Conduct a quick Google search using “site:yourwebsite.com” to make sure your pages are actually indexed. If you’re noticing that critical pages aren’t appearing in the SERPs, you’ve likely found the culprit. Check your robots.txt file to make sure you haven’t blocked important pages or directories. If that looks good, check individual pages for a noindex tag.
Traffic coming from SEO (Search Engine Optimization) is free, while traffic generated from PPC (Pay Per Click) ads is not free. PPC ads are displayed above organic search results or on the sidebar of search engine pages for search queries related to their topic. Landing page keywords, ad keywords, CPC (cost per click) bids, and targeting all influence how those ads are “served” or displayed on the page.
Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.
Hi, thanks for your comment. When you refer to incognito browsing, I think it's important to clarify that referrer data *is* passed between sites. Click on a link whilst in incognito, and the referrer header is passed as usual. The exception is if you're browsing normally, and then opt to open a specific link in incognito mode - this won't pass referrer data.
If both page are closely related (lots of topical overlap), I would merge the unique content from the lower ranking article into the top ranking one, then 301 redirect the lower performing article into the top ranking one. This will make the canonical version more relevant, and give it an immediate authority boost. I would also fetch it right away, do some link building, and possibly a little paid promotion to seed some engagement. Update the time stamp.

Optimise for your personas, not search engines. First and foremost, write your buyer personas so you know to whom you’re addressing your content. By creating quality educational content that resonates with you>r ideal buyers, you’ll naturally improve your SEO. This means tapping into the main issues of your personas and the keywords they use in search queries. Optimising for search engines alone is useless; all you’ll have is keyword-riddled nonsense.
Hey, Kevin, love connecting with people and more then happy to tell you that my user name is brankica81 I will sometimes just stumble for fun and thumb up the pages I like. That is how I got a high number of favorites. The good thing is, that once I started using SU to promote my blog (just occasionally) my discoveries were valued more. I am actually a pretty fair user, I never thumb up something just to have higher numbers. Great thing is to have a friend with high number of stumbles, that can sky rocket your post, happened to me the other day as well. Paper.li is still new (for me) and I can see a lot of potential in it.
Otherwise you might end up polluting your website’s traffic. The pages filled with obsolete or low quality content aren’t useful or interesting to the visitor. Thus, they should be pruned for the sake of your website’s health. Low quality pages may affect the performance of the whole site. Even if the website itself plays by the rules, low-quality indexed content may ruin the organic traffic of the whole batch.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
Do you have a content strategy in place, or are your efforts more “off the cuff?” Not having a clearly defined keyword map can spell trouble — especially if two or more pages are optimized for the same keyword. In practice, this will cause pages to compete against each other in the SERPs, potentially reducing the rankings of these pages. Here is an example of what this might look like:
If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.
Buy organic search traffic from Google.com – We will send min. 2500 real human visitors to your site using Google.com's search field with your keywords to improve your SERPs and SEO strategy. All visitors will be shown as organic search traffic in your Google Analytics. Having a constant in organic search traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait buy Google Search Traffic Visitors today …
Recent studies have found that upwards of 80% of consumers’ outbound sharing from publishers’ and marketers’ websites now occurs via these private channels. In terms of numbers of active users, messaging apps are outpacing social networking apps. All the activity driven by these thriving platforms is typically bucketed as direct traffic by web analytics software.

Putting it simple, duplicate content is content that appears on the world wide web in more than one place. Is this such a big problem for the readers? Maybe not. But it is a very big problem for you if you are having duplicate content (internally or externally) as the search engines don’t know which version of the content is the original one and which one they should be ranking.
Lol, yeah, I am so happy its published cause it is awesome to be a part of this contest you guys are doing. I just know I will meet some great people. You should check out Website Babble, it is related to blogging and a great related traffic source. By the way, the post was pretty long so I didn't want to stuff it with more details, but a lot of the mentioned answer sites have your links set to do-follow :) Thanks so much for the comment! Um beijo.
Wow, great post, Brankica!!! I've used Flickr in the past and while I haven't gotten much traffic from it, I've gotten a ton of backlinks. Like there's this one photo in particular that I tagged as "SEO" and included a link in the description back to the blog post I used the photo in. I guess there are lots of autoblogs that scrape stuff like SEO-related images, and since they scrape the description, too, I get lots of automatic backlinks. Sure, they're crappy backlinks, but still backlinks :)
The following example shows a Client who receives 55% of their website traffic from Organic Search – typically this is a strong performance, but again this will vary greatly based on paid search. More important than the % is the number of sessions itself – this tells us how many visits (note: not unique visitors) we received through this channel. In this example, we have 7486 visits in 1 month that all came from organic searches.

Lol, start from the middle :) Yeah, Squidoo is doing great and I know they are cleaning out all the time when they see a lousy lens. They have angels over there that report stuff like that (I am an angel myself on Squidoo :) And it is even not that hard to make money over there which I always suggest to beginners that don't have money to invest in blogs and sites.
Another tip you can use is just reach out to the prior agency and say something like the following: “We realise you were using link networks for our website which has resulted in a Google penalty and loss in business. Can you please remove my website from any link network you have built?”. If the prior agency is decent, they will remove the links from the network.
Now that we have the data we want in the chart, we use the advanced search to filter it down to only the traffic we want to see. Click the blue “Advanced” link beside the search bar that is just to the top right of your list of landing pages. This will open the Advanced search screen, where we want to setup our query. In the green drop down, choose “Medium” and in the text box at the end of the row we type “organic”. Click the Apply button below the query builder to apply this search.
The "Direct Session" dimension, not to be confused with the default channel grouping or source dimension, is something different. It can tell you if a session was genuinely from its reported campaign (reported as No), or if it was simply bucketed there due to the last non-direct click attribution model (reported as Yes). I didn't mention this dimension in the article because it has some flaws which can cause brain bending complexities, plus it goes a little beyond the scope of this article, which I tried to gear more towards marketers and non-expert GA users. It's certainly an interesting one, though.
For our client: We took the top PPC terms based on conversion and worked these keywords into existing pages on the website. We also created new high-quality content-based pages from these conversion terms. This type of strategy can work very well in assisting overall conversions on the website and driving more revenue. We also conducted a large-scale keyword research project for the client which yielded in uncovering many areas of opportunity for content development and targeting. 

This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?
For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.
A great way to break your web traffic without even noticing is through robots.txt. Indeed, the robots.txt file has long been debated among webmasters and it can be a strong tool when it is well written. Yet, the same robots.txt can really be a tool you can shoot yourself in the foot with. We’ve written an in-depth article on the critical mistakes that can ruin your traffic in another post. 
Wow, what a massive post Brankica! I love it. :) I especially liked your suggestion to answer questions on answer sites. I've tried Yahoo Answers before, though didn't do much with it. I'll definitely give it another go in the future though! Man, just when I was tidying up my to-do list, you had to make it longer for me. ;) Thanks for such an awesome read! By the way, I found this post on StumbleUpon (crazy how the world works). =p Christina
Sanjeev, I really appreciate the feedback. I am glad you actually put to use these tips and made them work for you. I love hearing people getting results from my tips :) Is there such a thing as Ph.D in traffic generation? LOL, I know some that would love the title, but somehow never really walk the walk... Now about those leads and closures, I will have to consult you, seems like you know some things way better than I do, I would like to see how that happens! Thank you so much for the comment and I hope you will have even better results in few weeks!
Use long tail keywords. Don’t just go with the most popular keywords in your market. Use keywords that are more specific to your product or service. In time, Google and other search engines will identify your website or blog as a destination for that particular subject, which will boost your content in search rankings and help your ideal customers find you. These tools will help.
×