"We have used quite a few tools for tracking SERPs and keywords as part of our content marketing effort. And there was always something missing. It wasn’t until we found AccuRanker that we were completely satisfied. It has increased our productivity. The powerful filters, tagging, instant checks, integration with GSC, multiple URL detection per keyword, accurate search volume, and notes, are all features we now can’t live without. AccuRanker really has taken our SEO efforts to another level. We were able to grow our organic traffic by 571% in just 13 months."
The Forums – yes, I know some bloggers are telling you that you can not get anything from the Forums, but I can prove them wrong. One example: I met one of my dear blogging friends on a Forum way before I started blogging. She has been helping me since. Instead of struggling with some newbie mistakes, I skipped most of them and went straight for success. Most forums allow links in signatures – use it!
Recent studies have found that upwards of 80% of consumers’ outbound sharing from publishers’ and marketers’ websites now occurs via these private channels. In terms of numbers of active users, messaging apps are outpacing social networking apps. All the activity driven by these thriving platforms is typically bucketed as direct traffic by web analytics software.
So, Google has accepted the reconsideration request, you can now move forward with creating high-quality link building and a content creation strategy. I see every one creating threads about great content marketing examples, but the problem is that most of the time these are big business examples. SME’s and start-ups do not have big dollars to do such things, so the next best thing is to is to create a content market calendar for your clients. 

Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
Lol, start from the middle :) Yeah, Squidoo is doing great and I know they are cleaning out all the time when they see a lousy lens. They have angels over there that report stuff like that (I am an angel myself on Squidoo :) And it is even not that hard to make money over there which I always suggest to beginners that don't have money to invest in blogs and sites.
Everyone wants to rank for those broad two or three word key phrases because they tend to have high search volumes. The problem with these broad key phrases is they are highly competitive. So competitive that you may not stand a chance of ranking for them unless you devote months of your time to it. Instead of spending your time going after something that may not even be attainable, go after the low-hanging fruit of long-tail key phrases.

People find their way to your website in many different ways. If someone is already familiar with your business and knows where to find your website, they might just navigate straight to your website by typing in your domain. If someone sees a link to a blog you wrote in their Facebook newsfeed, they might click the link and come to your website that way.
So, Google has accepted the reconsideration request, you can now move forward with creating high-quality link building and a content creation strategy. I see every one creating threads about great content marketing examples, but the problem is that most of the time these are big business examples. SME’s and start-ups do not have big dollars to do such things, so the next best thing is to is to create a content market calendar for your clients. 

Another point which you did mention (between the lines: "and updating your backlinks to point to HTTPS URLs") but I actually didn't realize enough is that updating backlinks when your site switches from http to https is very important. Redirecting all http traffic to the https version of your site from the server (301) is necessary, but doesn't solve losing referral data. Only updating backlinks, especially those on httpS-sites, does. I'd like to emphasize that!
First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.

In the end of the day it depends on the size of the website you are working with and how well known the brand is in the market. You can adapt some of the strategies listed above in the post on scale and it can have a highly positive impact on a web property, the property in question is a real content house so any thing is possible. What else do you suggest we should do I will advise you if it has been done already?
Wow Brankica, what a list! Completely and utterly usable list that I am sure to apply in my niche. Very detailed post especially as you stated how to make the most of even the well known ones. We usually overlook the offline traffic generation forgetting that once upon a time, there was no internet. Slideshare sounds like something I could use. This post is unbeatable! Good job!
AllRecipes: This site is one of the larger recipe-focused sites on the web, and it also has a reputation amongst foodies for having a ton of comments about “this recipe was terrible, also I made X, Y, and Z substitutions.” You, too, can take advantage of these recipe comments to promote your own changed version of the recipe, if you have a cooking blog.
This is an easy one. Don’t use meta refreshes or JavaScript-based redirects — these can wipe or replace referrer data, leading to direct traffic in Analytics. You should also be meticulous with your server-side redirects, and — as is often recommended by SEOs — audit your redirect file frequently. Complex chains are more likely to result in a loss of referrer data, and you run the risk of UTM parameters getting stripped out.
Ezinearticles.com, although hit by the new Google algorithm, it is a great source of highly targeted traffic. The bounce rate of visitors I get from EZA is always less than 20%! Choosing a good keyword for an article can result in incredible amounts of traffic. I have been receiving a lot of traffic from a single well written article for a year and a half now!
Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.
So we’re confident that the high social traffic in the sixth example above reflects the highly successful social media campaigns the organization is working on. We could also see this pattern for a site which has decided that they can’t succeed with search and has, as Google suggests for such websites, chosen to work on social media success instead. The difference is, one site would have good Search and Direct traffic and really good social media, while the other might have dismal Search and rely heavily on social media, which is very time consuming and often has a low ROI. This second pattern is one we’ve seen with microbusinesses where the business owner is spending hours each day on social media and making very little progress in the business. Making the investment in a better website would probably pay off better in the long run, even if it seems like an expensive choice.
Sanjeev, I really appreciate the feedback. I am glad you actually put to use these tips and made them work for you. I love hearing people getting results from my tips :) Is there such a thing as Ph.D in traffic generation? LOL, I know some that would love the title, but somehow never really walk the walk... Now about those leads and closures, I will have to consult you, seems like you know some things way better than I do, I would like to see how that happens! Thank you so much for the comment and I hope you will have even better results in few weeks!
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.

In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.
Hey Caroline, that is one of the great things, being included like that. Happened to me once, when my favorite blogger asked a question on her FB page and then included the testimonials in one of the pages. So not only did I get a dofollow link from a site that is Alexa less than 5.000 but I was so happy to be featured on her blog. Thanks for the awesome comment and I would love some feedback in a week or so, when the first results come in :)
By far the best rank tracking software on the market. We’ve tested a variety of different rank tracking softwares and AccuRanker is by far the most precise on the market. As a tool AccuRanker is very intuitive to use and the support is just great. We appreciate the speed and the possibilities regarding filters, tagging and especially the instant refresh function. These features and a bunch others is something we use a lot and I myself has been satisfied using it since 2016.

To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.


For many startups, this means doing enterprise SEO on a small business budget, which comes with a few compromises. The problem is, Google doesn’t accept compromises when it comes to search optimisation and you need to get the fundamentals spot on. The good news is, the sooner you get these right, the faster you’ll be able to build a self-sustaining SEO strategy that doesn’t come back to bite you in the budget later.
Thank you Jayne, for doing all that voting and sharing, means the world to me! I am glad you saw some new ideas and looking at all the comments, I think I killed a really bug tree with this post, that is how many people want to print it out, lol. If at any point of time you get stuck with one of these, send me an e-mail, tweet, what ever and I will try to help you with extra ideas :)

For our client: We monitored everything on a daily basis. If something came up, which needed to be fixed, we were quick to implement it with the development team at the business. We also rolled out numerous campaigns multiple times as they worked effectively the first time around in generating significant traffic so it was second nature to do the same thing twice.
It works in a similar way as a canonical tag does to show when a duplicate version of a page exists. But in this case, it helps Google index local content easier within local search engines. It also helps to pass trust across sites and improve the way Google crawls these pages. To know more about how errors you should not make when it comes to hreflang tags, you can check our previous article.
To do this, I often align the launch of my content with a couple of guest posts on relevant websites to drive a load of relevant traffic to it, as well as some relevant links. This has a knock-on effect toward the organic amplification of the content and means that you at least have something to show for the content (in terms of ROI) if it doesn't do as well as you expect organically.
It’s content like this that forms the foundation of effective content marketing: a crucial component in modern day integrated marketing campaigns that cohesively drive marketing results. It’s so vital, in fact, that some 22% of those surveyed at Smart Insights said that content marketing would be the digital marketing activity with the greatest commercial impact in 2016.
I have always believed in good quality content, well structured and written in a way that isn’t just about promotional talk. Thanks for sharing this information with us, it’s always helpful to have everything written in a concise manner so we can remind ourselves now and again of what it takes to increase organic traffic. As an SEO consultant myself I come across websites all the time that are messy and still using tactics that have long been out of date. Having a successful website is all about quality content and links. I like it that you stated what the problem was and then how you fixed it for the client. Great article.

Google claims their users click (organic) search results more often than ads, essentially rebutting the research cited above. A 2012 Google study found that 81% of ad impressions and 66% of ad clicks happen when there is no associated organic search result on the first page.[2] Research has shown that searchers may have a bias against ads, unless the ads are relevant to the searcher's need or intent [3]
Another tip you can use is just reach out to the prior agency and say something like the following: “We realise you were using link networks for our website which has resulted in a Google penalty and loss in business. Can you please remove my website from any link network you have built?”. If the prior agency is decent, they will remove the links from the network.
Mobile traffic: In the Groupon experiment mentioned above, Groupon found that both browser and device matter in web analytics’ ability to track organic traffic. Although desktops using common browsers saw a smaller impact from the test (10-20 percent), mobile devices saw a 50 percent drop in direct traffic when the site was de-indexed. In short, as mobile users grow, we are likely to see direct traffic rise even more from organic search traffic.
×