The following example shows a Client who receives 55% of their website traffic from Organic Search – typically this is a strong performance, but again this will vary greatly based on paid search. More important than the % is the number of sessions itself – this tells us how many visits (note: not unique visitors) we received through this channel. In this example, we have 7486 visits in 1 month that all came from organic searches.
I feel that an article is only good if it adds value to the reader, and this one qualifies as a great article because it has shown me quite a few new ways in which to generate traffic to my blogs. I would like to tell you that I had read your suggestions about using flickr images on blog for traffic generation some other place as well and I have religiously followed that on my blog and today at least 15% of the traffic to my site comes from there! I am therefore going to follow the other suggestions as well (the ones that I am not following now) and hope to take my blog from Alexa 500K to sub 100K in the next couple of month!

Hey Ryan, thanks for including me, I will be over there to thank you as well :) I am glad you liked the post and definitely don't advice on getting into all of them at once, lol. I am the first one that would try all, but I learned that it is the wrong way to go in about anything. The best thing would be choosing one or two of these and tracking results. I learned that Flickr can take too much time for some types of blogs, while others will have great results with it. Depends on the niche a lot. But I know you will do the best thing, you always do! Thanks for the comment and helping me in the contest!

Unfortunately, this particular client was fairly new, so as a result, their Moz campaign data wasn’t around during the high organic traffic times in 2015. However, if it was, a good place to start would be looking at the search visibility metric (as long as the primary keywords have stayed the same). If this metric has changed drastically over the years, it’s a good indicator that your organic rankings have slipped quite a bit.


Everything on Wikipedia and other Wikis has a resource part. So if the image's resource is your blog (and I am talking about useful and good images, not just anything) you will get traffic. I explained in the post how it generally works for a travel site for example. It works better for some niches than others but creative people can make everything work for them.

We used to like best to see a fairly balanced mix, back when there were mostly just three sources: Organic, Direct, and Referral. Organic Search traffic is a good sign of general site health, it’s usually the top source of conversions, and it generally shows great ROI. Direct traffic is often the people who love your site or are coming back to buy after they found you in some other way. These two groups of visitors are probably the ones that bring you the most sales or leads.
Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.
AccuRanker is faster, better and more accurate for rank tracking than enterprise tools and If you want a best of breed strategy to you toolbox, AccuRanker is part of that solution. Getting instant access to bulk upload and download of thousands of keywords on the clients rankings and their competitor rankings on specific chosen keywords enabled GroupM to analyze big data within short deadlines. AccuRanker is the best in the industry when it comes to one thing: Rank Tracking.

Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.

Hi Chris, "Good content" means a couple of things - good for readers and good for Google. Good content for readers means that the content answers questions, provides value, offers solutions, and is engaging. You want to keep the reader on the page and on your website for as long as possible. To make good content for Google, you have to provide the search engine with a set of signals - e.g., keywords, backlinks, low bounce rates, etc... The idea is that if you make good content for readers (engaging, valuable, actionable, and informative), your content will get more engagement. When your content gets more engagement Google will see it as good content too and put it higher in the SERPs. Making "good content" is about striking that balance. Let us know if that answered your question!

Optimise for your personas, not search engines. First and foremost, write your buyer personas so you know to whom you’re addressing your content. By creating quality educational content that resonates with you>r ideal buyers, you’ll naturally improve your SEO. This means tapping into the main issues of your personas and the keywords they use in search queries. Optimising for search engines alone is useless; all you’ll have is keyword-riddled nonsense.
For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 

BuzzFeed: Everyone knows the primary perpetrator of Clickbait and content appropriation on the web, but seemingly few people realize that anyone can write for them. You can sign up for a free account here, and simply start creating content on their site. There’s a lot of content that sees very little exposure on the site, but if something does catch on – perhaps because of your audience and your brand reputation – BuzzFeed will reach out to you and help by promoting it on the main sections of their site.
There are two ways to hook up interviews. The first is the manual option. Identify sites and specific people you want to interview, or who you would like to interview you, and approach them about it. Send them a message asking if they would be willing to interview you, or saying that you’re available if they would like to interview you about something specific related to your industry. In the second case, generally try to have a specific subject they’ve talked about recently, a press release of your own that might intrigue them, or a case study with data they might want to use. Something to sweeten the pot, basically.
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
You are giving some great advice to use for increasing readership and loyal followers. I am trying to include 2-3 sources from each of your list categories so that I have a balanced approach to increasing traffic. Right now I am learning Stumble Upon and Squido.com. Although I submit articles to Ezine, I think my keywords are not strong enough to get a good click through rate. I am getting better about writing articles with strong keywords now; sort of fell off the pace of that when I started using a lot of guest authors :( Thanks for your great insights and check list for increasing traffic, Brankica!
Hey Christy, thanks so much for the comment and I hear about this HARO thing for the first time. I am definitely going to have to check it out. I have this OCD (lol) that as soon as I hear about something new, I gotta join and try it out. This is officially the new item on my to do list that now has only 563 bullet points. Just kiddin', although we all have long to do lists I always find extra time to see what are the things other bloggers talk about. Thanks so much again :)
We have been using AccuRanker for the past few months and the team have so much good stuff to say about it. Not only do the ranking seem accurate, there are a lot of added features within the platform making reporting on the data very effective. With any new tool, there are some recommendations of features that we have put forward and AccuRanker have been very receptive to these ideas and we are hopeful that they will be implemented in the near future.
Thank you Jayne, for doing all that voting and sharing, means the world to me! I am glad you saw some new ideas and looking at all the comments, I think I killed a really bug tree with this post, that is how many people want to print it out, lol. If at any point of time you get stuck with one of these, send me an e-mail, tweet, what ever and I will try to help you with extra ideas :)
By using the Content Assistant feature of Cognitive SEO, I’ve included the majority of the suggested keywords by the tool on the content assets we’re aiming to increase in organic visibility.  I’ve also restructured the underperforming content and added new sections – to make the inclusion of the suggested key phrases more logical, natural and create relevant content.

Organic search traffic used to just mean the amount of traffic that came to your site via someone who found your site using a search engine. Then, you could drill down into more detail to see which keyword they searched to get to your website. For example, we might learn that someone came to HubSpot.com by searching the keyword phrase "inbound marketing." This is important to know because it helps businesses understand which keywords are driving the most traffic, leads, and customers so they can develop better-informed content and keyword strategies.
The strength of your link profile isn’t solely determined by how many sites link back to you – it can also be affected by your internal linking structure. When creating and publishing content, be sure to keep an eye out for opportunities for internal links. This not only helps with SEO, but also results in a better, more useful experience for the user – the cornerstone of increasing traffic to your website.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
×