One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
AccuRanker is faster, better and more accurate for rank tracking than enterprise tools and If you want a best of breed strategy to you toolbox, AccuRanker is part of that solution. Getting instant access to bulk upload and download of thousands of keywords on the clients rankings and their competitor rankings on specific chosen keywords enabled GroupM to analyze big data within short deadlines. AccuRanker is the best in the industry when it comes to one thing: Rank Tracking.
As I mentioned earlier, one of the biggest downsides to SEO is that Google is constantly making changes to their algorithm; they come out of nowhere, and they can drop your domain authority by 5, 10, even 15 points and it almost feels like you need to start optimizing all over again. It's frustrating, it's hard work and the results aren't always visible - but if you can get SEO right, it can truly be an amazing source of traffic.
Problems donating? | Other ways to give | Frequently asked questions | We never sell your information. By submitting, you are agreeing to our donor privacy policy. The Wikimedia Foundation is a nonprofit, tax-exempt organization. If you make a recurring donation, you will be debited by the Wikimedia Foundation until you notify us to stop. We'll send you an email receipt for each payment, which will include a link to easy cancellation instructions.
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
Having large groups of content that all revolve around the same topic will build more relevance around keywords that you're trying to rank for within these topics, and it makes it much easier for Google to associate your content with specific topics. Not only that, but it makes it much easier to interlink between your content, pushing more internal links through your website.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.

On the other hand, structured data are also a real asset to increase your organic traffic and improve your CTR. They refers to values that help search engines categorize and index your content in a creative ways for the user. While there is no direct correlation between those data and a SEO improvement, structured data can really help you boost your visibility in SERPs.
Thank you Jayne, for doing all that voting and sharing, means the world to me! I am glad you saw some new ideas and looking at all the comments, I think I killed a really bug tree with this post, that is how many people want to print it out, lol. If at any point of time you get stuck with one of these, send me an e-mail, tweet, what ever and I will try to help you with extra ideas :)
Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  
In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.
Everything on Wikipedia and other Wikis has a resource part. So if the image's resource is your blog (and I am talking about useful and good images, not just anything) you will get traffic. I explained in the post how it generally works for a travel site for example. It works better for some niches than others but creative people can make everything work for them.
I don't generally think of my own list as a traffic source, because those are people that subscribed and keep coming back, mostly because of good content. This is more tutorial about where to get fresh traffic from in case a person is not using some of these sources already. Where have you guest posted, I haven't seen any, would like to read some of those :)
About.com – How many times have you tried to do something and needed instructions? You Google it and it brings up how-tos from About.com. Well, you can contribute to this site with some of the how-tos for your niche and get traffic from it. Additional tip – there is an About.com Forum where you can be helpful to people in need and get additional traffic to your blog.
One of the things that makes social media a bit more attractive than search engine optimization is that you get to maintain a bit more control over your success. You can always find new ways of improving your strategy and you can learn from your mistakes and your wins so that you can improve your traffic in the future. The same can't really be said about SEO - although it's very clear what not to do, it's not always as clear exactly what strategies can help you improve your ranking.
Backlinks are basically Authoritative linking. Which means someone else says about your site that it is in an indication of a particular keyword or you have authority in a particular market is indicating that their readers can go and find more helpful information from certain places on the web and they do that by creating these authoritative links which also called backlinks. The more of high quality, authoritative links that you have, Google considers this as you are being incredible in the market. Your website can be authoritative by having other website owners to link to your website, Then Search Engine algorithm will consider your site and you will get higher boost to your SEO and your site will likely get higher ranking and the more of this authoritative link. Blog Commenting is a great way to get backlinks to your website. Step 1. Find relevant and high traffic blog in your niche. Step 2. Actually read the post, what all it’s about. Step 3. Just leave relevant comment to the topic, then simply place your link in the comment.
As I already mentioned above, it doesn’t matter how small or big your company is; you need to take care of your SEO.  Remember that 89 percent of consumers use search engines to help make their purchasing decisions. To get people to buy from your company, you need to make your site visible at some point in the buyer's journey, which means you need to definitely be visible on the search results pages.

Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  
Organic is what people are looking for; the rest of these simply put things in front of people who may or may not be seeking what you offer. We know that approximately X number of people are looking for Y every day. So if we can get on front of those people, we have a much greater opportunity to create long-term relationships and increase our overall ROI.
In order to get the whole bundle of internal links for your website, in the same Search Console, you need to go to the Search Traffic > Internal Links category. This way you’ll have a list with all the pages from your website (indexed or not) and also the number of links pointing to each. This can be a great incentive for discovering which pages should be subject to content pruning.
For our client: We rolled out numerous new pieces of content onto their blog and news section; we aimed to make the content creative and funny. As the client was in the careers space we made use of “funny interview questions” and “technical interview questions” style articles. It was amazing that one of the articles even made it to the first page of Reddit. We also pushed out content which was related to various holidays in that year and also specific to the client’s industry and also current trends in the market. 
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
We used to like best to see a fairly balanced mix, back when there were mostly just three sources: Organic, Direct, and Referral. Organic Search traffic is a good sign of general site health, it’s usually the top source of conversions, and it generally shows great ROI. Direct traffic is often the people who love your site or are coming back to buy after they found you in some other way. These two groups of visitors are probably the ones that bring you the most sales or leads.
There are always high profile blogs in your industry, no matter what that industry is. It might be sites like Business Insider, Forbes, and Inc. It might be sites like Medium and Gawker. It might be sites like Search Engine Journal and QuickSprout. The fact is, every industry has its dominant forces, and as long as you’re not the dominant blog, you can use the dominant blogs as traffic sources for your own.

If you don’t want your Google traffic dropped dramatically due to indexing and content pruning, we are going to list below the steps you need to take in order to successfully prune your own content. We’ve developed this subject in a previous blog post.  Yet, before doing so, we want to stress on the fact that it’s not easy to take the decision of removing indexed pages from the Google Index and, if handled improperly, it may go very wrong. Yet, at the end of the day,  you should keep the Google Index fresh with info that is worthwhile to be ranked and which helps your users. 
Hi Brankica I'm honored to be up in the same contest as you! Your article is breathtaking really! You have added so many unique factors into this one it's extremely beneficial for any blogger to read. Your covered so many are's and gave great idea's and suggestions about them. Great article! Thanks for sharing! Also THANK YOU for adding blogengage I love ya!
Google measures average time on site by first collecting each visitor’s exact time on a particular page. Imagine that a visitor lands on page 1 of your site. Google places a cookie, including a unique code for the visitor and a time stamp. When that visitor clicks through to page 2 of your site, Google again notes the time, and then subtracts the time that the visitor arrived at page 2 from the time that the visitor arrived at page 1. Google then averages each and every page’s time spent to get the average time each visitor spends on the site.
Hi Matt, realizing now how difficult it is to run a blog, trying to promote it and carry on with your daily activities. I would say it's a full time job. Once you thing you done learning about something, something else is coming :). My blog is about preparing for an ironman so I need to add the training on top of it. Thanks a lot for sharing this article with us so we can keep focus!!!
×