If you want to know how engaging your content is, you might look at bounce rates, average time on page and the number of pages visited per session to get an idea – and Google can do precisely the same. If a user clicks through to your site, quickly returns to the results page and clicks on another listing (called “pogo-sticking”), it suggests you haven’t provided what this person is looking for.
So, you have downloaded your links profiles on a CSV and you now have an extensive list of all your linked domains. If you have been doing SEO for 8+ years like me you can probably just know from analysis which links are bad from a TLD and URL point of view. If you do not know too much you can use tools such as Link Detox: http://www.linkdetox.com/ to complete analysis of your link profile. I would always consult the advice of an expert SEO in this instance because it is easy for these tools to mistake good and bad links.
Hi Brankica, wow what an extensive list. As a one-month old blogger this list is overwhelming so there were four things I have noted and will look into further are stumbleupon, digg, roundups and vark. I hope to implement them soon and will keep track of the results. As for the other sources, I will come back to this post to see what other sources to use next. I am curious to know from your experience, what are the Top 10 most effective traffic sources?
This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?
The term “organic” refers to something having the characteristics of an organism. Although black hat SEO methods may boost a website’s search engine page rank in the short term, these methods could also get the site banned from the search engines altogether. However, it is more likely that readers will recognize the low quality of sites employing black hat SEO at the expense of the reader experience, which will reduce the site’s traffic and page rank over time.
So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt.
To find the right people I downloaded a list of some of the most popular users within the community. To do this, I used Screaming Frog SEO Spider to gather a list of all the URLs on the website. I then exported this list into an Excel spreadsheet and filtered the URLs to only show those that were user profile pages. I could do this because all of the profile pages had /user/ within the URL.
One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
On the other hand, structured data are also a real asset to increase your organic traffic and improve your CTR. They refers to values that help search engines categorize and index your content in a creative ways for the user. While there is no direct correlation between those data and a SEO improvement, structured data can really help you boost your visibility in SERPs.
It works in a similar way as a canonical tag does to show when a duplicate version of a page exists. But in this case, it helps Google index local content easier within local search engines. It also helps to pass trust across sites and improve the way Google crawls these pages. To know more about how errors you should not make when it comes to hreflang tags, you can check our previous article.
You are giving some great advice to use for increasing readership and loyal followers. I am trying to include 2-3 sources from each of your list categories so that I have a balanced approach to increasing traffic. Right now I am learning Stumble Upon and Squido.com. Although I submit articles to Ezine, I think my keywords are not strong enough to get a good click through rate. I am getting better about writing articles with strong keywords now; sort of fell off the pace of that when I started using a lot of guest authors :( Thanks for your great insights and check list for increasing traffic, Brankica!
You know, a lot of blog posts promise to show you how to grow your blog and bring traffic in. And then they go and copy/paste a bunch of other bloggers, change two or three words, and then publish. Then wonder why no-one read, commented or shared. Somehow I don't think you'll have this problem, Brankica - this is one of the most solid pieces on traffic generation around. Great stuff, miss, and something every blogger can take something away from.
Hey Julia, thanks so much for the comment. I just saw the idea you were talking about, that is awesome. I think you are right cause although it seems a bit inconvenient, I personally wrote down a few URL like that in my cell phone or notebook to check it out when I am back at the computer. Wow, this is something I need to think about. Also, how about those big billboards?
This is the number of views that you can test each month on your website.It's up to you how you choose to use them, either by allocating all the views to one test or to multiple test, either on one page or on multiple pages. If you have selected the 10.000 tested views plan and you run an experiment on your category page which is viewed 7000 times per month, then at the end of the month 7000 is what you'll be counted as tested views quota.