One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
I've started a fairly new blog, aimed at fairly new bloggers, by a fairly new blogger! I've had another blog up and running for about four months and it's building up quite well, but I've learned a lot from you here, and am looking forward to trying all the ideas that are new to me. You've been so thorough that I can't suggest a 51, but perhaps a 50a - I use Tweetadder with my Twitter accounts; it's a brilliant tool for automating tasks and driving traffic. I won't spam the comment with a link here, but there is one on my blog for anyone who is interested. Thanks again for all those ideas, cheers, Brent
Wow Brankica! I just don't know how I missed this awesome post :) I have just been going through it for quite sometime and finally bookmarked it for future reference whenever I need it- it is that good! Awesome indeed :) I loved the way you shared all the way you could get traffic and even though I am doing most of these things, there is so-so much more that I need to work on and change the way I look at some others. Thanks so much for sharing, and wishing you the best for the contest :)
Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.
I’ve always been a believer that hard work gets the best results, and in practice it always ends up being true. On the web it’s no different. If you want more organic traffic, you have to work for it. That means giving your best effort every time, going after opportunities your competitors have missed, being consistent, guest blogging strategically, and staying on Google’s good side.
But now, that is all changing. We can still see all of that information ... sometimes. Things got a whole heck of a lot more complicated a little over a year ago, when Google introduced SSL encryption . What does that mean? It means Google started encrypting keyword data for anyone conducting a Google search while logged in to Google. So if you're logged in to, say, Gmail when you do a search, the keyword you used to arrive on a website is now encrypted, or hidden. So if that website drilled down into its organic search traffic source information, they'd see something like this:
For our client: We monitored everything on a daily basis. If something came up, which needed to be fixed, we were quick to implement it with the development team at the business. We also rolled out numerous campaigns multiple times as they worked effectively the first time around in generating significant traffic so it was second nature to do the same thing twice.
To sum up all of this information, even organic traffic, like direct traffic, has some gray areas. For the most part, though, organic traffic is driven by SEO. The better you are ranking for competitive keywords, the more organic traffic will result. Websites that consistently create content optimized for search will see a steady increase in organic search traffic and improved positioning in the search results. As a marketer, it is important to look at your keywords and high-ranking pages to identify new SEO opportunities each month.