Haven’t heard of search engine optimization yet? Search engine optimization (or SEO for short) is initiated to help boost the rankings of your website within organic search results. Some examples of techniques used to boost organic SEO include using keywords, building local citations or mentions of your business across the web, and writing content that’s valuable and relevant for real people. The goal is to have unique and quality content, a user-friendly website, and relevant links pointing to your site. These are just a few of the things that can influence the rank of a site within an organic search.

For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
Lol, yeah, I am so happy its published cause it is awesome to be a part of this contest you guys are doing. I just know I will meet some great people. You should check out Website Babble, it is related to blogging and a great related traffic source. By the way, the post was pretty long so I didn't want to stuff it with more details, but a lot of the mentioned answer sites have your links set to do-follow :) Thanks so much for the comment! Um beijo.
For the short-term, PPC is generally your best bet. The minute your ad runs, it delivers results. But when the budget stops, so do the benefits. Because SEO relies on your website’s content, linking structure, and meta data, developing a thorough strategy that delivers real results takes more time than setting up a PPC campaign. However, SEO’s results may be more valuable to you and your organization in the long run.
Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.

If your page is ranking for a keyword like “social media management,” you might want to look into similar long-tail keywords, such as “social media management pricing” or “social media management tools.” Chances are, adding in sections that address these subjects will be fairly easy. Additionally, by adding in a few extra long-tail keywords, you’ll have the added bonus of increasing the total keyword density of the page.
If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.
However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.
So, you have downloaded your links profiles on a CSV and you now have an extensive list of all your linked domains. If you have been doing SEO for 8+ years like me you can probably just know from analysis which links are bad from a TLD and URL point of view. If you do not know too much you can use tools such as Link Detox: http://www.linkdetox.com/ to complete analysis of your link profile. I would always consult the advice of an expert SEO in this instance because it is easy for these tools to mistake good and bad links.
You can apply this to marketing in a few ways. If, for example, you purchase paid search advertising, you’ll want to make sure those “CPC” sources have generally low bounce rates. If a pay-per-click or cost-per-click campaign has a high bounce rate (1) check your landing page to make sure that it provides the content promised in your ad, (2) check your ad copy to ensure it is clear, and (3) check your keywords.
Having large groups of content that all revolve around the same topic will build more relevance around keywords that you're trying to rank for within these topics, and it makes it much easier for Google to associate your content with specific topics. Not only that, but it makes it much easier to interlink between your content, pushing more internal links through your website.
Good point,The thing with this client is they wanted to mitigate the risk of removing a large number of links so high quality link building was moved in early before keyword research. So it is on a case by case basis, but defiantly a good point for most new clients I work with who do not have pre-existing issues you want to do Keyword Research very early in the process. 

Google doesn't always include a whole paragraph of text in the Featured Snippet. If you add "Step 1," "Step 2," "Step 3," etc. to the start of each HTML heading within your content (for example, within your H2 tags), Google will sometimes just list out your headings within the Featured Snippet. I've started to see this happen more and more in keywords beginning with "how to".
Hi Pavan, I would agree that it's a long post - but some posts are just worth the read no matter how long they are - especially this one since it's loaded with useful resources. I've actually bookmarked it and I plan to read it a few times over in hopes of putting these great tips to use. All in all - it's not length that matters - it's how a post is presented and the information that it contains within. If a writer can keep me captivate or entertained during the entire thing - then I'm willing to read it regardless of how long or short it is. Just my opinion :). Have a great week. Cheers
AccuRanker is faster, better and more accurate for rank tracking than enterprise tools and If you want a best of breed strategy to you toolbox, AccuRanker is part of that solution. Getting instant access to bulk upload and download of thousands of keywords on the clients rankings and their competitor rankings on specific chosen keywords enabled GroupM to analyze big data within short deadlines. AccuRanker is the best in the industry when it comes to one thing: Rank Tracking.

Hey Marcus, thank you so much. Hearing all the compliments make me wish I made a list of 100 traffic sources :) I don't think anyone would publish it though...too much work, lol. I hope everyone will use it and find some fresh traffic for their blogs, most of these sources will work for any blog but some will work better for specific niches. Thanks for sharing and appreciate the comment!
Now, there's one more little group that might fall into direct traffic -- it's something you might have heard about as "Dark Social" if you've been staying up to date on, well, dark social as a referral source. Basically, dark social is what some people are calling the social site traffic that most analytics programs can't capture since it lacks referral data, but is coming from things like emails and instant messengers (which, technically, are social mechanisms). Often, this unidentifiable traffic -- anything without a referring URL -- is also bucketed into direct traffic. Now, as BuzzFeed so eloquently puts it, "all 'dark social' traffic is direct traffic, but not all direct traffic is dark social."

Buy spanish web traffic visitors from Google.es for 30 days – We will send real spanish visitors to your site using Google.es's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Google España Web Traffic Service today …
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.

WOW Brankica, cannot believe that you also includes Craigslist.org on your list! What a great post, indeed. ;) Elise add #51 email list building, so I would add #52 rss feed submission, #53 review websites like resellerratings.com, reviewcentre.com, etc., #54 blog search engine like Technorati, plus I have about 40 video websites where you can upload and share your video like break.com, blip.tv. Good luck with the contest. ;)
The "Direct Session" dimension, not to be confused with the default channel grouping or source dimension, is something different. It can tell you if a session was genuinely from its reported campaign (reported as No), or if it was simply bucketed there due to the last non-direct click attribution model (reported as Yes). I didn't mention this dimension in the article because it has some flaws which can cause brain bending complexities, plus it goes a little beyond the scope of this article, which I tried to gear more towards marketers and non-expert GA users. It's certainly an interesting one, though.
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.
In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.
I'm not in the contest, but if I was - I'd be SKEEEEERED. This was awesome, really - there were MAYBE 3 things I knew... The rest was new. I'm lame that way. The great thing about this is that as a blogger - you've covered ideas I've never thought of...I get my traffic mostly from SEO (not on my blog, but in my websites which are product based review sites) - but there's enough meat in this post I can use for my niche sites to keep me in the black, so to speak (ink I mean, not socks). Awesome post, Brankica - I'm speechless. (If you ignore the foregoing paragraph.)

So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
That helped explain some of the organic traffic loss, but knowing that this client had gone through a few website redesigns, I wanted to make sure that all redirects were done properly. Regardless of whether or not your traffic has changed, if you’ve recently done a website redesign where you’re changing URLs, it’s smart to look at your top organic landing pages from before the redesign and double check to make sure they’re redirecting to the correct pages.
Thanks for the comment, I would not say it is impossible to create high quality backlinks from scratch without content, you just need to do a review on competitor backlinks and see if their are any easy targets. We have had some good luck in the education space acquiring links on the same pages as competitor from PR5+ edu sites. It all revolves around the outreach strategy in which you put in place.
Superb resource list Brankica, thank you. I've also included it this week's Erudition, not just because you're in a competition, but because it really is a resource we should all have bookmarked. Actually, I need to make it a study priority and see how many of the sources I can reasonably use on a regualr basis. Link to Erudition | Help files from Information Junkies Anonymous

Google measures average time on site by first collecting each visitor’s exact time on a particular page. Imagine that a visitor lands on page 1 of your site. Google places a cookie, including a unique code for the visitor and a time stamp. When that visitor clicks through to page 2 of your site, Google again notes the time, and then subtracts the time that the visitor arrived at page 2 from the time that the visitor arrived at page 1. Google then averages each and every page’s time spent to get the average time each visitor spends on the site.
Keywords research is a very important process for search engine optimization as it can tell you the exact phrases people are using to search on Google. Whenever you write a new blog post, you have to check the most popular keywords, but don’t be obsessed by the search volume. Sometimes, long tail keywords are more valuable, and you can rank for them more easily.
"We have used quite a few tools for tracking SERPs and keywords as part of our content marketing effort. And there was always something missing. It wasn’t until we found AccuRanker that we were completely satisfied. It has increased our productivity. The powerful filters, tagging, instant checks, integration with GSC, multiple URL detection per keyword, accurate search volume, and notes, are all features we now can’t live without. AccuRanker really has taken our SEO efforts to another level. We were able to grow our organic traffic by 571% in just 13 months."
I would also advise to continue doing what works. If something you have rolled out generates great traffic and links bring out a new version of the content, for example the 2012 version worked effectively bring out the 2013 version of the content. Another effective strategy is to make the piece of content into an evergreen article which you add to over time so it is always up to date.
Hi Chris, "Good content" means a couple of things - good for readers and good for Google. Good content for readers means that the content answers questions, provides value, offers solutions, and is engaging. You want to keep the reader on the page and on your website for as long as possible. To make good content for Google, you have to provide the search engine with a set of signals - e.g., keywords, backlinks, low bounce rates, etc... The idea is that if you make good content for readers (engaging, valuable, actionable, and informative), your content will get more engagement. When your content gets more engagement Google will see it as good content too and put it higher in the SERPs. Making "good content" is about striking that balance. Let us know if that answered your question!
×