I’ve always been a believer that hard work gets the best results, and in practice it always ends up being true. On the web it’s no different. If you want more organic traffic, you have to work for it. That means giving your best effort every time, going after opportunities your competitors have missed, being consistent, guest blogging strategically, and staying on Google’s good side.

Clean, fast code is important and you need to be aware of this if you’re using WordPress themes or other CMS platforms that typically come with a lot of bloated code. Despite its chunky build, one of the key benefits of using WordPress is its structure of templates that allow you to create new pages at the push of a button and create content in a visual interface, rather than code everything yourself.
Content is one of the 3 main Google ranking factors. As a matter of fact, having a blog can help you get your content indexed faster. An active blog – relying on qualitative and insightful content, and backed by authoritative links – will help you improve your rankings and thus your organic traffic. Also, long-form copies (above 1200 words) tend to rank higher and be crawled more often.
So, Google has accepted the reconsideration request, you can now move forward with creating high-quality link building and a content creation strategy. I see every one creating threads about great content marketing examples, but the problem is that most of the time these are big business examples. SME’s and start-ups do not have big dollars to do such things, so the next best thing is to is to create a content market calendar for your clients. 
Hey Delena, I am not worried about you stalking me, I like it :) I think you can advertise any service on Craigslist. The idea you have is awesome. Not only you can point your ad to your blog but also make some extra cash. And instead just going locally, I am sure there is something you can do worldwide, like give people advice over Skype or something. Thanks for the awesome comment and the twithelp suggestion.

Hey Michael, thanks so much for the comment. After this post, I decided to revamp my traffic generation strategy and work more on the existing sources and work a bit more on the ones I was not using that much. I saw results almost from the first day. This list is good because of the diversity of the ideas, cause if one thing doesn't work for you, there has to be another one that will sky rocket your stats :)

Brankica, what a valuable post you've contributed here! These are all great methods for driving traffic to your website. Here's two more suggestions: 1. Write Amazon reviews on products/books related to your website and sign those comments with your name, the name of your blog, and its URL. You can even do video reviews now and mention your blog as part of your qualifications to review a particular book or product. 2. QR codes on flyers. People can scan these with their phone and be sent directly to your blog. I'm seeing these all over the city lately linking to things like bus schedules, Foursquare pages, and what not. Once again, thanks for this post!


Hey Jeevan, Forums can ban people for promotion but most of them only do it for certain types of behavior. For example, some of them don't allow signatures until a certain number of posts. All of them will allow related links if it helps answering the question. I have never been banned from a forum and try to play by their rules. Thanks for the compliment and I think you have a great contestant as well, I RTed it the other day cause it is really useful!
The amount of dark social that comprises one's direct traffic is going to vary. For example, you might be diligent about incorporating tracking tokens into your email marketing campaigns, and asking your co-marketing partners to do the same when they promote your site content. Great. You won't have a huge chunk of traffic dumped into direct traffic that should really go into email campaigns. On the other hand, perhaps you're embroiled in a viral video firestorm, and a video on your site gets forwarded around to thousands of inboxes ... you don't exactly have control over that kind of exposure, and as a result you'll be seeing a lot of traffic without referral data in the URL that, consequently, gets bucketed under direct traffic. See what I mean? It depends.
There are always high profile blogs in your industry, no matter what that industry is. It might be sites like Business Insider, Forbes, and Inc. It might be sites like Medium and Gawker. It might be sites like Search Engine Journal and QuickSprout. The fact is, every industry has its dominant forces, and as long as you’re not the dominant blog, you can use the dominant blogs as traffic sources for your own.

Lol, start from the middle :) Yeah, Squidoo is doing great and I know they are cleaning out all the time when they see a lousy lens. They have angels over there that report stuff like that (I am an angel myself on Squidoo :) And it is even not that hard to make money over there which I always suggest to beginners that don't have money to invest in blogs and sites.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
With the exception of crude oil and Picassos, very few industries are “recession-proof” and experience an inelastic product demand. Look at how your competitors are faring, and see if they’re experiencing the same problems. While you should take Google Trends data with a grain of salt, looking at the bigger picture may help provide some clarity. I’d suggest taking this a step further by conducting trends research and reading industry reports.
Putting it simple, duplicate content is content that appears on the world wide web in more than one place. Is this such a big problem for the readers? Maybe not. But it is a very big problem for you if you are having duplicate content (internally or externally) as the search engines don’t know which version of the content is the original one and which one they should be ranking.
Input your post’s url into SEMrush to discover the keywords that it’s already ranking for. Are you ranking just off the first page of the SERPs for any specific keywords that have a high search volume? Take a look at keywords which are ranking in positions 2-10 and try optimizing for these first — moving from third to first position for a term with high search volume can drastically increase organic traffic. Plus, it’s easier to bump a page up the SERPs when it’s already ranking for that keyword.
Hey Rob, thanks for the awesome comment and sorry for flooding your inbox, lol. Also, thanks for the great words, I really appreciate it! I can't wait to see your numbers after 2 weeks. You will tell me which 5 you chose and if they don't work as expected I will help you choose others that might work better for you. It depends a lot on the niche but you can count on me helping you finding the best ones for you :)

The "Direct Session" dimension, not to be confused with the default channel grouping or source dimension, is something different. It can tell you if a session was genuinely from its reported campaign (reported as No), or if it was simply bucketed there due to the last non-direct click attribution model (reported as Yes). I didn't mention this dimension in the article because it has some flaws which can cause brain bending complexities, plus it goes a little beyond the scope of this article, which I tried to gear more towards marketers and non-expert GA users. It's certainly an interesting one, though.
While it’s best to create a custom dimension for filtering out bots, applying the generic bot filter is a good place to start. It’s important to note that filters cannot be applied retroactively, so if you’ve recently turned on this feature, you should be receiving less traffic. Additionally, double-check that you are filtering out your own traffic and IP address.
Now that we have the data we want in the chart, we use the advanced search to filter it down to only the traffic we want to see. Click the blue “Advanced” link beside the search bar that is just to the top right of your list of landing pages. This will open the Advanced search screen, where we want to setup our query. In the green drop down, choose “Medium” and in the text box at the end of the row we type “organic”. Click the Apply button below the query builder to apply this search.
Hi Chris, "Good content" means a couple of things - good for readers and good for Google. Good content for readers means that the content answers questions, provides value, offers solutions, and is engaging. You want to keep the reader on the page and on your website for as long as possible. To make good content for Google, you have to provide the search engine with a set of signals - e.g., keywords, backlinks, low bounce rates, etc... The idea is that if you make good content for readers (engaging, valuable, actionable, and informative), your content will get more engagement. When your content gets more engagement Google will see it as good content too and put it higher in the SERPs. Making "good content" is about striking that balance. Let us know if that answered your question!
×