Content is one of the 3 main Google ranking factors. As a matter of fact, having a blog can help you get your content indexed faster. An active blog – relying on qualitative and insightful content, and backed by authoritative links – will help you improve your rankings and thus your organic traffic. Also, long-form copies (above 1200 words) tend to rank higher and be crawled more often.
RankBrain can have an impact on your keywords campaigns. When you are defining your keywords, you are looking for relevant terms that respond to customers queries. This is how a successful SEO strategy works. And logically, if you are picking keywords your audience is not searching for, you campaign will have no chance to succeed. This is where RankBrain can play a role.
This is a crucial area. If you do not have schema markup and rel="author", you are costing your business money. It is as simple as that. As an example, say I want to make spaghetti (pasta) for dinner I search for “Spaghetti Recipe” and instantly I see some great markup in play, but one competitor has no markup and no rel="author" they are losing business in my eyes. Wouldn't you agree?.
Brankica, aloha. Yet another great article. With the in-depth info and resources you have been giving us, I don't know how you have time for anything else. This is another one that will live on my computer. You mention several words/places that I have not seen before. Look forward to exploring. Off to tweet. Will be working on a 51 for you. Take good care, my friend. Aloha. Janet
Oh, I wish you told me what was wrong with it :) I only discovered it recently but I am getting nice traffic from it. I hope you will let me know how it worked for you. At the moment I am posting both to my page and personal profile. I also realized that I might just leave it on the personal page (yeah, sound weird) cause on my fan page, I kinda like to add a little comment to the post. Anyway, thanks for the comment and I will try to find your blog over there and subscribe to it on Networked blogs.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
The Google, Yahoo!, and Bing search engines insert advertising on their search results pages. The ads are designed to look similar to the search results, though different enough for readers to distinguish between ads and actual results. This is done with various differences in background, text, link colors, and/or placement on the page. However, the appearance of ads on all major search engines is so similar to genuine search results that a majority of search engine users cannot effectively distinguish between the two.[1]
Hey Brankica, I'm going to have to kill a tree for this one too.... going to print it! :-) (but is it as bad if I use a piece of paper that I printed something else on the other side?) Lots of great ideas here I haven't seen as well as some reminders- thanks! Personally, my biggest challenge is consistency! Have you found that some methods take longer to see results or are there certain things you do that consistently work for you? Just curious... I would think that it might depend on the niche as well? (of course you don't know until you try it, huh? ;-)) Thanks for such a great resource- I'll definitely be sharing this with my readers too! :-) Kim

First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Hi Brankika, With respect to EZA, I no longer submit to them. They rejected some of my article for saying that the page I was linking to did not contain enough information. I linked to my blog and to the original article. I believe they didn't like the link to the original article. That being the case, they no longer allow one of the cardinal rules of syndication as per Google itself..."Link To The Original". Once they stopped allowing that, they were no longer useful to me. Thanks for the great resource. Mark
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
Those two ideas are great. I didn't know you are allowed to post links to you Amazon reviews! That is great to know, I already have some ideas in my head about this :) QR codes are definitely something to start using ASAP, too. With all the new gadgets and technology being developed, the sooner you start using it, the better results we can get :) Thanks for such an awesome comment!
The Featured Snippet section appearing inside the first page of Google is an incredibly important section to have your content placed within. I did a study of over 5,000 keywords where HubSpot.com ranked on page 1 and there was a Featured Snippet being displayed. What I found was that when HubSpot.com was ranking in the Featured Snippet, the average click-through rate to the website increased by over 114%.

Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
The truth is that a major problem for search engines is to determine the original source of content that is available on multiple URLs. Therefore, if you are having the same content on http as well as https you will “confuse” the search engine which will punish you and you will suffer a traffic loss. This is why it’s highly important to use rel=canonical tags. What exactly is this?
Wonderful little article, lovely detailing and the possible sources of direct traffic which could lead to various positives and negatives to your overall SEO strategies. Direct traffic could be the prime conversion getter for your eCommerce website or could be a flaw in your website as a whole which looks like direct traffic has a correlation with organic traffic as well. As organic traffic increases, your overall direct traffic increases along with it. Does some type of bot activities lead to direct traffic as well?s
Would you mind sharing your stumbleupon user name with me? I would love to follow you and your shares! :) I am beginning to get more active on SU and it is definitely paying off. I have had a few posts in the last several weeks that have brought in 600-700 visits each. :) In regards to your "challenge" I am going to try out paper.li I will let you know how it goes.
The big other search engine people recommend is, of course, Bing. Bing and Yahoo have something of an alliance, with Yahoo taking their data primarily from the Bing index, so appealing to either one is the same as appealing to both. SEO for Bing is a little different than it is for Google, though. Exact match keywords tend to have greater weight, for one thing. Bing also has a bit more of an emphasis on links from edu and gov sites.

You are giving some great advice to use for increasing readership and loyal followers. I am trying to include 2-3 sources from each of your list categories so that I have a balanced approach to increasing traffic. Right now I am learning Stumble Upon and Squido.com. Although I submit articles to Ezine, I think my keywords are not strong enough to get a good click through rate. I am getting better about writing articles with strong keywords now; sort of fell off the pace of that when I started using a lot of guest authors :( Thanks for your great insights and check list for increasing traffic, Brankica!
Note the penultimate processing step (previous campaign within timeout), which has a significant impact on the direct channel. Consider a user who discovers your site via organic search, then returns via direct a week later. Both sessions would be attributed to organic search. In fact, campaign data persists for up to six months by default. The key point here is that Google Analytics is already trying to minimize the impact of direct traffic for you.
Have you conisidered traffic exchanges, or blog exchanges. After submitting your URL and signing up to about 10 exchanges, it takes about an hour clicking through all 10 surf sights at a time. But it does give you some good traffic. I reciently started doing it and I am now getting around 200 visitors a day. That is just from traffic exchanges. not bad from just one type of traffic source
In the example below, Google’s organic results generated the most site traffic for the time period shown, but had a bounce rate that was greater than the site average. By contrast direct traffic generated fewer visits, but had a lower than average bounce rate. A page from another website, generated 5,946 visits but had a bounce rate 22.62 percent less than the site average.
Hey Jym, thanks a bunch :) Yeah, I don't think everyone need to use each and every one of these, especially at the beginning. But I do find the list useful for those with "older" blogs. When you are thinking "Where else can I go to get more people to see my blog". That is what I do with my first website. Again, I agree with the part of identifying the ones that will work the best, so we don't spend too much time getting no results. Thanks so much for the comment
This is the number of views that you can test each month on your website.It's up to you how you choose to use them, either by allocating all the views to one test or to multiple test, either on one page or on multiple pages. If you have selected the 10.000 tested views plan and you run an experiment on your category page which is viewed 7000 times per month, then at the end of the month 7000 is what you'll be counted as tested views quota.
×