Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.

Search engine optimisation is tricky for any business, but you’ve got a real challenge on your hands as a startup. You need to make an impact fast, get things moving and start building traction before those limited funds run out. Which is probably why a lot of startups take shortcuts with SEO, hoping to cut a few corners and speed their way to search ranking glory.
If you check out some of the suggestions below this though, you're likely to find some opportunities. You can also plug in a few variations of the question to find some search volume; for example, I could search for "cup of java" instead of "what is the meaning of a cup of java" and I'll get a number of keyword opportunities that I can align to the question.
But now, that is all changing. We can still see all of that information ... sometimes. Things got a whole heck of a lot more complicated a little over a year ago, when Google introduced SSL encryption . What does that mean? It means Google started encrypting keyword data for anyone conducting a Google search while logged in to Google. So if you're logged in to, say, Gmail when you do a search, the keyword you used to arrive on a website is now encrypted, or hidden. So if that website drilled down into its organic search traffic source information, they'd see something like this:
By using the Content Assistant feature of Cognitive SEO, I’ve included the majority of the suggested keywords by the tool on the content assets we’re aiming to increase in organic visibility.  I’ve also restructured the underperforming content and added new sections – to make the inclusion of the suggested key phrases more logical, natural and create relevant content.
Looking at the keyword rankings and organic landing pages provided a little bit of insight into the organic traffic loss, but it was nothing definitive. Because of this, I moved to the on-page metrics for further clarity. As a disclaimer, when I talk about on-page metrics, I’m talking about bounce rate, page views, average page views per session, and time on site.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
After adjusting that, the table refreshes again and I’m looking at a month-by-month summary of my Organic Traffic. Hover your mouse over any single month dot to view a summary of that month’s numbers. In this particular example, we can see that recently there’s been an increase in Organic Traffic. January had 6,630 organic sessions, February (short month) had 5,982 and then March came in strong with 7,486 organic sessions. This information lets us know that something on the site is performing better than usual in March. In most cases, this means that either interest in a topic has increased or the website has begun to rank better in the search engines for specific keywords. In the next section we’ll begin to break this down further.
Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.
Brankica I like your Point about being a Master of Catchy Titles when Using Commentluv, I can also see that you use a different link whenever possible replying to comments here. This is just proving you know what you are talking about great insight into getting traffic from multiple sources and looking for alternative traffic not just thinking about getting visitors from search engines.

Keywords research is a very important process for search engine optimization as it can tell you the exact phrases people are using to search on Google. Whenever you write a new blog post, you have to check the most popular keywords, but don’t be obsessed by the search volume. Sometimes, long tail keywords are more valuable, and you can rank for them more easily.


I don't generally think of my own list as a traffic source, because those are people that subscribed and keep coming back, mostly because of good content. This is more tutorial about where to get fresh traffic from in case a person is not using some of these sources already. Where have you guest posted, I haven't seen any, would like to read some of those :)
We want you to be able to make the most out of AccuRanker which is why we have a dedicated team to help you with any questions you have. We offer complimentary one-on-one walkthroughs where our Customer Success Manager will take you though our software step-by-step. We also have a chat function where you can ask us questions, give us your feedback and suggestions.
Another big benefit to using social media is that it can help you gain more influence, as well as grow your business across board. And although social media has little direct influence over your search engine ranking, it can help your SEO, albeit indirectly. By helping to grow your business and your website, gaining more traffic, more backlinks and so on, this will improve your website's domain authority, and thus, its search engine ranking by extension.

We are using automated methods and search engine marketing methods to provide you with visits to your links. Our algorithm is using public search services, we are not affiliated with the search engine companies in any way. RealVisits.org will not accept websites that contain or promote Illegal, adult or offensive content. We reserve the right to reject any website that we feel does not meet our standards. There is no risk of getting your Page banned or penalized by Google and it is 100% AdSense safe. We are selling our services only for business marketing purposes. By purchasing and using our services, you agree to our Terms and Conditions.
Look at the biggest startup success stories and analyse their content. Airbnb positions itself as the place to find unique travel experiences with locals, Uber is the cheap, on-demand taxi alternative and Skillshare carved its niche as a free learning exchange platform. They all address their audience concerns in a way nobody else is doing and their content has a friendly, approachable tone – even when we’re talking about software startups like IFTTT or Slack.
One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
Cheers for sharing that thread, I'd not read it. I think most of the confusion here arises because of how standard GA reports work on a last non-direct click basis - if there's a previous campaign within timeout, that user will be attributed to that campaign, even if they're technically returning via direct. MCF reports are the exception, of course, given that they show *actual* direct.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 

At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.
Here is a common narrative that many e-tailers can relate to: You identified your “sweet spot” in the marketplace and know that charging above this threshold leads to price sensitivity. Your core products drive volume — which allows you to achieve amazing growth. Then, one day, your focus shifted. Maybe you stopped churning out iterations of your best sellers, or maybe you tried to focus on your higher-revenue products — all the while alienating the people who liked your previous offerings.
It increases relevancy: Siloing ensures all topically related content is connected, and this in turn drives up relevancy. For example, linking to each of the individual yoga class pages (e.g. Pilates, Yoga RX, etc) from the “Yoga classes” page helps confirm—to both visitors and Google—these pages are in fact different types of yoga classes. Google can then feel more confident ranking these pages for related terms, as it is clearer the pages are relevant to the search query.
×