Oh, I wish you told me what was wrong with it :) I only discovered it recently but I am getting nice traffic from it. I hope you will let me know how it worked for you. At the moment I am posting both to my page and personal profile. I also realized that I might just leave it on the personal page (yeah, sound weird) cause on my fan page, I kinda like to add a little comment to the post. Anyway, thanks for the comment and I will try to find your blog over there and subscribe to it on Networked blogs.
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.
Nice post. I was wondering if all this content of your strategy was been writien in blog of the site, or if you added to content in some other specific parts of the sites. I don't believe 100% in the strategy of reomoving links. If Google just penalize you taking into account your inbound likes, It would be so easy to attack your competitors just by buying dirty link packages targeting to their sites.

Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.
The first step that I take is to do a quick Google search to find pages on my domain where I've mentioned the keyword in question so that I can add an internal link. To do this, I'll use the following search query, replacing DOMAIN with your domain name (e.g. matthewbarby.com) and KEYWORD with the keyword you're targeting (e.g. "social media strategy"):
In the end of the day it depends on the size of the website you are working with and how well known the brand is in the market. You can adapt some of the strategies listed above in the post on scale and it can have a highly positive impact on a web property, the property in question is a real content house so any thing is possible. What else do you suggest we should do I will advise you if it has been done already?
"We have used quite a few tools for tracking SERPs and keywords as part of our content marketing effort. And there was always something missing. It wasn’t until we found AccuRanker that we were completely satisfied. It has increased our productivity. The powerful filters, tagging, instant checks, integration with GSC, multiple URL detection per keyword, accurate search volume, and notes, are all features we now can’t live without. AccuRanker really has taken our SEO efforts to another level. We were able to grow our organic traffic by 571% in just 13 months."
Buy brasilian web traffic from Yahoo Brasil for 30 days – We will send real brasilian visitors to your site using Yahoo Brasil's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait buy Web Traffic from Yahoo Brasil today …
Hey Tia, thank you so much for the comment and glad you liked it. I like Hub pages and Squidoo for traffic but the money is equal to none compared what you have to do to get so many views, lol, guess no pain no gain. I am also not worried about Ezines because I actually wrote unique articles and did my best to make them great quality ones. Guess that is why I posted just a few of them, lol. But they still get me nice amount of traffic. I like it more that I didn't have to post 200 articles to make it worth. Thanks again for the comment!
For our client: We rolled out numerous new pieces of content onto their blog and news section; we aimed to make the content creative and funny. As the client was in the careers space we made use of “funny interview questions” and “technical interview questions” style articles. It was amazing that one of the articles even made it to the first page of Reddit. We also pushed out content which was related to various holidays in that year and also specific to the client’s industry and also current trends in the market. 
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.

Everything on Wikipedia and other Wikis has a resource part. So if the image's resource is your blog (and I am talking about useful and good images, not just anything) you will get traffic. I explained in the post how it generally works for a travel site for example. It works better for some niches than others but creative people can make everything work for them.
The best of profitable visitors which you can get to your site is from Organic search results. This means, when people search for something and land on your website because these are the people who are most likely to convert into customers or clients. In blogging, it’s also most profitable as users will be seeing more high CPC Adsense ads, as they will see ads based on their search term.
So we’re confident that the high social traffic in the sixth example above reflects the highly successful social media campaigns the organization is working on. We could also see this pattern for a site which has decided that they can’t succeed with search and has, as Google suggests for such websites, chosen to work on social media success instead. The difference is, one site would have good Search and Direct traffic and really good social media, while the other might have dismal Search and rely heavily on social media, which is very time consuming and often has a low ROI. This second pattern is one we’ve seen with microbusinesses where the business owner is spending hours each day on social media and making very little progress in the business. Making the investment in a better website would probably pay off better in the long run, even if it seems like an expensive choice.
Unlike text and display ads, PLAs enable e-commerce companies to target specific products and product groups at the decision stage of the buyer’s journey. They also help to increase brand awareness as they position their brand in front of a highly targeted audience that is looking for a product the company offers. Typically, a PLA contains a product picture and a price, along with a store brand. So it comes as no surprise that Walmart, as one of the world’s largest online retailers, appeared to be the undisputed leader in using PLAs, with almost 604,000 keywords used in April. The company is followed by Home Depot, which sells various home improvement items and targets 139,000 keywords.
I've started a fairly new blog, aimed at fairly new bloggers, by a fairly new blogger! I've had another blog up and running for about four months and it's building up quite well, but I've learned a lot from you here, and am looking forward to trying all the ideas that are new to me. You've been so thorough that I can't suggest a 51, but perhaps a 50a - I use Tweetadder with my Twitter accounts; it's a brilliant tool for automating tasks and driving traffic. I won't spam the comment with a link here, but there is one on my blog for anyone who is interested. Thanks again for all those ideas, cheers, Brent

This quickly turns into a “chicken-and-egg” situation. Are fewer people coming to your site due to poor visibility in the SERPs? Or have you shifted your product focus, and is that why consumers are no longer interested in your brand? For a quick check, look at Google Search Console data, and pull positions and clicks by page. If position is staying relatively stagnant, this means your brand is not losing visibility in the SERPs, and there may be a bigger issue at play.


Hi SEO 4 Attorneys, it could be any thing is this for your site for a clients site.It could be an attempt at negative SEO from a competitor? The thing is people may try to push 100's of spammy links to a site in hopes to knock it down. In the end of the day my best advice is to monitor your link profile on a weekly basis. Try to remove negative links where possible if you cant remove them then opt for the disavow tool as a last resort. 
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
Lol, start from the middle :) Yeah, Squidoo is doing great and I know they are cleaning out all the time when they see a lousy lens. They have angels over there that report stuff like that (I am an angel myself on Squidoo :) And it is even not that hard to make money over there which I always suggest to beginners that don't have money to invest in blogs and sites.
Buy canadian search traffic visitors from Google.ca for 30 days – We will send real canadian visitors to your site using Google.ca's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Google Canada Search Traffic Service today …
Hi Brankica, I'm a total Kindle addict too! As much as I love to hoard books, the Kindle makes it even easier to consume books in volume! One other suggestion: There's a nifty new service called http://paywithatweet.com that allows you to give away an electronic download in exchange for a Tweet or post on Facebook. I LOVE this and have been testing it out with my latest book. Very effective!
Good point,The thing with this client is they wanted to mitigate the risk of removing a large number of links so high quality link building was moved in early before keyword research. So it is on a case by case basis, but defiantly a good point for most new clients I work with who do not have pre-existing issues you want to do Keyword Research very early in the process. 

For example, sometimes this could include social media websites, sometimes it might not -- in HubSpot's software , social media websites are not included in referral traffic because they're included in a separate "social media" bucket. Another instance of variance is whether subdomains are included -- HubSpot software, for example, includes subdomains (like academy.hubspot.com) as a traffic source under Referrals. And sometimes it's not that tricky -- you'll always see third-party domains, like mashable.com, for instance -- right under here. This is particularly helpful if you're trying to ascertain which web properties are great for co-marketing, SEO partnerships, and guest blogging opportunities.


The online space has definitely become extremely competitive. New businesses, platforms, and complimentary services are entering the market almost every day. In fact, nowadays you can even set up an online store in just five minutes, but to sustain it as a profitable e-commerce business requires significant amounts of time and resources, plus a great amount of business experience and marketing knowledge.
I am too much late to commenting on this article. I want to read "How much get Organic traffic by SEO", found your your article on top & really very interesting. James Norquay, you did good research.I think Now days google block mostly SEO activities. Is this worthy for current marketing scnerio?If any other post to related strategy for increasing Organic traffic, you can reffer me.

The truth is that a major problem for search engines is to determine the original source of content that is available on multiple URLs. Therefore, if you are having the same content on http as well as https you will “confuse” the search engine which will punish you and you will suffer a traffic loss. This is why it’s highly important to use rel=canonical tags. What exactly is this?


On the basis of this article alone, you should be given a Ph.D in internet traffic generation. Absolutely fantastic set of ideas & sources for traffic generation. I had taken a print of this article the first I read it and then took some serious action of some of the ideas that have mentioned here - flickr images, blog commenting, slideshare (this even resulted in some great business leads and closures), sites such as blog engage, blog interact, social media sources such as LinkedIn, forum participation etc. And they have done wonders to my sites. The fact that there are 319 comments here even as I write is testimony to the authenticity of the ideas presented here. Great article which will one day be a classic - if it is not already.
If you don’t want your Google traffic dropped dramatically due to indexing and content pruning, we are going to list below the steps you need to take in order to successfully prune your own content. We’ve developed this subject in a previous blog post.  Yet, before doing so, we want to stress on the fact that it’s not easy to take the decision of removing indexed pages from the Google Index and, if handled improperly, it may go very wrong. Yet, at the end of the day,  you should keep the Google Index fresh with info that is worthwhile to be ranked and which helps your users. 

Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
×