Nice post. I was wondering if all this content of your strategy was been writien in blog of the site, or if you added to content in some other specific parts of the sites. I don't believe 100% in the strategy of reomoving links. If Google just penalize you taking into account your inbound likes, It would be so easy to attack your competitors just by buying dirty link packages targeting to their sites.
The monthly volume of searches entered on keywords can be found with a few different methods. If you have a Google AdWords account, you can use Keyword Planner for this step. If you don’t, there are a few free sites out there that will give you similar numbers. Obviously, if a keyword has higher monthly searches you’ll want to keep it in mind. However, that also might mean that it has a higher keyword difficulty, and fiercer competition.
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.

Page and Brin founded Google in 1998.[22] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[23] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[24]
The social media landscape is constantly evolving. New networks rise to prominence (e.g. Snapchat), new technology increases user participation and real-time content (e.g. Periscope) and existing networks enhance their platform and product (e.g. Facebook,Twitter, Pinterest and Instagram launching ‘buy’ buttons). Organic reach is also shrinking as the leading networks ramp up their paid channels to monetise platform investment.
If both page are closely related (lots of topical overlap), I would merge the unique content from the lower ranking article into the top ranking one, then 301 redirect the lower performing article into the top ranking one. This will make the canonical version more relevant, and give it an immediate authority boost. I would also fetch it right away, do some link building, and possibly a little paid promotion to seed some engagement. Update the time stamp.
To sum up all of this information, even organic traffic, like direct traffic, has some gray areas. For the most part, though, organic traffic is driven by SEO. The better you are ranking for competitive keywords, the more organic traffic will result. Websites that consistently create content optimized for search will see a steady increase in organic search traffic and improved positioning in the search results. As a marketer, it is important to look at your keywords and high-ranking pages to identify new SEO opportunities each month.  

An organic marketing strategy generates traffic to your business naturally over time, rather than using paid advertising or sponsored posts. Anything you don’t spend money on directly – blog posts, case studies, guest posts, unpaid tweets and Facebook updates – falls under the umbrella of organic marketing. That email blast you just sent out? Yup, that’s organic. So is that user-generated content campaign you just launched.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
The typical Web user might not realize they’re looking at apples and oranges when they get their search results. Knowing the difference enables a searcher to make a better informed decision about the relevancy of a result. Additionally, because the paid results are advertising, they may actually be more useful to a shopping searcher than a researcher (as search engines favor research results).
ALT tags are HTML elements used to specify alternative text to display when the element they are applied to (such as images) can’t be rendered. ALT tags can have a strong correlation with Google SEO rankings, so when you have images and other elements on your web pages, be sure to always use a descriptive ALT tag with targeted keywords for that page.

There are also a few more similarities. All of these marketing methods are measurable to an extent never seen in any other media. Every click can be measured – where and when it came – and followed through to the conversion, the sale and the lifetime customer value.  This feedback loop creates optimization opportunities that can create huge incremental improvements in your SEM campaigns.

At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.
×