Understanding the working mechanism of social algorithms is not a piece of cake. The algorithm for each social platform works differently. For instance, on Facebook, a core factor that affects the rankings of a post is its relevancy score, whereas on YouTube the total watch time of the video per session decides whether a video enters a ‘Recommended Video’ section or not.

Billions of people search the web every day. Search engine marketing (SEM for short) is how you can get your ads in front of these future customers where it counts: in premium spots on the first page of search results. You set your own budget and are charged only when your ad is clicked. This makes SEM an affordable way to reach more customers for businesses of all sizes — including yours.
Melissa: I completely agree. And the other thing about them adding and the way they’re added in the LinkedIn video option is that there’s an auto play. So as folks are scrolling through their feed, they’re more likely to stop with this video that just kind of starts playing, as well. I think that’s a big opportunity to really get some more eyes on your content.
Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past, but over the past years the use of search engines for accessing information has become vital to increase business opportunities.[31] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products, but it could also pose various challenges.[32] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[31] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore, search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[31] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.

To sum up all of this information, even organic traffic, like direct traffic, has some gray areas. For the most part, though, organic traffic is driven by SEO. The better you are ranking for competitive keywords, the more organic traffic will result. Websites that consistently create content optimized for search will see a steady increase in organic search traffic and improved positioning in the search results. As a marketer, it is important to look at your keywords and high-ranking pages to identify new SEO opportunities each month.  
With organic search, you don’t have to outspend your competitors to outrank them. Your competitors can’t recreate the content experience that you use to drive organic traffic.  This one is major. PPC is easy to replicate and reverse engineer. Many spy tools allow you to dissect paid campaigns to see what’s working and what’s not. You can get insight into what ad creatives generate the most clicks.
Click through rate: Except for high purchase intent searches, users will click on paid search listings at a lower rate than organic search listings. Organic listings have more credibility with search engine users. In one UK study, published by Econsultancy, only 6% of clicks were the result of paid listings. In another study, it was 10%. The important thing to remember is that click through rate varies by purchase intent. Organic rankings will get more click through rates for “top of funnel” keyword search queries.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
So just how much of the traffic that finds itself labeled as direct is actually organic? Groupon conducted an experiment to try to find out, according to Search Engine Land. They de-indexed their site for the better part of a day and looked at direct and organic traffic, by hour and by browser, to pages with long URLs, knowing that pages with shorter URLs actually do get a large amount of direct traffic, as they can be typed quickly and easily into a browser. The results showed a 50% drop in direct traffic, clearly demonstrating how all of these other factors come into play during the analytics process.
If both page are closely related (lots of topical overlap), I would merge the unique content from the lower ranking article into the top ranking one, then 301 redirect the lower performing article into the top ranking one. This will make the canonical version more relevant, and give it an immediate authority boost. I would also fetch it right away, do some link building, and possibly a little paid promotion to seed some engagement. Update the time stamp.

Hi , the post is really nice , and it made me think if our current strategy is ok or not , 2 things are important " High quality content strategy " and " Good quality Links " now joining those correctly can pose some real challenges , say if we have n no of content writers who are writing for couple of websites, to be generic let’s consider , 1 writer @ 1 website . We have to write make a content strategy for in-house blog of the website to drive authentic traffic on it and a separate content strategy for grabbing  links from some authentic High PR website i.e. CS should be 2 ways , In-house / Outhouse .


Quick question. Do you ever click on the paid results when you conduct a search? It turns out, most users don’t. People typically bypass paid results and click on the top organic results. I get it. They’re looking for the most relevant and trustworthy answers to their problems. A top result that appears to be bought doesn’t appeal to them as much as an organic result. That’s where the credibility factor comes into play. It’s why 75% of clicks are organic.
This refers to your ability to rank for certain keywords. For instance, say you sell shoes online, you will need to optimize your site, backlinks, website speed and much more so that you can “rank” high for certain keywords that are highly relevant to your business. Relevant keywords may include “buy shoes,” “shoe sale,” “where to buy shoes,” and so on. Once you can rank high (top page) for these keywords, you will enjoy an increase in traffic and business as a result.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[39] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[40] in addition to their URL submission console.[41] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[42] however, this practice was discontinued in 2009.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
Hi Chris, "Good content" means a couple of things - good for readers and good for Google. Good content for readers means that the content answers questions, provides value, offers solutions, and is engaging. You want to keep the reader on the page and on your website for as long as possible. To make good content for Google, you have to provide the search engine with a set of signals - e.g., keywords, backlinks, low bounce rates, etc... The idea is that if you make good content for readers (engaging, valuable, actionable, and informative), your content will get more engagement. When your content gets more engagement Google will see it as good content too and put it higher in the SERPs. Making "good content" is about striking that balance. Let us know if that answered your question!

That's why it's necessary to always stay abreast of developments in the SEO world, so that you can see these algorithm updates coming or you can determine what to do once they’ve been released. The WordStream blog is a great resource for SEO updates, but we also recommend Search Engine Land and Search Engine Roundtable for news on updates. Glenn Gabe of G-Squared Interactive is also a great resource for analyzing the causes and impact of algorithm updates.


Remember when you used to rely solely on search engines for traffic? Remember when you worked on SEO and lived and died by your placement in Google? Were you #1? Assured success. Well, okay, maybe not assured. Success only came if the keywords were relevant to your site users, but it was the only real roadmap to generating site traffic and revenue.
×