Look at the different search engines (sources) that drive traffic to your site to determine where you want to invest your resources. For example, if you're getting an overwhelming amount of visitors and revenue from a particular search engine, that's an obvious source of profitable traffic and an area in which you might want to make further investment; but you might also find another search engine that delivers only a few visitors, but ones who represent a very high Per Visit Value. In this latter case, you might want to increase your spend in that area to drive more of those high-value visitors to your site.
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
These types of keywords each tell you something different about the user. For example, someone using an informational keyword is not in the same stage of awareness as someone employing a navigational keyword. Here’s the thing about awareness. Informational needs change as awareness progresses. You want your prospects to be highly aware. If you’re on a bare-bones budget, you can be resourceful and achieve that with one piece of content.
Even if you don’t have a website, you can still make sure customers can find you online by creating listings on sites like DexKnows and Yelp. Just be aware that your customer base will be relying more and more on the internet to learn about your company, and a website will better provide the information they seek, as well as helping you build their confidence in your business.
Guest blogging purely for inbound links is a flawed strategy because the value of those links are going down. However, guest blogging for traffic is still an incredibly viable strategy. While that inbound link you get at the end of a guest post doesn’t have as much SEO value as it used to, it still has the value of exposing your content to a new audience.
I think for agencies as far as how we appear in organic search ourselves, we are definitely going to need to leverage all 3 of the solutions you talk about and agencies who haven't branded their products/services are going to have to do that and are going to have to also employ branding strategies. In addition, we have to optimize for other search ares like you say in your point #2 and we must look at optimizing existing content for voice search and answers/featured snippets like you say in point #3.
However, with a properly created PPC campaign, results can be analyzed and any conversion-related problems can fixed within no time. It shouldn’t be surprising to see massive results from a PPC campaign that’s been running only for a few weeks. When and if you have the budget, getting quick results with PPC is not only possible, it’s completely doable.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
It’s unreasonable to assume that you will pull top rank in Google for every keyword relating to your industry. Your goal should be to pull top rank on the most desired keywords. This is an exercise that will take the effort of both marketing and management. Think about how people would search for your products and services, make a list of these keywords, and check the traffic for each term with a tool like Google’s Keyword Planner. Naturally you will want to rank for the keywords with the most traffic, so whittle your list down to the highest-trafficked, most relevant terms.
Organic is what people are looking for; the rest of these simply put things in front of people who may or may not be seeking what you offer. We know that approximately X number of people are looking for Y every day. So if we can get on front of those people, we have a much greater opportunity to create long-term relationships and increase our overall ROI.