Search engines: Google vs. Bing. Google was the first search engine that provided better search results and people told others about it so it spread virally and became a verb “Google it”, whereas Bing is trying to buy it’s way into the market, doing ads, deals with Facebook and Yahoo, etc. Most people weren’t asking for a 2nd, “me-too” search engine, the first one solved their search pain and continues to do so, so trust was built and people have remained loyal to it.

Using organic search data through Data Cube you can make your PPC campaign even stronger. You can research keywords that have the highest traffic and use the BrightEdge Recommendations engine to learn the types of sites that people are most likely targeting with specific queries. You can then create content for your PPC campaigns armed with this insight, positioning yourself well for paid search success.

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[69][70]
The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).

Search engines reward you when sites link to yours – they assume that your site must be valuable and you’ll rank higher in search results. And the higher the “rank” of the sites that link to you, the more they count in your own ranking. You want links from popular industry authorities, recognized directories, and reputable companies and organizations.
Black hat SEO refers to the practice of trying to trick the search engines into giving you higher rankings by using unethical tactics, such as buying links. The risk is just too great. Even if you enjoy a temporary boost in rankings due to black hat tactics, it’s likely to be short lived. Google is getting better and better at spotting dirty tricks and sooner or later the progress you made will be wiped out by an algorithm update, or worse, your site will get removed from the index altogether.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However Google implemented a new system which punishes sites whose content is not unique.[35] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[36] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[37] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognised term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [38]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
It’s not going to run up your grocery bill. In fact, investing time and energy in developing an organic marketing strategy is one of the most important steps you can take to ensure the long-term success of your business’s digital presence. I’ve helped businesses design and run organic marketing campaigns for years. It’s one of the most effective ways to build an authentic audience and fan base for your product or service. It takes more time, consistency and patience to pull off, but it’s worth the extra effort.
One important thing to note is a website’s domain authority (DA) and page authority (PA). This is a number from 1 to 100 that indicates the strength of a website’s domain or a specific page. DA and PA are two of several factors that go into how a website will be ranked on a SERP. The higher the DA and PA, the better the chances are of that webpage ranking on the front page of a SERP (everyone’s dream!). This number is determined by a few things, such as the age of the website and number of links leading to it (backlinks).
The challange is for SEO's then to tell this to the clients and not worry of loosing them. What to report on then, for GMB- impressions (this should decrease because I found on the maps that the link to website isn't always there!), GMB dashboard for views (a test showed stats on the GMB dashboard are incorrect) the suggested channels social, youtube don't fall under organic traffic
Use long tail keywords. Don’t just go with the most popular keywords in your market. Use keywords that are more specific to your product or service. In time, Google and other search engines will identify your website or blog as a destination for that particular subject, which will boost your content in search rankings and help your ideal customers find you. These tools will help.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[53] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[54]
It’s unreasonable to assume that you will pull top rank in Google for every keyword relating to your industry. Your goal should be to pull top rank on the most desired keywords. This is an exercise that will take the effort of both marketing and management. Think about how people would search for your products and services, make a list of these keywords, and check the traffic for each term with a tool like Google’s Keyword Planner. Naturally you will want to rank for the keywords with the most traffic, so whittle your list down to the highest-trafficked, most relevant terms.
Another one of the benefits of SEM is that people who see your PPC ads are those most likely to want to buy your product or service. PPC ads require you to choose a geographic location and specific search queries to target. As a result, you can be sure that anyone who clicks on your ad is not arbitrarily surfing the web, but rather, is looking for your product or service and in a position to do so.

Optimize for opt-ins. Make sure you lead to something more than your content. What this means is that when people read your content, they must know where to go next. This may come in the form of a call to action or your offering of additional content, appearing perhaps as a PDF. Growing organic traffic is important, but it doesn’t matter if you are not converting those viewer into leads. Your business doesn’t pay its bills using raw traffic.
For our client: We rolled out numerous new pieces of content onto their blog and news section; we aimed to make the content creative and funny. As the client was in the careers space we made use of “funny interview questions” and “technical interview questions” style articles. It was amazing that one of the articles even made it to the first page of Reddit. We also pushed out content which was related to various holidays in that year and also specific to the client’s industry and also current trends in the market. 
Since there is an obvious barrier of entry for anyone trying to beat you once you’re established, you won’t have to worry about having competitors “buying” their way to the top. Their only option is pay per click ads, but then again, it isn’t the same as getting a higher position on the SERPs. Again, this is assuming that you took the right steps and were patient enough to solidify your place in the top search results. 
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
×