For a long time, digital marketers summed up the properties of direct and organic traffic pretty similarly and simply. To most, organic traffic consists of visits from search engines, while direct traffic is made up of visits from people entering your company URL into their browser. This explanation, however, is too simplified and leaves most digital marketers short-handed when it comes to completely understanding and gaining insights from web traffic, especially organic and direct sources.
Get a handle on your brand reputation. Your brand story is the one that you tell. Your reputation is the story that customers tell on your behalf. If someone consistently stumbles on your site when they type in niche search queries, they’ll be intrigued. The result? They’ll start conducting navigational searches for your brand. The intent behind that search? They want reviews and other customer’s experiences with your business. Ask your customers for reviews and reach out to third-party review sites in your niche. This way, these navigational searches don’t come up empty. I also recommend monitoring your brand mentions. The easy way is to set up Google Alerts. Type in your brand name and create your alert. Any mention online and you’ll be notified.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
Click through rate: Except for high purchase intent searches, users will click on paid search listings at a lower rate than organic search listings. Organic listings have more credibility with search engine users. In one UK study, published by Econsultancy, only 6% of clicks were the result of paid listings. In another study, it was 10%. The important thing to remember is that click through rate varies by purchase intent. Organic rankings will get more click through rates for “top of funnel” keyword search queries.

You know who and where your best customers are — Bing Ads lets you choose when and how to reach them. Control where your ads appear by city, state, country and worldwide. Fine-tune your targeting even further by setting the time of day to display your ads and on which devices. By targeting only your most relevant customers, you can reduce unnecessary spending.
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.
×