While it’s best to create a custom dimension for filtering out bots, applying the generic bot filter is a good place to start. It’s important to note that filters cannot be applied retroactively, so if you’ve recently turned on this feature, you should be receiving less traffic. Additionally, double-check that you are filtering out your own traffic and IP address.
When you run a PPC campaign -- whether through Google or some other PPC provider -- you can track how much site traffic it drives in this part of your sources report. Obviously for proper PPC campaign management , you'll also need to be reviewing whether that traffic actually converts , too. Like email marketing as a source, be sure to include tracking tokens with all of your paid search campaigns so this is properly bucketed as, you know, paid search.
Relying too much on one source of traffic is a risky strategy. A particular channel or strategy can be fantastic for generating traffic today, but it doesn’t mean it will stay the same tomorrow. Some sites lost out when Penguin started penalizing on certain SEO linking practices. Others lost out when Facebook decided to massively restrict organic reach. If your site relies exclusively on only one source of traffic, then algorithm changes can create some serious trouble for you. So be aware of the importance of diversifying and know the options available from different traffic sources.
This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?
A great way to break your web traffic without even noticing is through robots.txt. Indeed, the robots.txt file has long been debated among webmasters and it can be a strong tool when it is well written. Yet, the same robots.txt can really be a tool you can shoot yourself in the foot with. We’ve written an in-depth article on the critical mistakes that can ruin your traffic in another post. 
Google measures average time on site by first collecting each visitor’s exact time on a particular page. Imagine that a visitor lands on page 1 of your site. Google places a cookie, including a unique code for the visitor and a time stamp. When that visitor clicks through to page 2 of your site, Google again notes the time, and then subtracts the time that the visitor arrived at page 2 from the time that the visitor arrived at page 1. Google then averages each and every page’s time spent to get the average time each visitor spends on the site.
Organic traffic is a special kind of referral traffic, defined as visitors that arrive from search engines. This is what most marketers strive to increase. The higher you rank for certain keywords, the more often your search result appears (increasing your impressions), ultimately resulting in more visitors (aka clicks). It’s also important to note that paid search ads are not counted in this category.

Mobile traffic: In the Groupon experiment mentioned above, Groupon found that both browser and device matter in web analytics’ ability to track organic traffic. Although desktops using common browsers saw a smaller impact from the test (10-20 percent), mobile devices saw a 50 percent drop in direct traffic when the site was de-indexed. In short, as mobile users grow, we are likely to see direct traffic rise even more from organic search traffic.
×