Google measures average time on site by first collecting each visitor’s exact time on a particular page. Imagine that a visitor lands on page 1 of your site. Google places a cookie, including a unique code for the visitor and a time stamp. When that visitor clicks through to page 2 of your site, Google again notes the time, and then subtracts the time that the visitor arrived at page 2 from the time that the visitor arrived at page 1. Google then averages each and every page’s time spent to get the average time each visitor spends on the site.
Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.
Do you have a content strategy in place, or are your efforts more “off the cuff?” Not having a clearly defined keyword map can spell trouble — especially if two or more pages are optimized for the same keyword. In practice, this will cause pages to compete against each other in the SERPs, potentially reducing the rankings of these pages. Here is an example of what this might look like:
Armed with this information, ecommerce marketers can learn why some sources might be underperforming or focus efforts on sources that drive better quality traffic. In some cases, this might mean relying less on search engines and more on social media or brand awareness. Other times the opposite could be the best course of action. Either way, the Traffic Sources section in Google Analytics can help.
Hey Mavis, thanks for that :) I wrote down all the ideas from the comments as well so there are at least 65 ideas now, lol. I just wish we all had more time in a day to really work on all of them. I guess all that testing and tracking comes now so we can focus on the best for our blogs. But do I even need to say how much I hate testing and tracking.
So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.
Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.
Brankica I like your Point about being a Master of Catchy Titles when Using Commentluv, I can also see that you use a different link whenever possible replying to comments here. This is just proving you know what you are talking about great insight into getting traffic from multiple sources and looking for alternative traffic not just thinking about getting visitors from search engines.
We are in the SEO industry for a while now, wearing many hats: digital marketers, tool developers, advisers, researchers, copywriters, etc. But first of all, we are, just like you, site owners. And we’ve been through ups and downs regarding organic traffic, and we were in the situation of trying to understand why the Google traffic dropped dramatically at some point as well. Having all these in mind, we’ve thought of easing your work, and we’ve put together a list with the top 16 reasons that can cause sudden traffic drop, unrelated to Google algorithm changes or penalties.