The "Direct Session" dimension, not to be confused with the default channel grouping or source dimension, is something different. It can tell you if a session was genuinely from its reported campaign (reported as No), or if it was simply bucketed there due to the last non-direct click attribution model (reported as Yes). I didn't mention this dimension in the article because it has some flaws which can cause brain bending complexities, plus it goes a little beyond the scope of this article, which I tried to gear more towards marketers and non-expert GA users. It's certainly an interesting one, though.
If you’re building your website from scratch, create your own templates for new pages and important elements so you don’t have to keep typing out the same code. Also make sure you’re familiar with dynamic web pages so you can edit elements like your website’s header in one place instead of having to make the same changes manually across every page.
Remember when you used to rely solely on search engines for traffic? Remember when you worked on SEO and lived and died by your placement in Google? Were you #1? Assured success. Well, okay, maybe not assured. Success only came if the keywords were relevant to your site users, but it was the only real roadmap to generating site traffic and revenue.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
By using the Content Assistant feature of Cognitive SEO, I’ve included the majority of the suggested keywords by the tool on the content assets we’re aiming to increase in organic visibility.  I’ve also restructured the underperforming content and added new sections – to make the inclusion of the suggested key phrases more logical, natural and create relevant content.
Putting it simple, duplicate content is content that appears on the world wide web in more than one place. Is this such a big problem for the readers? Maybe not. But it is a very big problem for you if you are having duplicate content (internally or externally) as the search engines don’t know which version of the content is the original one and which one they should be ranking.

The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.


Hey Michael, thanks so much for the comment. After this post, I decided to revamp my traffic generation strategy and work more on the existing sources and work a bit more on the ones I was not using that much. I saw results almost from the first day. This list is good because of the diversity of the ideas, cause if one thing doesn't work for you, there has to be another one that will sky rocket your stats :)
If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.
Relying too much on one source of traffic is a risky strategy. A particular channel or strategy can be fantastic for generating traffic today, but it doesn’t mean it will stay the same tomorrow. Some sites lost out when Penguin started penalizing on certain SEO linking practices. Others lost out when Facebook decided to massively restrict organic reach. If your site relies exclusively on only one source of traffic, then algorithm changes can create some serious trouble for you. So be aware of the importance of diversifying and know the options available from different traffic sources.
People who use the ambiguous phrase “social media marketing” are typically referring to advertising: you broadcast your message and hope people will listen. Even if you overcome consumer indifference with a well-targeted campaign, any subsequent interactions are affected by their very public nature. The privacy of dark social, by contrast, represents a potential goldmine of intimate, targeted, and relevant interactions with high conversion potential. Nebulous and difficult-to-track though it may be, dark social has the potential to let marketers tap into elusive power of word of mouth.
Having large groups of content that all revolve around the same topic will build more relevance around keywords that you're trying to rank for within these topics, and it makes it much easier for Google to associate your content with specific topics. Not only that, but it makes it much easier to interlink between your content, pushing more internal links through your website.

Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.


Bonus: WHAFF is a good tool to make traffic, and also a good tool to make money if you share your invite code in microworkers.com alike websites. Get WHAFF rewards on your abdroid device, and use my invite code: CG90779 to get your first $ 0.50. After getting at least $ 10.5 you can withdraw using your Scrill account and upload the balance to microworkers to make further campaigns
The term was first used by Internet theorist John Kilroy in a 2004 article on paid search marketing.[citation needed] Because the distinction is important (and because the word "organic" has many metaphorical uses) the term is now in widespread use within the search engine optimization and web marketing industry. As of July 2009, "organic search" is now common currency outside the specialist web marketing industry, even used frequently by Google (throughout the Google Analytics site, for instance).
Brankica, Excellent list. The only one I could think of is mentioning through email to family and friends. Example: Hey Guys and Gals, Sorry I haven't been in touch for a while. I've been working my tail off building something I hope will be special. I would appreciate it if you could do me a huge favor. My blog is still new and I need some opinions on where it can improve. Let me know the good, bad and ugly:) Also, can you pass it along to some of your friends? The more opinions the better idea I have at what works and what doesn't. etc. Thanks for the post. Live it LOUD!
The second major source of traffic for e-commerce sites is organic search, which is responsible for 32 percent of overall monthly traffic. Interestingly, while the ratio of search vs paid traffic for e-commerce websites is 20:1, the average ratio for all industries is 8:1 (87.17% search clicks and 13.23% paid clicks as of January 2018), which leaves room for growth of paid traffic for online retailers.
Guest blogging purely for inbound links is a flawed strategy because the value of those links are going down. However, guest blogging for traffic is still an incredibly viable strategy. While that inbound link you get at the end of a guest post doesn’t have as much SEO value as it used to, it still has the value of exposing your content to a new audience.
×