We have been using AccuRanker for the past few months and the team have so much good stuff to say about it. Not only do the ranking seem accurate, there are a lot of added features within the platform making reporting on the data very effective. With any new tool, there are some recommendations of features that we have put forward and AccuRanker have been very receptive to these ideas and we are hopeful that they will be implemented in the near future.
Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.
Hi Brankika, With respect to EZA, I no longer submit to them. They rejected some of my article for saying that the page I was linking to did not contain enough information. I linked to my blog and to the original article. I believe they didn't like the link to the original article. That being the case, they no longer allow one of the cardinal rules of syndication as per Google itself..."Link To The Original". Once they stopped allowing that, they were no longer useful to me. Thanks for the great resource. Mark
Referral traffic in Google Analytics can also include your social traffic or even your email visits. This is largely because there are different ways to track data and there are imperfections in the system, but it also depends on which reports you look at. For example, Acquisition> All channels> Referrals will include much of your social traffic, but the default All channels report does not include Social in your Referrals. The best solutions:
But now, that is all changing. We can still see all of that information ... sometimes. Things got a whole heck of a lot more complicated a little over a year ago, when Google introduced SSL encryption . What does that mean? It means Google started encrypting keyword data for anyone conducting a Google search while logged in to Google. So if you're logged in to, say, Gmail when you do a search, the keyword you used to arrive on a website is now encrypted, or hidden. So if that website drilled down into its organic search traffic source information, they'd see something like this:
For many startups, this means doing enterprise SEO on a small business budget, which comes with a few compromises. The problem is, Google doesn’t accept compromises when it comes to search optimisation and you need to get the fundamentals spot on. The good news is, the sooner you get these right, the faster you’ll be able to build a self-sustaining SEO strategy that doesn’t come back to bite you in the budget later.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
Direct traffic is defined as visits with no referring website. When a visitor follows a link from one website to another, the site of origin is considered the referrer. These sites can be search engines, social media, blogs, or other websites that have links to other websites. Direct traffic categorizes visits that do not come from a referring URL.
×