"We have used quite a few tools for tracking SERPs and keywords as part of our content marketing effort. And there was always something missing. It wasn’t until we found AccuRanker that we were completely satisfied. It has increased our productivity. The powerful filters, tagging, instant checks, integration with GSC, multiple URL detection per keyword, accurate search volume, and notes, are all features we now can’t live without. AccuRanker really has taken our SEO efforts to another level. We were able to grow our organic traffic by 571% in just 13 months."

Hey Julia, thanks so much for the comment. I just saw the idea you were talking about, that is awesome. I think you are right cause although it seems a bit inconvenient, I personally wrote down a few URL like that in my cell phone or notebook to check it out when I am back at the computer. Wow, this is something I need to think about. Also, how about those big billboards?
Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.
The thing about SEO in 2018 is that Google changes its algorithms more than once a day! Reports say that the company changes its algorithms up to 600 times a year. While the majority of those updates consist of smaller changes, among them is the occasional, major update like Hummingbird or Panda that can really wreak havoc with your traffic and search rankings.
In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.
I'm not in the contest, but if I was - I'd be SKEEEEERED. This was awesome, really - there were MAYBE 3 things I knew... The rest was new. I'm lame that way. The great thing about this is that as a blogger - you've covered ideas I've never thought of...I get my traffic mostly from SEO (not on my blog, but in my websites which are product based review sites) - but there's enough meat in this post I can use for my niche sites to keep me in the black, so to speak (ink I mean, not socks). Awesome post, Brankica - I'm speechless. (If you ignore the foregoing paragraph.)

Fantastic post Tom! There's a lot of confusion on direct traffic to begin with. These are even more finer nuances that you have explained. I have a client whose top source of traffic is direct, and part of it is maybe due to the fact that they don't do any promotion, and it's mostly shared around internally within their team. But I just noticed that their URL is http and not https so this might be something for me to take back to the team. Thanks!

Clean, fast code is important and you need to be aware of this if you’re using WordPress themes or other CMS platforms that typically come with a lot of bloated code. Despite its chunky build, one of the key benefits of using WordPress is its structure of templates that allow you to create new pages at the push of a button and create content in a visual interface, rather than code everything yourself.

Many bloggers don’t get time to maintain their blog frequency thus search engine bots also don’t like such blogs much. It is not that tough to maintain blog frequency, just need to be bit organized. But writing post frequently doesn’t mean that you write articles not related to your niche. Always write article related to your niche and take care about keywords.
Forum comments work much the same way as blog comments, except you tend to lack the base post from which to play off. Instead, you need to invest some time into getting to know the culture of the forum, the prominent users, the rules, and the discussion flow. If people generally post one sentence at a time, adding a 3,000 word post will be excessive and may be mocked. If people tend to post lengthy discussions, short posts may have a negative effect. And, like Reddit, some sites may be very rabid about enforcing no advertising.

Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.

It increases relevancy: Siloing ensures all topically related content is connected, and this in turn drives up relevancy. For example, linking to each of the individual yoga class pages (e.g. Pilates, Yoga RX, etc) from the “Yoga classes” page helps confirm—to both visitors and Google—these pages are in fact different types of yoga classes. Google can then feel more confident ranking these pages for related terms, as it is clearer the pages are relevant to the search query.

Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
I would like to talk about a case study for a large start up I worked on for over eight months in the Australian and US market. This client originally came to the company with the typical link building and SEO problems. They had been using a SEO company that had an extensive link network and was using less than impressive SEO tactics and methodologies over the last 12 months. The company was also losing considerable revenue as a direct result of this low quality SEO work. So, I had to scramble and develop a revival strategy for this client.

A few links down and I've noticed that Brian has a link from WordPress.org. Not bad! Turns out that his content has been referenced within one of WordPress's codex posts. If I were to reach out and offer some additional insight, citing one of my articles, there's a chance I could bag a similar link, especially considering they have a 'Useful Resources' section.


Armed with this information, ecommerce marketers can learn why some sources might be underperforming or focus efforts on sources that drive better quality traffic. In some cases, this might mean relying less on search engines and more on social media or brand awareness. Other times the opposite could be the best course of action. Either way, the Traffic Sources section in Google Analytics can help.

This is a breath of fresh air. Information Technology has been, for too long, plagued by those exploiting the ignorance of well intentioned people. The more good gen we have the better we will do and the better services will become. We put useful information on our web site for people wanting to start up renting out cottages. It is extensive and quite annoys our rivals who keep this sort of thing secret, only giving it out if you 'send for an information pack'. This, then, gives them the green light to pester you with sales calls and threatened sales visits. Generosity with information is the way forward. People remember it and associate it with a postive attude towards business relationships. Thank you for your help with these links.
There are always high profile blogs in your industry, no matter what that industry is. It might be sites like Business Insider, Forbes, and Inc. It might be sites like Medium and Gawker. It might be sites like Search Engine Journal and QuickSprout. The fact is, every industry has its dominant forces, and as long as you’re not the dominant blog, you can use the dominant blogs as traffic sources for your own.
Thanks for this handy info. I wasn't aware that Google owned Vark so I'm off to check it out. There is so much to learn about the best sites to use on the internet for traffic generation. I'm too frightened of spending more time surfing than actually writing for my own blogs that I probably don't do nearly enough 'looking'. Does anyone else have this time problem V's actual work time?

Lee Wilson is head of enterprise SEO at Vertical Leap and has been leading digital marketing teams since the early 2000’s. He has headed up the SEO department for a top 10 leading search and digital agency (Vertical Leap) from 2010 to present. In 2016, he had his first solely authored industry book published (Tactical SEO – the theory and practice of search marketing).
Are you currently excluding all known bots and spiders in Google Analytics? If not, you may be experiencing inflated traffic metrics and not even know it. Typically, bots enter through the home page and cascade down throughout your site navigation, mimicking real user behavior. One telltale sign of bot traffic is a highly trafficked page with a high bounce rate, low conversions and a low average time on page.
You could get even more specific by narrowing it down to customer base. Is there a specific group of clients you tend to serve? Try including that in your long-tail key phrase. For example: “SEO agency for non-profits in Albuquerque NM.” That’s a key phrase you’re a lot more likely to rank for. Not to mention it will also attract way more targeted, organic traffic than a broad key phrase like “SEO agency.”
×