First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
What if I read anymore information my brain will explode--so where do I go if I supply all the content, but am too lazy to read all of this. Who can I pay to run with this?--(Also-I know enough to do all the grunt work-just need some direction) I have a really fun project/marketing challenge, a moderate amount of coins, and other than today-usually a ton of commitment. http://bdehaven.com
According to our data, twenty-three out of the twenty-five largest retailers use Product Listing Ads (PLAs), which are cost-per-click ads that are purchased through AdWords to promote products. Google initially launched Product Listing Ads in the US market in 2011. However, the advertising format experienced an astronomical rise around 2014, when the search engine launched the feature in other countries. Today, PLAs account for 43 percent of all retail ad clicks and a staggering 70 percent of non-branded clicks!
The moral of this story is that everything is contextual. And this applies to everyday happenings and to your online marketing, traffic sources, and conversion rates as well. It happens that what looks like an opportunity to be actually a setback and vice-versa. We all make changes within our sites with the purpose of having tons of traffic. We are in a continuous race for inbound links, domain authority, technical SEO, diagnosing traffic drops, search volume, keyword research and we forget to take a few steps back and see how all of these influence our overall site’s performance. We’ve documented the ranking drop issue in an earlier post as well and you can take a look at it as well to better understand this phenomenon.
So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.
It increases relevancy: Siloing ensures all topically related content is connected, and this in turn drives up relevancy. For example, linking to each of the individual yoga class pages (e.g. Pilates, Yoga RX, etc) from the “Yoga classes” page helps confirm—to both visitors and Google—these pages are in fact different types of yoga classes. Google can then feel more confident ranking these pages for related terms, as it is clearer the pages are relevant to the search query.
The amount of dark social that comprises one's direct traffic is going to vary. For example, you might be diligent about incorporating tracking tokens into your email marketing campaigns, and asking your co-marketing partners to do the same when they promote your site content. Great. You won't have a huge chunk of traffic dumped into direct traffic that should really go into email campaigns. On the other hand, perhaps you're embroiled in a viral video firestorm, and a video on your site gets forwarded around to thousands of inboxes ... you don't exactly have control over that kind of exposure, and as a result you'll be seeing a lot of traffic without referral data in the URL that, consequently, gets bucketed under direct traffic. See what I mean? It depends.
Another good thing to look at is domain authority and core page authority. If your site has had a few redesigns, moved URLs, or anything like that, it’s important to make sure that the domain authority has carried over. It’s also important to look at the page authorities of your core pages. If these are much lower than when they were before the organic traffic slide, there’s a good chance your redirects weren’t done properly, and the page authority isn’t being carried over through those new domains.
Guest blogging purely for inbound links is a flawed strategy because the value of those links are going down. However, guest blogging for traffic is still an incredibly viable strategy. While that inbound link you get at the end of a guest post doesn’t have as much SEO value as it used to, it still has the value of exposing your content to a new audience.
In all of the above cases, you’ve potentially found a page on your website that could turn into a keyword monster with a little extra content and keyword integration. Although we’ve discussed blog posts in this guide, don’t completely overlook the other pages on your website – these tips will work to increase organic traffic on all pages, not just individual blog posts.
I used to work with Ranker-Tracker and was more than pleased when I changed to AccuRanker, which is not only the fastest SERP tool available but also very accurate. The keyword data in Accuranker is refreshed every single day. I can also get instant on-demand ranking updates - rankings refresh in a matter of seconds whenever we need it. Reporting is very easy and saves us time as we can set automated emails to be sent directly to our and clients emails.
Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources: