Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.

Thanks for the comment Slava good too see your team is on top of things and happy you liked the post. The website in the case listed was a client who has taken on an agency who was doing lower quality SEO work which was affecting the site such as the huge link network and a strategy which only revolved around mainly head terms. We saw no long tail integration from the old agency's strategy, so we were able to yield great results to begin with. The clients site has 100's of high quality articles which we were able to re optimize and update as noted. Further to this they had a large index of high quality pages to work from. Sure enough the points listed above were key elements to a far wider strategy which could be 100's of points. I just wanted to include some of the biggest wins and easy to implement points.  
Secure (https) to non-secure sites (http): Since Google began emphasizing the importance of having a secure site, more websites are securely hosted, as indicated by the “https” in their URLs. Per the security protocol, however, any traffic going from a secure site to a non-secure site will not pass referral information. For this issue, you can correct by updating your site to be secure through a third-party SSL certificate.

Buy canadian web traffic visitors from Yahoo Canada for 30 days – We will send real canadian visitors to your site using Yahoo Canada's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Yahoo Canada Web Traffic Service today …
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
This section is particularly helpful when looking at organic results from search engines, since it will let you know which search queries resulted in engaged traffic. Below is another example from a site that focuses on electronic components. Overall, the Google organic source was well behind the site average, but some specific search queries were actually performing better than average.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
If your page is ranking for a keyword like “social media management,” you might want to look into similar long-tail keywords, such as “social media management pricing” or “social media management tools.” Chances are, adding in sections that address these subjects will be fairly easy. Additionally, by adding in a few extra long-tail keywords, you’ll have the added bonus of increasing the total keyword density of the page.
So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
Referral means people clicked on a link somewhere else. This can be email or social, but is mostly links on other websites. If you switch the view in the Channels pie chart to Sources/Mediums, as we did for the screenshot below, you can see your most important links. For our lab site, Pinterest is major, as are Google’s educators’ sites and several homeschool sites. We can click on Acquisitions> Referrals to see more. Referral traffic can be a little confusing because it overlaps with Email and Social; more on that later.
Basically, what I’m talking about here is finding websites that have mentioned your brand name but they haven’t actually linked to you. For example, someone may have mentioned my name in an article they wrote (“Matthew Barby did this…”) but they didn’t link to matthewbarby.com. By checking for websites like this you can find quick opportunities to get them to add a link.
Hi Pavan, I would agree that it's a long post - but some posts are just worth the read no matter how long they are - especially this one since it's loaded with useful resources. I've actually bookmarked it and I plan to read it a few times over in hopes of putting these great tips to use. All in all - it's not length that matters - it's how a post is presented and the information that it contains within. If a writer can keep me captivate or entertained during the entire thing - then I'm willing to read it regardless of how long or short it is. Just my opinion :). Have a great week. Cheers
I would like to talk about a case study for a large start up I worked on for over eight months in the Australian and US market. This client originally came to the company with the typical link building and SEO problems. They had been using a SEO company that had an extensive link network and was using less than impressive SEO tactics and methodologies over the last 12 months. The company was also losing considerable revenue as a direct result of this low quality SEO work. So, I had to scramble and develop a revival strategy for this client.
For a long time, digital marketers summed up the properties of direct and organic traffic pretty similarly and simply. To most, organic traffic consists of visits from search engines, while direct traffic is made up of visits from people entering your company URL into their browser. This explanation, however, is too simplified and leaves most digital marketers short-handed when it comes to completely understanding and gaining insights from web traffic, especially organic and direct sources.
×