Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
Ha ha ha, John, aren't you trying to be funny. Well I need to inform you that I have been to your site, read it and totally DON'T agree with you :P Your blog is great and I don't see many people running from it, lol. And definitely bookmark for future reference, there is no way to go through it in a day (I can't do it and I wrote it!). Hope you will use some of these and send me some feedback about the results.
Branica, I wasn't sure what to expect - part of me thought about skimming the article because I figured it would 'read' as all the other articles on generating traffic - I was WRONG! I slowed my skimming and started reading - you made the otherwise sterile information interesting while your pace infused a sense of excitement ... so much so that I want to implement everything you suggested and I have, I guess, what you would consider a niche blog. Thanks for the lists along with your insights on their value, etc. I came upon this article through Blog Interact - and glad to have found you. Peppy
I feel that an article is only good if it adds value to the reader, and this one qualifies as a great article because it has shown me quite a few new ways in which to generate traffic to my blogs. I would like to tell you that I had read your suggestions about using flickr images on blog for traffic generation some other place as well and I have religiously followed that on my blog and today at least 15% of the traffic to my site comes from there! I am therefore going to follow the other suggestions as well (the ones that I am not following now) and hope to take my blog from Alexa 500K to sub 100K in the next couple of month!
If you feel as if your business is being overtaken by your competitors, it may be because they are already taking advantage of everything that social traffic can do for them. Having a Facebook page or Twitter account with a huge amount of likes or followers automatically makes you look more legitimate. For example, would you rather buy something from someone whose page has 50 likes, or someone whose page is really popular and has 2,000 likes? The answer is obvious.
Referral means people clicked on a link somewhere else. This can be email or social, but is mostly links on other websites. If you switch the view in the Channels pie chart to Sources/Mediums, as we did for the screenshot below, you can see your most important links. For our lab site, Pinterest is major, as are Google’s educators’ sites and several homeschool sites. We can click on Acquisitions> Referrals to see more. Referral traffic can be a little confusing because it overlaps with Email and Social; more on that later.
Well as noted in the post it is not just above the links that was only one key part of a wider strategy. This website in question has deep levels of content. So it is not just about a blog section, they have numerous high quality content sections we have developed over time. It would not be advisable ever to attack competitors sites with low quality links.
In order to get the whole bundle of internal links for your website, in the same Search Console, you need to go to the Search Traffic > Internal Links category. This way you’ll have a list with all the pages from your website (indexed or not) and also the number of links pointing to each. This can be a great incentive for discovering which pages should be subject to content pruning.
For our client: We were lucky enough to remove most from the prior agency outreach, we also went directly to many webmasters in which we wanted to remove links. We did not use the Disavow tool as it was not around when we completed this link cleanup, but we all know it has been said that if you are going to use the Disavow Tool to use it with caution.
You could spend a week researching and writing a 3,000 word in-depth guide only to find that in a month its traffic is being eclipsed by a 300 word blog that took you one tenth of the time to write. That little gem could start ranking for some pretty valuable keywords – even if you never planned for it to. Give it a makeover and you’ll see your rankings on SERPs (search engine results pages) and organic traffic values soar. We’ve seen this strategy work with both our clients and our own website. Big bonus: it’s actually an easy strategy to pull off. Here’s how:
Now, it is not that these sites are not interested in Google users. In fact, they have hired us to help them increase their share. However, they are getting so much traffic from sites like Facebook that it seems there is less urgency about attracting this traffic and less willingness to change the site to meet organic standards. Not long ago, sites would urgently and unquestioningly abide by Google’s standards to court that traffic.
Hi, thanks for your comment. When you refer to incognito browsing, I think it's important to clarify that referrer data *is* passed between sites. Click on a link whilst in incognito, and the referrer header is passed as usual. The exception is if you're browsing normally, and then opt to open a specific link in incognito mode - this won't pass referrer data.
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources: