While it’s best to create a custom dimension for filtering out bots, applying the generic bot filter is a good place to start. It’s important to note that filters cannot be applied retroactively, so if you’ve recently turned on this feature, you should be receiving less traffic. Additionally, double-check that you are filtering out your own traffic and IP address.
Hey Christy, thanks so much for the comment and I hear about this HARO thing for the first time. I am definitely going to have to check it out. I have this OCD (lol) that as soon as I hear about something new, I gotta join and try it out. This is officially the new item on my to do list that now has only 563 bullet points. Just kiddin', although we all have long to do lists I always find extra time to see what are the things other bloggers talk about. Thanks so much again :)
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
Understanding where your site visitors come from is an integral part of any marketing strategy. Your website is the heart of your digital marketing practices, with traffic acting as the blood. No traffic means your website can’t do anything for your business; knowing the different kinds of traffic and how they play into your website gives you the power to make educated decisions on how to improve your marketing practices.
You can apply this to marketing in a few ways. If, for example, you purchase paid search advertising, you’ll want to make sure those “CPC” sources have generally low bounce rates. If a pay-per-click or cost-per-click campaign has a high bounce rate (1) check your landing page to make sure that it provides the content promised in your ad, (2) check your ad copy to ensure it is clear, and (3) check your keywords.
Great post, though http://www.blogcatalog.com that sends millions of visits a month to blogs and has an engaged community of bloggers from startup bloggers to pros should be on any list. BlogCatalog is the parent company of http://www.bloggersunite.org a site dedicated to harnessing the power of the blogosphere and another great site for bloggers to connect and get traffic and become more effective bloggers.
If you publish whitepapers or offer downloadable PDF guides, for example, you should be tagging the embedded hyperlinks with UTM campaign parameters. You’d never even contemplate launching an email marketing campaign without campaign tracking (I hope), so why would you distribute any other kind of freebie without similarly tracking its success? In some ways this is even more important, since these kinds of downloadables often have a longevity not seen in a single email campaign. Here’s an example of a properly tagged URL which we would embed as a link:
Hey Andreas, there are two main reasons I saw for suspending accounts - mean people and too many links. But they come to the same. On Yahoo Answers, like everywhere else, you have competition. They will flag your answer just so their answer would be chosen as the best. On the other hand, putting too many links in your answers can provoke the same thing. I usually put a link in every 3rd or 4th answer I post. And it is usually the best answer I can make. Glad you will try out the other sources, would love to see the results.
Great article! SEM Rush's new ranking study of 600,000 search queries showed a positive correlation between direct traffic and top rankings. They indicated that Google prioritizes domains with authority and consequently more direct traffic when ranking websites in its search results. See their study here: https://www.semrush.com/ranking-factors/. So, direct traffic can be viewed as a positive ranking correlation.
You had to know social media would be on the list. I generally recommend that a site only have a presence on 2-3 social networks, at least while they’re small. It’s a lot of work to maintain, engage, and update a social network profile, and keep its messaging consistent with your branding. You can always spay someone to do it for you, but then it’s not a free traffic source, is it?
The strength of your link profile isn’t solely determined by how many sites link back to you – it can also be affected by your internal linking structure. When creating and publishing content, be sure to keep an eye out for opportunities for internal links. This not only helps with SEO, but also results in a better, more useful experience for the user – the cornerstone of increasing traffic to your website.
For our client: We only used a smaller quantity of very high-quality link building each month. So, for example we only built 40 of the best links each month to supplement the work we were doing on the content marketing front. We also invested heavily into tracking competitor backlink profiles, using Majestic SEO and Open Site Explorer. We worked out how the competitor's acquired specific backlinks, then by using outreach and content creation we obtained these links.
Direct traffic refers to traffic you receive to your website that doesn't come through any other channel. So, when you type www.hubspot.com into your search bar and hit 'Enter,' you're accessing HubSpot.com via direct traffic. If someone posted a link to www.hubspot.com on Facebook, however, and you clicked on that link, your visit would be bucketed in HubSpot.com's social media sources.

Facebook ads are a great way to get highly targeted traffic to your blog (landing page, fan page, what ever). Although not a free traffic source, it is a great one. The price is not very high and I think it is pretty acceptable considering that you can choose demographics of Facebook users that will see your ad. I had great success getting traffic that converts on one of my niche sites.
This is an easy one. Don’t use meta refreshes or JavaScript-based redirects — these can wipe or replace referrer data, leading to direct traffic in Analytics. You should also be meticulous with your server-side redirects, and — as is often recommended by SEOs — audit your redirect file frequently. Complex chains are more likely to result in a loss of referrer data, and you run the risk of UTM parameters getting stripped out.

I’ve always been a believer that hard work gets the best results, and in practice it always ends up being true. On the web it’s no different. If you want more organic traffic, you have to work for it. That means giving your best effort every time, going after opportunities your competitors have missed, being consistent, guest blogging strategically, and staying on Google’s good side.
×