Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.

Branica, I wasn't sure what to expect - part of me thought about skimming the article because I figured it would 'read' as all the other articles on generating traffic - I was WRONG! I slowed my skimming and started reading - you made the otherwise sterile information interesting while your pace infused a sense of excitement ... so much so that I want to implement everything you suggested and I have, I guess, what you would consider a niche blog. Thanks for the lists along with your insights on their value, etc. I came upon this article through Blog Interact - and glad to have found you. Peppy
The best of profitable visitors which you can get to your site is from Organic search results. This means, when people search for something and land on your website because these are the people who are most likely to convert into customers or clients. In blogging, it’s also most profitable as users will be seeing more high CPC Adsense ads, as they will see ads based on their search term.

Superb resource list Brankica, thank you. I've also included it this week's Erudition, not just because you're in a competition, but because it really is a resource we should all have bookmarked. Actually, I need to make it a study priority and see how many of the sources I can reasonably use on a regualr basis. Link to Erudition | Help files from Information Junkies Anonymous
Unless you have an invite, you can’t comment or submit a new product to PH. Even then, if you were to submit yourself, the likelihood is that you’d miss out on a lot of traction compared to someone influential on PH submitting. You only get one chance to submit to Product Hunt so you’ll need to identify someone who would be interested in your startup that also has influence within the PH community. To do this, go to Twitter and search the following query in the search bar:
Another special mention is Quora. The question and answer site is like a high class, actually-useful version of the defunct Yahoo Answers. Experts find questions in their industry to answer and provide detailed answers, either in the form of a moderately lengthy post, or in a post that links out to their websites. You, too, can take advantage of industry questions by answering them the best you can. Users can vote on the most useful answer, and it floats to the top, so the more useful you can be, the more exposure your link will get.
hey james - congrats on your success here. just a question about removing crummy links. for my own website, there are hundreds of thousands of backlinks in webmaster tools pointing to my site. The site has no penalties or anything  - the traffic seems to be growing every week. would you recommend hiring someone to go through the link profile anyway to remove crummy links that just occur naturally?

We are here to give you the best web traffic you can get. We are experts in SEO methods and have been in this business for many years. We understand that our customers need to be our number one priority so one of our main objectives is to ensure the rapid growth of your website and to do everything to make your business more visible online.          
As I said at the beginning of the article, more naturally earned backlinks, will help you get more organic traffic and improve SEO. You can suggest your readers to link back to your website by having a small widget at the bottom of your posts. Just as you have the social media sharing links, an embedded linking widget will increase the chances to get more backlinks.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
Conduct a quick Google search using “site:yourwebsite.com” to make sure your pages are actually indexed. If you’re noticing that critical pages aren’t appearing in the SERPs, you’ve likely found the culprit. Check your robots.txt file to make sure you haven’t blocked important pages or directories. If that looks good, check individual pages for a noindex tag.
Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
We want you to be able to make the most out of AccuRanker which is why we have a dedicated team to help you with any questions you have. We offer complimentary one-on-one walkthroughs where our Customer Success Manager will take you though our software step-by-step. We also have a chat function where you can ask us questions, give us your feedback and suggestions.
Before you say it – no, true guest blogging isn’t dead, despite what you may have heard. Securing a guest post on a reputable site can increase blog traffic to your website and help build your brand into the bargain. Be warned, though – standards for guest blogging have changed radically during the past eighteen months, and spammy tactics could result in stiff penalties.
If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.
Hey Don, that is a great question and I will give you the overview. Now I have been trying these out for almost two years, for example Yahoo Answers was one of the first ones I tried and brought great results then. So first you need to decide which ones will you try and then track the results. I use Google Analytics to track the incoming traffic. So if I am focusing for example on blog commenting, I will note where I commented, how many times, etc and then see if those comments brought me any traffic. It depends a lot on the type of comment I make, the title of my ComLuv post, but you can still get some overview of how that traffic source is working for you. There are of course sources I discover "by chance". For example, in the last month Paper.li brought me 24 visitors, that spent more than 2 minutes on average on my blog. That is more than some blogs I comment on regularly bring me. So in this case, I will try to promote the paper.li a bit better and make it work for me. I will unfollow some people on Twitter that are not tweeting anything good so my paper.li chooses better posts hence better tweeps and get me exposed to them. A lot of them will RT my paper.li daily, so there is more potential of my content being shared. In case of the blog I mentioned, since none of the posts are becoming viral, the blogger is average, it is obviously not bringing me any traffic, I will start commenting less and work more on those that bring me more traffic. Now this is all great, except I get emotionally attached to blogs I read so I don't look at numbers like that :) But that is how you should track results. The main thing after reading this post is to choose up to 5 of these sources you feel comfortable with, work on them and track results. Keep those that work for you and ditch those that don't. It is a lot dependent on the niche you are in, too. I always try one source for up to a month, if there are no results in a month, I stop working on it and move to another one. If I didn't answer all you wanted to know, just ask additional questions, I am more than glad to help :)
Love the five different area of investigation that you went over, great way to analyze and diagnosis the issue. I would also definitely agree doing a rankings comparison between the two time frames, and not only check what your Google ranking is, but also track the search volume for your keywords to see if it has fluctuated or gone down. Google Trends is a great tool for this as well, as one of your keywords that your ranking for may have just lost popularity online.
Amy Gesenhues is Third Door Media's General Assignment Reporter, covering the latest news and updates for Search Engine Land and Marketing Land. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including MarketingProfs.com, SoftwareCEO.com, and Sales and Marketing Management Magazine. Read more of Amy's articles.
This is a crucial area. If you do not have schema markup and rel="author", you are costing your business money. It is as simple as that. As an example, say I want to make spaghetti (pasta) for dinner I search for “Spaghetti Recipe” and instantly I see some great markup in play, but one competitor has no markup and no rel="author" they are losing business in my eyes. Wouldn't you agree?.
I feel that an article is only good if it adds value to the reader, and this one qualifies as a great article because it has shown me quite a few new ways in which to generate traffic to my blogs. I would like to tell you that I had read your suggestions about using flickr images on blog for traffic generation some other place as well and I have religiously followed that on my blog and today at least 15% of the traffic to my site comes from there! I am therefore going to follow the other suggestions as well (the ones that I am not following now) and hope to take my blog from Alexa 500K to sub 100K in the next couple of month!
Hi Pavan, I would agree that it's a long post - but some posts are just worth the read no matter how long they are - especially this one since it's loaded with useful resources. I've actually bookmarked it and I plan to read it a few times over in hopes of putting these great tips to use. All in all - it's not length that matters - it's how a post is presented and the information that it contains within. If a writer can keep me captivate or entertained during the entire thing - then I'm willing to read it regardless of how long or short it is. Just my opinion :). Have a great week. Cheers

One of the things that makes social media a bit more attractive than search engine optimization is that you get to maintain a bit more control over your success. You can always find new ways of improving your strategy and you can learn from your mistakes and your wins so that you can improve your traffic in the future. The same can't really be said about SEO - although it's very clear what not to do, it's not always as clear exactly what strategies can help you improve your ranking.
Buy canadian web traffic visitors from Yahoo Canada for 30 days – We will send real canadian visitors to your site using Yahoo Canada's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Yahoo Canada Web Traffic Service today …
Go to local events or Meetup events and connect with bloggers in your industry. An example of an event I run to connect with bloggers and people in the online marketing word is: http://www.meetup.com/Online-Marketing-Sydney/. Make friends first and then try to gain guest posts later. I am not really a fan of websites which are flooded with guest posts one after another; it is the type of thing which Google is just waiting to target.
Hey Delena, I am not worried about you stalking me, I like it :) I think you can advertise any service on Craigslist. The idea you have is awesome. Not only you can point your ad to your blog but also make some extra cash. And instead just going locally, I am sure there is something you can do worldwide, like give people advice over Skype or something. Thanks for the awesome comment and the twithelp suggestion.
Thanks for this handy info. I wasn't aware that Google owned Vark so I'm off to check it out. There is so much to learn about the best sites to use on the internet for traffic generation. I'm too frightened of spending more time surfing than actually writing for my own blogs that I probably don't do nearly enough 'looking'. Does anyone else have this time problem V's actual work time?
Remember when you used to rely solely on search engines for traffic? Remember when you worked on SEO and lived and died by your placement in Google? Were you #1? Assured success. Well, okay, maybe not assured. Success only came if the keywords were relevant to your site users, but it was the only real roadmap to generating site traffic and revenue.
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.
However, this may not be the case for your company or your clients. You may start by looking at keyword rankings, and realize that you’re no longer ranking on the first page for ten of your core keywords. If that’s the case, you quickly discovered your issue, and your game plan should be investing in your core pages to help get them ranking again for these core keywords.
The moral of this story is that everything is contextual. And this applies to everyday happenings and to your online marketing, traffic sources, and conversion rates as well. It happens that what looks like an opportunity to be actually a setback and vice-versa. We all make changes within our sites with the purpose of having tons of traffic. We are in a continuous race for inbound links, domain authority, technical SEO, diagnosing traffic drops, search volume, keyword research and we forget to take a few steps back and see how all of these influence our overall site’s performance. We’ve documented the ranking drop issue in an earlier post as well and you can take a look at it as well to better understand this phenomenon.

The time has never been better to jump on the Facebook and Twitter bandwagons and buy social traffic at incredibly affordable rates. Imagine having each and every post you make on Twitter or Facebook get seen by thousands of people and then having their hundreds of thousands of friends and followers see them as well. Don’t forget about the added benefits that these types of boosts can give you in the big search engines either. Stop wasting time trying to build these accounts by yourself. It’s time to let the professionals take care of getting likes and followers so you can spend your valuable time on building your business and taking care of your customers.
The good news is, however, indications are that dark social should decrease more and more over time, as social media as a sharing mechanism -- as opposed to email -- only continues to grow. In fact, in the same article we mentioned earlier, BuzzFeed cited a preference with millennials to share over Facebook and Twitter, alongside a longer term downward trend of sharing over email.

In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.
The top five leaders in paid traffic (Google Adwords) are the same companies that lead in search, with Amazon again on top. Leading retailers don’t want to put all their eggs in one basket, and they are willing to invest heavily in pay-per-click campaigns. Clearly, PPC has worked for them because we see four out of the top five spenders are the leaders in annual sales numbers.
For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.
If you feel as if your business is being overtaken by your competitors, it may be because they are already taking advantage of everything that social traffic can do for them. Having a Facebook page or Twitter account with a huge amount of likes or followers automatically makes you look more legitimate. For example, would you rather buy something from someone whose page has 50 likes, or someone whose page is really popular and has 2,000 likes? The answer is obvious.
Once you've set up an alert within Mention, go to your settings and then 'Manage Notifications'. From here you can select the option to get a daily digest email of any mentions (I'd recommend doing this). You also have the option of getting desktop alerts - I personally find them annoying, but if you really want to stay on the ball then they could be a good idea.
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.

But now, that is all changing. We can still see all of that information ... sometimes. Things got a whole heck of a lot more complicated a little over a year ago, when Google introduced SSL encryption . What does that mean? It means Google started encrypting keyword data for anyone conducting a Google search while logged in to Google. So if you're logged in to, say, Gmail when you do a search, the keyword you used to arrive on a website is now encrypted, or hidden. So if that website drilled down into its organic search traffic source information, they'd see something like this:
If you indulge me an analogy, if you’re moving from one place to another, you would like the postman to send all the love letters to your new address and you wouldn’t want them lost in an old mailbox that no one uses, right? (I am guessing you would want the bills to be sent to the old address instead). A similar thing happens when it comes to moving your site’s address.
Brilliant stuff Brankica! This is a very comprehensive overview of major traffic generation strategies. My guess is that nobody will actually need to use all of them! Identifying a few which are relevant to your particular niche and fit the vision you have for your blog is the way to go. Awesome post, loads of value in there... Well done! All the best, Jym
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
In the end of the day it depends on the size of the website you are working with and how well known the brand is in the market. You can adapt some of the strategies listed above in the post on scale and it can have a highly positive impact on a web property, the property in question is a real content house so any thing is possible. What else do you suggest we should do I will advise you if it has been done already?
Buy french web traffic visitors from Google.fr for 30 days – We will send real french visitors to your site using Google.fr's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Google France Web Traffic Service today …
Buy brasilian web traffic from Yahoo Brasil for 30 days – We will send real brasilian visitors to your site using Yahoo Brasil's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait buy Web Traffic from Yahoo Brasil today …
Hey Ryan, thanks for including me, I will be over there to thank you as well :) I am glad you liked the post and definitely don't advice on getting into all of them at once, lol. I am the first one that would try all, but I learned that it is the wrong way to go in about anything. The best thing would be choosing one or two of these and tracking results. I learned that Flickr can take too much time for some types of blogs, while others will have great results with it. Depends on the niche a lot. But I know you will do the best thing, you always do! Thanks for the comment and helping me in the contest!

Google measures average time on site by first collecting each visitor’s exact time on a particular page. Imagine that a visitor lands on page 1 of your site. Google places a cookie, including a unique code for the visitor and a time stamp. When that visitor clicks through to page 2 of your site, Google again notes the time, and then subtracts the time that the visitor arrived at page 2 from the time that the visitor arrived at page 1. Google then averages each and every page’s time spent to get the average time each visitor spends on the site.

Direct traffic is defined as visits with no referring website. When a visitor follows a link from one website to another, the site of origin is considered the referrer. These sites can be search engines, social media, blogs, or other websites that have links to other websites. Direct traffic categorizes visits that do not come from a referring URL.

×