Backlinks are basically Authoritative linking. Which means someone else says about your site that it is in an indication of a particular keyword or you have authority in a particular market is indicating that their readers can go and find more helpful information from certain places on the web and they do that by creating these authoritative links which also called backlinks. The more of high quality, authoritative links that you have, Google considers this as you are being incredible in the market. Your website can be authoritative by having other website owners to link to your website, Then Search Engine algorithm will consider your site and you will get higher boost to your SEO and your site will likely get higher ranking and the more of this authoritative link. Blog Commenting is a great way to get backlinks to your website. Step 1. Find relevant and high traffic blog in your niche. Step 2. Actually read the post, what all it’s about. Step 3. Just leave relevant comment to the topic, then simply place your link in the comment.

Hey Delena, stress management and parenting seem like a really related areas to me :) Slideshare is really great, I only uploaded two slideshows by now, but I am satisfied with the numbers. What you can do is create a slideshow, upload it to Slideshare and then create a video out of that same slideshow and upload it to YouTube. It will take you only a few extra minutes, but you will get exposure on YT as well. I hope the Craigslist ads will help, if now with traffic than with some extra gigs! Thanks so much for the comment!
The other way visitors can access your website is by coming from other websites; in this instance, the user lands on your website after following a link from another site. The link that the user clicked on is referred to as a “backlink,” as it links back to your website. This traffic is much more beneficial to the search engine optimization (SEO) of your website as opposed to direct traffic, which has little to no effect. The reason is that Google and other search engines interpret backlinks as little doses of credibility for your website. If other credible websites are linking to your site, that must mean it is comprised of relevant and accurate content, which is exactly what search engines want.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 

You might remember “Face Off” , a movie from ‘97 where Nicolas Cage is a terrorist and John Travolta is an FBI agent. And, as it happens in movies, they became obsessed with one another. Travolta needs information from Cage’s brother, so he undergoes an operation in which he gets Cage’s face; a few plot twists later and Cage has Travolta’s. They continue to pursue each other with new identities, the bad guy the good guy, the good guy the bad guy and so on and (almost) no one knew who is who. Dizzy yet?
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
Content is one of the 3 main Google ranking factors. As a matter of fact, having a blog can help you get your content indexed faster. An active blog – relying on qualitative and insightful content, and backed by authoritative links – will help you improve your rankings and thus your organic traffic. Also, long-form copies (above 1200 words) tend to rank higher and be crawled more often.

In order to get the whole bundle of internal links for your website, in the same Search Console, you need to go to the Search Traffic > Internal Links category. This way you’ll have a list with all the pages from your website (indexed or not) and also the number of links pointing to each. This can be a great incentive for discovering which pages should be subject to content pruning.


The amount of dark social that comprises one's direct traffic is going to vary. For example, you might be diligent about incorporating tracking tokens into your email marketing campaigns, and asking your co-marketing partners to do the same when they promote your site content. Great. You won't have a huge chunk of traffic dumped into direct traffic that should really go into email campaigns. On the other hand, perhaps you're embroiled in a viral video firestorm, and a video on your site gets forwarded around to thousands of inboxes ... you don't exactly have control over that kind of exposure, and as a result you'll be seeing a lot of traffic without referral data in the URL that, consequently, gets bucketed under direct traffic. See what I mean? It depends.
Hey there, I had to come back and post some updates. I tried your tips on my brand new blog and got following results. I have to say it is a brand new free wordpress site and I got it to have 30 daily visitors after only 5 days of following your advice. It's been 2 weeks that I am using the tips and I am up to 45 daily visitors.I also made my first money with that blog. Since I am a lot more active on Squidoo, I decided to try some of these ideas over there too, and I had my traffic double in only one week.Thanks and I will be back with more numbers and feedback how I made these traffic sources work for me.

Referral traffic in Google Analytics can also include your social traffic or even your email visits. This is largely because there are different ways to track data and there are imperfections in the system, but it also depends on which reports you look at. For example, Acquisition> All channels> Referrals will include much of your social traffic, but the default All channels report does not include Social in your Referrals. The best solutions:
Implementing structured data markup (such as that from schema.org) might seem like a one-time project, but that “set it and forget it” mentality can land you in hot water. You should be monitoring the appearance of your rich snippets on a regular basis to ensure they are pulling in the correct information. As you change the content on your website, this can alter the markup without warning.
One of the things that makes social media a bit more attractive than search engine optimization is that you get to maintain a bit more control over your success. You can always find new ways of improving your strategy and you can learn from your mistakes and your wins so that you can improve your traffic in the future. The same can't really be said about SEO - although it's very clear what not to do, it's not always as clear exactly what strategies can help you improve your ranking.

If, on the other hand, you’ve already migrated to HTTPS and are concerned about your users appearing to partner websites as direct traffic, you can implement the meta referrer tag. Cyrus Shepard has written about this on Moz before, so I won’t delve into it now. Suffice to say, it’s a way of telling browsers to pass some referrer data to non-secure sites, and can be implemented as a element or HTTP header.


So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
As I mentioned earlier, one of the biggest downsides to SEO is that Google is constantly making changes to their algorithm; they come out of nowhere, and they can drop your domain authority by 5, 10, even 15 points and it almost feels like you need to start optimizing all over again. It's frustrating, it's hard work and the results aren't always visible - but if you can get SEO right, it can truly be an amazing source of traffic.

So, you have downloaded your links profiles on a CSV and you now have an extensive list of all your linked domains. If you have been doing SEO for 8+ years like me you can probably just know from analysis which links are bad from a TLD and URL point of view. If you do not know too much you can use tools such as Link Detox: http://www.linkdetox.com/ to complete analysis of your link profile. I would always consult the advice of an expert SEO in this instance because it is easy for these tools to mistake good and bad links.

Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
Love the five different area of investigation that you went over, great way to analyze and diagnosis the issue. I would also definitely agree doing a rankings comparison between the two time frames, and not only check what your Google ranking is, but also track the search volume for your keywords to see if it has fluctuated or gone down. Google Trends is a great tool for this as well, as one of your keywords that your ranking for may have just lost popularity online.
In other words, businesses have much less visibility into which keywords searchers are using to find them, making it much harder to understand which words and terms are or working -- or not working -- in their search engine optimization. Google said this would only affect about 11% of searches, but the truth of the matter is the number is much greater than that, and is only continuing to increase as Google's user base grows. It's important to keep this curveball in mind when evaluating organic search as a traffic and lead generation source.
Kristine Schachinger has 17 years digital experience including a focus on website design and implementation, accessibility standards and all aspects of website visibility involving SEO, social media and strategic planning. She additionally specializes in site health auditing, site forensics, technical SEO and site recovery planning especially when involving Google algorithms such as Penguin and Panda. Her seventeen years in design and development and eight years in online marketing give her a depth and breadth of understanding that comes from a broad exposure to not only digital marketing, but the complete product lifecycle along with the underlying technology and processes. She is a well known speaker, author and can be found on LinkedIn, Google+ and Twitter.
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
Hey Brankica, Wow, what an amazing post, packed with incredible sources of traffic. No more excuses if there are so many places to tap into to generate traffic. This is definitely the most popular post in the contest. Well done, girl! :) Thanks for sharing your insights. I have picked up a few sources that I haven't used before. All the best, Mavis
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
Understanding how people landed on your website is a key component of optimization. If you’ve ever looked at Google Analytics (and if you haven’t you should), you’ve probably seen the words “Direct,” “Referral,” and “Organic” in relation to your traffic. These are the sources where your users come from — or what Google calls channels. But what do these words really mean, and why do they matter?
I'm not in the contest, but if I was - I'd be SKEEEEERED. This was awesome, really - there were MAYBE 3 things I knew... The rest was new. I'm lame that way. The great thing about this is that as a blogger - you've covered ideas I've never thought of...I get my traffic mostly from SEO (not on my blog, but in my websites which are product based review sites) - but there's enough meat in this post I can use for my niche sites to keep me in the black, so to speak (ink I mean, not socks). Awesome post, Brankica - I'm speechless. (If you ignore the foregoing paragraph.)

Who are you man!? You better be back here till tomorrow and give me number 51 or you will be doing push ups for the rest of your blogging life!!! RLMAO Hey Steve, thanks so much for the comment and I am glad that there is someone after all that is getting all the traffic his webhost can take! I am already thinking of how to enter your Webmaster tools and redirect your whole blog to mine... thinking, thinking, thinking.... Thanks for the RT and love having you as my blogging friend!
First, I will show you a quick snapshot of the traffic uplift, which yielded an additional 400,000 unique visitors from organic search traffic on a monthly basis. Then I will explain to you the steps in which we took to get the client to this level. I have also tried to keep this quite general so everyone can adapt their own situation to this case study.
Oh, I wish you told me what was wrong with it :) I only discovered it recently but I am getting nice traffic from it. I hope you will let me know how it worked for you. At the moment I am posting both to my page and personal profile. I also realized that I might just leave it on the personal page (yeah, sound weird) cause on my fan page, I kinda like to add a little comment to the post. Anyway, thanks for the comment and I will try to find your blog over there and subscribe to it on Networked blogs.
PPC platforms hinge not on fixed prices, but on bids. Marketers bid for what they’re willing to pay for a single keyword click. Some industry words are much more expensive than others. The more expensive the word, the more likely you should rely on SEO to deliver traffic and leads to your organization. To find average industry CPCs, you can use Google’s keyword planner tool.
Hi Chris, "Good content" means a couple of things - good for readers and good for Google. Good content for readers means that the content answers questions, provides value, offers solutions, and is engaging. You want to keep the reader on the page and on your website for as long as possible. To make good content for Google, you have to provide the search engine with a set of signals - e.g., keywords, backlinks, low bounce rates, etc... The idea is that if you make good content for readers (engaging, valuable, actionable, and informative), your content will get more engagement. When your content gets more engagement Google will see it as good content too and put it higher in the SERPs. Making "good content" is about striking that balance. Let us know if that answered your question!
×