Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
I'm not in the contest, but if I was - I'd be SKEEEEERED. This was awesome, really - there were MAYBE 3 things I knew... The rest was new. I'm lame that way. The great thing about this is that as a blogger - you've covered ideas I've never thought of...I get my traffic mostly from SEO (not on my blog, but in my websites which are product based review sites) - but there's enough meat in this post I can use for my niche sites to keep me in the black, so to speak (ink I mean, not socks). Awesome post, Brankica - I'm speechless. (If you ignore the foregoing paragraph.)

A few links down and I've noticed that Brian has a link from WordPress.org. Not bad! Turns out that his content has been referenced within one of WordPress's codex posts. If I were to reach out and offer some additional insight, citing one of my articles, there's a chance I could bag a similar link, especially considering they have a 'Useful Resources' section.
I don’t know how much time it took to gather all this stuff, but it is simply great I was elated to see the whole concept related (backlinks, content strategies, visitors etc) to see at one place. I hope it will be helpful for the beginners link me. I recently started a website, also I’m newbie to blogging industry. I hope your information will helps me a lot to lead success.
If you want to know how engaging your content is, you might look at bounce rates, average time on page and the number of pages visited per session to get an idea – and Google can do precisely the same. If a user clicks through to your site, quickly returns to the results page and clicks on another listing (called “pogo-sticking”), it suggests you haven’t provided what this person is looking for.
A great way to break your web traffic without even noticing is through robots.txt. Indeed, the robots.txt file has long been debated among webmasters and it can be a strong tool when it is well written. Yet, the same robots.txt can really be a tool you can shoot yourself in the foot with. We’ve written an in-depth article on the critical mistakes that can ruin your traffic in another post. 
We want to see landing pages that came from organic searches, so first we need to add to this dataset the parameter “Medium” which is how Analytics identifies channels. To do this, use the drop down above the table of data and locate the option for “Medium”. The table below should refresh and now you will have a second column of data showing the channel for each landing page.
Relying too much on one source of traffic is a risky strategy. A particular channel or strategy can be fantastic for generating traffic today, but it doesn’t mean it will stay the same tomorrow. Some sites lost out when Penguin started penalizing on certain SEO linking practices. Others lost out when Facebook decided to massively restrict organic reach. If your site relies exclusively on only one source of traffic, then algorithm changes can create some serious trouble for you. So be aware of the importance of diversifying and know the options available from different traffic sources.
Though a long break is never suggested, there are times that money can be shifted and put towards other resources for a short time. A good example would be an online retailer. In the couple of weeks leading up to the Christmas holidays, you are unlikely to get more organic placement than you already have. Besides, the window of opportunity for shipping gifts to arrive before Christmas is ending, and you are heading into a slow season.
Hey Jym, thanks a bunch :) Yeah, I don't think everyone need to use each and every one of these, especially at the beginning. But I do find the list useful for those with "older" blogs. When you are thinking "Where else can I go to get more people to see my blog". That is what I do with my first website. Again, I agree with the part of identifying the ones that will work the best, so we don't spend too much time getting no results. Thanks so much for the comment
For example, sometimes this could include social media websites, sometimes it might not -- in HubSpot's software , social media websites are not included in referral traffic because they're included in a separate "social media" bucket. Another instance of variance is whether subdomains are included -- HubSpot software, for example, includes subdomains (like academy.hubspot.com) as a traffic source under Referrals. And sometimes it's not that tricky -- you'll always see third-party domains, like mashable.com, for instance -- right under here. This is particularly helpful if you're trying to ascertain which web properties are great for co-marketing, SEO partnerships, and guest blogging opportunities.
(function(){"use strict";function s(e){return"function"==typeof e||"object"==typeof e&&null!==e}function a(e){return"function"==typeof e}function u(e){X=e}function l(e){G=e}function c(){return function(){r.nextTick(p)}}function f(){var e=0,n=new ne(p),t=document.createTextNode("");return n.observe(t,{characterData:!0}),function(){t.data=e=++e%2}}function d(){var e=new MessageChannel;return e.port1.onmessage=p,function(){e.port2.postMessage(0)}}function h(){return function(){setTimeout(p,1)}}function p(){for(var e=0;et.length)&&(n=t.length),n-=e.length;var r=t.indexOf(e,n);return-1!==r&&r===n}),String.prototype.startsWith||(String.prototype.startsWith=function(e,n){return n=n||0,this.substr(n,e.length)===e}),String.prototype.trim||(String.prototype.trim=function(){return this.replace(/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,"")}),String.prototype.includes||(String.prototype.includes=function(e,n){"use strict";return"number"!=typeof n&&(n=0),!(n+e.length>this.length)&&-1!==this.indexOf(e,n)})},"./shared/require-global.js":function(e,n,t){e.exports=t("./shared/require-shim.js")},"./shared/require-shim.js":function(e,n,t){var r=t("./shared/errors.js"),i=(this.window,!1),o=null,s=null,a=new Promise(function(e,n){o=e,s=n}),u=function(e){if(!u.hasModule(e)){var n=new Error('Cannot find module "'+e+'"');throw n.code="MODULE_NOT_FOUND",n}return t("./"+e+".js")};u.loadChunk=function(e){return a.then(function(){return"main"==e?t.e("main").then(function(e){t("./main.js")}.bind(null,t))["catch"](t.oe):"dev"==e?Promise.all([t.e("main"),t.e("dev")]).then(function(e){t("./shared/dev.js")}.bind(null,t))["catch"](t.oe):"internal"==e?Promise.all([t.e("main"),t.e("internal"),t.e("qtext2"),t.e("dev")]).then(function(e){t("./internal.js")}.bind(null,t))["catch"](t.oe):"ads_manager"==e?Promise.all([t.e("main"),t.e("ads_manager")]).then(function(e){undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined}.bind(null,t))["catch"](t.oe):"publisher_dashboard"==e?t.e("publisher_dashboard").then(function(e){undefined,undefined}.bind(null,t))["catch"](t.oe):"content_widgets"==e?Promise.all([t.e("main"),t.e("content_widgets")]).then(function(e){t("./content_widgets.iframe.js")}.bind(null,t))["catch"](t.oe):void 0})},u.whenReady=function(e,n){Promise.all(window.webpackChunks.map(function(e){return u.loadChunk(e)})).then(function(){n()})},u.installPageProperties=function(e,n){window.Q.settings=e,window.Q.gating=n,i=!0,o()},u.assertPagePropertiesInstalled=function(){i||(s(),r.logJsError("installPageProperties","The install page properties promise was rejected in require-shim."))},u.prefetchAll=function(){t("./settings.js");Promise.all([t.e("main"),t.e("qtext2")]).then(function(){}.bind(null,t))["catch"](t.oe)},u.hasModule=function(e){return!!window.NODE_JS||t.m.hasOwnProperty("./"+e+".js")},u.execAll=function(){var e=Object.keys(t.m);try{for(var n=0;n=c?n():document.fonts.load(l(o,'"'+o.family+'"'),a).then(function(n){1<=n.length?e():setTimeout(t,25)},function(){n()})}t()});var w=new Promise(function(e,n){u=setTimeout(n,c)});Promise.race([w,m]).then(function(){clearTimeout(u),e(o)},function(){n(o)})}else t(function(){function t(){var n;(n=-1!=y&&-1!=g||-1!=y&&-1!=v||-1!=g&&-1!=v)&&((n=y!=g&&y!=v&&g!=v)||(null===f&&(n=/AppleWebKit\/([0-9]+)(?:\.([0-9]+))/.exec(window.navigator.userAgent),f=!!n&&(536>parseInt(n[1],10)||536===parseInt(n[1],10)&&11>=parseInt(n[2],10))),n=f&&(y==b&&g==b&&v==b||y==x&&g==x&&v==x||y==j&&g==j&&v==j)),n=!n),n&&(null!==_.parentNode&&_.parentNode.removeChild(_),clearTimeout(u),e(o))}function d(){if((new Date).getTime()-h>=c)null!==_.parentNode&&_.parentNode.removeChild(_),n(o);else{var e=document.hidden;!0!==e&&void 0!==e||(y=p.a.offsetWidth,g=m.a.offsetWidth,v=w.a.offsetWidth,t()),u=setTimeout(d,50)}}var p=new r(a),m=new r(a),w=new r(a),y=-1,g=-1,v=-1,b=-1,x=-1,j=-1,_=document.createElement("div");_.dir="ltr",i(p,l(o,"sans-serif")),i(m,l(o,"serif")),i(w,l(o,"monospace")),_.appendChild(p.a),_.appendChild(m.a),_.appendChild(w.a),document.body.appendChild(_),b=p.a.offsetWidth,x=m.a.offsetWidth,j=w.a.offsetWidth,d(),s(p,function(e){y=e,t()}),i(p,l(o,'"'+o.family+'",sans-serif')),s(m,function(e){g=e,t()}),i(m,l(o,'"'+o.family+'",serif')),s(w,function(e){v=e,t()}),i(w,l(o,'"'+o.family+'",monospace'))})})},void 0!==e?e.exports=a:(window.FontFaceObserver=a,window.FontFaceObserver.prototype.load=a.prototype.load)}()},"./third_party/tracekit.js":function(e,n){/**
So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
I used to work with Ranker-Tracker and was more than pleased when I changed to AccuRanker, which is not only the fastest SERP tool available but also very accurate. The keyword data in Accuranker is refreshed every single day. I can also get instant on-demand ranking updates - rankings refresh in a matter of seconds whenever we need it. Reporting is very easy and saves us time as we can set automated emails to be sent directly to our and clients emails.

Everyone wants to rank for those broad two or three word key phrases because they tend to have high search volumes. The problem with these broad key phrases is they are highly competitive. So competitive that you may not stand a chance of ranking for them unless you devote months of your time to it. Instead of spending your time going after something that may not even be attainable, go after the low-hanging fruit of long-tail key phrases.
×