Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
Having large groups of content that all revolve around the same topic will build more relevance around keywords that you're trying to rank for within these topics, and it makes it much easier for Google to associate your content with specific topics. Not only that, but it makes it much easier to interlink between your content, pushing more internal links through your website.
Thanks for this handy info. I wasn't aware that Google owned Vark so I'm off to check it out. There is so much to learn about the best sites to use on the internet for traffic generation. I'm too frightened of spending more time surfing than actually writing for my own blogs that I probably don't do nearly enough 'looking'. Does anyone else have this time problem V's actual work time?
For our client: We only used a smaller quantity of very high-quality link building each month. So, for example we only built 40 of the best links each month to supplement the work we were doing on the content marketing front. We also invested heavily into tracking competitor backlink profiles, using Majestic SEO and Open Site Explorer. We worked out how the competitor's acquired specific backlinks, then by using outreach and content creation we obtained these links.
Hey Jym, thanks a bunch :) Yeah, I don't think everyone need to use each and every one of these, especially at the beginning. But I do find the list useful for those with "older" blogs. When you are thinking "Where else can I go to get more people to see my blog". That is what I do with my first website. Again, I agree with the part of identifying the ones that will work the best, so we don't spend too much time getting no results. Thanks so much for the comment
Brankica, Very comprehensive list there. Can't think of anything else :D RT! I think the traffic sources that I hardly use is answering questions. I find that traffic conversion there is poor for me, so I don't focus too much on it. I think of all those in the list, focus on those that gives you the best kind of traffic. Forums can be a particularly useful one especially when people are there asking for help. Answering their questions and giving them a link that benefits their query is one good way of getting the right kind of traffic.
Hey Julia, thanks so much for the comment. I just saw the idea you were talking about, that is awesome. I think you are right cause although it seems a bit inconvenient, I personally wrote down a few URL like that in my cell phone or notebook to check it out when I am back at the computer. Wow, this is something I need to think about. Also, how about those big billboards?
Optimise for your personas, not search engines. First and foremost, write your buyer personas so you know to whom you’re addressing your content. By creating quality educational content that resonates with you>r ideal buyers, you’ll naturally improve your SEO. This means tapping into the main issues of your personas and the keywords they use in search queries. Optimising for search engines alone is useless; all you’ll have is keyword-riddled nonsense.

Direct traffic refers to traffic you receive to your website that doesn't come through any other channel. So, when you type www.hubspot.com into your search bar and hit 'Enter,' you're accessing HubSpot.com via direct traffic. If someone posted a link to www.hubspot.com on Facebook, however, and you clicked on that link, your visit would be bucketed in HubSpot.com's social media sources.
You can apply this to marketing in a few ways. If, for example, you purchase paid search advertising, you’ll want to make sure those “CPC” sources have generally low bounce rates. If a pay-per-click or cost-per-click campaign has a high bounce rate (1) check your landing page to make sure that it provides the content promised in your ad, (2) check your ad copy to ensure it is clear, and (3) check your keywords.
!function(n,t){function r(e,n){return Object.prototype.hasOwnProperty.call(e,n)}function i(e){return void 0===e}if(n){var o={},s=n.TraceKit,a=[].slice,u="?";o.noConflict=function(){return n.TraceKit=s,o},o.wrap=function(e){function n(){try{return e.apply(this,arguments)}catch(e){throw o.report(e),e}}return n},o.report=function(){function e(e){u(),h.push(e)}function t(e){for(var n=h.length-1;n>=0;--n)h[n]===e&&h.splice(n,1)}function i(e,n){var t=null;if(!n||o.collectWindowErrors){for(var i in h)if(r(h,i))try{h[i].apply(null,[e].concat(a.call(arguments,2)))}catch(e){t=e}if(t)throw t}}function s(e,n,t,r,s){var a=null;if(w)o.computeStackTrace.augmentStackTraceWithInitialElement(w,n,t,e),l();else if(s)a=o.computeStackTrace(s),i(a,!0);else{var u={url:n,line:t,column:r};u.func=o.computeStackTrace.guessFunctionName(u.url,u.line),u.context=o.computeStackTrace.gatherContext(u.url,u.line),a={mode:"onerror",message:e,stack:[u]},i(a,!0)}return!!f&&f.apply(this,arguments)}function u(){!0!==d&&(f=n.onerror,n.onerror=s,d=!0)}function l(){var e=w,n=p;p=null,w=null,m=null,i.apply(null,[e,!1].concat(n))}function c(e){if(w){if(m===e)return;l()}var t=o.computeStackTrace(e);throw w=t,m=e,p=a.call(arguments,1),n.setTimeout(function(){m===e&&l()},t.incomplete?2e3:0),e}var f,d,h=[],p=null,m=null,w=null;return c.subscribe=e,c.unsubscribe=t,c}(),o.computeStackTrace=function(){function e(e){if(!o.remoteFetching)return"";try{var t=function(){try{return new n.XMLHttpRequest}catch(e){return new n.ActiveXObject("Microsoft.XMLHTTP")}},r=t();return r.open("GET",e,!1),r.send(""),r.responseText}catch(e){return""}}function t(t){if("string"!=typeof t)return[];if(!r(j,t)){var i="",o="";try{o=n.document.domain}catch(e){}var s=/(.*)\:\/\/([^:\/]+)([:\d]*)\/{0,1}([\s\S]*)/.exec(t);s&&s[2]===o&&(i=e(t)),j[t]=i?i.split("\n"):[]}return j[t]}function s(e,n){var r,o=/function ([^(]*)\(([^)]*)\)/,s=/['"]?([0-9A-Za-z$_]+)['"]?\s*[:=]\s*(function|eval|new Function)/,a="",l=10,c=t(e);if(!c.length)return u;for(var f=0;f0?s:null}function l(e){return e.replace(/[\-\[\]{}()*+?.,\\\^$|#]/g,"\\$&")}function c(e){return l(e).replace("<","(?:<|<)").replace(">","(?:>|>)").replace("&","(?:&|&)").replace('"','(?:"|")').replace(/\s+/g,"\\s+")}function f(e,n){for(var r,i,o=0,s=n.length;or&&(i=s.exec(o[r]))?i.index:null}function h(e){if(!i(n&&n.document)){for(var t,r,o,s,a=[n.location.href],u=n.document.getElementsByTagName("script"),d=""+e,h=/^function(?:\s+([\w$]+))?\s*\(([\w\s,]*)\)\s*\{\s*(\S[\s\S]*\S)\s*\}\s*$/,p=/^function on([\w$]+)\s*\(event\)\s*\{\s*(\S[\s\S]*\S)\s*\}\s*$/,m=0;m]+)>|([^\)]+))\((.*)\))? in (.*):\s*$/i,o=n.split("\n"),u=[],l=0;l=0&&(g.line=v+x.substring(0,j).split("\n").length)}}}else if(o=d.exec(i[y])){var _=n.location.href.replace(/#.*$/,""),T=new RegExp(c(i[y+1])),E=f(T,[_]);g={url:_,func:"",args:[],line:E?E.line:o[1],column:null}}if(g){g.func||(g.func=s(g.url,g.line));var k=a(g.url,g.line),A=k?k[Math.floor(k.length/2)]:null;k&&A.replace(/^\s*/,"")===i[y+1].replace(/^\s*/,"")?g.context=k:g.context=[i[y+1]],h.push(g)}}return h.length?{mode:"multiline",name:e.name,message:i[0],stack:h}:null}function y(e,n,t,r){var i={url:n,line:t};if(i.url&&i.line){e.incomplete=!1,i.func||(i.func=s(i.url,i.line)),i.context||(i.context=a(i.url,i.line));var o=/ '([^']+)' /.exec(r);if(o&&(i.column=d(o[1],i.url,i.line)),e.stack.length>0&&e.stack[0].url===i.url){if(e.stack[0].line===i.line)return!1;if(!e.stack[0].line&&e.stack[0].func===i.func)return e.stack[0].line=i.line,e.stack[0].context=i.context,!1}return e.stack.unshift(i),e.partial=!0,!0}return e.incomplete=!0,!1}function g(e,n){for(var t,r,i,a=/function\s+([_$a-zA-Z\xA0-\uFFFF][_$a-zA-Z0-9\xA0-\uFFFF]*)?\s*\(/i,l=[],c={},f=!1,p=g.caller;p&&!f;p=p.caller)if(p!==v&&p!==o.report){if(r={url:null,func:u,args:[],line:null,column:null},p.name?r.func=p.name:(t=a.exec(p.toString()))&&(r.func=t[1]),"undefined"==typeof r.func)try{r.func=t.input.substring(0,t.input.indexOf("{"))}catch(e){}if(i=h(p)){r.url=i.url,r.line=i.line,r.func===u&&(r.func=s(r.url,r.line));var m=/ '([^']+)' /.exec(e.message||e.description);m&&(r.column=d(m[1],i.url,i.line))}c[""+p]?f=!0:c[""+p]=!0,l.push(r)}n&&l.splice(0,n);var w={mode:"callers",name:e.name,message:e.message,stack:l};return y(w,e.sourceURL||e.fileName,e.line||e.lineNumber,e.message||e.description),w}function v(e,n){var t=null;n=null==n?0:+n;try{if(t=m(e))return t}catch(e){if(x)throw e}try{if(t=p(e))return t}catch(e){if(x)throw e}try{if(t=w(e))return t}catch(e){if(x)throw e}try{if(t=g(e,n+1))return t}catch(e){if(x)throw e}return{mode:"failed"}}function b(e){e=1+(null==e?0:+e);try{throw new Error}catch(n){return v(n,e+1)}}var x=!1,j={};return v.augmentStackTraceWithInitialElement=y,v.guessFunctionName=s,v.gatherContext=a,v.ofCaller=b,v.getSource=t,v}(),o.extendToAsynchronousCallbacks=function(){var e=function(e){var t=n[e];n[e]=function(){var e=a.call(arguments),n=e[0];return"function"==typeof n&&(e[0]=o.wrap(n)),t.apply?t.apply(this,e):t(e[0],e[1])}};e("setTimeout"),e("setInterval")},o.remoteFetching||(o.remoteFetching=!0),o.collectWindowErrors||(o.collectWindowErrors=!0),(!o.linesOfContext||o.linesOfContext<1)&&(o.linesOfContext=11),void 0!==e&&e.exports&&n.module!==e?e.exports=o:"function"==typeof define&&define.amd?define("TraceKit",[],o):n.TraceKit=o}}("undefined"!=typeof window?window:global)},"./webpack-loaders/expose-loader/index.js?require!./shared/require-global.js":function(e,n,t){(function(n){e.exports=n.require=t("./shared/require-global.js")}).call(n,t("../../../lib/node_modules/webpack/buildin/global.js"))}});
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.
This was an extremely comprehensive, well-thought out list Brankica! I would never have thought of eBay or Craigslist- so I thought your spin on that is interesting. I would definitely recommend this list to any newbie blogger. :) So it took me awhile to come up with a #51 just because you seemed to cover everything-lo. It's an angle on the answering questions- volunteer to be interviewed on your niche for other bloggers or on a site like HARO. If you provide some great value, it is the next logical step for something to visit your site to learn more about you. Great work!!
Google is currently been inundated with reconsideration requests from webmasters all over the world. On public holidays the Search Quality teams do not look at reconsideration requests. See below analysis. From my experience it can take anywhere from 15-30+ days for Google to respond to reconsideration requests; during peak periods it can even take longer.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.
Hi Brankica! I had to stop by again and let you know how a few of these ideas of yours are working for me. Purely because of this article, I put an ad on Craigslist for my Natural Parent consulting. Also I started posting a bit on Q&A sites, but I should bump that up. Paper.li didn't make a lot of sense to me until the other day when I was mentioned on Twitter, followed the link, and got to see exactly what it was. Then I went, "Ohhh! That's what she meant!" lol Next on my list is Slideshare. My Beat the Stress articles seem to be the most successful so I'll use those. Maybe I should shift focus a little to stress management and parenting? =) Thanks, as always! I should just print out this list.

Hey, Matt! Thank you for your sharing, and I learned much from it, but I still have a question. We began to do SEO work for our site 2 years ago, and our organic traffic grew 5 times ( from 8K to 40K every day). But two years later, it is very difficult to get it grow more, even it drop to 3.2K every day. So can you give me any advice to make our site's traffic grow again? Thank you in advance!
×