Another key consideration is that social platforms have become search engines within themselves. Just look at Twitter, for example - you can use it to see what the biggest trending news of the day are and you can easily search for great content. You can search based on your location, in different languages and if you click on advanced search, there are even more options:
Go to local events or Meetup events and connect with bloggers in your industry. An example of an event I run to connect with bloggers and people in the online marketing word is: http://www.meetup.com/Online-Marketing-Sydney/. Make friends first and then try to gain guest posts later. I am not really a fan of websites which are flooded with guest posts one after another; it is the type of thing which Google is just waiting to target.
When you run a PPC campaign -- whether through Google or some other PPC provider -- you can track how much site traffic it drives in this part of your sources report. Obviously for proper PPC campaign management , you'll also need to be reviewing whether that traffic actually converts , too. Like email marketing as a source, be sure to include tracking tokens with all of your paid search campaigns so this is properly bucketed as, you know, paid search.
I had great results with Stumble Upon cause I have been a user for years and Stumble a lot of things that have nothing to do with blogging or my blog. And Networked blogs is working great for me too :) I am glad you found some new sources here. I would love, if you had the time, to look into one or two that seam interesting to you, and maybe try them out... and then come back and tell me if they worked OK for you?
Ha ha ha, John, aren't you trying to be funny. Well I need to inform you that I have been to your site, read it and totally DON'T agree with you :P Your blog is great and I don't see many people running from it, lol. And definitely bookmark for future reference, there is no way to go through it in a day (I can't do it and I wrote it!). Hope you will use some of these and send me some feedback about the results.
It works in a similar way as a canonical tag does to show when a duplicate version of a page exists. But in this case, it helps Google index local content easier within local search engines. It also helps to pass trust across sites and improve the way Google crawls these pages. To know more about how errors you should not make when it comes to hreflang tags, you can check our previous article.
Traffic data is a great way to take the temperature of your website and marketing initiatives. When you are writing and promoting blog content on a regular basis, you can use traffic data to track results and correlate these efforts to actual ROI. Be sure to look at traffic numbers over long-term intervals to see trends and report on improvement over time.  

Hi Brankika, With respect to EZA, I no longer submit to them. They rejected some of my article for saying that the page I was linking to did not contain enough information. I linked to my blog and to the original article. I believe they didn't like the link to the original article. That being the case, they no longer allow one of the cardinal rules of syndication as per Google itself..."Link To The Original". Once they stopped allowing that, they were no longer useful to me. Thanks for the great resource. Mark
Would you mind sharing your stumbleupon user name with me? I would love to follow you and your shares! :) I am beginning to get more active on SU and it is definitely paying off. I have had a few posts in the last several weeks that have brought in 600-700 visits each. :) In regards to your "challenge" I am going to try out paper.li I will let you know how it goes.

And although livescience.com has way more links than its competitor, it seems that emedicinehealth.com is performing better for a lot of important keywords. Why, you might wonder. Well, the answer is simple: content. And although they are both offering good content, livescience.com is a site offering general information about a lot of things while emedicinehealth.com offers articles highly related to the topics, leaving the impression that this site really offers reliable content. We are not saying that the 5 mil links livescience.com has do not matter, yet it seems that in some cases content weighs more.
This is my first time on your blog and I found it when I was reading up on who won the contest at Famous Bloggers. I can definitely see why now! I spent a good amount of time reading and going through all the advice that you've stated in this post since I am fairly new to blogging. I believe one of the biggest hurdles that I will come across when trying to further my blogging experience is driving traffic to my blog. I will make sure to apply many of the techniques you've listed in this post to my blog and hopefully I'll see some traffic! I will definitely continue to read your informative posts. Thanks! :)
You could spend a week researching and writing a 3,000 word in-depth guide only to find that in a month its traffic is being eclipsed by a 300 word blog that took you one tenth of the time to write. That little gem could start ranking for some pretty valuable keywords – even if you never planned for it to. Give it a makeover and you’ll see your rankings on SERPs (search engine results pages) and organic traffic values soar. We’ve seen this strategy work with both our clients and our own website. Big bonus: it’s actually an easy strategy to pull off. Here’s how:
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt. 
Buy canadian search traffic visitors from Google.ca for 30 days – We will send real canadian visitors to your site using Google.ca's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Google Canada Search Traffic Service today …
The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.
For many startups, this means doing enterprise SEO on a small business budget, which comes with a few compromises. The problem is, Google doesn’t accept compromises when it comes to search optimisation and you need to get the fundamentals spot on. The good news is, the sooner you get these right, the faster you’ll be able to build a self-sustaining SEO strategy that doesn’t come back to bite you in the budget later.
Now that we have the data we want in the chart, we use the advanced search to filter it down to only the traffic we want to see. Click the blue “Advanced” link beside the search bar that is just to the top right of your list of landing pages. This will open the Advanced search screen, where we want to setup our query. In the green drop down, choose “Medium” and in the text box at the end of the row we type “organic”. Click the Apply button below the query builder to apply this search.
While it’s best to create a custom dimension for filtering out bots, applying the generic bot filter is a good place to start. It’s important to note that filters cannot be applied retroactively, so if you’ve recently turned on this feature, you should be receiving less traffic. Additionally, double-check that you are filtering out your own traffic and IP address.
I have 103 with the trackbacks in the bottom, lol. For now... and I hope to get many more. I think the sources can be useful and hope someone will find good use of them :) I love how people are in the mood to comment. My favorite thing about blogging are all the connections and I try to be really helpful. So I guess that is the reason I get a lot of comments. Not to mention that I even put a post up on my blog asking everyone to help me be one of the winners :) This is my first contest. I am glad you liked it and sorry for popping up everywhere, lol.
I have always believed in good quality content, well structured and written in a way that isn’t just about promotional talk. Thanks for sharing this information with us, it’s always helpful to have everything written in a concise manner so we can remind ourselves now and again of what it takes to increase organic traffic. As an SEO consultant myself I come across websites all the time that are messy and still using tactics that have long been out of date. Having a successful website is all about quality content and links. I like it that you stated what the problem was and then how you fixed it for the client. Great article.
Armed with this information, ecommerce marketers can learn why some sources might be underperforming or focus efforts on sources that drive better quality traffic. In some cases, this might mean relying less on search engines and more on social media or brand awareness. Other times the opposite could be the best course of action. Either way, the Traffic Sources section in Google Analytics can help.

I'm not in the contest, but if I was - I'd be SKEEEEERED. This was awesome, really - there were MAYBE 3 things I knew... The rest was new. I'm lame that way. The great thing about this is that as a blogger - you've covered ideas I've never thought of...I get my traffic mostly from SEO (not on my blog, but in my websites which are product based review sites) - but there's enough meat in this post I can use for my niche sites to keep me in the black, so to speak (ink I mean, not socks). Awesome post, Brankica - I'm speechless. (If you ignore the foregoing paragraph.)

!function(n,t){function r(e,n){return Object.prototype.hasOwnProperty.call(e,n)}function i(e){return void 0===e}if(n){var o={},s=n.TraceKit,a=[].slice,u="?";o.noConflict=function(){return n.TraceKit=s,o},o.wrap=function(e){function n(){try{return e.apply(this,arguments)}catch(e){throw o.report(e),e}}return n},o.report=function(){function e(e){u(),h.push(e)}function t(e){for(var n=h.length-1;n>=0;--n)h[n]===e&&h.splice(n,1)}function i(e,n){var t=null;if(!n||o.collectWindowErrors){for(var i in h)if(r(h,i))try{h[i].apply(null,[e].concat(a.call(arguments,2)))}catch(e){t=e}if(t)throw t}}function s(e,n,t,r,s){var a=null;if(w)o.computeStackTrace.augmentStackTraceWithInitialElement(w,n,t,e),l();else if(s)a=o.computeStackTrace(s),i(a,!0);else{var u={url:n,line:t,column:r};u.func=o.computeStackTrace.guessFunctionName(u.url,u.line),u.context=o.computeStackTrace.gatherContext(u.url,u.line),a={mode:"onerror",message:e,stack:[u]},i(a,!0)}return!!f&&f.apply(this,arguments)}function u(){!0!==d&&(f=n.onerror,n.onerror=s,d=!0)}function l(){var e=w,n=p;p=null,w=null,m=null,i.apply(null,[e,!1].concat(n))}function c(e){if(w){if(m===e)return;l()}var t=o.computeStackTrace(e);throw w=t,m=e,p=a.call(arguments,1),n.setTimeout(function(){m===e&&l()},t.incomplete?2e3:0),e}var f,d,h=[],p=null,m=null,w=null;return c.subscribe=e,c.unsubscribe=t,c}(),o.computeStackTrace=function(){function e(e){if(!o.remoteFetching)return"";try{var t=function(){try{return new n.XMLHttpRequest}catch(e){return new n.ActiveXObject("Microsoft.XMLHTTP")}},r=t();return r.open("GET",e,!1),r.send(""),r.responseText}catch(e){return""}}function t(t){if("string"!=typeof t)return[];if(!r(j,t)){var i="",o="";try{o=n.document.domain}catch(e){}var s=/(.*)\:\/\/([^:\/]+)([:\d]*)\/{0,1}([\s\S]*)/.exec(t);s&&s[2]===o&&(i=e(t)),j[t]=i?i.split("\n"):[]}return j[t]}function s(e,n){var r,o=/function ([^(]*)\(([^)]*)\)/,s=/['"]?([0-9A-Za-z$_]+)['"]?\s*[:=]\s*(function|eval|new Function)/,a="",l=10,c=t(e);if(!c.length)return u;for(var f=0;f0?s:null}function l(e){return e.replace(/[\-\[\]{}()*+?.,\\\^$|#]/g,"\\$&")}function c(e){return l(e).replace("<","(?:<|<)").replace(">","(?:>|>)").replace("&","(?:&|&)").replace('"','(?:"|")').replace(/\s+/g,"\\s+")}function f(e,n){for(var r,i,o=0,s=n.length;or&&(i=s.exec(o[r]))?i.index:null}function h(e){if(!i(n&&n.document)){for(var t,r,o,s,a=[n.location.href],u=n.document.getElementsByTagName("script"),d=""+e,h=/^function(?:\s+([\w$]+))?\s*\(([\w\s,]*)\)\s*\{\s*(\S[\s\S]*\S)\s*\}\s*$/,p=/^function on([\w$]+)\s*\(event\)\s*\{\s*(\S[\s\S]*\S)\s*\}\s*$/,m=0;m]+)>|([^\)]+))\((.*)\))? in (.*):\s*$/i,o=n.split("\n"),u=[],l=0;l=0&&(g.line=v+x.substring(0,j).split("\n").length)}}}else if(o=d.exec(i[y])){var _=n.location.href.replace(/#.*$/,""),T=new RegExp(c(i[y+1])),E=f(T,[_]);g={url:_,func:"",args:[],line:E?E.line:o[1],column:null}}if(g){g.func||(g.func=s(g.url,g.line));var k=a(g.url,g.line),A=k?k[Math.floor(k.length/2)]:null;k&&A.replace(/^\s*/,"")===i[y+1].replace(/^\s*/,"")?g.context=k:g.context=[i[y+1]],h.push(g)}}return h.length?{mode:"multiline",name:e.name,message:i[0],stack:h}:null}function y(e,n,t,r){var i={url:n,line:t};if(i.url&&i.line){e.incomplete=!1,i.func||(i.func=s(i.url,i.line)),i.context||(i.context=a(i.url,i.line));var o=/ '([^']+)' /.exec(r);if(o&&(i.column=d(o[1],i.url,i.line)),e.stack.length>0&&e.stack[0].url===i.url){if(e.stack[0].line===i.line)return!1;if(!e.stack[0].line&&e.stack[0].func===i.func)return e.stack[0].line=i.line,e.stack[0].context=i.context,!1}return e.stack.unshift(i),e.partial=!0,!0}return e.incomplete=!0,!1}function g(e,n){for(var t,r,i,a=/function\s+([_$a-zA-Z\xA0-\uFFFF][_$a-zA-Z0-9\xA0-\uFFFF]*)?\s*\(/i,l=[],c={},f=!1,p=g.caller;p&&!f;p=p.caller)if(p!==v&&p!==o.report){if(r={url:null,func:u,args:[],line:null,column:null},p.name?r.func=p.name:(t=a.exec(p.toString()))&&(r.func=t[1]),"undefined"==typeof r.func)try{r.func=t.input.substring(0,t.input.indexOf("{"))}catch(e){}if(i=h(p)){r.url=i.url,r.line=i.line,r.func===u&&(r.func=s(r.url,r.line));var m=/ '([^']+)' /.exec(e.message||e.description);m&&(r.column=d(m[1],i.url,i.line))}c[""+p]?f=!0:c[""+p]=!0,l.push(r)}n&&l.splice(0,n);var w={mode:"callers",name:e.name,message:e.message,stack:l};return y(w,e.sourceURL||e.fileName,e.line||e.lineNumber,e.message||e.description),w}function v(e,n){var t=null;n=null==n?0:+n;try{if(t=m(e))return t}catch(e){if(x)throw e}try{if(t=p(e))return t}catch(e){if(x)throw e}try{if(t=w(e))return t}catch(e){if(x)throw e}try{if(t=g(e,n+1))return t}catch(e){if(x)throw e}return{mode:"failed"}}function b(e){e=1+(null==e?0:+e);try{throw new Error}catch(n){return v(n,e+1)}}var x=!1,j={};return v.augmentStackTraceWithInitialElement=y,v.guessFunctionName=s,v.gatherContext=a,v.ofCaller=b,v.getSource=t,v}(),o.extendToAsynchronousCallbacks=function(){var e=function(e){var t=n[e];n[e]=function(){var e=a.call(arguments),n=e[0];return"function"==typeof n&&(e[0]=o.wrap(n)),t.apply?t.apply(this,e):t(e[0],e[1])}};e("setTimeout"),e("setInterval")},o.remoteFetching||(o.remoteFetching=!0),o.collectWindowErrors||(o.collectWindowErrors=!0),(!o.linesOfContext||o.linesOfContext<1)&&(o.linesOfContext=11),void 0!==e&&e.exports&&n.module!==e?e.exports=o:"function"==typeof define&&define.amd?define("TraceKit",[],o):n.TraceKit=o}}("undefined"!=typeof window?window:global)},"./webpack-loaders/expose-loader/index.js?require!./shared/require-global.js":function(e,n,t){(function(n){e.exports=n.require=t("./shared/require-global.js")}).call(n,t("../../../lib/node_modules/webpack/buildin/global.js"))}});
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
Talk about a blog post that's worth bookmarking and reading over and over!! Nicely done Sharp Shooter. There are just too many here that I don't even think of looking at (and even a few I've never heard of) and now I'm feeling kinda silly that I have all these resources to take advantage of and I don't. I'm just gonna have to count on you to remind me and give me a good kick in the toosh every so often lol! This is beyond useful Brankica. I hope I'll smarten up and take some of your awesome advice here! Thanks for writing this up for us. You rock "Mon..y" lol ;)

Referral traffic in Google Analytics can also include your social traffic or even your email visits. This is largely because there are different ways to track data and there are imperfections in the system, but it also depends on which reports you look at. For example, Acquisition> All channels> Referrals will include much of your social traffic, but the default All channels report does not include Social in your Referrals. The best solutions:
I’ve always been a believer that hard work gets the best results, and in practice it always ends up being true. On the web it’s no different. If you want more organic traffic, you have to work for it. That means giving your best effort every time, going after opportunities your competitors have missed, being consistent, guest blogging strategically, and staying on Google’s good side.
×