Forum comments work much the same way as blog comments, except you tend to lack the base post from which to play off. Instead, you need to invest some time into getting to know the culture of the forum, the prominent users, the rules, and the discussion flow. If people generally post one sentence at a time, adding a 3,000 word post will be excessive and may be mocked. If people tend to post lengthy discussions, short posts may have a negative effect. And, like Reddit, some sites may be very rabid about enforcing no advertising.
So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.
When a user follows a link on a secure (HTTPS) page to a non-secure (HTTP) page, no referrer data is passed, meaning the session appears as direct traffic instead of as a referral. Note that this is intended behavior. It’s part of how the secure protocol was designed, and it does not affect other scenarios: HTTP to HTTP, HTTPS to HTTPS, and even HTTP to HTTPS all pass referrer data.
If you don’t want your Google traffic dropped dramatically due to indexing and content pruning, we are going to list below the steps you need to take in order to successfully prune your own content. We’ve developed this subject in a previous blog post.  Yet, before doing so, we want to stress on the fact that it’s not easy to take the decision of removing indexed pages from the Google Index and, if handled improperly, it may go very wrong. Yet, at the end of the day,  you should keep the Google Index fresh with info that is worthwhile to be ranked and which helps your users. 
Another big benefit to using social media is that it can help you gain more influence, as well as grow your business across board. And although social media has little direct influence over your search engine ranking, it can help your SEO, albeit indirectly. By helping to grow your business and your website, gaining more traffic, more backlinks and so on, this will improve your website's domain authority, and thus, its search engine ranking by extension.

Hey Delena, I am not worried about you stalking me, I like it :) I think you can advertise any service on Craigslist. The idea you have is awesome. Not only you can point your ad to your blog but also make some extra cash. And instead just going locally, I am sure there is something you can do worldwide, like give people advice over Skype or something. Thanks for the awesome comment and the twithelp suggestion.

Google measures average time on site by first collecting each visitor’s exact time on a particular page. Imagine that a visitor lands on page 1 of your site. Google places a cookie, including a unique code for the visitor and a time stamp. When that visitor clicks through to page 2 of your site, Google again notes the time, and then subtracts the time that the visitor arrived at page 2 from the time that the visitor arrived at page 1. Google then averages each and every page’s time spent to get the average time each visitor spends on the site.
This is a crucial area. If you do not have schema markup and rel="author", you are costing your business money. It is as simple as that. As an example, say I want to make spaghetti (pasta) for dinner I search for “Spaghetti Recipe” and instantly I see some great markup in play, but one competitor has no markup and no rel="author" they are losing business in my eyes. Wouldn't you agree?.
The following example shows a Client who receives 55% of their website traffic from Organic Search – typically this is a strong performance, but again this will vary greatly based on paid search. More important than the % is the number of sessions itself – this tells us how many visits (note: not unique visitors) we received through this channel. In this example, we have 7486 visits in 1 month that all came from organic searches.
Superb resource list Brankica, thank you. I've also included it this week's Erudition, not just because you're in a competition, but because it really is a resource we should all have bookmarked. Actually, I need to make it a study priority and see how many of the sources I can reasonably use on a regualr basis. Link to Erudition | Help files from Information Junkies Anonymous
Regarding Link Detox, links it diagnoses as Toxic are generally fine as they're either not indexed by Google or have malware/viruses/etc., but I recommend a manual review of any diagnosed as Suspicious. I used it recently to get started cleaning up our backlinks and some legit sites and blogs were under Suspicious simply because they didn't have many links pointing to them.

When you run a PPC campaign -- whether through Google or some other PPC provider -- you can track how much site traffic it drives in this part of your sources report. Obviously for proper PPC campaign management , you'll also need to be reviewing whether that traffic actually converts , too. Like email marketing as a source, be sure to include tracking tokens with all of your paid search campaigns so this is properly bucketed as, you know, paid search.
Putting it simple, duplicate content is content that appears on the world wide web in more than one place. Is this such a big problem for the readers? Maybe not. But it is a very big problem for you if you are having duplicate content (internally or externally) as the search engines don’t know which version of the content is the original one and which one they should be ranking.
For our client: We monitored everything on a daily basis. If something came up, which needed to be fixed, we were quick to implement it with the development team at the business. We also rolled out numerous campaigns multiple times as they worked effectively the first time around in generating significant traffic so it was second nature to do the same thing twice.

It increases relevancy: Siloing ensures all topically related content is connected, and this in turn drives up relevancy. For example, linking to each of the individual yoga class pages (e.g. Pilates, Yoga RX, etc) from the “Yoga classes” page helps confirm—to both visitors and Google—these pages are in fact different types of yoga classes. Google can then feel more confident ranking these pages for related terms, as it is clearer the pages are relevant to the search query.
About.com – How many times have you tried to do something and needed instructions? You Google it and it brings up how-tos from About.com. Well, you can contribute to this site with some of the how-tos for your niche and get traffic from it. Additional tip – there is an About.com Forum where you can be helpful to people in need and get additional traffic to your blog.
Hi Brankica, Thanks for a great article, you are so generous with sharing your knowledge and I always look out for your posts etc. I recently bought a product for my personal use that really blew me away, so I wrote about it on my blog, and sent the supplier a link. They were absolutely delighted when they saw my blog post they added it to their testimonial page with a link back to me. Indirectly the product links to a topic covered on my website so this was a great way for me to get a little extra exposure. I'm def going to work through your list and do at least something new from it every day. Cheers, Caroline

We have been using AccuRanker for the past few months and the team have so much good stuff to say about it. Not only do the ranking seem accurate, there are a lot of added features within the platform making reporting on the data very effective. With any new tool, there are some recommendations of features that we have put forward and AccuRanker have been very receptive to these ideas and we are hopeful that they will be implemented in the near future.
!function(e){function n(t){if(r[t])return r[t].exports;var i=r[t]={i:t,l:!1,exports:{}};return e[t].call(i.exports,i,i.exports,n),i.l=!0,i.exports}var t=window.webpackJsonp;window.webpackJsonp=function(n,r,o){for(var s,a,u=0,l=[];u1)for(var t=1;td)return!1;if(p>f)return!1;var e=window.require.hasModule("shared/browser")&&window.require("shared/browser");return!e||!e.opera}function a(){var e="";return"quora.com"==window.Q.subdomainSuffix&&(e+=[window.location.protocol,"//log.quora.com"].join("")),e+="/ajax/log_errors_3RD_PARTY_POST"}function u(){var e=o(h);h=[],0!==e.length&&c(a(),{revision:window.Q.revision,errors:JSON.stringify(e)})}var l=t("./third_party/tracekit.js"),c=t("./shared/basicrpc.js").rpc;l.remoteFetching=!1,l.collectWindowErrors=!0,l.report.subscribe(r);var f=10,d=window.Q&&window.Q.errorSamplingRate||1,h=[],p=0,m=i(u,1e3),w=window.console&&!(window.NODE_JS&&window.UNIT_TEST);n.report=function(e){try{w&&console.error(e.stack||e),l.report(e)}catch(e){}};var y=function(e,n,t){r({name:n,message:t,source:e,stack:l.computeStackTrace.ofCaller().stack||[]}),w&&console.error(t)};n.logJsError=y.bind(null,"js"),n.logMobileJsError=y.bind(null,"mobile_js")},"./shared/globals.js":function(e,n,t){var r=t("./shared/links.js");(window.Q=window.Q||{}).openUrl=function(e,n){var t=e.href;return r.linkClicked(t,n),window.open(t).opener=null,!1}},"./shared/links.js":function(e,n){var t=[];n.onLinkClick=function(e){t.push(e)},n.linkClicked=function(e,n){for(var r=0;r>>0;if("function"!=typeof e)throw new TypeError;for(arguments.length>1&&(t=n),r=0;r>>0,r=arguments.length>=2?arguments[1]:void 0,i=0;i>>0;if(0===i)return-1;var o=+n||0;if(Math.abs(o)===Infinity&&(o=0),o>=i)return-1;for(t=Math.max(o>=0?o:i-Math.abs(o),0);t>>0;if("function"!=typeof e)throw new TypeError(e+" is not a function");for(arguments.length>1&&(t=n),r=0;r>>0;if("function"!=typeof e)throw new TypeError(e+" is not a function");for(arguments.length>1&&(t=n),r=new Array(s),i=0;i>>0;if("function"!=typeof e)throw new TypeError;for(var r=[],i=arguments.length>=2?arguments[1]:void 0,o=0;o>>0,i=0;if(2==arguments.length)n=arguments[1];else{for(;i=r)throw new TypeError("Reduce of empty array with no initial value");n=t[i++]}for(;i>>0;if(0===i)return-1;for(n=i-1,arguments.length>1&&(n=Number(arguments[1]),n!=n?n=0:0!==n&&n!=1/0&&n!=-1/0&&(n=(n>0||-1)*Math.floor(Math.abs(n)))),t=n>=0?Math.min(n,i-1):i-Math.abs(n);t>=0;t--)if(t in r&&r[t]===e)return t;return-1};t(Array.prototype,"lastIndexOf",c)}if(!Array.prototype.includes){var f=function(e){"use strict";if(null==this)throw new TypeError("Array.prototype.includes called on null or undefined");var n=Object(this),t=parseInt(n.length,10)||0;if(0===t)return!1;var r,i=parseInt(arguments[1],10)||0;i>=0?r=i:(r=t+i)<0&&(r=0);for(var o;r
Bonus: WHAFF is a good tool to make traffic, and also a good tool to make money if you share your invite code in microworkers.com alike websites. Get WHAFF rewards on your abdroid device, and use my invite code: CG90779 to get your first $ 0.50. After getting at least $ 10.5 you can withdraw using your Scrill account and upload the balance to microworkers to make further campaigns
Organic search engine optimization (organic SEO) refers to the methods used to obtain a high placement (or ranking) on a search engine results page in unpaid, algorithm-driven results on a given search engine. Methods such as boosting keywords, backlinking and writing high-quality content can all improve a site’s page rank. Black hat SEO methods, such as the use of keyword stuffing and link farming, can also boost organic SEO.

The first step that I take is to do a quick Google search to find pages on my domain where I've mentioned the keyword in question so that I can add an internal link. To do this, I'll use the following search query, replacing DOMAIN with your domain name (e.g. matthewbarby.com) and KEYWORD with the keyword you're targeting (e.g. "social media strategy"):

Hey Brankica! Awesome contest entry! Love it. I, like you, get a decent amount of traffic from Ezine Articles, too! I am not in a rush to do anything drastic just because of the algorithm change. Hubpages is also a lot of fun and I am pretty close to finally making my $100 payout from Adwords. My hubs have been viewed over 10,000 times and I've had a lot of fun there.
You could get even more specific by narrowing it down to customer base. Is there a specific group of clients you tend to serve? Try including that in your long-tail key phrase. For example: “SEO agency for non-profits in Albuquerque NM.” That’s a key phrase you’re a lot more likely to rank for. Not to mention it will also attract way more targeted, organic traffic than a broad key phrase like “SEO agency.”
×