For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.

Hey Brankica, This is a great post. I loved it. I am a newbie in the world of blogging just 1 month old. I am really happy with my blogging. The problem here is that, i get a very less amount of traffic and almost no comments at all. I will try some of your tips you have given here and then tell you how it really worked out for me. Thanks , it was a lot of information which i never ever heard. hope it will be useful for me. Thanks again, for the greatest post i have ever read
Visual assets aren’t regular images you might pull from a Google Image search. Instead, these are unique diagrams or infographics you’ve created specifically for your epic content. These kinds of diagrams or infographics explain a theory, communicate a point, or showcase data in exciting and interesting ways—and gain attention (and links) because of it.
By far the best rank tracking software on the market. We’ve tested a variety of different rank tracking softwares and AccuRanker is by far the most precise on the market. As a tool AccuRanker is very intuitive to use and the support is just great. We appreciate the speed and the possibilities regarding filters, tagging and especially the instant refresh function. These features and a bunch others is something we use a lot and I myself has been satisfied using it since 2016.
In all of the above cases, you’ve potentially found a page on your website that could turn into a keyword monster with a little extra content and keyword integration. Although we’ve discussed blog posts in this guide, don’t completely overlook the other pages on your website – these tips will work to increase organic traffic on all pages, not just individual blog posts.
At our agency, we work with sites of varying sizes, from very large to quite small, and recently, we have noticed a trend at the enterprise level. These sites aren’t relying as much on Google for traffic any more. Not only are they not relying on Google traffic, but also, the sites are getting less than 10 percent (or slightly more) of their organic traffic from the search giant.
Hi Brankica, Really hard pressed to come up with #51 here as it's such an extensive list. Quite a few of my customers have had success driving traffic and awareness with Groupon recently. By setting up special offers they are able to get real people through the doors too. Just by asking people to quote the coupon code when they fill in the contact us form, or when they pick up the phone, it's a great way of tracking success. Still getting my head around how this could be used my an online only business, but for a brick and mortar with a web presence it's a nice little option.

What we look for in a list like this is to identify the pages that are performing well so we can continue to capitalize on those. In this example, we see that the inventory pages are getting significant traffic, which is great, but we also see that the Team page and the Service page are both also ranking well. With this information in mind, we should revisit these pages to ensure that they are structured with the right content to perform as the visitor’s first page view, possibly their first glimpse at your business.
This section is particularly helpful when looking at organic results from search engines, since it will let you know which search queries resulted in engaged traffic. Below is another example from a site that focuses on electronic components. Overall, the Google organic source was well behind the site average, but some specific search queries were actually performing better than average.
One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
Hi SEO 4 Attorneys, it could be any thing is this for your site for a clients site.It could be an attempt at negative SEO from a competitor? The thing is people may try to push 100's of spammy links to a site in hopes to knock it down. In the end of the day my best advice is to monitor your link profile on a weekly basis. Try to remove negative links where possible if you cant remove them then opt for the disavow tool as a last resort. 
We want to see landing pages that came from organic searches, so first we need to add to this dataset the parameter “Medium” which is how Analytics identifies channels. To do this, use the drop down above the table of data and locate the option for “Medium”. The table below should refresh and now you will have a second column of data showing the channel for each landing page.
It works in a similar way as a canonical tag does to show when a duplicate version of a page exists. But in this case, it helps Google index local content easier within local search engines. It also helps to pass trust across sites and improve the way Google crawls these pages. To know more about how errors you should not make when it comes to hreflang tags, you can check our previous article.

In other words, businesses have much less visibility into which keywords searchers are using to find them, making it much harder to understand which words and terms are or working -- or not working -- in their search engine optimization. Google said this would only affect about 11% of searches, but the truth of the matter is the number is much greater than that, and is only continuing to increase as Google's user base grows. It's important to keep this curveball in mind when evaluating organic search as a traffic and lead generation source.
Cheers for sharing that thread, I'd not read it. I think most of the confusion here arises because of how standard GA reports work on a last non-direct click basis - if there's a previous campaign within timeout, that user will be attributed to that campaign, even if they're technically returning via direct. MCF reports are the exception, of course, given that they show *actual* direct.
Make a campaign on microworkers about "Post my Link On Relevant Forums". Currently Micoworkers has about 850.000 freelance workers worldwide. You know that posting a link on Forums is not that easy. There would be thousands workers out of your campaign after they're trying hard the get a forum that accepts a link within their post. But the benefit for you is that they have visited your link to understand your site to search a relevant forum for your link. On this campaign you only accept real clickable link on forums. From about 850.000 there will be ten to hundred thousands freelancers who are interested to work on your campaign "if " you make a long lasting campaign(s) by hiring at least 200 posisions and teasing among 850.000 workers by paying $0.30 - $0.50 once one successfully posted your link on a relevant forum.
One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
Hey Kimberly, thank you so much for such a comment. Facebook and Google ads are great, if you know what you are doing. I could never get Google ads perfect so I didn't have as great results as some other people I know. When it comes to Facebook ads, they were great for my niche site and I loved the results. You, as a photographer, can get so much traffic from Flickr. If you are not sure how to do it, go to Traffic Generation Cafe blog, I posted a guest post over there, about getting traffic from Flickr. Forums are great, although can be a bit time consuming :) Thanks again and can't wait to see your results from the newly found traffic sources!
Are you currently excluding all known bots and spiders in Google Analytics? If not, you may be experiencing inflated traffic metrics and not even know it. Typically, bots enter through the home page and cascade down throughout your site navigation, mimicking real user behavior. One telltale sign of bot traffic is a highly trafficked page with a high bounce rate, low conversions and a low average time on page.

Google doesn't always include a whole paragraph of text in the Featured Snippet. If you add "Step 1," "Step 2," "Step 3," etc. to the start of each HTML heading within your content (for example, within your H2 tags), Google will sometimes just list out your headings within the Featured Snippet. I've started to see this happen more and more in keywords beginning with "how to".
So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
Otherwise you might end up polluting your website’s traffic. The pages filled with obsolete or low quality content aren’t useful or interesting to the visitor. Thus, they should be pruned for the sake of your website’s health. Low quality pages may affect the performance of the whole site. Even if the website itself plays by the rules, low-quality indexed content may ruin the organic traffic of the whole batch.

While it’s best to create a custom dimension for filtering out bots, applying the generic bot filter is a good place to start. It’s important to note that filters cannot be applied retroactively, so if you’ve recently turned on this feature, you should be receiving less traffic. Additionally, double-check that you are filtering out your own traffic and IP address.
You could get even more specific by narrowing it down to customer base. Is there a specific group of clients you tend to serve? Try including that in your long-tail key phrase. For example: “SEO agency for non-profits in Albuquerque NM.” That’s a key phrase you’re a lot more likely to rank for. Not to mention it will also attract way more targeted, organic traffic than a broad key phrase like “SEO agency.”
×