I have always believed in good quality content, well structured and written in a way that isn’t just about promotional talk. Thanks for sharing this information with us, it’s always helpful to have everything written in a concise manner so we can remind ourselves now and again of what it takes to increase organic traffic. As an SEO consultant myself I come across websites all the time that are messy and still using tactics that have long been out of date. Having a successful website is all about quality content and links. I like it that you stated what the problem was and then how you fixed it for the client. Great article.
If both page are closely related (lots of topical overlap), I would merge the unique content from the lower ranking article into the top ranking one, then 301 redirect the lower performing article into the top ranking one. This will make the canonical version more relevant, and give it an immediate authority boost. I would also fetch it right away, do some link building, and possibly a little paid promotion to seed some engagement. Update the time stamp.
For an Agency like Aira, the speed of AccuRanker is a huge feature - the ability to do fast checks and instant refreshes on keywords means we can diagnose issues as and when they happen, during meetings or when on a client call. We love how quick Accuranker is and how scalable it is across regions around the world. We use it all the time for client reports and are now also making use of the Data Studio connector to pull data into our own existing report templates.
The response rate here was huge because this is a mutually beneficial relationship. The bloggers get free products to use within their outfits (as well as more clothes for their wardrobe!) and I was able to drive traffic through to my site, get high-quality backlinks, a load of social media engagement and some high-end photography to use within my own content and on product pages.
Visitors can also fall into the direct category by clicking a link to your site from an email or PDF document, accessing your site from a shortened URL (which is an abbreviated version of your website address), clicking on a link from a secured site to your non-secure site, or clicking on a link to your site from a social media application like Facebook or Twitter. And, there is a chance that accessing your site from an organic search can end up being reported as direct traffic.
Plan your link structure. Start with the main navigation and decide how to best connect pages both physically (URL structure) and virtually (internal links) to clearly establish your content themes. Try to include at least 3-5 quality subpages under each core silo landing page. Link internally between the subpages. Link each subpage back up to the main silo landing page.
Let’s say that you want to move your blog from a subdirectory URL (yourwebsiterulz.com/blog) to a subdomain (blog.yourwebsiterulz.com). Although Matt Cutts, former Google engineer, said that “they are roughly equivalent and you should basically go with whichever one is easier for you in terms of configuration, your CMSs, all that sort of stuff”, it seems that things are a bit more complicated than that.
So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.
So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
Although it may have changed slightly since BrightEdge published its report last year, the data still seem to hold true. Organic is simply better for delivering relevant traffic. The only channel that performs better in some capacities is paid search ads, but that is only for conversions, not overall traffic delivery (Paid Search only accounted for 10 percent of overall total traffic).
Buy brasilian search traffic visitors from Google.com.br for 30 days – We will send real brasilian visitors to your site using Google.com.br's search field with your keywords to improve your SERP CTR and SEO strategy. All visitors will be shown as organic traffic in your Google Analytics. Having a constant in organic web traffic is imperative in order to maintain your rankings and minimize the risk of dropping in the SERP's. So, don't wait use our Google Brasil Web Traffic Service today …
The amount of dark social that comprises one's direct traffic is going to vary. For example, you might be diligent about incorporating tracking tokens into your email marketing campaigns, and asking your co-marketing partners to do the same when they promote your site content. Great. You won't have a huge chunk of traffic dumped into direct traffic that should really go into email campaigns. On the other hand, perhaps you're embroiled in a viral video firestorm, and a video on your site gets forwarded around to thousands of inboxes ... you don't exactly have control over that kind of exposure, and as a result you'll be seeing a lot of traffic without referral data in the URL that, consequently, gets bucketed under direct traffic. See what I mean? It depends.
Hi Brankika, With respect to EZA, I no longer submit to them. They rejected some of my article for saying that the page I was linking to did not contain enough information. I linked to my blog and to the original article. I believe they didn't like the link to the original article. That being the case, they no longer allow one of the cardinal rules of syndication as per Google itself..."Link To The Original". Once they stopped allowing that, they were no longer useful to me. Thanks for the great resource. Mark
Hey Julia, thanks so much for the comment. I just saw the idea you were talking about, that is awesome. I think you are right cause although it seems a bit inconvenient, I personally wrote down a few URL like that in my cell phone or notebook to check it out when I am back at the computer. Wow, this is something I need to think about. Also, how about those big billboards?
Hey Michael, thanks so much for the comment. After this post, I decided to revamp my traffic generation strategy and work more on the existing sources and work a bit more on the ones I was not using that much. I saw results almost from the first day. This list is good because of the diversity of the ideas, cause if one thing doesn't work for you, there has to be another one that will sky rocket your stats :)
Thanks for the comment, I would not say it is impossible to create high quality backlinks from scratch without content, you just need to do a review on competitor backlinks and see if their are any easy targets. We have had some good luck in the education space acquiring links on the same pages as competitor from PR5+ edu sites. It all revolves around the outreach strategy in which you put in place.
Wow, what a massive post Brankica! I love it. :) I especially liked your suggestion to answer questions on answer sites. I've tried Yahoo Answers before, though didn't do much with it. I'll definitely give it another go in the future though! Man, just when I was tidying up my to-do list, you had to make it longer for me. ;) Thanks for such an awesome read! By the way, I found this post on StumbleUpon (crazy how the world works). =p Christina
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
This is the number of views that you can test each month on your website.It's up to you how you choose to use them, either by allocating all the views to one test or to multiple test, either on one page or on multiple pages. If you have selected the 10.000 tested views plan and you run an experiment on your category page which is viewed 7000 times per month, then at the end of the month 7000 is what you'll be counted as tested views quota.