This community is full of opportunities if you're a fashion-based retailer. One of the major advantages is the fact that they add links to each of the products that they feature within their outfits - the links go directly to product pages. This is the holy grail for ecommerce SEO, and the traffic those links will bring through will convert at a very high rate.
And although livescience.com has way more links than its competitor, it seems that emedicinehealth.com is performing better for a lot of important keywords. Why, you might wonder. Well, the answer is simple: content. And although they are both offering good content, livescience.com is a site offering general information about a lot of things while emedicinehealth.com offers articles highly related to the topics, leaving the impression that this site really offers reliable content. We are not saying that the 5 mil links livescience.com has do not matter, yet it seems that in some cases content weighs more.
BuzzFeed: Everyone knows the primary perpetrator of Clickbait and content appropriation on the web, but seemingly few people realize that anyone can write for them. You can sign up for a free account here, and simply start creating content on their site. There’s a lot of content that sees very little exposure on the site, but if something does catch on – perhaps because of your audience and your brand reputation – BuzzFeed will reach out to you and help by promoting it on the main sections of their site.
Another point which you did mention (between the lines: "and updating your backlinks to point to HTTPS URLs") but I actually didn't realize enough is that updating backlinks when your site switches from http to https is very important. Redirecting all http traffic to the https version of your site from the server (301) is necessary, but doesn't solve losing referral data. Only updating backlinks, especially those on httpS-sites, does. I'd like to emphasize that!
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
If your company is like my client’s, there’s a good chance you’re taking advantage of the maximum 20 goal completions that can be simultaneously tracked in Analytics. However, to make things easier and more consistent (since goal completions can change), I looked at only buyer intent conversions. In this case it was Enterprise, Business, and Personal edition form fills, as well as Contact Us form fills.
Mobile traffic: In the Groupon experiment mentioned above, Groupon found that both browser and device matter in web analytics’ ability to track organic traffic. Although desktops using common browsers saw a smaller impact from the test (10-20 percent), mobile devices saw a 50 percent drop in direct traffic when the site was de-indexed. In short, as mobile users grow, we are likely to see direct traffic rise even more from organic search traffic.