Unlike text and display ads, PLAs enable e-commerce companies to target specific products and product groups at the decision stage of the buyer’s journey. They also help to increase brand awareness as they position their brand in front of a highly targeted audience that is looking for a product the company offers. Typically, a PLA contains a product picture and a price, along with a store brand. So it comes as no surprise that Walmart, as one of the world’s largest online retailers, appeared to be the undisputed leader in using PLAs, with almost 604,000 keywords used in April. The company is followed by Home Depot, which sells various home improvement items and targets 139,000 keywords.
Let’s say you wish to block all URLs that have the PDF. extension. If you write in your robots.txt a line that looks like this: User-agent: Googlebot Disallow: /*.pdf$. The sign “$” from the end basically tells bots that only URLs ending in PDF shouldn’t be crawled while any other URL containing “PDF” should be. I know it might sound complicated, yet the moral of this story is that a simple misused symbol can break your marketing strategy along with your organic traffic. Below you can find a list with the correct robotx.txt wildcard matches and, as long as you keep count of it, you should be on the safe side of website’s traffic.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt.
Thank you Jayne, for doing all that voting and sharing, means the world to me! I am glad you saw some new ideas and looking at all the comments, I think I killed a really bug tree with this post, that is how many people want to print it out, lol. If at any point of time you get stuck with one of these, send me an e-mail, tweet, what ever and I will try to help you with extra ideas :)
Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.
One of the most significant changes to Google’s algorithm over the years has been how it uses keywords to match relevant content to queries. The days of simply matching search terms to the keywords on your page are behind us, and Google’s machine learning algorithms are able to match the topical relevance of your content to the contextual meaning behind user searches.
For an Agency like Aira, the speed of AccuRanker is a huge feature - the ability to do fast checks and instant refreshes on keywords means we can diagnose issues as and when they happen, during meetings or when on a client call. We love how quick Accuranker is and how scalable it is across regions around the world. We use it all the time for client reports and are now also making use of the Data Studio connector to pull data into our own existing report templates.
Direct traffic is people who type your URL into their navigation bar, or who use a bookmark. These are your regular visitors, people who’ve discovered you in some other way and are now coming back, and — less now than in the past — people who type in a URL they’ve seen on your offline ads or who guess your web address based on your company name. Direct traffic is often the highest converting kind, but it can also be regular blog readers or your own staff. Filter your workers out if at all possible to keep your data clean. If you can’t filter them, at least ask them to use direct methods (rather than search, for example) and you may be able to identify it when you work in Analytics. Any traffic that Google can’t identify will also show up in Direct traffic, and that can include ads if you haven’t hooked up your ad accounts with your analytics, email or SMS campaigns if you haven’t tagged them for Google, and other sources that aren’t identified. Tag well and you’ll see less of this.
When you run a PPC campaign -- whether through Google or some other PPC provider -- you can track how much site traffic it drives in this part of your sources report. Obviously for proper PPC campaign management , you'll also need to be reviewing whether that traffic actually converts , too. Like email marketing as a source, be sure to include tracking tokens with all of your paid search campaigns so this is properly bucketed as, you know, paid search.
Oh, I wish you told me what was wrong with it :) I only discovered it recently but I am getting nice traffic from it. I hope you will let me know how it worked for you. At the moment I am posting both to my page and personal profile. I also realized that I might just leave it on the personal page (yeah, sound weird) cause on my fan page, I kinda like to add a little comment to the post. Anyway, thanks for the comment and I will try to find your blog over there and subscribe to it on Networked blogs.
Beyond organic and direct traffic, you must understand the difference between all of your traffic sources and how traffic is classified. Most web analytics platforms, like Google Analytics, utilize an algorithm and flow chart based on the referring website or parameters set within the URL that determine the source of traffic. Here is a breakdown of all sources: