Now in your reconsideration request make sure you are honest and tell Google everything that the prior agency was up too. Be sure to include all Excel information of removed links and say you are going to make an ongoing effort to remove everything negative. It is common knowledge that Google may not accept your first reconsideration request, so it may take a few times.
Visitors can also fall into the direct category by clicking a link to your site from an email or PDF document, accessing your site from a shortened URL (which is an abbreviated version of your website address), clicking on a link from a secured site to your non-secure site, or clicking on a link to your site from a social media application like Facebook or Twitter. And, there is a chance that accessing your site from an organic search can end up being reported as direct traffic.
Direct traffic is people who type your URL into their navigation bar, or who use a bookmark. These are your regular visitors, people who’ve discovered you in some other way and are now coming back, and — less now than in the past — people who type in a URL they’ve seen on your offline ads or who guess your web address based on your company name. Direct traffic is often the highest converting kind, but it can also be regular blog readers or your own staff. Filter your workers out if at all possible to keep your data clean. If you can’t filter them, at least ask them to use direct methods (rather than search, for example) and you may be able to identify it when you work in Analytics. Any traffic that Google can’t identify will also show up in Direct traffic, and that can include ads if you haven’t hooked up your ad accounts with your analytics, email or SMS campaigns if you haven’t tagged them for Google, and other sources that aren’t identified. Tag well and you’ll see less of this.
So what does this mean? “Bounce rate” can be thought of as a measure of engagement. If visitors are moving around your site, they are engaged. If they are bouncing, they cannot think of a good reason to stay. There is one notable exception to this: Blogs, videos, and news sites often have higher bounce rates because a visitor reads a particular article or watches a video and then leaves. For an ecommerce site, however, you would like to see relative low bounce rates. Sources that bounce a lot are probably not providing quality traffic.
If both page are closely related (lots of topical overlap), I would merge the unique content from the lower ranking article into the top ranking one, then 301 redirect the lower performing article into the top ranking one. This will make the canonical version more relevant, and give it an immediate authority boost. I would also fetch it right away, do some link building, and possibly a little paid promotion to seed some engagement. Update the time stamp.
Brankica, aloha. Yet another great article. With the in-depth info and resources you have been giving us, I don't know how you have time for anything else. This is another one that will live on my computer. You mention several words/places that I have not seen before. Look forward to exploring. Off to tweet. Will be working on a 51 for you. Take good care, my friend. Aloha. Janet
Google makes constant efforts to improve the search algorithm and detect quality content and that is why it rolled out the Google Panda 4.0 update. What is this update really doing? It deranks low quality content and boosts the high quality one, according to Google’s idea of valuable content. In the screenshot below you can see what kind of content you shouldn’t be looking into if you are really determined in making a mess out of your organic traffic.
The "Direct Session" dimension, not to be confused with the default channel grouping or source dimension, is something different. It can tell you if a session was genuinely from its reported campaign (reported as No), or if it was simply bucketed there due to the last non-direct click attribution model (reported as Yes). I didn't mention this dimension in the article because it has some flaws which can cause brain bending complexities, plus it goes a little beyond the scope of this article, which I tried to gear more towards marketers and non-expert GA users. It's certainly an interesting one, though.
One of the things that can slow-bleed the traffic from your site is the quality (or lack thereof) of the content you publish on your site. Previous Google updates like Panda have already been released specifically to deal with the issue of low-quality content on websites. Long story short: Panda intended to stop sites with bad content from appearing in the search results.
Before developing this subject further, indulge me to remind you that according to Google a robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers. And although there is plenty of documentation on this subject, there are still many ways you can suffer an organic search traffic drop. Below we are going to list two common yet critical mistakes when it comes to robots.txt.
There are a few key pieces of information that we want to look at with Organic Traffic. The first piece of information that helps us frame the website performance is the total percentage of traffic that is organic traffic. These numbers will vary greatly based on your AdWords spend, how many email campaigns you send and many other factors. To view this figure, we want to go to the Acquisition section of your Analytics dashboard and then proceed to Channels.
Traffic coming from SEO (Search Engine Optimization) is free, while traffic generated from PPC (Pay Per Click) ads is not free. PPC ads are displayed above organic search results or on the sidebar of search engine pages for search queries related to their topic. Landing page keywords, ad keywords, CPC (cost per click) bids, and targeting all influence how those ads are “served” or displayed on the page.
Mobile traffic: In the Groupon experiment mentioned above, Groupon found that both browser and device matter in web analytics’ ability to track organic traffic. Although desktops using common browsers saw a smaller impact from the test (10-20 percent), mobile devices saw a 50 percent drop in direct traffic when the site was de-indexed. In short, as mobile users grow, we are likely to see direct traffic rise even more from organic search traffic.