Just a couple items this week:
- Yahoo has seen it’s share of search traffic increase to 10.4% in December, a two point increase from November, due to adoption by Firefox Web browser as it’s default search engine. I bet we’ll see this drop as people switch back to Google.
- Moz has a lengthy case study on actions taken to help a site recover from poor SEO.
Some of the things that I found interesting about the Moz case study:
- The author, Alan Bleiweiss, starts with an analysis to determine what of the various Google algorithms were cause of the traffic declines, but the remedy is basic SEO/Webmaster best practices.
“The only correct approach in this scenario is to step back and understand that for maximum sustainable improvement, you need to consider every aspect of SEO. Heck, even if a site was ONLY hit by Panda, or Penguin, or the “Above the Fold” algorithm, I always approach my audits with this mindset. It’s the only way to ensure that a site becomes more resilient to future updates of any type.”
That seems to be the message that Google would preach. If we focus on improving the quality of our sites first, we will worry less about specific algorithms.
- The first thing he addresses in the rehabilitation of the site is improving page speed. Page speed is a key component of a visitor’s experience, even with broadband. Google’s page speed tool is helpful in looking at factors that slow down a page. In addition to image size and compression, there are concerns for block rendering JavaScript and CSS, file compression, caching, etc. (This is one of the things we addressed with our own site right after we implemented a redesign, and we still need to address this with our portfolio page, which is very heavy with graphics. )
- Fixing broken links and internal redirects, the kind of housekeeping chores that are pretty basic, but also can be time consuming to implement. Yet another best practice that can fall by the wayside. I think that to the extent that we can implement links as a separate content type within a CMS tool, the easier it is keep up going forward. A link checking bot can be very helpful in checking both internal an external links throughout a site.
- Finally, setting pages that are light on original content as “noindex, nofollow,” the idea being that these pages might otherwise hurt your site’s overall perceived quality. This might be something to consider for us, as we are posting short blog posts or ones that are relatively light on original content.