SEO and Replatforming: How Your Ecommerce Upgrades or Overhauls Can Affect Search

Apr 01, 2011 9:30 PM  By

More than half (57%) of online retailers increased their ecommerce technology spending in 2010, according to The Forrester Wave: B2C ecommerce Platforms, Q4 2010. That’s because ecommerce platforms have a “relatively low cost” and a high return on investment.

But the reality is that changing or upgrading ecommerce platforms can have a much higher cost than anticipated when the impact on search engine optimization is overlooked.

Most major ecommerce platforms tout their ability to produce “search-engine-friendly” or even “search-engine-optimized” websites by simply deploying their out-of-the-box software.

But like many other commitments made during the sales process, real world examples make it clear that out-of-the-box search engine optimization can be as elusive as a free lunch. And we all know there is no such thing as a free lunch.

Several times during the past year, our SEO team has been called on to help merchants diagnose and remedy the ill effects of platform implementations on search engine performance. These retailers were not seeing the performance gains they were expecting from launching their sites on a new platform. In fact, all of them experienced a detrimental impact on the search results they’d had prior to the upgrade.

The experience of a major online specialty retailer that migrated its existing commerce site to a new one built on an industry-leading ecommerce platform illustrates how such a move can affect SEO.

Prior to the site migration, organic search represented one of the single largest (and certainly most economical) sources of website traffic for this large retailer, averaging nearly 30% of site traffic throughout the year. Immediately following the launch of the new site, however, the paradigm changed dramatically.

Most, if not all, of the organic search key performance indicators (such as keyword rankings, number of indexed site pages, percentage of organic-referred traffic, and percentage of organic-referred revenue) showed a major issue that had to be addressed quickly.

By the time holiday 2010 came around, the retailer was in true crisis mode, with organic search down roughly 95% from the prior year. This meant that a significant increase in paid search expenditure was needed just to help fill the gaping hole left by the drop in organic search performance.

The end result was a precipitous drop in the merchant’s fourth-quarter revenue and profit due to the disastrous organic search performance. So much for ecommerce platforms having a “relatively low cost.”

While there were many factors that ultimately contributed to the organic search problems, the following are some of the major culprits that were tied to the new ecommerce platform or decisions made during the migration process:

URL structure

Best practice: An optimized URL ideally includes targeted keywords and only alpha-numeric characters.

What we found: This ecommerce deployment produced painfully long URLs that included session IDs and other suboptimal parameters. While the ecommerce platform uses a “cloaking” solution to (supposedly) remove session IDs for the search crawlers, Google’s index clearly shows the solution does not work in all instances.

Recommended solution: If the selected ecommerce system package does not offer flexibility for altering the URL structure, you can use various external URL mapping or rewriting solutions to remove the nonalphanumeric parameters, as well as shorten the overall length and complexity.

Directory length/structure

Best practice: While URL length is more often a business decision than a technology issue, Rosetta recommends that URLs be shorter than 75 characters and contain fewer than three levels of directory depth.

What we found: In this deployment, we found URLs that averaged in excess of 150 characters and six levels of depth for product detail pages, causing significant problems with indexation and passing of link value.

Recommended solution: Just because a page technically lives deep in the site hierarchy does not mean that the URLs must reflect that depth. Many retailers make the mistake of building URLs in a similar fashion to navigational breadcrumbs.

But it’s not necessary to represent all of the multiple sub-categories and filtering options in the URL. Simple rules can be established to help flatten some of the directory levels presented in the resulting URLs.

Canonical URLs

Best practice: Canonicalization is the process of picking the best URL when there are several choices. This helps the search engines concentrate their focus and consolidate incoming link value.

What we found: We found as many as six different ways (URLs) to access this retailer’s homepage.

Recommended solution: Canonicalization is most effectively managed through the combination of rel=”canonical” tags and 301 redirects. This approach ensures that all major search engines can find and maintain only one version of the URL in their indices, therefore maximizing link value and authority.

Duplicate content

Best practice: High quality — and highly valued — sites avoid publishing multiple pages with duplicate content. Duplicate content can not only affect an individual page’s performance, but, in aggregate, reduce the overall authority and quality score for the domain.

What we found: Poor implementation of ecommerce platforms and connected applications (like on-site search and product recommendations) can lead to inadvertent, technology-driven duplicate content. We found tens of thousands of pages and empty page templates that were likely perceived as duplicate content by the engines.

Recommended solution: Because duplicate content can be caused by a wide variety of technical missteps, the recommended solutions vary almost as greatly. But by directing the search engine spiders to avoid certain pages or directories, the robots.txt protocol or meta robots (with no index) are very effective for correcting many instances of duplicate content.

Google and Bing both offer pattern-matching options with regular expressions that can be used to identify pages or subfolders that should be excluded. These two characters are the asterisk (*) and the dollar sign ($).

  • * – is a wildcard that represents any sequence of characters
  • $ – matches the end of the URL

While there are no doubt many other on- and off-page factors that affect search engine performance, starting with a solid technical base is a must. As the cautionary tale of our retail example demonstrates, your business deserves more than an empty sales promise from a software salesman.

A well-planned and tested technical SEO strategy will lay the foundation for overall SEO health and strong organic search performance.

Paul Elliott is a partner in Rosetta’s Consumer Products and Retail Vertical, and previously founded and led the digital and direct interactive agency’s Search & Media Practice.