Redesign with SEO in mind

A site redesign or switch to a new platform is kind of like a rebirth — it’s one of the most exciting and nerve-wracking times for the entire Internet marketing team. With everyone caught up in the branding, design, usability and technology, the impact on search engine optimization can sometimes be forgotten until the last minute.

Whether the task at hand is redesigning or replatforming, start by doing some SEO reconnaissance. Ask the vendor to provide several of their clients’ sites, in particular sites that are structured like your planned site or that contain similar features.

Press releases are another good source of client lists, as are on-site testimonials. You can get an idea of the challenges that might lie ahead by performing a 60-second Website audit, identifying a snapshot of the challenges that each of the sites serviced by a vendor have in common.

It can be hard to determine what the natural search impact will be until working code hits a development server. But remembering the following five SEO development mantras and repeating them often will keep the team focused on the most critical elements to plan for success.

Links must be crawlable with JavaScript, CSS and cookies disabled.

Links provide the ability to crawl a site, passing link popularity and keyword signals deeper into the site. If a site’s links are not crawlable to search spiders, that site will be critically limited in the search terms for which it can rank and drive traffic and sales.

While their ability to crawl more complex sites is getting better, search engines do not traditionally crawl with JavaScript and CSS enabled, and they don’t accept cookies. Whiz-bang interactive sites designed in AJAX and Flash are likely to be only minimally crawlable, and only if specific optimization techniques are used.

Even less-complex elements such as expandable navigation and rollovers can be coded in different ways to be more or less crawlable. Do your company a favor and plan to include a “graceful degradation” or progressive enhancement version of elements on the site that need to be both fantastically interactive and SEO friendly.

Plain text must be crawlable on the page with JavaScript and CSS disabled.

Similar to the above linking issue, the unique text on page shouldn’t be locked inside images and Flash, or nonexistent without JavaScript and CSS enabled. The important aspect here is “the unique content.” If the only crawlable plain text on the page is the navigational header and footer, then every page will send a very similar keyword signal.

Crawlable anchor text is important, yes, but this second mantra focuses on the unique body content. As with links, uncrawlable content can be exposed using a number of optimization techniques.

Every page must send a unique keyword signal.

Just as every page needs crawlable plain text on the page, every page must send its own unique keyword signal. Every page has a unique reason to exist, or else it would be part of another page.

The platform must have the ability to expose these unique keyword signals on every page in a unique title tag, HTML headings and other optimizable fields. Every template needs to be designed to contain at least an H1 heading and one sentence of body copy.

The fields in each template should be able to be optimized automatically with a customizable formula that places specified text elements from the database in a specified order in the title tag, HTML headings and meta data. But the platform also needs to enable manual optimization of those fields so that the critical pages can be hand-optimized.

One URL for one page of content.

Each unique page of content needs one single URL. This can be trickier than it sounds.

Some platforms publish content in multiple locations with different URLs. Some analytics programs append tracking parameters. Some servers aren’t canonicalized to a single protocol, TLD, domain, subdomain, directory or file extension.

For example, consider the following 10 URLs that would theoretically all load the same page of content:

I have seen every one of these duplication examples — and more — in play on client sites. Often, a site will suffer from several duplication sources, which then multiply each other to create hundreds or thousands of URLs for the same page of content. The best way to stop content duplication is to ensure that it never starts, at the platform and server level.

We’re going to 301 that, right?

This is a good mantra to pull out every time URL changes are discussed, whether it’s one by one or across the whole site, ensuring that every legacy URL has been considered as part of a 301 redirect plan. More on this below.

Test development sites for SEO prelaunch

As the time draws near to launch the redesign, the SEO professional should be a key part of the testing corps. Of course, you’ll want to go through the usual slew of testing rituals that your company has outlined before a major launch. But on the SEO front, a couple more checks are required.

Use a tool such as Chris Pederick’s Web Developer Toolbar add-on for Firefox to disable JavaScript, cookies and CSS. Can you still navigate the site? Or do links disappear/unlink? Try disabling images.

Next Page: Create 301 maps to pass SEO strength to the new site

Can you understand the page content without the visual cues humans are used to processing? Try outlining headings. Does anything stand out as more prominent? Do those headings use unique, relevant, popular keywords? Or do they all say “More Info”?

If you can’t understand the site when surfing in this way, it’s likely that the natural search performance will be seriously limited as well.

Free crawlers are another excellent tool for testing how crawlable a site is, and how many URLs are generated unintentionally. Crawlers such as GSite Crawler (http://gsitecrawler.com) and Link Sleuth (http://home.snafu.de/tilman/xenulink.html#Description) catalog URLs as they crawl and generate an exported list that can be sorted and filtered to identify duplicate URLs and title tags, among other things.

I typically start by sorting the URLs alphabetically in Excel and looking for appended parameters I didn’t expect. Then I might choose a product or category number or identifying keyword and filter by that to determine if different URL structures represent the same product or category.

I’ll also sort the title tags alphabetically and filter on identifying keywords to identify duplication. Fixing the sources of duplication in development is far easier than weeding them out when they’re live and indexed.

If the structure of the site is strong, content optimization and external link building can all be strengthened after launch. But if the structure is weak or the templates and platform don’t allow for optimization at launch, resolving it post-launch represents a much bigger challenge.

Create 301 maps to pass SEO strength to the new site

When a site is stable on the development environment and the URLs are ironed out, identify a 301 redirect plan. The basic principle is simple: For SEO and user experience both, redirects should be put in place for every legacy URL, redirecting them to the new URLs. For example:

Many servers default to 302 temporary redirects. As small as the difference seems, the redirects must be 301 permanent redirects. Only 301 redirects pass the legacy URL’s link popularity to the new URL and prompt the engines to de-index the legacy URL. A 302 redirect merely moves the user agent to the new URL, without passing link popularity or prompting de-indexation.

The tricky part comes in identifying the universe of legacy URLs to 301 redirect. It’s not enough to 301 the URLs you know exist; you really need to clean house and identify those forgotten pockets of duplicate or forgotten pages littering the server.

To find these URLs, comb through the major engines’ indexes using combined search operators such as site: and inurl: queries to determine what’s indexed. For a large site, crawling the existing live site using the crawlers mentioned above will be much more effective and efficient, though it may take many hours to complete.

You can also examine log files to identify which URLs have been served, but URLs lurking in dark corners may not appear in recent log files. A combination of all these approaches will yield significant overlap, but also the most complete final list. Dump all of the legacy URLs into a single Excel spreadsheet and pair each one with the new URL that contains the same — or most similar — content.

Ideally, every known legacy URL would have a 301 redirect to a new URL. This can be done with pattern matching if the URLs follow predictable patterns. If not, thousands of individual 301 redirects are not likely to be practical.

Determine which URLs have the most link popularity to pass to the new URLs. Harvesting legacy link popularity to boost the performance of the new site is the primary SEO purpose of 301 redirects.

For those not able to 301 redirect, a hard 404 error should be returned. The 404 will also prompt de-indexation, and can be modified to load a custom, friendly error page. Ensure that the server header returned is a 404, though, or de-indexation of legacy URLs will not occur.

Measure the transition

It can take 30 to 90 days after a redesign or replatform before the site stabilizes and begins performing predictably. Whether this transition time is 30 days or 90 days depends on the strength of your current SEO, the cleanness with which the transition is executed, and the strength of the new site’s SEO.

To measure the transition, track crawling (log files, Google Webmaster Tools), indexation (major engine site: queries), rankings (major engines, or ranking tools), traffic and conversions (Web analytics) weekly until the trend line stabilizes. Wherever it stabilizes is the new baseline to measure ongoing SEO efforts.

Give the transition trend lines time to stabilize before applying additional optimization. Layering change upon change can result in a muddy mess where cause and effect become impossible to measure.

Redesigning or replatforming a site is a complex process that requires a lot of planning and a lot of testing. But if you’re included in the design process and armed with the SEO mantras, a strong testing plan, and a 301 map, you can minimize SEO performance issues during the transition.

Jill Kocher ([email protected]) is manager of natural search consulting at search marketing firm Netconcepts.

Online resource guide

For details about performing a 60-second Website audit, visit http://www.naturalsearchblog.com/archives/2009/06/19/60-second-website-audit/

For more on progressive enhancement, visit http://en.wikipedia.org/wiki/Progressive_enhancement

To find Chris Pederick’s Web Developer Toolbar add-on, visit http://chrispederick.com/work/web-developer/

To brush up on search operators, visit http://www.google.com/help/cheatsheet.html