When optimizing URLs for high rankings, many websites pay little attention to optimizing the URL for maximum clickthrough. Yet the URL undeniably affects searcher clickthrough rates in the search engine results pages. MarketingSherpa demonstrates this in its eyetracking study published in its Search Marketing Benchmark Guide in recent years.
Specifically, MarketingSherpa found that short URLs get clicked on twice as often as long URLs—given that the position rank is equal. As you can see in the heatmaps below, experiment participants spent more time viewing the long URL, but less time viewing the entire listing.
You could conclude from this that the long URL distracts the searcher from viewing the listing’s title and description. Not a great outcome.
Worse yet, long URLs appear to act as a deterrent to clicking, drawing attention away from its listing and instead directing it to the listing below it, which then gets clicked 2.5 times more frequently.
It’s open for debate, of course, as to what is a “short” URL or a “long” URL. But it’s the first data I’ve ever seen attempt to quantify the affinity searchers have for the URL component of natural search listings.
These MarketingSherpa findings confirm that success at SEO still requires more than just XML Sitemaps, and that an unoptimized URL is money left on the table. Just because algorithms have evolved to handle dynamic URLs with multiple parameters, avoid session-based spider traps, and even fill out forms on occasion, we shouldn’t be lulled into a false sense of security that our URLs are “good enough” and don’t need work.
You should be on an unending mission to find and implement opportunities to test and optimize URLs for both rankings and clickthrough.
Even though URLs you’d never have dreamed of getting indexed a few years ago are now regularly making it into the index, this doesn’t mean that suboptimal URLs are going to rank well or convert searchers into clickers.
While at my previous agency Netconcepts (now owned by Covario), my team and I conducted countless tests over the years using my GravityStream platform. These tests proved that optimized URLs consistently outperform unoptimized URLs. Given that, here are some general best practices for URLs that I believe hold true:
- The fewer the parameters in your dynamic URL, the better. One or two parameters is much better than seven or eight. Avoid superfluous/nonessential parameters such as tracking codes.
- A static-looking URL (one containing no ampersands, equal signs or question marks) is more search-optimal than a dynamic one.
- Having keywords in the URL is better than no keywords.
- A keyword in the filename portion of the URL is more beneficial than in a directory/subdirectory name.
- Hyphens are the preferred word separator. Underscores are not, and have never been considered to be, word separators by Google (this according to renown Google engineer Matt Cutts.) So if you have multiple-word keyword phrases in your URLs, I’d strongly recommend using dashes to separate them.
- Stuffing too many keywords in the URL looks spammy. Three, four or five words in a URL look perfectly normal. A little longer and it starts to look worse to Google, according to Cutts.
- The domain name is not a good place for multiple hyphens, as it can make your URL look spammy. Although that said, sometimes a domain name should have a hyphen, as the domain faux pas “arsecommerce.com” demonstrates. (You may not get this joke if you don’t recognize Queen’s English!)
Given the above, it’s absolutely worthwhile to rewrite your dynamic URLs to make them appear static and to include keyword phrases with hyphens separating the words (done within reason). So a targeted search term of “blue widgets” would be represented as “blue-widgets” in the URL.
Bare spaces cannot be used in URLs, so some “white space” character needs to be used—either the + (plus sign) or the character encoding for a space, which is %20. I’m not a fan of using the character-encoded version, as it’s not quite as pretty: blue%20widgets.
Stability is overrated
The above best practices are generally accepted. It gets a lot more contentious when talking about the stability/permanence of your URLs.
The general school of thought is that stable is better. In other words, decide on an optimal URL for a page and stick with it for the long haul. I have a different view: URLs can be as fluid as a title tag.
URLs can be experimented with and optimized iteratively over time—just like any other on-page factor. Why would you “set it and forget it” when it comes to your URLs when you don’t do that with your titles, H1 headlines, page copy, and internal linking structure?
For example, all the following hypothetical URLs follow best practices—with the exception of the first URL, of course, which is actually the real URL. Now which one will perform the best?
• http://www.homedepot.com/webapp/wcs/stores/servlet/Navigation? storeId=10051&N=10000003+90401+525285&langId=-1&catalogId=10053&Ntk= AllProps&cm_sp=Navigation-_-GlobalHeader-_-TopNav-_-Appliances-_-Dehumidifiers
• http://www.homedepot.com/webapp/stores/50364/ 100053.html
• http://www.homedepot.com/Appliances/Dehumidifiers/
• http://www.homedepot.com/Appliances/Dehumidifiers.html
• http://www.homedepot.com/Appliances-Dehumidifiers.html
• http://www.homedepot.com/Dehumidifiers-Appliances.html
• http://appliances.homedepot.com/Dehumidifiers.html
Test to ensure success
If your content management system or ecommerce platform supports having URLs that are malleable, then why not exploit that capability and embark on a regimen of testing and continuous improvement? The popular blog platform WordPress supports this fairly well by automatically 301 redirecting requests for old permalink URLs, once the “post slug” (post URL) for that blog post has been changed in the admin.
Unfortunately, most ecommerce platforms do not support such a capability. When sites are stymied by their platform, you have just a few options.
You can replace your CMS with one that supports malleable URLs, customize the CMS to support it (assuming you have access to the source code), or put a layer on top of your CMS by using an SEO proxy technology like Covario’s Organic Search Optimizer or similar.
Regardless of how you accomplish continuous URL optimization, the MarketingSherpa study shows that complacency when it comes to iterative testing and improvement of your URLs (or any other on-page factor, for that matter) results in more traffic going to your competitor’s listings. This is fatal to your organic search program.
Stephan Spencer is co-author of The Art of SEO and founder of Netconcepts (acquired by Covario).