Yahoo’s Search Life After Google

Yahoo!’s Feb. 18 announcement that it would stop using Google’s search technology in favor of its own search engine — including a new Web crawler, Yahoo! Slurp, to index Web pages — could help catalogers who rely on search engine marketing for online traffic and sales. Then again, it could also hurt the same catalogers.

In January, Google accounted for 39% of all U.S. Internet searches, according to a Nielsen/NetRatings survey. Yahoo! tied for second place with MSN; each had a 30% share. America Online accounted for 15% of searches, followed by Ask Jeeves with 8%. But Google had powered Yahoo!’s searches. In effect, then, Google’s search engine actually accounted for 69% of all searches.

Because Google was the source of so much search traffic, many catalogers optimized their sites according to Google’s criteria, notes Kevin Lee, CEO of Did-it.com, a New York-based search marketing agency. “The fear catalogers may now have is what if the Yahoo! spider doesn’t find all their hundreds or thousands of different products the same way that Google does. They may see a fall-off in ‘free’ traffic” — traffic from organic, or natural, search results as opposed to paid placement.

Wait and see

Sunnyvale, CA-based Yahoo! built its search engine inhouse with technology from search engine developers Inktomi and Overture Services, both of which it acquired last year. In the days immediately following Yahoo!’s announcement, search marketing professionals were not yet clear on what the criteria and rules of Yahoo!’s search spider will be.

But Brian Klais, vice president of e-business services for Madison, WI-based e-marketing agency Netconcepts, notes that Yahoo! has bumped up its threshold limit — the amount of a site that it will crawl — to 500k. “This means that catalogers can include more content on product or category pages,” Klais says. In comparison, Google looks only at the first 100k of a site’s content.

If a cataloger’s site contains a lot of content in which, say, customers have contributed testimonials, “it’s a great thing because it’ll influence Yahoo’s search rankings,” Klais explains. On the other hand, “Google won’t even index it if it’s more than 100k.”

By bumping up its threshold, Yahoo! may be trying to best Google in terms of depth. Google, which claims to scour nearly half of the approximately 10 billion Internet pages on the World Wide Web, can certainly turn up more catalogers in consumers’ subject searches than Yahoo! can. Yahoo! doesn’t publicly reveal how broad it scours the Web, but the amount is believed to be considerably less.

“The Google spider tends to be fairly aggressive in finding content,” Lee says. “Current data show that Inktomi isn’t quite as aggressive. But Yahoo! wants to be as aggressive.”

Not surprisingly, Mountain View, CA-based Google doesn’t have much to say about its client-turned-competitor. “We’re trying to crawl the Web and get as much content as possible,” says a spokesperson who asked not to be named. The move by Yahoo! “has no bearing on catalogers who optimize with Google.”

So, if you invested resources to ensure that your Web pages float to the top of Google search results, will you now have to reoptimize your site to appeal to Yahoo! Slurp? Perhaps — especially given that “we’ve found the conversion rate on e-commerce sites tends to be higher with Yahoo! users vs. Google users,” says Netconcepts founder/president Stephan Spencer. “Yahoo! users are more shopping oriented than Google users, who are more research oriented.”

Before rushing to make wholesale changes, Spencer says, you “need to study Yahoo!’s search results and experiment with your pages to see the impact on rankings. For example, you can evaluate how important the title tag is by changing it on some pages and seeing what that does to your Yahoo! rankings.”

What’s more, while Yahoo! currently accounts for 30% of all U.S. searches, that number could decrease should Web users find Yahoo!’s search ineffective or otherwise unsatisfactory. “If users find the Yahoo! search less useful than it used to be when they were using Google results, then Yahoo! risks a mass migration of search engine users to Google,” Spencer says.

Get out your checkbooks?

At the same time that Yahoo! dropped Google, it introduced a pay-per-click service that charges Website owners to be included in the database. For Danny Sullivan, editor of search engine marketing portal Searchenginewatch.com, “That’s the real change. Catalogers may want to first sit back and compare traffic in Google in the beginning of February and the end of February and see if there were any problems. If there are, they may want to look into Yahoo!’s paid inclusion, which includes more content and traffic.”

The service costs $0.15-$0.30 per click. In a sense it’s an enhancement of the paid inclusion programs already offered by Yahoo! properties Overture and Inktomi. “Yahoo! said, Let’s take the best of these paid inclusion programs and come up with more-unified pricing and replace all the older paid inclusion programs,” Lee says. Because search engines AltaVista and AlltheWeb are part of Overture, participating in Yahoo!’s paid inclusion database will ensure that your listing appears on the other engines as well.

According to Jennifer Stephens, a spokesperson for Overture, catalogers that pay to be included can discuss with Yahoo! which areas of their sites it wants the spider to crawl. Yahoo! wants participants to “tell us what is important to their sites so that we’ll be able to crawl deep Web content we couldn’t get without an algorithmic crawl,” she says.

Catalogers that want even more control over what pages Yahoo! — or any other search spider — crawls may want to consider XML paid inclusion, suggests Lee. Rather than having the spider come to your site, you would provide a data feed to the search engine of the pages you want represented. “You could provide a data dump of all your pages directly to Overture,” Lee explains, “or through a search marketing agency.”

XML paid inclusion can also improve the rankings of Websites that aren’t optimized for the particular spider. A pay-per-click inclusion program like Yahoo!’s doesn’t guarantee position, Lee says, “just that you’re in the database.”

Search Engine Marketing Glossary

cost per click paying for sponsored links or paid inclusion links according to each time a user clicks on a link

crawler software used by a search engine to find and index pages; also called a robot or a spider

paid inclusion Paying to be included in a search engine or directory index. Does not improve search rankings but guarantees inclusion of pages a spider might have missed and “respidering” of pages periodically. Also called pay for inclusion (PFI)

paid placement paying for a link to be included on a search results page, usually at the top or right of the regular search results and set off by a label such as “sponsored links.”

search engine optimization (SEO) techniques used to improve a Web page’s ranking on a results page

XML feed simplified version of HTML that allows data (including product databases) to be sent to search engines in the format they request