SEO tactics to avoid

You know there are search engine optimization tactics you should steer clear of. But which are the worst of the worst?

The list of SEO tactics to avoid depends heavily on how risk averse a site owner is. For instance, I work primarily with big brand e-commerce and media sites, so my risk tolerance matches my clients’ risk tolerance — very low.

In the world of SEO, the consequences of risky tactics can be difficult to diagnose and trace back to actions that can be reversed. Change comes slowly in natural search, so the time to recover can be lengthy. There are enough solid SEO tactics that are safe, scalable and effective, so there’s no need to dabble in risky tactics.

I find that a rule of thumb from my childhood applies to SEO: “If you’re worried you’ll get caught doing something, you probably shouldn’t be doing it in the first place.”

But everyone’s favorite scolding works equally well: “If your friend jumped off a bridge, would you jump too?” These are the SEO tactics I’d be worried about.


The 301 redirect is one of an SEO’s favorite tools for passing link popularity to another URL and de-indexing content. But sometimes you want to use 301s to harvest URLs that humans need to see.

It’s possible to conditionally 301 redirect URLs based on user agent/bot detection. This, however, is a tactic that risks a Google penalty or ban, according to Google Webspam head Matt Cutts.

Conditional redirecting is considered a form of cloaking in which the human is shown one thing and search engines are shown another. There are many other methods of cloaking to feed text and links to search engines but hide them from humans.

The best way to stay safe is to avoid tactics that present content to crawlers that humans can’t see.


Speaking of which, hiding text and links seems like an easy way to get those juicy SEO benefits without having yucky text and links cluttering up the sleek design of the site. There are a lot of ways to accomplish this, including white-on-white text, off-page positioning with CSS (cascading style sheets), visibility=hidden in CSS, and placing the content a page view beneath the footer.

Technically, this isn’t cloaking because it isn’t based on user agent or bot detection. But the engines are onto these tactics. Nothing is truly hidden anymore: The engines can crawl CSS and JS files.

It’s debatable whether they’re sophisticated enough to make algorithmic decisions based on CSS and JS. But if a site is flagged for human review, I don’t want to be found using these tactics.


You want keyword URLs, but you don’t want to go through the whole URL rewriting process. It’s easier just to stick a directory or two — or three — at the front of the existing URL that doesn’t really serve any functional purpose, right?

Don’t bother. The addition of keywords to a URL without rewriting the elements that actually drive the content of the page is like taking one step forward and one step back. The URL has keywords, but now it’s even longer than it was before.

Sure, it will probably get indexed. And if the legacy URLs are 301-redirected to the new keyword URL to preserve link popularity and de-index the legacy URLs, it’s not likely that there will be much impact.

If the keyword insertion is overdone, however, this tactic could easily look like risky keyword stuffing. For example, this URL exists in the wild, but I’ve changed its name to protect the innocent: The URL would be so much more optimal.


By now most folks have heard about the great text-link smack-down of late 2007, in which Google penalized sites that were blatantly buying or selling text links for natural search link benefit. Yet many sites continue to buy and sell text links that pass link popularity.

Whatever your personal opinion about buying and selling text links, if you plan to do it, make certain that the links are nofollowed, that they pass through a 302 redirect, or are placed with JavaScript to negate their ability to pass link popularity. Focus your SEO link building strategy instead on blogging, social media, link bait and other natural methods of building links to your site.


Speaking of unnatural link building, the king of these tactics is the link farm or ring made up of sister sites. Sometimes these sites are all about the same topic, sometimes not.

Sometimes the sites are actually “real,” and sometimes they re-post scraped content. Sometimes it’s a collection of microsites that contain many of the same products, but focus on different niches.

The kicker for this tactic is that, typically, all of the sites are hosted on the same C Block and registered to the same company or person. The search engines are registrars. They know the hosting and registration details for every site. Fifty thousand links from 50 external domains have far less value when they can all be traced back to the same owner.


An HTML sitemap is a tried-and-true usability feature and a secondary navigation path for search engine crawlers. But some sites take sitemaps to the extreme, creating long chains for sitemaps linking to sitemaps linking to sitemaps to compensate for a poorly structured site.

Pages that lack content and contain only links do provide a crawlable path deeper into a site, but beware. Very little measure of value passes through a string of pages that themselves offer no value besides links.

Those pages are highly unlikely to rank well, or to pass on the kind of keyword and link popularity signals that are required for the deep pages to rank well. At the end of the trail, all that site has is indexation without the ability to rank.

In order for content to have SEO value, it needs to have human value as well. If a list of a list of a list of links starts to seem like a good idea, ask yourself if you’d honestly find value in that page if Google landed you there. If the answer is no, either don’t do it, or plan to include useful, targeted content on the page as well as links.

At the end of the day, relevance is a search engine’s product. Search engine software engineers spend their lives analyzing what humans consider to be relevant content and optimizing their algorithms to identify and deliver relevance in their search results.

As such, one of the best rules of thumb for SEO is, “What’s good for the goose is good for the gander.” Especially if by goose you really mean humans and by gander you really mean SEO.

The time you may be tempted to spend on thinking of ways to trick the engines into ranking a site higher? Take that time and refocus it on developing a structurally crawlable site, unique content, a strong value proposition, and links built the old-fashioned way — earned.


Jill Kocher ( is manager of natural search consulting at search marketing firm Netconcepts.