Valid HTML 4.01!


Valid CSS!

Search Engine Optimization - long term


<< How to do due diligence on a web site or blog

(Glossary of terms)

In this section:

Page 1: Analysing the site's SEO - too much spells disaster

Page 2: Understanding what is and what isn't good ranking

This page: Recognising and building long term, long haul SEO

Page 4: Bad link buying and how to avoid it


Long Haul SEO

If an SE doesn't consistently provide useful results, people will stop using it and the SE will die. Relevance, relevance, relevance is their mantra. To provide that relevance SEs need to trawl the billions of pages out there and use some ranking system to deliver the most useful pages first. Quite simply there are two ways you can make your site feature on the first page. The first is by analysing their ranking system and applying the results to your site; a challenging job and one filled with constant change. The second is by aligning your site with the SEs' long term interests i.e. making your site "relevant" for that subject/keyword. Long haul SEO is about the latter.

Some sites are made without ranking algorithms in mind. They largely ignore current SEO fads and focus on quality content rather than quality SEO. That's what you're hoping to detect in a site you're looking to buy. That's what long term strategists hope to achieve with sites they own. The less time spent chasing algos the more time available to concentrate on content. Initially it may be at the expense of losing some traffic to algo chasers but, with the SEs constantly revising their algos, the long term strategists aren't chasing rankings in the search engines; the search engines keep chasing them trying to ensure they stay on the front page.

SEs see SEOs as people looking to manipulate their search results and they have a natural animosity towards the SEO industry. SEs can give a higher trust to sites that they know didn't play around with any of the known optimisation techniques. Some examples:

Excessive use of <H1>
Optimising keyword density
Overuse of Meta tags
Building incoming backlinks/PR (even from unrelated sites)
Keyword stuffing alt tags
Cloaking
Excessive outgoing links to on-subject sites (subsequent to the Google update called Florida filling a page with a large number of relevant outgoing links worked a dream and was primarily responsible for the phenomenon of "scraper" sites)
hidden text
Using CSS Z-index/fixed positioning/tiny text to keep spammy text out of view of users
Anchor text for incoming links, manipulation of
Volume for volume's sake (creating a large number of pages using auto-generation software)
Mod-rewriting URLs (particularly on dynamic sites with session variables in the URL) for SEs .... and even
Reciprocal linking

If a site's ever used any of the above there's always a chance - a high probability - that the SEs know about it. Even user agent cloaking!

There's no better SEO-ed site than one that hasn't been SEOed at all! If a domain was registered a long time ago, originally paid up for 10 years (why bother renewing every year?), has page names and page extensions in the Internet Archive that haven't changed for years, has never had a keyword meta tag on any page, hasn't followed fads from framed pages to PHP delivery, and hasn't gained more than the odd incoming back link in the last two years ... those may actually act in its favour! (Read the famous Google patent page). Bung some fresh content on a site like that and the history of "good behaviour" may push that fresh, no link content to the front page of the SE results.

Who's in control?

To achieve relevance the SEs want to base search results on things we can't easily influence, algorithms that make it more difficult for us to figure out how to manipulate our sites into top spot. And to achieve this they have a variety of tools.

The SEs' armory:

1. They've got the history: History of the site, of individual pages, of content on those pages, of links to and from those pages etc. You can't change history as they've got it recorded in their databases. If a site was heavily into SEO two years ago, they know about it.

2. SEs have access to a whole range of information on how people use your site, the routes they take through it, how useful they find it, how many bookmark it, how quickly users exit, how often they return and a lot, lot more. Google, for example, can in theory, combine information from Adsense/Adwords (and conversion tracking), the PR toolbar, personalisation services, clicks they track in the search results pages (and how often people hit the back button when the arrive on your site), and that's just for starters. Though it's a phenomenal amount of information on its own they can combine it with information from other sources / databases from companies they've acquired and it becomes even more potent.

3. SEs have the experts, technology, and the money to chase relevance and weed out sites that don't have it.

4. Software is getting smarter and heuristic programs are leveraging SEs' ability to understand what searchers want.

What should always stand a site in good stead in the long term

Here's one good beginner's guide to SEO and a highly recommended ebook from a recognised expert. But, if you don't fancy all that reading:

1. Clean, standards compliant, "useful" websites that load quickly, are content rich and frequently updated, and that follow the web's simpler conventions (short and descriptive title, easy navigation, no disabling of browser buttons, no keyword stuffing or other blatant SEO work ... that sort of thing).

2. Natural looking Link building. A few links from directories, very minimal reciprocal or three way linking, no apparent buying or selling of links, no attempted PR manipulation (buying/selling/hogging), no outward links to less reputable sites. Link-baiting (taking a stand on a controversial issue - "Abortion is murder", "Hitler was a kind and good man") to er, tempt webmasters to link to you, is OK. Paying to be listed in the "big" directories is OK.

3. Constant stream of new content. Not auto-generated content, not machine translated content but original, useful content. And, it's got to be quality content. Article writing seems to have replaced link-building as the new Answer To Everything. But if SEs are going to keep users they will have to keep serving not just content pages but pages with good content. Can they filter on quality? They can use their toolbars, Analytics programs, PPC programs, SERPs click tracking, and a variety of other tools to discern how visitors behave on hitting your content. And, though the SE has never sent a human to view your page it "knows" how "good" your content is relative to competing content for that subject. Content isn't the long term answer, it's quality, authoritative content. 

4. No technical errors, no duplicate pages, a valid robots.txt, and a little pandering to the other food robots look for - like a favicon.ico, sitemap, custom error pages (not all 404 pages return a server header of 404. Test yours here) etc.

Having no duplicate content is good too (http://mysite.com/
http://www.mysite.com/
http://mysite.com/index.htm
http://mysite.com/Index.htm
http://mysite.com/index.html
http://mysite.com/index.htm?id=1
can all be seen as different pages. Other bugs/features in dynamically created sites can appear as duplicate content to SEs).

5. Regularly running a spider like Xenu Link Sleuth through your site to detect if any of your links to external sites are going to 404 pages or sites that don't exist anymore is a good idea.

Page 1: Analysing the site's SEO - too much spells disaster
Page 2: Understanding what is and what isn't good ranking
This page: Recognising and building long term, long haul SEO
Page 4: Bad link buying and how to avoid it
 

<< Due Diligence - Research the Business Before You Buy It