Caslon Analytics elephant logo title for SEO note
home | about | site use | resources | publications | timeline   spacer graphic   Ketupa

overview

issues

responses

studies

















related pages icon
related
Guides:


Networks
& the GII






related pages icon
related
Profiles
and Notes:


Search
Engines


Popular
Search
Terms


Directories

Metadata

section heading icon     responses

This page discusses responses to search engine optimisation myths.

It supplements discussion elsewhere on this site regarding search engines (and popular search terms), metadata, online resource identification and networks.

A fundamental response to the challenges outlined in the preceding page of this note is simply to offer search engines and humans a site that is worth visiting.

Another response is to recognise that appearance in a list of results from a search engine is merely one way that a past/potential user can find a site, page or other online resource.

For some users the most appropriate - and perhaps most cost effective - route to that resource may be offline, through for example citation on a letterhead, in an email, on a business card, the side of a bus or newspaper advertisement. Links from other sites, particularly sites that the user is likely to regard as authoritative (or merely that the user is likely to encounter), are also valuable.

We have cautioned some clients against a fixation with being 'number one', questioning the axiom that "if you aren't in the top 20 you don't exist". Some sites may be aimed at a demographic that undertakes sophisticated searching (eg boolean-style searches) and that is prepared to explore several pages of search results.

A third response is to 'know' the market for your site and to be aware of who is visiting (and how they are navigating the site). Commercial webstats services and free services such as Google Analytics are of value in understanding arrivals, departures and roadblocks.

Attention to the needs of human users and search engines is also useful, both to maximise the likelihood of a page being found and to address broader concerns about accessibility.

Principles include -

  • using standard, validated code
  • coherent site architecture
  • provision of a site map (or equivalent) for any site that has more than a few pages/files
  • maintenance of internal links
  • avoidance of text that is clearly redundant (on a paragraph by paragraph or page by page basis)

It appears to be useful to keep sites up-to-date (some engines are biased against sites where no content has been changed for several years, particularly sites that have not attracted substantial traffic).

Recognition of variant spellings may also be valuable, one reason why on this page we have referred to both optimisation and optimization.

Engines disfavour sites that

  • use non-compliant HTML
  • feature content that appears to have been created automatically (eg random text salted with keywords)
  • rely on 'invisible' (aka 'white') text to create a higher keyword density
  • install malware or are linked to known spammers
  • feature 'spamdex' metadata (eg very recurrent use of keywords in the page header)
  • have content that is strongly at variance with page names and navigation points
  • are reliant on 'link exchange' schemes
  • are recurrently resubmitted to those engines

Some site operators have responded to online search challenges by choosing to pay for placement, deciding that the best way of getting a user's attention during a search is to pay to appear above or otherwise adjacent to the list of results on the SERP for a particular query.






   next page  (studies of SEO)




this site
the web

Google

version of May 2006
© Bruce Arnold