Websites exist for sharing information.
Whether it’s news of your latest big product release, general info about your company or industry, or a story about your day in the park with your dog, chances are you’re putting it out there for people to read. Since the days of Lycos, AltaVista, Yahoo, and, of course, Google, search engines have been a big part of that goal.
Building your site to entice search engines to index and favorably place your pages has gone from the brute-force spider-baiting methods of the late 90’s, to the… well, brute-force spider-baiting methods of the 2000’s.
SEO has become an acronym, but many SEO companies still seem focused on keyword bombing, link farming, and site “build-out.” This approach does kind of work, so these guys can get away with it up to a point and sell their clients on their “success,” but it usually means leaving two things behind: 1) your customers, and 2) sane, usable content.
There is a better way. It is possible to build search engine friendly sites without making your site look like a dictionary or random pile of keywords. With a little bit of time and effort, a good understanding of your site’s real goals (“getting a top search ranking” is not a real goal), and some thoughtful copywriting, you can serve your customers a readable, usable site and still rank well in your target searches.
As I see it, there are two basic principles of SEO:
- Understand how the spiders see your site’s pages,
- and create compelling, accessible, usable content, and organize it so spiders can see it.
I’ll talk about these a bit more in-depth, but not necessarily in great detail–this post is merely intended to offer an overview, and perhaps a better general approach to SEO, not blow-by-blow implementation guidelines. With that disclaimer in place, let’s continue…
Understand how the spiders see your site’s pages
It’s a bit of a technical question, and it does vary from search engine to search engine, but it is crucial to understand what a spider sees when it visits your pages.
First, look at your site’s code. Go to a page, right-click, and select “View Source,” “View Page Source,” or your browser’s equivalent function. The window that pops up is the soup that a spider has to strain to get to the meat of your site.
Most spiders are smart enough to ignore irrelevant content (pretty much everything above
<body… except some of the ones that start with
<meta…). The spider is looking for links, both deeper into your site (so it can index more of your pages), and also to other sites (so it can discover your site’s relevance). It is also looking for copy—textual content. It gets some of this from the content visible to your users, some from things like
title attributes, and some from linked or embedded filenames.
The spiders grab all* that juicy content, wrap it up in a little sack, and send it home to digest.
The search engine will then take your stripped down content and process it through their algorithms. The key things they’re looking for are:
- relevance, which is a measure of not just what keywords you use, but how they relate to each other,
- and how your site relates to itself and to other sites covering the same subjects.
So yes, you can get away with a bit of keyword bombing and link farming, but search engines have gotten pretty good at spotting it, meaning that you have to also have good, relevant content to help your keywords not look like obvious spam or spider-bait.
Which leads us into our second principle…
Create compelling, accessible, usable content, and organize it so spiders can see it.
The number one problem with keyword bombing is that it’s very hard to integrate into consumer friendly, readable, and usable content. You’ll likely end up with paragraphs of content where every third word is a keyword, and, no matter how good your copywriters are, your visitor is going to feel like she’s reading gibberish.
As I mentioned above, creating good content is primarily dependent on understanding your site’s goals and your audience’s needs. The content should support those things. First develop a good site strategy—aligned with your client’s branding, tapped into industry trends, and mindful of SEO—then set your star copywriter loose to pull it together into a cohesive and compelling package.
“SEO is simply a positive side effect of good content strategy.” **
Sure, you want to cover the basics by ensuring your keywords are in title, meta, and header tags, and sprinkled throughout the rest of your content, but it’s much more important that you have an easily navigable site structure and clear, readable, interrelated content. Not only does it create a better user experience for the average visitor with a constantly decreasing attention span, but it’s also more enticing to the spider and more understandable to the search engine’s indexing algorithms.
More important still, you want your visitors to recommend your site to others, especially in contexts that are relevant to your site’s goals. Visitors are only going to do that if they can quickly make sense of your content, and if it answers a question, solves a problem, or simply provides information that piques their interest.
The emperor is dead, long live the king.
Obviously, there’s much more to creating compelling user and search-engine friendly sites than what I’ve covered here, and this only scratches the surface of arguments against “traditional” SEO practices. In my experience and opinion, however, these two are the most basic principles upon which one should base all SEO.
In summation: Compelling content is king, while content caked with keywords is merely an emperor with no clothes.
* There was a time when spiders could only consume so much from a single page… anything beyond a certain amount they’d just ignore. I don’t think this is still the case, but I haven’t been able to find direct confirmation.
** Thanks to @odonnell for that juicy tidbit.