Editor’s note: This is the first of a multipart series on SEO, designed to educate small businesses and organizations insight into the important, but often-deceptive, world of SEO consulting. The series is meant to give website owners enough insight to carry on a well-informed conversation with a consultant, as well as tips on steps they can take themselves to improve their sites.
Part 1 of a Series About SEO
In the beginning, we called it the Information Superhighway, a cliché that now seems quaint. But to follow that analogy, the web was a highway with few exits, no billboards and no maps.
The newly public World Wide Web was fresh from the incubator of universities and think tanks, which shared favorite links to help spread the word about new and interesting sites. Those were the days when a link might be an IP address, rather than a domain name, and DotNet had much more credibility than DotCom in a tech world that debated the idea of even having a commercial world.
We navigated the web by looking at lists of links on well-known sites, where we learned about other cool and useful sites. Almost immediately, web sites emerged that collected and shared links, organized by category.
The colossus of this era was the original Yahoo, with thousands of wannabe’s and “cool site of the day” listings trailing in its wake.
But this type of site required human editors to collect and post links, a job that quickly became unthinkable as the web mushroomed. It also led to one of the first ethical controversies of the web: link-buying.
When the bills came due, the big operations needed cash, and they began selling directory placements – and they didn’t label them as paid ads. The constant surge of new sites found they had little way to break into the directory listings. So directories began to lose credibility; people could no longer trust that the sites displayed prominently in the directories had really been chosen and endorsed by an unbiased web editor.
Grey Hats take the stage
The directory system also spawned the first schemes to game the web link business. Most directories included a way for a website owner to submit their site for inclusion – or at least consideration – on the list. The schemers developed and sold standalone software – or in some cases, services – that allowed you to enter your web address, and then automatically submitted your URL to hundreds or thousands of web directories.
(This is still a popular “grey hat” scheme today, updated a bit for the search-engine algorithm world.)
Building better ways to catalog and map the web became one of the top priorities for entrepreneurs, and they began to focus on search technology. Searching a database is a fairly old technology. Every database has search functionality, from blunt-instrument to extremely sophisticated. A database that’s not searchable in a useful way is, well, useless. And the web, to stretch a definition, is a database – an aggregation of billions of bits of information residing on websites linked directly or indirectly.
The early search sites developed programmed “bots” that moved from link to link within and outside a website, and from site to site – like a spider moving across a web – and sending information on sites and link text back to the search site. Visitors could then search the collected information on the search site in various ways.
And in come the Black Hats
Unfortunately this information covered only a portion of the web, and was poorly organized. And a new wave of black hats learned they could capture the top pages of search results by “stuffing” their sites with hundreds of hot links that didn’t necessarily match the site content. Search results became increasingly useless.
Popular early search engines such as Excite, Webcrawler and Alta Vista were a step up from web directories, because they showed you everything they turned up, instead of cherry-picking and showcasing paying customers. But seeing everything isn’t necessarily useful. The static drowns out the ability to pick out the object of the search.
Using even a top-rated search engine of the day often meant having to flip by the first several pages of results, which generally were occupied by various spam sites – very often porn sites using respectable key words to lure visitors. You had to go much deeper into the search results to find what you were looking for.
Google changed all that in the late 1990s launching with a system called “PageRank,” the foundation of an incredibly complex algorithm that seeks to do one thing: clear away the black-hat and low-value links and deliver high-value relevant content, based on the search words. This algorithm is updated continually to tweak performance and dodge attempts to game the system.
Gaming the system today
And there are more gamers than ever. An entire industry has grown up around the idea of taking actions on and off sites to push a client further up the search results page. Some Search Engine Optimization (SEO) work is transparent and open, some involves “trade secrets” that are of value only if held close to the chest, and others that are move into black-hat territory. And of course, there are a significant number of flim-flam “consultants” who promise things they can’t deliver, or use sleight-of-hand to inflate success rates.
In its legitimate form, SEO experts work with a web site’s underlying coding, proper page and content structures and other strategies. Google itself encourages this type of SEO. The search company even offers tools that analyze sites for their transparency to the indexing search bots, as well as tips for legitimate ways to improve the way you present your site to Google.
But the darker side of the industry works continually to find new loopholes and ways around Google’s “anti-gaming” safeguards, while Google works continually to close loopholes and shut down the latest schemes. SEO work in this area is brinkmanship; at best, it’s a continuing effort to keep a step ahead; as soon as the crowd learns the loophole, Google will certainly move in. At worst, a misstep can result in a client’s site facing Google’s “death penalty” – being zapped from search results.
The SEO for Non-Geeks series focuses on Google, as its algorithm delivers most search results in much of the world, and a site that is properly optimized for Google is also generally good for other search engines.
In the following parts of this continuing series, we’ll cover several major topics, including:
- SEO starts with the right platform
- Content is still king
- Things you do that hurt your SEO
- Your site updates
- Warning flags: Scams, cheats & black hats
Our SEO Services
Cook Profitability Services offers complete SEO services, from site setup on search-engine friendly web platforms, to tweaking content and processes for a client’s top search terms, to ongoing maintenance and training. While we do have our own proprietary systems, our methods are strictly “white hat” and above board. Our SEO clients have seen their critical search rankings reach near or at the top of Google search engine results pages. For a free evaluation of your web development, SEO, internet marketing or public relations needs, call us at 210-601-1050 or submit a contact form.
Gabor Barath says
I just wanted to thank you for your great article on SEO for non geeks!
I’m from Hungary and I’m just trying to find my way amongst the abundance of unevaluated data on SEO. It really helped to evaluate on this subject and to clear my misunderstandings.
I have a question: Can I translate your article and post on my blog which is being made now on SEO?
Thank you again and wish you all the best,