A Quick History in SEO Algorithms
Search Engine Optimization (SEO) has always been something that every website owner is kind of “in” by default the second they launch a website.
Whether they know it or not, search engines relentlessly scan the web looking for the pages we post to our sites. These “spiders” as they are called, basically make a copy of your site pages and then try to determine what the page is about based on the content and the HTML tags that support that content.
If the algorithm likes your page better than the competitors who also have sites, your site goes to the top of the rankings for any number (hundreds, even thousands) of keyword phrase variations that relate to what your site is about and what it offers its visitors.
It’s easy to see how it is in your best interest as a site owner to make sure these elements are in line with the search engine’s ranking rules. Understanding what the search engines are looking for and how they are ranking your site will attract more visitors (potential customers) than your competitor’s website.
Even if we took NO strides toward making our pages more ‘friendly’ for search engines, odds are good we’d have a page or two rank for SOMETHING somewhere along the way (the company name for instance.) Your company name is simply not a competitive phrase, right? It’s more than likely going to pop up because no one else is trying to rank for it.
However, ‘active’ SEO is the ongoing study, effort and practice of influencing a search engine (or search engines) to rank a website’s pages higher than the competition. A consistent and well planned SEO strategy really becomes necessary when we are tackling more competitive keywords as well as planning the widest net possible with content marketing and back link building.
I’d go one step further these days to add that Google is SMARTER than ever today (and getting smarter each year.) We don’t have to ‘trick’ Google, we just have to give it a reason to LOVE our site’s content.
In fact, unlike the old days of black hat, risky manipulation, SEO is really about building the best experience for our VISITORS (humans, not ranking algorithms) with the idea that website visitors will engage (and hopefully link to our pages over time) prompting Google to rank our sites higher over time.
In the old days (early 2000’s), we could stuff tons of keywords into a page and then not touch it for months while it climbed the ranks driving new traffic. Now, we have to think of websites as living documents. It’s suggested (and I have a case study showing this) that untouched websites are a bit like putting a horse to pasture. Google doesn’t want to show your old horse to its customers!
Keyword Stuffing: SEO before 2003
What do I mean by ‘influence’ exactly? To over simplify, search engines rank your site’s page based on what YOU say your page is about (optimization of your pages and the keywords you use), AS WELL AS what OTHER people say your page is about (the number of sites who link to those pages.)
Search engines will check both on-page AND off-page factors together. These days, they will check other technical factors too (like site speed, secure certificates, etc) but let’s stick with this manipulation theme for a moment.
Since the search engines don’t tell site owners what their rules are (they do provide guidelines), site owners have always had to guess most of the time. They would (and still do) arrive at a hypothesis by observing sudden changes in traffic volume and rankings while comparing notes with others who are doing the same. When multiple sites reflect similar patterns, there might be a valid update that’s just taken place.
It didn’t take site owners long to figure out that if they put certain words in their web pages many more times than their competitors, they could rank higher than those competitor sites. They could effectively ‘stuff’ a page with way too many keywords and that was very effective at the time.
I can remember cramming 14 variations of a core service keyword phrase in a home page title and meta description. That’s what you did at one time as an SEO professional because it worked pretty easily and increased rankings almost always resulted in a short amount of time.
It really didn’t matter what other sites linked to yours, it just mattered what YOU said the page was about with your keywords. The more the better, and you jumped right to the top of the results pages because your competitors were often not taking the time to practice the same stuffing technique.
Needless to say, the Google search engine folks didn’t like what was happening with all this keyword stuffing. Simple optimization and manipulation meant that lots of spammy sites were infesting the search results. Oftentimes, these spam based site owners who were experimenting the most with these techniques.
Thus came the first search engine UPDATE of significance, called the ‘Florida’ update (Google used to name updates…not so much these days.)
Authority: SEO between 2004 and 2011
Google’s answer with the Florida update was to change the weight of what made a website rank. No longer was it enough to stuff pages with keywords in order to manipulate rankings.
If a page was truly a ‘good’ and ‘reliable authority’ on a certain topic, it followed that other sites would likely rank to it. That’s generally what site owners do when they want to share a resource with their visitors.
Think of a link to a site as a VOTE of confidence from another site. Links became all the rage, and if 30 websites were linking to a dental website “A” while his local competitor (dental website “B”) had NO sites linking to his, it made sense that the “A” dental site might have authority or be more desirable. That site would now rank higher (in addition to many other factors obviously.)
Can you guess what happened? Spammers and other studious site owners found ways to ‘build’ links, lots of them, quickly. These were not natural links, but links purchased from link farms for instance (companies setting up lots of fake sites and selling links in mass volume.)
User Experience: Panda Update in 2011
Google blew away lots of garbage website rankings with its Panda update in 2011. Since the focus had been on manipulation both on the page and off, scores of sites had completely ignored humans in an effort to appeal to a computer ranking algorithm.
Pages were almost unreadable thanks to over optimization, and if that wasn’t bad enough, sites with absolutely no value to readers had been pushed to the tops of search results simply because they had more links (from other ‘no value’ sites) than the next site ranking below it.
The Panda update really alerted site owners to focus on the user experience rather than out guessing Google’s algorithm. It really changed the SEO game from memorizing the latest list of rules in keyword density for instance to focusing on useful, engaging content.
Any SEO who has spent time studying the back link profile of a website, can easily see the ‘real’ links versus the ‘spammy’ or purchased links. Google figured out that the best way for a site to obtain a GENUINE link from a REAL website was to promote natural linking due to great content.
Link Spam: Penguin Update in 2012
Just to cap what Panda started, Penguin took the ‘link problem’ a bit further by targeting paid links with much greater accuracy.
Links are S