A Quick History in SEO Algorithms

Search Engine Optimization (SEO) has always been something that every website owner is kind of “in” by default the second they launch a website.

Search Engine Optimization Algorithms

Whether they know it or not, search engines relentlessly scan the web looking for the pages we post to our sites. These “spiders” as they are called, basically make a copy of your site pages and then try to determine what the page is about based on the content and the HTML tags that support that content.

If the algorithm likes your page better than the competitors who also have sites, your site goes to the top of the rankings for any number (hundreds, even thousands) of keyword phrase variations that relate to what your site is about and what it offers its visitors.

It’s easy to see how it is in your best interest as a site owner to make sure these elements are in line with the search engine’s ranking rules. Understanding what the search engines are looking for and how they are ranking your site will attract more visitors (potential customers) than your competitor’s website.

Even if we took NO strides toward making our pages more ‘friendly’ for search engines, odds are good we’d have a page or two rank for SOMETHING somewhere along the way (the company name for instance.) Your company name is simply not a competitive phrase, right? It’s more than likely going to pop up because no one else is trying to rank for it.

However, ‘active’ SEO is the ongoing study, effort and practice of influencing a search engine (or search engines) to rank a website’s pages higher than the competition. A consistent and well planned SEO strategy really becomes necessary when we are tackling more competitive keywords as well as planning the widest net possible with content marketing and back link building.

I’d go one step further these days to add that Google is SMARTER than ever today (and getting smarter each year.) We don’t have to ‘trick’ Google, we just have to give it a reason to LOVE our site’s content.

In fact, unlike the old days of black hat, risky manipulation, SEO is really about building the best experience for our VISITORS (humans, not ranking algorithms) with the idea that website visitors will engage (and hopefully link to our pages over time) prompting Google to rank our sites higher over time.

In the old days (early 2000’s), we could stuff tons of keywords into a page and then not touch it for months while it climbed the ranks driving new traffic. Now, we have to think of websites as living documents. It’s suggested (and I have a case study showing this) that untouched websites are a bit like putting a horse to pasture. Google doesn’t want to show your old horse to its customers!

Keyword Stuffing:  SEO before 2003

What do I mean by ‘influence’ exactly? To over simplify, search engines rank your site’s page based on what YOU say your page is about (optimization of your pages and the keywords you use), AS WELL AS what OTHER people say your page is about (the number of sites who link to those pages.)

Keywords in HTML

Search engines will check both on-page AND off-page factors together. These days, they will check other technical factors too (like site speed, secure certificates, etc) but let’s stick with this manipulation theme for a moment.

Since the search engines don’t tell site owners what their rules are (they do provide guidelines), site owners have always had to guess most of the time. They would (and still do) arrive at a hypothesis by observing sudden changes in traffic volume and rankings while comparing notes with others who are doing the same. When multiple sites reflect similar patterns, there might be a valid update that’s just taken place.

It didn’t take site owners long to figure out that if they put certain words in their web pages many more times than their competitors, they could rank higher than those competitor sites. They could effectively ‘stuff’ a page with way too many keywords and that was very effective at the time.

I can remember cramming 14 variations of a core service keyword phrase in a home page title and meta description. That’s what you did at one time as an SEO professional because it worked pretty easily and increased rankings almost always resulted in a short amount of time.

It really didn’t matter what other sites linked to yours, it just mattered what YOU said the page was about with your keywords. The more the better, and you jumped right to the top of the results pages because your competitors were often not taking the time to practice the same stuffing technique.

Needless to say, the Google search engine folks didn’t like what was happening with all this keyword stuffing. Simple optimization and manipulation meant that lots of spammy sites were infesting the search results. Oftentimes, these spam based site owners who were experimenting the most with these techniques.

Thus came the first search engine UPDATE of significance, called the ‘Florida’ update (Google used to name updates…not so much these days.)

Authority:  SEO between 2004 and 2011

Google’s answer with the Florida update was to change the weight of what made a website rank. No longer was it enough to stuff pages with keywords in order to manipulate rankings.

SEO Florida Update

If a page was truly a ‘good’ and ‘reliable authority’ on a certain topic, it followed that other sites would likely rank to it. That’s generally what site owners do when they want to share a resource with their visitors.

Think of a link to a site as a VOTE of confidence from another site. Links became all the rage, and if 30 websites were linking to a dental website “A” while his local competitor (dental website “B”) had NO sites linking to his, it made sense that the “A” dental site might have authority or be more desirable. That site would now rank higher (in addition to many other factors obviously.)

Can you guess what happened? Spammers and other studious site owners found ways to ‘build’ links, lots of them, quickly. These were not natural links, but links purchased from link farms for instance (companies setting up lots of fake sites and selling links in mass volume.)

User Experience:  Panda Update in 2011

SEO Panda Update

Google blew away lots of garbage website rankings with its Panda update in 2011. Since the focus had been on manipulation both on the page and off, scores of sites had completely ignored humans in an effort to appeal to a computer ranking algorithm.

Pages were almost unreadable thanks to over optimization, and if that wasn’t bad enough, sites with absolutely no value to readers had been pushed to the tops of search results simply because they had more links (from other ‘no value’ sites) than the next site ranking below it.

The Panda update really alerted site owners to focus on the user experience rather than out guessing Google’s algorithm. It really changed the SEO game from memorizing the latest list of rules in keyword density for instance to focusing on useful, engaging content.

Any SEO who has spent time studying the back link profile of a website, can easily see the ‘real’ links versus the ‘spammy’ or purchased links. Google figured out that the best way for a site to obtain a GENUINE link from a REAL website was to promote natural linking due to great content.

Link Spam:  Penguin Update in 2012

Just to cap what Panda started, Penguin took the ‘link problem’ a bit further by targeting paid links with much greater accuracy.

SEO Penguin Update

Links are STILL a huge part of Google’s ranking algorithm, however, the best links will be natural and unsolicited if Penguin has anything to say about it.

Panda and Penguin are still working today, and they are still evolving. However, as I try to explain to my SEO clients, my services have nothing to do with secret handshakes with Google. There is no magic button to push these days and if there is, it isn’t worth the effort and risk of being caught and penalized to almost zero site traffic overnight, at least not for a legitimate business.

SEO has become a mixture of so many elements today, you REALLY need to work regularly to improve a number of factors….always make it better or the competition will leave your site further down in the rankings.

What are the factors? The exact formula isn’t posted anywhere I know of, but here are a few things to consider. I know what has worked with my personal business sites and what hasn’t so I can confirm these for sure.

    • Do other sites link to yours? If no other sites have taken the time to link to yours, you must not have much authority. If you did, people would consider your site as a leader in the industry and important enough to share with their own visitors.
    • Are they quality sites? Do they make sense? For instance, does the mortgage site have tons of real estate agent sites linking to it or are they random sites that have nothing to do with the mortgage industry (like a blog about hair products.)
    • Does your site appear safe (HTTPS?) or is Google Chrome flashing that nasty message about it being ‘insecure’ and untrustworthy? Is it fast enough to keep users moving from page to page or are they leaving a slow loading site because the experience is miserable?
    • Is the content you put on your site built with your visitor in mind? Every piece of content you add to your site should have a purpose. Every piece of content plays a specific role in the lead funnel that is your website.
    • Are there good points made? Is there something to learn?
    • Is your content keeping people on the page? Any video? Are there visuals to complement? If people are immediately leaving your page (and site) then Google will know (analytics) and the rankings will not be in your favor.
    • If you site isn’t mobile friendly (responsive) in 2019, you might be a bit behind the 8 ball. Each year, more and more of the sites I track is showing a higher percentage of mobile users than the year before. The pinch and zoom technique is no longer acceptable for mobile users with tiny screens!

RankBrain:  Understanding Search Meaning in 2015

One of the most important changes in search and how Google ranks in past year was the introduction of an algorithm called RankBrain.  As I’ve mentioned previously, Google originally awarded high rankings to the pages that had the most keywords in the right places for an given search.  Simply put, if you typed in the word “hammer” into Google and my site’s ranking page mentioned the word 4 times more than my closest competitor, I’d rank higher.  You ‘ranked’ on Google in a universal sense. 

However, a search engine needs to be smarter than this in order to serve the best results.  What if it could actually learn on the fly to make searches better for each individual?  RankBrain looks at many factors to customize the results it thinks might make the most sense (rather than who has optimized pages the best.)  

For instance, think about a word like “hammer”, as SEO expert Bruce Clay pointed out in a podcast I heard recently.  Words can have different meanings, different “intent” based on things like your previous search history (in your own browser) or where you are searching from in that instant.

If you type the word “hammer” into Google, you might logically expect to find information about a hammering tool.  In North Georgia where I live, odds are good that if I search for this word I’m looking for a tool to buy at the local Home Depot or Lowes.  Or, maybe I just want to learn about the types of hammers that exist today before I even shop for the hammer I need for my next home project.

However, if you live in Los Angeles, odds are good than many people are actually looking for the Hammer Museum.   In New York, people could be looking for the Hammer Galleries, so on and so forth.

The lesson to take from this 2015 update, is that search engines are getting smarter all the time.  We need to be in the business of providing value to people, not trying to figure out how to manipulate search engines into higher rankings.  We take care of our target audience, search engines will take care of our visibility.

The Modern SEO “Pie”

To reiterate from the last section, SEO these days has nothing to do with trying to manipulate or out-guess algorithms in the short term.

Working with businesses on SEO projects, we essentially try to create the best, most user friendly and useful sites we can for the types of visitors THEY want to attract.

When we put our focus on making a website the best, most informative site that people can’t help but come back to again and again, good things (and rankings) often seem to follow from Google.

When we focus on the following items on a regular basis, the process becomes very logical. Presumably, a site that is easy to read, easy to use, and provides information that I need is good for me as a visitor.

It’s a site that I will not only want to revisit, but I’ll likely spend more than an average amount of time on the pages (and visit lots of them as Google can see from its analytics data more than likely), link to it in some way from one of my sites at some point.

When Google sees that behavior from me as a visitor on that particular website (as well as from other visitors like me), it’s likely to want to share (rank well) that site with other people searching for the same things we return to it for!

Search Engine Market Share 2019