SEO 101: How Search Results Are Ordered
October 14, 2021 | Posted in: Search
It may or may not be something you’ve thought about: The websites listed on the first page of Google are there for a reason. They are not ordered by random nor is someone at Googleplex picking lists of sites for every possible search query.
As a business owner, it’s easy to imagine the value of your website appearing at or near the top when a potential customer is looking for products and services like yours. After all, isn’t that how you find someone to hire? Products to buy?
Introducing: Google’s algorithm
Algorithms sound like an incomprehensible concept, but can be simply defined as a set of rules for solving a problem or accomplishing a task. They range in complexity from very simple scripts to computer-driven methods beyond human comprehension.
Here are some examples:
Goal: Get the clothes as clean as possible without shrinking them
Algorithm to accomplish the task:
- When I wash delicate clothing, set the washing machine to Cold
- When I’m washing towels and regular items, set it to Warm
Less Simple Task:
Cooking a full course meal
Goal: Serve four different dishes at the correct temperature at the same time
Algorithm to accomplish the task:
- Pick a serving time for the meal
- Read the packaging on any frozen items
- Check how long the recipe specifies to cook them
- Set them out to thaw in time to go into the oven
- Preheat the oven
- Start preparing the cold items like salads
- Prepare any items that will be cooked in the oven
- When the oven reaches the proper temperature, put in oven-cooked items in order of longest cooking time to shortest cooking time
And so on…
Seemingly Impossible Task:
Generating search results for any query a person can come up with
Goal: When someone types in any combination of words, abbreviations, and numbers, deliver a full page of relevant web pages for any keyword(s) in less than a tenth of a second
Google’s problem happens to be a lot more complex than laundry or cooking. The list of rules and steps required are well beyond human comprehension. Their algorithm is actually built and modified on an ongoing basis using “machine learning.” Human beings fine tune the code that actually writes itself to become more effective over time. Rather than a simple list of steps, the Google algorithm is a flexible and ever changing computerized process. But, if we break down the steps of how Google fundamentally works, their algorithm is at least something a person can envision:
Google needs to keep a copy of every page on every website on the entire internet
What sounds impossible is surprisingly doable with the power of computers. A program called Googlebot spends its days crawling the web and saving a copy of every page it visits to the index.
Crawling means that Google’s program clicks on every link or button it sees, then opens up the destination of that link. By doing this, Googlebot continually finds new pages to add to the Google index.
The index or indexing refers to a massive chunk of data used by Google’s algorithm. Think of it like a library. Anytime someone types in a search query, Google can look through its library and hand them a stack of results. There are currently billions of pages in their index (and it is growing larger every day).
Googlebot crawls the average small business website every 3-4 weeks, but SEO’s can use special tools to request additional crawls as needed. For example, if you added a new product or service page the day after Googlebot visited, it could take weeks for it to appear in the index (and search results) without a manual request.
Did you know: There is actually an additional algorithm used by Googlebot to determine how often your site needs to be crawled?
Highly authoritative and updated sites like cnn.com get visited by Googlebot up to multiple times per day, while a lawn care service in Nebraska will likely only get allocated one crawl per month. In the SEO industry, we refer to this as your website’s “crawl budget.”
Mobile vs Desktop
A relatively new feature at Google is their use of Googlebot sending a “secondary crawler” to your site. In most cases, the primary crawler is looking at your website through the mobile version of Chrome and the secondary crawler views the desktop format. Any pages that don’t perform well on both formats tend to appear lower in search results, especially on the type of device that saw errors. If your website performs poorly on a mobile crawl, it will appear lower in mobile results than on desktop.
Keeping Unwanted Pages Off Google
SEO’s can use code snippets called “robots meta directives” to specify which pages of a website should or should not be crawled. If you have a page on your site that you prefer not to show up in results, it can be omitted and Googlebot will keep it out of consideration. There are many different scenarios where meta directives come into play:
- On e-commerce sites, the “shopping cart” page is rarely a great place for a searcher to start. In most cases it is helpful to block Google from that page and instead have them show your homepage, shop page, or anything else that is packed full of useful content.
- If your site has multiple variations of a single page, it is a best practice to let Google know those pages are grouped together. If you offer one product in 4 sizes and 8 colors that could look to Googlebot like 32 different pages. In reality, it’s just one page with selectable options. SEO’s implement “canonicalization” in your website’s source code to keep Googlebot from mistaking versions of a page as duplicates.
- Duplicate content is a big concern for SEOs. Anytime Googlebot perceives multiple webpages as being nearly identical, it chooses one (typically the oldest) and refuses to include the others in search results. Properly canonicalized content shows Googlebot that you have different versions and informs it which one you want shown in results.
- Canonicalization also helps consolidate link and quality signals to one URL!
- In this scenario, a page like
/products/champion-jacket/ is preferable in search results versus allowing Google to guess and potentially choose something like
/products/champion-jacket/size-XXL/color-yellow/ when someone searches “medium black champion jacket”
- Another reason to direct Googlebot away from certain pages is if you have pages that are only for paid or logged in users and would otherwise appear blank. Including a meta directive will keep Google from including those pages in search results.
Google needs to understand the basic purpose of every page in the index
The Google Webmasters site and other Google press releases alert the public to changing rules and methodology, but Google doesn’t actually publish what factors they use or how factors are weighted when it comes to picking which sites show up first. They are very cautious in opening up the door to manipulation. Scammers have long sought to outrank legitimate sites for the purposes of stealing credit card info and spreading misinformation. Google holds their cards close for that reason.
Google discloses that they use hundreds of factors in evaluating what types of keywords a given web page is relevant to, but they do not publicize that list and instead encourage users to focus on providing them with great content on fast-loading and technically-sound websites.
With Google’s algorithm changing on an ongoing basis, the job of an SEO Strategist does as well!
By studying Google’s technical requirements and content recommendations, SEOs are able to deduce a significant number of factors included in their algorithm. Additionally, ongoing testing and development can reveal what types of elements will push a web page higher in results and what will lower your positioning.
Google needs to recognize the intent of each search query and deliver the results in order of relevance
After sorting through all of the pages across the Internet that could possibly meet the searcher’s needs, Google essentially has to guess which of those relevant pages are the best fit. To do this, they use the behavior of past searchers and little clues in the search term itself.
Think about searches that include the word “apple.” Users might be looking for a new phone, somewhere they can buy fruit, or a stock quote. How can Google tell the difference?
For each of these types of terms, Google has built on past experience. If you search for “apple price,” Google is quite confident you want the stock price so the results will start with that. However, the page quickly transitions to a handful of pages selling iPhones. You won’t find the cost per bushel of the fruit until somewhere around page 3 or 4. If the majority of “apple price” searchers started clicking all the way through to page 4 before visiting a page about fruit prices, that may soon change…
User behavior is used to improve the algorithm
The way we interact with each search result page is used by Google to measure the quality of their results. They track our scrolling, clicks and time spent on different websites to measure how good of a candidate each result is for a given topic. Using that data, the algorithm constantly improves itself with minor tweaks.
For example, there are millions of pages on the internet dedicated to selling dresses. Most of those websites sell prom dresses. If the searcher types in “cheap prom dresses,” Google tries to predict which of those websites are the most suitable. If prior searchers looking for “cheap” or “inexpensive” dresses tended to hit the Back button quickly after visiting [site A], but spent quite a bit of time on [site B], Google will likely reshuffle those results to list the second site above the first site. Even though [site A] might have more dresses or a better website, Google’s ultimate goal is to send searchers to pages they will like.
Google customizes the results for each user
Personalized results go beyond traditional ranking factors and further adjust the results using factors like:
- Where the searcher is physically located
- Businesses and content closer in proximity to the user will be given additional weight
- The types of searches and web pages the user previously entered
- Google will start to make assumptions about the type of topics you are likely to be interested in
- Which results the user has recently clicked on
- Google uses your history to decide which websites you like visiting and also which results do not seem to be answering your questions
- What type of device the searcher is using
- Search results change by as much as 30% between a computer and a mobile phone.
- Information from other Apps such as Gmail, Drive, and Google Play
- By reading your emails and files, Google can use complex ideas like the dates and location of your next vacation when deciding which results to show you