Subscribe to our newsletter!
We don't spam. You will only receive relevant and important tips for you and your business.
Unsubscribe anytime.
By Darren DeYoung
Trying to rank on Google can be frustrating, elusive, and sometimes incomprehensible. There are a lot of websites out there and somehow this mysterious machine, called Google, is able to determine which website is the most important when compared to all other websites... but how?
Since its beginning in 1998, Google’s mission has been to “organize the world’s information and make it universally accessible and useful.”
With a vast amount of information available on the web, this is no easy feat and pretty much impossible without some help. That is where the Google rankings systems come into play. The algorithm is designed to sort through the billions of web pages and find the most relevant and useful results in a fraction of time.
Google doesn’t let people know exactly what factors are most critical to their search algorithms, but through research, testing, and experience, a good SEO will know what the most important factors are. Most SEO’s would agree that the following are near the top of the reported 200+ algorithm ranking factors:
Although each algorithm is set-up by humans, the website rankings are determined by a series of algorithms. The website ranked first is the one that is given the best score when taking into account all ranking factors.
To provide the best results, Google creates an index after searching through all web pages. A Googlebot, the generic name for Google’s web crawlers, scans each web page and consists of two types of crawlers:
The Googlebots continuously scan the web and collect information to determine which web pages are most relevant and useful according to their complex algorithms.
Google Search has immense power. Not only does it need to filter through billions of web pages, but it also has to be smart to return results that are relevant to the search query. Thus, it must understand the user’s intent. Understanding intent is much more complicated than understanding language and this is a very critical part of Search. From interpreting spelling mistakes (“finance” vs “fiance”), recognizing synonyms (“pet adoption” vs “animal rescue”), to understanding the query based on natural language processing (“bass” [the fish] vs “bass” [the musical instrument]), the algorithm, as we all know, is very advanced and normally comes through the way we want it to. And sometimes, it takes our queries literally. If you search “do a barrel roll”, the results page will rotate 360 degrees before coming to a rest.
The Google algorithm analyzes the content of web pages to see if any page contains information that is relevant to what the user is looking for. For example, does the web page contain the same keyword(s) as your search query? This would be the most basic signal for determining if a web page is relevant or not.
There are many other signals that are used through machine-learning to determine a web page's relevance. These signals help Search algorithms assess whether a web page contains an answer to the search query, or if it is just repeating the same question. For example, if you search for “puppy”, you likely don’t want a page with just the word “puppy” on it. You are also likely looking for additional content such as pictures of puppies, videos, or other information like how to train a puppy.
Although matching keywords is important, Google also aims to prioritize the most reliable sources first. They identify many signals that determine if a web page possesses expertise, experience, authoritativeness, and trustworthiness on a given topic.
The most popular sign to determine the quality of a web page is if other prominent websites link to the page (known as PageRank). If more trusted websites link to a given web page, the quality of that page is deemed to be higher and more credible.
It is worth noting that many spam algorithms work to make sure low-quality sites do not rise in the search results. Websites that use deception or manipulative behavior violate Google’s webmaster guidelines and are designated as low-quality spam sites. All algorithms work to find the right balance of information so users can trust the results that are provided.
If a web page is relevant and of high quality, it may not be listed in the rankings if it is not usable. The usability of web pages is measured by yet another algorithm that promotes easy-to-use web pages over less usable pages. This algorithm analyzes several signals that include:
Searching for “happy dog” while in White Bear Lake, Minnesota may yield results to a well-known web development agency based there. Whereas if you search for “happy dog” in Sydney, Australia, you will likely get images of, well, happy dogs.
Additionally, Google will attempt to personalize search results based on recent search activity. If you are planning a trip to Lake Havasu in Arizona and have done recent research on hotels in the area, and then you search for “London Bridge”, that is an important clue that you likely want information on the London Bridge in Lake Havasu City, instead of the London Bridge, in London, England.
If you don’t have time to master all 200+ ranking factors, here are a few basic rules and tips to keep in mind.
If you are stuck and need more questions answered, reach out Hoist. We help businesses grow by improving their online presence and digital marketing. Understanding how Google ranking works is very complex, but we have successfully helped many others navigate this complicated system.
And if you have five more minutes to spare, check out this video of how Google search works:
Still have Google ranking questions?