When search engines started the first determined whether a webpage deserved the #1 position for a search query based on how many times that page contained those search terms.
Remember any websites from back in the day that would have hundreds of keywords at the bottom of the page? Sometimes they would even be in white text and a white background
Google came along and recognized that ranking websites based off of how many times they used a keyword just encouraged websites to repeat their keywords over and over and over. And they wanted better results for their users.
So they came up with a system of ranking websites based off a factor they they couldn't control themselves (or at least google believed they couldn't).
They declared the internet is a democracy, and each website gets to "vote" for another site (by linking to them).
They theory is that you wouldn't be able to to get other websites to link to you unless you were a quality website.
Turns out they were wrong. A new era of search engine manipulators was born.
They used automated tools that would create accounts on forums, create blogs on free blogging sites, create comments on forums, etc all automatically. And all of these would contain links to their website.
Again, spammers found a way to get the better of Google, but it wouldn't last forever. If google wants to be #1 in search they need to come up with ways to prevent poor quality content from being manipulate to the top of their search engine.
Or at least make it less easy to do so.
Google then decided that each website gets an equal vote was a bad system, and decided that websites that had more websites linking to them should have a bigger site than websites with few or no websites linking to them.
That should stop the spammers right?
Nope! They evolved to spam even more. They realized that multiple layers / tiers of spamming was the solution. If they make a webpage that links to the site they are trying to rank, and it's vote is not important due to it having no sites linking to it, they auto create even more spammy sites that link to that site, and more to that site and so forth.
Google again had to come up with a way of stopping this ever increasing amount of spammy and spammy powered websites from showing up in the hugely profitable revenue producing search engine
Aha! We have a solution. We will start valuing websites by how trustworthy they are, and we can evaluate how trustworthy they are based off of what trustworthy sites link to them. But how do you define trustworthy sites?
Well Google isn't revealing their secret sauce, but the most credible theory is that Google picked a handful of very trustworthy sources manually.
Which likely included government websites, college websites, and big credible news organizations
But then people found out how to ....