Once a bot has crawled a website and collected the necessary information, these pages are included in an index. There they are sorted according to their content, authority and relevance. This makes it much easier for the search engine to show us the results that are most relevant to our search when we make a query.
In the beginning, search engines were based on the number of times a word was repeated. They would crawl their index to find which pages had them in their texts, ranking better the one that had it repeated the most times. Nowadays they are more sophisticated and base their indexes on hundreds of different aspects. Aspects such as: date of publication, if they contain images, micro formats, etc. Now they give more priority to the quality of the content.
Once the pages are crawled and indexed, it is time for the algorithm to act: algorithms are the computer processes that decide which pages appear before or after in the search results, the algorithms check the indexes. So they will know which pages are the most relevant taking into account the hundreds of ranking factors. And all this happens in a matter of milliseconds.