From the very beginning of his work as a search engine for students using search engines a project called BackRub at Stanford University, for a powerful search engine that is both a noun and a verb in one, Google’s path has been one of growth and constant adaptation over time.
Examining the path of this story provides some interesting insights into what the world’s most popular search engine is likely to look like in 10 years.
Why 10 years? Because 10 years span almost the entire life of Google, since its first major algorithm update in 2003 called «Boston». In 10 years, the search engine will probably look very different from what it does today, but it will serve many needs in everyday life far beyond what exists on a family computer.
Google search history
In 1995, when Larry Page and Sergey Brin first met and began collaborating on the search engine that would eventually become Google — then called BackRub — it’s doubtful they could have imagined the size and power of the company that would result from those efforts.

By 1999, the small company had outgrown its garage beginnings and set up its first real office in Palo Alto with eight employees. 2002 was when things got really interesting, with Google Search Appliance, an overhaul of AdWords, Google Labs, Google News, and the first set of Google APIs for developers.
Not surprisingly, the following year saw the first Google algorithm update, designed to thwart SEOs who – up to this point – have been quite successful at reaching landing pages at the top of Google results through keyword stuffing and building huge campaigns. for building backlinks.
Boston has curbed the backlink game a bit and started an ongoing battle between those who designed Google’s algorithm for better results and webmasters’ efforts to get sites and web pages as high as possible in Google’s search results.

The evolution of this algorithm, short of trying to thwart SEO gaming efforts, actually reveals a lot about the future vision of Google’s search planners and where they’ve led the company up to this point. Here is a breakdown of those major updates that offered this insight.
- Brandy (2004) — Latent Semantic Indexing (LSI), which uses a mathematical technique to define relationships within concepts within a set of text.
- Personalized Search (2005) — This update used the user’s search history to influence search results.
- Google Local (2005) — Local business data integrated into Google Maps. .
- Universal Search (2007) — News, video and local results integrated with organic search results.
- Real-time search (2009) — Social content such as Twitter is integrated into the feed in real time.
- Caffeine (2010) — Improves site indexing by improving the freshness of search results.
- User Search (2012) — Google+ and authorship integrated into search results.
- Venice (2012) — Best local results for broad queries.
- Knowledge Graph (2012) — Displays information and an image related to your search term next to regular results.
These were all updates that weren’t meant to thwart black hat attempts at SEO, but instead focused on algorithm development. to create a new generation of information in a whole new way, whether it’s integrating information about the user doing the search, or trying to use something like a knowledge graph to predict what the user really intends to search.
Predictive search results
The science of the search algorithm has applications far beyond the web search page itself. When you combine the plethora of mobile devices and the IoT movement, using a search algorithm to feed all the right data becomes even more critical. Google is perhaps best suited for this, especially considering that it has already created one of the most popular mobile platforms on the planet, Android.

Past efforts to include information about users and their behavior in search results point to a future in which artificial intelligence will be used to more accurately predict what a user wants to know, even before being asked. Google Now is a good example of this early generation, and Google Glass is a prime example of an alternative search delivery system. There is a growing movement towards augmented reality, and Google seems ready to take advantage of this.
In an interview with the BBC, Amit Singhal, Head of Google Search, explained it this way:
“For a computer, understanding means that when you ask something, it can tell you a lot more about it. For example, for the Taj Mahal, he will be able to say that this is a monument, where it is located, what you should know about it if you are interested in it, and not just a bunch of links.”
In other words, in the near future, knowing where you are, your past web searches, shopping and travel, Google will be able to evaluate what you probably want to know and provide you with information even before you open your mouth to ask. . This day can be displayed on the HUD display as a pair of sunglasses, wristwatches, or even special electronic contact lenses.
Personal information
Google’s past efforts to continually improve local search results mean that the search team’s future goal is to provide you with personalized data based on your location and current activity.

This kind of update, with local business results, started back in 2005 and has continued to evolve in the search algorithm right up until today. In 10 years, this personalization of delivered information will become even more relevant in our daily lives. The concept of «auto-complete» when entering search results takes on a whole new meaning when Google can potentially «auto-complete» the intended data search without even having to type a single letter. This vision was explained by Amit Singhal in the same BBC interview:
«Now you can imagine there are some context prompts where you don’t even have to enter the first letter to fill in what most people do in contexts like this.»
In other words, if most people like you are almost always looking for a particular piece of data in a similar situation, then Google can predict that you are likely to be looking for it. Imagine that you are driving down the interstate highway wearing Google Glasses and when you and your family start to go hungry, Google automatically displays the location of the top 4 takeaway restaurants in the area. Such predictions are possible when Google collects statistics on millions of queries from millions of places around the world.
» So, for example, if someone is standing in front of Buckingham Palace and most of the people who are standing there ask the question ‘Trafalgar Square’, then potentially it will be a sentence that can happen without even going up to the box.»
Of course, while it’s incredibly convenient to have the vast amount of data you need just when you need it, it’s also a little embarrassing that Google could potentially have so much information about people’s behavior in the near future — much more than it does today. In the wrong hands, such information can be dangerous… but it can be the price necessary for such progress.