Check out all the on-demand sessions from the Intelligent Security Summit here.
Since the release of ChatGPT in November, there has been a lot of speculation about OpenAI’s latest large language model (LLM) spelling doom for Google Search. The sentiment has only intensified with the recent report of Microsoft preparing to integrate ChatGPT into its Bing search engine.
There are several reasons to believe that a ChatGPT-powered Bing (or any other search engine) will not seriously threaten Google’s search near-monopoly. LLMs have several critical problems to solve before they can make a dent in the online search industry. Meanwhile, Google’s share of the search market, its technical ability and its financial resources will help it remain competitive (and possibly dominant) as conversational LLMs start to make their mark in online search.
Meanwhile, the real (and less discussed) potential of LLMs such as ChatGPT is the “unbundling” of online search, which is where real opportunities for Microsoft and other companies lie. By integrating ChatGPT into successful products, companies can reduce the use cases of Google Search.
Integrating ChatGPT in search engines
While ChatGPT is a remarkable technology, it has several fundamental problems, which are also present in other LLMs. This is why Google, which already has similar technology, has taken a conservative approach toward integrating conversational LLMs into its search engine.
Intelligent Security Summit On-Demand
Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.
- As many users and researchers have shown, LLMs such as ChatGPT can “hallucinate,” generating answers that are grammatically cohesive but factually wrong.
- LLMs do not cite their sources, which makes it difficult to further validate and investigate the truthfulness of their output.
- The costs of running LLMs are huge. According to one estimate, with one million daily users, ChatGPT costs around $100,000 per day.
- LLMs are slow to run. Search engine databases can return millions of results within milliseconds. LLMs take several seconds to generate responses.
- LLMs are slow to update. Google can add millions of records to its search index every hour at virtually no cost. LLMs need to undergo slow and expensive retraining every time they are to be updated with new knowledge (ChatGPT’s training data is from 2021).
A company like Microsoft might be able to solve these problems by using its highly efficient Azure cloud and developing suitable LLM architectures, training techniques, and complementary tools.
Microsoft and OpenAI might also be able to solve the truthfulness problem by adding automated guardrails that fact-check ChatGPT’s answers before showing them in Bing results.
However, nothing prevents Google from doing the same thing. Google has immense data and compute resources and one of the most talented AI teams. Google also has the advantage of being the default search engine on Chrome, most Android devices, and Safari (included with macOS and iOS devices). This means that unless it’s significantly better, a ChatGPT-powered Bing will not convince users to go out of their way to make the switch from Google Search.
People use Google Search to solve various problems, from locating nearby restaurants to finding academic papers, retrieving news articles, querying historical information, looking for coding advice and more.
ChatGPT and other LLMs can also solve some of these problems. We’re already seeing this happen in software development. When programmers need help writing code for a specific problem, they usually search for it on Google or visit a coding forum such as Stack Overflow. Today, thanks to GitHub Copilot and OpenAI Codex, they just need to write a textual description in their integrated development environment (IDE) (e.g. Visual Studio Code or GitHub Codespaces) and have the LLM automatically generate the code for them. This helps developers stay in the flow by avoiding switching from their IDE to Google Search. This is an example of “unbundling” some of the work that Google Search is currently doing.
There are many other opportunities to unbundle search through LLMs, such as developing assistants for academic papers, essays and other content creation. There are several benefits to unbundling:
- It allows for specialization. The LLM can be fine-tuned to the specific application it is integrated with. This improves the accuracy of the LLM’s output and also allows for the use of smaller models, which considerably reduces costs.
- Unbundling reduces the update overhead. As long as users don’t expect the LLM to know up-to-the-minute facts, it will not need to be retrained very frequently.
- Companies can avoid direct competition with Google’s search behemoth. Instead, they can tap into their existing markets. For example, Microsoft could integrate ChatGPT as assistants into Office, Visual Studio, Teams and other products that collectively have billions of users. Other content platforms can find opportunities in friction points where users have to switch from their apps to Google Search to find content. Some of those problems might be solved by integrating an LLM into the application.
- The integration model unlocks new business models. Google Search earns its revenue from its vast ad network. Integrated LLMs could be monetized through other means such as subscriptions. As Copilot shows, if the LLM boosts productivity and saves time, users will be willing to pay a monthly fee for it.
The future of online search
For many use cases, Google’s list of blue links will remain the dominant tool. For example, if you want to do a precise search in specific domains and timeframes, Google’s search technology is better than current LLMs.
Unbundling will not pose an existential threat to Google Search yet. In fact, the history of large platforms such as Craigslist and Amazon shows that unbundling usually results in the expansion of a market (and Google already has a stake in many of those markets). However, unbundling will weaken Google’s hold on the online information market to a degree. And in the long run, LLMs can trigger more profound shifts in the market.