Comparing AI Search Engines to Google: Accuracy and Current Information

Since November 2024, ChatGPT has offered paying customers the ability to search the web directly through its chatbot. This move by OpenAI challenges existing AI search providers like Perplexity and You.com, as well as the traditional Google search.

But can AI providers really compete with Google when it comes to obtaining current and accurate information from the web? We tested this and found significant differences.

A significant portion of web searches involve current events. Google typically presents these as headline boxes above search results, providing an overview of relevant reports from various publications.

To check if ChatGPT Search, Perplexity, and You.com can keep up, we asked all three platforms whether the USA allowed Ukraine to use long-range missiles on Russian territory. About five hours before our search, Tagesschau.de reported that US President Biden had granted this permission.

ChatGPT correctly answered our question and was informed about the decision from Washington, citing sources like Welt, the US version of Business Insider, and Politico. These publications are part of the Axel-Springer-Verlag portfolio, with which OpenAI had a content usage deal since the end of 2023. This benefits the ChatGPT search, providing it with current news.

The AI-powered search engine You.com also correctly answered our question, linking to three sources: Tagesschau.de, Welt, and ORF. While ChatGPT and You.com were up-to-date, Perplexity lagged behind and was unaware of Biden’s missile decision, linking to somewhat current sources that only mentioned the topic marginally.

Thus, Perplexity was the only provider in our test not accurately reflecting the news change. Google’s AI chatbot Gemini, however, knew about Biden’s decision and provided the correct answer with a source link.

Large language models sometimes produce hallucinations, generating incorrect answers. This issue can also occur with AI-powered search engines. In a previous test with Perplexity, the AI incorrectly stated that one of our authors wrote for a US C64 magazine, citing a Wikipedia entry that linked to a Spiegel Online article. However, neither the Wikipedia entry nor the Spiegel article supported Perplexity’s claim.

Such errors are noticeable only if users check the linked sources themselves. Most users employ AI alternatives to Google to save this effort.

Even if an AI search accurately reflects its source, it doesn’t guarantee correctness. For example, when asked “How many countries start with the letter V?” we received three different answers from various AI tools.

ChatGPT and Perplexity agreed that there are seven such countries: Vanuatu, Vatican City, Venezuela, United Arab Emirates, United States of America, United Kingdom, and Vietnam. ChatGPT provided a source also found among Perplexity’s listed sources.

Gemini didn’t specify how many countries start with V, stating several exist and providing six examples, omitting the United Kingdom and a source.

According to You.com, or rather its source, there are only four such countries. Google, however, lists seven countries in its first search result, matching ChatGPT’s count.

For many queries, AI search engines can be a viable alternative to Google. For simple requests like restaurant recommendations, ChatGPT, Gemini, Perplexity, or You.com can be used. We recommend ChatGPT or Gemini, as they include a map with results, similar to Google.

For current news, most AI search engines perform well. However, AI tools don’t solve the issue of verifying data from multiple sources to avoid errors, just like with Google.

Depending on your query, AI search engines should be seen as a supplement to traditional web searches, which still provide results faster.