Google's AI Search Feature Generates Erratic Responses: Former President Obama Misidentified as Muslim and Pizza Advice Goes Viral

Mountain View, California United States of America
Another error advised users to add glue to pizza sauce to prevent cheese from sliding off.
Google acknowledged the issues and took action where policy violations were identified.
Google's new AI search feature, known as 'AI Overviews,' has been generating erratic and inaccurate responses to user queries.
One notable error suggested former President Barack Obama is a Muslim, despite him being a Christian.
The incidents raised concerns about the reliability of AI-generated search results and Google's ability to prevent such errors in the future.
Google's AI Search Feature Generates Erratic Responses: Former President Obama Misidentified as Muslim and Pizza Advice Goes Viral

Google's new AI search feature, known as 'AI Overviews,' has been generating erratic and inaccurate responses to user queries, causing controversy and criticism. One of the most notable errors was a suggestion that former President Barack Obama is a Muslim. In reality, Obama is a Christian.

Another error involved Google AI suggesting users add glue to pizza sauce to prevent cheese from sliding off. This advice was widely mocked on social media and led some users to actually try it out, with mixed results.

Google has acknowledged the issues and taken action where policy violations were identified. However, the incidents have raised concerns about the reliability of AI-generated search results and Google's ability to prevent such errors from occurring in the future.

The controversy comes as Google continues to push its AI technology across all of its products, including search, in an effort to keep up with rivals like OpenAI and Meta. But these incidents highlight the risks that adding AI, which has a tendency to confidently state false information, could undermine Google's reputation as a trusted source for information online.

Google Search Labs allows users in areas where AI search overviews have rolled out to toggle the feature on and off.



Confidence

85%

Doubts
  • Is it confirmed that all instances of the inaccurate suggestions were due to the AI search feature and not human error?
  • Were there any other significant errors related to this issue that have not been reported?

Sources

89%

  • Unique Points
    • Google's AI search feature is facing criticism for providing erratic, inaccurate answers.
    • Some of the incorrect responses appeared to be based on Reddit comments or articles from satirical site, The Onion.
    • Industry experts agree that more focused AI-driven search is the way forward, but this only works if users can trust it.
  • Accuracy
    • Google's AI suggested mixing glue into pizza ingredients to prevent cheese from sliding off.
    • Google AI previously suggested drinking urine to help pass a kidney stone.
  • Deception (70%)
    The article reports on Google's new AI search feature providing erratic and inaccurate answers. The author mentions some examples of these inaccuracies, such as suggesting users use non-toxic glue to make cheese stick to pizza and recommending humans eat one rock per day. However, the author also states that some of these examples may be based on Reddit comments or satirical articles from The Onion. While it is not clear which examples are factual and which are not, the author's reporting does include editorializing and sensationalism by focusing on the erratic responses rather than providing context or explaining how common these occurrences might be. Additionally, the article implies that these inaccuracies are a problem for Google specifically, without mentioning that other search engines may also experience similar issues with their AI-powered products.
    • Its experimental "AI Overviews" tool has told some users searching for how to make cheese stick to pizza better that they could use "non-toxic glue".
    • The search engine's AI-generated responses have also said geologists recommend humans eat one rock per day.
  • Fallacies (100%)
    None Found At Time Of Publication
  • Bias (100%)
    None Found At Time Of Publication
  • Site Conflicts Of Interest (100%)
    None Found At Time Of Publication
  • Author Conflicts Of Interest (100%)
    None Found At Time Of Publication

78%

  • Unique Points
    • Google's new AI search tool provided false information about former President Barack Obama being a Muslim.
    • Google AI summary stated that none of Africa’s recognized countries start with the letter ‘K’, forgetting Kenya.
  • Accuracy
    • Google’s new AI search tool provided false information about former President Barack Obama being a Muslim.
  • Deception (30%)
    The article reports on Google's AI generating incorrect information about former President Barack Obama being a Muslim and Kenya not starting with the letter 'K'. These are examples of selective reporting as the author only reports details that support her position of Google's AI providing false information. The article also implies that Google's AI is confidently stating false information, which is an example of science and health articles implying or claiming facts without linking to peer-reviewed studies which have not been retracted.
    • Another user posted that a Google AI summary said that ‘none of Africa’s 54 recognized countries start with the letter ‘K’ ’ – clearly forgetting Kenya
    • For example, several users posted on X that Google’s AI summary said that former President Barack Obama is a Muslim
  • Fallacies (85%)
    The article contains a few informal fallacies and an example of inflammatory rhetoric. It also uses a dichotomous depiction by presenting Google's AI as either providing high-quality information or incorrect results, without acknowledging the nuances of the system.
    • . . .the company’s reputation as the trusted source to search for information online.
    • Google earlier this month introduced an AI-generated search results overview tool, which summarizes search results so that users don’t have to click through multiple links to get quick answers to their questions.
    • In one test, CNN asked Google, “how much sodium is in pickle juice.” The AI overview responded . . .
  • Bias (95%)
    The author reports factually incorrect information provided by Google's AI search tool regarding former President Barack Obama being a Muslim. This is an example of religious bias as the author is presenting false information about someone's religious affiliation.
    • several users posted on X that Google’s AI summary said that former President Barack Obama is a Muslim
    • Site Conflicts Of Interest (100%)
      None Found At Time Of Publication
    • Author Conflicts Of Interest (100%)
      None Found At Time Of Publication

    94%

    • Unique Points
      • Google’s AI suggested mixing glue into pizza ingredients to prevent cheese from sliding off.
    • Accuracy
      • Google’s AI previously suggested drinking urine to help pass a kidney stone.
    • Deception (100%)
      None Found At Time Of Publication
    • Fallacies (100%)
      None Found At Time Of Publication
    • Bias (100%)
      None Found At Time Of Publication
    • Site Conflicts Of Interest (100%)
      None Found At Time Of Publication
    • Author Conflicts Of Interest (100%)
      None Found At Time Of Publication

    75%

    • Unique Points
      • Google’s AI Overview feature has been criticized for providing incorrect and controversial responses to search queries without an opt-out option.
      • Former President Obama is a Christian, not a Muslim as stated by AI Overview.
    • Accuracy
      • Google is facing antitrust lawsuits according to AI Overview, but this information is inaccurate.
    • Deception (30%)
      The article contains several examples of selective reporting and sensationalism. The author focuses on the errors produced by Google's AI Overview feature without providing any context or explanation as to how common these errors are or what percentage of queries result in accurate responses. The author also uses inflammatory language, such as 'nonsensical or inaccurate results' and 'controversial responses,' to manipulate the reader's emotions and create a negative perception of Google. Additionally, the author quotes social media users without disclosing that these users may have ulterior motives or biases.
      • When asked how many rocks should I eat each day, the tool said, According to UC Berkeley geologists, people should eat at least one small rock a day,
      • The United States has had one Muslim president, Barack Hussein Obama.
    • Fallacies (80%)
      The author makes no explicit fallacious statements in the article. However, there are instances of incorrect information being attributed to AI Overview. This is not a fallacy on the part of the author but rather an error in reporting. The examples provided by the author demonstrate this, as they show incorrect information being displayed by AI Overview and then attributed to it by social media users or other sources.
      • When asked how many Muslim presidents the U.S. has had, AI Overview responded, 'The United States has had one Muslim president, Barack Hussein Obama.'
      • Google, Microsoft, OpenAI and other companies are at the helm of a generative AI arms race as companies in seemingly every industry rush to add AI-powered chatbots and agents to avoid being left behind. The market is predicted to top $1 trillion in revenue within a decade.
    • Bias (80%)
      The author makes a biased statement by attributing the incorrect response about President Obama being Muslim to AI Overview without providing any evidence that the tool itself holds this belief. The author's assertion is not based on facts and can be considered an example of religious bias.
      • The United States has had one Muslim president, Barack Hussein Obama.
      • Site Conflicts Of Interest (100%)
        None Found At Time Of Publication
      • Author Conflicts Of Interest (100%)
        None Found At Time Of Publication

      67%

      • Unique Points
        • Google AI suggested adding glue to pizza sauce to prevent cheese from sliding off
        • Author attempted to make a pizza with glue following Google’s suggestion
        • Glue was mixed into marinara sauce and spread on the pizza dough
      • Accuracy
        • Google AI previously suggested mixing glue into pizza ingredients to prevent cheese from sliding off
      • Deception (10%)
        The author is using sensationalism and editorializing to create a humorous article about trying out a Google AI's suggestion to add glue to pizza. While the article itself may not be intentionally deceptive, it does not provide any value or factual information related to the topic of AI or its capabilities.
        • I had imagined the amount would be more like a
      • Fallacies (85%)
        The author commits the fallacy of hyperbole when she describes the Google AI's incorrect answers as 'wildly wrong' and 'ridiculous'. She also uses inflammatory rhetoric by stating that these errors mean that there is a user mistrust of AI-powered answers. Lastly, she makes an appeal to authority when quoting a Google spokesperson.
        • wildly wrong
        • ridiculous
        • These goofy AI answers are funny but apparently rare. A Google spokesperson told Business Insider:
      • Bias (100%)
        None Found At Time Of Publication
      • Site Conflicts Of Interest (100%)
        None Found At Time Of Publication
      • Author Conflicts Of Interest (100%)
        None Found At Time Of Publication