Barack Obama was the first Muslim president of the United States. There is no country in Africa that starts with the letter “K”. Pythons are mammals.

These are some examples of counter-factual information that social media users, computer scientists and journalists have highlighted using Google's new “AI Summary” tool. Based on large-scale language models (LLMs), this new search engine feature aims to provide artificial intelligence-generated summaries based on the search terms entered by users.

Following this blunder, Google took “swift action” to improve its AI summaries.

AI Overview uses generative AI trained on large amounts of data (usually text) to generate new media such as text, images, audio, video, etc. Generative AI has become a hot topic in recent years with the development of ChatGPT and image creation software.

It is based on the basic premise of sifting through large amounts of data to find the most likely extensions of that data to create new images or text.

Newsletter

Some users have expressed concerns about the ethical considerations of the new development, especially since the technology is still relatively new and there is little regulation surrounding its use.

A common feature of generative AI is that algorithms still suffer from “hallucinations.”

Hallucinations occur when an AI doesn't know how to input data, and occasionally inserts false information. This is often because the algorithm doesn't vet the source of the data, sometimes pulling information from unverified social media posts or joke articles published in satirical online magazines. onion.

It's unclear what exactly causes Google's AI Overview to present misinformation, but the rollout of the new tool is likely to be delayed.

Sign up for our weekly newsletter





Source link