impact of AI-powered SearchGPT on search engines
Can openAI's SearchGPT be Trusted? Expert Expresses Profound Doubts
The company has introduced its innovative approach to web search through the launch of SearchGPT, a provisional search tool currently in prototype phase in collaboration with prominent publishers such as The Atlantic, Vox Media, and News Corp.
OpenAI states that the new tool offers immediate responses to user inquiries, whether seeking the day's weather, the latest international news, local concert schedules, or recipe ingredients, complete with attributions and source links.
SearchGPT is currently not an officially releases product. OpenAI has announced that it will be accesssible to a limited group of users and publishers in the upcoming weeks as part of its efforts to refine the tool prior to its integration into ChatGPT. Users who are interested can sign up for the waitlist.
According to a Northeastern University expert in artificial intelligence and communications, OpenAI's strategy with SearchGPT is commendable for its collaboration with publishers, ensuring proper permission and attribution of their work. However, the researcher notes that it may still suffer from the fundamental challenges inherent to all large language model technologies.
Michael Ann DeVito, a professor of computer sciences and communication studies at Northeastern University, pointout that current systems lack true intelligence and rely on context-free pattern matching, which often results in misleading or nonsensical outputs. He also highlights that while training on news content is preferable to public internet data, it still contains inherent biases.
The SearchGPT system resembles Perplexity AI's search engine, offering summarized responses and source links, However, Perplexity AI has faced criticism from publishers for unauthorized use of journalistic content.
Google offers AI products such as its chatbot, Gemini, and a search feature called AI Overview. The AI Overview service encountered issues earlier this spring, delivering inaccurate information, including recommendations to consume rocks and apply glue to pizza.
OpenAI has been speculated for some time to be developing a search product designed to compete with Google, particularly given its recent partnerships with news publishers and platforms like Reddit.
ChatGPT has become a popular pseudo-search tool among users for its quick and interactive responses. Nevertheless, it is frequently noted for its tendency to 'hallucinate' and disseminate incorrect information.
According to DeVito, the advent of these technologies coincides with a marked deterioration in public media literacy over the last ten years, which exacerbates the difficulty of discerning trustworthy information from dubious sources.
She laments that there has been considerable regression over the past decade, noting, 'we've reached a point where we simply tell people to search online, but we haven't equipped them with the skills to do so effectively.'
ChatGPT acknowledges on its webpage that it may produce errors and advises users to verify the information independently. However, DeVito contends that this cautionary note is insufficient.
This situation is reminiscent of the small warnings once found on cigarette packages. Were those warnings effective? Certainly not. Significant strides in public health were made only after we introduced large, graphic warnings. Similarly, warnings on digital platforms need to be as prominent and attention-grabbing as the rest of the interface, rather than being relegated to fine print that is often ignored.
"These tools should be viewed as cutting-edge novelties designed for experimentation, not as reliable solutions for mission-critical operations," she explains.
"My recommendation is to steer clear of these tools," she notes. "If you do use them, approach the results with significant skepticism and diligently verify the information. The time spent on this may well exceed the time required for a manual search."
Labels: artificial intelligence, SearchGPT
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home