Retail News

ChatGPT hallucinates

The New York Times 05/02/2023

ChatGPT has been criticized for sometimes getting facts wrong. It has not just gotten them wrong in some cases, it has made them up. These so-called AI hallucinations could potentially be dangerous to individuals who rely on search results to make personal or professional decisions. “If you don’t know an answer to a question already, I would not give the question to one of these systems,” said Subbarao Kambhampati, a professor at Arizona State University.

Source: The New York Times


    Check out RetailWire's Engaging Online Discussions Featuring Our Exclusive Braintrust!