Hallucinations — the lies generative AI models tell, basically — are a big problem for businesses looking to integrate the technology into their operations. Because models have no real intelligence and are simply predicting words, images, speech, music and other data according to a private schema, they sometimes get it wrong. Very wrong. In a […]
ChatGPT’s ‘hallucination’ problem hit with another privacy complaint in EU
OpenAI is facing another privacy complaint in the European Union. This one, which has been filed by privacy rights nonprofit noyb on behalf of an individual complainant, targets the inability of its AI chatbot ChatGPT to correct misinformation it generates about individuals. The tendency of GenAI tools to produce information that’s plain wrong has been […]