Ever since the release of OpenAI's humanlike chatbot ChatGPT in late 2022, users were warned about the tool's potential for hallucinations, meaning the bot could spit out inaccurate or fictitious information in a very confident manner.

But more recently, these hallucinations have been used as one of the strategies to challenge the chatbot's legal standing under the European Union's General Data Protection Regulation.

While OpenAI has denied its chatbot was in violation of the GDPR, ChatGPT has received increased scrutiny from European data protection authorities, including some explicit warnings recently from the Italian Data Protection Authority.