Asking chatbots for short answers can increase hallucinations, study finds

techcrunch.com/2025/05/08/asking-chatbots-for-short-answers-can-increase-hallucinations-study-finds

Turns out, telling an AI chatbot to be concise could make it hallucinate more than it otherwise would have. That’s according to a new study from Giskard, a Paris-based AI testing company developing a holistic benchmark for AI models. In a blog post detailing their findings, researchers…

This story appeared on techcrunch.com, 2025-05-08 12:05:00.
The Entire Business World on a Single Page. Free to Use →