- cross-posted to:
- hackernews@lemmit.online
- hackernews@lemmy.smeargle.fans
- cross-posted to:
- hackernews@lemmit.online
- hackernews@lemmy.smeargle.fans
Google rolled out AI overviews across the United States this month, exposing its flagship product to the hallucinations of large language models.
Hallucination is a technical term. Nothing to do with thinking. The scientific community could have chosen another term to describe the issue but hallucination explains really well what’s happening.
huh, i kinda assumed it was a term made up/taken by journalists mostly, are there actual research papers on this using that term?
Yup. Loads of them! https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=hallucinations+llm&btnG=
It used to mean all generated output though. Calling only mistakes hallucinations is new, definitely because of hype.