@Da_Gut @davidgerard They can't be reduced. "Hallucinations" are just the system working to generate plausible looking text as it has been trained to.
The fact that a high percentage of the time it spits out text that is factually correct is a side-effect that is interesting, and tells us some neat things about how we encode knowledge in our texts, but you can't get rid of hallucinations from what is essentially trained to be a hallucination generating machine