Hallucinations have already been reduced. You’re expressing a pretty standard anti-LLM stance but it seems people in the field think the hallucination problem can be fixed. Even something as simple as having them say “I don’t know”. Better use of tools and sources.
The fact that hallucination is less of a problem then it used to be should make it pretty clear that it’s not immutable.
In any event, it’s still ridiculously more possible than artificially transferring human consciousness.
not unless they pivot on the basic principles of the LLM’s, instead of attempting to force a square peg into a circle hole.
Hallucinations have already been reduced. You’re expressing a pretty standard anti-LLM stance but it seems people in the field think the hallucination problem can be fixed. Even something as simple as having them say “I don’t know”. Better use of tools and sources.
The fact that hallucination is less of a problem then it used to be should make it pretty clear that it’s not immutable.
In any event, it’s still ridiculously more possible than artificially transferring human consciousness.