• MiddleAgesModem@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    2 days ago

    Hallucinations have already been reduced. You’re expressing a pretty standard anti-LLM stance but it seems people in the field think the hallucination problem can be fixed. Even something as simple as having them say “I don’t know”. Better use of tools and sources.

    The fact that hallucination is less of a problem then it used to be should make it pretty clear that it’s not immutable.

    In any event, it’s still ridiculously more possible than artificially transferring human consciousness.