• WraithGear@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    not unless they pivot on the basic principles of the LLM’s, instead of attempting to force a square peg into a circle hole.

    • MiddleAgesModem@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      14 hours ago

      Hallucinations have already been reduced. You’re expressing a pretty standard anti-LLM stance but it seems people in the field think the hallucination problem can be fixed. Even something as simple as having them say “I don’t know”. Better use of tools and sources.

      The fact that hallucination is less of a problem then it used to be should make it pretty clear that it’s not immutable.

      In any event, it’s still ridiculously more possible than artificially transferring human consciousness.