I know this is unpopular as hell, but I believe that LLMs have potential to do more good than bad for learning, a long as you don’t use it for critical things. So no health related questions, or questions that is totally unacceptable to have wrong.

The ability to learn about most subjects in a really short time from a “private tutor”, makes it an effective, but flawed tool.

Let’s say that it gets historical facts wrong 10% of the time, is the world more well off if people learn a lot more, but it has some errors here and there? Most people don’t seem to know almost no history at all.

Currently people know very little about critical topics that is important to a society. This ignorance is politically and societally very damaging, maybe a lot more than the source being 10% wrong. If you ask it about social issues, there is a more empathetic answers and views than in the main political discourse. “Criminals are criminals for societal reasons”, “Human rights are important” etc.

Yes, I know manipulation of truth can be done, so it has to be neutral, which some LLMs probably aren’t or will not be.

Am I totally crazy for thinking this?

  • Maven (famous)@piefed.zip
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 day ago

    I dont think that getting a summary overview of something and learning it are exactly the same thing.

    The famous example is Rome and how it fell.

    If you ask for a basic summary you’ll get 100s of different answers if not 1000s because thats how many factors were in play over the extremely long course of the fall of Rome. Skipping over those details only leads to… well… skipping over details…

    Even if AI was 100% perfect always right, theres no way a summary can substitute actually learning something.

    • ComradePenguin@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 day ago

      Can’t you just continue digging to learn more? Follow up questions as you would with a private tutor?