• 0 Posts
  • 13 Comments
Joined 2 months ago
cake
Cake day: September 29th, 2025

help-circle
  • These people turned to a tool (that they do not understand) - instead of human connection. Instead of talking to real people or professional help. And That is the real tragedy - not an arbitrary technology.

    They are a badly designed, dangerous tools and people who do not understand them, including children, are being strongly encouraged to use them. In no reasonable world should an LLM be allowed to engage in any sort of interaction on an emotionally charged topic with a child. Yet it is not only allowed, it is being encouraged through apps like Character.AI.




  • Let’s devote the full force of modern technology to create a tool that is designed to answer questions in a convincing way. The answer must seem like an accurate answer, but there is no requirement that it be accurate. The terminology and phrasing of the answer must support the questioner’s apparent position and the overall conversation must believably simulate an interaction with a friendly, or even caring individual.

    Yeah, in a world of lonely people who are desperate for human contact and emotional support and are easily manipulated, this is in retrospect, an obvious recipe for disaster. It’s no wonder we’re seeing things like this and some people even developing a psychosis after extended interactions with chat-bots.









  • First, there’s no such thing as actual Artificial Intelligence. In it’s current usage, AI is simply a Large Language Model that takes the enormous amount of data it’s been fed and tries to generate a response that seems like it may be an answer to your question. It has no understanding of the question or the answer, it’s just an estimation of what might be an answer. The fact that there is no guarantee whatsoever that the answer you get is accurate is simply a modern example of the old adage, “Garbage in; garbage out.”

    Secondly, there isn’t a single LLM made by a company that I would trust to guess my weight let alone the answer to a question I thought was important.