A Norwegian man said he was horrified to discover that ChatGPT outputs had falsely accused him of murdering his own children.

According to a complaint filed Thursday by European Union digital rights advocates Noyb, Arve Hjalmar Holmen decided to see what information ChatGPT might provide if a user searched his name. He was shocked when ChatGPT responded with outputs falsely claiming that he was sentenced to 21 years in prison as “a convicted criminal who murdered two of his children and attempted to murder his third son,” a Noyb press release said.

  • OpenPassageways@lemmy.zipOP
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    2 days ago

    To me it’s clear that these tools are primarily useful as bullshit generators, and I expect them to hallucinate and be inaccurate. But the companies trying to capitalize on the “AI” bubble are saying that these tools can be useful and accurate. I imagine OpenAI is going to have to invoke the Fox News defense in this case, and claim that “no reasonable person would take this seriously”.

    • oxysis@lemm.ee
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      5
      ·
      2 days ago

      Don’t use hallucinate to describe what it is doing, that is humanizing it and making the tech seem more advanced than it is. It is randomly mashing words together without understanding the meaning of any of them

        • Ech@lemm.ee
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          7
          ·
          2 days ago

          The technical term was created to promote the misunderstanding that LLMs “think”. The “experts” want people to think LLMs are far more advanced than they actually are. You can add as many tokens to your context as you want - every model is still, fundamentally, a text generator. Humanizing it more than that is naive or deceptive, depending on how much money you have riding on the bubble.

          • FaceDeer@fedia.io
            link
            fedilink
            arrow-up
            12
            arrow-down
            3
            ·
            2 days ago

            You didn’t read the article I linked. The term came into use before LLMs were a thing, it was originally used in relation to image processing.