A Norwegian man said he was horrified to discover that ChatGPT outputs had falsely accused him of murdering his own children.
According to a complaint filed Thursday by European Union digital rights advocates Noyb, Arve Hjalmar Holmen decided to see what information ChatGPT might provide if a user searched his name. He was shocked when ChatGPT responded with outputs falsely claiming that he was sentenced to 21 years in prison as “a convicted criminal who murdered two of his children and attempted to murder his third son,” a Noyb press release said.
Humans hallucinate. These things extrapolate tokens statistically. In average his tone of requests would be likely to lead to some murder story.
Nothing is wrong with the tech (except it doesn’t seem very useful when you firmly know what it can’t do), but everything is wrong with that tech being called artificial intelligence.
It’s almost like calling polygraph “lies detector”.
I replied to the following statement:
I countered this dismissal by quoting the article, which explains that it was more than just a coincidental name mix up.
You response is not really relevant to my response, unless you are assuming I’m arguing for one side or the other. I’m just informing someone who dismissed the article’s headline using an explanation that demonstrated that they didn’t bother to read the article.
If the owners of the technology call it artificial intelligence and hype or sell it as a potential replacement for intelligent human decision making then it should be absolutely be judged on those grounds.
I know I’ve hate the fact that we’ve settled on the word Hallucinate It anthropomorphizes something that absolutely isn’t intelligent.
It’s not capable of thinking a particular piece of information is true when it isn’t, because it isn’t capable of thinking about information in general.
Well, one of is features is remembering details you told it previously. I am surprised it went into child murder territory. In my experience it will usually avoid such topics unless prompted carefully. My initial suspicion is he prompted it into saying these things, but to be fair these things have gone off the rails before. Either way, most people just have a very little or no understanding on what these things are and how they work.
It’s a function approximate…er!
I have understanding and this is horrible, extreme malpractice and not an inherent feature