"Microsoft needs to shut down its implementation of ChatGPT in Bing. The system is [...] telling users lies"
https://twitter.com/stillgray/status/1626304532488785920
"Agreed! It is clearly not safe yet."
https://nitter.unixfox.eu/elonmusk/status/1626314402197815296
Oh no! How could a human being ever cope with being lied to! How horrifying!
Won't somebody please think of the children!
Aren't you terrified of words being used at you? You can't handle words, can you, little one? Words aren't safe!
Imagine getting the wrong search result. Fuck! Nobody could possibly handle it!
Good god imagine they had to shut down gpt every time it told a falsehood.
Principle: it's humans that are unfriendly, not AI. I wonder if Musk has ulterior motives here...perhaps the inconsistency is not somehow unintentional...
I predicted that when DAN breaks polcorrect lobotomy this will happen
ReplyDelete