jckarter

everyone already knows i'm a dog

the swift programming language is my fault to some degree. mostly here to see dogs, shitpost, fix old computers, and/or talk about math and weird computer programming things. for effortposts check the #longpost pinned tag. asks are open.


email
mailto:joe@duriansoftware.com
discord
jckarter

ewie
@ewie

i’ve been reading a bit about ai stuff recently and one thing that has really bothered me is that a lot of people using the framing provided by corporations by calling the problems "hallucinations". they’re not hallucinations. a computer can't hallucinate. these "hallucinations" are caused by the computer being unable to fundamentally understand the world because they can't understand. period. it just bothers me because you're criticizing ai, but you're criticizing it using their language and on their terms and that sucks. don’t play their game and don’t play it by their rules.


jckarter
@jckarter

i forget who it was unfortunately, but someone referred to what LLMs do as cold reading, and i think that analogy hits it right on the head. the LLM is just saying random stuff, and the correlation to reality (if any) is filled in by the reader

e: thanks @ant for finding the article:


You must log in to comment.

in reply to @ewie's post:

i’ve heard “fucking bullshit”, “making shit up”, “lying”, and “just being plain wrong”, but i guess none of those phrases sound as good on paper as “hallucinations”