LemmaEOF

Your favorite chubby cuddlebot

Hey! I'm Lemma, and I'm a chubby queer robot VTuber who both makes and plays games on stream! I also occasionally write short stories and tinker with other projects, so keep an eye out! See you around~

Chubbyposting and IRL NSFW alt: @cuddlebot

name-color: #39B366



spiders
@spiders

was reading a fic where a home ai becomes "sapient" (because i guess normal home ai is "nonsapient" and basically a very advanced alexa) and so she's declared "legally a sophont" free to do whatever, after a thorough "examination" of her digital consciousness, and it's portrayed completely positively and man this gives me really weird vibes.

like in the story it's celebrated as a beautiful thing, but personally it freaks me the fuck out to imagine them having a system of determining who is legally sapient vs nonsapient where a line is drawn. the inhabitants of the home are aware of the fact that she is becoming "self-aware" for days leading up to the event, based on signs and behaviors that artificial intelligences don't normally do, but she isn't legally fully "self-aware"/sapient yet. they even time their estimation of when she will become that in a matter of days. and that just scares me, that there would be a possibility for someone to be "almost sapient, but not"

like it gives me bad associations and bad vibes because of autism, because of plurality, because of ecology.

and like i was talking about this with the author and they really didn't see anything wrong with this. when i brought up the terraformers as a really good exploration of these concepts, they were outright horrified in a way that confused me but maybe i didn't explain the story well enough

anyways i would like scifi authors to consider for a moment that "sapience" is kind of a fake thing that can't actually be quantified or measured or defined. what people mean when they say "sapient, and not just sentient" is that the being "thinks and acts like a human" (and by human they often mean, "a neurotypical abled human").

and if that's your definition of sapient, at best you will find yourself "alone" in a universe, none of the life you find will fit into a neat easy box that's simple and quick to understand. you will disqualify beings who have truly beautiful but very alien intelligences as not being "sapient" enough to be included in your definition of personhood for you, because they aren't able to be reduced to human behaviors, human language.

i think in a lot of ways, sapience in science fiction is defined by if the language barrier can be broken. so are orcas "sapient"? is Alex the Grey Parrot, or Koko the Gorilla? are crows? if a fungus or plant was "sapient", how would you ever really know, if their desires and behaviors and ways of speaking are so radically different from you?

even the terraformers acknowledges this. their definition for what is a "person" cow or train or earthworm or tunnel boring machine or moose vs a nonperson cow or train or earthworm or tunnel boring machine or moose is literally that the ones who are "people" were bioengineered to explicitly think like a human, instead of a cow or train or earthworm or tunnel boring machine or moose.

theres just so much fucking grey area within humans too. is someone in the latest stages of Alzheimer's disease "sapient"? or someone with "braindeath"? is it suddenly okay to strip them of their personhood, their rights, just because they don't think or act like a neurotypical abled human? nonspeaking autistic people have historically been considered "nonsapient" by the medical establishment. even speaking autistic people, often treated like we are less "cognitively aware" than them. and when experiencing a psychedelic dissolution of the ego and the boundaries stop existing, there is no experiencer or experiencee, only Experience, is there "sapience" there? is the Experience temporarily stripped of "sapience"?

"sapience" as a concept in sci fi feels like a comforting quilt they drape over the uncomfortable truth of the world, which is that there is no line between "nonaware" and "sentient" and "sapient", these are human constructs at their core, influenced by prevailing ideology. humans deciding who is similar enough as a human to be a person. ways of thinking and being exist on a multidimensional field and "sapience" is a little circle, a little fortification, a little gatekeep of personhood, drawn around a spot in that field where the thinking seems similar to how neurotypical humans think. there is an entire universe out there tho. and theres no reason that humans should expect to meet someone who thinks like a human, or that in a sci fi context the only way to become a spacefaring species is to be like a human and do it in a similar way to how humans did it.


You must log in to comment.

in reply to @spiders's post:

this is part of a bigger worrying trend I see in HDG, where it's casually asserted that the affini can casually Know what constructed categories a person belongs to, in a hard, concrete way, by looking at their neurology or other inner workings. I read one once where the affini asserted that the character had... some set of vague microlabels, as if those represented some True, Tangible, Meaningful Universal Category independent of the communities that coined them and the context in which they were originally used. it's a really insidious ontology that's been baked into that universe.

yeah alot of fic writers have really not done alot of thinking about this sort of thing. there are some that go against this, that say, no, the only person who can know who or what you are is you, nobody else can tell, there's no brainscanner, its for you to decide.

in our half assed attempts at writing fic we try to also assert this, even when neurology comes into play. like with the fic we wrote where a plural system has one member who's a pet and one who's independent, there's no way for the implant to know who is who definitively, even tho it has access to the neurons! it just makes guesses based on who's likely to have which thought patterns and mannerisms, and sometimes it gets it wrong, because its only guessing.

but yeah its extremely frusturating when this kind of thing comes up. in another fic there was one where basically a doctor was like "i would need to examine them to be certain if they really are a plural system" and it just felt Oh Fuck No.

it comes from i think a very understandable place of like, wanting the comfort of someone coming in and saying "yes, you are inherently, without a doubt, trans/plural/asexual/whatever" which is a thing we wished for at one point when we were inexperienced about these things.

but its a desire you have to move past because ultimately its not a realistic desire.

i do really think a lot of it is just like... it is very hard to determine where one would stop, speicifcally because there is no line? naturally the goal is, yknow, respect and understanding of all creatures, but... ok i need to kinda be reductive about it to get my point across. like- i don't know if i could make a reasoned compromise with a mouse. we have to live in a world with each other in it but, yknow, the mouse doesn't give a shit why i feel the need to hide my food from it and try and get it to leave.

a long while ago we read some of chris wayan's work, and at that early age the concept of "oh, this group of people is quantifiably Less Sapient than the others" freaked us out enough to hope there was a defining line somewhere. the concept is very fraught IMO, just as fraught as defining the line for lack of a better phrasing too high. two sides of the same coin? i dunno

sorry this isn't... coherent or well-reasoned. i am very tired. i hope it's wordsable though

Yeah this is a minefield. I find it difficult to even start thinking about this, since yes, apparently i already don't think like a human, according to the people whose opinions hold power. I guess the only reason i get any shred of respect at all is because of the arbitrary biology of a vessel that i'd rather transcend, but would fulfilling that dream waive my remaining rights? It's so backwards that it's literally reversed

It feels like the working definitions of sapience, humanity, and personhood are drawn up to be as conflated and exclusionary as possible and cannot withstand any (potentially looming) trial by fire. Of course i'm hoping that trial is a breakthrough in morphological freedom and not, like, techbros rushing towards AGI. Nevertheless, i fear that the level of body modding that i yearn for will cause the violent collapse of society—and not for any issue with the mods themselves. This problem is a lot more than a sci-fi writing thing, hopefully i'm preaching to the choir with that

And as far as the infinite world of fiction goes, i don't have anything to add that can top your own closing paragraph yeah

the parogenic community has a bit of a problem with this. it's less of an issue now, because everyone realized "hey this is a shitty thing to do to another person" but it used to be normal for hosts in the early stages to use a "sentience test" whenever they were having doubts.
like, talk about messed up power dynamics. imagine the person who you depend on to exist at a basic level is feeling a bit down so they ask you to surprise them (or something) and if you don't there's a chance that they'll just give up.
so that's thankfully a thing of the past, but the echoes of that ideology are still around. like, there's an MRI study that should be coming out in a year or two and a lot of folks are hopeful that it'll prove... something. and of course there are the eternal battles over the DSM-V as if that means anything at all.
it's weird, because you'd think that this is one area where rejecting that anyone else can measure personhood would be not just liberatory, but actually useful?
but what do I know.

On one hand I agree with you, determining sapience or sentience are arbitrary things, kind of like trying to measure the soul. Humans perceived a pattern, made up rules to explain the pattern, extrapolated those rules as if they were laws, and eventually the cart gets put before the horse and they getting applied to things that don't make sense or used as justifications. The same thing that makes humans good at building on previous experiences makes them suseptible to fallacies. A naive exploration then very easily can miss the dangers of the mindset and conclusions they have reached.
On the other paw, 'we live in a society' and there are 'rules' that even if they are not correct they are what will decide things like personhood. Laws and regulations HATE us, because we don't fit cleanly into boxes and litigation. In a world where AIhas progressed enough to actually function as an individual, I pretty much guarantee that there will either 1) be a test that decides when they count as a person, 2) a limiter designed to prevent them from becoming what society deems is a person, or 3) they will be refused personhood regardless. Of the three, the option where they meet some sort of criteria is bleak, but still probably the least bleak option.
Now, does that mean that option is unambiguously good and should be presented uncritically? No.
I don't really write stories myself, but I do write code, and one of the most important, but frustrating lessons a programmer has to learn is that sometimes the correct answer isn't the right answer. Society is based on compromises, and half of them aren't even ever vocalized. In that mishmash a lot of people don't get the protection they should. You brought up an alzheimers patient as an example, but at a certain point they often DO lose their legal personhood. Sometimes that is for their safety, but sometimes that means they lose the right to choose things even if they are capable of making the choice.

Tldr it's complicated and society likes to standardize things that are complicated.