lupi

cow of tailed snake (gay)

avatar by @citriccenobite

you can say "chimoora" instead of "cow of tailed snake" if you want. its a good pun.​


i ramble about aerospace sometimes
I take rocket photos and you can see them @aWildLupi


I have a terminal case of bovine pungiform encephalopathy, the bovine puns are cowmpulsory


they/them/moo where "moo" stands in for "you" or where it's funny, like "how are moo today, Lupi?" or "dancing with mooself"



Bovigender (click flag for more info!)
bovigender pride flag, by @arina-artemis (click for more info)



GoopySpaceShark
@GoopySpaceShark

If journalists truly think AI is going to supplant them, then I suppose in a way, journalism is already dead. An AI can spew out coherent - and often outright incorrect - sentences with all the sensationalism you want, and without having any kind of moral compass of its own.

Journalism is meant to be unbiased, factual, fact-checked reporting. An AI can't fact check, because an AI does not truly comprehend sentences, let alone interpersonal relationships, politics, or any advancement in any scientific field. An AI is biased by whoever puts together the initial framework and their motives, and the vast quantities of (ironically, often auto generated) training data it scrapes from the internet.

It kinda feels like actual journalism died shortly after the Snowden leaks, that journalists felt all the facts would come to them -- or maybe it's just upper management once again optimising for clicks and ad revenue. I'm not sure the distinction really matters any more.


You must log in to comment.

in reply to @GoopySpaceShark's post:

I know that AI is what the techbros and media call it, but it isn't AI. It's all Machine Learning and Language Learning Models.

An AI would have comprehension and self awareness.

A learned machine is just very good at following the rules it was told to. These can never leave the bounds of their programming because they can't 'learn' anything they haven't been told to 'learn'. It doesn't actually know anything, it just knows how to take an input and make an output, but it has no ability to access if its output is a good output, since it doesn't actually know what words are, it just has indexes of what units of its output should also look like.

And if an AI is ever made, well, we know how people get treated in the world.

(Apologies for the diatribe, this particular subject is galling for multiple reasons, nothing you said was incorrect, op.)