Sheri

its worth fighting for 🌷

Writer of word both truth and tale. Video producer, editor, artist, still human. Hire me?

Check #writeup for The Good Posts.

Slowly making a visual novel called We Will Not See Heaven, demo is free. Sometimes I stream, or post adult things. Boys' love novel enthusiast. Take care, yeah?

💟💟💟
TECH CAN ONLY BE AS KIND TO US AS WE ARE TO ONE ANOTHER.


🖥️ blog
sherishaw.net/blog

thewaether
@thewaether

it is clear to me that no mechanical change to social media will reduce the toxicity. people will always try: they've tried changing the algorithm, using "AI", changing the gamification... and it doesn't work

and it doesn't work because there's only one real answer and it's an answer none of the people running these sites want to hear. you just have to ban the people who cause trouble. there's people out there who simply want to cause trouble, hurt people, and fuck things up, and they will stop at nothing to do it. no mechanic will get in their way because they will try their best to work around it.

...Jack Dorsey's run as CEO of twitter was characterised by this ignorant sense of optimism that everyone on the site was interested in approaching it with good faith and posting only the truth, and he seemed to believe that the only reason "bad" posts would appear is because someone with good in their heart had simply been misled, or misunderstood the site's rules.

...this eventually lead to him, in his ignorance, once the toxic posts started coming without end, to assume that the neo-nazi point of view was "winning" in the public sphere because it was the viewpoint that was argued the best. "if you disagree with it," he would say, "why don't you make a counter-argument?" but it doesn't work that way, because trolls on the internet, as I already said, don't like rationality. they don't like arguing. they like to come in and destroy things and the thing that pisses them off most is barriers that stand between them and destruction. they don't want to make friends and they don't want to help the site run smoothly. and if a site admin doesn't get that, then I just assume they're probably one of them


vectorpoem
@vectorpoem

Banning users for unacceptable speech requires you, a platform owner, to hold and publicly state values with regards to what things are unacceptable to say. Even the most libertarian-right freezepeachers have speech they won't tolerate - say, posting a photo of their home address with your current GPS coordinates a few blocks away and a gun sitting next to it - and it's adorable to watch them pretend they don't.

The reason they are so reticent to do this, though, is that they see it as being pure downside. If you state your values, people will criticize you where their values differ. You'll lose customers, beyond just the people you had to ban. Capital Brain tells you that losing customers is always bad, even if those customers are driving away other customers and making everyone miserable. You lose the safety of being able to pretend that you are "apolitical" and above it all. You're down in the muck with the rest of the human race, in the messy eternal squabble over what kind of world we actually want to live in - a domain where technology can almost never solve social problems, not really.

So that's why you couldn't hardly torture a moral stance of any clarity out of Jack Dorsey. That's why Steam's user culture was overflowing with white supremacy for years. The unbearable, unavoidably accountable daylight of saying they believe some ideas are good and other ideas are bad, sizzling the vampire's flesh.


Sheri
@Sheri

there's a lot of games that theme themselves in ways inherently dangerous to treat fans of as a Group, Unit, a Cult

thinking about how payday 2 put in-game items behind joining the steam group. and then held community drives to increase the number of people in the steam group. social media pushes to get more followers, more people, more numbers

all the while the actual content of the game they're making is "ancient aliens control america from under the white house, use your guns to take hostile control instead". this all riding on the back of an ARG 'secret' involving hints in official announcements, coded messages in posts, and an overall air of "what you see isn't all there is to see, don't trust anything told to you by authority"

this was all unfolding over the course of late 2018, a famously good time to make your game's aesthetic around a secret cabal that controls the american government which must be violently overthrown.

the companies aestheticizing themselves to cultic conspiracy while doing follower drives and asking $20 american dollars for cosmetics to 'pledge your support!' probably don't see what they're doing wrong. probably.

but steam could. valve could easily look at this and go "hey guys maybe cut it out with the cult shit, we've made so many concessions for you as is". but valve goes where the community goes, a community built on the back of war shooter internet computer games

they're too libertarian to care. and damn if that group didn't have millions of people in it arguing about assault weapons and what's underneath the white house. i'm sure this is fine.


You must log in to comment.

in reply to @thewaether's post:

it is clear to me that no mechanical change to social media will reduce the toxicity.

That runs counter to my experience, but I think we could just be drawing different lines around what counts as a mechanical change. I agree that recommendation algorithms and such can't be counted on to solve moderation problems, and site-level moderation will always be important.

In addition to site-level moderation, there are other, smaller-scale mechanical differences that can make a difference to how things play out socially.

If the block feature creates a big notice saying "you have been blocked by (specific name here)" that someone can screenshot and parade around like a trophy, that's a mechanic that with social effects.

If blocklist information is public enough that people are afraid to use the feature for fear of being targeted specifically for who they've blocked, that's a mechanic with social effects.

If having a conversation necessarily broadcasts that conversation to all your followers and you inadvertently draw other people into a fight just by participating in one yourself, that's a mechanic with social effects.

If there are user-moderated groups where mods can remove off-topic posts and boot the trolls, that's a mechanic with social effects.

If the moderation is all ToS-based site-level and all you have is tag subscriptions that are vulnerable to off-topic posting and other misues that are technically-within-site-rules yet socially disruptive, to the point that all people can do is yell at each other to please behave, that's a mechanic with social effects.

So in light of things like these, mechanical changes can definitely make an impact on the social atmosphere and, by extension, toxicity. But no site should treat user-end options as a reason to dispense with site-level moderation, either.

in reply to @vectorpoem's post:

Important to remember that pretending to have no content moderation or even the understanding that such a concept existed was the keystone of social media companies' defense against legislators coming after them for, say, all the child porn. Or helping coordinate a pogrom for any police state with a buck. If they were seen to coherently enforce policies against anything less vile, so the toddler-logic goes, they'd be taking responsibility for all of it, and being responsible for something like Facebook or Twitter is the kind of thing tribunals hang people for

in reply to @Sheri's post: