i took a peek at the furry community on bluesky and there's a lot of the usual clamoring you see in new social media platforms. "we have an opportunity to reinvent ourselves! when the twitter 'main characters' arrive, do NOT engage! everyone communicate with each other in good faith! no quote dunking! no needlessly incendiary takes!" etc.
it's great that everyone recognizes and sincerely wants to treat the symptoms of twitter brain, and i'll admit i'm not the most qualified person to analyze social systems, but i struggle not to roll my eyes when the proposed solution to ANY problem involving large numbers of people is "this time everyone be on their best behavior". i don't think there's such a thing as "twitter without the consequences of twitter". i think that if you want a social space where people predominantly act in good faith it needs to be designed to disincentivize bad faith behavior. i think it's overly ambitious to go to a platform that opaquely incentivizes bad faith behavior, to acknowledge that those incentives are immutable properties of the platform itself, and then to say "please don't, though". i would love to be proven wrong.
Systemic problems need systemic solutions. Twitter didn’t become what it was due to individuals making poor decisions. Like you said, the system rewards certain behaviors and disincentivizes others. The problem with a society that fanatically teaches everyone the ideology of individualism is that people become incapable of understanding or effectively shaping group behavior.
I’m running Bluesky’s furry Discord community, Fursky.
One of our key components for moderation is that, people who are asses have already shown their history. It’s usually trivial to find someone that’s a fuckwit. You deal with these problems not by starting over or letting everyone be on their best behavior - you eject them from the fucking airlock before they even get a chance.
Building safe communities requires proactive action, and that means knowing your threats when you see them. You can’t forget the history of a community just because you’re on a new platform.
We keep us safe.
Yes! That's how actual, real-ass moderation works. And it's something you cannot (and can never, not even in a theoretical near future with fancier tech) do via algorithm.