customize your hot cakes with syrup

posts from @lori tagged #meta

also:

Yes, this is inspired by recent discourse, but it's not directly about it and it's something I've written about elsewhere many times about various scuffles. It's just obviously timely right now.

I see a lot of people in a lot of communities right now that see community rules as something that has to be algorithmic in nature. Your rules have to be hard and concrete, and if there's any room for a grey area, then you shouldn't have those rules. Obviously in the most recent discourse this was about lolicon/cub content, "what if you can't determine if it's underage", "what if it's an elf character, is that considered human or no?" "how will you decide if it's porn or not?" And all of this is seen as a justification for why we shouldn't have rules against it.

But that's not how communities should work, and that's not how successful communities work. All rules have human input. Someone posting their gofundme a few days in a row could be considered spam by some sites, for other sites spam is more like an account dumping a thousand links to a shady knockoff Nike seller. You have rules against racism, there's no easy algorithmic rule to determine if something is racist, it's up to the moderator in charge to make that call. You have rules against trolling, it's up to the moderator in charge to determine if someone is trolling or posting sincerely. In a good community, a grey area will be dealt with with some grace--maybe they are open to an explanation from the person who posted it, open to community input, willing to reverse it if they believe they made the wrong choice. It's worth it to go through a little friction on grey areas in order to have rules that prevent the stuff that is so far away from the gray area that it can't even get cell reception anymore.

If you want an algorithmic view of community moderation, just let chatgpt be your moderator. But you don't, you want human moderators that you can trust to do the right thing. Now of course, that requires two things:

  1. Y'know, human moderators you actually trust. If you don't trust them, then it doesn't matter what the rules are, they will do what they want. If you do trust them, then you don't need to worry about whether they'll be unfair with the rules as written.

  2. A strong community ethos. A community that shares what they stand for lets you know how they will handle grey areas. A community that is focused on not allowing bigotry is going to be more heavy handed at removing things they think are bigoted. The grey area is enforced more harshly. A community that makes a big deal about free speech may do the opposite. The important thing is, you know what the community's vision of itself is. Now, its' easy for a site to say one thing and not actually follow it, but that goes back to the trust in #1.

I think it's an absolute joke to try to have a community where you refuse to make a rule because there could be grey areas. All of your other rules already have grey areas. Heck, Cohost has a rule saying they're allowed to decide whether your posts are funny. So I am not swayed at all by someone saying you can't make a rule against X because there are grey areas. You can't make a rule against X because it's hard to enforce. Most rules are hard to enforce. We do it because it makes the community better. If a rule gets rid of 99% of the stuff nowhere near the grey area that makes the site worse, then I think it's worth it for the 1% left in the grey area to be considered carefully. Most of the time, most of the content you're going to be moderating isn't very grey to begin with. But "this rule can't be applied perfectly in every scenario so we shouldn't have it" can apply to almost every rule in a community and at that point you might as well just not have rules and see how far that gets you.



I've seen several people on here say they prefer Cohost's CW/hide system to Mastodon's. Can someone explain to me what's different about the two? Maybe I just haven't interacted with all the different mute features but some of the comments I've seen sound identical to how they work on Mastodon so I want to know what I'm missing.

(Just fyi this question is about the actual UI/UX, this is not a call to debate how you think mastodon users use these features or what different people choose to CW or not CW, I'm just curious what's different about them functionally speaking because I feel like I have to be missing something here)

Edit: also just because I'm the sort of person who tends to want to Over Clarify Themselves I definitely don't find Mastodons tagging or content warning systems flawless, it's just that from what I can tell Mastodon and Cohost have largely the same systems, and I'm having a hard time identifying what's different about them that could make people feel strongly about one being better than the other.



My main takes on the latest Takes are pretty secondary but:

  1. I've seen multiple people say "there aren't transphobes/Nazis/etc. on cohost". Extremely dangerous. Believing that is the first step to them getting away with being here. They're everywhere. Some of them just keep their mouths shut enough to not get banned.

  2. The constant evoking of Mastodon in these CW discussions is so pointless. Half the people doing so haven't been on Mastodon since 2017 (which was a wildly different landscape) and you have to remember how Mastodon works. Replace "Mastodon" with "Forums" and you'll see why. "Forum culture is batshit with how they handle CWs!" as if forums don't wildly differ. I was on Mastodon in 2017 and have been back to it for over a year now. I never see the weird CW debates. I'm sure they happen, somewhere, but I never see it, and it's bizarre to think the entire fediverse is like that. My instance just requires CWs on a handful of things like NSFW content, alcohol...normal stuff. The "CW eyelashes" or whatever shit I keep hearing about just...doesn't exist to me. I'm sure it exists somewhere but it does so in the same way Gab exists out there somewhere, I never see it and I'd argue most mastodon users don't encounter this kind of thing. I think it has more to do with certain social circles than Mastodon or Fedi. And I don't bring this up out of a strong need to defend Fedi, it's not perfect, but Cohost's problems are Cohost's problems, most people here haven't used Mastodon and it is not affecting how they behave here. Assuming it's some sort of reaction to a small subset of Fedi instances that go to far with CWs means misidentifying the cause. It's really not relevant to what I've seen argued here.

  3. I've said it before and I'll say it again--issues like this cannot be solved by changing the way the site works. Social problems cannot be solved via code. You need social solutions. Conversation, moderation, leadership...that's what it requires. If people don't care to CW something, and you think they should CW it, you either need to convince them they should care about it, or they need to be misdated to care about it whether they want to or not. No change to tagging systems will solve the issue of "I Don't Want To Do It".



There are a lot of people online who think that when a site that handles payments has to appease a payment processor that they've done some kind of massive ethical crime. But how exactly do you all think that plays out realistically? Should itch.io shut their doors forever because a small handful of devs threatened their ability to process payments for every other dev because payment processors are a nightmare these days? Is that really some moral good?

I have bad news, if Cohost continues to grow, it will have the exact same problem. If they allow monetizing content and that content involves porn, they will hit the same roadblock and have to choose between banning adult content monetization or losing the ability to make money at all which would mean death. People keep acting like this is due to some sort of puritan values from site devs. That's not the reality of it.