customize your hot cakes with syrup


Yes, this is inspired by recent discourse, but it's not directly about it and it's something I've written about elsewhere many times about various scuffles. It's just obviously timely right now.

I see a lot of people in a lot of communities right now that see community rules as something that has to be algorithmic in nature. Your rules have to be hard and concrete, and if there's any room for a grey area, then you shouldn't have those rules. Obviously in the most recent discourse this was about lolicon/cub content, "what if you can't determine if it's underage", "what if it's an elf character, is that considered human or no?" "how will you decide if it's porn or not?" And all of this is seen as a justification for why we shouldn't have rules against it.

But that's not how communities should work, and that's not how successful communities work. All rules have human input. Someone posting their gofundme a few days in a row could be considered spam by some sites, for other sites spam is more like an account dumping a thousand links to a shady knockoff Nike seller. You have rules against racism, there's no easy algorithmic rule to determine if something is racist, it's up to the moderator in charge to make that call. You have rules against trolling, it's up to the moderator in charge to determine if someone is trolling or posting sincerely. In a good community, a grey area will be dealt with with some grace--maybe they are open to an explanation from the person who posted it, open to community input, willing to reverse it if they believe they made the wrong choice. It's worth it to go through a little friction on grey areas in order to have rules that prevent the stuff that is so far away from the gray area that it can't even get cell reception anymore.

If you want an algorithmic view of community moderation, just let chatgpt be your moderator. But you don't, you want human moderators that you can trust to do the right thing. Now of course, that requires two things:

  1. Y'know, human moderators you actually trust. If you don't trust them, then it doesn't matter what the rules are, they will do what they want. If you do trust them, then you don't need to worry about whether they'll be unfair with the rules as written.

  2. A strong community ethos. A community that shares what they stand for lets you know how they will handle grey areas. A community that is focused on not allowing bigotry is going to be more heavy handed at removing things they think are bigoted. The grey area is enforced more harshly. A community that makes a big deal about free speech may do the opposite. The important thing is, you know what the community's vision of itself is. Now, its' easy for a site to say one thing and not actually follow it, but that goes back to the trust in #1.

I think it's an absolute joke to try to have a community where you refuse to make a rule because there could be grey areas. All of your other rules already have grey areas. Heck, Cohost has a rule saying they're allowed to decide whether your posts are funny. So I am not swayed at all by someone saying you can't make a rule against X because there are grey areas. You can't make a rule against X because it's hard to enforce. Most rules are hard to enforce. We do it because it makes the community better. If a rule gets rid of 99% of the stuff nowhere near the grey area that makes the site worse, then I think it's worth it for the 1% left in the grey area to be considered carefully. Most of the time, most of the content you're going to be moderating isn't very grey to begin with. But "this rule can't be applied perfectly in every scenario so we shouldn't have it" can apply to almost every rule in a community and at that point you might as well just not have rules and see how far that gets you.


You must log in to comment.

in reply to @lori's post:

i feel like when people argue against the rule for whatever "oh its so hard to enforce that what about-" reasoning, we should take them at face value.
they're arguing against the rule. that's materially observable.
the justification is weak and contradictory because it's an afterthought.

And to apply the grey area thing to the current discourse (I know I know I shouldn't but y'know):

People are really out here fighting over "but how can you tell the difference between a 17 year old and an 18 year old????" when I have seen content on this very site that was cub art drawn to look like four year olds (like, a lot of it as it turns out, I guess it wasn't banned at all in practice before now). Trust me, they do not have the same looking genitals as grown adults do, it's very clear when people are drawing toddler shit. So what the fuck is the point of fighting over 17 vs 18 when we're talking about 4 and that's what we want to see banned? The 17 vs 18 debate is going to be a small fraction of what gets moderated, and it's worth having the 17 and 18 have extra friction in moderation if it means getting rid of the 4.

I think a lot of the discussion is Not just about rules though but also about the way rules are perceived. Common sentiment among one crowd is that any disagreement or rules violation is viewed as an absolute moral crime that constitutes being a child rapist, and so pointing out ambiguity in rules demonstrates that that hardline perception is not coherent.

The problem with the discourse is that a lot of it was about moral castigation(from multiple crowds) rather than gauging how to effectively get common goals. Most points cant be viewed without that in mind.

People don't always have common goals, that's another issue with site moderation. Some people's goals are in direct conflict with others. In that scenario, focusing on a theoretical grey area is extra pointless because if you were able to magically remove the grey areas, one side would still want the content there and one side wouldn't. For the lolicon debate, there are people talking about grey areas who would not be satisfied if you could magically make that grey area go away because they still want the stuff that's nowhere near the grey area to be there. At that point, pointing out the ambiguity is going to come off as disingenuous, because whether there's a grey area or not doesn't actually affect how that person feels about the rule.

Sure, that's true for some, but that doesnt dismiss my point entirely. I'm pretty sure there's definitely common goals about like making sure the place doesnt become like certain "loli-porn-y" sites culturally, doing practical things to reduce grooming, avoiding objectification of actual children, and making this a good place for the traumatized to be.

Like yes, for like some % of one side pointing out the ambiguity means nothing because they'd be ABSOLUTEFREEDOM in any case, but I think the vast majority of both "sides" share those goals.

A lot of this is less about "What do the rules mean for content itself" and more about "What does the overall context of the discussion mean for the culture"- theres a fear of both "this'll become like 4chan or inkbunny" and "this'll gain a toxic harassment/policing culture like parts of tumblr", and so people make statements and stake subpositions that are less about "how does this interface with the rules" and more about "how does this interface with my fear of cohost's culture potentially changing".

"I'm pretty sure there's definitely common goals about like making sure the place doesnt become like certain "loli-porn-y" sites culturally, doing practical things to reduce grooming, avoiding objectification of actual children, and making this a good place for the traumatized to be."

I don't think "these two incompatible goals are fine if we work together on other goals" matters when we're talking about how specific rules get enforced. For a lot of people, the problem with the loli porn sites is the part where they have loli porn.

And as for the traumatized part, a lot of people traumatized by this porn, who were groomed using it in the past, who have been assaulted, have left the site because it was allowed (or going to be, but even if it's reversed the fact that this is the second time it's been considered a possibility is too much of a risk for a lot of people). This is exactly what I mean, some people have fully incompatible goals. If someone says I need to be able to post this art because of my trauma, and someone else says "I need to be on a site that doesn't allow that content because of my trauma", those people cannot be on the same site and be happy. There is not a common goal to work for there.

This has gone wildly away from my original post which was about grey areas, so I don't really want to relitigate this part further.

Yeah, I'm aware of incompatibility between some sections of the population, my main point was how the arguments you talked about(the grey areas thing) often arent just about rules but also about cultural aspects and moral castigation.