• they/it/xe/wyrm

yo, I’m Wyvere!
I’m a silly little sea dragon on the internet!
I am 19 years old
I am white
Please read my DNI before following
Please do not try to flirt with me, this includes calling me sexually charged “compliments” like hot or sexy, even in a platonic way it makes me uncomfortable


krveale
@krveale

I've seen some discussion about Cohost's intention to create a 'missing stair' rule, and wanted to talk about how and why that is actually a good idea, although one where an enormous amount will come down to how it's implemented.


I know of several major sites which clearly state "staff reserves the right to infract and/or ban people for patterns of behaviour" without being tied down and forced to ONLY deal with people clearly crossing the line.

That way the mods can deal with people trying to dance up to the edge of the line repeatedly without crossing them, or just general mess.

It's possible to do well, and not just on small sites.

Dreamwidth is an example of doing it well, and here's a thread from Rahaeli that details how and why that's necessary, including with examples: be warned, those examples feature awful nazi bullshit!

The RPG.net forums feature this example of a Rule 0, setting context for everything else:

Rule 0: The mission statement of this forum is "Keep the forums friendly and welcoming to as wide a range of people as possible." This principle guides all moderation. To the extent that the letter of the rules conflicts with the mission statement, the mission statement takes priority. The moderators may engage in moderation when necessary that is not based on the text of the following rules in order to keep the forums friendly and welcoming. Adherence to the letter of the following rules or “being right” is no excuse if your participation is destructive or disruptive to our communities.

I also want to highlight that the basic ability to say "staff reserves the right to refuse service" is a necessary tool even outside of dealing with extremist shitbags: it's MOST important in dealing with them, but not only FOR them.

The quotes which stay with me from Rahaeli's thread are:

People do not understand how the bad actors who want to use a platform for organizing and doing horrible things know your ToS and content policy better than you do. They will rules lawyer edge cases for DAYS.

and

this is 100% why every policy team needs someone who is empowered to look at a particular report and say "I am invoking the 'I was not fucking born yesterday' clause". Because you have to treat deliberate, systemic boundary pushing as a hostile act.

These statements are true even when you're not dealing with nazis! I have seen people do boundary-pushing fuckery over tiny, TINY things that they absolutely 2000% will NOT let go.

I know an example from Twitter where the entire Trust and Safety team needed to weigh in on whether the statement "pussy ass bitch" was ONE insult, or THREE. Because that distinction somehow mattered, and someone was pushing back.

And in every case, people who do this kind of dancing up to the line are specifically trying to find a fulcrum/lever combination that will allow them to Get Away With Shit, whether that's a particular hobby horse or some other weird niche thing, or outright nazi schemes.

In many cases they feel justified and that they're the 'good guys,' too!

Basically on every site, if we get behind the curtain, we will find that an alarming amount of mod/staff time dealing with moderation reports come down to two things:

  • Non-actionable things
  • Problems coming from a short list of people that make up most of the rest of their pile of things to work through, none of which individually cross the line into clearly breaking a rule.

I actively suggest that staff need to be able to say "we reserve the right to refuse service, good bye" to that short list of people, because they are literally not worth the time and trouble of keeping around.

HOWEVER. An enormous amount comes down to how this ability gets implemented in practice.

It's very important that the implementation of the rule comes from the staff and their experiences behind the moderation-curtain. They need to be able to respond to the people who are actually causing ongoing disproportionate problems, and not be driven by REPORTS of problems. In fact, small numbers of people being responsible for a vast proportion of moderation reports can be an example of ongoing problems themselves, and staff need to be able to have that on the radar too!

The current statement on Cohost about what they intend to do is not policy, it is an emergency announcement that they intend to develop policy, so I'm giving it some grace. If it turns out to be the policy in itself, then the phrasing isn't great - but I'd be surprised if it was.

It also is a situation strongly rooted in its wider moderation context: there's discussion currently that moderation on cohost has not been consistent in ways that have caused acknowledged problems, and that's a much trickier environment for people to trust this kind of approach.

So there we go: this initiative in itself is not a bad idea, it's not only something that works on small scale, and big sites HAVE made it work. However, context matters a great deal and we'll have to see how the specifics shake out.

I do also think that having one staff member responsible for moderation is not safe for that person: they could be superhuman and it would still be too much.


You must log in to comment.