• she/her

queer code witch - 18
discord @‍mintexists
send me asks! :3


boobs
I'm not convinced that this needs to be a link?
Yea no
it doesnt
i wonder if
**markdown** formatting *works* no it doesnt thats sad

shel
@shel

Just to be absolutely clear so that we are all on the same page. Here is the Cohost policy that everybody seems to be angry about:

  • child sexual abuse material is illegal, morally repugnant, and banned from cohost. we will fully cooperate with law enforcement in prosecuting anyone who posts it.
  • “child sexual abuse material” refers to realistic depictions of human minors, either actual photographs/videos, or visual art difficult to distinguish from actual photographs/videos
  • this also includes non-explicit depictions which are obviously intended to be sexual. (e.g. creepshots, underage “fashion modeling”, etc)
    fictional 18+ content featuring minors
    [...]
  • do not post photography or photorealistic depictions of minors
    • as described above, photography and photorealistic sexual depictions of human minors are illegal, morally repugnant, and banned from cohost. please refer to that section (under “other prohibited content”) for full details.
  • do not post sexually explicit non-photorealistic visual art of characters who are apparently minors.
    • we will not be using third-party evidence (i.e. fan wikis) when determining if a character depicted is a minor.
    • non-explicit art with suggestive themes is allowed with a mandatory content warning
  • written material is generally permitted with a mandatory content warning, barring two major restrictions:
    • do not post written fiction involving real people (i.e. RPF) who are minors or depicted as minors.
      • this does not apply to content written in a documentary capacity (e.g. describing personal experiences) however these instances must include a content warning
    • do not post anything encouraging, glorifying, or advocating sex between adults and minors.

I keep seeing posts that say "Why won't Staff ban child pornography?" and I am so confused. Where in this policy is child pornography allowed? The part where child pornography is described as "illegal, morally repugnant, and banned from cohost?" The part that bans "anything encouraging, glorifying, or advocating sex between adults and minors?" The part that bans "sexually explicit non-photorealistic visual art of characters who are apparently minors?"

Without the clause allowing "non-explicit art with suggestive themes" you couldn't have panels from Naruto or One Piece—which are in the teen section of your local book shop. And Cohost requires you to add a content warning for it, which Barnes & Noble does not do.

As for written materials, it's extremely difficult to determine at what point Lolita turns into Lolicon when it's in writing. We can't just ban any piece of written fiction where bad things happen to a minor because someone might be getting off on it. Even so the policy still bans "anything encouraging, glorifying, or advocating sex between adults and minors." So if a piece of writing does seem to be glorifying/eroticizing what's being depicted, then it's not allowed.

This policy, all in all, is stricter than what you'd find on most websites and is stricter than what is accepted in the mainstream vis a vis what you find in book stores.

Anybody who read this policy without the context of there having been a whole process of creating it and modifying it based on community feedback would look at this and understand it to be a policy banning child pornography. I don't understand what forms of child pornography aren't banned by this. Photos of erotic pottery depicting apparently underaged mythological figures? The policy is so specific because "pornography" isn't a very precise term. Avoiding vague language is good policy writing 101.

I can't think of anything that I would consider child pornography which this policy allows. Child pornography is already banned on Cohost. Nobody is advocating for pedophilia or defending child pornography. In fact, doing so is a bannable offense on Cohost under this policy and if you actually do see someone defending pedophilia on here then please report the post.

Also, as a reminder, nobody under the age of 16 is allowed on Cohost. This website is not meant to be "a safe place for minors."


aetataureate
@aetataureate
Sorry! This post has been deleted by its original author.

You must log in to comment.

in reply to @shel's post:

I timed this one so that the west coasters would see it since it seems like every night I go to sleep at midnight and wake up to some sort of fire that happened overnight.

I used the phrase "child pornography" because that's the phrase being used by the people who are criticizing the policy for "not banning child pornography"

The policy does explicitly ban CSAM using that language which for some reason this crowd considers to not be entirely covering everything they'd consider "child pornography" like CSAM is a subset of it

clearly this discourse is not being had from a place of everyone being well educated on the subject matter

I also want to stress that I understand your reasoning. Some people think CSAM does not cover "child pornography." But there's no need to concede ground to people who are either uneducated on the matter or simply have ill intent. The reason we don't say "child pornography" (even if some people do) is that we understand that child sexual abuse, even when filmed or photographed, will always be abuse. Porn is art made for titillation purposes by consenting adults. CSAM can, by definition, not be porn.

100%! There is no way to read this policy in good faith and come to the conclusion that the staff is somehow secretly pro-child porn.

And honestly, if the Puritans think that this is some kind of safe haven for pedophilia, then I wish they'd make good on their threats to leave.

I am speaking as someone with zero interest in any of the fetishes under discussion. My concern is that a group of self-righteous moral crusaders have been emboldened to harass the staff and other users.

It is so easy to indulge in righteous indignation; it feels so good to shout down someone you see as your moral inferior. But that's not how you treat someone you care about, or someone you're interested in building a community with. If these folks aren't interested in being kind and compassionate, if they're not willing to engage in good faith, then they should leave.

It is kinda fucking wild. And it's not even only anime and manga that make things weird, here, I mean

Some of the most popular shows, if hype is anything to go by, of the last several years have been shit like Euphoria and Game of Thrones. Which. While clearly all of the actors in any given lewd scene are legally clear, their supposed character ages get real fucking uncomfortable fast, especially if you're relying on "fan wiki" information.

To bring Game of Thrones up again, the written works are even worse on this. Personally could not read them with how stuff goes. Like. Not going to be calling for their Ban or anything but yikes.

I was one of the people who did write an email to staff when the initial Thing picked up steam, because I believe that initial version of the guidelines did allow for something I don't think should be permissible on a public platform like this.

And, yeah, you're right, what we have now is precisely what I was asking for in that email, and I see absolutely no fault with the way the guidelines are written now—I am baffled at the sheer poor reading comprehension on display when people say that "writing it is still allowed", it definitely isn't and it takes a worst-faith read to assume the staff aren't going to err on the side of caution. I'm exhausted by this entire discourse at this point, and I do think some people are taking it way too fucking far by bullying staff; the change we needed, in my opinion, already happened.

There's a whole other discussion to be had about how quickly the change happened, too, largely because while I am grateful for the change, I don't think adjusting guidelines changes within the same 24 hours of them going live is a good thing—but I can't blame staff for desperately trying to quell such violent backlash. Even though I agree with the sentiment that csam has no place on cohost, people need to actually give staff the fucking time to figure out what needs to change, rather than demanding immediate same-day action. As you pointed out, staff does take their time on shit like this, and they should be allowed to take time on adjusting it, too.

The terminology thing is something that CSA survivors and organizations that fight against the CSAM industry are very very stringent about. It's not a way to nitpick it's a precise term meant to emphasize that CSAM is recordings of a real child being abused in order to prevent people from thinking of it as something sexy. Even in this post I used the phrase "child pornography" and someone in the comments linked to an article from Child Rescue Coalition explaining the term CSAM and telling me not to call it porn.

For like 3 hours Cub wasn't banned. Now it is. So what is the problem? I asked what the policy allows that you think shouldn't be allowed. You said Cub. But Cub is not allowed! Now you're saying it's good it's not allowed but bad that it was, briefly. Well that's not the original question.

Part of my problem with this line of language is that CSAM explicitly means "child sexual abuse materials". CSAM is material created as a result of sexual abuse targeting a child. That is... patently not the case, I would like to think, for the majority of the art being discussed in recent policy decisions. That there is a risk of that being the case--instances where cub is being based on actual CSAM pictures--justifies policy action, sure, but the fact of the matter here is that most artists in these communities are not engaged in the consumption of CSAM, even if they're drawing underage characters. Because they're drawings, and not actual real life children being abused.

91% of child sexual abuse happens either in the home, with family, or in the care of extended family or trusted family friends. The majority of CSAM proliferated online is a result of cases like that, where a trusted individual in the family's circle, or heaven forbid a member of the child's direct family, has committed heinous abuse, documented it, and proliferated it on the internet. That abuse usually isn't taking place in the form of drawings.

This terminology matters for the reasons Shel has addressed already. Even under the original policy guidance, there was never a world in which CSAM was acceptable on Cohost. Cub and lolisho can cause real world harm without being CSAM, without even being adapted from CSAM. Both of these things can be true, and the distinction of language used serves to honor the experiences of survivors.

Everyone getting up in arms about the language and throwing around flattened abuse terms, while maybe understandable bc a lack of knowledge and the fact it’s such a triggering subject with high stakes, is the reason why it’s important not to flatten the language around the materials involved in the policy and the mechanisms of grooming.

in reply to @aetataureate's post:

Pinned Tags