jkap

CEO of posting

butch jewish dyke
part of @staff, cohost user #1
married to @kadybat

This user can say it
osu stats


🐘 mastodon
xoxo.zone/@jkap
🖼️ icon credit
twitter.com/osmoru
🐦 twitter
not anymore lol

staff
@staff

hey folks, we’ve got a couple big trust and safety updates coming today, including some changes to the community guidelines. we wanted to go over everything here for transparency about what we’re doing and why.

first off, the community guidelines. we’ve gotten a lot of questions and reports on content that, while we considered to be borderline but permitted, was absolutely in a gray area in the written community guidelines. we had internally developed a set of policies that we were applying to the small number of cases that came up, but had not publicly announced the policies we were applying because of some open questions we still had. this was a bad call, and moving forward we’re going to be more transparent about areas of uncertainty and indecision in our policy.

here’s a summary of the changes:

  • we’ve added clarifications to the section regarding child sexual exploitation material, and how it pertains to non-realistic depictions of minors, in an attempt to provide clarity and consistency for enforcement.
    • internally, we had been drawing the line at the prevailing legal definition of “realistic depictions,” which includes photographs/videos of actual human minors, or content difficult to distinguish from actual photographs/videos.
    • policy around non-realistic depictions, such as lolicon/shotacon, has not yet been finalized. we don’t want to implement a policy that the majority of users would feel uncomfortable with. we are currently working to implement a system to allow us to get user input on this area of policy. until such time, please refrain from posting it; up to this point, we have been asking people posting it to remove it pending a final policy decision.
  • we’ve added a new section clarifying and adding new rules around content warnings.
    • previously, content warnings were only strongly recommended for posts containing potentially sensitive content. in most cases, this is still true. however, we are now requiring CWs for certain types of content.
    • this policy change is accompanied by a technical change that prevents these CWs from showing up in unrelated tag pages. these posts will still show up on your dashboard (if you are following the poster), profile pages, and tag searches for any of the terms on the list.
    • the full list of mandatory content warnings can be found on our support site. this page is also linked from the community guidelines.
    • repeated failure to add mandatory content warnings, as well as attempts to circumvent the filtering system (such as by using numbers or symbols in place of letters), are considered bannable offenses. we don’t want to ban you so please be normal about this.

the tag page change is live now. our motivation in this change is not to censor any types of allowed content, but to prevent certain types of sensitive content from showing up in large, more general tags. while we may make changes to this list in the future, all changes will come with a notice, as well as a grace period for users to start adding CWs to their posts.

our goal is to provide a robust set of tools that allow everyone to customize their own experience to their level of comfort and safety. to support this, we are actively working on a system with which you will be able to completely hide posts that include CWs you never want to see and skip the clickthrough on CWs you do not need a warning for. these tools are being worked on in addition to general tag filtering tools. above all, we believe that you know your own preferences, limits, and triggers better than anyone else; our intent with these changes is to help you see the posts you want to see and none of what you don’t.

we also want to clarify that, thus far, we have not received any reports for content that, under the new rules, would require a mandatory content warning but did not already have one. we really appreciate that people are using the content warning system correctly, even before we had rules in place. the purpose of these rules isn’t to change anyone’s behavior, but to codify behavior we already saw, as well as to make our job moderating easier.

we are, as always, open to feedback on these policy and technical changes. this is a tricky, sensitive area to work in, and we’re making sure to act deliberately and with consideration. this is not a sudden decision; we have been thinking over these changes for well over a month now. (related: having weekly hours long conversations with your coworkers about lolicon kind of sucks and we would recommend against being in a position where that’s necessary.)

that’s all for now. please let us know if you have any questions or feedback and, as always, thanks for using cohost!

EDIT 10/7/22, 8:21am PDT:

In an attempt to reduce the amount of unconstructive nastiness and name calling in this comment thread, we are going to be removing comments (both "on our side" and not) that detract from actual conversation.

Please note: due to the sheer load, we will not be sending emails to users whose comments were removed. These removals will not be held against you in any future reports. This is a special situation for many reasons. If you have any questions, you can email us at support@cohost.org


You must log in to comment.

in reply to @staff's post:

in addition to general tag filtering tools

As always, I appreciate very much that this is something you're working on - in general, and not just for this policy!

Regarding content like lolicon/shotacon - I hope your timeline works out so that tag filtering is available by the time you make a decision on this. I think it's not illegal in your country, but it is in mine, so being able to opt out of even accidentally seeing it would be appreciated.

Regarding content like lolicon/shotacon - I hope your timeline works out so that tag filtering is available by the time you make a decision on this.

yes, we consider this a non-negotiable dependency internally, as well as geoblocking such that those posts aren't visible in countries where it is illegal.

that list should be linked from the 'new post' and 'submit comment' panels, I think, and the shorthands should have full descriptions (what is 'noncon'? i can guess but i suspect many people can't)

I want you to level with me here, and whatever your answer I'm not going to judge you. You're doing some great shit here, in the real world. But I need to know.

Are these policy updates in any way related to The Search For New Funding?

PS for other people saying it's "safer" to allow it, because it "doesn't hurt real kids", there have been cases where that's not the case (one of them being this https://www.vice.com/en/article/epz57p/manga-abuse-children-japan-sexual )

I have also personally witnessed someone grooming a minor trans girl, and this guy was the creepiest guy I've ever met, he was able to tell from memory which states have lolicon allowed. He was really into it. And he was an active serviceman in the US military too.

Except this article has no actual proof of harm. Someone being into something fictional doesn't make them commit crimes. Someone who is actually sick in the head enough to convince themselves a child can consent isn't the fault of an artist or a genre of fiction. Personally I have never actually talked to any shota artist who actually thought CSA(M) was ok You find a startling amount are CSA survivors themselves and cope through their art (I believe there is definitely a connection there but I can't prove it), and even if they aren't, the reasons they like it don't matter, thought crimes don't exist

The reason why CSAM is wrong is because children do not have the capacity or the understanding of what sex really is. It can be extremely harmful to them (I would know!) to go through that sort of thing. Consuming CSAM feeds into a cycle of creation, retraumatisation, etc Same thing doesn't happen with fictional content. It's fictional, no one is hurt.

I don't blame any sort of fiction for why my abuser chose to be a predator and abuse a child, I blame him because he was selfish and knew what he was doing.

Blaming fictional porn for people committing actual crimes and hurting actual people is like blaming video games and violent movies for murders The only difference is that one involves sex, and even then the way the brain reacts to violence can be really similar to the way the brain reacts to sex (not literal intercourse but the idea of sex) I've known people who were so anti lolicon and then it got out they were actually grooming minors the whole time. It has nothing to do with the fictional content they are into, but who they are as a person

having weekly hours long conversations with your coworkers about lolicon kind of sucks

Yikes... I wouldn't wish something like that on anyone. Certain things have to be dealt with at some point though.

I've been looking into trying out this site and I guess I'm glad I did right when something important like this was being dealt with. As someone who generally leans to the side of anti-censorship, the tag muting system mentioned here sounds absolutely perfect to handle these sensitive cases. I wonder if making tags more immediately visible would help to avoid situations where someone might instinctively click through an adult content warning before seeing a tag they might not have wanted to see? (I have had my fair share of forgetting to read tags before clicking through those on other sites haha.)

One way or another, I won't really be affected by whatever decision is made regarding this so I won't mind the outcome. I still feel like it's important to play a bit of devil's advocate having seen the claims being made in the comments. I am not seeing many sources. I see a fair amount of "loli porn art supports and normalizes sexual abuse towards kids" and not a single one I've seen delves into how and where the info comes from. Having read research papers on this myself, I've usually seen the exact opposite being argued. I'm not trying to claim that it doesn't cause harm as I still want to read more on the topic, but it worries me to see some people be so vehemently against something without really providing any sort of proof.

That being said, I would still completely understand the decision to ban such content regardless. Image is important, especially for a social media site in its infancy, and a hot topic like this is EXTREMELY difficult to navigate. No matter what side is chosen, the other one will have people who demonize you. Maybe there's also some convincing papers I haven't read that might contribute to a ban. In the end my hope is really just that your decision is informed based on facts, let that be whether or not such content is harmful, what would harm/benefit the site, etc. I'm so sorry you all have to make a choice on such a sensitive topic like this.

P.S: I'd love to share the papers I've read regarding this topic, but unfortunately they use snippets of lewd loli art (albeit ancient and tame ones from like the 90s and early 2000s) for demonstrative purposes so I'm hesitant to really link anything here.

No one is using "underage" and "noncon" euphemistically in an attempt to conceal what is being talked about. They are valuable as tags for fiction because it's important to not obscure when we are talking about a real person being abused.

I cannot tell you how unserious I find it to deliberately eliminate the distinction between these things but it used to be possible for me to ignore content that triggers me because it's not real and it can't hurt me and I am begging you, whatever you think of this material, to not demand that people refer to it as the real thing and go on high alert.

These are both tags that provide information about the creator's intent and framing, because they indicate that the subject was approached as a sexual fantasy and not as a serious treatment.

It is actually good when people tag their porn clearly instead of trying to put it under the same category as realistic depictions of traumatic subjects.

Agreed. All drawing it does is normalize it, strip it of all severity, and sexualize the trauma of real people. While there are ways to depict it respectfully and with the gravity it deserves, those will never come in the form of an online blogger's porn drawings.

Also sorry if this comes with a ghost notification, I commented from the wrong page at first.

uh, guess I'm kinda genuinely wondering what the argument for allowing explicit loli/shota stuff (even behind a cw) is and where this has been popping up. appreciate everything you do for the site and don't need an answer, this just seems big and very sudden to me lol

this, which I found by accident, may have something to do with it

Edit: as I didn't include this information originally, and this has led to some confusion: that tweet was posted on September 29. The admins are taking this move now in reaction to it, not the other way around.

cw: link to a tweet by an artist who draws this kind of stuff. their pinned tweet contains images of it. the cohost account linked in the quoted tweet also contains images of it, behind a CW.

summary for people not wanting to click: the poster, a lolicon artist, talks about cohost potentially being a space for them, as they received a response from staff stating essentially the same as this post, saying they would be consulting the community to determine where to draw the line on things like this.

the artist then talks about getting people from a Discord server, presumably based around this same material, to sign up for Cohost "to make [their] voices heard" and subscribe to Cohost Plus in order to "lend [their] input a little more weight."

they go on to say "if we can build a strong userbase there, put our money and time where our mouth is, we have a good chance of supporting and fostering the site we want to be on." So, basically, textbook entryism.

Fortunately I'm pretty sure the staff here aren't going to take them more seriously just because they're paying $5. Getting a hard "no" from the Cohost community at large would provide good backup for banning this kind of material.

data point: the furry art gallery site Inkbunny takes a hands-off approach to letting artists post that stuff. One outcome of that is that Inkbunny and artists who have use it are regarded as Sus by the furry community at large. At this point the main reason someone would have an Inkbunny profile at all is to post things other sites specifically prohibit ("cub").

By taking a permissive stance, they made themselves a pariah and, in the community's eyes, a specialized purveyor of fictionalized child pornography.

Figuring out where to draw lines in a project like this is always difficult but I feel like, even only taking practical matters into account and not moral ones, it seems clear where a line ought to be drawn here, for the site's sake. Everyone generally tries to stay away from that sort of material, up to and including the various and oft-maligned ABDL communities.

Cohost has potential to have some of the best CW and content filtering capabilities of any social media platform, which allows a highly permissive stance toward art that would minimize harm and discomfort by people who don't want to see various subjects depicted. However, drawing a line excluding things that could land people in jail for viewing them in most of the world seems to make sense on the practical side of things.

In terms of the personal impact of this art on others, other people are stating it better than I can in this thread. Cohost has generally had a thoughtful and considerate approach to things, and I hope this continues in regards to its content policies.

i think the concern actively allowing controversial fiction making the rest of the site sus is reasonable. id say look to a site like e621 that still allows everything that inkbunny does but it doesnt get flack for it in the same capacity because to see that content you need to get an account, go deep into settings, and manually turn on those tags just to see them.

I think including a default list of muted tags for all users could go a long way. i remember finding out the hard way what snuff was and while i respect an artists right to make whatever they want id rather never see it.

if you do it right you wont have to censor borderline artists or make most users aware that that sort of thing exists at all.

Yeah, I think this is the thing. If the risk is getting the reputation of being "the site" for it, the impact on everyone else is that the site you go to has become "that site", and it shifts the population in favour of just... that.

a lot of folks have said more or less this same thing but i'm responding more or less just to boost it; allowing pedophilic content, regardless of legality, is going to make the site more welcoming to pedophiles and more hostile to everyone else. i respect the majoritarian approach, and i understand that the viewpoints expressed in this comments section may not be representative, but this is content that, like bigotry, is content that is actively harmful to victims. in allowing it you would be siding with those who perpetrate that violence, and not those who are victims of it. thank you for your consideration

If pedophilic content like lolicon/shotacon/cub porn is welcomed here I'm deleting my account and leaving right away. I don't want to be associated with sites that allow it in any way or form. That's all I really have to say on that topic.

pleased to see this strong specific response in the comments of this post. i would personally note that referring to, say, "noncon" rather than anything invoking the word "rape" makes sense to me just because anyone who posts it just, almost certainly isn't going to be calling it the latter and so could consider the latter ambiguous

while i think getting community input before making decisions on what type of content is allowed or denied is great, i really don't see it being a good idea in this case.

not just because it's a "controversial topic" (my personal vote is Please God No), but because it leaves open the possibility of groups that create/engage with that type of content joining the site explicitly to shift the majority opinion towards allowing it, and to exclusively to post said content - as others have noted, there's already cases of people advocating for this, and buying Plus in the hope that it'll help further shift site policy towards allowing it.

if that content were to be permitted, even under strict tag and filter requirements, i can see a lot of people choosing to leave - personally, while i'm not sure it'd cause me to dip immediately, i would definitely feel a bit uncomfortable even with perfect filtering, especially given its legal status. i'm especially concerned by the risk of cohost becoming a "safe haven" for posting that content, as others have in the past.

i hope this doesn't come off as too strong a reaction - i'm glad this is being discussed openly, even if i disagree with the concept of putting it to a user vote, and that it's under a de-facto ban until policy is finalized. but for what it's worth, i dislike the concept of this even being a user choice, but if it will be then my choice will be "hell fucking no".

in addition to the obvious moral problems, i think working around the legal issues would be a waste of engineering time.

i apologize for editorializing, and i'm sure those who engage with that content would disagree with my terminology, but i don't think it's worth investing the time to design adequate filtering just so a bunch of pedos in the small handful of countries that allow it can have a wank.

My personal feelings on taboo fictional content and paraphilias are complicated and not something I trust the open Internet with. That being said, I think your "having weekly hours long conversations with your coworkers about lolicon kind of sucks" comment brings up an important point to consider, which is your (referring to all of staff's) personal boundaries.

You are not a multimillion dollar corporation with lots of funding and personpower, but a small team that's still figuring out stability. If you decide that this topic is way too stressful/depressing/triggering to manage, much less moderate on a regular basis, and decide to simplify things by just banning it altogether, it would be perfectly within your right, even if the majority of the community feels otherwise.

(Hell, you could even say "it is not within our ability to moderate NSFW content, so no NSFW content will be allowed from here on out" and it would still be within your right. I'd be hugely disappointed about losing a space for NSFW and would spend some time fretting about how asinine "child protection" laws are wrecking safe spaces for fandom and sex workers, but my feelings about it do not change what's sustainable for you to manage, nor do I feel it's fair to lay the task of "solving" a society-wide problem at the feet of any individual small team rather than campaigning against the people pushing this at the top.)

Those are my two beans. Thank you, as always, for being so transparent with folks.

really agree with this - didn't want to add my thoughts on the matter to my comment because it was already long enough, but frankly i think if the staff don't want to/can't deal with it, it's entirely reasonable to just say no

Also agree with this. While noble to want to involve the user base in the decision, the fact is none of us are the ones doing to moderation, we do not feel the weight of that choice, the staff do. Therefore, decisions on what to host or not host, should be up to what the staff can handle.

yeah, i think honestly this stuff is a whole can of worms wrt nsfw content generally, especially as it approaches borderline stuff, so mostly my feeling right now is that based on this post, and previous stories of other websites dealing with moderation, i would really not feel super comfortable with it being allowed on the site without the team feeling very very confident about the site's moderation capability

i'll also add that, given how i've seen nsfw artists start to come over here as a possible space for their art and community, i do appreciate and agree with the fact that you're trying to communicate openly and honestly about this, even though i know it's probably bringing some heat in from people. regardless of what you end up implementing i think being communicative with the people affected makes a huge difference.

yeah, especially as someone else said with regards to this being a small team site - y'all don't need to do everything, you're already doing a lot. I'm glad adult art is a thing here and also I feel it's completely fine to have difficult-to-moderate stuff like the specific tags being discussed above be "hey this is too taxing for our team to moderate effectively so Don't"

youre well within your rights to ban legal disagreeable works of fiction but censorship is a slippery slope and people reporting posts are always going to have a different line than the people moderating them. (see "made in abyss")

its going to cost a lot less human resources to push the undesirables into their own little self managed box than ban them outright.

hell it even makes it easier to catch actual predators and save actual children. trust me, ADBL people will crucify an actual pedophile because a good chunk of them are child abuse survivors.

i would be blessed to never see shota/lolicon ever in my life.

but i make borderline content. i draw characters having sex in classroom settings, i focus on large age gaps, i have an artstyle that is stubbier than the average bara artist. my most popular OCs are a Daddy/son pairing. this is all stuff that these "cops" are going to report me for. i know this from experience and the (troll)death threats ive gotten.

putting that stuff behind a default muted tag list seems like the best option from my perspective as a riske artist whos fallen victim to censorship creep on tumblr and patreon. I dont want cohost to end up like patreon where i cant post half my shit because they deem all hypnosis kink art as glorifications of rape.

My art is how i process my own trauma but i also do my best to give a guiding light for people that experience similar troubles as me. i want my art to say "you are worthy of love" and theres so many bad faith people out there trying to attack that.

One of the best things about cohost is the built in age gate. ive always assumed most trolls are children that wandered into the wrong part of the internet and havent figured out how to discern reality from fiction yet. i wonder if the comments on the community guidelines post would be as bad as they are if the post were marked as adult content.

I trust staff to make good choices, i will respect those choices, i believe they are doing their best. and im sick of dealing with the thoughts and opinions of people who never ever have to interact with me or my art.

in conclusion, if theres any blurry lines im going to personally suffer for it. quarantine it, dont ban it.

these comments are not fun and i hope you guys get tea or something after your next meeting reading these

All said as somebody that posts nothing but porn here...

I hope the @staff hasn't been so busy that it missed today's news that the U.S. Supreme Court is going to review two cases involving Section 230. With SCOTUS now having a majority that is just plain spiteful, this whole discussion could become a moot point very quickly as the Dishonorable Clarence "Uncle Ruckus" Thomas would love nothing more than to ban all sexuality from the internet.

That being said, for now, I would ask the @staff to think long and hard about this list...

https://help.antisoftware.club/support/solutions/articles/62000226150-mandatory-content-warnings

Do you truly want to defend this content both among users as well as legally?

Is it worth having considering how many have already expressed their intent to leave if it is not banned by cohost?

There is freedom of expression and then there is common sense. Just like you can't yell "Fire!" in a crowded movie house, you shouldn't be showing people sexually abusing children...period.

...i kinda feel like this post explicitly states that staff have indeed been thinking long and hard about the implications and consequences, though; and that they would just want to ensure that their actions align with what people using cohost want.

notably, this comments section is not full of people saying "no, i want to see this content", which certainly bodes well for that!

I think on the whole, most people want to make sure their content is both found by the people that want to see if, but not stumbled upon by those that may be upset by it. So I'm glad to hear that most the people that have joined have been self-monitoring as I do think people on the whole do try to be good about that. Though, I'm admittedly surprised that extreme gore isn't on the list of things that are mandatory to content warn.

That being said, once tag filters are in, that can do so much to help one person's browsing experience, which can make further discussion about the more sensitive topics easier to contend with.

the tone of this does seem to be a bit more "we are starting to see this, how should we move on banning it explicitly in addition to making it difficult for others to see" than a lot of people are assuming

consider: twitter explicitly does not ban it (which, IMO, sucks), and I'm guessing a majority of people on this site have a Twitter account. I didn't know Twitter didn't ban it until I found it by accident, which was a pretty unpleasant experience

This is exactly what I think this comment section needed. I was at Discord when the company made a similar decision and I have not been at Discord for very long in the grand scheme of things. This is not happening in isolation or behind closed doors and the idea that people are somehow shocked that these decisions need to be discussed at all is ridiculous to me because of course they do. There have almost certainly been conversations about this at Twitter—but you don’t know about them, because Twitter doesn’t care to involve you.

Reading through the comments reminds me of the story of kicking a single nazi out of a bar to prevent it from more nazis showing up and it becoming a nazi bar.

Cohost is still small, so shaping the community is still possible, but also important for growth. I'd like to have the site be known for all kinds of cool things, and not as a freehaven for some really questionable content (which might even be illegal in my country).

I think I'd support banning certain subjects if they present a moderation headache, costs way too much time in discussions whether to allow it or not, or an existential threat to cohost because it attracts creepy people. This does not have to be permanent, changing it when the site has grown and has better structures in place to deal with edgier subjects is perfectly acceptable.

i don't want to get real deep into this, but i will say that while i think this is basically a reasonable post saying reasonable things, it's also the first one from you folks that's left me feeling less than 100% trusting and invested in the site. i've been intending to go to plus, but i don't think i will be if this doesn't shake out as a ban of this kind of content - even though this kind of content is never on my radar as even existing, until i read a discussion like this.

whatever you think about whether this type of stuff should exist, it seems like a no-brainer that it should not exist here. i feel like the last thing anyone wants out of cohost is yet another environment that splits hairs about whether something fucked up and disturbing is actually illegal enough to be banned.

also, with the talk in the post about "prevailing legal definitions of realistic depictions", etc etc... i would just like to note that nothing obligates you to have that much specificity in your rules or enforcement, and often communities are better off without it. i'd rather be on a site where something borderline-but-maybe-debatably-paedophilic is, like... banned for being borderline paedophilic.

this is maybe the most i've agreed with an internet post in years. i appreciate the transparency of a team who clearly wants to democratize decisions that affect the community with direct input from said community, but i would much more appreciate there being a hard line for uncontroversial issues like this, like the one taken against fascism in cohost's design philosophy. putting something like this to a vote only wastes everyone's time and mental bandwidth, especially the team.

just ban it and work on cool shit. no legalese necessary; there's no slippery slope; nothing of value will be lost; there isn't a "pedophile community" to which this decision would be unjustly exclusionary; it's not anti-artist or anti-sex work. it's simply choosing not to platform harmful and inhumane content.

As a CSA survivor and shotacon (obviously, by the name) I feel the need to give my input on this: (TW for brief discussion of paraphilia)

Thought crimes do not exist. This idea of "this art is mortally wrong to me therefore should not exist and should be illegal" is ridiculous and harmful. Loli/shotacon hurts no one, it actually is a good thing that it's allowed to exist in the first place. For people who are actually attracted to actual, living and breathing children, it's an alternative that causes no harm, and actually does the opposite of what some people on here are saying it does. It actually decreases rates of CSAM use and CSA. To me it's pretty obvious that fictional depictions of minors, animals, whatever the desired subject is, is a good thing. People cannot control what they are attracted to, and these attractions never go away (conversion therapy doesn't work for paraphilias, either) having a harmless outlet helps people

I'm having difficulty typing this out on my phone right now, but the fact that people see no problem in violent video games and movies that arguably "glorify" violence and murder but they see they only see a problem when something involves sex. Liking playing violent video games doesn't make someone a dangerous person and neither does liking dark fictional content

I don't know how long my tangent is going to go on for, but I also wanted to add that it is not traumatising for me to consume this content, before someone tries to tell me I'm "retraumatising" myself. I see a psychologist, they know what shotacon is and they know I like it, and they literally didn't care and said it's actually an extremely good outlet for some people with their trauma.

There should be tags in settings for certain types of content (noncon, underage, gore, etc) and they should automatically be ticked to be off so people who actually want to see these things can manually click a box so they know what they're getting into. Most artists I've seen are good with tagging their stuff, so it shouldn't really be a problem that people come across it when they didn't want to.

I'm also curious if it does end up getting banned, where the line will be crossed. Anyone can claim a character to be 18. Some characters look younger (Ex: Venti) but are adults. Some characters look like adults but are underage (basically any character from Jojo) Will there be a line drawn with furry content as well? Personally I do not like any animal based content but people have the right to draw whatever they want (assuming they are not drawing actual people without consent/actual minors)

I can't really scroll up to read what I typed because it looks weird on my phone, so if there's any spelling errors or if it's all just generally confusing, my apologies And sorry it's so long

Ok? That is no one's fault. Someone being uncomfortable or getting triggered by art isn't anyone's fault (unless someone is purposefully posting it where they shouldn't)

Personally furry porn freaks me out because it's based on animals and I don't like seeing those types of themes in nsfw at all but I'm not going to complain about getting "hurt" by it because it's a squick of mine

If you don't like something, don't look for it. Block those who post it. You are responding for your own online experience and consumption, not random people online.

even if one were to take the most "devil's advocate" stance possible on this: nobody here is an animal who has been subject to abuse.

saying furry porn and shota are on the same level because they both cause "squick" to people who don't like them requires ignoring the material facts under discussion and the impact on others who have been subject to child abuse and are materially harmed by seeing it depicted in an eroticized fashion

You can argue that with any piece of media. Someone watching a fictional movie about a murderer can upset and trigger people, doesn't mean it should be banned or censored, when it's a piece of fiction.

And I am one of those people who was subject to child sexual abuse, and so are the majority of my friends who are into lolisho. My therapist sees no problem with me liking shotacon and thinks it's a good outlet for me, and personally it has helped me. Obviously for some people it will not help them, but those people should stay far away from it. Some things trigger me and remind me of my trauma but I don't try to control or censor what people say/make.

I don't really understand what material harm means in this context, from what I understand the term relates to money loss but I'm probably just misunderstanding and not understanding what "material" means here

The thing is, it really isn't my problem what other people are upset by. Its fiction, its not based upon any real people. Personally there is types of fiction that I do not like but I am not going to tell someone what they can and cannot draw/write just because it personally upsets me.

Again, someone who witnessed a murder might be upset by the depiction of violence in fictional media, but that doesn't mean people should stop writing it Personally I don't really understand the appeal of gore for the sake of gore but I'm not pro censorship of it.

I appreciate the transparency from staff on this very fraught subject, and I'm sorry to hear about how much this particular subject has proliferated into your lives off-site. To clarify my own position, I am in favour of loli/shota content and am willing to provide a rationale if need be, though I suspect I had better save it for the referendum. That being said, where legality enters the picture (and questions of obscenity in particular), I cannot fault you for being hesitant, especially as a small group of devs who lack the funds for a robust legal team. Your tagging system is already very good (I pray you never change the case sensitivity feature), and I think toggling the mandatory content warnings to "off" for new users by default would be head and shoulders above what other sites are doing. For not being the OTW, you're doing a great job.

Back on topic... like several people in this thread and off it, I was also considering investing in Plus before this update was released. I will be holding off until a final decision is reached. My hope for the outcome is that even if you are not able to host that content at this time, you will remain open to it in the future. I would be willing to consider financially supporting Cohost if that were the ultimate decision staff made.

On the one hand, I'd really like to finally have a social media site that's anti-censorship-of-art without being an anti-moderation trashfire. On the other hand, you're a small team and shouldn't do things that stress you out, or put the site at legal risk.

For my own stances/ arguments against a ban here:

I'm not personally into lolisho, but I am into other controversial themes in fiction, and if the community is allowed to vote to ban one thing, then you've introduced the idea that the community is allowed to vote to ban "gross stuff" or anything controversial. The slope from "community bans lolisho" to "community bans shipping abuse" to "community bans everything" is historically extremely short.

My account was just activated yesterday - I just joined because my friends recommended it as a chill community that wasn't overrun by antis or censorship witchhunts. I was considering a plus subscription if I end up using the space, since I like what y'all are doing. But many responses from users in the comments here make me feel unsafe in this community and reluctant to invest in it, and I'll likely delete my account if lolisho is banned.

I think the suggestion I've seen in a few other comments here, of a default blacklist that applies to logged out users like e621 has, might be a good balance. If you need to for legal compliance, you could use geoblocking or "set your country on sign up" or etc to prevent things from being un-blacklisted. e621's system isn't perfect, and there's still people who go into the comments of art that's tagged with default-blacklist items to complain about that art existing, but "this was appropriately tagged, don't like don't read" should hopefully be easier to moderate than "there is not a canonical tag for this because we technically don't allow it." (Knowing humans, people are always going to post things like lolisho, but are usually pretty good about staying in their designated box if one is provided. "If it's at all borderline, tag it with this" would probably save you a lot of headaches with "but this character is actually 700 years old, it shouldn't be banned"/ "but that character is actually 16, it should be banned" moderation headaches. And the fewer avenues you have for users to use the mods to harass each other through, the better, and banning any dubious/ judgment call/ fictional content will create a lot of harassment avenues.)

I also know several people afraid of expressing their views in favor of lolisho being allowed in the comments here, because they're afraid of being harassed/ dogpiled by anti-lolisho members fwiw to anyone looking at the comment counts and judging by that. (Also, if there's a community vote, then who voted for what needs to be invisible to other users, otherwise voting rolls will have a pretty high chance of being used as harassment hit lists. Or people being afraid of that will discourage voting, even if the threat doesn't manifest.)

banning specific manifestations of fictional depictions of kink always leads to a bit of a "slippery slope" argument; i say this in good faith, introducing the individual judgement element to fictional content moderation always adds a gray area to what is and isn't allowed. in order to have an accurate policy regarding what is and isn't allowed to be depicted in kink fiction/art, you have to have a moderation team who's willing to look at each individual piece of fiction/art and make a measured, informed decision about that specific piece, and that's not always functional long-term nor is it the easiest mentally on a small moderation team who may be upset or squicked by engaging in-depth with certain topics.

it opens a whole can of worms, basically: is your moderation team all going to be on the same page about what qualifies as lolisho? as cub art? as ABDL kink? as littlespace? etc etc etc. what about moderators you hire in the future? what kinds of clear, consistent guidelines will you create that will leave as little room for interpretation as possible? what about the community--is everyone in the space you're curating on board with what the nuances of each individual term are, and which are banned vs. not? and at the end of the day, your moderators are still going to make individual judgement decisions as to whether or not a specific report falls into one of the banned categories, and these have the potential to be very controversial decisions.

you can also go the scorched-earth route, and ban any discussion or depiction of sexuality wrt minors or any aesthetics associated with minors--not something i am personally on board with, but it's respectable as a clear and consistent ideological approach.

Regardless of 'kodocon' and/or fictional underage being regulated or banned, it will not personally affect me (except if it isn't tagged for filtering). I do consume other fictional 'taboo' content (BDSM, gore, Non-con, exophilia) though. As @ bazelgeuse-apologist said, your team's boundaries are just as important. I also appreciate the transparency and improved flitering/warning tools! (And not calling CSA(M) "CP") /pos /gen

I also want to mention about others saying/implying paraphilias are synonymous with being a predator or enabler, is contributes to sanism/ableism and stigmatization. It's possible to combating abuse and not spread harm to marginalized groups. I think this should be brought up as breaking TOS. You can also add mental health resources on your about page. /gen /npa

i think it's an excellent idea to add technical features for all the child porn enthusiasts (elsewhere known as 'pedophiles') on your platform to identify themselves as such. in the next version i suggest you take the mandatory cw policy one step further and also imprison them in real life. you could call this the 'cogulag'

i appreciate your input, the first ever cohost gimmick account. while i certainly agree i don't want that shit on this website i find this an annoyingly snappy and twitter-like take

more seriously, as ive said upthread i think the existence of a more robust content filter system is useful for considerably less objectionable things that i nevertheless wouldn't want to see.

I'll be frank the fact that this is even being considered is astounding to me? What on earth would posses you to even think it would be appropriate? On the site that last I heard was designed to not be a shit hole of "free speech" as a code word for "Yeah we don't care if nazis hang out here". And yeah I bet having to talk about it does fucking suck! That should probably tell you something!

(related: having weekly hours long conversations with your coworkers about lolicon kind of sucks and we would recommend against being in a position where that’s necessary.)

i mean you could have simply taken a principled stance against choosing to host loli and shota. just for starters

they're currently removing it, the post literally also says "we don’t want to implement a policy that the majority of users would feel uncomfortable with. we are currently working to implement a system to allow us to get user input on this area of policy. until such time, please refrain from posting it; up to this point, we have been asking people posting it to remove it pending a final policy decision." so it's currently not allowed, and they're trying to get feedback to ensure the safety and comfort of the community. this is a good thing compared to wholesale allowance like twitter

I'm in two minds about this. On the one hand, lolicon sucks, it's gross and disgusting and I don't want it anywhere near me, (plus it's illegal in my country,) nor do I want this site to gain a reputation for harbouring pedophilia. On the other hand, banning it outright takes considerable resources and moderation that I'm not sure Cohost has, and a robust tagging system that allows people like me to never ever see it would effectively solve the problem while still giving CSA survivors space to work out their trauma. It's a difficult topic, and I respect staff for being open and honest about it, especially when sites like Twitter simply allow it wholesale with a total lack of moderation. That said, I personally would lean towards banning it at this point.

The mandatory content warnings page should define their terms. I am familiar with most of them and can guess the remaining one from context, but this is not going to be true of everyone and you can't expect people to follow rules they don't understand (and besides, those are terms you might not want to search for yourself; for obvious reasons, anything that requires a mandatory content warning might result in upsetting material showing up in a search).

Appreciate y'all bringing this to the community so transparently. Decisions of content moderation & censorship are complicated. Personally, I'm largely anti-censorship but have the hardest time reconciling that with this sort of content. If I were in y'all's shoes I wouldn't allow it on such a "general use" site. Comes down to "what do we want this platform to be like", and all the potential danger, hurt, etc. that comes with this isn't something I personally want for the site or for y'all to have to navigate.

to provide a slightly countervailing voice: it's comforting to me that this needs to be a discussion. I really appreciate the staff's serious consideration of free speech issues, even if in this case it does end up coming down the other way. Despite my own personal leanings on this subject, which are roughly "why should this site be any more restrictive than legally required in its jurisdiction", the other commenters have brought up salient practical concerns re: site reputation/moderation workload, which in my mind sufficiently answer that question.

So I also side against this content, and encourage seriously weighing practical concerns against ideological ones, in the end it's a higher priority that this should be a cozy site for our community, even if that implies some ideological compromise. As well, economy of moderation resources when possible means they can be freed up for site improvements, and I think it shouldn't be downplayed that staff morale is actually important for community morale here, at least at this time and scale. Take care of yourselves, seriously, this is a marathon more than a sprint.

they aren't being cool, being cool means not bringing a discussion about allowing pedophile content on the site to it's users and just going 'wow what reprehensible shit! ban them all!' like they SHOULD HAVE

I would rather see the staff of a site involve its community in things like this than to make those decisions on their own. What do we gain by being jerks to them? Especially when the decision they're making was already forecasted to be the one you and I would already want? What do you gain by treating the other people in here, who all have dignity and thoughts and personhood, like that? C'mon. Be cool.

I would rather have the staff care about it's users enough not to make a huge post dangling over their heads that they're going to get them involved in a discussion about materials involving child exploitation. That is not good stuff at all. They did some serious harm to their users by announcing it this ambiguously. The new post says that this isn't allowed which is great and good so why did this post say it was going to be a discussion? There's entire communities of pedos mobilizing to get their voices heard here so they can claim this site too.

This is not just happening in a vacuum! This is the fiftieth or so social media site out there, this conversation has happened on all of them in some way or other, and it's absolutely vile. The default for any of these sites should be "no child exploitation materials, period." with no discussion. Not everything is up for debate.

You want benevolent tyrants or parents. That's not what you'll get anywhere. This is the work of shaping a community -- you're a part of it, and you being able to voice this at all instead of some faceless corporation deciding it should be a boon to you (a corp won't decide the way you want, they'll decide by legality).

This is about child sexual exploitation, friend, not about discussion of pineapple on pizza. This is about potentially allowing child pornography. This is not some small potatoes situation, this is about something legitimately harmful and real. This is exactly the kind of topic that should not have to be brought up, this is the kind of safety we expect as a baseline.

Let's take all of that as true -- it's not the baseline you get today. Is being rude to the people legitimately trying to help serve you a safer site and involve you in that choice the right play? Do you honestly believe in a theory of change that works like that?

This is the kind of baseline we should expect on a site run by people who claim to have the best interest of it's users at heart. This is the kind of baseline we should expect here, if this site is what they said it was going to be.

These people have experience with this very situation from other sites, they should understand. The post they made is absolutely baffling, they are straight up saying they are going to poll the userbase about this. Once again, this is about content depicting child sexual exploitation.

I assumed this site was run by "benevolent tyrants" (I'd have used something that meant more along the lines of closed leadership) otherwise what's the point of building a new centralized platform?

You may say I'm being "rude" but the reality is they should expect trauma responses when they post triggering content. Once again, this is not about something casual.

Were you exploited, as a child? Were you abused, sexually, as a child? If not I need you to reconsider what you just said.

You cannot exploit drawings. Lolisho is not child sexual exploitation. I was exploited, as a child. I was sexually abused. Comparing to what I and other survivors went through to lolisho is deeply disingenuous and I need you to reconsider.

I understand that, but I also know the source of my anger is the repeated comparison of the experiences of myself and others to drawings. That's what makes me angry, and that's why I have been hurt by the tenor of these comments.

It hurts, a lot. These things are not the same and I wish people understood that.

The source of our anger is that someone did something so heinous to us and society at large still acts confused about what to do when it comes up. Survivors attack survivors, saying their survival is better than the others because they react to trauma "better" in their eyes. This narrow text window is making this hard, but depictions of child abuse are traumatic. Discussions of allowances of that content, traumatic. Having all my social media feeds come up saying "cohost is allowing child sexual exploitation content" hits my triggers, not only about the exploitation but about the absolutely vile way people discuss it. The attacks they make on others who they should have solidarity with.

I think it is EXTREMELY inappropriate to ask strangers if they were abused as children, to the point where I don't think that should be allowed on this site either.

If you read this post and your initial reaction is to imply that staff are pedophiles/pedophile apologists you should not be posting on this website, and should give some thought to not posting on the internet entirely

Allowing that material would have severely negative consequences for the culture and community of the website and it's surprising that this is considered a serious topic for discussion at all; this seems like it should be on the "ban nazis y/n" level of obviousness.

There have been some strong points made in the comments about the importance of getting this right. Maybe there's a deficiency of rightness in general that we're just now addressing. Instead of having to rely on existing norms, we could be changing them into new ones more suited to preserving our social fabric.

I see this like we're in the middle of putting our pants on one leg at a time with regard to depiction of abuse. What started as crude but commonplace whether recorded or imagined became verboten to record (but "tolerable" to fabricate) as consensus progressed on the definition of abuse. Now we're on what to do about the fantasized depiction, both on this site and in general society. It's not just a matter of national legality or personal disgust - we're faced with how to shape our norms to contain actual social harm that occurs through the act of depiction itself, no matter whether the subject ever lived or not.

I'm trying to square this with mentions earlier of sexualization as a way of dealing with trauma. That's its own valid thing out of context, but then why would the material be made if the trauma wasn't there to inform it? What would a world without that trauma look like, and what part do we, here, have in getting there?

At a minimum it can't just be a matter of putting up with what everyone else already does. We have to keep it from pushing the rest of us out (like the nazi bar scenario). Where to draw the line from there - between stopping the cycle of exploitation and pushing it elsewhere, with regard to moderation capacity - is a policy decision that I don't envy, and I thank our gracious website runners for putting it up for comment.

i don't believe that sort of content has any merit to being on this site, and i am looking forward to whatever means it is that is provided for feedback

that said do also keep an eye out on those who have been openly advocating for swaying you decision making process by joining now, and by subscribing to provide a financial incentive for you to decide in their favor

I research some related areas to this and it's a gnarly topic. I think the Cohost staff are really putting in the effort to dig into one of the most complex moderation discussions it's possible to have, and that's always thankless and rough. That is itself something I appreciate: thank you, staff!

Focusing on the pragmatic tensions involved, the core issue for me is mutually-exclusive accommodations and needs.

On the one hand, you've got people who have to avoid encountering fictionalised CSEM because it triggers them.

On the other hand, you've got people who find fictionalised CSEM helpful to them personally, due to their experiences.

Those two groups seem like the ones to tailor a response around, since they're the most adversely affected in either direction. As such, they get priority through triage, and their needs are more important. Even so, they conflict.

AO3's approach, along with the other sites using system-enforced tags, seems like the least-worst way to thread that needle. That way people who have to avoid fictionalised CSEM of various kinds can, and have security that they won't encounter it. Or, if they DO encounter it, it's actionable because someone's broken site rules through failing/refusing to apply appropriate tags. Meanwhile, the people who want to have a space where they can.

This also builds a structure where accounting for the needs of the most adversely-affected people here also includes people less adversely-affected on the site in different directions.

...I'm not sure that putting this to a vote is a good idea, left to my own devices? With the best will in the world, this is a profoundly hot-button topic and I have a hard time imagining how to set the terms of the discussion and a moderation framework that would avoid it turning into an inevitable trainwreck.

As such, possibly using the sites that have Already Had These Conversations - like AO3 - as a guide, potentially including reaching out to the staff involved, might be less high-stakes? And also less rough on the Cohost staff as a way forward?

+1 to using AO3 as a guidepost here. While there are some who choose not to use it due to its policies on underage content, on the whole it has avoided the reputation for being exclusively or primarily for that kind of content, which many commenters here have expressed concern about.

I'd like to say that, while I respect any choices the admin team makes, I'm way more comfortable in spaces that permit depictions of immoral acts than spaces that prohibit them.

I also want to throw in agreement that with such a small group of staff, you should definitely lay rules where you're comfortable moderating them. I've been a community moderator before and it's a role that leads to INTENSE burn-out. Anything to slow that down is good and should take precedence. I do agree with the people saying that allowing it may be less moderation hassle than banning it (people don't generally report things that are explicitly allowed, and the way you're suggesting the mandatory content warnings work would keep this content well out of the way, but people do tend to post things that are forbidden, which then gets reported, which the moderators then need to deal with). But frankly that's all dealing in hypotheticals.

I personally find "lolisho" content repulsive, but I consume content that I know others would find objectionable.

I appreciate you seeking community input on this matter. If you are truly interested in getting the full perspective of your community, I would recommend establishing some ability for users to provide feedback anonymously (other than creating a secondary page, as I have obviously done), as I expect many people are hesitant to share their feelings on such a fraught subject, and I don't think the loudest and most aggressive voices should be the only ones heard.

For those questioning why this even needs to be considered, I would encourage you to consider the practical task of moderating this. It is easy enough to say that art depicting fictional sex acts involving minors is not allowed, but there are a number of edge cases that will either need to be ruled on as a matter of policy, or left open to interpretation by the whims of individual moderators. I would much prefer that staff create a clear, robust policy up-front, rather than stumble into a collection of contradictory or piecemeal rulings.

Those outraged that staff would need to think carefully about this policy should consider which of the following they think should be banned, and how they would write a policy to consistently distinguish between them:

  1. An illustration depicting a sex act involving a minor who is identified as such (the baseline that most people here are concerned about)
  2. An illustration depicting a sex act involving an individual who looks like they could be a minor, but is not explicitly identified as such (what criteria should be used to determine this?)
  3. An illustration depicting a sex act involving an existing character who is a minor in their source material, but who has been "aged up" to 18+ (e.g. "sexy Velma")
  4. An erotic illustration of a real person who is 18+, but who appears young (e.g. a drawing of any of a number of porn actresses popular for their "youthful" appearances)
  5. Written erotic fiction describing a sex act involving a minor (e.g. anything in the "Underage" AO3 tag)
  6. A written account of a fictional sex act involving a minor, which is not explicitly erotic in nature
  7. A written non-fiction account of a personal experience with CSA
  8. "Age-play" content depicting adults (real or fictional) dressed as, behaving like, or pretending to be minors.
  9. Erotic content depicting fictional incest between adults (e.g. Game of Thrones fanfic, a shocking amount of mainstream pornography)
  10. Erotic content depicting fictional rape between adults
  11. Erotic content depicting "consensual non-consent" between fictional adults who are depicted giving explicit consent
  12. Erotic content depicting CNC between fictional adults, where consent is assumed or stated to have been given "off-screen"
  13. Filmed pornography depicting fictional rape, in which the actors involved knowingly consented and willingly participated
  14. Erotic content depicting adults engaging in sexual acts under the influence of drugs or alcohol

There are, perhaps, an infinite number of further edge cases, and there are many, many forms of paraphilia, kink, and taboo sexuality not even addressed here that many would find objectionable. I suspect there are those here who would choose to ban all of the above, and those who would choose to ban none of them. Most, I suspect, would choose to ban some but not all of the above, and I very much doubt that you would be able to come to anything approaching a consensus about which is which. I don't list these in an attempt to draw an equivalency between them, nor to argue that if you allow any, you must allow all, but instead to point out the necessity and difficulty of creating a robust policy that reflects both the desires of the community and the realities of content moderation. If Cohost goes beyond the requirements of the law, they will need to rule on these sooner or later, or else end up with a vague, arbitrarily-enforced policy that frustrates and confuses both the people producing such content and the people reporting it.

If Cohost intends to allow NSFW/erotic content at all, it will need to deal with the fact that a major component of human sexuality (including the stuff people who actually pay for content want to pay for) involves stuff other people find "gross" or objectionable. A site that nominally allows NSFW content as long as no one specifically objects to it is not a safe site for sex workers or erotic artists. (Which, again, is not to argue against banning specific types of content, as long as those policies are clear and it's not just a matter of removing anything a specific user or moderator finds distasteful.)

I feel like there are two camps to consider the most:

  • those who are invested in non-realistic loli/shota art being allowed
  • those who are invested in that art being banned

In my experience, the latter camp tends to want the former camp, i.e. the people who enjoy such art, to be forbidden from using any platform at all, whereas the former camp tends to want the latter camp just to leave them alone, and wants to share such art amongst themselves without any risk of the latter camp seeing it.

This is certainly subjective, but I believe that to cater to the latter camp is to say that anyone can be banned for the art they create or enjoy, if enough people are offended enough, loudly enough, by that art.

I think that, no matter what, people invested in this conflict will see cohost as aligning with one side or the other, and that the cohost staff should assertively align with the side that is most consistent with their values, rather than trying to satisfy as many people as possible. And I feel like it doesn't - or, it shouldn't - make a difference if everyone here comes out of the woodwork to share their personal stance, it just matters to recognize that both of these camps exist.

(That being said, I want to share my personal stance anyway, not for the staff's sake but in case anyone is tallying up the people on each side: I think non-realistic loli/shota art should be permitted, with the requirement that it is properly tagged, and that it's opt-in to be able to see it at all. Anything else I'd want to say has already been said better than I could say it, in other comments and replies.)

+1 also note the tone differences emerging in the comments of the post... And there's no underlying principled stance you can make as to why lolisho can be banned by sufficient outcry but other controversial subjects/ fictional depictions of immoral or harmful behaviors can't.

I'm gonna come out and say it. I'm not a lolisho but I'm against censorship. I don't believe lolisho 'encourages pedophilia', no more than I believe GTA will turn people into violent criminals. Crimes are not the fault of some piece of fiction, they're the fault of the one who commits the acts. Saying otherwise takes responsibility from actual predators.

Predators will use SFW images, candy, and puppies to groom their victims. We don't keep children safe by banning candy and puppies. It's a ridiculous idea, and on the same level as saying that banning lolisho will save children in my eyes. The groomers I've dealt with have always used completely innocent things. Things these detractors would have no problem with existing.

Quite frankly, people who compare drawings to actual CSEM repulse me. It waters down what CSEM is and how it harms. A picture of Sonic the hedgehog should never be compared to a child being harmed. It's like a sick joke and the people being harmed are the punchline.

Yes, I'm aware that there's people who feel that lolisho allowed them to be harmed. My heart goes out to you, but that doesn't mean I agree. I mean this in the kindest way I can muster: You don't speak for everyone that share your trauma. This goes for any trauma.

Ultimately, if the site allows you to avoid content, then you should take it. It's your (general you) responsibility to curate your experience. If you have qualms with what a site allows, then you're welcome go to any number of art hosting sites that are more strict. Most of them disallow lolisho I'm sure, so there are options for you.

I'm concerned if cohost bows to these demands, if it will bolster more demands for more censorship. I've seen this play out with other newly established art sites. First they get lolisho banned, then they try to campaign to remove incest and 'aged up' art of cartoon characters. Or the endless headache of feral art and what that even is.

In my opinion, a hard line needs to be drawn. Allow people to make the choice whether they'll stay or leave knowing that disagreeable art is being hosted. Don't let them bully or scare you into obeying demands.

That's my opinion anyway. Ultimately, it's staff's choice and they know what's best for their site and their team. I sympathize with not wanting to deal with the controversy. I'll respect their decision even if it's one I disagree with.

Hugely appreciative of your thoughtfulness and transparency here where other social media sites have failed. It takes a lot of time and effort and thought to do.

I’ll put my two cents in here: I’m pro tagging content like noncon and/or lolisho but not banning it entirely. Every person and country draws the line of acceptability differently, but in the case of the above, I draw the line as to whether a real person who did not or was unable to consent was involved.

There are queer artists out there who draw noncon as a way of survivorship. There are folks who write fictional characters in situations that play out an adult fantasy that they wish they could have as an adult with another adult, but due to whatever circumstance, cannot.

I support folks who don’t want to see this type of art and writing AND it is a real form of messiness especially in queer and marginalized communities have.

Thank you again for your thoughtfulness and candor throughout this process.

I appreciate you folks a lot. Some thoughts:

  • Seems like the convo is being brigaded by a bunch of new users, which is not pogchamp.
  • I would personally feel like the vibes on this site would be a tiny bit rancid if I knew there was lolisho content lurking around.
  • There are other websites for that, anyway. It's pretty clear the staff aren't free speech absolutists, and I'm unsure of whether I want to share this community with people who extremely vehemently defend simulated CSAM.

Sorry to hear that this has become such a flashpoint topic.

I see a lot of hay is being made out of the observation that more than a few of the people in this comment section in favor of allowing lolisho content are people who made their accounts within the past couple days. Which is to say, they are producers/consumers of lolisho content who heard about this debate and are transparently lobbying in an attempt to secure a new haven for their niche interest. I can promise you that @staff were not born yesterday and can put two and two together with regard to these users and their motivations.

However, we would be remiss if we didn't acknowledge that this tactic is hardly unique to the "Yea" side of the debate. It takes very little looking to see that more than a few of the most vociferously negative voices in the comments are from accounts who, by all appearances, signed up a month or two ago, made a few posts, and then forgot this website existed until they rematerialized yesterday filled with terrible, terrible concerns over the direction the site culture was taking. Which is to say, these are not people who care about Cohost, these are people who care about getting their paws on a ripe dogpile target. Again, @staff were not born yesterday.

Now, a quick exercise: let's say you're someone who hasn't made up their mind on this issue yet. Which group of transparent carpetbaggers do you think is doing a better job advancing their cause?

The pro-lolisho camp have been, to a person, polite, patient, and understanding towards @staff, stating their appreciation for the transparency and amicably promising to abide by whatever ruling comes down following the community consultation.

The opposition, you might notice, have been variably condescending, scolding, and just-this-side-of-threatening to @staff, berating them for even having the temerity to not consider this a settled issue, for even daring to host a discussion. Multiple people in the "nay" camp have promised to leave if the ruling doesn't go the way they want, with most promising to "warn" their communities about Cohost. I've seen more than one person go on to Twitter and start telling outright lies about @staff and Cohost, claiming that they are pedos/pedo apologists (they are not) and that Cohost currently allows CSEM (it does not).

None of this is to imply that simply by being polite the pro-lolishota carpetbaggers automatically have the better arguments. Rather, this is an observation that because they clearly believe themselves to be in the Unassailable Moral Right, a large contingent of the anti-lolishota camp appear to believe that they are absolved of any expectation that they should conduct themselves civilly & in good faith.

Personally, I find this conceit to be particularly noxious, and that it further does the job of the pro-lolishota camp far more effectively than the pro-lolishota users themselves. I posted about my feelings on this yesterday, coming to the conclusion that I ultimately favored a ban because I felt an influx of pro-lolishota users would be distasteful and bad for the site's image. However, seeing the arrogant, disrespectful behavior of the "Nay" camp towards @staff and other users I can say flat out that I'm starting to lean towards opening the floodgates if for no other reason than to reduce the number of these kinds of toxic users from the site. Congratulations!

Depending on how this all shakes down, I may stay on anyway. I like Cohost and staff and mostly post SFW content. I would be fine with linking to things off-site (e.g., to AO3) if they decide on a total ban. But the fact that there are many users who can be reasonable, even when their opinions don't perfectly align with mine, already makes this place vastly preferable to 99% of the rest of social media.

I'm one of those who literally just joined the site since I only found out about it just few days earlier. I was really excited to find a new site to try and already posted few posts and then the previous post by staff went up and completely floored me, like you're seriously considering allowing that kind of content? Hell no, I want nothing to do with a place that allows it! So the natural reaction was to then delete my posts in case this would turn into a place that allows it since that's what I'd do anyway at that point.

Due my so far short stay it was easy to just respond that I'm leaving if that's allowed since I'm not really losing anything. If I had been here longer and had more posts, friends here etc. I might have been more hesitant about just deleting everything, but as I just joined I don't have any of that, I was just shouting into the void alone. I'm happy the staff said such content is not allowed but now I know many users here were voting in favor of it and that in turn left me feeling iffy. As in: is this the kind of community I want to be part of and/or associated with? I don't have the answer for that question for myself yet so I'm hesitant to keep posting here for now. Maybe after the dust settles a little I can make better decisions but this whole thing did leave a sour taste in my mouth overall, this shouldn't have been up for debate at all.

+1. additionally, i'll say that although i'm a new account and i'm generally heavily anti-censorship, i didn't join to try to sway any kind of debate or change the site culture--i'm part of a wave of new accounts that are coming over from tumblr simply because we want somewhere to post fandom content that isn't prone to sudden, random bouts of censorship, and that allows NSFW content.

this does mean that a lot of us will be on the "allow tagged content" side of this debate; it's poor timing that we're coming in right before this staff post dropped, imho, but there's nothing to be done about it.

we are operating in good faith and i haven't encountered anyone in my personal circles arguing for any kind of carpetbagging operation; we just heard about this site for a specific reason, namely the open-ended community guidelines and potential safe haven for "problematic" adult content i.e. kinky weird porn, and tend to have a specific set of views on this staff post as a result.

My earlier note feels hopelessly vague now. It's just starting to hit me, thanks to this comment and the linked post, how a social campaign that nominally opposes abuse has been proactively abusing staff and others and why. Shades of the old harm-reduction debate - why are you giving needles to addicts and condoms to students? Doesn't that show approval of immoral behavior?

Because it really boils down to showing approval, doesn't it - not results or effects or outcomes. Policy is appeasement. All that matters is being seen judging others. Be the right kind of person. Be on the right side. Or else.

Not to say, of course, for the crowd, that I approve of or consume or even support the inclusion of such material. The vagueness of what I posted earlier was as much a product of trauma around this subject as missing the swell of context. But harm is harm, and forming policy to address harm shouldn't be subject to even more harm from sheer revulsion to the subject, just so isolated individuals can congratulate themselves that they're one of the good ones. This has been revealing.

I think you hit the nail on the head here. A lot of people don't really care about minimising harm, but about distancing themselves from discomfort, and visually staying in the moral high-ground.

Just one more voice to say that at least some of the things on that "CWs required" list do need to be banned outright because they are intrinsically and unavoidably harmful. I was honestly a little shocked to click through and see that list of tags when I was expecting things about the sorts of traumatic lived experiences that people must be allowed to talk about but for which CWs are very important to let others who've had similar experiences choose how to engage with them. But to see "toddlercon" on a list of content that is explicitly allowed on a site does not make me want to be here.

I'm staying for now because the followup post suggests that you are open to changing these rules, but it worries me that this need wasn't obvious from the outset. I say this as an experienced social media moderator myself: I know that some issues genuinely are a lot more complicated and delicate than they look from the outside, but I also know that some really really are not.

I'm also troubled by the list of tags with no explanation. What does "noncon" even denote, for example? It could be anything from control kink to rape porn: things that demand very different moderation responses, and people should not be left guessing about. To just post a list with no explanation looks worryingly naïve to me.

To be clear, that is not a list of content that is explicitly allowed. That is a list of content that, if it were to be posted, would need to be tagged as such. This post and the followup are fairly clear that fictional erotic depictions of minors (i.e. everything but "noncon" on that list) are currently not allowed and are being removed, pending finalization of their policy. That policy will, presumably, provide the explanation you're asking for in your final paragraph, which is why they are taking their time and soliciting community feedback to ensure they get it right.

What is the point of a list of required CWs if they're for content that mustn't be posted anyway? This has at best been communicated worryingly poorly, considering how predictable it was that policy around this sort of material would be controversial.

And again: some of the things on that list are so obviously harmful that for there to even be a discussion needed is a bad sign.

Hey! Thanks so much to the staff here for being open and willing to talk about these topics, since I am an artist I felt the desire to give my 2 cents. The reason I joined this platform was because of the potential for sharing nsfw art. I draw a wide array of topics and don’t shy away from taboo themes. Being unable to post certain art here would impact my usage of this space. I support fictional freedom and really love the values and community of this site and hope I can continue using it! Thank you.