Raptor

Fake gamer; real girl.

🏳️‍⚧️ Amateur game scholar.
Social worker. Events Person
Living on stolen Duwamish lands.


Old website I made for travel writings that droped a database
elucidovia.com/
Mastadon I probably won't use
tech.lgbt/@Raptor
Tumblr I probably won't use
www.tumblr.com/ohthatraptor
Username pretty much everywhere
OhThatRaptor

Sheri
@Sheri

This is Part 3 of a multi-part series on mass layoffs at The Trevor Project during union negotiations. Read Part 1 here, and then Part 2 here.

Content discussing suicidal LGBTQ+ youth follows. I tried to write with care, if sometimes dark humor as to make this easier to handle reading.


nonprofit doesn't mean no overhead; employees still must be paid, resources acquired. graphic design and bandwith for hotlines, ai chat bots and business parking lots

“A couple of years ago when Amit started, he
wanted us to really think about two core pillars
to growth. We needed to drive down what we call our
costs-per-youth-served by 50%.”

and i mean lemme tell ya all these suicidal children are really driving up the price of this suicide hotline. gotta half the cost of each suicidal child by the end of this fiscal year, or how else will we meet our quota?

”That means that we can help two times the number
of young people with the same amount of funding that
we have. And the second pillar is that we’ll never sacrifice
quality for scale. We’ll always maintain or improve the quality
that we provide to youth in crisis.”

- John Callery, Former Senior Vice President of Tech
for The Trevor Project, to Fast Company, 2022

you may call your sea a lake, but you're part of the ocean.

'We' Means 'You' When It Comes To Labor

"The Trevor Project’s goal is to expand our capacity to serve the 1.8 MILLION+ LGBTQ YOUNG PEOPLE who seriously consider suicide in the U.S. every year." A map of the United States of America with data on LGBTQ+' youth suicidal consideration' rates. "Guiding Principles Diversity and Inclusion, Youth-centricity, Growth, Quality, Innovation, Best-in-class team"

1.8 million suicidal youth is not 1.8 million callers. these are children in danger that need outreach.

it should not be treated as a market.

but, i guess if we're trying to reach 'capacity' for just shy of 2 million youth, even as a nonprofit, how can Trevor most... efficiently achieve that?

most... affordably? most... cheaply, even?

1: Hire Advisors, Not Employees - So Say The Advisors

Via The Trevor Project's Volunteer sign-up page. "Our Crisis Services. We offer two remote programs volunteers can choose from: TrevorLifeline (Phone): Lifeline Volunteer Counselors are trained to answer calls from LGBTQ young people who reach out to our 24/7 phone Lifeline when they are feeling suicidal or need a safe, non-judgmental place to talk.
TrevorChat/TrevorText (Digital): Digital Volunteer Crisis Counselors are trained to answer chats and texts from LGBTQ young people who reach out about issues such as coming out, LGBTQ identity, depression, and suicide. Please note that our greatest need for volunteers is currently in our TrevorChat/TrevorText Digital program. Young people are close to twice as likely to reach out to Trevor’s service on our chat/text platform."

in 2019, to expand Trevor's volunteer capacity and, uh, "strengthen its technological capacities", accounting firm consolidate PricewaterHouseCoopers International Limited used their charitable foundation to allot a $6 million grant to Trevor over the course of four years

The PwC Foundation and The Trevor Project Announce Multi-Year Collaboration. Over stock footage of counselors is a chyron that reads: "Increase the number of volunteer crisis counselors by ten times."

the only catch, as every good charitable foundation has, is that PwC US LLP would do the consulting on just how that $6 million gets spent.

The PwC Charitable Foundation, Inc. awarded the nonprofit a $6 million grant to help strengthen its technological capabilities and revamp the volunteer program over four years. PwC US LLP provided pro bono consulting, bringing our BXT (business, experience and technology) way of working to deliver results.

"pro bono", of course.

The goal was simple, but critical: recruit, onboard
and retain more volunteers faster to respond to
growing needs. And, end long delays between applying,
interviewing and training.

The Trevor Project was eager to transcend traditional
ways of operating to eliminate wasted time, so PwC
brought our BXT approach to help equip their volunteers
and support youth in need.

ah yeah, so we were taking too long screening the people who'll be on the phone with suicidal teenagers, which is why we need $6 million american dollars for... BXT?

the thing graphics cards have? what the fuck is BXT??

B is for Business! "How you build value. Industry and functional expertise inform the most relevant ideas." X is for Experience! "What people remember. Human-centric thinking engages people and breathes life into everything." T is for Technology. "How you make it real. Connected technology becomes a platform for the things you make."

oh that's just fucking rich.

"Talk to us about how BXT Works will work for you. We are your partner in innovation. We work with you from ideation to launch and beyond, providing guidance, feedback and support along the way. We bring deep partnerships with world-class technology providers and a proven record of delivering successful digital products for clients across industries." After some stock photos of people sitting at laptops, the text: "BXT Works brings speed to results. Driven by technology." Logos for Microsoft Azure, Amazon AWS, Adobe, Oracle, SAP, and Cloud Salesforce follow.

you're telling me this 'charitable foundation' donated $6 million dollars to collaborate with The Trevor Project, to- do what, exactly?

configure OneDrive for them??

THE SOLUTION: We partnered with PwC Charitable Foundation to deploy a new back-end volunteer management system that connects the following platforms: Okta, volunteer online application, Salesforce CRM, asynchronous training, and volunteer scheduling. The team rapidly deployed technology capabilities and automation on the Salesforce platform — cutting development time by using pre-built components, out of the box functionality, and Custom Apex code, which ultimately allowed for more robust testing periods and made it possible to deliver a scalable product faster. This streamlines several processes around recruitment, training, and retention, including: Centralized data: For the first time, Trevor’s volunteer journey — from application to training to managing crisis counselors — is tracked in a single database that is accessible by all of our volunteer management teams.
Streamlined Registration and Automated User Provisioning: We implemented an Okta registration for a smoother user experience and a more simplified system provisioning process. Revamping the Volunteer Application: We redesigned our volunteer application, improving the user experience and linking the application directly to Salesforce so applicant Contacts are automatically created to begin their volunteer journey.
Automating Emails: We implemented email tracking in Salesforce with Einstein Activity Capture and incorporated spam prevention and deliverability tracking via SendGrid integration (SMTP relay). Streamlining Interview Scheduling: We embedded Calendly links with the recruiter’s updated schedule and integrated Calendly to update Interview Status in Salesforce to make scheduling easy and seamless.

these were the guys you needed for that, amit?! i mean, i agree keeping track of your volunteers is a good idea, but if you really listened to the "LGBTQouth" as you call them, they'd have already convinced you to switch to linux.

what qualifies these guys to take our mental health resource for gay children in crisis into the 21st century?

Turn ideas into products, faster.

yeah, like turning chronic illness into opiate addiction! amirite, amit?

jesus christ.

act like a nonprofit if you're going to claim to be one. fuck all.

still, this is all business fluff. capitalist foreplay. as to how "embedding Calendy links" helps increase volunteer capacity by the goal of, reminder, ten-fold

In front of a picture of many Google programmers posing in a very 'cool' office, text reading: "Google and Trevor teamed up to use machine learning to: Ensure youth at the highest risk of suicide are connected to a crisis counselor most quickly. Roll out user-experience improvements that will create a more welcoming and affirming experience for youth in crisis. Train significantly more volunteer crisis counselors. Exponentially grow our global. Community of LGBTQ youth on. TrevorSpace and work to keep them safe." Following that: "Volunteers trained: 100 per year to 100+ per month." "In 2020 we launched our asynchronous training platform that helps us to train 10x the number of volunteers each year on more flexible timelines"

well. there's a new one.

2: Use AI To Decide Which Suicidal Child Needs Help First

An article from Fast Company in late 2020, entitled "How the Trevor Project is using AI to prevent LGBTQ suicides". By KC Ifeanyi. The byline reads: "Over the past three years, the nation’s largest suicide prevention organization for LGBTQ youth has undergone a major tech overhaul, most recently using machine learning to assess high-risk outreach." Image is of several Trevor Project counselors on calls across Macs in a white office building.

Leveraging AI in suicide prevention has
gained traction over the years.

Trevor Project, thankfully, knew that a cry for help being answered by a chatbot could be even worse than a delay. but, instead of, you know, seeing this as a compelling reason to not incorporate AI, they found a compromise.

With Google’s help, The Trevor Project will be able
to assess suicide risk level of youth in crisis more quickly,
allowing counselors to better tailor their support and to provide
relevant resources and consistent quality of care.

and really, when dealing with suicidal children, comprising technology is only as dangerous as you acknowledge it; just ask Trevor Project's AI expert, John Callery

“Sitting at the intersection of social
impact, bleeding-edge technology,
and ethics, we at Trevor recognize
the responsibility to address systemic
challenges to ensure the fair and
beneficial use of AI. We have a set of
principles that define our fundamental
value system for developing technology
within the communities that exist.

Right now, we have a lot of great data
that shows that our model is treating
people across these groups fairly, and we
have regular mechanisms for checking
that on a weekly basis to see if there are
any anomalies.”

our AI isn't racist, trust us! johnny-boy over here checked!

and aaand! he's got Great Data™ coming back every monday to doublecheck the AI didn't learn racism over the weekend.

[Gaunt] offers pragmatic advice for those seeking
to reduce the bias in their data. “Define the problem
and goals up front. Doing so in advance will inform the
model’s training formula and can help your system stay
as objective as possible,” she said. “Without predetermined
problems and goals, your training formula could
unintentionally be optimized to produce irrelevant results.”

"tell the robot ahead of time not to be racist" shit good thinking, write that down!!

better yet, get google to do it for you!!

The Trevor Project applied for Google’s AI Impact
Challenge and was selected as one of 20 finalists
from 2,602 applications. Google granted The Trevor
Project $1.5 million and a team of Google Fellows to
help the organization problem-solve with AI.

“And from there, Google kind of flipped up the heat
on how to set goals, how to approach responsible AI,
[and] how to productionize AI,” Callery said.

just be sure to remind them not to be racist! put it on a sticky note next to 'don't be evil!'.

speaking of- you can spend hours, months, entire fiscal years debating the ethics of AI- but none of this matters here and now. speculation doesn't matter here and now.

you know what does matter here and now? getting qualified people on the line with suicidal kids.

A visual demonstration of The Trevor Project's AI tool for prioritizing incoming mental health crisis calls. First, chats begin rolling in. Second, Trevor's Machine Learning Tool begins Content Analysis. Messages include "Hey... it's the anniversary of my mom's death?" "Why are the teachers at my school so mean?" and ":I'm feeling very alone right now. I'm not sure I can keep going."

an AI cannot be qualified to field suicidal DMs.

Stage Three: Chats that are flagged as high-risk are moved to the front of the queue. Stage Four: The High-priority Chats are Connected with a Trevor Volunteer First. In the visual, the message "I'm feeling very alone right now." is highlighted, then shown on a phone with a response. "Hi Erin, my name is Ansel and I'm here for you. Tell me more about what's going on."

if you don't sound suicidal enough to the AI, you'll have to wait for another unpaid volunteer to be told by the machine it's your turn to live.

this is an appalling use of this appalling technology. a volunteer trained by AI, being told by AI which suicidal child is most efficient to speak with next. a complete callousness to compassion at every level, all in the name of 'growth'.

“We didn’t set out to and are not setting out to
design an AI system that will take the place of a
counselor, or that will directly interact with a
person who might be in crisis.”

-Dan Fichter to MIT Technology Review, Feb 2021

funny thing about "setting out" is, eventually-

For the Trevor Project, someone reaching out via
text or chat is met with a few basic questions such as
“How upset are you?” or “Do you have thoughts of suicide?”
From there, Google’s natural language processing model
ALBERT gauges responses, and those considered at a
high risk for self-harm are prioritized in the queue to
speak with a human counselor.

-you arrive somewhere, don't you?

3: Use A Suicidal AI Child To Train Unpaid Volunteers

A screengrab of Riley, the AI tool used to help train volunteers for The Trevor Project. In phone DM style formatting, the chat reads as such: "YOUTH: it was awful... they acted like i was such a freak. they told me i just wanted attention and that i should just be a man. COUNSELOR: Makes sense that would be really upsetting to hear. Earlier you mentioned thinking about coming out to your parents. Tell me more. YOUTH: idk it's all just making me anxious thinking about coming out to my parents makes me wanna throw up tbh. COUNSELOR: You said earlier you were pretty close, what makes you think they would not understand if you come out to them?"

this is riley. they're having one of the worst days of their life, by design.

they are an AI chatbot which used, at least for a time, GPT-2 (yes, the one trained on reddit links) as a base.

next, The Trevor Project trained it on transcripts of older semi-scripted roleplay exercises by counselors helping to train one another.

to avoid the 'What If The AI Learns Racism?' problem (that Trevor has their white tech dude checking The Data weekly on), Riley was narrowed in for just this task.

the suicidal AI child robot only knows how to be a suicidal AI child robot.

“Emulating youth language really does feel genuine.
I think, now, the model might do a better job of
that than the adult staff.”
       -Jen Carter, Global Head of Tech
       and Volunteering at Google, 2021

look- i'm not flatly against the concept of crisis simulators. and for LGBTQ youth mental health, the only shape it can really take is a convincing sounding voice of a child in crisis.

but remember, all of this wasn't done to better train counselors; growth is the primary goal, don't you recall? Riley was developed under the explicit intention to train more volunteers, to get them on the phone faster. same with drew.

"Named “Drew,” the new simulated youth persona displays a new set of life experiences and risk level and will be used alongside the existing training persona “Riley”"

oh yeah, drew is the other suicidal child robot

“Starting from the first conception of the
Crisis Contact Simulator two years ago, it
has always been our hope to develop a variety
of training role-play personas that represent the
diverse experiences and intersectional identities
of the LGBTQ young people we serve, each with
their own stories and feelings.

     -Dan Fichter, Head of AI and Engineering
         at The Trevor Project, 2021

can we stop talking about this like it's a product launch?

“This project is a perfect example of how we can
leverage industry gold-standard technology innovations
and apply them to our own life-saving work. I’m so proud
of our dynamic technology team for developing tools that
directly support our mission, while also creating a new
paradigm that can set an example for other mental health
and crisis services organizations.”

     -Amit Paley, CEO & Executive Director
         of The Trevor Project, 2021

can we stop, turning suicidal ideation into products?

The organization currently employs a technology team of
more than 30 full-time staff dedicated to product development,
AI and machine learning, engineering, UX,
and technology operations.

Looking ahead, The Trevor Project intends to continue
exploring technology applications to grow its impact
by investing in new tools to scale. 

        -The Trevor Project, 2021

CAN WE FUCKING STOP THIS?!

4: Pay The Fewest People Possible At Every Stage

From The Trevor Project's volunteer page. "I don’t have experience in mental health counseling. Can I still become a crisis counselor? Yes! Our trainees undergo a 40-hour virtual training to prepare them extensively for the many potential scenarios that may emerge in conversations with LGBTQ youth in crisis."

who supervises this damned training, anyways?

the Training Coordinators, of course. who are they? it could be you! after all, they're literally always hiring

Using our online learning platform, Training Coordinators
provide structured support and expectations for volunteers,
deliver clear and compassionate feedback, and promote
volunteer success through rigor and kindness.

Please note: Because the Training Coordinator role is
mission-critical to our organization and because we employ
a large number of these positions, we interview for this role
even when we don’t have a currently open position.

you, too, can be thrown on a pile of resumes to hopefully someday be underpaid and overworked in the name of growth

oh yeah and help some kids or whatever

so chatgpt trains the volunteers, overseen by exhausted coordinators, to gain the "important soft skills" and "world-class crisis intervention training" promised by Trevor, and then?

and fucking then?

Volunteer FAQs: What is the time commitment for volunteers?. After completing a 40-hour training over the course of 10 weeks, volunteers must commit to a weekly 3-hour shift for a minimum of one year.

make them do the heart-breaking work.

make them, your unpaid volunteers, into a marketable demographic for little white pills.

did Purdue teach you that, amit? or did the AI help you learn something new?


Part 4: Developing Every Resource Except Human


You must log in to comment.

in reply to @Sheri's post: