Raptor

Fake gamer; real girl.

🏳️‍⚧️ Amateur game scholar.
Social worker. Events Person
Living on stolen Duwamish lands.


Old website I made for travel writings that droped a database
elucidovia.com/
Mastadon I probably won't use
tech.lgbt/@Raptor
Tumblr I probably won't use
www.tumblr.com/ohthatraptor
Username pretty much everywhere
OhThatRaptor

gamedeveloper
@gamedeveloper

"Toxic people are usually the ones about to burnout."

Tinybuild boss Alex Nichiporchik says the company is using AI to identify toxic workers, which according to the CEO also includes those suffering from burnout. In excerpts from the talk published by Why Now Gaming, Nichiporchik explains how the Hello Neighbor publisher feeds text from Slack messages, automatic transcriptions from Google Meet and Zoom, and task managers used by the company, into ChatGPT to conduct an "I, Me Analysis" that allows Tinybuild to gauge the "probability of the person going to a burnout."

Nichiporchik said the technique, which he claims to have invented, is "bizarrely Black Mirror-y" and involves using ChatGPT to monitor the number of times workers have said "I" or "Me" in meetings and other correspondence. Why? Because he claims there is a "direct correlation" between how many times somebody uses those words, compared to the amount of words they use overall, and the likelihood of them experiencing burnout at some point in the future.

Update: In a separate response sent directly to Why Now Gaming, Nichiporchik said the HR portion of his presentation was purely a "hypothetical" and that Tinybuild doesn't use AI tools to monitor staff.

"The HR part of my presentation was a hypothetical, hence the Black Mirror reference. I could’ve made it more clear for when viewing out of context," reads the statement. "We do not monitor employees or use AI to identify problematic ones. The presentation explored how AI tools can be used, and some get into creepy territory. I wanted to explore how they can be used for good."

Read more on this story, and hear Nichiporchik's response to our reporting, over at Game Developer.



You must log in to comment.

in reply to @gamedeveloper's post:

Absolutely fuckin' foul. Between this and Chucklefish (Both for it's mistreatment of development staff + some personal reasons it'd be improper to just throw in here) I'm growing incredibly wary of indie publishers.

in reply to @ghoulnoise's post:

but like

wordcount of uses of "me" is like two lines of python

the role of ChatGPT is to make this slower and more expensive, and probably also much less accurate