"Toxic people are usually the ones about to burnout."
Tinybuild boss Alex Nichiporchik says the company is using AI to identify toxic workers, which according to the CEO also includes those suffering from burnout. In excerpts from the talk published by Why Now Gaming, Nichiporchik explains how the Hello Neighbor publisher feeds text from Slack messages, automatic transcriptions from Google Meet and Zoom, and task managers used by the company, into ChatGPT to conduct an "I, Me Analysis" that allows Tinybuild to gauge the "probability of the person going to a burnout."
Nichiporchik said the technique, which he claims to have invented, is "bizarrely Black Mirror-y" and involves using ChatGPT to monitor the number of times workers have said "I" or "Me" in meetings and other correspondence. Why? Because he claims there is a "direct correlation" between how many times somebody uses those words, compared to the amount of words they use overall, and the likelihood of them experiencing burnout at some point in the future.
Update: In a separate response sent directly to Why Now Gaming, Nichiporchik said the HR portion of his presentation was purely a "hypothetical" and that Tinybuild doesn't use AI tools to monitor staff.
"The HR part of my presentation was a hypothetical, hence the Black Mirror reference. I could’ve made it more clear for when viewing out of context," reads the statement. "We do not monitor employees or use AI to identify problematic ones. The presentation explored how AI tools can be used, and some get into creepy territory. I wanted to explore how they can be used for good."
Read more on this story, and hear Nichiporchik's response to our reporting, over at Game Developer.

