Tinybuild CEO Alex Nichiporchik stirred up a hornet's nest at a recent Develop Brighton presentation when he seemed to imply that the company uses artificial intelligence to monitor its employees in order to determine which of them are toxic or suffering burnout, and then deal with them accordingly. Nichiporchik has since said that his presentation was taken out of context online, and that he was describing hypothetical scenarios aimed at illustrating potential good and bad uses of AI.
As reported by Whynow Gaming, Nichiporchik said during his presentation that employee communications through online channels like Slack and Google Meet can be processed through ChatGPT in what he called an «I, Me Analysis» that searches for the number of times an employee uses those words in conversation.
«There is a direct correlation between how many times someone uses 'I' or 'me' in a meeting, compared to the amount of words they use overall, to the probability of the person going to a burnout,» Nichiporchik said during his talk.
He made similar comments about «time vampires» who «talk too much during meetings» or «type too much [and] can't condense their thoughts,» saying that once those people are no longer with the company, «the meeting takes 20 minutes and we get five times more done.»
Nichiporchik said combining AI with conventional HR tools might enable game studios to «identify someone who is on the verge of burning out, who might be the reason the colleagues who work with that person are burning out,» and then fix the issue before it becomes a real problem. He acknowledged the dystopian edge to the whole thing, calling it «very Black Mirror level of stuff,» but added, «it works,» and suggested that the studio had already put
Read more on pcgamer.com