In a talk at this week's Develop:Brighton conference, tinyBuild CEO Alex Nichiporchik gave examples of how large language models such as ChatGPT could be used by video game studios to identify "potential problematic players on the team." The suggestions included feeding employees' text chats and video call transcripts into a system to detect certain words which could indicate burnout and "time vampires."
After receiving criticism online, Niciporchik tweeted to say that parts of his presentation had been taken out of context and that the examples were "hypothetical."
"We do not use AI tools for HR, this part of the presentation was hypothetical," said Nichiporchik in the final tweet of a thread.
During the presentation, Nichiporchik described a process he called "I, Me Analysis", as reported by Whynow Gaming. The process involves feeding Slack transcripts, and automated transcriptions from video calls, into ChatGPT in order to count how many times an employee uses the words "I" and "me".
"There is a direct correlation between how many times someone uses ‘I’ or ‘me’ in a meeting, compared to the amount of words they use overall, to the probability of the person going to a burnout," Nichiporchik reportedly said. "I should really copyright this, because to my knowledge, no one has invented this."
He also spoke about how a similar process could be used to identify "time vampires" - employees who talk too much in meetings. "Once that person is no longer with the company or with the team, the meeting takes 20 minutes and we get five times more done."
Through these systems, Nichiporchik suggested a company might be able to "identify someone who is on the verge of burning out, who might be the reason the colleagues who work
Read more on rockpapershotgun.com