Alex Nichiporchik, CEO of Hello Neighbor publisher TinyBuild, has raised eyebrows during a keynote in which he outlined potential uses of AI tools in the workplace, including the monitoring of employee Slack messages and meeting transcriptions to help identify «potential problematic players» — a discussion he has since insisted was «hypothetical».
Nichiporchik (as reported by WhyNowGaming) was speaking at this week's Develop: Brighton conference, in a talk titled 'AI in Gamedev: Is My Job Safe?' which promised an «in-depth [look] into how [TinyBuild] adopted AI in daily practices to exponentially increase efficiency».
One part of the presentation in particular, focusing on «AI for HR», has proved especially contentious since news of its contents began to spread around the internet. Here, Nichiporchik discussed how AI could be used by HR to spot burnout (later described as being synonymous with «toxicity») among employees by first identifying «potential problematic team members» then collating and running their Slack messages and automatic transcriptions from the likes of Google Meet and Zoom through Chat GPT, in a process he terms «I, Me Analysis».
«There is a direct correlation between how many times someone uses 'I' or 'me' in a meeting,» Nichiporchik posited, «compared to the amount of words they use overall, to the probability of the person going to a burnout.»
According to Nichiporchik, by identifying employees who 'talk took much about themselves', who 'suck up too much time in meetings' so that «nothing gets done», and who recieve negative feedback in 360-degree peer reviews, it's then possible to «identify someone who is on the verge of burning out, who might be the reason the colleagues who work with that person
Read more on eurogamer.net