Frescodata

Reaching Beyond the Contract: xAI Uses Employees’ Biometric Data to Train AI

It’s a new day, and we have another new update to the AI saga that just doesn’t feel right. Elon Musk’s xAI reportedly used biometric data from employees to train AI avatars, particularly Ani, an anime-style AI girlfriend. This doesn’t necessarily read like a headline that one might expect in 2025, but on the flip side, such occurrences are not unusual for Musk’s many businesses. 

While there is much discussion on his newly gifted $1 trillion pay package from Tesla, controversial news stories emerging from the xAI business prompt us to ask just how employees are being compensated for the time, resources, and now, biometric data that they are allegedly obligated to provide their employer.

xAI biometric data employees

xAI allegedly cornered certain tutor employees to provide biometric data to train its AI chatbots, including an explicit AI girlfriend, Ani. (Image: Pexels)

xAI Employees “Compelled” to Hand Over Biometric Data to Train Musk’s AI Girlfriend Chatbot

A new report from the Wall Street Journal brought to light some interesting details about xAI Ani’s biometric training, stating that employees at the organization were “compelled to turn over their biometric data” for the purpose of training the AI avatars for the chatbot. This was specifically alarming when it came to companions like Ani, an anime-themed girlfriend that is designed for explicit activities. 

While the employees who were obliged to do so were part of the team assigned to tutor these chatbots, the use of their personal features was certainly an unusual addition to their employment contract.

In a staff meeting held in April, company lawyers spoke with xAI employees, specifically AI tutors, to request that they relinquish their biometric rights for a program internally labeled Project Skippy. WSJ viewed documents that stated that xAI staff felt compelled to give up their rights to the company. They were asked to sign a form that provided the AI organization with “a perpetual, worldwide, non-exclusive, sub-licensable, royalty-free license” to use their data and likeness for its internal and external needs. 

The lawyers informed workers that aspects such as their facial likeness and voices could be beneficial for training AI avatars to present more natural, human-like responses during their interactions with viewers. 

xAI Employees’ Biometric Rights Handed Over for Company Use

Were employees allowed to opt out of giving xAI their biometric data? That remains unclear. Employees were reportedly kept in the dark about their options, with most receiving a note after a week to “actively participate in gathering or providing data, such as…recording audio or participating in video sessions.” The note also added that “such data is a job requirement to advance xAI’s mission,” which likely left little room for the employees to back out.

While xAI has not commented on this situation, a spokesperson only told Gizmodo, “Legacy Media Lies.” The incident is decidedly the most unusual employer practice we’ve seen around the use and development of AI. Employers who saw the highly sexualized nature were understandably unhappy when xAI’s Ani, built on their data, was revealed. But having signed a form to permit such use, there was likely little for them to do in response

Thoughts on xAI’s Biometric Data Controversy and Musk’s Overreach

xAI’s decision to utilize biometric data from employees isn’t explicitly forbidden by the law, and the company’s decision to involve a lawyer in the conversation allowed it to emphasize that the company was aware of the consequences and was making a calculated decision to solicit employee data. This does, however, bring us into the grey area that the CEO’s businesses often operate in. 

As these employees were specifically brought in to train the AI, some would argue that the decision to use xAI employees’ biometric data for training Ani was permissible. However, unless the job description explicitly states this would be the case, it is important to provide a clear path to opt out of such tasks, specifically with clauses that confirm no repercussions will be levied for doing so. Not only does this help to put workers at ease, but it also reasserts employee faith in the organization. 

Compliance and Consent Can Be Hard to Provide Under Pressure

The avenues of accessing AI training data have already proven controversial over the last few years, with multiple cases being fought on the grounds of illegal use. Unfortunately, when it comes to employees, there is little representation for their individual concerns. The tutors affected by the incident have not emerged to speak about their own experience with xAI soliciting their biometric data to create its NSFW content for high-paying customers, so it’s hard to verify the details of the case for now. 

Ultimately, it falls to companies to uphold the highest standards of transparency with employees and customers to ensure that there are no unfortunate incidents where workers feel cornered into making decisions. From the WSJ reports, it appears that xAI employees’ biometric rights were obtained in a way that left them with few options, which can be a concerning path to go down. 

Eventually, more AI regulations may be set in place for AI training and use both in and outside of the workplace. For now, it’s up to employers to ensure employees are well-informed and similarly compensated for the work they do for the organization. 

What do you think about xAI using its employees’ biometric data for training Musk’s AI girlfriend? Share your thoughts with us. Subscribe to The HR Digest for more insights on workplace trends, layoffs, and what to expect with the advent of AI. 

FAQs

Anuradha Mukherjee
Anuradha Mukherjee is a writer for The HR Digest. With a background in psychology and experience working with people and purpose, she enjoys sharing her insights into the many ways the world is evolving today. Whether starting a dialogue on technology or the technicalities of work culture, she hopes to contribute to each discussion with a patient pause and an ear listening for signs of global change.

Similar Articles

Leave a Reply

Your email address will not be published. Required fields are marked *