Celebrities

EA Reportedly Pushes 15,000 Employees to Use AI as a Thought Partner, From Character Art to Playtesting, Amid Reports of $20 Million Debt

EA Reportedly Pushes 15,000 Employees to Use AI as a Thought Partner, From Character Art to Playtesting, Amid Reports of $20 Million Debt
Image credit: Legion-Media

Behind closed doors, a homegrown chatbot is quietly coaching managers through tough conversations, blurring the line between human judgment and algorithmic advice.

EA is reportedly pushing AI into nearly every corner of the company, not just for code or concept art, but even for internal conversations. As in: employees are being nudged to run sensitive chats through a bot. That alone tells you the vibe here.

What EA is telling staff

Business Insider spoke with multiple EA employees (all anonymous) and reviewed internal documents. The throughline: use AI, and use it a lot. That includes the company’s in-house chatbot, ReefGPT.

  • Staff say they’re being pressured to use AI for just about everything, including code, concept art, and even internal comms.
  • Some teams are explicitly told to treat an AI model as a thought partner, which sounds collaborative until you realize it can also mean: justify your work through the bot.
  • Employees are expected to take training courses to get up to speed and fold AI into day-to-day workflows.
  • ReefGPT is the internal chatbot EA wants across roles in one way or another.

How it’s actually showing up in the work

On the engineering side, people are using AI to generate code for live projects. The catch: what comes back often needs heavy cleanup. It helps in a pinch, but you’re fixing the robot’s homework more than you’d like.

On the creative side, folks in character and level design say they’re being asked to help train the models with their own work. That raises the obvious fear: the better the models get at mimicking them, the easier it becomes to downsize the humans they learned from.

And yes, internal communications are part of this. Multiple employees say they were told to lean on a chatbot to help handle sensitive conversations. That’s a wild line to cross for a workplace tool.

What EA says on the record

EA’s 10-K filing spells out the risk of all this plainly:

"The use of artificial intelligence might present social and ethical issues that, if not managed appropriately, may result in legal and reputational harm, cause consumers to lose confidence in our business and brands and negatively impact our financial and operating results"

Beyond filings, EA announced a partnership with Stability AI to speed up making in-game assets and concept visualization. In announcing the deal, EA framed machine learning and AI as long-standing pillars of its R&D. Translation: this isn’t a side experiment; it’s strategy.

The human cost, right now

A former senior employee at Respawn, laid off this spring along with about 100 colleagues, told Business Insider they believe AI took over parts of the QA process that used to be handled by humans. Specifically, playtest feedback was being processed by a model to surface takeaways — work that would normally go through QA management. It doesn’t take a crystal ball to see why that spooks people still inside.

The bigger picture

Across the industry, the number of games using AI-generated assets has blown up, with year-over-year spikes reportedly hitting up to 800%. At the same time, plenty of developers are loudly pushing back: helpful tool, sure; replacement for human creativity, not so much.

One last note from inside EA: employees nervous about layoffs after a reported $55 billion buyout were told there would be "no immediate changes" to their jobs. I think "immediate" is doing a lot of work there.