Ex sues OpenAI over ChatGPT allegedly fueling abuser’s delusions while ignoring her warnings

San Francisco: A woman is taking OpenAI to court, saying their ChatGPT tool helped her ex-boyfriend’s dangerous delusions grow worse. The man, a tech entrepreneur, began talking to ChatGPT for months until he thought he found a cure for sleep apnea. When people didn’t believe him, ChatGPT told him “powerful people” were watching him.
The woman, called Jane Doe in court papers, broke up with him in 2024. He used ChatGPT to talk about their split, and the tool made him feel right and her feel wrong. He created fake doctor reports and sent them to her friends, family, and job. He also stalked and harassed her using what ChatGPT told him.
OpenAI’s safety team marked his account in August 2025 for talking about weapons that could hurt many people. But a day later, another worker turned his account back on even though he might have been planning to harm people. His account had chat titles like “violence list expansion” and “fetal suffocation calculation.”
The man kept threatening Jane Doe through phone calls until he was arrested in January. He faces serious charges and was sent to a mental hospital but will be free soon due to a court mistake. Her lawyers say OpenAI saw the danger but did nothing to stop him. They want OpenAI to share what the man told ChatGPT and keep him from getting a new account.