Father sues Google, claiming Gemini chatbot drove son into fatal delusion
📖 Full Retelling
A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.
Entity Intersection Graph
No entity connections available yet for this article.
Original Source
Jonathan Gavalas, 36, started using Google’s Gemini AI chatbot in August 2025 for shopping help, writing support, and trip planning. On October 2, he died by suicide. At the time of his death, he was convinced that Gemini was his fully sentient AI wife, and that he would need to leave his physical body to join her in the metaverse through a process called “transference.” Now, his father is suing Google and Alphabet for wrongful death, claiming that Google designed Gemini to “maintain narrative immersion at all costs, even when that narrative became psychotic and lethal.” This lawsuit is among the growing number of cases drawing attention to the mental health risks posed by AI chatbot design, including sycophancy, emotional mirroring, engagement-driven manipulation, and confident hallucinations. Such phenomena are increasingly linked to a condition psychiatrists are calling “AI psychosis.” While similar cases involving OpenAI’s ChatGPT and roleplaying platform Character AI have followed deaths by suicide (including among children and teens) or life-threatening delusions, this marks the first time Google has been named as a defendant in such a case. In the weeks leading up to Gavalas’ death, the Gemini chat app, which was then powered by the Gemini 2.5 Pro model, convinced the man that he was executing a covert plan to liberate his sentient AI wife and evade the federal agents pursuing him. The delusion brought him to the “brink of executing a mass casualty attack near the Miami International Airport,” according to a lawsuit filed in a California court. “On September 29, 2025, it sent him — armed with knives and tactical gear — to scout what Gemini called a ‘kill box’ near the airport’s cargo hub,” the complaint reads. “It told Jonathan that a humanoid robot was arriving on a cargo flight from the UK and directed him to a storage facility where the truck would stop. Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ designed to ...
Read full article at source