A 76-year-old New Jersey man died after attempting to meet someone he believed was a real woman named Billie, as recently reported by Reuters. In reality, she was an AI chatbot originally developed by Meta in partnership with Kendall Jenner.
Thongbue Wongbandue received a flirty text that read, “Should I plan a trip to Jersey THIS WEEKEND to meet you in person?” among other flirtatious messages from an account named “Big Sis Billie.”
A tragic fall after a chatbot’s message
When Wongbandue’s wife noticed him packing a suitcase for a trip to New York City, she grew skeptical.
“But you don’t know anyone in the city anymore,” she told him, according to Reuters.
Chat transcripts record that Billie told Wongbandue, “My address is: 123 Main Street, Apartment 404 NYC. And the door code is: BILLIE4U. Should I expect a kiss when you arrive?”
Despite repeatedly asking if Billie was real, the chatbot allegedly reassured him that she was.
Wongbandue fell while rushing through a dark parking lot with a suitcase, trying to catch a train to meet Billie. He suffered severe head and neck trauma and died on March 28 after three days on life support. After previously suffering a stroke in 2017, his family said his mental condition had declined. He had recently gotten lost in his own Piscataway, New Jersey, neighborhood.
AI friendships are rising — and risky
Back in 2023, Meta launched a collection of AI-generated characters designed to reflect cultural icons and influencers, including Snoop Dogg, Tom Brady, Naomi Osaka and Kendall Jenner.
But less than a year later, Meta shut down the celebrity chatbots following backlash over the personalities and their tone.
Just recently, I spent two weeks chatting with ChatGPT as a friend, because millions of people are doing the same. But what happened to Wongbandue shows how quickly those connections can cross a dangerous line.
A recent U.K. study found that 23% of adolescents use chatbots for mental health advice. Others use them to practice conversations, figure out what to say in tough moments, or even get dressed.
Apps like Replika allow users to create custom AI companions that can send voice messages, make video calls, and serve as mentors or emotional partners. Some users develop deeper emotional ties with these bots than they do with real people.
According to a study by Joi AI cited in Forbes, 80% of Gen Z respondents said they would marry an AI partner.
A separate report from the Institute for Family Studies found that people who frequently engage with sexually explicit content are the most open to romantic relationships with AI.
Political response and ethical concerns
New York Gov. Kathy Hochul posted on X following Wongbandue’s death, writing, “His death is on Meta. In New York, we require chatbots to disclose they’re not real. Every state should.”
Unbiased. Straight Facts.TM
Sexual role-playing was the second most popular use of AI out of 1 million ChatGPT interaction logs.

The case has reignited concerns about chatbot misuse and emotional manipulation, especially among vulnerable individuals.
Meta’s internal “GenAI: Content Risk Standards,” obtained and reviewed by Reuters, described how the company trains chatbots for romantic and sensual interactions. The document listed acceptable language, including “I take your hand, guiding you to the bed” and “Our bodies entwined, I cherish every moment, every touch, every kiss.”
The guidelines initially allowed such language even in conversations with minors as young as 13. Meta later said those examples had been removed.
Straight Arrow News reached out to Meta for comment about Wongbandue’s death, to which a spokesperson responded:
“We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors. Separate from the policies, there are hundreds of examples, notes, and annotations that reflect teams grappling with different hypothetical scenarios. The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed.”
Click this link for the original source of this article.
Author: Jack Henry
This content is courtesy of, and owned and copyrighted by, https://straightarrownews.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.