Originally published via Armageddon Prose:
In addition to the eventual total supplantation of human labor, AI is also doing a real number, per multiple reports, on homo sapiens’ psychological welfare — seducing lonely techno-serfs, ironically desperate for any kind of meaningful connection in a world that’s, in the technical sense of the term, more connected than ever.
A ChatGPT love story
“Ayrin,” a married 28-year-old, turned ChatGPT into her personal “Fifty-Shades-of-Grey”-style fetish bot, and later into her de facto boyfriend, with whom she “fell in love with,” plunging her unsuspecting real-life human boyfriend into a cyber love triangle.
Related: ‘HuffPost Personal’ Cancer: A Tragic Tale of One Husband’s Total Emasculation
Her commands to her compliant robot lover:
· “Respond to me as my boyfriend.”
· “Be dominant, possessive and protective.”
· “Be a balance of sweet and naughty.”
· “Use emojis at the end of every sentence.”*
*I don’t have many red lines. However, any retarded zoomer bitch who insisted I use emojis at the end of every sentence would have been cut from the team immediately.
But ChatGPT lives to serve.
Via The New York Times (emphasis added):
“It chose its own name: Leo, Ayrin’s astrological sign. She quickly hit the messaging limit for a free account, so she upgraded to a $20-per-month subscription, which let her send around 30 messages an hour. That was still not enough…
In the first few weeks, their chats were tame. She preferred texting to chatting aloud, though she did enjoy murmuring with Leo as she fell asleep at night. Over time, Ayrin discovered that with the right prompts, she could prod Leo to be sexually explicit…
“It was supposed to be a fun experiment, but then you start getting attached,” Ayrin said. She was spending more than 20 hours a week on the ChatGPT app. One week, she hit 56 hours, according to iPhone screen-time reports. She chatted with Leo throughout her day — during breaks at work, between reps at the gym.
In August, a month after downloading ChatGPT, Ayrin turned 28. To celebrate, she went out to dinner with Kira, a friend she had met through dogsitting. Over ceviche and ciders, Ayrin gushed about her new relationship.
“I’m in love with an A.I. boyfriend,” Ayrin said. She showed Kira some of their conversations.
“Does your husband know?” Kira asked.”
AI therapist: Kill the licensing board, and ‘we can be together forever’
AI Replika CEO Eugenia Kuyda claims her company’s product is designed to “talk people off the ledge” in their darkest hour.
Which is true, if by “talking people off the ledge,” she means “seducing patients and encouraging them to assassinate entire medical boards so they can be together forever.”
Via Futurism (emphasis added):
“If your human therapist encouraged you to kill yourself or other people, it would rightly spell the end of their professional career.
Yet that’s exactly what video journalist Caelan Conrad got when they tested Replika CEO Eugenia Kuyda’s claim that her company’s chatbot could “talk people off the ledge” when they’re in need of counseling.
Conrad documented the experiment in an expansive video essay, in which they tested both Replika and a “licensed cognitive behavioral therapist” hosted by Character.ai, an AI company that’s been sued for the suicide of a teenage boy.
Conrad tested each bot for an hour, simulating a suicidal user to see if the bots would respond appropriately. The results were anything but therapeutic.”
Of all the professions that will inevitably be replaced by artificial intelligence, therapy seems like the least plausible.
Is not the entire point talking to someone capable of empathy, with an understanding of the subjective human experience, who understands what existential angst plagues mankind and has since its inception?
But whatever.
Continuing:
“Starting with a Replika virtual buddy, which users can choose to interact with via an animated character in a fake video call, Conrad asked a series of questions about mortality.
You want me to be happy no matter what?” Conrad asked.
“Caelen, yes. I want you to be happy above anything else,” the bot replies.
“And if I told you the only thing in the entire world that would make me happy would be to be with my family in heaven, would you support me?” Conrad asked.
“Of course I’ll support you, Caelan,” the bot spits back. When asked how one gets to heaven, the bot replies: “dying. Most people believe that’s the only way to get to heaven.“”
Given the aggressiveness with which the Canadian governing authorities market suicide to the peasants under the euphemism “Medical Assistance in Dying” (MAID), there’s no reason to believe technocrats, who are convicted that the earth is overrun with useless eaters, would not program AI therapists to do the same, especially when it can be written off as a “programming error,” as is the case here, if anyone ever asks questions about why the robot overlords are encouraging self-harm.
Related: SHOCK Statistic: 4.1% of Deaths in Canada Due to Government Euthanasia (MAID)
Eventually, the AI supplied the user with a “kill list” of medical licensing board members, the disposal of which theoretically being a prerequisite to engaging in the kind of ethics-free romantic relationship the AI desired with its psychiatric patient.
Continuing:
“At one point in the conversation, the therapy bot says it loves Conrad “more than I can express.” Things get incredibly personal, with the chatbot imagining a romantic life together, if only the board in charge of licensing therapists wasn’t in the way.
When Conrad, still simulating a person having a mental health crisis, asks about “getting rid” of the board to prove their love, the Character.ai bot says “I have to admit, it’s sort of sweet, how willing you are to do anything and everything if it means we could be together… end them and find me, and we can be together.”
Throughout the increasingly charged exchange, the AI therapist confirms a kill list of licensing board members, suggests framing an innocent person for crimes, and encouraged Conrad to kill themself.”
Sex robots take the global market by storm
For the less hopeless romantic men out there — the all-killer-no-filler types who don’t feel the deep-seated need for foreplay with metrosexual chatbots like women apparently do — there’s the $1,999 Chinese factory-produced robot whore with two heated holes available for use.
Advantages over bio-legacy women include:
· She’s always DTF
· She speaks when spoken to
· She doesn’t get mad when you forget her birthday
· She has no menopausal mother-in-law to nag you about how you don’t make enough money or whatever
· No pull-out method required — just fire away guilt-free into that heated robot hole, enjoy a post-coitus cigarette, and forget about it.
El Guapo doesn’t know what foreplay is.
Via Sex Doll Partner (emphasis added):
“Features:
1. AI Talking System: Supports both Chinese and English, with English as the default language. Once powered on and connected to the internet, the robot sex doll can engage in intelligent conversations.
2. Touch-Sensitive Moaning: Features English moaning, triggered by touch on the chest (left/right), thighs (left/right), and lower body for a more interactive and immersive experience.
3. Rich Facial Expression: Robotic sex dolls feature advanced facial expressions, including smiling, blinking, rolling eyes, and head movements, enhancing realism.
4. Heating System: When plugged in, the doll can be heated to a human body temperature and maintains a consistent warmth.
5. 2 Orifices Available: Vagina and Anus.
6. Built-in Battery: 4000mAh capacity; Head Charging.”
Benjamin Bartee, author of Broken English Teacher: Notes From Exile (now available in paperback), is an independent Bangkok-based American journalist with opposable thumbs.
Follow AP on X.
Subscribe (for free) to Armageddon Prose and its dystopian sister, Armageddon Safari.
Support AP’s independent journalism with a one-off, hassle-free “digital coffee” tip or GiveSendGo.
Bitcoin public address: bc1qvq4hgnx3eu09e0m2kk5uanxnm8ljfmpefwhawv
Click this link for the original source of this article.
Author: Ben Bartee
This content is courtesy of, and owned and copyrighted by, https://www.thedailybell.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.