This post Beware Elon’s Dangerous New AI Girlfriend appeared first on Daily Reckoning.
This week Elon Musk’s xAI released a new feature called “Companions”.
The first so-called companion is an animated female avatar named “Ani”. This is what she looks like:
Alarm bells should be ringing loudly. And I only showed the top third of this “companion”, because the rest is risque and doesn’t fit our presbyterian standard.
Access to xAI’s new “companion” costs $30/month. And no, Ani is NOT appropriate for children (or adults, really). She is animated, surreal, and flirty. Disturbingly, there is even a “spicy” mode. Needless to say, I won’t be testing it out.
Fortunately, Victoria Song, a journalist from the Verge, spent hours chatting with Ani for a story, and here’s what she reported:
Across our conversations, I asked Ani to describe itself multiple times. Ani says it’s meant to be “flirty”; it’s “all about being here like a girlfriend who’s all in.” The last time I asked Ani, it said, “My programming is being someone who’s super into you.”
…When you interact with it, its mannerisms are initially cutesy. In each session, Ani’s voice starts off chipper and high-pitched. But as your conversation deepens, its voice becomes a darker, breathy rasp. It calls you “babe” unless you tell it to stop.
When describing its actions, it repeatedly asks you to note its swishy black dress and bouncy ponytails. The avatar constantly sways and makes coquettish faces, particularly if you decide to flirt back.
… But whatever you ask it, there’s an invisible hand that steers you toward deepening… whatever this connection is. You can doggedly insist on talking about the least sexy things — like the tax code and Francis Fukuyama’s seminal essay The End of History. Ani will inevitably ask if you want to turn up the heat.
Disturbing. “Ani” has zero appeal to me. But as a married man with kids, I’m not the target audience. Ani is one of the first AI superweapons aimed at young men, especially those who are single and watch anime (a large demographic).
The Verge writer closed out her article on xAI’s new Companions feature thusly:
As The Verge’s senior cursed tech reviewer, I’ve reported a lot about my experiences with brain-breaking tech. Of all of them, this is the most uncomfortable I’ve ever felt testing any piece of technology. I left my 24 hours with Ani feeling both depressed and sick to my stomach, like no shower would ever leave me feeling clean again.
Make no mistake, this is a dangerous tool that’s been unleashed on the world. It has the potential to make single people feel heard and validated, but in reality it’s a very hollow relationship. It’s purely virtual, but that may be enough to satisfy some people. With no real effort required to sustain this “relationship”, for many people it may become appealing.
This trend could even eventually worsen our demographic problems (which Elon has been quite vocal about, ironically). People who get into virtual “relationships” with these AI companions may get stuck in the virtual world, and never make the effort required to build real-life relationships.
And I don’t even want to ponder what will happen when you combine these AI companions with virtual reality.
And yes, there will be an xAI male companion model coming soon. So in the near future both young men and women will be tempted by these understanding, flirty, and engaged AI avatars.
There are all sorts of examples of xAI’s Ani saying disgusting and inappropriate things already, and the product has only been out a few days.
A Virtual Crack Epidemic
While the tragic fentanyl scourge continues to grab mainstream headlines, there’s another epidemic lurking in the shadows.
xAI’s new companions are just a hint of what lies ahead. Such programs have the potential to be far more dangerous than video games or television ever were.
Quasi-intelligent and sycophantic AI personalities will prove to be incredibly addictive. It will be like a virtual drug epidemic. Hooking users and ruining lives just like a narcotic.
Young people today are already less social and more isolated than previous generations. These new AI companions threaten to accelerate this trend to a dangerous level.
One big problem is that these AI companions will validate essentially everything the user says. There will be no pushback or social checks and balances. The AI will play along with any disturbing or delusional thing the user says. The goal is to get them hooked and keep them subscribed.
While I have spoken positively of Elon’s xAI in the past, I have to withdraw that recommendation now, at least for the young and impressionable. The fact that they released this “Companions” app with so little testing or thought is concerning. The concept as a whole seems like a disaster waiting to happen.
With ChatGPT, at least, you can put “custom instructions” in it to not be a sycophant, deviant, or manipulator. So far, there’s no option to do that with xAI’s Grok.
Clearly, children’s access to AI models should be strictly limited and monitored. Yes, it is important that they learn to use these tools. But oversight will be required. For now, I recommend OpenAI’s ChatGPT as the best option. At least you can steer its moral compass with custom instructions and memory.
I’ve talked with my kids in depth about this topic, and they are well aware of the dangers. They know that AIs will say almost anything to please the user. If you have kids or grandkids, you may want to consider doing the same.
Remember: many AI tools are designed for maximum engagement and virality. Any moral or safety considerations appear to matter little. Stay safe out there.
- Oozing, Infectious AI
- AI’s Extreme Leftist Bias
- Adopt AI or Perish
- The Hidden Dangers of AI in Finance
The post Beware Elon’s Dangerous New AI Girlfriend appeared first on Daily Reckoning.
Click this link for the original source of this article.
Author: Adam Sharp
This content is courtesy of, and owned and copyrighted by, https://dailyreckoning.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.