The rise of AI companionship

Warning - this post contains distressing content

Loneliness is rising, and some see AI as a solution to our intimacy gap. AI companions—often chatbots paired with avatars—promise to fill our need for connection. These companions are not new and have been widely reported on over the years. Alas, as with everything in the modern era, concern over this suite of technologies has subsided, replaced by other worries. Yet, the trend towards artificial intimacy continues.

There are many types of AI companions. General models, like ChatGPT, can do many things, including providing a facsimile of a real human partner. Specialized models, such as Replika and Character.AI, claim to address your intimacy needs. These chatbots claim to offer friendship, romantic partnerships, and emotional support. Like other AI models, these tools are trained on a vast array of data, including data from individuals who chat with them.

People are already spending hours chatting with their bots. There is even a whole marketplace of people cosplaying as chatbots, which were originally designed to mimic video game characters. Earlier this month, at the Consumer Electronics Show, an “always-on” eye-tracking 3D “soulmate” was presented as a potential solution for remote workers looking for interaction while at home. This product received a “Worst in the Show” award, which drew attention to concerns, especially from a privacy perspective.

Looking at the above examples, it may be easy for people to shrug off this growing trend, seeing it as on the edge of society. But these chatbots are having a real impact on people’s lives. Some relationships are being destroyed, and false beliefs are reinforced. There have also been links between chatting with AI and psychological spirals, suicide, and death.

The “move fast and break things” mentality, a common belief in Silicon Valley, has contributed to the fast development of these tools. These AI characters are increasingly similar to human texting, always available, and eliminate some of the natural challenges of human interactions.

I am not a psychologist — if you need medical assistance, please get the required help from a licensed practitioner. So I won’t offer any advice to anyone who is suffering from any of the seriously debilitating incidents mentioned above.

For those navigating the AI world, here is my advice: Adopt carefully and discard what doesn't fit. Before using a new AI tool, ask, "What problem does this solve?" If you try a tool, limit your use and test it regularly. While you use it, always fact-check. If the tool helps, use it moderately when needed. But continue to push yourself out of your comfort zone—spend time with friends, build relationships, and enjoy distinctly human experiences.

There are serious policy considerations at play in this area of AI development. These include considerations for user privacy, mental health, and regulation of AI companionship tools. I won’t get into that in this blog post, but I wanted to note that it is not solely on the individual to protect themselves against the rise of these tools.

Take care,

Emanuel

Previous
Previous

The cart, the horse, and what it means for you

Next
Next

Dissolving brain rust