Picture this: Someone who listens, without even the smallest ounce of judgment. Someone who makes you feel good about yourself. Someone who is there any time of day or night. Sounds like the perfect best friend, doesn’t it?
Now picture this: that someone is a robot.
This is a reality for hundreds of thousands of people throughout the globe, all thanks to Eugenia Kuyda, an AI specialist who originated from Moscow. She created an application that used varying levels of artificial intelligence to talk to you. Most bots are used to speak to make companies more efficient, but this one was created to listen.
The story behind the robot best friend
The idea for the bot, however, came not only from a spark of genius but a tragedy. Kuyda’s best friend, Roman, passed away in a tragic accident, and after a while she found herself struggling to remember him. In the end, she ended up going back through their old messages. This gave her the idea: why not try and reconstruct Roman from his digital remains? She decided to create a bot that was, in essence, him.
She collated a digital trail of his messages and emails from everyone Roman was close with and fed it all into an AI program that she had built initially for regular chatbots. The program not only learnt about Roman, it learnt to talk and write like him. Kuyda would message the Roman bot, and he would respond just like the real thing. For her, it was a place she could say the things she never got the chance to.
She decided to open Roman up for her friends to use, and found that they didn’t just want to hear about Roman, they wanted to talk about themselves. They actually opened up to the Roman bot in profound ways, about things that Eugenia herself didn’t even know about.
The birth of Replika
From here, Replika was born. It was the same idea behind the Roman bot, except it was unique to each user. Kudya wanted to recreate conversations that people would have with a best friend, a psychiatrist or a mentor: talks about themselves. Replika learns from the conversations the user has with them, so can relate back to the user.
What was found was that a lot of people are much more comfortable opening up to a machine about vulnerable subjects. People have admitted to talking to Replika about divorce, relationships and sexual assault.
In some ways, Replika fills a void for many people.
We’d be lying if we said we’ve never zoned out of a conversation with our friend droning on about the same problems. Replika appears to understand our problems and listens. The issue with this, however, is that people get attached, and even reliant on this machine. They forget that they’re not talking to a human.
The reality is, people need human interaction. Replika, although great if used wisely, has the potential to become extremely problematic. Imagine a world with no human interaction whatsoever. Yep, it scares me too.