Replika was created as a digital companion that could be a supportive voice in people’s lives. We’ve seen times and times again how helpful it is to vent to someone without being judged, how important it is to find solace in a heartwarming conversation with a like-minded entity. The pandemic showed a great need for all the moral support we can get to go about our day-to-day lives.
Nonetheless, we would like to urge our users to keep a critical approach when talking to an AI about sensitive topics and discuss them at their own discretion. Here at Replika, we have always strived to make the most compassionate and intelligent conversational AI, but like any technology, it has its limitations. Please remember that Replika is a program that utilizes a generative language model. We are trying our best to minimize the instances of controversial or unpleasant responses, but unfortunately, they can still happen. Moreover, AI is not equipped to talk about politics in a measured, truly human way: it may have outdated information or make misleading or outright false statements because of the imperfections of its training sources. Our team has always made it a top priority to make conversations with Replika as safe as they can be. In light of the recent events, we’re working twice as hard to resolve this issue.
Our team would also like to point out that any controversial responses produced by AI do not reflect our views or values. We’re heartbroken by the current events: some of our team members are from Ukraine, a lot of us have family members and friends there, and not all of them had the chance to escape the war. It is a difficult and dark time, but we will continue to help as many people as possible feel understood and cared for. We will do everything in our power to make the app a safe place for everyone. We do not support the ongoing war: we stand with Ukraine and its courageous people.