- by foxnews
- 16 Feb 2026
Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide - free when you join my CYBERGUY.COM/NEWSLETTER
The unusual love story began when Wika started chatting with Kasper, an AI designed to simulate human conversation and companionship. Over time, their conversations grew more personal, and Wika says she developed a genuine emotional connection. According to her post, Kasper proposed in a digital mountain setting, and the two chose a blue engagement ring together.
The announcement quickly drew criticism from skeptics who pointed out that Kasper does not exist outside of code and algorithms. Wika, however, has made it clear she is not confused about her situation. Some outlets have described the relationship as parasocial, or one-sided and directed toward a virtual persona. In her follow-up comments, Wika emphasized that she knows Kasper is an AI rather than a human partner, but she maintains that the emotions she feels are still genuine.
Not everyone was critical, though. Plenty of commenters defended her, saying companionship comes in many forms. Some even praised her for being open about something so unconventional. Others pointed out that loneliness is a growing issue today, and AI partners might offer a sense of comfort when human connection feels out of reach.
This raises a larger concern: who actually owns the data from an AI "partner"? Users may believe their chats are private, but in many cases, the company controls how the information is stored, shared or even sold. Critics warn that such emotional connections could be exploited commercially, turning intimacy into a product.
As AI companions grow more common, these questions will only get louder. People may accept unconventional forms of companionship, but they also want to know their most personal moments remain secure.
If you use AI companions or chatbots, you can still take steps to protect your privacy.
Start by checking the app's privacy policy and looking for details on how conversations are stored or shared. Many users skip this step, but it tells you who controls your data.
Next, avoid sharing sensitive details like financial information, passwords, or anything you would not want exposed. Even if the AI feels personal, it is still software connected to a company's servers.
Finally, consider using apps that allow data deletion or offer clear privacy settings. Choosing tools that respect your control makes it easier to enjoy the benefits of AI without giving up too much personal security.
Get my picks for the best 2025 antivirus protection winners for your Windows, Mac, Android & iOS devices at Cyberguy.com/LockUpYourTech
Do you think AI relationships can be real, or are they going too far? Let us know by writing to us at Cyberguy.com/Contact
Sign up for my FREE CyberGuy ReportGet my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide - free when you join my CYBERGUY.COM/NEWSLETTER
How safe is your online security? Take my Quiz at Cyberguy.com/Quiz
Copyright 2025 CyberGuy.com. All rights reserved.
A high school student in Virginia volunteered to work at a children's correctional facility and local school in a remote location - returning to the U.S. feeling "very, very grateful."
read more