Lifestyle
The emergence of chatfishing: Why your potential match on a dating app could actually be a bot
By Kajal Sharma - 18 Feb 2026 05:53 PM
In the past, dating apps were criticized for pushing users to show off highly edited versions of themselves. The issue now is more complicated: it's possible that people aren't even composing their own messages anymore. Users are increasingly using artificial intelligence to assist them flirt, reply, and continue discussions on dating apps, according to Scientific American. Today, this tactic is known as "chatfishing." Chatfishing-supporting tools are become harder to identify and easy to obtain. According to Scientific American, users can use "wingman apps," paste messages into chatbots like ChatGPT, or engage with AI coaching tools integrated into dating apps.
According to a 2025 Norton research, "six out of ten dating app users think they've come across at least one conversation written by AI." Additionally, studies reveal that people have trouble telling the difference between language produced by machines and by humans, which raises concerns about consent, emotional connection, and authenticity in online dating. Chatfishing indicates a deeper unease with text-based intimacy that goes beyond dates and matches. Although dating apps lessen the appeal of text-based interactions, where algorithms frequently do better than humans, humans have evolved to communicate through voice, facial emotions, and physical presence."When I look at chatfishing through a psychological lens, the key difference is intent," says Dr. Sakshi Mandhyan, a psychologist and the creator of Mandhyan Care, in an interview with IndianExpress.com. Creating a phony identity is the goal of traditional catfishing. The main goal of chatfishing is to conceal perceived inferiority. The emotional voice is outsourced, yet the person is real. "I see this often in people who struggle with low relational confidence or social anxiety," she adds. They worry about saying the wrong thing, coming across as uninteresting, or getting turned down too soon. It feels like emotional armor to use AI. This has a psychological connection to poor impression management and self-efficacy. Since they are not lying about who they are, people persuade themselves that it is harmless. They think that all they are doing is "editing" themselves.