Beth Hyland wasn’t looking for love when she moved to Portage, Michigan three years ago. She was freshly divorced and still settling into her life as a newly single woman.
But when she watched a couple of her coworkers find healthy, loving, long-term relationships on Tinder, she decided to give it a try.
Very quickly, one profile stood out.
“It was so intriguing, so much like mine,” she said. “I swear this scammer targeted me.”
Conversation flowed naturally. The scammer showered Beth with compliments, masterfully moving the scam along without making things feel rushed. Within 10 days, they were talking about falling in love. A short video chat (which Beth later recognized as an AI-generated deepfake) gave her just enough visual confirmation to quiet any doubts.
When the scammer claimed sudden work travel, first to San Diego and later to Qatar, for an unpaid construction project, their long-distance romance intensified. He shared fabricated receipts, daily check-ins, and stories of struggle that made Beth feel like they were in it together. Soon, they were talking marriage and buying a house together.
Then the scammer hinted at a crisis: He needed to appear in person for a verification check in England but lacked the funds. He never directly asked Beth for money, but she loved him and wanted to help. So, she offered. She took out multiple loans and sent the scammer $26,000 in BitCoin — more than a quarter of what she'd saved for retirement.
The turning point came when a supposed $10 million payout required yet another $50,000 activation fee. For the first time, Beth began to get suspicious, not that her love was a scam but that her boyfriend might be being scammed. So, she told her financial advisor.
“He said: I think you are the victim of a romance scam,” Beth said. “I was like, what? I’ve never heard that term before?”
Global losses in the billions
Beth’s story is, unfortunately, not unique.
- Nasdaq’s 2024 Global Financial Crime report found romance scams and other confidence schemes account for an estimated $3.8 billion in annual global losses.
- In 2023, Americans filed nearly 18,000 romance scam complaints, with losses totaling $652 million, according to the 2023 FBI’s Internet Crime Report.
- The FTC’s Consumer Sentinel Data Book received more than 64,000 reports of romance scams, with U.S. losses totaling $1.14 billion.
- BioCatch customers around the world reported a 63% uptick in romance scams between 2024 and 2025.
This also isn’t only happening on dating apps or social media platforms. Fitness and interest-based networks like Strava, Fitbit, and Playtomic aren’t immune either.
Whatever the platform, the pattern is painfully familiar: Someone starts following, kudos and comments graduate to built-in messenger chats and, eventually, migrate to conversations on encrypted messaging platforms like WhatsApp or Telegram.

Evidence of this trend from Strava and Playtomic
At that point, the pace intensifies. Victims consistently report romance scammers messaging them all day long, isolating them from friends and family while deepening emotional dependence.
Increasingly, this emotional grooming is becoming a gateway to romance-investment scams. In those instances, the scammer presents himself as financially successful, often mentioning a strategy that recently made him good money. The scammer then introduces the victim to a fake trading platform, sends them token returns to build trust, and gradually coerces the victim into depositing larger and larger sums. When the victim attempts to withdraw their supposed earnings, they discover there is no money. They’ve been looking at a fake portfolio. All of the funds were transferred directly to a scammer, who will soon (if they haven’t already) disappear.
An elusive threat
From a bank’s perspective, these scams are among the hardest to stop. They appear legitimate from start to finish. Unlike account takeover, the criminal never hacks the account. There is no coercion, as seen in law enforcement impersonation scams. Instead, the customer willingly initiates the payment, convinced they are helping someone they love, someone with whom they plan to build a future.
Most banking fraud systems were designed to catch what deviates: unusual devices, abnormal transaction patterns, unfamiliar geographies, sudden spikes in velocity. Romance scams rarely trigger those alarms. Victims use their own devices, log in from familiar locations, and transact in ways that appear consistent with their historical behavior. In many cases, the first payment is a small and deliberate test of trust followed by larger transfers only after the emotional manipulation has fully taken hold.
Industrialization scaling devastation
Gone are the days of clumsy and poorly spelled emails. Modern scammers use generative AI to perfect tone, spelling, grammar, and emotional pacing. They research victims across social platforms, tailor narratives to life events, and deploy fake profiles that are nearly indistinguishable from real people.
Across parts of Southeast Asia, romance and investment scams are run from industrial-scale scam centers. Fortified compounds in border regions of Myanmar, Cambodia, and Laos operate like call centers for fraud. Investigations by international law enforcement and news agencies show these hubs running scripted romance and “love-investment” scams at scale, often using trafficked or coerced workers, generating billions annually.
How behavioral intelligence reveals social engineering in real time
Stopping social engineering scams requires a fundamentally different approach — one that focuses not only on what a customer is doing, but also how and why they’re doing it.
BioCatch Scams360 utilizes behavioral intelligence derived from thousands of subtle, real-time signals during a digital banking session, including changes in typing rhythm, navigation patterns, app switching, active VOIP calls, idle screen time, and the presence of risky apps, including crypto-related ones. Together, these signals act as predictors of a customer’s cognitive state — whether they are pressured, confused, hesitant, frustrated, or distracted — revealing when someone is acting under external influence, even when the transaction itself appears completely normal.
At one major APAC bank, Scams360 resulted in a 67% improvement in scam capture rate within the first month, successfully identifying more than 77% of attempted scams. As the system continued to learn and adapt, performance improved further, with the bank identifying nearly 84% of attempted scams in subsequent months.
Another large APAC bank saw a 73% improvement in scam capture rate in its very first month after deployment.
In The human cost of scams, my colleague illustrates how several behavioral intelligence signals detect vulnerability and reveal the true picture, showing how subtle changes in a customer’s digital behavior reveal manipulation.
After confronting her scammer, Beth endured years of heartbreak, grief, and anger, blocking him, wavering briefly, and then walking away for good. She wrote a book that details her ordeal, founded an LLC, and now educates others on how to avoid and survive romance scams like hers.
Behavioral intelligence might not have stopped her heartbreak. It could have stopped her from losing a quarter of the money she’d saved for retirement.