Artificial IntelligenceCyber Security

Beware Cupid’s Deception: Tenable Warns Against Deepfakes and AI Boosting Romance Scams

(source: Canva)

In the past few years, romance scams have become scarier than ever before. Scammers are now integrating advanced technologies, such as generative AI and deepfakes, to trick people. These enhanced tactics make it even harder to tell if someone is real or not. 

Various media outlets reported a rapid rise in love-related scams in Southeast Asia over the past few years. For instance, in Indonesia, there are 88 Chinese nationals nabbed for their involvement in a cross-border telephone and online romance scam syndicate. Another is in Singapore, where it was reported that SGD25.9 million was lost in the first 6 months of 2023 due to love scams. 

Using original and edited videos, audio manipulations, and face-tracking webcam tools, scammers could trick people into giving them money or compromising themselves. This also involves the rise of sextortion and digitally altered images, where scammers would blackmail victims by threatening to expose explicit content featuring their likenesses.

Tenable, the Exposure Management company, sheds light on the romance scams likely to take place in 2024, revealing the sinister ways scammers will exploit their targets:

Generative AI and deepfakes crafting convincing personas

Celebrity impersonations have become widespread. Online tools and tutorials make it easy for scammers to map celebrity likenesses onto their webcams, blurring the lines between reality and deception. These scams often start on social media sites like Facebook. They can make people feel like they are in a safe environment, tricking victims into a false sense of security.

Preying on the vulnerable

A worrying pattern that is becoming a trend is that scammers target elderly individuals, especially those who are widowed or have memory issues. They start conversations with the victim and assess their familiarity with technology before using pre-recorded videos or live interactions to deceive them. A notable example is a Facebook scam where an elderly artist was tricked out of USD500,000 by a scammer who used a deepfake Mark Ruffalo.

Protecting yourself from Cupid’s deception

In the realm of online relationships, requests for money from newfound connections should sound immediate alarm bells. It’s crucial to scrutinize photographs and videos that deliberately conceal background details, hindering online verification.

“While social media platforms may lack explicit guidance on romance scams, I urge users to report any suspicious activities using the available reporting tools. Awareness and vigilance are our best defences against these heartless manipulations, ensuring that love seekers don’t fall victim to the tangled web of AI-enhanced deception,” said Chris Boyd, staff research engineer at Tenable.


Written by
Tech Beat Philippines

Tech Beat Philippines is the social media news platform for all things technology. It is also a part of the GEARS section on Daddy's Day Out.

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Daddy’s Day Out is a platform that celebrates modern masculinity and offers a space where men can unite, learn, and grow together. It fosters a community where authenticity, support, and self-expression thrive unapologetically.

Related Articles

CICC Provides Steps to Fix “Blue Screen of Death” Issue in Windows

On July 19, 2024, the dismaying “Blue Screen of Death” (BSOD) occurred...

Kaspersky Calls for Stronger Cybersecurity Measures in Schools and Universities

Global cybersecurity and digital privacy company Kaspersky emphasized the urgent need for...

DOST Provides Funding to Packworks for Enhancing Sari-sari Stores with AI

Sari-sari stores will become smarter and more sophisticated with the latest partnership...

CICC and Scam Watch Pilipinas Encourages Public to Report Text Scams via DICT’s eGov App

The Cybercrime Investigation Coordinating Center (CICC) and Scam Watch Pilipinas are advising the public...