Toronto man loses life savings to Justin Trudeau deepfake cryptocurrency scam

21 views 1:04 pm 0 Comments March 28, 2024

Stephen Henry thought: ‘It’s got to be perfect. If not, how could you get the prime minister?’

Article content

A Toronto man says he’s out $12,000 after being duped by a deepfake cryptocurrency scam that used Justin Trudeau’s likeness to endorse a fraudulent investment platform.

The scam was propagated through a YouTube video and manipulated with AI and voice cloning technology to appear as if Trudeau was promoting a cryptocurrency exchange and an investment platform aimed at “helping Canadians safeguard their financial future.”

Advertisement 2

Story continues below

Article content

Article content

“I thought, ‘It’s got to be legitimate, it’s got to be perfect. If not, how could you get the prime minister?’ So I thought, ‘It’s got to be official,’” Stephen Henry told CTV.

Henry initially invested $250, but then continued to invest his savings, believing his investments had grown to over $40,000 in value.

Recommended from Editorial

  1. A Facebook advertisement for a cryptocurrency scam featuring an AI-generated Justin Trudeau speaking with a thick Australian accent.

    Deepfake Trudeau selling crypto with an accent is an uneasy look at the AI disinformation future

  2. Foreign Affairs Minister Melanie Joly speaks with senior Global Affairs Canada official, Weldon Epp before appearing at committee, Wednesday, March 22, 2023 in Ottawa. In January, Global Affairs Canada was attacked, leading to a data breach that limited remote access to its networks.

    Canada’s cybersecurity under siege and even the government is powerless

When Henry tried unsuccessfully to withdraw some of his money, he realized he’d been scammed.

“Now, I’m ripped off of all my chances of ever making a life. That was all the money I had,” he said.

Henry is far from alone. Scams that exploit the images of politicians and celebrities to deceive individuals have surged alongside the improvements in quality and accessibility of deepfake technology.

Taylor Swift, Pope Francis and Ukrainian President Volodymyr Zelenskyy are just a few examples of individuals whose likeness has been co-opted in deepfake scams and misinformation campaigns.

The scams manipulate AI and voice cloning technologies to create highly convincing yet fraudulent endorsements. AI and machine learning algorithms can superimpose faces and mimic voices, including replicating mannerisms and vocal patterns.

Even ads with low believability can be effective, especially for those who are unfamiliar with the advancements in AI technology.

Facebook users may have recently spotted an ad on the platform featuring a deepfake Justin Trudeau promoting a cryptocurrency scam.

The fake ad uses footage from a CBC interview, but Trudeau speaks with an Australian accent.

Article content

Advertisement 3

Story continues below

Article content

“A trademark of scams is that they need to be realistic enough to catch somebody, but also fake enough so that the people they catch would plausibly go through with (falling for it),” McGill University assistant professor Aengus Bridgman told National Post earlier this month.

While Bridgman said the Trudeau ad was poorly done, it was also serving a purpose by filtering out more experienced users in effort to pull in people who may be more likely to invest money in the scam.

“That’s the type of person you want to catch with these ads: somebody who is not digitally literate — in a similar way the elderly in Canada are preyed upon by phone scams and identity theft,” said Bridgman.

The Prime Minister’s Office, through Press Secretary Jenna Ghassbeh, acknowledged the challenges posed by deepfake technology and the proliferation of false information targeting elected officials in a statement to CTV.

“The amount of deceptive, fake and misleading information and accounts targeting elected officials is increasingly concerning and unacceptable, particularly in an era with deepfake technology,” Ghassbeh said.

Advertisement 4

Story continues below

Article content

While the federal government scrambles to keep up with the advancements in the technology, they say educating communities and promoting critical engagement with information are key strategies for protection.

“Societal norms and discourse on deepfakes should be nudged to create a social environment where people are not only more skeptical about what they see, but also are encouraged to challenge each others’ informational claims,” notes the Canadian Security Intelligence Service (CSIS).

Some technology companies and social media platforms use a combination of human insights and automated methods to detect deepfakes, while there’s also a push for legal frameworks that could hold creators and distributors of deepfakes accountable and offer protection to victims of defamation.

“To alter societal norms, thought leaders and those most central in social networks are key,” CSIS adds. “Educational resources including digital literacy training are helpful tools, especially if directed at influencers. Videos explaining political deepfakes have been found to reduce uncertainty, and in so doing can increase trust in media. But norms only really change through collective action.”

Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark nationalpost.com and sign up for our daily newsletter, Posted, here.

Article content

Comments

Join the Conversation

This Week in Flyers