There Is Now Also a Deepfake Video of DPM Lawrence Wong Selling Some Investment Scam


Advertisements
 

With the rise of artificial intelligence (AI), it’s sometimes difficult to tell what is real anymore.

A deepfake video of Deputy Prime Minister Lawrence Wong promoting an investment scam has been circulating on Facebook and Instagram.

The worst part is that it looks real.

Deepfake Video of DPM Lawrence Wong Promoting an Investment Scam Circulating Online

Deepfakes are media that have been altered by AI to look or sound like someone.

In the video, DPM Wong’s mouth is altered to synchronise with a fake voiceover that sounds like him.

Yes, the voiceover mimics the pitch and intonation of DPM Wong’s actual voice.

Don’t believe me? You can watch the deepfake video here:

@straitstimes In the video, DPM @lawrencewongst’s mouth is noticeably altered to synchronise with a fake voice-over promoting an investment scam. #sgnews #deepfake #fake #scammeralert ♬ original sound – The Straits Times

Notably, the video was made from modified footage of DPM Wong giving an interview recorded by The Straits Times.

The deepfake video promotes an investment scam, even using terms reminiscent of a DPM speech, like “my dear Singaporeans”.

An SPH Media spokeswoman clarified that the deepfake video was not created or published by The Straits Times or SPH Media. 

She said, “It has come to our attention that there is a video attributed to The Straits Times, featuring Deputy Prime Minister Lawrence Wong endorsing commercial projects, circulating online.

“We urge members of the public to stay vigilant and not circulate videos of unknown sources.”

According to The Straits Times, the police noted that a report had been made about the deepfake video.

DPM Lawrence Wong’s Response

In a Facebook post on 11 December, DPM Wong said he was aware of the deepfake video.

He also noted that he was aware of another fake video stating that the government wanted to reinstate a circuit breaker.


Advertisements
 

He wrote, “These are all falsehoods. Let’s stay vigilant and discerning online!”

Why the Hoo-ha?

As someone reading this article online, you’re probably no stranger to deepfake videos.

In fact, there have been deepfake videos of Ho Ching, the leader of Temasek Holdings, promoting cryptocurrency.

Of course, this is yet another scam.

Speaking at the Regional Anti-Scam Conference 2023 in June, Minister of State for Home Affairs Sun Xueling noted that scammers can use deepfake technology to impersonate authority figures.


Advertisements
 

She cautioned, “We need to constantly monitor this threat, work with research institutes, relevant government agencies, market players who themselves are at the forefront of these technologies, to study ways to counter them.”

National University of Singapore associate professor Terence Sim, who researches deepfakes, spoke to The Straits Times about the dangers of such technologies in June.

He noted that a scammer merely needs a few photos of a person’s face to create a deepfake.

Furthermore, with the prominence of social media, a person’s face pictures can be easily found.

He noted that audio deepfakes are a problem as well.

Simply having a short video clip of the target’s voice can allow a scammer to create an audio deepfake.


Advertisements
 

Watch this video for a summary of the dangers of deepfakes:

Deepfakes Can Trick Victims’ Friends and Family Too

Deepfakes can be created of anyone.

While a deepfake audio of DPM Wong may seem suspicious, a deepfake audio of your close relative or friend could appeal to your emotions more.

Prof Sim noted that deepfake technology can use a target’s voice to create what appears to be calls of distress.

Mr Adrian Hia, the managing director of cyber-security firm Kaspersky’s Asia-Pacific, told The Straits Times that deepfake technology can scam people anywhere.


Advertisements
 

He added that successful deepfake campaigns play into human emotions like fear, excitement, curiosity, guilt and sadness.

He noted that a sense of urgency is created when emotions are heightened, leading the victim to act without rational thought.

For instance, in May, there was a case in Inner Mongolia where a scammer used face-swopping technology to impersonate a victim’s friend during a video call.

The victim transferred S$805,000 to the scammer, believing him to be a friend in need.

There is also a growing concern about such scams in Europe, where AI has been used to recreate the sound of family members’ voices to trick people into transferring money.

The Sumsub Identity Fraud Report 2023, released in November, showed a 10-fold increase in deepfakes detected globally from 2022 to 2023.

As you probably know, Singapore isn’t safe from this phenomenon either.

Earlier this month, it was reported that deepfakes in Singapore have jumped five times in the last year.


Advertisements
 

What Should Be Done About Deepfakes

Executive chairman of AI Singapore Professor Mohan Kankanhalli told Channel News Asia that strict regulations need to be implemented to prevent harm caused by deepfakes.

He suggested a risk-based approach, where a country would try to understand the risks posed by different types of deepfakes and regulate them accordingly.

He also stressed that regulators need to understand deepfake technology.

According to Mr Hia, there are ways to spot deepfakes.

For instance, the lighting in a deepfake video may be inconsistent.

The speaker in the video may exhibit jerky movements, blink weirdly, or not blink at all.

Another indication that a video is a deepfake is when the speaker’s lips are not synchronised with his speech.

For audio deepfakes, Prof Sim recommended that people pay attention to the words used.

For instance, if the speaker in the audio uses a phrase that differs from how a loved one would speak, it is probably an audio deepfake.

Furthermore, people should always check with their loved ones before acting.

He added, “At the end of the day, scammers do this to create a sense of urgency in the victim.”