PM Lee’s Deepfake in Investment Scam Alert: A Cautionary Tale
Imagine this: you’re leisurely scrolling through your usual YouTube shorts, and suddenly, an unexpected advertisement pops up.
Prime Minister (PM) Lee Hsien Loong appears to be promoting a crypto-trading video on the Beijing-based news outlet China Global Television Network (CGTN).
Hold on…wait a minute…
Yes, PM Lee seems to be discussing the benefits of a hands-free crypto trading platform, which boasts the ability to compute algorithms, analyse market trends, make strategic investment decisions, and execute trades—all autonomously, without any manual input from the user.
Come on, that’s definitely fake.
Indeed, while the video appears extraordinarily out of the ordinary, it still manages to look real and convincing.
Now, let’s hear from the real PM Lee.
“‘Tis the season for scams!” PM Lee humorously wrote in his Facebook post, aiming to remind Singaporeans to remain vigilant against such deepfake scams.
On 29 Dec, PM Lee shared a recent deepfake video that has been circulating online.
Elaborating on the type of scam involved, PM Lee explained that scammers employ AI (artificial intelligence) technology to mimic our voices and images.
They transform real footage of us, taken from official events, into very convincing but entirely bogus videos of us purportedly saying things we have never said.
PM Lee urged people not to respond to such scam videos, which promise guaranteed returns on investments.
And, of course, PM Lee reminded Singaporeans that official videos are always available on the official Prime Minister’s Office YouTube channel.
Sadly, often people fall prey to scams simply because they did not take that extra step to cross-reference (at least for those that can be cross-referenced).
In the classic WhatsApp scams, sometimes, all it takes to save the day is to call or meet in person the friend or family member who suddenly asks you for money.
More About Deepfake
Deepfake is essentially the 21st century’s answer to Photoshopping.
Introduced in 2017, Deepfakes use a form of artificial intelligence called deep learning to create images of fake events, hence the name deepfake.
Initially, Deepfake was predominantly used in many pornographic videos, where people mapped the faces of their favourite female celebrities onto porn stars, thereby fuelling revenge porn.
All it required was just one, or perhaps a couple more, videos of the video you want to map the target image to, and, of course, the image of the celebrity.
In simpler terms, you just feed encoded images into the “wrong” decoder.
For a convincing video, the mapping of the image to the target video has to be meticulously done on every frame.
From being a simple face-swapping technology, Deepfake has undergone numerous cycles of advancement, with more and more sophisticated AI algorithms being developed.
Deepfake is no longer limited to just videos, but also extends to audios.
In essence, deepfake can create “voice skins” or “voice clones” of the speaker in a video, just like the one featuring PM Lee.
But, whether the output audio closely resembles the usual national day rally speech voice we are accustomed to hearing is subject to individual interpretation.
Now, a wide array of users, from academic and industrial researchers to amateur enthusiasts, visual effects studios, porn producers, and, unfortunately, scammers, are utilising Deepfake.
Quoting PM Lee, the deployment of deepfake technology to spread disinformation is likely to escalate.
And, true enough, in just the past few months, we’ve witnessed a surge of convincing deepfake videos, including those of Taylor Swift and Donald Trump speaking Mandarin, and most recently, a deepfake video of Deputy Prime Minister Lawrence Wong promoting an investment scam circulating on Facebook and Instagram.
Watch this video to find out more about Deepfake:
How to Spot A Deepfake Video?
Executive chairman of AI Singapore, Professor Mohan Kankanhalli, expressed to Channel News Asia that the it is important for regulators to have a thorough understanding of deepfake technology.
According to Mr Hia, there are identifiable signs of deepfakes.
For instance, the lighting in a deepfake video may be inconsistent, the speaker might not blink or may blink excessively, or when the speaker’s lips are not synchronised with their speech.
Additionally, inconsistencies such as a commonly-used phrase being omitted or unnecessarily included can be tell-tale signs.
Deepfake technology isn’t solely malicious; it also serves practical, helpful purposes.
Voice-cloning deepfakes can restore voices to people who lose them due to illness.
Deepfake videos can also bring life to galleries and museums.
Therefore, completely prohibiting this technology might be implausible.
As PM Lee says, what is important is to stay vigilant and discerning online.
Watch this for a complete summary of what REALLY happened to Qoo10, and why it's like a K-drama:
Read Also:
- Woman Tried Bribing Officer in S’pore Immigration, Thinking It’s a M’sia Officer
- There Might Not Be Crazy Rich Asians 2 in the Near Future
- Everything About Donald Trump’s Controversial Cabinet’s Picks That Are Known So Far
- Pet-Friendly Cafe Just 10 Minutes Away From JB CIQ Has Furry Floral Decor, Pastries & Mains
- 4 Handrolls For S$4 At Japanese Handroll Bar In Duxton Road On 17 November 2024
- Everything About The Deepfake Nude Photo Scandal in S’pore Sports School
Advertisements