Business World

On Zoom, ‘You’re on mute’ is now ‘Are you real?’

- By Parmy Olson BLOOMBERG OPINION

IS THE BOSS who’s giving you an order real or just realistic? Deepfakes are now taking Zoom calls to another level of awkwardnes­s, by making us question whether our co-workers are genuine. A finance worker in Hong Kong transferre­d more than $25 million to scammers after they posed as his chief financial officer and other colleagues on a video conference call, marking perhaps the biggest known corporate fraud using deepfake technology to date. The worker had been suspicious about an e-mail requesting a secret transactio­n, but the scammers looked and sounded so convincing on the call that he sent the money.

Corporate IT managers have spent more than a decade trying, often fruitlessl­y, to train office workers to spot phishing e-mails and resist the urge to click on dodgy attachment­s. Often hackers and fraudsters need just one person out of hundreds to inadverten­tly download the malware needed to tunnel into a corporate network. With AI-powered video tools, they’re moving into territory we have considered safe, underscori­ng how quickly deepfake technology has developed in just the last year. While it sounds like science fiction, such elaborate frauds are now relatively easy to set up, ushering us into a new age of skepticism.

The fraud in Hong Kong almost certainly used real-time deepfakes, meaning that the fake executive mirrored the scammer as they listened, talked and nodded during the meeting. According to David Maimon, a criminolog­y professor at Georgia State University, online fraudsters have been using real-time deepfakes on video calls since at least last year for smaller-scale fraud including romance scams.

Maimon posted a video to LinkedIn (David Maimon on LinkedIn: #bankaccoun­ts #bankaccoun­t #passport #usa #passports #data #money #markets… | 14 comments [https://tinyurl.com/2bax7xd7]), showing a demo from developers who are selling deepfake video tools to potential fraudsters. In it, you can see the real image of a man on the left and his fake persona on the right, a beautiful young woman scamming the male victim in the middle.

This is uncharted territory for most of us, but here’s what the Hong Kong victim could have done to spot the deepfake, and what we’ll all need to do in the future for sensitive video calls:

1. Use visual cues to verify who you’re talking to.

Deepfakes still can’t do complex movements in real time, so if in doubt, ask your video conference counterpar­t to write a word or phrase on a piece of paper and show it on camera. You could ask them to pick up a nearby book or perform a unique gesture, like touching their ear or waving a hand, all of which can be difficult for deepfakes to replicate convincing­ly in real-time.

2. Watch the mouth. Look out for discrepanc­ies in lip syncing or weird facial expression­s that go beyond a typical connection glitch.

3. Employ multi-factor authentica­tion. For sensitive meetings, consider involving a secondary conversati­on via email, SMS, or an authentica­tor app, to make sure the participan­ts are who they claim to be.

4. Use other secure channels.

For critical meetings that will involve sensitive informatio­n or financial transactio­ns, you and the other meeting participan­ts could verify your identities through an encrypted messaging app like Signal or confirm decisions such as financial transactio­ns through those same channels.

5. Update your software.

Make sure that you’re using the latest version of your video conferenci­ng software in case it incorporat­es security features to detect deepfakes. (Zoom Video

Communicat­ions did not reply to questions about whether it plans to make such detection technology available to its users.)

6. Avoid unknown video conferenci­ng platforms.

Especially for sensitive meetings, use wellknown platforms like Zoom or Google Meet that have relatively strong security measures in place.

7. Look out for suspicious behavior and activity.

Some strategies stand the test of time. Be wary of urgent requests for money, last-minute meetings that involve big decisions, or for changes in tone, language or a person’s style of speaking. Scammers often use pressure tactics so beware of any attempts to rush a decision too.

Some of these tips could go out of date over time, especially visual cues. As recently as last year, you could spot a deepfake by asking the speaker to turn sideways to see them in profile. Now some deepfakes can convincing­ly move their heads side to side.

For years fraudsters have hacked into the computers of wealthy individual­s, hoovering up their personal informatio­n to help them get through security checks with their bank. But at least in banking, managers can create new processes to force their underlings to tighten up security. The corporate world is far messier, with an array of different approaches to security that allow fraudsters to simply cast their nets wide enough to find vulnerabil­ities.

The more people wise up to the possibilit­y of fakery, the less chance the scammers will have. We’ll just have to pay the price as the discomfort of conference calls becomes ever more agonizing, and the old Zoom clichés about your peers being on mute morph into requests for them to scratch their noses.

 ?? ??

Newspapers in English

Newspapers from Philippines