- Scammers have impersonated the Binance chief communications officer Patrick Hillman to deceive start-ups that want to be listed in Binance.
- Scammers have achieved their goals by running a real-time deepfake in Zoom meetings with start-up representatives.
- The scam was discovered when a start-up representative reached Patrick Hillman to thank him for his help.
For decades, we have been worrying about the photoshopped images that try to fool us. Then the video age began and some edits made us believe some things that have not happened. When AI/ML technologies began maturing, it became almost impossible to understand whether the image is real or not. But making an imaging AI work flawlessly is a hard task to complete. You need to feed many, many images and it still requires a lot of specialized computing power.
Zoom meeting with real-time deepfake
A new successful scamming operation has targeted the cryptocurrency industry. Some scammers have decided to impersonate Binance’s chief communications officer Patrick Hillman to deceive the crypto start-ups that want to be listed in Binance. Normally those start-ups apply for Binance listing by filing the application form and then they wait for the results. This process might sometimes be time and money-consuming.
Scammers have found ways to contact start-up representatives and had Zoom meetings with them. In those meetings, scammers have used a deepfake avatar of Patrick Hillman to “help them to be listed in Binance”. This scam was discovered after one of the start-up representatives contacted Hillman to thank him for his help.
Requires a lot of processing power and time
While this scam looks like a simple thing, looking through the technical side, it is a huge deal. Machine learning algorithms can indeed put another face to a person; even mobile phones can do it at a very basic level for fun. However, making a high-level, real-face deepfake is much, much harder than it sounds.
First, you need to gather tons of images of the target person to “feed” the algorithm. It alone is a very time-consuming process. After that, the AI starts the “learning” process, which is another time-consuming process as well. Then you need a huge amount of processing power for the following steps to finalize:
- Exactly locating the person’s face who will get masked (in this case, the scammer) frame-by-frame
- Detecting every single muscle change in the person’s (scammer) face, frame by frame
- Applying those muscle changes to the target person’s (Hillman) face image, frame by frame
- Putting the final image of the target’s face (Hillman) to the exact location of the impersonator (scammer), frame by frame
Those 4 steps are very heavy tasks and require a lot of computing power, even on videos. Making them work in real-time in a Zoom meeting is a much, much harder thing to achieve.
Hiding the possible artifacts
In this case, we’d assume that the scammers have actually run a low-quality deepfake, but the low-quality video streaming of Zoom calls managed to hide it. The scammers might have intentionally lowered the bandwidth for worse quality calls to hide the artifacts on deepfake as well. In any case, we will need to be way more careful since the machine-learning hardware becomes much more accessible day by day.
The number of start-ups that have been scammed and the amount of money the scammers have managed to get are currently unknown.