Patrick Hillmann, chief communications officer at the world’s largest crypto exchange, Binance, claims scammers made a deepfake of him to trick contacts into taking meetings.
Writing in a blog post titled “Scammers Created an AI Hologram of Me to Scam Unsuspecting Projects,” Hillmann claims that a “sophisticated hacking team” used video footage of interviews and TV appearances to create the fake. Says Hillmann: “Other than the 15 pounds that I gained during COVID being noticeably absent, this deep fake was refined enough to fool several highly intelligent crypto community members.”
The only direct evidence Hillmann offers for the claim is a screenshot of a conversation with an anonymous individual who claims to have had a Zoom call with Hillmann. Hillmann denies it, and his interlocutor responds: “they impersonated your hologram.”
Although there has been much discussion of the potential of deepfakes to impersonate people in video calls, there have been no definitively confirmed cases to date. Audio deepfakes have been used to impersonate people over the phone, and video deepfakes have been shared on social media to boost scams (a recent example used a deepfake of Elon Musk, a common target for impersonation in crypto scams). But it’s not clear if the technology in its most accessible form is sophisticated enough yet to sustain an impersonation during a live call. Indeed, experts recommend that the simplest way to tell if you’re talking to a deepfake is simply to ask the individual to turn their head, as machine learning models used to create the deepfake do not generally include a face’s profile.
Meanwhile, fear of the threat of deepfakes is much more widespread. In 2021, for example, European politicians claimed they’d been tricked by a deepfake video call of a Russian dissident. However, reporting by The Verge revealed that the incident was the work of Russian hoaxers who used only makeup and deceptive lighting to impersonate their target.
On the other hand, the world of cryptocurrency is certainly rife with scams based on impersonation. These are usually more low-tech, relying on stolen photos and videos to populate fake social media profiles, but given the highly technical communities that follow crypto, it’s not implausible that people might try their hand at a more sophisticated plot. It’s also certainly true that the potentially lucrative proceeds of crypto scams make individuals like Hillmann extremely attractive targets for impersonations. A deepfake of a crypto exec could be used to boost confidence in a scam project or seed information that would turn the market in a desired direction.
We’ve reached out to Hillmann to ask for more details about the incident and will update this story if we hear back.