When OpenAI released its AI video-generation app, Sora, in September, it promised that “you are in control of your likeness end-to-end.” The app allows users to include themselves and their friends in videos through a feature called “cameos”—the app scans a user’s face and performs a liveness check, providing data to generate a video of the user and to authenticate their consent for friends to use their likeness on the app.
But Reality Defender, a company specializing in identifying deepfakes, says it was able to bypass Sora’s anti-impersonation safeguards within 24 hours. Platforms such as Sora give a “plausible sense of security,” says Reality Defender CEO Ben Colman, despite the fact that “anybody can use completely off-the-shelf tools” to pass authentication as someone else.
Real