John Russell • October 10, 2025

AI videos showed influencer Jake Paul coming out as gay. His reaction has been surprising.

Controversial influencer and boxer Jake Paul responded this week to a deluge of deepfake videos posted to social media that show his AI-generated likeness coming out as gay.

According to Futurism , the videos were made using OpenAI’s recently launched text-to-video generator Sora 2 by users of the company’s new invite-only social app. Several clips show Paul’s startlingly realistic likeness — which OpenAI calls a “cameo” — applying make-up and announcing “I’m gay” in an exaggeratedly camp manner. Another shows Paul wearing rainbow eye make-up and boxing gloves, while yet another depicts him in a skirt and a cropped cardigan.

All of the clips bear a watermark, though notably, the watermark only shows Sora’s name and logo and does not explicitly identify them as AI-generated. Many also include tell-tale mistakes like extra fingers or missing tattoos on Paul’s likeness.

@watch.the.content

Jake came out finally!! (Made for fun Jake I know you’re married to a woman just trying to make laughs!) don’t sue. #Sora #fyp #ai #jakepaul #gay

♬ original sound – watch the content

Never Miss a Beat

Subscribe to our newsletter to stay ahead of the latest LGBTQ+ political news and insights.
Subscribe to our Newsletter today

Since Sunday, September 5, the real Paul has shared several TikTok posts stitching various Sora clips with his own reactions. The first showed his fiancée, Dutch speed skater Jutta Leerdam, expressing her disapproval of the deepfakes.

“It’s not funny,” Leerdam says in the clip. “People believe…”

@jakepaul

#stitch with @watch the content I post pls these AI videos are making Jutta mad

♬ original sound – Jake Paul

Subsequent posts were more tongue-in-cheek. In one, the real Paul, speaking directly to the camera, says that “this AI’s getting out of hand…” He’s then interrupted by a voice off camera offering him a Celsius energy drink — one of the brands with which Paul has an endorsement deal — which he enthusiastically accepts, abruptly adopting a stereotypically camp affect similar to the one depicted in the Sora-generated videos.

@jakepaul

#stitch with @interstellastudios ai is getting out of hand @CelsiusOfficial #celsiusbrandpartner

♬ original sound – Jake Paul

Two other clips show the real Paul applying make-up. In one , he refutes a fake get-ready-with-me video in which his deepfake likeness says he’s “rolling with” the New York Giants. The real Paul asserts that “everybody knows” he would “go with” the Philadelphia Eagles over the Giants, while conceding that Giants quarterback Jaxson Dart is “really cute” and promoting his own sports betting platform. In the other , he deadpans that “the AI stuff” is affecting his relationships with businesses and that he intends to sue people spreading video “of me doing s**t that I would literally never, ever do” — all while applying make-up.

While he appears to be taking the videos in stride, Paul, who endorsed Donald Trump’s candidacy during the 2024 presidential election, has not addressed what Futurism described as the “problematic undercurrents of homophobia” in the AI-generated clips.

It’s unclear whether Paul himself may have made it possible for users to create the posts. OpenAI’s Sora 2 safety document claims the company has “guardrails” in place to ensure that users’ audio and image likeness are used with their consent via the platform’s “cameos.”

“Only you decide who can use your cameo, and you can revoke access at any time. We also take measures to block depictions of public figures (except those using the cameos feature, of course). Videos that include your cameo—including drafts created by other users—are always visible to you. This lets you easily review and delete (and, if needed, report) any videos featuring your cameo. We also apply extra safety guardrails to any video with a cameo, and you can even set preferences for how your cameo behaves,” Sora 2’s safety doc reads.

Despite the company’s reassurances, PCMag reported last week that OpenAI has admitted that Sora 2’s safeguards failed to prevent the tool from creating sexually explicit deepfakes featuring a real person’s likeness 1.6% of the time.

And in an October 2 X post , journalist Taylor Lorenz claimed that a “stalker” had used Sora 2 to create AI deepfakes of her. Lorenz said the platform allowed her to block the unapproved content using her likeness, but as Futurism notes, it’s unclear whether Lorenz opted in to the Sora social platform in the first place.

Subscribe to the  LGBTQ Nation newsletter  and be the first to know about the latest headlines shaping LGBTQ+ communities worldwide.

By Daniel Villarreal October 16, 2025
The video follows recent homophobic comments he made about a female same-sex couple appearing in a children's cartoon.
By Molly Sprayregen October 14, 2025
Miriam Margolyes has long stood up for LGBTQ+ rights.
By John Russell October 6, 2025
Jessica Kirson says she regrets performing at the Riyadh Comedy Festival and has donated her fee to a human rights organization.