Deepfake accounts of people with Down syndrome are being used to push OnlyFans links—and the internet is not OK with it.
A disturbing new AI trend is turning disability into a fetish—literally. Fake influencers, made to look like people with Down syndrome, are being created with artificial intelligence and used to promote OnlyFans content. These aren’t real people. They’re digital fabrications, carefully designed to look disabled and to appear relatable, innocent, and emotionally engaging.
But behind the soft filters, smiles, and captions about self-love lies a shocking marketing trick. These AI “models” are being passed off as real individuals with disabilities—yet their purpose is to draw clicks to adult content. According to CyberNews, the accounts are being run anonymously and are using “AI-generated faces of people with Down syndrome to lure in viewers and direct them to OnlyFans links.”
Fake Faces, Real Profits
These AI influencers have been spotted across platforms like Instagram and TikTok. They usually feature images of what appears to be young adults with Down syndrome—posing in swimwear, bedroom outfits, or selfies with soft, aesthetic lighting. The red flags? If you look closely, you’ll see blurry fingers, unnatural jewelry, and subtle facial oddities—classic signs of AI-generated content.
Disability advocates are calling this trend harmful—not helpful. Instead of promoting actual inclusion or visibility, the OnlyDown scam uses disability as bait. These fake influencers use stereotypical visual traits of Down syndrome without any connection to the actual community. They’re playing a role for clicks, not telling a real story. They’re not people. They’re made up of machines. Worse, their captions often include lines about confidence and empowerment.
“the creators behind these fake personas are exploiting emotional vulnerability and curiosity to funnel traffic to adult platforms.”
Who’s Behind This?
As MSN reported, some of the adult links don’t even lead to pages related to the AI personas. They’re simply a tool—an attention trap to drive up traffic and profit. That means more realistic fake personas are just a few clicks away. So far, there’s no evidence that any real person with Down syndrome is involved—or has consented to the use of their likeness or condition. Some accounts even claim to be “run by caregivers,” adding another layer of fake emotional depth to the scam.
One of the most talked-about examples is a woman known as @mariadopari, an account that presented her as a person with Down syndrome. Her Instagram page, now unavailable, had over 148,000 followers. On it, she offered links to her OnlyFans and Fanvue accounts. While her Instagram has vanished, her X account is still online, continuing to drive traffic through a fantasy that many argue is fake—and harmful.
Public Outrage and Industry Silence
Online, the backlash came fast. Social media users quickly called out the OnlyDown accounts as “gross,” “disrespectful,” and “beyond messed up.” Many demanded that platforms like Instagram and TikTok do more to monitor and remove deceptive AI content—especially when it targets vulnerable communities.
Disability activists have also spoken out, saying this is not inclusion—it’s exploitation. Using fake disabilities to promote adult content not only misrepresents the community but can make it harder for real creators with disabilities to be taken seriously.
Meanwhile, OnlyFans has yet to issue a public statement, even as its name is now tied to the OnlyDown scandal. Many in the adult content space argue that this kind of clickbait harms legitimate creators and further fuels public mistrust of the industry. As LatinTimes summed up in their coverage:
“The incident highlights serious concerns about AI abuse, especially when vulnerable identities are used without permission for profit.”
When AI Crosses the Line
Still, not all AI in adult content is problematic. In fact, some creators are using it to enhance creativity, improve their workflow, and build more personalized fan experiences. Machine Learning has opened the door to amazing innovations and AI tools —but it’s also opened Pandora’s box.
The OnlyDown scandal proves how quickly this technology can be twisted into something dark. Disability isn’t a costume, and it’s not a theme for engagement farming. It’s real. Faking it for fetish clicks is a betrayal of everything the disability rights movement has fought for.The danger isn’t just what’s happening now—it’s what could come next. Today, it’s Down syndrome. Tomorrow, it could be any condition, community, or identity faked and farmed for profit. With deepfakes becoming harder to detect and easier to produce, it’s up to platforms—and the public—to demand ethics before algorithms.