Elon Musk plans to fast-track a product to monetize adult content on Twitter, which he more or less confirmed Saturday afternoon. To be clear, Musk’s apparent plans—the latest get-rich-quick scheme from the richest man on the planet—have nothing to do with backing sex workers. Choosing to expand adult content at a moment of heightened scrutiny surrounding sex work and queer people is risky, particularly amid reports that Musk plans to remove protections for trans people, a population that disproportionately overlaps with sex workers, on the platform.
Subscribe to WIRED and stay smart with more of your favorite Ideas writers.
The move toward monetization also threatens to ruin a refuge: Since the US Federal Bureau of Investigation confiscated Backpage in April 2018, three days before President Trump signed the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) into law, Twitter has become the only major social media platform to tolerate sex workers. Even in the absence of direct payouts, Twitter has long been a safe haven for sex workers (adult content creators as well as in-person providers) in an increasingly puritanical digital landscape. But in order for the monetization to work, Twitter must overhaul its content moderation practices and intensify them, in direct contrast to Musk’s oath to protect “free speech.”
More likely is that monetization will be done in haste and without addressing issues raised by one of the company’s internal teams sensitive to the complexities of adult content and moderation. This is in part because, as of Friday, teams like the Machine Learning, Ethics, Transparency, and Accountability (META) team no longer exist. This move threatens to further jeopardize sex workers and especially trans sex workers, who are already at an astronomically increased risk of violence that continues to rise with each new piece of anti-trans legislation.
A badly executed attempt to monetize adult content could compound the fears, amid a growing moral panic that sex workers and queer adults are sexually deviant and thereby threatening to minors, lending credence to the belief that we have no place on a mainstream social media platform that welcomes users as young as 13. And such a move by Musk threatens to change the experience of everyone on the site.
ONE OF MANY ways homophobia, transphobia, and whore-phobia—the systemic oppression of sex workers—overlap is that we are all perceived as threats to children. Online sex work is not immune from this bias. Earlier this year, Casey Newton and Zoë Schiffer reported in The Verge that Twitter had been developing an “OnlyFans-style” subscription project, but efforts were stymied by fears over child sexual exploitation material (CSEM, sometimes referred to as child sexual abuse material). or CSAM). A contingent of in-house researchers, dubbed the Red Team, determined that Twitter couldn’t safely roll out the project, titled Adult Content Monetization (ACM), before getting a handle on CSEM, which they believed ACM would exacerbate. The project was tabled in May, only a few weeks after Musk’s offer to buy Twitter for $44 billion.
The Red Team’s findings hinge on two misconceptions about sex work: First, that it, together with human trafficking, exists on a continuum encompassing “the sex trade,” and second, that adult-content platforms do not enforce aggressive moderation policies.
cubicles repeated on the sand backdrop
Welcome to Digital Nomadland
It’s true that CSEM is near-ubiquitous on the internet; its existence has long been used as a cudgel by religious right-wing groups like Exodus Cry and the National Center on Sexual Exploitation (NCOSE, formerly Morality in Media) against the presence of sex workers on the Internet. But all online platforms that identify CSEM are required by law to report their findings to the National Center for Missing and Exploited Children (NCMEC). In their report, for example, Newton and Schiffer note that “in 2019, Mark Zuckerberg boasted that the amount Facebook spends on security features exceeds Twitter’s entire annual revenue.”
Yet the claim that adult content is inextricably linked with child abuse is demonstrably untrue. The NCOSE names OnlyFans specifically as enabling “child sexual abuse material, sex trafficking, rape videos, and a host of other crimes”—but the numbers tell a different story. In their latest annual report, the NCMEC found that PornHub, for example, reported 9,029 instances of CSEM in 2012, a figure dwarfed by Twitter’s 86,666. Facebook, despite