Co-authored with the excellent Joshua Neves, Aleena Chia and Ravi Sundaram, Technopharmacology is just out with University of Minnesota Press in the In search of media series. The book explores the close relations of media technologies to pharmaceuticals and pharmacology and calls for expanding media theoretical inquiry by attending to the biological, neurological, and pharmacological dimensions of media and centers on emergent affinities between big data and big pharma. My section focuses on diagnoses of online porn addiction and makes an argument for attending to the excitements that make the self. The book will also be out on OA through Meson Press. This was fun to make.
Category Archives: data culture
Happy news! Our Intimacy in Data-Driven Culture consortium got funding for 2022-2025 from the Strategic Research Council at the Academy of Finland. During the second funding period we’ll continue to probe vulnerabilities connected to datafication among different groups of people with a cross-disciplinary research team at University of Turku, Tampere University, Aalto University and Åbo Akademi University. As PI, am feeling very, very fortunate.
Please join us on Zoom, or in person, should you be in Sydney:
Tue, 26 Apr 2022 • 06:00PM – 07:00PM AEDT (10:00 – 11:00 AM CEST)
Online / Social Sciences Building Seminar Room 210, University of Sydney
Online registration link: https://uni-sydney.zoom.us/webinar/register/WN_Im57SHm9RqCCjG7fd5Fh9g
Despite the significance of sexuality in people’s lives, sex is a topic of constant contestation. This panel asks why sex, particularly mediated depictions of sex, are often termed objectionable. Why are female nipples zoned out from social media? Why is porn framed as a social problem? Join us as our experts discuss what is really at stake in platform regulation of explicit content.
Chair: Professor Kane Race (University of Sydney)
Participants: Professor Kath Albury (Swinburne University of Technology), Professor Alan McKee (UTS), Professor Susanna Paasonen (University of Turku/Hunt-Simes Visiting Chair @SSSHARC, University of Sydney)
In person seating is limited so please so please send RSVPs to email@example.com to ensure your spot.
Until end of April, I am Visiting Professor at University of Sydney’s Faculty of Arts and Social Science as Hunt-Simes Visiting Chair in Sexuality Studies. My visit includes PhD workshops, a public lecture (tba), as well as collaboration with the local research community and my most excellent host, Professor Kane Race, broadly on the topic of sexuality and social media content regulation. Very exciting.
Issue 2/2021 of WestEnd is out, with the special theme of “Pornografie. (Un-)Sittlichkeit und Geschlecht” edited by Juliane Rebentisch and Kerstin Stakemeier and including my piece, “Pornokreuzzüge und emotionale Plattformpolitik”. I remain very enthusiastic about my year publishing in a language I don’t speak in any meaningful way. Here, the abstract and whole text in English.
Porn crusades and affective platform politics
Online pornography forms a ubiquitous part of online culture even as its ready and abundant availability continues to fuel social concerns and campaigns aimed at curbing it. Focusing on the recent campaign of US-based journalist, Nicholas Kristof, against a leading video aggregator site, Pornhub, this article examines the logic and politics involved in “the deplatforming of sex”—that is, the expansive removal of nudity and sexual content from online platforms. It argues that Kristof’s campaign, in targeting online payment system providers in particular, represents a shift in anti-pornography activism towards infrastructural interventions aiming to delimit porn sites’ techno-material conditions of operation. As such, it speaks of broader platform politics where regulatory practices specific to the US impact the sexual expression of users on a global scale.
In December 2020, Pornhub, the globally leading porn video aggregator site, suspended access to nine million videos, these amounting to the majority of its content. The action was in response to public attention caused by a The New York Times exposé opinion article by the Pulitzer-winning journalist Nicholas Kristof, titled “Children of Pornhub”. Setting out to reveal the dark side of the platform as one “infested with rape videos”. it dramatically claimed that the site monetizes on “child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags” (Kristof 2020). In addition to dwelling on the stories of abused, tortured and trafficked women, Kristof reiterated some well-known problems in the operating principles of Pornhub, a platform long critiqued for building its business model on piracy, having lax moderation practices and responding slowly to complaints on illegal content and requests to remove it (e.g., Auerbach 2014; Grant 2020). When voiced by sex workers, these critiques have had little effect. The case was different with the NYT article.
Starting from Kristof’s lobbying against Pornbub and other platforms trading in commercial sex, this article explores their logic, goals and ramifications within the broader context of “the deplatforming of sex” (Molldrem 2018), namely the increasingly vigilant removal of nudity and sexual content from online platforms. This involves what David M. Halperin (2017, 3) calls “the war on sex”: a cumulative effect of many independent initiatives targeting sex, and especially forms of sex arousing “disapproval on moral, aesthetic, political, or religious grounds” in the United States (see also Halperin 2017, 6; Race 2018, 172–173).
Out with it
Porn video aggregator sites broadly emulate the operating principles of YouTube which has, since 2005, largely defined the principles of online video sharing. With the exception of XVideos and xHamster, the most popular of these (e.g., Pornhub, Redtube, YouPorn) are owned by the same company, MindGeek. Writing on YouTube, Tarleton Gillespie (2010, 352) maps out the notion of platform in four different senses of the term: “computational, something to build upon and innovate from; political, a place from which to speak and be heard; figurative, in that the opportunity is an abstract promise as much as a practical one; and architectural, in that YouTube is designed as an open-armed, egalitarian facilitation of expression”.
In a political and figurative sense, to have a platform means being heard and seen, having the possibility of gaining an audience and potentially impacting culture and society. Conversely, to deplatform means to silence by removing someone’s or something’s access to a channel through which they can be heard and gain an audience. In networked media, deplatforming occurs on diverse levels: by removing user accounts or entire groups (Rogers 2020), by banning content categories (Pilipets and Paasonen 2020) and enforcing such bans through moderation, or by impacting the technical or economic infrastructures necessary for the platform’s operability. The deplatforming of sex in social media operates through in-platform laws such as community standards. In a dramatic example, Tumblr decided to ban nudity and sexual depiction in 2018, these having previously formed a large part of its content (Cho 2018). The platform was popular among sexual and gender nonconforming communities who lost access to networks, resources and archives built over a decade, further adding to their marginalisation (Byron 2019; Molldrem 2019). The third form of deplatforming targeting the infrastructural conditions of an application, site or service – deplatforming in a computational sense – became a topic of debate following Amazon’s decision to remove the Parler app favoured by Trump supporters from its web hosting service in January 2021.
The solutions that Kristof suggested for fixing Pornhub’s problems were partly same to those that sex worker activists had long been calling for: that, in order to curb piratism and illegal content, only verified users should be able to post videos, downloads should be prohibited and content moderation and reporting practices improved. To use Gillespie’s terms, these suggestions cut through Pornuhb as a political, figurative and architectural platform. Pornhub claims to have complied with all these modifications (see Pornhub 2020). Kristof however further suggested cutting the platform’s ties to payment infrastructures: “I don’t see why search engines, banks or credit card companies should bolster a company that monetizes sexual assaults on children or unconscious women”. Visa and Mastercard, alarmed by the negative publicity and looming PR damage, moved quickly to severe their ties with the platform, so that it cannot currently accept credit cards. In 16 April, 2021, Kristof continued his project with another NYT opinion article, “Why Do We Let Corporations Profit from Rape Videos?”, targeting XVideos and calling for both credit card companies and search engines to cut it off.
As Sarah T. Roberts (2019) details in her ethnographic study of commercial content moderation, social media platforms would be rife with materials of torture, both animal and human, were not armies of low-paid employees tasked with weeding it out. For while much of visual content moderation happens through automated, algorithmic means, distinctions pertaining to authenticity and context are hard for machines to make (the format of video posing its specific sets of challenges). The work within the “cesspool” of social media is largely concerned with the brand management of these platforms, seldom comes with sufficient mental health support and entails notable emotional and psychological stress (Roberts 2019, 116–123, 151–154). Kristof (2020) however framed the problem of traumatizing content as specific to Pornhub so that its moderators became both victims of MindGeek and villains facilitating the sexual abuse of children.
The shortage of content moderation resources on porn aggregator sites is an acute concern, yet similar work at Google and Facebook has been discussed as no less “soul-crushing” in making employees “soak up the worst of humanity” (Chen 2014). ISIS beheading videos, documentations of sexual and other violence and footage shot by white extremists on shooting sprees have all been available on mainstream social media before being flagged or removed by commercial content moderators (Gillespie 2018, 9; Parks 2019). Facebook (2021) reports taking action on five million incidents of child nudity and exploitation in the first three months of 2021, catching 98,9% of the content before it was reported by users. As porn video aggregator sites’ principles of operation are to a large extent similar to those of social media platforms, their problems in content moderation are also similar, even as their content policies drastically differ.
The spaces for sexual display and communication have been growing increasingly narrow on social media since the passing of the 2018 “Allow States and Victims to Fight Online Sex Trafficking Act” (FOSTA) and “Stop Enabling Sex Traffickers Act” (SESTA) bills in US senate with overwhelming bipartisan support. As exceptions to Section 230 of the United States Communication Decency Act which protected online services from liability for the content posted by users, FOSTA-SESTA has redefined online platforms as publishers responsible for content aiding sexual solicitation. This has resulted in broad removal of content connected to commercial sex that has nothing to do with trafficking, mainly since no distinction is drawn between consensual and non-consensual sex work (Reynolds 2020): consequently, online advertisements for sexual services have disappeared, as have social media groups and threads for sex workers sharing tips on filtering clients, sexual health resources and managing their careers independently (Blunt and Wolf 2020; Paasonen et al. 2019, 133; Tripp 2020). Since US-based social platforms are globally used, the legislation has broad resonances, also in countries where sex work is legal.
These transformations have impacted content moderation well beyond the realm of commercial sex. As pre-emptive measures, social media companies have tightened content policies since the liabilities of weeding out too little by far overshadow the commercial benefits involved in hosting sexual content – this having always been difficult to monetize as advertisers are unwilling to place their ads next to depictions of nudity and sex (Pilipets and Paasonen 2020). Facebook and Instagram have opted for horizontal content bans pertaining to nudity, sexual display and solicitation, deplatforming sex up to the visibility of female nipples and nude buttocks, users inquiring after each other’s interest in having sex, and the uses of eggplant and peach emojis in a sexual context (for a longer discussion on deplatforming of sex in social media, see Paasonen 2021).
Meanwhile, FOSTA-SESTA is argued to have little impact on curbing trafficking while curbing sex worker’s access to information resources and failing to protect them (Tripp 2020). In her critique, Lura Chamberlain (2019, 2206) defines the law as “deeply flawed” in that it “threats to criminalize significant categories of protected speech have already led to a documented chilling effect on speech due to its gross misunderstanding of the interaction between sex work and sex trafficking.” Long in the planning, FOSTA-SESTA built on a 2017 ban on commercial sex advertising targeting Backpage.com (Goldman 2018). Summing up the impact of FOSTA-SESTA and the closing of down Backpage’s sex advertising, Hacking//Hustling sex worker community report points out that there is no evidence of it having “done anything to prevent sexual labor exploitation. Our research shows that this law has actually put people in more precarious financial situations that actually make individuals more vulnerable to trafficking, as well as decreasing access to previously established channels of communication used to protect sex workers against violence.” (Blunt and Wolf 2020, 35.)
Targeting Backpage in a 2017 NYT opinion piece, Kristof called it “the odious website where most American victims of human trafficking are sold” and argued that SESTA “was crafted exceedingly narrowly to target only those intentionally engaged in trafficking children” and hence, contrary to criticism, has nothing to do with narrowing the freedom of expression online, or with limiting the rights of sex workers. In the light of empirical evidence, this is patently untrue. As in his previous work and later campaigns against Pornhub and XVideos, Kristof’s liberally conflated all commercial sex work with forced and involuntary labour, used the sexual abuse of minors as his affective rhetorical focus and accused tech companies such as Google for being allies of sex traffickers to undermine their critiques of FOSTA-SESTA (e.g., Barnes 2019; see also Kristof 2009). As a rhetorical strategy, the conflation of sex work and trafficking has been highly influential for two decades, cutting through and bringing together the Christian right, abolitionist feminists and governmental actors (Weitzer 2007, 449). This strategy remains knowingly blind to the presence and agency of sex workers as others than victims of abuse, delimiting their possibilities to impact policy, as well as obscuring their different agendas, positions and experiences, both locally and internationally (Bernstein 2019).
Given the impact of US internet governance on users across the globe, initiatives such as FOSTA-SESTA go well beyond regional concerns. This also means that campaigns such as Kristof’s, basically consisting of opinion articles published in one US newspaper and a flow of tweets aiming to impact policy, matter internationally since these policies alter the terms and conditions of online platforms used by billions of people around the world. What may seem – or in fact, be – a moral panic in the US can impact the livelihood of people doing online sex work in Germany, just as it can impact the ways in which social media users can, or cannot, exchange sexual content ranging from sex education resources to historical photographs or titillating selfies, or sexually relate to one another on these platforms.
The association of porn with violence against women has, of course, been key to feminist initiatives that have, since the 1970s, framed pornography as both a symbol and documentation of male violence justifying the sexual objectification, dehumanization and subjugation of women (e.g., Griffin 1981; Dworkin 1989; Kappeler 1986; Long 2012). This line of argumentation has drawn causal connections between porn and sexual violence, as in Robin Morgan’s famous 1974 slogan, “porn is the theory, rape is the practice”. Largely originating from the US, anti-pornography feminism continues to have international influence.
Premised on porn production and consumption being harmful to women both individually and collectively, anti-porn feminism has focused on critiques of patriarchal power relations in the framework of binary gender, so that forms of pornography not including women or made by women for other women, by people not conforming to a gender binary, or not simply fitting the patterns of critique, are either absent or interpreted as offering further proof of patriarchal sexual politics. In its focus on women’s abuse by men, this line of argumentation operates with a deeply hetenormative logic which, while seldom acknowledged, becomes generalised as a framework for sexual fantasies and the work of porn (Thompson 2015; Paasonen et al. 2020, 40–41). Like Kristof’s campaigns, anti-pornography feminism paints a binary universe, both moral and gendered, where porn and sex work lack female agency and help to bolster male hegemony. There is no room for considerations of porn as a site of sexual experimentation or expression, or for sexual desires and fantasies of the unruly, queer and kinky kind. This speaks of the persistent presence of sexual hierarchies of the kind that Gayle Rubin (1989, 281) identified at the early stages of the feminist sex wars as separating “good sex” (heterosexual, married, monogamous, procreative, non-commercial, private, vanilla) from the bad (homosexual, unmarried, promiscuous, non-procreative, commercial, public and kinky).
Since anti-porn feminism’s critique is categorical, it approaches the genre as a singular entity with aligning intentions, aesthetics, politics and economies, firmly placing it in the realm of “bad sex”. As the genre becomes thus homogenised, its inner diversity and fragmentation evaporates from view so that it is impossible to grasp the work or products of contemporary porn – and, consequently, to understand much of what is being discussed (see Paasonen 2011). These critiques also tend to be disinterested in the views of women working in porn, unless they are speaking against the industry, hence excluding their concerns connected to sexual health, income or control over work conditions. The notion of the porn industry, largely coined in the 1980s and 1990s, fails to describe contemporary forms of production involving studios of various sizes, amateurs and semi-amateurs, independent producers and animators aiming to make their products seen on online platforms even as the dominance of video aggregator sites, combined with the invisibility of porn work on social media, means that such visibility is by no means easy to achieve. In her critique of Kristof’s Pornhub campaign, journalist Melissa Gira Grant (2020) points out how,
“For years, porn performers have tried to draw attention to the exploitation at the heart of the tubesite business model—YouTube clones, which now dominate an online porn ecosystem that, not long ago and like much of online media, once offered independent creators more control over their work. Those days are all but over in porn, and the large companies behind websites like Pornhub have drained money out of independent porn, not just by pirating their work but by nearly monopolizing the business. Pornhub’s parent company owns porn-production companies, too, ones that some performers who might otherwise speak out also need to rely on for work. In turn, that has resulted in less work, lower wages, and less control for performers. In monopolies, particularly in industries that operate with little independent oversight and a nonunion labor force, abuse proliferates.”
There is much to critique in Pornhub and MindGeek’s impact on porn work and production culture that has contributed to something of a collapse in the studio system allowing for longer contracts and an emergence of a gig economy of financial precarity while also narrowing down the financial viability of independent producers and distributors (Berg 2021; Paasonen et al. 2019, 44, 59–60). As Grant argues, campaigns for credit card companies and PayPal to halt payments to Pornhub nevertheless do little to amend the situation. Rather, they hurt sex workers and other content producers who depend on the platform for their income. Attacks on Pornhub as a sex trafficking hub are also missing the point in that not only do users post child abuse material on mainstream social media platforms but the majority of reported child sexual abuse material is shared in either the dark web or through encrypted messaging apps like WhatsApp (e.g., Burgess 2021; Kleinman 2021). There are, however, no campaigns to date targeting the Facebook-owned WhatsApp used by two billion people as a child sexual trafficking platform.
A liberal journalist, Kristof is careful to distinguish his critiques of rape videos from arguments of porn being an engine of rape culture, targeting platforms for sharing illegal content instead. In other words, by framing his project as not being about pornography but about rape, he rhetorically detaches it from those aiming to ban pornography in more categorical terms. At the same time, his Pornhub article promotes the efforts of Traffickinghub, a campaign run by Exodus Cry, a religious right organization aiming at “the abolition of the sex trade, including prostitution and porn, by means of the criminal law” (Grant 2020). Kristof’s then appears to have intimate kinship networked anti-pornography initiatives bringing together conservative groups resisting sex education, LGBTQ+ rights, reproductive and abortion rights (Grant 2020). Framing these organizations as anti-trafficking (Weitzer 2007) has helped to neutralized them so that they can receive funding for their diverse actions: arguing to protect the rights of women with anti-trafficking campaigns, they in fact campaign against women’s sexual rights, operating internationally. Meanwhile, Kristof’s journalistic status gives him an aura of objectivity of the kind inaccessible to activists labelled either radical feminist or conservative Christian. With some two million Twitter followers, his political platforms are notable: some of his platform status is evident in the Pornhub article taking up the entire front page of NYT Sunday Review section.
During the Reagan presidency, high-profile radical feminists Andrea Dworkin and Catharine MacKinnon aligned their initiatives with those of Christian conservative coalitions, even as their gendered and sexual politics were fully incompatible (see Vance 1997). A similar alignment is taking place between anti-pornography feminist initiatives and conservative lobbying groups – as well as in Kristof’s alignment with Exodus Cry. Attending to these connections, commentators have been quick to identify Kristof’s attack on Pornhub as a moral crusade (Grant 2000).
His argumentation makes use of visceral examples – such as Backpage advertising “a 13-year-old whose pimp had tattooed his name on her eyelids” (Kristof 2017) – and excerpts from the survivors of abuse. These operate as textual equivalents of anti-pornography feminist slide shows aiming at negative affective responses for a political effect (Gentile 2010, 85–92). Feminist anti-pornography activism has, both historically and within the contemporary, made use of negative affect in arguing for the nefarious impact of porn, associating it with feelings of hurt, sadness, anger, frustration, sorrow, fear and nausea (e.g., Griffin 1981; Dworkin 1989; 2000). This was particularly true with Dworkin whose work is undergoing something of a revival with the publishing of Last Days at Hot Slit (2019), a collection of her writings. Within the cultural context of #MeToo and the fight over reproductive rights in the US, many find her emotional prose, fury at the way things are, and the firmness of her political stance resonant (Paasonen et al. 2020, 46). At the same time, her clarity of argumentation comes with ample simplification and sexual normativity within the framework of binary gender that fits ill with considerations of sexual and gender diversity.
Accounts of negative affect connected to porn, in the variations it has taken from the 1970s to the current day, from feminist texts to Christian fundamentalist ones and to Kristof’s reporting, anchor political argumentation in gut reactions in order to bestow on them a visceral sense of authenticity and acuteness. They operate affectively by putting “the body behind our words” so that words can become “something more than mere words” (Miller 1997, 181). Activism building on the power of feeling (anger, sorrow, disgust) together can be powerful in bridging the personally felt with the collective and the societal (Protevi 2009). At the same time, these forms of affective address work to efface diversity within the aesthetics, sexual routines, bodies, genders, sexualities, economies, politics and ethics connected to porn so as to frame it as singular entity and object assumedly evoking uniform responses. In other words, not only do the cultural objects and practices of pornography become homogenized but so do the presumed ways of experiencing them. All this sets clear limits to how porn can be approached, conceptualized, analysed and known – which, of course, is what these campaigns aim at.
What’s in a word?
The boundaries of porn as a genre have never been set, and they have grown ever more ephemeral in the course of digital and networked production, distribution and consumption involving a plethora of actors, governance practices, financial and political interests. Porn is an umbrella term for practices, aesthetics and economies that may share little similarity with one another across space and time. In order for analyses and critiques of porn to be efficient, these need to be specific, founded in empirical evidence and attuned to the distinctions among the actors and materials addressed.
As we have been communicating through networked means, often unable to connect flesh-to-flesh during the COVID-19 pandemic, the importance of mediated forms of sexual relating for wellbeing has grown strikingly evident. It is crucial not to understand such relating in narrow terms as an extension of extant intimate relationships: it is also a realm of sexual play, experimentation and pleasure involving the (mediated) bodies of virtual strangers through webcams, OnlyFans accounts, porn clips, hookup app profiles, and beyond. As sites for play for some, these are sites for work for others in ways blurring any clear divisions between the two notions (Paasonen 2018, 31). In any case, they are detached from reproductive goals and attuned toward discoveries in what one can sexually enjoy, like and prefer and, consequently, what or who one’s sexual self may be. Such “unpredicted forms of experience” (Warner 2000, 185) can alter one’s understanding of sexuality and desire as “a new sensation, an unusual mood, a previously inconceivable way of relating” comes about (Race 2009, 186). Unexpected incidents happen in encounters with other people, just as they do with mediated images and sounds – porn included. To consider porn in this vein as affording potentially startling and possibly encounters opens up alternative ways of thinking about its affective power and potential.
Within the current cultural conjuncture, it is however also necessary to reconsider what it means to label cultural objects as pornographic to start with: this necessity is pertinent in terms of securing spaces for sexual expression and relating through networked means. People creating sexual media do not necessarily see it as porn even as it can hold great personal importance as a means of exploration and reflection. Sexual depiction and visibility are key to the making and maintenance of gender and sexual nonconforming communities, just as it can be key to self-discovery and social relating (Molldrem 2018). At the same time, social media platforms classify all displays of nudity as offensive and categorically remove them, so as to protect their own brands and the commercial interests of advertisers (Tiidenberg and van der Nagel 2020, 46–47). As sexual content is being zoned to specific sites and as both feminist and Christian right organizations are pushing for closing down these sites’ access to payment systems, it remains crucial to ask whose interests are being served, how, and in whose name.
Carefully contextual analyses of sexual media production, distribution and use are necessary for shifting the foci of public debate so that sexual rights on online platforms are not merely understood in the negative sense as freedoms from (being harassed and abused) but equally as positive freedoms to (express and enjoy sexuality), without the one overweighing or cancelling out the other (Spišák et al. 2021). Such a step also necessitates acknowledging, and working through, the complexities in how people of diverse gender identifications and sexual orientations make use of sexual media and how online platforms – political, figurative, computational and architectural – and their governance shape the ways that sexual sociability can take. Simplified moralistic and ideological takes on what sexual exchanges and bodily displays mean or who produces them do much more harm than good when it comes to the sexual rights of self-expression, pleasure and knowledge.
Auerbach, David. 2014. Vampire Porn: MindGeek is a Cautionary Tale of Consolidating Production and Distribution in a Single, Monopolistic Owner. Slate, 23 October, http://www.slate.com/articles/technology/technology/2014/10/mindgeek_porn_monopoly_its_dominance_is_a_cautionary_tale_for_other_industries.html.
Barnes, Leslie. 2019. False Representations of Sex Workers. AUReporter 49 (3): https://reporter.anu.edu.au/false-representation-sex-workers.
Berg, Heather. 2021. Porn Work: Sex, Labor, and Late Capitalism. Chapel Hill: The University of North Carolina Press.
Bernstein, Elizabeth. 2019. Brokered subjects: Sex, Trafficking, and the Politics of Freedom. Chicago: University of Chicago Press.
Blunt, Danielle and Ariel Wolf. Erased: The Impact of FOSTA-SESTA & The Removal of Backpage. https://hackinghustling.org/erased-the-impact-of-fosta-sesta-2020/
Burgess, Matt. 2021. Police Caught One of the Web’s Most Dangerous Paedophiles. The Everything Went Dark. The Wired, 12 May, https://www.wired.co.uk/article/whatsapp-encryption-child-abuse.
Byron, Paul. 2019. “How could you write your name below that?” The queer life and death of Tumblr. Porn Studies 6 (3): 336–349.
Chamberlain, Lura. 2019. FOSTA: A Hostile Law with a Huma Cost. Fordham Law Review 87 (5): 2171–2211.
Chen, Adrian. 2014. The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed. Wired, October 23, https://www.wired.com/2014/10/content-moderation/.
Cho, Alexander. 2018. Default publicness: Queer youth of color, social media, and being outed by the machine. New Media & Society 20 (9): 3183–3200.
Dickson, EJ. 2020. ‘OnlyFans Creators and Sex Workers are Getting ‘Purged’ from TikTok. The Rolling Stone. December 17. https://www.rollingstone.com/culture/culture-features/onlyfans-sex-workers-tiktok-purge-banned-1101928/.
Dworkin, Andrea. 1989. Pornography; Men Possessing Women. 2nd edition. New York: E. P. Dutton.
Dworkin, Andrea. 2000. Pornography and Grief (1987). In Drucilla Cornell (ed.), Feminism & Pornography. Oxford: Oxford University Press, 39–44.
Dworkin, Andrea. 2019. Last Days at Hot Slit: The Radical Feminism of Andrea Dworkin. Cambridge, MA: Semiotext(e).
Fabbri, Thomas. 2019. Why is Instagram Deleting the Accounts of Hundreds of Porn Stars? BBC Trending, 24 November. https://www.bbc.co.uk/news/blogs-trending-50222380.
Gentile, Kathy Justice. 2010. Sexing the Look in Popular Visual Culture. Cambridge: Cambridge Scholars Publishing.
Gillespie, Tarleton. 2010. The Politics of “Platforms.” New Media & Society 12 (3): 347–364.
Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. New Haven: Yale University Press.
Goldman, Eric. 2019. The Complicated Story of FOSTA and Section 230. First Amendment Law Review 17: 279–293.
Grant, Melissa Gira. 2020. Nick Kristof and the Holy War on Pornhub. The New Republic, December 10. https://newrepublic.com/article/160488/nick-kristof-holy-war-pornhub.
Griffin, Susan. 1981. Pornography and Silence: Culture’s Revenge Against Nature. New York: Harper & Row.
Halperin, David M. 2017. Introduction: The War on Sex. In The War on Sex, ed. David M. Halperin and Trevor Hoppe. Durham, NJ: Duke University Press, 1–61.
Kappeler, Susanne. 1986. The Pornography of Representation. Minneapolis: University of Minnesota Press.
Kleinman, Zoe. 2021. Child Sexual Abuse: Four Held in German-Led Raid on Huge Network. BBC News, 3 May, https://www.bbc.com/news/world-europe-56969414.
Kristof, Nicholas. 2009. If This Isn’t Slavery, What Is? The New York Times, 3 January, https://www.nytimes.com/2009/01/04/opinion/04kristof.html.
Kristof, Nicholas. 2017. Google and Sex Traffickers Like Backpage.com. The New York Times, 7 September, https://www.nytimes.com/2017/09/07/opinion/google-backpagecom-sex-traffickers.html.
Kristof, Nicholas. 2020. Children of Pornhub. The New York Times, 4 December.
Kristof, Nicholas. 2021. Why Do We Let Corporations Profit from Rape Videos? The New York Times, 16 April.
Long, Julia. 2012. Anti-Porn: The Resurgence of Anti-Pornography Feminism. Zed Books Ltd.
Miller, William Ian. 1997. The Anatomy of Disgust. Cambridge, MA: Harvard University Press.
Molldrem, Stephen. 2018. Tumblr’s Decision to Deplatform Sex Will Harm Sexually Marginalized People, Wussy, December 6. https://www.wussymag.com/all/2018/12/6/tumblrs-decision-to-deplatform-sex-will-harm-sexually-marginalized-people.
Paasonen, Susanna. 2011. Carnal Resonance: Affect and Online Pornography. Cambridge, MA: MIT Press.
Paasonen, Susanna. 2018. Many Splendored Things: Thinking Sex and Play. London: Goldsmiths Press.
Paasonen, Susanna, 2021. Intime Abhängigkeiten, fragile Verbindungen, entsexualisierte Plattformen Sexuologie 28 (1–2).
Paasonen, Susanna, Kylie Jarrett and Ben Light. 2019. NSFW: Sex, Humor, and Risk in Social Media. Cambridge, MA: MIT Press.
Paasonen, Susanna, Feona Attwood, Alan McKee, John Mercer and Clarissa Smith. 2020. Objectification: On the Difference Between Sex and Sexism. London: Routledge.
Parks, Lisa. 2019. Dirty Data: Content Moderation, Regulatory Outsourcing, and The Cleaners. Film Quarterly 73 (1): 11–18.
Pilipets, Elena and Susanna Paasonen. 2020. Nipples, Memes, and Algorithmic Failure: NSFW Critique of Tumblr Censorship. New Media & Society. https://journals.sagepub.com/doi/pdf/10.1177/1461444820979280.
Pornhub. 2020. The Last on Our Commitment to Trust and Safety, https://www.pornhub.com/blog/11422.
Protevi, John. 2009. Political Affect: Connecting the Social and the Somatic. Minneapolis: Minnesota University Press.
Race, Kane. 2009. Pleasure Consuming Medicine: The Queer Politics of Drugs. Durham, NC: Duke University Press
Race, Kane. 2018. The Gay Science: Intimate Experiments with the Problem of HIV. New York, NY: Routledge.
Reynolds, Chelsea. 2020. “Craigslist is Nothing More than an Internet Brothel”: Sex Work and Sex Trafficking in US Newspaper Coverage of Craigslist Sex Forums. The Journal of Sex Research, https://doi.org/10.1080/00224499.2020.1786662.
Roberts, Sarah T. 2019. Behind the Screen: Content Moderation in the Shadows of Social Media. New Haven: Yale University Press.
Rogers, Richard, 2020. Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media. European Journal of Communication 35(3): 213–229.
Rubin, Gayle. 1989. Thinking sex. In Carol S. Vance (ed.), Pleasure and Danger: Exploring Female Sexuality. London: Pandora, 267–319.
Spišák, Sanna, Tommi Paalanen, Susanna Paasonen, Elina Pirjatanniemi and Maria Vihlman. 2021. Social Networking Sites’ Gag Order: Commercial Content Moderation’s Adverse Implications for Fundamental Sexual Rights and Wellbeing. Social Media + Society, DOI: 10.1177/20563051211024962.
Thompson, Jay David. 2015. Invisible and everywhere: Heterosexuality in anti-pornography feminism. Sexualities 18 (5–6), 750–764.
Tiidenberg, Katrin and van der Nagel, Emily. 2020. Sex and Social Media. Melbourne: Emerald Publishing.
Tripp, Heidi. 2020. All Sex Workers Deserve Protection: How FOSTA/SESTA Overlooks Consensual Sex Workers in an Attempt to Protect Sex Trafficking Victims. Penn State Law Review 124, 219–246.
Vance, Carole S. 1997, Negotiating Sex and Gender in the Attorney General’s Commission on Pornography. In Roger N. and Micaela di Leonardo (eds.), The Gender/Sexuality reader: Culture, History, Political Economy. New York: Routledge, 440–452.
Warner, Michael. 2000. The Trouble with Normal: Sex, Politics, and the Ethics of Queer Life. Cambridge, MA: Harvard University Press.
Weitzer, Ronald. 2007. The Social Construction of Sex Trafficking: Ideology and Institutionalization of a Moral Crusade. Politics & Society 35 (3): 447–475.
I am delighted to be one of the three scholars elected for a 2022 Hunt-Simes Visiting Chair in Sexuality Studies position at Sydney Social Sciences and Humanities Advanced Research Centre (SSSHARC), University of Sydney. If in the spring we are living in a world where people fly long, long distances, I’ll have the pleasure of working with Kane Race and the rest of the excellent Sydney team on sexual expression and social media platform governance. Very much honored to be in the same company with Jen Gilbert and Srila Roy,
I recently published an article “Intime Abhängigkeiten, fragile Verbindungen, entsexualisierte Plattformen” in Sexuologie journal based on a talk I gave at the Intimacy talk series by ICI Berlin and Schwules Museum last spring on the deplatforming of sex in social media (available here as video). There have been multiple variations of the talk since 2018 or so, that’s fed into and draws on different pieces I’ve written and co-written here and there. Here’s the full English article version:
As networked communications have grown infrastructural in how everyday lives are managed and lived, social media have come to operate as infrastructures of intimacy that play important roles in how people come together, stay in contact, maintain distances and proximities, and possibly fall out. At the same time, the increasingly aggressive “deplatforming of sex” from social media in the name of unspecified notions of safety involves the effacement of sexual imagery and sexual communication from the palette of exchanges available to social media users. Exploring the different ways in which sexual content is valorised and considered to lack in value, and focusing on the logic of social media community standards in particular, this article asks what kinds of bodies and bodily interactions become marked desirable, safe or risky within them; what kinds of value sex holds on them; and what is at stake in social media assembling platformed sociability void of sex.
Keywords: social media, sexuality, content moderation, sociability, sexual rights
intimate dependencies, fragile connections, sexless platforms
Not least in the course of the ongoing COVID-19 pandemic, during which much of mundane sociability has shifted online, networked media have grown infrastructural in how everyday lives are managed. While online connections were only two decades ago something of an add-on to the navigation of friendships and sexual relations, they have since grown part of the mundane maintenance and management of social proximities and distances, attachments and desires. In our particular historical conjuncture of a global pandemic where social distancing measures strongly limit physical mobility and potential ways of connecting both within specific regions and across them, online traffic has soared: the social media market leader Facebook alone has reported hundreds of millions of new users since the March 2020, porn aggregator sites enjoy high user volumes, and the success of the content subscription service OnlyFans speaks of the appeal of novel platforms for sexual entertainment, monetization and exchange.
At the same time, social media platforms are increasingly engaged in what Stephen Molldrem (2018) identifies as “the deplatforming of sex”, namely the effacement and removal of sexual content and communication. As a general concept, deplatforming refers to the removal of platforms from which individuals or groups who use them to disseminate their views and ideas to the broader public. The shapes that this take vary from Amazon deplatforming the Parler messaging app used by Trump supporters storming the Capitol building in January 2021 by removing it from its web hosting service to the banning and removal of individual user accounts, as in Trump being ousted from both Twitter and Facebook the same month. For its part, the deplatforming of sex means the removal of sexual exchanges from social media so that the spaces for different sexual cultures, practices, and communities shrink – a “de-sexing of platforms” that disproportionately hurts exchanges among sexually marginalised people (Molldrem, 2018). As Katrin Tiidenberg and Emily van der Nagel (2020, 46–47) argue, “critics broadly agree that this deplatformization of sex is a result of cold calculations cloaked in emotional, moralizing language” that tap into lingering moral panics concerning mediated sex “which have reinforced the dubious moral status of sex and popularized the trite diagnosis of technologically mediated sex as deviant.” Contributing to these debates, this article focuses on the violent frictions that occur between platform capitalism (Srnicek, 2018) and the forms that sexual networked exchanges are allowed to take on social media.
Lauren Berlant (2000, 4) maps out intimacy as “connections that impact on people, and on which they depend for living.” Understood in this vein, intimacy operates infrastructurally as networks that our everyday lives – and even our very sense of self – depend upon. Given the infrastructural roles that networked communications in play in everyday life, such dependencies are not limited to connections between people alone but encompass the technological, mediated environments where such connections unfold: and we depend on them for living in terms of the networks that impact us. In other words, it is my premise that considerations of intimacy as connections and networks that matter need to extend to the infrastructural role of digital technologies in the functionality of personal, social, occupational, and collective lives (Paasonen, 2018; Wilson, 2016). Within all this, social media have grown central to how people relate: quotidian, habitual in their uses, these platforms are simultaneously banal and essential in the connections that they allow for.
Infrastructure refers to facilities and backbones, both material or organizational, which, in Ara Wilson’s (2016, 247) terms, “shape the conditions for relational life”. As infrastructures of intimacy, networked media depend on the backbone of the internet as a global computational communication network composed of servers, data cables, protocols, and many things beyond. This technological and very much material infrastructure has, with the introduction of broadband connections and the mobile internet, become a key component of everyday life, not least so in Northern Europe (e.g., Hallinan, 2021). As we are, in many cases, factually constantly connected, the division of the online and the offline that much of the early Internet research operated with has become somewhat meaningless in describing the social practices of everyday life. Within the current historical conjuncture, data giants such as Facebook, Apple, Amazon and Google/Alphabet have broad and concrete power to both shape and monitor our online exchanges.
Blake Hallinan (2021, n.p.) points out that, during the past decade, “Facebook has scaled up, in terms of the number of users, and scaled out, in terms of integration into public life” so that it has gained infrastructural importance in how both personal and public lives are navigated and managed. Expanding its operations to data centres, software and investments in undersea cables, Facebook equally operates on the level of Internet infrastructure, making it “a communication infrastructure that simultaneously infrastructures significant aspects of our social, political, and economic lives” (Hallinan, 2021). My concern is specifically on the infrastructural roles that Facebook – together with other social media companies – plays in terms of mundane sociability in framing sexuality as objectionable, problematic and sensitive.
Although the term of social media suggests the opposite, all media are basically social. As Nancy Baym (2015, 1) argues, social media entails “the takeover of the social by the corporate” while also putting “the focus on what people do through platforms rather than critical issues of ownership, rights, and power”. Furthermore, social media structure and govern the shapes that platformed sociability takes – while this issue has come into sharp focus within the global pandemic, it is not being specific to the moment. My interests lie on the perceived value of sex, or the lack thereof, at a moment when social media services increasingly frame out sexuality from the kind of sociability they allow for. I argue that we are facing a corporate model of non-sexual sociability, of the unsexual social, and that this poses a problem in terms of sexual rights, particularly those pertaining to LGTBQ+ people. Within the model of the non-sexual corporate social, sex becomes flattened into objectionable and potentially obscene content, and visual content in particular becomes filtered and blocked in a horizontal vein free of contextual nuance. In the course of this, the political, cultural and social dimensions of sexuality are flattened and hollowed out. In what follows, I first unpack the overall rationale behind sex’s deplatforming before moving to address the conflicts between the economic value generated on social media platforms and the value that sex holds in individual and collective lives, and the overall stakes involved in its increasing zoning out in networked exchanges.
Zoning out sex
The visibility and accessibility of sexual content has been moderated on most social media services since their launch, even as users have been, and remain creative in their sexual uses of these communication platforms (for an extended discussion, see Tiidenberg & van der Nagel, 2020). In addition to users flagging and reporting posts that they find offensive, commercial content moderators weed through masses of data, and automated machine learning techniques are broadly used to recognize sexual content. Such policing has grown ever more vigilant with the 2018 “Allow States and Victims to Fight Online Sex Trafficking Act” (FOSTA) and “Stop Enabling Sex Traffickers Act” (SESTA) bills, exceptions to Section 230 of the United States Communication Decency Act that has kept online services immune from civil liability for the actions of their users. Arguably intended to curb sexual trafficking, FOSTA-SESTA has led to the removal of much content connected to commercial sex while impacting the availability of sexual content much more expansively. Facebook, never friendly to sexual content, has revised its community standards to delimit sexual communication and depictions of nudity and sexual activity. The microblogging service Tumblr, once home of diverse sexual subcultures, removed all nudity in its December 2018 “porn ban”, effacing communities of exchange that had developed over a decade (Byron, 2019; Paasonen et al. 2019, 62, 133; Tiidenberg, 2019; Tiidenberg & van der Nagel, 2020, 45–46; Pilipets & Paasonen, 2020).
Before FOSTA-SESTA bills passed, Tumblr offered users the possibility to opt in, namely to agree to viewing sexual content. A similar model remains in place on Twitter. On Facebook, Instagram, TikTok and the current Tumblr, such content is strictly governed through community standards regulating the kind of content that users are allowed to post and share. These standards explicitly and forcefully exclude sexual content in the name of safety, by emphasizing that their aim is to keep users safe from potentially harmful content. In Facebook’s community standards, for example, nudity and sexual communication are classified under “objectionable content” alongside with violence and hate speech.
Sexual content is governed and moderated even more strictly than images of violence, while hate speech is moderated with some contextual consideration in connection with the freedom of expression. Whereas Facebook moderation guidelines address at some length the degrees of physical violence, blood and gore accepted, nudity (female nipples included) is horizontally forbidden, so that images of classic artworks, public statues, historical photographs, dick pics and links to gonzo porn are all similarly censored (Paasonen et al., 2019, 40–41). Within this logic, all nudity is understood as sexual, and therefore objectionable. Suggestions of sexual availability among consenting adults are equally forbidden in Facebook’s community standards, arguably since these can be seen as sexual soliciting. The ousting of sex in social media ranges from the ban on nude selfies to sexual health resources, journalistic accounts and academic studies being classified as being in conflict with community standards.
In the United States, the home of Facebook, the notion of community standards has long been used to determine the criteria for obscenity and obscene content unprotected by the First Amendment principles of freedom of expression. The 1957 Supreme Court ruling Roth v. the United States introduced community standards as a test making it possible to judge whether, to “an average person,” a cultural representation appeals to prurient interest and is hence obscene. In the context of social media services with billions of users globally, community standards operate with an even more ephemeral notion of “some people”, remaining a problematic, or at least an opaque concept in that anything like an average person is impossible to isolate or identify among Facebook’s 2,8 billion active users. These platforms identify themselves as community services, or just as communities, yet their vast use of volume, combined with vast differences in what individual users are likely to find objectionable, offensive or obscene across cultural, religious and societal divides mean that there can be no implicit consensus on the matter.
Despite the corporate logic being clear as such, it remains important to ask why sex becomes framed as unsafe to start with. The most popular social media services in Europe are, with the exception of TikTok, of U.S. origin. The community standards of Facebook, Instagram and Tumblr can be interpreted in relation to Puritanism as an often invisible, yet impactful cultural sensibility concerning sex and sexuality. It is my suggestion that the conflation of sex with risk speaks of that which, following Raymond Williams (1977, 132–133), can be identified as a residual structure of feeling orienting “affective elements consciousness and relationships” and generating specific kinds of sociability. Formed in the past, residual structures of feeling linger on, possibly unnoticed, while nevertheless attuning the rhythms of contemporary culture. They are, on the one hand, detached from dominant culture – just as puritanism, as a historical ideology and practice, is detached from the neoliberal entrepreneurial capitalism of contemporary Silicon Valley – yet, they are already incorporated into it, giving rise to contradictory layers of culture. The residual remains actively present through “reinterpretation, dilution, projection, discriminating inclusion and exclusion” (Williams, 1977, 123) as it does more passively as reverbs and emphases. Understood as echoes implicit in social media’s community standards and codes of conduct, puritan residual structures of feeling specific to the United States (although not limited to the country in question) operate with the premise that sex is dangerous and in need of control, rather than a source of enjoyment and wellbeing. (See also Paasonen et al., 2019, 169; Paasonen & Sundén, forthcoming).
Such echoes entail culturally specific attitudes towards sexuality and bodily display which, filtered through the tradition of puritanism into less strident yet persistent variations of prudishness, are catered as transparent norms of properness internationally through social media community standards. Sex, in this framework, implies risk to be fought off through tagging, flagging, automated filtering, human moderation, and combinations thereof – the humans thus employed often being offshore workers in countries with cheap labour costs (Roberts, 2019).
Social media content policies are, on the one hand, concerned with nudity on a horizontal manner. On the other hand, it is female and not male nipples that come attached to prurient interests as sites of titillation and obscenity (for an extended discussion, see Paasonen & Sundén, forthcoming). Content moderation policies come with gender bias, in addition to which the algorithms used in automated content filtering are primed and optimised to recognize certain shapes and patterns over others. As creations of specific cultural and social context, these techniques treat and valorise different bodies – and, by extension, embodied differences and identities – differently. There is bias, both racial and gendered, to how algorithms work: much of this depends on the datasets that they have been trained with (Noble, 2018).
Exploring literature on computer vision-based pornography filtering techniques that expands on such datasets, Robert W. Gehl, Lucas Moyer-Horner and Sara K. Yeo (2017, 530) argue that these operate with a highly limited set of assumptions, according to which “pornography is limited to images of naked women”, “sexuality is largely comprised of men looking at naked women” and “pornographic bodies comport to specific, predictable shapes, textures, and sizes.” In particular, the authors show that filtering software has been built to primarily recognise white, young female bodies as pornographic objects: the word “penis” did not feature once in the 102 articles that Gehl, Moyer-Horner and Yeo (2017, 536) analysed. Although penises have more recently entered the realm of algorithmic governance, gendered bias remains rife.
In 2019, the feminist newsletter Salty reported on a leaked Facebook and Instagram ad policy document defining the boundaries of acceptability in underwear and swimwear photos through references to a Victoria’s Secret’s advertising campaign. These images comprising a biased dataset were not used to train algorithms but human eyes, explaining “in twenty-two bullet points the way models can sit, dress, arch their backs, pose, interact with props” and “how see-through their underwear can be”. The algorithms discussed by Gehl and his co-authors focused on young, white, and thin female bodies as sites of porn. Facebook’s policy document was concerned with very similar bodies when drawing boundaries around the acceptably sexy and the potentially obscene – the difference being that these bodies operated as markers of desirable, profitable content. As Salty’s writers explain, this biased training sample discriminates against queer and women-run accounts posting body positive and gender nonconforming content that depart from the narrow body aesthetics of Victoria’s Secret, a brand known for its glossy and sexualised displays of femininity (see also Madden et al., 2018).
Within all this, naked or semi-naked female bodies remain the markers of obscenity and potential disgust alike, except when confining to narrow, commercially defined aesthetic norms of appropriate sexiness. From the perspective of Facebook, things seen as sexy or “cheekily” acceptable hold value in attracting advertisers and user engagement; things considered inappropriately sexy, cheeky and offensive do not. Within such estimations of value, drastically different scales of worth, importance and productivity emerge and clash. Meanwhile, community standards offer a discursive space for pulling these together. The 1973 U.S. supreme court test on obscenity (in the Miller v. California case), expanded the criteria of community standards to considerations of whether the work examined “lacks serious literary, artistic, political or scientific value” (Hudson, 2018), with the overall idea that obscene content lacks in all these kinds of value, and hence does not merit protection in the name of freedom of expression as simple trash.
If this legal definition identifies obscenity as worthless, the adaptation of community standards in social media content regulation expands the same logic to all content posted and shared by users. In practice, these norms are far from transparent, ambivalent in their wording, and often plain opaque for those whose content ends up being removed. While residually puritan social media content policies mark sexuality – from sexual representation to other forms of sexual communication and knowledge – as lacking in importance and social value, the situation is considerably more complex from the perspective of users, for whom content that becomes filtered out, flagged or blocked can hold great personal and social value irreducible to its monetization. This importance directly correlates with the perceived importance of sexuality in people’s personal and social lives.
In her research on the mid-1990s 2D graphical chat space, The Palaces, Sal Humphreys (2008) points out the fragility of queer sociability on commercial online platforms. Once a vibrant community, the queer Palace room she studied was abruptly disbanded and destroyed as the site owners simply switched off the server, leaving users unable to find one another as the room had been their only node of mutual connection. Humphreys describes the experience of logging into what had become “a different Palace” some months later:
“Palace, our ‘home’ has disappeared – the owners took the server offline. We have all been wandering aimlessly from Palace to Palace in search of each other for weeks now. Sometimes we find some of us. We gather in a room in someone else’s Palace and rekindle the warmth, but it is not the same. We need our own place back again. We have started to use the Palace we are in now, but it isn’t queer and there are lots of others here too.” (Humphreys, 2008, n.p.)
Social media platforms, similarly to the queer Palace, can just disappear, as did the Twitter-owned short-form video service Vine in 2016. They can equally radically alter their content policies and other in-platform laws without much prewarning so that sexual and gender nonconforming networks are cut off, as happened on Tumblr in 2018 (Byron, 2019; Tiidenberg, 2019; Pilipets & Paasonen, 2020). Announcing the ban on nudity and sexual content, Tumblr CEO Jeff D’Onofrio (2018) spoke of the intention to make the platform a “better, more positive” place for its community members. This statement was met with large degrees of hostility, sarcasm and sadness. Given Tumblr’s role in the formation of queer sociability and in the maintenance and creation of networks connected to a broad spectrum of sexual subcultures, the content ban did not merely alter the platform’s possible uses of but, through the removal of NSFW blogs, did away with social connections and resources.
Some of the content removed pertained to specific sexual likes, as Tumblr blogs had gathered together self-shooters, kinksters, niche porn aficionados, queers, and combinations thereof (Molldrem, 2018; Ashley, 2019; Engelberg & Needham, 2019; Tiidenberg, 2016; Ward, 2019). For some users, Tumblr was a site of sexually explicit fan art and, for others, an archive of counterhegemonic sexual content, a network of connections and exchanges, and a site of queer knowledge on issues ranging from mental health to penile reconstructions. Within two weeks of Tumblr announcing its new content policy, users, some of them comparing the platform to a home space facing immediate destruction, were “no longer be able to express themselves through sharing images of their bodies, or the bodies of others, or a range of other sexual content, regardless of whether ‘adult content’ was their reason for taking up residence on Tumblr” (Byron, 2019, 345).
If we understand social media as infrastructures of intimacy in the sense of entailing connections upon which we depend on for living – not entirely, yet to different degrees – it follows that being cut off from these, or these disappearing, partly reorganizes one’s ability to relate to others, to be and to act. There is fragility to relating through these platforms as the terms and conditions under which this occurs are not ours to control, or to hardly even impact. As Humphreys (2008) points out, “Publishers and owners most often reserve the right to exclude anyone from their sites for any or no reason. They mostly refuse any form of accountability for their decisions and invoke the rights of property to back their claims.”
Writing on Tumblr before the “porn purge”, Alexander Cho (2018) points to it being the preferred platform for queer youth of colour due to both its lenient content policy and its lack of publicness by design. Contra to Facebook’s real name policy, which allows the service to extract “robust and verifiable user data for monetization” by insisting that users operate with their legal name, Tumblr allows for multiple blogs under different aliases, none of which need to be connected to a singular name that other users could search for (Cho, 2018, 3183). As Cho points out, platform capitalism, in the case of both Facebook’s name policy and Google ID, can have drastically negative outcomes for queer youth in the form of outing and homophobic surveillance. In not allowing for anonymity or aliases, Cho argues, Facebook puts vulnerable users at risk. For its part, Facebook identifies aliases with fake accounts entailing the risk of bullying, spamming and trolling while also claiming to protect the safety of its users by governing the visibility of sexual content available to them. Safety, then, works in mysterious, or at least convoluted ways.
As Cho (2018, 3196) further argues, “the very factors that make Tumblr a non-default-public space for counter-hegemonic expression are the ones that impede the targeted value-extractive mechanics of platform capital”. Without sex, Tumblr’s economic model, never profitable as such, virtually collapsed: three months after the ban on sexual content, user traffic had dropped by nearly 30 per cent, and the service was soon sold for a fraction of its former price. With Tumblr, sexual content was key to popularity but not to financial success. While much could be said of the reasons for the content ban and its consequences, suffice to say that it involved a drastic misreadings of the service’s key userbase and the value that these people placed on sexual exchanges. These misreadings speak to the incompatibility in scales of value placed on social media platforms and the exchanges they afford.
Off with context
Announcing Tumblr’s porn ban, D’Onofrio (2018) argued that, “Bottom line: There are no shortage of sites on the internet that feature adult content. We will leave it to them and focus our efforts on creating the most welcoming environment possible for our community.” In doing so, D’Onofrio conflated all sexual content circulated on the site with the genre of pornography – a genre most commonly associated with “adult content”. This swift rhetorical move effectively did away with all contextual nuance within such content, as well as specificities in how, why and what sexual content is circulated and consumed, in what social settings and on what online platforms. This move both built on and further supported a broad division drawn between social media platforms (defined as safe, non-sexual, welcoming, community-oriented) and porn sites (standing for the unsafe, that which is best excluded from social media or otherwise filtered out). This rift continues to grow ever more pronounced so that social media companies may classify virtually all kinds of sexual content as comparable with, or as freely falling into the category of “porn”. This horizontal, simplified classification is in place by design as it is the aim of companies such as Facebook that their algorithmic and human content moderation practices, independent of cultural context or geographical location, are able to similarly recognize, flag and block offensive content.
On the one hand, it is easy for algorithms to spot nudity. On the other hand, it is virtually impossible for them to make sense of (cultural, social, temporal or political) contexts within which such nudity is featured. It is also difficult for Facebook to make sense of consent in order to draw lines around solicited and unsolicited exchanges, for example. Consequently, a nude photo sent to a lover in a private message as part of sexual play can result in the user being banned as this is treated similarly to sexual harassment by a random stranger. There is no choice for users to opt in, or consent to viewing sexual content since, according to community standards, “some people” may be sensitive to this. Focused on the pictorial properties of visual content, moderation guidelines sidestep the notion of artistic value as impossible for the company to make judgements on: while this impossibility is a fact, the policy simply results in art featuring nudity being classified as offensive. As context disappears, images shared from a BBC news site can become flagged as offending community standards on nudity and sexual activity – as with Facebook tributes to the actor Burt Reynolds at the time of his death, many of which made us of his famous nude 1972 Cosmopolitan centrefold, and subsequently became blocked.
The deplatforming of sex, as addressed in this article, has significant repercussions in terms of what can be done with sexual media, how we can relate to one another on diverse platforms, as well as the shapes that sexual lives and pleasures can take. The effacement of sexual content and its conflation with pornography further evokes the question of how we make sense of sexual media genres, their cultural positions and social uses. A broad range of visual sexual practices plays out on online platforms, from networked masturbation sessions on Skype or SnapChat to WhatsApp sexting, chats in hook up apps and more or less playful exchanges of nude selfies via social media backchannels via direct messaging, even if these go against community standards. The people engaging in such routines do not necessarily perceive their media production as being pornographic, or its outcomes having much to do with porn, as largely consumed on aggregator sites in the shape of video clips. Meanwhile, such productions can be experienced as erotic, sexual, intense, libidinal, fun, disturbing, visceral and important. It is of course the case that platforms do exists, and are constantly launched for the purposes of sexual display and connecting – FetLife alone having been in operation since 2008, and diverse hook-up apps being largely centred around sexual exchanges. It is nevertheless my argument that the expansive, horizontal ousting of sex from social media flattens out possible ways of relating, gaining sexual knowledge and advancing public debates connected to sexual cultures, lifestyles, preferences, identities, professions and ethics.
Denying the personal and social value of such exchanges, I further argue, does damage to ways of understanding sociability and the kinds of social engagements that drive people to start with. Perhaps self-evidently, it also downplays the role that networked media play as and within infrastructures of intimacy. The deplatforming of sex has detrimental effects on sexual cultures in blocking access to communication platforms, archival spaces, networking options, and in doing away with the means of presenting and enjoying bodies. Following the logic of deplatforming, the post-porn ban Tumblr, with no more dick pic galleries, no porn gifs, no amateur porn drawings and no explicit kink, is just safer, more welcoming and, by extension, better. The content moderation practices set in place for policing community standards are designed to keeps us all safe, even as this safety goes against our explicit wants, needs and desires. At the same time, this figure of safety, both normative and spectral, causes harm: consider, for example, the consequences that the ousting of gender and sexual nonconforming blogs has had for the possibility of people coming together, finding things, and possibly figuring things out, or the risks of real-name policies on the lives of queer youth, as discussed by Cho. And even as the value of sex – the value of people expressing desire to get it on with one another, or to explore specific interests – is denied, sex certainly continues to hold value in social media as ubiquitous “sexiness” confining to gendered body norms.
Social media services have strongly defended the freedom of expression when accused of not sufficiently moderating vitriolic political commentary or the racist, homophobic and misogynistic overtones of their users. Their community standards, following U.S. legal definitions of obscene content as both falling outside the freedom of expression and lacking in social and cultural value, conflate sexual content with offensiveness so that these can be filtered out. At the same time, the notion of sexual rights, articulated since the 1990s in response to feminist, queer and transgender activism, firmly frames the issue of sexual expression as a human right key to wellbeing (Albury, 2017). Understood through the prism of sexual rights, the policing of sexual expression in social media is in conflict with human rights (for an extended discussion, see Spišák & al. 2021). This is particularly pressing a concern as social distancing measures have boosted the traffic of social media. Networked forms of doing sex have expanded just as the deplatforming of sex has accelerated, making evident the tension between a data economy ruled by U.S. specific notions of appropriate content and behaviour, and the centrality of sexual exchanges in people’s lives
Arguments made for sexual rights in response to social media content policies are perhaps unlikely to undo the overall logic of risk and safety that these are premised on, yet they can allow for shifting the focus of debate to the diverse forms of value that sex holds in people’s lives beyond the logic of monetary value monetization upon which social media operate. The value of sex is intensely personal as that which contributes to the making of the self just as it is political as a matter of alliance, identification and advocacy. Deplatforming of sex produces social media sociability void of sexuality and, in so doing, effaces connections that impact us, and on which we depend for living.
Albury, K. 2017. Just Because It’s Public Doesn’t Mean It’s Any of Your Business: Adults’ and Children’s Sexual Rights in Digitally Mediated Spaces. New Media & Society 19(5), 713–725.
Ashley, V. 2019. Tumblr Porn Eulogy, Porn Studies 6 (3), 359–362.
Baym, N.K. 2015. Social Media and the Struggle for Society. Social media + Society 1 (1), 1–2, DOI: 10.1177/2056305115580477.
Berlant, L. 2000. Intimacy: A Special Issue. In Berlant, L. (Ed.), Intimacy. Chicago, IL: University of Chicago Press, 1–8.
Byron, P. 2019. “How Could You Write Your Name Below That?” The Queer Life and Death of Tumblr. Porn Studies 6 (3), 336–349.
Cho, A. 2018. Default Publicness: Queer Youth of Color, Social Media, and Being Outed by the Machine. New Media & Society 20 (9), 3183–3200.
D’Onofrio, J. 2018. A Better, More Positive Tumblr. https://staff.tumblr.com/post/180758987165/a-better-more-positive-tumblr.
Engelberg, J. & Needham, G. 2019. Purging the Queer Archive: Tumblr’s Counterhegemonic Pornographies, Porn Studies 6 (3), 350–354.
Gehl, R.W.,Moyer-Horner, L. and Yeo, S.K. 2017. Training Computers to See Internet Pornography: Gender and Sexual Discrimination in Computer Vision Science. Television and New Media 19 (6), 529–547.
Hallinan, B. 2021. Civilizing Infrastructure. Cultural Studies, online before print. DOI:10.1080/09502386.2021.1895245
Hudson, D.L. Junior. 2018. Obscenity and Pornography. In: The First Amendment Encyclopedia, https://mtsu.edu/first-amendment/article/1004/obscenity-and-pornography.
Humphreys, S. 2008. Commodifying Social Relations: Affective Economies Online. Unpublished Presentation at the Networks of Desire Seminar, Helsinki Collegium for Advanced Studies, October 9.
Madden, S., Janoske, M. Winkler, R. Briones & Harpole, Z. 2018. Who Loves Consent? Social Media and the Culture Jamming of Victoria’s Secret. Public Relations Inquiry 7 (2), 171–186.
Molldrem, S. 2018. Tumblr’s Decision to Deplatform Sex Will Harm Sexually Marginalized People, Wussy, December 6. https://www.wussymag.com/all/2018/12/6/tumblrs-decision-to-deplatform-sex-will-harm-sexually-marginalized-people.
Noble, S. Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.
Paasonen, S. 2018. Infrastructures of Intimacy. In: Andreassen, R., Nebeling Petersen, M., Harrison, K. & Raun, T. (Eds.), Mediated Intimacies: Connectivities, Relationalities and Proximities. London: Routledge, 103–116.
Paasonen, S., Jarrett, K. & Light, B. 2019. NSFW: Sex, Risk, and Humor in Social Media. Cambridge, MA: MIT Press.
Paasonen, S. & Sundén, J. Forthcoming. Objectionable Nipples: Puritan Data Politics and Sexual Agency in Social Media. In: Keilty, P. (Ed.), Queer Data. New York: Routledge.
Pilipets, E. & Paasonen, S. 2020. Nipples, Memes and Algorithmic Failure: NSFW Critique of Tumblr Censorship. New Media & Societ, online before print, https://doi.org/10.1177/1461444820979280.
Roberts, S.T. 2018. Digital Detritus: “Error” and the Logic of Opacity in Social Media Content Moderation. First Monday 23 (3), https://journals.uic.edu/ojs/index.php/fm/article/view/8283/6649.
Roberts, S.T. 2019. Behind the Screen. Content Moderation in the Shadow of Social Media. New Haven: Yale University Press.
Salty. 2019. Exclusive: Victoria’s Secret Influence on Instagram’s Censorship Policies. November 22, https://saltyworld.net/exclusive-victorias-secret-influence-on-instagrams-censorship-policies/.
Spišák, S., E. Pirjatanniemi, T. Paalanen, S. Paasonen & M. Vihlman. 2021 Social Networking Sites’ Gag Order: Commercial Content Moderation’s Adverse Implications for Fundamental Sexual Rights and Wellbeing. Social Media + Society.
Srnicek, N. 2018. Platform Capitalism. Cambridge: Polity.
Tiidenberg, K. 2016. Boundaries and Conflict In a NSFW Community on Tumblr: The Meanings and Uses of Selfies. New Media & Society 18 (8),1563–1578.
Tiidenberg, K. 2019. Playground in Memoriam: Missing the Pleasures of NSFW Tumblr. Porn Studies 6 (3): 363–371.
Tiidenberg, K. & van der Nagel, E. 2020. Sex and Social Media. Bingley: Emerald.
Ward, J. 2019. Tumblr Tributes and Community. Porn Studies 6 (3): 355–358.
Williams, R. 1977. Marxism and Literature. Oxford: Oxford University Press.
Wilson, A. 2016. The Infrastructure of Intimacy. Signs: Journal of Women in Culture and Society 41 (2): 247–280.
Our short piece on shadowbanning, Sex in the Shadows of Celebrity, written together with the wonderful Dr Carolina Are, is out on OA with Porn Studies as part of a forthcoming special issue on the deplatforming of sex in social media. Here’s the abstract:
Shadowbanning is a light censorship technique used by social media platforms to limit the reach of potentially objectionable content without deleting it altogether. Such content does not go directly against community standards so that it, or the accounts in question, would be outright removed. Rather, these are borderline cases – often ones involving visual displays of nudity and sex. As the deplatforming of sex in social media has accelerated in the aftermath of the 2018 FOSTA/SESTA legislation, sex workers, strippers and pole dancers in particular have been affected by account deletions and/or shadowbanning, with platforms demoting, instead of promoting, their content. Examining the stakes involved in the shadowbanning of sex, we focus specifically on the double standards at play allowing for ‘sexy’ content posted by or featuring celebrities to thrive while marginalizing or weeding out posts by those affiliated with sex work.
Our article with Sanna Spisak, Elina Pirjatanniemi, Tommi Paalanen and Maria Vihlman, with the whopper of a name, “Social Networking Sites’ Gag Order: Commercial Content Moderation’s Adverse Implications for Fundamental Sexual Rights and Wellbeing” is just out with Social Media and Society (OA). It’s one outcome of our ongoing project, Intimacy in Data-Drive Culture, and the abstract goes like this:
This article critically investigates the reasoning behind social media content policies and opaque data politics operations regarding sexual visual social media practices and sexual talk, asking what is at stake when social media giants govern sexual sociability on an international scale. Focusing on Facebook, in particular, this article proposes an alternative perspective for handling various expressions of sexuality in social media platforms by exploring the wide-ranging ramifications of community standards and commercial content moderation policies based on them. Given that sexuality is an integral part of human life and as such protected by fundamental human rights, we endorse the freedom of expression as an essential legal and ethical tool for supporting wellbeing, visibility, and non-discrimination. We suggest that social media content policies should be guided by the interpretive lens of fundamental human rights. Furthermore, we propose that social media content policies inclusive of the option to express consent to access sexual content are more ethical and just than those structurally erasing nudity and sexual display.