Tag Archives: internet research

disquieting, unfit social bodies

Our book project, #NSFW, is moving more into the writing phase with Kylie Jarrett and Ben Light, which is exciting indeed. So here’s a little something towards for a section on nudity and social media.



On February 9, 2016, visual artist Illma Gore published a pastel pencil painting of Donald Trump in the nude, titled “Make American Great Again,” on her Facebook page with the accompanying slogan, “Because no matter what is in your pants, you can still be a big prick.”

Inspired by the highly public, and broadly discussed, allusions to Trump’s penis size during the Republican presidential primary debates, the drawing featured the candidate with genitalia of markedly modest size. The image soon began circulating on other social media platforms from Instagram to Tumblr, Twitter, and Snapchat. By February 11, it was featured with NSFW warnings in media outlets such as The Huffington Post (“Artist Imagines What Donald Trump Looks Like Naked And It Ain’t Pretty (NSFW)”) and The Daily Dot (“Realistic nude painting of Donald Trump will make you gouge your eyes out”), with many other stories to follow.

Very quickly after posting the initial image, Gore’s Facebook account was temporarily suspended for violating the service’s community standards, and it has been blocked numerous times since. According to these standards:

“We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous or satirical purposes. Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed.” (https://m.facebook.com/communitystandards/?section=1)


Illma Gore’s portrait of Trump in the nude did not in fact violate these terms as she had covered Trump’s genital area with a black block (Hoffman 2016). On March 3, she published an eBay listing for the piece, only to have it taken down a few days later due to violating the service’s policy on images of nudity, according to which “frontal nudity is allowed in Art categories when the item is considered fine art, such as Michelangelo’s David, vintage pin-up art, Renaissance-style paintings, and nude cherubs.” Works not fitting these parameters should be listed in the Adult Only Category. (http://pages.ebay.com/help/policies/adult-only.html.) Within a week, Gore had been threatened by lawsuits from Trump’s team and risked having her Facebook account permanently blocked. (Voon 2016.)

The image itself is well suited in terms of both news outlets and click sites in its visceral display of folding naked celebrity flesh. News items on the acts of blocking, banning, and potential censorship connected to it—and ones related to Facebook in particular—invested the incident with a different kind of sticky attention value. In the course of this all, the image gained notable virality as people shared news items, most of them featuring an uncensored version of the portrait (as seen above) that consequently found its way to extensive distribution on Facebook. In order to further fuel its dispersion, Gore put a high-resolution image file on her website for free downloading. By April 2016, the portrait was on display at the Maddox gallery in London, priced at £1m (http://www.theguardian.com/us-news/2016/apr/17/nude-donald-trump-painting-illma-gore-lawsuits). All this witnesses to the intermeshing of attention value and monetary value through and within the fast speeds of social media platforms, online news outlets, links, and clicks, as debated in the context of attention economy, even if this is not the key issue right here.

The image in question, situated in the realm of arts and politics alike, is one in a long, constantly accumulating stream of incidents testing and challenging the community standards of Facebook. At the same time, the incident is also somewhat exceptional in that it focused on the nude male body of a celebrity politician. Other controversies to date have predominantly focused on images of female bodies, as in the context of breastfeeding and breast cancer. In 2012, a group of women gathered in front of Facebook headquarters in a collective breastfeeding protest opposing their policy of flagging and removing images of nursing. In a 2013 controversy, over 20,000 people signed a Change.org petition protesting the removal of images of mastectomy as obscene (and hence as comparable to pornographic content). Facebook community standards have since undergone revisions. According to the March 2016 version,

“We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous or satirical purposes.”


Nudity occupies a tricky terrain in the borderlines between the SFW and the NSFW, especially since the criteria regulating obscenity are, due to the largest social media platforms’ country of origin, overwhelmingly defined according to standards particular to U.S. culture. Art remains a space where nudity fails to be automatically conflated with obscenity, yet terms of service routinely map out its confines and discontents in notably conservative terms. On eBay, this involves fencing off the obscene in contrast to Renaissance-style aesthetics and cherubs even while the normative boundaries of art are transgressed by including vintage pin-ups within the category. For its part, Facebook recognizes painting and sculptures but not digitally generated images as art, hence articulating the boundaries of art in highly medium-specific means.

Image recognition software has been developed since the 1990s for automatically filtering out pornographic images (e.g. Ries and Lienhart 2014; Iqbal et al. 2016), yet Gore’s blocked Facebook access most likely owes to other users reporting the image as offensive, and company representatives screening the content agreeing on the issue (cf. Summers 2009). As a mechanism for reporting offensive content, flagging, as “technical rendition of ‘I object,’” is based on user participation in the reification of social media sites’ community norms (Crawford and Gillespie 2014, 2, 5).

Kate Crawford and Tarleton Gillespie (2014, 2) show how flagging has become a ubiquitous mechanism of governance with a somewhat complex relation to community sentiment, consisting as it does of interactions between “users, platforms, humans, and algorithms, as well as broader political and regulatory forces.” These are themselves inseparable from moral concerns and corporate strategies, and routinely result in opaque decisions detached from broader negotiations or articulations of concern over what is acceptable, offensive, or controversial (Crawford and Gillespie 2014, 4, 10).

In another March 2016 incident, Mari Corry’s photo showing her breastfeeding in a park was flagged as offensive. In protest, she uploaded a breastfeeding photo where the baby’s head was covered with a print featuring a Victoria’s Secret Model, hence commenting on double standards concerning women’s breasts and the multiple purposes of their public visibility (Wallwork 2016). The previous month, Rowena Kincaid, terminally ill with breast cancer, uploaded a picture of her symptoms in order to help other women self-diagnose. Since the image included her nipple, it was soon flagged for violating community standards. She then reposted the image with a smiley drawn over the nipple. As was the case with Corry and Gore’s images, news of flagging fuelled the images’ social circulation across diverse platforms. Flagging is a sign of objection but it also involves more complex social dynamics and exchanges, many of which are inseparable from increases in attention value:

“Flags get pulled as a playful prank between friends, as part of a skirmish between professional competitors, as retribution for a social offense that happened elsewhere, or as part of a campaign of bullying or harassment—and it is often impossible to tell the difference. Flagging is also used to generate interest and publicity, as in the cases where YouTube put a racy music video behind an age barrier and promoters decried the “censorship” with mock outrage. The fact that flagging can be a tactic not only undercuts its value as a “genuine” expression of offense, it fundamentally undercuts its legibility as a sign of the community’s moral temperature.” (Crawford and Gillespie 2014, 11.)


The persistent hunt for offensive areolas and micro-penises in the incidents discussed above is connected to broader boundary maintenance between SFW and NSFW platforms. While Twitter, and Tumblr in particular, broadly accommodate sexually explicit content, most social media platforms undergo considerable effort to remove it. This, combined with the increased centralization of ownership to dominant actors (such as Google and Facebook), has drastically reframed the operating possibilities of companies trading in specifcially pornographic content. In his coverage of the adult app store, Mikandi, Wired reporter Cade Metz notes that

“with the rising power of companies like Apple and Google and Facebook, the adult industry doesn’t drive new technology. In many respects, it doesn’t even have access to new technology. The big tech companies behind the big platforms control not only the gateway services (the iPhone app store, Google Search, the Facebook social network) but the gateway devices (the iPhone, Android phones, Google Chromecast, the Amazon Fire TV, the Oculus Rift virtual reality headset). And for the most part, they’ve shut porn out. Besides, these giants now drive new technology faster than services like Mikandi or Pornhub ever could.” (Metz 2015.)


Porn sites—the market leader Pornhub being here an obvious example—are currently much more likely to emulate the technical solutions and revenue models of social media platforms than the other way around. Pornographic aggregator sites from RedTube to YouPorn and XTube have all copied their design and operating principles from YouTube. Meanwhile, those modelled after other social outlets, such as Fuckbook and Snatchly, “the Pinterest of Porn,” have not similarly picked up.

All this is in stark contrast to the situation some two decades ago, given the degree to which the needs of the porn industry drove the development of Web solutions. Gaming and online shopping only picked up towards the end of the 1990s, and for quite a while pornography remained one of the few forms of content that users would pay for. Consequently, safe credit card processing systems, streaming video technologies, and hosting services, as well as practices such as banner advertisement and pop-ups, were first developed for and used on porn sites. The role of porn as a driving force in dot.com enterprise has clearly since passed. While pornographic content still quickly migrates to new technical platforms and media formats, its position is crucially different in the context of social media than in the Web cultures of the 1990s.

Pornographic content, or even less sexually explicit nudity, is weeded out from platforms such as YouTube or Facebook through flagging and automated blocking alike : it is not possible to share links directly to pornographic content. Adult entertainment companies have public social media presence, yet this mainly involves sticking to the non-explicit and hence failing to represent the brands’ core features in order to accommodate diverse terms of service. In addition to key social media companies warding off pornography, Google, YouTube’s owner, also bans pornography from its advertisements:

“The AdWords policies on adult sexual services (…) will be updated in late June 2014 to reflect a new policy on sexually explicit content. Under this policy, sexually explicit content will be prohibited, and guidelines will be clarified regarding promotion of other adult content. The change will affect all countries. We made this decision as an effort to continually improve users’ experiences with AdWords.” (https://support.google.com/adwordspolicy/answer/4271759?hl=en&ref_topic=29265)


Search engines have long filtered our pornography from their freely published listings of most popular search terms, hence adding to the position of porn as a public secret of sorts. Search engines also filter out sexually explicit content in diverse ways, which, in practice, makes it difficult for users to find that which they are searching for. While this could, for evident reasons, be considered as deliberate design of a poorer service, such measures are articulated and motivated as improvements in user experience (as in the AdWords policy statement above).

Google has long provided a “SafeSearch” option that filters out all, or at least most, hits to adult content. As that which SafeSearch filters out, sexually explicit content becomes framed as unsafe and risky. Should the user choose to disable SafeSearch, Google will “provide the most relevant results for your search and may include explicit content when you search for it.” Searches on pornography can be openly tracked through Google Trends, witness to their perennial and even increasing popularity, yet without breaking down these trends into actual numbers of searches. All in all, the uneasy visibility of nudity and pornographic content in Web searches and on social media platforms speaks of a gap between the ideal, normative figure of a user—in accordance to which the services’ “community standards” are crafted—and the diverse interests, inappropriate interests, and unruly titillations of empirical, actual people that routinely veer towards the NSFW.


Leave a comment

Filed under cultural studies, internet research, media studies, porn studies

Bit on Networked Affect

I recently wrote a short entry on “Networked Affect” based on our recent edited collection for the forthcoming Posthuman Glossary put together by Rosi Braidotti and Maria Hlavajova. And here it is! In its non-copyedited glory.


Counter to rationalised conceptualisations of network media as an issue of information management, retrieval and exchange, online communications are not merely about storing and sharing data but also about the spread, attachment, amplification and dissipation of affective intensity that help to shape and form connections and disconnections between different bodies. These proximities and distances, again, may intermesh and layer with sexual titillation, political passions or the creation of monetary value alike

As the capacity of bodies to affect and be affected by one another (e.g. Spinoza 1992; Massumi 2015), affect cuts across, and joins together, bodies human and nonhuman, organic and machine, material and conceptual – bodies of flesh and those of thought (Deleuze 1988: 127; Gatens 2000). Following Spinoza (1992), bodies and their capacities are constantly shaped and modified in their encounters with the world and the other bodies inhabiting in it: such encounters may then increase or diminish, affirm or undermine their life forces and potential to act. The notion of networked affect (Paasonen, Hillis and Petit 2015) is a means to address these interconnections as the circulation and oscillation of intensity in the framework of online communication that involves a plethora of actors, from individual users to more or less emergent collective bodies, devices, platforms, applications, companies, files and threads.

Addressing affect as networked positions it as something always already in-between bodies: as something that emerges in encounters between them, shapes these encounters, and animates the bodies involved. Instead of being articulated as an issue of individual capacity or property, affect, understood as networked, is that which makes things matter, gathers attention and, possibly, adds to the individual sense of liveliness as intensity that reverberates with personal embodied histories, orientations and values (Ahmed 2004; Cho 2015). Such a framing does not situate networked affect as either visceral gut reactions specific to the human or as nonhuman pre-personal potentiality. Rather, it allows for an examination of how intensities shape our ubiquitous networked exchanges, how they circulate, oscillate, and become registered as sensation as bodies pass from one state to another.

As Jodi Dean (2010; 2015) argues, the uses of social media are driven by a search for affective intensity that orients and provokes the interest and curiosity of users as they move across platforms, click on links, share and comment, searching for a shiver of interest, amusement, anger or disgust. Intensity, or that which Dean discusses as the drive, is that which drives the movements across sites and applications. What the users encounter on social media platforms, however, are not only other people but equally image and video files, animated GIFs, emojis, comments, algorithms, information architecture and routines of data mining. Although their parameters are of human design, these nonhuman factors curate the shapes that our sociability may take, what we can see and in what kinds of constellations on these platforms – and, perhaps to a degree, how we may feel about these interactions. Sarah Kember and Joanna Zylinska therefore argue that, ‘It is not simply the case that “we” – that is, autonomously existing humans – live in a complex technological environment that we can manage, control, and use. Rather, we are – physically and ontologically – part of the technological environment, and it makes no more sense to talk of us using it, than it does of it using us’ (2012: 13).

Tero Karppi (2015: 225) points out how Facebook, the currently dominant social networking site, aims to cater ‘happy accidents’ through its algorithms that are set to render visible things that users may not know to expect or actively search for. Similarly to the ‘like’ buttons, such designed serendipity aims at affective modulation, or amplification (Massumi 2015: 31) in the positive register. The controversial Facebook emotional manipulation study of 2012, conducted by a team of psychologists from Cornell, encapsulates much of this. The experiment involved the news feeds of 689,003 Facebook users, and analysis of some three million posts consisting of 122 million words, without the users’ explicit informed consent (Kramer, Guillory and Hancock 2014). The research team tweaked the algorithms selecting the content visible in users’ news feeds and manipulated them to show more or less positive or negative posts. The overall aim was to assess how this affected the users’ emotional states. Their hypothesis – and finding – was that ‘emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness’ (Kramer, Guillory and Hancock 2014: 8788).

Without further unpacking the limitations or conceptual nuances of this specific study here, it points to the centrality of affective modulation in and for the operating principles of much commercial network media – from social networking sites to online newspapers and clickbaits. In other words, affective modulation is inbuilt in, and central to, the production of value as ‘dependent on a socialised labour power organised in assemblages of humans and machines exceeding the spaces and times designated as “work”’ (Terranova 2006: 28). As forms of affective labour, this value production involves the manipulation of affects, social networks, and forms of community alike (Hardt and Negri 2001: 293; also Coté and Pybus 2007). This is an issue of ‘the corporeal and intellectual aspects of the new forms production’ where ‘labor engages at once with rational intelligence and with the passions or feeling’ (Hardt 2007: xi). Not only do social media ‘produce and circulate affect as a binding technique’ (Dean 2015: 90) to attract returning and loyal users, but affective stickiness is equally intimately tied to the production of monetary value.

Network media involves both personal and collective affective economies (Ahmed 2004) linked to memories, feelings, attachments, monetary value, politics, professions and fleeting titillations. Explorations of networked affect as the fuel for action help in mapping out how online platforms, exchanges and devices matter, as well as that which they affect – the purposes they are harnessed to and the outcomes that they facilitate. Here, any clear binary divides between the rational and the affective, the human and the nonhuman or the user and the instrument used are guaranteed to break down.


Ahmed, S. (2004), The Cultural Politics of Emotion, Edinburgh: Edinburgh University Press.

Cho, A. (2015), ‘Queer Reverb: Tumblr, Affect, Time’, in K. Hillis, S. Paasonen and M. Petit (eds), Networked Affect, Cambridge: MIT Press, 43–58.

Coté, M. and J. Pybus (2007), ‘Learning to Immaterial Labour 2.0: MySpace and Social Networks’, Ephemera: Theory and Politics in Organization 7(1): 88–106.

Dean, J. (2010), Blog Theory: Feedback and Capture in the Circuits of the Drive, Oxford: Polity.

Dean, J. (2015), ‘Affect and Drive’, in K. Hillis, S. Paasonen and M. Petit (eds), Networked Affect, Cambridge: MIT Press, 89–100.

Deleuze, G. (1998), Spinoza: Practical Philosophy, R. Hurley (trans), San Francisco: City Lights Books.

Gatens, M. (2000), ‘Feminism as “Password”: Re-thinking the “Possible” with Spinoza and Deleuze’, Hypatia 15(2): 59–75.

Hardt, M. (2007), ‘Foreword: What Affects Are Good For’, in P. Ticineto Clough and J. Halley (eds), The Affective Turn: Theorizing the Social, Durham: Duke University Press, ix-xiii.

Hardt, M. and A. Negri (2001), Empire, Cambridge: Harvard University Press.

Karppi, T. (2015), ‘Happy Accidents: Facebook and the Value of Affect’, in K. Hillis, S. Paasonen and M. Petit (eds), Networked Affect, Cambridge: MIT Press, 221–234.

Kember, S. and J. Zylinska (2012), Life after Media: Mediation as a Vital Process, Cambridge: MIT Press.

Kramer, A.D.I., J.E. Guillory and J.T. Hancock (2014), ‘Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks’, Proceedings of the National Academy of Sciences of the United States of America 111(24): 8788–8790.

Massumi, B. (2015), The Politics of Affect, Cambridge: Polity.

Paasonen, S., K. Hillis and M. Petit (2015), ‘Introduction: Networks of Transmission: Intensity, Sensation, Value’, in K. Hillis, S. Paasonen and M. Petit (eds), Networked Affect, Cambridge: MIT Press, 1–24.

Spinoza, B. (1992), The Ethics, Treatise on the Emendation of the Intellect and Selected Letters, S. Feldman (ed), S. Shirley (trans), Indianapolis: Hackett.

Terranova, T. (2006),‘On Sense and Sensibility: Immaterial Labour in Open Systems’, in G. Cox, J. Krysa and A. Lewin  (eds), Curating, Immateriality, Systems: On Curating Digital Media, New York: Autonomedia, 27–36.


Leave a comment

Filed under affect theory, media studies

Girls and sexual role-play online

Based on Silja Nielsen’s excellent MA research involving a survey with 1269 Finnish female respondents aged 11–18, our article, “Pervy role-play and such*: Girls’ experiences of sexual messaging online, co-authored with Silja and Sanna Spisak, is now out with the Sex Education journal (online before print), as part of a special issue on the evolving role of media in sex education, edited by Alan McKee, Sara Bragg and Tristan Taormino. While providing an overview of the survey findings in general, the article is more focused on girls’ positive accounts of sexual role-play and messaging.

Leave a comment

Filed under media studies, porn studies

Networked Affect is out!

Based on a series of worksho9780262028646ps at the 2011 Association of Internet Researchers conference in Seattle, Networked Affect is now out from MIT Press. Edited by Ken Hillis, Michael Petit and myself, it also includes essays by James Ash, Alex Cho, Jodi Dean, Melissa Gregg, Kylie Jarrett, Tero Karppi, Stephen Maddison, Jussi Parikka, Jennifer Pybus, Jenny Sundén and Veronika Tzankova. The book explores the intersections of internet research and theories of affect from a range of perspectives — from the queer reverbs of Tumblr to the gift economies of Facebook, nonhuman agencies of code, digital materialities of Steampunk and the political affect of Turkish sexual confession sites. So glad it’s finally out.

Leave a comment

Filed under affect theory, media studies