Our book project, #NSFW, is moving more into the writing phase with Kylie Jarrett and Ben Light, which is exciting indeed. So here’s a little something towards for a section on nudity and social media.
On February 9, 2016, visual artist Illma Gore published a pastel pencil painting of Donald Trump in the nude, titled “Make American Great Again,” on her Facebook page with the accompanying slogan, “Because no matter what is in your pants, you can still be a big prick.”
Inspired by the highly public, and broadly discussed, allusions to Trump’s penis size during the Republican presidential primary debates, the drawing featured the candidate with genitalia of markedly modest size. The image soon began circulating on other social media platforms from Instagram to Tumblr, Twitter, and Snapchat. By February 11, it was featured with NSFW warnings in media outlets such as The Huffington Post (“Artist Imagines What Donald Trump Looks Like Naked And It Ain’t Pretty (NSFW)”) and The Daily Dot (“Realistic nude painting of Donald Trump will make you gouge your eyes out”), with many other stories to follow.
Very quickly after posting the initial image, Gore’s Facebook account was temporarily suspended for violating the service’s community standards, and it has been blocked numerous times since. According to these standards:
“We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous or satirical purposes. Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed.” (https://m.facebook.com/communitystandards/?section=1)
Illma Gore’s portrait of Trump in the nude did not in fact violate these terms as she had covered Trump’s genital area with a black block (Hoffman 2016). On March 3, she published an eBay listing for the piece, only to have it taken down a few days later due to violating the service’s policy on images of nudity, according to which “frontal nudity is allowed in Art categories when the item is considered fine art, such as Michelangelo’s David, vintage pin-up art, Renaissance-style paintings, and nude cherubs.” Works not fitting these parameters should be listed in the Adult Only Category. (http://pages.ebay.com/help/policies/adult-only.html.) Within a week, Gore had been threatened by lawsuits from Trump’s team and risked having her Facebook account permanently blocked. (Voon 2016.)
The image itself is well suited in terms of both news outlets and click sites in its visceral display of folding naked celebrity flesh. News items on the acts of blocking, banning, and potential censorship connected to it—and ones related to Facebook in particular—invested the incident with a different kind of sticky attention value. In the course of this all, the image gained notable virality as people shared news items, most of them featuring an uncensored version of the portrait (as seen above) that consequently found its way to extensive distribution on Facebook. In order to further fuel its dispersion, Gore put a high-resolution image file on her website for free downloading. By April 2016, the portrait was on display at the Maddox gallery in London, priced at £1m (http://www.theguardian.com/us-news/2016/apr/17/nude-donald-trump-painting-illma-gore-lawsuits). All this witnesses to the intermeshing of attention value and monetary value through and within the fast speeds of social media platforms, online news outlets, links, and clicks, as debated in the context of attention economy, even if this is not the key issue right here.
The image in question, situated in the realm of arts and politics alike, is one in a long, constantly accumulating stream of incidents testing and challenging the community standards of Facebook. At the same time, the incident is also somewhat exceptional in that it focused on the nude male body of a celebrity politician. Other controversies to date have predominantly focused on images of female bodies, as in the context of breastfeeding and breast cancer. In 2012, a group of women gathered in front of Facebook headquarters in a collective breastfeeding protest opposing their policy of flagging and removing images of nursing. In a 2013 controversy, over 20,000 people signed a Change.org petition protesting the removal of images of mastectomy as obscene (and hence as comparable to pornographic content). Facebook community standards have since undergone revisions. According to the March 2016 version,
“We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous or satirical purposes.”
Nudity occupies a tricky terrain in the borderlines between the SFW and the NSFW, especially since the criteria regulating obscenity are, due to the largest social media platforms’ country of origin, overwhelmingly defined according to standards particular to U.S. culture. Art remains a space where nudity fails to be automatically conflated with obscenity, yet terms of service routinely map out its confines and discontents in notably conservative terms. On eBay, this involves fencing off the obscene in contrast to Renaissance-style aesthetics and cherubs even while the normative boundaries of art are transgressed by including vintage pin-ups within the category. For its part, Facebook recognizes painting and sculptures but not digitally generated images as art, hence articulating the boundaries of art in highly medium-specific means.
Image recognition software has been developed since the 1990s for automatically filtering out pornographic images (e.g. Ries and Lienhart 2014; Iqbal et al. 2016), yet Gore’s blocked Facebook access most likely owes to other users reporting the image as offensive, and company representatives screening the content agreeing on the issue (cf. Summers 2009). As a mechanism for reporting offensive content, flagging, as “technical rendition of ‘I object,’” is based on user participation in the reification of social media sites’ community norms (Crawford and Gillespie 2014, 2, 5).
Kate Crawford and Tarleton Gillespie (2014, 2) show how flagging has become a ubiquitous mechanism of governance with a somewhat complex relation to community sentiment, consisting as it does of interactions between “users, platforms, humans, and algorithms, as well as broader political and regulatory forces.” These are themselves inseparable from moral concerns and corporate strategies, and routinely result in opaque decisions detached from broader negotiations or articulations of concern over what is acceptable, offensive, or controversial (Crawford and Gillespie 2014, 4, 10).
In another March 2016 incident, Mari Corry’s photo showing her breastfeeding in a park was flagged as offensive. In protest, she uploaded a breastfeeding photo where the baby’s head was covered with a print featuring a Victoria’s Secret Model, hence commenting on double standards concerning women’s breasts and the multiple purposes of their public visibility (Wallwork 2016). The previous month, Rowena Kincaid, terminally ill with breast cancer, uploaded a picture of her symptoms in order to help other women self-diagnose. Since the image included her nipple, it was soon flagged for violating community standards. She then reposted the image with a smiley drawn over the nipple. As was the case with Corry and Gore’s images, news of flagging fuelled the images’ social circulation across diverse platforms. Flagging is a sign of objection but it also involves more complex social dynamics and exchanges, many of which are inseparable from increases in attention value:
“Flags get pulled as a playful prank between friends, as part of a skirmish between professional competitors, as retribution for a social offense that happened elsewhere, or as part of a campaign of bullying or harassment—and it is often impossible to tell the difference. Flagging is also used to generate interest and publicity, as in the cases where YouTube put a racy music video behind an age barrier and promoters decried the “censorship” with mock outrage. The fact that flagging can be a tactic not only undercuts its value as a “genuine” expression of offense, it fundamentally undercuts its legibility as a sign of the community’s moral temperature.” (Crawford and Gillespie 2014, 11.)
The persistent hunt for offensive areolas and micro-penises in the incidents discussed above is connected to broader boundary maintenance between SFW and NSFW platforms. While Twitter, and Tumblr in particular, broadly accommodate sexually explicit content, most social media platforms undergo considerable effort to remove it. This, combined with the increased centralization of ownership to dominant actors (such as Google and Facebook), has drastically reframed the operating possibilities of companies trading in specifcially pornographic content. In his coverage of the adult app store, Mikandi, Wired reporter Cade Metz notes that
“with the rising power of companies like Apple and Google and Facebook, the adult industry doesn’t drive new technology. In many respects, it doesn’t even have access to new technology. The big tech companies behind the big platforms control not only the gateway services (the iPhone app store, Google Search, the Facebook social network) but the gateway devices (the iPhone, Android phones, Google Chromecast, the Amazon Fire TV, the Oculus Rift virtual reality headset). And for the most part, they’ve shut porn out. Besides, these giants now drive new technology faster than services like Mikandi or Pornhub ever could.” (Metz 2015.)
Porn sites—the market leader Pornhub being here an obvious example—are currently much more likely to emulate the technical solutions and revenue models of social media platforms than the other way around. Pornographic aggregator sites from RedTube to YouPorn and XTube have all copied their design and operating principles from YouTube. Meanwhile, those modelled after other social outlets, such as Fuckbook and Snatchly, “the Pinterest of Porn,” have not similarly picked up.
All this is in stark contrast to the situation some two decades ago, given the degree to which the needs of the porn industry drove the development of Web solutions. Gaming and online shopping only picked up towards the end of the 1990s, and for quite a while pornography remained one of the few forms of content that users would pay for. Consequently, safe credit card processing systems, streaming video technologies, and hosting services, as well as practices such as banner advertisement and pop-ups, were first developed for and used on porn sites. The role of porn as a driving force in dot.com enterprise has clearly since passed. While pornographic content still quickly migrates to new technical platforms and media formats, its position is crucially different in the context of social media than in the Web cultures of the 1990s.
Pornographic content, or even less sexually explicit nudity, is weeded out from platforms such as YouTube or Facebook through flagging and automated blocking alike : it is not possible to share links directly to pornographic content. Adult entertainment companies have public social media presence, yet this mainly involves sticking to the non-explicit and hence failing to represent the brands’ core features in order to accommodate diverse terms of service. In addition to key social media companies warding off pornography, Google, YouTube’s owner, also bans pornography from its advertisements:
“The AdWords policies on adult sexual services (…) will be updated in late June 2014 to reflect a new policy on sexually explicit content. Under this policy, sexually explicit content will be prohibited, and guidelines will be clarified regarding promotion of other adult content. The change will affect all countries. We made this decision as an effort to continually improve users’ experiences with AdWords.” (https://support.google.com/adwordspolicy/answer/4271759?hl=en&ref_topic=29265)
Search engines have long filtered our pornography from their freely published listings of most popular search terms, hence adding to the position of porn as a public secret of sorts. Search engines also filter out sexually explicit content in diverse ways, which, in practice, makes it difficult for users to find that which they are searching for. While this could, for evident reasons, be considered as deliberate design of a poorer service, such measures are articulated and motivated as improvements in user experience (as in the AdWords policy statement above).
Google has long provided a “SafeSearch” option that filters out all, or at least most, hits to adult content. As that which SafeSearch filters out, sexually explicit content becomes framed as unsafe and risky. Should the user choose to disable SafeSearch, Google will “provide the most relevant results for your search and may include explicit content when you search for it.” Searches on pornography can be openly tracked through Google Trends, witness to their perennial and even increasing popularity, yet without breaking down these trends into actual numbers of searches. All in all, the uneasy visibility of nudity and pornographic content in Web searches and on social media platforms speaks of a gap between the ideal, normative figure of a user—in accordance to which the services’ “community standards” are crafted—and the diverse interests, inappropriate interests, and unruly titillations of empirical, actual people that routinely veer towards the NSFW.