Tuesday, September 13, 2016

Nuance, culture, society and Facebook

September 9th 2016 the Aftenposten editor Espen Egil Hansen published a video addressing Mark Zuckerberg. In this video he explained why you can't treat all images the same. The picture he was talking about is the famous picture that quite possibly turned the flow of the coverage of the Vietnam war, the famous picture of a nine-year old girl running naked down a road, flanked by armed soldiers and with her back burned in a napalm attack. Author Tom Egeland had been publishing this image in connection to discussions on strong and shocking press photography. Facebook deleted his posts. Others posted it. It got deleted. Egeland posted it over and over again, and got denied from Facebook. A Norwegian expert on freedom of speech, Anine Kierulf, made a long post with several different nude images, some from art, some from pictures, and asked Facebook what was acceptable. The whole thing got deleted. The editor of Nettavisen, an online newspaper, wrote about the case on Facebook, and got denied. Why? It's a nude picture of a child, and as we know, Facebook really has problems with nudity. Ask any woman who has posted a picture of her own breasts, whether it is with a slightly erotic overtone, or if it is while breastfeeding or to discuss scars after breast surgery. Even the chest area of women with no breasts, where both have been removed in surgery, is too daring for Facebook.

Hansen's rant against Facebook caught the attention of several large newspapers. The Guardian wrote about it, Time Magazine wrote a piece and made a video about it, The Washington Post let us all know Facebook had changed their mind. It didn't hurt the cause that Facebook also deleted the post of Norway's Prime Minister Erna Solberg when she posted the picture.

This case is very interesting from several different angles. Questions kept popping up in my Facebook feed (very meta that). Is Facebook just a platform, or also a news provider, and as such, what kind of editorial responsibility does it have? At what point does a private platform turn into a public platform? Is Facebook now so big that Governments should look into how it practices freedom of speech, and should it be subject to the same kind of scrutiny and discussions about censorship and freedom of speech that nation states and national media are subject to? This example is a very good starting point for these discussions, and they will be revisited in media research for years to come.

Today's take on it from my side is however concerned with the importance of education: Cultural, historical, social. So far we have no algorithm that can recognize the kind of nuance needed to distinguish the picture of a woman bravely sharing her post-op scars from pornographic titillation. To a human being with the least sensitivity to images, context, stance and position, the difference will be very, very clear. And unless we accept that there is a difference, and this difference is important, we will never be able to develop better tools, nor to educate the humans who sit in key positions to do that kind of evaluations. The technology will keep being stupid, and our application of it will be even more so, to the point of being dangerously oppressive. Because oppression is what we get in a society that refuses to acknowledge nuance, context and shifting circumstances.

In short: this is why we need humanists and social scientists in tech-related work places, schools, education and research. Somebody needs to understand the difference between a revolutionary war photo and kiddie porn. It is, clearly, much harder than we thought.

No comments: