Yesterday an interesting issue got to the FrontPage of the Norwegian Aftenposten (at http://www.aftenposten.no/kultur/Aftenposten-redaktor-om-snuoperasjonen–En-fornuftig-avgjorelse-av-Facebook-604237b.html) and for those who are slightly Norwegian linguistically challenged, there is an English version at https://www.theguardian.com/technology/2016/sep/08/facebook-mark-zuckerberg-napalm-girl-photo-vietnam-war.
It is something we have seen before. Although from a technical point of view, the editing (read: initial flag) is likely to have been done electronically, the added blame we see when we get to the quote “Egeland was subsequently suspended from Facebook. When Aftenposten reported on the suspension – using the same photograph in its article, which was then shared on the publication’s Facebook page – the newspaper received a message from Facebook asking it to “either remove or pixelize” the photograph” shows that this is an entirely different matter. This is now a censoring engine that is out of control. The specification ‘either remove or pixelize’ does not cut it, especially when it concerns a historical photo that was given a Pulitzer.
I am actually considering that there is more in play, you see, the Atlantic (at http://www.theatlantic.com/technology/archive/2016/05/facebook-isnt-fair/482610/) said it in May when it published “Facebook Doesn’t Have to Be Fair. The company has no legal obligation to be balanced—and lawmakers know it“, which is the title and subtitle and as such, the story is told and politicians like John Thune experienced how a social network can drown out whatever it wants (within reason). So when you see something is trending on Facebook, you must comprehend that it is not an algorithm, but contracted people guide its creation and as quotes in the Atlantic “routinely suppressed conservative news“. Yet this goes further than just censorship and news. As the Editor of Aftenposten raises (and others with him), Mark Zuckerberg has now become the most powerful editor in the world. He now has nothing less than a sworn duty to uphold the freedom of speech to a certain degree, especially when relying on algorithms that are unlikely to cut the mustard on its current track. It now also opposes the part the Atlantic gave us with the subtitle “The company has no legal obligation to be balanced—and lawmakers know it” showing Sheryl Sandberg in a ‘who gives a fuck‘ pose. You see, at present Facebook has over 1.7 billion active users. What is interesting is that the acts that he has been found guilty of acts that negatively impacts well over 50% of his active user base. Norway might be small, but he is learning that it packs a punch, and when we add India to the mix, the percentage of alienated people by the censoring act of Facebook goes up by a lot. So even as there is the use of blanket rules, the application is now showing to be more and more offensive to too many users and as such this level of censorship could hurt the bottom dollar that every social media site has, which are the number of users. So as Mark Zuckerberg is trying to get appeal in Asia, he needs to realise that catering to one more nation could have drastic consequences to those he think he has. Now we understand that there needs to be some level of censorship, yet the correct application of it seems to go the wrong way. Of course this could still all go south and we would have get used to log in to 顔のブック, or 脸书. Even चेहरे की किताब is not out of the question. So is that what Zuckerberg needs? I know the US is scared shitless in many ways when that happens, so perhaps overseeing a massive change into the world of censoring is now an important issue. Espen Egil Hansen said it nearly all when he stated “a troubling inability to “distinguish between child pornography and famous war photographs”, as well as an unwillingness to “allow space for good judgement”” is at the heart of the matter. In that regard, the issue of “routinely suppressing conservative news” remains the issue. When you censor 50% of your second largest user base, it is no longer just a case of free speech or freedom of expression. It becomes an optional case of discrimination, which could have even further extending consequences. Even as we sit now, there are lawsuits in play, the one from Pamela Geller, a person that only seems to be taken serious by Breitbart News is perhaps the most striking of all. Pamela (At http://www.breitbart.com/tech/2016/07/13/pamela-geller-suing-facebook/) with the quote “My page “Islamic Jew-Hatred: It’s In the Quran” was taken down from Facebook because it was “hate speech.” Hate speech? Really? The page ran the actual Quranic texts and teachings that called for hatred and incitement of violence against the Jews.” is a dangerous one. It is dangerous because it is in the same place as the Vietnam photo. The fact that this is a published religious book makes it important and the fact that the book is quoted makes it accurate. The blaze (at http://www.theblaze.com/stories/2016/01/05/an-israeli-group-created-fake-anti-israel-and-anti-palestinian-facebook-pages-guess-which-one-got-taken-down/) goes one step further and conducted an experiment. The resulting quote is “The day the complaint was filed, the page inciting against Arabs was shut down. The group received a Hebrew language message from Facebook that read, according to a translation via Shurat HaDin, “We reviewed the page you reported for containing credible threat of violence and found it violates our community standards”, the page inciting against Jews was left active.” This indicates that Facebook has a series of issues. One cannot help but wonder whether this issue is merely bias or the economic print the Muslim world has when measured against a group of 8 million Israeli’s or perhaps just the population of 16 million Jews globally. With the Aftenposten event, Facebook seems to have painted itself into a corner, and if correct several lawsuits that could soon force Facebook to have a rigorous evaluation and reorganisation of several of its internal and external departments.
Because if Content is the cornerstone of Social media, the need to keep a clear view of freedom of expression and freedom of speech becomes even more important. In a product that seeks the need for growth that should have been obviously clear.
There is however a side that is not addressed by any. You might get the idea when you see the Guardian quote “News organizations are uncomfortably reliant on Facebook to reach an online audience. According to a 2016 study by Pew Research Center, 44% of US adults get their news on Facebook. Facebook’s popularity means that its algorithms can exert enormous power over public opinion“, the fact that Facebook might soon be hiding behind the ‘algorithms‘ as we see Facebook go forward on a defence relying on their version of the DEFAMATION ACT. In this example I will use the DEFAMATION ACT 2005 (Australian Law), where we see in Article 32
32 Defence of innocent dissemination
(1) It is a defence to the publication of defamatory matter if the defendant proves that:
(a) the defendant published the matter merely in the capacity, or as an employee or agent, of a subordinate distributor, and
(b) the defendant neither knew, nor ought reasonably to have known, that the matter was defamatory, and
(c) the defendant’s lack of knowledge was not due to any negligence on the part of the defendant.
(2) For the purposes of subsection (1), a person is a “subordinate distributor” of defamatory matter if the person:
(a) was not the first or primary distributor of the matter, and
(b) was not the author or originator of the matter, and
(c) did not have any capacity to exercise editorial control over the content of the matter (or over the publication of the matter) before it was first published.
By relying on Algorithms, Facebook could now possible skate the issue, yet this can only happen if certain elements fall away, in addition, the algorithm will now become part of the case and debate muddying the waters further still.
Hanson does hit the nail on the head when it comes to the issues he raises like “geographically differentiated guidelines and rules for publication”, “distinguish[ing] between editors and other Facebook users,” and a “comprehensive review of the way you operate”. He is not wrong, yet I have to raise the following
In the first, when you decide to rely on “geographically differentiated guidelines and rules for publication”, you also include the rules of who you publish to. This is the first danger for Facebook, their granularity could fall away to some extent and Facebook advertising is all about global granularity. It is a path he would be very unwilling to skate. Open and global are his ticket to some of the largest companies. When this comes into play, smaller players like Coca Cola and Mars could soon find the beauty of moving some of their advertisements funds away from Facebook and towards Google AdWords. I am decently certain that Google will not be opposing that view any day soon.
In the second “distinguish[ing] between editors and other Facebook users” is only part of the path, you see when we start classifying the user, Facebook could start having to classify a little too much, making any distinguishing of such kind additional worries in regards to discrimination. Twitter faced that mess recently when a certain picture from one Newspaper was allowed and another one was not. That and the fact that a woman named Molly Wood (her actual name) was not allowed to use her name as her Facebook name, which is a matter for another day.
In the third the issue “comprehensive review of the way you operate” which is very much in play. The cases that Facebook has faced regarding content and privacy are merely the tip of the iceberg. We can all agree that when it is about sex crimes people tend to notice it, I am speculating for the most because of the word ‘sex’. So when I saw that there is a June reference (at http://www.mrctv.org/blog/facebook-censuring-international-stories-about-rapes-muslim-refugees), when Facebook removed a video from Ingrid Carlqvist for the Gatestone Institute, where she reports that there has been a 1,500% increase in rapes in Sweden, I was wondering why this had not found the front page of EVERY newspaper in every nations where there is free speech. The Gatestone Institute is a not-for-profit international policy think tank run by former UN Ambassador John Bolton, so not some kind of radicalised front.
In that regard is any kind of censoring even acceptable?
This case is more apt than you think when you consider the quote we see, even as I cannot give weight to the publishing site. We see “Facebook may have been incited to censor this story by a new European Union push in cooperation with Facebook, Twitter, and Google to report incidents of racism or xenophobia to the authorities for criminal prosecution” with the by-line “In order to prevent the spread of illegal hate speech, it is essential to ensure that relevant national laws transposing the Council Framework Decision on combating racism and xenophobia are fully enforced by Member States in the online as well as the in the offline environment. While the effective application of provisions criminalising hate speech is dependent on a robust system of enforcement of criminal law sanctions against the individual perpetrators of hate speech, this work must be complemented with actions geared at ensuring that illegal hate speech online is expeditiously reviewed by online intermediaries and social media platforms, upon receipt of a valid notification, in an appropriate time-frame. To be considered valid in this respect, a notification should not be insufficiently precise or inadequately substantiated“, which was followed by “No matter why Facebook decided to remove Ingrid Carlqvist’s personal page, it doesn’t lessen the fact that this is another example of their political censorship, and their desire to place political correctness over freedom of the press and freedom of expression”
Now this part has value and weight for the following reason: When we consider the earlier move by Facebook to relay on algorithms, the European Commission (at http://europa.eu/rapid/press-release_IP-16-1937_en.htm) gives us: ‘is expeditiously reviewed by online intermediaries and social media platforms, upon receipt of a valid notification, in an appropriate time-frame‘, which could imply that an algorithm will not be regarded as one of the online intermediaries, which means that the human element remains and that Facebook cannot rely on the innocent dissemination part of the Defamation Act, meaning that they could end up being in hot water in several countries soon enough.
As parting words, let Facebook take heed of the words of Steven Spielberg: “There is a fine line between censorship and good taste and moral responsibility“.