Information fiduciaries, trending topics, and digital gerrymandering – notes on gatekeepers, intermediaries and corporate social responsibility

Gerrymander vote, elements via https://pixabay.com/en/ballot-election-vote-1294935/ and http://www.dailykos.com/story/2013/2/13/1185175/-Three-Different-Universes-of-Redistricting-Massachusetts-Plus-Revisiting-the-Original-GerrymanderIn the kind of serendipitous conjunction characteristic of intersecting internet feeds, I came across three related stories almost simultaneously. First, from Silicon Republic:

Facebook denies political bias in hullabaloo over trending topics

Facebook has been forced to deny claims that it employed contractors to manipulate and suppress stories in trending topics of interest to Conservatives in the US.

If there is a problem here, and if there is a legal solution to this problem, then that’s where the second and third stories come in. Via Steve Hedley‘s Private Law Theory blog, I read about Jack Balkin‘s article “Information Fiduciaries and the First Amendment” 49 UC Davis Law Review 1183 (2016) (UCDLR pdf | SSRN preprint | Balkinization blog post):

… online service providers and cloud companies who collect, analyze, use, sell, and distribute personal information should be seen as information fiduciaries toward their customers and end-users. Because of their special power over others and their special relationships to others, information fiduciaries have special duties to act in ways that do not harm the interests of the people whose information they collect, analyze, use, sell, and distribute. These duties place them in a different position from other businesses and people who obtain and use digital information. And because of their different position, the First Amendment permits somewhat greater regulation of information fiduciaries than it does for other people and entities.

Via an article on the possible influence which Google and Facebook may have upon elections, learn that Jonathan Zittrain has proposed using this concept to address the problems with digital gerrymandering which I discuss here and here and which Facebook has denied it is doing in editing its trending topics feed. Reports claiming that it was doing do sparked outcry on social media and a possible investigation in the US Congress. It may not be long before there are legal proceedings, perhaps either a class action in the US, or an enforcement action by a European regulator, or both. In any such case, Balkin’s arguments referred to here, Frank Pasquale’s arguments in his prescient The Black Box Society, and Laidlaw’s arguments referred to in an earlier post, are all likely (in the word a current US political candidate) to figure bigly. Let’s watch this space, especially if that space is the Facebook trending topics feed.

Updates (various): Here’s a selection of some further articles on the issue in this post that I have subsequently found, and found to be useful:

Social Network Algorithms Are Distorting Reality By Boosting Conspiracy Theories (11 May 2016): “The algorithms that drive social networks are shifting the reality of our political systems—and not for the better. … Algorithms, network effects, and zero-cost publishing are enabling crackpot theories to go viral. The filter bubble — the idea that online recommendation engines learn what we like and thus keep us only reading things we agree with — has evolved.”

Facebook news selection is in hands of editors not algorithms, documents show (12 May) “Guardian Exclusive: Leaked internal guidelines show human intervention at almost every stage of its news operation, akin to a traditional media organization.”

A Bold New Scheme to Regulate Facebook (12 May 2016): “In an unpublished paper that he is writing with Jack Balkin, a Constitutional law professor at Yale Law School, Zittrain recommends that certain massive repositories of user data—like Apple, Facebook, and Google—be offered a chance to declare themselves “information fiduciaries.” An information fiduciary would owe certain protections to its users, closing the “disconnect between the level of trust we place in [online services] and the level of trust in fact owed to us,” according to the paper. The key to this idea? Facebook might opt into this regulation itself.”

Does social media have a censorship problem? (12 May 2016) Ryan McChrystal for Index on Censorship:

Companies like Facebook and Twitter aren’t obliged to host users’ posts, but their efforts to filter our feeds nonetheless seem at odds with the values of free expression. … “Legally we have no right to be heard on these platforms, and that’s the problem,” Jillian C. York, director for international freedom of expression at the Electronic Frontier Foundation, tells Index on Censorship. “As social media companies become bigger and have an increasingly outsized influence in our lives, societies, businesses and even on journalism, we have to think outside of the law box.”

Transparency rather than regulation may be the answer. Back in November 2015, York co-founded Online Censorship, a user-generated platform to document content takedowns on six social media platforms (Facebook, Twitter, Instagram, Flickr, Google+ and YouTube), to address how these sites moderate user-generated content and how free expression is affected online.

Social Media Finds New Role as News and Entertainment Curator (15 May 2016): “Facebook has editors? It does, and it isn’t alone. Most major social media platforms have, in recent years, amassed editorial teams of their own, groups that select, tame and fill gaps in the material produced by users and media companies.”

Here is the news – but only if Facebook thinks you need to know (15 May 2016) John Naughton in the Observer:

Before the internet, our problem with information was its scarcity. Now our problem is unmanageable abundance. So now the scarce resources are attention and time, over which a vicious war has broken out between traditional media and the internet-based upstarts. … Any algorithm that has to make choices has criteria that are specified by its designers. And those criteria are expressions of human values. … If Facebook wants to become a conduit for news, then it has to recognise that it has moved into a different sphere and has acquired new responsibilities. And the publishers who suck up to it should remember Churchill’s definition of appeasement: it’s the process of being nice to a crocodile in the hope that it will eat you last.

Facing the Facebook facts: algorithms are biased too (16 May 2016): “The current scandal reveals a gnawing fear as the realisation dawns that Facebook wields unprecedented power, swiftly becoming the world’s pre-eminent publisher on a scale that was impossible to conceive of just a few years ago. We are just beginning to get a sense of what that power looks like, and we are understandably uncomfortable with what we are faced with.”

I worked on Facebook’s Trending team – the most toxic work experience of my life (17 May 2016): “A former contractor says that while the social media company did not impose political bias upon news ‘curators’, she and other employees were subject to poor management, intimidation and sexism that left them feeling voiceless”.

Algorithms, clickworkers, and the befuddled fury around Facebook Trends (18 May 2016): “So algorithms are in fact full of people and the decisions they make.”

Facebook woos US conservatives with cordial campus tour (19 May 2016): a “meeting between Mark Zuckerberg and Republicans [was] aimed at defusing [the] row over liberal bias [at Facebook]”.

The Real Bias Built In at Facebook (19 May 2016) Zeynep Tufekci in the New York Times:

… algorithms aren’t neutral. … they “optimize” output to parameters the company chooses, crucially, under conditions also shaped by the company. On Facebook the goal is to maximize the amount of engagement you have with the site and keep the site ad-friendly. … This setup, rather than the hidden personal beliefs of programmers, is where the thorny biases creep into algorithms, … The newsfeed algorithm also values comments and sharing.

The first step forward is for Facebook, and anyone who uses algorithms in subjective decision making, to drop the pretense that they are neutral. Even Google, whose powerful ranking algorithm can decide the fate of companies, or politicians, by changing search results, defines its search algorithms as “computer programs that look for clues to give you back exactly what you want.”

But this is not just about what we want. What we are shown is shaped by these algorithms, which are shaped by what the companies want from us, and there is nothing neutral about that.

Eventually, in a blogpost posted on 23 May 2016, Facebook came round:

Response to Chairman John Thune’s letter on Trending Topics

By Colin Stretch, Facebook General Counsel

Last week we met with Chairman of the U.S. Senate Commerce Committee John Thune to describe our investigation in response to anonymous allegations of political bias in our Trending Topics feature and to discuss the preliminary results of our work. Today, we sent Chairman Thune a follow up letter setting out our findings and conclusions.

… suppressing political content or preventing people from seeing what matters most to them is directly contrary to our mission and our business objectives and the allegations troubled us deeply. We are proud of the platform and community we have created, and it is important to us that Facebook continues to be a platform for all ideas. … [However, as] part of our commitment to continually improve our products and to minimize risks where human judgment is involved, we are making a number of improvements to Trending Topics … These improvements and safeguards are designed not only to ensure that Facebook remains a platform that is open and welcoming to all groups and individuals, but also to restore any loss of trust in the Trending Topics feature.

Penultimately, “Taming the beasts“, The Economist‘s 28 May 2016 take on “Regulating technology companies” puts this controversy into a broader context: “European governments are not alone in wondering how to deal with digital giants. … Regulators still have much to learn about how to deal with platforms. But they have no choice but to get more expert.”

Finally, Jane Bambauer “The Relationships between Speech and Conduct” (2016) 49 UC Davis Law Review —, via Private Law Theory blog: “In this essay, I begin to roll out the implications of Balkin’s relational approach to free speech … in his new article, Information Fiduciaries, …”.

And, finally finally, Martin Moore :Imagine if Google or Facebook took a line on the EU referendum” (Conversation | Inforrm’s Blog):

The scale and reach of the tech giants and the growing civic roles they play make it inevitable that governments – democratic and otherwise – will respond. … Yet most of these responses are destined to fail. This is for three reasons: democratic governments have not yet adequately defined the problem with these tech giants that they are trying to solve. They are using legislation and policy approaches unsuited to dealing with these tech organisations and their products. And they do not have a vision of where they would like a future digital society to end up.