Category: Cyberlaw

Reality and Illusion in EU Data Transfer Regulation post-Schrems

Crhis KunerOn the first anniversary of the judgment of the Court of Justice of the European Union in Case C-362/14 Schrems, Professor Christopher Kuner (pictured left), Professor of Law at the Vrije Universiteit Brussels, will give a public lecture on

Reality and Illusion in EU Data Transfer Regulation post-Schrems

The lecture will be held in the Neill Theatre, Trinity Long Room Hub, Trinity College Dublin, on Thursday 6 October 2016, at 1:00pm.

In Case C-362/14 Schrems v Data Protection Commissioner [2015] ECR I-nyr (Grand Chamber, 6 October 2015), the Court of Justice of the European Union invalidated the EU-US Safe Harbour arrangement allowing personal data to be transferred to the US. The judgment is a landmark in the Court’s data protection case law, and illustrates the tension between the high level of legal protection for data transfers in EU law and the illusion of protection in practice. The judgment has undermined the logical consistency of the other legal bases for data transfer besides the Safe Harbour, and reactions to it have largely been based on formalism or data localization measures that are unlikely to provide real protection. Schrems also illustrates how many legal disagreements concerning data transfers are essentially political arguments in disguise. The EU and the US have since agreed on a replacement for the Safe Harbour (the EU-US Privacy Shield), the validity of which will likely be tested in the Court. It is crucial for data transfer regulation to go beyond formalistic measures and legal fictions, in order to move regulation of data transfers in EU law from illusion to reality.

Professor Christopher Kuner is a leading expert on the law of data protection and, in particular, the law governing the international transfer of data. He is Professor of Law and Co-Chairman of the Brussels Privacy Research Hub at the Vrije Universiteit Brussel and Senior Privacy Counsel in the Brussels office of Wilson Sonsini Goodrich & Rosati. He is also a Visiting Professor in the Department of Law in the London School of Economics and Political Science, an associate professor in the Law Faculty of the University of Copenhagen and an affiliated lecturer and Honorary Fellow of the Centre for European Legal Studies of the University of Cambridge. He is the author of Transborder Data Flows and Data Privacy Law (OUP, 2013) and Editor-in-Chief of the Journal of International Data Privacy Law (also published by OUP).

ADAPT centre logoAttendance is free, and all are welcome to attend, but booking is essential, so please register at eventbrite.

The lecture is organised by the Ethics & Privacy Working Group of the ADAPT Centre, TCD, in conjunction with the Trinity Long Room Hub, TCD School of Law, TCD School of Religions, Peace Studies and Theology, TCD Library and DCU Institute of Ethics.

Some forthcoming legislation on the administration of justice, cybercrime, education, intellectual property, and privacy

Government Buildings by night, via Wikipedia

Government Buildings,
Merrion Square, Dublin.
Image via wikipedia
Government Chief Whip Regina Doherty has announced the Government’s Legislation Programme for the Autumn Session 2016 (pdf). It is a considerable update of the programme published last June (pdf) when the government came into office.

The June programme had the feel of a holding document, published to get a new government to the Summer Recess. This programme has a far more substantial feel about, published to demonstrate the government’s confidence in its capacity to promote and enact legislation.

After the publication of the June programme, I examined proposed legislation from the Department of Education and Skills (here; and see also here), the Department of Jobs, Enterprise and Innovation (here; and see also here and here), and the Department of Justice and Equality (here and here). Under those headings, very little has changed. But there are some notable additions, not least of which is the Interception of Postal Packets and Telecommunications Messages (Regulation) (Amendment) Bill. All we are told is that work is underway on a Bill to “amend various pieces of legislation in respect of electronic communications”. There is no further explanation. This is probably the Bill to provide for further covert surveillance of electronic communications promised by the Minister earlier this Summer. It is also likely to cover incoming requests from overseas to access to data held in Ireland. It may also include preparatory work for the response to the investigations being carried out by retired Chief Justice John Murray and retired Supreme Court judge Nial Fennelly. However, at present, this is just speculation, so we shall have to wait and see what the Department has in mind.

As to the administration of justice, priority legislation to be published by the Department of Justice and Equality this session includes a Bill to make provision for periodic payment orders to replace lump sum damages, and a (hastily-promoted?) Bill to establish the long-awaited Judicial Council. Indeed, that Bill is expected to undergo pre-legislative scrutiny this session, as is a Bill to replace the Judicial Appointments Advisory Board with a new Judicial Appointment Commission – indeed, the cabinet agreed yesterday to bring forward the heads of such a Bill by November. All of these developments are very welcome – provided that the Appointments Bill permits legal academics to apply for appointment to be bench, especially at appellate level. It would not be difficult to draft the necessary legislative provisions, and there is no reason in principle not to do so.

As to cybercrime, first, the busy Department of Justice and Equality is promoting the Criminal Justice (Offences Relating to Information Systems) Bill 2016, to implement Directive 2013/40/EU on attacks against information systems. It is on the Dáil Order Paper, awaiting Second Stage. Second, in the ‘I’ll believe it when (if) I see it’ category is the long-promised and almost long-forgotten Cybercrime Bill to give effect to the Council of Europe Convention on Cybercrime 2001. Yes, you read that right, it’s a 2001 Convention. It is 15 years old, which is a lifetime online.

As to education, legislation envisaged at some stage from the Department of Education, but probably not in this session, includes the Higher Education (Reform) Bill and the longer-threatened Universities (Amendment) Bill (critiqued here, here, here, and here). And the Technological Universities Bill 2015 remains on the Dáil Order Paper, awaiting Committee stage.

As to intellectual property, pre-legislative scrutiny is expected shortly on the Knowledge Development Box (Certification of Inventions) Bill. Heads of a Bill to amend Article 29 of the Constitution to recognise the Agreement on a Unified Patent Court were approved on 23 July 2014, though, in the light of Brexit, a cautious approach for the time being may mean that other Bills may progress ahead of it from the Department of Jobs, Enterprise and Innovation to the Oireachtas. Finally, the Copyright and Related Rights (Amendment) (Miscellaneous Provisions) Bill has been “referred to committee” pre-legislative scrutiny. This is presumably the Joint Committee on Jobs, Enterprise and Innovation. However, the Bill is not in the pre-legislative scrutiny list for this session, so we probably won’t see it in committee before Christmas.

As to privacy, the most important piece of legislation mentioned in the Programme is the Data Protection Bill, to transpose the EU Directive 2016/680 and give full effect to the General Data Protection Regulation (Regulation 2016/679). Heads are expected before the end of 2016 (but I’m not holding my breath). A Data Sharing and Governance Bill will be published and sent for pre-legislative scrutiny, to mandate and facilitate lawful data-sharing and data-linking for all public bodies, and a Health Information and Patient Safety Bill go further in the context of health information. In both cases, the drafting will be tricky, not least because the Bills will have to be compliant with the decision of the Court of Justice of the European Union in Case C?201/14 Bara. The Criminal Records Information System Bill and the Passenger Name Record Bill implement EU obligations. However, in the case of the latter, since there is a challenge before the CJEU in respect of a related measure, a cautious approach for the time being may mean that other Bills may progress ahead of it from the Department of Justice and Equality to the Oireachtas.

Finally, it is heartening to see that work has commenced on a Bill to remove blasphemy from the Constitution, and interesting to see active proposals to establish an Electoral Commission and to amend the transfer of records in the National Archives from 30 years to 20 years.


Harmful Communications and Digital Safety

Online SafetyThe Law Reform Commission has today published its long-awaited (pdf) a Report on Harmful Communications and Digital Safety (pdf). It contains 32 recommendations for reform, and includes a draft Harmful Communications and Digital Safety Bill to implement them. In my view, the most important recommendation is the proposal to establish a statutory Digital Safety Commissioner, modelled on comparable offices in Australia and New Zealand. The Commissioner’s function would be to promote digital safety, including an important educational role to promote positive digital citizenship among children and young people. The Commissioner’s role would also include publication of a statutory Code of Practice on Digital Safety, to set out nationally agreed standards on the details of an efficient take-down procedure. As the Law Reform Commission explains:

Under the proposed statutory system, individuals would initially apply directly to a social media site to have harmful material removed in accordance with agreed time-lines: this is similar to the statutory system in place in Australia. If a social media site did not comply with the standards in the Code of Practice, the individual could then appeal to the Digital Safety Commissioner, who could direct a social media site to comply with the standards in the Code. If a social media site did not comply with the Digital Safety Commissioner’s direction, the Commissioner could apply to the Circuit Court for a court order requiring compliance.

The Commission also recommends two new criminal offences, and reforms of existing offences. The new offences relate to publication of intimate images without consent (revenge porn (section 4 below), upskirting and down-blousing (section 5 below)). The reforms relate to the existing offences of harassment, and of sending threatening and intimidating messages, in both cases to ensure that the relevant offences fully capture the most serious types of online harassment, stalking and intimidation. Reflecting a campaign that is gaining increasing momentum in the UK, the Commission also recommends that, in any prosecution for these offences, the privacy of the victim should be protected (but her or she should also be able to waive his or her anonymity). The details of the new offences in the draft Bill are as follows:

Distributing intimate image without consent, or threatening to do so, with intent to cause harm
4(1) A person commits an offence where he or she, without lawful authority or reasonable excuse and in the circumstances referred to in subsection (2), by any means of communication distributes or publishes an intimate image of another person (in this section referred to as the other person) without the consent of the other person, or threatens to do so.
(2) The circumstances are that the person who distributes or publishes the intimate material, or who threatens to do so, does so where—
(a) he or she, by his or her act or acts intentionally or recklessly seriously interferes with the other person’s peace and privacy or causes alarm, distress or harm to the other person, and
(b) his or her act or acts is or are such that a reasonable person would realise that the act or acts would seriously interfere with the other person’s peace and privacy or cause alarm, distress or harm to the other person. …

Taking or distributing intimate image without consent
5(1) A person commits an offence where he or she, without lawful authority or reasonable excuse and in the circumstances referred to in subsection (2), by any means of communication takes, or distributes or publishes an intimate image of another person (in this section referred to as the other person) without the consent of the other person.
(2) The circumstances are that the person who takes, or distributes or publishes the intimate material does so where he or she, by his or her acts seriously interferes with the other person’s peace and privacy or causes alarm, distress or harm to the other person. …


Internet defamation and the liability of intermediaries (Muwema v Facebook part 1)

Uganda Facebook Ireland1. Introduction
The liability of internet intermediaries for defamatory posts on their platforms was central to the decision of Binchy J in Muwema v Facebook Ireland Ltd [2016] IEHC 519 (23 August 2016). A Ugandan lawyer objected to allegedly defamatory posts on a pseudonymous Facebook account, and Binchy J gave an order requiring Facebook to identify the account-holder. However, he declined to grant injunctions requiring Facebook either to remove the posts or to prevent the material in them from being re-posted, on the grounds that Facebook could rely on the defence of innocent publication in section 27 of the Defamation Act 2009 (also here).

On the other hand, in the earlier Petroceltic International plc v Aut O’Mattic A8C Ireland Ltd (High Court, unreported, 20 August 2015, amended 8 September 2015; noted here (pdf) and here (pdf)) (see Irish Independent | Irish Times) Baker J not only gave an order requiring the defendant to identify an account-holder but also granted an injunction requiring the defendant to remove allegedly defamatory posts from a blog hosted on its site. Baker J simply made the relevant orders, whereas Binchy J handed down a full judgment explaining that section 27 was the reason why he refused to award the injunctions against the defamatory posts. Whilst his judgement is likely to be influential, it is nevertheless striking that another judge took the opposite approach, so the matter cannot be regarded as fully settled, and it is appropriate to analyse his reasoning in at least a little detail.

In Muwema, the plaintff sought three orders against Facebook: (i) to identify the person or persons behind the pseudonymous account, (ii) to take down the allegedly defamatory posts, and (iii) to prevent any similar posts from being reposted. The plaintiff obtained the first, but not the second or third. In this post, I will discuss the issues in this sequence, but I will postpone a detailed analysis of section 27 to the next post. (more…)

Information fiduciaries, trending topics, and digital gerrymandering – notes on gatekeepers, intermediaries and corporate social responsibility

Gerrymander vote, elements via and the kind of serendipitous conjunction characteristic of intersecting internet feeds, I came across three related stories almost simultaneously. First, from Silicon Republic:

Facebook denies political bias in hullabaloo over trending topics

Facebook has been forced to deny claims that it employed contractors to manipulate and suppress stories in trending topics of interest to Conservatives in the US.

If there is a problem here, and if there is a legal solution to this problem, then that’s where the second and third stories come in. Via Steve Hedley‘s Private Law Theory blog, I read about Jack Balkin‘s article “Information Fiduciaries and the First Amendment” 49 UC Davis Law Review 1183 (2016) (UCDLR pdf | SSRN preprint | Balkinization blog post):

… online service providers and cloud companies who collect, analyze, use, sell, and distribute personal information should be seen as information fiduciaries toward their customers and end-users. Because of their special power over others and their special relationships to others, information fiduciaries have special duties to act in ways that do not harm the interests of the people whose information they collect, analyze, use, sell, and distribute. These duties place them in a different position from other businesses and people who obtain and use digital information. And because of their different position, the First Amendment permits somewhat greater regulation of information fiduciaries than it does for other people and entities.

Via an article on the possible influence which Google and Facebook may have upon elections, learn that Jonathan Zittrain has proposed using this concept to address the problems with digital gerrymandering which I discuss here and here and which Facebook has denied it is doing in editing its trending topics feed. Reports claiming that it was doing do sparked outcry on social media and a possible investigation in the US Congress. It may not be long before there are legal proceedings, perhaps either a class action in the US, or an enforcement action by a European regulator, or both. In any such case, Balkin’s arguments referred to here, Frank Pasquale’s arguments in his prescient The Black Box Society, and Laidlaw’s arguments referred to in an earlier post, are all likely (in the word a current US political candidate) to figure bigly. Let’s watch this space, especially if that space is the Facebook trending topics feed.


Vote Google No 1? Gatekeepers, intermediaries and Corporate Social Responsibility – a footnote

Google vote image, elements via and my earlier post on Vote Zuckerberg No 1? Gatekeepers, intermediaries and Corporate Social Responsibility, I adverted once again to the considerable control that large private companies, such as Facebook and Google, can exert over the flow of information, and noted some stories in which Facebook denied seeking to influence the voting intentions of its customers.

Now comes a story by Rory Cellan-Jones (the BBC’s Technology Correspondent) on the BBC News Magazine website which sees Google denying the same allegation:

Six searches that show the power of Google

… So let’s look at the power of Google via six searches.

Search 1: How does Google search work?

Search 2: Trout flies

Search 3: Hotels Tallinn

Search 4: Can Google affect the result of an election
But is there a risk that its market dominance and the sheer power of the Google algorithm could even determine who rules us?

Any search around this topic will throw up articles quoting Dr Robert Epstein, a psychologist at the American Institute for Behavioural Research. He says his research showed that where candidates or parties appeared in search results it could influence elections. “It will shift the opinions of undecided people so dramatically that just being higher in search rankings can win someone an election.”

Image of search for 'can google affect the result of an election' via BBC news magazine

Challenged as to whether Google engineers would really tweak the algorithm to favour one candidate, he says that wouldn’t surprise him, and there was always scope for a rogue employee to do that.

But he says that the worst possibility was that their algorithm could do it. “The computer program is always going to put things into an order and in every election it’s almost certainly going to put one candidate ahead of another.” And that, he says, means elections are being decided by the algorithm.

Google has described Dr Epstein’s research as a “flawed conspiracy theory” and says it has never changed search rankings to manipulate user sentiment.

David Auerbach, a technology columnist at Slate who worked on search at Google for five years, agrees that there is no conscious manipulation going on. But he does see something called emergent bias, where even if Google is producing what it regards as relevant results, “there’s no guarantee that those results are objective because people don’t necessarily think the most objective sources are the most relevant sources.”

Search 5: Mario Costeja Gonzalez

Search 6: Unprofessional Hair

Cellan-Jones concluded that the fact that information is delivered to by algorithms, which even Google’s engineers don’t fully understand, made him just a little uneasy. And he explored that unease in The Force of Google, a programme on BBC Radio 4 which was broadcast last Tuesday evening and is currently available here on the BBC iPlayer Radio.

Vote Zuckerberg No 1? Gatekeepers, intermediaries and Corporate Social Responsibility

Facebook vote image, elements via and things I read today resonated with one another.

First, from The Signal and the Noise (The Economist Special Report on Technology and Politics; pdf; p6):

… online giants, such as Facebook and Google, … know much more about people than any official agency does and hold all this information in one virtual place. It may not be in their commercial interest to use that knowledge to influence political outcomes, as some people fear, but they certainly have the wherewithal. …

Second, from Gizmodo:

Facebook has declared it will never use its product to influence how people on the platform vote. Earlier today, Gizmodo reported that employees had asked Mark Zuckerberg to answer the question, “What responsibility does Facebook have to help prevent President Trump in 2017?” in an internal poll.

In a statement to the Hill and Business Insider, Facebook said:

Voting is a core value of democracy and we believe that supporting civic participation is an important contribution we can make to the community. We encourage any and all candidates, groups, and voters to use our platform to share their views on the election and debate the issues. We as a company are neutral — we have not and will not use our products in a way that attempts to influence how people vote.

In the earlier Gizmodo story, UCLA law professor Eugene Volokh explained that Facebook has no legal responsibility to give an unfiltered view of what’s happening:

Facebook can promote or block any material that it wants. Facebook has the same First Amendment right as the New York Times. They can completely block Trump if they want … or promote him.

I have discussed on this blog the considerable control that large private companies, such as Facebook and Google, can exert over the flow of information (see, eg, here | here | here | here | here). Concerns about the tricky issue of the the transparency of the algorithms used by such companies to control money and information have recently led the FTC to establish the Office of Technology Research and Investigation:

The Office … is located at the intersection of consumer protection and new technologies … and its work supports all facets of the FTC’s consumer protection mission, including issues related to privacy, data security, connected cars, smart homes, algorithmic transparency, emerging payment methods, fraud, big data, and the Internet of Things.

In her superb new book Regulating Speech in Cyberspace. Gatekeepers, Human Rights and Corporate Responsibility (Cambridge UP, 2015; summary), Emily Laidlaw argues that these digital developments need a new system of human rights governance that takes account of private power, in particular by incorporating principles of corporate social responsibility. In the context of whether to seek to influence the voting intentions of its customers, Facebook has undertaken to do the responsible thing. But what about existing algorithms which seek accuracy at the expense of diversity in political viewpoints visible to the users? Worse, what about other companies that might not be as scrupulous or responsible? In such circumstances, we could do worse than to take Laidlaw’s prescription on board.

The privations of privacy: from dystopia to dysaguria

Dave Eggers The Circle coverI spoke yesterday evening at a Data Protection Day event in Trinity College Dublin. The theme was “What does the Internet say about you?”. It was organized by the Information Compliance Office and the Science Gallery in Trinity. Jessica Kelly of Newstalk introduced and chaired the event. You can download audio of my talk here (via SoundCloud), and you can download slides for my talk here (via SlideShare).

I was full of my usual caffeine-deprived doom about the challenges which technology pose for privacy. Jeanne Kelly, a partner in Dublin solicitors firm of Mason Hayes & Curran, spoke about the still-pending EU Data Protection Regulation. Conor Flynn, principal of Information Security Assurance Services, spoke about our digital footprints. And Sinéad McSweeney, Director of Public Policy for Europe, the Middle East and Aisa, at Twitter, talked about Twitter’s foundational commitments to freedom of expression and individual privacy. The evening was recorded for podcast, and I’ll blog about those presentations when the podcast is available.

In this post, I want to mention one point which I made near the end of my talk. I coined a new word – the last word in the title to this blogpost. In his speech on leaving the US Presidency in January 1961, Eisenhower warned against the growing power of the military-industrial complex. In modern surveillance terms, we might term this the security-corporate complex. And we already have a word for when the military/security state goes bad, as illustrated in George Orwell’s 1984, Aldous Huxley’s Brave New World, Ray Bradbury’s Farenheit 451, and Anthony Burgess’ A Clockwork Orange. That word is “dystopia”.

However, we don’t have a word for when the industrial/corporate society goes bad, as illustrated in Dave Eggers’ The Circle (cover pictured top left). I think it’s beyond time we had one, and the derivation of “dystopia” provides a guide. It was coined as a counterpoint to “utopia”, devised by Thomas More, initially to describe “nowhere”, from the Greek “ou” = “not” and “topos” = “place”, and now to describe the “perfect state”, from “eu” meaning “good”, and “topos” meaning “place”. Reflecting this derivation, John Stuart Mill devised “dystopia” as “frightening state”, from the Greek “dys” meaning “bad”, and (again) “topos” meaning “place”.

I suggest that we need a word for “frightening company”, and that we can devise one by following the lead provided by More and Mill. Let’s keep “dys” as the prefix, and look for a suitable word to which to add it. Greek provides “aguris”, which means “crowd” or “group” (or “gathering”, “assembly” or “marketplace”, and which has already lent other words to modern English). Hence, from “dys” meaning “bad”, and “aguris” meaning “crowd” or “group”, I suggest “dysaguria”, as a noun meaning “frightening company”, and “dysagurian” as the adjective to describe that company.

Indeed, “dysaguria” is the perfect noun and “dysagurian” is the perfect adjective to describe the eponymous company in Dave Eggers’ The Circle. It’s not in the same league as Orwell, or Huxley, or Bradbury, or Burgess. But it does raise very important questions about what could possibly go wrong if one company controlled all the world’s information. In the novel, the company operates according to the motto “all that happens must be known”; and one of its bosses, Eamon Bailey, encourages everywoman employee Mae Holland to live an always-on (clear, transparent) life according the maxims “secrets are lies”, “sharing is caring”, and “privacy is theft”. Eggers’s debts to dystopian fiction are apparent. But, whereas writers like Orwell, Huxley, Bradbury, and Burgess were concerned with totalitarian states, Eggers is concerned with a totalitarian company. However, the noun “dystopia” and the adjective “dystopian” – perfect though they are for the terror of military/security authoritarianism in 1984, or Brave new World, or Farenheit 451, or A Clockwork Orange – do not to my mind encapsulate the nightmare of industrial/corporate tyranny in The Circle. On the other hand, “dysaguria” as a noun and “dysagurian” as an adjective, in my view really do capture the essence of that “frightening company”.

And now, finally armed with an appropriate word, I can sum up the theme of my talk yesterday evening. In much the same way that we must be vigilant against a military/security dystopian future, we must also be on our guard against an industrial/corporate dysagurian one.