On the first anniversary of the judgment of the Court of Justice of the European Union in Case C-362/14Schrems, Professor Christopher Kuner (pictured left), Professor of Law at the Vrije Universiteit Brussels, will give a public lecture on
Reality and Illusion in EU Data Transfer Regulation post-Schrems
The lecture will be held in the Neill Theatre, Trinity Long Room Hub, Trinity College Dublin, on Thursday 6 October 2016, at 1:00pm.
In Case C-362/14Schrems v Data Protection Commissioner ECR I-nyr (Grand Chamber, 6 October 2015), the Court of Justice of the European Union invalidated the EU-US Safe Harbour arrangement allowing personal data to be transferred to the US. The judgment is a landmark in the Court’s data protection case law, and illustrates the tension between the high level of legal protection for data transfers in EU law and the illusion of protection in practice. The judgment has undermined the logical consistency of the other legal bases for data transfer besides the Safe Harbour, and reactions to it have largely been based on formalism or data localization measures that are unlikely to provide real protection. Schrems also illustrates how many legal disagreements concerning data transfers are essentially political arguments in disguise. The EU and the US have since agreed on a replacement for the Safe Harbour (the EU-US Privacy Shield), the validity of which will likely be tested in the Court. It is crucial for data transfer regulation to go beyond formalistic measures and legal fictions, in order to move regulation of data transfers in EU law from illusion to reality.
The June programme had the feel of a holding document, published to get a new government to the Summer Recess. This programme has a far more substantial feel about, published to demonstrate the government’s confidence in its capacity to promote and enact legislation.
As to the administration of justice, priority legislation to be published by the Department of Justice and Equality this session includes a Bill to make provision for periodic payment orders to replace lump sum damages, and a (hastily-promoted?) Bill to establish the long-awaited Judicial Council. Indeed, that Bill is expected to undergo pre-legislative scrutiny this session, as is a Bill to replace the Judicial Appointments Advisory Board with a new Judicial Appointment Commission – indeed, the cabinet agreed yesterday to bring forward the heads of such a Bill by November. All of these developments are very welcome – provided that the Appointments Bill permits legal academics to apply for appointment to be bench, especially at appellate level. It would not be difficult to draft the necessary legislative provisions, and there is no reason in principle not to do so.
As to education, legislation envisaged at some stage from the Department of Education, but probably not in this session, includes the Higher Education (Reform) Bill and the longer-threatened Universities (Amendment) Bill (critiqued here, here, here, and here). And the Technological Universities Bill 2015 remains on the Dáil Order Paper, awaiting Committee stage.
As to privacy, the most important piece of legislation mentioned in the Programme is the Data Protection Bill, to transpose the EU Directive 2016/680 and give full effect to the General Data Protection Regulation (Regulation 2016/679). Heads are expected before the end of 2016 (but I’m not holding my breath). A Data Sharing and Governance Bill will be published and sent for pre-legislative scrutiny, to mandate and facilitate lawful data-sharing and data-linking for all public bodies, and a Health Information and Patient Safety Bill go further in the context of health information. In both cases, the drafting will be tricky, not least because the Bills will have to be compliant with the decision of the Court of Justice of the European Union in Case C?201/14Bara. The Criminal Records Information System Bill and the Passenger Name Record Bill implement EU obligations. However, in the case of the latter, since there is a challenge before the CJEU in respect of a related measure, a cautious approach for the time being may mean that other Bills may progress ahead of it from the Department of Justice and Equality to the Oireachtas.
Finally, it is heartening to see that work has commenced on a Bill to remove blasphemy from the Constitution, and interesting to see active proposals to establish an Electoral Commission and to amend the transfer of records in the National Archives from 30 years to 20 years.
Security and freedom: do we need both?
And can we enjoy both without the pursuit of one jeopardising the other?
Prof Haker is a member of the European Group on Ethics in Science and New Technologies (EGE), which advises the European Commission on ethical issues. On 20 May 2014, the EGE submitted to the Commission their Opinion no 28 on “Ethics of Information and Communication Technologies”. In an era where rapid advances in telecommunications and computing have enabled the data of billions of citizens around the globe to be tracked and scrutinized on an unprecedented scale, the Opinion aims to provide a reference point for the Commission regarding the ethics of security and surveillance measures.
Building upon the Opinion, in this lecture, Prof Haker considers the tensions between security and freedom. After broadening the concept of security to describe the general human needs and rights to secure their well-being (UN), ‘security’ has recently been narrowed down again, in light of terrorist and criminal activities. One area of concern is the expansion of surveillance technologies. The ‘re-turn’ to the narrow security concept has been framed as a necessary ‘trade off’ between security and freedom.
Prof Haker will consider whether this ‘framing’ appropriate, whether surveillance undermines trust as a condition for social cooperation, and whether surveillance technologies will affect us both in our personal relationships and in our presence in the public sphere. And she will argue that a new ‘social contract’ is needed that not only readjusts the political control of individuals but also critically examines the role of companies promoting security and surveillance technologies in comparison with other socio-economic efforts to create security for human beings.
Professor Hille Haker holds the Richard McCormick S.J. Chair of Ethics at Loyola University Chicago. She is a member of the EGE. From 2003-2005, she was Associate Professor of Christian Ethics, Harvard University, Cambridge, USA and from 2003 – 2009 Chair of Moral Theology and Social Ethics at the University of Frankfurt, Germany.
I want Britain to be the best place to raise a family. … Where children are allowed to be children. … Protecting the most vulnerable in our society; protecting innocence; protecting childhood itself. … I will do whatever it takes to keep our children safe.
… viewing graphic and violent pornographic material online is extremely harmful to children and we believe strongly that introducing such filters in Ireland is an option worth at least some serious consideration.
In my previous post, I argued that, as a matter of principle, the controversial American anti-Islamic video should not be censored. The most obvious form of censorship comes from government action, such as legislation banning speech, but that does not arise in this case. Less obvious, but no less insidious, was the White House request to Google to re-consider whether the video breached YouTube rules. This was not a formal ban, and Google declined to take the video down in the US, but it did block access to it in in Egypt and Libya. This raises two important questions about the structure of free speech. First, in the online world, where most of us access the internet through a range of intermediaries, government censorship does not necessarily need to target the disfavoured speech; it need only target the intermediaries. Very few US companies would feel able to decline a request like that from the White House, and Google are to be commended for standing firm in those circumstances. Second, these intermediaries now have a great deal of practical power over online expression – not only can they be co-opted by government as agents of state censorship, but they also have the capacity to act as censors in their own rights, as Google did in their unilateral action to block access in the Middle East.
Such intermediaries are effectively gatekeepers are those who enable – and control – our access to that information, and this raises profound issues of principle about the role of intermediary gatekeepers in the structure of free speech about which I have written on this blog (here | here | here). At present, such intermediary gatekeepers are all private entities, operating to their own rules, and it is not at all clear how they can be made accountable to their users or the wider public for their private actions. Given the practical, social and legal issues that arise in policing content in such a quasi-public sphere (pdf), it has been argued that search engines and other intermediaries should have public interest obligations, perhaps by analogy with common law duties that govern public utilities (pdf). In particular, free speech norms should not only be about protecting speakers against a heavy-handed state but also about protecting speakers and readers against heavy-handed intermediate gatekeepers. (more…)
What happens when people with a presence in cyberspace (really) die? Does the presence become an absence? What do their survivors do about their activities in cyberspace? How do they deal with online assets, or even discover real-world assets that may be locatable only online? How do estate trustees and executors carry out their legal duties with respect to these assets, and the liabilities as well? Many people get most of their bills in electronic form. How can an executor see to paying the debts of the deceased? …
Whilst the Special Rapporteur’s conclusions are nuanced in respect of blocking sites or providing limited access, he is clear that restricting access completely will always be a breach of article 19 of the International Covenant on Civil and Political Rights, the right to freedom of expression.
But not everyone agrees with the United Nations’ conclusion. Vinton Cerf, a so-called “father of the internet” and a Vice-President at Google, argued in a New York Times editorial that internet access is not a human right:
The best way to characterize human rights is to identify the outcomes that we are trying to ensure. These include critical freedoms like freedom of speech and freedom of access to information — and those are not necessarily bound to any particular technology at any particular time. Indeed, even the United Nations report, which was widely hailed as declaring Internet access a human right, acknowledged that the Internet was valuable as a means to an end, not as an end in itself.
First of all, and perhaps most importantly, I didn’t like the headline, which stated baldly and boldly that ‘Internet Access is not a Human Right’. Regardless of whether you agree or disagree with that statement, the piece said a great deal more than that …
Secondly, I think the point that he makes leading to this headline, and to his conclusions, reflects a particularly US perspective on ‘human rights’ – a minimalist approach which emphasises civil and political rights and downplays (or even denies) economic and social rights amongst others. … We need to be very careful about the assumptions we make about any human right – and that, in practice, many of what we consider to be human rights are instrumental, qualified, or contextual rather than absolute, pure and simple.
… another thing that disappoints me about Cerf’s Op Ed piece [is that he] doesn’t mention privacy, he doesn’t mention freedom from censorship, he doesn’t mention freedom from surveillance – I wish he would, because next after access these are the crucial enablers to human rights, to use his terms.
I wish Cerf had seen the excellent presentation at AALS on cyberlaw and the internet kill switch, which was organized by Annemarie Bridy and included fellow bloggers Rob Heverly, Michael Froomkin, and Jack Balkin. As Balkin noted, “new school censorship” is constantly shifting; Cerf’s confidence that abstract categories like “freedom of speech” could identify it all is more blinkered than the rapporteur’s endorsement of concrete modes of realizing communicative autonomy. Heverly drew on the literature of cyborgs to demonstrate how intimately connected personal identities can be with the machines and technologies in which they are embedded. As Julie Cohen argues, we are “networked selves,” and need “greater control over the boundary conditions that govern flows of information to, from, and about” us them.