Neither a pretty face nor a beautiful game — of football pitches, data protection impact assessments, artificial intelligence, facial recognition, and closed-circuit television surveillance

CCTV at a chinese playing pitchI read this morning that, in a wide-ranging letter to Congress on racial justice reform, IBM CEO Arvind Krishna wrote that IBM will no longer offer “general purpose facial recognition or analysis software”. Of course, “general purpose” is doing a lot of work in that sentence. But let’s see where it goes. [Update]: two days later, Amazon followed with a one-year moratorium on police use of their facial recognition technology, to give Congress enough time to implement appropriate rules. Again, let’s see where this goes.[End update]

These developments reminded me of a recent local story about Dublin City Council. Not content with seeking to post freeze-frame closed-circuit television (CCTV) images of people dumping their rubbish in litter black-spots, in the hope of shaming them or others into desisting from doing so in the future, now they want CCTV cameras with facial-recognition capabilities. Nearly three months ago (in the world just before lockdown) Sean Finnan reported in the Dublin Inquirer [with added links]:

Council Installed Cameras with Facial Recognition on Football Pitch

Before the refurbishment of the football pitch at Bluebell Road in the west of the city, it was an anti-social blackspot, says Michael O’Shea, chairman of Inchicore Athletic FC. … After a refurbishment in November 2018 for Dublin City Council, a new artificial astroturf pitch was installed, as well as floodlighting and a new high-tech closed-circuit television (CCTV) security system. … Installed on the football pitch are Hikvision cameras, embedded with what is called “deep learning” technology, AI that trains on people visiting the pitch. … The technology is not something that the council should be using without carrying out a Data Protection Impact Assesment, says Elizabeth Farries of the Irish Council for Civil Liberties. … [And I said that there] may be a good reason for the cameras for purposes of security … But whether the use of this specific technology is proportionate should have been analysed through a Data Protection Impact Assesment …

A data protection impact assessment (DPIA) can be used to identify and mitigate against any data protection related risks arising from a new project. Under Article 35 of the General Data Protection Regulation (Regulation (EU) 2016/679) (GDPR) and section 84 of the Data Protection Act 2018 (DPA18), a DPIA is required where the processing of personal data “is likely to result in a high risk to the rights and freedoms of natural persons”. Such an assessment is particularly necessary where the processing is by means of CCTV. The Council did tell Finnan that a DPIA was “being undertaken retrospectively”, but that savours very much of locking the stable door after the horse has bolted. The whole point of the DPIA is to prevent the horse from bolting in the first place. Indeed, the GDPR and DPA18 both expressly require that it should be carried out “prior to the processing” (emphasis added). Doing so retrospectively is as much a breach as not doing so at all.

In any event, that assessment seems to be bearing some fruit. A few weekends ago, Peter O’Dwyer reported in the Business Post:

DCC to cease using ‘blacklisted’ CCTV firm

Chinese firm Hikvision, whose cameras use facial recognition technology, has been implicated in connection with human rights violations and data protection concerns

Dublin City Council will no longer use CCTV cameras manufactured by Hikvision, a controversial Chinese firm, after data protection concerns were raised over their installation. …A DCC spokeswoman told the Business Post that the assessment was being finalised and that the Hikvision cameras had been replaced by “conventional CCTV cameras” several weeks ago. …

This is a good thing. The machine learning Artificial Intelligence (AI) technology in the cameras raises serious legal (pdf) and ethical issues that regulators such as the EU Commission are only beginning to address, harness and regulate. In particular, that the AI is training facial recognition raises significant data protection concerns, which prudence suggests are better avoided. For example, as long ago as 2003, the Article 29 Working Party (WP29) raised concerns about the compatibility of biometric analysis in general with data protection principles (see Working document on biometrics (pdf)); and, in 2012, WP29 raised concerns about the compatibility of facial recognition in particular with data protection principles (see Opinion 02/2012 on facial recognition in online and mobile services (pdf)). More recently, the European Data Protection Supervisor characterised facial recognition technology as “A solution in search of a problem?“. And the Commission Nationale de l’Informatique et des Libertés (the CNIL; the French data protection authority) has highlighted the regulatory, technological, ethical and societal risks associated with this technology.

It is clear beyond doubt that facial images are biometric data; and, as the processing of biometric data “for the purpose of uniquely identifying a natural person [by] such [means] as facial images” (see Article 4(14) GDPR), facial recognition is the processing not merely of personal data, but of special category data (see section 2 DPA18), which is prohibited by Article 9(1) GDPR unless one of the specific grounds outlined in Article 9(2) and Chapter II DPA18 applies. Furthermore, according to Article 35(3)(b) GDPR, a DPIA is mandatory in such cases. However, quite frankly, it is very difficult to fit public facial recognition CCTV into one of those specified grounds. The least unlikely is Article 9(2)(g), which permits processing of special category data where it

… is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject

On 13 June 2019, the Danish football club Brøndby IF announced that, with the approval of the Datatilsynet (the Danish data protection authority), automated facial recognition technology would be deployed to identify and exclude persons that have been banned from attending their matches. The Datatilsynet relied directly upon “reasons of substantial public interest” in Article 9(2)(g) GDPR; but there is no additional provision incorporating that Article into Danish law; so it seems to me that there is no applicable “Member State law” to justify the Datatilsynet’s approval to Brøndby. In Ireland, the relevant “Member State law” in Ireland is DPA18, which permits such processing where it “is carried out in accordance with regulations” made under section 51(3) DPA18. However, no regulations relating to facial recognition have been made pursuant to that section. Until they are, it would seem that Article 9(2)(g) GDPR cannot be relied upon as a lawful basis for processing special category biometric facial recognition data.

Moreover, even if there is a lawful basis for such processing, it is even more difficult to see how such technology can respect the principles of data protection by design and by default (Article 25 GDPR; section 76 DPA18), let alone comply with the additional obligations that arise where the processing leads to profiling and automated decision making (Article 22 GDPR; section 57 DPA18).

As a consequence of these kinds of problems, as recently as January of this year, the EU Commission was considering the introduction of a ban of 3 to 5 years on the use of facial recognition technology in public places (this drew mixed responses from the the tech world: it won the backing of Alphabet Chief Executive Sundar Pichai but got a cool response from Microsoft President Brad Smith). Although it subsequently decided not to press ahead with that proposal, the Commission still considers that automated facial recognition breaches GDPR. For example, in the first fine imposed by the Datainspektionen (the Swedish data protection authority) for breach of the GDPR, a school in Skellefteå which trialled facial recognition to keep track of student attendance was fined 200,000 SEK (c.€20,000) for failing to do an adequate DPIA and for unlawfully processing sensitive biometric data (closer to home, a Kilkenny school subsequently shelved plans to use facial recognition software to monitor student’s attendance).

The data protection concerns relating to the cameras at the football pitch have been mitigated by removing the facial-recognition element from the CCTV. However, they have been not entirely alleviated – CCTV footage of identifiable individuals amounts to personal data for the purposes of data protection law; a DPIA is still necessary; and the GDPR and the 2018 Act continue to apply. Consequently, the Council must tread softly here. It doesn’t want to end up with a fine and adverse headlines like the Swedish school. As Mark Tighe reported in the Sunday Times a few weekends ago, Tusla, the child and family agency, became the first body to be fined in Ireland for a data protection breach (see also Irish Times | Irish Legal News). A second Tulsa fine followed within the week; and more fines are on their way. The sooner the Council publishes the overdue DPIA for the remaining CCTV at the football pitch, the better.

As that example demonstrates, the use of facial recognition technology is increasingly popular, notwithstanding that the GDPR and DPA18 place significant restrictions on its use. It is particularly helpful in the context of law enforcement; but, even there, the fundamental rights of data subjects must be respected (see, eg, R (Bridges) v Chief Constable of South Wales Police [2020] 1 WLR 672, [2020] 1 All ER 864, [2019] EWHC 2341 (Admin) (04 September 2019)). Indeed, since February 2020, the Metropolitan Police in London has begun to deploy live facial recognition technology on an operational basis. The technology is also helpful in the commercial context; but, especially there, the fundamental rights of data subjects must be respected (see, eg, Shara Monteleone “Privacy and Data Protection at the time of Facial Recognition: towards a new right to Digital Identity?” (2012) 3(2) European Journal of Law and Technology 168; Elias Wright “The Future of Facial Recognition is not Fully Known: Developing Privacy and Security Regulatory Mechanisms for Facial Recognition in the Retail Sector” (2019) 29(2) Fordham Intellectual Property, Media and Entertainment Law Journal 611). Again, the technology is also helpful in the employment context; and again, especially there, the fundamental rights of data subjects must be respected (see, eg, Doolin v Data Protection Commissioner [2020] IEHC 90 (21 February 2020); see also Lilian Edwards, Laura Martin & Tristan Henderson “Employee Surveillance: The Road to Surveillance is Paved with Good Intentions” (2018; SSRN)).

In Case C-212/13 Rynes v Urad pro ochranu osobních udaju (ECLI:EU:C:2014:2428; CJEU, Fourth Chamber; 11 December 2014) (see also Case C-345/17 Buivids (ECLI:EU:C:2019:122; CJEU, Second Chamber; 14 February 2019); compare Doolin (above)), the Court of Justice of the European Union held that CCTV amounts to the processing of personal data, which requires justification or exception pursuant to data protection law. By way of example of such justification or exception, in Case C-708/18 TK v Asociatia de Proprietari bloc M5A-ScaraA (ECLI:EU:C:2019:1064; CJEU, Third Chamber; 11 December 2019) (compare Bridges (above)), the Court accepted that CCTV could be installed and used for the purposes of “ensuring the safety and protection of individuals and property”. These cases emphasise that the processing must have a lawful basis, must be necessary and proportionate, and must respect the fundamental rights and freedoms of the data subjects. These are the kinds of things that the Council’s belated DPIA must assess. I look forward to its publication almost as much as I look forward to the safe resumption of football in Bluebell and elsewhere.