The iceberg that is camera surveillance and data privacy
News and information from the Advent IM team.
From Head of Client Development, Derek Willins
A recent complaint filed with the ICO by Big Brother Watch (BBW) reminded me of the iceberg that is camera surveillance and data privacy. The accused is Southern Coop. The charge is that they are using “biometric cameras” (Live Facial Recognition technology or LFR), and they are “infringing the data rights of a significant number of UK data subjects”.
The story arc of the Southern Coop accusation is the UK’s appetite for surveillance cameras in public and private places. In 2020 there was an estimated 5.2m of them, but when you add on internal systems, domestic and doorbell cameras, bodycams and drones it’s anyone’s guess. Privacy and 24-7 public monitoring are always a source of discomfort.
UK Retailers have been at the forefront of adopting surveillance technology in their businesses. According to a June 22 survey (of 500 retailers) by Secure Redact, 94% already have video surveillance technology such as wireless surveillance, facial recognition technology, queue monitoring, or body cameras in-store, and most intend to add to it. However, the survey also found that 43% of respondents reported they had been fined for a violation of video surveillance GDPR legislation. Of these retailers, 37% reported paying an equivalent of 2% of their annual turnover (or so they claim). The high level of fines is a result of the rapid adoption of technology and not enough risk assessment. The problem retailers are tackling is twofold: combating increased theft and assaults on staff, plus a lack of police and judicial interest in the culprits. Police usually only respond to shoplifting if the value is over £200, and many incidents have to be recorded by an individual before fines or custodial sentences are handed out. The frustration retailers feel is understandable, but implementing new systems badly only makes the problem worse.
The technology involved in the Southern Coops case is Live Facial Recognition (LFR). The ICO explains LFR as follows; “LFR is a type of FRT (Facial Recognition Technology) that often involves the automatic collection of biometric data. This means it has greater potential to be used in a privacy-intrusive way. LFR is used in surveillance systems and It is directed toward everyone in a particular area rather than specific individuals. It has the ability to capture the biometric data of all individuals passing within range of the camera indiscriminately. Data is collected in real-time and potentially on a mass scale. There is often a lack of awareness, choice or control for the individual in this process”.
The live data is then matched against a library of persons of interest (where does this come from and how managed?), and when matched, they are immediately flagged for scrutiny. However, the matching process can be inaccurate, leading to false accusations and trust issues. For example; In July 2020, the National Institute of Standards and Technology (NIST) conducted independent assessments to understand matching bias by AI systems. It reported that facial recognition technologies for 189 algorithms showed racial bias toward women of colour. NIST also concluded that even the best facial recognition algorithms studied couldn’t correctly identify a mask-wearing person nearly 50% of the time.
At the heart of all these issues is the individuals’ right to have their biometric (facial) data respected and managed lawfully. The introduction of technology like LFR “can at times, risk creating an accountability gap where controllers rely on vendors’ products and may not understand the detail of how the system operates or fully appreciate their legal obligations” (source ICO). Controllers seeking to deploy LFR must comply with all relevant parts of the UK GDPR and DPA 2018. This includes the data protection principles set out in UK GDPR Article 5, including lawfulness, fairness, transparency, purpose limitation, data minimisation, storage limitation, security, and accountability. Controllers must also enable individuals to exercise their rights. Divergence from these laws will result in an ICO fine.
So, what are the key Data Protection issues controllers using LFR need to consider and ensure they are comfortable before adopting? The ICO after several investigations have now identified them and are listed below.
Together, these requirements mean that where LFR is used for the automatic, indiscriminate collection of biometric data in public places, there is a high bar for its use to be lawful. A well-known facial recognition case already judged is R (Bridges) v South Wales Police in 2020, in which the Court of Appeal held that automated facial recognition technology used by South Wales Police breached data protection laws and Article 8 of the European Convention on Human Rights. This case did not completely rule out the use of facial recognition technologies, provided that sufficient data protection impact assessments and thorough policy documents are established.
How the Southern Coop will fare with their case remains to be seen. If they can demonstrate that the adoption of LFR in their 35 stores is proportionate and meets a high standard for the issues that the ICO will be considering, then they will be fine.
This remains an important and evolving area. Technology advances and its adoption will continue and unprepared controllers who are over-reliant on vendors can easily run into trouble. Advice from independent experts like Advent IM will ensure risk is minimised.
________________________________________________________________________________________________Sources: various: including, ICO Opinion: The use of live facial recognition technology in public places; 18 June 2021. Secure Redact Retail Survey 2022. https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/ Accessed 05.08.22