Legal Trouble: Is facial recognition a violation of human rights?

In a ruling on Tuesday, three Court of Appeal judges have ruled that facial recognition has unlawful use cases. The court ruled in the favour of Ed Bridges a Civil Rights Campaigner in a privacy dispute on the use of facial recognition technology by the police. Three out of five appeals were accepted by the court.

The case was constructed on the idea that invasion of privacy causes distress. Hence, the civil rights campaigner Ed Bridges challenged the South Wales Police in the formal court of law. His argument outlined that the use of automatic facial recognition (AFR) by the South Wales Police had caused him “distress” and challenged his wellbeing.

The campaigner was scanned and identified while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018. While many criminal cases and mysteries are resolved using similar automated technology and it seems that this case utilizing facial recognition technology distressed the rights campaigner immensely.

According to the judges and the case presented by the campaigner, “Too much discretion is currently left to individual police officers.”

Under the current law and accepted practice, “It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed.” There are no clear boundaries as to who can be surveyed and tracked based on their activity and on what grounds.

The court also mentioned, “that the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason they do not have the necessary quality of law”.

In their ruling, Master of the Rolls Sir Terence Etherton, President of the Queen’s Bench Division Dame Victoria Sharp and Lord Justice Singh, found that the use of AFR (Automatic Facial Recognition) was proportionate under human rights law as the potential benefits of it outweigh the impact on Mr Bridges. However, a data protection impact assessment of the scheme was deficient and that the force had not done all it could to verify that the AFR software “does not have an unacceptable bias on grounds of race or sex.”

Having said that, the judgment also noted that there was no clear evidence that the software was based on grounds of race or sex eventually.

The judges concluded that “as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”

Don’t Risk €20 Million in Fines
—Ensure Compliance Today

Worth €30/Month