British Cops Will Scan Every Fan’s Face at the Champions League Final
South West Police is piloting a facial recognition technology at one of the world’s biggest sporting events in an effort to crackdown on crime and fan violence.
The Champions League final will be held at Cardiff’s Principality Stadium on June 3rd this year and when the thousands of fan make their way to the main event, very few of them will realise that their faces have already been scanned, analysed and compared to a police database of over half a million “persons of interest”.
Although the technology has been the subject of criticism by fans, British police remains undeterred and the system will be deployed during the day of the game in Cardiff’s main train station as well as around the stadium itself, which is situated in the heart of the retail centre.
The cameras will be scanning the faces of an estimated 170,000 visitors plus many thousands more who will be in the vicinity on a busy Saturday evening. These images will then be compared in real time to 500,000 custody images stored in the police information and records management system, alerting the police to anyone that might be of “interest”.
Although new for the Principality Stadium this isn’t the first time such technology has been used in the UK as Automated Facial Recognition (AFR) was employed during the Notting Hill Carnival in 2016.
In light of the increasing challenges to security and concerns that were raised with the recent quarterfinal attack on the Borussia Dortmund team bus, this seems to be a sensible approach to take, although opponents still question what this technology might be infringing when it comes to civil liberties.
There is also the question of how effective AFR really is, with its critics claiming the effectiveness is still hampered by a technology that is still flawed in many ways. Subjects who are not facing the camera or have their faces obscured in some way drop out of the analysis group and the real benefits of the system are only truly enjoyed in highly controlled environments with high quality equipment.
This technology doesn’t come cheap and as multiple cameras are often required to deliver the best results it soon places the solution outside of the reach of all but the very wealthy.
The accuracy of such systems has also come into question in the US during a House Committee on Oversight and Government Reform when the committee revealed findings by the Government Accountability Office that algorithms used by the FBI were inaccurate 14 percent of the time and were more likely to misidentify black people.
The current limitations of AFR were also highlighted by a statement issued by the Met Police concerning the Notting Hill Carnival operations when the operation led to zero arrests, failing to identify a single person of interest despite there being a total of 454 arrests made during the course of the carnival.
So the jury is still out on AFR and while I believe human rights and civil liberties need to be protected at all costs, I also realise that in this world of heightened tensions and security concerns, technology can still play a part in making our day to day lives safer and where possible, we still need to explore anything that can help us achieve that aim.