You will now have your face scanned by live recognition cameras at London Bridge station
New facial recognition cameras have been plugged in at one of the UK’s busiest train stations, monitoring millions of people.
A set of live cameras recording faces has launched at London Bridge station as part of the British Transport Police’s trial of the technology.
The tech, which uses artificial intelligence, will scan faces and compare it against a list of serious criminals at the station, which saw over 54 million passengers last year.
If the system hits a match, it will send out an alert to an officer, who will manually review it and make further checks to decide if the person is a suspect, the force said.
During the trial, passengers who don’t want to be scanned by the cameras will be given an alternative route to avoid them.
Privacy and civil liberties campaigners have warned against the rollout of the tech, saying its use by police forces across the country is currently not monitored.
Big Brother Watch described the ‘mass biometric surveillance’ of people as ‘disturbing.’
Chief Superintendent Chris Casey from BTP said the trial, which is carried out to see how it performs in a railway setting, is part of the force’s commitment to ‘using innovative technology to make the railways a hostile place for individuals wanted for serious criminal offences, helping us keep the public safe.’
He said: ‘The cameras work by scanning faces and comparing them to a watchlist of offenders wanted for serious offences.
‘If there’s a match, then the system generates an alert. An officer will review it and carry out further checks to determine if the person is a suspect and if they need to take further action.
‘People who prefer not to enter the recognition zone will have alternative routes available and images of anyone not on the authorised database will be deleted immediately and permanently.
Latest London news
‘We want to make the trial as effective as it can be and we welcome your feedback. You can scan the QR codes on the posters and tell us your thoughts.’
Other trials at further stations will be announced before they happen, BTP said.
Facial recognition cameras have also been rolled out by supermarkets and councils in a bid to crack down on anti-social behaviour.
But the use of facial recognition has proven controversial after the cameras misidentified people, including a Londoner who was kicked out of a Sainsbury’s after staff using the cameras mistook him for a criminal.
Big Brother Watch’s Matthew Feeney said: ‘We all want train passengers to travel safely, but subjecting law-abiding passengers to mass biometric surveillance is a disproportionate and disturbing response.
‘Facial recognition technology remains unregulated in the UK and police forces, including British Transport Police, are writing their own facial recognition rules, including those governing how they use the technology and who they place on watchlists.
‘It is especially concerning that British Transport Police are moving ahead with facial recognition deployments before the Home Office has finished its consultation on a legal framework for police use of facial recognition technology.
‘The use of this technology is especially offensive in a democracy where neither the public nor Parliament has ever voted on its use.
‘Sadly, the UK stands out among democracies when it comes to the widespread use of live facial recognition. The Government must take immediate steps to rein in police use of this technology.’
Live facial recognition ‘used 4 million times last year’
Big Brother’s Silkie Carlo and anti-knife crime activist, Shaun Thompson, have challenged Met Police’s use of live facial recognition (LFR) tools after Thompson was mistakenly identified when passing a camera van in 2024.
Lawyers representing the pair told a High Court hearing last month that the police’s use of LFR is increasing ‘exponentially.’
The Met Police used facial recognition 231 times and scanned around 4 million faces last year, Dan Squires KC said during the judicial review of the extension of facial recognition across other forces.
Do you support live facial recognition cameras at stations?
-
Yes
-
No
There are plans to extend the cameras’ use, although consultation on it is still underway.
Ruth Ehrlich, a director of external relations at Liberty, a UK-based human rights advocacy group, told Metro: ‘The Government is undermining its own consultation on facial recognition technology by continuing to roll out these powerful surveillance tools while that consultation is still underway.
‘Facial recognition enables police to track and monitor people as they go about their daily lives. Its use to date has been deeply flawed with children wrongly placed on watchlists, and Black people put at greater risk of being wrongly identified.
‘This will have created real harm to people’s lives and is the consequence of handing complex, powerful technology to police who lack the expertise to govern it safely.
‘The Government must halt the rapid rollout of facial recognition technology, ensure safeguards are in place to protect each of us, and prioritise our rights. They must also learn the lessons of the past and ensure that, before handing police further AI tools, a system of strong guardrails is in place and one that puts the rights and privacy of the British public at its core, with genuine transparency and meaningful oversight.’
How live facial recognition cameras work?
The process starts by identifying a face in a still image or video – picking out which pixels make up a face and which are the body, background or something else.
It then maps the face, such as measuring the distance between certain features, to create a ‘numerical expression’ for an individual.
This can then be quickly compared to large databases to try to find a match from faces that have already been mapped.
Get in touch with our news team by emailing us at webnews@metro.co.uk.
For more stories like this, check our news page.