Peers ask experts about UK police use of AI and facial recognition technology • The Register
Members of the House of Lords examine the role of technologies such as AI and facial recognition in modern police methods and law enforcement.
The Justice and Home Affairs Committee announced in May that it plans to investigate the case and has already heard a series of oral testimony in June from legal experts as it familiarizes itself with the matter.
Today it was the turn of Professor Michael Wooldridge – Director of Informatics at the University of Oxford – to be interviewed by his peers.
He echoed many of the concerns raised previously and cited the example of a computer system that tells guard officers whether a person should be held in a police cell or released based on a plethora of data. , including his criminal history.
His concerns weren’t necessarily about the technology, but that it might lead some officers to “abdicate.” [their] responsibility ”by becoming too dependent on it.
Wooldridge warned there was a need to better understand AI and what it can – and can’t – do. And he urged his peers not to “humanize” him.
“It’s not like human intelligence,” he said. He then went further, adding that the technology is “fragile” and can fail unexpectedly.
AI can, for example, mislabel or misidentify images, he added. While this is not a problem on social media, it could have serious consequences in the criminal justice system in areas such as facial recognition.
At this early stage, the committee is content to gather evidence, but from the start the complexity of what they face is clear.
Speaking in June as the inquest began to hear oral testimony [PDF], Professor Carole McCartney, professor of law and criminal justice at the University of Northumbria, explained that “one of the big criticisms of technologies … is their lack of scientific basis” and the importance of a review meticulous.
“The House of Lords recently released a report that the underlying basic research is often lacking and that there is nothing new here; very often a lot of these technologies will be based on very basic science and that’s dangerous, ”she said.
She supported her argument by citing the example of automatic facial recognition technology used by the South Wales police.
“Basically, they put the technology in the wild and just watch it, and they call it ‘a trial’. This is not how scientific trials work, ”she said.
When asked if the benefits of technology outweigh the pitfalls, Professor McCartney replied, “It depends. “
“One of the difficulties in this area is that there is no silver bullet,” she added. “There is no technology that will solve domestic violence or allow us to successfully predict burglaries 100% of the time.”
In a separate briefing last month, the committee also heard from members of the West Midlands Police who gave their views on how technology can aid law enforcement and criminal detection. Speaking behind closed doors, officers provided an update on plans for the national data analytics solution, which uses advanced analytics that can be used by law enforcement.
Last week, the role of facial recognition technology was brought to light in the United States after the House Judiciary Committee heard testimony about how it is used by law enforcement. .
Dr Cedric Alexander, former member of President Barack Obama’s 21st Century Policing Task Force, highlighted the minefield facing lawmakers by explaining how, on the one hand, the FRT can promote justice and ” even saving lives ”, but not if that means sacrificing constitutional rights.
The hearing coincided with civil rights activists in the United States who called on retailers to stop using facial recognition technology over worrying privacy concerns and fears it could lead to arrest. of people wrongly. ®