Overhaul of UK police tech needed to prevent abuse


A House of Lords investigation has found that the use of artificial intelligence (AI) by UK police could undermine human rights and further exacerbate existing inequalities without adequate safeguards, supervision and precautions.

following a 10 month check In the use of advanced algorithmic techniques by the UK police, including facial recognition and various crime “prediction” tools, the Lord’s Home Affairs and Justice Committee (HAJC) described the situation as “a new Wild West”, with strategy, accountability was lacking. and transparency from top to bottom.

In report good Published on 30 March 2022, the HAJC stated: “The use of advanced technologies in the application of law is a real and present risk to human rights and the rule of law. Unless this is acknowledged and addressed, then By far the potential benefits of using advanced technologies may outweigh the harms and mistrust it creates.”

In the case of “predictive policing” technologies, the HAJC noted a “vicious cycle” and their tendency to “reinforce pre-existing patterns of discrimination” as they allow police patrols to become more-income on a low-income, already historical basis. Cops direct the areas. arrest data.

“Due to the increasing police presence, it is likely that a higher proportion of crimes committed will be detected in areas that do not have more cops as compared to those areas. The data would reflect this increased detection rate in the form of an increased crime rate, which would be fed into the tool and itself embedded in the next set of predictions,” it said.

On facial recognition, the other key algorithmic technology being deployed by police, the report said it could have a chilling effect on protests, undermine privacy and lead to discriminatory results.

“The use of advanced technologies in the application of law is a real and present risk to human rights and the rule of law. Unless this is acknowledged and addressed, the potential benefits of using advanced technologies outweigh the disadvantages.” And the mistrust that can arise from it may be greater than that.

HAJC Report

“While we found great enthusiasm about the potential of advanced technologies in law enforcement, we did not detect a similar commitment to any thorough evaluation of their efficacy,” the HAJC report said.

It stated that, “there is no minimum scientific or ethical standard that an AI tool must meet before it can be used in the criminal justice field”, lacking in the vast majority of public bodies involved in the development and deployment of these technologies. Expertise and resources to properly evaluate equipment.

“As a result, we run the risk of deploying technologies that may be unreliable, inconsistent or unsuitable for the task at hand,” HAJC said, adding that the system needs to “immediately streamline and improve governance” because “As it stands, users are making it in effect as they go along”.

The committee’s conclusion was in line with the comments of Karen Yeung, an interdisciplinary professor of law, ethics and informatics at Birmingham Law School, who told HAJC in October 2021 that police officers had begun to use new technologies “simply because We can … without clear evidence” about their efficacy or effects.

This includes “very rigorous” testing and use of facial recognition, as well as crime prediction tools such as the Met Police’s Gangs Matrix or Durham Constabulary’s Harm Assessment Risk Tool.

HAJC chair Baroness Hamvy summed up the committee’s 55 written contributions and interviewed 20 witnesses, saying: “We had a strong belief that these new tools were being used without question of whether they would always produce a reasonable result. Is ‘computer’ always right? It was a different technology, but look at what happened to hundreds of post office managers.”

The HAJC report makes a number of recommendations on how to address the concerns raised by its investigation. This includes the establishment of a single national body to set minimum scientific standards for the use of new technologies by law enforcement bodies, to certify every new technological solution against these standards, and to regularly audit their deployment. to do.

This national body should also be set up on an independent statutory basis, have its own budget and the power to impose moratoriums.

Questionable procurement practices and transparency

With regard to the purchase of new technologies, the HAJC noted a series of “suspicious sales practices” stemming from conflicts of interest between police forces, which are obliged to consider under the Public Sector Equality Fee (PSED) how their policies and practices can be discriminatory. , and private sector suppliers, who often wish to protect their intellectual property and trade secrets.

“We heard of companies refusing to engage constructively with clients, such as police forces, on grounds of confidentiality. [The Birmingham Law School’s] Yeung was concerned that some technology providers may invoke intellectual property rights to make ’empty promises’ on the representation of training data, hiding it from their customers, outside reviewers and the courts.

“The Metropolitan Police Service also informed us of ‘vendors reluctant to share information, citing reasons of business confidentiality’.”

In August 2020, the use of Live Facial Recognition (LFR) technology by the South Wales Police (SWP) was deemed illegal by the Court of Appeal, as the force did not comply with its PSED.

The decision noted that the manufacturer in that case – the Japanese biometrics firm NEC – did not provide details of its system to SWP, meaning the force could not fully assess the technology and its effects.

“For reasons of commercial secrecy, the manufacturer is unwilling to divulge details so that it can be tested. This is understandable, but in our view it does not enable a public authority to discharge its own, non-representative, duty under section 149,” it said. ruling,

To deal with these and other procurement issues, the HAJC recommended that, while forces should be free to procure any technical solution certified by the national body, additional support should be provided so that they are “skilled customers” of new technologies. be able to become

“Pre-deployment certification, in itself, can assure them about the quality of the products they are buying. The Enhanced Procurement Guidelines also require, “Local and regional ethics committees should also be established on a statutory basis to examine whether the proposed and actual use of any technology is “lawful, necessary and proportionate”.

On the transparency front, the HAJC noted that there were currently “no systemic obligations” on law enforcement bodies to disclose information about the use of advanced technologies, with the Public Register of Police Algorithms, a “duty of clarity”. should be established, so that regulators and the general public alike understand how new tools are being deployed.

clear law needed

Speaking to Computer Weekly, HAJC’s Hamavi said committee members were “shocked and concerned” when they began to understand the scope of how advanced technologies are deployed in the justice system, and about the implications. Left with a lot of worries”. for human rights and civil liberties.

“We couldn’t figure out who was responsible – more than 30 bodies (which we identified – and we missed a few) suggested with some sort of role that if things go wrong, nobody should be held responsible.” It would be almost impossible to stay,” he said. “And if things went wrong, they could go horribly wrong — you could be convicted and even imprisoned based on evidence you don’t understand and can’t challenge.”

Hamavi said that while the committee recognized that AI could bring “considerable benefits”, for example in efficiency and new ways of working, the final decision should always be made by a human being and how the technologies are used by UK police. A new law is necessary to control it. ,

“I doubt any committee member will think that new laws are the answer to everything, but we need legislation – a register of algorithms used in relevant instruments, as the basis for regulation by a national body. And with the certification of each device,” she said. “Readers of Computer Weekly will not be indifferent to the technology, but for many it is often a matter of ‘the computer says so’. Stricter standards will mean the public can trust that the police, in particular, use advanced technologies.” how they are used as they are now and may be in the future.”

“I doubt any committee member will think that new laws are the answer to everything, but we need legislation. Stricter standards will mean the public can trust how the police use advanced technologies, As they are now and may be in the future.

Baroness Hamvy, HAJC

The HAJC therefore also recommended that “the government bring forth primary legislation that embodies general principles, and which is supported by detailed rules setting minimum standards” because “this approach would strike the right balance between concerns that An overly prescriptive law can stifle innovation and there is a need to ensure safe and ethical use of technologies.”

Computer Weekly contacted Police Minister Kit Malthhouse for comment on the findings of the investigation, but did not receive a response.

Malthhouse previously stated during a webinar on the challenges and future of policing that the acquisition and use of digital technologies would be a major priority going forward, and told HAJC in January 2022 that the use of new technologies by police would be tested in court. Needed. than defined by the new law, which he argued could “inhibit innovation”.

This is in line with the previous government’s claims about police technology. For example, a. in response to July 2019 Science and Technology Committee Reportwhich called for a moratorium on police use of live facial recognition technology until a proper legal framework is in place, Government claimed in March 2021 – after a two-year delay – that “there was already a comprehensive legal framework for the management of biometrics, including facial recognition”.

Paul Wills, the former commissioner for the retention and use of biometric materials, also told the Science and Technology Committee in July 2021 that there was currently a “common legal framework” governing the use of biometric technologies, given their widespread nature and rapid dissemination. This meant a more clear legal framework was needed.

In March 2022, Strategic Review of Policing in England and Wales Reaffirmed that technology would play a role in the police moving forward, but also warned of the need for more ethical scrutiny to ensure public trust.

Although the review focused on policing as a whole – noting the need for “route and branch reform” to address the current crisis in public confidence – many of its 56 recommendations specifically focused on the role of technology. dealt with.

One of the recommendations of the review was for the Home Office to bring in legislation to introduce a clear duty for police forces.

Stay Connected With Us On Social Media Platforms For Instant Updates Click Here To Connect With Us TeaveterAnd Facebook





Source link

Leave a Comment