The U.S. Department of Justice says state and local law enforcement agencies will buy AI-based “policing” tools known to be inaccurate, if not likely to exacerbate the biases long observed in the U.S. failed to convince a group of U.S. lawmakers that they were not given federal funding to do so. US police.
In a letter to the Justice Department first obtained by WIRED, seven members of Congress wrote that the information they pried from the agency only fueled concerns about the department's police funding program. ing. Lawmakers say there is nothing in the responses so far to indicate that the government has bothered to investigate whether the agencies it subsidized were purchasing discriminatory enforcement software.
“We urge you to suspend all Department of Justice grants for predictive monitoring systems until the Department can ensure that grant recipients do not use such systems in a manner that has a discriminatory impact,” the letter reads. “it is written like this. The Justice Department previously acknowledged that it did not track whether police departments used funds awarded under the Edward Byrne Memorial Justice Assistance Grant Program to purchase so-called predictive policing tools.
Lawmakers led by Sen. Ron Wyden, D-Ore., said the Justice Department is required by law to “periodically review” whether grant recipients are complying with Title VI of the Civil Rights Act. It is claimed that there is. They explain that the Department of Justice is prohibited from funding programs that are found to discriminate on the basis of race, ethnicity, or national origin, whether the result is intentional or not. There is.
Popular “predictive” police tools trained on historical crime data often reproduce long-standing biases and leave law enforcement with no scientific validity at best, according to an independent investigation in the news organization. It turns out that while providing a façade, it perpetuates the over-policing of predominantly black and Latino neighborhoods. His October headline in The Markup bluntly states, “Predictive police software is terrible at predicting crime.” This article details how the publication's researchers recently examined 23,631 police crime predictions and found that those predictions were accurate about 1% of the time.
“Predictive policing systems rely on historical data skewed by falsified crime reports and disproportionate arrests of people of color,” Wyden and other lawmakers wrote, and many researchers As it does, it predicts that the technology will only serve to create “dangerous” feedback loops. The statement said “biased projections are being used to justify unwarranted stops and arrests in minority areas,” further biasing statistics on where crime occurs.
Senators Jeffrey Merkley, Ed Markey, Alex Padilla, Peter Welch, and John Fetterman also co-signed the letter, as did Representative Yvette Clark.
Lawmakers have called for the president's next report on policing and artificial intelligence to examine the use of predictive policing tools in the United States. “The report should assess the precision and precision of predictive policing models across protected classes, their interpretability, and validity,” the researchers wrote, adding, “The lack of transparency of the developing companies It should also include the limitations of risk assessment that arise.” ”
If the Department of Justice wants to continue funding this technology after this evaluation, it should at least establish an “evidence standard” for determining which predictive models are discriminatory, and all predictions that do not meet it. Lawmakers say funding for the model should be denied.