If you’re tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.
It’s becoming familiar: get arrested, go to trial, and find out the government’s star witness is a half-baked algorithm built in a Silicon Valley basement, rubber-stamped by a judge who can’t program a VCR. This time it’s in New Jersey, where the prosecution in State v. Miles is taking this pastime to new levels of absurdity.
In the state’s latest criminal-justice-meets-cyberpunk farce, Tybear Miles stands accused of the 2021 killing of Ahmad McPherson. The central piece of evidence? Not fingerprints. Not an eyewitness. A facial recognition hit from a system so secretive, that the government won’t even tell the defense what it is, how it works, or whether it’s more accurate than a drunk dart throw.
The prosecution insists it has Miles nailed, thanks to a confidential informant who claimed that “Fat Daddy” was the killer. The cops then poked around Instagram, pulled some photos, fed them into a facial recognition system, and out popped Tybear Miles. The algorithm gave its blessing, the informant nodded in agreement, and the case was sealed.
Except there’s one hitch: the defense wants to actually see how the magic sausage was made. Understandable, since getting sent to prison based on a software’s hunch isn’t exactly the gold standard of due process. Defense lawyers have asked for access to the guts of the system: error rates, database quality, testing protocols, anything that might show the difference between science and snake oil.
The state’s answer? A firm, resounding “no.” Apparently, revealing how this software works would compromise law enforcement tactics.
Cue the cavalry. Civil liberties watchdogs who’ve seen this sci-fi courtroom rerun one too many times; filed a joint brief basically shouting: You can’t have a fair trial if the evidence comes from a black box!
“Facial recognition searches involve multiple components and steps that each introduce a significant possibility of misidentification,” the brief warns, in what might be the most understated way to say “this stuff screws up a lot.”
The civil liberties gang is pointing to a recent New Jersey appellate decision in State v. Arteaga, where the court wisely concluded that if a computer is accusing someone of a crime, we might want to know whether it graduated from MIT or flunked out of Clippy’s Academy for Glitchy Algorithms.
That ruling laid the groundwork for Miles’ team to ask for the same transparency. You’d think that would be obvious. Instead, we’re having a legal showdown to determine whether a man’s freedom hinges on trade secrets and corporate NDAs.
This isn’t only a Jersey problem. Across the country, cops have been quietly using facial recognition tech like it’s a cheat code in a video game, without bothering to tell judges, juries, or the people getting locked up because a computer said so.
In that context, the Miles case is less of an anomaly and more of a litmus test. If the New Jersey Supreme Court decides to side with secrecy, it won’t just gut one man’s defense. It’ll further enshrine the idea that algorithmic evidence is above scrutiny.
If you’re tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.
The post Guilty by Algorithm appeared first on Reclaim The Net.
Click this link for the original source of this article.
Author: Christina Maas
This content is courtesy of, and owned and copyrighted by, https://reclaimthenet.org and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.