Chicago Man Wrongly Imprisoned Because Of Artificial Intelligence?
An artificial intelligence program is being used in crime investigations, leading to a man in Chicago potentially being falsely accused of a crime.
This article is more than 2 years old
Michael Williams spent almost a year in prison before prosecutors asked a judge to dismiss his case due to insufficient evidence. The 65-year-old was jailed over the shooting of a man inside his car. Now, he is suing the city of Chicago for using an unreliable artificial intelligence system called ShotSpotter as critical evidence in charging him with first-degree murder. The human rights advocacy group out of Northwestern University claims the city’s police relied solely on the technology and failed to pursue other leads in the investigation.
The lawsuit, filed by the MacArthur Justice Center, is seeking damages from the city for mental anguish, loss of income, and legal bills for Williams, who still suffers from a tremor in his hand that developed while he was locked up. The document also details the case of Daniel Ortiz who was arbitrarily arrested and jailed by police responding to ShotSpotter’s artificial intelligence alert. Additionally, the suit is seeking class-action status for any Chicago resident stopped based on the alerts, the Associated Press reports.
The MacArthur Justice Center even requested a court order barring the use of artificial intelligence technology in the nation’s third-largest city. Speaking about his ordeal, Williams explained that although he’s free, he doesn’t think he will ever recover from the effects of what happened to him. “Like the shaking with my hand, I constantly go back to the thought of being in that place,” he told the publication. “I just can’t get my mind to settle down.” When asked for comment, the city’s law department said it had not yet been served with the complaint.
If the lawsuit is successful, Chicago may have to halt all ShotSpotter activity. This would be tricky since the city quietly renewed its $33 million contract with the artificial intelligence company last year. According to The Byte, the case could be an inflection point in the rollout of AI-assisted policing in the future. Interestingly, this isn’t the first time ShotSpotter has come under scrutiny. In 2021, a report by the MacArthur Project claimed 89% of the tech’s alerts lacked any on-site evidence.
That same year, an investigation by Vice cited the horrific death of an unarmed black 13-year-old who was shot by Chicago police responding to a ShotSpotter alert. The report accused the artificial intelligence software of racial discrimination. The tech isn’t named as a defendant in the new 103-page filing. But the lawsuit claims the company’s algorithm-powered technology is flawed. The suit also says the city’s decision to place most of its gunshot-detection sensors in predominantly Black and Latino neighborhoods is racially discriminatory.
Meanwhile, Engadget says police didn’t even confirm a motive for Williams, who was accused of shooting 25-year-old Safarian Harring while giving him a ride home from a police brutality protest. All they had was soundless security footage of a vehicle, along with an artificial intelligence alert. However, ShotSpotter has refuted the inaccuracy and racial bias claims, insisting that their system has a 97% aggregate accuracy rate for real-time detections across all customers. The statement also says it works by mapping objective historical data on shootings and homicides.