NYCLU sues ICE over changes to immigrant risk assessment algorithm – The Verge

The New York Civil Liberties Union has filed a lawsuit against US Immigration and Customs Enforcement over changes to the agency’s Risk Classification Assessment (RCA) software, which is used when ICE detains someone for deportation proceedings. The tool provides an algorithm-based recommendation on whether an immigrant should be detained in jail until they can see a judge — or whether they can instead be released on bond in the interim.

But the NYCLU claims that in 2017, ICE altered the RCA so that it overwhelmingly started to favor detention over release. It became the default, guaranteed recommendation even for people with little or no criminal history. “The program now automatically recommends that all immigrants be detained,” the union wrote in a blog post. “An ICE officer reviewing their case has to manually override the recommendation.”

These days, ICE is barely releasing anyone on bond. Reuters reported that the number of immigrants with no criminal history who ICE booked into detention tripled in a year to more than 43,000 from 2016 to 2017. This fall, the total number of people in ICE detention each day reached its highest point ever.

And immigrants aren’t just waiting a few days or a week to get in front of a judge for their next opportunity for release, with the NYCLU noting that “the median wait time is now 80 days” in the New York City area — up from 42 days last year.

ICE refused to expedite a FOIA request and the NYCLU says its attempts to gather more information on modifications made to the tool have been met with “radio silence.” The NYCLU is turning to litigation because “the public needs to know more about how ICE makes decisions about whether or not to release people, and why so many who are eligible for release continue to be detained.”

In its complaint filed with the US District Court for the Southern District of New York, the NYCLU “seeks an order compelling ICE to conduct a thorough search for all responsive records, immediately process all located records, promptly disclose the requested records in their entirety and make copies available to the NYCLU, and award the NYCLU costs and reasonable attorneys’ fees incurred in this action.”

Aside from this ICE example, risk assessment algorithms are used by some prisons in deciding who should receive parole. States are trying to implement them under the premise that they could weed out racial bias and make criminal sentences fairer. Pennsylvania has been trying to come up with one for years — but now lawmakers want to scrap the whole project, as reported by The Philly Inquirer. Earlier this year, The Verge published a feature on the damage done by a bad health care algorithm that had cut care for unsuspecting individuals. Algorithms are a helpful thing in many contexts, but there are clear arguments for why we shouldn’t rest every decision with them. In ICE’s case, the agency seems to have adjusted the risk assessment algorithm to reflect a more stringent immigration policy.

Powered by WPeMatico