FEATURE-US prisons consider AI to analyze inmate phone calls
(Clarifies reference to analysis in paragraph 6) * Justice Department requested to investigate feasibility of AI prisoner monitoring technology
* Prisoners’ rights advocates fear prejudice, privacy concerns * Software designed to prevent suicides and violent crimes
By David Sherfinski and Avi Asher-Schapiro WASHINGTON / LOS ANGELES, Aug. 9 (Thomson Reuters Foundation) – For people like Heather Bollin, a 43-year-old woman in Texas engaged to a currently incarcerated man, constant surveillance is a reality of life: the three daily phone calls they make together are monitored by the prison authorities.
“We can never communicate without being watched,” she told the Thomson Reuters Foundation in a telephone interview, asking that the prison in which her fiancé is located remain anonymous because she fears reprisals. US prisons could benefit from high-tech help to keep tabs on what inmates are saying, after key House of Representatives panel calls for report to study use of intelligence artificial intelligence (AI) to analyze inmate phone calls.
But prisoner advocates and inmate families say relying on AI to interpret communications opens the system to errors, misunderstandings and racial prejudice. Call for the Department of Justice (DOJ) to further explore technology, to help prevent violent crime and suicide, accompanies spending bill of more than $ 81 billion to fund DOJ and other federal agencies in 2022 which the Appropriations Committee adopted last month.
The technology can automatically transcribe inmate phone calls, analyzing their communication patterns and flagging certain words or phrases, including slang, that officials have pre-programmed into the system. A House Democratic aide said in an emailed statement that he encouraged the DOJ “to engage with stakeholders while examining the feasibility of using such a system.”
Several state and local facilities across the country have already started using the technology, including in Alabama, Georgia and New York. The House panel wants the DOJ to examine the possibility of exploiting the technology for federal purposes and identify any gaps or gaps in the information it produces.
“It’s very disturbing – what if I say something wrong on a call? Said Bollin, who worries about accidentally getting her fiance in trouble. “Could that be misinterpreted by this technology, and then he could be punished?” ‘EXPERIMENTAL SUBJECTS’
Privacy groups say the technology could amplify racial bias in the justice system and unfairly subject prisoners to irresponsible artificial intelligence. “This Congress should ban racist police technology – it should not fund it,” said Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project (STOP), a New York-based advocacy group.
“People who have been caught up in the criminal justice system are always turned into test subjects for new technological systems.” Supporters take issue with these criticisms, saying the technology is an essential time-saving tool for law enforcement and does not target specific groups.
Oxford, Alabama Police Chief Bill Partridge said local forces were successful in resolving unsolved homicide cases after prisoners were reported over the phone talking about “actually committing murder “. Partridge’s department is one of a handful of state agencies that have used software from LEO Technologies, a California-based company, which uses natural language processing and transcription tools from Amazon Web Services (AWS). to process and report inmate calls for almost real purposes. time analysis.
The police chief said the technology, called Verus, is particularly useful in preventing suicides. “I think if the federal government starts using it, it will prevent a lot of inmate deaths,” he said. Scott Kernan, CEO of LEO Technologies and former secretary of the California Department of Corrections and Rehabilitation, said the technology “saves lives both inside and outside the correctional environments we monitor.”
“Because we listen to all communication, we are not targeting any race, gender or protected group,” Kernan said. Specific public data on the number of calls reported by Verus was not readily available.
SURVEILLANCE, PRIVACY ISSUES Prisoners have very little legal protections inside to defend themselves if a machine determines they are about to think outside the box, said Bianca Tylek, founder of Worth Rises, a non-profit organization working on prison justice issues.
“I think the idea that a machine can hear and understand what a person is saying, and that it becomes some kind of tool in court, is ludicrous,” she said. The technology that transcribes voice conversations is flawed and has a particularly high error rate when applied to black voices, according to a 2020 article on the top five systems by researchers at Stanford University and Georgetown University.
“Speech-to-text technology is not in a place where it can be used to make these kinds of criminal justice decisions,” said Allison Koenecke, lead author of the study. The researchers found that Amazon’s automatic speech recognition software had an error rate for black speakers almost twice that of white speakers.
In a statement, an AWS spokesperson said the company’s Amazon Transcribe service was “very accurate,” but acknowledged that pronounced accents or poor audio quality can cause variations in individual words. They said the service “had never received any reports of abuse or misuse”, that it was inappropriate to attribute variations to a “single category” such as race, and that they were trying to identify potential areas for improvement.
In the United States, black men are six times more likely to be behind bars than white men, according to the Sentencing Project, a research group. Kentrell Owens, a computer scientist at the University of Washington who studies prison surveillance, said good surveillance of AI systems is essential.
“Before you implement technology that can control people’s freedom, you need an independent assessment and audit of the tool to determine if the technology is even helping you achieve your goals,” he said. -he declares. AUTOMATE RACIAL PROFILING?
Advocates say the AI technology referred to in the spending bill report extends to current prison surveillance technology, developed by companies like Securus Technologies – a major provider of calling services. telephone numbers for prisoners. In a STOP 2020 report, Cahn examined the Securus platform, which uses automated speech recognition technology to record and analyze conversations for the New York State Department of Corrections and Community Supervision (DOCCS).
He said the software violated the privacy rights of prisoners and their families and had the potential to “automate racial profiling.” DOCCS said in a statement that it takes the safety of staff, visitors and detainees “very seriously” and that it uses many tools in addition to Securus software to achieve its goals.
A Securus spokesperson did not answer questions about Cahn’s report and potential biases, but said the company had not pushed for AI language to be included in the project report. law of the House. For people like Bollin, the Congressional initiative raises concerns that they will have to give up their privacy even more in order to continue talking with their fiancé.
“We are supposed to be free people, we are not incarcerated,” she said. “But I feel like my rights are constantly being violated.”
(This story was not edited by Devdiscourse staff and is auto-generated from a syndicated feed.)