A criminal justice plan from presidential candidate Bernie Sanders would ban police from using facial recognition software.
The Democratic senator and 2020 hopeful also called for a pause of the use of algorithmic assessment tools in the criminal justice system until they are audited.
“We must ensure these tools do not have any implicit biases that lead to unjust or excessive sentences,” the Sanders proposal outlines.
An extensive ProPublica investigation in 2016 detailed how algorithmic systems used to predict future criminals were flawed.
Facial recognition can be used to identify people from live video feeds, often by comparing facial features with a database of faces, such as mugshots.
“Police use of facial recognition software is the latest example of Orwellian technology that violates our privacy and civil liberties under the guise of public safety and it must stop,” Sarah Ford, a Bernie Sanders campaign spokesperson, told CNN Business on Monday.
“Bernie is proud to join cities like San Francisco in banning the use of this technology for policing and, as president, will enact a nationwide ban on facial recognition software for policing, including at the state and local levels,” Ford added.
Related: San Francisco just banned facial-recognition technology
In May, San Francisco banned police and local government departments from using facial recognition technology — becoming the first city in the United States to do so. Somerville, Massachusetts, and Oakland, California, have also banned the use of such technology.
San Francisco Supervisor Aaron Peskin, who introduced the city’s bill, told CNN Business earlier this year that the technology is “so fundamentally invasive” that it shouldn’t be used.
While San Francisco banned the use of facial recognition systems by local authorities, federally controlled facilities, like San Francisco International Airport, are exempt.
In July, The Washington Post reportedhow federal agencies like the FBI and ICE were using the technology.
The American Civil Liberties Union is concerned the technology can be biased and inaccurate and could disproportionately impact women and people of color. There are no federal guidelines to limit or standardize the technology’s use.
Supporters of police use of the technology say it can be an important tool in fighting crime.
“This technology allows law enforcement agencies to compare images of hundreds of thousands of individuals, which saves time and agency resources,” John Mirisch, the mayor of Beverly Hills, wrote in June.