UN decries use of ‘big data’ collected via AI to reinforce racial bias

racial bias big data AI
FILE PHOTO: DC National Guard military police officers look on as demonstrators rally near the White House against the death in Minneapolis police custody of George Floyd, in Washington, D.C., U.S., June 1, 2020. REUTERS/Jonathan Ernst/File Photo

GENEVA (Reuters) – Police and border guards must combat racial profiling and ensure that their use of “big data” collected via artificial intelligence does not reinforce biases against minorities, United Nations experts said.

Companies that sell algorithmic profiling systems to public entities and private companies, often used in screening job applicants, must be regulated to prevent misuse of personal data that perpetuates prejudices, they said.

“It’s a rapidly developing technological means used by law enforcement to determine, using big data, who is likely to do what. And that’s the danger of it,” Verene Shepherd, a member of the UN Committee on the Elimination of Racial Discrimination, told Reuters.

“We’ve heard about companies using these algorithmic methods to discriminate on the basis of skin colour,” she added, speaking from Jamaica.

Shepherd, a historian, led the 18 independent experts in drafting a “general recommendation” to the 182 countries that have ratified a binding international treaty prohibiting racial discrimination.

Minorities and activists have complained about the growing use of artificial intelligence, facial recognition and other new technologies, she said.

“It’s widely used in the United States of America, and we’ve had complaints from black communities in the European Union as well. And Latin America where people of African descent and indigenous people complain about profiling,” Shepherd said, citing Brazil and Colombia. “These are the hotspots where we hear about cases of profiling being more prevalent.”

Protests against racism and police brutality erupted across the United States following the death in May of George Floyd, an African-American who died after a Minneapolis police officer knelt on his neck for nearly nine minutes.

Many police use “predictive” profiling systems that lead to identity checks, traffic stops and searches, based on previous arrest data about a neighbourhood, Shepherd said.

The committee recommends that people who have been targeted deserve compensation, she said, adding: “If they live to tell the tale, by the way, because we know sometimes it ends up badly.”

(Reporting by Stephanie Nebehay; Editing by Nick Macfie)

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.