6.5 C
New York
Thursday, September 29, 2022

Can robots be more or less racist?

Must Read

Breaks In ‘junk’ DNA Yield New Clues To Treat Neurological Disorders

Contrary to earlier theories, "junk" DNA is much more vulnerable to breaks from oxidative genomic damage, which...

Newly-discovered Fossil “Treasure Hoard” Fills In Missing Pieces Of The Tree Of Life

Researchers from the Chinese Academy of Sciences' Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) have recently found...

Remains Of A 439-Million-Year-Old Toothed Fish Challenge Long-Held Beliefs About Vertebrate Evolution

An international team of scientists has found toothed fish remains that date back 439 million years, which...
Avatar photo
Kuldeep Singh
Kuldeep is a Journalist and Writer at Revyuh.com. He writes about topics such as Apps, how-to, tips and tricks, and social networks and covers the latest story from the ground. He stands in front of and behind the camera, creates creative product images and much more. Always ready to review new products. Email: kuldeep (at) revyuh (dot) com

Protests against structural racism in the United States seem to have gone deep. Artificial intelligence (AI) developers have now set out to end the problem of racist bias in many of the technologies. We tell you what their strategies are to change the course of robots and AI.

Technology that reproduces racist prejudices has caused a lot of controversy in recent times, mainly in the United States. An investigation on the subject by The New York Times indicates that in Dallas in 2016 a police robot first killed a person; he was a black man. The device took a bomb to where it was and blew it up. The following year the police used similar technology again, to do exactly the same thing; this time in Maine.

Now there are robots that use facial recognition, that can predict people’s actions, or that can decide on their own whether to fire “non-lethal” projectiles.

From the first case, the NYT recalls that some robotics researchers raised concerns, claiming that the robots that were part of police bomb squads are marketed as tools to safely dispose of bombs, not to kill people. The same happens with other technologies. 

“The question is: Do we as roboticists want to make it easier for the police to do what they’re doing now? Tom Williams of the Colorado School of Mines asked NYT. “I live in Denver, and this summer during the protests I saw the police gassing people a few blocks from meI live in Denver, and this summer during protests I saw police tear-gassing people a few blocks away from me.”

But how technology is used is not the only problem. Many of the algorithms that are used are biased against black people and others who are not like their designers: rich, healthy white men. In the last decade, facial recognition technologies have been shown to be more efficient with white people’s faces than other people. 

“It is disconcerting that robot peacekeepers, including police and military robots, will, at some point, be given increased freedom to decide whether to take a human life, especially if problems related to bias have not been resolved,” they wrote in the journal Science and Engineering. Ethics Ayanna Howard, a robotics researcher at Georgia Tech, and Jason Borenstein, a colleague from the same university’s school of public policy, according to NYT.

Not everything is lost

AI and robotics researchers are lobbying to change the way technologies are developed and used. The organization Black in Computing released a statement and proposal that, according to the NYT, have been signed by nearly 200 black computer scientists and more than 400 allies (whether they are black academics in other fields or non-black people working in related areas), to refuse to work with or for law enforcement agencies.

They allege that the technologies they help “create to benefit society are also disrupting black communities through the proliferation of racial profiling.” And therefore, they manifest: “Without justice, there are no robots.”

The statement also calls for reforms, including ending harassment of black students by campus police officers, and addressing the fact that blacks receive constant reminders that others believe they don’t belong there.

There are also other proposals. For the University of Alabama professor, Chris S. Crawford, he told NYT that he believes that the problem of bias could be solved if there were more people who resembled the American population at the work table when designing the technology.

A similar approach was made to NYT by Odest Chadwicke Jenkins, a robotics and AI researcher at the University of Michigan. “The bigger issue is, really, representation in the room — in the research lab, in the classroom, and the development team, the executive board.” He assured that ethical discussions must be rooted in that first fundamental question of civil rights.

There are others who consider that not working with the repressive authorities would be a mistake. Such is the case of Williams, and Cindy Bethel, director of the Laboratory of Social, Therapeutic and Robotics Systems at Mississippi State University. According to NYT, the researcher believes that robots can make police work safer for both officers and civilians.

Now Bethel and her team are developing a robot equipped with night vision cameras, which would allow officers to scan a room before entering it. She thinks this would make it safer for the police. “Everyone is safer when there isn’t the element of surprise, when police have time to think,” she said.

“If external people who have ethical values aren’t working with these law enforcement entities, then who is?” Howard told NYT. “When you say ‘no,’ others are going to say ‘yes.’

Whatever strategy developers take, the truth is that for the first time many robotics and technologists signed statements assuming responsibility for addressing racism from the laboratories. “I think the protests in the street have really made an impact,” Jenkins told NYT.

- Advertisement -
- Advertisement -

Latest News

- Advertisement -

More Articles Like This

- Advertisement -