Perhaps it is no accident that Yann Lecun, director of Facebook Ai Research, is one of the pioneers of artificial intelligence. According to a new report, most of those working in this field are white and male . The percentage is around 80% and even reaches 90% in the case of Google. It is a bit better for Facebook where, at least, AI women are 15%.
According to researchers from the AI Now Institute of New York University – the New York-based organization that studies the social implications of artificial intelligence and has written the analysis – the data is worrying.
Machines, in fact, analyze reality from the point of view of their creators and can be conditioned by their prejudices. Not only. The lack of diversity within companies also undermines their effectiveness. A Mit Lab survey, for example, found that facial recognition systems very well identify individuals with white skin but often make mistakes when they have to repeat the same task with people of color.
” The problem of lack of diversity in the artificial intelligence sector is well documented and affects all places of work,” the researchers said. “It is enough to see the offices, the universities, the disparities in recruitment and promotions and even in the systems on the market that reflect and amplify the prejudices, which is why we return to talk about biological determinism“.
The institute’s researchers complain that the policies adopted so far to solve this problem have had no effect. The document states, for example, that the initiatives to encourage the entry of women are still very limited and concern only white-skinned employees, while their black colleagues remain excluded. Furthermore, companies often continue to treat machine biases individually, without thinking of a broader or common strategy.
According to the team, it would be more effective to become aware of this problem – something that has not yet been done – and act accordingly, hiring people of different ethnicities and giving them the opportunity to level up if they deserve it. We should then act on transparency, publishing reports in which we put in writing who was hired, who was promoted and how individual employees are paid: a measure that would favor the accountability of companies and help them monitor their progress in diversity theme.
Finally, we should continue to test the machines and also evaluate their social impact. ” These systems must be fair, not just safe, ” the research authors write.