International Women’s Day provides an opportunity to reflect on a pressing problem in technology: gender bias in artificial intelligence (AI) systems. Admittedly, the tech industry has become aware of this problem and has taken action. However, there is still a lot to do. AI systems are increasingly integrated into our daily lives, from healthcare to lending, yet there is still an urgent need to eliminate bias. Organizations must listen to women’s contributions and create platforms where women’s perspectives can be heard.
Generative AI like ChatGPT has increased interest in AI technology. Unfortunately, many of these systems still contain built-in biases that are undoing decades of progress in the representation of women. A recent study from the University of Washington showed that AI art generator Stable Diffusion often generates images of light-skinned men when asked about a “person.” As similar tools become more integrated into our lives, we risk a future that is whiter and more male. To prevent this, organizations must give women a leading role in combating gender bias. Consider testing, monitoring, and identifying problems with AI systems.
Managers should not wait for government regulations but take action themselves. They must listen to the women working in their organizations to create a future where AI advances, not hinders, gender equality. Lenovo’s Women in AI group is a group where women can speak freely about these topics. More organizations should do this. This is the time for women to make their voices heard.
Developing unbiased AI
To design truly unbiased systems, gender biases must be taken into account from the outset and women must be included at all levels. Creating gender-diverse teams is critical, and companies need to ensure they screen AI for diversity as thoroughly as they do for privacy and security issues. For high-risk technologies like AI, it’s worth expanding user testing for more diversity so that the people testing the product are actually different from the people designing it. For example, teams can also benefit from insights from developing technologies for people with disabilities. This can help understand and combat various forms of discrimination.
Organizations cannot rest on their laurels. In the past, gender bias in AI was often clearly visible. Think about women who are offered lower credit limits based on an algorithm. However, with the spread of AI in areas such as education, newer, less obvious forms of discrimination may also arise. This can, for example, lead to a lack of support for STEM education for women.
Make sure employees are heard
Every company is at the beginning of its AI journey and is taking the first steps to understand what “good” AI should look like. The starting point must be communication and awareness. Organizations need to ensure that employees – and especially women – feel comfortable expressing their opinions.
Organizations need to ensure that employees – and especially women – feel comfortable expressing their opinions.
Ada Lopez, Senior Manager, Product Diversity Office Lenovo
It is therefore important to create a “central point” where problems can be reported. Additionally, a culture where women feel heard is crucial. Of course, it is also important to address the problems. For example, Lenovo recently raised concerns about a female avatar. This sparked an important, broader discussion about how women should be represented through AI. As with product testing, the broader and more diverse the discussion about possible gender biases, the better.
Collaborate
Collaboration both within the technology industry and with external organizations is key to addressing this problem long-term. Companies should share best practices and partner with educational organizations to provide training on how to address bias in AI.
Organizations need to pursue a long-term vision and work with technical schools. In doing so, they raise awareness of gender bias among those developing AI systems in the future. Of course, it remains important to get girls and young women interested in STEM fields in order to bring more women into the workplace and into the teams that design AI systems. According to Gartner, women are only 26% represented in IT, and that needs to change quickly to deliver unbiased AI systems.
An unbiased future
AI bias is not a simple problem, nor can it be solved quickly. That’s why it’s important to strive for progress rather than perfection. This is not a problem that can be solved by a committee. Everyone within the organization needs to be involved and preconceptions about AI need to be discussed at the start of every AI project. It is also crucial that organizations ensure that women have a voice and are heard at all levels. Only by looking at it from all perspectives can this problem be addressed in the long term. By making concrete progress now, companies can pave the way for a fairer and more diverse AI-driven future.
This is a post from Ada Lopez, Senior Manager, Product Diversity Office at Lenovo