May 5, 2025
Trending News

Ethics and artificial intelligence: Women and LGBT + people are a minority in AI teams

  • May 5, 2022
  • 0

Gender differences are in technology and this affects the development of artificial intelligence During development Algorithms It is important to note that there is diversity in the data

Gender differences are in technology and this affects the development of artificial intelligence

During development Algorithms It is important to note that there is diversity in the data that feeds these systems. Otherwise, artificial intelligence developed based on this information Can repeat and reinforce bias. Therefore, it is increasingly important to consider the ethical aspect when producing this form of technology.

The point is that while companies today are aware of this situation, certain obstacles and limitations have not yet been overcome to achieve more inclusive technologies.

According to a recent IBM study conducted with Oxford Economics, 68% of organizations surveyed acknowledge that a diverse and inclusive workplace is important in reducing bias towards artificial intelligence..

Despite this, The findings indicate that AI teams are substantially less diverse than their organization’s workforce: 5.5 times less for inclusive women, 4 times less for inclusive LGBT + people, by the way.

Because many companies today use artificial intelligence algorithms in their business, they face increasing internal and external requirements to make these algorithms fair, secure, and reliable; However, there has been little progress in the industry in incorporating artificial intelligence ethics into its practice, ”said Jesუსs Mantas, IBM Consulting’s Global Managing Partner.

The report, which relies on a survey of 1,200 executives in 22 countries across 22 industries, is not a detailed reason why companies are pushing for less diversity in AI teams; But this information can be analyzed in the light of other data related to this topic.

According to a UNESCO Report, Only 33% of women with higher education in the world choose a scientific and technological career. The international organization specified that only 3% of female students choose to pursue higher education, choosing technology, information and communications; 5% choose science, math and statistics.

While 8% of students choose engineering, manufacturing and construction; And 15% choose careers related to health and well-being, such as medicine or nursing.

There are many obstacles to these educational paths, ranging from girl stereotypes to family responsibilities and superstitions. “What women face when choosing a field of study,” the UNESCO report said.

Stereotypes are created by advertisements, publications, stories that are repeated in society and in the media. Various researchers have noted that the advent of the personal computer in the 1980s and its popularization, from advertisements as a product related to the male world, have created models of exclusion of women.

Algorithms used in various fields: Justice, Finance and Health, including (IStock)

Marketing research conducted in the 1980s and 1990s showed that teachers and family members were more likely to encourage boys to study math, science, and technology. This world connected with this segment and thus created a story that said that calculations or technology were “a matter for men”, which pushed more and more women and other societies to devote themselves to the study of these disciplines.

In addition, a study by researchers at the University of Michigan and the University of Philadelphia found that LGBT + professionals in STEM (Science, Technology, Engineering, and Mathematics) disciplines are more likely to experience professional restrictions, harassment, and professional devaluation. Than their non-LGBTQ peers. They also said they were more likely to leave STEM disciplines as a result of this situation.

The stereotypes we use create a horizon of opportunity and determine what is expected of each sex.. “From these images, we define how we perceive others and ourselves, and they predispose us to certain attitudes and decisions.”

“One of the reasons for the algorithmic bias is the lack of diversity in the teams that develop artificial intelligence, and the second reason is the lack of training of these teams in gender issues. “Many times it is not enough to add more diversity, but it is also necessary to train these developers so that they know about this situation and have a more inclusive approach.” with Infobae.

He added: “It is important to emphasize that training in the social sciences, ethics and diversity should be included in a career related to the development of this type of technology, as algorithms affect many aspects of community life.”

What to do in this situation

The first step to change is to take into account the unequal situation that exists in artificial intelligence teams and this is something that almost 7 out of 10 companies surveyed have already acknowledged as mentioned above. At the same time, 88% of Latin American leaders recognize the importance of ethical artificial intelligence.

88% of Latin American leaders recognize the importance of ethical artificial intelligence

The next steps are Take concrete initiatives to add more diversity to the teams that develop algorithms that can drive different aspects of life. In this regard, IBM’s research recommends several actions for business leaders, including:

1. Take a multidisciplinary and collaborative approach: Ethical AI requires a complex approach and comprehensive skills from all parties involved in the process. C-suite executives, designers, behavioral scientists, data scientists, and AI engineers each have a different role to play in the journey to credible artificial intelligence.

2. Establish Artificial Intelligence Governance for the Functioning of Artificial Intelligence Ethics: Take a holistic approach to encouraging, managing, and managing artificial intelligence decisions throughout their life cycle, from establishing the right culture to promoting AI responsibly, to product practices and policies.

3. Reach out to a partner beyond the company: Extend your approach by identifying and engaging AI-oriented technology partners, science, startups, and other ecosystem partners to build “ethical collaboration.”

Continue reading:


Source: Info Bae

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version