Disrupt Development

View Original

Gender-inequity in Artificial Intelligence

For women, hurdles are everywhere. Despite the critical role women play in societies, unequal access to education, loans, jobs, healthcare, technology, and political discourse are commonplace — and worsened by COVID-19.

Technological innovations like artificial intelligence (AI) promise to identify and close these gaps through claims of a more data-driven, objective approach, but ironically pose another hurdle for women. Often, these digital systems inadvertently carry the same old analog gender biases.

Let’s take an example.

Imagine that you run a small woman-owned business from home. Excited by its early success, you apply for a loan to hire staff and get more space. But you’re soon dejected — every bank approved you for a smaller loan than requested, while a friend got his loan request approved in full. You’re surprised — he has similar assets and savings as you. The only obvious difference is that you’re a woman.

You learn that your credit application was evaluated not by a person, but by a machine. Banks use AI technology to assign a creditworthiness score to applicants. Using a machine learning algorithm, the tool ‘learns’ patterns of behavior associated with higher creditworthiness based on previous applicants’ data and their associated repayment. Historically more men received loans than women, so the algorithm determined that men were more creditworthy. Although banks turned to AI to be more objective and equitable in lending, the results were actually the opposite.

Smallholder farmers in Uganda test AI-enabled digital crop advisory tools as part of the Fall Armyworm Tech Prize / CABI_Invasives

This is not a hypothetical example — there are now-famous instances of the unintentional consequences of AI, such as automated resume screeners rejecting women, facial recognition disproportionately failing for women, and algorithmic credit-scorers ranking women lower than men. As AI tools are being tested and used in developing economies to derive insights and gain efficiencies across sectors — and as we rely more and more on them to give loans, diagnose diseases, triage medical care, and respond to humanitarian crises — we must work to prevent them from discriminating. There is an opportunity and urgency to optimize for innovative and equitable AI — especially in developing countries.

Development actors are taking steps to address disparities, for example, using AI to close gender-related data gaps in child marriage. Through the WomenConnect Challenge, USAID is beginning to tackle algorithmic gender bias in lending and is committed to taking action on the fair development and use of AI more broadly, working with partners to create a report and online course to better integrate AI fairness in development.

But we know there are many more ways that bias manifests in AI. Complex contributors to these harmful outcomes can include unrepresentative datasets, largely male data science teams, cultural norms around gender, and local policies and practices around data, among many others.

Women in the Dominican Republic participate in a focus group discussion about their experience with gender attitudes in the credit market. / Unica Mendez, IPA

Gender equity is critical to achieving AI fairness, and as we work to build an agenda for action, we want to hear from you. In recognition of Women’s History Month, we want to shine a spotlight on inequitable gender outcomes related to AI in the developing world, and we need your help.

In the comments, or by highlighting and upvoting, please share your experiences and perspectives. You can comment anonymously, but please indicate which country you live in or are referring to. We are especially keen to hear from people in developing countries. The following questions are not meant to be exhaustive or prescriptive, but might help guide your responses.

What’s going wrong?

If you’ve experienced AI-related gender inequity, or are aware of AI use leading to it, please share what happened, how you recognized it, and what you think facilitated it. What barriers exist for gender-equitable AI tools in your context? Are there systems/laws/policies/norms contributing to AI-related gender inequity?

What’s going right?

Are there efforts by the private sector, government, civil society, or other stakeholders to recognize and promote equitable AI? Are there examples of mitigating a potentially gender-inequitable outcome?

What’s possible?

Do you have ideas for reducing AI’s harmful impact on women, especially in developing countries? What would it take to put your idea(s) into practice?

Who could make change in this space?

Which entities or organizations have influence in this space and can make meaningful contributions? Who should or shouldn’t have a bigger voice?

We see power in collectively sharing stories, but if you feel reluctant to openly share your experience, we invite you to reach out more privately to genderAI@usaid.gov.

The development community is committed to gender equality and women’s economic empowerment. We must ensure that using AI does not reverse the substantial gains of the past decades. Together, we can raise awareness about gender inequity in AI and take action to change course. Join us.

About the Authors

Shachee Doshi is an Emerging Technologies Advisor on the Strategy & Research team, and Meredith ‘Beth’ Perry is an Open Innovation Advisor, both within the Innovation, Technology, and Research Hub of USAID’s Bureau for Development, Democracy, and Innovation.