Why AI ethics should be top of your mind

Natalia Modjeska, Info-Tech Research Group

Natalia Modjeska, Director, Research & Advisory Services (Data & Analytics)
Info-Tech Research Group

When speaking to clients about AI, especially ethics, fairness and algorithmic/AI bias, I usually offer a list of resources for anyone wanting to go deeper. It includes books by Cathy O’Neil, Virginia Eubanks, Caroline Criado Perez, Jennifer Eberhardt, and Frank Pasquale; and videos by Kate Crawford and Arvind Narayanan. (See “Resources for Further Learning” at the bottom of the article).

 

Have another look at the names in that list. Notice anything interesting? Neither did I, until a client pointed it out it: most of these books and videos are by female authors! Is this a subconscious bias at play? — A female analyst favouring fellow female writers/ researchers/ authors? Perhaps… Or maybe women, especially women of color and of non-Caucasian origin, are more acutely attuned to injustices, discrimination, bias, and prejudices that permeate our societies and cultures?

Perhaps women write about these topics because we live in a male-dominated world? A world where many technologies have been invented (or were credited to them) by men, for men? Because a lot of products around us have been created using a “one-size-fits-all” approach, and that one size is the one that best fits men? 

Caroline Criado Perez opens her book “Invisible Women” with this provocative statement: ”Seeing men as the human default is fundamental to the structure of human society.” She continues: “It’s an old habit and it runs deep.” So deep, in fact, that we continue to rely on it while fundamentally reshaping our world through AI and its underlying technology, machine learning (ML). 

AI is being used to automate many processes, workflows, and systems that make consequential decisions: who gets jobs, healthcare, social assistance; who has access to opportunities in life such as loans or good education; who stays in jail, and who gets to go free. 

Many of these systems – and the decisions they make — are unfortunately biased. Moreover, they disproportionately impact people of color, the elderly, vulnerable populations, and people from marginalized backgrounds. And women. And if you think that you may be spared, think again. Amazon’s sexist AI recruitment tool, the Apple Card example, and a recent UK exam debacle show that it could happen to anyone of us, to our children, spouses, parents, siblings and friends. 

And this is just the beginning. As the scale, scope, speed and degree of AI adoption and the resulting automation increase, we will see more examples of algorithmic discrimination, automated racism, sexism, misogyny, ageism, “deepfakes”, and large-scale manipulation of public opinion and human behavior (“nudging”). Also, because AI/ML is dependent on big data, this interdependency creates a toxic feedback cycle where biased systems lead to bad predictions, discrimination and harm, which in turn generate more biased data that reinforces the system, thereby making it even more biased and dangerous. 

Much of this damage is accidental. It arises because we use biased data. And because the majority of AI/ML developers, scientists, professors and coders are male – white, North American, young men, to be more precise. Here are some stats: 

  • Only 18% of data science roles are occupied by women1
  • 11% of data teams don’t have any women on them at all1
  • Only 18% of authors at leading AI conferences are women2
  • More than 80% of AI professors are men2
  • Women comprise only 15% of AI research staff at Facebook and 10% at Google2
  • Only 2.5% of the workforce at Google is black, and 4% at both Facebook and Microsoft2

1: Bayern, 2019; 2: Myers West et al., 2019

And these numbers get even smaller if you look at them intersectionally! (E.g., black women.) I recently spoke with a black woman who works as a data solutions architect, another occupation which is predominately male. Black women comprise only about 2% of data architects. 

Why does it matter? — You see, when we build systems and structures, including AI/ML, we bring with us individual cognitive biases, as well as cultural and societal biases, preconceptions, attitudes, perspectives and experiences. We subconsciously inject them into the products we create. Because that’s how we humans operate. 

The situation is slowly changing as public awareness of algorithmic bias and discrimination is increasing, thanks to the tireless efforts of many female (and male) scientists, journalists, and authors. Alas, achieving some semblance of gender and racial balance will take time. Because technology is a form of power. AI is a form of power. 

So perhaps it is not surprising, after all, that many books on data and AI ethics are written by women. “For too long we have positioned women as a deviation from standard humanity”, writes Caroline Criado Perez, “It’s time for a change in perspective. It’s time for women to be seen”.   

It’s also a time for women to act:

  • Learn about AI and algorithmic biases, what they are and how they emerge; how to adopt and use AI ethically; and how to build and deploy fair, transparent and responsible AI-driven products and services
  • Engage in discussion forums, public consultations, and the work of civil-society organizations and non-profits working to create ethical AI standards, certification and audits
  • Report AI discrimination and abuses to relevant bodies and investigative outlets, such as ProPublica
  • Ask your technology providers tough questions about how they ensure that their products and services are ethical and don’t discriminate
  • Ensure that that your own AI development teams are knowledgeable and diverse 

Whether you are a decision-maker, user, developer, researcher, consumer, or other person, the knowledge will help you to make more informed decisions – professionally, and as a citizen and consumer. Engagement should lead to better, fairer, more ethical and responsible AI systems that benefit us all. It is our duty to ourselves, our daughters, cousins, sisters, and friends, and to generations of women (and men) to come, to make the world more equitable. 

Resources for Further Learning

 

You can hear more from Natalia at the Women in IT Virtual Summit Europe on 15th September 2020. You can view the agenda and register your free place here