An umbrella, woman’s coat and high heel shoes float in the air without a body
How a bias in data could widen the gender gap

Think:Act Magazine "Rethinking Growth"
How a bias in data could widen the gender gap

Portrait of Think:Act Magazine

Think:Act Magazine

Munich Office, Central Europe
April 8, 2020

Caroline Criado Perez’s book Invisible Women calls for rethinking algorithms before it’s too late.

Data is the basis on which decisions are made, resources allocated and results measured. But what if it fails to represent large segments of the population, like … women? Leading thinkers are drawing attention to the hidden bias in the data we use to create our world.

We like to think of data as being objective, but the answers we get are often shaped by the questions we ask. When those questions are biased, the data is too: That's how Bill and Melinda Gates decided to express and assess data bias in their 2019 annual letter. How much income did women in developing countries earn last year? How much property do they own? We do not know the answers to these questions because no one has thought to gather this data. Instead, the data gathered about women in developing countries is focused mainly on their reproductive health, reflecting the way society sees their primary role: as a wife or a mother.

We live in an increasingly data-driven world. It influences personal, business and policy-making decisions. But what if the data we have is failing to provide an accurate picture of the lives of half the population? In many countries we have come a long way since the first women's movements drew attention to the inequalities in society. Can it really be the case that we are still blind to so much unmet need?

In her book Invisible Women, published in 2019, British writer and feminist Caroline Criado Perez takes us on a journey through the modern world, highlighting where data gaps and bias exist in both developing and developed countries and revealing how these lead to detrimental outcomes for women. As Perez proves, bias is there even in places where people think it not possible. It may not be malicious or even deliberate, but often what we take for normal or standard in fact fails to recognize the needs of women, because often when we say human, we mean man.

Five ways to remove data bias
1

The first step is to accept there is a problem. The next step is to collect gendered data. Then you can start to design evidence-based solutions around it.

2

In design, do not prioritize technical parameters over the needs of the user. To find out what women need, consult them and listen to them.

3

Women are usually in the minority in any decision-making body, so move away from majority-based toward unanimous decision-making.

4

Accept that men interrupt more than women do and women are penalized if they behave in a similar way. Allocate time limits for each person to speak.

5

It is not good enough to encourage women to be more like men. Why should we accept that the way men do things is the correct way?

If data containing inherent bias is used in self-learning systems, the bias will be magnified.

Is everything gendered? Perez opens her book with an example from Sweden, a country renowned around the world for its progressive attitudes on women's rights. In 2011 an initiative was introduced requiring officials to evaluate policies for gender bias. As they began the task, one official remarked, surely something as "straightforward" as snow clearing could not have a gender bias? But answering this question reveals the depth of the problem. Criado Perez shows how the way a public body decides to clear snow has a very different impact on men and women. In order to understand this, start by looking at how women and men generally use transportation. It is still the case today that men are more likely to travel to work in the morning by car or train to an urban center using major transportation routes that have been designed specifically for the purpose of supporting activity that is seen as economically worthwhile. That's why those routes get cleared of snow first.

Meanwhile, women are more likely to take children to nursery and school first, encumbered with pushchairs and schoolbags, often using local buses, before traveling to their place of work. What use is it to them if the major thoroughfares have been cleared of snow when the footpaths and side roads are still icy? If this sounds like little more than an inconvenience, consider the costs. Criado Perez says that data gathered in Sweden shows that women make up 69% of pedestrian injuries during the winter months, often with fractures and dislocations. That's a cost for the whole of society, not just for those women.

"In order to design interventions that actually help women, first we need the data."

Caroline Criado Perez

Journalist and author "Invisible Women"

From the way buildings and products are designed to drug development and testing, from how we assess merit and performance to the health and safety measures we implement, by drilling down to the granular details of everyday life, Criado Perez reveals the deep bias in the way we build and organize our societies Surely this isn't gendered? is a question that is asked on the assumption that we have an ungendered standard or norm around which we design our institutions. But look more closely at that norm and compare it with the reality of women's lives and it is clear that there is a divergence.

A huge amount of unmet need is the result – unmet need found in approximately half the population with implications both for the public and private sector. And unmet need can be an opportunity for business – but discovering what those needs are and working out how to meet them requires a new approach. According to Gayna Williams, the former director of user experience at Microsoft and founder of the organization If She Can I Can, it requires a proactive effort to remove gender blindness. In her 2014 blog piece "Are you sure your software is gender-neutral?" she recommends that design teams trying to remove gender bias should start by using "she" as the default pronoun, that a female customer should be represented in demos and that feedback from women should be explicitly sought. As Williams says, men should not assume they know the experiences, motivation and behaviors of their spouses, daughters or mothers; they should ask for their perspectives. Like Criado Perez, she also recommends that data should be broken down by gender. Gender-neutral products do not happen by chance, she says.

Caroline Criado Perez

British journalist and author of Invisible Women Caroline Criado Perez published her first book, Do It Like a Woman, in 2015. A feminist campaigner and founder of the Women's Room project for better representation of female experts in the media, her efforts led to the installation of the statue of suffragist leader Millicent Fawcett in Parliament Square, London, in 2018.

As algorithms take over more and more of the decision-making processes, it matters more than ever which assumptions we build our models on. If data containing inherent bias is used in self-learning systems, the bias will be magnified. This could affect areas of life ranging from job applications and health care to insurance premiums and credit ratings. Much of the information used to train algorithms is gendered, says Safiya Noble, associate professor and co-director of the UCLA Center for Critical Internet Inquiry. We see gendered data collection in almost all of the mainstream platforms that are dependent upon collecting information about us to aggregate us into consumer segments for marketers and advertisers: Facebook, Instagram, Google Search and YouTube. These platforms make data profiles about us that deeply influence what we see in terms of search results, news, and products or services.

Noble believes that if we do not find a way to regulate algorithmically driven systems soon, we will see the normalization of a whole host of gendered and racially discriminatory systems that will then be difficult for us to intervene upon or change. And it goes far beyond snow clearing. I think we should keep a very close watch on the credit, loan, investment and financial services industries where opaque algorithms decide who is credit-worthy, education-worthy, housing-worthy and ultimately, who is opportunity-worthy, she says.

Mind the data gap

Most of our data is based on a norm that is male-centric. But a woman's day-to-day experience differs from a man's in many areas of life.

An illustration of a cartoon female scientist going through different stages of her day
A day in the life of a female scientist

  1. Getting the kids to school
    Transport networks are designed to serve compulsory travel for economic activity – not care, which is what most women travel for. The networks tend to be radial, great to get in and out of town, but not for traveling locally to drop off kids at school. Women do trip-chaining – if there is no hopper fare this means buying several tickets.
  2. Arrival at work
    The standard office temperature is based on the metabolic resting rate of the average 40-yearold, 70 kg man – offices are, on average, 5°C too cold for women. Personal protection equipment (PPE) like lab coats and overalls are designed around the male body – 10% of women working in the energy sector wear PPE designed for women. The average smartphone is 5.5" – too big for the average woman to use one-handed.
  3. Paycheck
    Women earn between 31% and 75% less than men over their lifetimes. They are more likely to be in part-time work, which is generally paid less well. But we need the unpaid care work. "None of us, including business, could do without the invisible, unpaid work careers do," says Criado Perez. Pensions are increasingly directly based on past contributions, so women are penalized even further for the unpaid care work they do.
  4. After-work work
    Home again via school and shops, followed by unpaid work in the evening. Globally, women do three times the amount of unpaid care work men do. The failure to measure unpaid household services is the greatest gender data gap of all – it could account for up to 50% of GDP in high-income countries. Cuts in social care budgets simply shift the cost from the public sector to women, because the work still needs to be done. Of the cuts made in the UK following the 2008 financial crash, 86% fell on women.
  5. An evening out
    Automobile safety features are designed for the average male. As a result, a woman is 47% more likely to be seriously injured than a man in a car crash. Crash test dummies are based on the average male – 1.77 meters tall and weighing 76 kg.
  6. At the theater
    Why is public restroom space allocated on a basis of 50/50? Women take 2.3 times longer and go more often and often have children with them. In the developing world, the lack of public toilets means more than inconvenience – it causes health problems and puts women in danger. The UN says one in three women lack access to safe toilets. A typical Mumbai slum has six bathrooms for 8,000 women.

Further reading on gender issues
Further reading
Subscribe to the Think:Act Magazine newsletter

Curious about the contents of our newest Think:Act magazine? Receive your very own copy by signing up now! Subscribe here to receive our Think:Act magazine and the latest news from Roland Berger.

Portrait of Think:Act Magazine

Think:Act Magazine

Munich Office, Central Europe