Rethinking Growth
Think:Act looks at smart growth, from efficiency and ideal group size to dark tourism, sports technology innovation at FC Barcelona and the data gender gap.
Data is the basis on which decisions are made, resources allocated and results measured. But what if it fails to represent large segments of the population, like … women? Leading thinkers are drawing attention to the hidden bias in the data we use to create our world.
We like to think of data as being objective, but the answers we get are often shaped by the questions we ask. When those questions are biased, the data is too: That's how Bill and Melinda Gates decided to express and assess data bias in their 2019 annual letter. How much income did women in developing countries earn last year? How much property do they own? We do not know the answers to these questions because no one has thought to gather this data. Instead, the data gathered about women in developing countries is focused mainly on their reproductive health, reflecting the way society sees their primary role: as a wife or a mother.
We live in an increasingly data-driven world. It influences personal, business and policy-making decisions. But what if the data we have is failing to provide an accurate picture of the lives of half the population? In many countries we have come a long way since the first women's movements drew attention to the inequalities in society. Can it really be the case that we are still blind to so much unmet need?
In her book Invisible Women, published in 2019, British writer and feminist Caroline Criado Perez takes us on a journey through the modern world, highlighting where data gaps and bias exist in both developing and developed countries and revealing how these lead to detrimental outcomes for women. As Perez proves, bias is there even in places where people think it not possible. It may not be malicious or even deliberate, but often what we take for normal or standard in fact fails to recognize the needs of women, because often when we say human, we mean man.
Is everything gendered? Perez opens her book with an example from Sweden, a country renowned around the world for its progressive attitudes on women's rights. In 2011 an initiative was introduced requiring officials to evaluate policies for gender bias. As they began the task, one official remarked, surely something as "straightforward" as snow clearing could not have a gender bias? But answering this question reveals the depth of the problem. Criado Perez shows how the way a public body decides to clear snow has a very different impact on men and women. In order to understand this, start by looking at how women and men generally use transportation. It is still the case today that men are more likely to travel to work in the morning by car or train to an urban center using major transportation routes that have been designed specifically for the purpose of supporting activity that is seen as economically worthwhile. That's why those routes get cleared of snow first.
Meanwhile, women are more likely to take children to nursery and school first, encumbered with pushchairs and schoolbags, often using local buses, before traveling to their place of work. What use is it to them if the major thoroughfares have been cleared of snow when the footpaths and side roads are still icy? If this sounds like little more than an inconvenience, consider the costs. Criado Perez says that data gathered in Sweden shows that women make up 69% of pedestrian injuries during the winter months, often with fractures and dislocations. That's a cost for the whole of society, not just for those women.
From the way buildings and products are designed to drug development and testing, from how we assess merit and performance to the health and safety measures we implement, by drilling down to the granular details of everyday life, Criado Perez reveals the deep bias in the way we build and organize our societies Surely this isn't gendered? is a question that is asked on the assumption that we have an ungendered standard or norm around which we design our institutions. But look more closely at that norm and compare it with the reality of women's lives and it is clear that there is a divergence.
A huge amount of unmet need is the result – unmet need found in approximately half the population with implications both for the public and private sector. And unmet need can be an opportunity for business – but discovering what those needs are and working out how to meet them requires a new approach. According to Gayna Williams, the former director of user experience at Microsoft and founder of the organization If She Can I Can, it requires a proactive effort to remove gender blindness. In her 2014 blog piece "Are you sure your software is gender-neutral?" she recommends that design teams trying to remove gender bias should start by using "she" as the default pronoun, that a female customer should be represented in demos and that feedback from women should be explicitly sought. As Williams says, men should not assume they know the experiences, motivation and behaviors of their spouses, daughters or mothers; they should ask for their perspectives. Like Criado Perez, she also recommends that data should be broken down by gender. Gender-neutral products do not happen by chance, she says.
British journalist and author of Invisible Women Caroline Criado Perez published her first book, Do It Like a Woman, in 2015. A feminist campaigner and founder of the Women's Room project for better representation of female experts in the media, her efforts led to the installation of the statue of suffragist leader Millicent Fawcett in Parliament Square, London, in 2018.
As algorithms take over more and more of the decision-making processes, it matters more than ever which assumptions we build our models on. If data containing inherent bias is used in self-learning systems, the bias will be magnified. This could affect areas of life ranging from job applications and health care to insurance premiums and credit ratings. Much of the information used to train algorithms is gendered, says Safiya Noble, associate professor and co-director of the UCLA Center for Critical Internet Inquiry. We see gendered data collection in almost all of the mainstream platforms that are dependent upon collecting information about us to aggregate us into consumer segments for marketers and advertisers: Facebook, Instagram, Google Search and YouTube. These platforms make data profiles about us that deeply influence what we see in terms of search results, news, and products or services.
Noble believes that if we do not find a way to regulate algorithmically driven systems soon, we will see the normalization of a whole host of gendered and racially discriminatory systems that will then be difficult for us to intervene upon or change. And it goes far beyond snow clearing. I think we should keep a very close watch on the credit, loan, investment and financial services industries where opaque algorithms decide who is credit-worthy, education-worthy, housing-worthy and ultimately, who is opportunity-worthy, she says.
Most of our data is based on a norm that is male-centric. But a woman's day-to-day experience differs from a man's in many areas of life.
Think:Act looks at smart growth, from efficiency and ideal group size to dark tourism, sports technology innovation at FC Barcelona and the data gender gap.
Curious about the contents of our newest Think:Act magazine? Receive your very own copy by signing up now! Subscribe here to receive our Think:Act magazine and the latest news from Roland Berger.