Review in Hungarian Magyarul itt: Újegyenlőség
Have you ever looked into how large is the average handspan? or how large is a female or a male handspan? If you play the piano you may find that the keyboards are designed to fit the average male hand? So what... you may wonder. Well, the consequences can be measured with statistics, the disadvantage: the disproportionate work-related injuries women suffer and where keyboard players are most at risk if we look into instrumentalists. Not speaking of the 'successful performers', injuries disqualify one to get into the club, and yes, larger hand span makes one more likely to get there. This is just one example of the problem that we encounter almost everywhere: the design bias, based on the 'average men' assumption: buildings, offices, public toilets, air-conditioning adjustments just to name a few.
The book unfolds with a sea of data: macro and micro-level, data that has at first hand UK and US-focus but being systemically–within the limits of availability and space–contrasted and complemented with data from developing countries, with a special focus on the global South. Turns out, despite of the differences in social-cultural contexts that are not being taken into account by developmental and "aid" projects, subtle and explicit mechanisms of segregating women or neglecting the needs and situation of women is an overarching problem. In case of emergency or disasters the problems around shelters or refugee centers with gender neutral and remote toilets, and shared spaces evoke the same problems of violence and rape despite of the location: Sweden, US, Lebanon or India. The same goes with the series of examples connected to political participation of women, even if they go through the diverse electoral processes that might favor or not their presence, turns out harassment: verbal, sexual, and cutting the voice by institutional or situational means as well as crowding women out from decision-making through informal ties: are all present in Argentina or New-Zealand, in Afganistan or Sweden. Women are being perceived as more aggressive for the same behavior as men in politics.
The problem with data bias is that it hides the various ways women are being harmed – that is the focal point of the book. And the story continues. What is the solution? Collect gender-specific data in every sector and context. Further solution comes on the level of political representation. Drawing on the case of the electoral system Perez highlights the possible points of intervention to bring more gender parity into the system. To back this argument, it would be even more powerful to learn from the cited best practices: with timely social data. The reader get hungry for data, being swept away all over the book with data.
We need to rewire the system as „The higher the socioeconomic status of women in a country the lower the sex gap in deaths” (300p)
Therefore participation in politics is important, but a more radical systemic change is being needed, since the trend of the last decade shows an opposite trend.
Perez goes on the macro-level scrutinizing employment structures, the social welfare systems pointing out the anomalies, and drawing on the problem of the GDP: leaving out the contribution born by women: especially with the shift of even more responsibilities in care and social provisions, and education on the households. Besides the value of housework has just estimations, no proper calculations.
In Science-Technology-Society (STS) scholarship a growing strand looks into the bias related to machine designs. An example:
1. the problem of the screen-fixation in producing smartphones, thus the bigger-the better principle is suggested to be driven by men-needs-first perspective: being design for male hands, making it difficult for women for one-handed use, just think about the case when you want to make a good photo. There are fields however which are being gender-biased in the sense that products are being designed for women, or being simply targeted at women.
Guess what: AI is also gender-biased, turns out the search engines, voice-recognition, and translation services are ran by algorithms trained on corpora, trained on a dataset that associates terms more closely with men than women– as in the case of doctor-nurse, or a programmer– could do a woman out of job. Robots are involved in the interview process, and selection of CVs is algorithm-based.
The examples go on and on, all backed by relevant and available research findings.
The book turned how I see myself as a woman upside down – instead of bringing me down, it gave me the discovery of "how it works" and "screw that: now I know it was not my fault". It brings examples of women just-doing-it, for example in the startup world, where women designing well-informed and female-data-informed prototypes but need to face an all-male investor board, but still persist.
While it tells the story of structural barriers it also goes beyond that and empowers to go on.
Lack of data on women's body, on the shape of the womb: one of the most striking examples to how male-default the image of the world designed around us is.
Caroline Criado Perez: The Invisible Women. Data Bias in a World Designed for Men, New York: Abrams Press, 2019.
Julianna Faludi PhD is a sociologist and writer. She is interested in the relationship of technology, society and the arts, and ethical consumption. Her background is in economic sociology, development studies, and humanities. As a professor she has a track record in lecturing courses in Innovation, Branding, the Arts, Russia studies, and Sociology. Beforehand she worked with regional development in different roles. Julianna has experience in broadcasting, giving talks, and writing. She masters several languages, Russian, Italian, English, French and Hungarian.
Julianna Faludi All rights reserved. You may not take images or content or replicate any of the content from this site without written permission.udi juliann