The Geekettez. User Research & Experience Design Studio

Get in touch

Phone Berlin: 0171.12 45 07 3
Phone Mannheim: 0177.71 38 208

A Petition for an Online Inclusivity Countdown

22. April 2024 | User Research

Reading time: 7 minutes

A few weeks ago, we attended an event hosted by Digital Media Women Rhein-Neckar and BPW Mannheim-Ludwigshafen, a Future Talk on the Digital Gender Gap with guest panelist Maren Heltsche (Co-founder of speakerinnen.org and special representative for the policy field of digitalisation in the German Women’s Council).

During a one-hour discussion, Maren and fellow panellist Johanna Illgner (city councillor for the city of Heidelberg and co-founder of Plan W – Agentur für strategische Kommunikation) dove into why the Digital Gender Gap exists, and discussed possible solutions. The discussions involved the gap within the workforce but also focussed on the digital data gap, which led us to the question: if we can have an online accessibility countdown, why can we not have an online inclusivity countdown?

What is the Online Accessibility Countdown?

In short, the Online Accessibility Countdown is a law requiring digital services to be accessible for people with disabilities. From online-accessibility-countdown.eu by designer Annika Brinkmann: “On June 28 2025, the European Accessibility Act (EAA) comes into force. Then, the websites of companies with more than ten employees and more than 2 million EUR annual turnover must be accessible, or those that are published afterwards.” You can learn more about the law and the requirements on Annika Brinkmann’s page. The Online Accessibility Countdown is not the topic of this post. Instead, we want to use it as a blueprint to explain why we believe it is only the beginning of a truly accessible and inclusive internet.

Why do we need an Online Inclusivity Countdown?

Argument #1: Progress is better than Regress.

The Digital Gender Gap has many pillars to focus on. For one, there is a gap within the workforce. The percentage of Women in Tech varies depending on the geographical location. According to www.womentech.net, it will take 53 years (in Latin America and the Caribbean) to 189 years (in East Asia and Pacific) to close this gap within the STEM and Tech workforce, which directly affects the products and services developed as the perception of the teams working on them can be one-sided.

While these numbers are shocking, our petition for an Onlince Inclusivity Countdown will focus on closing the Digital Gender Data Gap. The reason is that the rapid development of AI contains the imminent potential of a backwards movement rather than the improvement of inclusivity within digital services and products, which prompts immediate actions.

Currently, the data algorithms rely on and AIs are trained with can be biased, sexist and therefore harmful or discriminatory against marginalised groups. During the Future Talk session, Maren Hetschle correctly said that fair systems seem utopian in an unfair world with unfair data.

An additional problem: training data is not regulated, which is partially for economic reasons. Most AI companies’ business models and competitive edge would be nullified if it were. However, if it appears that we have only agreed that international regulation is unacheivable, for the sake of progress and the success of existing AI companies, that is only half the truth. The real question is, who, what international body could be in charge of regulating the data?

If you believe that the consequences of this development will take some time to rear their ugly head, you might want to check out a recent UNESCO report, which found proof of regressive gender stereotypes in generative AI (www.unesco.org/en/articles/generative-ai-unesco-study-reveals-alarming-evidence-regressive-gender-stereotypes). AI, which kids are currently using to do their homework with.

Argument #2: Missed opportunities lead to Regrets.

Currently, 27% of the people living in the EU have a disability. While not all people with disabilities are limited when interacting with digital products, the number of people affected by non-inclusive systems is high enough to have knocked some sense into the EU and brought the European Accessibility Act (EAA) into law.

Considering that the women population in the EU (51%) exceeds the men (49%), making them the majority, it is surprisable that an Online Inclusivity Countdown is not already in place. This circumstance means we are currently not trying to regulate data models, even though they discriminate against most people.

No matter what women, aka half the world’s population means to you in terms of your job, be it a customer, an employee, a patient, or a constituent. If you rely on discriminatory data when interacting with them, you WILL miss opportunities and make avoidable mistakes.

Argument #3: Discrimination is costly.

Suppose the argument of discrimination against 50% or more of the population seems too weak of an argument. In that case, money might be a more convincing argument.

A financial fiasco such as the Amazon Recruiting Tool (qz.com/1419228/amazons-ai-powered-recruiting-tool-was-biased-against-women) or Austria’s employment agency ArbeitsmarktchancenAssistenzsystem (AMAS) (algorithmwatch.org/en/austrias-employment-agency-ams-rolls-out-discriminatory-algorithm, Stefanie’s talk about this at WUD 2021 youtu.be/PndW3UR_p1s?si=uRhYHQJpB-DSrp2k&t=483) could have been prevented, if data had been evaluated and adjusted before the development of the tools.

You are wrong if you think this type of discrimination does not affect your budget. Suppose you use the internet to market to your audience. In that case, your money might not reach your desired audience due to discriminating algorithms. Learn more about that topic here: algorithmwatch.org/en/automated-discrimination-facebook-google.

We are sure there are many more disadvantages of discriminatory data in tech and reasons for an online Inclusion Countdown, but listing them will only matter if we find ways to tackle the source of the problem: How do we design non-discriminatory systems and services when our tools are flawed?

Currently, the international consensus seems to be that data cleansing or supplementation and training model regulations are unachievable tasks, so we will have to develop our own solutions.

Designing Fair Systems in an Unfair World

#1 Get to Know Your Data

Constantly challenge your data, whether basing your design on analytical data or training your service with datasets. Find out how, when, and by whom the data was accumulated. Ask what type of data was collected and—maybe even more importantly—what was not collected. If you are familiar with analysing and evaluating website performance, think of this step as cleansing your analytics data from spam traffic. If you don’t, you might as well be flying blind.

#2 Get to Know Your Users

User Research can help you create tailored solutions and complement or help challenge existing data. Understand that no data is neutral or unbiased – back it up with real insights by asking the people you are creating for. To prevent these insights from becoming another source of biased data, document them and explain how they lead to design decisions.

#3 Diversify your Teams

“Every person is the result of something, the combination of nature and nurture and childhood and work and love and family. Every one of us has been moulded into the people we are today by the environments from which we sprang.” (“You Are What You Watch”, Walt Hickey, Workman Publishing, 2023) Different people, shaped by different environments, ask different questions and develop different solutions. Use that benefit.

While we can do no more than hope that the Online Inclusivity Countdown will one day start ticking, we won’t sit back and wait for that day! If you need help creating nuanced and inclusive designs, get in touch! The founding claim of The Geekettez was “We Design for Humans”. We have stayed true to that claim since 2012. #PowerToThePeople

Links and Sources

Related posts