Insight

Overcoming unconscious bias in tech

During May, our Women in Tech team collaborated with Microsoft, Google, Unilever and IBM to host an event exploring unconscious bias in tech. With breakout rooms offering deep dives into different aspects of this pervasive problem, the event was an occasion to share learnings and collaborate on tackling a challenge at the core of the tech industry. Here are our key takeaways:

What’s the problem with unconscious bias?

Many consider big data the most valuable and powerful tool in modern IT. Intelligently and automatically analysing vast pools of data helps businesses harness the information they’ve always gathered and use it to hone products and services, from improving medical treatments to powering the artificial intelligence (AI) that makes digital tools more useful to us all.

But when the dataset that fuels decisions, processes and services doesn’t represent the female experience, or the programming of automated analytics tools doesn’t correctly consider it, we can inadvertently create tangible disadvantages for women. For example, seatbelts, headrests and airbags in cars have been designed based on data collected from trials using the male physique, phones are too big for the average female hand, and online recruitment tools have penalised  women’s applications as they use data collected from predominantly white male applicants.

The use of historic data to train AI and the fact that most developers are male means we’re at risk of perpetuating established class-, race- and gender-based inequalities as the world digitises. We need to bring together underrepresented groups to share their voices, so real innovation can start.

Build equity into the design process

Creating responsible AI frameworks, such as setting a percentage of women involved in designing AI solutions, will guide improvements. Alongside the use of more advanced tools to anticipate and detect bias, for example Google’s What-If Tool and IBM’s AI Fairness 360, organisations should re-design human-driven processes to build equitable systems for the future. This can be done by creating diverse, empowered teams and focusing on user outcomes.

The more diverse your team, the better equipped they’ll be to understand the problem, spot bias and engage with underrepresented groups as they’ll bring a variety of perspectives to the design process and challenge pre-existing assumptions. And by focussing on user outcomes, those diverse teams will better achieve results that benefit all users, fairly.

Take a systems thinking approach

Technology is only part of the issue. Organisations have been making progress implementing new AI ethics frameworks, putting in place engines to monitor bias and fairness across all AI. The challenge with AI is that it lacks the broader context of the systems it operates in, assuming everyone has equal status and focusing more on correlation than cause. As people, we live within multiple different systems, all with varying experiences. Taking a systems thinking approach is an important next step, appreciating the way interconnected systems interact. Future research needs to address how to bring a diverse team together to understand the ways in which systemic equities manifest themselves in AI and question if they will be fit for purpose in the future.

Make the unconscious, conscious

To formulate the codes of conduct and best practices to root unconscious bias out of AI, we first need to understand it and bring the unconscious to the forefront of our minds. It’s only through recognising the existence and implications of a problem that we can act.

So, it’s critical to partner with different voices and create an open dialogue when building AI frameworks. And that means bringing together policy makers, tech experts and business leaders to diversify the AI field.

By achieving gender equality in tech, we can create more equality for everyone. It’s critical to get all voices heard in our AI-enabled future.

Explore more

Contact the team

We look forward to hearing from you.

Get actionable insight straight to your inbox via our monthly newsletter.