A sensor that doesn’t react to Black hands but works fine for everyone else (Facebook Office’s ‘Racist soap dispenser,’ 2017). A hiring algorithm that is sexist towards women (Amazon, 2018). These are only a few examples of AIs that became unintentionally racist or sexist because they used datasets with biases.
The dataset that trained the AI data represents the quality of the AI.
“Technology is one of the main drivers of change. With good tech, we can bring social change and a better understanding of how certain we progress with change. This particularly with Gender equality and Biases that are relevant and needed for a massive improvement in our society and social economics growth to allow the next generation to thrive in a fair and equal world.” —
Challenge partner, Sofia Nabila Echadli, Founder & CEO GEI LAB.
10 teams participated in a 12 hours hackathon on March 5th in the Hack the Bais hackathon organized by GEI and Codam. The participants were given a dataset containing the results of a company-wide survey on various aspects of their employee’s lives. The company that provided the data prided itself on having a diverse employee base and equal opportunities for all employees. It was time to see if their diversity policies were being lived up to within their company by developing an algorithm that could detect discrimination.
“I had a wonderful time with so many interested and hungry to learn young talents! The students all showed innovative solutions and tackled the problem in a unique manner which was great to see.” Dr. Eva, IBM’s Advanced Analytics & AI practice leader and Chief Data Scientist for MEA — Jury Member in the Hackathon.
“Throughout the hackathon, we could feel a warm and welcoming and involved atmosphere from the mentors, judges, and other participants. Our team was made up of four people: two are men, and two are women. Of these four, two are people of color. The organizers’ passion and the participants allowed the hackathon to be more than a regular hackathon: You could instantly feel the energy and passion, which contributed to bringing the hackathon to life. “- Duy Hoang, team member AI Against Injustice, third place winners hackathon and Codam Coding College Student.
The dataset revealed that there were some obvious differences in how employees were being treated depending on their race, gender, or age.
The winning team, UnderestiMates, found a person’s salary is affected by many factors, including their gender, educational background, and where they live.
They found out that the data set survey was filled in primarily by men. Many underrepresented subgroups are often more unhappy about their job than men who responded to the survey. There is a significant discrepancy in the information about people provided depending on where they live.
“Overall, I enjoyed this hackathon because it gives you a boost when you’re able to create and present something in such a short time!” —
Maria Daan, team member UnderestiMates, Codam Coding College Student.
The second place went to No Bias, No Cry. They discovered that women were overall less satisfied with their job than men, and non-straight employees earned less than their straight co-workers on average.
One hour before the deadline, we still only had a hypothesis but nothing else to present. This hackathon’s expectations seemed a lot higher than what we were capable of doing in the timeframe. Sofia, the organizer, had expected us to create a tool, use machine learning, and do proper statistical analysis; yet we had some basic data visualization. Then, thank god, thirty minutes before the deadline, we finally got our multi-variable linear regression analysis to work, and the results were interesting. — Tijmen, team member No Bias, No Cry, Codam Coding College Student.
The third place went to AI Against Injustice, who discovered that men's risk-taking behavior was more likely to be rewarded than in women.
How can we ensure that AIs don’t have any biases?
“If the AI would have been trained to create salaries for people, it would have paid straight white men more than other groups.”
All groups concluded that although the company valued diversity and equal opportunities — they still seemed to be lacking it in practice in many fields.
If the data set used to train an AI is not diverse enough, it will reflect its dataset’s biases. So, if this dataset had been used to train an AI, the AI would have skewed towards what men preferred most due to their predominance in the dataset. Not only that, if the AI would have been trained to create salaries for people, it would have paid straight white men more than other groups. Taking the biases found into account while training AI can minimize the effect in the AI’s final design.
The importance of recognizing your daily biases has never been more critical, and this is true whether you’re in tech or not. Only by working together can we solve the problem of biased software and business practices. Rather than creating AIs that perpetuate a dataset's biases, new software should be developed to find biases in data to prevent these biases from reaching AIs.
I’m very happy to have been part of this hackathon and to have met the wonderful judges and panelists, as well as the students. The students showed their passion, drive, and knowledge in their solutions. — Andrew Schumer, Jury member, Cybersecurity and Digital Transformation advisor to the UAE government and private sector.
“I feel that Codam students have a particular lineage of resilience based on their peer-to-peer learning, and that means they push outside their comfort zone daily.” — Challenge partner, Sofia Nabila Echadli, Founder & CEO GEI LAB.
Hackathons, meet-ups, and other business, academic, or community events are a great way to get together with our students. Get in touch with Victoria Ous, Codam’s head of Partnership and Communication at Victoria@codam.nl; we’d love to create magic together!✨