Search

Gender and racial discrimination in artificial intelligence


Gunay Kazimzade, researcher at Technical University of Berlin in the field of gender and racial bias within artificial intelligence systems.


GUNAY, WHY ARE WE EXPERIENCING GENDER AND RACIAL DISCRIMINATIONS IN THE FIELD OF ARTIFICIAL INTELLIGENCE?

Bias in AI-based systems has variety of reasons which occurs in different phased starting from data collection, data mining, data processing and training of the ML models.

Often the data used to feed the models are not diverse enough to be considered for training.

Thus, when you are feeding deep learning algorithms with more photos of light-skinned faces – your system will be unable to recognize people of colour. If you will train the recruitment system with 10 years old data where women were underrepresented in specific positions, the system will eliminate female candidates during the recruitment process. As you can recognize, these biases are shaping the society in a dramatic way by influencing it on economic, ethical, legal levels.


IS THIS ALSO CAUSED BY NOT SUFFICIENT REPRESENTATION OF WOMEN SPECIALISTS IN AI AND PROGRAMMING INDUSTRIES?

For sure it is. Underrepresented groups and minorities captured in data always influence the predictions and decision made by those systems.

Also, currently, most of the available data is coined by a dominance of white, educated and wealthy men, which results from their high status in the social hierarchies of past and present, and their increased investment in new technologies, mirroring in their high appearance both in datasets and technological decision-making processes.

WHAT NEGATIVE CONSEQUENCES THIS TREND CAN HAVE?

Bias and discrimination remain even on human level today; thus, AI is mirroring the biases we have in the society. I see here a huge hidden danger in social determination, creating network bubbles, emotional manipulation, financial loss and creating “elite” in a digital world. Giving the power to the most powerful one and eliminating the voice of “others” – where it takes us now.

0 views