Researchers Create AI Based Tool That Prevents Gender Discrimination

4024

A new artificial intelligence (AI) tool for detecting unfair discrimination on the basis of race or gender has been created by researchers at Penn State and Columbia University

Preventing unfair treatment of individuals on the basis of race, gender or ethnicity has been a long-standing concern of modern civilized societies. However, detecting such discrimination resulting from decisions, whether by human decision makers or automated AI systems, can be extremely challenging.

This challenge is further complicated by the wide adoption of AI systems to automate decisions in many domains—including policing, consumer finance, higher education, business and so on.

Machine Learning at its use

The team of researchers created an AI tool for detecting discrimination with respect to a protected attribute (race or gender), by human decision makers or AI systems that is based on the concept of causality in which one thing—a cause—causes another thing—an effect.

Aria Khademi, graduate student in Information Sciences and Technology, Penn State stated an example question, ‘Is there gender-based discrimination in salaries?’ that can be reframed as, ‘Does gender have a causal effect on salary?,’ or in other words, ‘Would a woman be paid more if she was a man?’

Vasant Honavar, Professor and Edward Frymoyer Chair of Information Sciences and Technology, Penn State said that the AI systems are trained on large amounts of data. But if these data are biased, they can affect the recommendations of AI systems.

For example, if a company historically has never hired a woman for a particular type of job, then the AI system is trained on this. Based on this historical data, the AI system will not recommend a woman for that job.

Since it is not possible to directly know the answer to such a hypothetical question, the team’s tool uses sophisticated counterfactual inference algorithms to arrive at a best guess.

Khademi added that an intuitive way of arriving at a best guess as to what a fair salary would be for a female employee is to find a male employee who is similar to the woman with respect to qualifications, productivity and experience. Based on this, gender-based discrimination in salary can be minimised if it is ensured that similar men and women receive similar salaries.

The researchers tested their method using various types of available data, such as income data from the U.S. Census Bureau to determine whether there is gender-based discrimination in salaries.

Other applications

Honavar added that as data-driven artificial intelligence systems increasingly determine how businesses target advertisements to consumers, how police departments monitor individuals or groups for criminal activity, how banks decide who gets a loan, who employers decide to hire, and how colleges and universities decide who gets admitted or receives financial aid, there is an urgent need for more of such tools to be developed and implemented.