The negative consequences of web biases

UTA researchers combatting bias in web database applications

Wednesday, Sep 22, 2021 • Herb Booth : Contact

A computer science and engineering professor at The University of Texas at Arlington is using a grant from the National Science Foundation (NSF) to determine how to detect and eliminate biases in web database applications.

Gautam Das, left, and Shirin Nilizadeh" _languageinserted="true
Gautam Das, left, and Shirin Nilizadeh

Gautam Das, who is codirector of the Center for Artificial Intelligence and Big Data, received a three-year, $416,000 NSF grant to examine web applications, such as those on shopping sites, real estate sites, maps and other common database-reliant applications. These applications often produce biased results based on the data collected, leading to a lesser experience for users and suboptimal choices.

Das is exploring how to mitigate or redesign the applications to remove the biases. Shirin Nilizadeh, a UTA computer science assistant professor, is coprincipal investigator. This study is part of a larger $1 million collaborative project with Abolfazl Asudeh of the University of Illinois at Chicago and H.V. Jagadish of the University of Michigan.

While the biases aren’t necessarily programmed into the algorithm, the factors that the algorithm uses to make its decisions—such as where a person lives and their past search history—might introduce biases.

When researching products on a shopping site, for example, the site will return a list of products potentially desirable to a user. Based on the profile the site has built, the user might not get the type of quality he or she would expect. Real estate websites might point certain ethnicities toward certain neighborhoods rather than those that might better fit their preferences.

“These are necessarily closed systems,” Das said. “If you suspect something is wrong, you’d need to force companies to reveal their algorithms and data and scrutinize it, which is highly unlikely.

“Since you can’t look at a single query and a single response and see a pattern, we are going to try to observe the systems with a series of questions and answers, then reverse-engineer the algorithms so we can start to see trends that indicate the presence of biases.”

Das and his team hope to identify and characterize discrimination in web database applications and to enhance fairness through design and transparency without infringing on proprietary benefits.

Being a cybersecurity researcher, in this project Nilizadeh will focus on identifying the biases in artificial intelligence-based, anti-abuse systems that are used by collaborative content sites, such as Yelp, YouTube and Twitter. She will also look at bias in detecting and removing abusive content, for example fake reviews, malicious content and hate speech.

“Anti-abusive systems are closed systems,” Nilizadeh said. “While trying to understand the behavior of anti-abusive systems, we focus on problematic features that might bias the function toward a particular group of users.”

For example, in security literature, it has been shown that malicious users tend to create fake accounts pretending to be women, and therefore anti-abuse systems may use gender as a feature in their algorithms.

“As a result, a benign message sent by a female account might have a higher probability to get detected as malicious,” Nilizadeh said. “Similarly, for the hate speech detection algorithm on Twitter, the language of the tweets or the type of hate speech will be considered for fairness investigation.”

Hong Jiang, chair of UTA’s Computer Science and Engineering Department, said, “The research proposed by Dr. Das and his team will have high societal impact because as we increasingly immerse ourselves in a digital economy, we rely on user-facing web content to live our daily lives. As a result, any bias in the web, intentional or not, may have important and often negative consequences. Being able to detect and then warn web users of such biases will no doubt reduce, if not avoid, the negative impacts of these biases.”

- Written by Jeremy Agor, College of Engineering