When searching for information online, the results can vary widely from person to person. Jiqun Liu, an assistant professor in the School of Library and Information Studies in the University of Oklahoma’s College of Arts and Sciences, wants to improve the quality of online search results that accounts for users’ biases and returns more balanced and useful results.
Liu said an example of system bias in information retrieval is when results give preference to popular or established works, people or content, rather than potentially more relevant results.
“When searching for music, algorithms are biased toward popular artists, artists who are already established in the field,” he said. “New artists need opportunities to increase the impacts of their latest works. How do we increase the fairness in the exposure and information presentation?”
“People often act intuitively and are subject to systematic biases when making decisions under uncertainty due to their inability to calculate all the possible consequences of their choices, a fundamental cognitive phenomenon called bounded rationality,” Liu said. “Without proactive information supports, these decisions could be driven by misleading information, cognitive biases and heuristics and may result in significant deviations from desired outcomes.”
Liu received a $175,000 grant from the National Science Foundation to study users’ systematic biases, like previous search queries, to better understand the relationships between search interactions and users’ systematic biases, and to build bias-aware prediction models of search interactions. Using the results, Liu will then develop a scalable and potentially transformative approach to modeling users and their decision-making processes in interactive information retrieval.