» Articles » PMID: 35858360

Propagation of Societal Gender Inequality by Internet Search Algorithms

Overview
Specialty Science
Date 2022 Jul 20
PMID 35858360
Authors
Affiliations
Soon will be listed here.
Abstract

Humans increasingly rely on artificial intelligence (AI) for efficient and objective decision-making, yet there is increasing concern that algorithms used by modern AI systems produce discriminatory outputs, presumably because they are trained on data in which societal biases are embedded. As a consequence, their use by human decision makers may result in the propagation, rather than reduction, of existing disparities. To assess this hypothesis empirically, we tested the relation between societal gender inequality and algorithmic search output and then examined the effect of this output on human decision-making. First, in two multinational samples ( = 37, 52 countries), we found that greater nation-level gender inequality was associated with more male-dominated Google image search results for the gender-neutral keyword "person" (in a nation's dominant language), revealing a link between societal-level disparities and algorithmic output. Next, in a series of experiments with human participants ( = 395), we demonstrated that the gender disparity associated with high- vs. low-inequality algorithmic outputs guided the formation of gender-biased prototypes and influenced hiring decisions in novel scenarios. These findings support the hypothesis that societal-level gender inequality is recapitulated in internet search algorithms, which in turn can influence human decision makers to act in ways that reinforce these disparities.

Citing Articles

Quantifying behavior-based gender discrimination on collaborative platforms.

Vasarhelyi O, Vedres B PNAS Nexus. 2025; 4(2):pgaf026.

PMID: 39925854 PMC: 11804011. DOI: 10.1093/pnasnexus/pgaf026.


Beyond the Screen: The Impact of Generative Artificial Intelligence (AI) on Patient Learning and the Patient-Physician Relationship.

Traylor D, Kern K, Anderson E, Henderson R Cureus. 2025; 17(1):e76825.

PMID: 39897260 PMC: 11787409. DOI: 10.7759/cureus.76825.


Exploring condition in which people accept AI over human judgements on justified defection.

Yamamoto H, Suzuki T Sci Rep. 2025; 15(1):3339.

PMID: 39870713 PMC: 11772686. DOI: 10.1038/s41598-025-87170-w.


Analyzing the impact of socioeconomic indicators on gender inequality in Sri Lanka: A machine learning-based approach.

Kularathne S, Perera A, Rathnayake N, Rathnayake U, Hoshino Y PLoS One. 2024; 19(12):e0312395.

PMID: 39724101 PMC: 11671024. DOI: 10.1371/journal.pone.0312395.


Transmission of societal stereotypes to individual-level prejudice through instrumental learning.

Schultner D, Stillerman B, Lindstrom B, Hackel L, Hagen D, Jostmann N Proc Natl Acad Sci U S A. 2024; 121(45):e2414518121.

PMID: 39485797 PMC: 11551433. DOI: 10.1073/pnas.2414518121.


References
1.
Caliskan A, Bryson J, Narayanan A . Semantics derived automatically from language corpora contain human-like biases. Science. 2017; 356(6334):183-186. DOI: 10.1126/science.aal4230. View

2.
Gervais S, Bernard P, Klein O, Allen J . Toward a unified theory of objectification and dehumanization. Nebr Symp Motiv. 2013; 60:1-23. DOI: 10.1007/978-1-4614-6959-9_1. View

3.
Bailey A, LaFrance M, Dovidio J . Is Man the Measure of All Things? A Social Cognitive Account of Androcentrism. Pers Soc Psychol Rev. 2018; 23(4):307-331. DOI: 10.1177/1088868318782848. View

4.
Garg N, Schiebinger L, Jurafsky D, Zou J . Word embeddings quantify 100 years of gender and ethnic stereotypes. Proc Natl Acad Sci U S A. 2018; 115(16):E3635-E3644. PMC: 5910851. DOI: 10.1073/pnas.1720347115. View

5.
Chandler J, Rosenzweig C, Moss A, Robinson J, Litman L . Online panels in social science research: Expanding sampling methods beyond Mechanical Turk. Behav Res Methods. 2019; 51(5):2022-2038. PMC: 6797699. DOI: 10.3758/s13428-019-01273-7. View