Download presentation
Presentation is loading. Please wait.
1
Algorithmic Glass Ceiling in Social Networks
The effects of social recommendations on network diversity Online social networks have been shown to mirror real-world biases, from workplace discrimination, to gender and racial biases. These biases may be amplified by social recommendation algorithms used on these netwoks - algorithms that are seemingly neutral. Our main questions are why is this important to us and how do these algorithms do it? This is important to us as a society as there is already enough bias in the world, and to us, specifically, in order to be able to design better algorithms Ana-Andreea Stoica, Christopher Riederer, Augustin Chaintreau Columbia University Highlights of Algorithms, June 2018
2
Automating social decisions
With the age of big data, a big part of our social decisions have been automated, from the way we connect with people, to the way we receive news, we see job ads and so on.
3
Automating social decisions
With the age of big data, a big part of our social decisions have been automated, from the way we connect with people, to the way we receive news, we see job ads and so on.
4
Automating social decisions
With the age of big data, a big part of our social decisions have been automated, from the way we connect with people, to the way we receive news, we see job ads and so on.
5
Mirroring social biases
But unfortunately, as our interactions move online, so do our biases. And online social networks have been shown to mirror real world biases – from workplace discrimination, to gender and racial biases. \ Algorithms that introduce bias vs. algorithms that reproduce bias – we don’t talk about any of these Where do algorithms amplify this bias? – that’s our question
6
Mirroring social biases
But unfortunately, as our interactions move online, so do our biases. And online social networks have been shown to mirror real world biases – from workplace discrimination, to gender and racial biases. \ Algorithms that introduce bias vs. algorithms that reproduce bias – we don’t talk about any of these Where do algorithms amplify this bias? – that’s our question
7
Mirroring social biases
But unfortunately, as our interactions move online, so do our biases. And online social networks have been shown to mirror real world biases – from workplace discrimination, to gender and racial biases. \ Algorithms that introduce bias vs. algorithms that reproduce bias – we don’t talk about any of these Where do algorithms amplify this bias? – that’s our question
8
Automating social decisions
Recommendation systems The automation of our social decisions filters the great amount of information we see showing us personalized content and suggestions. This is mostly being done by recommendation algorithms with the purpose of providing personalization in an efficient. But this must be put into balance with the aforementioned bias. If a dataset contains bias in it, a recommendation algorithm will pick up on it and mirror it, and in unfortunate cases, make it worse.
9
Automating social decisions
Recommendation systems Human Bias Social inequality Discrimination Efficiency Personalization The automation of our social decisions filters the great amount of information we see showing us personalized content and suggestions. This is mostly being done by recommendation algorithms with the purpose of providing personalization in an efficient. But this must be put into balance with the aforementioned bias. If a dataset contains bias in it, a recommendation algorithm will pick up on it and mirror it, and in unfortunate cases, make it worse.
10
“The glass ceiling effect is the unseen, yet unbreakable barrier that keeps minorities and women from rising to the upper rungs of the corporate ladder, regardless of their qualifications or achievements.” Here I will present our work on the glass ceiling effect in social networks and the role of algorithms and network structures in facilitating it This effect has been extensively observed in the real world, and more recently, has been addressed formally on social networks as well. Our main point is that even seemingly neutral algorithms, that have no intent to discriminate, can pick up patterns in the data and amplify them. We tackle this problem from two perspectives: observing the effect on an Instagram dataset that we have collected, and proving it theoretically under certain conditions. Federal Glass Ceiling Commission. "Solid investments: Making full use of the nation's human capital. US Government, Department of Labor. Washington, DC: US Government Printing Office. Retrieved February 2013." (1995).
11
Glass ceiling effect Only a handful of women make it to the C-Suite.
% females decreases for more prominent researchers. Males have more retweets and followers. Data collected by LeanIn.Org and McKinsey & Co. from 134 global companies, surveying 34,000 men and women. Shirin Nilizadeh et al. "Twitter's Glass Ceiling: The Effect of Perceived Gender on Online Visibility. AAAI Conference on Web and Social Media Avin, Chen et al. "Homophily and the glass ceiling effect in social networks.” ITCS, ACM, 2015.
12
Empirical results on Instagram Data
Graph of likes and comments from Instagram public profiles split by gender Simulated two recommendation algorithms: Adamic-Adar Random walk of length two Glass ceiling effect is amplified
13
Model Ingredients for a glass ceiling effect:
Minority-majority: blue label and red label Fraction of red nodes = r < ½ Rich-get-richer: nodes connect w.p. proportional to degree Homophily: if different labels, connection is accepted w.p. ρ Organic growth Recommendation model
14
Degree distribution Organic growth: Recommendation model:
Theorem: For 0 < r < ½ and 0 ≤ ρ ≤ 1, for the graph sequences G(n) for the organic model and G’(n) for the recommendation model, the red and blue populations exhibit a power law degree distribution with coefficients: So, again, we want to show that the degree distributions for the R and B nodes follow a power law with different coefficients gap
15
Key takeaways Seemingly neutral algorithms may reinforce inequality
Trade-off between prediction rate and fairness Algorithm design with awareness of network structure - Future takeaways
16
Thank you! Web:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.