Download presentation
Presentation is loading. Please wait.
Published byOsborn Osborne Modified over 9 years ago
1
Ethics of personalized information filtering Ansgar Koene, Elvira Perez, Christopher J. Carter, Ramona Statache, Svenja Adolphs, Claire O’Malley, Tom Rodden, and Derek McAuley HORIZON Digital Economy Research, University of Nottingham
2
Public/private data Privacy: expressed concerns vs. expressed behaviour Interim summary Conditions for consent Overview
3
Information Overload! Estimated data production in 2012 (www.domo.com)
4
Information services, e.g. internet search, news feeds etc. free-to-use => no competition on price lots of results => no competition on quantity Competition on quality of service Quality = relevance = appropriate filtering Good information service = good filtering
5
Why personalized filtering? John and Jane average have: 2.43 children 0.47 dogs & 0.46 cats 0.67 houses & 0.73 cars John and Jane average do not exists Results based on population averages are crude approximations Personalized filtering – a natural step in the evolution of information services
6
Personalized filter/recommender systems Content based – similarity to past results the user liked Collaborative – results that similar users liked (people with statistically similar tastes/interests) Community based – results that people in the same social network liked (people who are linked on a social network e.g. ‘friends’)
7
Concerns regarding personalization Social consequences: self-reinforcing information filtering – the ‘filter bubble’ effect Privacy: personalization involves profiling of individual behaviour/interests Agency: the filtering algorithm decides which segment of available information the user gets to see Manipulation: people’s actions/choices are depend on the information they are exposed to
8
User profiling involves mining of data about: past behaviour of the user interacting with the service user behaviour on other services o through ‘tracking cookies’ o data purchasing from other services mapping the social network of a user and monitoring the behaviour of people within that social network User profiling: privacy
9
Informed consent for profile building: -Part of long, difficult to understand, Term & Conditions that users click ‘accept’ on, usually without reading it. -Same consent is applied for years without explicit renewal User profiling: (un)informed consent
10
The profile summarizes user behaviour patterns its purpose is to predict the interests of the user Access to this information can facilitate: -Phishing -Social engineering for hacking User profiling: security issues
11
Filter algorithms provide competitive advantage details about them are often trade-secrets Users don’t know how the information they are presented with was selected no real informed consent Service users have no ‘manual’ override for the settings of the information filtering algorithms It is difficult for service users to know which information they don’t know about because it was filtered Agency: user vs. algorithm
12
Information filtering, or ranking, implicitly manipulates choice behaviour. Many online information services are ‘free-to-use’, the service is paid for by adverting revenue, not users directly Potential conflict of interest: promote advertisement vs. match user interests Advertising inherently tries to manipulate consumer behaviour Personalized filtering can also be use for political spin / propaganda etc. Manipulation: conflict of interest
13
2011 FTC investigation of Goolge for search bias EU competition regulation vs Google Netflix prize competition de-anonymization Evidence of public concern
14
Is privacy sensitive, need to know how it is handled Role for regulating authoroty, but also: Tools to probe filtering criteria -> black-box testing User-friendly testing kit for general public -> RRI -> so people can decide for themsleves if they are happy with a service Manipulation: conflicts of interest
15
Personalized information filtering is a natural evolution in the interaction with the user It raises issues relating to privacy and data protection. Lack of transparency -> concerns over agency & manipulation Potential for covert manipulation RRI -> researchers developing recommender algorithms have responsibility identifying and studying the socio-psychological impact of personalized filtering; helping people to understand and regulate the level of privacy intrusion they are willing to accept for personalized information filtering; developing a methodology to probe the subjective ‘validity’ of the information that is provided to users based on their own interests; engaging with corporate information service providers to reinforce ethical practices. Conclusion
16
Technical development of tools: Black-box testing kit for probing the characteristics of the user behavior profiles used in recommender systems. Recommendation bias detection system for identifying user behavior manipulation A two-layer recommender architecture that de-couples the delivery of non-personalized information by service providers from a user owned/controlled system for personalized ranking of the information. Psycho-social research on the impact of personalized information filtering on: General exploration-exploitation trade-off in action selection Attitudes towards trust and critical evaluation of information Cybersecurity: Protection against mal-use of personalized recommender systems for phishing related social engineering Policy: Development of guidelines for responsible innovation and use of recommender systems, protecting the privacy and freedom of access to information of users. Public engagement: Develop educational material to help people understand how recommendations they receive from search engines, and other recommender systems, are filtered so that they can better evaluate the information they receive. Call for research programme
17
Data collections by service provider Filtering by user two layer system Acquisti et al. (2009)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.