Public Attitudes Towards Algorithmic Personalization and Use of Personal Data Online

Evidence from Germany, Great Britain, and the United States
dc.bibliographicCitation.article117
dc.bibliographicCitation.issue1
dc.bibliographicCitation.volume8
dc.contributor.authorKozyreva, Anastasia
dc.contributor.authorLorenz-Spreen, Philipp
dc.contributor.authorHertwig, Ralph
dc.contributor.authorLewandowsky, Stephan
dc.contributor.authorHerzog, Stefan M.
dc.date.accessioned2023-03-28T13:40:32Z
dc.date.available2023-03-28T13:40:32Z
dc.date.issued2021
dc.date.updated2023-03-28T09:52:37Z
dc.description.abstractPeople rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.
dc.identifier.doi10.1057/s41599-021-00787-w
dc.identifier.urihttp://resolver.sub.uni-goettingen.de/purl?fidaac-11858/2936
dc.language.isoen
dc.relation.issn26629992
dc.relation.journalHumanities and Social Sciences Communications
dc.rightsL::CC BY 4.0
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subject.ddcddc:300
dc.subject.ddcddc:070
dc.subject.fieldamericanstudies
dc.subject.fielddigitalhumanities
dc.subject.fieldbritishstudies
dc.subject.fieldsocialscience
dc.titlePublic Attitudes Towards Algorithmic Personalization and Use of Personal Data Online
dc.title.alternativeEvidence from Germany, Great Britain, and the United States
dc.typearticle
dc.type.versionpublishedVersion
dspace.entity.typePublication
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
s41599-021-00787-w.pdf
Size:
8.38 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
5.84 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections