Abstract
The role of tradeoffs and moralization in the adoption of big data technologies
Public summary
Big data technologies have emerged in various domains such as criminal investigations, healthcare, hiring, and retail, and come with both benefits and costs. They can improve healthcare and help solve crimes faster but continuous surveillance can also violate people's privacy and biased algorithms can be discriminatory. Researchers and ethicists voice concerns over the use of these technologies often in a language indicating a moral attitude, but people continue to use these technologies despite the concerns. And if people indeed hold moralized attitudes towards these technologies then these attitudes can be hard to change and can make people intolerant towards those who disagree with them.
In my dissertation, using empirical research methods, I addressed the following questions: 1) How do people make tradeoffs when benefits of these technologies are pitted against the costs? 2) How do they make these tradeoffs when others' privacy is violated? 3) What is the extent to which people hold moralized attitudes towards big data technologies and 4) how can we change moralized attitudes?
Overall, I found that people place importance on both data protection and benefits of the technology. However, people are willing to give up others' privacy as long as their own privacy is protected. Individual differences such as sensitivity to justice play a role in both tradeoffs and moralization, i.e. people with higher concerns for justice place more importance on privacy and hold more moralized attitudes towards the technologies. Individual differences in acquiring privacy information are also associated with how people make tradeoffs. Regarding the role of moralization, people vary in their moralized attitudes towards these technologies and do not show consensus in moralization of these technologies. Both moral and non-moral persuasive messages opposing the technologies change people's attitudes. However, moral messages increase moralization and lower people's willingness to compromise whereas non-moral messages reduce levels of moralization. This finding suggests that moral messages which are often used to persuade people should be used cautiously as they can have side effects such as increased moralization and lower compromise. The findings of this dissertation provide a foundation for understanding people’s relationship with new big data technologies. They shed light on situations where people make tradeoffs not just for their own privacy protection but also when others are affected by these decisions. They contribute to theories related to decision-making and moralization by applying existing theories to a new problem and simultaneously updating these theories.
Public summary
Big data technologies have emerged in various domains such as criminal investigations, healthcare, hiring, and retail, and come with both benefits and costs. They can improve healthcare and help solve crimes faster but continuous surveillance can also violate people's privacy and biased algorithms can be discriminatory. Researchers and ethicists voice concerns over the use of these technologies often in a language indicating a moral attitude, but people continue to use these technologies despite the concerns. And if people indeed hold moralized attitudes towards these technologies then these attitudes can be hard to change and can make people intolerant towards those who disagree with them.
In my dissertation, using empirical research methods, I addressed the following questions: 1) How do people make tradeoffs when benefits of these technologies are pitted against the costs? 2) How do they make these tradeoffs when others' privacy is violated? 3) What is the extent to which people hold moralized attitudes towards big data technologies and 4) how can we change moralized attitudes?
Overall, I found that people place importance on both data protection and benefits of the technology. However, people are willing to give up others' privacy as long as their own privacy is protected. Individual differences such as sensitivity to justice play a role in both tradeoffs and moralization, i.e. people with higher concerns for justice place more importance on privacy and hold more moralized attitudes towards the technologies. Individual differences in acquiring privacy information are also associated with how people make tradeoffs. Regarding the role of moralization, people vary in their moralized attitudes towards these technologies and do not show consensus in moralization of these technologies. Both moral and non-moral persuasive messages opposing the technologies change people's attitudes. However, moral messages increase moralization and lower people's willingness to compromise whereas non-moral messages reduce levels of moralization. This finding suggests that moral messages which are often used to persuade people should be used cautiously as they can have side effects such as increased moralization and lower compromise. The findings of this dissertation provide a foundation for understanding people’s relationship with new big data technologies. They shed light on situations where people make tradeoffs not just for their own privacy protection but also when others are affected by these decisions. They contribute to theories related to decision-making and moralization by applying existing theories to a new problem and simultaneously updating these theories.
Original language | English |
---|---|
Qualification | Doctor of Philosophy |
Supervisors/Advisors |
|
Award date | 20 May 2022 |
Place of Publication | s.l. |
Publisher | |
Publication status | Published - 20 May 2022 |
Fingerprint
Dive into the research topics of 'The role of tradeoffs and moralization in the adoption of big data technologies'. Together they form a unique fingerprint.Datasets
-
Dissertation - The role of tradeoffs and moralization in the adoption of big data technologies - Chapter 3
Kodapanakkal, R. I. (Creator), DataverseNL, 28 Oct 2021
DOI: 10.34894/k97qwj, https://dataverse.nl/citation?persistentId=doi:10.34894/K97QWJ
Dataset