ABSTRACT
For several years, scholars have (for good reason) been largely preoccupied with worries about the use of artificial intelligence and machine learning (AI/ML) tools to make decisions about us. Only recently has significant attention turned to a potentially more alarming problem: the use of AI/ML to influence our decision-making. The contexts in which we make decisions--what behavioral economists call our choice architectures--are increasingly technologically-laden. Which is to say: algorithms increasingly determine, in a wide variety of contexts, both the sets of options we choose from and the way those options are framed. Moreover, artificial intelligence and machine learning (AI/ML) makes it possible for those options and their framings--the choice architectures--to be tailored to the individual chooser. They are constructed based on information collected about our individual preferences, interests, aspirations, and vulnerabilities, with the goal of influencing our decisions. At the same time, because we are habituated to these technologies we pay them little notice. They are, as philosophers of technology put it, transparent to us--effectively invisible. I argue that this invisible layer of technological mediation, which structures and influences our decision-making, renders us deeply susceptible to manipulation. Absent a guarantee that these technologies are not being used to manipulate and exploit, individuals will have little reason to trust them.
- Solon Barocas and Andrew D. Selbst. 2016. Big Data's Disparate Impact. California Law Review, Vol. 104 (2016), 671--732.Google Scholar
- Louis Brandeis. 1933. Other Peoples Money and How the Bankers Use It. Jacket Library.Google Scholar
- Kiel Brennan-Marquez and Daniel Susser. 2016. Obstacles to Transparency in Privacy Engineering. In 2016 IEEE Security and Privacy Workshops (SPW). IEEE, San Jose, CA, 49--52.Google ScholarCross Ref
- M. Ryan Calo. 2014. Digital Market Manipulation. The George Washington Law Review, Vol. 82, 4 (2014).Google Scholar
- Ryan Calo and Alex Rosenblat. 2017. The Taking Economy: Uber, Information, and Power. Columbia Law Review, Vol. 117 (2017).Google ScholarCross Ref
- Matthew Chalmers and Areti Galani. 2004. Seamful Interweaving: Heterogeneity in the Theory and Design of Interactive Systems. In Proceedings of the 2004 conference on Designing interactive systems processes, practices, methods, and techniques - DIS '04. ACM Press, Cambridge, MA, USA, 243. Google ScholarDigital Library
- Danielle Keats Citron. 2008. Technological Due Process. Washington University Law Review, Vol. 85, 6 (2008), 66.Google Scholar
- Kate Crawford and Jason Schultz. 2014. Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms. Boston College Law Review, Vol. 55, 1 (2014), 37.Google Scholar
- Sara Degli Esposti. 2014. When big data meets dataveillance: the hidden side of analytics. Surveillance & Society, Vol. 12, 2 (May 2014), 209--225.Google Scholar
- Robert E. Goodin. 1985. Protecting the Vulnerable: A Reanalysis of Our Social Responsibilities. The University of Chicago Press, Chicago, IL.Google Scholar
- T.H. Greene. 1881. Liberal Legislation and Freedom of Contract. In Works 3: 3365--86, R.L. Nettleship (Ed.). Longmans, London.Google Scholar
- Martin Heidegger. 1962. Being and Time. Blackwell Publishing Ltd., Oxford.Google Scholar
- Jacqueline Howard. 2016. Americans at More Than 10 Hours a Day on Screens. https://www.cnn.com/2016/06/30/health/americans-screen-time-nielsen/index.htmlGoogle Scholar
- Don Ihde. 1990. Technology and the Lifeworld: From Garden to Earth. Indiana University Press, Bloomington.Google Scholar
- Jeffrey M. Jones. 2013. In U. S., 40% Get Less Than Recommended Amount of Sleep. https://news.gallup.com/poll/166553/less-recommended-amount-sleep.aspxGoogle Scholar
- Daniel Kahneman. 2013. Thinking, Fast and Slow 1st pbk. ed ed.). Farrar, Straus and Giroux, New York. OCLC: ocn834531418.Google Scholar
- Pat Langley. 1997. Machine Learning for Adaptive User Interfaces. In KI-97: Advances in Artificial Intelligence, Jaime G. Carbonell, Jörg Siekmann, G. Goos, J. Hartmanis, J. Leeuwen, Gerhard Brewka, Christopher Habel, and Bernhard Nebel (Eds.). Vol. 1303. Springer Berlin Heidelberg, Berlin, Heidelberg, 53--62. Google ScholarDigital Library
- Pat Langley. 1999. User Modeling in Adaptive Interfaces. In Proceedings of the Seventh International Conference on User Modeling. Springer, 357--370. Google ScholarDigital Library
- Marjolein Lanzing. 2018. "Strongly Recommended" Revisiting Decisional Privacy to Judge Hypernudging in Self-Tracking Technologies. Philosophy & Technology (June 2018).Google Scholar
- Bruno Latour. 1992. Where Are the Missing Masses? The Sociology of a Few Mundane Artifacts. In Shaping Technology/Building Society: Studies in Sociotechnical Change, Wiebe E. Bijker and John Law (Eds.). MIT Press, Cambridge, MA.Google Scholar
- Catriona Mackenzie, Wendy Rogers, and Susan Dodds (Eds.). 2014. Vulnerability: new essays in ethics and feminist philosophy .Oxford University Press, New York.Google Scholar
- David Martin, Benjamin Hanrahan, Jacki O'Neill, and Neha Gupta. 2013. Being a Turker. In CSCW '13. ACM, 12. Google ScholarDigital Library
- Marshall McLuhan. 2003. Understanding Media: The Extension of Man. Ginkgo Press, Corte Madera.Google Scholar
- Helen Nissenbaum. 1996. Accountability in a Computerized Society. Science and Engineering Ethics, Vol. 2 (1996), 25--42. https://link-springer-com.ezaccess.libraries.psu.edu/content/pdf/10.1007%2FBF02639315.pdfGoogle ScholarCross Ref
- Amy Nordrum. 2016. Popular Internet of Things Forecast of 50 Billion Devices by 2020 Is Outdated. https://spectrum.ieee.org/tech-talk/telecom/internet/popular-internet-of-things-forecast-of-50-billion-devices-by-2020-is-outdatedGoogle Scholar
- Donald Norman. 1999. The Invisible Computer: Why Good Products Can Fail, the Personal Computer Is So Complex, and Information Appliances Are the Solution. MIT Press, Cambridge, MA. Google ScholarDigital Library
- Cathy O'Neil. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy first edition ed.). Crown, New York. Google ScholarDigital Library
- Eli Pariser. 2011. The Filter Bubble: What the Internet Is Hiding from You. Penguin Press, New York. http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=1118322 OCLC: 759841356. Google ScholarDigital Library
- Frank Pasquale. 2015. The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press, Cambridge. Google ScholarDigital Library
- Joseph Raz. 1986. The Morality of Freedom reprinted ed.). Clarendon Press, Oxford. OCLC: 838707273.Google Scholar
- Robert Rosenberger. 2009. The Sudden Experience of the Computer. AI & Society, Vol. 24 (2009). Google ScholarDigital Library
- Robert Rosenberger and Peter-Paul Verbeek. 2015. A Field Guide to Postphenomenology. In Postphenomenological Investigations: Essays on Human-Technology Relations, Robert Rosenberger and Peter-Paul Verbeek (Eds.). Lexington Books, Lanham, Maryland.Google Scholar
- Alex Rosenblat. 2018. Uberland: How Algorithms are Rewriing the Rules of Work. University of California Press, Oakland, CA. Google ScholarDigital Library
- Alex Rosenblat and Luke Stark. 2016. Algorithmic Labor and Information Asymmetries: A Case Study of Uber's Drivers. International Journal of Communication, Vol. 10 (2016), 27.Google Scholar
- Marijn Sax, Natali Helberger, and Nadine Bol. 2018. Health as a Means Towards Profitable Ends: mHealth Apps, User Autonomy, and Unfair Commercial Practices. Journal of Consumer Policy, Vol. 41, 2 (June 2018), 103--134.Google ScholarCross Ref
- Daniel Susser. 2017. Transparent Media and the Development of Digital Habits. In Postphenomenology and Media: Essays on Human-Media-World Relations, Yoni Van den Eede, Stacy O'Neal Irwin, and Galit Wellner (Eds.). Lexington Books, New York, 27--44.Google Scholar
- Daniel Susser, Beate Roessler, and Helen Nissenbaum. {n. d.}. Online Manipulation: Hidden Influences in a Digital World. Under Review ({n. d.}). https://papers.ssrn.com/abstract=3306006Google Scholar
- Richard H. Thaler and Cass R. Sunstein. 2008. Nudge: Improving Decisions About Health, Wealth, and Happiness. Yale University Press, New Haven. OCLC: ocn181517463.Google Scholar
- Sherry Turkle. 1995. Life on the Screen: Identity in the Age of the Internet. Simon and Schuster, New York, NY. Google ScholarDigital Library
- Sherry Turkle. 2005. The Second Self: Computers and the Human Spirit. MIT Press, Cambridge, MA.Google Scholar
- Yoni Van Den Eede. 2011. In Between Us: On the Transparency and Opacity of Technological Mediation. Foundations of Science, Vol. 16, 2/3 (May 2011), 139--159.Google Scholar
- Peter-Paul Verbeek. 2005. What Things Do: Philosophical Reflections on Technology, Agency, and Design. Pennsylvania State University Press, University Park, Pa.Google Scholar
- Karen Yeung. 2017. Hypernudge: Big Data as a Mode of Regulation by Design. Information, Communication & Society, Vol. 20, 1 (Jan. 2017), 118--136.Google ScholarCross Ref
- Tal Zarsky. 2006. Online Privacy, Tailoring, and Persuasion. In Privacy and Technologies of Identity: A Cross-Disciplinary Conversation, Katherine J. Strandburg and Daniela Stan Raicu (Eds.). Springer-Verlag, New York, 209--224.Google Scholar
Index Terms
- Invisible Influence: Artificial Intelligence and the Ethics of Adaptive Choice Architectures
Recommendations
SHAPE: A Framework for Evaluating the Ethicality of Influence
Multi-Agent SystemsAbstractAgents often exert influence when interacting with humans and non-human agents. However, the ethical status of such influence is often unclear. In this paper, we present the SHAPE framework, which lists reasons why influence may be unethical. We ...
Subverting the gaze: redefining the object role, gaze subversion helmet
UbiComp/ISWC'15 Adjunct: Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable ComputersFrom the moment this gaze exists, I am already something other, in that I feel myself becoming an object for the gaze of others. But in this position, which is a reciprocal one, others also know that I am an object who knows himself to be seen.
Jacques ...
Behavioural artificial intelligence: an agenda for systematic empirical studies of artificial inference
AbstractArtificial intelligence (AI) receives attention in media as well as in academe and business. In media coverage and reporting, AI is predominantly described in contrasted terms, either as the ultimate solution to all human problems or the ultimate ...
Comments