The Meaning of ‘Good Design’ in the Age of Smart Automation: Why Human-Centered Design Needs Ethics

Main Article Content

Rodrigo Hernández Ramírez
http://orcid.org/0000-0002-8214-8185

Abstract

The increasing adoption of smart automation has improved people’s lives in several ways, but it has also brought a host of new problems such as deskilling, deepening of structural inequalities, new forms of exploitation, loss of privacy and hindering of human liberties. This paper begins by assuming that such issues are the consequence of poor design and takes the opportunity to analyse what “good design” should mean in turn. Following insights from mediation theory and philosophy of technology, it surveys the general inherent complexities of automation and argues that Human-Centered Design (HCD) continues to endorse an instrumentalist conception of technology. This paper shows that such a conception of human–technology relations significantly limits designers capacity to approach design from a genuinely ethical standpoint. The paper concludes with a sketch of principles that HCD should incorporate to become a truly humanist and ethically-minded design approach.

Keywords: Ethics, Instrumentalism, Mediation theory, Philosophy of design, Smart automation, User-Centered Design

Downloads

Download data is not yet available.

References

Autor, D. (2014, September). Polanyi’s paradox and the shape of employment growth. http://dx.doi.org/10.3386/w20485

Bainbridge, L. (1983). Ironies of automation. Automatica, 19(6), 775–779. https://doi.org/10.1016/0005-1098(83)90046-8

Bar-Yam, Y. (1997). Dynamics of complex systems. Addison Wesley.

Buchanan, R. (2001). Design and the new rhetoric: Productive arts in the philosophy of culture. Philosophy and Rhetoric, 34(3), 183–206. https://doi.org/10.1353/par.2001.0012

Buchanan, R. (2005). Design ethics. In C. Mitcham (Ed.), Encyclopedia of science, technology, and ethics: Vols 2 (D–K) (pp. 504–510). Macmillan Reference.

Ceglowski, M. (2016, October 29). Superintelligence: The idea that eats smart people. Idle Worlds. http://idlewords.com/talks/superintelligence.htm

Cooper, A., Reimann, R., Cronin, D., & Noessel, C. (2014). About face: The essentials of interaction design (4th ed.). John Wiley & Sons.

Danaher, J. (2018). Toward an ethics of AI assistants: An initial framework. Philosophy & Technology, 31(4), 629–653. https://doi.org/10.1007/s13347-018-0317-3

Flusser, V. (2014). Gestures. University of Minnesota Press.

Garreta Domingo, M. (2020). Dieter Rams: 10 Timeless Commandments for Good Design. Interaction Design Foundation. https://www.interaction-design.org/literature/article/dieter-rams-10-timeless-commandments-for-good-design

Girardin, F. (2019, January 16). When automation bites back. http://blog.nearfuturelaboratory.com/2019/01/16/when-automation-bites-back/

Gould, J. D., & Lewis, C. (1985). Designing for usability: Key principles and what designers think. Communications of the ACM, 28(3), 300–311. https://doi.org/10.1145/3166.3170

Göransdotter, M., & Redström, J. (2018). Design methods and critical historiography: An example from Swedish user-centered design. Design Issues, 34(2), 20–30. https://doi.org/10.1162/desi_a_00483

Graeber, D. (2018). Bullshit jobs: A theory. Simon & Schuster.

Hao, K. (2019, December 27). In 2020, let’s stop AI ethics-washing and actually do something. MIT Technology Review. https://www.technologyreview.com/s/614992/ai-ethics-washing-time-to-act/

Hassenzahl, M. (2010). Experience design: Technology for all the right reasons (S. L. on Human-Centered Informatics, Trans.; Vol. 3, pp. 1–95). Morgan & Claypool Publishers LLC. https://doi.org/10.2200/s00261ed1v01y201003hci008

Hernández-Ramı́rez, R. (2019). On false augmented agency and what surveillance capitalism and user-centered design have to do with it. Journal of Science and Technology of the Arts, 11(2), 18–27. https://doi.org/10.7559/citarj.v11i2.667

ISO EN 9241-11. (2018). Ergonomics of human-system interaction—part 11: Usability: Definitions and concepts. International Organization for Standardization. https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en

Johnson, D. G., & Verdicchio, M. (2017). Reframing AI discourse. Minds and Machines, 27(4), 575–590. https://doi.org/10.1007/s11023-017-9417-6

Loh, W., & Misselhorn, C. (2018). Autonomous driving and perverse incentives. Philosophy & Technology, 32(4), 575–590. https://doi.org/10.1007/s13347-018-0322-6

Martinho-Truswell, A. (2018, February 13). To automate is human. Aeon https://aeon.co/essays/the-offloading-ape-the-human-is-the-beast-that-automates

Monteiro, M. (2019). Ruined by design: How designers destroyed the world, and what we can do to fix it. Mule Design.

Norman, D. A. (2013). The design of everyday things (Revised and expanded edition). Basic Books.

Norman, D. A. (1990). The Psychology of Everyday Things. Perseus Books Group.

Norman, D. A. (2010). Living with Complexity. MIT Press.

Norman, D. A., & Draper, S. W. (Eds.). (1986). User centered system design. Lawrence Erlbaum Associates. https://doi.org/10.1201/9780367807320

Ochigame, R. (2019, December 20). The Invention of “Ethical AI”: How Big Tech Manipulates Academia to Avoid Regulation. The Intercept. https://theintercept.com/2019/12/20/mit-ethical-ai-artificial-intelligence/

Parsons, G. (2015). The philosophy of design. Polity Press.

Saffer, D. (2010). Designing for interaction: Creating innovative applications and devices (2nd ed.). New Riders.

Strauch, B. (2018). Ironies of automation: Still unresolved after all these years. IEEE Transactions on Human-Machine Systems, 48(5), 419–433. https://doi.org/10.1109/thms.2017.2732506

Sudjic, D. (2015). B is for Bauhaus, Y is for YouTube: Designing the Modern World from A to Z. Rizzoli Publications.

Taleb, N. N. (2012). Antifragile: Things that gain from disorder. Random House.

Taleb, N. N. (2018). Skin in the game: Hidden asymmetries in daily life. Random House.

Verbeek, P.-P. (2015). COVER story: Beyond interaction. Interactions, 22(3), 26–31. https://doi.org/10.1145/2751314

Verbeek, P.-P. (2006). Materializing morality. Science, Technology, & Human Values, 31(3), 361–380. https://doi.org/10.1177/0162243905285847

Vermaas, P., Kroes, P., van de Poel, I., Franssen, M., & Houkes, W. (2011). A philosophy of technology: From technical artefacts to sociotechnical systems (Vol. 6). Morgan & Claypool Publishers LLC. https://doi.org/10.2200/s00321ed1v01y201012ets014

Woods, D. D. (2016). The risks of autonomy. Journal of Cognitive Engineering and Decision Making, 10(2), 131–133. https://doi.org/10.1177/1555343416653562

Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89. https://doi.org/10.1057/jit.2015.5