Moralizing Technology: Understanding and Designing the Morality of Things
University of Chicago Press, 2011
Christine Rosen has written a very long and excellent book review / reflection in The New Republic on the recent book on the moral dimension of technology by Prof. Peter-Paul Verbeek (pictured) of the University of Twente in The Netherlands.
Interaction designers ought to reflect on the fact that Verbeek locates morality not just in the human users of technology but in the interaction between us and our machines. In this affair, human beings no longer hold the autonomous upper hand when it comes to moral agency; rather, Verbeek argues, we should replace that notion with one that recognizes “technologically mediated intentions.”
In a world where new technologies seek to seduce us by invoking the language of self-improvement and where smart algorithms subconsciously bypass our emotional and cognitive “imperfections” in order to make us more efficient, those interested in behavioural change should be aware that this also brings about an increase in moral laziness and a decline in individual freedom. “Freedom, Verbeek says, “is a hollow promise in the absence of agency and choice.”
And all of us would be intrigued to read that Enlightenment principles of human autonomy are according to Verbeek “no longer sufficient grounds for moral thinking in an era whose technologies are as ubiquitous and powerful as our own.” Rosen also quotes Alex Pentland who argues in Honest Signals, his book about sociometers, “We bear little resemblance to the idealized, rational beings imagined by Enlightenment philosophers. The idea that our conscious, individual thinking is the key determining factor of our behavior may come to be seen as foolish a vanity as our earlier idea that we were the center of the universe.”