Technology paternalism, ubicomp and the role of exceptions

In "Technology paternalism – wider implications of ubiquitous computing", S. Spiekermann and F. Pallas deal with how people can maintain control in environments that are supposed to be totally automated. They coined the term "technology paternalism" to describe the situation where "people may be subdued to machines’ autonomous actions". They take the example of car that beeps when you don't fasten your seatblet and show how such situations meet the same criteria as the one that define paternalism:

" the definition of Technology Paternalism extends the general notion of paternalism with respect to two aspects: one is that actions are being taken autonomously by machines. The other one is that by their coded rules, machines can become ‘absolute’ forces and therefore may not be overrulable any more."

And discuss that in conjunction with Weiser's notion of calm computing:

"If machines are controlled, then they are not calm any more. There is a clear disaccord between the concept of disappearing technologies and the attempt to remain in control. Control premises attention and visibility whilst Ubicomp environments are designed to be invisible and seamlessly adaptive. Can this dissonance really ever be resolved? "

And of course there is a part about who's responsible of technology paternalism:

"Of course, this power does not lie in the hands of technology itself. Technology only follows rules implemented into it. Therefore, the question arises: who WILL be the real patrons behind Technology Paternalism if it were to become a reality? Who will decide about the rules, the ‘rights’ and ‘wrongs’ of every-day actions? (...) three groups as the potential patrons behind Technology Paternalism: engineers and marketers of Ubicomp technologies as well as regulators influencing application design. "

Why do I blog this? Some relevant issues regarding the notion of control in ubicomp. The authors finally come up with a series of recommendations. The one that strikes me as fundamental is the following: "there should be a general possibility to overrule ‘decisions’ made by technology and any exceptions from this should be considered very carefully". The notion of exception is a crux issue that is often diminished by lots of engineers I talked to wrt to autonomous technologies such as "intelligent fridges" or location-based services. Exceptions breaks patterns and habits tracked by sensors, disrupt machine learning algorithm and are eventually impediments to prediction-based system that would send emergency messages to 911 because granny did not open her fridge for 2 weeks (because she unexpectedly decided to visit her grandson).