(This is a continuation of musings on Matthew Crawford’s Why We Drive. Read Part 1 here.)
Modern technology is opposed to the development of virtue.
Machinery that humans operate has been moving away from manual gauges and towards automated alerts. While the manual gauge, or even moreso, the seat of the pants, puts an imposition on the driver to pay attention, the automated alert does not - instead it invites us to have faith that the alarm will sound when there is a problem. But what if it doesn't? Then, the driver is thrust from their leisurely couch-sitting into a disaster where all the automation is falling apart.
Human factors research shows how making things too easy for people can backfire, because our "attentional capacity... shrinks to accommodate reductions in mental workload." This is especially worrying because it is hard to detect. The operator simply tunes out because he or she doesn't have enough to do. Further, when the driver (or pilot) is understimulated during routine operation, he is more likely to panic when overstimulated, as happens when dealing with a failure of the automation.
As much as I think we get overstimulated by media, I think Crawford is absolutely onto something here. In productive endeavors, especially those that use the hands, we are chronically UNDER-stimulated.
No wonder we want to watch TikTok while driving.
The cars aren't fun anymore - and not just because we've gotten used to them - they demand less of us. Now, the Lord asks us to stay awake and keep watch. Is there a level of automation that encourages us to not do this?
The virtue of alertness is one front, but there are others. Machine thinking despises prudence - it writes it right out.
If there is an overall lesson to be learned from the human factors literature, perhaps it is this: automation has a kind of totalizing logic to it. At each stage, remaining pockets of human judgement and discretion appear as bugs that need to be solved.
And this becomes a political question as much as a technical one.
The contest for control between humans and computers often looks like no contest at all, as a political reality.
We have conditioned ourselves to expect this!
Wherever large groups assemble there is an imperative to control every aspect of the environment, and prescribe every move - every allowed use. Usually it is some private entity that does this, not the government. Worse, it becomes an unthought posture we adopt for ourselves, having been trained to think of ourselves as consumers of manufactured experiences rather than as rational creatures capable of dealing with the world in an unfiltered way.
And of course the desire to not exercise virtue is one of laziness.
This is a laziness that the technocrats of our time are well aware of. Eric Schmidt (of Google) told the Wall Street Journal:
"One idea is that more and more searches are done on your behalf without you having to type... I actually think most people don't want Google to answer their questions. They want Google to tell them what they should be doing next."
We want rest and freedom. Of course, true freedom is not to be found through faith in machines, but in development of virtue.
Qualities once prized, such as spiritedness and a capacity for independent judgement, are beginning to appear dysfunctional. If they are to operate optimally, our machines require deference. Perhaps what is required is an adaptation of the human spirit, to make it more smoothly compatible with a world that is to be run by a bureaucracy of machines. Or maybe we need to burn that house down.