InsightaaS: I had the good fortune to spend some time with a friend – a physician – last week. We talked about many different things, but one that stuck with me was his critique of automation in healthcare. Time spent typing, he said, is time that isn’t spent looking at patients, talking with them, forming an understanding of what might be wrong with them. The good doctor isn’t a Luddite – we also spent a fair amount of time talking about the need for more and better data on which to base understanding of what is and should be happening with respect to treatments (and physician measurement) – but he strongly believes that technology is and should be subordinate to human skills in diagnoses and care delivery.
Despite being most famous for an article entitled IT Doesn’t Matter, Internet/IT philosopher Nicholas Carr isn’t a Luddite, either. However, like my doctor friend, Carr is concerned with the “de-skilling” effect of expert systems. In a piece published in the Wall Street Journal, Carr argues that AI-based systems are eroding the skills of people – physicians, airplane pilots, architects – who should be amongst the best trained and most perceptive professionals in our society. It seems like a difficult slope to argue against: for example, increased use of autopilots and “fly by wire” systems has made air travel safer, but “flying skills decay quite rapidly towards the fringes of ‘tolerable’ performance without relatively frequent practice,” and “even a slight decay in manual flying ability can risk tragedy. A rusty pilot is more likely to make a mistake in an emergency.” Similar trends are seen in other professions.
Is the right idea to ratchet up automation even further, or to look to another model? Carr is strongly persuaded that the latter is the better path. He argues that, rather than continue to rely on “technology-centered automation,” society would be better served if we deploy “human-centered automation,” in which “the talents of people take precedence.” In his article, Carr states that “Human-centered automation doesn’t constrain progress. Rather, it guides progress onto a more humanistic path, providing an antidote to the all-too-common, misanthropic view that venerates computers and denigrates people.”
Carr doesn’t offer a lot of sound guidance with respect to how this model will play out in the industries he discusses, and no guidance whatever with respect to how his goals can be aligned with the profit motives of the corporate leaders who make decisions on what form of automation their enterprises will pursue (and fund). He does make a concluding point that will resonate with many readers, however: “If we let our own skills fade by relying too much on automation, we are going to render ourselves less capable, less resilient and more subservient to our machines. We will create a world more fit for robots than for us.”
Artificial intelligence has arrived. Today’s computers are discerning and sharp. They can sense the environment, untangle knotty problems, make subtle judgments and learn from experience. They don’t think the way we think–they’re still as mindless as toothpicks–but they can replicate many of our most prized intellectual talents. Dazzled by our brilliant new machines, we’ve been rushing to hand them all sorts of sophisticated jobs that we used to do ourselves.
But our growing reliance on computer automation may be exacting a high price. Worrisome evidence suggests that our own intelligence is withering as we become more dependent on the artificial variety. Rather than lifting us up, smart software seems to be dumbing us down.
It has been a slow process. The first wave of automation rolled through U.S. industry after World War II, when manufacturers began installing electronically controlled equipment in their plants. The new machines made factories more efficient and companies more profitable. They were also heralded as emancipators. By relieving factory hands of routine chores, they would do more than boost productivity. They would elevate laborers, giving them more invigorating jobs and more valuable talents. The new technology would be ennobling.
Then, in the 1950s, a Harvard Business School professor named James Bright went into the field to study automation’s actual effects on a variety of industries, from heavy manufacturing to oil refining to bread baking. Factory conditions, he discovered, were anything but uplifting. More often than not, the new machines were leaving workers with drabber, less demanding jobs…