InsightaaS: Andrew McAfee is well-known as a commentator on the business of, investment trends within, and the social implications resulting from IT. In this post from his "The Business Impact of IT" blog, he provides an interesting perspective on IT investment. A first set of data demonstrates that since 2000, the proportion of US GDP allocated to technology has been shrinking. Does that mean IT is less important than it once was? McAfee doesn't believe so, and he digs below the surface to explain why: a second data set shows that software investment ramped up heavily until 2000, and has remained more or less steady since, and a third compares software and total IT spending, and demonstrates that software is becoming the dominant area of IT investment. McAfee's storylines tie these points together: "Hardware runs software,: he says, "and it’s software that runs things;" and while expenditures on software have been flat overall for the past 15 years, "open source software and the cloud and everything-as-a-service...have significantly lowered the bill for a given level of enterprise software capability, so I look at the graph above as pretty good evidence of constantly increasing demand for software." In the end, McAfee finds that there is a disconnect between traditional measures of spending and the overall focus of businesses on IT: "It does feel to me like a sea change is taking place – that it’s getting so much cheaper to acquire digital technologies that even if demand for them rises strongly in the future total spending on them might not."
My MIT colleague David Autor delivered a wonderful paper at the recent Jackson Hole Economic Policy Symposium about American job and wage patterns in recent decades, and their link to the computerization of the economy. I’ll say more later about his paper, which was one of the highlights of the event for me (sighting this moose was another one). For now I just want to highlight one graph that he included, and draw a couple additional ones.
Autor included a chart very much like the one below, which tracks US corporate spending over time on digital stuff – hardware and software – as a percentage of GDP:
The most striking pattern in this graph is the sharp increase in the late 1990s, and the steep falloff since then. We’re spending just about a full percentage point of GDP less on IT than we were fifteen years ago. This seems like a compelling prima facie case for believing that IT’s impact on the economy and the labor force should be less than it was before the turn of the century...