IT: Visionaries and futurists

The field of IT and system administration is infected by a lot of visionaries. It’s a relatively new field; by the most generous definition there have only been IT workers and sysadmins for 40 years. That makes computers and IT a sexy field that people who feel they’re so utterly brilliant that they can make livings predicting the future love to talk on and on about it.

There’s no shortage of visionary types. Lots of them end up in New York making and then losing piles of money. The financial markets are really nothing more than a huge gambling sport going on, in which every player thinks they understand the fiendishly complex game better than the other players. They’re right for a time and then tragically wrong. It’s no surprise they come to be wrong no matter how brilliant they are. After all, they’re making bets all over the place; one person is invested in Latin American mining interests, Chinese textiles, and European luxury goods. Another person might be in American software, Finnish timber, and Korean steel. How can anyone reasonably expect to understand all these industries well enough to even understand the present, never mind guess what’s going to happen?

They certainly don’t. So instead, financial analysts build models. They abstract out the way they expect a given industry works, and use that model to describe the characteristics of a company that should expect growth. They then use that model to try to find companies and investment opportunities that match those few magic qualities that make a sector successful, and bet their stakes — and other people’s money — on that. That’s all investing is; trying to distill what really matters in the face of the impossible task of fully understanding the wide world.

And it’s also why investing is perilous. Models are necessarily finite in their understanding; they only capture a subset of the data in an economy, and a subset of all possible conditions. They work for a time, if at all, but never forever. When the economy shifts underneath the model, the model’s assumptions might no longer be true, and investments made on it will turn out to be bad. If that happens to a particularly popular model, we have a Wall Street meltdown, where a whole lot of smart people look awfully stupid since they forgot at some point that their understanding of their own markets was extremely incomplete.

Technological futurists tend to be equally flawed, but they have the advantage that no one seems to care when they’re wrong, and keep buying their books anyway. It helps when you don’t have a billion dollar bet riding on your prophecy, I suppose. But lately Nick Carr has been the worst of the bunch; a deanship of the analysts that Rob Enderle ceded when he hitched his wagon to Microsoft and Microsoft turned out not to be the leader of anything anymore.

Carr is fond of telling us that IT is dead, and that computing power will become yet another utility; a power outlet, a hot water pipe, or the like; you plug in anything into any power outlet in America and the expected thing happens. There’s nothing that differentiates one type of power from another, one company from another, besides cost. And then power becomes a platform for other innovations, but it’s nothing that anyone should do on their own.

Carr makes the link that computing equals power, and then leaps further to say that a company can get no competitive advantage by innovating in IT, just like companies no longer have their own water turbines and gasoline powered generators. Any IT innovation is just going to be copied by your competitors and turn out to be null anyway.   That means everyone has the same IT, and therefore it’s commodity; you can’t compete better with better electricity, and so you can’t compete better with better IT.   So by all means, outsource whenever you can and concentrate on your “core” business, whatever that means, and wait for the day when email and computing power is standard and is in the “cloud.” This magical “cloud” will take over and rule all.

When people ask me what the computing cloud means, I never have an answer, because I have no friggin’ clue. I think it means he assumes Google is going to do all the programming for me. That would be nice, but I doubt it.

First, Carr’s hypothesis assumes an equal level of competence among company managements. Anyone who’s ever had a boss can tell you otherwise on that one. It also assumes equal talent among workers; also untrue.

Finally but most critically, Carr’s leading “insight” rests on a flawed assumption of what IT does. A lot of people see me and my compatriots as the tenders of the sacred shrine of the machine room; we know the rituals and incantations to Keep The Damn Things Running, and we should be measured in how well we do so. Lots of sysadmins see their role like that too. If that were the entirety of it, then yes, we could be outsourced I suppose to wherever you please, and computing would become a utility.

But keeping the damn things running is truly only the prerequisite to what we really do. My company, like most these days, is about knowing things. Knowledge gets turned into value and that gets turned into profit. These days, also, companies have much more data than fits inside people; so they’re in databases too vast to understand. They need to be processed and abstracted to derive the value. That’s what computing power does; processes data into knowable bits of things, taking the conditions of the world and trying to turn them into insight. You don’t truly know how to use a computer’s potential until you know how to program it.

The sysadmin is the gatekeeper. Our job is to funnel the business’s need to understand and use the data it already has, together with its need to keep it safe from disaster, secure from attack, and available on demand, into actual technology that can do these functions. Can that really be outsourced into an appliance? I doubt it; it’s a custom application. One type of database will not suit the needs of everyone, even within a narrow market. One type of communications medium may be perfect in one place and horrid in another. Everyone uses email, in a standard way, but no one does real business computing in a standard way, since there’s no abstraction that makes it general for all purposes.

Software programming is hard. It’s very abstract and fiendishly difficult to manage shifting requirements and changing ideas of what it should do. It’s hard because it’s an attempt to translate human desires — grey, ambiguous, difficult to communicate — into the concrete thought structures of software. Things that are that difficult and that intricate — and that tied and tailored to specific needs — are not easily made into commodities. Commodity implies one size fits all, but in this case, the detailed complexity of many problems demands one-off solutions.

As long as one-off solutions are necessary — as long as the nature of what I do changes dramatically whenever I take a new job, even if the description matches exactly the previous job’s — then there’s no opportunity for commodity economics. Indeed, people taking Carr’s advice and relying on “the cloud” will find that information — their most important resource these days — will be hard to find and harder to access.

And it’s no mystery to see how this happens. Carr’s a writer, not a sysadmin, and not an IT worker. He studies the problem but he doesn’t live it, just like a financial analyst on Wall St. He comes up with a model and applies it to the real problem, and generates headlines by making counter-intuitive pronouncements that you can get something that’s always been difficult — computing — for no effort and investment. He himself probably uses entirely commodity applications; can survive solely on a diet of Google mail and word processors and the like. But understanding doesn’t come from that; understanding comes from integrating details together; things that sound right can often be proven wrong when they come into contact with reality.

Expertise comes from the bottom up; knowing which factors lay in wait to trip up a big plan with a small snag, and being able to recognize a brilliant synthesis of a thousand details. In a way, if Carr were really analyzing, he never would have come up with his model and thesis; the data and analysis needed to provide a more complete vision of the IT industry would have been sufficiently difficult to assemble on computers that he might have recognized the role IT workers play in making that process easier.

So no, IT isn’t going away, nor should you want it. IT cuts can sometimes look good on paper, but the inefficiency drag that results in the rest of your business can easily set in and make your business sluggish, unresponsive and unaware even of what’s going on inside it, never mind in your market as a whole. The more expert your IT folks are in both technology but also your business itself, the more agile things will be. But if that bridge between technology and the business is never built, and you assume it’s just a commodity to understand your data, well, things will likely go very poorly.