During their own Sept. 29 Cloud Computing Summit, IDC made what at first glance sounded like a self-defeating statement: clouds, in the long run, are actually a more expensive option than a company running its own datacenter.
That, needless to say, got my attention.
The story came out today from Computerworld UK, which I invite you to read. It gives good coverage of the presentation made at the Summit by Matthew McCormack, Consultant, European Systems Group, IDC. Lots of good quotes. I wanted to see the numbers, though, and while looking to contact McCormack, I found just the information I was looking for.
IDC ran a model that analyzed what the costs would be to run 100 percent of a large company’s IT infrastructure entirely within the cloud versus on their own data center. It turns out that according to IDC’s estimates, after the first three years of a cloud IT operation, costs would exceed those of an optimally run datacenter, and that includes the added three-year refresh cycles costs of 30 percent for the datacenter (see slide 6). By the end of 10 years, costs for the cloud operation would hit ¬£26M (US$41M), while the datacenter would only rack up ¬£15M (US$24.6M).
Of course, if your datacenter is being run very poorly, cloud computing can save you short- and long-term money (slide 7). Datacenters could reach as much as twice the cost of a cloud over ten years.
Clearly, the reality is somewhere in between, since no company would be perfect in operating their own datacenter, nor would you think it likely that complete idiots would be bungling a datacenter that long without some sort of course correction. It seems, then that ultimately, whether it’s three years or five or beyond, cloud computing may not make long-term financial sense. Is that the case?
McCormack seems to think not. For one thing, he predicts in the Computerworld UK story “that cloud service costs will eventually fall, as more competitors enter the fray, and as suppliers act to improve market-take up.” This is key to long-term growth of the cloud computing environment.
Right now, cloud computing is a lot of buying a hybrid car here in the US was a few years ago. If you researched a hybrid car, invariably someone made a cost-benefit analysis and confirmed that the value of the fuel savings would never equal the additional cost of buying a hybrid version of a particular car model versus the standard-engine version. But, lately, things have changed. Gas prices have gone up and stayed up for the most part. Consumers are much more “green”-aware, and want hybrids despite the costs. There are more hybrid models available now than a few years ago. All of these factors are driving up demand and driving down costs as car companies seek to stay competitive.
The same will hold true for cloud computing. Yes, right now, there is a need to carefully weigh out the costs involved in using cloud computing for your company. But in the near future it should become increasingly more common to see the cloud as a better economic option as costs come down. Standards, like those on Linux, are key here, too–McCormack cautioned in his presentation that customers should avoid vendor lock in. This is exactly the sort of lock in that projects like Red Hat’s Deltacloud are trying to prevent.
So while the initial read might come across as off-putting, really this was just IDC inducing a bit of common sense into the hype of cloud computing. And common sense is never a bad idea.