When evaluating the pros and cons of moving to the cloud, one of the obvious considerations will be the cost of on-premise vs. a cloud hosted data center. There are almost limitless variables that will make the cost of ownership unique to each business, but here are some high-level considerations.
On-premise hosting comes with large up-front fees for server hardware, where-as the cloud is going to be a recurring monthly cost where you are paying as you go. A less obvious factor is that on-premise hosting does come with its own set of recurring costs, such as the power usage of on-premise servers, as well as the physical office space they take up. Depending on the size of your organization, this can be significant. However, the main cost consideration that I want to address is downtime. This can be brought about by natural or man-made disasters, fires, floods, busted pipes, etc. (The list could go on and on). And how much does downtime really cost? The answer might surprise you when you combine your total affected revenue per hour and total labor cost per hour.
When you’re talking about replacing on-premise servers, you have to allot time for ordering, receiving, configuring AND installing the replacement. At this point you are looking at days or even weeks in some cases.
When your critical workload is in the cloud, that conversation goes from days and weeks, to seconds and minutes.
Not all cloud solutions are created equal but features like co-location redundancy and high availability can guarantee 99.999% uptime.
This leads us perfectly into next week’s episode where we unpack more in the realm of continuity. Be sure to join us next week to continue the conversation.