I prefer the term utility computing to cloud computing. People outside of software development understand that electricity, water, and phone service are all utilities. Cloud computing is an attempt to deliver compute resources at utility prices. Before I define what utility computing is, I want to define what utility computing isn’t.
Some companies, such as IBM, are trying to do a "me too" with cloud computing. They use the term cloud computing because that amorphous word, cloud, does not have a clear meaning. They are also confusing the computing public. Why? They are conflating their virtualization products with utility computing in order to confuse the market and continue making sales (note: most big iron and *nix vendors have EXCELLENT virtualization technology). Virtualization of resources and compute resources is an ol+++++++++d story that mainframe vendors have had working well since the 1960s. More recently, companies like VMWare, Citrix,and Microsoft have made virtualization of compute resources common. Virtualization lets a customer remain ignorant of what the underlying hardware is. With virtualization, one still has control over the operating system that is used to store files, run applications, authenticate users, and communicate with other operating systems. Virtualization lets an entity buy a high powered machine and then load it up with operating system images that get to pretend like they are the only operating system on the hardware. Virtualization presents a simplified view of the hardware, including limited views into memory, CPU, and storage. With virtualization, one worries about access to memory, CPU, and storage. When a vendor like IBM or Sun says you can own your cloud, they mean that you can write applications that can demand to run like in a cloud, but you have to worry about having enough storage, CPU, and memory to get the job done. This space is important for lots of reasons, but it is different from utility computing.
Cloud computing is a mechanism to deliver compute and storage resources where the user does not need to know how those resources are provisioned, who else might be on the same hardware, or what the underlying technology to provide those services happens to be. The closest analogs to utility computing are other utilities: electricity, water, gas, cable TV, Internet, cell phone carriers to name a few. Utilities have a common set of characteristics.
- Picks who provides the utility.
- Is responsible for limiting consumption.
- Views available resources as infinite.
- Procures resources to deliver the utility.
- Decides how the utility is delivered.
- Makes sure that one user does not adversely effect other users.
- Dictates the mechanisms used to consume the utility.
Virtualization does allow for most of these items to appear. Virtualization does not allow for a user to treat available resources as infinite. Virtualization does require for the consumer to also be a provider. If you rent a Windows Server or *nix service as a virtual instance from a machine you can’t see, you have a utility operating system, not a utility compute resource. Your machine still has fixed storage and CPU.
There are many firms saying that they offer cloud computing. Three are very well known:
Of the three, Amazon is the odd man out, offering a hybrid of utility virtual instances, where you can spin up a Windows Server or Linux OS, but that instance can only provide durable storage via the infinite Simple Storage Service, S3.
Utility computing provides a benefit that virtualization does not: utility computing allows you to abstract away the professionals who handle data redundancy, keeping servers running, and adding compute power when needed. When your computing needs indicate that you need more or less, you just take what you need. There is no need to negotiate for extra compute utility or to keep those resources when they are not in use. Virtualization means you have the resources 24/7. Utility computing means you have the resources only when you need them. The rest of the time, you can give back the resources. That’s a strength of utility models. The owner of the resource can be creative around how to handle the demand flow by turning resources on and off. Consumers only have to worry about what they need now and how much they can afford to consume. The rest is automatic and transparent.
Utility providers do make decisions about how you can consume: REST to access storage, Python/.NET/something else to write applications, types of databases that work (hint: RDBMS doesn’t scale, and utilities know this), and types of storage. These decisions help you create applications that can take optimal advantage of the utility resources.
This change in computing will require people to learn a different way to develop applications. People will not like the changes until more successes happen. In the end, utility computing is going to succeed and will work hand in hand with virtualization.