In today's group meeting, we were discussing the security issues in the cloud computing paradigm. At the end of the meeting, I was confused about the difference between grid vs. cloud computing. Are they both refer to the same thing? Or Are they different? Or Are they have things in common? If it is the last case, what is common and what is different? So, I decided to look for the answer.
I am not familiar with grid computing, but Ian Foster et.al.'s 2008 paper titled "Cloud computing and grid computing 360-degree compared" helped to resolve some of the confusions I had in mind. This blog post is based on the material in that paper.
The following diagram shows the big picture of grid vs. cloud computing.
The following discussion is based on projects such as TeraGrid (grid computing) vs. commercially available Amazon EC2, Microsoft Azure (cloud computing).
How the resources are distributed?
To me both are the same from the distributed system point of view; both try to reduce the computing cost by using distributed cluster of computers. However, the main difference appear to be how the two approaches work. It is safe to say that, from a user's point of view, cloud computing is a centralized model whereas grid computing is a decentralized model where the computation could occur over many administrative domains.
In cloud computing (at least from what I have seen so far), one party has the control over the cluster of computers, whereas in grid computing, there is no one controller that can control all the nodes in the cluster. In other words, cloud computing allows a user/organization to build a virtual organization on a third party infrastructure where as grid computing tries to build a collaborative virtual organization that does not belong to one single entity.
How the on demand computation works?
Both have the notion of on demand computation. However, grid computing is more of an incentive model (e.g. if you provide computation resources, you also get computation resources from others who have already joined the grid/cluster), whereas in cloud computing there is no notion of incentive model. Cloud computing is more of a utility model like electricity consumption where you pay for what you use. One could argue that both have some kind of a utility model; in grid computing you trade your idle computation cycles, unused space, etc. with some other (same or different) resources available in the virtual network and in cloud computing, you trade your money for the resources available with a cloud provider.
I don't claim that the above description is fully correct. I may have looked at the topic from a narrow point of view. Please feel free to voice your opinion.