Grid Computing Vs Cluster Computing

imagesThe components of a cluster are usually connected to each other through fast local area networks (“LAN”), with each node(computer used as a server) running its own instance of an operating system. Computer clusters emerged as a result of convergence of a number of computing trends including the availability of low cost microprocessors, high speed networks, and software for high performance distributed computing.

Clusters are usually deployed to improve performance and availability over that of a single computer, while typically being much more cost-effective than single computers of comparable speed or availability.

Grid computing is the collection of computer resources from multiple locations to reach a common goal. The grid can be thought of as a distributed system with non-interactive workloads that involve a large number of files. What distinguishes grid computing from conventional high performance computing systems such ascluster computing is that grids tend to be more loosely coupled, heterogeneous, and geographically dispersed.[1] Although a single grid can be dedicated to a particular application, commonly a grid is used for a variety of purposes. Grids are often constructed with general-purpose grid middleware software libraries.

Cluster computing is the base of all distributed computing paradigm, it aggregates the resources locally and shares the load. Grid computing is the extended version of cluster, in which resources are provisioned through internet. Cloud is on top of all, it provides more or less same functionalities as the above two, but provides in the form of services and bills the same ( utility). virtualization is the common concept utilized by all the paradigms, but implementational method varies accordingly. In cluster computing, a bunch of similar (or identical) computers are hooked up locally (in the same physical location, directly connected with very high speed connections) to operate as a single computer. The computers that make up the cluster cannot be operated independently as separate computers. A cluster, as far as any software or other computer is concerned, looks like essentially one big computer.

In grid computing, the computers do not have to be in the same network have a program on them that allows unused resources (usually processing time and memory) to be used by another computer on the network. The speed of the connections between the computers on the grid are relatively slow (Ethernet speeds) compared to the speed of connections inside each computer, so processing tasks are broken up into independent chunks and sent out to different computers on the grid. When a computer is done with a chunk, it sends the results back to the server.

Roughly on a grid, a server log in to a bunch of computers (the grid), send them data and a program to run, and runs the program on those computers, which sends the data back to the server when its done.

In sum, a cluster is one large computer made up of small, similar computers, just as R.A.I.D. is one large hard disk made up of small hard disks. Whereas a grid is a bunch of computers that make their unused resources available to select computers (often a single server) over a network.

Annie Steffi Sydney (13 Posts)

The author is a computer science engineer and she is currently pursuing MBA. She has received many National Awards in different fields.She has a passion for writing, social service and Bharatanatyam.