Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is virtualization needed for cloud computing?

Can anyone explain me why is virtualization needed for cloud computing? A single instance of IIS and Windows Server can host multiple web applications. Then why do we need to run multiple instances of OS on a single machine? How can this lead to more efficient utilization of resources? How can the virtualization overhead be worth it? Is it strictly a matter of economics - I have money to buy only 100 machines, so I run virtualization to pretend I have 1000 machines?

like image 266
morpheus Avatar asked Aug 30 '10 05:08

morpheus


People also ask

What is virtualization Why is it needed?

Virtualization is technology that lets you create useful IT services using resources that are traditionally bound to hardware. It allows you to use a physical machine's full capacity by distributing its capabilities among many users or environments.

How is virtualization used in cloud computing?

Virtualization is the fundamental technology that powers cloud computing. Virtualization is software that manipulates hardware, while cloud computing refers to a service that results from that manipulation. You can't have cloud computing without virtualization.


2 Answers

Virtualization is convenient for cloud computing for a variety of reasons:

  1. Cloud computing is much more than a web app running in IIS. ActiveDirectory isn't a web app. SQL Server isn't a web app. To get full benefit of running code in the cloud, you need the option to install a wide variety of services in the cloud nodes just as you would in your own IT data center. Many of those services are not web apps governed by IIS. If you only look at the cloud as a web app, then you'll have difficulty building anything that isn't a web app.
  2. The folks running and administering the cloud hardware underneath the covers need ultimate authority and control to shut down, suspend, and occasionally relocate your cloud code to a different physical machine. If some bit of code in your cloud app goes nuts and runs out of control, it's much more difficult to shut down that service or that machine when the code is running directly on the physical hardware than it is when the rogue code is running in a VM managed by a hypervisor.
  3. Resource utilization - multiple tenants (VMs) executing on the same physical hardware, but with much stronger isloation from each other than IIS's process walls. Lower cost per tenant, higher income per unit of hardware.
like image 173
dthorpe Avatar answered Sep 19 '22 18:09

dthorpe


First of all, virtualization prevents possible damage to the underlying system. Since users want the environment to work transparently - so that nodes can be added and excluded seamlessly - those nodes need to be completely bulletproof so that the user software they run can't make them unusable.

Other than that - yes, virtualization facilitates higher resources utilization and also seamless deployment and migration of software between nodes. This lets you pay for actual resources used and lower costs.

like image 27
sharptooth Avatar answered Sep 17 '22 18:09

sharptooth