May 05, 2006

Virtualization

An advitorial from VMware said that the average CPU utilization for a corporate application running on its own OS/machine host is 5-10%. Therefore, virtualization can consolidate many applications to a single machine, thus reducing the amount of hardware required for a data center. They claim they can reduce the number of machines by a factor of 10-15.

I don't understand why these applications need to run by themselves on a machine. Why can't they run several applications on a single OS/machine? Aren't processes good enough? If the OS requires substantial configuration that makes it difficult to run any another app, then the OS is flawed. If the app requires dedicated access to
resources that conflicts with other apps, then those resources are flawed. And if admins are too lazy to configure something to support multiple apps, then they are flawed. Virtualization sounds like a hack to avoid configuring multiple apps on a single OS.

Posted by surana at 01:38 AM