Definition
Server Virtualization

A method used to maximize a business server's resources and data center processing workloads by creating multiple virtual layers within a single server. This allows multiple applications and operating systems to be run from a single server while also increasing the amount of traffic that server is capable of handling.

Organizations operating a network that supports a large number of users often deal with increased strain on the processing capabilities of their servers. With the process of server virtualization, companies can provide an increased amount of users with access to business-critical information, applications and different operating systems. By virtually dividing an individual server into separate layers, businesses can gain more processing ability from existing hardware. Each of these partitions acts as an isolated server, providing more users with access to the variety of software hosted on that particular server. In order for this method to work in the best way possible, proper security measures should be implemented. Providing fail-over support to virtualized partitions in the event that one server loses functionality will ensure that business-sensitive data and applications are not lost. Overall, server virtualization is an IT strategy that, when implemented with the proper security measures, can increase network performance and application delivery.