Quick Contact


    Cloud Computing – Virtualization

    By now, you would have all understood that the file systems are distributed across multiple storage devices and servers spread across the Internet. This distribution is what Cloud Computing manages. Data is never stored at just one location. Since data is stored at multiple locations, if any server were to go down or if a computing resource were to fail then the backup system will take over automatically. So whatever resources you use are spread over this distributed file system. Hence one of the critical components of Cloud Computing is the resource allocation algorithms.

    Now let us look at the technologies that run in the background that make Cloud Computing tick:

    • Virtualization
    • SOA or Service Oriented Architecture
    • Grid Computing
    • Utility Computing
    Virtualization

    The primary technology that enables Cloud Computing is Virtualization. Virtualization enables the sharing of computing services or software applications on one physical instance across various customers, i.e. it allows pooling and sharing of resources – networking, storage and servers in real-time. It accomplishes this by allocating a logical name to an existing resource as and when required. The actual resource is also referenced when needed. Thus a single physical server is partitioned into multiple logical servers—every single one of these logical servers functions akin to a physical one.

    Many organizations today provide virtualization services. Amazon, Microsoft and VMware, to name a few. These solutions turn out to be cheaper options that are easier to implement since and save time as you use their services instead of your personal processing machines. As these virtualization services are configured in such a way that they provide an isolated environment virtually, hence users can configure the applications in the manner they want, just as they would do on their own machines.

    Virtualization technology also provides a win-win scenario for Software development and Quality Assurance and Control resources. Developers get the ability to develop and execute code across varied environments, and Testers too can simulate various testing scenarios and run them across these different environments.

    The three main reasons where Virtualization technology is used –

    1. Virtualization of Storage
    2. Virtualization of Servers
    3. Virtualization of Networks

    Virtualization of Storage: Takes into consideration the combining of physical storage from numerous Network storage devices into one central console managed unit. Storage Area Networks (SAN’s) are a prime example of the use of storage virtualization.

    Virtualization of Servers: Server virtualization involves hiding users from server resources, such as processors, RAM, operating system, etc. Server virtualization is intended to maximize resource sharing and decrease the load and complication of user computing.

    Virtualization of Networks: By segregating the bandwidth capacity of the available network into individual channels, network virtualizations seeks to utilize available network capacity best. Each of these channels is not dependent on other channels and allows real-time allocation of bandwidth to a particular server or system that needs it the most.

    Virtualization

    >

    The key to opening the cloud infrastructure is virtualization, which makes virtualization so crucial for the Cloud that it decouples the software from the hardware. For example, virtual memory can be used by PCs to borrow extra memory from the hard disc. All hard disks contain vast amounts of space, much-much more than the physical memory a system can ever possess. While virtual disks are slower than actual memory, the replacement works perfectly if correctly handled. Similarly, there is software that can mimic an entire machine, which means that the function of a single computer is now equivalent to many more.

    Copyright 1999- Ducat Creative, All rights reserved.