What Is Virtualization?

Virtualization was originally pioneered more than 40 years ago by IBM in order to distribute costly mainframe resources and run multiple applications. At the time, computers were far too expensive to be tied up computing single tasks. Virtualization allowed the host computer to run many more “virtual” computers at the same time—duplicates of the host computer. Pretty cool!

Eventually the necessity and cost of maintaining expensive, distributed computing decreased, so virtualization was no longer necessary. The decrease in mainframe computing coincided with advances in personal computers (PCs). Personal computing meant that end users finally had the ability to run both their operating system and their applications locally, right at their desk—on their personal computer. So, in some respects, computing was becoming more and more “decentralized.” Another viewpoint is that they became “hyper-centralized,” because nearly everyone had a computer right at their desk!

ABOUT THE CONTRIBUTING AUTHOR: PETER STREIBIG

Peter Streibig holds a bachelor of science degree in architecture from the University of Virginia and has over 10 years' experience in architecture. He manages and coordinates information and communication technology. His role is particularly challenging: incorporating emerging technologies in a high-profile and high-design firm. He also understands the importance of keeping the emphasis on architecture while maintaining a stable, approachable, and sophisticated (mixed-platform) computing environment.

By today's standards, early PCs were quite primitive. “Top-of-the-line” personal computers came with a 60 MHz processor and a 250 MB hard drive. Only 15 years later, the phone in your pocket probably has a 400 MHz processor (a more than 6× increase) and an 8 GB hard drive (a more than 30× increase in capacity).

As for today's personal computing, processor speeds once measured in megahertz (MHz) are now measured in gigahertz (GHz), and single processing systems have given way to processors that may in turn contain multiple “cores.” This means that a single processor system may be composed of two or more independent cores. It's not uncommon to have computers with multiple processors. That's a lot of computing power. In many cases, it's a lot of underused computing power.

Did you know that you might have a computer with multiple processors, each of which contains multiple cores? That means you effectively have many potential computing environments. For example, if your computer contains two quad-core processors, you potentially have eight simultaneous computing environments!

You're probably beginning to understand why virtualization is making a comeback. Those 40-year-old computing principles are being applied to personal computing. A multicore CPU will allow applications to run simultaneously and faster (within a single operating system). Virtualization allows multiple operating systems to run concurrently on the same computer.

This is because each VM is allocating the physical resources of the multiprocessor/core computer while “virtualizing” other physical hardware resources required by an operating system. So each virtualized session contains its very own operating system—Windows, Linux, and so on. And within those virtualized operating systems, multiple applications are able to operate.

A lot of great applications run in Windows, but some very industry-specific applications run in Linux. Others may only run in Mac OS X (which is Unix based). Virtualization allows you to create a fantastic best-of-breed solution, where you get to choose not only the application but also the operating system for that application.

WHAT ABOUT MAC OS X?

In September 2009, Autodesk began officially supporting virtualization (including Mac OS X hardware) for a number of Autodesk products: AutoCAD, AutoCAD LT, Autodesk Inventor Professional, 3ds Max, and the Revit platform.

Although both VMware and Parallels can run Autodesk's Windows-based applications, Autodesk and Parallels signed agreements signifying that Parallels would be Autodesk's preferred virtual solution. It should also be noted that Autodesk extended official support to Boot Camp earlier in 2009.

Autodesk already supports five native OS X applications. So, who knows what's in store for Mac users. Maybe even native support for Revit!

Just imagine that not too long ago, radios were AM only, and then later FM, and eventually both AM and FM. Even the first TVs were VHF (channels 2–13) and then later UHF (channels 14–83) was added. But if you wanted AM and FM, or VHF and UHF, you had to buy two radios and two televisions! Well, that certainly seems unimaginable now. Yet now the same goes for personal computing. You no longer have to buy multiple computers to run multiple operating systems because you effectively already have multiple computers in your “one” computer. You're just not taking advantage of them!

Figure 24.1 illustrates this with four concurrent spaces in Mac OS X (which is being used as the host computer). In turn, this host computer is running two virtual machines. One of the VMs is of Windows XP, which is running Revit (upper right). The other VM is running Vista 64, which is running a yet-to-be-released cloud-based design aggregation, visualization, and collaboration platform (lower right). And at the same time, the upper and lower spaces on the left contain instant messaging sessions, email, web browser, and word processing applications (running on the host computer).

FIGURE 24.1 Multiple virtual machines running on a single host computer

image

To get started with virtualization, you'll need a computer with multiprocessors (or a single processor with multiple cores). You'll also need to have enough memory to allocate to your host and your guest (or virtual) computers: 4 GB of RAM will do nicely. How many real and virtual processors you have will help you determine how many computing environments can run simultaneously. For example, if you have four processing cores and 8 GB of RAM (and you want to allocate each computer (real or virtual) at least 2 GB of RAM, then you'll be able to run up to four computers at the same time (one host computer and three guest computers).

The Host Computer

The host computer is the physical machine along with its operating system (OS). Windows, Linux, and Mac OS X may all operate as the host computer and operating system. In turn, the host computer will simply host the virtual OSs. In turn, those virtual operating systems will run their virtualized applications (this is the guest computer). Standard ×86 hardware and Intel or AMD architecture may host virtual machines.

The Guest Computer

The guest computer is simply a virtual machine. The VM contains the virtual OS, applications, and even user files (like a document, spreadsheet, and so on) all contained in a single file.

The guest may reside on the same hard drive as the host machine. But the guest machine may also be stored (as well as backed up) on another hard drive or even an external hard drive or other portable media.

Essentially, each VM exists as a separate, virtual disk image. And yet files on the host machine may be accessed by either the host or guest machine, or vice versa. For example, a Microsoft Word document residing on a host machine (running OS X) may be accessed, opened, and saved by a guest VM running Microsoft XP and Microsoft Office. And the guest VM may contain files accessed by the host machine. It's really quite flexible.

As a result, it's important to keep in mind that the files that you need to access within a VM (or guest machine) need not reside “inside” the VM. Your files may reside on the host machine or even at a location that can be accessed by either the host or the guest, such as a LAN or WAN network drive or even an external hard drive.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset