In Windows IT Pro's Calling All Virtualization Heroes contest, we asked you to tell us how your IT organization has used virtual machine (VM) technology in innovative ways to reap practical benefits. It was tough selecting among a number of noteworthy entries, but we found two standouts: first-place winner Jeff Gagne, who's an IT Specialist 2 with the Vermont Office of the Attorney General, and second-place winner Brad Kulick, a Senior Network Engineer for Camera Corner/Connecting Point. Jeff wins a copy of Microsoft Virtual Server 2005, and Brad wins a copy of Microsoft Virtual PC 2004. Congratulations! You can read Jeff and Brad's stories and more real-life virtualization tales on this page. Thanks to everyone who participated!
First-Place Virtualization Hero: Jeff Gagne
Because of the attacks on September 11, 2001, and a nasty flood that took place here in our state capital, where our main office is, we were told to have a disaster recovery plan in place in case our building was unoccupiable. In a nearby city, the state IT folks had set up a site just for this scenario. The only issue was of space. We could not move in several servers, not even temporarily, to get our users back online. We were given the possible availability of one slot, yes, a slot. We were not given floor space or rack space. Our thoughts were "There's no way." We thought about setting up our own facility somewhere else, but the budget crunchers said no. After I attended a Microsoft seminar about Microsoft Virtual Server 2005, I knew what to do. We would "copy" the existing servers using the Virtual Server 2005 software onto one highly powerful blade server. This would solve several issues: space, budget, and conforming to the disaster recovery plan. Yes, there would still be issues about syncing the servers every so often and maintaining proper backups. However, under the circumstances, this was the best option.
Runner-Up Virtualization Hero: Brad Kulick
One of my clients was facing a Sarbanes-Oxley audit and had a short time window to prepare for it. Four of the requirements of their audit were to 1. Show how their software solutions are tested before being put into production. 2. Show that they can restore data from various servers from their backup rotation. 3. Show how antivirus software was tested, updated, and implemented. 4. Show how software updates were tested and implemented.
The client has 19 servers in production, and it wasn't feasible to purchase duplicate servers for a test environment. We solved the problem by purchasing two high-end desktops with enough resources to recreate the client's production environment by using VM technology. This was a cost-effective solution and met the client's requirements.
More Virtualization Hero Stories!
We have a cold (offline) disaster recovery co-location site. We virtualized one of our root Active Directory (AD) domain controllers (DCs) and one of our production AD DCs. Each night, those virtual DCs are paused and backed up to tape using an "encapsulated backup," which backs up the VM files entirely (as opposed to backing up from within the VM itself by using a backup agent). When we need to update the disaster recovery site, we restore the latest virtual DC backups to a host system there, and we instantly have a complete working copy of our production domain (mostly for testing purposes). We use this same scenario when we need to update our working lab with a new "updated" version of our production AD structure.
We had a requirement to move small-scale application and test servers from old low-end PCs that were scattered around the computer rooms and the development team's offices onto server-class hardware with redundant hardware configuration. We purchased an HP blade server with 4GB of RAM and a pair of 72GB drives, which we mirrored. We then installed VMware ESX Server 2.0 on the blade server. We now have a total of 14 guest OSs, including a Windows NT 4.0 DC, Novell NetWare file server, and several Windows 2000 Server and Windows Server 2003 servers. The only reason we couldn't add more guests was because of a lack of disk space on the host. We were so impressed by VMware ESX Server that we built a second host server, which is also running on an HP BL20p. This time, though, we used a pair of 144GB drives, so we should be able to handle a lot more than 14 guest OSs on this hardware.
In my day job as a network manager, I use VMware to test software deployments, new scripts, and just about everything else that needs to be deployed or used on my network. I found the virtual computers to be an invaluable tool in testing, better than having several big cases on my desk(s). Now I have just one with plenty of RAM and CPU. The virtual computers proved most helpful when our email server died recently. I had just built a virtual server (Windows 2000), and I installed the email software on the virtual computer until we fixed the hardware problem on the main email server. The virtual server worked very well. At night I teach basic computer skills to adults. I use virtual computers to let the students play with the OSs. They can break whatever they want on the virtual computers and do no harm whatsoever. The virtual computers are great for showing them how to perform tasks such as adding and removing programs and hardware.
My first experiences with Microsoft Virtual Server were when it was released into beta. I immediately prototyped my company's new Exchange 2000 Server network, doing a complete migration from a copy of our production environment that was ported into the Virtual Server environment. I now use it in our production environment. Virtual Server enhances our production environment because we can add or delete subdomains faster, without additional hardware, keeping AD healthy. I've also used it for testing server configuration for tuning and performance. I've been able to build highly tuned configurations, correct all the issues, and create a set of templates for the various types of servers that I have to build. Virtual Server also has let me test and verify security tools, such as F.I.R.E. and other Linux-based bootable-CD utilities that I might benefit from. I've also used Virtual Server to install monitoring tools, such as Spotlight on Exchange, on a virtual session and allow multiple users to view it and still perform my job functions. Virtual Server has also let me test DMZ configuration and test various network routing configurations before putting them into production. In short, Virtual Server has saved me lot of work.
Last summer I took a real leap of faith, and so far it's working. I manage a mixed network of PCs and Macs with two Windows 2000 Server systems and one Macintosh OS X server. In the past, I carried around a PC laptop and a Mac laptop and pulled whichever one I needed out of my bag. This summer I picked up Microsoft Virtual PC for Macintosh 6.1 and installed it on a new PowerBook with 1GB of RAM. So far, I've been able to manage all administrative tasks through VMs. I have Remote Desktop installed on a Windows XP Professional VM and can connect to any of the XP machines and the Win2K Server systems in the network. I have Mac OS X Server Admin tools on the virtual PC and can manage AD, run a computer management tool, and run Symantec Antivirus System Console. On the Mac side, I have Remote Desktop, with which I can connect to the users Macs remotely, and Remote Desktop Connection for direct terminal access to the Windows servers and XP machines. I run the VM with 512MB of RAM (the upper limit of the settings). I use virtual switch settings so that the VM has its own IP address and is joined to the domain. I run Microsoft Office 2003 on the VM and Microsoft Office 2004 for Macintosh on the Mac platform. This gives me a good chance to help users coordinate between the two platforms. I also run Microsoft Visio and Microsoft TechNet on the VM. I installed Microsoft Streets & Trips to prove it would work, and then removed it because I don't need it daily. I've also used the beta version of Virtual Server 2005 running on my home XP machine to practice using Windows Server 2003. I have media and intend to upgrade my two Win2K Server systems to Windows 2003. I'm currently deploying a Windows 2003 network at the new public library building. The only glitch so far is that XP service pack 2 (SP2) crashes the VM. I have proved that to myself twice (with the resultant rebuild from scratch). I can go as far as SP1 with all the critical patches and updates up to but not including SP2. Hopefully, the new version (7) will work better with XP SP2.
I have a very small business operating out of my backyard, so to speak. My computers are accessible when clients come to drop off their machines for repair AND when relatives bring their nosey kids over to visit. They like to play computer games, so I installed the trial version of Microsoft's virtual software and loaded up the original Windows 95, Microsoft BOB, and some old Arcade games! The kids have a ball, and I don't worry that they might ruin my regular configuration! Although I keep an image backup, this is so much simpler, and I still have my computer ready to work when I need it. Thank you, Microsoft!
I use virtualization for studying, disaster recovery, documentation, and in a test lab environment. I've found virtualization to be scalable, robust, and nondestructive. Microsoft Virtual Server 2005 lets me test applications on a virtual system that's isolated from the real network. In other words, I can, for example, deploy Windows XP SP2 in a test environment, then test a server that uses a specific application and see how XP SP2 affects it. My organization will be using Virtual Server for disaster recovery simulations and for documenting the disaster recovery procedures for all Windows Server 2003 servers. Virtual Server has become as critical as all our existing servers. The product enables my organization to rebuild all servers virtually and document the rebuild process. Developers can also use it to test their code without requiring domain administrator rights. The ROI for Virtual Server is good because I don't have to buy computer hardware. I'm also using Virtual Server 2005 to help me prepare for my MCSE 2003 certification.
We've used virtualization to create a duplicate of our production environment. Because of our very limited space, we couldn't create a physical duplicate environment. Our environmental manager approves of our virtualization solution because it requires less hardware and therefore consumes less power and requires less cooling. We're trying to consolidate our servers further onto VMs because of the "green" benefit.
We're currently using Microsoft Virtual Server 2005 to test different products, such as the upcoming Microsoft Office Live Communications Server 2005, and migration scenarios. We've also been running a product environment under Virtual Server 2005 that combines AD, Exchange Server 2003, SQL Server 2000, Microsoft SharePoint products and technologies, and several workstations in a single physical server. Also, some of our products are sold as VMs to our customers; after the project is launched, a virtual PC image is transferred for the customer. This process is automated so that all required software is installed in unattended mode to save time.
I am an IT instructor. I use virtual servers for many of my classes for various reasons. For example, one day I'll be teaching one topic, then the next morning I need to teach a different topic and sometimes a different version. Instead of reinstalling an OS, I either do a "undo hard drive changes" or pull a copy of the hard drive off of a CD-ROM or DVD. For travel purposes, virtualization is great because I can network many servers and versions of server OSs with only one laptop. It's a lot easier to carry one laptop instead of four! If I tile the virtual screens, I can see what's going with more then one server at the same time on the projection screen. Geeks love it. It is great for testing. When finished, I just undo the changes.
We used a single physical fault-tolerant server with virtual server technology to host multiple Windows guest OSs to run virtual TCP relays in support of a data center migration. First, TCP relays are targeted to local intercept Web sites, with subsequent targeting to new data center location and Web sites.
In our daily work, we often have to test and work with different OSs. For this reason, we use VMs to keep a big quantity of OSs and different configurations to test our customers' environments. For us, it's easy to simply delete a file and reinstall an OS without losing information, or just change the number of network cards, or the size of the disk, without having to spend money! We can also mount a lab with two virtual PCs to test secure communications and isolate it from the real network.
Our Microsoft Virtual Server technology solution allowed us to replace four existing test and development servers. This gave us much-needed real estate in our data center cabinets and reduced the server budget. Additionally, we're using Virtual Server to evaluate new software applications from different vendors. We have an automated baseline server build that installs in minutes onto our hardware running Virtual Server (this previously required 2–4 hours to prepare a server), which has made our network engineers true heroes.
During my 5-year stint with my last employer, we used an Extract-Transform-Load (ETL) process to move data from dBase to Ingres and back again using NFS as the transport protocol. The transformation logic was embedded as a callable option in the monolithic application running on the client side, so at first a client pen computer was used to perform this ETL during the night. This machine had no physical security, lived in the manager's office, and was hardly server class: 486, 16MB RAM, DOS 6.22. The process was prone to memory leakage and needed to be restarted regularly. Using virtualization, we solved the following administrative issues:
- Remote control, The process ran in a full-screen DOS graphics window, which precluded the use of remote control software such as pcAnywhere, Timbuktu, and VNC. Even Citrix MetaFrame refused to support full-screen DOS graphics over the network, although it supported full-screen DOS graphics over a direct serial connection, which we didn't have. By making use of virtualization, we were able to run the process in a VM encapsulated in Windows 2000 Server, making remote control a snap using native Windows Terminal Services (RDP). No other solution could have provided remote control of the process, given the ancient runtime environment used by the pen computers on the client side.
- Physical security, By virtualizing the DOS machine, we effectively moved the process into a secure machine room, thereby freeing the manager to concentrate on other, more important issues.
- Scheduled reboots, Virtualization permitted us to reboot the entire VM regularly to ameliorate the memory leaks.
- High availability, Ironically, this process was critical to the production environment and served as the link between the Customer Service and Warehouse Inventory departments. Tragically, the process failed regularly and caused delays of many hours. Moving the process into a VM was nice, but running the VM on hardware that supported RAID, memory interleaving, dual power supplies, UPS, environmental controls, and fire containment meant that the process was up close to 100 percent of the time, as compared with running on a 486 pen computer that had a PCMCIA hard disk and a power cable that was regularly kicked by the manager.
- Runapp, With the process running in an encapsulated VM, we could make use of the host OS to use the Runapp command from the Windows 2000 Resource Kit, which meant that if the guest VM process crashed, Runapp would fire up the VM again with the appropriate parameters.
- Performance, Simple comparison of a 486 versus a 1.2Ghz SMP P3. The runtime of the process decreased from 50 minutes every hour to 4 minutes every hour, The ETL process ran once per hour, so we gained plenty of time in the hour window simply by using the greater processing power of the Dell PowerEdge versus the 486.
- Power consumption, Virtualization let us decommission the original PC and CRT monitor, saving kilowatt-hours. Also, the manager noticed that his office was cooler in the summer since the air conditioning didn't have to deal with the CRT heat.
- Code deployment, Any updates to the process (in the monolithic application) could be performed without a walk to the machine room, using inherent Windows file sharing and the folder sharing inherent in VMware. This is implied in remote control, but saved our developer many a walk.
- Multiple runs in the hour window, Given the speediness of the virtualized process, if necessary we could run the process several times in the 1-hour window if inventory changed quickly and Customer Service needed updates.
- Scalability, Admittedly, the ETL process and the runtime environment are antiquated and would be better replaced by an Enterprise Service Bus (ESB) or SOA environment. However, and this is true for many organizations, the process is going to continue to serve us for a LONG time. How long has that AS/400 been hanging around your machine room in spite of the fact you've paid millions for Oracle ERP? The ETL antique process can be scaled and coddled in modern technology, taking advantage of Moore's Law and all that implies.
- The NFS stack in the guest DOS OS used UDP; the host OS (Win2k) tossed packets on the network using TCP. Therefore, the only UDP traffic was from the guest to the host OS, and packets traversed the actual network via TCP, making for a best-of-best-effort of deliver.
I have several Microsoft and Citrix certifications, and I found that VM software was by far the least expensive way to set up a lab environment to help me learn and experience the many things I needed to get ready for certification exams. There was no other way to be able to "crash" and test solutions without adversely affecting a production network.
Microsoft Virtual Server has been a tremendous cost and space saver for the company I work for. During a recent upgrade of an entire multitiered business processing system, consisting of some 15 physical Windows servers, I implemented our entire development and test environment on several Windows 2000 VMs, saving the purchase of at least 10 physical servers. The VMs run on a single Compaq Proliant DL380 G3 server. The production environment runs on three physical servers. We are so delighted that we'll be migrating other development and test systems to VMs. Aside from the cost and space benefits, VMs also lend themselves to great flexibility. We can save VMs ahead of making changes to them and then roll them back in 10 to 15 minutes! Also, I have some base images saved in the form of .vhd files. Using the saved images, I can easily bring another server up, give it a new SID, rename it, and add it to our domain within about 15 minutes. Try doing that with a physical box! We are busy trying to win management over to the idea of using virtualization in our production environment, on a range of business-processing systems.
I find Microsoft Virtual Server incredibly useful in training and testing Network Load Balancing (NLB) and shared quorum cluster servers. Obviously, having access to specialized cluster hardware for training and testing purposes is almost impossible. This lets people who don't have access to clustering hardware gain experience and test scenarios without disrupting the production environment.
We've used Microsoft Virtual PC 2004 for some time for routine testing of beta software and such. We came up with a unique solution for using virtualization for simulating controls systems for power plants for training and testing purposes. Using a snapshot program, we capture ghost images of our control computer. We restore the image to a VM and tweak the hardware abstraction layer (HAL) as necessary to make it run. On one machine, we currently have six VMs running on one host that emulate the functions and graphics controls in a networked environment of all the control computers needed to run a power plant. It's a FAT test and training simulator in one box. Using undo disk, if the training operator "breaks" any one part of the system, he can simply restart that VM. The system can be archived for later use. We use Virtual PC for other projects that let us use different software platforms without reconfiguring the entire host machine. Love Virtual PC!
I do a lot of support from home using a high-speed Internet connection and a VPN. However, I didn't want to put my home Windows XP machine on the domain at work because of the sharing problems it would cause with other PCs on my home network. I installed Microsoft Virtual PC, then installed Windows 2000 on a VM and put it on the domain. I can create Windows NT accounts and Exchange mailboxes as well as use Server Manager from home without incurring sharing problems at home. It has worked very well!
We're a training facility that uses a flexible learning model for our students. Resources, as always, have to be used in an efficient manner. We use Microsoft Virtual PC in two ways to streamline our operation and to maximize our time with our students. We've built virtual servers (Windows 2000 Server and Windows Server 2003) and saved those files on a centralized server so that our students can work through a TechLab on one machine. When a TechLab calls for a DC and a client machine, the student launches the virtual DC within the client Windows XP OS. This approach allows more students to use our hands-on lab equipment. The second benefit of virtual technology is the ability to deliver MicroLabs and technical demonstrations on the fly. Again, we've prebuilt specific server OSs (Win2K Server and Windows 2003) and can quickly configure a host machine to run one or more virtual server sessions. It provides our students with a highly flexible environment to meet their learning needs.
I've started using Microsoft Virtual PC in my classrooms to teach my students. I can build labs at home and burn the virtual PCs to a DVD, then load them on the students' PCs. This is a great time saver, especially when a student fails to follow instructions; I just have them close the VM (without saving their incorrect changes) and start over with the new fresh starting point (saving quite a few hours per class of reloading a PQDI image). The students can run an entire network internally on their PCs and even do network load balancing, front-end/back-end Exchange Server 2003 scenarios, and much more by using tons less of hardware! I also used Virtual PC at my day job during a migration from Novell NetWare to Microsoft. Old Novell workstations were moved off of real hardware workstations to VMs that I could readily access from home while connecting to my work desktop. This was the perfect solution: I could trash the old workstation image and just back up the VM's files, which gave me a more up-to-date image and saved me many hours of PQDI imaging in return. When the migration was done, my last step was to just delete the Novell workstation VM. BTW, Virtual PC rocks!
To keep it simple, our agency has to test up and coming software in realtime server distribution sites. Before any new software (e.g., patches, updates, fixes or revisions) can be released into the server or workstation environment, the new software will be tested against all server and workstation setups. Microsoft Virtual Server has let us reduce the number of existing computers while maintaining all test scenarios. We've reduced engineer time by using autoscripts to reboot these Virtual Server test machines into each session and log any failures or errors that the new software generates. The bottom line: We've saved some time and reduced the amount of test machines required to maintain a no-downtime or no-disruption environment.