Pioneering PC Provisioning
If your company is like most, your procedures for moving new PCs from hardware manufacturer to end user and for refreshing existing PCs is a necessarily labor-intensive manual process. And the more PCs you support, the bigger the hit you take in related overhead costs. The traditional approach to provisioning—moving a PC from hardware vendor to IT staging to end user—can require between 1 to 4 hours per machine. I was interested to hear that ManageSoft (http://www.managesoft.com) is pioneering a better approach to PC provisioning.
In a recent conversation with ManageSoft's Vice President of Worldwide Marketing Paul Cisternelli and Technology Evangelist Steve Klos, I learned about Self-Service Ordering, ManageSoft's automated software ordering process that uses an enterprise's existing infrastructure to provision new and replacement computers with customized software without IT assistance.
Self-Service Ordering provides an enterprise ordering portal. The process's dynamic rules-based engine enables automatic software download and installation on new PCs according to corporate standards and user role. For existing PCs, user data is automatically passed through rules that enforce licensing and security compliance and perform conflict analysis as part of a complete software refresh and upgrade that's fully user-managed. Self-Service Ordering can integrate with management software such as Microsoft Systems Management Server (SMS) for ongoing dynamic software management. ManageSoft estimates that Self-Service Ordering can produce savings of between $50 and $70 per machine over a manual provisioning process.
Can You Find 80 Bits?
I typically think of identity as part of the OSI application layer because that's where authentication usually takes place. But when I'm configuring a firewall, I wonder why I can't just have it authenticate each TCP stream. I recently spoke with Trusted Network Technologies (TNT—http://www.trustednetworktech.com) about its I-Gateway product, which does just that—transparently. TNT's I-Host software runs on client computers and embeds a SID-based identity into the IP and TCP headers during the three-way handshake for I-Gateway authentication. I-Host stores part of the identity in the TCP Sequence Number field. Because Request for Comments (RFC) 793 says that the initial sequence number is clock-based or random, I-Host can insert part of the identity there transparently. But TCP sequence numbers are 32 bits, and TNT reports that it uses a total of 112 bits. If you can find 80 more transparently usable bits from the headers, please write to me at acarheden at windowsitpro.com. If you're confused, see the online version of this article (http://www.windowsitpro.com, InstantDoc 45261) for some helpful links.
The Evolution of Backup
When I first started working with computers, I imagined that they could do anything if only I knew the right command. Why couldn't I access every version of my data that ever existed? Experience has taught me that limiting factors in both hardware and software separate reality from what I wish my computer could do. I contemplated what I wanted from backup recently when Leonid Shtilman and Gil Rapaport, the CEO and vice president of marketing, respectively, for XOsoft (http://www.xosoft.com), briefed me about the company's WANSync and Enterprise Rewinder products. WANSync mirrors data between servers, and Enterprise Rewinder accesses mirrored data as it existed at any point in time. Users enter how much history Enterprise Rewinder should save according to criteria such as time, space, or events—for example, a tape backup. Although hardware may preclude me from storing every version of my data that ever existed—at least, until I can afford that terabyte disk array—XOsoft's backup software has caught up with my imagination.
How Am I Supposed to Measure That?
When I was a technology consultant, I noticed two characteristics that often went hand-in-hand: insularity between IT departments and other business units and poorly defined service level agreements. SLAs are frequently defined in terms of an IT pro's definition of a functioning system (e.g., the ability to ping machines, the availability of certain vital services), often leaving users dissatisfied even though SLAs are met. It's not that IT pros don't understand users' concerns but that they have no way to quantify user experience in an SLA.
I recently discussed this problem with Indicative Software (http://www.indicative.com), an Agilent Technologies spin-off. Indicative End User Experience Manager is designed around measurements that are meaningful to end users. The company's product uses round-trip synthetic transactions to simulate the user experience across multiple components and multi-tiered systems. Indicative charges per measurement—a pricing strategy the company hopes will make its product attractive to smaller businesses as well as to the company's enterprise customer base.