Skip navigation
Understanding System Center 2012 Data Protection Manager

Understanding System Center 2012 Data Protection Manager

Enhanced backup and disaster recovery

According to Enterprise Strategy Group (ESG) research, the number-one IT spending priority in 2012 was improving data backup and recovery, tied with increased use of server virtualization. Interestingly enough, improving business continuity or disaster recovery (BC/DR) scored in the top 10 as well. There are two key reasons. First, commoditization of virtualization has made many IT processes easier but makes backups more difficult. Second, data is growing faster than most organizations can manage it, and legacy backup solutions are choking to keep up. Other factors include an ever-growing reliance on IT (forcing raised prioritization of BC/DR) and the consumerization of IT (causing new protection scenarios for privately owned endpoint devices). Add the growing complexities of backing up and recovering Microsoft workloads (e.g., Microsoft SQL Server, SharePoint, Exchange Server, Hyper-V, Windows Server file services), and you can understand why Microsoft started building its own data-protection solution.

Microsoft introduced Volume Shadow Copy Service (VSS) in 2003, but that only worked if the backup and storage vendors chose to utilize the VSS APIs. Microsoft needed to assure customers that they had a viable (and supported) backup-and-recovery capability, as part of sustaining (or raising) customer satisfaction and adoption of Windows Server. With supportability and workload adoption in mind, Microsoft released Data Protection Manager (DPM) 2006 in late 2005. This disk-to-disk protection capability was optimized for branch office file servers and served to augment a legacy tape-based solution.

Eighteen months later, DPM 2007 brought application protection for Microsoft workloads, as well as added tape support. This release changed the dynamic with Windows-centric backup vendors.

DPM 2010 offered further enhanced workload and tape support and added client-node protection. This change enabled Windows laptops to be backed up and recovered, even when connected via the Internet or disconnected from a local backup cache. In addition, the basic replication features for DR were augmented, and monitoring (via System Center Operations Manager) was enhanced. Arguably, with three versions and 5 years of experience in market, DPM offered a credible midmarket backup solution, especially for those customers who ran "pure Microsoft" environments. Then came DPM 2012.

System Center 2012 DPM

In April 2012, Microsoft announced general availability of Microsoft System Center 2012, which comprises numerous -- and in some cases, formerly separate -- products, one of which is DPM. DPM in System Center 2012 -- informally referred to as DPM 2012 -- has three complementary focus areas:

  • Evolve from a small-to-midsized business (SMB) and midmarket offering to something that's credible for large-scale enterprises (or at least for the Windows-based nodes within an enterprise).
  • Integrate and interoperate with the rest of the System Center family, not only for "better together" data-protection capabilities, but also to remain relevant within the Microsoft-driven management story.
  • Continue to evolve core features and workload support, refine experience, and resolve engineering issues.

Centralized Administration via Operations Manager

Easily the most noticeable and important enhancement in DPM 2012 is the addition of the Central Console, through System Center 2012 Operations Manager. Operations Manager has been monitoring DPM for a few years with varying success, but System Center 2012 provides not just complete monitoring but also management for its backup product.

Previously, a DPM administrator needed to maintain a Microsoft Excel spreadsheet detailing which production servers were protected by which DPM server. Any configuration, troubleshooting, or restoration requests required you to connect, via Terminal Server, to the console of a particular DPM server. Although these requirements weren't horrible in midsized organizations, they were painful enough that large enterprises, which likely owned the entire System Center suite license, would leave the DPM components on a shelf. System Center 2012 changes this situation, not only for DPM 2012 servers but also for DPM 2010 servers. That's right: By adding even one DPM 2012 server and an Operations Manager 2012 server, your existing DPM 2010 servers gain centralized management as well.

 

Management, not just monitoring. In the left pane of Operations Manager, you can expand what initially appears to be just another DPM management pack. From there, you can check the status of DPM servers, protected servers (including predefined subsets based on workloads such as SQL Server or Exchange), and alerts (e.g., tape media alerts, failed jobs, disk capacity issues). As is typical in Operations Manager, when you select a particular alert in the center pane, several other portions of the console change. Detailed information about the alert appears in the bottom-center portion of the console, and context-specific actions appear in the right pane, as shown in Figure 1.

 Figure 1: DPM Server Alerts in Operations Manager Console
Figure 1: DPM Server Alerts in Operations Manager Console

This is where things get good. Every Operations Manager management pack comes with a wealth of knowledge about symptoms, likely causes, and recommended resolution actions. The DPM pack is no exception. Whenever you click an alert in the top-center pane of the console, the bottom-center pane displays the known information about the alert, including suggested resolution actions (which are often direct hyperlinks, such as Restart Service). Along with the specific action listed in the Knowledge Base, the right pane of the Operations Manager console often offers actions, some of which (e.g., Ping Server) are generic and others of which (e.g., Modify Disk Allocation in Storage Pool) are specific to the platform being managed.

 

 

 

 

 

 

Focused troubleshooting. Imagine a DPM backup job failing because its storage pool has been maxed out (and autogrow hasn't been enabled). When the job fails, Operations Manager receives an alert. Operations Manager will likely receive several alerts because the disk capacity can affect multiple jobs. Operations Manager bubbles up an alert to be serviced (and to notify the Operations Manager administrator accordingly). At this point, you (as the administrator) might see the hyperlink in the Knowledge Base, or you might simply click Modify Storage Allocation in the right pane. In either case, the mini-wizard UI that was previously seen within the DPM console now pops up within Operations Manager. From there (i.e., without using any DPM UI or Terminal Services screen), you can change the storage allocation. After doing so, you can use another action in the right pane to Restart Backup Job, and you're done.

Before DPM 2012, you needed either to routinely check each DPM server's alert page (by connecting Terminal Services to each console) or to react after an early Operations Manager management pack alerted you (which also required a Terminal Services connection).

Most (though not all) common management tasks can now be performed as actions within the Operations Manager UI. For those (far fewer) times when you need to connect to a particular DPM server for specific troubleshooting tasks, you can also do so through the Operations Manager UI. Clicking a DPM server and selecting Connect to brings up a scoped-down DPM UI, in which the terminal session is created behind the scenes. This UI shows the tabs that are necessary to resolve the issue. The protection groups are filtered, and the Alerts and Jobs tabs are scaled back to show only the information that's related to the error or errors, as shown in Figure 2.

Figure 2: DPM UI Invoked from Operations Manager
Figure 2: DPM UI Invoked from Operations Manager 

 

Role-based management. Another benefit of using Operations Manager as the primary console for DPM is something that DPM administrators have clamored for: role-based management. Prior to DPM 2012, you needed to be a local administrator of each DPM server, and you often needed to have raised privileges on every production server that you were protecting or recovering to.

With System Center 2012, the role-based management capabilities of Operations Manager can be used to scope a DPM administrator to one of several preconfigured or manually configured roles. Some roles or capabilities include the following:

  • Monitor backup jobs (only)
  • Define new protection policies
  • Restore data
  • Monitor and manage tape media

Figure 3 shows an example of only a few of the predefined roles and their permissions within DPM.

Figure 3: Table of Predefined DPM Roles Within Operations Manager UI
Figure 3: Table of Predefined DPM Roles Within Operations Manager UI 

New DPM Interface and Workload Enhancements

With all the excitement about using Operations Manager as the primary UI for DPM, one might not initially notice that the DPM-specific UI also received an update. Using the same framework as the other System Center 2012 components, DPM 2012 uses what some refer to as the "Outlook" style: a prominent ribbon across the top, key operation areas in the left pane (including pane buttons in the lower left), and a context-sensitive right body window, as shown in Figure 4.

Figure 4: New DPM 2012 Interface
Figure 4: New DPM 2012 Interface 

Each release of DPM continues to confirm Microsoft's commitment to offering a "best-of-breed" solution exclusively for Microsoft workloads, including Windows desktops and file servers, as well as application servers such as SQL Server, Exchange, SharePoint, and Hyper-V.

 

Enhanced Hyper-V protection. DPM 2012 takes the file-system filtering technology that it uses to protect other workloads' files and databases and applies the same approach to protecting Virtual Hard Disk (VHD) files. For those familiar with VMware vStorage, this filtering technology is similar to the VMware ESX Changed Block Tracking (CBT) function of monitoring and noting changed blocks as they occur. When it's time for a scheduled backup, the change log fetches and transmits only those blocks. As a result, backup of virtual machines (VMs) occurs much more quickly and with very little overhead. As I mentioned earlier, Microsoft has used this methodology since DPM 2006 but added it to Hyper-V, along with the enhancements in DPM 2012 and some behind-the-scenes updates to VSS, only as of Windows Server 2008 R2. Because this process is less I/O-intensive on the hypervisor and VMs, another result of the enhancement is the enablement of more frequent backups.

 

Virtualized backup servers. Also related to virtualization and DPM, the backup server can now be virtualized. You could run a virtualized DPM 2010 server, but you sacrificed a few key capabilities, such as file-level recoveries of VMs. (Virtualized instances of Windows Server 2008 don't support mounting a VHD inside a VM because mounting a VHD was part of the Hyper-V role, which isn't available within a VM in Server 2008.) Because of enhancements in DPM 2012, along with its prerequisite of Server 2008 R2 (which has the native ability to mount a VHD), virtualized DPM 2012 servers can restore individual files or directories from a host-based backup of a VM.

 

Optimized SharePoint restores. Although DPM has almost always been able to offer single-file restores from a SharePoint farm, the process hasn't always been smooth. Every release of DPM gets better at restoring SharePoint, in part because of SharePoint's continued evolution.

  • DPM 2007 needed to use Microsoft Office SharePoint Server 2007's Recovery Farm option to restore an entire content database, which would then restore single files.
  • DPM 2010 no longer required a Recovery Farm because Microsoft Office SharePoint Server 2010 didn't. Instead, it could recover the content database to any instance of SQL Server on the intranet, and then restore a file, although it still needed to recover the database first.
  • DPM 2012 restores the file only. It mounts the database within its backup storage pool, using its own running instance of SQL Server, and then plucks out the file for easy and fast restores.

What might have taken an hour in DPM 2010 (to recover the database first) now takes around 20 seconds in DPM 2012. Nice job!

 

 

 

 

 

Generic data source protection. Microsoft has always aspired for DPM to be the best for Microsoft backups and restores. With System Center 2012, Microsoft is reaching for DPM to be the best for Windows.

Although not a fully developed feature, DPM 2012 opens the door to protect other (non-Microsoft) workloads that run on Windows. Essentially, Windows already provides the core plumbing for any Windows-based application to be systematically protected through VSS functionality.

  • VSS requesters are components of the backup agent. These components request that an application prepare its data for protection.
  • VSS writers are components of the workload (e.g., SQL Server). These components receive VSS requests and then prepare the data for protection by performing functions such as flushing database transactions in memory, checkpointing databases, and so on.
  • VSS providers operate at the storage layer and are provided by either the hardware array or the software VSS provider within Windows.

With DPM 2012, Microsoft has started providing guidance that enables any application with a VSS writer to be visible for protection by the DPM backup agent. In addition, other application owners can create an XML file that describes the behaviors that an application should perform, essentially enabling the Windows file system's VSS writer to then protect the data.

It will be interesting to see whether other Windows-based applications will perform the minor (but incremental) work to deliver their own VSS writer (if they don't already) and script an XML file for this use. One might also hope that Microsoft will do some of the heavy lifting for strategic workloads that have previously precluded adoption of DPM because they weren't protectable, such as Oracle or IBM Lotus Notes.

Integration Across System Center

Operations Manager isn't the only component of System Center 2012 with which DPM is integrated, although some of the other integrations actually pass through Operations Manager to be realized. Without these integrations, DPM couldn't be a fully credible and enterprise-worthy member of the System Center family.

 

System Center 2012 Orchestrator runbooks. Like the automatable tasks of the other System Center components, DPM tasks can be scripted as activities within an automation runbook in System Center 2012 Orchestrator. If you're automating the import of data into a database, you might add a DPM task to create a recovery point (backup) immediately before the import. Later in the runbook, you can automate a restore to that recovery point if the data import is determined to be unsuccessful.

Similarly, when combined with dynamic provisioning tasks in a Microsoft private cloud architecture, a runbook that's invoked to create a new VM (through its self-service portal) can tell DPM to create a protection policy for the data in the newly launched service.

 

System Center 2012 Service Manager tickets. Just like any other alert in Operations Manager, DPM alerts can result in System Center 2012 Service Manager tickets for resolution. Those tickets are cleared when Operations Manager determines that the alerts have been resolved (or selected to ignore). Service Manager service requests can also be used to invoke data-protection activities such as creating a recovery point (backup) or initiating a restore. Those kinds of tasks are facilitated from either the canned actions within Operations Manager or the runbooks within Orchestrator.

 

System Center 2012 Configuration Manager deployments. One of the other new features in DPM is a more scalable (enterprise) way of deploying DPM agents through Configuration Manager. DPM 2012 works through Operations Manager and Configuration Manager to become aware of new protectable machines that are added to the managed domains of an intranet. When this happens, Configuration Manager can automatically deploy the DPM agent as it would any other software package.

New in DPM 2012 is an awareness, via policies, of which DPM server should protect which production machines. When Operations Manager detects a freshly installed agent, it uses the policies to determine which DPM server should protect that machine and creates a connection package within Configuration Manager. Configuration Manager deploys the package, whose simple script creates the glue from any particular machine to the correct DPM server. In doing this, DPM agent installations can be completely automated, from connection through the application of a protection policy.

The Bigger Picture

DPM 2012 is a credible part of the System Center 2012 feature set. With proper usage, it could be a boon to private cloud deployments, as well as a low- or no-cost (for organizations that already own System Center) solution that's ideal in branch offices. But few organizations will think of using it, perhaps because of a few missing features that the status quo backup solution recently added or because the organization's Microsoft representative hasn't demonstrated it yet.

System Center 2012 isn't just Operations Manager plus Systems Management Server and some other management tools thrown together. It really is a suite. Microsoft would like you to look at System Center as one product, based on licensing. But one look at the installation directories will assure you that although the bits might be highly interoperable, there are still multiple server back ends, agent installables, and management interfaces to be used. (One can only hope -- or assume -- that even further unification and consolidation will take place in the next System Center release.)

There are still a few features needed to bring DPM to parity with other Windows-centric solutions, particularly around deduplication. Windows Server 2012 will help, but its deduplication is still restricted to per-volume deduplication, which won't benefit DPM backup servers nearly as much as solutions that can use EMC Data Domain Boost, HP StoreOnce Catalyst, or NetApp optimized storage stacks. Today, backup leaders are innovating deduplication beyond just the disk in the backup servers' storage pool, to optimize transmission from the backup server or even the production application. DPM now has the management features and scale capabilities that larger enterprises need, but the lack of deduplication holds it back.

Choosing one's backup solution is about trust, often based on perceived product capability and company commitment to the solution. Ask your Microsoft direct sales engineer or Microsoft partner to demonstrate virtualization provisioning or automation and monitoring in System Center 2012. You'll be amazed at the capabilities. Ask for a demonstration of backup or DR, and you'll see that although DPM backup tasks are noticeably missing from many training and demo scenarios, DPM can still play a key role in your overall backup and protection solution.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish