The fact that Microsoft has posted 469 recordings on their Channel 9 site from the recent TechEd North America event is a wonderful thing. I don’t particularly care for the scale of TechEd and the chance to go back for a conference so soon after MEC (sessions from that event are also online) seemed like overkill, so I didn’t make the journey to Houston. But having the chance to consume selected content at my leisure is a very nice thing.
Given so many recordings from which to choose there’s bound to be some good and bad in the mix - just like the conference itself, I guess. I greatly enjoyed watching Jeffrey Snover and Don Jones lead people through PowerShell; listening to some others waxing lyrical about the wonders of Exchange certification exams was less inspiring. Some of the sample questions were just odd in that journal rules appear to be much more important to certification than they are in real life.
In any case, I took the opportunity to listen to Asaf Kashi cover Data Loss Prevention (DLP). This is an interesting technology that I have written about to provide both an overview of DLP as implemented in Exchange 2013 and some specific details, such as the document fingerprinting feature introduced in Exchange 2013 SP1. But it’s always nice to listen to a pro and I had missed his MEC session.
I took four major points away from the talk:
- A common team is now responsible for DLP across both Exchange and SharePoint. This makes a heap of sense and is in line with the trend for the Office servers to leverage each other much better. Using Search Foundation as a common search platform and the less-than-successful-so-far site mailboxes are other examples.
- SharePoint Online is about to implement DLP in the form of a crawler (see below). This is a component that “crawls” through content in SharePoint sites to perform a certain function. In this case, the crawler will be looking for documents that contain sensitive data types like credit card numbers or SSNs, just like Exchange examines messages for the same content. Selecting a crawler for the job makes sense because it provides much the same common chokepoint as the transport pipeline provides for Exchange. In other words, you can be sure that any document that contains sensitive data will be detected.
- Initially you’ll be able to include DLP queries in “cases” similar to those created today through SharePoint’s eDiscovery Center. Plans are in place to allow for more automatic processing so that, for instance, a document discovered to contain sensitive data might be quarantined until it can be reviewed and released by a compliance officer.
- A new Unified Compliance Center is coming that will allow administrators to perform compliance activities, including DLP, across multiple servers (Exchange and SharePoint to start). However, you’ll be able to continue using DLP features for individual workloads if you like, such as using EAC or EMS to configure DLP for Exchange.
All of this seems very logical and worthwhile. I assume that the DLP features intended for SharePoint Online will eventually find their way into an on-premises version sometime soon.
Asaf made one other point that I thought worth noting. He spoke about add-on compliance products that can be deployed with Exchange and emphasized that Microsoft incurs very little penalty to perform the deep content analysis in the transport pipeline to detect sensitive data types that might be controlled by policy. This is because the transport pipeline already has to examine content in messages and attachments as they pass through, so performing an extra check for sensitive data is not a big performance hit. The close integration of DLP into Exchange and the Outlook and OWA clients is Microsoft’s trump card when it comes to making a choice between compliance products. It looks like the same will soon be true for SharePoint. Time for Microsoft’s competitors to get their innovation hats on to figure out some new unique selling points!
Follow Tony @12Knocksinna