In the past 6 months, all developers of distributed, Web-based systems have undoubtedly wondered about the introduction of ADO.NET and its new object model. The common perception is that ADO.NET will transform existing data-access code into something that resembles legacy code. In most cases, existing data-access code is part of recently built Digital Networking Architecture (DNA) systems.
When Microsoft engineers sat down to design .NET's data access, they adapted ADO, and ADO.NET was the result. ADO emerged during a period of evolutionary client/server computing, and its goal was to offer new ways of working (e.g., disconnected recordsets, data shaping, output to XML, in-memory fabrication). ADO uses OLE DB to accomplish the physical data access, but one often underestimated and seldom cited element of OLE DB plays a fundamental role: the OLE DB Services.
As data travels to the client, OLE DB Services intervene to represent that data in a special way (e.g., irregularly shaped, hierarchically, persisted, disconnected snapshots) by offering functions that aren't strictly part of the original, core ADO library. The recordset object is merely a COM wrapper for the results of a SQL query. In ADO, the result is a rather complex and bloated object.
Designers wanted to make more functionality available in ADO.NET—more disconnected functions, more reading features, and interactivity and integration with XML. Rather than using the recordset as the starting object and filling it with new services, the designers came up with a new object model, ADO.NET, that's aligned to ADO wherever possible and that lets you apply existing skills whenever possible. As a result, the learning curve for expert ADO developers isn't too steep, and the learning curve for data-access newbies is linear and not particularly long. Developers, both experienced and not, can study and work with an object model that's perfectly aligned with the computing world they're accustomed to and the applications they write.
I don't recommend that you import and use ADO in .NET applications, but if you must, you should have good reasons for doing so. Here are some examples:
- You have a lot of working ADO code that you want to save
- You've invested a lot in your business objects that use ADO to read/write
If you have a lot of ADO code that you want to preserve, you must first determine where that code is located. Is the ADO code in Active Server Pages (ASPs) buried in plain-code blocks? Or is the code in COM+ business objects that return recordsets to the ASP presentation builder layer?
For performance and maintenance reasons, placing plain ADO code in ASPs is a bad idea. In such cases, the ADO code is nearly legacy code. You can adapt the code to work with .NET, but you'll encounter performance and coding issues.
You can import any COM object into .NET applications, and the ADODB library is no exception. From a command line or from within Visual Studio.NET, you can use a special system tool, tlbimp.exe, to create a wrapper class that lets you call COM methods and properties through a .NET proxy. In that way, you can create a recordset object even in ASP.NET pages or Windows forms. However, tlbimp.exe isn't a lifesaver. Once you've created and populated the recordset, you still have to figure out how to display its content. With ASPs, you can use an endless series of Response.Write statements to output the UI tags, or you can build a big string to flush in one shot. This code is all but .NET-compliant, yet it works fine.
You'll probably modify a lot of code in any of your ASP—now ASP.NET—pages to bring ADO into the .NET world, but you'll pay a performance price. Is it worth it?
Saving Business Objects
Business logic is the heart of a distributed system. Changing or rewriting the business logic is always potentially harmful for any system. A lot of business objects in existing DNA systems return ADO recordsets to the ASP presentation builder layer. In most cases, those components, which are binary, compiled components written in VB or C++, run under the protective umbrella of COM+ and Microsoft Transaction Services (MTS). The only point of contact the components have with the rest of the system is through the data types they return, which is often a recordset. This modularity provides a precious advantage for migration because you can leave the business logic unchanged and start upgrading ASPs to ASP.NET pages.
You design ASP.NET pages to take advantage of server controls and data-binding, then you simply call into COM or COM+ objects and transform their output into ADO.NET objects. The data adapter object available in ADO.NET (starting with beta 2) can read from ADO recordsets and automatically populate a DataTable object. You take this object and bind it to data-aware server control; you don't need to make any significant changes to the critical parts of the system. Best of all, you can build a new, more efficient UI using .NET's productivity tools.