Skip navigation

.NET 4.0: A Framework All Grown Up or Just Getting Started?

New elements like parallelism make .NET Framework 4.0 relevant for the software development future

On April 12, 2010, Microsoft officially released .NET 4.0 to the public. In many ways, this version of the framework represents a mature product. But there are also signs of the future of .NET in this release: New directions in development are being implemented in .NET 4.0. Arguably the most important of these is parallelism.

To me, the key elements of the maturity of .NET 4.0 are additions like Client Profile and Side-by-Side Execution.

As frameworks age, they get big. Backward compatibility becomes the enemy of agility as old code is left in place for compatibility and new code gets added to create new functionality. Eventually the framework is so bloated with stuff you don't need that its size becomes an obstacle to adoption.

The .NET Framework Client Profile breaks down the .NET Framework into a package of features that a client needs, like WPF and MEF, leaving behind the things it doesn't need, like ASP.NET and MSBuild. This lightens the load for client deployment and surfaces the idea that the framework is not necessarily a monolithic whole, but a collection of classes for specific capabilities.

Side-by-Side Execution is another great example of solving problems only mature frameworks have—applications made up of different elements that actually depend on different versions of the framework. With .NET 4.0, an application can actually load and start different versions of the .NET Framework in the same process.

The combination of Client Profile and Side-by-Side execution points to a future that effectively solves the framework bloat problem by allowing a mix-and-match solution: Only classes needed by the application need to be installed and loaded on the PC in question, and there are effectively no conflicts between versions of classes.

So .NET 4.0 not only reflects maturity, but a graceful maturity that is finding ways to deal with the ravages of software age. Meantime, significant new elements are being added to the framework in the 4.0 release.

The addition of the Dynamic Language Runtime (DLR) is bringing a new generation of dynamic languages and language behavior to the Common Language Runtime (CLR). It is the DLR that lets languages like Python and Ruby become first-class citizens in .NET. Also, you're seeing the addition of dynamic objects to C# and Visual Basic.

The potential of dynamic languages in .NET is only beginning, but the potential is huge. But even that pales to what is coming with parallelism.

Ultimately, the vision of software development must reflect the reality of hardware. The .NET Framework in early days reflected a change in hardware: Memory had become so large and fast that it no longer made sense to carefully hand-craft code to manage memory. With .NET, memory management became part of the "plumbing" of your application. You declared variables and let them fall out of scope. All the memory management of the variables was handled by the framework.


The .NET Framework reflected the reality of PC hardware in 2000. Now it is changing to reflect the PC hardware of 2010 and beyond. Long gone are the days where the clock rate of processors doubled every couple of years. Instead, the solution to increasing performance is increasing the number of cores in the processor.

And this increase is about to jump dramatically.

Dual core and quad core processors are already common. What's coming over the next few years are PCs with 8, 16, 32, 64, even 256 cores or more. How will developers cope with this sort of hardware? It is reasonable for a developer to write code that can deliberately harness two or even four cores, and in a few specialized cases more. But it's crazy to think that we as developers can effectively write code for 256 cores ourselves. It's time for the tools to take over threading, the same way the tools took over memory management.

With tools like the Parallel Task Library, PLINQ , and Data Structures for Parallel Programming in .NET 4.0, you can start to see a future where the developer is no longer responsible for declaring threads. In fact, in short order a developer declaring a thread will be considered a mistake. Instead, developers will be writing code that tells these tools what tasks they wish executed and what dependencies exist within those tasks—and the tools will sort out how many threads are actually involved in executing the tasks. Parallelism in .NET is soon to be something that just happens, much the same way that memory management just happens in .NET.

Much like the first versions of .NET, the early code will not be as simple or easy as we'd like. But it is the beginning, and things will get better over time. The movement to parallelism isn't a one-release feature: It represents a fundamental shift in how we write code.

Part of what makes multi-threaded/parallel development so challenging is that our existing tools aren't designed for it. Object-oriented development, in particular the mutability of state in object-oriented development, struggles to execute code simultaneously. It's only fair; object-orientation and the CLR that's based on it were designed for one very fast processor, not the many-core world we're moving into today. The resurgence of functional languages is in part an effort to make parallel development easier.

Functional languages, depending on their implementation, of course, tend to protect the mutability of state data in a way that makes parallel execution easier. I see F# in Visual Studio 2010 as a move toward the future of languages that are naturally able to deal with the world of massive parallel execution. In .NET 4.0 there's no special ability of F# to do parallel execution, but the ground work is there for a future release.

We're in the early stages of development evolution toward massive parallelism. But in .NET 4.0, Microsoft has laid the foundation for a future where the average developer will harness dozens, if not hundreds, of CPU cores in their application. It will be .NET tools that make this not only possible, but virtually transparent to the developer,  the same way that .NET has made memory management invisible to the developer.

When it comes to parallelism, the .NET Framework is just getting started.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish