Skip navigation

WCF Proxies: To Cache or Not to Cache?

Managing Channel Factory and Channel Lifetime for WCF Applications

RELATED: "WCF and Client Proxies" and "WCF Service Instancing."

This month we ll explore scenarios for managing the lifetime of WCF proxies in different architectural scenarios, and also cover related performance improvements released with .NET 3.0 SP1.

Calling a WCF service requires clients to construct a proxy which really means constructing a channel factory and, ultimately, a strongly-typed channel for the service endpoint. Building the client s channel stack carries some overhead, which can impact multithreaded applications. This includes Windows Forms or WPF applications that present a user interface, and server-side applications that consume WCF services, such as ASP.NET pages and other WCF services. In this article I ll explore the appropriate optimizations for caching the channel factory or the entire channel, depending on the application architecture. In addition, I ll explain new features released with .NET 3.0 SP1 that now provide some built-in optimizations.


A Quick Note on .NET 3.0 SP1

.NET 3.0 SP1 was released during the same timeframe as Visual Studio 2008 and .NET 3.5 (November 2007). The service pack is actually a prerequisite to .NET 3.5 and, thus, is installed with .NET 3.5. You can also apply the service pack to your .NET 3.0 installations separately. A few assemblies are updated when the service pack is applied.

Here s a direct link to .NET 3.0 SP1 if you are not installing .NET 3.5. This also provides information on the core changes in the service pack under the KB Articles link.

There are several improvements in .NET 3.0 SP1 among them, optimizations related to WCF proxies and lifetime management of the underlying channel factory. This will be the feature that I discuss as part of this article.


Channel Caching Options

Your options for channel caching in any application consist of the following:

  • No Caching: Create a new channel factory and channel for each call.
  • Channel Factory Caching: Cache the channel factory and use the same factory to create a new channel for each call.
  • Channel Caching: Cache the actual channel returned by the channel factory and use this for each call.

When you generate a proxy using SvcUtil, a class is generated that implements your service contract and inherits ClientBase. Consider the class shown in Figure 1 that implements the service contract ICounterServicePerCall.



public interface ICounterServicePerCall



 int IncrementCounter();


public partial class CounterServicePerCallClient :

ClientBase< ICounterServicePerCall>,

 ICounterServicePerCall {...}

Figure 1: This class implements the service contract ICounterServicePerCall.

If you choose to not cache the proxy reference, your code might look like this if you create a new proxy for each call:

CounterServicePerCallClient proxy =

 new CounterServicePerCallClient(new NetTcpBinding(),

new EndpointAddress("net.tcp://localhost:9000/PerCall"));



In .NET 3.0 (before SP1), this would create a new channel factory and channel each time. You can manually achieve the same result using ChannelFactory instead of the generated proxy by calling the static CreateChannel method exposed by ChannelFactory to create a new channel for each call:

ICounterServicePerCall proxy =


(new NetTcpBinding(), new EndpointAddress("net.tcp://




In both cases, you could opt to cache the returned proxy or channel reference to achieve channel caching scoping the reference so the application can reuse it for multiple calls and possibly multiple threads.

The only way to cache the channel factory separate from the channel before SP1 was to use ChannelFactory to create the factory first, then explicitly use the cached factory to construct a new channel for each call (see Figure 2).

// scope the channel factory reference for the application's requirements

ChannelFactory factory =

 new ChannelFactory(new NetTcpBinding(),new

 EndpointAddress("net.tcp://localhost:9000/PerCall")) ;

// create the channel for each call

ICounterServicePerCall proxy = factory.CreateChannel();



// clean up the factory when the application no longer needs it


Figure 2: Prior to SP1 the only way to cache the channel factory separate from the channel was to use ChannelFactory to create the factory, then explicitly use the cached factory to construct a new channel for each call.

Of course, you d have to uniquely scope individual channel factory references if there are several (to ensure using the correct factory to create channels for each endpoint with which you communicate).

.NET 3.0 SP1 introduced a new most recently used (MRU) channel factory cache that is managed automatically as you use proxies that derive from ClientBase. This makes it possible to cache the channel factory while using a generated proxy, without having to write the code to handle caching efforts. In fact, your code could construct a new proxy instance for each call without caching the reference and when the proxy is constructed again the appropriate channel factory for the new proxy instance would be used if a match is present in the MRU cache. I ll explore how this works in the next section, but the result is similar to what you might do if you manually cached the channel factory and used it to construct new channel instances for each call.

Once you have a full grasp of the channel caching options available, you must think through what is appropriate for your application. In other words:

  • When is caching appropriate?
  • What level of caching is appropriate?

To answer these questions, one must always consider the application architecture. I ll provide some guidance on this based on Windows client- and server-side application scenarios in forthcoming sections.


Channel Factory Caching in SP1

The new channel caching feature released with .NET 3.0 SP1 is transparent to developers in the sense that channel factory caching in the MRU cache is handled automatically by ClientBase. You construct your proxy, which derives from ClientBase in the usual way. During construction, ClientBase checks to see if a channel factory has already been cached for the endpoint you are calling. If so, it will use that channel factory to construct the inner channel for the proxy. If not, it will construct a new channel factory for that endpoint, cache it for future use, then construct the inner channel for the proxy.

That s the high-level perspective, but there are some important details to consider in this process:

  • The MRU cache is a static member and lives in the application domain. If the application domain is recycled, the MRU cache will be cleared and the process of caching begins again.
  • The MRU cache is limited to 32 entries, thus less frequently used channel factories may be removed if the list grows beyond this for a particular application domain.
  • MRU caching behavior varies based on your choice of proxy constructor and other properties set on the proxy prior to opening the channel. It is very important to use the correct constructor if you want to leverage the MRU cache.
  • Entries in the MRU cache are keyed according to specific properties of the proxy as set by the constructor. It is important to understand which properties are used to key channel factories stored in the MRU cache.

There are several constructors supplied by ClientBase (see Figure 3).

// non-duplex constructors

protected ClientBase();

protected ClientBase(string endpointConfigurationName);

protected ClientBase(Binding binding, EndpointAddress remoteAddress);

protected ClientBase(string endpointConfigurationName, EndpointAddress remoteAddress);

protected ClientBase(string endpointConfigurationName, string remoteAddress);

// duplex constructors

protected ClientBase(InstanceContext callbackInstance);

protected ClientBase(InstanceContext callbackInstance, string endpointConfigurationName);

protected ClientBase(InstanceContext callbackInstance, Binding binding,

EndpointAddress remoteAddress);

protected ClientBase(InstanceContext callbackInstance, string endpointConfigurationName,

 EndpointAddress remoteAddress);

protected ClientBase(InstanceContext callbackInstance, string endpointConfigurationName,

 string remoteAddress);

Figure 3: Constructors supplied by ClientBase.


Parameters to these constructors are used to either cache a new channel factory or find an existing channel factory reference. Constructor parameters used to cache the channel factory are: endpointConfigurationName, remoteAddress, and callbackInstance (the latter being relevant to DuplexClientBase proxies only). Constructors that accept a Binding parameter type cannot be used to cache the channel factory so if you construct your bindings programmatically, you can t take advantage of the MRU cache.

Aside from constructor choice affecting the use of the MRU cache, accessing certain properties of the ClientBase reference will also cause a matching MRU cache entry to be released. To be specific, removal of the matching MRU cache entry takes place if you access the ChannelFactory, ClientCredentials, or Endpoint properties of the proxy reference.

Channel factory references in the MRU cache are considered a unique match to a particular proxy reference based on the following:

  • The name of the configuration setting used to construct the proxy must match. If not provided, this value will be * internally which means the proxy was using the default client endpoint in the configuration section.
  • The same remote address must be used, which means any parameters passed to initialize the EndpointAddress of the proxy, such as the address, listen URI, and other properties of the EndpointAddress type.
  • The same callback reference must be used for duplex proxies. That means passing the same reference, not a duplicate instance of the same callback type.

Because the use of the MRU cache is transparent to the developer, there is very little reason to not leverage it when you can particularly for Windows client applications. However, there are definitely some scenarios where this approach for channel factory caching is not practical or even possible. The next sections will look at some application scenarios and recommended approaches.


Windows Clients and Channel Caching

Windows clients built with Windows Forms or WPF typically create a proxy and cache it for the lifetime of the application. Figure 4 illustrates a client application that communicates with two services, Service A and Service B. Two proxies are constructed when the application begins; each is used for the lifetime of the application to communicate with their respective service endpoints. Because the services use PerCall instancing, each call from the client gets its own service instance, but a single channel factory and channel are constructed at the client to invoke service operations.

Figure 4: Channel caching for the lifetime of a Windows application.

Figure 4 assumes that you are caching the proxy reference that includes both the channel factory and channel. This would be equivalent to using ChannelFactory and caching the channel reference from CreateChannel.

If the channel has a transport session (named pipes, TCP, or HTTP with reliable sessions or secure sessions), it is possible for the channel to become faulted, possibly from a timeout or an uncaught exception. In this case, a new channel must be created at the client. With SP1, if the proxy is constructed with parameter values that match a channel factory in the MRU cache (see Figure 5), the overhead of recreating the client channel is reduced.

Figure 5: Leveraging the MRU cache to construct a new proxy.

If the client application is multithreaded, channel caching is an absolute necessity, particularly if each thread is updating user interface components. If each thread constructs a new proxy to communicate with services, UI updates can become unbearably slow. If there are enough UI threads, even the MRU cache is not sufficient to improve this perception to end users. In this case, channel caching is the best possible solution where all threads share the same proxy or channel reference, as shown in Figure 6. Certainly, if something causes the channel to fault, the presence of the MRU cache is still useful for optimizing channel recreation.

Figure 6: Multiple threads sharing the same proxy reference.

Remember that in the presence of a transport session, even PerCall services must be set for ConcurrencyMode.Multiple to allow a single proxy to send multiple concurrent requests to the same channel.

There is at least one clear case where a Windows client application will not be able to leverage the MRU caching feature to optimize channel recreation. Because accessing the ClientCredentials property removes the matching MRU cache entry for the proxy, clients that supply credentials collected at run time will never use the MRU cache. For example, the following code sets UserName credentials a typical requirement of Internet-facing WCF services:

CounterServicePerCallClient proxy = new

CounterServicePerCallClient("netTcp2", new


proxy.ClientCredentials.UserName.UserName = "user";

proxy.ClientCredentials.UserName.Password = "password";



As soon as the ClientCredentials property is accessed, previously cached MRU entries for the same configuration name and address are removed from the cache. Fortunately, Windows clients benefit the most from channel caching. The MRU cache, although useful in the case where channel recreation is necessary, will not make a significant difference unless the channel is consistently faulted across multiple client threads that rely on the channel.

To improve channel recreation performance you can create the channel factory directly using ChannelFactory, set the ClientCredentials property, and create your own caching mechanism for the factory in the event channel recreation is required. The following code illustrates how to set credentials on the ChannelFactory reference:

ChannelFactory factory = new

ChannelFactory(new NetTcpBinding(),

new EndpointAddress("net.tcp://localhost:9000/PerCall")) ;

factory.Credentials.UserName.UserName = "user";

factory.Credentials.UserName.Password = "password"; 

Figure 7 summarizes the typical usage of each caching option for Windows clients.

Caching Option

Typical Usage

No Caching

Useful for one-off scenarios in a single-threaded client.

Channel Factory Caching using ChannelFactory

Useful to optimize channel recreation when channel factory features such as credentials and binding configurations must be set at run time.

Channel Factory Caching using MRU Cache

Useful to optimize channel recreation when channel factory features are not set at run time. The need to set run time credentials often reduces the value of this option.

Channel Caching using ChannelFactory or ClientBase

The best possible use case for multithreaded applications with frequent UI updates. ChannelFactory should be used to optimize channel factory caching if run-time settings are required.

Figure 7: Windows client channel caching options and typical usage.


Server-side Clients and Channel Caching

Server-side clients are a much different animal from Windows applications when it comes to practical options for caching channel factories and channels. Figure 8 illustrates the architecture where multiple concurrent threads from an ASP.NET application or WCF service hosted in the Web server tier call a WCF service hosted in the application server tier.

Figure 8: Server-side clients consuming a WCF service from multiple concurrent threads.

This scenario is different from the Windows client scenario in the following ways:

  • There is an expectation that there will always be multiple concurrent threads accessing the same page or service operation. Each thread will require access to a client channel to call downstream services.
  • Calls may originate from different application users, thus each thread may need to pass information about that specific user to downstream services by setting channel credentials or custom application headers.
  • Application servers will likely be load balanced, which means that each thread should not have affinity to a particular server machine.

The characteristics of the server-side scenario limit your practical options for channel caching.

Caching a proxy or channel reference is simply not an option if you want to load balance calls from the Web server tier to the application server tier unless the protocol used is HTTP without sessions (BasicHttpBinding). The typical protocol between Web and application servers is TCP (NetTcpBinding), which reduces this likelihood. If you cache the channel, server affinity is established which can crush your scalability goals for the application as a whole, even if it seems to improve performance when you test with only a few concurrent users.

Caching the channel factory can be a viable option to improve performance, while still allowing the application to distribute calls to load-balanced servers because a new channel is created for each thread. Caching the channel factory in a server-side scenario cannot likely leverage the MRU cache supplied by ClientBase for two reasons: binding configurations are likely to be pulled from a common database entry to facilitate server-farm deployments, and credentials are likely to be set at run time on the proxy for each call.

Using ChannelFactory you can still achieve channel factory caching with your own custom MRU cache. This still implies an important restriction: calls to the same service endpoint that share the channel factory must also share the same credentials. That means you can t pass different credentials for each thread calling application services from the Web server tier. One scenario where this is not an issue is if you use the same certificate or Windows credential to authenticate to downstream services. In this case, if you need to pass information about the authenticated user, you can use custom headers rather than a security token.

In most cases, server-side clients simply pay the price of constructing a new channel for each call, without any caching benefits. This satisfies the need to scale out and provide the required credentials for each call using appropriate security tokens instead of the custom header workaround. Performance may be slower if compared side-by-side with each caching approach, but the benefits of horizontal scaling outweigh this concern. In most cases a multithreaded server application can withstand two to three WCF service calls without caching the channel factory or channel and still achieve performance benchmarks required by the application for the individual call. The overhead of the WCF plumbing is usually shadowed by the overhead of the application functionality. You should always benchmark your applications to establish an acceptable average time to complete each call. Figure 9 summarizes the typical usage of each caching option for server-side clients.

Caching Option

Typical Usage

No Caching

Often necessary due to unique credential requirements for each thread.

Channel Factory Caching using ChannelFactory

Useful to optimize channel creation for multiple threads that can share the same channel factory features, such as credentials and binding configurations.

Channel Factory Caching using MRU Cache

Not useful because most server-side scenarios rely on run-time binding configurations and credential settings. Could be useful if cached in the context of a session.

Channel Caching using ChannelFactory or ClientBase

Not useful because most server-side scenarios require calls to be load balanced.

Figure 9: Server-side client channel caching options and typical usage.



After reading this article you should have a better understanding of not only the possible channel caching options available to you, but what is practical to apply in Windows client and server-side client scenarios. While the MRU cache is a useful new feature, there are still cases where raw ChannelFactory caching will be necessary. In addition, don t be overly concerned if you can t cache your channel factory or channel in server-side scenarios, because the goal should be overall scalability in that case. It is when a user interface is involved, as with Windows clients, that caching becomes much more necessary to perceived performance in particular with multithreaded clients.

Download the samples for this article at


Michele Leroux Bustamante is Chief Architect of IDesign Inc., Microsoft Regional Director for San Diego, and Microsoft MVP for Connected Systems. At IDesign Michele provides training, mentoring, and high-end architecture consulting services focusing on Web services, scalable and secure architecture design for .NET, federated security scenarios, Web services, and interoperability and globalization architecture. She is a member of the International .NET Speakers Association (INETA), a frequent conference presenter, conference chair for SD West, and is frequently published in several major technology journals. Michele also is on the board of directors for IASA (International Association of Software Architects), and a Program Advisor to UCSD Extension. Her latest book is Learning WCF (O Reilly 2007); see her book blog at Reach her at mailto:[email protected], or visit and her main blog at


Additional Resources

Wenlong Dong has a fantastic blog entry with additional details about the MRU cache:

Learning WCF (O Reilly, 2007): (get all the book code here!)

Michele s blog:


Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.