Friday, February 26, 2010

WCF Service with Multiple Site Bindings: automatic multiplexing

I’ve previously posted an article where I provide a solution for dealing with the limitation of hosting a WCF Service on an IIS site with multiple site bindings. Unfortunately such solution requires manually changing the configuration file and is not really portable, so needs to be kept in synch with any changes made to the site bindings themselves.

So I recently had an idea for a smarter implementation of a similar idea that would be much more portable and require no manual intervention when applied to different deployment environments. This implementation is also based on a custom ServiceHostFactory but doesn’t require any changes to your configuration file, instead it will figure out the adjustments and fix-up steps that need to be made do the service’s endpoint at runtime, based on the deployment environment so you don’t have to do that. Based on my previous post, it’s clear that the problem with having multiple site bindings is that WCF doesn’t know how to create physical absolute addresses for endpoints that have relative addresses. That’s not a very hard problem to solve, all we need to do is make sure that all endpoints have reasonable absolute addresses and we’re good to go. But the WCF hooks that are available to us to inject that code, are called too late in the game when all addresses have already been converted to absolute addresses or, a failure to do so (the notorious “This collection already contains an address with scheme http...") has already occurred.

The basic approach that I came up with is to make utilitarian calls to CreateServiceHost with some specially massaged inputs such that I can discover the form of the original address by inspecting the results. I then dispose of the ServiceHost (which I never open) and go ahead and build the real one and then I replace all the endpoints with ones that have the addresses I computed base on the information I gathered. Here’s some more detail on how this:

When the CreateServiceHost is invoked by the WCF runtime, the ServiceHostFactory will inspect the list of base addresses it gets passed. It will then go through the list and detect if any Uri scheme has more than a single base address in the list. If none are found, we have nothing to do and can call the base implementation. If, however, it finds one it will create a dummy address for that scheme and replace any additional base addresses with that dummy address, otherwise it will just keep the base address as-is. So, if you had the following 4 base addresses:

http://foo.com:8080/
http://localhost/
http://10.0.0.1:8181/
net.tcp://localhost/

this will result in the following list of 2 base addresses, a real one and a dummy one:

http://i-am-a-dummy-host/
net.tcp://localhost/

The code for this looks more or less like this:

// count base addresses per scheme and create a filtered collection where we skip all but the 1st address for each scheme
Dictionary<string, List<Uri>> allBaseAddresses = new Dictionary<string, List<Uri>>();
Dictionary<string, Uri> dummyBaseAddresses = new Dictionary<string, Uri>();
List<Uri> filteredBaseAddresses = new List<Uri>();
bool filtered = false;
foreach (Uri address in baseAddresses)
{
    if (!allBaseAddresses.ContainsKey(address.Scheme))
    {
        filteredBaseAddresses.Add(address);
        dummyBaseAddresses.Add(address.Scheme, address);
        allBaseAddresses.Add(address.Scheme, new List<Uri>());
    }
    else
    {
        // an address for this scheme was already added, that means we're going to filter them out
        UriBuilder ub = new UriBuilder(address.Scheme, MbaServiceHostFactory.DummyHost);
        dummyBaseAddresses[address.Scheme] = ub.Uri;
        filtered = true;
    }
    allBaseAddresses[address.Scheme].Add(address);
}
if (!filtered)
{
    // if no filtering occured we don't need to perform fixups, so call the base implementation
    return base.CreateServiceHost(constructorString, baseAddresses);
}

With the resulting collection, we now call the base CreateServiceHost implementation with the sole purpose of building the ServiceDescription which contains a collection of ServiceEndpoint instances. The code for this is trivial, it looks more or less like this:

// if filtering occured, we need to multiplex some of the endpoints, to find out which ones
// we create a temporary ServiceHost and feed it the dummy base addresses we just computed
ServiceHostBase dummyService = base.CreateServiceHost(constructorString, dummyBaseAddresses.Values.ToArray());

We now go through the list of ServiceEndpoint and whenever we hit an endpoint that has an address containing the dummy address, we are sure that this endpoint had a relative address in configuration so we remove this endpoint, reverse engineer its original relative address and replace it with a collection of endpoints that are all equal, except for their addresses. Such addresses are the result of combining the relative address with all the base addresses (which we remember from our initial inspection) that have a matching scheme. So, if we found 2 endpoints with the following addresses:

http://i-am-a-dummy-host/Service1.svc
http://i-am-a-dummy-host/Service1.svc/mex

we’d replace them with the following 6:

http://foo.com:8080/Service1.svc
http://localhost/Service1.svc
http://10.0.0.1:8181/Service1.svc
http://foo.com:8080/Service1.svc/mex
http://localhost/Service1.svc/mex
http://10.0.0.1:8181/Service1.svc/mex

All other endpoints remain as-is, and since at this point all endpoints have real absolute addresses, we’re ready for Open() and can just return this ServiceHost to WCF. The code for this is a little complex, I’ve decided to break it down in two phases, one where I just divide these endpoints in the two classes I’ve just described:

// now we simply go over the endpoints and figure out which ones need to be multiplexed by
// checking if they were bound to any of the dummy base addresses
List<ServiceEndpoint> singleEndpoints = new List<ServiceEndpoint>();
List<ServiceEndpoint> multiplexEndpoints = new List<ServiceEndpoint>();
foreach (ServiceEndpoint endpoint in dummyService.Description.Endpoints)
{
    List<Uri> bas = allBaseAddresses[endpoint.Binding.Scheme];
    if (bas != null && bas.Count > 1)
    {
        if (endpoint.ListenUri != null)
        {
            if (endpoint.ListenUri.Host == MbaServiceHostFactory.DummyHost)
            {
                multiplexEndpoints.Add(endpoint);
                continue;
            }
        }
        else if (endpoint.Address != null && endpoint.Address.Uri.Host == MbaServiceHostFactory.DummyHost)
        {
            multiplexEndpoints.Add(endpoint);
            continue;
        }
    }
    singleEndpoints.Add(endpoint);
}

The second phase, is preceded by the creation of the actual ServiceHost and clearing of the endpoints from its Description, this is trivial:

// now we can create the real ServiceHost, but we'll use the
// filteredBaseAddresses to guarantee exactly one address per scheme
ServiceHostBase service = base.CreateServiceHost(constructorString, filteredBaseAddresses.ToArray());
// for simplicity we'll completely rewrite the endpoint collection // so we don't have to break enumeration everytime we change the collection service.Description.Endpoints.Clear();

We then rebuild the collection of endpoints from scratch by simply adding the ones that required no change:

// add the endpoints that need no modification
foreach (ServiceEndpoint endpoint in singleEndpoints)
{
    service.Description.Endpoints.Add(endpoint);
}

And now, for each of the ones in the other set, we create multiple endpoints for each base addresses we had found with a matching scheme:

// multiplex the endpoints that correspond to multiple base addresses
foreach (ServiceEndpoint endpoint in multiplexEndpoints)
{
    List<Uri> bas = allBaseAddresses[endpoint.Binding.Scheme];
    int count = 0;
    foreach (Uri ba in bas)
    {
        ServiceEndpoint copy = new ServiceEndpoint(endpoint.Contract)
        {
            Address = endpoint.Address,
            Binding = endpoint.Binding,
            ListenUri = endpoint.ListenUri,
            ListenUriMode = endpoint.ListenUriMode,
            Name = endpoint.Name + "_" + count.ToString()
        };
        count++;
        foreach (IEndpointBehavior eb in endpoint.Behaviors)
        {
            copy.Behaviors.Add(eb);
        }
        if (endpoint.Address != null && endpoint.Address.Uri.Host == MbaServiceHostFactory.DummyHost)
        {
            EndpointAddressBuilder endpointAddressBuilder = new EndpointAddressBuilder(endpoint.Address);
            endpointAddressBuilder.Uri = MbaServiceHostFactory.GetNormalizedUri(ba, endpoint.Address.Uri.PathAndQuery);
            copy.Address = endpointAddressBuilder.ToEndpointAddress();
        }
        if (endpoint.ListenUri != null && endpoint.ListenUri.Host == MbaServiceHostFactory.DummyHost)
        {
            copy.ListenUri = MbaServiceHostFactory.GetNormalizedUri(ba, endpoint.ListenUri.PathAndQuery); ;
        }
        // make sure we're not re-adding an endpoint that already exists because it had an absolute Uri in config
        if (!singleEndpoints.Exists(se => se.Address.Uri == copy.Address.Uri))
        {
            service.Description.Endpoints.Add(copy);
        }
    }
}

For brevity I’ve left out the helper method (GetNormalizedUri) that this uses to build the absolute addresses that you can find in the solution code. All you need to do now is use this as your factory in your .svc file and you’re good to go. In the provided code, I’ve called this factory MbaServiceHostFactory (Mba: Multiple Base Addresses) and I’ve also implemented a MbaServiceHostFactory<T> for cases in which you want the base implementation to be something other than ServiceHostFactory, so if you want a WebServiceHostFactory as your base behavior you can either change MbaServiceHostFactory to inherit from it, or use MbaServiceHostFactory<WebServiceHostFactory> instead.

The WCF team is building an out of the box solution for this issue in the .NET 4.0 release, but if for whatever reason you can’t upgrade, I strongly recommend using this approach.

You can download the sources here.

Friday, February 5, 2010

Wire-first testing WebApiEnabled applications using in-proc hosting and HttpClient (2)

Part 2: using in-proc hosting

In this post I describe how to host your web application/service in the test infrastructure for wire-first testing.
I’ve discussed the client side of the world in part 1 of the post.

In the previous part of this post we didn’t go into the details of hosting the applications and services that we set out to test. In fact there are two main issues with the approach we described: 1) it doesn’t help us start/stop the service when we do a test run 2) when we want to debug our tests, we are debugging the HttpClient code, not the server side code which is what we are trying to test (BTW, this also breaks code coverage). This post describes an approach that addresses these two issues.

Some services framework, such as WCF, allow you to host services in your own process. You can easily write tests for such a service using a Visual Studio Test Project: I usually would have a TestClass with an AssemblyInitialize method where I start the service and and AssemblyCleanup method where I stop the service. This would look more or less like this:

[TestClass]
public class TestCommon {
    static ServiceHost service;

    [AssemblyInitialize]
    public static void AssemblyInit(TestContext context) {
        service = new WebServiceHost(typeof(Service), new Uri("http://localhost/Temporary_Listen_Address/Service"));
        service.Open();
    }

    [AssemblyCleanup]
    public static void AssemblyCleanup() {
        service.Close();
    }
}

Very often, though, your service in production will be hosted in IIS/ASP.NET which makes the approach I just described less desirable as the test environment would be farther away from your production environment, and if you actually have a hard dependency on ASP.NET like you would have if you’re using WebApiEnabled (or in WCF if your AspNetCompatibilityRequirementsMode is set to Required), the approach will not work at all.

While hosting your service on Cassini or on the local IIS server will bring you much closer to the production environment, it makes testing more tedious because you might have to manually deploy to and start/stop those hosts manually, it also makes debugging very time consuming as you’ll have to manually attach to different processes.

The approach I describe below alleviates this problem and, while not being perfect, strikes a good balance of fidelity, in terms of emulating the production environment, and convenience, in terms of fast, easy and integrated the experience is for running and debugging tests. The key to the proposed approach is to host ASP.NET in the same process as the one that is used to run the tests and to hook the start/stop in the same way as we did above for WCF.

Not everyone knows that the Cassini ASP.NET host that Visual Studio uses, is a reusable component that can also run outside of Visual Studio itself, all we need is to write some infrastructure code to hook it all of this together. We will start by adding a reference to one of the Cassini assemblies in the test project. In Visual Studio 2008 SP1, the assembly is called WebDev.WebHost.dll and while it is in the GAC, I doesn’t show up in Visual Studio’s list of assembly references by default. While there might be a way to change that, I just add it manually by opening the .csproj file and adding the following to the list of references:

<Reference Include="WebDev.WebHost" />

We can now use an approach similar the one I describe above for WCF. Instead of using the ServiceHost class, we will be using the Server class in the Microsoft.VisualStudio.WebHost namespace.

[TestClass]
public class TestCommon {
    public static int AspPort = 59341; // just pick one 
    public static string AspVirtualPath = "/MyMvcApp";
    public static string AspBaseAddress = "http://localhost:" + AspPort + AspVirtualPath + "/";
    static Server server;

    [AssemblyInitialize]
    public static void AssemblyInit(TestContext context) {
        server = new Server(TestCommon.AspPort, TestCommon.AspVirtualPath, hostedRoot, false, false);
        server.Start();
    }

    [AssemblyCleanup]
    public static void AssemblyCleanup() {
        server.Stop();
    }
}

Now we can also change our HttpClient to use a matching addess, so these are always in sync in case we decide to change them. In fact we normally just have the client be a member of the TestClass, so all TestMethods can share it w/out having to create one each time:

HttpClient client = new HttpClient(TestCommon.AspBaseAddress);

There is, however, extra complexity that has to do with the fact that ASP .NET needs a physical directory from which to load files, so we need to write code to “find it”. In fact, you might have observed that i am using a string hostedRoot that I didn’t even define, here’s the code that I use to compute that:

// make relative to where TestResults are dropped
string hostedRoot = Path.Combine(context.TestDir, @"..\..");
// make relative to where the MvcRestTest folder is located
hostedRoot = Path.Combine(hostedRoot, @"/* web app project folder */");
hostedRoot = new DirectoryInfo(hostedRoot).FullName;

I certainly am not happy about how it depends on the physical layout of the folders in your solution, but it’s the best I could come up with.

Unfortunately the complexity is not over yet: in order to make code coverage work, we will need to write code to propagate the instrumented binaries that Visual Studio copies to the TestResults folder, to the bin folder under the web application project folder, the code looks like:

string hostedLocation = Path.Combine(hostedRoot, "bin");
string location = Path.GetDirectoryName(typeof(TestCommon).Assembly.Location);
TestCommon.MergeBinaries(location, hostedLocation);

Where the (pretty naive) implementation for MergeBinaries, looks like this:

static void MergeBinaries(string location, string hostedLocation) {
    List<string> sources = new List<string>();
    sources.AddRange(Directory.GetFiles(location, "*.dll"));
    sources.AddRange(Directory.GetFiles(location, "*.exe"));
    sources.AddRange(Directory.GetFiles(location, "*.pdb"));
    foreach (string source in sources) {
        string destination = Path.Combine(hostedLocation, Path.GetFileName(source));
        bool overwrite = false;
        if (File.Exists(destination)) {
            FileInfo s = new FileInfo(source);
            FileInfo d = new FileInfo(destination);
            if (s.CreationTime <= d.CreationTime) {
                continue;
            }
            overwrite = true;
        }
        File.Copy(source, destination, overwrite);
    }
}

That’s pretty much it, now we can hit Run Test, Debug Test, look at code coverage and it should all work and we should get a very nice integrated experience.

In AssemblyInit you can also add checks for pre-requisites of your tests, so if anything wrong is detected, you can fail early and avoid getting bogus results. For example I add a check that verifies that my mdf and ldf files exist in the expected location and that they are writeable, otherwise when I run tests that update the data in the database, they would fail and I would be left wondering whether I just introduced a regression.

P.S.: though I say applications in the title, you are much more likely to do wire-first testing for services. That said, keep in mind that this approach will work for applications too. Also, a lot of what I discuss here is actually applicable to a larger class of services, beyond ones that are built using WebApiEnabled, such as WCF SOAP, WCF REST and ASMX services.

Wire-first testing WebApiEnabled applications using in-proc hosting and HttpClient (1)

Part 1: using HttpClient

In this post I describe how you can use HttpClient to write wire-first tests.
I’ll be discussing the service side of the world in part 2 of the post.

So you’ve added WebApiEnabled to your controller class, your unit tests continue to work but how do you know that the web api is also working as expected? Also, since your web api sets out to be compatible with an heterogeneous array of web clients, how do you verify that the wire format is the one I expect?

The approach I suggest here is to make use of the HttpClient API that is available for download on Codeplex as part of the WCF REST Starter Kit Preview 2 (get it here). The HttpClient API is an expressive and clean API that lets you write tests that are very concise, readable and easy to write and maintain.

So, let’s assume your controller looks like so:

[WebApiEnabled]
public class MoviesController : Controller {
    static List<Movie> Movies = new List<Movie> {
        new Movie { Id = 1, Title = "La vita รจ bella", Director = "Roberto Benigni", DateReleased = DateTime.Parse("20/12/1997") }, 
        new Movie { Id = 2, Title = "Avatar", Director = "James Cameron", DateReleased = DateTime.Parse("18/12/2009") }, 
        new Movie { Id = 3, Title = "The Godfather", Director = "Francis Ford Coppola", DateReleased = DateTime.Parse("20/12/1997") } 
    };

    // GET: /Movies/Details/5 
    public ActionResult Details(int id) {
        var movieToDisplay = (from m in MoviesController.Movies
                              where m.Id == id
                              select m).FirstOrDefault();
        if (movieToDisplay == null) {
            throw new HttpException((int)HttpStatusCode.NotFound, "No movie matching '" + id + "'");
        }
        return View(movieToDisplay);
    }
}

And suppose you’re hosting this at the “http://localhost/MyMvcApp” address (a lot more about this in Part 2), here’s what a test that verifies that asking for a non existent movie gets you a 404 response looks like:

[TestMethod]
public void Movies404() {
    HttpClient client = new HttpClient("http://localhost/MyMvcApp");
    using (HttpResponseMessage response = client.Get("Movies/9999")) {
        Assert.AreEqual(HttpStatusCode.NotFound, response.StatusCode);
    }
}

Because HttpClient is very easy to extend via extension methods, you can add one that lets you easily specify an Accept header, like so:

public static class HttpClientExtensions {
    public static HttpResponseMessage Get(this HttpClient client, string uri, string acceptContentType) {
        HttpRequestMessage request = new HttpRequestMessage("GET", uri);
        request.Headers.Accept.AddString(acceptContentType);
        return client.Send(request);
    }
}

And you can start testing the multi-format behavior that WebApiEnabled provides out of the box:

[TestMethod]
public void Movies404Xml() {
    HttpClient client = new HttpClient("http://localhost/MyMvcApp");
    using (HttpResponseMessage response = client.Get("Movies/9999", "text/xml")) {
        Assert.AreEqual(HttpStatusCode.NotFound, response.StatusCode);
        Assert.AreEqual("application/xml; charset=utf-8", response.Content.ContentType);
    }
}

And, of course, you can write more complex test cases that test the behavior of multiple related requests. Here’s an example testing that you can add an item usng the json format and then check that it was added using the xml format. It makes 3 subsequent requests:

[TestMethod]
public void MoviesCrudJsonXml() {
    MoviesComparer moviesComparer = new MoviesComparer();
    List<Movie> originalMovieList;
    HttpClient client = new HttpClient("http://localhost/MyMvcApp");
    using (HttpResponseMessage response = client.Get("Movies", "application/xml")) {
        Assert.AreEqual(HttpStatusCode.OK, response.StatusCode);
        Assert.AreEqual("application/xml; charset=utf-8", response.Content.ContentType);
        originalMovieList = response.Content.ReadAsDataContract<List<Movie>>();
    }
    string director = "Nichols";
    DateTime dateReleased = new DateTime(1967, 12, 21);
    string title = "The Graduate";
    Movie movieToInsert = new Movie { Director = director, DateReleased = dateReleased, Title = title };
    using (HttpResponseMessage response = client.Post("Movies/Create", HttpContentExtensions.CreateJsonDataContract(movieToInsert))) {
        Assert.AreEqual(HttpStatusCode.Created, response.StatusCode);
        Assert.AreEqual("application/json; charset=utf-8", response.Content.ContentType);
    }
    List<Movie> updatedMovieList;
    using (HttpResponseMessage response = client.Get("Movies", "application/json")) {
        Assert.AreEqual(HttpStatusCode.OK, response.StatusCode);
        Assert.AreEqual("application/json; charset=utf-8", response.Content.ContentType);
        updatedMovieList = response.Content.ReadAsJsonDataContract<List<Movie>>();
    }
    Movie insertedMovie = updatedMovieList.Except(originalMovieList, moviesComparer).SingleOrDefault();
    Assert.IsTrue(moviesComparer.Equals(movieToInsert, insertedMovie));
}

which uses the following comparer for the Movie class:

public class MoviesComparer : EqualityComparer<Movie> {
    public override bool Equals(Movie x, Movie y) {
        return x.Director == y.Director && x.Title == y.Title && x.DateReleased == y.DateReleased;
    }
    public override int GetHashCode(Movie obj) {
        return obj.Director.GetHashCode() ^ obj.Title.GetHashCode() ^ obj.DateReleased.GetHashCode();
    }
}

The technique used here is in fact the same technique we ended up using to write tests while developing the WebApiEnabled support itself. In the near future we’re looking at making the tests available for you to download.

Getting source code from a source server from Mdbg

I deal mostly withy managed code, and while Visual Studio is a great environment, I often find myself in situations where a lightweight, command-line style debugger is all I really need to get to what i want. Mdbg is such a debugger, with a 4 MB working set, it often fires up in just under a second.

Mdbg has support for symbol server but not for source server, and when you’re dealing with 6 released versions and all kinds of service packs and qfes, you really don’t want to go spelunking to find the source code. Since Mdbg supports extensions, I decided to write an extension that could find the source automatically using the DbgHelp APIs implemented by DbgHelp.dll (docs available here). I decided to share the code and some of the experience hoping it will be useful to others.

I based my implementation on the sample posted by Mike Stall (available here), and I decided to call my command “ls” for LoadSource, so I started off creating a Class Library project, added references to MdbgCore.dll, and started off with the following skeleton:

[CommandDescription(CommandName = "ls",
                    ShortHelp = "Load Sources from source server",
                    MinimumAbbrev = 2,
                    LongHelp = @"
Usage: ls [frames]
    load sources using the SRC* source server")]
        public static void LoadSourceCmd(string arguments) {
        }

The support for symbol server works just as one would expect, so in my command I assume that the user has already taken care of getting the right symbols loaded so the path information is already available. The API in MdbgCore is fairly straightforward, I didn’t need docs to find what I needed, just a few hops using intellisense.

What I do is look at the Path in the SourcePostion of the CurrentFrame, look it up using the FileLocator, and if I find that the file is not available, I will go and fetch it using the source server. If the source server returns a location the value of fileLocation and the contents of the FileLocator will be updated. Regardless, I then execute the Show command:

MDbgFrame currentFrame = CommandBase.Debugger.Processes.Active.Threads.Active.CurrentFrame;
stringsourceFile = currentFrame.SourcePosition.Path;
stringfileLocation = CommandBase.Shell.FileLocator.GetFileLocation(sourceFile);
if (!File.Exists(fileLocation)) {
    fileLocation = GetPathFromSourceServer(arguments, currentFrame, sourceFile);
    if (File.Exists(fileLocation)) {
        CommandBase.Shell.FileLocator.Associate(sourceFile, fileLocation);
} } IMDbgCommand show; stringargs; CommandBase.Shell.Commands.ParseCommand("sh", outshow, outargs); show.Execute(string.Empty);

Since DbgHelp is a native code library, calling from managed required me to write a bunch of tedious P/Invoke code, I will share that code but won’t say much about it as there isn’t anything very interesting about it. I also removed most of the error checking from the snippets below to make it more readable, don’t think I do that in real life, the attached code should have plenty of error checking!

DbgHelp requires a initialization, so I addded a static initializer that I call at the beginning of the GetPathFromSourceServer method so I can initialize if I haven’t already. Before initializing one can also set options using the SymSetOptions API, but other than that it’s just a straight call to SymInitializeW to which i pass the handle of the current process:

IntPtr hProcess = Process.GetCurrentProcess().Handle;
Dbghelp.SymInitializeW(hProcess, null, false);

After that, we need to make sure DbgHelp has loaded the module we’re finding sources for. I added a static cache where I track which modules I already loaded so I don’t end up loading a module twice. So I first check my cache and if it comes back empty, I will load the module with a call to SymLoadModuleExW and save the result in my cache:

MDbgModule module = currentFrame.Function.Module;
string symbolFile = module.SymbolFilename;
ulong baseAddress;
CorModule corModule = module.CorModule;
baseAddress = (ulong)corModule.BaseAddress;
uint size = (uint)corModule.Size;
string moduleName = Path.GetFileNameWithoutExtension(sourceFile);
baseAddress = Dbghelp.SymLoadModuleExW(hProcess, IntPtr.Zero, symbolFile, moduleName, baseAddress, size, (Dbghelp.MODLOAD_DATA*)null, 0);
moduleCache.Add(symbolFile, baseAddress);

At this point there really isn’t much left for us to do, our last API call is the real deal, the one that finds the source code and brings a copy of it down, and all of that work is just a simple call to SymGetSourceFileW:

StringBuilder path = new StringBuilder(MAX_PATH * 8);
Dbghelp.SymGetSourceFileW(hProcess, baseAddress, IntPtr.Zero, sourceFile, path, (uint)path.Capacity);
return path.ToString();

After playing with the new command, I discovered that my command and Visual Studio (which uses the same core implementation as DbgHelp) were putting files in different places and having two separate caches was really irritating me, so I decided to fix that. Unfortunately, after calling the SymSetHomeDirectoryW API to point DbgHelp to the Visual Studio location (which on my Win7 box happens to be “%LOCALAPPDATA%\SourceServer”), my files were being put under a subfolder of that same location, called “src”. Thanks to a suggestion from Pat Styles, I found that if the SYMOPT_FLAT_DIRECTORY option was set, then the “src” folder would not be appended when calling SymSetHomeDirectoryW. So I added these few lines before the call to SymInitializeW and my problem was solved:

Dbghelp.SymSetOptions(Dbghelp.SYMOPT.SYMOPT_FLAT_DIRECTORY);
string localApplicationData = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData) + @"\SourceServer";
Dbghelp.SymSetHomeDirectoryW(hProcess, localApplicationData);
Dbghelp.SymSetOptions((Dbghelp.SYMOPT)0);

You can download the sources here, they contain two sets of sln/csproj files that share the same code, you can use LoadSource2.0.sln with Visual Studio 2008, and LoadSource.sln Visual Studio 2010 (note that you can also just use msbuild.exe to build the LoadSource.dll, so Visual Studio isn’t required).

Thursday, August 27, 2009

Rest For ASP.NET MVC, integrating with Json.NET

Rest For ASP.NET MVC isolates the MVC developer from the concern of rendering data in machine readable formats for wire transmission. One nice thing about it is how easy it is to go in and customize or replace the handling for any given format. In this post I will walk you through replacing the built-in Json format handler (built on DataContractJsonSerializer from System.ServiceModel.Web.dll) with one that instead uses the Json.NET library (this project is hosted here on Codeplex).

First of all download and unzip Json.NET (I used the Beta 4 drop which you can find here).

Then download and unzip the Rest for ASP.NET MVC sample (you can find it here) and open the Product\Product.sln solution in Visual Studio.

We will be modifying the sdk\MovieApp\MovieApp_EdmSample project which already shows a similar technique but uses the JavaScriptSerializer, so make sure you can run that project.

image

If you set this project as default and hit F5, your browser should pop up and display the following page:

image

Now, if you had control over the Accept header, you could ask for a json representation of this same data (a collection of movies), but let’s just go and hit the Json button at the top right, this will return “application/json” content which the browser doesn’t know how to render, so let’s save it to disk and open it, it should look like this:

[{"Id":1,"Title":"Star Wars","Director":"Lucas","DateReleased":"\/Date(251884800000)\/","EntityState":2,"EntityKey":{"EntitySetName":"MovieSet","EntityContainerName":"MoviesDBEntities","EntityKeyValues":[{"Key":"Id","Value":1}],"IsTemporary":false}},{"Id":2,"Title":"Memento","Director":"Nolan","DateReleased":"\/Date(1038988800000)\/","EntityState":2,"EntityKey":{"EntitySetName":"MovieSet","EntityContainerName":"MoviesDBEntities","EntityKeyValues":[{"Key":"Id","Value":2}],"IsTemporary":false}},{"Id":3,"Title":"Pulp Fiction","Director":"Tarantino","DateReleased":"\/Date(982483200000)\/","EntityState":2,"EntityKey":{"EntitySetName":"MovieSet","EntityContainerName":"MoviesDBEntities","EntityKeyValues":[{"Key":"Id","Value":3}],"IsTemporary":false}},{"Id":4,"Title":"Raiders","Director":"Spielberg","DateReleased":"\/Date(251020800000)\/","EntityState":2,"EntityKey":{"EntitySetName":"MovieSet","EntityContainerName":"MoviesDBEntities","EntityKeyValues":[{"Key":"Id","Value":4}],"IsTemporary":false}}]

Now that we’ve verified that everything works, let’s close the browser and move to writing some code.

Open the Infrastructure folder and find the JavaScriptSerializerFormatHandler.cs file. Right click on it, copy and paste it in the same folder, you should see the following:

image

You will also start seeing a bunch of build errors because we now have duplicate copies of the same classes, we’ll fix that in a minute. Rename the file to JsonNetFormatHandler.cs and open it. Find an replace JavaScriptSerializer with JsonNet so we end up with the a matching class name.

Now add a reference to the Json.NET library, we’ll need Newtonsoft.Json.dll, and replace the using statement for JavaScriptSerializer “using System.Web.Script.Serialization;” with the Json.NET equivalent “using Newtonsoft.Json;”.

In the JsonNetFormatHandler we’ll need to modify the Deserialize method to use the JsonConvert.DeserializeObject API. This is pretty trivial, it’s a one line single method call! Your method should now look like the following:

public object Deserialize(ControllerContext controllerContext, ModelBindingContext bindingContext, ContentType requestFormat)
{
string input = new StreamReader(controllerContext.HttpContext.Request.InputStream).ReadToEnd();
return JsonConvert.DeserializeObject(input, bindingContext.ModelType);
}

Notice that the first line hasn’t changed. If Json.NET were to add an API overload that reads its input from a Stream rather than from a string, this whole method could be a single line of code!

To fix the serialization path, let’s move to the JsonNetActionResult.ExecuteResult() method were we’ll need to cleanup some code that we don’t need anymore and use the JsonConvert.SerializeObject API. This is also pretty trivial, your method is now a couple lines shorter and look like the following:

public override void ExecuteResult(ControllerContext context)
{
Encoding encoding = Encoding.UTF8;
if (!string.IsNullOrEmpty(this.ContentType.CharSet))
{
try
{
encoding = Encoding.GetEncoding(this.ContentType.CharSet);
}
catch (ArgumentException)
{
throw new HttpException((int)HttpStatusCode.NotAcceptable, string.Format(CultureInfo.CurrentCulture, "Format {0} not supported", this.ContentType));
}
}
this.ContentType.CharSet = encoding.HeaderName;
context.HttpContext.Response.ContentType = this.ContentType.ToString();
string json = JsonConvert.SerializeObject(this.Data, Formatting.Indented);
byte[] bytes = encoding.GetBytes(json);
context.HttpContext.Response.OutputStream.Write(bytes, 0, bytes.Length);
}

Notice that the change is again tiny. If Json.NET were to add an API overload that writes the output to a Stream rather than converting to a string, this method could be further simplified! Also notice that I used Formatting.Indented, this isn’t required, but it has the nice side effect of helping us see a difference when we test this.

The last thing we need to do is to register this new handler as the one the system will use for Json. To do so is trivial, just open the Global.asax file and replace the type used in the MyFormatManager contructor for jsonHandler from JavaScriptSerializerFormatHandler to JsonNetFormatHandler. The code will now look as follows:

public MyFormatManager()
{
XmlFormatHandler xmlHandler = new XmlFormatHandler();
JsonNetFormatHandler jsonHandler = new JsonNetFormatHandler();
this.RequestFormatHandlers.Add(xmlHandler);
this.RequestFormatHandlers.Add(jsonHandler);
this.ResponseFormatHandlers.Add(xmlHandler);
this.ResponseFormatHandlers.Add(jsonHandler);
}

We’re done. Let’s hit F5, and when we hit the Json button and save to disk, the data we ge will look like the following:

[
{
"Id": 1,
"Title": "Star Wars",
"Director": "Lucas",
"DateReleased": "\/Date(251884800000-0800)\/",
"EntityKey": {
"EntitySetName": "MovieSet",
"EntityContainerName": "MoviesDBEntities",
"EntityKeyValues": [
{
"Key": "Id",
"Value": 1
}
]
}
},
{
"Id": 2,
"Title": "Memento",
"Director": "Nolan",
"DateReleased": "\/Date(1038988800000-0800)\/",
"EntityKey": {
"EntitySetName": "MovieSet",
"EntityContainerName": "MoviesDBEntities",
"EntityKeyValues": [
{
"Key": "Id",
"Value": 2
}
]
}
},
{
"Id": 3,
"Title": "Pulp Fiction",
"Director": "Tarantino",
"DateReleased": "\/Date(982483200000-0800)\/",
"EntityKey": {
"EntitySetName": "MovieSet",
"EntityContainerName": "MoviesDBEntities",
"EntityKeyValues": [
{
"Key": "Id",
"Value": 3
}
]
}
},
{
"Id": 4,
"Title": "Raiders",
"Director": "Spielberg",
"DateReleased": "\/Date(251020800000-0800)\/",
"EntityKey": {
"EntitySetName": "MovieSet",
"EntityContainerName": "MoviesDBEntities",
"EntityKeyValues": [
{
"Key": "Id",
"Value": 4
}
]
}
}
]

Overall pretty simple, yet powerful!

Friday, July 25, 2008

Hosting a WCF Service on an IIS site with Multiple Bindings

If you ever tried hosting a WCF Service on an IIS site with Multiple Bindings you will be familiar with the error "This collection already contains an address with scheme http...". The workarounds you can find on the internet and the recent introduction of baseAddressPrefixFilters in WCF can help get passed this error, but they won't allow you to receive requests on all the configured bindings which is usually what people I talked to actually want to do. So here's how you would go by making your service accessible from all available bindings:

1) Use a custom Factory, here's what this looks like in your .svc file:

<%@ ServiceHost Language="C#" Debug="true" Service="MyService.MyService"
    Factory="MyService.MultipleHostsFactory" %>

2) The implementation, rather than selecting a single address from the list that is passed in, overrides this with a completely empty list:

class MultipleHostsFactory : ServiceHostFactory
{
    protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)
    {
        return base.CreateServiceHost(serviceType, new Uri[] { });
    }
}

3) Your web.config will now need to specify the full list of absolute addresses you're effectively listening to. This needs to exactly match your IIS configuration or this won't work:

<configuration>
  <system.serviceModel>
    <services>
      <service name="MyService.MyService">
        <endpoint address="http://hiring.contoso.com/WcfMultipleHosts/test.svc" binding="basicHttpBinding" contract="MyService.IServiceContract" />
        <endpoint address="http://hiring.contoso.com.uk/WcfMultipleHosts/test.svc" binding="basicHttpBinding" contract="MyService.IServiceContract" />
      </service>
    </services>
  </system.serviceModel>
</configuration>

Done.

Be aware that anything that relies on base addresses will now also need to be converted to explicit full addresses, here's an example of how you'd enable the help page for httpGet and enable service metadata:

<configuration>
  <system.serviceModel>
    <behaviors>
      <serviceBehaviors>
        <behavior name="diagnose">
          <serviceMetadata httpGetEnabled="true" httpGetUrl="http://hiring.contoso.com/WcfMultipleHosts/test.svc" httpsGetEnabled="false" />
          <serviceDebug httpHelpPageUrl="http://hiring.contoso.com.uk/WcfMultipleHosts/test.svc" httpsHelpPageEnabled="false" includeExceptionDetailInFaults="true" />
        </behavior>
      </serviceBehaviors>
    </behaviors>
    <services>
      <service behaviorConfiguration="diagnose" name="MyService.MyService">
        <endpoint address="http://hiring.contoso.com/WcfMultipleHosts/test.svc" binding="basicHttpBinding" contract="MyService.IServiceContract" />
        <endpoint address="http://hiring.contoso.com.uk/WcfMultipleHosts/test.svc" binding="basicHttpBinding" contract="MyService.IServiceContract" />
      </service>
    </services>
  </system.serviceModel>
</configuration>