Thursday, January 8, 2015

Stupid C# tricks - "clean room" code execution with multi-instance applications and remoting.

The Scenario:

Sometimes, libraries, native dlls, or environment in your Windows application's running process can interfere with each other. 'Pure' .Net managed code is much less prone to conflict due to .Net's fantastic versioning, type system, strong naming, and separation via application domains. In real world applications, you often need to interact with a lot on native code or APIs where you have no such protection. This blog post is about a simple way to keep native dependencies separate for different code  within a single .Net desktop application.


I have some software I'm working on in C# that requires the use of an embedded web browser. I'm have a web application I'm working on that is HTML5 and Javascript heavy, and I'm writing a C# application that hosts the web app and interacts with it to do some "heavy lifting" things that are not possible inside a browser without plug-ins. All I'll say about it right now is that it's robotics based, I'm going to open-source it, and the web app part with be available to run from Anibit, but I'll also have the offline Windows application that will be easy-peasy to use and have some fancier features than the pure browser based version. It will generate code, and the the offline version will also compile and upload the result to your device.

The Problem:

I found that when I had the browser loaded, the GeckoFx library, that is a managed shim for Mozilla's "xulrunner" embedded Firefox, that when I spawned the compiler as a child-process, it would fail with a lot of cryptic errors. If I did not load the browser component, the spawned compiler worked fine. I'm pretty sure that some dll's or environment settings or something in the process's memory were not playing well between xulrunner and gcc. Rather than spend forever tracking down the exact problem, which I ultimately probably would have had to build my own xulrunner or gcc binaries to fix, (yuck), I came up with a nice work-around that gives me the best of both worlds.

The solution:

On start-up in my application, before I have done anything, I launch a second instance of the application with special command-line parameters. The parameters tell the second instance that it should run in a "remote execution server" mode, and I also pass the Windows process ID of the parent to the server/child. The child process periodically checks to see if the parent process is still running, and exits if not.

The child process starts a server for a ".Net remoting" object. .Net remoting is one of lesser-known/understood technologies of .Net, but it's fantastic if you're on a pure Microsoft technology stack. It makes remote procedure calls across applications or even machines super simple. Essentially, with some small configuration files, and a little bit of support code, you can create a class who's functions automatically get executed in the child process. The calls can be synchronous, and parameters and return values are magically handled by the CLR. (Note that Microsoft advises against using the 'legacy' remoting APIs, and instead recommends using "Windows Communication Foundation". I find for really simple situations, .Net remoting is a bit simpler to setup and use, and the remote interface is dynamically generated so there's no endpoint API to maintain.)

Rather than make a lot of diagrams or posting code to this blog, I just put a small demonstration C# project on Github, if this sounds like it could help you, feel free to use it for whatever you want. You can find it here.