My Mix11 Proposal
Thursday, January 27, 2011 6:51 AM

I only submitted one session for consideration to present at Mix this year. Unfortunately it was not shortlisted and is consequently not open to voting currently on the site.

However, I thought I might post it here to gain some feedback from others as to how I might refine or change it. Maybe it is just something people are not interested in and I have taken a wrong direction. Perhaps you might be interested in me presenting this at your user group?

At any rate, the session abstract is below and would love your feedback.

 

Title: There is no Web: HTML5 Offline capabilities

Abstract:

HTML 5 offers many new capabilities to the standardised web space. The majority of this session will focus on a few capabilities geared towards allowing your web applications to function when the web is not available or otherwise pre-occupied. For a lot of developers, this has not been too much of a consideration because of the limited ability to deal with it, but with HTML 5 this can all change. We will look at how HTML 5 makes it easier to detect your offline/online status and also look at a way to to construct your application to take this into account and provide a seamless experience for your users. We will look at constructing components to provide a better network detection status, offline storage approaches, integration with ASP.NET MVC, and to top it all off, will utilise the RavenDB (NoSQL) document storage engine just for some fun. If you want to see some of the techniques and pitfalls of HTML 5 offline capabilities, combined with ASP.NET MVC and NoSQL, then this talk is for you.

ScriptHelper–For MVC and WebForms projects
Thursday, November 18, 2010 4:38 PM

This issue might seem minor but I always forget the names (exact naming) or number of script files I need to get some features working in MVC or Webforms. In addition, in my applications that require a specific client side feature, I might need a series of dependent scripts to make it work. Failing to include all of them often gives ambiguous errors or the functionality or feature just doesn't work.

To that end, I have created a ScriptHelper to allow me to express those dependencies as a singular name and have all the script dependencies (or CSS) emitted for me.

The library is currently hosted on bitbucket here http://bitbucket.org/glav/mvc-script-dependency-extension

The idea is that you can express your Javascript and CSS dependencies into a separate file and pretty much call it what you want. Then, in your pages, simply the reference the dependencies by name and everything you need is pulled in for you.

The ScriptHelper I have written is a little raw, but you can define an XML file that expresses script names and its dependencies. The script helper then allows you to provide that single name and all dependent scripts are emitted in the page.

For example, in the page you can do:

<%= ScriptHelper.RequiresScript(ScriptName.jqueryValidateUnobtrusive) %>

Or

<%= ScriptHelper.RequiresScript(“jQuery-validate-unobtrusive”) %>

And you would get

<script type='text/javascript' src='/Scripts/jquery-1.4.1.js'></script>
<script type='text/javascript' src='/Scripts/jquery.validate.js'></script>
<script type='text/javascript' src='/Scripts/jquery.validate.unobtrusive.js'></script>

Additionally, you could do:

<%= ScriptHelper.RequiresScript("All") %>

And you would get the following emitted into your page:

<script type='text/javascript' src='/Scripts/jquery-1.4.1.js'></script>
<script type='text/javascript' src='/Scripts/MicrosoftAjax.js'></script>
<script type='text/javascript' src='/Scripts/MicrosoftMvcAjax.js'></script>
<script type='text/javascript' src='/Scripts/jquery.validate.js'></script>
<script type='text/javascript' src='/Scripts/jquery.validate.unobtrusive.js'></script>
<script type='text/javascript' src='/Scripts/MicrosoftMvcValidation.js'></script>

The dependency file looks something like:

<?xml version="1.0" encoding="utf-8" ?>
<Dependencies ReleaseSuffix="min" DebugSuffix="debug">
  <Dependency Name="jQuery-validate-unobtrusive" Type="js">
    <ScriptFile>~/Scripts/jquery.validate.unobtrusive.js</ScriptFile>
    <RequiredDependencies>
      <Name>jquery</Name>
      <Name>jQuery-validate</Name>
    </RequiredDependencies>
  </Dependency>

  <Dependency Name="jQuery-validate" Type="js">
    <ScriptFile>~/Scripts/jquery.validate.js</ScriptFile>
    <RequiredDependencies>
      <Name>jquery</Name>
    </RequiredDependencies>
  </Dependency>

  <Dependency Name="jQuery" Type="js">
    <ScriptFile>~/Scripts/jquery-1.4.1.js</ScriptFile>
  </Dependency>

<!— rest of dependencies …. -->

The ReleaseSuffix and DebugSuffix also act to rename a dependency file that is output to the page. These are not mandatory but it is not uncommon to host debug versions of your scripts when doing development and release/minified versions of your scripts in production. So for example, if we have a script file dependency defined as Something.js, then it will be renamed to Something.min.js in release mode and Something.debug.js in debug mode when output to the page. If not specified, the emitted dependency is left as is.

Finally, you can also specify multiple dependencies on the same line like so:

<%= ScriptHelper.RequiresScript("Microsoft-Mvc-Validation", "jQuery") %>

And all dependent scripts will be included for you without duplicating anything. Obviously the script names must be the same as what is defined in the script dependency file. I have used a static class to hold these constants and you could easily substitute your own as its just strings.

It also supports expressing CSS dependencies. For example, with the following dependency definitions:

<Dependency Name="SiteWideStyle" Type="css">
    <ScriptFile>~/Content/Site.css</ScriptFile>
  </Dependency>

  <Dependency Name="HomeStyle" Type="css">
    <ScriptFile>~/Content/Home.css</ScriptFile>
    <RequiredDependencies>
      <Name>SiteWideStyle</Name>
    </RequiredDependencies>
  </Dependency>

You can then include the following in your <head> section of your pages

<%= ScriptHelper.RequiresScript("HomeStyle") %>

And you would get

<link href='/Content/Site.css' rel='stylesheet' type='text/css' />
<link href='/Content/Home.css' rel='stylesheet' type='text/css' />

Currently, the ScriptHelper I have written supports resolving the dependency file in a few local locations, but I plan on extending it support remote locations via HTTP.

This type of thing could reduce multiple script include to a single line in a web page or view. In addition, you could have dependency files centrally located on a remote server either in the cloud or internal to your organisation, so that dependencies are always resolved via this file which can be maintained and updated as new dependencies and/or later versions of the scripts are available or required.

There is also nothing stopping you from naming your dependencies more component oriented in nature. For example you might name a dependency “AjaxGrid”, and that may load in jQuery core script, jQuery UI custom scripts, as well as some of your own custom scripts required to get this component working.

I have been using this library primarily with ASP.NET MVC projects, but it is just a set of static methods that could also be used within an ASP.NET Webforms application as well.

Feel free to leave comments/issues on the hosting site here http://bitbucket.org/glav/mvc-script-dependency-extension

Caching Architecture–Testability, Dependency Injection and Multiple Providers
Wednesday, October 13, 2010 9:52 PM

Note: Link to Wiki is here

Note: This article assumes familiarity with caching in .Net and dependency injection.

Update: Fixed broken links and updated some text based on feedback.

One of the things I have always stated is that caching is important to every applications performance. In many of the applications I design and work on, I like to introduce a caching layer from the outset, and ensure it is part of the vertical slice that I typically provide as an application blueprint.

Obviously we want to ensure that all this caching magic does not introduce hard dependencies to the underlying caching mechanism. For example, the ASP.NET caching engine is fantastic at what it does, however if you are writing components that make use of it, then they become hard to test as you need to introduce lots of HttpContext references which are time consuming and hard to mock out for testing purposes.

In order to prevent this, I like to define an ICache interface that defines our contract for working with a cache.

Something like this:

  1: public interface ICache 
  2: { 
  3:    void Add<T>(string cacheKey, DateTime expiry, T dataToAdd) where T : class; 
  4:    T Get<T>(string cacheKey) where T : class; 
  5:    void Add(string cacheKey, DateTime expiry, object dataToAdd); 
  6:    object Get(string cacheKey); 
  7:    void InvalidateCacheItem(string cacheKey); 
  8: }

With that, I can write a simple pass through CacheAdapter that implements this interface, but passes all the calls through to the ASP.NET Cache. Any components can then work with the CacheAdapter rather than the ASP.NET cache directly. This means we can mock it out easily by mocking the interface. Testing then becomes much easier and our code is loosely coupled. The CacheAdapter may look something like this:

  1: public class WebCacheAdapter : ICache
  2: {
  3:    private System.Web.Caching.Cache _cache;
  4:    public WebCacheAdapter()
  5:    {
  6:       if (System.Web.HttpContext.Current != null)
  7:          _cache = System.Web.HttpContext.Current.Cache;
  8:       else
  9:          throw new ArgumentNullException("Not in a web context, unable to use the web cache.");
 10:    }
 11: 
 12: public void Add<T>(string cacheKey, DateTime expiry, T dataToAdd) where T : class
 13: {
 14:    if (dataToAdd != null)
 15:       _cache.Add(cacheKey, dataToAdd, null, expiry, Cache.NoSlidingExpiration, CacheItemPriority.Normal, null);
 16: }
 17: 
 18: public void Add(string cacheKey, DateTime expiry, object dataToAdd)
 19: {
 20:    Add<object>(cacheKey, expiry, dataToAdd);
 21: }
 22: 
 23: // rest of code omitted for brevity

So that’s all well and good. Even though I am primarily a web guy, I often design and work with desktop applications which cannot use the ASP.NET cache. In addition, I may also want to use the awesome power of Windows AppFabric Cache for distributed caching. Furthermore, in a few scenarios it has been unclear whether distributed caching was an option due to infrastructure concerns. So I would have liked to use AppFabric caching, but may not be able to, and I would not have known this till later in the project. Finally, I do this kind of code a lot, and I did not want to write it for each scenario and have to specialise it for each project.

It would be nice if I could have all the caching options I may need, already abstracted out for me, easily selectable via configuration, and utilise interfaces with dependency injection for easy testing and loosely coupled applications (what a mouthful).

With that in mind, I have developed a simple caching architecture that I can introduce into all new projects and has the following features:

  • Provides an ICache interface and associated cache adapter class through which all cache engines are accessed. This includes the .Net 4 MemoryCache, ASP.NET Web cache, and Windows AppFabric cache.
  • Provides an enhanced CacheProvider class (which implements ICacheProvider) that allows strongly typed cache access and a simple easy to use consistent API.
  • Allows selection of which cache engine to use via configuration.
    • The currently supported cache mechanisms are
      • Memory
      • Web
      • AppFabric
  • Fully supports dependency injection with everything already wired up. My current organisation standardises on Microsoft's Unity for Dependency Injection so that is what this library uses.
    • Container.Resolve<ICacheProvider>();

is all you need to do to access the cache mechanism.

  • Provides simple logging diagnostics (again via dependency injection so its easy to change) so that you can track whats going on.

Before I dive into the details, here is some example code to use this library and caching.

  1: var cacheProvider = AppServices.Resolve<ICacheProvider>();
  2: 
  3: Console.WriteLine("Getting Some Data.");
  4: var data = cacheProvider.Get<SomeData>("cache-key", DateTime.Now.AddSeconds(5), () =>
  5: {
  6:    Console.WriteLine("... => Adding data to the cache... ");
  7:    var someData = new SomeData() { SomeText = "cache example1", SomeNumber = 1 };
  8:    return someData;
  9: });

The preceding code sample first resolves our cache provider using the Unity container. Note that this is not an ICache instance (which we could also resolve and use directly), but rather a higher level ICacheProvider instance. This provides a more advanced and easy to use API for caching.

Then we simply try and retrieve the item (of type SomeData) from the cache passing in a cache key, the expiry time of the cached data, and an anonymous function as the last argument.

The anonymous function is only called if the data is not present in the cache. The return data of the anonymous function is placed into the cache using the cache key and expiry date/time, then returned to the caller.

The library comes with some simple example code to allow you to get up and running very quickly. If you want to use it, you can download it from here.

For those who are interested in more detail, then read on.

The design of the library is fairly simple.

The solution itself consists of 4 projects as shown below:

clip_image001

Glav.CacheAdapter.Core contains all the necessary interfaces and an implementation of the MemoryCache (which implements ICache). There are multiple cache adapters (memory, web and AppFabric) which implement ICache. The ICache interface has basic Get/Remove methods to manipulate the cache. To facilitate a higher level API with easier usage and without introducing extra these methods into ICache (thus forcing each ICache adapter implementation to have to implement these methods), there is also a ICacheProvider interface which has the enhanced methods for retrieving items from the cache and automatically inserting them if the data items do not exist in the cache. Methods such as

  1: Get<T>(string cacheKey, DateTime expiryTime, GetDataToCacheDelete)

Both Glav.CacheAdapter.Web and Glav.CacheAdapter.Distributed contain implementations of the ICache interface for ASP.NET and Windows AppFabric respectively.

clip_image002

Obviously, in order to use Windows AppFabric caching, you must have that installed on the machines that will utilise it. The library contains 2 core assemblies from Windows AppFabric that allow it to compile and reference the required functionality. If you enable AppFabric in the configuration of the library without it being installed and try to utilise caching, this will obviously fail.

clip_image003

Finally, to glue all this together a project called the Glav.CacheAdapter.CacheBootstrap will register the correct cache implementation into the service container based on the supplied configuration. The .Net 4 MemoryCache is the default if no configuration or an unrecognised configuration is supplied.

So that’s it. Download it and give it a try if you want a nicely abstracted and pre-packaged cache solution. I would welcome any feedback.

ASP.NET MVC 3 and Custom Extensions
Sunday, August 8, 2010 3:03 PM

When playing with the latest ASP.NET MVC 3 Preview 1 bits, some people have mentioned their dislike of the .cshtml extension used for the “razor” view engine that comes with ASP.NET MVC 3 and also WebMatrix. Well there are a number of ways you can change this. For the purposes of learning and tinkering, I decided to try and register a new view engine using the new Dependency Injection support within ASP.NET MVC 3.

So, in the Global.asax.cs file I did this:

protected void Application_Start()
{
    AreaRegistration.RegisterAllAreas();
    IUnityContainer container = new UnityContainer();

    container.RegisterInstance<IControllerFactory>(new UnityControllerFactory(container));
    container.RegisterType<IViewEngine, TestViewEngine>();
   
    UnityMvcServiceLocator svcLocator = new UnityMvcServiceLocator(container);
    MvcServiceLocator.SetCurrent(svcLocator);

    RegisterGlobalFilters(GlobalFilters.Filters);
    RegisterRoutes(RouteTable.Routes);
}

Note: Previously (and this options is still available), you would register your custom view engine by adding it to the existing Engines collections like so:

ViewEngines.Engines.Add(new TestViewEngine());

My Custom view engine looked like this:

public class TestViewEngine : VirtualPathProviderViewEngine
{
    public TestViewEngine()
    {
        base.AreaViewLocationFormats = new string[] { "~/Areas/{2}/Views/{1}/{0}.glav", "~/Areas/{2}/Views/Shared/{0}.glav" };
        base.AreaMasterLocationFormats = new string[] { "~/Areas/{2}/Views/{1}/{0}.glav", "~/Areas/{2}/Views/Shared/{0}.glav" };
        base.AreaPartialViewLocationFormats = new string[] { "~/Areas/{2}/Views/{1}/{0}.glav", "~/Areas/{2}/Views/Shared/{0}.glav" };
        base.ViewLocationFormats = new string[] { "~/Views/{1}/{0}.glav", "~/Views/Shared/{0}.glav" };
        base.MasterLocationFormats = new string[] { "~/Views/{1}/{0}.glav", "~/Views/Shared/{0}.glav" };
        base.PartialViewLocationFormats = new string[] { "~/Views/{1}/{0}.glav", "~/Views/Shared/{0}.glav" };

    }
    protected override IView CreatePartialView(ControllerContext controllerContext, string partialPath)
    {
        return new CshtmlView(partialPath, "");
    }

    protected override IView CreateView(ControllerContext controllerContext, string viewPath, string masterPath)
    {
        return new CshtmlView(viewPath, masterPath);
    }
}

This view engine implementation looked for a view extension of ‘.glav’ and invokes the “razor” view to parse the document/page (CshtmlView)

ASP.NET MVC 3 will use the MvcServiceLocator that we have supplied, and call the ‘GetAllInstances’ method when determining what classes implement IViewEngine so that it can invoke the correct view engine.

By default, ASP.NET MVC 3 has the System.Web.Mvc.WebFormViewEngine and the System.Web.Mvc.CshtmlViewEngine registered. We are adding a new custom view engine to the mix by registering it with the service locator that ASP.NET MVC uses ( MvcServiceLocator ). When ASP.NET MVC goes looking for a view engine it will use the MvcServiceLocator to get all instances of IViewEngine (via a call to ‘GetAllInstances’ ) in order to try and satisfy the request to process/render a particular view.

Now all this theory is good and well, and I thought it would work however it didn’t. One of the issues with Unity (and thats what I was basing my testing on since that was the example provided with ASP.NET MVC 3) is the the ' MvcServiceLocator.GetAllInstances ‘ method, calls the ResolveAll method of the UnityContainer. Now this *only* returns all instances if they have been registered by name, so I had to change one line of my code in the Global.asax.cs from:

container.RegisterType<IViewEngine, TestViewEngine>();

to

container.RegisterType<IViewEngine, TestViewEngine>("test");

and it all worked and my new engine was invoked. Apparently this is known behaviour with Unity and may be changed in future versions, but for now, this is how it works.

Note: There are a few other ways of registering custom view engines and also associating file extensions with a particular view engine. This is just one way of doing it.

by Glav | 8 comment(s)
Filed under: , ,
Fujitsu T900 and Fingerprint Sensor Problems
Sunday, June 27, 2010 6:00 PM

Not so long ago, I bought a Fujitsu T900 Notebook computer from TegaTech (many thanks to the great service and help I received from Hugo Ortega who runs it).

Anyway, I bought a fairly stock system with 4Gb memory and a 320Gb 5400 rpm HDD. Later I updated to 8Gb memory and a 128Gb SSD with the 320Gb HDD as my secondary drive. This of course meant a full rebuild of Windows 7 and associated drivers and applications.

The rebuild went pretty smoothly but it soon became apparent that some things were not working. One of the most annoying was the fingerprint sensor. I had installed the drivers from the Fujitsu site a few times over.

I saw the ‘Authentic’ device in the Biometric section in Windows Device Manager, and when I went to the ‘Biometric Devices’ in control panel it had a simple message saying:

There were no Biometric devices found on this computer. Try installing some drivers ..” or some such thing. After much hunting around, removing the device, re-installing etc… I have finally got it working and thought I would post how I did it here for others if interested, but also for myself so I don't have to remember what I did.

Its pretty simple though so here goes:

1. Go Windows Device Manager via Control Panel and delete/uninstall the device from the Biometric devices Section.
image

2. Don't reboot! I don't know why but when I rebooted after uninstalling it and tried this it didn’t work.

3. Right click on the very top node and select ‘Scan for Hardware Changes’
image

Windows will then detect the device, and search windows update for the drivers. After about 5 minutes it should find the driver and install it.

And that's it. Simple enough but all the installing of drivers from the Fujitsu site didn’t seem to do anything. Perhaps this acted as part of the step for resolution but it didn’t seem to make much different.

At any rate, I am back to logging into windows using my fingerprint.

Making WCF Output a single WSDL file for interop purposes.
Tuesday, March 16, 2010 10:42 PM

By default, when WCF emits a WSDL definition for your services, it can often contain many links to others related schemas that need to be imported. For the most part, this is fine. WCF clients understand this type of schema without issue, and it conforms to the requisite standards as far as WSDL definitions go.

However, some non Microsoft stacks will only work with a single WSDL file and require that all definitions for the service(s) (port types, messages, operation etc…) are contained within that single file. In other words, no external imports are supported. Some Java clients (to my working knowledge) have this limitation. This obviously presents a problem when trying to create services exposed for consumption and interop by these clients.

Note: You can download the full source code for this sample from here

To illustrate this point, lets say we have a simple service that looks like:

Service Contract

public interface IService1
{
    [OperationContract]
    [FaultContract(typeof(DataFault))]
    string GetData(DataModel1 model);

    [OperationContract]
    [FaultContract(typeof(DataFault))]
    string GetMoreData(DataModel2 model);
}

Service Implementation/Behaviour

public class Service1 : IService1
{
    public string GetData(DataModel1 model)
    {
        return string.Format("Some Field was: {0} and another field was {1}", model.SomeField,model.AnotherField);
    }
    public string GetMoreData(DataModel2 model)
    {
        return string.Format("Name: {0}, age: {1}", model.Name, model.Age);
    }
}

Configuration File

<system.serviceModel>
<services>
  <service name="SingleWSDL_WcfService.Service1" behaviorConfiguration="SingleWSDL_WcfService.Service1Behavior">
<!-- ...std/default data omitted for brevity..... -->
    <endpoint address ="" binding="wsHttpBinding" contract="SingleWSDL_WcfService.IService1" >
          .......
 </services>
      <behaviors>
      <serviceBehaviors>
        <behavior name="SingleWSDL_WcfService.Service1Behavior">
             ........
        </behavior>
      </serviceBehaviors>
    </behaviors>

</system.serviceModel>

When WCF is asked to produce a WSDL for this service, it will produce a file that looks something like this (note: some sections omitted for brevity):

 <?xml version="1.0" encoding="utf-8" ?> 
- <wsdl:definitions name="Service1" targetNamespace="http://tempuri.org/" xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/"     ...... namespace definitions omitted for brevity
+ &lt;wsp:Policy wsu:Id="WSHttpBinding_IService1_policy">
      ... multiple policy items omitted for brevity
  </wsp:Policy>
- <wsdl:types>
- <xsd:schema targetNamespace="http://tempuri.org/Imports">
  <xsd:import schemaLocation="http://localhost:2370/HostingSite/Service-default.svc?xsd=xsd0" namespace="http://tempuri.org/" /> 
  <xsd:import schemaLocation="http://localhost:2370/HostingSite/Service-default.svc?xsd=xsd3" namespace="Http://SingleWSDL/Fault" /> 
  <xsd:import schemaLocation="http://localhost:2370/HostingSite/Service-default.svc?xsd=xsd1" namespace="http://schemas.microsoft.com/2003/10/Serialization/" /> 
  <xsd:import schemaLocation="http://localhost:2370/HostingSite/Service-default.svc?xsd=xsd2" namespace="http://SingleWSDL/Model1" /> 
  <xsd:import schemaLocation="http://localhost:2370/HostingSite/Service-default.svc?xsd=xsd4" namespace="http://SingleWSDL/Model2" /> 
  </xsd:schema>
  </wsdl:types>
+ <wsdl:message name="IService1_GetData_InputMessage">
      ....
  </wsdl:message>
- <wsdl:operation name="GetData">
     .....
  </wsdl:operation>
- <wsdl:service name="Service1">
     .......
  </wsdl:service>
  </wsdl:definitions>

The above snippet from the WSDL shows the external links and references that are generated by WCF for a relatively simple service. Note the xsd:import statements that reference external XSD definitions which are also generated by WCF.

In order to get WCF to produce a single WSDL file, we first need to follow some good practices when it comes to WCF service definitions.

Step 1: Define a namespace for your service contract.

[ServiceContract(Namespace="http://SingleWSDL/Service1")]
public interface IService1
{
       ......
}

Normally you would not use a literal string and may instead define a constant to use in your own application for the namespace.

When this is applied and we generate the WSDL, we get the following statement inserted into the document:

  <wsdl:import namespace="http://SingleWSDL/Service1" location="http://localhost:2370/HostingSite/Service-default.svc?wsdl=wsdl0" /> 

All the previous imports have gone. If we follow this link, we will see that the XSD imports are now in this external WSDL file. Not really any benefit for our purposes.

Step 2: Define a namespace for your service behaviour

[ServiceBehavior(Namespace = "http://SingleWSDL/Service1")]
public class Service1 : IService1
{
      ......
}

As you can see, the namespace of the service behaviour should be the same as the service contract interface to which it implements. Failure to do these tasks will cause WCF to emit its default http://tempuri.org namespace all over the place and cause WCF to still generate import statements. This is also true if the namespace of the contract and behaviour differ. If you define one and not the other, defaults kick in, and you’ll find extra imports generated.

While each of the previous 2 steps wont cause any less import statements to be generated, you will notice that namespace definitions within the WSDL have identical, well defined names.

Step 3: Define a binding namespace

In the configuration file, modify the endpoint configuration line item to iunclude a bindingNamespace attribute which is the same as that defined on the service behaviour and service contract

<endpoint 
    address="" 
    binding="wsHttpBinding" 
    contract="SingleWSDL_WcfService.IService1" 
    bindingNamespace="http://SingleWSDL/Service1">

However, this does not completely solve the issue. What this will do is remove the WSDL import statements like this one:

<wsdl:import namespace="http://SingleWSDL/Service1" 
location="http://localhost:2370/HostingSite/Service-default.svc?wsdl" /> 

from the generated WSDL.

Finally…. the magic….

Step 4: Use a custom endpoint behaviour to read in external imports and include in the main WSDL output.

In order to force WCF to output a single WSDL with all the required definitions, we need to define a custom WSDL Export extension that can be applied to any endpoints. This requires implementing the IWsdlExportExtension and IEndpointBehavior interfaces and then reading in any imported schemas, and adding that output to the main, flattened WSDL to be output. Sounds like fun right…..? Hmmm well maybe not.

This step sounds a little hairy, but its actually quite easy thanks to some kind individuals who have already done this for us.

As far as I know, there are 2 available implementations that we can easily use to perform the import and “WSDL flattening”.  WCFExtras which is on codeplex and FlatWsdl by Thinktecture. Both implementations actually do exactly the same thing with the imports and provide an endpoint behaviour, however FlatWsdl does a little more work for us by providing a ServiceHostFactory that we can use which automatically attaches the requisite behaviour to our endpoints for us.

To use this in an IIS hosted service, we can modify the .SVC file to specify this ne factory to use like so:

<%@ ServiceHost Language="C#" 
          Debug="true" 
          Service="SingleWSDL_WcfService.Service1" 
          Factory="Thinktecture.ServiceModel.Extensions.Description.FlatWsdlServiceHostFactory"  %>

Within a service application or another form of executable such as a console app, we can simply create an instance of the custom service host and open it as we normally would as shown here:

FlatWsdlServiceHost host = new FlatWsdlServiceHost(typeof(Service1));
host.Open();

And we are done. WCF will now generate one single WSDL file that contains all he WSDL imports and data/XSD imports.

You can download the full source code for this sample from here

Hope this has helped you.

Note: Please note that I have not extensively tested this in a number of different scenarios so no guarantees there.

by Glav | 13 comment(s)
Filed under: , , , ,
Sydney Architecture User Group – WPF Architecture
Wednesday, February 24, 2010 12:22 PM

Bit short notice for a blog post but eminent WPF expert Paul Stovell will be presenting on WPF Architecture at the Sydney Architecture User Group on thursday 25th February. (Full details here)

Here is the abstract:

.NET 4.0 will mark the fourth release of Windows Presentation Foundation, and the take up continues to rise as the platform matures. For architects, such a new technology means a new set of patterns, approaches and trade-offs that we need to understand when designing solutions. In this session, Paul will lead you on a guided tour of the WPF client application problem space. We will look at patterns for presentation, composition, navigation, and communication needs, as well as resource management, localization, and performance. We will also look at strategies for enforcing UI standards, maximising code leverage, and handling cross-cutting concerns. Bring questions!

Location and specific details are:

Title: Architecting solutions with Windows Presentation Foundation
Date/Time:Thursday 25/02/2010 06:00 PM
Where:Grace Hotel , Kiralee or Pinaroo Function Room77 York st,Sydney,NSW. 2000

 

Hope to see you there!

by Glav | 2 comment(s)
Filed under: , , ,
.Net Performance Testing and Optimisation – Free eBook
Saturday, January 30, 2010 11:43 AM

This blog has been super quiet lately. This is mainly because I have been hard at work writing a book around .Net Performance Testing and optimisation. Well, I am happy to say that part 1 of this book is available as a free download from here.

The full book will follow shortly and will contain concrete examples of popular performance issues in .Net applications, what to watch out for, and how to mitigate them. Additionally, you’ll get performance tips for Internet Information Server, web apps, SQL, caching strategies and a whole lot more.

In part 1, you will learn how to setup a performance test rig, how it works, how to record tests, replay them, what metrics to collect, and how to analyse them.

In addition, you will find out how to automate all this. Finally, you will get insight into profiling an application to improve performance. This is the one stop shop for performance testing and optimisation on the .Net platform.

Since its free, download it now!

iPhone Apps using Microsoft .NET
Wednesday, December 2, 2009 8:22 AM

My good friend Wally McClure has written a short eBook on “Building iPhone and iPod touch Applications for the .NET/C# Developer with MonoTouch”.

You can check it out here

Given that iPhone is a very popular phone and that some of the development barrier for Microsofties like me is no .Net language support (Objective-C being the native development language), this can now be overcome with the help of Wally’s eBook.

Its small and concise with topics covering class library support, deployment and debugging. I encourage you to check it out.

Sydney Architecture User Group – Next Meeting: Why Windows Azure is not just Generic Brand Web Hosting
Tuesday, November 10, 2009 2:46 PM

The Sydney Architecture User Group is having its second meeting this month on Thursday, 26th November. Full details can be found on our new (but very simple) website located here ( http://thesaug.org ). You can subscribe to the monthly email and also indicate your intention to come by selecting the RSVP option (which would be really nice if you did :-) )

Here is what the next Sydney Architecture meeting has in store. Hope to see you there.

Why Windows Azure is not just Generic Brand Web Hosting

Presenter Nick Randolph
Date/Time: Thursday 26/11/2009 06:00 PM
Where: Grace Hotel , Function Room 77 York st Sydney,NSW. 2000

Abstract

If you take only a glimpse at the offerings on the Windows Azure platform it may just appear to be a form of generic/home brand web hosting from Microsoft. However you’d be sorely mistaken as the platform is significantly different from not only traditional web hosting offerings but also from its competitors in the cloud computing space. In this session we will cover the unique offerings of cloud computing before looking at each of the components of the Windows Azure platform. Cloud computing is new, hot and sexy, but does that mean it’s right for you? Make sure you’re ready to interact and discuss the relative merits of building on the cloud.

Presenter Bio

Nick currently runs Built To Roam (http://www.builttoroam.com) which focuses on building rich mobile applications. Previously Nick was co-founder and Development Manager for nsquared solutions where he lead a team of developers to build inspirational software using next wave technology. Prior to nsquared, Nick was the lead developer at Intilecta Corporation where he was integrally involved in designing and building their application framework.

More Posts « Previous page - Next page »

This Blog

Syndication