Archives

Archives / 2008 / November
  • My durable WCF RESTful calculator

    A durable service in WCF is by a definition a service that can persist all its internal state across calls in some durable storage. For every operation, the service state is retrieved from the storage, the operation is executed and finally the state is persisted again in the storage. Therefore, there is not need to keep the service instance idle in memory while waiting for client calls. It is equivalent to a long run session, which make this feature something ideal for long-running processes like workflows (In fact, workflow services are mount on top of this feature),

    In order to create a durable service, WCF provides a "DurableService" attribute (It's a service behavior) that can be applied to a regular service definition. The service itself has to be either serializable or have members decorated with DataContract or DataMember attributes to be serialized and stored in the persistent storage.

    The service activation, as in workflow services, is managed by the WCF context correlation mechanism. Once a service instance has been created, the client application has to propagate some context information(which includes the service instance id) in order to route all the new messages to the right service instance. Jesus has already discussed how this mechanism works more in detail in this post (Although the post is bit old and some names have changed since then, it is worth reading).

    For purposes of this post, I decided to create a simple calculator example that exposes different operations through the classic http verbs,

    [ServiceContract(Namespace = "http://Microsoft.WorkflowServices.Samples")]

    public interface ICalculator

    {

        [OperationContract()]

        [WebInvoke(Method = "POST")]

        int PowerOn();

     

        [OperationContract()]

        [WebInvoke(Method = "PUT", UriTemplate = "add")]

        int Add(int value);

     

        [OperationContract()]

        [WebInvoke(Method = "PUT", UriTemplate = "subtract")]

        int Subtract(int value);

     

        [OperationContract()]

        [WebInvoke(Method = "PUT", UriTemplate = "multiply")]

        int Multiply(int value);

     

        [OperationContract()]

        [WebInvoke(Method = "PUT", UriTemplate = "divide")]

        int Divide(int value);

     

        [OperationContract()]

        [WebInvoke(Method = "DELETE")]

        void PowerOff();

    The implementation of this service is also quite straightforward.

    [Serializable]

    [DurableService]

    public class DurableCalculator : ICalculator

    {

        int _currentValue = 0;

     

        [DurableOperation(CanCreateInstance=true)]

        public int PowerOn()

        {

            return _currentValue;

        }

     

        [DurableOperation]

        public int Add(int value)

        {

            return (_currentValue += value);

        }

     

        [DurableOperation]

        public int Subtract(int value)

        {

            return (_currentValue -= value);

        }

     

        [DurableOperation]

        public int Multiply(int value)

        {

            return (_currentValue *= value);

        }

     

        [DurableOperation]

        public int Divide(int value)

        {

            return (_currentValue /= value);

        }

     

        [DurableOperation(CompletesInstance=true)]

        public void PowerOff()

        {

        }

    }

    As you can see, I decorated the service implementation with the "DurableService" and "DurableOperation" attributes to make this simple service a durable one.

    WCF only comes with built-in support for transferring the context information between the client and service with Http Cookies or Soap Headers. While cookies would be the right mechanism for http REST services, unfortunately they do not work as expected. The path that WCF uses for creating the cookies is relative, so the context manager throws the following exception on the client side,

    Unhandled Exception: System.Net.CookieException: An error occurred when parsing the Cookie header for Uri 'http://localhost:8080/DurableCalculator'. ---> System.Net.CookieException: The 'Path'='/DurableCalculator/PowerOn' part of the cookie  is invalid.

    As workaround, we can use for this scenario the custom context binding I created some weeks ago to exchange the context information as regular http headers.

    <connectionStrings>

      <add name="DurableService" connectionString="Initial Catalog=SQLWorkflows;Data Source=.\SQLEXPRESS;Integrated Security=SSPI;"/>

    </connectionStrings>

    <system.serviceModel>

      <services>

        <service name="ServiceConsole.DurableCalculator" behaviorConfiguration="MyServiceBehavior">

          <endpoint address="" behaviorConfiguration="MyServiceBehavior" binding="webHttpContext" contract="ServiceConsole.ICalculator" />

        </service>

    </services>

    <behaviors>

      <endpointBehaviors>

        <behavior name="MyServiceBehavior">

          <webHttp />

        </behavior>

      </endpointBehaviors>

      <serviceBehaviors>

        <behavior name="MyServiceBehavior">

          <persistenceProvider type="System.ServiceModel.Persistence.SqlPersistenceProviderFactory, System.WorkflowServices, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DurableService"/>

        </behavior>

      </serviceBehaviors>

    </behaviors>

    <extensions>

      <bindingExtensions>

        <add name="webHttpContext" type="Microsoft.ServiceModel.Samples.WebHttpContextBindingCollectionElement, WebHttpContext, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null" />

      </bindingExtensions>

    </extensions>

    </system.serviceModel>

    If you pay special attention to the configuration settings above, in addition to the binding configuration, I also included the "persitenceProvider" behavior for configuring the persistence provider that will serialize and store the service instance (In this case, the SQL server provider).

    Now, thanks to this support, my calculator service will survive to application and server restarts :). Download the complete code from this location.

    Read more...

  • Open Source alternatives in .NET for building RESTful services

    Usually all my posts about REST are about WCF or mention this technology in some parts. Today, I decided to take a different approach and discuss some of the projects available today for building REST services,  Resourceful and Dream framework (Both available for mono as well).
    It is worth mentioning however that the WCF team has made an excellent work introducing the new Web Model in .NET 3.5, it has definitively helped a lot to adopt this kind of service in the .NET platform. In my opinion, there are still some aspects in WCF that could be improved,

    1. WCF services are hard to unit test. It is possible but requires some extra work. I already mentioned some techniques  based on integration tests and mocks in this post "Unit tests for WCF"
    2. Poor support for defining multiple resource representations/formats within a single operation definition.
    3. Any aspect you would like to add here ?

    Ok, I will try now to summarize some of available features or implementation details in these two projects.

    Resourceful

    • Service definitions are totally imperative. Whereas a service definition (and operations) in WCF is made declaratively through attributes (annotating classes with WCF attributes), the service definition in Resourceful is totally imperative, it has to be made through several lines of code.

    LocalApplicationDescription app = new LocalApplicationDescription();

    // get-user

    LocalApplicationMethod getUser = app.NewMethod("getUser", HttpMethod.Get, _usersController.GetUser);

    getUser.NewResponseRepresentation(MediaType.ApplicationXml);

    getUser.NewResponseRepresentation(MediaType.ApplicationExWwwFormUrlencoded);

     

    ApplicationResource userResource = app.NewResource("users/{username}", new TemplateParameter("username", "xsd:string"));

    app.Bind(userResource, getUser);

    The developer has to perform two things, first define the operation itself specifying a friendly name along with the supported Http methods and resource representations and afterwards, create a resource mapping ("users/{username}" in this case). {username} is a URI template hole, equivalent to the Uri Templates in WCF.

    The method signature for the NewMethod is the following,

    NewMethod(string id, string name,Action<IRepresentationContext> handler)

    As you can see, the last argument is a delegate that points to the operation implementation. IRepresentationContext is equivalent to the WebOperationContext class in WCF, it contains all the runtime context settings that a service can use. This actually better than WCF because IRepresentationContext can be mocked for unit tests.

    In the example above, _usersController is a simple class with the service implementation (This separation of concerns definitively helps a lot for unit testing). Some code for the GetUser operation implementation looks as follow,

    public void GetUser(IRepresentationContext context)

    {

        string username = context.TemplateParameters["username"];

     

        UserAccount user = this.Engine.FindUser(username);

     

        if (user == null)

        {

            this.RenderStatus(context, HttpStatus.NotFound);

            return;

        }

     

    • Support for WADL. The framework uses the service definition (the LocalApplicationDescription class in the example above) to publish all the available operations in the service.

     

    • Support for multiple resource representations in a single operation. This is possible because the framework does not support the concept of channels (Or aspects) that can be plugged into the service to perform additional work, all the translation must be done in the operation implementation itself. The service operation can only get the resource representation as an stream from the IRepresentationContext (Same thing can be done in WCF if the input parameter for the operation method is an stream or a message),

    public void CreateUser(IRepresentationContext context)

    {

        if (context.Request.MediaType != MediaType.ApplicationXml)

        {

            this.RenderStatus(context, HttpStatus.UnsupportedMediaType);

            return;

        }

         UserAccount user = UserAccount.FromXml(new StreamReader(context.Request.GetEntityStream()));

    In the example above, CreateUser only supports POX (Plain old xml) representation for the users, so it also makes the translation from XML to an user entity.

    • There is not support for channels or aspects to perform additional work before a message arrives/leaves a service. There is, however, an special class for hosting service instances, it can be used from a console application or IIS as an http handler.

    Uri rootUri = new Uri(string.Format("http://localhost:3000/v1/", address));

     

    BookmarkService service = new BookmarkService();

     

    HttpServer host = new HttpServer(rootUri);

     

    host.ReceiveWebRequest += delegate(HttpListenerContext context)

    {

        service.Process(new HttpListenerContextAdapter(context, rootUri));

    };

     

    host.Start();

    • Rich and fluent client API for consuming existing REST services. This is probably one of the nicest things about this framework, a lot of examples are provided for consuming well-know REST services on the web such as Amazon S3,  ADO.NET services, Simple DB or Delicious to name a few. This client API can be used from Silverlight applications as well.

    Dream Framework 

    • Service definitions are declarative as in WCF. Two attributes are used, one for the service definition "DreamService" and another for each operation in that service, "DreamFeature".   

    [DreamService("Dream Tutorial Address-Book", "Copyright (c) 2006, 2007 MindTouch, Inc.",

          Info = "http://doc.opengarden.org/Dream_SDK/Tutorials/Address_Book",

          SID = new string[] { "http://services.mindtouch.com/dream/tutorial/2007/03/addressbook" }

        )]

        public class AddressBookService : DreamService {

            [DreamFeature("GET:firstname/{name}", "Get list of all addresses matching first name")]

            public Yield GetFirstName(DreamContext context, DreamMessage request, Result<DreamMessage> response) {

     

    I will not enter much in detail here, but the "DreamFeature" supports almost the same things as WCF and Resourceful, an Http verb and the URI template for the resource. For more information about how to create an Dream service from scratch, take a look at this page.

    • Every operation implementation should receive three arguments and return an enumerator. The arguments are basically the runtime context (Equivalent to WebOperationContext in WCF), the request message and a handler to send responses to the client.

    using Yield = System.Collections.Generic.IEnumerator<IYield>;

    public Yield GetFirstName(DreamContext context, DreamMessage request, Result<DreamMessage> response) {

    They implement a weird mechanism based on custom enumerators to execute callbacks on the service host. The service can basically returns IYield objects representing callbacks with the C# keyword "yield". The response is automatically sent to the client application when the service invokes "yield break".

    [DreamFeature("GET:addresses", "Get all addresses")]

    public Yield GetAddresses(DreamContext context, DreamMessage request, Result<DreamMessage> response) {

     

        // send back the entire address book

        lock(_addresses) {

            response.Return(DreamMessage.Ok(_addresses));

        }

        yield break;

    }

    • Supports for long running services with durable state, it's not clear to me however how they restore that state between calls. The fields must be annotated with the attribute "DreamServiceState" in order to use this feature.

    [DreamServiceState]private XDoc _addresses;

     

    Read more...

  • Using the WCF OAuth channel with an ADO.NET service

    As I promised in my previous post "OAuth Channel for WCF RESTful services", it is now time to show this new channel in action with a real service. To make this sample more interesting, I decided to base this implementation on an ADO.NET service that provides information about contacts.

    This post will be a kind of walk-through to demonstrate all the steps required to implement the ADO.NET service, and then, the final integration with OAuth.

    1.Create a custom data source (IQueryable) implementation for using with the ADO.NET data service

    [DataServiceKey("Id")]

    public class Contact

    {

        public int Id { get; set; }

        public string Name { get; set; }

        public string Email { get; set; }

        public string Owner { get; set; }

    }

     

    public class ContactsData

    {

        static Contact[] _contacts;

     

        static ContactsData()

        {

            _contacts = new Contact[]{

              new Contact(){ Id=0, Name="Mike", Email="mike@contoso.com", Owner = "jane" },

              new Contact(){ Id=1, Name="Saaid", Email="Saaid@hotmail.com", Owner = "jane"},

              new Contact(){ Id=2, Name="John", Email="j123@live.com", Owner = "john"},

              new Contact(){ Id=3, Name="Pablo", Email="Pablo@mail.com", Owner = "john"}};

        }

     

        public IQueryable<Contact> Contacts

        {

            get { return _contacts.AsQueryable<Contact>(); }

        }

     

    }

    What I defined here is a simple Contact class representing the contact entity and a ContactsData for the ADO.NET service data source. The service will automatically reflect the IQueryable properties in this data source class. The "DataServiceKey" attribute on top of the contact entity is required by ADO.NET services to define an artificial primary key on custom classes (It took me some to figure this out).

    2. Implement the ADO.NET data service

    public class contacts : DataService<ContactsData>

    {

        // This method is called only once to initialize service-wide policies.

        public static void InitializeService(IDataServiceConfiguration config)

        {

            config.SetEntitySetAccessRule("*", EntitySetRights.AllRead);

        }

     

        [QueryInterceptor("Contacts")]

        public Expression<Func<Contact, bool>> OnQueryContact()

        {

            var name = Thread.CurrentPrincipal.Identity.Name;

            return c => c.Owner.Equals(name, StringComparison.OrdinalIgnoreCase);

        }

    }

    The "QueryInterceptor" in this service implementation basically filters the resulting contacts based on the authenticated user. As I showed in my previous post, the authentication is performed by the OAuth channel.

    3. Configure the OAuth WCF channel for the ADO.NET data service

    <%@ ServiceHost Language="C#" Factory="ExampleOAuthChannel.AppServiceHostFactory" Service="ADOServices.OAuth.contacts" %>

    using System;
    using System.ServiceModel;
    using System.ServiceModel.Activation;
    using Microsoft.ServiceModel.Web;

    namespace ExampleOAuthChannel
    {
      class AppServiceHostFactory : ServiceHostFactory
      {
        protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)
        {
            WebServiceHost2 result = new WebServiceHost2(serviceType, true, baseAddresses);
            result.Interceptors.Add(new OAuthChannel.OAuthInterceptor(
       ADOServices.OAuth.OAuthServicesLocator.Provider, ADOServices.OAuth.OAuthServicesLocator.AccessTokenRepository));
            return result;
        }
      }
    }

    Nothing new in this step, I only registered the OAuth interceptor in the WCF service host for the ADO.NET service.

    Download the complete example. (It includes a client application implementation as well)

    Read more...

  • OAuth channel for WCF RESTful services

    While OpenID and WS-Federation focus on delegating user identity (or a collection of identity claims), OAuth was designed to address a different and complementary scenario, the delegation of user authorization. In few words, OAuth allows a client application to obtain user consent (as access tokens) for executing operations over private resources on his behalf.

    The analogy given by Eran Hammer Lahav in this post "Explaining OAuth" is very close to what the specification tries to address,

    "Many luxury cars today come with a valet key. It is a special key you give the parking attendant and unlike your regular key, will not allow the car to drive more than a mile or two. Some valet keys will not open the trunk, while others will block access to your onboard cell phone address book. Regardless of what restrictions the valet key imposes, the idea is very clever. You give someone limited access to your car with a special key, while using another key to unlock everything else."

    Now, if we analyze the specification in more detail, we will see that the real purpose behind OAuth is to create a network of collaboration  between applications. It will not be necessary anymore to keep all our stuff just in a single place, we can have for instance our pictures in a website, our contacts in another place and a third application making use of them, all these applications collaborating together.

    If you want to know more about how OAuth works, you should read the following posts

    When I initially said that OpenID and OAuth complement each other, I meant that the user can first authenticated by an OpenID provider, and then redirected to the relying party to obtain his consent. (Authorization for consuming a private resource).

    Alex Henderson (Aka Bittercoder) has written a pretty good OAuth library in .NET for implementing an OAuth consumer and service provider. The library is available here under a MIT license (do wherever you want with it), and it is very easy to use. Alex has definitively made a very good work.

    My WCF channel implementation for OAuth mounts on top of his library and it basically transforms a OAuth token into a .NET security principal that can be used later within the service implementation. The channel is implemented as a RequestInterceptor, one of new features introduced in the REST WCF Starter Kit. This interceptor basically captures the request at channel level and performs all the validations required by OAuth. The following sample illustrates how the interceptors can be plugged into an existing service host (service.svc),

    <%@ ServiceHost Language="C#" Debug="true" Service="ExampleOAuthChannel.FeedService" Factory="ExampleOAuthChannel.AppServiceHostFactory"%>

    using System;
    using System.ServiceModel;
    using System.ServiceModel.Activation;
    using Microsoft.ServiceModel.Web;

    namespace ExampleOAuthChannel
    {
      class AppServiceHostFactory : ServiceHostFactory
      {
        protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)
        {
            WebServiceHost2 result = new WebServiceHost2(serviceType, true, baseAddresses);
            result.Interceptors.Add(new OAuthChannel.OAuthInterceptor(
       OAuthServicesLocator.Provider, OAuthServicesLocator.AccessTokenRepository));
            return result;
        }
      }
    }

    OAuthServicesLocator.Provider and OAuthServiceLocator.AccessTokenRepository are just part of the OAuth implementation.

    The interaction (and all the messages interchanged) between the consumer and the provider was very well summarized by Alex in this post "OAuth for beginners"

    The following code illustrates some of the functionality implemented in the OAuth interceptor,

    Message request = requestContext.RequestMessage;

    HttpRequestMessageProperty requestProperty = (HttpRequestMessageProperty)request.Properties[HttpRequestMessageProperty.Name];

     

    OAuthContext context = new OAuthContextBuilder().FromUri(requestProperty.Method, request.Headers.To);

     

    try

    {

        _provider.AccessProtectedResourceRequest(context);

     

        OAuthChannel.Models.AccessToken accessToken = _repository.GetToken(context.Token);

     

        TokenPrincipal principal = new TokenPrincipal(

            new GenericIdentity(accessToken.UserName, "OAuth"),

            accessToken.Roles,

            accessToken);

     

        InitializeSecurityContext(request, principal);

    }

    catch (OAuthException authEx)

    {

        XElement response = XElement.Load(new StringReader("<?xml version=\"1.0\" encoding=\"utf-8\"?><html xmlns=\"http://www.w3.org/1999/xhtml\" version=\"-//W3C//DTD XHTML 2.0//EN\" xml:lang=\"en\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xsi:schemaLocation=\"http://www.w3.org/1999/xhtml http://www.w3.org/MarkUp/SCHEMA/xhtml2.xsd\"><HEAD><TITLE>Request Error</TITLE></HEAD><BODY><DIV id=\"content\"><P class=\"heading1\"><B>" + HttpUtility.HtmlEncode(authEx.Report.ToString()) + "</B></P></DIV></BODY></html>"));

        Message reply = Message.CreateMessage(MessageVersion.None, null, response);

        HttpResponseMessageProperty responseProperty = new HttpResponseMessageProperty() { StatusCode = HttpStatusCode.Forbidden, StatusDescription = authEx.Report.ToString() };

        responseProperty.Headers[HttpResponseHeader.ContentType] = "text/html";

        reply.Properties[HttpResponseMessageProperty.Name] = responseProperty;

        requestContext.Reply(reply);

     

        requestContext = null;

    }

    It basically validates the OAuth ticket using the library written by Alex and initializes a new principal containing the ticket identity. If the ticket can not be validated for some reason, it returns a friendly exception to the consumer.

    UPDATE: Alex has now include the channel as part of the OAuth Library. It is available under the following links,

    http://devdefined-tools.googlecode.com/svn/trunk/projects/oauth/DevDefined.OAuth.Wcf/

    http://devdefined-tools.googlecode.com/svn/trunk/projects/oauth/ExampleOAuthChannel/

    Coming next  "Using the WCF OAuth channel with an ADO.NET service" (The complete source code will be available as part of that post)

    Read more...

  • Routing to the right workflow service instance through URI templates (REST workflows part III)

    Continuing from my previous posts "REST and Workflow services play well together I" and "REST and Workflow services play well together II", in which I created a new binding to send the workflow context as an http header, I've applied some modifications to that sample to support also routing through URI templates. This is very helpful for scenarios where client applications are not necessarily aware of the existence of a workflow services on the other end, so the context information is reconstructed from the resource URI using Uri templates.

    For example, the following Uri template "/orders/{instanceId}" applied to the Uri http://localhost:8000/orders/3C3DE415-71F4-4538-ADA6-87A735827DF6 will result in a match for the key "instanceId" with a value of "3C3DE415-71F4-4538-ADA6-87A735827DF6". We can now pass that instanceId variable to the WF context correlation manager so it will be able to locate and load the corresponding workflow service instance.

    The configuration for the binding on the service side looks as follow,

    <webHttpContext>

    <binding name="myServiceBinding" contextMode="UriTemplate">

      <uriTemplates>

        <add name="orders" value="order/{instanceId}"></add>

        <add name="payments" value="payment/order/{instanceId}"></add>

      </uriTemplates>

    </binding>

    </webHttpContext>

    UriTemplates is a collection, you can configure there all the possible URIs that your workflow can handle. In that example, instanceId is the name of the variable used by WF to locate the workflow instance, other variable names can also be used. For example "conversationId" can be used to continue the workflow in a specific ReceiveActivity (Usually required when the workflow is waiting for some event in a parallel branch activity)

    We can now return the following links (representing the possible branches in the workflow) to the client,

    currentOrder.Next = new Next[] {

        new Next

        {

            Rel = "http://starbucks.example.org/payment",

            Uri = "http://localhost:8000/payment/order/" + WorkflowEnvironment.WorkflowInstanceId.ToString(),

        },

        new Next

        {

            Rel = "http://starbucks.example/order/update",

            Uri = "http://localhost:8000/order/" + WorkflowEnvironment.WorkflowInstanceId.ToString()

        }

    };

    The client application will not need to remember to send a context header in the next execution to resume an existing workflow instance, the Uri will be enough to parse the context information.

    Download the complete sample from this location

    Read more...

  • Conditional Puts in REST

    Conditional puts is a technique generally used in a REST architecture to inform clients about possible conflicts when multiple updates are performed simultaneously over the same resource version. It basically works as fist-write / first-win approach, a client can only commit an operation only if the underline resource has not changed in the meanwhile, otherwise it may receive a http conflict error.

    As the "conditional get" approach I discussed some weeks ago, conditional puts are also based on the use of the headers "Last-Modified" and "Etag" (the If-Not-Modified and If-None-Match request headers are their flip side). It is used nowadays for example by the "ATOM publishing protocol", and there are some plans to support it in Amazon S3.

    Let's see how this approach works with a simple example of two clients (A and B) trying to update a resource R1.

    1. Client A performs a GET over R1 (Version1). The response given by the REST service will include the resource representation and a header "Etag" with the resource version, "1" in this case (Last-Modified could also be used).

    2. Client B performs a GET over R1 (Version1). It gets the same representation as the client A

    3. Client B modifies R1 and invokes a PUT on the server. This invocation includes the modified version of the resource and a header "If-None-Match" with the current resource version (Version 1). As result of this modification, the server returns a Http OK and increments the resource version by one (Version 2).

    4. Client A modifies R1 and try to invoke a PUT on the server. (This invocation also includes a "If-None-Match" header with the resource version "Version 1"). The service detects that the resource changed since the last get, so it throws a conflict error.

    If you want to know more about how to implement this technique with WCF, the new WCF REST Starter Kit includes a complete example "ConditionPut" that illustrates all the steps involved in the implementation of this scenario.

    Read more...

  • Adding documentation to WCF Restful services with the REST Starter Kit

    Automatic documentation is another cool feature introduced in the WCF REST Starter kit. While documentation is an important aspect in the development process, unfortunately there is not an standard mode or guidance yet about how this should be done for REST services. This new feature comes to help a little in this aspect of the process.

    As part of the starter kit, the WCF team has created a new [WebHelp] attribute that can be used to annotate existing web operations with human-readable descriptions. The [WebHelp] attribute (as the WebCache attribute) is an operation behavior implementation that receives a simple string to describe the operation behavior. The operation arguments are automatically reflected and shown as part of the service documentation.

    [WebInvoke(UriTemplate = "DoWork?json", ResponseFormat = WebMessageFormat.Json)]

    [WebHelp(Comment="This method returns a HTTP status code Conflict with a custom json error body")]

    [OperationContract]

     SampleResponseBody DoWorkJson(SampleRequestBody request)

    Now, the question is, once we have this attribute applied to all the web operations of an existing service, where is the documentation actually published ?. Well, this feature also requires the use of a new service host "WebServiceHost2" that comes with the starter kit, which among other things publishes a new endpoint "/help" for the service. Therefore, the final documentation for a service will be available at "ServiceUri" + "/help". For example, http://localhost/MyService.svc/help

     

    As you can see in the image above, an Atom feed is created and published in the /help endpoint. Each entry in the feed provides some documentation about any of the available web operations.

    The new service host can be configured in a existing service modifying the .svc file (If the service runs on IIS),

    <%@ ServiceHost Language="C#" Debug="true" Service="Service.Service" Factory="Microsoft.ServiceModel.Web.WebServiceHost2Factory"%>

    Read more...

  • Using SqlCache dependencies with the new WCF WebCache attribute (REST Starter KIT)

    Let's begin from a hypothetical example that we want to publish a simple product catalog as an ATOM feed. The product table schema in our database was initially designed as follows,

    The first step is to enable the SQL notification for the products database, that can be done with the following commands:

    1. Enable cache notifications for the database: aspnet_regsql [Credentials] -ed -d [DatabaseName] . For example, aspnet_regsql -S .\SqlExpress -E -ed -d Northwind

    2. Enable cache notifications for an specific table in that database: aspnet_regsql [Credentials] -et -d [DatabaseName] -t [TableName] . For example, aspnet_regsql -S .\SqlExpress -E -et -d Northwind  -t Products

    Once we have enabled the ASP.NET SQL notification for the table we want to use (Products in this example), the next step is to implement the service and configure it correctly to use the new WebCache behavior.

    The contract for the service is quite simple, it receives an optional categoryId attribute just in case we want to filter the products.

    [ServiceContract]

    public interface IProductCatalog

    {

      [WebCache(Location = OutputCacheLocation.ServerAndClient, SqlDependency = "myDatabase:Products", VaryByParam = "categoryId")]

      [WebGet(UriTemplate = "?category={categoryId}")]

      [OperationContract]

      Atom10FeedFormatter GetProducts(int categoryId);

    }

    As you can see in the code above, I defined the new WebCache behavior as part of my operation contract. I also specified that I want to invalidate the ASP.NET cache entries when two of the following conditions are true,

    1. Some change is introduced in the products table (The ASP.NET Sql Dependency)

    2. A different categoryId argument is specified in query string.

    The caching preferences for this service are configured in the application web configuration file,

    <connectionStrings>

      <add name="myDatabase" connectionString="Data source=.\SQLExpress;Initial Catalog=Northwind;Trusted_Connection=yes"/>

    </connectionStrings>

    <system.web>

      <compilation debug="true"></compilation>

    <caching>

      <sqlCacheDependency enabled="true" pollTime="1000" >

        <databases>

          <add name="myDatabase" connectionStringName="myDatabase" />

        </databases>

      </sqlCacheDependency>

    </caching>

    </system.web>

    <system.serviceModel>

        <serviceHostingEnvironment aspNetCompatibilityEnabled="true"/>

    </system.serviceModel>

    The ASP.NET compatibility mode is required in order to use this new caching mechanism in our REST service. The service implementation is quite normal, there was no need to add some extra code to support caching, which is something great thanks to the new WebCache behavior.

    [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]

    public class ProductCatalog : IProductCatalog

    {

        public Atom10FeedFormatter GetProducts(int categoryId)

        {

            var connectionString = ConfigurationManager.ConnectionStrings["myDatabase"].ConnectionString;

     

            var items = new List<SyndicationItem>();

            using (var connection = new SqlConnection(connectionString))

            {

                var command = new SqlCommand();

                if (categoryId == 0)

                    command.CommandText = "SELECT * FROM Products";

                else

                {

                    command.CommandText = "SELECT * FROM Products WHERE CategoryId = @categoryId";

                    command.Parameters.Add(new SqlParameter("@categoryId", categoryId));

                }

     

                command.Connection = connection;

                connection.Open();

     

                using(var reader = command.ExecuteReader(CommandBehavior.CloseConnection))

                {

                    while(reader.Read())

                    {

                        items.Add(new SyndicationItem()

                        {

                            Id = String.Format(CultureInfo.InvariantCulture, "http://products/{0}", (int)reader["ProductID"]),

                            Title = new TextSyndicationContent((string)reader["ProductName"]),

                            LastUpdatedTime = new DateTime(2008, 7, 1, 0, 0, 0, DateTimeKind.Utc),

                            Authors =

                            {

                                new SyndicationPerson()

                                {

                                    Name = "cibrax"

                                }

                            },

                            Content = new TextSyndicationContent(string.Format("Category Id {0} - Price {1}",

                                (int)reader["CategoryId"], (decimal)reader["UnitPrice"]))

                        });

                    }

                }

            }

     

            var feed = new SyndicationFeed()

            {

                Id = "http://Products",

                Title = new TextSyndicationContent("Product catalog"),

                Items = items

            };

     

            WebOperationContext.Current.OutgoingResponse.ContentType = "application/atom+xml";

            return feed.GetAtom10Formatter();

        }

    }

    If you are interested in knowing more details about the WebCache behavior implementation, my friend Jesus Rodriguez has written an excellent post some days ago.

    Read more...