March 2009 - Posts

Streaming large content such as media content, images or files is a common scenario for RESTful services. If a scenario like this is not well addressed or implemented on the service side, there is a high risk of consuming server resources like memory or CPU in a matter of seconds. This usually happens when the complete content is loaded in memory before it is transferred to the service consumer.

The WCF REST Starter kit introduced a new mechanism for addressing this scenario, an “AdapterStream” utility class, which basically pushes small pieces of content (and thus, only small memory buffers are used) to the client application as it becomes necessary.

The kit also comes with an example “PushStyleStreaming” that shows this class in action

Stream GetImage(string text)

{

    if (string.IsNullOrEmpty(text))

    {

        throw new WebProtocolException(HttpStatusCode.BadRequest, "text must be specified", null);

    }

    Bitmap theBitmap = GenerateImage(text);

    WebOperationContext.Current.OutgoingResponse.ContentType = "image/jpeg";

    return new AdapterStream((stream) => theBitmap.Save(stream, ImageFormat.Jpeg));

}

In the code above, the AdapterStream is used to transfer an image to a service consumer. As you can also notice in that code, that class receives a lambda  expression or Action<T> delegate in the constructor. That provides some flexibility at the moment of generating the final stream that will be sent to the client application.

Another example that uses an TextWriter for generating text content,

  new AdapterStream((writer) =>

    {

        writer.WriteLine("You said: ");

        writer.WriteLine(text);

        writer.WriteLine("Didn't you?");

        writer.Flush();

    }, Encoding.UTF8);

In addition to this feature, you might also want to have a better control of service usage by restricting the throttling settings. This is a good thing about REST services implemented with the WCF stack. Other implementations such as the ASP.NET MVC rely on the ASP.NET for handling this aspect, where these settings are applied only at application domain level (For all services running in the same app).

Posted by cibrax | 3 comment(s)
Filed under: , ,

It’s very common when developing RESTful services to authenticate users against a proprietary user database. This is generally done with a combination of username and password through http basic authentication. Unfortunately, basic authentication is tied to windows accounts in IIS, which leads us to find out some alternatives or workarounds to support this scenario. WCF 3.5 made possible to authenticate transport credentials with one of the existing UsernamePasswordValidator extensions, however, this approach does not work for IIS hosted services.

Dominick solved this problem with a module plugged directly in the ASP.NET pipeline that works like a charm, but it requires some additional WCF settings and a custom IAuthorization policy to flow the user principal to the WCF service instance. His solution works for ASP.NET applications as well.

This problem can be also solved at a deeper level in the WCF transport model using a message interceptor. The message interceptor can receive a traditional Membership provider in the constructor class, and use it later for authenticating the users. In addition, this message interceptor can also automatically pass the user credentials to the WCF service instance.

As any other message interceptor, it can be configured directly in the WCF service factory with no need of having additional configuration. This can be easily done in the “svc” file hosted in IIS.

<%@ ServiceHost Language="C#" Debug="true" Service="Service" Factory="AppServiceHostFactory" %>

class AppServiceHostFactory : ServiceHostFactory

{

   protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)

   {

     WebServiceHost2 result = new WebServiceHost2(serviceType, true, baseAddresses);

     result.Interceptors.Add(new BasicAuthenticationInterceptor(

                System.Web.Security.Membership.Provider, "foo"));

     return result;

   }

}

BasicAuthenticationInterceptor is the message interceptor I built for this post, it receives the MembershipProvider and a default realm that will be returned together with an 401 http error (Unauthorized) in case the user was not correctly authenticated.

The example is available to download from here.

Posted by cibrax | 11 comment(s)
Filed under: , ,

Doug Purdy and Chris Sells announced today in the mix the availability of two new DSLs for RESTful services. MUrl for defining RESTful clients, and MService for defining the service implementation.

More information about MUrl can be found in the Doug’s blog, http://www.douglaspurdy.com/2009/03/20/murl-a-dsl-for-restful-clients/ . An example is also available to download from the Oslo Dev Center, http://msdn.microsoft.com/en-us/oslo/default.aspx

MService, on the other hand, is a DSL for defining or creating RESTful services. Doug just posted new information about it, http://www.douglaspurdy.com/2009/03/20/mservice-a-dsl-for-restful-services/

The mix session where these new technologies were shown will be available soon at this location

Enjoy.

Posted by cibrax | 1 comment(s)
Filed under: , ,

Continuing my post “Brokered authentication for REST active clients”, I will show today how the client code can be simplified using the new HttpClient (WCF REST Starter kit 2) and some custom http processing stages attached to its pipeline.

The first thing we have to do is to implement a custom processing stage (a class that derives from HttpStage) to centralize all the logic needed to negotiate a SAML token from an existing STS.

The pipeline contains basically two kinds of stage, a regular http stage that can be injected through the HttpClient.Stages collection, and a more specialized implementation HttpWebRequestTransportStage, which runs last in the pipeline and has access to all the transport settings. This last one can only be replaced with a custom version of the HttpClient that overrides the protected method “CreateTransportStage”,

public class HttpClient : IDisposable

{

  protected virtual HttpStage CreateTransportStage();

}

Having said this, two possible options for implementing the token negotiation in a pipeline stage could be,

1. A regular http stage that can be initialized with the STS address and the user credentials through the class constructor or a property setter.

2. A custom HttpWebRequestTransportStage and the corresponding HttpClient (FederatedHttpClient) implementation to return that stage.

From my point of view, the second approach seems to work better because the HttpClient instance does not get tied to the user credentials. This is the approach I will use for this example.

public class NegociateTokenStage : HttpWebRequestTransportStage

{

private string stsUri = "";

public NegociateTokenStage(string stsUri) : base()

{

    this.stsUri = stsUri;

}

protected override void ProcessRequestAndTryGetResponse(HttpRequestMessage request, out HttpResponseMessage response, out object state)

{

    string token = GetToken(stsUri, request.Uri.AbsoluteUri, this.Settings.Credentials);

    request.Headers.Add("Authorization", token);

    base.ProcessRequestAndTryGetResponse(request, out response, out state);

}

The custom transport stage derives from the built-in transport stage “HttpWebRequestTransportStage” and adds some custom code in the ProcessRequestAndTryGetResponse to negotiate the SAML token from the STS before the final service gets called (This is being done in the GetToken method). After that, the SAML token get passed to the final service through the authorization html header.

The custom implementation of the HttpClient application is quite simple, only returns our custom transport stage in the CreateTransportStage method,

public class FederatedHttpClient : HttpClient

{

    public string StsUri

    {

        get; set;

    }

    protected override HttpStage CreateTransportStage()

    {

        NegociateTokenStage stage = new NegociateTokenStage(this.StsUri);

        stage.Settings = this.TransportSettings;

        return stage;

    }

}

Now, the client application can use our custom version of the HttpClient for consuming the final service, only a few lines are required.

FederatedHttpClient client = new FederatedHttpClient { StsUri = "http://localhost:7481/STS/Service.svc/Tokens" };

client.TransportSettings.Credentials = new NetworkCredential("cibrax", "foo");

string response = client.Get("http://localhost:7397/RestServices/Service.svc/Claims").Content.ReadAsString();

The SAML negotiation is totally transparent to the client application, it does not even know that a SAML token exists, sweet :).

The code is available to download at this location.

UPDATE: As John Lambert from the WCF team pointed out, a custom transport stage also needs to override the BeginProcessRequestAndTryGetResponse and EndProcessRequestAndTryGetResponse to support async scenarios. I will try to update the example to override these methods any time soon. Thanks John for the feedback!!!.

Posted by cibrax | 2 comment(s)

PollingAgent is an utility class for consuming REST services that implement conditional gets. An instance of this class will periodically invoke the service within a predefined interval of time, and fire an event on the client side when a new response is available to consume. It is internally layered on top of the HttpClient class, so all the pipeline infrastructure provided by this last one is also supported for this pooling agent.

As I discussed in the post “Conditional Get in REST”, conditional gets provide an effective and standard mechanism for caching information on the client side. It is based on the use of the  Etag/IfNoneMatch and Last-Modified/IfModifiedSince html headers. This mechanism is commonly used by the feed readers to be notified about new items in the feeds.

This pooling agent implementation internally keeps track of those headers, so it hides many of the conditional gets details from the client application.

public class PollingAgent : IDisposable

{

  public PollingAgent();

  public HttpClient HttpClient { get; set; }

  public bool IgnoreExpiresHeader { get; set; }

  public bool IgnoreNonOKStatusCodes { get; set; }

  public bool IgnoreSendErrors { get; set; }

  public TimeSpan PollingInterval { get; set; }

  public event EventHandler<ConditionalGetEventArgs> ResourceChanged;

  public void Dispose();

  public void StartPolling();

  public void StartPolling(Uri uri);

  public void StartPolling(Uri uri, EntityTag etag, DateTime? lastModifiedTime);

  public void StopPolling();

}

public class ConditionalGetEventArgs : EventArgs

{

  public ConditionalGetEventArgs();

  public HttpResponseMessage Response { get; set; }

  public Exception SendError { get; set; }

  public bool StopPolling { get; set; }

}

This public API is quite straightforward, a couple of methods to start/stop polling the REST service, an event “ResourceChanged” to be notified about changes, and some properties to configure the agent.

AtomPubClient is an specialized version of the HttpClient for consuming Atom Pub Services. It provides different overloads for creating, updating, getting or updating existing syndication entries through Atom pub.

As you can see, the WCF Rest team is definitively doing a great job to facilitate the adoption of REST services in the platform. 

Posted by cibrax | 3 comment(s)
Filed under: , ,

HttpClient is a new utility class introduced in the WCF REST Startert Kit Preview 2 for consuming REST services. This new class is made up of three different parts,

1. A rich object model to manipulate the Http request and response objects in a more natural way. This is done through the use of the HttpRequestMessage and HttpResponseMessage classes.

2. Some overloads or extension methods to reduce the number of code lines required to consume a service. For example, the HttpClient class itself contains method overloads to send messages to a service through some of the well-know http verbs, such as Get, Post, Put, Delete, Head or Options.  The HttpRequestMessage and HttpResponse classes also contain methods for serializing/deserializing the incoming/outgoing messages in different formats.

HttpClient client = new HttpClient();           

var echo = client.Get(new Uri("http://localhost:1449/MyRestService/Service.svc/Echo")).Content.ReadAsString();

The sample code above sends a Http Get message to an “Echo” service and gets the response as a simple string.

3. An extensible execution pipeline made up of customizable steps or stages. This pipeline takes the form of a pipeline controller pattern where each step executes one after another to return back the execution control to the pipeline (It is basically a coordinator). This feature gives enough flexibility to perform additional work over the request/response messages such as validations or caching to name a few.

Every step in the pipeline is represented by a HttpStage class. This class contains two overloads,

public class MyCustomStage : HttpStage

{

    protected internal override void ProcessRequestAndTryGetResponse(HttpRequestMessage request, out HttpResponseMessage response, out object state)

    {

    }

    protected internal override void ProcessResponse(HttpResponseMessage response, object state)

    {

    }

}

The “ProcessRequestAndTryGetResponse” is executed first in the pipeline, and it allows doing some processing over the request message. It is also possible in this method to return a response message, if that happens, all the stages that come after this one are not executed and the response is returned to the client application. As result of this, the REST service does not get called because the last stage in the pipeline is usually the one that sends the request through http to the service. This approach is generally useful for performing some caching or mocking the responses from unit tests.

The following example interrupts the pipeline execution and returns an string (“myfoo”) as response message,

public class MyStage : HttpStage

{

    protected internal override void ProcessRequestAndTryGetResponse(HttpRequestMessage request, out HttpResponseMessage response, out object state)

    {

        response = new HttpResponseMessage

        {

            Content = HttpContent.Create("myfoo"),

            Method = "GET",

            StatusCode = System.Net.HttpStatusCode.OK,

            Uri = request.Uri

        };

        state = null;

    }

    protected internal override void ProcessResponse(HttpResponseMessage response, object state)

    {

    }

}

The “ProcessResponse” method is executed after a response message is found in the pipeline. As it name states, it allows doing some additional processing over that response message.

A more specialized version of these stages is also provided by the starter kit, HttpProcessingStage

public class MyCustomStage : HttpProcessingStage

{

    public override void ProcessRequest(HttpRequestMessage request)

    {

    }

    public override void ProcessResponse(HttpResponseMessage response)

    {

    }

}

This class derives from the standard “HttpStage” and adds some plumbing code to simplify the work of a developer only interested in inspecting or modifying the request or response messages. It works like a Message Inspector in WCF.

These custom stages can be added to the HttpClient trough the “Stages” collection,

client.Stages.Add(new MyStage());

If you want to start playing with this new useful class, go an grab the latest preview of the Starter kit from Codeplex.

Posted by cibrax | 1 comment(s)
Filed under: , ,

T4 is a powerful template engine for code generation shipped out of the box within Visual Studio.  It is an evolution of T3, which was initially introduced a couple of years ago as part of  the DSL toolkits and the software factories.

Today, it is getting more attention from other product teams as well, for instance, the ASP.NET MVC and Entity Framework teams have recently announced that they will ship T4 templates as part of their products. That will provide a way to customize the code they are generating from custom tools or visual studio item templates.

A cool thing is that you do not need any custom tool box to automate the code generation. You can simply add a T4 template to any Visual Studio project, rename it to use the extension "tt", and visual studio will do the rest.

One of the major pains, however, is the authoring experience for creating or modifying existing T4 templates in Visual Studio, there is not built-in support for doing that.  The company where I am currently working on, Clarius Consulting, has made the best designer ever for authoring T4 templates within Visual Studio, the T4 editor. Some of the features included with this designer are syntax coloring, intellisense or code preview to name a few.

This last weekend, while I was delayed a complete day in DC for mechanical problems  in one of the airplanes, I decided it was a good moment to start playing with this technology and make something productive with my time. The result was a T4 template for auto generating a DTO (Data transfer objects) layer based on WCF data contracts from an existing entity model. Using DTOs is a common practice for transferring the state of different entities across service boundaries, they are frequently found in system designs that follow the DDD principles.

Although the resulting code is practically useless, it can be easily customized for supporting different scenarios. (Or at least, it will help to give you an idea about how this thing can be done)

The structure of a T4 template is quite simple (And somehow similar to an ASP.NET page without any control), "<# #>" for wrapping multiple lines of code or "<#= #>" for inline code, the rest of the template is treated as text.

For this example I used a model with a few entities (an anemic domain model I would say),

public class Employee

{

    public string Name

    {

        get;

        set;

    }

    public Employee Boss

    {

        get;

        set;

    }

    public Company Company

    {

        get;

        set;

    }

}

public class Company

{

    public string CompanyName

    {

        get;

        set;

    }

    public List<Employee> Employees

    {

        get;

        set;

    }

}

The template filters the types that have to be included in the code generating process with a Linq expression, all the entities in the "EntitiesToDTO.Entities" namespace in this case.

<#

var entitiesNamespace = "EntitiesToDTO.Entities";

//Use another expression here to filter the entities

var typesToRegister = from t in LoadProjectAssembly(entitiesAssembly).GetExportedTypes()

                      where t.Namespace == entitiesNamespace && t.IsClass && !t.IsAbstract

                      select t;

#>

The resulting code is also quite straightforward, it includes a partial class that can be extended to support additional mappings.

[DataContract(Name="employee", Namespace="urn:EntitiesToDTO/Entities")]

public partial class EmployeeDTO   

{

    [DataMember(Name="name")]

    public System.String Name

    {

        get; set;

    }

    [DataMember(Name="boss")]

    public EmployeeDTO Boss

    {

        get; set;

    }

    [DataMember(Name="company")]

    public CompanyDTO Company

    {

        get; set;

    }

    public Employee MapTo(EmployeeDTO dto)

    {

        return GetMapper().MapTo(dto);

    }

    public static EmployeeDTO MapFrom(Employee entity)

    {

        return GetMapper().MapTo(entity);

    }

    public static EmployeeMapper GetMapper()

    {

        return new EmployeeMapper();  

    }

    public partial class EmployeeMapper

    {

        public EmployeeDTO MapTo(EntitiesToDTO.Entities.Employee entity)

        {

            var dto = new EmployeeDTO

            {  

                Name = entity.Name,

                Boss = EmployeeDTO.GetMapper().MapTo(entity.Boss),

                Company = CompanyDTO.GetMapper().MapTo(entity.Company),

            };

            DoMapping(dto, entity);

            return dto;

        }

        public EntitiesToDTO.Entities.Employee MapTo(EmployeeDTO dto)

        {

            var entity = new EntitiesToDTO.Entities.Employee

            {  

                Name = dto.Name,

                Boss = EmployeeDTO.GetMapper().MapTo(dto.Boss),

                Company = CompanyDTO.GetMapper().MapTo(dto.Company),

            };

            DoMapping(entity, dto);

            return entity;

        }

        partial void DoMapping(EntitiesToDTO.Entities.Employee fromEntity, EmployeeDTO toDto);

        partial void DoMapping(EmployeeDTO fromDto, EntitiesToDTO.Entities.Employee toEntity);

    }

The mapper (EmployeeMapper) is an additional class to handle the mappings between entities and DTOs, a mapping class per DTO is generated. As you can see in the code below, either the EmployeeDTO or the EmployeeMapper can be extended with a partial class to perform additional mappings. For example,

public partial class EmployeeDTO

{

    [DataMember(Name = "fullName")]

    public string FullName { get; set; }

    public partial class EmployeeMapper

    {

        partial void DoMapping(Employee fromEntity, EmployeeDTO toDto)

        {

            toDto.FullName = fromEntity.Name + " Foo";

        }

    }

}

The sample is available to download from here.

Posted by cibrax | 6 comment(s)
Filed under: ,

WCFMock, a mocking framework for WCF services. Not a very original name, but it was the first one that came out to my mind :). If you have been following my blog for a while, you might notice that I discussed different approaches in the past to unit test WCF services, here and here. One of the major pains that you will find today for unit testing WCF services is the static operation context (OperationContext and WebOperationContext). If you service implementation relies on that context for doing something, you will have a hard time trying to test that functionality.

For instance, it is very common in WCF REST services to use the context to set or get http status codes. With the current WCF bits, how can you do to unit test those services ?. The answer is WCFMock, a set of useful classes that will help you to remove all the explicit dependencies with the operation context, and still provide a good way to mock them from unit tests.

Let's see how WCFMock works in practice with a very simple example,

1. You have a WCF REST service that returns a RSS feed with a catalog of products

[ServiceContract]

public interface IProductCatalog

{

    [WebGet(UriTemplate = "?category={category}")]

    [OperationContract]

    Atom10FeedFormatter GetProducts(string category);

}

 

public Atom10FeedFormatter GetProducts(string category)

{

    var items = new List<SyndicationItem>();

    foreach(var product in repository.GetProducts(category))

    {

        items.Add(new SyndicationItem()

        {

            Id = String.Format(CultureInfo.InvariantCulture, "http://products/{0}", product.Id),

            Title = new TextSyndicationContent(product.Name),

            LastUpdatedTime = new DateTime(2008, 7, 1, 0, 0, 0, DateTimeKind.Utc),

            Authors =

            {

                new SyndicationPerson()

                {

                    Name = "cibrax"

                }

            },

            Content = new TextSyndicationContent(string.Format("Category Id {0} - Price {1}",

                product.Category, product.UnitPrice))

        });

    }

    var feed = new SyndicationFeed()

    {

        Id = "http://Products",

        Title = new TextSyndicationContent("Product catalog"),

        Items = items

    };

    WebOperationContext.Current.OutgoingResponse.ContentType = "application/atom+xml";

    return feed.GetAtom10Formatter();

}

 

 

This service implementation only relies on the WebOperationContext to set up the response content type, that is being done in the following line,

 

WebOperationContext.Current.OutgoingResponse.ContentType = "application/atom+xml";

 

2. You have now to find a way to get rid of that dependency so we can unit test that method. Here is where WCFMock comes to the rescue. The first thing you have to do is to define a new alias on top of your class,

using WebOperationContext = System.ServiceModel.Web.MockedWebOperationContext;

Optionally, you can wrap that sentence with a conditional compilation directive

#if DEBUG

using WebOperationContext = System.ServiceModel.Web.MockedWebOperationContext;

#endif

This is useful for instance, if you want to use the mocked version in development, and always the WCF version in production. That's all, you do not need to touch your existing service implementation at all, once you defined that alias, the service is ready to be tested.

3. For testing the service, I will use Moq, a pretty good mock framework created by my friend Cazzu.

[TestClass]

public class UnitTests

{

    [TestMethod]

    public void ShouldGetProductsFeed()

    {

        ProductCatalog catalog = new ProductCatalog(

            new InMemoryProductRepository(

                new List<Product>{

                new Product { Id = "1", Category = "foo", Name = "Foo1", UnitPrice = 1 },

                new Product { Id = "2", Category = "bar", Name = "bar2", UnitPrice = 2 }

            }));

        Mock<IWebOperationContext> mockContext = new Mock<IWebOperationContext> { DefaultValue = DefaultValue.Mock };

        IEnumerable<SyndicationItem> items;

        using (new MockedWebOperationContext(mockContext.Object))

        {

            var formatter = catalog.GetProducts("foo");

            items = formatter.Feed.Items;

        }

        mockContext.VerifySet(c => c.OutgoingResponse.ContentType, "application/atom+xml");

        Assert.AreEqual(1, items.Count());

        Assert.IsTrue(items.Any(i => i.Id == "http://products/1" && i.Title.Text == "Foo1"));

    }

}

Two pieces of code deserves some special attention, the code for creating the mocked WebOperationContext and the code required for verifying the expectations.

using (new MockedWebOperationContext(mockContext.Object))

{

    var formatter = catalog.GetProducts("foo");

    items = formatter.Feed.Items;

}

That will insert the mockContext object in the Thread Local Storage so it can be accessed later in the service implementation.  

The test also verifies that the ContentType header was set with the correct value in the operation,

mockContext.VerifySet(c => c.OutgoingResponse.ContentType, "application/atom+xml");

As you can see, all the magic is done by the MockedWebOperationContext (There is also a MockedOperationContext to replace the OperationContext). The implementation of this class is quite simple,

public class MockedWebOperationContext : IDisposable

{

    [ThreadStatic]

    private static IWebOperationContext currentContext;

    public MockedWebOperationContext(IWebOperationContext context)

    {

        currentContext = context;

    }

    public static IWebOperationContext Current

    {

        get

        {

            if (currentContext == null)

            {

                return new WebOperationContextWrapper(WebOperationContext.Current);

            }

            return currentContext;

        }

    }

    public void Dispose()

    {

        currentContext = null;

    }

}

The Current property of this class first tries to get an IWebOperationContext from the thread local storage (That could be set by the unit tests), and if none is available, it returns a wrapper to the original WCF context. As you can see, the overhead introduced by the class is minimal, it just a quick search in the TLS.

The project is now available to download from Codeplex at this location, http://wcfmock.codeplex.com/

Enjoy.

Posted by cibrax | 8 comment(s)
Filed under: ,

I have been thinking for a while about what could be a good way to support brokered authentication for active REST clients. Something I did not want to do was to force the use of WS-Trust Active profile, which is in essence SOAP based.

Some of the qualities attributes that are easy to reach with REST services, such as simplicity, interoperability and scalability can definitely be affected with the introduction of a additional SOAP stack for negotiating an identity token. WS-Trust passive requestor profile, on the other hand, was designed for dumb clients like web browsers, clients that do not have capabilities to handle cryptographic materials or the SOAP stack itself.  This profile basically hides most of the WS-Trust details from client applications through a sequence of http redirections, which could be helpful in this scenario for negotiating a token and still keep simple REST clients. However, as some user interaction is required, this profile is not suitable for consuming REST services from desktop applications or other active client applications.

If we take a deep look at the functionality provided by a Secure Token Service (STS), it is not more than a service that handle the lifecycle of a identity token, it knows how to issue a token, renew it or finally cancel it when it is not longer need it.  If we see all these scenarios from a point of view of REST, an identity token is just a resource, something that can be created, updated or even deleted. Of course, there is not any spec available yet for this scenario, all I will show here is just an possible implementation of a Restful STS.

The mapping of supported Ws-Trust actions to http verbs for my Restful STS is defined below,

  • Issue = POST, creates or issues a new token resource (A SAML token)
  • Renew = PUT, renew an existing token
  • Cancel = DELETE, cancel an existing token
  • GET, gets an existing token (There is not such thing in Ws-Trust)

I leave out the "Validate" action as part of this implementation.

What I have created for this example is a REST facade layered on top of a STS implementation with the Geneva Framework. The definition of service contract for this Restful STS for supporting that mapping should look like this,

[ServiceContract]

public interface IRestSts

{

    [OperationContract]

    [WebInvoke(UriTemplate="Tokens", Method="POST", RequestFormat=WebMessageFormat.Xml, ResponseFormat=WebMessageFormat.Xml)]

    RequestSecurityTokenResponse IssueToken(RequestSecurityToken request);

 

    [OperationContract]

    [WebInvoke(Method = "PUT", UriTemplate = "Tokens/{tokenId}", RequestFormat = WebMessageFormat.Xml, ResponseFormat = WebMessageFormat.Xml)]

    RequestSecurityTokenResponse RenewToken(string tokenId);

 

    [OperationContract]

    [WebInvoke(Method = "DELETE", UriTemplate = "Tokens/{tokenId}", RequestFormat = WebMessageFormat.Xml, ResponseFormat = WebMessageFormat.Xml)]

    void CancelToken(string tokenId);

 

    [OperationContract]

    [WebGet(UriTemplate = "Tokens/{tokenId}", RequestFormat = WebMessageFormat.Xml, ResponseFormat = WebMessageFormat.Xml)]

    RequestSecurityTokenResponse GetToken(string tokenId);   

}

As I mentioned before, the client has to first acquire a token from the STS, that can be done with a regular Http POST containing a RequestSecurityToken message.

Issue_REST

The message embedded in the request body to the STS looks like this,

<RequestSecurityToken xmlns="http://schemas.xmlsoap.org/ws/2005/02/trust">
    <AppliesTo>https://localhost/MyService</AppliesTo>
    <TokenType>http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV1.1</TokenType>
</RequestSecurityToken>

And the corresponding response like this,

<RequestSecurityTokenResponse xmlns="http://schemas.xmlsoap.org/ws/2005/02/trust" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
    <Links>
        <Link>
            <href>http://localhost:7362/STSWindows/Service.svc/_8a6fc87b-7e6a-45c9-a479-20ea42113e40</href>
            <rel>self</rel>
            <type>application/xml</type>
        </Link>
    </Links>
    <RequestedSecurityToken>....</RequestedSecurityToken>
    <TokenType>http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV1.1</TokenType>
</RequestSecurityTokenResponse>

Both calls, the first one to get the token from the STS, and the second call to invoke the service in the Relying party should be protected with transport security to avoid any middle in the man attack.

In this sample, the STS is using basic authentication to authenticate the user trying to get access to the token. If the authentication succeed, the STS implemented with Geneva will provide the necessary claims associated with that user.

The code on the client side to ask for a new token is quite simple,

static string GetToken(string address, string appliesTo, string username, string password)

{

    RequestSecurityToken request = new RequestSecurityToken

    {

        TokenType = "http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV1.1",

        AppliesTo = appliesTo

    };

    DataContractSerializer requestSerializer = new DataContractSerializer(typeof(RequestSecurityToken));

    WebRequest webRequest = HttpWebRequest.Create(address);

    webRequest.Method = "POST";

    webRequest.ContentType = "application/xml";

    webRequest.Credentials = new NetworkCredential(username, password);

    using (var st = webRequest.GetRequestStream())

    {

        requestSerializer.WriteObject(st, request);

        st.Flush();

    }

    WebResponse webResponse = webRequest.GetResponse();

    DataContractSerializer responseSerializer = new DataContractSerializer(typeof(RequestSecurityTokenResponse));

    using (var st = webResponse.GetResponseStream())

    {

        var response = (RequestSecurityTokenResponse)responseSerializer.ReadObject(st);

        return response.RequestedSecurityToken;

    }

}

It creates a new RequestSecurityToken message, provides the user credentials and post that information to the STS. The response from the STS is a RequestSecurityTokenResponse containing the issued token, that's what this method returns in response.RequestedSecurityToken.

Once the client gets the issued token from the response, it can include it as part of the request message to the relying party's service. For this sample, I decided to include the token in the "Authorization" header, which is a common mechanism to attach authentication credentials in a request message to a REST service (Basic authentication, and other authentication mechanisms use the same approach).

WebRequest webRequest = HttpWebRequest.Create(address);

webRequest.Method = "GET";

webRequest.Headers["Authorization"] = token;

Now, the hard part, the Relying Party needs a way to parse the token and authenticate the user before calling the service implementation. Fortunately, the guys from the WCF REST Starter kit have provide an excellent solution for this kind of scenarios, message interceptors. What I did here was to implement a message interceptor for SAML tokens, which internally used the Geneva Framework for performing all the validations and parsing the token.  An easy way to inject message interceptors in a service implementation is through a custom service factory (Zero config deployment),

class AppServiceHostFactory : ServiceHostFactory

{

    protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)

    {

        WebServiceHost2 result = new WebServiceHost2(serviceType, true, baseAddresses);

        result.Interceptors.Add(new MessageInterceptors.SamlAuthenticationInterceptor(new TrustedIssuerNameRegistry()));

        return result;

    }

}

The "TrustedIssuerNameRegistry" is a just a simple implementation of a Geneva "IssuerNameRegistry" provider that validates the issuer of the SAML token.

All this stuff is of course transparent to the service implementation, it only receives a bunch of claims representing the user identity. Those claims can be got accessed through the current user principal. In the code below, the service generates a feed with all the received claims.

IClaimsIdentity identity = (IClaimsIdentity)Thread.CurrentPrincipal.Identity;

var feed = new SyndicationFeed()

{

    Id = "http://Claims",

    Title = new TextSyndicationContent("My claims"),

};

feed.Items = identity.Claims.Select(c =>

    new SyndicationItem()

    {

        Id = Guid.NewGuid().ToString(),

        Title = new TextSyndicationContent(c.ClaimType),

        LastUpdatedTime = DateTime.UtcNow,

        Authors =

            {

                new SyndicationPerson()

                {

                    Name = c.Issuer

                }

            },

        Content = new TextSyndicationContent(c.Value)

    }

);

The complete sample is available to download from here. Note, it uses the latest Geneva Framework bits (And also the X509 certificates included with the samples, just run the certificate setup file included with the framework).

Posted by cibrax | 8 comment(s)
Filed under: , , ,
More Posts