Contents tagged with REST

  • Hosting your own Pub/Sub in the cloud with AppHarbor and Hermes

    As you might read in my latest post, Hermes is one our new pet projects in Tellago for doing Pub/Sub over http. The idea is simple, but still very useful for integration scenarios in the enterprise. The fact that Hermes is all based on Http and uses one of the most famous open source initiatives for NoSQL databases like MongoDB, makes this project very appealing for the cloud as well. Many of the cloud platforms already provide MongoDB as a service that you can use in your applications hosted in the cloud.

    Read more...

  • Second round of Web Http Caching

    As I discussed in my previous post, web caching relies on specific headers that you need to use correctly on your services. That’s an http application protocol thing, and something that you can easily use in any application framework that treats Http as first citizen. This means that you don’t need to implement anything fancy or exclusively rely on an specific caching technology or components for doing ouput caching (e.g ASP.NET Cache).

    Read more...

  • Http Message Channels in WCF Web Apis Preview 4

    The new WCF Web Apis Preview 4 released yesterday in the wcf.codeplex.com introduced a new extensibility point for intercepting messages at channel level. The name for this new feature is “Http Message Channels” and the good thing is that you don’t need to rely anymore on the REST Starter Kit request interceptors for doing the same thing. Actually, a Http Message Channel is more useful as you can intercept either request or response messages, and also you get an Http message with all the context information you might need and not a WCF generic message, which usually requires some additional processing.

    Read more...

  • Making your WCF Web Apis to speak in multiple languages

    One of the key aspects of how the web works today is content negotiation. The idea of content negotiation is based on the fact that a single resource can have multiple representations, so user agents (or clients) and servers can work together to chose one of them.

    The http specification defines several “Accept” headers that a client can use to negotiate content with a server, and among all those, there is one for restricting the set of natural languages that are preferred as a response to a request, “Accept-Language”. For example, a client can specify “es” in this header for specifying that he prefers to receive the content in spanish or “en” in english.

    However, there are certain scenarios where the “Accept-Language” header is just not enough, and you might want to have a way to pass the “accepted” language as part of the resource url as an extension. For example, http://localhost/ProductCatalog/Products/1.es” returns all the descriptions for the product with id “1” in spanish. This is useful for scenarios in which you want to embed the link somewhere, such a document, an email or a page. 

    Supporting both scenarios, the header and the url extension, is really simple in the new WCF programming model. You only need to provide a processor implementation for any of them.

    Let’s say I have a resource implementation as part of a product catalog I want to expose with the WCF web apis.

    [ServiceContract]
    [Export]
    public class ProductResource
    {
        IProductRepository repository;
     
        [ImportingConstructor]
        public ProductResource(IProductRepository repository)
        {
            this.repository = repository;
        }
     
        [WebGet(UriTemplate = "{id}")]
        public Product Get(string id, HttpResponseMessage response)
        {
            var product = repository.GetById(int.Parse(id));
            if (product == null)
            {
                response.StatusCode = HttpStatusCode.NotFound;
                response.Content = new StringContent(Messages.OrderNotFound);
            }
     
            return product;
        }
    }

    The Get method implementation in this resource assumes the desired culture will be attached to the current thread (Thread.CurrentThread.Culture). Another option is to pass the desired culture as an additional argument in the method, so my processor implementation will handle both options. This method is also using an auto-generated class for handling string resources, Messages, which is available in the different cultures that the service implementation supports. For example,

    Messages.resx contains “OrderNotFound”: “Order Not Found”

    Messages.es.resx contains “OrderNotFound”: “No se encontro orden”

    The processor implementation bellow tackles the first scenario, in which the desired language is passed as part of the “Accept-Language” header.

    public class CultureProcessor : Processor<HttpRequestMessage, CultureInfo>
    {
        string defaultLanguage = null;
     
        public CultureProcessor(string defaultLanguage = "en")
        {
            this.defaultLanguage = defaultLanguage;
            
            this.InArguments[0].Name = HttpPipelineFormatter.ArgumentHttpRequestMessage;
            this.OutArguments[0].Name = "culture";
        }
     
        public override ProcessorResult<CultureInfo> OnExecute(HttpRequestMessage request)
        {
            CultureInfo culture = null;
                        
            if (request.Headers.AcceptLanguage.Count > 0)
            {
                var language = request.Headers.AcceptLanguage.First().Value;
                culture = new CultureInfo(language);
            }
            else
            {
                culture = new CultureInfo(defaultLanguage);
            }
     
            Thread.CurrentThread.CurrentCulture = culture;
            Messages.Culture = culture;
     
            return new ProcessorResult<CultureInfo>
            {
                Output = culture
            };
        }
    }
     
    As you can see, the processor initializes a new CultureInfo instance with the value provided in the “Accept-Language” header, and set that instance to the current thread and the auto-generated resource class with all the messages. In addition, the CultureInfo instance is returned as an output argument called “culture”, making possible to receive that argument in any method implementation
     
    The following code shows the implementation of the processor for handling languages as url extensions.
     
    public class CultureExtensionProcessor : Processor<HttpRequestMessage, Uri>
    {
        public CultureExtensionProcessor()
        {
            this.OutArguments[0].Name = HttpPipelineFormatter.ArgumentUri;
        }
     
        public override ProcessorResult<Uri> OnExecute(HttpRequestMessage httpRequestMessage)
        {
            var requestUri = httpRequestMessage.RequestUri.OriginalString;
     
            var extensionPosition = requestUri.LastIndexOf(".");
     
            if (extensionPosition > -1)
            {
                var extension = requestUri.Substring(extensionPosition + 1);
     
                var query = httpRequestMessage.RequestUri.Query;
     
                requestUri = string.Format("{0}?{1}", requestUri.Substring(0, extensionPosition), query); ;
     
                var uri = new Uri(requestUri);
     
                httpRequestMessage.Headers.AcceptLanguage.Clear();
     
                httpRequestMessage.Headers.AcceptLanguage.Add(new StringWithQualityHeaderValue(extension));
     
                var result = new ProcessorResult<Uri>();
     
                result.Output = uri;
     
                return result;
            }
     
            return new ProcessorResult<Uri>();
        }
    }

    The last step is to inject both processors as part of the service configuration as it is shown bellow,

    public void RegisterRequestProcessorsForOperation(HttpOperationDescription operation, IList<Processor> processors, MediaTypeProcessorMode mode)
    {
        processors.Insert(0, new CultureExtensionProcessor());
        processors.Add(new CultureProcessor());
    }

    Once you configured the two processors in the pipeline, your service will start speaking different languages :).

    Note: Url extensions don’t seem to be working in the current bits when you are using Url extensions in a base address. As far as I could see, ASP.NET intercepts the request first and tries to route the request to a registered ASP.NET Http Handler with that extension. For example, “http://localhost/ProductCatalog/products.es” does not work, but “http://localhost/ProductCatalog/products/1.es” does.

    Read more...

  • Authenticating clients in the new WCF Http stack

    About this time last year, I wrote a couple of posts about how to use the “Interceptors” from the REST starker kit for implementing several authentication mechanisms like “SAML”, “Basic Authentication” or “OAuth” in the WCF Web programming model. The things have changed a lot since then, and Glenn finally put on our hands a new version of the Web programming model that deserves some attention and I believe will help us a lot to build more Http oriented services in the .NET stack. What you can get today from wcf.codeplex.com is a preview with some cool features like Http Processors (which I already discussed here), a new and improved version of the HttpClient library, Dependency injection and better TDD support among others.

    However, the framework still does not support an standard way of doing client authentication on the services (This is something planned for the upcoming releases I believe). For that reason, moving the existing authentication interceptors to this new programming model was one of the things I did in the last few days.

    In order to make authentication simple and easy to extend,  I first came up with a model based on what I called “Authentication Interceptors”. An authentication interceptor maps to an existing Http authentication mechanism and implements the following interface,

    public interface IAuthenticationInterceptor
    {
        string Scheme { get; }
        bool DoAuthentication(HttpRequestMessage request, HttpResponseMessage response, out IPrincipal principal);
    }

    An authentication interceptors basically needs to returns the http authentication schema that implements in the property “Scheme”, and implements the authentication mechanism in the method “DoAuthentication”. As you can see, this last method “DoAuthentication” only relies on the HttpRequestMessage and HttpResponseMessage classes, making the testing of this interceptor very simple (There is no need to do some black magic with the WCF context or messages).

    After this, I implemented a couple of interceptors for supporting basic authentication and brokered authentication with SAML (using WIF) in my services. The following code illustrates how the basic authentication interceptors looks like.

    public class BasicAuthenticationInterceptor : IAuthenticationInterceptor
    {
        Func<UsernameAndPassword, bool> userValidation;
        string realm;
     
        public BasicAuthenticationInterceptor(Func<UsernameAndPassword, bool> userValidation, string realm)
        {
            if (userValidation == null)
                throw new ArgumentNullException("userValidation");
     
            if (string.IsNullOrEmpty(realm))
                throw new ArgumentNullException("realm");
     
            this.userValidation = userValidation;
            this.realm = realm;
        }
     
        public string Scheme
        {
            get { return "Basic"; }
        }
     
        public bool DoAuthentication(HttpRequestMessage request, HttpResponseMessage response, out IPrincipal principal)
        {
            string[] credentials = ExtractCredentials(request);
            if (credentials.Length == 0 || !AuthenticateUser(credentials[0], credentials[1]))
            {
                response.StatusCode = HttpStatusCode.Unauthorized;
                response.Content = new StringContent("Access denied");
                response.Headers.WwwAuthenticate.Add(new AuthenticationHeaderValue("Basic", "realm=" + this.realm));
     
                principal = null;
     
                return false;
            }
            else
            {
                principal = new GenericPrincipal(new GenericIdentity(credentials[0]), new string[] {});
     
                return true;
            }
        }
     
        private string[] ExtractCredentials(HttpRequestMessage request)
        {
            if (request.Headers.Authorization != null && request.Headers.Authorization.Scheme.StartsWith("Basic"))
            {
                string encodedUserPass = request.Headers.Authorization.Parameter.Trim();
     
                Encoding encoding = Encoding.GetEncoding("iso-8859-1");
                string userPass = encoding.GetString(Convert.FromBase64String(encodedUserPass));
                int separator = userPass.IndexOf(':');
     
                string[] credentials = new string[2];
                credentials[0] = userPass.Substring(0, separator);
                credentials[1] = userPass.Substring(separator + 1);
     
                return credentials;
            }
     
            return new string[] { };
        }
     
        private bool AuthenticateUser(string username, string password)
        {
            var usernameAndPassword = new UsernameAndPassword
            {
                Username = username,
                Password = password
            };
     
            if (this.userValidation(usernameAndPassword))
            {
                return true;
            }
     
            return false;
        }
    }

    This interceptor receives in the constructor a callback in the form of a Func delegate for authenticating the user and the “realm”, which is required as part of the implementation. The rest is a general implementation of the basic authentication mechanism using standard http request and response messages.

    I also implemented another interceptor for authenticating a SAML token with WIF.

    public class SamlAuthenticationInterceptor : IAuthenticationInterceptor
    {
        SecurityTokenHandlerCollection handlers = null;
     
        public SamlAuthenticationInterceptor(SecurityTokenHandlerCollection handlers)
        {
            if (handlers == null)
                throw new ArgumentNullException("handlers");
     
            this.handlers = handlers;
        }
     
        public string Scheme
        {
            get { return "saml"; }
        }
     
        public bool DoAuthentication(HttpRequestMessage request, HttpResponseMessage response, out IPrincipal principal)
        {
            SecurityToken token = ExtractCredentials(request);
     
            if (token != null)
            {
                ClaimsIdentityCollection claims = handlers.ValidateToken(token);
     
                principal = new ClaimsPrincipal(claims);
     
                return true;
            }
            else
            {
                response.StatusCode = HttpStatusCode.Unauthorized;
                response.Content = new StringContent("Access denied");
     
                principal = null;
     
                return false;
            }
        }
     
        private SecurityToken ExtractCredentials(HttpRequestMessage request)
        {
            if (request.Headers.Authorization != null && request.Headers.Authorization.Scheme == "saml")
            {
                XmlTextReader xmlReader = new XmlTextReader(new StringReader(request.Headers.Authorization.Parameter));
     
                var col = SecurityTokenHandlerCollection.CreateDefaultSecurityTokenHandlerCollection();
                SecurityToken token = col.ReadToken(xmlReader);
     
                return token;
            }
     
            return null;
        }
    }
    This implementation receives a “SecurityTokenHandlerCollection” instance as part of the constructor. This class is part of WIF, and basically represents a collection of token managers to know how to handle specific xml authentication tokens (SAML is one of them).

    I also created a set of extension methods for injecting these interceptors as part of a service route when the service is initialized.

    var basicAuthentication = new BasicAuthenticationInterceptor((u) => true, "ContactManager");
    var samlAuthentication = new SamlAuthenticationInterceptor(serviceConfiguration.SecurityTokenHandlers);
     
    // use MEF for providing instances
    var catalog = new AssemblyCatalog(typeof(Global).Assembly);
    var container = new CompositionContainer(catalog);
    var configuration = new ContactManagerConfiguration(container);
     
    RouteTable.Routes.AddServiceRoute<ContactResource>("contact", configuration, basicAuthentication, samlAuthentication);
    RouteTable.Routes.AddServiceRoute<ContactsResource>("contacts", configuration, basicAuthentication, samlAuthentication);

    In the code above, I am injecting the basic authentication and saml authentication interceptors in the “contact” and “contacts” resource implementations that come as samples in the code preview.

    I will use another post to discuss more in detail how the brokered authentication with SAML model works with this new WCF Http bits.

    The code is available to download in this location.

    Read more...

  • Http Processors in the WCF Web Programming Model

    The code drop recently released as part of wcf.codeplex.com introduced a new feature for injecting cross-cutting concerns through a pipeline into any existing service. The idea is simple, when you move to the http world, there are some aspects that might want to move out of your operation implementation. A typical example is content negotiation, in which the http messages are serialized or deserialized into specific entity or object graph expected by the service based on the “content-type” or “accept” http headers.  You don’t want to have all that logic spread across all the service implementations or tie your operation to an specific format as it happened with previous model with the WebInvoke and WebGet attributes. For that kind of logic, it’s really useful to have that concern implemented in a specific class that you can test individually and inject into your service to support new message formats.

    Read more...

  • New IQueryable Support for Http Services in WCF

    One of the things that caught my attention when WCF Data Service was released a couple of years ago was the ability of using URI segments for passing queries directly to an underline linq provider. This represented a very powerful feature for allowing clients to filter and manipulate a large data result set on the server side with a simple API and without incurring in any unnecessary performance penalty (the queries are performed and resolved in the data source itself most of the times) and having all that logic implemented in the service itself.

    Read more...