September 2010 - Posts

Raffaele Rialdi, a security MVP from Italy, has just released a very cool tool to manage X509 certificates in windows. X509 certificates has always represent a pain for most developers, as they are hard to deploy or configure correctly with the right permissions. A tool like this is absolutely need when working with frameworks like WCF or WIF that makes an extensive use of certificates.

These are some of the features that you can find in this initial version,

  1. Ability to create self-signed certificates from the UI by specifying all the different settings that you need (certificate name, expiration dates, password, certificate store).
  2. Ability to browse the different certificate stores and do things like checking the certificate details, the chain trust for an specific certificate, or change its ACL permissions.

You will able to find more details about the tool here.

Posted by cibrax

WS-Discovery is not only a mechanism for discovering service endpoint addresses at runtime, but also a way to query for specific service information and metadata. If you look at it from another standpoint, WS-Discovery provides access to a decentralized short-lived service catalog that is available as long as the services are running. It is decentralized because every service expose their own metadata, unless you use a WS-Discovery managed proxy, which act as an intermediary and central location for service discovery. It is short-lived because it is only available when the service is running, and it is not something that clients could use at any time.

With all this, I am not saying that WS-Discovery is a good replacement for a service repository, which actually provides opposite capabilities, a centralized storage for service metadata at design time that supports full rich querying.

However, they both complement very well each, the service can configure itself from the service repository, and expose metadata and runtime information to clients through WS-Discovery.

There are two types of metadata attributes or extensions that you can configure in the “endpoint” discovery behavior,

<endpointDiscovery enabled="true">
    <scopes>
      <add scope="urn:CRM"/>
    </scopes>
    <extensions>
      <Owner>Pablo Cibraro</Owner>
      <Metadata>http://locahost:8080/?wsdl</Metadata>
    </extensions>
</endpointDiscovery>

Scopes, which represents “URI”s and can be used by clients to filter the discovery results when sending the discovery probes. For example, in the configuration above, a client could only be interested in a service that provides a “urn:CRM” scope, so this service will match that probe.

On the client side, the WCF client can specify the “scopes” as part of the “FindCriteria” argument passed to the DiscoveryClient,

var discovery = new DiscoveryClient(new UdpDiscoveryEndpoint());
 
var criteria = new FindCriteria(typeof(IHelloWorld));
criteria.Scopes.Add(new Uri("urn:CRM"));
 
var discoveryResponse = discovery.Find(criteria);

Then, you have “extensions”, which are xml elements that provide additional information about the service. In the given example, the extensions are used to provide information about the service owner (the developer), and the service metadata endpoint, but you could extend this to any service metadata or documentation.  

The service metadata endpoint (or MEX endpoint in WCF) is useful here as part of the extensions because WS-Discovery only returns the "service” endpoint address as part of the response for the probe messages.

The extensions are available on the client side as part of the discovery response message in the found endpoints.

var discovery = new DiscoveryClient(new UdpDiscoveryEndpoint());
 
var criteria = new FindCriteria(typeof(IHelloWorld));
 
var discoveryResponse = discovery.Find(criteria);
 
var address = discoveryResponse.Endpoints.First().Address;
 
Console.WriteLine(discoveryResponse.Endpoints.First().Extensions);
Posted by cibrax | 11 comment(s)
Filed under: ,

As we announced last week, we are shipping a new Visual Studio plugin for generating service proxies as part of the SO-Aware SDK. The functionality is equivalent to what you find today in the “Add Service Reference” command, but the results are much better as you get a proxy that does not require any WCF configuration, and also knows how to resolve bindings and behaviors from the repository.

However, that plugin is only available for Visual Studio 2010, meaning that you need an alternative solution for generating the same equivalent proxy if you are using older versions of Visual Studio or you are not even using this development environment. Here is where “swutil.exe” comes to fill that gap.

This tool is equivalent to “svcutil.exe”, the one that comes with the .NET framework for generating WCF service proxies, but the result is a much more intelligent proxy that does not require any previous knowledge of WCF configuration.

The following arguments are supported by this new tool,

swutil.exe -help

Parameters:

-help           Prints the help screen.
-uri             Service Repository Uri
-version       Service Version Name
-category    Configuration Category
-out            Output file
-language    Language: cs or vb
-serializer    xml or datacontract

The generated proxy derives from a specific SO-Aware base class “ConfigurableClientBase<T>”, and not the traditional one “ClientBase<T>” that you find in the WCF.

This specific base class gives you access to some methods like “SetClientBehavior” or “SetDefaultBinding” that become handy for automatically resolving bindings and behavior configuration from the repository.

Those methods receive an string with the “binding” or “behavior” name, and automatically inject the equivalent WCF object into the WCF channel after having resolved that name in the repository.

The following lines illustrate how you would use this new proxy to consume a service.

EchoClaimsClient proxy = new EchoClaimsClient(ServiceUri, "EchoClaims(1.0)", null, "CustomBinding_IEchoClaims");
proxy.SetClientBehavior("echoClaimsEndpointBehavior");
proxy.SetDefaultBinding("echoClaims_MutualCertificate");
 
var response = proxy.Echo();

That’s all, no configuration is required on the client side for consuming the service. All the magic happens in that proxy class :)

Posted by cibrax | 3 comment(s)

ASXM web services has been the favorite choice for many developers for building soap web services in .NET during a long time because of its simplicity. With ASMX web services, you get a web service up and running in a matter of seconds, as it does not require any configuration. The only thing you need to do is to build the service implementation and the message contracts (xml serialization classes), and that’s all. However, when you build a system as a black box with most of the configuration hardcoded, and only a few extensibility points in mind, you will probably end up with something that is very easy to deploy and get running, but it can not be customized at all. That’s what an ASMX web service is after all, you don’t have a way easily change the protocol versions, encoders, security or even extend with custom functionality (SOAP extensions are the only entry point for extensibility, which work as message inspectors in WCF).

On the other hand, you have WCF, which is extensible beast for building services among other things. The number of extensibility points that you will find in WCF is extremely high, but the downside is that configuration also becomes extremely complex and a nightmare for most developers that only want to get their services up and running.

Fortunately, the WCF team has considerably improved the configuration experience in WCF 4.0, making possible to run a service with almost no configuration. The approach that they have taken for this version is to make everything work with no configuration, and give the chance to override what you actually need for a given scenario.

For instance, a WCF service that uses http as transport behaves a ASMX web service by default (it uses basicHttpBinding with SOAP 1.2, transport security, text encoding and Basic profile 1.1) unless you change that. So, how can you create a new WCF service as you did before with ASMX ?. That’s simple and you need to follow these steps,

1. Create a new WCF service in Visual Studio

visualstudio_newservice

2. Modify the service and data contract to expose the operations you actually need in the service.

 

// NOTE: You can use the "Rename" command on the "Refactor" menu to change the interface name "IService1" in both code and config file together.
[ServiceContract]
public interface IService1
{

[OperationContract]
string GetData(int value);

[OperationContract]
CompositeType GetDataUsingDataContract(CompositeType composite);

// TODO: Add your service operations here
}


// Use a data contract as illustrated in the sample below to add composite types to service operations.
[DataContract]
public class CompositeType
{
bool boolValue = true;
string stringValue = "Hello ";

[DataMember]
public bool BoolValue
{
get { return boolValue; }
set { boolValue = value; }
}

[DataMember]
public string StringValue
{
get { return stringValue; }
set { stringValue = value; }
}
}

3. Optionally, enable the service metadata page for the service, so any client application can use this to generate the proxies.

<system.serviceModel>
<behaviors>
<serviceBehaviors>
<behavior>
<serviceMetadata httpGetEnabled="true"/>
</behavior>
</serviceBehaviors>
</behaviors>
</system.serviceModel>

 

4. Optionally, enable the ASP.NET Compatibility mode to use the ASP.NET security context (Otherwise, the service will use the default security settings for the basicHttpBinding). That will require two additional steps, adding the “serviceHostingEnvironment” element in the existing service model configuration.

<system.serviceModel>
<serviceHostingEnvironment aspNetCompatibilityEnabled="true"/>

 

And adding an attribute in the service,

[AspNetCompatibilityRequirements(RequirementsMode=AspNetCompatibilityRequirementsMode.Allowed)]
public class Service1 : IService1

That’s all you need to implement a new WCF service that will behave as a traditional ASMX webservice. As you can see, no service or binding configurations were required for the service. In addition, the behavior element does not have any name, so it applies to all the services running in the same host.

Posted by cibrax | 17 comment(s)
Filed under: , ,

As Jesus mentioned in this post, SO-Aware provides three interfaces for managing the service repository. An OData API in case you want to integrate third applications with the repository. OData is a pure http API that can be easily consumed in any platform using a simple http client library. The management portal, which is an ASP.NET MVC user interface layered on top of the OData API and probably the one most people will use. And finally, a PowerShell provider that also mounts on top of the OData API to allow administrators to automate management tasks over the repository with scripting. 

The SO-Aware PowerShell provider, in that sense offers around 40 commands that enables simple management scenarios like registering bindings or services or more complex scenarios that involves testing services or sending alerts when a service is not properly working.  

This provider can be registered as an snapin in an existing script using the following command,

$snapin = get-pssnapin  | select-string "SOAwareSnapIn"
if ($snapin -eq $null)
{
Add-PSSnapin "SOAwareSnapIn"
}

Once you have registered the snapin, you can start using most of the commands for managing the repository.

The first and more important command is “Set-SWEndpoint”, which allows you to connect to an existing SO-Aware instance. This command receives the OData service location as first argument, and it looks as follow,

Set-SWEndpoint -uri http://localhost/SOAware/ServiceRepository.svc

 

As next step, you can start managing or querying data from the repository using the rest of the commands. For instance, the following example registers a new binding in the repository only if it was not created already

function RegisterBinding([string]$name,[string]$type,[string]$xml)
{
$binding = GetBinding($name);
if(!$binding)
{
Add-SWBinding -Name $name -BindingType $type -Configuration $xml
}
}


function GetBinding([string]$name)
{
$bindings = Get-SWBindings
foreach($binding in $bindings)
{
if($binding.Name -eq $name)
{
return $binding
}
}
}


RegisterBinding "stsBinding" "ws2007HttpBinding" "<binding>
<security mode='Message'>
<message clientCredentialType='UserName' establishSecurityContext='false' negotiateServiceCredential='false'/>
</security>
</binding>"
;

As you can see, this provider brings a powerful toy that administrators in any organization can use to manage services or governance aspects by leveraging their scripting knowledge.

Posted by cibrax | 2 comment(s)
Filed under: , ,
More Posts