I am writing application that besides other cool things lets users insert information about business contacts. My system is able to keep names history of contacts. In this posting I will show you how to use NHibernate mappings to keep names ordered logically.
Here is fragment of my class diagram. It is working draft, so don’t take it as an example of perfect solution. (To find out more about class structure represented here read my blog posting Modeling people and organizations: Class Party).
Here is what I want to achieve. I took this screenshot from my working draft.
The problem here is sort order of valid-to dates. If I just sort names by date to descending order then current name (we don’t know how long it is valid) is the last one. I needed order shown on the image above.
Here is how I achieved this order using NHibernate mapping file and MSSQL COALESCE function. The trick is simple – I used COALESCE to replace null with current date.
<?xml version="1.0" encoding="utf-8" ?>
<key column="party_id" />
<property name="RegistryCode" column ="registry_code"/>
<!-- more properties here -->
<bag name="Names" table="company_names" cascade="all"
order-by="COALESCE(valid_to, GETDATE()) desc">
<one-to-many class="OM.Core.Contacts.CompanyName, OM.Core"/>
I think it is good solution for sorting because sorting is done in database and we don’t have to write additional code to DAL to get collection sorted. Of course, it is not recommended to change (reassign, reorder) collections monitored by NHibernate. What I did here was simple – I just added order-by attribute to Names bag definition and let NHibernate do the rest.
As a side product of some experiments I wrote simple LINQ query that matches properties of two objects and returns these properties as list. The code I wrote is pretty simple and I packed it for you as method. C#
public IList<PropertyInfo> GetMatchingProperties(object source, object target)
if (source == null)
throw new ArgumentNullException("source");
if (target == null)
throw new ArgumentNullException("target");
var sourceType = source.GetType();
var sourceProperties = sourceType.GetProperties();
var targetType = target.GetType();
var targetProperties = targetType.GetProperties();
var properties = (from s in sourceProperties
from t in targetProperties
where s.Name == t.Name &&
s.PropertyType == t.PropertyType
Public Function GetMatchingProperties(ByVal source As Object,_
ByVal target As Object) As IList(Of PropertyInfo)
If source Is Nothing Then
Throw New ArgumentNullException("source")
If target Is Nothing Then
Throw New ArgumentNullException("target")
Dim sourceType = source.GetType()
Dim sourceProperties = sourceType.GetProperties()
Dim targetType = target.GetType()
Dim targetProperties = targetType.GetProperties()
Dim properties = (From s In sourceProperties _
From t In targetProperties _
Where s.Name = t.Name AndAlso s.PropertyType = t.PropertyType _
The method returns only those properties that match by name and type. If both objects have property called Sum and both of them have Sum in different type (let’s say float and decimal) then this property is not returned by this method. Of course, you can extend this method if you like and you can make it much more clever. Simple implementation given here worked for me very well.
My last posting described how to read and write files located in Windows Azure cloud storage. In this posting I will show you how to do almost same thing using PHP. We will use Windows Azure SDK for PHP. The purpose of this example is to show you how simple it is to use Windows Azure storage services in your PHP applications.
Preparing for example
I expect you have everything needed for this example:
If you need more information then please check out these resources:
Before we start scripting let’s make sure that our environment is configured correctly:
- copy folder named Microsoft to your script folder (or somewhere else where PHP file include functions can access it),
- let PHP display errors to page if you don’t prefer to read log files,
- make sure you have cURL extension enabled in php.ini file.
Reading and writing cloud storage files
Our script does two simple operations – it reads one file from Windows Azure BLOB storage and writes the other one there. Note that in the beginning of script I turn on error reporting and set content type as plain text. If there are any errors or warnings then this information is written out and formatted so it is easy to read.
// Connect to Windows Azure cloud storage
$client = new Microsoft_WindowsAzure_Storage_Blob(
// Read file Data.xml from container called dataset
$localpath = getcwd() . '\Data.xml';
$client->getBlob('dataset', 'Data.xml', $localpath);
// Write file example.txt to container called dataset
$localpath = getcwd() . '\example.txt';
$result = $client->putBlob('dataset', 'example.txt', $localpath);
Some notes too. Windows Azure SDK for PHP handles cloud storage address as raw host name. <ACCOUNT> – you cloud storage service name – is used with raw host name to build up correct cloud storage URL. It is handled by library and you don’t have to worry about it. <KEY> is base64 encoded key and it is generated when you create your BLOB storage service.
It is very easy to communicate with Windows Azure cloud storage services using PHP and Windows Azure SDK for PHP. The code example above shows also one good thing: we wrote very basic and extremely simple code to get our works done – even complete beginners are able to use Windows Azure cloud storage in their scripts.
On Windows Azure CTP some file system operations are not allowed. You cannot create and modify files located on server hard disc. I have small application that stores data to DataSet and I needed place where I can hold this file. The logical choice is cloud storage service of course. In this posting I will show you how to read and write DataSet to your cloud storage as XML-file.
Although my original code is more complex (there are other things you have to handle when using file based storage in multi-user environment) I give you here the main idea about how to write some methods to handle cloud storage files. Feel free to modify my code as you wish.
Before we start
You need Windows Azure storage account. If you don’t have it currently then go to this page and register yourself. When your account is ready you must create your cloud storage service in Windows Azure portal. Also you need Windows Azure Tools for Visual Studio. Now create a simple application and add reference to StorageClient.dll file (it is shipped with Azure tools, you can search for it from Program Files folder if you cannot find it fast).
The structure of cloud storage is simple. At the root level you have your account. Under your account you have services – Table, Queue and Blob storage. We are using Blob storage in this example. Blob storage has containers at root level. You can handle containers as independent root level folders. Each container may contain one or more BLOBs – handle BLOBs as files. Screenshot here should illustrate it somehow. _blob is the name of my cloud storage service (it is name that I use in my client software). dataset is the name of container and Data.xml is my data file.
Connecting to cloud storage
As we are working with same file in same container we can say that we need container as root level object. We can easily write method that connects to cloud storage and returns us container we need.
private BlobContainer GetContainer()
var account = new StorageAccountInfo(
var storage = BlobStorage.Create(account);
var container = storage.GetBlobContainer("dataset");
This method is private because I don’t want to expose cloud storage details to other classes.
Now let’s read our data file from cloud storage and let’s load it to DataSet. There is one little trick you should be aware of. Take a look at the following method and notice that I have to handle the location of stream pointer.
public void Load(DataSet data)
var container = GetContainer();
using (var mem = new MemoryStream())
var fileBlob = new BlobContents(mem);
container.GetBlob("Data.xml", fileBlob, true);
var stream = fileBlob.AsStream;
// Required - stream pointer must be at position 0
After changes it is good idea to write data back to cloud storage at some moment. Here is the method that saves new file with fresh data to cloud storage container.
public void Save(DataSet data)
var container = GetContainer();
var metadata = new NameValueCollection();
metadata["FileName"] = "Data.xml";
var properties = new BlobProperties("Data.xml")
Metadata = metadata,
ContentType = "text/xml"
using(var mem = new MemoryStream())
var fileBlob = new BlobContents(mem.ToArray());
container.CreateBlob(properties, fileBlob, true);
Using Windows Azure cloud storage and BLOBs is pretty simple if we use right tools. StorageClient saved us some valuable time and we were able to write some methods that are very easy to use for other programmers. You can easily change those methods so accoun information is read from application configuration and programmers can also specify container and file they need from cloud storage.