May 2005 - Posts
Databases as Services typically are well encapsulated and contain business rules
Without discussing if the kind of database design Barry is suggesting is "Service Oriented" or not, I think the idea of a well encapsulated database with business logic is interesting. It follows Jim Gray's Put the computation near the data
principle. It goes against the idea of the 'middle tier' where the business logic is performed to avoid loading the database server.
However, most 'modern' features of relational databases (referential integrity, joins) are usually performance intensive. People who need to get the most of them, don't use those features
, but most of us use them because the performance impact is not that bad in our scenarios, even if it increases the load of the database. It looks then that moving the logic to the database could be acceptable in some cases.
Going back to the database as services matter, the main difficult that we have with SQL Server 2000 to make this work is that services need to deal with hierarchical data, and T-SQL stored procedures suck at it. I cannot have a stored procedure that receives an Order. I need one for the Order header and another for the Order lines (I know, I could send the header and lines in a string and parse them, but that's what 'sucks' mean). That is not encapsulation.
In SQL Server 2005 we have could handle in two ways. One could be using UDTs, but I'm not sure if I can have an Order UDT. The other option is to send it in an XML.
When you want to update an Order, and you are sending an XML, you need a way to tell the stored procedure which changes you made to the order (for example, if you added and deleted a row line), so sending a simple XML with the order information won't be enough. You need to send out of band information that describes the operations done in the Order, and probably the previous values for the modified rows. This starts to look like a diffgram ;).
Having a stored procedure that receives a diffgram with the Order and applies business logic and updates the database looks like a good solution, and it seems it could fit Barrys' requirements. The only issue is that you need a diffgram-friendly platform (or another XML standard that can be used to exchange diffgrams that is supported in multiple platforms) if you want to have an easy way to work with those stored procedures from a different platform.
Last week we released a new DeKlarit version. We put a lot of effort in this one, and I’m quite proud of it.
The main new feature is support for Microsoft Enterprise Library. The rationale for doing it is outlined here (PDF).
One of the DeKlarit’s core features is the generation of the business logic layer. Authorization checks must be performed in that layer, so we are using the Security Application Block to do accomplish it. The good thing about using Enterprise Library is that you can plug any authentication/authorization provider you want.
DeKlarit also supports generating a presentation layer, and if you use the presentation layer generators, the checks are also performed there, so you won’t see the ‘Save’ option in the Customer form if you don’t have rights to perform that action.
We also use Enterprise Library for Caching, Configuration and Encryption.
Gaston built a cool VS.NET-integrated editor for DeKlarit add-ins, which are like the scripts used to create new code generation tools (i.e. the presentation layer ones). You can tell DeKlarit which VS.NET projects you want to create, the templates you want to apply and when you want to apply them. You can also edit the templates in VS.NET.
These features required a redesign of our underlying framework but it was worth it.
On the other hand, there’s a feature that was quite simple to implement, and I think it’s very cool (I won’t explain it here because it’s well explained there ;). When you are writing the BL layer, you can live without it. It just requires you to write code to retrieve the primary key using the natural key. But when you are generating the presentation layer, if your metadata model does not have that information, then there’s no way to generate the right form.
SQL Server 2005 has new XML data type. When you store an XML value or you send it over the wire, is not serialized as a text XML representation, but as a binary XML representation.
The System.Data.SqlTypes.SqlXml type as a .CreateReader() method that returns an XmlReader. This means you can pass it to any .NET object that takes an XmlReader without even creating the XML text representation, saving memory and parsing time.
Suppose you want to write a stored procedure that returns a XML type as a ref parameter. You can write:
public static void StoredProcedureOut(ref SqlXml sqlXml)
string myXml = "<Customer>Peter</Customer>";
sqlXml = new SqlXml(new XmlTextReader(new StringReader(myXml)));
The problem is that when you call that stored procedure from ADO.NET client code you cannot get the result as a SqlXml type. It's returned as a string. If you want to feed it to an object that needs an XmlReader, you need to parse the returned string's XML.
IMHO this is a bad thing, but it looks it's because the ADO.NET team wants to enable non .NET 2.0 clients to invoke the stored procedures.
Now, what if you really want to return a SqlXml from a stored procedure and you don't want to get an XML string?
It can be done by returning a resultset to the client with only one row and one column :
public static void StoredProcedureResultset()
string myXml = "<Customer>Peter</Customer>";
SqlXml x = new SqlXml(new XmlTextReader(new StringReader(myXml)));
SqlDataRecord rec = new SqlDataRecord(new SqlMetaData("xmlType", SqlDbType.Xml));
However, if you invoke this stored procedure and ask for the type returned by the server using GetFieldType() you will still receive 'System.String'! But if you read the field using GetSqlXml() then you will, finally, get the SqlXml value.
SqlCommand command1 = new SqlCommand("StoredProcedureResultset", con); command1.CommandType = CommandType.StoredProcedure;
using (SqlDataReader reader = command1.ExecuteReader())
Type type = reader.GetFieldType(0); // Returns System.String
SqlXml value = reader.GetSqlXml(0); // It works!
We've been using FogBugZ since 2.x. We've just upgraded to 4.0.
The new version is quite nice. It looks nice, and has small improvements in a lot of areas that make it more usable, but I was not really suprised by any new feature, until I saw this one [http://www.fogcreek.com/FogBugz/images/50PctScreenshotTool.png]. Basically it's a task bar application that lets you capture a screen shot, crop it, and attach it to an existing or new case. The taskbar application has also a 'New Case' option that takes you directly to the New Case page, which is also very handy.
However, I was not very pleased with their upgrade pricing strategy. Even if Joel wrote some articles about pricing that I enjoyed, their pricing sucks. Basically, each day that passed without us upgrading to the new version, the upgrade was more expensive.
They support is $0.05 per day/user, and the only way to upgrade is to have a valid support contract. This means that if you did not pay support after the standard support ended, you need to pay for it anyway to upgrade. So, we ended up paying around 2 years of support which we did not use.
Pablo Castro posted an interesting piece of information about ADO.NET 2.0 Batching.
The solution for implementing batching looks very good. The problem is that you can only use it if you are using SqlDataAdapter!
In previous Whidbey releases there was a SqlCommandSet class that was supposed to let you do batching from your own code. That class is still there but marked as internal.
So, if you are using DataSets/SqlDataAdapter, then you can easily take advantage of the new batching features.
If you are using DataSets but not SqlDataAdapter, you can try using SqlDataAdapter when you need to take advantage of Batching.
If you are not using DataSets, then your batching options are much more limited. You need to concatenate commands and change parameter names, etc.
I've posted a suggestion in MSDN Feedback Center to ask for a way to use the new batching without SqlDataAdapter. It's probably way too late in the Whidbey product cycle to change that, but let's see...