October 2007 - Posts
I found this neat (and free) trash destination for SSIS. It's really great as a development aid. It allows you to quickly terminate a data flow path, and does not require any configuration. It will consume the rows without any side effects, and prevents warnings or errors you may otherwise receive when executing the data flow.
You can read more and download it from here
Microsoft just announced its resources and tools investments in adopting SOA. Some of the tools in this set were obvious. As can be expected, the prototype project Biztalk Services will be available as a CTP very soon. I spoke about the huge potential of this project a while back and its nice to see it coming to light. Biztalk Services takes the concepts of an ESB, and pushes it into the internet cloud - an ISB (internet service bus) if you will.
“Oslo will enable a new class of applications that are connected and streamlined — from design through deployment — reducing complexity, aligning the enterprise and Internet, and simplifying interoperability and management.” - Robert Wahbe, Corporate Vice President of the Connected Systems Division
This vision will be delivered through several servers and tools:
- Server: Microsoft BizTalk Server "6" will continue to provide a core foundation for distributed and highly scalable SOA and BPM solutions, and deliver the capability to develop, manage and deploy composite applications.
- Services: BizTalk Services "1" will offer a commercially supported release of Web-based services enabling hosted composite applications that cross organizational boundaries. This release will include advanced messaging, identity and workflow capabilities.
- Framework: The Microsoft .NET Framework "4" release will further enable model-driven development with Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF).
- Tools: New technology planned for Visual Studio "10" will make significant strides in end-to-end application life-cycle management through new tools for model-driven design of distributed applications.
- Repository: There will also be investments in aligning the metadata repositories across the Server and Tools product sets. Microsoft System Center "5," Visual Studio "10" and BizTalk Server "6" will utilize a repository technology for managing, versioning and deploying models.
I think October 30, 2007 will be known as the day Microsoft went SOA. This might be my favorite release (this hour) confirming Microsoft's commitment to SOA.
"The Managed Services Engine (MSE) is one approach to facilitating Enterprise SOA through service virtualization. Built upon the Windows Communication Foundation (WCF) and the Microsoft Server Platform, the MSE was developed by Microsoft Services as we helped customers address the challenges of SOA in the enterprise.
The MSE fully enables service virtualization through a Service Repository, which helps organizations deploy services faster, coordinate change management, and maximize the reuse of various service elements. In doing so, the MSE provides the ability to support versioning, abstraction, management, routing, and runtime policy enforcement for Services."
I was working on an SSIS project to dump data from an Excel spreadsheet into a database table. However, the Excel file happened to be a beautifully formatted, colorful, presentation of charts and graphs, subtotal rows and all sorts of interesting images. The actual data I needed to dump didn't even start until about 15 rows down and 6 rows across. My mind started wandering on some custom component that would skip rows, strip subtotal columns and basically rewrite the entire spreadsheet. But SSIS had better plans.
Thank goodness for the OpenRowset() property on the Excel Datasource component. This property allows you to specify the range to be considered by the datasource in the format Sheet1$B15:Z2000. By specifying a range, I was able to ignore all the titles and data spread all over the sheet and concentrate just on my data range. I thought skipping rows and ignoring headers / titles was going to add considerable hours to this project. I was wrong.
The next challenge was getting rid of the subtotal rows. Subtotal rows were distinguished by a cell with the word "RESULT" and an empty cell next to it. I added a Conditional Split using the ISNULL([expression]) to separate "real data rows" from "subtotal rows". After this, the import was straightforward into a OLEDB Destination.
"The goal of Microsoft Codename Astoria is to enable applications to expose data as a data service that can be consumed by web clients within corporate networks and across the internet. The data service is reachable over regular HTTP requests, and standard HTTP verbs such as GET, POST, PUT and DELETE are used to perform operations against the service"
In it's simplest form, it really seems like a web data access layer to your database. This also looks like it might be Microsoft's offering for REST
-style web service developers. Semantic web
here we come!
There's quite a few components you'll need installed before you can start playing with it
- VS 2008 Beta 2
- SQL Server 2005 or Express
- Astoria Sept 2007 CTP
- ADO.NET Entity Framework Beta 2 (Runtime)
- ADO.NET Entity Framework (Tools)
Once you've downloaded all the required components, setup to get the data services up and running is a piece of cake. You generate an Entity Data Model from your relational database, add a new "Web Data Service" to your ASP.NET application, and update a generic class to use your new entity model. For step-by-step instructions, check out the Using Microsoft Codename Astoria document that comes with the installation. In a matter of moments, you have your entire database wide-open and exposed over http. Hmmm... this is a little scary - more on this to come...
You can then access your tables through a Uri pointer to specific tables. You use expressions that query the database as the Uri.
The expressions are fairly intuitive. Here are some examples:
<service>/Customers - returns the customer with the id 1234.
<service>/Customres/Orders - returns the sales orders for the customer with id 1234
<service>/Customers[ALFKI]/Orders[OrderDate gt '1998-1-1'] - returns all posted after 1/1/1998 for customer 1234
For a working example, check out the hosted Northwind data service here:
There's a lot more complex expressions which include paging, sorting, and the ability to expand elements in an entity that I'll cover in future posts.
The operators map 1 to 1 with the binary operators in SQL:
||/Customers[City eq 'London']|
||/Customers[City ne 'London']|
||/Orders[OrderDate gt '1998-5-1']|
||Greater than or equal
||/Orders[Freight gteq 800]|
||/Orders[Freight lt 1]|
||Less than or equal
||/Orders[OrderDate lteq '1999-5-4']|
I would love to hear from some of the REST developers and get your feedback on Microsoft's direction with these data services. More to come on this exciting new technology soon, including more abstract concepts on what REST is, and what it means to your current development practices. Needless to say, I have my apprehensions about it already. It almost seems like a step backwards where business logic gets closer to the database and n-tier becomes 2-tier. What do you think?
I was reading an old post by Simon Guest about removing the intimacy to create services and thought he had a great analogy comparing real life to SOA solutions. I love these types of posts because they provide concrete examples which explain system interactions in a straightforward manner. Gregor Hohpe posted something similar to this in Starbucks Does Not Use Two-Phase Commit post. It's a great read.
I would like to take Simon's post a few steps further. Continuing on his example, let's add a few more features such as:
1. Message Correlation
2. Message Expiration
3. Concurrent Orchestration
4. Policy-based security
When person A approaches person B to borrow $5, B must first make sure that he (1) knows who A is and (2) can trust A with this transaction. Even though A looks like who he says he is, and he even knows B's name and address, B has a strict "policy" that A must provide a state-issued ID (federated security) which B scans into his security manager to verify A's identity.
After his identity has been verified, B looks up A's record and current outstanding balance. B will check if his business rule stating that no one person can have an outstanding balance of more than $100 has been violated (business rules engine). A promises to return B's money in 7 days and that's fine with B's business rule.
In B's orchestration engine (his brain), he creates a new long running transaction where he listens, waiting for A to return his money, on or before today + 7 days. He creates an alert in his process manager (his Blackberry) to send himself and person A an alert if he has not received the “money returned” message by that day. If no message comes in by this time, a new process will be instantiated to mitigate the situation (the sendTonyToBreakLegs process) which receives a violatingEntityIdentifier message as its input message.
B is a very generous guy and let's a lot of people borrow money. He may have several long running transactions going on at any given time. In fact, person A could be going through some hard times and may have borrowed money several days in a row. Each transaction is assigned a unique value so that when A starts paying back, he can start applying it the one closest to expiring.
When you look at daily life, it's really one big integration project. Orchestration, coordination, services, long-running transactions and asynchronous operations are all around us. How well individuals deal with it is largely dependent on individual's internal processing engines as well as accessibility to tools that specialize in the management of these processes (Blackberry, Outlook etc).
Unfortunately, ASP.NET 2.0 did not provide any validation controls for validating the status of a CheckBox . These controls are among the controls I use most often in several web applications. Typical uses include when you need the user to choose at least x number of items in a CheckBoxList or a user needs to agree to some terms by checking that he or she read the agreement.
This article by Scott Mitchell has a great overview of how to create such a validation control and the source is available at the end so you can start using it in your application immediately.
The usage is extermely simple. For example, I needed to make sure that the user selected at least one checkbox before submitting a search. Here's the markup:
<skm:CheckBoxListValidator ID="chklistVal" ValidationGroup="search" ControlToValidate="chkServices"
runat="server" Display="None" ErrorMessage="Please select at least one service." />
There's been a lot of talk about the announcement at the ALT.NET conference about a new offering of an MVC framework for ASP.NET. This is very exciting news for any serious ASP.NET developer building large web applications. If you've ever used the Web Client Software Factory to build a web application, you know the advantages that building upon an MVC framework can bring.
I've been scouring for more details about this exciting announcement. I finally found what I was looking for... Scott Hanselman has posted on his blog the presentation and screen cast from the ALT.NET conference. This is going to be awesome!
If you've ever done any kind of AJAX debugging, you may have used Fiddler to figure out exactly what is going over the wire. Fiddler is a neat little tool that let's you analyze the traffic between the browser and the application. It shows session information, cookies, and a lot more. One thing that's not clear right away is how to use it with the ASP.NET local server.
If you run your typical ASP.NET application with the default server, you'll notice that Fiddler is not capturing any traffic. To remedy this situation, you can simply add a period ('.') to the end of "localhost" in the address bar (http://localhost.:8181/mywebapp). This will allow you to view your application’s traffic data with Fiddler.
After reading Jesus' blog post about ESB, I got to thinking. At the end of the day, its really about just getting disparate systems to play nice together. We can't get countries in our own world to play nice together. If we can agree that everyone in the world speaks a common language, it doesn't mean that we're going to play nice together. Some will need a Translator, or at least a Broker to share the same idea, but with different implementations (think English-to-Spanish translator vs. German-to-Spanish). The trick is to see and embrace those differences, encapsulate the difference, and figure out a way to mitigate them as quickly and painlessly as possible. If you think a merger will ever be as simple as changing a configuration setting, creating a new Receive Location, and adding a Send Port, you've never been involved in a merger (and you're using BizTalk).
To think there's a standard to anything is a naive approach. And to imagine that those standards will survive the test of time is even sillier. Face it. In some development environments waterfall actually works and agile really fails. Its about the culture, skill set, management and so many other factors you can't even begin to imagine. Its about adaptability.
I always like to look to the business to solve our enterprise and technology problems (Eeeeh?). If you can get to the people closest to the business, and they describe a customer as X with Y accounts and an account as N with M funds, then you are getting the sense of the business/IT model. Everything else is just plumbing. We love, as technologist, to find relations among everything, but we have to be forward enough to see that if the business says "we'll never have more than 100 accounts for each customer" we make a decision to not enforce it in our entity models. You can enforce it somewhere else, but make it configurable ;-)
I think an ESB is a great step in the right direction for many companies facing an "IT identity crisis". If not for anything, then just because it forces anyone adopting it to take a step-back and look at their business domains. It forces us to establish a canonical model of major business entities which I believe this is vital to the success of a business and IT strategy. It seems people ascribe the word "canonical" to mean "never-changing, permanent, inflexible", but this is a huge misnomer.
That being said, if you want to change what your company or industry considers the canonical model of what a "sales order" looks like, you'd better have a darn good reason. That reason needs to be signed off on by X, Y, and Z and it needs to be versioned and documented and syndicated to all other subsidiaries. It shouldn't be easy and you should need buy-in from several people making much larger paychecks than you.
If you're not in a position to create a canonical model of what a certain business entity looks like in your company, please don't bother to play with an ESB. It will only complicate your life when you find out that a sales order line item really doesn't do a “real-life” sales order line item justice. I suggest as a first step to step away from the technology and look at (and understand) your business domain. This is the key to a successful business / IT strategy and its the promise of SOA deployed properly. It’s all about business driven development. Did I just coin that? Apprarently not.
More Posts Next page »