Continuing on the SOA front, I've recently been reading a version of the forthcoming book "Programming WCF Services" by Juval Lowy. While a great book on WCF internals, I was thunderstruck by Juval's assertion that (services are the) "next evolutionary step in the long journey from functions to objects to components to services". I so strongly disagreed, that I left some notes to that effect in the notes to him and the publisher.
As described, Juval sees an application as being made up (entirely) of services and I can see his reasoning. You see Juval advocates interface based development which he claims (others would attribute another definition) is "component-oriented programming". To him a "component" is an object that exposes an interface - one interacts with it through the "contract" exposed by that interface. Its a case of turning the advice "program to an interface not an implementation" into an edict. So in his previous book "Programming .NET Components" he sees an application as being composed of "components".
Now one interacts with a service (in WCF) via a "contract". And his extrapolation is therefore that a service is really a type of component - is a type of object. Which is where his statement (the) "next evolutionary step in the long journey from functions to objects to components to services" comes from.
How would I explain the relationship between an "application" and a "service"?
A single application can exist either within a single context of execution within a platform (typically a Process but in .NET could be AppDomain), or it can be distributed over multiple contexts of execution. Within any context of execution, the world will continue to use object oriented development to model the complexity of the domain and supporting infrastructure.
But the parts of the application that are spread over different contexts of execution need to communicate with each other. That can be extended outside an application also to communication between applications. One can use any of the standard mechanisms of integration architecture :- file sharing, shared database, remote procedure invocation, messaging or just walk a floppy between them.
While Remote Procedure Invocation was formerly "de rigeur", messaging between "islands of functionality" or "Services" using industry standards has become the fashionable standard mechanism for communication between those "contexts of execution".
There seems to be a lot of discussion these days regarding the relationship between DDD (Domain Driven Design) and SOA (Service Oriented Architecture).
Udi Dahan says they are orthogonal, and via that post, Richard Campbell says they are opposites.
My answer is that they are unrelated.
DDD relates to the implementation of a Service. How a Service is implemented within a SOA is unimportant, and the adoption of DDD is just an implementation detail. The Service may be implemented in any manner - as a procedural script, functional program or as object-oriented code - in Logo, Perl, RPG or C# - who cares?
That being said, one thing I advocate is that a Service is not equivalent to an Application. A single Application may be distributed, and itself consist of a number of Services that communicate through messaging across AppDomain or process boundaries. Perhaps that is where most confusion lies.
I recently responded to a question on a newsgroup regarding what to
use for persisting data - XPO (an Object/Relational Mapper (ORM) tool) or Microsoft's ADO.NET. The question was from a Delphi developer new to .NET. For posterity, I've included my response below.
Updated 23-Jul-06: added inline links and a link summary
Fundamentally, ADO.NET is a library that supports access to data. In order to simplify development, ADO.NET includes infrastructure that supports Microsoft's implementation of the disconnected recordset pattern i.e. DataSets. That pattern is just one way to access data through ADO.NET, and is one that particularly suits simple RAD type applications driven through a tool like Visual Studio. You don't need to use that pattern, indeed, it is not suited to applications with complex domains - applications that include a business domain layer modeled through objects.
Most .NET applications that use DataSets have no domain layer. Rather there is a triangular relationship between the Presentation/Logic layer, data access and the DataSet as shown below. They are generally modeled in terms of data not behavior - they are not object oriented solutions, but rather data-centric entity type ones.
Nevertheless, one can quickly get a simple application up and running through the visual tools, wizards and code generation found in tools like VS. Essentially just like you could in MS Visual Basic (version 3?) - which is no surprise as the majority of VS users come from a VB RAD background.
That does suit many simple applications, however, as domain complexity rises one needs to look to different patterns to connect the in-memory domain modeled through objects, to persisted data.
But, modeling complexity through objects in the domain layer has the problem that objects don't map well to most persisted data models - whether hierarchical, relational or other.
So the issue is how does one "map" objects to data?
One can look to a number of architectural patterns including Active Record (a la Ruby on Rails) through to data mapping through ORM (object/relational mapping) solutions.
XPO is an ORM tool. It portends to allow one to concentrate on modeling your application through objects, without regards to persistence. It is object-centric unlike popular ORM tools like LLBLGen Pro that is entity-centric and is more suited to the enterprise where schema already exists.
"XPO vs MS ADO.NET for the Delphi Programmer"? The real key is that there is no silver bullet - different patterns and approaches suit different projects.
My advice? Learn and use both!
Links: Developer Express XPO | LLBLGen Pro | Active Record Pattern (1) (2) (3) | Ruby on Rails | ADO.NET | .NET Definition | ORM
It's generally acknowleged that bringing a non-trivial software project to a successful conclusion is problematic. Sadly statistics confirm that few projects can be regarded as successful when one strictly applies the criteria that a successful project is one that is delivered on time, on budget and to specification. Delays, budget overruns and the delivery of solutions that aren't actually of value to stakeholders are common symptoms.
Most commonly people point to issues with the determination and management of "requirements". Certainly issues related to requirements engineering and the mismanagement of stakeholder expectations is at the core of most project failures. It's exceedingly rare that a project is jeopardised because of an inability to solve a technical issue.
Why this tour of the dark side of our industry?
Well, I've recently been exposed to (yet another) large problematic project that well meets the above definition of failure. The reason for that projects' issues are standard - no communicated project charter (vision and scope), no management of requirements or expectations, no attention to control, process or quality, no competent leadership. Truly an adhoc, anarchic CMM Level 1 (see pdf page 11) development environment evidenced by the organisations inability to ever deliver a single release to customers as advertised and scheduled.
My prescription in such circumstances? Replace
fire those in positions of authority and put in place competent professional management and technical resources that can adopt and put in place standard industry (best) practice.
What would you recommend?
Update: 30-Jul-06: It has been announced that new external senior management is to be brought in to oversee operations.
"BPM is SOA’s killer application, while SOA is BPM’s enabling infrastructure." - Ismael Ghalimi
Julian Bucknall (DevExpress) has written a great post on interface versus implementation inheritance, that also touches on developer skills.
For me, having developers produce loosely coupled objects that have a single responsibility and which support change over time, has been problematic on almost every project and team I have managed.
That all developers are highly capable and motivated - is an illusion. The reality is that it's really hard to find competent .NET developers who can write more than procedural code (within the guise of an object oriented language). That perhaps simply reflects that so many .NET projects are simple RAD applications that really are just a window onto data - no complex domain at all.
SQL Server Express 2005 doesn't allow remote connections by default.
Some simple instructions for allowing remote connections can be found at ... http://www.datamasker.com/SSE2005_NetworkCfg.htm.
Also grab the free ExpressMaint console utility available at http://www.sqldbatips.com/showarticle.asp?ID=29.
It provides some of the functionality of the sqlmaint utility
- backups made easy!
My favorite book on ASP.NET is without doubt Fritz Onion's Essential ASP.NET With Examples in C#, but I've been wondering about an update for 2.0. Looks like the answer is that he'll be co-authoring Essential ASP.NET 2.0 with Keith Brown - can't wait.
On the book front, I'm currently reading:-
I'm about to order:-
One worthwhile ebook download if your interested in LLBLGen Pro (O/R Mapper) is Joseph Chancellor's Rapid C# Windows Development: Visual Studio 2005, SQL Server 2005, and LLBLGen Pro. It provides a good opportunity to review LLBLGen in a quick easy read and only costs (US$8.75).
I actually buy few books that are .NET specific, prefering standard texts that cover technologies and practices that cross technologies. But Troelson, Onion and Richter's books are musts on your .NET bookshelf . Note, that Troelson is really suited to already proficient C# developers.
I used to order all my books from Amazon, because even though I am in Australia, prices were cheaper than local retailers. But for the last year, I've been buying all my books online from bookware.com.au - well worth looking into if your from Australia.
Click to display a larger version.
There has been some discussion recently on the Australian "dotnet" Mailing
List about the applicability of XP and agile development to large scale
Nick Randolf questioned someone's comment that "most of the practices won't work in large, dispersed
projects". He asked for good reasons why they would not?
I wrote in response ...
I like Barry Boehm's critical
Size : Criticality : Dynamism :
Personnel : Culture
Size: Well-matched to small products and teams.
Reliance on tacit knowledge limits
Criticality: Untested on safety-critical
products. Potential difficulties with simple design and lack of
Dynamism: Simple design and continuous
refactoring are excellent for highly dynamic environments, but a source of
potentially expensive rework for highly stable
Personnel: Requires continuous presence of a
critical mass of scarce Cockburn Level 2 or 3 experts. Risky to use
non-agile Level 1B people.
Culture: Thrives in a culture where
people feel comfortable and empowered by having many degrees of freedom -
thriving on chaos.
It's now widely accepted that XP practices "must be adapted as
necessary for projects that do not fit the "small team" limits recommended by
its founders." (http://www.thoughtworks.com/bad-smells-in-xp.pdf)
in Australia, efforts to implement XP as a corporate methodology or in large
projects - tend to go the way of Citect (citect.com.au). A "guru"
will preach "values" over "process", which rapidly go out the window as
processes that manage risk and offer predictability to customers and
stakeholders are re-introduced.
I think that's why there is movement away
from XP in projects unsuited to it's sweet spot, to say SCRUM, or one of the
large team variants of Crystal.
More Posts « Previous page
- Next page »