I just read an article ("Dealing with the 'Melted Cheese Effect'" by Maarten Mullender in collaboration with my fellow thinktect Christian Weyer) on how to prepare for change in SOA architectures. Although the article itself does not contain many new insights - however it´s the start of an interesting looking series - of course it tackles an important problem: How can you replace the moving parts in a machine running 24x7? We´ll see, what Maarten has to say about that in his future postings.
What I´m grateful for, though, is a thought this article sparked in me. I suddenly realized the size of change we´re talking about when we move from "ordinary applications" to "SOA systems". My thought was, this change has to be compared to the switsch from Newtonian physics to relativistic physics.
Newton and Galileo looked at physical systems that were so small, speed of impulse was of (almost) no concern. Impulses travelled in zero time from origin to destination. Galileo was not able to determine a particular speed of light. Likewise, up to a few years ago, most software systems were so small, "speed of impulse" was of little concern. Data and meta data (changes in interfaces and implementation) travelled in zero time.
Newtonian physics were replaced (or complemented) by relativistic physics. Relativistic physics is concerned with systems so large, speed of impulse is important - and is even bound by the speed of light. Where Galileo might be compared to someone remotely controlling a model aircraft, Einstein might be compared to someone remotely controlling a mars probe. While controlling your model aircraft, your steering impulses immediately have an effect on your plane. But whatever you know about your mars probe is some 3 minutes old (at best). And whatever impuls you send to it will reach it only some 3 minutes in the future (at best). So, steering a mars probe in real time does not make sense.
What does this mean for the world of software? Today much more software (if not most software) is so large, that "speed of impulse" is of high concern to it. Data does not travel between servers and clients in zero time - and also meta data does not travel in zero time.
However, from the first RPC protocols on, this fundamental change in software "physics" was denied. The purpose of RPC in all its variations (like DCOM, RMI etc.) and its uses (like EJB, MTS/COM+, .NET Remoting etc.) tried to hide the difference between steering a model aircraft and controlling a mars probe. RPC et al. suggests, you can use the same remote control panel for a model aircraft and a mars probe. But that´s of course in the end a rediculous suggestion.
How rediculous this is, now becomes apparent with SOA. Large numbers of developers educated in "Newtonian physics" try to develop software systems that are governed by "relativistic physics". And not only do they need to switch their thinking to new (or expanded or generalized) laws of "physics", they try to solve "relativistic problems" with their tools from the "Galilean toolchest". That´s of course bound to fail.
The task of the SOA movement thus is:
- To make clear the distinction between "Newtonian physics" and "relativistic physics" in software development and its implications. We need to build systems that account for obvious and finite impulse velocity regarding exchange of data.
- To provide tools, techniques and guidance how to best deal with "relativistic physics", i.e. exchanging parts in a large system where also changes travel with finite velocity and thus don´t reach all areas at the same time.
So far, programming components was about software running on a single machine. With SOA programming components has moved up to programming components (services) for software running in a network. It´s important to not forget this similarity - CBD and SOA are about components-, but it´s also important to see where differences in "physics" demand different solutions.