April 2004 - Posts
[Disclaimer: These are my opinions and not a generalization of the industry as a whole, i can only comment on things i read and make comments on things i've seen. I've been a terrible future predictor in the past and will probably still be one in the future (this prediction included :-))]
Some trends even been observing for a while and a post i've read online on Weblogs @ ASP.NET provoked the writing of this post. Since it has been yesterday it already disappeared from the main page (i can't remember to which blog it belonged, doesn't seem i subscribe it either) and google hasn't indexed it yet so i can't credit it.
[Update: This was the post i was refering. Thanks to Julia Lerman for the link]
The small post (2-3 lines (or what least that is what i've retained :-)) wondered if SOA is not being overhyped to a point that will follow hailstorm path.
Although i think hailstorm failed (delayed? will comeback?) due to a number of totally unrelated things:
- Privacy issues (notice the hubbub concerning new Gmail service from google)
- People weren't ready for paying services yet (free is still on a lot of mindsets :-))
- Data location (mainly a issue of trust among other things) people seem to have some strange sense of possession. :-)
- And some other things that i don't wish to address because i don't want to deviate from my topic more than i already have :-)
It is possible that SOA is so overhyped that after a point, people won't touch it with a ten foot pole because they haven't been burned by people who implemented solutions that had nothing to do with SOA but labelled it as such (and billed an amount directly proportional to the hype but with a ROI inversationally proportional).
In this industry (70's and 80's) there was once a popular saying "Nobody ever got fired for buying IBM" (later replaced with other names, as the industry progressed and power shifts ocurred). Basically it means no decision maker would be pin pointed, later on if a project goes badly. After all he bought from the leader himself so it must have been a good decision. (or preventing from being blamed from not having bought from the market leader if a project goes bad. Classical cover the back scenario). On the book Gorilla Game, Geoffrey Moore later on stated that since the gorilla is the only one who has the lasting play it's the only safe choice (as a small player i tend to disagree but he is the one writing the books. :-)).
I'm starting to think that SOA is starting to suffering from the same problem. Don't get me wrong, i love SOA and i would like to see it become something that puts bread into my table in substancial amounts in the future, because i really believe in the power of this architecture and it's something that (when applied correctly) brings (IMHO) a lot of added value to a client and it's information system. But i'm starting to think that all this overhype, misuse and abuse will do us more harm than good in the long run.
Projects will be approved (and expensive ones), they will be implemented but surely the results will not be as good as when they were sold. This is because as with all hype, there is more substratum than substance. People gobble up some webmethod attribute into a method and they call it SOA, totally disregarding the basic SOA tenets.
This anecdote seems to be the other side of the coin.
Surely such a high coverage of SOA, will allows us geeks to play SOA bingo while attending some meetings (when allowed) but apart from that, no much fun in watching all this silver bullet talk.
I could stay here all day, pondering my keyboard furiously but i couldn't write it as eloquently or as clearly as Rockford Lhotka put it in SOA, dollar signs and trust boundaries. (great soundbite The S in SOA is actually a dollar sign ($OA).) [Found via Ted Neward and subscribed]
[Update: Found this interesting post on Dare Obasanjo blog]
This isn't a real followup of my original post. But since the original post generated some responses i would like to address them here.
I will skip the first one: Kaneboy nothing personal but the post is written in some asian language, i fed it through babelfish by trial and experience and seems to be chinese. :-)
Anyway, it's a small post but it seems (if we trust babelfish and assuming i chose the right language :-)) to somehow agree that the middle tier is obsolete.
The second one argued that yukon will be a platform. No doubts there, If it was for microsoft wishes even Solitaire would be a platform :-). [Ray Ozzie once wrote the best description of a platform i once read online.]
Ok, i'm exagerating, but i think it's truth at a certain point, for MS the more the platforms the better, and the more platforms that leverage and extend the Windows (and office) franchise. If you have read Breaking Windows you can see how any (even internal) threat to the windows franchise is handled.
A platform is also great for Microsoft (at least for now) because it allows the sharecroppers to benefit from it (not to mention how well it sells windows platform thus increasing network externalities for MS products).
Although i think for a while now a MS culture change has started, and it's morphing into a different company (more open) it's not enough yet, i was excited with Infopath as a platform until i understood that you had to had office installed to use it. It such an impediment i don't see Infopath franchise flourishing as well as it could.
Saying that, i don't understand articles like this ones, angry coder is well angry at Microsoft for releasing SQL Server Reporting Services with a SQL server dependency when technically there are no reasons for such a dependency.
[Beware take my view with a grain of salt, obviously i'm not qualified to emit any opinions on strategy & economic matters, but since that never stopped me in the past i will emit them anyway. :-)]
" So why is it that Reporting Services only works with SQL Server? As far as I can tell, the database that stores the report formats is merely used as a data repository, and it can't be an OS and/or platform requirement because Oracle, MySQL and other database platforms can run on Windows. Did the Reporting Services team at Microsoft just not want to have to write PL/SQL stored procs for Oracle , or not want to bother figuring out how to port their SQL Server database to another DBMS? I wish I could find the answers to these questions. I'm curious."
You are right there are no technicall reasons but there are business reasons and microsoft seems to be in the business off licensing (renting? :-)) things (the more the better). Here microsoft is crossing that line between the user of the product and the ISV that build upon that platform (they have a history of that). A free product benefits it's users and goes against it's partners (crystal reports is the first one i can remember in this particular case, but i'm sure there are a lot more), but let's assume for a moment that with this step MS isn't stepping on ISV toes. Microsoft is using Reporting Services as a complementor to sell Sql Server because for MS that all there is (regarding db systems that is :-)), it wouldn't make any (business) sense to gain a few bucks selling a reporting product, if that meant not gaining much more for a Sql Server license or heaven forbids someone using reporting services for increasing (or preserving) a market share of a competitor. (i bet Larry would have a blast presenting annually results, stating how many new sales have been generated by Reporting Services :-)). Another reason is probably MySQL which is clearly a disprutive technology that is upscaling and is becoming a contender for MS SQL Server. [Update: Why MySQL grew so fast]
[BTW a new book on Reporting services is being written and the first 2 chapters can be download for reviewing here (via Jon Box)]
[the same reasoning can be applied to ObjectSpaces question]
Anyway this was just to ilustrate that indeed i also think Yukon will be a platform. At least for the people in the second group of my original post. For the first ones, they don't care for the platform, only for it's speed and stability. :-)
The second post is from Harry Pierson and with this post i have some disagreements.
[Update: Ingo Rammer has add it's 2 cents]
[Update: Harry has a followup]
Harry states, that the middle tier will move into the database because Moore Law is on our side. Ah speed, yes we like that the more the better. But unfortunately speed is all very nice, but when the comes to scalability speed probably isn't the main factor (I/O bandwith, CPU cache coherence (to a lesser degree), memory speed).
The main problem that affects scalability is contention, resource contention and as i stated here the data seems to be the major contention point.
Adding a faster CPU means other tasks will wait a little less, but you will have to have contention somewhere if you wish to preserve your data integrity.
Surely for some systems a single system would be more than enough, but for those that aren't i would love to see yukon supporting transparent data partitioning (i'm not talking about spliting data files among different spindles or filegroups) either in the same database or several distributed databases (transparently).
Harry says to a point "It even makes sense to build a both an application and a messaging infrastructure directly into the database engine." i wonder where does this leave Indigo?
Harry also says
"We also need better management. And not incrementally better, orders of magnitudes better. If you're going to replace a single BIG app with hundreds of independent services, incremental manageability improvements are not going to cut it. "
Agreed, and google groked this a long time ago and i really hope DSI will improve things on these matters.
Similar thoughts, also generated by Harry Pierson post:
From Ted Neward and Ed Draper. (but more thorough and eloquent than me)
Ah the joys burocracy.
"Process is not a substitute for talent, common sense, hard work, and good management"
-- Esther Derby
Nothing more to add, go read the entire post.
On an (probably) unrelated entry Why Individual Measurement is Bad. I totally agree, but i'm probably biased, since i tended to fill the annual auto evalution form (to me proforma crap) repeating the previous year entries. :-)
Let's get on the bandwagon of page 23 Meme (picked from)
"By buying a NeXT, we could justify my working on my long delayed hypertext project as an experiment in using the NeXT operating system and development environment"
-- From fifth sentence on page 23 of Weaving the web from Tim Berners-Lee
There were a lot of nearest books,so this one seemed appropriate because of this much deserved award (the Press release)
The meme says:
Grab the nearest book.
Open the book to page 23.
Find the fifth sentence.
Post the text of the sentence in your journal along with these instructions
(watch how it is spreading on the blogsphere here or here)
Warning rant ahead. :-)
I've been encountering a distressing number of people that seem to think that an WSDL is more than enough to integrate with other system.
Apparently the the contract seems more than enough to use a few hundred services (different systems) with a few dozen calls each. (some with more dozens of parameters). Semantic seems to be of no importance whatsoever.
At a certain point i thought that i missed something, the silver bullet has been found and i was sleeping.
In order to find my sanity (or the proof of insanity) i did some googling and found two interesting articles:
A recent article, which presents a nice overview of WSDL and it's uses:
And this older one, which i included here because i loved this paragraph which proves i'm not insane after all. :-)
Like all IDLs, WSDL is strong on syntax and weak on semantics. Nonetheless, do not neglect this task as, at the end of the day, it is the semantics that matter; syntax merely serves to unlock them.
This makes me wonder, if the Web services horn hasn't been touted too far. So far that people have lost grip of the basics. Simple things has, what does this method do and what are his parameters.
A few weeks ago i was surfing the web and come across this article Kiss the Middle-tier Goodbye with SQL-Server with SQL Server Yukon,i didn't had the time to read it but save it for reading it later. The title was intriguing and after reading nothing more than title, i thought it might give a nice post. And this after having a few thoughts about it while taking my morning shower, i envisioned the structure of this post.
Now that that i've read the article it has nothing to do what i've envisioned, but since no thoughts (even if they aren't worth much) should be wasted i decided to write it anyway. The article summarizes how we can store and manipulate XML directly in yukon, interesting but not what i expected. :-)
During my professional life, i've encountered two type of schools of thought when dealing with databases. (in no particular order):
- The ones that consider that the database is is just a data store, a simple silo where you can store,delete and fetch data. No more, no less. All business logic should be kept somewhere else. Normally data access it written using either code generation (during compile time, written automatically during run-time using some kind of abstraction,using some O/R mapper or some kind of similar technology). Their reasoning:
The ones that believe that not only the database is no only suitable for storing data, but is also the most appropriate place to put business logic. Their arguments (probably some other i don't know or don't remember)
- By using the data store as data storage and keeping the business logic independent from it, we are independent from it, and later on we can switch to another database incurring no costs (apart from the licensing ones and the data migratiom :-))
- It easier to contain all business logic on one place. Easier to write, easier maintain and easier to understand.
- More scalable systems, the processing is split across systems. No need to overload the already most strained system the database (unless the database is distributed, it is probably the most used single point and a biggest point for contention)
- Reduce the number of roundtrips while executing queries
- By controlling transactions manually on the database, you can make optimal lock management (no hold on locks while business logic is doing other things)
- By controlling manually the queries that are written (inside SP's for sure) you can micro optimize them. From my experience (YMMV) This is all fine and dandy, and you can extract every bit of performance from the database, but at a certain point it becomes unmanageable unless you functionalities are contained and orthogonal. (almost no reuse of SP's)
- The more close you are to the data the more performant you will be.
- Since business validation is made near the date it will be impossible to introduce incorrect data into the database (assuming there are no bugs in the validation code off course).
I understand both sides, i've been on both sides of the fence and i try to try to take a pragmatic approach and normally try to combine the best of both worlds.
But from what i've seen people tend to be fundamentalist about it. They seem to be binary, it's either one or the other. I've seen some pretty convulated machine states machine implemented in T-SQL in the name of performance. Apart from one or two people, trying to change something meaned almost invariable to system breakage.
From my experience, on most systems the database (data centric systems that is. On my business i would say 100% of them :-)) seems to be the bottleneck, and i confess, that when i hear stories about people putting more things into the database i shriver,since it's something that is very difficult to scale (vertically scaling is expensive and horizontally scaling is either architected up front or it's a nightmare to perform later).
For example this post claims that yukon will make the business logic tier irrelevant (i'm yukon of Borg, resistance is irrelevant?), can we afford it? until we can scale and distribute a database into multiple machines (transparently, and with a query optimizer engine that can write plans for distributed queries automatically) i think we can wait a little before putting even more things into the database.
Don't get me wrong, exposing web services directly from Yukon will probably cut a lot of craft and will speed up development, but for that MS has a wonderfull tool sqlxml, which for simple tasks is great (not to mention it can alleviate some sqlserver CPU load). But i probably still use .Net web services for more complicated things.
I think with yukon things will remain pretty much the same. People from one side of the camp, will now load CLR into their database, the other side will continue it's way (either with manually written code,objectspace or something else) but continue it's way.
I know i will continue my quest to find a nice db access code generator, i will use CLR on the database when appropriate, and will continue waiting and sighing for the theory of everything)