In the pursuit of collaborative intelligence...

...Ed Daniel's software blog


hacker emblem
View Ed Daniel's profile on LinkedIn

.net blogroll

.net dudes

.net dudes (FR)

.rainbow dudes



Open Source

Other thinking




Mark Shuttleworth launched the Ubuntu Foundation on July 1, 2005 following the first release of Ubuntu, a Debian-based operating system in October 2004.  Today the latest version of this has been released known as "Feisty Fawn".

It's quite a journey to observe and for the last couple of months I've been getting my hands dirty with the Fawn as it finished the final stages of testing and in the process have been learning what is Mark's Ubuntu.

In today's world of software it is more than apparent that it's not what you know, as it is often commoditized knowledge but what you do with what you know.  Leaders in business and beyond often talk of empowerment - how empowerment of the employee/manager enables their business to grow and out-perform.

If this empowerment were taken for granted what could one expect - I often find myself sharing one particular thought in conversation these days - "it's about the wisdom of the crowds in the age of participation".

My first glimpse at Linux was when I got my P3 Vaio dual-booting Suse in '2000 and it lead me to believe that from an enthusiast's perspective it was fun but in relation to desktop computing Microsoft gave me all I needed and a whole heap of support as well as an established, synergistic community of solution providers, all thanks to Microsoft's partner strategies and user community initiatives.  Today, thanks to the Ubuntu community and Mark's team at Canonical this has been replicated for me in the Ubuntu community. 

I pondered briefly on discussing the fact that this is all 'free' to me as is much of what Microsoft provides but I don't want to highlight the significantly lower costs a community model incurs to deliver services than one which must fund all contributions - what I'd prefer to focus on is what the advantages of the community model are and how these can compete with the existing one as delivered by Microsoft.  This in itself is another blog post but for a taster you really need to read this.

Having first-hand witnessed the methods and processes that support the Ubuntu community I believe this distribution has a very promising future ahead of it.  There are lots of blogs appearing from people who have decided to do a thorough investigation of moving from Windows to Ubuntu and we appear to be coming to similar conclusions.

Computing just got fun again and more so, the rules have changed once again in the IT industry as demonstrated by the success of Ubuntu as an operating system, community and organisation. 

Perhaps Microsoft's missed opportunity and its Achilles heel will always be that today we know so much more about the value of community, the power it possesses and the desire to belong to what I believe identifies who we are - like our brand choices.

PS. The next version of Ubuntu is in 6 months - and the next release (v4) of KDE - the desktop environment I 'chose' to run on Ubuntu will be including the Nepomuk project - so I finally get what I've been waiting for which is a semantic desktop computing experience;  I had hoped this would be possible with Longhorn's WinFS architecture as promised by Microsoft during the PDC in 2003 but I gave up waiting in 2006 when it was evident that with the new branding from Longhorn to Vista that restructuring was afoot and further delay was inevitable.

Multimedia-assisted requirements analysis can bring a spotlight on healthcare in UKs

Last night I attended a very interesting talk at the British Computer Society on open source technology and health care. Here's some links that were shared during the talk.

So why multimedia-assisted requirements analysis you may ask - well during the Q&A I was keen to highlight that there currently does not exist a means for the technical community (open or otherwise) to actually 'get at' the requirements of general practices (GP) for software solutions.  As part of this exercise it became apparent to me that one technique for requirements gathering would be to use multimedia i.e. a handheld video recorder to capture the 'working practices' at a GP.

This approach I believe has merit as one of the common issues we face, as architects/developers, is translating working practices into requirements - by 'shooting' videos, doing some packaging with tools such as Celtx (i.e. you might wish to blur confidential information) and sharing these on sites such as YouTube would enable the technically illiterate to actually prepare content at zero-cost (plus their time of course) that we could use to derive and model solutions that were fit-for-purpose. 

It would also bring an intense spotlight on the actual issues GPs face with the current tools they use to get their work done.  Does it really make sense for digital patient records to be printed out onto paper and re-entered into the exact same system when patients transfer from one GP to another - ludicrous is one thought coming to mind. 

It came as no surprise that the speaker was flabbergasted at the continued cycle of broken promises relating to software projects that are never deployed due to the politics and ineptitude of the people in charge to deliver this work - how long does one wait?  10+ years is the average it seems and still no service. 

What is perhaps the most depressing symptom of this poor performance is a totally disenfranchised community of NHS managers who are absolutely fearful of getting their hands dirty on an IT project because of the career risks it creates - no one wants to be known for losing £2.5m (value for example only) of budget on a failed IT project - and there have been many in the NHS yet a few people have continued to profit from this fiasco.

Today, we face an uphill struggle in trying to convince our government of the new era in software evolution - we have a pressing need to innovate around the indemnification challenge when someone like Mr. Granger wishes to ensure that when he engages with a provider that they assume liability - where this applies to software that is 'open' there seems to be an awful disconnect that is fueled by marketing spin - the problem is that this is costing taxpayers millions of pounds but no one listens because the debate is stuck in the realms of 'technical choices' as opposed to that of 'insightful strategy'.

For the entrepreneurs amongst you there is an opportunity to create a brokerage that provides indemnification for local open source providers targeting healthcare providers in their region that supports many more players, enriching and delivering choice to GPs, than those with the might and budget going after the bigger deals.

Comments welcomed, especially any links that corroborate or contradict these opinions

Pros and cons of software selling model "cheap product, expensive support"

I came across this LinkedIn Answer post today:

"Can anyone suggest a good set (or source) of pros and cons of "sell cheap/free, support for money" approach? Like Oracle do, for example. The software i'm talking about is for financial services industry, and quite expensive. I'm sure there is a lot of experts in selling strategies - would love to hear opinions, thanks."

Here's my thoughts on this...

At both a macro and micro level of software architecture i.e. a business-ready solution that leverages operating systems, messaging and storage platforms, upon which a variety of applications exist to an individual software component perhaps on a chip; the issue is one of 'software that just works' and 'software that needs help'.

Whatever software you use you make choices at what point you enter the architecture and how you build upon and beneath the various intermingling layers, it's more a 3D ecosystem than a vertical 2D stack nowadays.

At each juncture where one software depends on another the risks involved are based on maintainability, resilience, security, scalability, interoperability and its measure of being fit-for-purpose.

Around these arguments one would be able to align a business strategy that compliments the resource required to achieve a successful solution - such requirements will involve capacity, expertise and knowledge.

Knowledge is fast becoming a commodity, open standards are driving integration. What is not a commodity is 'time' - so the pros/cons of selling cheap/free software and raising revenue through a support model must meet the value of 'time' the approach brings to the customer.

If your solution costs less time to develop, deploy, manage, integrate, evolve and the overall lifecycle cost is competitive - then the proposition is able to stand up against any other proposition - at this point the customer should have a clear understanding of the cost/requirements & benefit/deliverables and be able to identify the value that can be created/saved through implementation.

The most compelling aspect of using free and open source software is the speed of evolution - software is released more frequently, more often - successful open source projects have thousands of expert developers participating to test, use and improve usability, functionality, design etc..

Already open source projects on both proprietary and non-proprietary operating systems have free automated testing tools that have improved software development lifecycles which means more people are writing better software - as well as the impact of free peer-group knowledge sharing that is taking place on the internet.

Moving your value proposition from product to service will mean you will need to be aware of all these aspects in order to provide a solution that competes with the rest of the marketplace - therefore smaller open source projects are at risk of being inferior to proprietary solutions and would be dangerous to rely upon.

A blended approach is to build proprietary expert tools that leverage and integrate with open source and proprietary software that are faster, better and superior to any current software available - a variety of business models exist to facilitate how this can be executed.


Check out this product from Sourcefire which is an example of a blended proposition.

Java set free

This is quite a milestone for the IT industry and community as a whole. Big pay-off too for those who have been betting on this move.

After countless pressure from various quarters - notably IBM - SUN will now fulfil its promise made by their new CEO, Jonathan Schwarz - to release Java under GPL.

One thing I've not yet checked is whether the JVM is also being released under GPL - that would be icing on the cake for many eager software engineers out there!

More here.

I wonder if Mono will maintain an interest in the community now that this has occurred.

What will IBM do now that they've got what they wanted?

Who else cares?

Above are some of the thoughts going through my head right now. 

UPDATE 18/5/07: 

Does Sun's open-sourcing of Java have an impact on the way Google views Java as a development platform?

"It doesn't change how we're looking at it, but it does increase the utility of Java for us. So before they had released Java as GPL, we had signed a source code agreement with them where we could give them patches and bugs and all this other stuff—because we have a lot of fairly advanced Java development going on at the company. We have folks like Joshua Bloch working for us and he's a very prominent Java developer and he's involved in the Java Community Process very heavily.

So we always had a way of getting patches in and some features developed. So that was fine for us. But with it being open source, it's actually better for us in a lot of ways, because we can access certain parts of the code in ways we couldn't before. And we can fix them and offer those fixes up without as much ceremony around submitting those patches and features. We can say, OK, it's an open-source project so we can just release this stuff. That's incredibly freeing for us. So we were very happy to see them go GPL there." 

Mueller's "No Lobbyists As Such - The War over Software Patents in the European Union" now available
Florian Mueller, the founder of the award-winning campaign, has published his memoir-style book, "No Lobbyists As Such - The War over Software Patents in the European Union", on the Internet.

On 377 pages, Mueller tells the story of the legislative process that ended in July last year with a landslide vote of the European Parliament against a proposal for a software patent directive.

This is an excerpt from the introduciton to the book by Florian Mueller, chronicling the events leading up to one of the most siginificant legislative decisions in European law this decade and of immense impact to our industry.

"On July 6, 2005, the world of politics turned upside down. Big money was dealt a blow.

The European Parliament threw out legislation that the world's largest IT companies badly wanted. Under the pretext of protecting inventors against plagiarists, it would have handed those giants sweeping powers over Europe's high-tech markets. An electronic roll-call vote thwarted the wicked plan in a matter of seconds, but that decision was preceded by years of intense fighting."

Readers of this blog will be aware I follow this - I'll never forget that later that day following my post to the blog we suffered the london bomb attack and lost Colin Morley.

Anyhow, I highly recommend you read this book.

Inspirations on Singularity and opportunities with Empathy
One of the areas that has kept my interest and curiousity in recent years is that of collective consciousness.  Having seen the power of online portals and how they improve collaboration it was fascinating to see how these web presences evolved - document sharing, forums, text chat rooms, blogs, wikis for example becoming mainstream and accessible to the online community / global village through proprietary, free and open source software.

During this phase of portal evolution the value of search emerged - then came the revolution in participation architectures and tagging - otherwise known as folksonomy.

Recently Ray Kurzweil was featured in the news regarding the concept of the singularity.  If you're not familiar with the Singularity then this link should help.  Until such time as artificial intelligence develops powerful empathy there may be a growing requirement for a collective human consciousness in the community, geographic and non-geographic regions, societies, organisations and associations.

Leveraging the thought-leadership of George Por's community-intelligence I'd like to see a standard emerge for exchanging 'consciousness' between Communities of Practice.

My ideas have meandered over the years (please note I no longer participate with this organisation) but I'd like to see some sort of working group, at OpenBC for instance, establishing open standards around this type of thinking.  OpenBC is slowly acquiring the participation of some of Europe's most promising entrepreneurs and technologists who would be fundamental to achieving a European Collective Consciousness.

Once that has been achieved then we could look to see how that could scale to a peer to peer model of consciousness between communities of practice.  A model that would follow would be of an organisation's various CoPs creating a shared consciousness and in turn this consciousness would be accessible via strategic partners.

OpenBC, itself, could be that organisation, so could Google, Microsoft and others.

This is not another expert system, nor a search engine, portal or social network, nor is it an attempt to usurp RSS and/or RDF, the driver of much integration and interaction between disparate data on the web, in fact I'd anticipate it would create vocabularies of RDF though. What this would be is a modular real-time mind that would be integral to decision making and deliver significant risk mitigation and impact awareness during the process of decisioning connecting the tangible with the intangible - powering automated decisioning with collective empathy - something that would help e-Government govern better for example.

What follows from this, for the commercially minded, is micro-transactions and their inherent value: how would you easily and comprehensively value transactions of consciousness?  Once again the theme of taming complexity emerges from this requirement.

I'd love to hear your views on this, learn about groups you'd think would be worth checking out and also links to any content you feel would support these enquiries.

Bruce Sterling in London

I recently had the fortune to attend a talk by Bruce Sterling, here in London hosted by the New Statesman.  It was thanks to Dave Green, who publicised the event at NTK, that this came to my attention. 

With the kind help of the New Statesman's online manager, Kathryn Corrick, I was sent an invite to join them at The Grouse and Claret, a charming pub that served Fursty Ferret: a fine ale and highly recommended ;-)

Bruce Sterling is rather well known although I must confess I'd only just discovered Bruce last year when Jonathan, the chairman of one of my projects remarked that something we'd discussed was very 'Bruce Sterling'.  I looked at him dumbfounded.  Jonathan enquired if I read for pleasure to which I replied "only on holidays, I don't have time" so he suggested I ought to take a break from my research and thrust his copy of Bruce Sterling's book Heavy Weather into my hands.  I'll never forget as I was reaching the climax of the book a certain hurricane took a 90 degree turn and headed for New Orleans, spooky.

Arriving early I managed to get in a pint of the Ferret before taking my place in the room while the New Statesman team prepared for Bruce's talk.  I can't tell you how lucky I felt when Bruce and friends decided to sit with me and that's when I got my chance to take a photo of the famous chap reading the article about Paul Wolfowitz in the latest edition of the New Statesman.

I guess I'll be frank about the talk and say I was more than au-fait with what Bruce spoke about so there were no 'ahh' moments for me.  That said, there's nothing better than capturing an expert wordsmith's soundbites that I felt were worth noting down:

"They [Business] don't talk benefits" + "PR people talk about benefits" + "only to consumers"

In relation to Wikipedia-type communities and their architecture of pariticipation:
"Free labour" + "Radical de-centralisation" + "Mass servicing of micro-markets"

Bruce was comfortable using the term "Web 2.0" and gave a great deal of respect to Tim O'Reilly for coining the phrase.  He remarked that a web 2.0 site had "minimum content to support the brand" and the focus is "You" the visitor, again remarking about 'free labour' in the context of the way tagging is carried out by the visitor using Flickr as a specific example.

Another good point about web 2.0 sites and while obvious as a key indicator of a successful web 2.0 company is that as it gets bigger it gets better, i.e. web 2.0 sites must be able to grow i.e. increase in membership and pariticipation and those that do provide even more value than those that don't.

He noted that Google will probably have a service as a real-estate broking platform following the recent acquisition of Sketchup.

Here are some of Bruce's web n.0 trends / change drivers:

  • Socially generated internet knowledge (which compliments my stance of knowledge commoditisation)
  • Interactive chips - RFID technology (one of my current fascinations)
  • Real-time locative systems - Geo-location mapping etc.
  • Traceable objects
  • New search tools
  • Cradle-to-cradle recycling
  • 3D Virtual objects / modelling - such as Google's Sketchup
  • Rapid prototyping
There were also some lovely words that followed this such as 'spyme', 'metaverse', 'benevolent magic elves', 'ambient findability' and my favourite: 'fractal shape of the internet' which Bruce linked to intrinsic benefits. 
This all resonates with much of my research so I had a wry smile on my face when he was talking to the audience.
I even got to ask Bruce a question about what his hopes and fears would be as we emerge towards a collective consciousness.  I felt he did not answer my question as well as I would have hoped  but I guess too many people have played that one out already in sci-fi literature - it is however one of my key interests and certainly in relation to communities of pracitice, peer2peer technologies and facilitating a P2P3C: {CoP.Col.Con.0 : CoP.Col.Con.n}  ;-)

Fortunately, if this has whet your appetite there's a podcast of the talk (45mins) and Q&A, courtesy of the New Statesman team, available here.



Lee's logo-fest of Web 2.0
Lee Wilkins of has kindly put together this Web 2.0 logo montage - couldn't resist borrowing it! 
Do you know what each of them does?  You ought to!
My favourite is Podzinger who license some smart voice technology to deliver their offering.
web 2.0 logos
UPDATE: Many thanks to Jason Moon for his recommendation of this link: web2logo. At the site they've got a nice tag cloud to help you contextually navigate the logos according to the theme of the business behind the logo, simple yet smart :-).
Patent headaches
I've just caught sight of an article over at the Inquirer that is rather alarming, extract here:
"Balthaser Online says getting the patent means that it can license nearly any rich-media Internet application across a broad range of devices and networks.

It means that anyone who wants to use Flash, Flex, Java, Ajax, and XAML could face a licensing fee from Balthaser when their site goes up."

Posted: Feb 23 2006, 03:32 PM by Ed Daniel | with no comments
Filed under: ,
AfterMail acquired by Quest Software

"Quest Software has acquired AfterMail Limited. The quality of the AfterMail team as well as the depth and completeness of the AfterMail product were key factors in our decision to proceed with this acquisition. Quest is a mature, rapidly growing, worldwide company, with a broad range of products that can help you improve the performance and productivity of your enterprise IT.

The AfterMail development and support team remains intact and will be complemented by Quest’s global support organization.

David Waugh Vice President of Product Management
Infrastructure Management Solutions"

Well done Rod, Mike, Geoff, Tim and the rest of the AfterMail team - it's amazing to think we sat in front of Rod only two years ago discussing strategy for creating a footprint in mainland UK and the rest of Europe - we're delighted to have supported and made possible along with several other AfterMail commercial partners a viable and exciting proposition that has now lead to the acquisition of the AfterMail business by Quest Software.

If you've been reading this blog for sometime you'll remember one of the criticisms voiced was the viability of AfterMail considering the size and time in business of the organisation, now with the backing of Quest those reservations are no longer valid :-)

More Posts Next page »