My experiences with .Net
I've written a quick summary on my understanding of the core Azure platform at http://www.indiangeek.net/2008/11/14/windows-azure-distilled/
A few years ago I was a firm believer in the Rich Connected Client application model, which was based on running applications installed locally on the users desktop. From the time of the Ajaxian explosion, the quality and quantity of Ajax based web applications has continued to increase, applications like FaceBook have introduced new paradigms whereas apps like Live Maps have made existing apps much more convenient and accessible. Today you have to really argue hard to even consider a desktop based application for anything that is non-computation intensive (Even this category is questionable now, for e.g. a few years back movie editing web apps would have been out of the question). So what is it that makes the web such a successful application platform
- Uniform and simple model (Web Browser, urls, can click when hand is visible) - Once a user learns the basics of working with a web application that knowledge can be easily applied to other applications.
- Client platform independence - The decoupling of the server and client with an agreed contract (HTML+CSS+JS) means that the traditional problems of targetting various platforms with different APIs is no longer existent on the client side.
- Machine independence - The user is no longer restricted to the machine on which the application was installed. This also results in a much simpler deployment model.
- Data independence - The user's data is now available on the network which means that not only can the user run the application from anywhere but can also access his data from anywhere.
- Full use of computing resources available locally - Having a powerful CPU and GPU seems like such a waste when all your applications have to be funnelled through the browser. So the next generation platform would allow access to the computing power available locally.
- Better integration with the local resources - This is sort of related to the point above, but would allow internet applications to access local disks, settings, registry etc.
- Better security model - Of course all this has already been attempted with ActiveX and XPCOM, but the security models there have been weak and non-intutive to users, a better solution is needed.
I was thinking on the way to work today that subversion would be a great tool to overcome some of the difficulties associated with frequent deployments to the web serevers. Here's how I see it working
- Create a production/live build folder in your source tree and add it to the repository.
- Modify our build system to create the live builds in this folder and commit to the repository.
- On the live server the site is deployed as a checkout of the live build folder.
- Once the build passes unit tests and QA all we need to deploy is to update the working copy on the live server. The big advantage here is that rollbacks etc. are automatically handled because we can always roll back to a previous version. Also you get a nice history of all the updates to the live server.
Having worked with programmers with an extreme variance in skills, I sometimes get the feeling that there is an big lack of good programmers but when I thought about it a little more I realized that it's not very clear cut, some of the programmers have strong areas and if you confine the tasks into their strong areas then they tend to deliver well. So I started thinking about all the lines on which we can evaluate a programmer, here's what I have so far...
Programmer Competency Matrix (the table is too big to fit on this blog post and needs a whole page of it's own)
After having spent a whole afternoon on this I realize that even this is not comprehensive, this matrix is more biased towards non-visual programmers, so a big majority of web devs will not be able to relate well to this matrix, but I am tired and will come back to this at a later time.
I recently came across an old article that I had written for my company newsletter, it's always fun to discover old stuff that you've written and see how much your perception has changed since then. Copied verbatim below, this was a print article which is why the links are not hyperlinked.
SearchMe - Still in beta and I was able to get an account easily. Check out the screenshot of their search results
For the past few days I’ve been investigating some memory leak issues in our desktop application. The problem started showing up when we saw that opening new documents and then closing them didn’t have any negative impact on the memory usage. Initial tests using vadump and process explorer confirmed that there was an issue and so we the developers started looking into it.
Last week all of us were baffled when suddenly one part of our application that uploads files to a FTP server stopped working. The strange thing was that the same build has been working without any issues for the past one week. We looked at everything that could have gone wrong, server, configuration, code but everything was setup fine and hadn't been changed. Also interestingly it stopped working for everyone except the developer who was responsible for the feature.
The first thing we did was to enable detailed logging to see what was happening, the logs showed two problems
- We were incorrectly formatting the path of file to upload
- The .Net framework code was changing folders after login to the root folder of the ftp where it didn't have permissions to upload the file
Then I remembered that last week .Net had issued a critical hotfix for .Net 2.0, could this be the issue. We verified that the developer didn't have the hotfix and all machines which were failing did have, Strike 1! Next we uninstalled the hotfix from one of the machines and the FTP uploads started working, Strike 2!! Finally we fixed the incorrect formatting of the ftp url and the issue got resolved on all machines with or without the hotfix, Strike 3! Issue resolved!
The problem was that the hotfix changed the implementation of the FTP code inside the .Net framework so that it behaved differently when passed an incorrectly formatted url.
This was the first time I saw a working app fail because of the way an incorrect argument was handled by a newer version of the framework. It was a good learning experience though :) Also this strengthens my belief in asserting all assumptions in code because if we had asserted that the url was infact of the format that we were expecting, this issue would never have happened in the first place.
I read a very interesting essay today - Hacknot - To Those About to Hack
that talks about why planning upfront always pays in the long run. There is a very nice story that illustrates the value of planning upfront.