Archives

Archives / 2003 / October
  • MSBuild's dirty little secret...

    Something that Microsoft's been failing to note in all the hype over MSBuild, is that Whidbey will not have MSBuild support in the C++ IDE.  I attended the MSBuild session, and this most certainly was not mentioned (a glaring omission, I believe).  It only came up in a casual conversion with an MS Staffer when I mentioned how psyched I was about this tool and how it was going to make my life easier, when he got a chagrined look on his face and admitted that there was no C++ IDE support for MSBuild.

    He did note that the C++ IDE would support it in the future, but that future isn't Whidbey.

    Read more...

  • MSBuild

    I usually end up doing the build and release engineering on whatever project I am on, so I’m actually a bit excited by MSBuild.  True, it’s a nAnt copy, which is, in turn, a copy of Ant, but what is imitation if not the most sincere form of flattery? (I’m relatively certain that MS didn’t use this line in the anti-trust suit.)

    One of the complaints about VS.NET 2002 was its lack of support for a true ‘build’ environment.  One could no longer export to nmake, though that was never a preferred solution in the first place.  C# and VB were also considerably weakened by the fact that Visual Studio did not support pre and post build events, though VS.NET 2003 later added this ability for C# projects.  Elaborate hacks and an elegant VS add on were invented by the community, but it wasn’t an ideal situation.  Several people used java’s Ant to build .NET solutions, and the nAnt project was born to copy the idea of Ant to the .NET world.  While careful observers will point out that nAnt predates the release of Visual Studio.NET, the project has only recently gained momentum and widespread use.

    For those who aren’t familiar with either tool, both nAnt and Ant provide an XML based declarative build environment, with the ability to easily create customizable add-ons.  Pretty cool stuff, and much easier to work with than the whole configure / make authoring common in the UNIX and gcc worlds.

    nAnt is a pretty nifty project, but it suffers from a major fallback.  Its VS.NET integration is tenuous at best, and the team that works on making this happen is really at the mercy of any potential changes to the .sln and .*proj files made by the Visual Studio teams.  Many teams that work with nAnt don’t rely on this (fr)agile integration, and instead maintain separate VS.NET and nAnt build solutions.  This is, of course, an ideal way to introduce build time bugs when developers make a change (such as adding an assembly reference in Visual Studio) and don’t update the corresponding build file.

    Enter MSBuild.  Microsoft is making the promise that their development tools will generate build files that the MSBuild engine is able to consume.  In addition, these files will be easy to extend, so that you may create your own complex, automated build process completely supported in the Visual Studio environment.  Is there a task that Microsoft didn’t include, that you could really use?  Do you have to sound the system bell if there is a build failure? Simply write your own Task by extending the Task class.  Sound familiar, nAnt users?

    In addition, MSBuild will ship on every OS starting in longhorn.  Actually, they’ll also be shipping the .NET compilers there, as well.  Now my Mom can drop down to the command line and compile her latest VB project without running out and purchasing Visual Studio.NET.  Go, Mom!

    Expect tool vendors to also start generating their build environments to be completely compatible and extendable using MSBuild, as well. If they don’t, insist that they do.  It’s going to make your life so much easier.

    Kudos to Microsoft for MSBuild.  Once again, “Opening” a proprietary format and making it extensible really paid off here.

    Read more...

  • Weblogger BoF

    Well, a few other people have posted about the Weblogger's BoF, with mostly good stuff to say.  Frankly, I was rather disappointed.  I realize that bloggers are by nature “Type A” personalities, but c'mon!  Clemens Vasters talked far too much about himself (really!).  He also spent far too much time gloating over how much Das Blog is better than .Text. Perhaps it was in jest, perhaps not, it was difficult to tell.  It was also difficult to hear anyone else speak, as Vasters did not often yield the floor.

     I did enjoy listening to Marc Canter challenge the group, he had a few interesting ideas pertaining to blog-reviewing instead of blog-editorializing.  His main point was basically that many, many people will write a review of a product, movie, etc, but relatively few (only a few million!) people will blog-editorialize.  He suggested building solutions around this new potential blog “market.“

    Unfortunately, most of the conversation (nay, argument!) would have been better served over a few beers at a local bar than in front of a room full of people looking for something interesting in the technical state of blogdom.

    Robert did a good job of putting this together; I'm just sorry that it wasn't more.

    Read more...

  • Destination... PDC

    Like most other people have mentioned, getting into LA today has been an adventure - 1 hour flight delay for me in Atlanta waiting for weather to clear, and then another hassle coming into LA becuase of the problem with the forest fires - apparently, the fires weren't a threat to LAX, but to a 'flight control facility' located elsewere, and there weren't enough flight controllers to handle the traffic into the LA area.  Fortunately for me, my flight was carrying precious cargo (To be specific, we had human kidneys on board being transported for a transplant).  That gave us priority to land at LAX, and so here I am at the Wilshire Grand.  If anyone wants to ring me, just call the Wilshire (rm 1082).

    Read more...

  • Inside Information

    Chris Sells blogs on getting the inside scoop.  Particularly, he points out that he occasionally used to ask friends inside Microsoft for info, after exhausting other reasonable means.

    Now, I'm just a regular joe without 'friends' at MS, but I've always noticed that every time I've contacted Microsoft with a reasonable question, the people on the other end have literally jumped through hoops to assist me.

    How many of you have complained about MSDN documentation?  I know that I have (My response is under the pseudonym 'jerdenn').  I've sent a similar messages to microsoft support on several occasions, and not only do I usually get a message back acknowledging the problem, but I also receive a detailed proposed solution (including code).

    A while back I was interested in the .NET Speech SDK, before the beta was publicly available.  I emailed the Speech SDK team, and they actually offered to send me an alpha version of the software.

    Really, I've always had a good experience in this realm - as long as you've got a legit purpose, people are going to go that extra mile.  Just don't expect them to do your homework for you.

    Read more...

  • Whidbey brings 64-bit .NET

    One of the often overlooked advantages of the .NET platform when compared to traditional compiled code is that JIT-ed code can be optimized at run time, vs. compile time.

    For example, let’s say I write a new program, we’ll call it super-notepad.  I make two versions of it, one in C++, and another in any .NET capable language.  We’ll call the two programs NativeNotePad and .NETnotePad.

    With Native Notepad, I can only take advantage of CPU specific optimizations that my specific compiler currently supports.  Why is this important?

    The 64 bit platform is in the near future.  Instead of spending a lot of time spelunking through code to re-engineer my memory manipulation, and any pointer arithmetic I wasn’t CPU-agnostic in writing, I can compile once, run on any Windows platform.  It doesn't quite meet Java's promise of platform agnostication, but it's a big step. 

    With JIT code, the vendor (Sun or Microsoft) makes an implicit promise that your code is portable into the future.  While Microsoft may promise that your native 32 bit applications still run on a 64 bit platform, they will normally still have to hit a ‘thunking’ layer, with the associated steep performance impact.  Anyone still working with 16 bit applications and the NTVDM will completely understand.  With .NET, there should be no such thunking.  I would expect that .NET applications will run far faster than 32-bit apps in 64 bit land.  We’ll find out soon, as .NET 64 bit support will be available in Whidbey.

    Sometimes I really wonder why more windows desktop developers aren’t working in .NET.  It will certainly save a world of work porting their old applications to a new CPU architecture.

    .NETnotePad means not as much as even a recompile.  It’s Microsoft’s job to make certain that 64 bit .NET “just works”.

    Read more...

  • Everywhere I look, there's Paul Wilson.

    I picked up a copy of Dino Esposito's “Programming Microsoft ASP.NET” yesterday - I've only read a bit of it, but it seems pretty good so far.  What was interesting to me is when I was reading the credits in the forward:

    "Next are the 24x7 people, available at all times, who in a way or the other, directly or indirectly, contributed tips, tricks, suggestions, advice, gotchas, or simply their own work that I used as a resource. They are, in no particular order, Jeff Prosise, Jason Clark, John Lam, Francesco Balena, Jeffrey Richter, Peter Debetta, Berni McCoy, John Robbins, Don Kiely, and Paul Wilson."

    As I've been reading the book, I see a few spots where Dino has been influenced by some of Paul's ideas.  That's got to be something to be someone who helps shape the way 'things are just done.'

    I also think it's quite an honor to be mentioned in the same breath as Richter and Prosise!  Congrats, Paul...

    Read more...