~mkw

Average guy, above average luck...the blog of M. Keith Warren
New Rule and Old Rule

New Rule (for blogging on this site):

Posts which allude to some great technical silver bullet or really cool demo and say nothing more than 'it was cool' are not allowed. Unless you say something specific you are just making noise and frustrating those of us who aren't in the knowledge loop. Saying you cannot talk due to an NDA is even more ridiculous because a basic tenet of secrecy would lead the keeper of such secret to avoid divulging they even know the secret.

(read: this sucks, I want to know what you guys know...)

Old Rule (for life and development):

Bullets will almost always kill you, or hurt really bad...even if they appear silver.

VS2005 RC is here

I just noticed that the Release Candidate for Visual Studio 2005 is on MSDN downloads!

Let the bandwidth drain begin!

UPDATE: It has suddenly disappeared...ARGHHH

OK, up and moving now @ 100K

Where is the real interface?

OK, I went and downloaded Vista Beta 1 and played for a while. There are some nice things but all in all incremental improvements over XP. My beef though is with the interface and the hype. For years I heard about this super secret 3d interface being developed, it was supposed to make OSX users jealous and makes users want to run out and buy the product when released.

 

Someone please tell me that this is not it, is that seriously the best you got? I work with OSX almost every day and tell people that Longhorn should have a UI that makes Cupertino shake but this makes me look like an idiot.

 

Scoble? Someone? Please tell me the goods are still hiding in Redmond and you are going to pull some last second, “Oh yeah, here is the fancy stuff”

 

 

443 <--> 80 - Seamlessly moving requests in and out of SSL

Sometimes you feel secure, sometimes you don’t. Better put, sometimes a page needs to be secured and sometimes it does not.

 

One of the things I wanted to do on a recent project was avoid unnecessary page encryption when the content did not require it to be. This may sound like a silly problem but when you consider that in the logical click stream of a user they may go from a page with sensitive data to a non-sensitive page and then back a forth between pages that contain secret information, you can see where you are wasting cycles encrypting pages that don’t need it.

 

This seemed to me like a common problem and I expected that the IIS would have an easy way to deal with this problem and while IIS does allow you to require SSL for a specific file it does not fail with elegance. By that I mean that when you visit the page which requires SSL using a normal HTTP session, you get a server error (Http status codes 403.4, 403.5 I think) that tells you this page must be viewed securely. While for some users this is not a big deal, just make the change to the URL – most people get really confused at this point; and heck if the darn thing knew it needed to be secure then why not just become secure. Furthermore, when considering this challenge outside my personal scope I knew that going the IIS route for this solution did not seem the best path because in lots of cases developers don’t have access to make IIS changes. So as I venture to find a way to make my application do this I am quite sure that ASP.NET has some great, built-in functionality which will do for me what I am attempting; after much searching I came to the conclusion that Request.IsSecureConnection is as good as it gets in the framework.

 

Other people have proposed solutions in the past, today I even ran across one which prompted me to write this; Matt Sollars has an excellent two part article on Code Project which details his solution to this problem involving httpModules and extending the configuration of asp.net.

 

I actually rolled a solution similar to Matt’s but was unhappy with the general complexity of it; I wanted something simple and the problem scope seemed so limited that there had to be some way to achieve this in a relatively performant manner without having to write a lot of code.

 

OK, that is a lot of build up, now to the point…

 

I found a way by extending the Page class that you can automatically move people in and out of secure pages with as little as one line of code per page! Here is how you do it.

 

First thing you will need to do is add some code to your base Page class; almost every single ASP.NET tips/tricks/good practices/yada/yada/yada article tells you that you should extend System.Web.UI.Page with common functionality; if you are not doing this already, shame on you.

 

To the base page class add a private boolean field to store the data indicating whether a page is secure

 

            private bool _RequireSSL;

 

 

Also add a property which wraps this field

 

[Browsable(true)]

      [Description("Indicates whether or not this page should be forced into or out of SSL")]

      public virtual bool RequireSSL

      {

            get

            {

                  return _RequireSSL;

            }

            set

            {

                  _RequireSSL = value;

            }

      }

 

 

Note: You will notice that the property is decorated with a couple of attributes, the first, “Browsable” tells VS.NET to show this property in the design time property window allowing you to indicate in that window what the value of the property would be, doing this can make things a bit easier and save you even needing to write the single line of code per page needed to implement the functionality; setting the property effectively writes the code for you. The “Description” attribute tells VS.NET what text should show at the bottom of the Properties window when this property is selected.

 

 

Next, we are going to add the actual method to our code which will do the magic. You will notice that this method has two other attributes, the first will tell VS.Net when debugging to skip over this part, no need to see it; it works. The second attribute indicates that we only want to run this code when we have compiled with a SECURE compilation constant defined, this saves us having to deal with SSL certs and such on development machines as we can define that constant only in build configurations that will be deployed to an environment with the certificate such as staging or production.

 

      [System.Diagnostics.DebuggerStepThrough()]

      [System.Diagnostics.Conditional("SECURE")]

      private void PushSSL()

      {

            const string SECURE = "https://";

            const string UNSECURE = "http://";

 

            //Force required into secure channel

            if(RequireSSL && Request.IsSecureConnection==false)

                  Response.Redirect(Request.Url.ToString().Replace( UNSECURE , SECURE ));

 

            //Force non-required out of secure channel

            if(!RequireSSL && Request.IsSecureConnection==true)

                  Response.Redirect(Request.Url.ToString().Replace( SECURE , UNSECURE ));

      }

 

The logic here is quite simple, if the RequireSSL property is set to TRUE and the Request is not a secure connection then we need to perform a redirect. That redirect will take the Request.Url which is the full URL of the request, convert it to a string and then replace http:// with https:// and send the user on to the https version of the page. The second conditional statement does the same thing only in reverse, taking a user out of SSL if the page is not required to be secure. You could toy with the actual string replacement if you wish, for example you might want to only analyze the first few characters of a string in the case the some form of "http" is embedded later in your URL (maybe you have other URLs; in your URL) that is up to you – for the sake of making it as easy to understand as possible I chose the simplest route.

 

Now we have our field, our property and our method, the only thing left is the implementation. To make this work for our pages we need to override OnInit in our page class…

 

      protected override void OnInit(EventArgs e)

      {

            base.OnInit(e);

            PushSSL();

      }

 

As you can see from the code above we are really only adding to OnInit, not changing the behavior any as the first thing we do is call the base member. Our second line of code calls our method which will run the process of checking our page is moving it in and out of the secure channel.

 

Now we have all of the pieces in place, to implement this on an actual page there is really only one line of code which you can write of let VS.NET write for you; an un-initialized boolean is by default false so unless you are trying to make a page secure there is really no reason you will need to do this. In the case you do need to make a page secure you should set the RequireSSL property equal to True on the page; this should be done in the InitializeComponent method…

 

      private void InitializeComponent()

      {   

            this.RequireSSL = true;

            //Other initialization code would be here also

}

 

The setting of this property can also be achieved by pulling up the design time properties of your page, navigating to the Page member in the property drop down list and setting the property manually. This will write the line of code for you.

 

Normally this is where professional writers recap and wrap up but I have pretty much said all there is to say, it works…it is not perfect but it does the job, if you are into this kind of thing I would also suggest looking at Matt’s article and deciding what solution is best for you.

 

related stuff to check out:

 

MSDN on Conditional Attributes

MSDN on the Browsable Attribute

MSDN on the Description Attribute

The Debugger Step Through Attribute


Matt Sollars solution @ Code Project


HTTP Status Codes

Extending System.Web.UI.Page

 

 

AppleMatters: How Microsoft Will Die...

http://applematters.com/index.php/section/comments/423/

Sometimes when I read stuff like this I seriously think it was originally written for “The Onion

Having a point to point argument with this guy would be like beating up a kid in a wheelchair but I thought I would bring up a few of the wackier points for good laughs.

“Shoddy software practices are forced on programmers due to incompetent managers which in turn produces the mess that is Longhorn.”
– Those terrible managers and shoddy practices, that’s the problem. Please tell me though, do you know any of these managers? Can you detail these bad practices? Have you even seen the “mess” that is Longhorn?

“Then on the server side Microsoft has finally realized that they are fighting a losing war.”
– Yeah and the fact that so many people are tossing Windows 2003 for other things just amplifies this…

“Now factor in the ITMS and how profitable it has been.”
– Forget the fact that Microsoft makes twice as much PROFIT in one month than ITMS has done in total revenue its entire lifetime.

“News flash! Longhorn is going to be drastically overshadowed by Leopard and Macs running Intel.”
- So everyone will be excited for the 200 people who still use Macs while no one will care the 200 million + users will be getting an upgrade.

“Throw all of the current Windows code away. All of it. Everything from 9x to XP to Longhorn, everything has to go. It’s all crap and its time to jettison those reeking piles of poorly written, buggy code.”
– I love proclamations from people who expose themselves in this way; I wonder how many lines of code this guy has written (hint: HTML does not count).

“There is no need to make IE so deeply attached to your kernel. Bad things happen when you do stuff like that.”
– OK, can you even give me a salient definition of a kernel? IE has never been part of the kernel, it will not be in Longhorn either.

And the one I like best is his suggestion for what Microsoft should do; suggestion number one, top of the list...Admit Defeat!

Ha! I love a good laugh.

Will Atlas slow the move to smart clients…?

I have been building applications on the web for almost a decade now; real applications used everyday by real people, not web sites you touch every once in a while. In my experience I have learned many things but one important element can be summed up in two words – browsers suck.

We have taken browsers and forced their square asses through round holes only to find that once we pushed it through there was another hole in a different shape waiting on the other side. Browsers are simply not an ideal platform for the presentation of data; they are certainly not ideal for the manipulation of that data. Don’t get me wrong we have come a long way since IE4 blew Netscape out of the water and ushered in an era more friendly to developers but let’s face it, the best of web applications are poor facsimiles of the real thing. What are we trying to do but merely mimic Windows.

We get excited by things like OWA in Exchange 2K3 but when questioned about the excitement the answer tends to be something like “It is so much like Outlook 2003, it is really great.” Think about the absurdity of that, we are excited because it is a pretty good fake.

ScottGu today announced Microsoft’s attempt to up the AJAX ante with Atlas and while exciting and applause worthy (and something I will most certainly use) I question its effect on the long term migration to a platform which is technically more elegant and financially (to MSFT) more fruitful. I am speaking about ‘Smart Clients’, Windows based applications with a native understanding of the web and specifically web services. The value of the web is not how pretty we can make our HTML but the content described by it, this was difficult to explain to people but the growth of RSS has made it more clear – the value is the data and not the presentation.

With the widespread adoption of managed run times like the .NET Framework and Java runtime some of the major challenges that gave rise to web based applications are being answered. Browsers and HTML in general gave us platform independent consistency. As the managed runtimes make their way on the nearly every new PC made and the majority of actively used existing ones this challenge is seceding. The runtime is becoming the consistent bedrock that developers need as a target for application development. Another great advantage wrought by the browser revolution was the obviation of the need for software distribution. Versioning issues and the physical act of installation on a client machine presented incredible engineering challenges and browsers simply did away with that. Today we are seeing more and more applications which update themselves and the Windows Forms team has served up a great piece of technology with ClickOnce deployment which will almost eliminate the original problem of distribution all together.

If the value proposition of the web is distilled down to the broad availability of the value asset (the data) then one could argue our applications can and should move to a model that best exploits that data for useful purposes, a model that takes advantage of the power of the PC and richness of the Windows user experience to give the user the best model of data availability and manipulative functionality. This is the course plotted by people building smart client applications and it is most certainly the right course; why then are people still building new applications using methods that don’t make sense anymore? Will Atlas merely exacerbate the problem by taking us further down a road of “works good enough” and effectively slow the migration back to Windows based applications.

My point is that we got in bed with browser based applications because of problems that are gone now (or very close to gone), will the continual advance and introduction of technologies that make life “more like” the real thing only delay to move back to the real thing?

This is not to say that Scott’s team should not build Atlas, of course they should and they should do it in classic Microsoft style: better than everyone else.

OT: US Supreme Court: Litigous open season on P2P File Sharing

Dont like getting into political issues but this has an effect on software development...

http://www.washingtonpost.com/wp-dyn/content/article/2005/06/27/AR2005062700471_pf.html

Big ramifications I am sure, looking at the broader picture here - the US Supreme court has said that software developers can be held liable in the case their software/service is used in an illegal manner, regardless of your purpose or intent in development and the provision of the service.

Sad...

The problem is with one's interpretation of intent, Souter says that someone who distributes something..."with the object of promoting its use to infringe copyright..." is liable for the acts performed with the software. Fine but who decides what the object promotes, once again lending vagueness to a ruling which can and most certainly will be abused by lawyers everywhere.

Couple this with the outrageous eminent domain decision that gives the government power to seize your private property for purposes other than the 'public use' and we have fulfilled the fear of Lincoln and resigned our government to an eminent tribunal.

OT: Microsoft and China

Microsoft censors words like Freedom and Democracy

http://biz.yahoo.com/ap/050614/china_microsoft.html?.v=3

I understand there are business considerations in all decisions like this and likewise the ultimate responsibility is a fiduciary one before the investment community but this really bothers me. All that talk about Microsoft and moral courage in the face of a state bill which extends specific and special rights to a certain minority group already protected as part of general anti-discrimination laws. Now a test of real courage and nothing.

Respect lost.

The irony of the matter

Apple moving to Intel processors...Microsoft picking up PowerPC

Am I asleep? The more I think about this the odder it seems, seriously, tell me this 6 weeks ago and you are nuts.

Next up: Dodge and Chevy decide to trade engines, Viper and Corvette to make switch.

 

INTELligent Fruit...
The story of an Apple/Intel relationship is everywhere now so I assume it is a legit story. Apple will in some form be moving to chips from Intel. But what does that really mean? All stories have been essentially the same, long on speculation and short on details. Mid 2006 the switch will begin, that is about all the detail that exists at this point. Jobs is supposed to tell more at his WWDC keynote this morning but certainly the speech will create a cabal of questioning.
 
I find it interesting that no reports have specified something that I think is a bigger issue here than manufacturer; architecture. No report that maintains a 'source' specifies that the move involves the use of X86; there is speculative coupling of the tidbit that Apple has had OSX running on X86 in a lab for some time. This should not be news to anyone who has paid attention as OSX is rooted from an X86 build based on BSD. The move also does not seem to make sense, if Apple is going X86 then why go with Intel - AMD is kicking their butt in so many technical categories...!?
 
Why do we automatically assume that Apple would toe the Intel line, any agreement such as this will have a give and take. In most cases this is going to be economic but since we are all just guessing at this point I will insist that it is possible that Intel will manufacture a chip designed with an instruction set compatible with existing OSX apps. If they dont, could it also be possible that Intel will now exercise the HyperVisor virtualization technology to help the move? Maybe someone smarter than me could answer if these things are possible.
 
Could it be possible that Apple will make and sell PCs? Forget their software which has become a niche for the 13 people who use it, the bulk of their revenue is based on their hardware which quite frankly is both aestetically and technically excellent.
 
Could the chips be used as part of a new home entertainment appliance, an IPod for the living room?
 
Could the story just be a bunch of speculation run amock? The NYT reported on it so truthfulness and bias need to be examined.
 
How many people will walk out of the keynote in protest? Childish I know but we are talking about Apple fans.
 
Scoble: Is it possible that Jobs saw a sneak peek of Longhorn and decided to abandon Apple's traditional business model in Mid 2006 (tentative Longhorn release timeframe!) and start selling Longhorn ready PCs!!!!!! (note to flamers: that was a joke)
More Posts « Previous page - Next page »