December 2008 - Posts
This innovative use of Deep Zoom uses screenshots of sites powered by ViaTecla’s software solutions to form a picture of a Santa Claus.
I don’t know what’s wrong with my XP system (besides the fact that I’m still using it) but I couldn’t install .NET 3.5 without removing .NET 2.0 and now I couldn’t install apply SP1 without removing .NET 3.0 and 3.5.
Fortunately, I was rescued by Aaron Stebner’s .NET Framework cleanup Tool – both times.
Use the mouse wheel to zoom in and out and the left mouse click and drag to move around.
Callbacks were introduced in ASP.NET 2.0 and is a simple mechanism for calling page or control functionality without page rendering and without the user noticing a post back.
For a page or control to handle callbacks, all it needs is to implement the ICallbackEventHandler Interface.
When the client calls back to de page or control, the initial state of the controls is posted along with the control being called upon in the __CALLBACKID field and the callback parameter in the __CALLBACKPARAM field.
It’s quite a simple procedure.
But what if you want to issue a callback server side?
In order for a request to be identified as a callback (IsCallback), the request must be a postback (IsPostback) and the before mentioned fields must be in the post data of the request. On the other hand, for a request to be considered a postback, the level of server calls (Transfer or Execute) must be 0 (meaning that the current request hasn’t made any Transfer or Execute calls) or the type of the page is the same of the Handler for the current request and the HTTP method is POST.
Changing the HTTP method is (as far as I know) impossible. So, if the request is not already a POST, there’s no way to issue a callback.
Setting the post data is easier. All it’s needed is to override the page’s DeterminePostBackMode method (or in a page adapter) and return the post data previously saved in a context item. Something like this:
protected override NameValueCollection DeterminePostBackMode()
NameValueCollection postBackMode = Context.Items["callbackPostData"] as NameValueCollection;
return (postBackMode != null) ? postBackMode : base.DeterminePostBackMode();
And issue a callback is something like this:
IHttpHandler handler = this.Context.Handler;
NameValueCollection postData = new NameValueCollection();
Context.Items["callbackPostData"] = postData;
Page calledPage = (Page)PageParser.GetCompiledPageInstance("~/Callback1.aspx", this.Server.MapPath("~/Callback1.aspx"), this.Context);
this.Context.Handler = calledPage;
StringWriter writer = new StringWriter();
Server.Execute(calledPage, writer, false);
this.response.Text = writer.ToString();
this.Context.Handler = handler;
You can find an implementation of a caller and a called page here.
Reading through the The Typemock Insider blog, I came across this post from Gil Zilberfeld.
I myself tend to fall in Gil’s practice ("binary search" debugging), but I don’t think Kent Beck has the right solution.
Gil’s suggestion of using Isolator is tempting (I don’t miss an opportunity to use it), but still not my favorite one.
I prefer to use debug assertions. Debug assertions can be used when running a debug version of the application to pop-up assertion messages and when running unit tests to fail tests.
In order to use debug assertions in unit tests a “special” trace listener is needed to make the test fail when its Fail method is called.
public class UnitTestTraceListener : global::System.Diagnostics.DefaultTraceListener
public UnitTestTraceListener() : base()
this.Name = "UnitTest";
this.AssertUiEnabled = false;
public override void Fail(string message, string detailMessage)
Microsoft.VisualStudio.TestTools.UnitTesting.Assert.Fail("Debug.Assert Failed: " + message + " " + detailMessage);
Now, all you need to do is register it.
Registering the trace listener can either be done in code:
<?xml version="1.0" encoding="utf-8" ?>
<add name="UnitTest" type="UnitTestTraceListener"/>
And if I’m using Isolator I have the take in account the accesses made in the call to the Assert method. More fun to me.
If you were able to attend this session at PDC or Tech-Ed EMEA Developers, you were presented with a first class presentation of the future of C#, presented, respectively, by Anders Hejlsberg and Mads Torgersen.
For the near future (.NET 4.0) C# will have:
Dynamically Typed Objects
Optional and Named Parameters
Improved COM Interoperability
Co- and Contra-variance
A preview of the compiler as a service was shown, but that’s not for the .NET 4.0 / Visual Studio 2010 timeframe. Probably, not even for the next.
Starting with .NET 4.0, C# and Visual Basic will converge in terms of features and follow a path of co-evolution going into the future.
No! That doesn’t mean that XML literals will be in C# in a foreseeable future. What that means is that the above list also applies to Visual Basic.
Talking of Visual Basic evolution, the _ line continuation character has been retired. If you have any use for the underscore, please visit http://www.unemployedunderscores.com/.
(It might seem a bit late for this, but, lately, I’ve been having a lot on my mind. So here it goes.)
This was my first PDC. It was just as I had been told.
For those who don’t know, the PDC is all about the future. The near future (.NET 4.0 and Windows 7) and the further future (Windows Azure, “Oslo”, “Dublin”, “Geneva”).
Next year’s PDC (Yes! Apparently, there’ll be one next year) will be also held in Los Angeles from November 17 to 20, and (I suspect) will be the commercial launch of the Azure Services Platform and a better story to tell about “Oslo”.
Tech-ED EMEA Developers, on the other hand, is more about the present and the near future. But, this year, attendees were able to have a sneak peek at what had bee shown at the PDC.
Next year’s Tech-ED EMEA Developers will be held in Berlin from November 2 to 6. Probably, like in 2006, it will be the launch of .NET 4.0 and Visual Studio 2010.
And I intend to attend both.