June 2007 - Posts - Jon Galloway

June 2007 - Posts

Speaking on SubSonic at the San Diego Code Camp this Saturday

I'll be presenting a session at the San Diego Code Camp this Saturday (6/30/07) titled "Using SubSonic to built ASP.NET applications that are good, fast, and cheap". I'll do a quick overview SubSonic in general, but spend most of the time building out a website. If you're interested in following along on you laptop, be sure to grab of SubSonic 2.0.2 (the latest release)from CodePlex.

I got the bright and early 9 AM slot, but am convinced that coffee will see us through.

 UPDATE: Files from my talk are available here.

Posted by Jon Galloway | with no comments
Filed under: ,

The value of "good enough" technology

Twitter is Good Enough

Twitter drives all my tech-savvy friends crazy. We all agree that the idea - a simple mix of blog, chat, and IM - is a good one. However the site does very little, and what it does it does poorly - slow response, frequent outages, etc. Most developers figure they could write a "better Twitter" in a lazy afternoon, and some already have. Good idea, poor execution, and yet... it's good enough.

Twitter is good enough to keep its network and remain successful. Yes, they could be better, but their networked customer base isn't going to pick up and move to a new service all at once, and people are unlikely to move on their own. Twitter is good enough.

The Dishwasher, the Chinese Room, and a Subservient Chicken

Assume I told you to specify the very first dishwashing machine. It would be very tempting to define the process:

  1. The machine will pick up an article.
  2. The machine will wash the article.
  3. The machine will inspect the article for spots and re-wash if necessary.
  4. The machine will rinse the article.
  5. The machine will dry the article and place it on the drying rack.

In reality, though, dishwashers just spray a bunch of hot, soapy water on dishes in a rack and dry them with hot air. They don't deal with the dishes one at a time, which is part of the reason that they're small enough and cheap enough to fit in most kitchens.1

Imagine that you needed to design a system which acted on a small set of instructions which were written in Chinese. You don't speak Chinese. What would you do?

The "right" answer might be to hire some translators, or possibly set up some automated translation software. Those would all work, but would probably be a little expensive. What if I told you that it was acceptable to have a small percentage of failures, especially at the beginning? In reality, a most software is not 100% perfect, especially in early versions...

I didn't randomly pick this task. The Chinese Room is a famous thought experiment in artificial intelligence, which postulates that a person with a good enough "cheat sheet" in a sealed room could pretend to be conversant in Chinese, even though they didn't have any idea of what they were saying. That would be work even better if the vocabulary and topics were confined2.

That's how the Subservient Chicken worked. The Subservient Chicken was an advertising campaign for Burger King's chicken sandwiches back in 2004. You could (and still can) give the chicken just about any order, and he'd do it. It was amazing... and yet, the chicken only performed about 300 actions. They were just the right actions to make it look like this crazy guy in a chicken suit could do just about anything you asked of him.

Good enough for an a publicity stunt, but let's take it a step further. Let's say that we wanted the translator in the Chinese room to pretend to speak the entire Chinese language, and the Subservient Chicken to do just absolutely anything you could think of. In considering our approach, remember that a certain amount of failure is acceptable.

In that case, I'd make sure that the initial feature set (Chinese vocabulary, set of chicken stunts, etc.) would be just enough to keep users coming back, putting more effort into making sure all failures were captured and immediately acted on. Yes, the system won't be perfect to start, but it will keep improving, and the improvements will be based on what users are asking for that we can't handle. And in this case, I believe a quick feedback loop would be better than trying to guess what users would want for the initial release. That's how I assumed the Subservient Chicken worked, by the way. I figured that they had some guy acting out things as they came in, only performing new stunts when they weren't already in the system. If you did that, you'd very quickly build up a library of the most common requests, and a small amount of effort would be required to keep the system growing.

Ever wonder how Google's spell correction system works? If you search for "mnokey", you'll see "Did you mean: monkey", right? Although there are a few corrections to the results, the spell corrections don't come from an intelligent system or even a dictionary; they come from tracking user behavior. Just about every time someone searches on "mnokey", they don't click on any search results and search again for "monkey." Google tracks that and uses that as a good guess at a spell correction. The system learns from users.

Everything's a feature

So, what features are up for discussion? Absolutely all of them.

  • Security is a feature
  • Reliability is a feature
  • Performance is a feature
  • Scalability is a feature
  • Object orientation is a feature
  • Test coverage is a feature
  • Data integrity is a feature
  • Data access methodology (e.g. only via stored procedure) is a feature
  • Etc., etc.

Some of those are obviously listed to provoke a reaction, but everything on the above list really is negotiable.

For instance, you might see security as non-negotiable, but Wikipedia is a pretty good example of a site with an intentionally weak security system which has become incredibly successful.

Reliability? NASA shoots for zero defect products. It's enormously expensive, and it's not always successful. Every major Internet service I use crashes or goes offline from time to time. Nobody's 100% reliable. If we say reliability is non-negotiable, we're fooling ourselves.

Performance and scalability? As Richard Campbell and others pointed out during the ASP.NET Scalability Panel, it's a big mistake to spend too much effort supporting millions of users when you don't have a hundred. Of course we should avoid irresponsible programming, but I recommend building working systems that can be tuned when needed rather than building elegant systems from day one.

I understand that many developers view that as sloppy programming, but that's just following personal preference over pragmatism:

I'm really into performance tuning - I wrote a section in our soon-to-be-released ASP.NET book focused on tuning slow ASP.NET applications, from caching to database profiling to index tuning. I enjoy writing SQL, and I think I'm really good at it. And yet, I'm a big fan of SubSonic, which generates dynamic SQL for ASP.NET data access. That's because, while I like hand-crafting SQL queries, I think it's usually an irresponsible waste of time. The better approach is to build a working application on a framework which allows me to hand tune when I need to, then wait until that time comes to get under the hood. It takes restraint to let facts rather than gut tell me where to spend my time, but it's the right thing to do.

So, everything on the list above (and more) is negotiable. Some are undoubtedly important in your application, but it's probably a mistake to try to get all of them perfect on the first release. Ship something, listen, then ship something better!

Network Effects and First Mover Advantage

So, Twitter shipped the first service "social chat-IM-blog" thing, and it's not perfect. Unless they're offline for days at a time or a competitor comes up with an absolutely stunning new feature, though, their network is secure because nobody wants to move to a new system without their friends. Inertia is on the side of an established network.

That's why it kills me to see Microsoft so late to the game with a lot of their Internet services. In many cases, they really are better services, but they're late enough to have a very uphill climb.

What I didn't say

I want to make it clear that I'm not saying programmers should write poor software. We all know that will happen without my saying so... What I am saying is that software teams should focus on shipping software quickly that's "good enough" to be useful to their customers as quickly as they possibly can, then use customer feedback and actual system performance to drive what happens next.


1 I first heard this example studying for my old-school MCSD six or seven years ago. I no longer have the book and can't find the quote online, but I think it's a fantastic analogy.

2 I learned the value of limited vocabularies in my days as a submarine officer. We all got pretty used to communications under terrible conditions - sound powered telephones in the engine room, ship to ship radios, yelled verbal reports during fire and flooding casualties (some of which were drills, some of which weren't). The trick was that we were trained to use very specific terminology. That's great for several reasons - the speaker doesn't have to make any decisions on how to phrase your words, and the listener doesn't parse sentences so much as select from a few possible expected commands or responses. Hmm... this sounds like a separate post...

Silverlight content only prints in IE (for now)

Last night I made a simple Silverlight maze generator for my 6 year old daughter, who's really into mazes right now. When I tried to print the resulting mazes, I found that the Silverlight content was was blank in Firefox (left), but worked in IE (right):

Silverlight-FF2-Print-PreviewSilverlight-IE7-Print-Preview

The official word seems to be that printing Silverlight 1.1 content is unsupported, but is being considered as a 1.1 feature. It sounds like some people are coming up with some clever hacks to handle printing Silverlight content, but I'm hoping that print support is added to 1.1. In addition to a nice application framework, Silverlight's vector based content could be a great print format which would eliminate the need for PDF downloads in many cases.

[Silverlight] "AG_E_RUNTIME_MANAGED_ACTIVATION" = You don't have Silverlight 1.1 installed

Silverlight-1.1-NotInstalledI'm thinking a better error message might be in order when folks try to view Silverlight 1.1 content with managed code and only have Silverlight 1.0 installed, but for now this is what you get  (obviously the last three lines will vary depending on the actual XAML content):

Silverlight error message
ErrorCode: 2251
ErrorType: ParserError
Message: AG_E_RUNTIME_MANAGED_ACTIVATION
XamlFile: Page.xaml
Line: 9
Position: 9

Solution: Download the Silverlight 1.1 Alpha Plugin.

Posted by Jon Galloway | with no comments
Filed under:

[SQL Server Analysis Services] - "Errors in the metadata manager" when restoring a backup

I had trouble restoring a SQL Server 2005 Analysis Services backup today due to "Errors in the metadata manager" messages:

The ddl2:MemberKeysUnique element at line 243, column 28420 (namespace http://schemas.microsoft.com/analysisservices/2003/engine/2) cannot appear under Load/ObjectDefinition/Dimension/Hierarchies/Hierarchy.
Errors in the metadata manager. An error occurred when instantiating a metadata object from the file, '\\?\C:\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data\...

I'm still a relative rookie with SSAS, but I knew enough to suspect that the solution had nothing to do with the backup. My general programming M.O is to try a few things before I start searching for solutions, but with SSAS errors I give a shot at rebuilding the cube (if applicable) and then search.

Sure enough, others had seen this problem and traced it to SQL Server 2005 SP2 not being installed on the backup source or target. I checked and SP2 hadn't been installed on the target database server. Once I installed SP2 on the target server the restore went just fine.

Posted by Jon Galloway | with no comments
Filed under:

[SubSonic] LoadFromPost method maps controls to object properties

Since SubSonic data access code and cuts way down on the repetitive grunt work, I've started to resent having to write any code at all. On a recent project, we found that since we weren't writing much data access or map related objects, the majority of the code we had to write revolved around shuttling data between controls and object properties.

You know, code like this:

int productQuantity = 0;
if (int.TryParse(txtproductQuantity.Text.Trim(), out productQuantity))
    this.Product.Quantity = productQuantity; 

Binding code is easy enough, but save "unbinding" code is a bit of a pain. I talked to Rob about how to make that easier - maybe a SubSonic textbox with an attribute which would map it back to a SubSonic object and automatically handle the bind / unbind code. Rob told me about a feature I hadn't noticed before - LoadFromPost().

All SubSonic ActiveRecord objects inherit a LoadFromPost() method, as in: Product.LoadFromPost()

LoadFromPost finds all the controls with the same name as the "Product" column names and initializes the object from them. It handles type checks, the ASP.NET nested control names like DataGrid1__ctl3_TextBox1, etc.

I noticed that the developers on this project who were new to SubSonic assumed they needed to use the query object to load a specific object from the database. The simplest method is to use the constructor overloads.

If you're loading by ID, you can just pass the ID in the constructor:

Product p = new Product(productId);

If you're looking it up by another column, you can specify the column name and value:

Product p = new Product(Product.Columns.ProductName, "Space Suit");

Keep in mind that the second constructor loads the first product WHERE ProductName = 'Space Suit'.

Posted by Jon Galloway | 1 comment(s)
Filed under:

Calling an ASMX webservice from Silverlight? Use a static port.

The setup

Rob Conery recently posted on Creating a Web Service-Enabled Login Silverlight Control, which is probably a more important topic than many people realize right now. Since Silverlight code runs client side in the user's browser, many tasks like database access and user authentication require what is by definition a "web service" (even if it uses REST or some other, non-ASMX approach).

Along the way, Rob ran into an interesting issue. Being the wise man that he is, Rob knew that he faced a choice:

  1. Figure out an odd brain teaser dealing with undocumented alpha technologies
  2. Mention the odd brain teaser to Jon, who would likely get hooked and stay up all night figuring it out

Rob's a smart guy, you guess what he chose...

The problem

Microsoft Silverlight Tools Alpha for Visual Studio codename “Orcas” Beta 1 adds a new project type to Orcas - you guessed it, the Silverlight project. It does a few things - it adds the necessary references, adds an "Add Silverlight Link" context menu icon to other projects in your solution, does some sort of magic to make sure the the client side code compiles to ClientBin rather than Bin, and probably a lot of other important stuff.

The idea is that you create a solution with a Silverlight project, then create a separate web project, and then select "Add Silverlight Link..." to your web project.

Great, so here's the plan:

  1. We set up a Silverlight project and a Web (site or application) project
  2. We create a service in the web project
  3. We add a Silverlight link from the Web project to the Silverlight project
  4. We add a web reference to the Silverlight project, pointing to our webservice

Do all those intermingled references cause a problem? They can, if you don't set a static port for your web project (more on that later).

The main problem is that the ASP.NET Development Server (nee. Cassini) uses a random port by default, so when you add the webreference to your Silverlight project it adds via the dynamic port which has been assigned to that web project. The problem there, of course, is that since the port is randomly selected the webreference quickly gets out of sync with the webservice.

As I worked through the problem, I became convinced that the solution was to split the solution out into three projects - a main website, a Silverlight project, and a webservice website. I ran into a very interesting problem there. The problem is that the Silverlight control runs under the website's Cassini port, and the Webservice runs under a different Cassini port, and Silverlight's security model prevents it from accessing another port.

I'll try to say that again, this time in English. Let's say the main website is running on http://localhost:1000/Login.aspx and the webservice we want the Silverlight control to call is running on port 2000, as http://localhost:2000/LoginService.asmx.

The Silverlight BrowserHttpWebRequest sees different ports and throws the following exception:

"Cross domain calls are not supported by BrowserHttpWebRequest"

Really? I'm calling from localhost to localhost and I'm crossing domains? Yep. Reflector shows that pretty clearly - the IsCrossDomainRequest method compares on UriComponents.SchemeAndServer:

internal static bool IsCrossDomainRequest(Uri uri)
{
    string components = uri.GetComponents(UriComponents.SchemeAndServer, UriFormat.Unescaped);
    string text2 = HtmlPage.DocumentUri.GetComponents(UriComponents.SchemeAndServer, UriFormat.Unescaped);
    if (components.Equals(text2, StringComparison.OrdinalIgnoreCase))
    {
        return false;
    }
    return true;
}

And UriComponents.SchemeAndServer includes port.

Hmm...

Rob and I discussed several options:

  1. Use IIS rather than Cassini. This isn't ideal, since it requires manual setup and clutters up your IIS installation on your local machine, but since IIS provides a distinct URL for your project without requiring a port, your reference won't change.
  2. Proxy requests by using the browser's XmlHttpRequest object, which can (probably) make cross-domain calls.
  3. Don't use a webservice, and call back to an ASPX page using a simple REST interface. In this case, there's no webreference to manage.
  4. Some other crazy rubbish involving hosts file entries which made sense at 3 AM, but sounds ridiculous right now.

None of those seemed right to me, which is why it took me so long to finish this post. The simple solution is to use one website project and set a static port number:

 Webservice with static port

In this case, the Silverlight webreference has a set port and doesn't get out of whack. More important, by using one site running under one static port, both the page and webservice run under the same port and there's no cross-domain problem.

Posted by Jon Galloway | 7 comment(s)
Filed under: ,

Safari on Windows - Browser testing just got a whole lot easier...

Funny, just last week I posted about using browsershots.org to see screenshots of your web application in a huge variety of browsers. Today, Apple announced Safari 3 runs on Windows.

Aside from screenshot services, there some other methods that pretty much worked in the past. The Swift browser runs on the open source WebKit layout engine which is also used by Safari, so at least theoretically you could get Sarari-esque browser running in Windows. Otherwise, you needed to be running a virtual machine (and potentially ignoring license restrictions) to test IE and Safari on the same machine.

Now if we could just test IE6 rendering on Vista without a VM...

Failed Orcas Beta 1 install - Check for Office 2007 Beta Installer records

The Orcas Beta 1 install kept failing on my laptop with a non-specific error. The install log didn't say anything very helpful:

Microsoft Web Designer Tools: [2] Component Microsoft Web Designer Tools returned an unexpected value.
setup.exe: [2] ISetupComponent::Pre/Post/Install() failed in ISetupManager::InternalInstallManager() with HRESULT -2147023293.
VS70pgui: [2] DepCheck indicates Microsoft Web Designer Tools is not installed.

The detailed install logs are in the %temp% folder; mine was called SetupExe(2007060821502617D4).log. Towards the end, I found this line:

The 2007 Microsoft Office system does not support upgrading from a prerelease version of the 2007 Microsoft Office system. You must first uninstall any prerelease versions of the 2007 Microsoft Office system products and associated technologies.

I hadn't installed Office 2007 (Beta or final) on this machine, and the Add / Remove programs list didn't show anything with Office 2007. The Windows Installer Cleanup Utility found something, though:

Office2007-Orcas-Install-Error

Sure enough, I'd installed the Office 2007 Compatibility Pack (Beta), then upgraded to the release version when it came out. Apparently the beta entry hadn't been removed when I installed the release version. Removing the Windows Installer record allowed the install to continue.

Posted by Jon Galloway | with no comments
Filed under: ,

browsershots.org - Test your site in a variety of browsers on Win, Mac, and Linux

A client needed some help with a display issue on Safari / Mac. Browsercam is a good solution and is reasonably priced, but for this simple issue I just needed to see the site in Safari / Mac and make sure I hadn't affected IE6 as well.

Browsershots is free and has a huge variety of browsers on the three major platforms. The turnaround can be pretty slow (2 1/2 hours in my case) so sure you set the "Maximum wait" to 4 hours.

With a free, easy to use service like this, there's no excuse not to expand your browser testing a bit.

browsershots.org

More Posts Next page »