Tales from the Evil Empire

Bertrand Le Roy's blog


Bertrand Le Roy

BoudinFatal's Gamercard

Tales from the Evil Empire - Blogged

Blogs I read

My other stuff


March 2006 - Posts

Mix06 demo part 1: the accordion control

I thought I'd write a few posts about the controls I've created for Brad's Mix06 demo, which you can download from his blog:

Full source code of the demo: http://blogs.msdn.com/brada/archive/2006/03/23/559077.aspx
Accordion control source code: http://blogs.msdn.com/brada/archive/2006/03/29/563648.aspx

One of the most reusable pieces of the demo is the accordion control. Plus, I'm French so it looked like a natural way to start this series of posts.

Atlas accordion control
(This is a snapshot of the control, an image, not a live sample)

The control is a client-side Atlas control. It does pretty much what you expect, which is to show only one of its panes at a time. It has a nice transition effect when switching to a new pane. Here's how you use it...

As usual with Atlas client-side controls, the control itself does not do any rendering. This part is left to the page developer and is completely free-form, which gives you total control over the rendering and helps separate layout and semantics from behavior. So the first thing to do is to create a container element (a div will do) and put the different headers and content panes inside of that. For accessibility reasons, I recommend you use A tags in the headers. Divs are fine for the panes themselves.

<div id="accordion1">
  <a href="BLOCKED SCRIPT;">Pane 1</a>
  <div id="Pane 1">
    This is Pane 1 in the accordion control.

  <a href="BLOCKED SCRIPT;">Pane 2</a>
  <div id="Pane 2">
    This is Pane 2 in the accordion control.

  <a href="BLOCKED SCRIPT;">Pane 3</a>
  <div id="Pane 3">
    This is Pane 3 in the accordion control.

<div id="accordion1">
  <a href="BLOCKED SCRIPT;">Pane 1</a>
  <div id="Pane 1">
    This is Pane 1 in the accordion control.

  <a href="BLOCKED SCRIPT;">Pane 2</a>
  <div id="Pane 2">
    This is Pane 2 in the accordion control.

  <a href="BLOCKED SCRIPT;">Pane 3</a>
  <div id="Pane 3">
    This is Pane 3 in the accordion control.

The accordion control will look at the contents of its associated element and will consider every other sub-element as a header or a content pane. So let's associate an accordion control to the div above:

<dice:accordion id="accordion1"/>


And... that's it, we're done.

One nice thing to note is that if javascript is off on the user's browser, the contents of the accordion will just show with all panes fully deployed.

In a future post, I'll explain how such a control is built and in particular how I included the animation effect.

UPDATE: Here's the scaffolding you need in your page for the accordion control (or any Atlas control) to work... You need a script manager on the page, which will manage the necessary script inclusions. Here, the accordion needs the Glitz library for the animations so we're adding that:

<atlas:ScriptManager ID="ScriptManager1" runat="server">
    <atlas:ScriptReference ScriptName="AtlasUIGlitz" />
    <atlas:ScriptReference ScriptName="Custom" Path="~/ScriptLibrary/Dice.js" />

Replace Dice.js with accordion.js if you're using the accordion-only version of the script file (the dice.js has the other controls used in Brad's demo).
If you're not using ASP.NET, you'll have to manually add <script> tags to your page that point to Atlas.js and the glitz script file.

And of course the Atlas markup itself must be in a <script type="text/xml-script"> section:

<script type="text/xml-script">
  <page xmlns:script="http://schemas.microsoft.com/xml-script/2005" xmlns:dice="http://schemas.microsoft.com/xml-script/2005/dice">
      <dice:accordion id="accordion1"/>

Note the namespace declaration here so that we can use the dice:accordion tag. I should have included that from the start, but I wanted to show only what's relevant to the sample. My mistake, it's corrected now.

UPDATE 2: The Atlas Control Toolkit now has a far more advanced Accordion control, which should be preferred over this one for real-world use.

XNA Framework
We’re not the only ones busy around here :). My friend Julien as well as other very smart people are building the next generation development system for Xbox and PC games, XNA. Part of this work is to build an implementation of the .NET framework for the Xbox 360. This means considerably reduced development costs for casual games and... who knows what's next? This is just going to be great as it's going to lower the entry bar for game development. I hope this is the return of the time when people like me, with the help of a few friends, could develop a reasonably successful game. This brings back 20-years old memories of the time when I was building video games with my friend Fabien for the TI99 and Atari 800...
Live from Mix 06

Mix is going really well. We have a much more diverse public here than at the usual Microsoft dev conferences. I've never seen that many Mac laptops at an MS conf. We're getting absolutely great feedback from attendees. Here's a sample from a conversation: "If you had told me two years ago that I would be at a Microsoft conference one day I would have just laughed at you, but this is great, it seems like you guys actually like us."

Additionally, the March CTP of ASP.NET "Atlas" is online, you can download it and enjoy all the new stuff and bug fixes we put in there.

Plus, we have some documentation. About time, I know ;) .

Oh, and one more thing: you can build a site with Atlas today and go live with it. This doesn't mean that we're freezing the object model. We *are* going to continue developing the platform further and stuff is going to change, but we wanted to give our early adopters the green light to publish their stuff publicly on the web.


Ruby for .NET? This is going to be interesting...

There have been a few bridging projects in the past that aimed at using .NET libraries from Ruby. Now, this is different, this is an actual implementation of the Ruby language for the .NET Common Language Runtime. Seeing how .NET finally seems to be more than OK with dynamic languages (see the amazing IronPython project), this could prove to be a very interesting turn of events. The open-source project, initiated by the Queensland University of Technology in Australia, is in its early phases with only some of the standard Ruby libraries implemented. This is not very surprising as the most difficult parts of porting a language is usually that the language itself is not enough, you also need its base class library if you need more than a familiar syntax and if you want that to have more than marginal usefulness. J# suffers from the same kind of problem with its class library.

Anyway, I thought I'd let you know. Check it out:

Did standards sterilize or enrich the web?

I know, I know, it is not very nice even to ask this question, especially coming from a Microsoftee. But really, I'd like to know what you think. We have had many discussions on this subject in the team and many of us have very sharply defined opinions one way or another.

On the one hand, standards have enabled the adoption of some innovations by all important browsers so that you can actually use them and not limit your audience to users of a specific browser, without having to write the same application three times.

On the other hand, it is impossible for a web developer to use a new browser innovation because it needs first to be standardized, adopted, implemented by all and the implementations need to be installed by a majority of users. It usually takes about five years. That's why we have the Ajax boom only now whereas the technology has been available for a very long time.

Of course, standards are not really responsible for that. But there's worse: any browser innovation is likely to be severly criticized as not being standard. There's a chicken and egg problem here that's not trivial to solve.

Furthermore, if you look at some of the nicest things in Ajax applications, they are not part of any widely adopted standard. I'm thinking in particular about things like XmlHttp and online HTML editing for which standards may have been drafted but were not adopted: the browsers agreed on the APIs in the absence of standards.

So honestly, what do you think? Any bright ideas on how to have both standards and innovation in the Web browser space?

Update: some answers to your comments (by the way, thanks for their excellent quality)...

Michel, who was first to comment, pretty well sums it up.

Joakim made the very interesting point of accessibility. This is right on point. That's an area where the standard is absolutely necessary, but it falls short on a few things.
First, many web developers, if they know/care about the norm at all, tend to mark all the checkboxes and consider to be done. That's unfortunately not enough and experimentation is key to having a truly accessible site. Of course, that's a very expensive process.
Second, until very recently, browser support for essential accessibility features such as accessKey, tabIndex and focusability on all elements has been limited to IE (which has supported it for years). Firefox implemented it in 1.5 thanks to some code donated by IBM and I think Safari and Opera are still lagging behind on this (correct me if I'm wrong here). Screen readers are pretty much limited to Windows/IE.
I'm not even talking about JavaScript not being considered accessible: it does in fact create accessibility scenarios that would be impossible without it so that's a standard requirement that's safe to ignore if done properly, but you won't get to check that checkbox.
In Atlas, accessibility is a key goal. For example, if you set the tabIndex on a listView, it immediately becomes keyboard-accessible: you can tab to it, use arrow keys to navigate between items, select using the keyboard, etc. As we go, we're going to bake more out-of-the-box accessibility features into the controls. This is I think a great example of JavaScript enabling better accessibility.

Wesley comments about not implementing existing standards and about filters nicely extending them. Of course when the standard exists it should be implemented. Now there are quite a few cases where the implementation actually predates the standard and the standard deliberately decides to do things differently from the existing implementation. That's fine if there is a good reason and that's usually the case, but what is the existing implementation supposed to do? They can't break their existing customers (well, they can: it seems sometimes like Microsoft is the only software company to care about that of course this is not true but it's definitely not a nice thing to do and something your users will hate you for), so they need to have a standard mode based on doctype (which may have future breaking changes) and a quirks mode that focuses on backwards compatibility, which is *exactly* what both IE and Firefox did.
Now about filters. That is a *great* example of something extremely useful, entirely optional in the sense that if the browser doesn't know how to handle it it's fine to ignore and absolutely non-standard. The cursor style is another example. So everything looks nice and that's an innovation everybody should be happy about. Except that if you use this style and have the javascipt console open in Firefox (maybe strict mode is also necessary to see that), you'll see that it considers any non-standard CSS attribute as an error. Yes, as en error, not even a warning. Google's home page throws errors in the console because of that. So that's clearly a case of yeah, sure, standards, ok, but let's just not go crazy about restricting possible things to what the standard defined. There must always be an extensibility path. XHTML is great at that.

Steve makes the point that browser developers should work closer together, be it inside or outside standards bodies. Yep. It seems like this is starting to happen. He also asks what happened to JavaScript 2.0. Oh yes, I would also like to know. It would make our lives so much easier (in five years).

Joe gets constructive and proposes that vendors take on the task of providing better documentation for standards. That is a very good point: the W3C documentation is just horrible. But of course, this is due to their not being written to be reference documentation for Web developers. They were written to be a reference specification for browser implementers. There definitely is a need for a standard reference documentation for Web developers (yes, there are some efforts to do that here and there but nothing really comprehensive). Maybe that should even be a joint effort from browser vendors.

Stan: it's a little sad that you would think that we don't know/care/understand about web standards. I especially liked this sentence in your post: "If you understood CSS and Style Sheets and some of the unbelievable things that you can do once you understand what a selector is I think that your view of standards would change". It is really symptomatic that you would assume that we don't.
You raise very specific point so let me answer them.
Where did you see that our navigation controls do not have their scripts in separate js files? Menu and TreeView both have their logic encapsulated into their own files. The only javascript that we emit on the page is instance specific and thus can't be isolated into a separate file.
I'm with you on inline styles. This is a legacy of the time when not all browsers supported CSS. We now recommend that people use CssClass on their styles instead whenever possible. Visual Studio will make that a much more natural first choice in the Orcas timeframe.
On table tags, we've debated that a lot and it was just not possible to get consistent rendering across all supported browsers without them. I'm actually implementing a simpler menu control on my personal time (which is not a lot of time) that's based on list items, css and JavaScript. The thing is this does not conform to the same constraints as the ASP.NET menu. For example, it won't ensure consistent rendering across browsers, it will look funny out-of-the-box until you style it, the list goes on. My point is, that will certainly be a useful control, but it definitely is a different control and not the one we chose to implement, I think for good reasons.
It sure is really easy to implement a simple pure HTML+CSS menu. Now you tell me how to do the following with pure HTML+CSS and no JavaScript:
- delay the disappearance of a menu
- position the popouts so that they are always on the visible part of the page, including positioning it on the left or on the right of the previous one depending on the available space and the size of the new menu
- size the popouts so that they always fit in the visible part of the page and add scrollers if they don't
- keyboard accessibility (with arrow key navigation)
The list goes on. We did implement all that.
I agree that the master page ids are an annoyance but that is necessary as master pages are containers and we need to manage id collisions. That doesn't mean you can't put most of your javascript in a js file. You'll just need to initialize a data structure inline for your generic script to use, which should be a fairly small amount of script. I agree it's inconvenient and we're working on ways to make that simpler in Atlas.
The thing about the master page set in web.config not being handled well by the designer is a known limitation. It's one that I personally would have liked to see fixed in Whidbey but it was unfortunately cut. It's high on the list for Orcas.
Finally, the thing about XmlHttp is a little of a cheap shot. Let's put things in perspective here. XmlHttp was invented by Microsoft, which decided to implement it (way before anybody else) as an ActiveX control at a time when ActiveX was the big new thing. Several years later, Mozilla decides that it's not a bad idea and comes up with their own implementation (compatible at the API level, which is a nice gesture). They don't have ActiveX (good for them, yes) so they go for a built-in object. Cool. We can't recall all copies of IE and change the existing implementation, can we? So what we do instead is implement it also as a built-in object in IE7. What's wrong with that (except for the delay between IE6 and IE7)?
Update to this comment: Stan answers this and clarifies his initial comment. No hard feelings here but I think just an animated technical conversation.

Bob seems to be reasonably happy with IE. Wow!

Wilco makes the point that what we consider to be innovation on the web would hardly qualify as such in the desktop application space. Yup. I've had the occasion to say it before, if you can use truly rich client technology such as WPF, by all means do so. Ajax is about making the web better, not about making it better than your desktop.

Finally, I'd like to add a few things about other platforms that had different experiences with standards.
First, there's Java. In the Java world, which is a multi-vendor thing, a standard body is a necessary thing. Still, there is a fair amount of complaints in the community on how the whole JSR process is weighing on innovation and most of the innovation happens in open-source projects (again, correct me if I'm wrong on that: it's my perception). Some of that ends up in JSRs, some just die, some are never standardized but still live happily ever after.
Second, there's .NET. Being pretty much conceived as a single vendor thing, there was no absolute need for a standard. Microsoft decided to submit the essential parts of the framework to ISO and ECMA anyway, which enabled great innovations like Mono. But only the smallest working set is really standardized. It's working pretty well though and innovation is florishing.
And there's Flash. Until recently it's been a single-vendor, proprietary thing. Now that there's been some standardization and format opening, a lot of companies are innovating in this space (XAMLON is an amazing example).

Update 2: Rick has an excellent comment.

Update 3: Colin makes the point of standard extensibility, to which John answers that browser vendors should form comittees to decide which features to implement. While it looks like a good idea, there are several problems with that. First, isn't that exactly what the standards bodies are supposed to be? And second, doesn't that just prevent any form of competition between browsers? If they are all the same, why not just group our efforts and have a single browser? Wait, that would be a monopoly and would probably be even worse.

Can tidal power plants have an effect on the Earth's rotation?

I just read an interesting article on the project to build power plants that tap into tidal energy. It's really weird to see that article now because I was discussing that exact subject with Fabien on the Stevens Pass chairs last Saturday.

A couple comments:
- There is a power plant in France that works on this principle. It's been operating since 1966 and it's producing 550 million kWh a year. China operates eight similar plants, and Canada also has one.
- This energy is *NOT* renewable. It's basically gravitational potential energy. Thoses of you familiar with physics know about the principle of action and reaction, which in this case implies that such energy tapping would in return have an impact on the relative motions of the Earth and Moon. Of course this effect is very very small and probably safe to ignore but we do have a precedent: the Moon itself now always shows the same face to us because the dissipation of the tidal energy into deformations of the crust quickly forced it into the minimal energy position which is the one where the tidal bulge on the rock always faces the direction of the tidal force. This is exactly similar: the deformation was tapping from the tidal energy, which slowed down the rotation of the Moon. How long would it take for the Earth? I didn't make any calculations but I'm pretty sure that would be a huge number.


Virtually drive through Seattle and San Francisco using Virtual Earth

They're at it again. They've added an amazing feature to Virtual Earth:


At first, it seems like we've had street level photos like that (pagesjaunes.fr is the oldest example I know) but I've never seen something like that where you can actually grab the car, move it on the map and see the view from your car change in real time with amazing responsiveness. You can also drive with the keyboard. It's no PGR3 but it's damn impressive for a web application.

The duck's wake

This is the first translation I'm doing of one of my French science popularization blog posts.

When I was preparing my PhD thesis a few years ago, I was also doing weekly hour-long preparation sessions for groups of three students to train them for the engineering schools contests. In these sessions, each student is given a problem that he must solve on a chalkboard.

The students were often brilliant and showed a great physical sense.

One of my favorite problems was the following: "the duck's wake". The student was supposed to figure out both the questions and the answers. Great exercise if you ask me and very revealing of the student's qualities or lack thereof. Here are some of the things you could say on this subject...

Let's simplify and suppose that the duck is a dimensionless point. When it moves on the water surface, it casts waves. Let's imagine these circular waves travel at an approximately constant speed (in reality they don't, more on that later). There are two possibilities: the duck can move slower than the waves, in which case the first wave will always be ahead of all the others, or the duck is "supersonic" and the waves will form a triangular wake. The first question you can ask yourself is to determine the relation between the angle of the wake and the speed of the duck. As could be expected, the faster the duck, the sharper the angle. The angle would be 180 degrees if the duck had the exact speed of the waves: the wavefront would be a line perpendicular to the direction of the duck and it would move with it.

The triangle that the wave's enveloppe creates is really a shockwave similar to the one that a supersonic plane emits. The supersonic bang is just the shockwave. Contrary to popular belief, the bang is not emitted by the plane as it passes the speed of sound but is continually emitted as long as it travels faster than sound. The thing is that the places where the bang can be heard move at the speed of sound. What you hear when a supersonic plane flies by is thus, to summarize: silence as long as you're outside the sound cone, a bang as you enter it, and the noise of the plane after that.

The second question you could ask is the repartition of energy in the shockwave. The result of the calculation is really surprising: the energy diverges and becomes infinite at the tip of the cone. This means that you would need infinite energy to go above the speed of sound. I suppose that's one reason why people used to think it was impossible. Of course, it's not really infinite in practice, just very expensive because no plane or duck is a point.

So now we know that any object that travels faster than the waves it emits emits these waves as a shockwave that packs most of its energy in its enveloppe. This phenomenon is commonly observed for surface waves (a duck's or a boat's wake) as well as for sound (supersonic planes). Now can it be observed for lightwaves?

A priori, no physical object can travel faster than the speed of light in a vacuum, so this phenomenon looks like something that would be out of the question. Nevertheless, light doesn't always travel in a vacuum. In any medium, light travels slower than the speed of light in a vacuum. It is thus possible to travel faster than light in a medium. The shockwave that the theory predicts does exist and is a commonly observed phenomenon called Cerenkov radiation. It is this radiation that is used in some neutrino detection devices.

Neutrinos are subtle particles. They have a very low mass (it's only recently that we've discovered thay have one), they don't have an electrical charge, don't participate in the strong force and the only way they interact with anything is through improbable weak interactions (gravitation can be neglected for detection purposes, the neutrino masses being so small). Despite their being very common particles in the universe (billions of neutrinos go through us every second), as they rarely interact with ordinary matter that is made from electrons, protons and neutrons, their detection is very difficult. Some detectors use the Cerenkov effect. When a neutrino interacts inside the detector, a particle such as an electron can be emitted with a faster than light speed. This particle emits light while decelerating (any charged particle accelerating emits electromagnetic waves). The resulting shockwave is then measured by detectors, from which we can deduce the trajectory of the charged particle, which gives us the direction of the neutrino that created it.

This is how you can start from the duck's wake and end up discussing the detection of neutrinos. That's just one illustration of the extraordinary explicative power of physics...

UPDATE: comments pointed out that the angle of the wake of the duck (or of a boat, or of whatever moves fast enough) does not depend on the speed of the object, as the good Lord Kelvin showed. It is constant at approximately 39 degrees. This is because the speed of the waves depends on the wavelength in such a subtle way as to cancel the dispersion that a single wavelength wave would show. Our simplistic calculation is still perfectly valid for cases where such dispersion doesn't exist, such as Çerenkov radiation.

More Posts