Archives / 2005 / May
  • FabriKam on MSDN

    A few weeks ago I order the FabriKam DVD. It never arrived so I guess my order was never processed or delivered. However you can get the FabriKam DVD off of MSDN downloads now. It's under Tools, SDKs, and DDKs | Application Tools, SDKs, DDKs | FabriKam 3.1 and just under 2GB to download. Once downloaded you need to unzip the VPC image which will create your .vmc and .vhd files for running under VirtualPC or Virtual Server 2005.

    I spent most of yesterday going through the VM and it looks pretty good. Lots of great source code, tons of documentation and a fully installed and working system out of the box to play with. Includes SQL Server, SharePoint, and BizTalk along with the solution files, source code, etc. Very slick. I did find some of the Web Parts were listed as being unsafe so I had to do a re-registration of them for some of the solutions to work but that could have been me (I was running as Administrator rather than one of the FabriKam users).

    If you're looking for business solutions and ideas that use SharePoint, InfoPath, and BizTalk it's a great learning tool to start with.

    EDIT: Some people have mentioned they can't find the file. I only have a Universal MSDN account I'm not sure if they've made it available to professional and other MSDN subscriptions but you do need an MSDN subscription to get it. The path is listed above starting under the Tools, SDKs, and DDKs menu in the MSDN downloads. You can try the path here but not sure if it will work for everyone (you will need to sign in via Passport).

  • Adjusting the DisplaySize of your Fields

    Yesterday I mentioned I would post an entry on YAUSF (Yet-Another-Undocumented-SharePoint-Feature) however I think this one will be around to stay.

    When you create a list you can specify the maximum length on Text fields. This is great and will provide a warning message to the user if they enter text that exceeds this. However it doesn't (always) adjust the length displayed on the screen so most SharePoint lists look like this:

    Of course with a simple tweak in SCHEMA.XML you can have your SharePoint lists looking like this:

    The list for the first image is your typical field definition for the columns. Here's the snippet from SCHEMA.XML:

       <Field Name="Title" ReadOnly="TRUE" Required="FALSE" Hidden="TRUE"/>
       <Field Name="Name" DisplayName="Name" Type="Text" />
       <Field Name="Address" DisplayName="Address" Type="Text" />
       <Field Name="City" DisplayName="City" Type="Text" />
       <Field Name="Country" DisplayName="Country" Type="Text" />

    In SharePoint fields, there's a property called DisplaySize. Yeah, looking through the SDK won't find it in the Field element in SCHEMA.XML. In fact, you don't find it in any SCHEMA.XML file anywhere (which might be why it's not in the SDK, unsupported perhaps?). You will find it however as a Property on the SPField class. This might lead you to think that you have to write a custom program to adjust the display length of all your fields if you want to give your users something more. Not so.

    While I was digging around in FLDTYPES.XML yesterday and screwing around with the RenderPattern there was another thing that stood out. DisplaySize was a property being checked and if it was there, it was used in the creation of the HTML tags outputting in the browser. So with a little tweak to SCHEMA.XML I added DisplaySize to a few fields like so:

       <Field Name="Title" ReadOnly="TRUE" Required="FALSE" Hidden="TRUE"/>
       <Field Name="Name" DisplayName="Name" Type="Text" DisplaySize="30" />
       <Field Name="Address" DisplayName="Address" Type="Text" />
       <Field Name="City" DisplayName="City" Type="Text" DisplaySize="20" />
       <Field Name="Country" DisplayName="Country" Type="Text" DisplaySize="15" />

    Now you get the much improved version of your list for your users to add and edit items to. This made sense as the property would alter the length of the field in the UI when it gets built in OWS.JS. Through the regular SharePoint interface you can specify the length of your fields. This will prevent the user from entering more than the number of characters here in the field but does it adjust the UI? It depends.

    If you create the fields as show in the second image with the field lengths indicated, you'll get the display you see on the screen above. All done through the UI. If you don't specify a length or you change a field so it is longer than 31 characters, it will stretch the UI display out to 50 characters wide (like the Address field above). 50 characters seems to be the max width and while you can create fields that are 255 long, they'll only ever display 50 characters wide. Through the UI. If however you edit your SCHEMA.XML for the list like above and set the DisplaySize tag to whatever you want, it will appear correctly not matter what the value is.

    So if you've been paying attention:

    • Creating columns through the UI only respect the length if it is less than 32 characters
    • Columns longer than 31 characters through the UI will always fill out to 50 characters wide in the Browser, no matter what the length
    • Specifying DisplaySize in SCHEMA.XML for a field will display correctly in the Browser for any length

    Anyways, mildly bizzare and slightly odd. Differences between what you can do in site definition files vs. what (normally) your users can do isn't always the same.

  • Unlocking the elusive TelephonePattern... almost

    I spent a few hours this morning chasing down a new SharePoint unicorn. Unfortunately I haven't found the solution but it's close. Real close. My rule is that if I can't crack the secret in a few hours I'll post or email someone and see if someone can complete the work. 10,000 monkeys must be smarter than 1.

    Have you ever built a custom list and had a telephone field that looked like this:

    That's the normal way you have to define a telephone field. With a plain ordinary text field (or you could use a number field if you want). But there's no way to format this stuff for end users. Wouldn't it be great if you could have THIS!

    Well, you can. Or at least Microsoft was *thinking* about letting you have it. The above image isn't a custom web part. It's just a regular list but with a *secret* option. I just haven't figured out how to get it to completely work.

    Deep in the heart of a file called FLDTYPES.XML (one of those dont-touch-this-or-you-will-be-unsupported files) lies the heart of something just waiting to get out. FLDTYPES.XML defines all of those columns you can add to a list (Text, DateTime, Number, Calculated, Lookup, etc.) and it also determines the way the fields are rendered for the various forms (New, Edit, Display, etc.) through something called RenderPattern. If you dissect the Text field's RenderPattern you'll find a switch on a property called Format. The default value is to just write out some JavaScript to create a new TextField:

           <HTML><![CDATA[<SCRIPT>fld = new TextField(frm,]]></HTML>
            <Property Select="Name"/>
            <Property Select="DisplayName"/>

    This will result in writing out the following JavaScript fld = new TextField(frm, "Name", "DisplayName"); into the browser and through a method in OWS.JS it will create a text field like in Image 1 above. However if a property called Format is set to Telephone you get this instead:

    <Case Value="Telephone">
           <HTML><![CDATA[<SCRIPT>fld = new TelephonePattern(frm,]]></HTML>
            <Property Select="Name"/>
            <Property Select="DisplayName"/>

    This will result in writing out the following JavasScript fld = new TelephonePattern(frm, "Name", "DisplayName"); to the browser which will create the 2nd image above. Cool. However it doesn't work. So first we create our Text field in a custom list using the SCHEMA.XML and add our Format="Telephone" value:

       <Field Name="Title" DisplayName="Title" Required="TRUE"/>
       <Field Name="ContactTelephone" Type="Text" DisplayName="Telephone" Format="Telephone" />

    This gets us our form and the data entry looks good. There's even separate validators created for each part of the field so if you make the ContactTelephone field required, it will require the area code and phone number (doesn't need a country code). Saving the item will result in a nasty message from SharePoint:


    No such field name

    No field was found with that name. Check the name, and try again.


    So now what? If you take a look at OWS.JS you'll see that when it actually makes the call to create the TelephonePattern, it actually creates a bunch of TextFields. One for each part of the phone number (country code, area code, number). So I figured if SharePoint couldn't save it, it was because there were these new fields that didn't know how to map to the record in the list (after all I only had one field, ContactTelephone). The code in OWS.JS seems to create new fields and gives them the naming pattern of "TelephonePattern#countryCode:" + the internal name of the field for the phone. So I modified my list definition to create some hidden fields to hold those values like so:


       <Field Name="Title" DisplayName="Title" Required="TRUE"/>
       <Field Name="ContactTelephone" Type="Text" DisplayName="Telephone" Format="Telephone" />
       <Field Name="TelephonePattern#countryCode:ContactTelephone" Type="Text" DisplayName="Telephone" Hidden="TRUE" />
       <Field Name="TelephonePattern#nationalCode:ContactTelephone" Type="Text" DisplayName="Telephone" Hidden="TRUE" />
       <Field Name="TelephonePattern#number:ContactTelephone" Type="Text" DisplayName="Telephone" Hidden="TRUE" />

    Here I thought we're all set. I deployed the list and entered a new item. Field displayed correctly, validation happened and it saved! Nice. However when you look at the list (through something like SharePoint Explorer) you'll see the fields but they're all set to null.


    So that leaves us puzzled and moving onto other things. You can get the Telephone to display correctly and save correctly with those fields in there. Just no values. As the Telephone format is not documented in SDK, it's one of those things that is left up to the reader. If I use it will it be supported? It's there in FLDTYPES.XML which is a core file of SharePoint however since it's not documented, you might think it will be removed in a future release.


    Tommorow I'll blog about another undocumented property which does work and I *think* should be there in fhe future. If anyone completes this investigation and gets the Telephone format working please drop me a note through the comments of this entry!

  • CAML and Date Formats

    Patrick beat me to the punch and posted about date formatting with CAML (it was sitting in my blog TODO pile, I *knew* I should have blogged last night). The date format works alone and will return values. I just wanted to add to his post that you can add a time to the date as well if you want to a query for a particular time. For dates with time the format of the query has to be: 




    <FieldRef Name="EventDate" />

    <Value Type="DateTime">2005-05-27T16:00:00Z</Value>





    So this would return all items in a list that has a field called EventDate with a value before 4:00PM on Friday, May 27, 2005.

  • Don't Modify your Output Path. Really.

    Todays read is an excercise in balance. The balance is between what Microsoft says and what Microsoft says about development, specifically creating Web Parts. Buried deep in the best practices around the build process is a quote. "Don't Alter the Build Output Path". The section that's relevant goes on to say:

    You might be tempted to alter the output paths of your projects in order to build to a single folder and then establish file references to that folder. Do not do this for the following reasons:

    • It causes the build process to fail with file lock errors when a referenced assembly exceeds 64 KB in size. This problem is likely to be fixed in a future version of Visual Studio .NET.
    • You can encounter another problem when you reference an assembly from a folder that is designated as your project's output folder. In this event, Visual Studio .NET cannot take a local copy of the assembly because the source and destination folders for the copy operation are the same. If you subsequently remove the reference from your project, Visual Studio .NET deletes the "local" working copy. Because the local folder is the same as the assembly's original folder, the original assembly is deleted.

    Any developer worth his salt and building Web Parts probably knows that buried deep in the SPS/WSS SDK is this little tidbit:

    Specify the build output path
    This task assumes that you are running Visual Studio .NET on a server running Windows SharePoint Services or SharePoint Portal Server. Setting the build output path to the C:\inetpub\wwwroot\bin folder will build your Web Part's assembly in the same location from which it will run on your server. If you are not running Visual Studio .NET on your computer, you can copy your Web Part's assembly to the folder C:\inetpub\wwwroot\bin folder on your server after building it.

    So on one hand we have Microsoft telling us to not change the Output path, and another we have them telling us to do it. Now for most of you, changing it has never been a problem. You set it to c:\inetpub\wwwroot\bin and all is well. Your Web Part compiles and runs and you can do all kinds of things like debugging and good stuff like that. On a larger project this isn't true so heads up to those about to approch the platform. Once airborne, it's an ugly trip back.

    On larger projects there can be lots of Web Parts. Our current project (a Contract Management System) has about 40 of them but even that might be considered small. However all the Web Parts are reletively thin (as they should be) and just simply make calls to a Service Layer to retrieve Domain Objects and update the views. Nothing fancy and nothing complicated. All 40+ Web Parts compile to less than 64k. We have a single assembly that represents the Business and Data Access Layer. They're combined just because everything is sitting on the SharePoint server anyways and the DAL is just a wrapper around accessing lists using the SharePoint Object Model so again it has to be local.

    It hasn't been a problem so far and we, as documented in the SDK, changed our output path to c:\inetpub\wwwroot\bin for development. Debugging and all that goodness abounds. The problem was this morning when I added a new assembly to handle communications with a third party system (SAP). SAP (via the .NET connector) creates proxy classes for you and these can change frequently and have to be regenerated so the assembly will change often. I didn't want to keep changing our Core assembly whenever SAP changed (because every Web Part referenced it) so it was split out. Herein lies the problem that surfaced. Now the system was referencing an assembly larger than 64k and, due to circular references in the assemblies, I couldn't avoid this. There were a few other things I tried with separating and combining classes but it's just not going to happen.

    So now it's back to resetting all 40+ projects to compile to bin\Debug again and creating an extra NAnt task to do an xcopy deployment to the c:\inetpub\wwwroot\bin directory for debugging. A bit of a PITA but has to be done and it fixes the issue. As Microosft says "this will be fixed in a future version of Visual Studio .NET". I checked and compiled with Beta 2 and it seems to be fixed but then I can't fully check it because I can't run Beta 2 against SharePoint just yet.

    Bottom line, follow the first rule and don't modify your Output path and use a build script to copy the files after a successful build. Even if your project is small today, someday it might grow. So do you want the pain now or later?

  • ListFormWebPart and Tzuanmi SharePoint Designer

    Last Friday I posed the question of being able to create "more than SharePoint but less than InfoPath" input forms without having to resort to writing an entire facade on top of a SharePoint List. Didn't get any response to things but I did some digging into the WebPartPages namespace. Microsoft creates a raft of WebParts that are used to create the SharePoint UI itself. The ListFormWebPart and ListViewWebPart are particularly interesting as you just point them at a list and they're supposed to handle all the heavy lifting of generating forms and views. And they do. All of the .aspx pages for lists use them so that's how all the extra columns you add to lists and document libraries appear when you add/edit an item or document. The ListFormWebPart just automagically creates the form on the fly by interrogating the list, getting the fields and their datatypes, and creating the various controls (TextBox, CheckBox, Dropdown, Calendar, etc.) for you.

    Anyways, I thought I would be clever (I always think I'm clever when I do these things) to create one of these things (they're sealed so you can't inherit your own Web Part from them) and render it in my own Web Part. Before the render (in the pre-render event) I could just grab the contents of the Web Part and make some "on-the-fly" modifications (like joining up dropdowns, etc.). That idea is a bust because as hard as I try, I can't do anything with the ListFormWebPart once it's been created. I can create it fine, set the list name and all that jazz but it always seems to render an empty Web Part. Poking around inside Reflector didn't show me much either (is SharePoint really that proprietary that Microsoft has to obfuscate the code?) so I'm at a loss how to tap these classes. Even though they're sealed, I don't recall anywhere Microsoft saying you can't use them (some classes are marked for internal use only, but not these ones). Just use them for good, not evil. Anyways, the quest goes on and I'll continue to post my findings here as I try to discover a value for them outside of what SharePoint does with them.

    Tzunami SharePoint Designer. I spent most of my Memorial/Victoria day looking at this new tool. Tzunami used to be the old K-Wise guys (there were some tools for SharePoint 2001 from them like K-Doc if I recall). The Designer is meant to be a tool that you can bascially build up your entire portal structure offline, saving it in various iterations to a local file (after sucking down the initial site from a SharePoint server), then committing those changes back to SharePoint to create everything. The tool is just coming out and has some growing to do but the initial reaction is pretty good. There are a couple of things that I noticed with this version:

    1. There's a Disconnect option but it seems to clear the project's contents. So you're really always connected to the server but it's not like being in the web interface where everything you do is committed immediately. With Designer you must invoke the commit through a menu which then hooks up and creates/updates all your lists and whatnot. Personally I think the model isn't quite right. I would rather be able to work completely offline and then sync up (say when I got back to the office after spending 300 hours on a plane ride designing my portal). Maybe that's a feature to come. It seems like it could support it.
    2. There are a bunch of features that are grayed out in this version like copy and move but I'm told they're coming.
    3. There are some elements of editing things like document libraries (like say exposing it on the Quick Launch) that are not present in the interface. Again, I'm told these are coming.

    The biggest thing that is mulling in my noodle is the fact that you can alter creation and modification dates on things like lists and list items. I need to do some digging to see how they're doing this because the properties through the Object Model are read-only so methinks they're doing something down at the database layer because I can't see how else they accomplish this. That bothers me for two reasons. First, never ever ever ever ever (did I say ever) touch the database directly. Period. Do not pass Go. Do not collect your versioned documents. There's just so much going on behind the scenes with transactions and logging and synchronization and other SQL stuff oh my. Even if you go through the documented stored procs Microsoft just won't support you and there are so many things that can go very, very bad doing this.

    Second, while there are a lot of people that jump up and scream when they try to migrate their data into SharePoint and generally rant about how it can't retain the original dates and such, I personally believe there's good reason why those things are read-only for developers in the Object Model. Basically it's a CYA thing. Yes, your developers do need access to be able to make changes to a site but do you really want them with the ability to alter that information once it's set? With all the SOX stuff going on, it's a good thing that when you write an item to a list you'll always know who created it and when it was last modified. Hold on, now with a single tool I can alter history! I can say "No, you didn't create that document on Monday May 13, 2001. Bob created it on Friday June 3, 2002." No, that's not a feature I want to enable. As well, there's a question of data integrity. SharePoint doesn't have a lot of it. I can delete a lookup value in a list or a user from Active Directory and depending on the phase of the moon, the information may or may not be there later. However I can always be rest assured that something silly will never happen like a modification date being set earlier than a documents creation date. Well, that's out the window now with this tool and if I did have any reports I created on aging those are probably going to take some explaining now.

    Anyways, if you into cutting edge tools and don't want to wait you can contact them to get a trial version. Like I said, there are some features missing and some features you might not want to have. The support guys are excellent and responsive so if you do have any questions they'll be happy to answer. The trial runs for something like 7 days and has a limited number of commits you can make to the server.

  • Conditional Mandatory Fields and Inter-form Filtering

    No, I'm not having that big neurotic breakdown you've all been expecting (although from the title one has to wonder). I'm on a quest. A quest for knowledge. A quest for an answer. Before I head off to dig into some Microsofties gray matter, I thought I would throw these questions out there for anyone to pick at.

    SharePoint forms are pretty simple things. You define columns in a list and the ListFormWebPart will spit out a data entry form for all the fields, complete with the contract you created for each field (mandatory, html enabled, drop down list vs. checkboxes, etc.). This is great but it does have its limitations, namely two things you can do fairly easily with regular ASP.NET forms.

    The first is the idea of a conditional mandatory field (or mandatory conditional if you prefer). Suppose you want the user to enter a Company name but only if they check a checkbox on the form first. This is handled by a simple event handler (CheckChanged on the checkbox) in the Web Form. However there's no interface in SharePoint to hook into the ListFormWebPart part this way (well, none that I know of). So you either define the field as mandatory or not. The alternative is that you build yourself an ASP.NET form in a custom webpart (or use SmartPart with a User Control) and do the coding yourself and then write the results out to your SharePoint list but now you're in development land rather than configuration land and it's expensive.

    The second is filtering interdependant lists. There have been a few posts I've seen on this. Probably the best one I've seen is this one by Patrick Tisseghem in this blog. Suppose you have one list (Country) that needs to filter another (Province/State) which needs to filter yet another (City). Typical scenario. You don't want to show the list of every city to someone to pick from. You might also have a business condition where you don't want to present lists of information to some users based on some condition that's outside the scope of your SharePoint environment (say a corporate directory of who reports to whom). Anyways, again it's a fairly simple thing to do with regular ASP.NET by hooking into a change event on the list and rebinding the data to the lookup. Patrick's blog is a great tip however it does demand that you know the Guid for the list and runtime things like that to make it work. If you're trying to bake a solution into something like a SCHEMA.XML for a list you usually end up running into dead ends (see my frustration with not being able to define Lookup fields in the list definition for more on this).

    Guids are a big part of SharePoint. Every site, every web, and every list has a guid to identify it. The trouble with guids is that they're only defined at runtime, meaning you either need to suck it out using CAML (which I think I've seen done by some posts by Ian Morrish before) do drop an albeit small but custom-written-none-the-less web part onto a page to get what you need. And to tie in with the "don't touch the system files in SharePoint or you'll be unsupported" thing that has reared it's head recently, you won't be able to do this very well with stock lists or admin pages.

    Looking at the ListFormWebPart (and the other classes in the Microsoft.SharePoint.WebPartPages namespace) it may be possible to override the GetData and the RenderWebPart methods to do something funky before it hits the page. Anyways, if anyone has some interesting ideas to solving these it would be good to hear about them. It may only be that custom solutions (Web Parts) is what you need to do here but that's sometimes overkill. I'll post whatever results I find with discussions with others as we come up with them. Cheers!

  • This wasn't the excuse note you were looking for

    Dear Employer,

    Please excuse Bil Simser from work on Thursday, May 19. He is not feeling well. Bil is at home in bed for the entire day, nursing what appears to be a serious hamster attack. Bil’s illness is in no way, shape or form related to the premiere of the final installment of the greatest story ever, which, coincidentally, premieres on the same date.

    While I cannot confirm nor deny that Bil has called my company, Geek Squad, asking to be set up with wireless access “in case of a space opera-related sick day” know that if you do receive an e-mail from your prized employee today, it is most likely because He was wise enough to plan ahead in the event of illness.

    But as I mentioned before, Bil is at home, safely in bed, but reachable (in dire emergencies) by e-mail or cell.

    One more thing. Beginning at 12:00PM MST, Bil Simser will be unreachable for about two hours, thirteen minutes and eleven seconds. He will be feeling really bad at this time.

    This wasn’t the excuse note you were looking for,
    Robert Stephens
    Geek Squad — Chief Inspector

  • Someone changed my ONET.XML!

    Okay, so there's some buzz going around based on a recent Knowledge Base article (KB# 898631) from Microsoft on supported and unsupported scenarios with custom site definitions. Some people are upset at the scenarios and say that the reason why we're using sitedefs (vs .stp files) is that we can apply the changes against existing sites to "automagically" update a site. Heather Solomon (new to my SharePoint blogroll) has a great reference blog on what directories contain what files and where do they end up when a site/area gets created. Nice stuff and very handy.

    Anyways, here's some scenarios that I ran into a few times with some guidelines on how far you can push the sitedef envelope (even if it is unsupported).

    Modifying the default setups. This is completely unsupported and while I do agree (after all, I wouldn't want someone to create a new portal and have things missing that should be there) however it does create an age old problem we've had with SharePoint. You can't create a new portal with custom areas. Sure you can create new areas by cloning existing ones, but since you can't modify the default setups all portals will always start exactly the same for everyone. The real crux is for organizations that want to customize the My Site so all new My Sites will get something more "corporate" with the standard stuff. After all, I can create a new default team site by copying the STS directory and messing with all the options (say I don't want people to be able to create discussion areas, no problem). However I can't touch My Site because it's part of the base system and thus unsupported.

    There's another kicker to modifying default setups. Like I said with the Portal, these are default Setups. I don't recall seeing any hardcoded values anywhere that depends on these things being there but if I was supporting a product I sure wouldn't want someone calling me if they messed with my control pages. That's like screwing with a .NET assembly and removing bits and pieces of it and recompiling it back to hope it still works. The only thing is that you don't need to decompile the schema files that make up a SharePoint site because they're all there in front of you, with just a flick of Notepad away from becoming your worst enemy. So okay, I'll buy not modifying the default setups but for the love of all that is holy, can we at least modify My Site and have a choice of which portal definition we use when creating a new portal?

    The other non-supported scenario is one of updating a sitedef once sites have been created. I'm on the fence with this because of doing a total fubar on an existing site (which I've done many times, it's not pretty). I can understand the need to put a stake in the ground and say this is unsupported. Like I said I've messed up sites by doing a redeploy over top of existing ones and boy did it hurt. There were points where I *had* to open the thing up in FrontPage just to delete it. Modifing sites in-place (via an updated sitedef) is a bit of a sort spot because as Serge van den Oever put it in this blog, the reason why we use sitedefs is so we can have more flexibility with creation of a site (sans programming) and update it easily. 

    Here's the scenario I ran into with doing this. Create a list and define a field to be a DateTime field (don't get me started on Lookup fields again). Now sometime later you decide to change the DateTime field to a Text field. Boom. You try to revisit a list that uses that field and edit it and you'll be in a world of hurt. I don't have the exact DON'T DO THIS list of what fields can and cannot be transformed from and to (anyone game to putting one together?) but basically changing types can be bad. They sometimes can work. For example going from a numeric field (as long as you don't specify decimals, etc.) to text is sometimes okay. Going from Text to say Choice sometimes works. Again, there are some scenarios that work and some that just plain put you in reboot server land, you are screwed, do not pass Go. This is due to the fact that behind the scenes it's not a 1:1 mapping of column you create to column in database and all that metadata in all those tables sometimes just gets really, really confused.

    Adding new fields (or adding anything). This isn't so much of a problem. Adding new fields to SCHEMA.XML generally always works with no ill effects (and I have yet to create a situation where it did). After all you are just adding more meta data about the lists so when a list renders it's add/edit form it's just another field to display. The only drawback here is if you make a field mandatory you can end up in a situation where your lists metadata isn't valid because any new fields will just be blank.

    Of course, any of these scenarios, good, bad, or otherwise, are offiicially unsupported according to the KB article so buyer beware. What's odd is they support modifying sites via FrontPage. Now while I can only mess up one site at a time with FP, I can really mess it up with a simple removal of some DHTML or Web Part Zone tags (read: make site unrenderable to the point where you can't fix it). So that doesn't make sense to me.

    P.S. I also stumbled across a nasty gotcha recently around missing files and the execution of ONET.XML. In ONET.XML you can specify files to copy to your new site when it gets created (again, referring back to the body of this blog, this section of ONET.XML only gets executed once on site creation hence why you can't update it and expect your sites to automatically be populated with new files that were not there before). Anyways, in the Modules section of ONET you can specify files to copy and where to copy them to. This is great but you'll get a nasty message as ONET is being executed when it hits a file it can't find. Yes, it's a file not found message and the new SharePoint Blue Screen of Death which has practically no information at all. I came across this but we were creating new sites with dozens of custom lists, dozens more of lists being generated and even more dozens of files being copied. Which file was it that was not found? The FileIOException class that will get thrown if you're doing this programatically contains a property called, what do you know, FileName. So why can't the IIS log, the SharePoint log, or even the SharePoint page itself tell me what file it can't find? I had to hunt and peck through my entire system until I found the culprit. In any case, I closed the ticket with Microsoft and asked them to kindly add more information error messages especially when they know the context of the message. It's akin to you buying something and your bank or the retailer saying "Not enough funds". So how much more do I need? I'm just looking for a little break here.

  • WikiSharePoint!

    Came into the office this weekend (yeah, I need a life) and found a blog posting from last Thursday by Mart Muller. He's put together a set of Web Parts that provide a Wikipedia like experience but hosted on your SharePoint site.

    First there's the Search Results Web Part. This displays any hits on the term you search for using the Wiki Search Web Part. It has links to the content of the search results and will also automatically hyperlink phrases that match other Wiki entries. There's also a Tree View of the Wiki entries so you can navigate through preferred and variant terms. The whole thing is hosted in one custom list (a thesaurus) with logic behind it to parse out the terms and serve up the information all linked together with like terms. Adding an entry is easy and just like filling out any other SharePoint list.

    I installed this quickly and easily in our dev environment to give it a whirl. A couple of minutes later I created a site to host the web parts (you can also create an area on a portal and it works the same) and added a few terms. Simple to implement and workable. Even though Mart says this is beta, it seems to be pretty solid and usable and can only get better. Combine this with Jim Duncan's cBlog Templates and you've got yourself a pretty powerful set of services all hosted using SharePoint.

    There's been requests before about creating KBs or other type tools with SharePoint and we've all told people it's doable but just needs a little work to bring things together. I can see this as a great start at a knowledge base in your organization. Just create an area on your Portal called Knowledge Base, drop the Web Parts on, and let everyone start adding terms. The great thing is that there are lots of ways to cross link the information and terms once they're entered will automatically hyperlink to other terms in their bodies so it saves you having to manually creating links. So check out the Web Parts here and let Mart know what you think.

  • The pain of creating lookup fields

    The SharePoint UI is great for end users. With very little training they can go off and create new custom lists and through the magic of a Lookup field they can create lookups into other lists and life is grand. For the developer though life is a rotten bag of apples when it comes to Lookup fields.

    There are two ways to create new fields in SharePoint sites. You can define them through Xml or create them programatically. With the Xml definitions, it's a matter of copying the CUSTLIST definition (which is just a simple empty list with a single field, Title) to your new definition and add fields. Here's the definition for a new text field:

       <Field Name="MyField" DisplayName="My Special Field" Type="Text" />

    Simple and easy. The <Field> tag is defined in SCHEMA.XML for your custom list and supports all the tags that are in the SDK documentation. Well, almost. The Lookup type is there and so if you wanted to define it you would think you do something like this:

       <Field Name="MyLookupField" DisplayName="My Special Field" Type="Lookup" />

    The documentation says that the List and ShowField attributes can be used with this type. The List attribute just says it's the internal name of the list (which we would assume that it would be the list we want to lookup values from). The ShowField attribute says it's the field name to display and be used to override the default (Title) and display another field from an external list. There's also another attribute called FieldRef which is the name of another field to which the field refers to, such as for a Lookup field. All in all, it's very confusing but you would think you could do this:

       <Field Name="MyLookupField" DisplayName="My Special Field" Type="Lookup" List="MyLookupList" />

    And if you don't want the lookup to use the Title Field in MyLookupList then you can use:

       <Field Name="MyLookupField" DisplayName="My Special Field" Type="Lookup" List="MyLookupList" FieldRef="LookupFieldName" />

    So let's put this to test and have some real data. Let's create two custom lists called Employee and Department. Each entry in the Employee list has a Lookup field that points to the Name field in the Department list. Here's the Lookup field definition in our Employee list:

       <Field Name="Department" DisplayName="Department Name" Type="Lookup" List="Department" FieldRef="Name" />

    However if you create your lists you'll notice two things. First, if you add an item to your Employee list (the one with the Lookup field in it) you'll see there's no choices available for Department (assuming you added values to that list first). Second, if you try to modify the Lookup field through the UI you get this nasty message:

    Guid should contain 32 digits with 4 dashes (xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx).

    So what gives? Simple. The List attribute, while it says it's supposed to be Text is but it's not the name of the list. It's the Guid (in the form listed above). The problem is of course that Guids are unique and only known after they're generated. There's nothing in an Xml file (no matter how great the Xml file might be) that can dynamically retrieve the Guid. So Lookup fields, IMHO, can't be used in SCHEMA.XML because they have to be the Guid of the list and that's not known until the list is created first (feel free to jump in and correct me if I'm wrong).

    Okay if we can't use SCHEMA.XML to do this, we can write code. Yes, beautiful glorius code. If you have a Lookup field and you retrieve the raw text from it, it looks like this:

    42;#Information Services

    The 42 refers to (besides the answer to life, the universe, and everything) the ID of the item in whatever list you're looking up. When you retrieve the lookup value you get "42;#Information Services" which you're going to have to transform with a simple little RegEx call if you want to show it to a user.

    So now you're thinking if I can retrieve it and get "42;#Information Services" I should set it the same way right? Nope. What you need to do is set the Lookup field with the ID of the value it's looking up in the other list. Internally when you set that, SharePoint will do a join and retrieve the textual representation of the lookup information and save it for you.

    Okay, some code to explain all this. This assumes that the site is created with both an Employee and Department list. This snippet will:

    • Add a new Lookup field called Department to our Employee List
    • Fill in some imaginary Department Names
    • Fill in some imaginary Employess that report to various Departments

    private void CreateLookup()


        SPSite site = new SPSite("http://localhost/sites/employee");

        SPWeb web = site.OpenWeb();


        // Get the Department List from the web for lookups

        SPList departmentList = web.Lists["Department"];


        // Get the Employee List from the web

        SPList employeeList = web.Lists["Employees"];


        // Add a new lookup field to the Employee list called Departement

        // that will use the Department list for it's values

        employeeList.Fields.AddLookup("Department", departmentList.ID,  false);


        // Create 2 new departments in the Department list for lookups

        AddDepartment(departmentList, "Information Services");

        AddDepartment(departmentList, "Finance");


        // Now create 5 employees with lookups into each Department

        AddEmployee(employeeList, "Mickey Mouse", departmentList, "Information Services");

        AddEmployee(employeeList, "Goofy", departmentList, "Finance");

        AddEmployee(employeeList, "Donald Duck", departmentList, "Information Services");

        AddEmployee(employeeList, "Daisy Duck", departmentList, "Information Services");

        AddEmployee(employeeList, "Minnie Mouse", departmentList, "Information Services");


        // Cleanup and dispose of the web and site





    private void AddDepartment(SPList list, string name)


        SPListItem newDepartmentItem = list.Items.Add();

        newDepartmentItem["Title"] = name;




    private void AddEmployee(SPList list, string name, SPList deptList, string deptName)


        SPListItem newEmployeeItem = list.Items.Add();

        newEmployeeItem["Title"] = name;

        newEmployeeItem["Department"] = FindDepartmentByName(deptList, deptName);




    private int FindDepartmentByName(SPList list, string name)


        int itemId = 0;

        SPQuery query = new SPQuery();

        query.Query = "<Where><Eq><FieldRef Name='Title'/><Value Type='Text'>" + name + "</Value></Eq></Where>";

        SPListItemCollection items = list.GetItems(query);                       

        if(items.Count == 1)

            itemId = items[0].ID;

        return itemId;


    The trick here is that you need to retrieve the ID of the item in the Lookup list based on name, then use that ID and set it in the other list. This is done by a simple call to the GetItems method on the list we're looking for. There are other ways to do this so for example if you have a small list you can load it up into a Hashtable and use the name as the key and the ID as the value. Whatever works for you as the call to the query can be expensive so you wouldn't want to do this for everything but if you just need it for a report or some data loading it's not too bad. Now when you look at your Employee record you'll see it's got a Hyperlink to the Department Name field in the Department list. 15 seconds work for a user in the UI, a couple of hours for you in Visual Studio. Enjoy.

  • Get the GAT!

    If you start a lot of projects you might follow a structure or pattern to how your solution is put together. Mike Roberts has a great blog here on how ThoughtWorks generally sets up their project development tree. They also put together a simple tool to do this (Tree Surgeon). We follow a similar structure, but it's usually all done manually (usually by the lead developer or Solution Architect) at the start of the project. There are some variations on where things go, etc. and some things are driven by the demand of the application. For example if you're not accessing a database directly (like in a SharePoint application where the Database is the SharePoint Object Model) then you don't need a Database project with SQL/Oracle/etc. connections and scripts. Stuff like that makes it sometimes cumbersome to set things up and it usually takes a few hours to get everything just right and it will vary from person to person and group to group on preferences of how things are organized.

    If you follow the Patterns & Practices group over at Microsoft, you'll know they've been working on a lot of cool things. They consider themselves in the guidance business and not creating cool tools (this is the same group responsible for Enterprise Library, reference architectures and patterns and other neat stuff). Now imagine if there were a pattern you could follow based on policies and architecture your organization wants/needs/desires to adhere to. Imagine if you could just fill out a single Wizard page and have that entire solution generated for you in minutes. And that every project at your organization followed the same pattern no matter what the specifics of the project were so if you moved from project to project (or worked on multiple projects at the same time) you would immediately know where everything was and where it was supposed to go. The GAT will help you get there (and more).

    Tom Hollander has announced that the Guidance Automation Toolkit (GAT) is now available for download. This download is for Visual Studio 2005 Beta 2 and you'll also need the Guidance Automation Extensions installed (which is available on the same site). Think of the GAT as the old Visual Studio Enteprise Templates on steroids (and beyond). Anyone who built VS 2003 templates knows some of the pain and suffering involved in it. The GAT makes this easy(er) but it's more than just templates as you can implement policies and patterns and the whole thing fits into the bigger Software Factories approach to things (although we're not sure how yet, I can guess a few things that could be done in this space). Anyways, if you're into it, Get the GAT!

  • CDI Technology Briefing in Calgary

    I'm off this morning for the better part of the day to hook up with Eli Robilard and CDI on their Technology Briefing Tour which hits Calgary today. Eli got in last night and we'll be chatting up on SharePointy type stuff. If you're registered with the class hope to see you there at the Westin Hotel today. Cheers!

    PS Wow, did I blabber on Monday about WOW and 64bit stuff. Okay, blackmail material if you want to ever embarass me at the next presentation I give.

  • Drinking the 64bit Kool-Aid and knowing how it's made

    I had a painful but interesting experience the last couple of weeks. To set the stage, I currently run 6 working systems at home all networked and all with various purposes (Linux firewall, Programming box, Graphics rendering workstation, File Server, Games machine, etc.) My programming box (XP Pro) was feeling a little under the weather and there was a sale on at Memory Express for a 64bit Athlon CPU. With the 64bit version of XP released I thought it was a good enough reason to upgrade and try out life in the fast lane. Warning techie-geek speak ahead.

    So off I went and purchased an Athlon 3000+ with a Gigabyte GA-K8NS Ultra-939 Pro board. I grabbed an image of XP 64bit off of MSDN and burnt a copy and proceeded to format a new drive for the OS. The install went surprisingly well and all I had to do was install some beta drivers for the sound card. Everything else worked perfectly (including the dual LAN and SATA controllers on the mobo) and the system was flying. My intent was to try out the whole 64bit development thingy so after a lengthy install of Visual Studio 2003 (and 2005) I went ahead and tried some builds of larger systems I had, targetted specifically at x64. Again, everything went well although debugging is hosed in 64bit. It seems the perf tools don't work on x64. The monitor was unable to start the kernal mode driver (VSPerfDrv.sys) and during sampling it would create an error saying "Profiling WOW64 processes is not supported by this version of the profiling tools". I was also hoping to get Virtual PC (32bit) running on XP 64 to see how it performed and if I could do so some silly things like run a 64bit OS (like Windows 2003 Server) in a 64bit hosted VM using the 32bit application (say that 3 times fast).

    However the biggest problem with x64 is the drivers. All drivers have to be 64bit (read: have to be, not should be) and while this is true for most of the big companies (ATI, etc.) others only have beta versions out or no support at all (the on-board sound driver was beta). There's a CD that comes with the Gigabyte with some goodies on it (flash BIOS, CPU tuner, etc.) but it wouldn't run at all with a 64bit OS. Why would they package this with a 64bit motherboard? Surely they know that someone would run it on a 64bit OS so wouldn't they have tested it? Basically it came down to having to use a very limited set of hardware until the drivers are available and while the performance was there it really was hardly noticable. I think if I was running Windows 2003 Server 64bit with SQL Server (a notoriaous CPU hog) that probably would have shown better results but for a deskop OS I just don't see the value.

    In any case, I wasn't going to continue with the x64 version so flipped back to using my trusty 32bit Pro edition. Windows has this wonderful feature where after enough significant change in the hardware occurs, it wants you to re-activate itself. A change of motherboard did that. Another problem is that with my MSDN subscription, I only have 5 activations available. Like Bill Gates and his wonderful "640K ought to be enough for anybody" statement, I'm sure the MSDN Licencing guys have the same attitude. Nobody will ever need more than 5 activations on an OS. Probably true, when you're running 1 machine. Run 5 and change out your hardware every month because you have nothing better to do with your life and all bets are off. So now I was SOL with no more activations on XP installs left. A few calls, some begging and whining, some digging and I finally resorted to drop back to XP Home since I hadn't used any licenses for it in my mini-Norad setup. Of course there was no way I'm going to develop on an XP Home Edition box so more shuffling, imaging, backups, ghosting, lots of coffee and I finally have a setup that works. Took about 2 weeks in total but we're back and running all 32bit OSes again.

    Now to get to the "how it's made part". Rory Blyth had an interesting blog about whether or not viewing the Windows source would actually help anyone? Being a MVP we have (after giving up our first born and the location of the secret Kennedy CIA files) the ability to partipate in the Shared Source Initiative from Microsoft which would give us access to the codebase. However I have a hard time thinking it actually would be beneficial to anyone (except *maybe* someone building embedded systems). One of the big arguments you get from the open source community is that Linux source code is available and Windows isn't and if a company was in trouble by using Linux they could just crack open the code to see what was going on and fix it. Okay, sorry but I really can't buy that. I've been in the Linux codebase and while it's somewhat organized it's far from someone just cracking it open to fix something. It's like anyone who might know something about cars being asked to rebuild an intake manifold (including all the extra gotcha's like getting the timing of the system right, etc.). Yes, I really don't know much about cars but I know more about operating systems and code and I'm sure not going to start messing with kernels (Linux, Windows, or otherwise) to see how something works. It's not like tweaking an algorithm in a business application to see what the number appears on the screen next. I think it's one of those myths. "Oh if only I had the source code I could be so much more productive" or "I wish I knew what was really going on".

    Sure, I'd be the first guy to say I'd be curious to see the source code behind SharePoint and with a copy of Reflector you can see how some things are done (just blur your eyes on the obfuscated parts) but it's really just curiousity over anything. Take a peek at the Windows CE source code that Microsoft did release and tell me that you're a better person and it would have saved you all kinds of headaches "if you only knew". If anything I would see it as potentially a learning thing (and given some of the code I have seen it might be a learning platform of how not to write code) but other than curiousity or "how did they do it because it works so why not reuse it" attitude I can't see a need for it. Give me a component that works and has sufficient documentation anyday over ripping into the code to see how things work. I'd rather be a consumer of a serviceable part than knowing how the guy that wires the low-level stuff together did it.

    Intellectual property? Any corporation that produces a software product has invested oodles of cash into building it but do you think for one minute if a new "revolutionary" feature showed up in say Tiger (say a Networking Wizard, surprising similar to what Windows does) that people would'nt start looking and lawyering up to see the code that produced this miracle. There was a statement that Microsoft would lose it's competetive edge if it released it's source code. I don't know about you, but I can't imagine that happening with release of source code? Microsofts "competitive edge" is by sheer quantity of installed systems. You get that with 90% of the Desktop market and just because source code is out doesn't mean others would start building better or more competetive systems. It doesn't take a genius to figure out how to build a feature just by looking at how someone else did it. That's how the masters have been creating works of art for thousands of years.

    Yes there's been many times I've been asked at work about how something works and the answer was "I don't know, I guess that's how Microsoft did it". Had we been able to see what was under the hood would we have been able to know what's going on? Not likely. It takes some serious skill just to know what's really working in your own application, let alone having hundreds of thousands of lines of source code that you didn't create to wade through. I have yet to see a Linux developer (again, unless they're building embedded systems) do end-to-end debugging into the heart of the lion because something wasn't working quite right. I doubt any Windows application developer would do the same and I don't buy the argument that the world would be a better place if the source code was available. I just don't see enough data points to convince me but your mileage may vary.

    Okay, enough of my soapbox. Back to SharePoint development this week.

    UPDATE: After waking up I realize that having access to the Windows source has absolutely nothing to do with XP 64bit Edition. I'm not sure why I thought there was any connection there at all. Chalk this up to "insane man blogging" syndrome.

  • VS2005 Class Designer and Exporting Images

    One of my favorite features with the Beta 2 update to Visual Studio 2005 was the ability to export your Class Diagrams as images. Yes, it's simple but you won't believe how useful it's been lately sucking code in from a 2003 project, automagically creating the Class Diagrams (thank you Microsoft!) and exporting it to communicate to the development team.

    Dmitriy Vasyura of the Visual Studio Class Designer Team has an excellent post on the ClassDesigner's WebLog that goes into great depths on preparing the diagram, making use of comment shapes to annotate the diagram, documentation scenarios, and of course exporting the images. Check out the post here for more info.

  • What is an Application in SOA?

    A colleague of mine recently turned me onto Clemens Vasters blog (Thanks Rob!). Clemens latest post was entitled "SOA" doesn't really exist, does it? It led to some interesting thought but I'm not convinced that any architecture really exists in the true sense of the word when it comes to "SOA". I did agree with most of what he had to say in his blog and that most people use "SOA" as a label for the engineering components of "SO" which really isn't what Service Oriented Architecture is all about.

    Another perspective to overlay on this discussion is the notion of an "application" in SOA. In SOA you have a federation of co-operating services that are orchestrated to achieve a well defined goal or objective. This is a completely different architectural model from what we have traditionally considered an "application" to be.
    Traditional application models grew from from the notions of task automation that originated in the late 60s and the 70's. They were narrow and deep in their scope, and typically bounded by an organization unit's accountabilities. This led to a vertically organized monolith of function all the way from the UI down to the database calls. This has turned out to be somewhat limiting from the current perspectives of how businesses want to operate; companies are generally larger, more diverse, and re-organize more frequently than was common at the time these models were developed.
    When applying SOA it makes more sense to bundle services by affinity of capability rather than by "set of tasks" or "org unit accountabilities". In some ways this is similar to the data concept of organizing data by "subject area", rather than application use. For example, if you think about the original implementation of SAP, it was focused on the needs of the Finance Department with "transaction extensions" for other organization units. Some failings of this model were that SAP was an "intrusive burden" to org units other than Finance, and it couldn't support decentralized business models (just any of the decentralized companies that tried to implement it!). Today SAP is re-focusing and exposing it's core services to be consumed by other applications.
    But what if the architects for SAP had applied SOA from the get-go? The actual architecture would have been defined as several aggregations of financial capabilities (like accounts, money movements, and so on) with appropriate service contracts. It would have also included a collection of capability that would be needed by the finance functions of a typical organization, again with appropriate service contracts. It would have have included some UI capabilities (Web Parts if they were really on the ball) that could consume these services (and indeed other services from different providers) and have the "appearance" of a traditional "application". However, as architects we would recognize that what the user thinks is the "application" is really just a thin facade to the orchestrations of functions they need. An important additional capability is to easily orchestrate other services that are not part of SAP (for example, a service with AMEX to get your credit card statements that can augment expense reports).
    So the idea of SOA is to design in this set of patterns and paradigms from the initiation of the concept. Thoughts? Feedback? Pizza?

  • Back at work after coughing up a lung and hey, check out SPDesigner

    I'm back at work after a few days off (of course getting sick over the weekend is no fun). I thought my lungs were going to collapse on Friday so I headed home early and stayed in bed all weekend taking every concievable legal cough/cold remedy drug known to mankind. After 72 hours of wandering in and out of conciousness I think I'm pretty healed so back to the grind.

    By way of Daniel McPherson and this blog I found a very cool new (beta) tool by James Milne called SharePoint Style Designer (or just SPDesigner or SPSkin depending on where you look). It's basically a simple way to create your CSS files (one for SPS, one for a custom theme in a WSS site). Rather than plodding through the CSS files yourself with something like Top Style (my choice of CSS tools), James tool just breaks the sections down and gives you what you need to configure. Very cool and very slick.

    You can view the tool online here. You can tweak the values in each category and view the resulting CSS. You can even work your style online and submit it for future versions of the Style Designer. I can't seem to find a way to download it and use it locally but it's currently in beta and perhaps not available. In any case, check it out if you're looking to change the look and feel of your SharePoint installations.