May 2007 - Posts

VSTS for Database Professionals on my current project
Monday, May 28, 2007 11:24 PM

On a current project I was interested in using VSTS for Database professionals to see if it could add any value with database management. Traditionally its been a very manual process as Visual Studio has not had any real support for it.

I also knew that Nick Weinholt is due to present at SDNUG on the topic of all things data related, one of which is VSTS for DB Pros, at the next meeting of SDNUG next Thursday, June 7th.

So I pinged Nick on MSN IM to get some of the early lowdown on the product. He quite liked the product and pointed out a few of its strengths like the Database schema comparisons (and by that I mean comparing db schema to another db schema OR to the schema defined in your Db project - its just another comparison), data generation, current data script generation, playing nice with source control tools, and full recognition of all the database aspects and objects.

So I installed it and started playing. All in all it looks very cool. Very script oriented in that everything works against .SQL scripts. There are .SQL scripts that represent all the objects within your DB and the final build produces a single .SQL script that is used to execute against your DB, or in the case of deploying your Db project, it just gets executed on your behalf. This includes any data related scripts, everything. That's right everything. Think of the project as an offline image of your database in script form.

One thing I did notice is that it keeps a temporary connection to a database that it seems to use for working with. This database gets created when you open the project, and removed when you close it. Not sure exactly as I haven't had time to figure out if its absolutely necessary yet.

The image here shows the -Database project with a GUID after it which is the temporary Db that DB Pro creates when I open the project.

I will say that it does take some getting used to. If you are going to use it, first install it, then download one of the many videos available from the DB Pro site that show what its all about. Work through the videos with the tool open, then you can apply it to a project of your own. I have also used red-gates SQL Compare tool and I think that Red-Gates tool is definitely easier to use. Without thinking, I can use the red-gate tool to compare a DB in a few minutes. VSTS DB Pro takes some getting used to. As to a proper comparison, VSTS DB Pro integrates fully with Visual Studio so its on a winner there but a full comparison is not something I could comfortably do just yet. At any rate, I am pretty impressed thus far.

Web Client Software Factory and Enterprise Library. I want to Opt-in, not Opt-out.
Monday, May 28, 2007 12:34 AM

I am a fan of the Web Client Software factory. Its a pretty good implementation of the MVP pattern within ASP.NET. Pretty lean and clean and doesn't involve lots of effort to get the simple stuff done, unlike a lot of other MVC based implementations/frameworks.

One of the things I don't like about it is the automatic inclusion of the Enterprise Library pieces. I know that Entlib integration has been a requested feature of WCSF, but not all of us want it integrated all the time, by default. Ideally I'd like the recipe to prompt whether that support/integration should be included.

Why?

Well Entlib comes with a lot of weight. Its feature heavy but also heavy in complexity and has a large dependency on its myriad of assemblies. Usually migrating from one version to another comes with a degree of pain and in my current crop of projects, not something I wish to entertain. Things like logging and exception management can be better managed by simpler, lightweight implementations specifically designed for my project/solution, in a simpler, leaner way. There is also nothing preventing me from re-adding it back into the solution if I need to.

Its kind of like taking the WCF mantra where you need to "opt-in" for features rather than all the features are there/on by default and you need to remove/disable them (think DataContracts/DataMembers where you need to explicitly include a DataMember before its included as part of the serialization - Opt-in, not just mark a class as Serializable and everything is included - Opt-out).

There has been some discussion around the use and/or recommendation of Entlib on projects, internally on the readify tech lists. A number opposed to its use, others in favor, some on the middle ground in between. The point being, clearly its not for everyone.

I want to Opt-in.

by Glav | with no comments
LiveWriter and Blog Styles
Tuesday, May 22, 2007 12:11 PM

If, like me, you like to use Microsoft LiveWriter, be aware that it does not work if you are using the Riviera style found within the admin section of this CommunityServer blog. I the JavaScript must kill it.

if you select that style, not only will LiveWriter not be able to download your style, but you wont be able to post to this site using it. I selected another style and LiveWriter seems to work just fine.

VSTS and VSTestHost.Exe - not on talking terms
Tuesday, May 22, 2007 12:02 PM

Thought I would install some optional updates to try and resolve some issues I was having with Vista and Visual Studio 2005. It seems I continually get the VSTestHost.exe has stopped unexpectedly error when trying to run any tests within Visual Studio. It seems there are not too many instances of this around as I can't find too much beyond simple fix suggestions that haven't worked.

After some investigations, it seemed to be trying to load Microsoft.VisualStudio.QualityTools.Common.dll (as well as a few others) and not finding them. They exist in the Common7/IDE/PrivateAssemblies area and the search that VS was doing seemed to looking in the GAC. So I GAC'ed em, but things still die. A little differently, but now I get a MessageBox like the one shown below:

Just awesome.

If anyone has any suggestions I'd love to hear them.

Oh by the way, as I alluded to at the start of the post, I tried some optional updates for Vista that I thought might help. Well it didn't, made it worse actually (the general system experience that is), so I restored. Now things are really weird. Lots of settings lost, bookmarks gone, whole heaps of things. Don't trust system restore....

Update:

I uninstalled just the Team Test components, then re-installed them again and it looks to be fixed. The uninstall and re-install process took ages, and I thought it may blow away some of my installed updates like SP1, .WCF Extensions, GAT Extensions and the like, but luckily all remained intact.

Silverlight Reflector
Sunday, May 20, 2007 10:47 PM

Just found a nice Silverlight tool by Ernie Booth to aid in my learning process. A plug in for Reflector that allows you to view Silverlight code (Javascript or Managed).

Found via Mike Harsh’s blog.

Its easy to use and helps me a lot in finding out about how things are done.

 

Silverlight and unsupported features
Sunday, May 20, 2007 10:32 PM

Like a lot of people lately, I have been playing with Silverlight (Alpha 1.1 version), trying to understand the nuts and bolts of it, and when that fails, just diving in and seeing how I go. I can’t say I am much of a WPF/XAML guru so things have been slow.

What is hard is just diving in and trying to implement stuff, then realising that I am trying to use an supported feature. A few examples:

I wanted to add some MouseOver effects to particular Canvas (while lamenting the absence of any grid controls in Silverlight), and eventually realised that the “MouseEnter” routed event is not supported as an event trigger, only the “Loaded” event is. Kinda weird I thought, a little painful but not too bad. It means you need to resort to code, and in my case, I wanted to do some scale animations, so needed to put those ScaleAnimations in the Resouce section of my canvas (that exists within my Page1.xaml file) like this:

 <Canvas.Resources>

  <Storyboard x:Name="fullScreenIconEnlargeTimeline">

    <DoubleAnimationUsingKeyFrames BeginTime="00:00:00" Storyboard.TargetName="txtFullScreenIcon" Storyboard.TargetProperty="(UIElement.RenderTransform).(TransformGroup.Children)[0].(ScaleTransform.ScaleX)">

      <SplineDoubleKeyFrame KeyTime="00:00:00.3000000" Value="2.646"/>

    </DoubleAnimationUsingKeyFrames>

    <DoubleAnimationUsingKeyFrames BeginTime="00:00:00" Storyboard.TargetName="txtFullScreenIcon" Storyboard.TargetProperty="(UIElement.RenderTransform).(TransformGroup.Children)[0].(ScaleTransform.ScaleY)">

      <SplineDoubleKeyFrame KeyTime="00:00:00.3000000" Value="3.75"/>

    </DoubleAnimationUsingKeyFrames>

    <DoubleAnimationUsingKeyFrames BeginTime="00:00:00" Storyboard.TargetName="txtFullScreenIcon" Storyboard.TargetProperty="(UIElement.RenderTransform).(TransformGroup.Children)[3].(TranslateTransform.X)">

      <SplineDoubleKeyFrame KeyTime="00:00:00.3000000" Value="-65"/>

    </DoubleAnimationUsingKeyFrames>

    <DoubleAnimationUsingKeyFrames BeginTime="00:00:00" Storyboard.TargetName="txtFullScreenIcon" Storyboard.TargetProperty="(UIElement.RenderTransform).(TransformGroup.Children)[3].(TranslateTransform.Y)">

      <SplineDoubleKeyFrame KeyTime="00:00:00.3000000" Value="27.5"/>

    </DoubleAnimationUsingKeyFrames>

  </Storyboard>

</Canvas.Resources>

My Canvas object definition looked something like:

<Canvas x:Name="fullScreenCanvas" Width="157" Height="72" Canvas.Left="515"

                    MouseLeftButtonUp="FullScreenClick" RenderTransformOrigin="0.5,0.5" Canvas.Top="-7"

          MouseEnter="FullScreenMouseEnter" MouseLeave="FullScreenMouseLeave">

// ... rest of definition continues.....

You’ll notice the MouseEnter="FullScreenMouseEnter" event definition. Now in my code behind (Page1.xaml.cs) I have this:

public void FullScreenMouseEnter(object sender, EventArgs e)

{

    fullScreenIconEnlargeTimeline.Begin();

}

Obviously it would be nicer to hook all this up in XAML (wel at least thats how I would like to do it) but that requires support for RoutedEvents other than Loaded as I mentioned earlier.

I should also note that I am using a combination of Blend and Visual Studio (Orcas) to get the XAML how I want it. Timelines are easier within Blend (IMHO) but in a lot of other cases I feel more confortable going into the XAML itself to edit. If I had XAML intellisense in Blend, I’d be much happier.

Another reason to use the AJAX Control Toolkit
Sunday, May 20, 2007 5:43 PM

Apart from the long list of great, free controls in the AJAX Control toolkit, there are numerous other reasons to use the toolkit. One of which is the great support functions that come with the toolkit.

One I had to use recently allowed me to figure out the width of an elements border and easily deduce the content size within the element for some tricky positioning. While not a big deal, these things are incredibly fiddly and not very straightforward in the wonderful world of browser "standards".

In the AJAX Control Toolkit project, there is a Common folder, which contains a "common.js" file with a large set of handy little script functions in there. Far too many to list here, but the ones I used on this occassion were:

var box = CommonToolkitScripts.getBorderBox(domElement);

and

var box = CommonToolkitScripts.getPaddingBox(domElement);

Each one returns a 'box' object that has the following properties:

.top         : top most PaddingSize/BorderWidth
.bottom      : bottom most PaddingSize/BorderWidth
.left        : left side PaddingSize/BorderWidth
.right       : right side PaddingSize/BorderWidth
.horizontal  : total horizontal PaddingSize/BorderWidth
               (right + left)
.vertical    : total vertical PaddingSize/BorderWidth
               (top + bottom)

And thats it. No worrying about different methods for different browsers.

Again, there are only two functions that exist in this library. There are many more. Additionally, there is also a script file called 'Blocking.js' in the 'Compat' directory which contains gems like:

Sys.UI.DomElement.setVisible(element, true/false);

and

var booleanFlag = Sys.UI.DomElement.getVisible(element);

which you may recall existed in the prior Atlas CTP builds, but have since been removed. The AJAX Control toolkit has retained this because they are quite handy to use.

 

External Drive Pain. FTP to the rescue.
Sunday, May 20, 2007 5:33 PM

My USB 250Gb external drive is about ready to pack it in. When I plug it into a USB port, I get the popup asking what I want do (browse, transfer files etc...). Then after not long using it, the drive "disconnects" from the system, and very quickly reconnects again, and I get the popup asking what I want to do as if I had just plugged it on. Now this is happening ever more frequently (sometimes every 10 seconds, sometimes after a few minutes).

I can minimise it by plugging it into a USB 1.1 port, but it still happens (just a lot less). Now I have a lot of stuff on there I want to keep as I use this as my "first step backup" (from here, it goes to DVD if the information is valuable enough to me).

Problem is, I can't copy anything of any size because it keeps dropping out. Its just like the USB plug being pulled out from the system and reconnected again. I have tried other USB devices with no issue (both 1.1 and 2.0).

So what to do.....

Well, here is what I am currently doing.

I want to get this data off the drive so I have setup a FTP server that points to this drive. I am using SmartFTP to queue the downloads from this removable drive to a brand new USB external drive which is functioning beautifully. SmartFTP works well because it auto-resumes the downloads for me.

Is it slow? Oh yes.

Is it painful? Indeed.

Is it working? Remarkably well (albeit quite slow).

If I can bear with the long duration its going to take to get my important data off, then I am done and I can turf this dying drive. Its transferred 1Gb of data so far, and I can hear the drive whirring as it disconnects and reconnects which it has done about 10 times during that transfer.

Update: It appears this post and the comments associated with it disappeared? So here it is reposted.

Update2: Fellow colleague Grant Holliday suggested I use Robocopy. I had completely forgotten about that one.

Update3: Whomever sent me the comment on enclosures was spot on (the comment has also gone, see Update1). It is the enclosure, however the drive is still part of the problem. Its r-e-a-l slow and making weird noises. At least I can copy my data over a bit easier.

 

 

by Glav | with no comments
Filed under: ,
ASP.NET Podcast Show #91 - ASP.NET AJAX Integration with ASP.NET Services
Friday, May 11, 2007 6:50 PM

Wally has pumped out another show and actually has a number of good shows planned coming up. Hell, I even have one coming up soon (really, I do...) but onto the latest.

Wally an update to a previous show on AJAX with ASP.NET Services.  This show has info about using Login, Profiles, and Roles in ASP.NET AJAX.

Subscribe

Original Url:  http://aspnetpodcast.com/CS11/blogs/asp.net_podcast/archive/2007/05/11/asp-net-podcast-show-91-asp-net-ajax-integration-with-asp-net-services.aspx

 

Show Notes:

  • Book #3 update.
  • Login.
  • Profiles.
  • Roles

Presentation

Source Code

WCF Client Channel Pool - Improved Client Performance
Monday, May 7, 2007 12:22 AM

Not long ago, I posted about WCF client performance and some work I have been doing around improving that with a "Channel Pool" type implementation.

Well its finally ready for some public consumption. You can grab the code here. (http://www.theglavs.com/DownloadItem.aspx?FileID=55)

You can grab the download from here. Its very "apha" at this point and my testing has been limited, but it has been showing consistently beter results than using a standard ClientBase proxy class in WCF.

So first a quick usage example:

public class MyProxy : ClientBasePool<ISomeInterface>, ISomeInterface
{
   public void MyInterfaceMethod(string s)
   {
      Channel.MyInterfaceMethod(s);
   }
}

And you use it as you normally would:

MyProxy prox = new MyProxy();
prox.MyInterfaceMethod("Hello");
prox.Close();

And thats it. The same way you would use a normal ClientBase proxy class.

Using this proxy class will typically yield better performance by approx. 20%-50% by optimising the client side of the communication process.

What it Does.

Using the ClientBasePool proxy class will utilise a pool of pre-opened channels behind the scenes. Negotiating the WS-SecureConversation is expensive, so this class manages a pool of channels, that have already done this negotiation before hand, in the background on a separate, low priority thread.

The pool will automatically get refilled in the background (on a separate thread) as channels are removed from the pool. In its default configuration, the pool has a size of 50, and a maximum of 126. The pool is refilled when a "threshold" value is hit. By default this is half the pool size, ie. 25. So when there are 25 or less channels in the pool, the process refills to pool with pre-opened channels.

Additionally, the channels will periodically be checked to see when they were opened, and if they exceed a pre-defined time period . If so, they are closed and removed from the pool. This is also done in the background on a separate thread. Think of this as the Garbage collector process. This is to prevent clients using a channel from the pool that may have been opened half an hour ago, after which the security conversation is not valid (token expired) and the channel is faulted when using it. This process will pro-actively close and remove end-of-life channels, and the refill process will kick in if requied. By default, the channels have a "life" of 90 seconds and are considered end of life after that. The clean up process runs (by default) every 45 seconds to check the pool. This is half of the channels life period.

All these settings are configurable.

My previous post shows some performance tests I initially did. I also did a more recent one using a 20,000 call test for a standard ClientBase proxy, my ClientBasePool proxy, and a comparative test using a single proxy for all calls. The results are:

Normal Proxy (ClientBase): 6:31:647
My Channel Pool ClientPoolBase Proxy: 4:48:618
Single Proxy: 1:5:522

You can see that my ClientBasePool took 4 minutes and 48 seconds compared to a standard proxy time of 6 seconds and 31 seconds. Not huge, but still a significant difference. Obviously, the times are far longer than using a single proxy only (without creating a new proxy for each set of service calls) with a time of 1 minute and 5 seconds.

Caveats

1. I have only tested this using the wsHttpBinding. I think that the netTcp binding wont really need it or wont benefit that much because of increased performance of the protocol. However, it may benefit somewhat, just haven't tried it.

2. Its early days and my time is limited so extensive testing is not possible. If you use it and have an issue, let me know. I'd love to hear about it.

3. During idle times, this channel pool will be periodically closing channels that have expired, and re-opening new ones in preparation for them to be used. Obviously this is extra traffic where there may be none at all.

4. This is NOT faster than using one single proxy class that is held statically for all your service calls. if you can do that, and manage the timeout issues and whatever, then that will easily be the fastest option.

Configuration and Usage

You dont need to set anything explicitly to make this work, but you can change it to better suit your needs.

Namespace: System.ServiceModel.ChannelPool

Main Classes to use (among others):

ClientBasePool : The proxy class to use which interacts with the ChannelPoolFactory and the ChannelPool to get the job done.

ChannelPool : The Channel Pool implementation for each chanel. Normally you should not have to interact directly with this class.

ChannelPoolFactory : Takes care of instantiating and destroying ChannelPool instances for a channel/service interface. To initialise a channel pool and start off the processing threads use:

ChannelPoolFactory<ISomeInterface>.Initialise();

To destroy a channel pool and terminate all background threads, use:

ChannelPoolFactory<ISomeInterface>.Destory();

Configuration Options:

All Configuration options exist in a class called (not surprisingly) Config. I haven't setup reading from a configuration file or anything like that. I'll let the specific implementation take care of that.

The properties in this class of relevance are:

PoolSize : Defaults to 50. The capacity of the channel pool. Max 127. The refill thresold is automatically set to half of the pool capacity, so by default, when 25 channels or less exist in the pool, the refill process is triggered to refill it.

PoolRefillTrigger: Defaults to 25. This value determines how low the pool can get before a refill is triggered. When a new size is set for the PoolSize, then this value automatically gets adjusted to half of the PoolSize.

ChannelLifetime: Defaults to 90 seconds. Channels are closed and removed from the pool after this time.

CleanupInterval: Defaults to half of the ChannelLifetime, ie. 45 seconds. When a new ChannelLifetime period is set, this value is automatically set to approximately half the ChanelLifetime period.

Examples:

System.ServiceModel.ChannelPool.Config.PoolSize = 100;
System.ServiceModel.ChannelPool.Config.ChannelLifetime = 300;

Final Notes:

There is a console application included in the source that I used for some testing. I chopped and changed this many times to try different tests so its a bit of a mess, but not too complex.

I have cleaned up the code somewhat but there are still bits and pieces lying around. I'll get to them. Happy for people to point them out to me though.

Love to hear any feedback.

Remember, grab it from here.

More Posts Next page »

This Blog

Syndication