Jim Jackson

Character Counts.
Do the right thing.

Sponsors

New Blog

So if anyone is still following me (what the heck were YOU thinking?) you can continue to my new blog. It's a Wordpress site hosted with an ASP.NET provider so it's a little slow but I plan to transition it to Azure some time soon.

My current project is this: http://bit.ly/LfzfzG

My new site is here: http://www.axshon.net

My twitter name is @axshon 

Semper Fi

Jim

Entity Framework 4 and “New transaction is not allowed because there are other threads running in the session” Exception

I’ve been working through a server-side process with Entity Framework wherein an uploaded file must be updated after initial entry. I have a multi-stage process that goes out to multiple other affinity web services to get complimentary data. Here is code phase 1:

using (EntityConn context = new EntityConn())
{
   var pts = from t in context.Points
             where t.Track.File.FileID == input
             select t;
   foreach (var pt in pts)
   {
      pt.ValueAdded =
         VendorMethodCall(pt.Lat, pt.Lon);
      pt.Loaded = true;
   }
   context.SaveChanges();
}

This works well except that VendorMethodCall() can take quite a bit of time so I want to show progress. I do this with another web service call that checks the database for records with the Loaded bit field to true. Oops. Nothing happens till the end and it’s all transactional so I go from 0% to 100% after a very long wait. That won’t do. Simple, just change the code to update the context after every method call. Like so:

using (EntityConn context = new EntityConn())
{
   var pts = from t in context.Points
             where t.Track.File.FileID == input
             select t;
  
foreach (var pt in pts)
   {
      pt.ValueAdded =
 
          VendorMethodCall(pt.Lat, pt.Lon);
      pt.Loaded = true;
      context.SaveChanges();
   }
}

Turns out this does not work. The first time the context.SaveChanges() method is called I get an exception whose inner exception is: “New transaction is not allowed because there are other threads running in the session.” Turns out that the READ happens via a transaction in EF 4! Not sure why that is and it may be an editable setting (optimistic v/s pessimistic locking?) but I cannot find the setting so I’m moving on to the work around.

The first thing I tried was to batch up 25 records at a time under the premise that previous saves had not completed. That didn’t work but in my final solution I did end up incorporating that method just so I could issue fewer individual statements. Most of the files I’m working with have upwards of 1,500 data points so I’ll probably bump the batch size to 100 or so. At any rate, the secret here is to load your data into an array and iterate the array rather than the actual result set from your Entity Linq statement.

using (EntityConn context = new EntityConn())
{
   var pts = from t in context.Points
             where t.Track.File.FileID == input
             select t;
   int iBatch = 0;
   var ptsList = pts.ToArray<Point>();
   foreach (var pt in ptsList)
   {
      pt.ValueAdded =
 
          VendorMethodCall(pt.Lat, pt.Lon);
      pt.Loaded = true;
      iBatch++;
      if (iBatch % 25 == 0)
      {
         context.SaveChanges();
         iBatch = 0;
      }
   }
   context.SaveChanges();
}

This works and does not throw an exception. I have not reviewed the SQL statements generated since my dev server for this project is hosted and I don’t have appropriate permissions. I will now have to try it locally to see if the traffic is at a permissible level.

On a side note, notice the Linq to Entities statement that allows me to cycle through 3 (!!) tables to get to the records I want using a single statement. In normal T-SQL, I would have to either use inner joins or use IN clauses to get the values I was looking for. The SQL will turn out to be similar but this dramatically simplifies the code.

Uploading a File to SQL Server via Silverlight, WCF and EF
Note: Some of the code presented here does not conform to standard security practices. The goal is to show the relevant methods to accomplish the requirements, not to present a production-ready solution.

The requirements:

ü  Upload individual large files to SQL Server 2008

ü  Use a standard Silverlight-enabled WCF service with no special plumbing

ü  Respect the very small upload size limits in normal binary WCF transmissions

ü  Use Entity Framework 4.0 against a model containing only tables

ü  Do not use any special HTTP handlers

Here is what we're trying to accomplish: The requirement is to take a file submitted by the user and send it to the database regardless of how large it is. We must avoid timing out our WCF connection (keep the message size small) and we must have fail-over logic in place in case a single part of the fail fails. Once a file has been completely uploaded, we want to perform other server-based operations on the file.

Let's get started by creating a new Silverlight Application called TestFileUpload and allow the template to also create a TestFileUpload.Web project.

Continue by creating two tables on the server. The first will store queued file parts while the second stores the final file. Note that FileData and FileDataPart columns are VarBinary(Max).

Data Model

In the solution add a new class libary project called DBContext. Kill the class1.cs file and add a new ADO.NET Entity Data Model. Call it FileUploadModel and allow the name of the connection and model to stay set at FileUploadTestEntities. Next, in the web site, open the web.config file and copy the connection string from the DBContext project over. Time to compile the solution.

The Web Project

Add a project reference to the DBContext project and rebuild the solution.

Add a new Silverlight-enabled WCF Service and call it Uploader.svc. In the new service, kill the DoWork() method. Note that at this stage your service should be configured for binary message encoding and the size of your uploads and downloads are very limited. We'll leave this exactly as is and assume that you can only upload 8192 bytes per message. We'll make it a bit easier to calculate in your head and cut it to 8000 bytes per message.

In the UploadFile method, start with this:

using (FileUploadTestEntities context = new FileUploadTestEntities()) { }

Note the squiggly under the the using keyword. This occurs because Visual Studio knows what a FileUploadTestEntities object is but not what it inherits from. Add a reference in your web project to System.Data.Entity and this should resolve.

In our web service we will have 3 methods, all returning integers:

UploadFile will take some basic file information and the first binary part of the file and return the new ID of the file record.

UploadFilePart is an iterative add method that takes the file ID, the position of the binary steam in the data and of course the binary part. It returns 0 if the insert fails, 1 if the data already exists and 2 if the record was successfully inserted or updated.

FinalizeFile will take all the parts of the data and process them into the header record then delete all the part data after a few simple checks. See notes at the end for a better method of accomplishing much of this logic. FinalizeFile will also be where you can call any additional processing methods.

The Silverlight Project

Next, we'll move on to the Silverlight application. Here is what we need.

·         A method and an interface to select a file and gets the source stream of that file.

·         Logic to divide the stream into byte arrays.

·         A cyclic process to upload each element in the byte arrays.

We'll forego an MVVM pattern here with ICommand and/or RelayCommand implementations in order to keep it strictly about the problem space. It is advisable to take this entire logical process and put it into it's own component.

Open the MainPage.xaml file, add a button and wire up it's Click event to it. Add a service reference to the Silverlight application pointing to your WCF service. Set the service reference namespace to Upload.

In the code-behind for MainPage.xaml, we will add a few private fields to maintain the uploaded file's data while processing it. Critical here are the fileBuffer generic list of bytes, the fileParts generic list of byte-sized (sic) chunks and the completedSections list of type ObservableCollection. The completedSections object is nice because it allows us to wire up events when the collection changes. The code is oversimplified to call the Finalize method every time the object changes but for our purposes, it works.

In the constructor we will wire up the CollectionChanged event of completedSections. Every time this collection changes, this event will fire.

The button_click event will allow the user to select a file and then break the file into arbitrary chunks. Once this is done, we call StartFileUpload to kick off the process.

StartFileUpload gets the first value from fileParts and fires it asynchronously, passing in the first section of the file. Note that ClientProxy is a local property that returns an UploaderClient object that is prewired to fire all three of it's return events.

client_UploadFileCompleted gets the return value from the client. If successful, we fire every object in fileParts using SendSection.

SendSection fires a single entry from fileParts based on an index value. The return of this method fires the client_UploadFilePartCompleted event.

client_UploadFilePartCompleted will return a value indicating whether or not the insert/update succeeded. If it succeeded, we add the index to completedSections which, in turn, fires the FinalizeFile method.

FinalizeFile will check to see if all parts of the file were sent (and successful) and ensures that the web service' FinalizeFile method is called only once. This method will return 0 if successful, -1 if failure or the index of the offending record if a failure reason can be determined. If the FinalizeFile method fails, we try to send the failed section again.

Some notes for production use:

a)      You will need to security trim all your methods.

b)      Consider sending max file size, available file types and individual message size in an initialization call.

c)       This code could be dangerous in that a virus-infected file can be uploaded to your database and then opened by an unsuspecting user. Consider adding attributes to your uploaded files table to allow them to be quaranteened and scanned.

d)      The Finalize method used here should be modified to include calls that send SQL directly to the database. I did not do that specifically to show that everything could be done with no SQL. The result is that the binary data is passed multiple times when all that is required is a call to DataLength() on the SQL side. As an aside, you MUST use DataLength() when testing the size of a VarBinary(Max) column in SQL Server. A call to Len() will not be reliable on this data type.

e)      When a particular part of a file fails to get into the database properly and we resend, consider adding logic to control the number of retries per section and total retries allowed.

So here is the code for the WCF service:

[OperationContract]
public int UploadFile(string fileName, string fileType, 
        string fileExtension, Int64 fileSize, byte[] firstFileData)
{
        int ret = 0;
        using (FileUploadTestEntities context = new FileUploadTestEntities())
        {
                UploadedFile file = new UploadedFile();
                file.FileName = fileName;
                file.FileType = fileType;
                file.FileExtension = fileExtension;
                file.FileSize = fileSize;
                context.AddToUploadedFiles(file);
                context.SaveChanges();
 
                UploadedFilePart part = new UploadedFilePart();
                part.FileID = file.FileID;
                part.Ordinal = 0;
                part.FileDataPart = firstFileData;
                context.AddToUploadedFileParts(part);
                context.SaveChanges();
 
                ret = file.FileID;
        }
        return ret;
}
 
[OperationContract]
public int UploadFilePart(int fileID, int ordinal, 
        bool overwrite, byte[] fileData)
{
        // return values:
        // 0 = Not inserted
        // 1 = Already exists
        // 2 = Inserted or updated
 
        int ret = 0;
        // Upload a new file part to the database
        using (FileUploadTestEntities context = new FileUploadTestEntities())
        {
                // Check to be sure this part does not already exist.
                var foundPart = (from p in context.UploadedFileParts
                                        where p.FileID == fileID
                                        && p.Ordinal == ordinal
                                        select p).FirstOrDefault();
                if (foundPart != null && overwrite)
                {
                        foundPart.FileDataPart = fileData;
                        context.SaveChanges();
                        ret = 2;
                }
                else if (foundPart != null && !overwrite)
                {
                        ret = 1;
                }
                else // foundPart == null so ignore overwrite
                {
                        UploadedFilePart nextPart = new UploadedFilePart();
                        nextPart.FileID = fileID;
                        nextPart.Ordinal = ordinal;
                        nextPart.FileDataPart = fileData;
                        context.AddToUploadedFileParts(nextPart);
                        context.SaveChanges();
                        ret = 2;
                }
        }
        return ret;
}
 
[OperationContract]
public int FinalizeFile(int fileID)
{
        // File upload is complete, post all file data to the UploadedFiles 
        using (FileUploadTestEntities context = new FileUploadTestEntities())
        {
                var totalPartSizes = from allParts in context.UploadedFileParts
                                                where allParts.FileID == fileID
                                                select allParts.FileDataPart;
 
                Int64 totalPartSize = 0;
                foreach (var sizePart in totalPartSizes)
                        totalPartSize += sizePart.Length;
 
                var totalAssignedSize = (from fileTest in context.UploadedFiles
                                                        where fileTest.FileID == fileID
                                                        select fileTest.FileSize).First();
 
                if (totalAssignedSize > totalPartSize)
                {
                        // The sizes do not match. 
                        // Find the first part that does not match the assigned size.
                        var missingParts = from p in context.UploadedFileParts
                                                        where p.FileID == fileID
                                                        orderby p.Ordinal
                                                        select p;
                               
                        int iTestOrdinal = 0;
                        foreach (var testPart in missingParts)
                        {
                                // Test for contiguous elements
                                if (testPart.Ordinal != iTestOrdinal)
                                        return iTestOrdinal;
                                // Test for size of the element as long as 
                                // it's not the last one.
                                if (iTestOrdinal != (missingParts.Count() - 1))
                                {
                                        if (testPart.FileDataPart.Length != 8000)
                                                return iTestOrdinal;  //////////////
                                }
                                iTestOrdinal++;
                        }
                        // We didn't find the problem. Nothing to do but fail.
                        return -1; //////////////
                }
                else if (totalAssignedSize < totalPartSize)
                {
                        // There are too many parts. 
                        // Not much we can do here except fail.
                        return -1; //////////////////
                }
 
                // The total size of parts is the same as the 
                List<byte> allFileBytes = new List<byte>();
                       
                // Get the list of parts for this item
                var parts = from p in context.UploadedFileParts
                                where p.FileID == fileID
                                orderby p.Ordinal
                                select p;
 
                foreach (var part in parts)
                        allFileBytes.AddRange(part.FileDataPart.ToList<byte>());
 
                var file = (from f in context.UploadedFiles
                                where f.FileID == fileID
                                select f).FirstOrDefault();
 
                if (file != null)
                        file.FileData = allFileBytes.ToArray();
                context.SaveChanges();
       
                // Final test to be sure that the file updated.
                var finalTest = (from f in context.UploadedFiles 
                                        where f.FileID == fileID
                                        select new { f.FileSize, f.FileData }).First();
                if (finalTest.FileSize != finalTest.FileData.Length)
                        return -1; ////////////
 
                // Matching sizes detected. Go ahead and delete the parts.
                foreach (var part in parts)
                        context.UploadedFileParts.DeleteObject(part);
       
                context.SaveChanges();
 
                // We return zero because the first file part
                // was guaranteed by the fact that we received
                // a fileID in the very first call to the service.
                return 0;
        }
}

Here is the code-behind for MainPage.Xaml.cs:

private List<byte> fileBuffer = null;
private FileInfo selectedFile = null;
private int fileID = 0;
private int sectionCount = 0;
private ObservableCollection<int> completedSections = 
        new ObservableCollection<int>();
private UploaderClient client = null;
private bool finalizedFile = false;
private Dictionary<intList<byte>> fileParts;
 
public MainPage()
{
        InitializeComponent();
        completedSections.CollectionChanged += 
                delegate(object sender, NotifyCollectionChangedEventArgs e)
        {
                FinalizeFile();
        };
}
 
private void Button_Click(object sender, RoutedEventArgs e)
{
        OpenFileDialog openFileDialog = new OpenFileDialog();
        openFileDialog.Filter = "JPEG files|*.jpg";
        openFileDialog.Multiselect = false;
        if (openFileDialog.ShowDialog() == true)
        {
                try
                {
                        using (FileStream strm = openFileDialog.File.OpenRead())
                        {
                                selectedFile = openFileDialog.File;
                                using (BinaryReader rdr = new BinaryReader(strm))
                                {
                                        fileBuffer = 
                                                rdr.ReadBytes(
                                                (int)strm.Length).ToList<byte>();
                                }
                        }
                        if (fileBuffer != null)
                        {
                                fileParts = new Dictionary<intList<byte>>();
                                var fileSections = from idx in 
                                        Enumerable.Range(0, fileBuffer.Count())
                                        group fileBuffer[idx] by idx / 8000;
 
                                sectionCount = fileSections.Count();
                                int ordinal = 0;
                                foreach (var section in fileSections)
                                {
                                        List<byte> itm = new List<byte>();
                                        foreach (var b in section)
                                                itm.Add(b);
                                        fileParts.Add(ordinal, itm);
                                        ordinal++;
                                }
                                StartFileUpload();
                        }
                }
                catch (Exception ex)
                {
                        MessageBox.Show(ex.Message);
                }
        }
}
 
private void StartFileUpload()
{
        byte[] msgBody = fileParts.First().Value.ToArray();
        ClientProxy.UploadFileAsync(
                selectedFile.Name, 
                "image"
                selectedFile.Extension, 
                fileBuffer.Count(), 
                msgBody, 
                0);
}
 
private void client_UploadFileCompleted(
        object sender, UploadFileCompletedEventArgs e)
{
        if (e.Result == 0)
        {
                ResetAll();
                throw new 
                        NullReferenceException("The file insert failed.");
        }
        else
        {
                fileID = e.Result;
                completedSections.Add(0);
                for (int i = 1; i < sectionCount; i++)
                        SendSection(i, false);
        }
}
 
private void SendSection(int sectionKey, bool overWrite)
{
        List<byte> foundPart;
        if (fileParts.TryGetValue(sectionKey, out foundPart))
        {
                byte[] msgBody = foundPart.ToArray();
                ClientProxy.UploadFilePartAsync(
                        fileID, 
                        sectionKey, 
                        overWrite, 
                        msgBody, 
                        sectionKey);
        }
}
 
private void client_UploadFilePartCompleted(
        object sender, 
        UploadFilePartCompletedEventArgs e)
{
        if (e.Result != 0)
        {
                completedSections.Add(
                        (int)e.UserState);
        }
        else
        {
                completedSections.Remove((int)e.UserState);
                SendSection((int)e.UserState, true);  
        }
}
 
private void FinalizeFile()
{
        if (completedSections.Count() == 
                sectionCount && !finalizedFile)
        {
                finalizedFile = true;
                ClientProxy.FinalizeFileAsync(fileID);
        }
}
 
private void ResetAll()
{
        fileBuffer = null;
        selectedFile = null;
        fileID = 0;
        completedSections = 
                new ObservableCollection<int>();
        finalizedFile = false;
        fileParts = null;
}
 
private void client_FinalizeFileCompleted(
        object sender, 
        FinalizeFileCompletedEventArgs e)
{
        if (e.Result == 0)
        {
                MessageBox.Show("Finished uploading file.");
                ResetAll();
        }
        else if (e.Result == -1)
        {
                MessageBox.Show(
                        "Upload failed. Contact the system admin.");
        }
        else
        {
                completedSections.Remove(e.Result);
                finalizedFile = false;
                MessageBox.Show(
                        "Failure detected is at ordinal position " + 
                        e.Result.ToString() + 
                        ". Retrying this file section.");
                SendSection(e.Result, true);
        }
}
 
private UploaderClient ClientProxy
{
        get
        {
                if (client == null)
                {
                        client = new UploaderClient();
                        client.UploadFileCompleted += new 
                                EventHandler<UploadFileCompletedEventArgs>(
                                client_UploadFileCompleted);
                        client.UploadFilePartCompleted += new
                                EventHandler<UploadFilePartCompletedEventArgs>(
                                client_UploadFilePartCompleted);
                        client.FinalizeFileCompleted += new 
                                EventHandler<FinalizeFileCompletedEventArgs>(
                                client_FinalizeFileCompleted);
                }
                return client;
        }
}

And the script to create the database tables:

CREATE TABLE [dbo].[UploadedFiles]
( 
   
[FileID] [int] IDENTITY(1,1) NOT NULL,
    [FileName] [varchar](250) NOT NULL,
    [FileType] [varchar](50) NOT NULL,
    [FileData] [varbinary](max) NULL,
    [FileExtension] [varchar](50) NOT NULL,
    [FileSize] [bigint] NOT NULL,
 CONSTRAINT [PK_UploadedFilesPRIMARY KEY CLUSTERED
  ([FileID] ASC)
 WITH (
 PAD_INDEX=OFF,
 STATISTICS_NORECOMPUTE=OFF,
 IGNORE_DUP_KEY=OFF,
 ALLOW_ROW_LOCKS=ON,
 ALLOW_PAGE_LOCKS=ON
 ) ON [PRIMARY]
)
ON [PRIMARY]
GO
CREATE TABLE [dbo].[UploadedFileParts]
(
    [FileID] [int] NOT NULL,
    [Ordinal] [int] NOT NULL,
    [FileDataPart] [varbinary](max) NOT NULL,
 CONSTRAINT [PK_UploadedFileParts]
 PRIMARY KEY CLUSTERED
  ([FileID] ASC, [Ordinal] ASC)
 
WITH (
 
PAD_INDEX=OFF,
  STATISTICS_NORECOMPUTE=OFF,
  IGNORE_DUP_KEY=OFF,
  ALLOW_ROW_LOCKS=ON,
  ALLOW_PAGE_LOCKS=ON
 
) ON [PRIMARY]
) ON [PRIMARY]
GO 

 

SQL 2008 Geography - Combine Data Points into a Geography Line

I have encountered many instances where having the decimal values for coordinates in a line is more useful than the having the Geography value for that same line. Likewise, in some instances a Geography value is more appropriate. Here is what I mean:

I need all points in a polygon on a map. For my purposes, it is much (MUCH) faster to determine the overall rectangle that surrounds the polygon, add a bit of latitude and longitude to all axis and search against the individual points. With an index on the decimal values this is fast and guarantees that I get everything in the polygon. I don't mind too much that I also get a bunch of stuff not in that polygon. I use this to find what is visible to the user, not to determine actual spatial relationships among points or between points and the polygon.

There are also times when I need to find distances and spatial relationship between points in a line. This is where the Geography type comes in and is both fast and accurate.

In the end, the Geography type seems best suited to calculations and the decimal values are better off used for searching.

So, once you have a bunch of coordinates in a table (PointsTable) and have them ordered properly and referenced by a header value, here is how you can quickly turn the line into a Geography instance. I do this when I am adding records to the PointsTable and store the output in the header table. (HeaderID is the inbound parameter.) I'm not really fond of the SubString statement in here but I haven't worked out how to append to LineList conditionally.

Declare @HeaderID Int
Select @HeaderID = 123

Declare @LineList VarChar(Max)
Declare @GeoVal Geography 
Select @LineList = @LineList +
      Convert(VarChar(100), Longitude)
      + ' '
      + Convert(VarChar(100), Latitude) + ', '
From dbo.PointsTable
Where HeaderID = @HeaderID
Order By Ordinal
 
 
Select @LineList =
      SubString(@LineList, 1, (Len(@LineList) - 2))
Select @LineList =
      'LINESTRING(' + @LineList + ')' 
Select @GeoVal = 
      Geography::STLineFromText(@LineList, 4326);

 

Entity Framework ObservableCollection Add Method - Unexpected Behavior

The documentation for Collection<T>.Add specifically states that when calling the Add method, the new item is appended to the end of the collection. This is not necessarily true. I haven't tested it in all it's incarnations yet but it appears to be adding to the beginning (index 0) of my collection.

http://msdn.microsoft.com/en-us/library/ms132404(v=VS.100).aspx 

My Entity Framework Self-Tracking object contains a collection of items. In my tests I created a couple of dummy objects and inserted them into the database. I was surprised to find that they were in reverse order! The collection type, by the way, is System.Collections.ObjectModel.ObservableCollection<T> in a Silverlight 4 application.

When I looked at the values in the immediate window, sure enough the first item in the collection was the last item I added and so on up the chain. Now you should never count on sorting your database items based on an identity key (obviously) but for spot check purposes, when I'm looking directly at the raw data, I prefer to have my items put in the database in the order expected.

So now rather than calling the Add() method, I'm switching over to using the Insert() method. As a perfomance point though, wouldn't adding an item to the beginning of an array be inherently more processor intensive than adding it to the end?

 

Using the BackgroundWorker in a Silverlight MVVM Application

With Silverlight 4 and the Entity Framework you get a lot of work done on your behalf in terms of standard UI CRUD-style operations. Validations and I/O are pretty easy to accommodate out of the box. But sometimes you need to perform some long running tasks either on the client or on the server via service calls. To prevent your UI from hanging and annoying your users, you should consider placing these operations on a background thread. The BackgroundWorker object is the perfect solution for this. It is easy to use and easy to debug. There are, however, a few things you should understand about where and when the BackgroundWorker can provide feedback and/or manipulate the UI. Spending a few minutes building out the sample listed here should give you the foundation you need to implement very complex, very responsive scenarios.

Note that I used MVVM Light’s RelayCommand not because it’s required but because it’s simple and provides the functionality I required in other applications when binding commands to UI events other than Click. The ViewModelBase also automatically implements INotifyPropertyChanged which is nice.

Here is how the system should operate:

The user types a message into the text box and presses the button. The message is passed into the processor object on a background thread. The processor occasionally notifies the UI that status has changed and the UI refreshes accordingly. While the processor is executing, there should be no lag in the UI functionality. This could also be enhanced to allow the processor to handle in-transit updates to the processor’s payload from the UI but this is beyond the scope of this post.

Getting Started

Start by opening Visual Studio 2010 and creating a new Silverlight Application. No need to do anything special here, no Navigation app, no MVM Light app. Inside the Silverlight application (I named mine BackgroundTest) create three new folders:

·         Models

·         ViewModels

·         Views

Drag the MainPage.xaml from the root folder into the Views folder. This is not really required but it keeps things organized.

Right click on ViewModels and select Add Item => MvvmViewModel. Name it MainViewModel. If you don’t have MvvmViewModel as an item template, check out Laurent Bugnion’s site for details about the MVVM Light Toolkit. http://www.galasoft.ch/

Add two new classes to the Models folder:

·         LongRunningObject: This is our faux object that will take an arbitrary amount of time to process the ProcessItem it receives.

·         ProcessItem: This is a simple business object that implements INotifyPropertyChanged.

In the App.xaml file, add a reference to the ViewModels namespace of the current project and give it a prefix of “vm”:

xmlns:vm="clr-namespace:BackgroundTest.ViewModels"
 

In the Application Resources element (also in App.xaml) add a static reference to the MainViewModel:

<vm:MainViewModel x:Key="MainViewModel" />

 

By adding the MVVM Light ViewModel you should have automatically added references to the following DLL’s. If they are not there, add them now:

·         GalaSoft.MvvmLight.Etras.SL4

·         GalaSoft.MvvmLight.SL4

Here is the Xaml of the MainPage:

<UserControl x:Class="BackgroundTest.MainPage"
      xmlns:i="clr-namespace:System.Windows.Interactivity;assembly=System.Windows.Interactivity"
      xmlns:cmd="clr-namespace:GalaSoft.MvvmLight.Command;assembly=GalaSoft.MvvmLight.Extras.SL4"
      xmlns:vm="clr-namespace:BackgroundTest.ViewModels"
      mc:Ignorable="d"
       DataContext="{Binding Source={StaticResource MainViewModel}}"
       Height="250" Width="500">
       
     <UserControl.Resources>
             <DataTemplate x:Key="ProcItemsTemplate">
                   <Grid>
                         <Grid.ColumnDefinitions>
                                <ColumnDefinition Width="65" />
                                <ColumnDefinition />
                                <ColumnDefinition Width="30" />
                         </Grid.ColumnDefinitions>
                         <TextBlock Grid.Column="0"
                                      Text="{Binding Path=Progress, StringFormat=\{0:P\}}" />
                         <TextBlock Grid.Column="1"
                                      Text="{Binding Path=CommandData}" />
                         <TextBlock Grid.Column="2"
                                      Text="{Binding Path=IsComplete}" />
                   </Grid>
             </DataTemplate>
      </UserControl.Resources>
 
       <Grid x:Name="LayoutRoot" Background="Beige">
             <Grid.RowDefinitions>
                   <RowDefinition Height="25" />
                   <RowDefinition Height="25" />
                   <RowDefinition />
             </Grid.RowDefinitions>
             <Grid.ColumnDefinitions>
                   <ColumnDefinition Width="140" />
                   <ColumnDefinition />
             </Grid.ColumnDefinitions>
              <TextBox Text="{Binding Path=DataText, Mode=TwoWay}"
                       Width="200" TextWrapping="Wrap" Grid.ColumnSpan="2" />
             <Button Content="Load this and wait..."
                      Grid.Row="1" Width="150" Grid.ColumnSpan="2"
                      Margin="0,2,2,0" >
                   <i:Interaction.Triggers>
                         <i:EventTrigger EventName="Click">
                                <cmd:EventToCommand
                                      Command="{Binding StartLongRunningExecCommand}"
                                       CommandParameter="{Binding Path=DataText}" />
                         </i:EventTrigger>
                   </i:Interaction.Triggers>
             </Button>
             <Rectangle Grid.Row="2" Fill="{Binding Path=CurrentColor}" />
             <StackPanel Orientation="Vertical"
                          HorizontalAlignment="Center"
                          VerticalAlignment="Top" Grid.Row="2">
                   <TextBlock Text="Red=Not Running" />
                   <TextBlock Text="Green=Running" />
                   <TextBlock Text="{Binding Path=ProcessorCount}"  />
             </StackPanel>
             <ScrollViewer Grid.Row="2" Grid.Column="2">
                   <ItemsControl
                          ItemsSource="{Binding Path=TokenList, Mode=TwoWay}"
                          ItemTemplate="{StaticResource ProcItemsTemplate}" />
             </ScrollViewer>
      </Grid>
</UserControl> 
 

This is pretty straight forward. The DataContext of the control is set to the key that you added previously in the App.xml. The binding statements will be more apparent after you dig into the ViewModel. Also, the Click event of the button is bound using Triggers to show that this command could be fired from any event, not just a button. The parameter of the command is set to the text property of the text box.

There is absolutely no code behind other than Initialize in the view which is exactly where we want to be.

Let’s now provide some meat to the business object.

If you fail to implement the INotifyPropertyChanged interface in this business object, updating items in the list will not refresh in the UI regardless of how many times and in how many places you throw the RaisePropertyChanged event.
 

We have three properties: A progress percentage value (double), a string body and a boolean complete flag. In addition there is a private implementation of that calls the PropertyChanged event for each property.

Here is the code:

using System.ComponentModel; namespace BackgroundTest.Models
{
     public class ProcessItem : INotifyPropertyChanged
     {
          
          private double _progress = 0;
          public double Progress
           {
              get
              {
                   return _progress;
              }
              set
               {
                   if (_progress != value)
                   {
                        _progress = value;
                        RaisePropertyChanged("Progress");
                   }
              }
           }
     
          private string _commandData = string.Empty;
          public string CommandData
           {
              get
              {
                   return _commandData;
              }
              set
              {
                   if (_commandData != value)
                   {
                        _commandData = value;
                        RaisePropertyChanged("CommandData");
                   }
              }
          }
   
          private bool _isComplete = false;
          public bool IsComplete
          {
              get
              {
                   return _isComplete;
              }
              set
              {
                   if (_isComplete != value)
                   {
                        _isComplete = value;
                        RaisePropertyChanged("IsComplete");
                   }
              }
          }
  
          public event PropertyChangedEventHandler PropertyChanged;
          private void RaisePropertyChanged(string prop)
          {
              if (PropertyChanged != null)
              {
                   PropertyChanged(this, new PropertyChangedEventArgs(prop));
              }
          }
      }
} 

 

The Long Running Processor

Now lets take a look at LongRunningObject. Remember that this is a pseudo object that simulates a long running call to a web service or some heavy client-side processing that must take place.

We have two public events that fire during processing to notify clients of progress and to state that processing is complete.

When the single public method fires, the events are thrown and a for loop with a sleep timer is used to simulate a long process. Once the loop is complete, the completion event is thrown with the appropriate payload.

Note that I’m throwing a new ProcessItem with each event. This is not completely necessary but the goal is to indicate that anything can come back, not just the input parameter. Remember, this is occuring on a different thread and care should be taken when passing references around in this manner.

Here is the code:

using System;
using System.Threading;
 
namespace BackgroundTest.Models
{
     public class LongRunningObject
     {
          public event Action<ProcessItem> WorkCompleted;
          public event Action<ProcessItem> WorkProgress;
          
          public void RunWorkAsync(ProcessItem inputData)
          {
              double iMax = 25;
              // Throw starting progress event
              WorkProgress(new ProcessItem()
                    {
                         Progress = 0,
                         CommandData = inputData.CommandData
                    });
 
              for (double i = 0; i < iMax; i++)
              {
                   Thread.Sleep(1200);
                   double dProg = (i + 1) / iMax;
                   // Throw the current progress event
                   WorkProgress(new ProcessItem()
                         {
                              Progress = dProg,
                              CommandData = inputData.CommandData,
                             IsComplete = false
                        });
              }
              // Throw the completed event
              WorkCompleted(new ProcessItem()
                    {
                         Progress = 1,
                         CommandData = inputData.CommandData,
                        IsComplete = true
                   });
          }
     }
}
 

That’s it for plumbing and faux code. Now let’s open the MainViewModel and get the real work done.

The ViewModel

First, we’ll add the properties required by the UI. We’ll need a property to bind the RelayCommand (StartLongRunningExecCommand), a string property to bind to the UI TextBlock (DataText), a simple counter to track how many BackgroundWorker objects we have in process (ProcessorCount) and an ObservableCollection to store all the items being processed by the BackgroundWorker (TokenList). Each of these properties will execute the RaisePropertyChanged event in the setter if the source value changes. This means that you should always update the value by setting the property, not hitting the storage field directly if you want the UI to update it’s binding sources.

We also need a local private variable to store the UI thread’s Dispatcher. This will be our means of updating the UI when progress is received from the BackgroundWorker object.

private Dispatcher currentDispatcher;
 

We’ll wire this up in the constructor:

currentDispatcher = App.Current.RootVisual.Dispatcher;
 

The actual work that takes place happens because in the constructor we also wire up the RelayCommand to a method.

StartLongRunningExecCommand = new RelayCommand<string>(p =>
     {
          StartProcess(p);
     });
 

The StartProcess method first instances the events that will be handled during processing:

DoWorkEventHandler workHandler = null;
RunWorkerCompletedEventHandler doneHandler = null;
Action<ProcessItem> longRunProgress = null;
Action<ProcessItem> longRunCompleted = null;
 

Then we set up the BackgroundWorker, assign it’s events and begin the work. Note that inside each delegate event that should fire only once, we immediately unwire the event from the source object to prevent memory leaks from orphaned objects. This makes it very plain that an object can be garbage collected. For events that fire multiple times, care should be taken to unwire those events upon final cleanup. Here we do it when the WorkCompleted event fires from the business object. Additionally, note that we call the currentDispatcher.BeginInvoke method in various places. This allows the method passed to be called on the original UI thread.

Here is the entire StartProcess method:

 
public void StartProcess(string commandData)
{
     // Events used during background processing
     DoWorkEventHandler workHandler = null;
     RunWorkerCompletedEventHandler doneHandler = null;
     Action<ProcessItem> longRunProgress = null;
     Action<ProcessItem> longRunCompleted = null;
  
     // Implementation of the BackgroundWorker
     var wrkr = new BackgroundWorker();
     wrkr.DoWork += workHandler =
 
          delegate(object oDoWrk, DoWorkEventArgs eWrk)
          {
              // Unwire the workHandler to prevent memory leaks
              wrkr.DoWork -= workHandler;
              LongRunningObject LongRun = new Models.LongRunningObject();
              LongRun.WorkProgress += longRunProgress =
                   delegate(ProcessItem result)
                   {
                        // Call the method on the UI thread so that we can get
                        // updates and avoid cross-threading exceptions.
                        currentDispatcher.BeginInvoke(
                             new Action<ProcessItem>(AddToken), result);
                   };
              LongRun.WorkCompleted += longRunCompleted =
                   delegate(ProcessItem result)
                   {
                        // Unwire all events for this instance
                        // of the LongRunningObject
                        LongRun.WorkProgress -= longRunProgress;
                        LongRun.WorkCompleted -= longRunCompleted;
                        currentDispatcher.BeginInvoke(
                             new Action<ProcessItem>(AddToken), result);
                   };
 
               // Events are wired for the business object,
               // this where we start the actual work.
              LongRun.RunWorkAsync(
                   new ProcessItem() { Progress = 0, CommandData = dataText });
          };
     wrkr.RunWorkerCompleted += doneHandler =
          delegate(object oDone, RunWorkerCompletedEventArgs eDone)
          {
              // Work is complete, decrement the counter
              // and kill references to teh donHandler.
              wrkr.RunWorkerCompleted -= doneHandler;
              procCount--;
              RaisePropertyChanged("ProcessorCount");
          };
      // This is where the actual asynchronous process will
      // start performing the work that is wired up in the
      // previous statements.
     wrkr.RunWorkerAsync();
     procCount++;
     RaisePropertyChanged("ProcessorCount");
}
 

The AddToken method is a simple synchronizer that eitehr adds the text value to the UI-bound list or updates the progress value. Once again, if you did not implement INotifyPropertyChanged in your business object, setting the itm.Progress and itm.IsComplete values would not update values on the UI thread.

That’s pretty much it. I’ve included the complete code listing for the Silverlight side here.

http://weblogs.asp.net/blogs/jimjackson/Files/BackgroundTest.zip

 
Silverlight Adventures - The Process So Far

I’m currently in the process of rebuilding my Silverlight application. The initial beta of this app was somewhat underwhelming in that the problems it solved were those I found most interesting, not necessarily those the beta testers were looking for resolutions to. So we head back to the drawing board, this time to build out something that benefits my users as well as provides me with exposure to real world problems that I may not encounter in my day job.

My employer has a number of MVP’s, authors and early adopters in various technologies, mostly Microsoft focused. They get their experience through lots and lots of study, helping out on projects they are not officially a part of, writing and public speaking. I’m fortunate to be a part of such a talented team but I have found that for various reasons, writing, public speaking and dissecting toolkit bits are not really my bag. It’s just too hard to focus on the abstract without a real business problem to apply it to. Building out MuddyGPS.com has provided that focus. I hope that at some point it will make a few dollars and pay for the hosting but in the mean time, it’s a valuable tool for thinking through the architectural solutions for various problem spaces. I have at least 3 or 4 other ideas for ‘amazing’ Silverlight applications in areas where I have some extra-development experience. If only there were another 8 or 10 hours in the day, I’d be good to go!

So here are a few items I’m currently monkeying around with:

Securing Data from Authenticated and Anonymous Users

·         My trails are available only from my site. I want to keep everyone, including my users, from seeing that raw data. They should only be able to view the trails within a map extent via my viewer. They should not be able to ping my service directly and scrape my data.

·         Once authenticated, a user can hit my web services all they want. I haven’t looked too deeply into javascript injection for a while but I know it’s possible because I’ve watched my logs and found that at least one caller is ‘tiling’ my maps looking for everything in my system. Of course it didn’t work but it didn’t stop him from trying.

·         Anonymous users are easy but there are some steps you need to consider in setting up your site when it’s in a hosted environment to be sure it’s secure.

Asynchronously Loading Data To and From a Web Service

·         Once security was taken care of I found that loading 2,500 lines onto a map would cause the UI thread to hang long enough that I’d break into the code to see if something was busted.

·         Taking large data calls, breaking them up and recomposing the messages both on the client and server can take longer but providing the UI with a chance to redraw a small part of the screen at a time gives an animation effect that can be informative to the user, more interesting than a percent bar and give your user a more patient outlook on things.

·         Background threads are also fun but getting back to the UI update in a pattern-based operation can introduce some spaghetti-like coding habits that nobody wants to read or debug.

File Uploads and Downloads

·         The most important questions for file uploads are:

o   Do you upload the file as-is and process on the server?

o   How long will the entire upload operation take?

·         Processing a file on the server has advantages in that you have the file and can always go back and reprocess it. The down side is that if you are not using all the data from the file, you are consuming more bandwidth than necessary. You are also probably storing both the file and the data from that file so you are dramatically increasing your server storage requirements. I’ve currently got about 370 Gb of GPS data on my server and it’s growing all the time. If I put all of this in my hosted environment I’d be out of money pretty quickly.

·         For images being uploaded to a site, you also need to consider whether or not you want to shrink the image before upload. I’m not there yet so I don’t know what SL 4 has in store for me in that venue.

·         In considering the upload time for a file, you need to decide what kind of proxy your WCF service will use. A custom binding with BinaryEncoding will allow for very large downloads but I’ve been personally unable to increase the upload size past 8,192 bytes. Also, since upload speeds are almost always slower than downloads for residential broad band, you need to think about whether or not you should break out the code into multiple uploads and whether or not a Polling Duplex service that gives some feedback about progress is appropriate.

SQL Server Geography Types

·         It’s interesting to me that there is so much discussion of geo-this and geo-that at Microsoft but very few people are discussing how to effectively use the geography and geometry types in SQL Server. Perhaps it’s because they are not supported outside of SQL Server in EF, ADO.Net or pretty much anything else.

·         My experience so far is that you should store both the geographic raw data AND the compiled geography field when using these types. Each has their own place and together you can get a lot of information very quickly.

·         The biggest problem with geography types is their speed. When I’m looking for all points in a map extent, I cannot define a geographic index and then get everything in it. My users may only be viewing that extent for 10 seconds. It’s much easier to store my center points or individual lat/longs in decimal columns and then index those. Crazy fast and I still get the geography features once I have my base rowset to work with.

·         The general rule I follow is to use geography types to calculate but not to select.

Patterns and Practices

·         I’m pretty happy at this stage with MVVM Light from Laurent Bugnion of GalaSoft. It’s a very easy to use, free helper tool kit for both WPF and Silverlight. What it took a few days for me to get through my thick skull was that MVVM is a pattern for the client. That’s all. As a developer, you need to implement other patterns on the server (service) side. Probably a no-brainer for most but I’m old and set in my ways so it took a bit to sink in.

·         The challenge for me is still the Inversion of Control (IOC) stuff, TDD for Silverlight and mocking. Again, all in good time. It’s difficult to hold off forging ahead solving the cool problems and take a breath to build out some unit tests. The results are almost always a stronger platform though and I AM learning.

Styling

·         The greatest thing for me about Silverlight is the ability to provide an immersive environment to a user where subtle queues guide a user through the process of doing whatever you want him to do. This is a blessing and a curse because all of these styling operations can take a lot (!!!) of time if you hand-code them or much less time if you learn to use Expression Blend right the first time! Blend is a true designers tool and I’m no designer. I have a couple of friends who do a lot of graphic design and I’ve been trying to convince them to start working with Xaml. No luck yet but I really believe there will be a thriving employment market for top-shelf Xaml designers very soon.

·         The other problem with styling is deciding how much is too much. For instance, CompletIT has an amazing Silverlight presentation on their site but it’s literally so much that it’s distracting! There is no doubt that they know their stuff and as a design firm, perhaps that site shows the coolness. To me though I would rather see fewer effects and more usability. See a previous post about a good hammer…

So I’ve started blogging again. I think that I’m going to tinker around with an MVVM-friendly background worker for uploading large contents and see how it goes.

Probably the best thing about Silverlight for me is the fact that it's not BI, it's not Sharepoint, it's not CRM and it's not BPM. I had considered the shrink-wrap-framework-package-as-a-foundation-for-everything-under-the-sun to be the wave of the future. It still may be, but I have found that development job to be tedious and uninspiring. I have some experience in all those packages and am looking forward to seeing how an immersive RIA technology like Silverlight can make them even better. An effective UI provides the user with guidance to get their job done better and faster. Now THAT sounds like a project that would wake me up in the morning.

 

RIA Services - Iterate Items in EntityQuery Object

I’m trying to get EF working with SL3, SSL, RIA Services et al. It’s a long road and I know there are issues with any route you decide to take when building a business app in Silverlight 3. In building previous iterations of my little program I used straight WCF services and had issues with host headers when using SSL and the cross domain access file.

RIA Services is definitely easier to work with in terms of the plumbing. It sets up a new handler in the web.config and if you work out your EF model properly, the data access and communications portions go relatively quickly.

The issue I was working through this evening was that I am using the ESRI mapping control and I could not get it to databind the graphics layer to a collection of graphics objects bound to a data context query. The backup plan is to iterate items in the data context query results. Good luck finding a helper post anywhere to write that Linq query! So here is one I finally found in a forum post on the RIA Services list. I lost the URL but here is the relavent code:

// Instance the list of objects we’ll use to
// build the map graphics.
List<MyMapPoint> lst = new List<MyMapPoint>();
// Query that gets the list of points from
// the database.
var queryhist = _MyContext.GetPointsQuery();
// Execute the query.
var qOp = _MyContext.Load(qhist, (pts) =>
{
    foreach (var pt in pts.Entities)
    {
        lst.Add(
                new MyMapPoint(
                pt.PointID,
                pt.PointName, 
                pt.Latitude,
                pt.Longitude
                );
    }
}, null);

I know I could probably serialize my query results directly into my list but I’m just not there yet (read: I don’t know how to do that!).

The points of interest viewer/editor should be up fairly soon at www.muddygps.com. Some time thereafter, the GPS trail routing will also get rolling and then I can start fooling with the race interface. That will be the fun part! Incidentally, the app as it exists right now on the internet is being completely rewritten (again) so it should get a lot faster soon.

Another item up for bids is going to be trying to work out why the cross domain access policy isn’t working right. If I make the entire SilverLight 3 portion of the site SSL it works but as soon as I make it SSL only for the RIA Services data calls, it makes squishing noises. Not going to worry about that one for now because the data access seems to be going relatively quickly for the size of data I’m throwing around.

I’m also building on some of Brad Abrams tutorials for SEO and deep linking. Hopefully that will get my site some traffic. Of course I can’t expect much at the moment since I’ve got a face but no content online.

Good Documentation…

Getting nationalized health care:

+ $3,000,000,000,000.00

Socializing the United States of America:

+ $10,000,000,000,000.00

Tax Hike to 'Evil Rich People' to raise less than 20% of funds needed for health care:

5.4%

Good documentation about how the new health care system will work:

Priceless

http://docs.house.gov/gopleader/House-Democrats-Health-Plan.pdf

Give ya two guesses where the rest of the funds for this travesty of liberty will come from but if you’re listening you’ll only need one.

Posted: Jul 15 2009, 12:28 PM by axshon
Filed under:
Small Person, Big Life

My friends and a few work associates know some of the details of my last trip to China to adopt my second daughter. While there my first adopted daughter, then 4 years old, got mysteriously sick. Very very sick. My wife and I spent a lot of time and obscene amounts of money playing the Ugly American role and advocating for our child in a medical and political system not set up with any specific value placed on an individual life. I love the Chinese people (all of them) and the Chinese culture is a part of my life for the sake of my children but I’m still in 1969 when it comes to Communist governments…

Nuff said about that. Suffice it to say that anyone ever who stands between me and my family will have problems much larger than they imagined possible. So when my kid gets sick and nobody knows what the hell is going on it’s incomprehensibly frustrating. You pray, you weep, you hold her hand and smile and say it’s going to get better soon even though you’re not at all sure it will.

So life being what it is, sometimes it doesn’t get better. Thus the reason for this post. The Spohrs family lost their fight last week for their daughter’s life. I don’t know them and I never will but I do feel a small part of their pain.

They have a raffle going to raise funds for the March of Dimes that you can find on their site but my reason for posting here is that Bill Simser is offering his MVP MSDN Premium Subscription with Team Suite for raffle. He has instructions on his blog about how to get involved. Cost is 10.00 per entry and it’s going to a damn good cause.

Please do this and please pass it on.

Posted: Apr 13 2009, 08:16 AM by axshon
Filed under:
More Posts Next page »