Passing Multiple Records to a Stored Procedure in SQL Server

There are so many different ways to work with data that it can make your head spin.  Regardless of the technology, one thing I always try to do when working with databases is minimize the number of calls that are made.  I use ORM frameworks a lot so I have to make sure that my code doesn’t end up calling the database a ton and being overly “chatty”. 

My company is currently working on a large Silverlight 3 application for a customer and have a requirement to allow CSV file uploads through the application.  That’s pretty straightforward to do with Silverlight but in looking through the CSV file there can be 1000s of records and there is a lot of custom functionality around each record.  Initially I was going to handle the business rules and insertion for each row individually.  However, once I saw how many rows were involved I backed off my initial approach since that would mean a single file upload could trigger thousands and thousands of calls to the database which certainly isn’t efficient.  Instead of going that route I decided it would be better to pass all of the rows into a stored procedure and let it handle everything.  That way I can make a single database connection yet handle inserting (and updating and deleting based upon the business rules) many records. 

So how can you pass multiple records into a stored procedure?  Back in “the day” I’d pass in delimited strings and then parse them.  Although that technique probably wasn’t the most efficient, it got the job done.  With SQL Server 2005 and higher we have access to a much more efficient technique due to the availability of XQuery.  The current project I’m working on relies on SQL Server 2005 behind the scenes so that’s what I’ll focus on here.  Note that SQL Server 2008 allows .NET DataTable objects to be passed when using table value parameters as well in a stored procedure.  See http://www.codeproject.com/KB/database/sqlserver2008.aspx for an overview of the table value parameter functionality.

For the import I can serialize the list of records to XML and then pass them into a stored procedure.  It appears that you can pass an XmlDocument object from .NET if the stored procedure takes an XML input parameter type.  However, I never got it to work properly doing that so I pass a string containing serialized XML data and then load it into the XML data type.  The stored procedure can then use XQuery to convert the XML into actual rows.  Here’s some code that handles parsing a flat file, converting the data into a List<JobMaterialImport>, serializing the list into XML and then passing the serialized XML data to a stored procedure named ImportJobMaterials.

public List<ImportJobMaterialsResult> ImportJobMaterials(int jobID, Stream stream, string coNumber, string coDesc, string xmlMapFile)
{
    //Convert flat file to List<JobMaterialImport> using converter and XML mapping file
    //Mapping file passed from FlatFileHandler.ashx in web project
    FlatFileToObjectConverter<JobMaterialImport> converter = new FlatFileToObjectConverter<JobMaterialImport>(stream, xmlMapFile);
    List<JobMaterialImport> imports = converter.ConvertToList();
    imports = imports.Take(imports.Count - 2).ToList(); //last rows are bogus in flat-file

    //Generate XML and pass to sproc
    using (StringWriter sw = new StringWriter())
    {
        XmlSerializer xs = new XmlSerializer(typeof(List<JobMaterialImport>));
        xs.Serialize(sw, imports);
        try
        {
            //Pass materials into sproc
            string xml = sw.ToString().Replace("utf-16", "utf-8");
            return this.DataContext.ImportJobMaterials(xml, jobID, coNumber, coDesc).ToList();            
        }
        catch (Exception exp)
        {
            Logger.Log("Error in JobManagementRepository.ImportJobMaterials", exp);
        }
    }
    return null;
}

 

The XML that’s generated from the code above looks like the following.  Security Note: We’ll assume that appropriate measures have been taken to clean the data and ensure it doesn’t contain any bad stuff related to injection attacks. That’s especially important if you’ll be running any dynamic SQL in your stored procedure.  I’m not, but I thought I’d be a good citizen and point it out…always sanitize your data with some Clorox before using it. :-)

<ArrayOfJobMaterialImport xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <JobMaterialImport>
    <Area>BUILDING A</Area>
    <Phase>LIGHTING</Phase>
    <WorkCode>0</WorkCode>
    <WorkCodeTitle>Manually Assigned</WorkCodeTitle>
    <Description>4x1 1/2in. SQ BOX COMB KO</Description>
    <Quantity>2</Quantity>
    <TotalHours>10.46</TotalHours>
  </JobMaterialImport>
  <JobMaterialImport>
    <Area>BUILDING A</Area>
    <Phase>LIGHTING</Phase>
    <WorkCode>0</WorkCode>
    <WorkCodeTitle>Manually Assigned</WorkCodeTitle>
    <Description>#8x   3/4 P/H SELF-TAP SCREW</Description>
    <Quantity>4</Quantity>
    <TotalHours>0.28</TotalHours>
  </JobMaterialImport>
  <JobMaterialImport>
    <Area>BUILDING A</Area>
    <Phase>LIGHTING</Phase>
    <WorkCode>605</WorkCode>
    <WorkCodeTitle>Wiring and System Devices</WorkCodeTitle>
    <Description>1G TGL SWITCH PLATE - PLASTIC IVY</Description>
    <Quantity>2</Quantity>
    <TotalHours>0.89</TotalHours>
  </JobMaterialImport>
</ArrayOfJobMaterialImport>

 

Here’s what the stored procedure looks like:

 

CREATE PROCEDURE [ImportJobMaterials]
    @JobMaterialsXml AS VARCHAR(MAX),
    @JobID AS INT,
    @ChangeOrderNumber AS VARCHAR(10) = NULL,
    @ChangeOrderDescription AS VARCHAR(100) = NULL
AS
    BEGIN
        DECLARE @XML AS XML
        
        DECLARE @MaterialsTable TABLE
        (
            ID INT IDENTITY(1,1),
            Area VARCHAR(250),
            Phase VARCHAR(250),
            WorkCodeID INT,
            WorkCodeTitle VARCHAR(250),
            MaterialTitle VARCHAR(250),
            Quantity DECIMAL(18,2),
            TotalHours DECIMAL(18,2)
        )
        
        SELECT @XML = @JobMaterialsXml
        
        INSERT INTO @MaterialsTable (Area, Phase, WorkCodeID, WorkCodeTitle, MaterialTitle, Quantity, TotalHours)
        SELECT M.Item.query('./Area').value('.','VARCHAR(250)') Area,
               M.Item.query('./Phase').value('.','VARCHAR(250)') WorkCode,
               M.Item.query('./WorkCodeID').value('.','INT') WorkCodeID,
               M.Item.query('./WorkCodeTitle').value('.','VARCHAR(250)') WorkCodeTitle,
               M.Item.query('./MaterialTitle').value('.','VARCHAR(250)') MaterialTitle,    
               M.Item.query('./Quantity').value('.','DECIMAL(18,2)') Quantity,    
               M.Item.query('./TotalHours').value('.','DECIMAL(18,2)') TotalHours
        FROM @XML.nodes('/ArrayOfJobMaterialImport/JobMaterialImport') AS M(Item)
        
        --Process the data                    
    END

Once the XML data comes in it’s converted into an XML data type using SELECT @XML = @JobMaterialsXml syntax.  The key part of the T-SQL code is the SELECT statement that grabs each value from the XML data type and looks for specific child nodes.  If the child nodes were attributes instead then you would do something like M.Item.value(‘@attributeName’,’DBType’).  Each JobMaterialImport node in the XML is located by the @XML.nodes(‘/ArrayOfJobMaterialImport/JobMaterialImport’) AS M(Item) code.  What’s nice about this approach is that a single call can be made to the database yet 1000s of records can be processed.  Not optimal for every situation, but exactly what I needed and fairly straightforward to use.

comments powered by Disqus

2 Comments

Comments have been disabled for this content.