The ultimate ASP.NET best practice - "it depends"

I previously blogged about a common conundrum in database design, in deciding upon the best table schema to use to satisfy the demands of good programming practice, fast application performance, and generally-accepted normalization.  Extending this concept, one of the major fallacies when learning the ASP.NET is that people neglect to remember that there are more than one ways to skin a cat. 

When using the oft-used term "best practices", people in similar fashion commonly don't realize that there exist several ways to get something done, much less the best way.  They read an article, book, blog post or watch a webcast and hear "best practices" and immediately gravitate towards the presented solution as the Holy Grail of doing something.  And when reading another such recommendation for the optimal way to program something, it's often conflicting, and therefore confusing.

It's largely the fault of those teaching, in inferring that not all projects are the same, and the right situation calls for the right tool.

Take for example the misnomer of "proper" DataGrid paging.  When ASP.NET first hit the market, people loved the DataGrid's automatic paging capabilities, and downloaded data in large bunches, iterating over a recordset page by page, a task that was a monumental undertaking in the days of ASP 3.0.  Of late, the popular practice has been to develop a custom paging scheme, setting the boolean DataGrid.AllowCustomPaging property to TRUE, and wiring up logic to (assumedly) buttons with which to navigate a set of data, large or small.  To support this, the recommended practice is to create a stored procedure that takes as arguments the starting and ending points for the virtual page, as well as the total size of the page.  This is done typically like so (borrowed from O'Reilly's excellent "ASP.NET Cookbook" (2004)):

CREATE PROCEDURE GetPagedData
@PageNumber INT,
@PageSize INT,
@TotalRecords INT OUTPUT
AS
DECLARE @FirstRecordInPage INT
DECLARE @LastRecordInPage INT

-- calculate the # of rows
SELECT @FirstRecordInPage = @PageNumber * @PageSize + 1
SELECT @LastRecordInPage = @FirstRecordInPage + @PageSize

-- create a temporary table and place the data into it
CREATE TABLE #Book
(
    ID    INT    IDENTITY(1,1)    NOT NULL,
    BookID    INT    NOT NULL,
    Title    NVARCHAR(100)    NOT NULL,
    ISBN    NVARCHAR(50)    NOT NULL,
    Publisher    NVARCHAR(50)    NOT NULL
)

-- copy the data into the temporary table
INSERT INTO #Book (BookID,Title,ISBN,Publisher)
    SELECT BookID,Title,ISBN,Publisher FROM Book ORDER BY Title

-- get the rows required for the passed page
SELECT * FROM #Book WHERE ID >= @FirstRecordInPage AND ID < @LastRecordInPage

-- get the total number of records in the table
SELECT @TotalRecord = COUNT(*) FROM Book
GO


People then flocked to develop methods like this and used them ad nauseum in their projects, even for recordsets containing only tens of records.  But sure, while this minimizes the total data stored in memory, it nonetheless requires a database visit every time someone calls upon the paging routine.  The alternative would be to call ALL the records into a disconnected DataSet object at the onset, and then programmtically store this data via the .NET Framework's Cache API, assigning it some condition.  This fundamentally would result in a single, albeit heavy/intensive, database call, but would bypass the need for any repetitive calls to the database.

But can one safely assume either solution would be optimal without knowing the frequency of the access to the underlying data?  Or the rapidity with which it would be modified (if at all)?  Or can it be known beforehand the total number of records?  Or, can it be predicted accurately how far into the recordset a user will likely navigate, meaning will only the first 50 records out of a collection of 1,000 be accessed?  These are all questions that need to be considered when determining what truly will be the "best" solution.

Other such debates that have been ongoing since the public's consumption of ASP.NET relative to performance-conscious data access are whether to use a DataList, Repeater or DataGrid for display; or should images be stored on disk on in a database.

And the bottom-line answer?  In terms of best practices?  IT DEPENDS.  This very concept was enforced by ASP.NET laureate Jeff Prosise in ASP.NET Pro Magazine when dealing with the image storage debate.  Such a simple solution goes a long way, but it too commonly forgotten.  There is no better mouse trap for all environments.

Too often are solutions presented as the "best" way, ithout mentioning the relevance of the overall impact of a practice in the grander scope of a project.  And it's the mark of a mature programmer to know best way to do something within one's own work.  I'm not recommending everyone start throwing caution to the wind and take up developmental vigilantism...but more responsible instruction.

Let's remember to enforce best-fit solutions as well as best programming practices in our teaching of those just grasping the concepts.  And for those consuming the information those of us put forth...take it, learn from it, apply it, and extend it.  Know that there are several ways to do the same thing in Microsoft development, and figure out which one is the best fit within the scope of your apps. 

3 Comments

  • The idea of best practices and there are few universal truths is something I think I mentioned several times in my book, and in fact right up front in the preface. The hardcore academic types seem to want to kick you in the nuts when you say it, but it's so true, the bit about skinning a cat (though I've never understood why you'd want to skin a cat).

  • Excellent comment!

    And excellent subject to prove it, that's one of the most abused subjects in ASP.NET (Paging/Sorting example with the datagrid), i heard Microsoft a consultant destroying some friends for using Text SQL's on their data tiers, of course, Stored Procedures provide another level of abstraction blah, blah, blah, but creating a temporary table sometimes can create an even larger overhead than not using the optimized dataAdapter.



    Keep 'em coming!!!

  • HI Jeff,



    I'm looking forward to giving your forthcoming book a spin and do a review on it! When does it come out again?

Comments have been disabled for this content.