Archives

Archives / 2005 / February
  • "ASP.NET 1.1 Insider Solutions" is a work of art...

    ...and I haven't even read the damn thing yet.  I got my Amazon order this morning to my office, and marveled at the book's design.  I've played with the samples from www.daveandal.com before, but I never expected the title to be of such high production quality.  The pages a thin version of the glossy paper that's more common to Flash and PhotoShop books, and it sits well.

    Plus, the book has contributions by a friend of mine, Mr. XML himself, Dan Wahlin, so it's a sure-fire winner.

    I'll have a review posted in a couple of days, but this is a trophy book.  Shoot, I'll probably need to order another just to keep in mint condition.

    Read more...

  • Creating a blog-like chronological list to track user posts by month

    One of the cool things everyone likes about blogs is that they let you navigate through a user's submissions chronologically, usually by month.  Most, like .TEXT, even show how many posts a user's made for that month.
    It's a cool little feature people really adhere to, making using them easy and fun.

    Here's a T-SQL stored procedure I wrote that provides returns a recordset that you can easily use in your apps to pull this type of user experience. I use it in the "Familiar Faces" community photo gallery app I built and use on my site.  It basically uses some of SQL Server's date-specific system functions to display field data of type SMALLDATETIME.

    When bound to a list control (in this case a DataList), it renders the following:



    CREATE PROCEDURE GetPersonalGalleryArchive
    (
        @UserID INT
    )
    AS
    SELECT DATENAME(mm,DateOfPost) + ' ' + DATENAME(yy,DateOfPost) AS [MonthYear],COUNT(BlogPostID) AS [MonthlyPosts],CONVERT(SMALLDATETIME,CONVERT(VARCHAR(4),MONTH(DateOfPost)) +
    '/1/' + CONVERT(VARCHAR(6),YEAR(DateOfPost))) AS [LinkDate]
    FROM BlogPosts
    WHERE UserID = @UserID
    GROUP BY DATENAME(mm,DateOfPost) + ' ' + DATENAME(yy,DateOfPost),CONVERT(SMALLDATETIME,CONVERT(VARCHAR(4),MONTH(DateOfPost)) + '/1/' + CONVERT(VARCHAR(6),YEAR(DateOfPost)))
    ORDER BY [LinkDate] DESC
    GO

    Just FYI....it uses the following database table schema:

    -- this table is used by the SPROC
    CREATE TABLE BlogPosts
    (
    BlogPostID    INT    IDENTITY(1,1)    PRIMARY KEY    NOT NULL,
    UserID        INT     NOT NULL, -- links to a membership table
    DateOfPost SMALLDATETIME DEFAULT GETDATE() NOT NULL
    )

    You could also easily bind the recordset to a vertical-reading DataList or Repeater, for the blog effect that we're so used to.  :)

    Have fun!

    Read more...

  • Benefits of mixing MIME types in podcast feeds

    Something theoretical/academic crossed my mind this evening - what the (dis)advantage(s) would be as a content provider for mixing one's podcasted multimedia content in the same RSS feed.  Specifically, how wise would it be to have one's multimedia and textual content integrated into a single XML-based pull channel? 

     

    I've seen many hybrid feeds that contain text-only articles, as well as <ENCLOSURE> tags containing URIs to MP3s for audio content.  But then again, I've also noted several well-known, well-established content providers make the effort to distinctly segregate their feeds by MIME types, keeping audio and text available, but completely separate.

     

    In my own case, I've been supporting for the past six months an XML feed that's served up textual version of my station's news stories, and I'll soon be adding MP3 audio of various content we generate.  In that light, you could consider this a "migration" project of sorts.  It certainly wouldn't be too painful developmentally to add-in to our current RSS feed the audio content I'm going to introduce, being just another DataTable in to iterate through in a DataSet.  It would be nice and clean, and perform beautifully.

     

    What I'm wondering is how practical it would be not only for aggregators already subscribing to my stuff, but also to smaller apps like personal web pages, and then also for the podcast applications that will subscribe to my stuff in the future.  At this point, I'm admittedly unjustifiably assuming that the majority of podcast apps, being audio channels in their nature, will ignore <ITEM> nodes in an RSS feed without <ENCLOSURE>.  I'm also going to assume that any encountered audio will get an icon or download link of some sort in the major aggregators.

     

    Alas, I think my previous conclusion that "it depends" would be the best advice, considering the context, scale, scope and target audience of an application, so time to start playing.

    Read more...

  • The best software modeling/project management tool you've never heard of: PowerPoint

    I've found some of the simplest tools to often be the most obvious (can you say Occam's Razor???)

    As an example, if I can avoid using super high-end tools in my software development for UML modeling and/or Gannt chart for project management, I use PowerPoint.  It's something I've already got no matter what PC I'm on, and minus the automatic code generation, does the same thing - sight summaries of the work I'm to do.

    Don't get me wrong, I enjoy other niche tools, but sometimes less really is more.

    Read more...

  • NBA embraces Metallica for All-Star Weekend

    One of the things I'm really enjoying about NBA All-Star Weekend is that the good people in Denver apparently really love metal music.  I'm writing this literally seconds after the 3-point shooting competition is starting, for which the booming low-end of Metallica's classic dirge "For Whom the Bell Tolls" resonated throughout the Pepsi Center during the player introductions.  Also, during last night's Rookie Challenge, the event coordinators announced both the Rookie and Sophomore teams to the infinetly-looped main riff from "Enter Sandman".

    Sweet.

    Read more...

  • Working on custom progress bar control for mobile ASP.NET apps

    I'm spending a good part of this President's Day weekend setting up my station's new podcast feeds.  It basically features comedy clips from our radio DJs, audio newscasts, and selected excerpts from the sportstalk radio show I host.  Seeing how positively so many people responded to the launch of our RSS newsfeeds, I'm excited about this new venture.  I personally know a ton of people locally that got iPods for Christmas.

    Read more...

  • Do APA citation standards account for non-printable new media?

    I'm waxing idiotic tonight, so just bear with me…

     

    Further to a post I put up about citing one's own blog posts in academic work, I recall the APA writing/citation standards just barely embracing e-mail correspondence.  How about, for sake of argument, if something noteworthy were revealed to me in a chatroom?  Or over SMS?  Or in a P2P instant messaging app?

     

    I mean, if the APA is tolerant enough to accept verbal, non-documented conversations as a citable medium, shouldn't those that are tangible, but questionably reproducible, worth a nod?

    Read more...

  • How goofy is it to cite your own blog work in an academic paper?

    I'm starting my doctoral work pretty soon, and something just dawned on me…how arrogant/vain/self-indulgent/downright silly would it be to cite one's own weblog posts as references when writing an academic work?  It would certainly be acceptable by what I remember as APA standards, albeit being a little strange for an instructor to validate a student's work and seeing within a list of "resources cited" multiple instances of her own documented thoughts.

     

    Years ago when I was getting my MBA, I'd be one of the few in my classes who would have the majority of sources cited in a paper based on web URLs.  But sadly, these were in the days before the rise of the almighty blog, when reliable news and academic sources were used.  In the years since, I've copied some of my better works over to my blog, for reasons such as the topics I wrote about are still relevant, and that I'd like an easily accessible locale in case I ever need to bring them up suddenly.

     

    My dissertation is likely going to be a continuation/extension of my Master's thesis, being an examination of the variance of the consumer psychology between traditional and Internet-based product marketing.  (I know, I know…YAWN!)  As such, I've got a body of existing work containing lots of things I can draw from. 

     

    But how weird would this be?  I guess as in all things, moderation would be key.

    Read more...

  • On the hunt for ASP.NET apps and Podcasting

    I've been looking around for good, solid examples of people developing Podcasting applications using ASP.NET.  I'm curious to see how people are getting their content to multimedia formats.

    If anyone knows of any good samples, please share.

    Read more...

  • The ultimate ASP.NET best practice - "it depends"

    I previously blogged about a common conundrum in database design, in deciding upon the best table schema to use to satisfy the demands of good programming practice, fast application performance, and generally-accepted normalization.  Extending this concept, one of the major fallacies when learning the ASP.NET is that people neglect to remember that there are more than one ways to skin a cat. 

    When using the oft-used term "best practices", people in similar fashion commonly don't realize that there exist several ways to get something done, much less the best way.  They read an article, book, blog post or watch a webcast and hear "best practices" and immediately gravitate towards the presented solution as the Holy Grail of doing something.  And when reading another such recommendation for the optimal way to program something, it's often conflicting, and therefore confusing.

    It's largely the fault of those teaching, in inferring that not all projects are the same, and the right situation calls for the right tool.

    Take for example the misnomer of "proper" DataGrid paging.  When ASP.NET first hit the market, people loved the DataGrid's automatic paging capabilities, and downloaded data in large bunches, iterating over a recordset page by page, a task that was a monumental undertaking in the days of ASP 3.0.  Of late, the popular practice has been to develop a custom paging scheme, setting the boolean DataGrid.AllowCustomPaging property to TRUE, and wiring up logic to (assumedly) buttons with which to navigate a set of data, large or small.  To support this, the recommended practice is to create a stored procedure that takes as arguments the starting and ending points for the virtual page, as well as the total size of the page.  This is done typically like so (borrowed from O'Reilly's excellent "ASP.NET Cookbook" (2004)):

    CREATE PROCEDURE GetPagedData
    @PageNumber INT,
    @PageSize INT,
    @TotalRecords INT OUTPUT
    AS
    DECLARE @FirstRecordInPage INT
    DECLARE @LastRecordInPage INT

    -- calculate the # of rows
    SELECT @FirstRecordInPage = @PageNumber * @PageSize + 1
    SELECT @LastRecordInPage = @FirstRecordInPage + @PageSize

    -- create a temporary table and place the data into it
    CREATE TABLE #Book
    (
        ID    INT    IDENTITY(1,1)    NOT NULL,
        BookID    INT    NOT NULL,
        Title    NVARCHAR(100)    NOT NULL,
        ISBN    NVARCHAR(50)    NOT NULL,
        Publisher    NVARCHAR(50)    NOT NULL
    )

    -- copy the data into the temporary table
    INSERT INTO #Book (BookID,Title,ISBN,Publisher)
        SELECT BookID,Title,ISBN,Publisher FROM Book ORDER BY Title

    -- get the rows required for the passed page
    SELECT * FROM #Book WHERE ID >= @FirstRecordInPage AND ID < @LastRecordInPage

    -- get the total number of records in the table
    SELECT @TotalRecord = COUNT(*) FROM Book
    GO


    People then flocked to develop methods like this and used them ad nauseum in their projects, even for recordsets containing only tens of records.  But sure, while this minimizes the total data stored in memory, it nonetheless requires a database visit every time someone calls upon the paging routine.  The alternative would be to call ALL the records into a disconnected DataSet object at the onset, and then programmtically store this data via the .NET Framework's Cache API, assigning it some condition.  This fundamentally would result in a single, albeit heavy/intensive, database call, but would bypass the need for any repetitive calls to the database.

    But can one safely assume either solution would be optimal without knowing the frequency of the access to the underlying data?  Or the rapidity with which it would be modified (if at all)?  Or can it be known beforehand the total number of records?  Or, can it be predicted accurately how far into the recordset a user will likely navigate, meaning will only the first 50 records out of a collection of 1,000 be accessed?  These are all questions that need to be considered when determining what truly will be the "best" solution.

    Other such debates that have been ongoing since the public's consumption of ASP.NET relative to performance-conscious data access are whether to use a DataList, Repeater or DataGrid for display; or should images be stored on disk on in a database.

    And the bottom-line answer?  In terms of best practices?  IT DEPENDS.  This very concept was enforced by ASP.NET laureate Jeff Prosise in ASP.NET Pro Magazine when dealing with the image storage debate.  Such a simple solution goes a long way, but it too commonly forgotten.  There is no better mouse trap for all environments.

    Too often are solutions presented as the "best" way, ithout mentioning the relevance of the overall impact of a practice in the grander scope of a project.  And it's the mark of a mature programmer to know best way to do something within one's own work.  I'm not recommending everyone start throwing caution to the wind and take up developmental vigilantism...but more responsible instruction.

    Let's remember to enforce best-fit solutions as well as best programming practices in our teaching of those just grasping the concepts.  And for those consuming the information those of us put forth...take it, learn from it, apply it, and extend it.  Know that there are several ways to do the same thing in Microsoft development, and figure out which one is the best fit within the scope of your apps. 

    Read more...

  • Steve Schofield joins ORCSWeb

    I was happy to learn today that Steve Schofield, a fellow Microsoft MVP for ASP.NET, joined the team at ORCSWeb.  I've been a web hosting client of theirs for more than 4 years, having moved my company's site to them about 18 months ago from Interland.

    I've interacted with Steve several times on the ASPFriends/ASPAdvice mailing lists, and he's good people.  He also joins a team of real good folks who care about their clients and are generally nice, in addition to being to-notch technically. 

    Congrats, Steve!

    Read more...

  • Suggestion for ADO.NET 2.0: better SqlCommandBuilder support

    I've long been a fan of the concept of the .NET Framework's SqlCommandBuilder class in theory, if not fully in practice.  I appreciate the foundation upon which it was designed - to automatically generate potentially complex SQL code for CRUD operations, and when used with the SqlDataAdapter.Update() method, making for very simple modification of database-born content  - being well-conceived, although (not unexpectedly) many in the community aren't as liberal in their support of it. 

    A project I worked on today had several database tables use several JOINs, which instantly led me to discount the opportunity to use a SqlCommandBuilder, opting for manually building a dynamic SQL string and use SqlParameters.  I tried to trick the Framework into supporting an UPDATE operation by creating a View of data originating in 5 different tables on the DB.  No dice - I still got the "dynamic SQL generated cannot come from multiple base tables" error.  Darn.

    I honestly think the concept is cool, but in my own development is rarely implemented, as the chance by which I actually query a single table with "SELECT * FROM Table" is rare-to-nil.

    That having been said, I think a couple of changes would be great to make for the next version of ADO.NET (I admittedly haven't kept up as well as I should with such features):

    • More SqlCommandBuilder constructor overloads - supporting selectable parameters so that the SQL generated by the object itself can contain a selectable number of fields, rather than having the long, drawnout UPDATE statements that test the nullability state of each and every field in a recordset, improving performance.  Perhaps implementing a signature like SqlCommandBuilder(SqlParameter[] sql).
    • Support JOINed tablular data - I know this is a big one, but I think it would be cool to support even simple multi-table data.   As evident above, quite even a simple JOIN kills any use for a SqlCommandBuilder for me.  And I know I'm not alone.
    • Be able to support JOINed tabular data derived from a database-side View - if the Framework is smart enough to recognize that a View utilizes 2 or more base tables, then why not be able to bake-in a feature that could provide UPDATE capability to those tables.  At least for the more simple of JOIN operations.
    In all, I'd like to see the SqlCommandBuilder be a much more reliable part of .NET than just the obligatory lesson learned in what can be done - and then back it up.

    Read more...

  • What the hell ever happened to "30 minutes or its free"?

    One of the things these days that irks me to no end is the lack of quality delivery service by pizza companies.  I remember when the "30 minutes or its free" marekting scheme forced the sub-industry to revamp their strategies and get thier shit together.  Nowadays, I'll be lucky if Pizza Hut and/or Domino's gets me even the simplest of orders within an hour.  Damn.

    Read more...

  • The recurring problem with database normalization

    I've come to learn the main (and possibly eternal) argument when trying to fully optimize and normalize database design is in cases where you have one-to-many relationships.  Specifically, whether to base a DB's design and table relationships on a single table with all possible elements having their own column, or having one row per element, and using multiple rows.  And thinking ahead, how will this affect administrative tasks, like batch-updating data for a user?

    For example, in
    an NCAA March Madness management an app I'm working on today, this conundrum once again reared its ugly head.  I pondered having a single DB table with N # of columns where N is the number of responses in a given set of data (for this purpose, 63, which is the total number of games in the tournament), and let the user submit all her choices with a single INSERT SQL statement.  This would then result in 1 row with 63 fields, just for games.

    The alternative would be the have one row in the database table for each selectable element in the form linked to a specific user, meaning N rows, and meaning when doing adminiistrative work, running an UPDATE SQL statement N times, again, for this example 63.  It's assumed that in most contest-oriented apps like sports pool managers and fantasy games, the onus is on the user end, with potentially thousands of people submitting forms that could run SQL statements over and over, occupying processing power on the server.  But what about post mortem administration, wherein game results might need to be updated, and en masse?  At what cost peformance?

    At first glance, it would seem more advantageous to use the former, populating an entire row, albeit a large one.  I've actually used each method in my designs in the past, and this seems to be a theme that pops up each and every time I get a project with multiple relations.  But a friend once gave me some very simple and sound advice - told me that the latter, as you suggested, was preferable, as this type of repatitive operations are just the thing computers were designed to do. 

    Something keen to keep in mind based on the application of your application.  :)

    Read more...

  • Free ASP.NET 1.x app download: "Familar Faces" community photo gallery

    Second only to my company's news content, our free community photo gallery, Familiar Faces, is the most popular part of my site, getting over 137,000 page impressions per day.  We now get about several hundreds of new images per day and support several thousand users.  It's kind of an offshoot of Xanga and MySpace, giving non-techies a centralized place to store their images.  I basically developed an application in ASP.NET 1.x to automate the gathering/display of images sent in by our TV viewers, to be displayed in a collection, and also to give each registered membe their own personal photo scrapbook.

    The main advantage is the speed: users upload images to my database via an ASP.NET WebForm (only .JPGs that are 200K or less are accepted), which are then run through a specialized process which reduces their quality setting and proportionately resizes them to a fixed dimension so the images all come out nice and uniform, and don't take forever and a day to load.  It's this programmatic standardization that makes this app a winner for us.  (I mentioned this process in a previous blog, "Emulating PhotoShop Actions programmatically using GDI+").

    The app doesn't necessarily require any web.config settings (although I only used a database connection string), or global.asax application settings, so it's pretty much plug-and-play, and is very scalable, being meant to support potentially thousands of members.  Some of the key features:

    • registered users upload images via a WebForm
    • images are displayed using a paged DataList control, which performas significantly better than paging through a DataGrid
    • the membership system it uses is very simple, and is included with the app
    • all T-SQL scripts are included for all stored procedures
    • a user control display the most active members, by the number of images they've posted
    • a user control displays the images posted by each user on their personal photo scrapbook
    • comments can be left for each message, which then e-mail the image's owner to alert them
    • a stored procedure limits the number of images that may be uploaded by an user each week (default is 5)
    • images are stored in a SQL Server database and served via a custom HTTP handler, which caches images at the server programmatically via the.NET Cache API.  This greatly improves performance.
    • an administrative page displays all the submitted images so that a site admin can preview the images before approving them (in the event inappropriate, duplicate, or otherwiseundesirable images are posted)
    I'll post the code if anyone's interested.  It's not anything new, but it is a fun and interesting case study for people to do some fairly involved ASP.NET programming for a service people will love using.

    Check out KUAM.COM's Familiar Faces ASP.NET app: http://www.kuam.com/familiarfaces

    Read more...

  • Leveraging C#'s "using" keyword for optimal data access

    More and more specialized, niche market books have been published over the last 6 months, dealing with best practice programming for ASP.NET v.1.x.  As such, many of the recommandations can get confusing, and at times, contradictory in laying out how to write good, clean, scalable, high-performance code.  One of the major areas is data access.

    One thing that people always ask me is how to properly leverage C#'s dual-purpose using keyword when making data calls.  Specifically, how to wrap code properly so that open or unused object references won't be left to the whim of the .NET garbage collector.

    First, a bit of background: using in C# can be used in two ways:

    • as a shortcut to typing long namespaces used in code
    • as a means of properly and automatically close and dispose any object references implementing the IDisposable interface
    For the latter, here's an example, in which a private helper method returns a DataSet object.  I typically write data access code wherein data is pulled from a SQL Server database through a stored procedure within a library like so:

    using System.Data;
    using System.Data.SqlClient;

    private DataSet GetFreshData(string sprocName)
    {
        using ( SqlConnection conn = new SqlConnection() )
        {
            using ( SqlDataAdapter da = new SqlDataAdapter() )
            {       
                da.SelectCommand = new SqlCommand();
                da.SelectCommand.CommandText = sprocName;
                da.SelectCommand.CommandType = CommandType.StoredProcedure;
                da.SelectCommand.Connection = conn;

                DataSet ds = new DataSet();

                try
                {
                    da.SelectCommand.Connection.Open();
                    da.Fill(ds);
                    da.SelectCommand.Connection.Close();
                }
                catch
                {
                    return null;
                }
                finally
                {
                    // do other things...calling Close() or Dispose()
                    // for SqlConnection or SqlDataAdapter objects not necessary
                    // as its taken care of in the nested "using" statements
                }
               
                return ds;
            }
        }
    }

    Read more...

  • Coaches wearing gym shoes on the sidelines?

    Has anyone noticed (undoubtedly, you have), but more importantly, does anyone why NCAA head basketball coaches on Saturday night wore gym shoes along with their garb on the sidelines?  I first caught Cincinnatti strategist Bob Huggins rockin' what appeared to be some white Nikes, along with his usual slick get-up, against Charlotte.  In similar fashion, Coach K wore running shoes against Georgia Tech.

    I assume this was to shed light on collective support for some shared cause the NCAA or coaches took, but what that is escapes me.  Anyone got any ideas?

    Read more...

  • Maybe less is more? (aka, "I'm shortening my resume")

    To my surprise/chagrin, perhaps corporate recruiters, like many women I've come to know over the years, aren't immediately impressed with length or girth (draw your own conclusions).  After having tried unsuccessfully for 4 years  sporting an exhaustive resume in excess of 3 pages, profiling my experiencing in what apparently is excruciatingly-annoying detail, I've taken a step back and recently cropped the shit out of it, whittling it down to a single, managable page

    I also drafted a relatively unimposing cover letter, compared to the grammatically heavy behemoth I used to tote that would make Shakespeare blush.

    Who knows...maybe this will work.  Nothing else has yet.  I've got to find some way off this rock.

    Read more...

  • Cracking the iPod "shuffle" randomizing algorithm

    I'm tragically neither a computer science or mathematics major, but I am a software developer (I have a BBA in Marketing and an MBA).  So while I can bang out some heavy code, the major concepts and academic principles so often escape me.  Or are too above my ability to comprehend that I just don't bother paying attention.  That having been said, from a programmatic standpoint, one thing that I've really enjoyed on my iPod Mini is the "shuffle" feature, which not surprisingly plays stored songs or grouped playlists our of their assigned order.  

    While not a gimmick that isn't available on tens of thousands of other media-savvy consumer products, the shuffling feature is particularly intriguing because of the way it randomizes digital music.  I've noticed that the shuffle option not only plays my MP3s out of the order in which they're stored, but it also apparently groups random tracks by artist.  For example, if I had 1,000 MP3s stored, 30 of which were by Pantera, when the randomly-ordered playlist got to the first Pantera songs, that song, and the next 29, would all be the Pantera tracks before playing the others.  Interesting little feature, I think.

    I've managed to replicate this "grouped randomization" functionality in .NET by playing with a populated DataTable, scrambling the default ordering of the items, and then grouping them by artist (or some other appropriate criterion).  Using the Random class, DataRows are removed from the collection after they've been "played".

    But, onto the meat of my argument: does anyone out there more skilled in the academics of math and/or computer science than I know if this is an existing algorithm?  Is there a name for it?

    Read more...