Roy has a really nice article
that addresses the problem of testing data access code. The only thing is that
it is missing a really important step assuming you are using object like views,
stored procedures, user defined functions, rules, etc. I've been bitten in the
past often enough ignoring this issue that I make sure I pick an approach rather
hoping it just works.
A common approach is to have a single shared instance of the database to test
against. However, there are numerous issues with this approach, not the least of
which is the need to have alternate data sets for different tests. What happens
when more that one person tries to run the tests at the same time?
The next choice is to have each developer have their own database instance
either on their own machine or a shared one. Either way each developer has
exclusive access to their own database. Now the problem becomes keeping all
those schemas and data sets up to date.
One choice is to use the restore command to quickly make a known database
available for testing. So how do you ensure your views/stored procedures/stored
functions/etc are up to date? Since backup/restore deals with all objects in the
database restore will bring in potentially old versions of your views, stored
procedures, etc. You need some type of scheme to make sure these are up to date.
The benefit of using restore is that it is relatively fast. This technique works
really well if you don't have any database object other than tables, such as
when using an object relational mapper or some other dynamic sql generating
technique. Another issue to watch out for is the management of database backups
and making sure the table definition changes get propagated to all the
If you are using database objects other than tables the restore command
doesn't provide enough benefit to offset the hassle of making sure the stored
procedures, views, etc. are up to date. In this case the best option is creating
a new database and loading all your schema objects and sample data before
running your tests. The advantage here is testing your installation process, and
ensuring that everything is guaranteed to be up to date. Some disadvantages are
the speed of initializing the database and a larger number of source data files
Some techniques for speeding up the database initialization are:
- Use BCP to load data
- Use multiple threads to run schema creation scripts
My preference is definitely the final one as it seems to eliminate all the
missing database object errors, dependencies on database objects that don't
really exist anymore, and ensures you think about initializing/upgrading the
database before the end of the project.
Note when you are upgrading an existing system you can combine the last 2
options where you restore an existing backup of the current system and then
apply the upgrade before testing.
Do you have any other techniques you use for testing