[tip] Use the Database Publishing Wizard to script your table data

Jeff Atwood recently asked "Is your database under version control?"  Well, is it? It's not as easy as it sounds. Until now, there wasn't a good, scriptable solution to the problem. I'll run through a few of the options I've looked at in the past, but let's jump to to the punchline:

The Database Publishing Wizard can script data as schema, and it's got a command line mode.

The Problem

Most of the applications I work on run against SQL Server databases. Jeff hit the nail on the head when he said that your database is an integral part of your application. I'll take it a step further and say that your data is often an integral part of your application. Many databases include signifcant amounts of reference data which are as much a part of your software release as your config files - for instance, a DotNetNuke site without the correct version of the database is just plain shot.

I'm not talking about database backups, which are an important part of your production environment. Database backups are a more appropriate solution to safeguarding your transactional data, but they don't integrate at all with the rest of your application's version control.

I previously posted a script to check a database (MDF) file into Subversion. It works, but it's far from ideal. It's just a binary file, so while you've got history, you can't track differences. No good.

I've used SQL-Scripter, GenDbScripts (had to fix a few bugs), and Enisey Data Scripter. They all work, but none of them was scriptable. 

I set out to see how easy it would be to write something in PowerSMO that would script out the table schema and data. It should have been easy, since SMO has a special "execution mode" that records all commands and outputs the resulting script. Heck, David Hayden has one post showing C# sample code to script a database schema, and another using SMO's Transfer.TransferData() to copy all data from one database to another. No problem, I'll just mash them together to enable CaptureSql and TransferData, and I'm set...

Yeah, not really. I converted it to a PowerShell script. It worked great... but it only scripted the schema, not the data. It looks like TransferData doesn't go through script, so it's not picked up by that cool CaptureSql thing. (In case you care, that PowerShell script is in the footnotes.)

The Solution (well, pretty much...)

Then Scott Guthrie mentioned the new Database Publishing Wizard on an extensive post on deploying a SQL database to a remote hosting environment. Scott's post was pretty beefy and covered the SQL Server Hosting Toolkit, so you might have missed the beautiful screenshots below the fold. The Database Publishing Wizard makes it easy to script your schema and data from within Visual Studio: 


The more exciting thing to me is that the Database Publishing Wizard can be run from the commandline:

sqlpubwiz script ( -C connection_string | -d local_database_name ) (output_file) [switches]


 Output options:

  [-f] :                       Overwrite existing files
  [-noschemaqualify] :         Output script does not qualify object names with
                             schema name.
  [-schemaonly] :              Only script schema
  [-dataonly] :                Only script data
  [-targetserver] :            Specifies what version of SQL Server the script
                             should target.  Valid versions are "2000", "2005".
                             Default Value: 2005
  [-q] :                       Suppress output
  [-nodropexisting] :           Default Value: False

 Local connection related:

  [-C connection_string] :     Connection string
  [-d local_database_name] :   Database name to script
  [-U local_user_name] :       Username
  [-P local_password] :        Password
  [-S local_server_name] :     Server name


Oops, we're missing two things before my quest is ended:

  1. We need an option to specify which tables to script. I know that complicates things a bit, but we gotta have it. Maybe a parameter that points to a text file containing a list of tables to be scripted. That's important because we want to script and version our reference data, but we don't want to script our transactional data for a lot of reasons (most importantly we don't want to overwrite transactional data, but also the transactional data will usually be the most subject to large growth).
  2. Why do the drop scripts need to check sys.objects for SQL 2005 databases and sysobjects for SQL 2000 databases? Why not use the INFORMATION_SCHEMA views so the same script works on both? That's the subject of another post, I guess...

Oh, and that PowerSMO I mentioned. It'll script the schema, but not the data:

$databaseToScript = "AdventureWorks" $serverName = "(local)\SQL2005" #USES POWERSMO - http://www.pluralsight.com/dan/samples/PowerSMO.zip #SETUP VOODOO $executionPolicy = get-ExecutionPolicy set-ExecutionPolicy "Unrestricted" .\InitPowerSMO.ps1 cls #CONNECT TO DATABASE $server = SMO_Server $serverName $server.ConnectionContext.SqlExecutionModes = [Microsoft.SqlServer.Management.Common.SqlExecutionModes]"CaptureSql" #ExecuteAndCaptureSql $database = $server.Databases[$databaseToScript] $t = SMO_Transfer($database) $t.CopyAllObjects = $true $t.DropDestinationObjectsFirst = $true $t.CopySchema = $true $t.CopyData = $true #Doesn't get scripted $t.DestinationServer = $serverName $t.DestinationDatabase = "just_a_dummy_value" $t.Options.IncludeIfNotExists = $true $t.CreateTargetDatabase = $true #Doesn't help #$t.TransferData() "Generating script" $commands = $t.ScriptTransfer() "Writing to file" $scriptFile = "{0}_Script.sql" -f $databaseToScript $commands | out-File $scriptFile "Done" #CLEANUP $server = '' set-ExecutionPolicy -executionPolicy $executionPolicy


  • Another data scripter:

  • The EasySQL Deploy is a similar data scripter. It uses an innovative way to pass the list with tables to command line interface. First you generate a project with all the options you need. Then you pass the project name as command line option.

  • Please anyone advice ...

    The scriptor is running fine but it is not scripting the database even if a single object is encrypted. It should actually exclude that encrypted object instead of excluding whole objects in that database.

    Can any one advice me how to exclude the object which in encrypted..instead of excluding the entire database.....

  • how do i execute the script after publishing..its is bringing out errors.like it doesnt contain a create database statement

  • I'm having a problem with the SqlPubWiz scripting the create views before the create tables which those views reference. Obviously this doesn't work since those tables are not found until later. I've seen other people have problems with this but have yet to find a solution other than manually rearranging the output file.

    Does anybody have a solution to this problem? Does anybody know WHY on earth it would create the views before the tables? I can see NO reason to create the views first and every reason to create them last.

  • Hi,Please open the file /includes/settings.aspIf you want to change the dlfauet settings.asp to point to mysql you have to follow these steps:1) comment the line pointing to the mdb file:changepDatabaseConnectionString = Driver={Microsoft Access Driver (*.mdb)};DBQ= server.MapPath( ../database/comersus.mdb ) ; to' pDatabaseConnectionString = Driver={Microsoft Access Driver (*.mdb)};DBQ= server.MapPath( ../database/comersus.mdb ) ; 2) uncomment Mysql connection string:change' mySQL Server 3.51 local' pDatabaseConnectionString = Driver={MySQL ODBC 3.51 Driver};Server=localhost;database=comersus;user=root;password=;OPTION=3 to' mySQL Server 3.51 local pDatabaseConnectionString = Driver={MySQL ODBC 3.51 Driver};Server=localhost;database=comersus;user=root;password=;OPTION=3 3) Configure mysql server connection string paramaters like server, database, user and password.Make sure that the user configured has write permissions in your database.[]

  • Session is one thing and dasabate is other thing. You can mix Comersus dasabate with your own MDB as long as it does not contain the same table names. If you want to let customers log in in your app and maintain session in Comersus, make sure that the same session variable is used and that both authenticate by email.[]

Comments have been disabled for this content.