Here is what SQLMetal generates when you have a column in your database that allows nulls:private System.Nullable<int> _DiscountTypeID;
This means that when you have a null value in the database, the value that is returned by the Linq to SQL engine is a .net null, not a crazy datetime.minvalue or some other such nonsense. Nice and intuitive.
For a nested entity, you get this:
This also handles null referenced values as nulls. So far so good.
But what happens when a column in your database does not allow nulls when you generate the dbml file but allows it at run time? This could be the difference between your dev and stage databases. As with the missing tables from a my previous post, the code will compile but you'll get the following exception:
"The null value cannot be assigned to a member with type System.DateTime which is a non-nullable value type."
So with Linq to SQL, we've really just pushed the inherent problems with keeping databases in sync to another level. You get compile time checking against your schema, not against the actual database. This is not a terrible thing in my opinion, its just a gotcha that you need to be aware of and handle the possibilities in your code.
So it turns out the simplest way to update your dbml schema is to just delete the existing files and recreate them with SQL Metal. Then, when you include the updated dbml file in your VS 2008 project you will get new dbml.layout and .designer.cs files. You'll of course have to reset your layout if you've modified that. If you do this, you're safe. If you need that layout, continue reading.
Just regenerating the file with SQL Metal will overwrite the dbml file but it will not touch the other two files. This means your layout will be ok but you're not out of the woods. If you have changed the name of an object in the underlying structure, your layout will remove that object from the layout and place the newly named object in the default position (usually top left). This object will NOT appear in your intellisense because the .designer.cs file still has the old name. Note that code referencing the old object name will still compile and run but will throw a SQL object not found exception when run.
Error Code: -2146232060
Error Message "Invalid object name 'your object name'."
Error Number: 208
So you still want to refresh the schema and keep you designer layout? You have an easy option:
Exclude the dbml file from your project (automatically excludes the other two files).
Delete the .designer.cs file.
Rerun your SQLMetal command.
Re-include the dbml file.
Now the .designer.cs file is regenerated from the dbml and all intellisense and compile-time checking will work. The .cs file contains all the appropriately named items. You also still have your original designer layout but any new objects are placed in the default location and old objects have been removed.
Now if you're modifying your dbml by adding associations etc., you are going to need a compare tool. I use ScooterSoftware's Beyond Compare. Simple, cheap and reliable. Generate your dbml file to a different location and run the compare utility to "import" all your modifications to the new file. Then perform the steps listed above.
So I'm looking at all the new stuff that was just released and the things that are going to be released soon. It's a good time to be a programmer in the Microsoft way as long as you are aligning yourself appropriately with the stars.
So we now have an actual complex pattern that you can read about from years gone by that has been implemented into a framework that we can use right now. Pretty darn cool and perhaps a step in the right direction.
This is my personal holy grail. I know and love to write SQL but its time consuming and often boring and not exactly known very well by all programmers. I've interviewed some very senior developers over the last few years who don't know a foreign key constraint from a primary key. Using EF, if it performs, will allow us to truly design and work with data that is based on the business problem at hand, not the based on the current storage mechanism. Conversely, we will be able to design the data storage in the most efficient way possible. I watched an "expert-to-expert" video on Channel 9 over the weekend and found myself saying "Exactly! That's what I was trying to do!"
Linq and Lambda Expressions
Another set of tools in the toolbox. Remember that lambda expressions don't need Linq but Linq IS lambda expressions. If you learn Lambda you will have a good means of understanding Linq. This may also be a barrier to entry but I think its worth learning. Learn it now because if you run into a lambda in someone else's code and you can't ready it you'll be at a dead stop. Really.
WCF, WF, WPF, Silverlight, WSS, MOSS Etc.
There are just so many different variables out there right now with the latest technologies that will allow a good developer to design and implement a real solution quickly and have it work out of the gate. I'm really looking forward to understanding how all of these will coalesce into a united whole.
In the past, developers would spend a great deal of time trying to make things talk nice to each other. Plumbing and pattern identification. This problem is really going away quickly if you are in the MS paradigm.
So my job next year is going to look very different from my job last year. This is a good thing. It keeps me interested and forces me to excel in new areas. Keeps me from becoming a "crotchity old programmer". Seems like there is always more to learn but now the more is interesting and not necessarily a re-solving of an old problem. I can concentrate on the business instead of the plumbing. It also keeps Barnes & Noble and Amazon in business...
SQLMetal.exe is the command line tool that will allow you to very quickly spin up the object model you'll need to connect from your SQL Server database via Linq. Here are the basics of using SQLMetal.
Take a look at this text file, the output from executing SQLMetal.exe from the VS 2008 command prompt with no parameters. It's pretty straight forward but knowing the details ahead of time will help you set yourself up properly.
This database contains a relational dataset with a few tables, a simple DML stored procedure, a view, a table-valued UDF and a scalar-valued UDF. In my next posting we'll look more deeply at what you can do with these kinds of objects and how to properly address them.
SQL Metal Commands
The following commands will be executed to show you what we've got. Here is the actual text of what I ran for each example. (The commands are formatted for easier reading. When actually executing, everything should be on one line.)
1. Create an xml file in dbml format but without an extension so you can peek at it in VS without it opening a designer.
2. Create a basic dbml file to view and manipulate in the VS designer.
3. Same as #2 except with the pluralize switch to see what that looks like.
4. Make the classes generated serializable and include functions, views and stored procedures.
5. Create just a code file, no designer.
6. Create a code file and a mapping file but no designer.
I created new file output names, namespaces and datacontext object names for each example. This allows me to have all of them in the same project at the same time. If I think of it and have time, I'll fire some Linq code up for each example and see what kind of IL is generated for each. I expect it to be the same but it would be nice to know for sure.
When you create a dbml file, only that file is created by SQL Metal. Opening or including file in VS 2008 will then create the .layout and .designer.cs files necessary to lay it out. This is great because you can make your layout nice and easy to understand and not have to worry about recreating it every time you have to regenerate the file. If you, like me, use SQL Server's data diagrams, you know the frustration of not knowing exactly when SQL Server will decide to 'rework' your layout for you. This is the file that I reformatted with example #2. Note the table-for-table matching with what's in the SQL database.
Pluralizing your model notes from MSDN: "When you use the /pluralize option with the Northwind sample database, note the following behavior. When SqlMetal makes row-type names for tables, the table names are singular. When it makes DataContext properties for tables, the table names are plural. Coincidentally, the tables in the Northwind sample database are already plural. Therefore, you do not see that part working. Although it is common practice to name database tables singular, it is also a common practice in .NET to name collections plural."
Here is an example of the output from using the /pluralize switch.
Tables, Views, Stored Procedures and Functions
Note here that including views, sprocs and functions will allow you to see views and operate on them like tables but it will not include table-valued functions in your layout although you can still use them as tables in your Linq queries. Here is a snip of that code:
var DueDates = from d in db.FnGetDueDates() select d;
dataGridView1.DataSource = DueDates;
To Map or Not
Next, the difference between creating a map file (#6) and not creating a map file (#5) are that not creating the map will force the tool to create an internal System.Data.Linq.Mapping.MappingSource object (concrete type of AttributeMappingSource), two additional constructors for the DataContext and will add attributes directly to all objects and properties as appropriate that will allow them to 'self-map' into the database. See the attached solution for details. By creating the mapping xml file, you take these items out the code file so that if you want to reset any of the mappings, you can. This is a bit closer to an abstraction model but not quite there in my opinion.
When you run the project, you'll get a button and a data grid. You can fire it up and see the customers in the list executed with only the connection in app.config. Pretty cool. No ADO.Net code or objects anywhere! You can also replace the Linq query and binding with what you see above and bind to a UDF.
In my next post, I'll spend some time showing how to address each type of object in different ways. Maybe we'll be populating collections or just directly binding to other bits and pieces of the UI that will of course have to expand a bit...
So here's my current plan for firing up these new technologies. As background, I gave a 15 minute presentation at a user group last night on the generalities of data abstraction and although I think I did ok, I also think that there are a lot more questions than answers in the developer community right now with regard to how to get data back into SQL Server through Linq.
I'm going to start out doing just Linq as I think its a little deeper than EF and more appropriate as a starting point. I also am concerned about how well EF will perform in the enterprise but that's for another day.
So I'm going to take the database I built for this presentation and put some data in it. Here's the current plan.
1. Explain where to find SQLMetal.exe and how to use it to generate an entity layer.
2. Take a look at normalized data in a real world scenario that includes things like null values and empty foreign keys (candidates for left join in SQL).
3. Figure out how to send an anonymous type to a business object so that I don't have to store the Linq query in my business object.
4. Get data from a business object back into the database for an update via Linq to SQL.
5. Get new data into the database from a business object via Linq to SQL.
6. Build and deploy a web part that allows the insert/update/viewing of all this data in VS 2008.
Wherever I find another blog or article that helps me move ahead I'll post that too.
Not sure if I'll have time to do all this as I believe a new project is on my horizon but I'll do as much as I can. I'm also not sure how much of this has been done elsewhere or whether its even worthwhile. I think it will be a fun evolution though.
Along with part 1, I'll post a script to create and populate the database I'll be using.