Migrating Production Database with Entity Framework Code First

The title of this post is a bit of a misnomer, because I will definitely not be showing you how to run migrations against a production database, but rather the proper way to get whatever schema changes are necessary applied to your production database.

Entity Framework Code First is an extremely simple and powerful development tool to be sure. You get these things called migrations that let you easily update your local development database with changes to your models, and you can even enable automatic migrations, making schema changes almost entirely frictionless. This is great for development because you database schema is always in a constant state of flux and the less you have to worry about keeping it in sync with your code, the more time you can spend actually developing your application.

However, when it comes time to deploy your shiny, new application, it's often confusing to new developers why their production database isn't updated at the same time. This is by design, as the concept of something remotely altering a production database schema silently is absolutely terrifying, if not completely forbidden by most enterprise change policies.

That still leaves the question, though, just how do you update your production database schema? Well, there's a number of different ways to do that, but I'll show you a few here.

Using update-database -Script

First and foremost, you can take advantage of the update-database power shell command's ability to generate SQL. If you add the -Script suffix to the command, then instead of actually applying the migration to your local development database, it will generate a SQL script that can be used to apply those changes. Then, after you have your SQL, you can run update-database again, without the -Script suffix to actually apply the changes to your development database. That would look something like:

> add-migration MyNewMigration
[Console output of the command]
> update-database -Script
[Console output of the command]
> update-database
[Console output of the command]

However, if you did this everytime you did a migration, you'd end up with quite a few SQL scripts that would need to be applied by the time you're ready to deploy, and a very perturbed DBA if he or she had to apply all of these one at a time in order.

If you're going to use this approach, you should also roll-up your migrations. What this means is that you mark the point at which you started development, and roll back your database to that point. If this is an entirely new application, then that would mean rolling back all the way through your zero, or initial, migration such that you end up with an empty database. If you were doing development on an existing application, you roll back your local database to the last migration that existed before you started adding new ones. Then, delete all of the individual migration files you added during development. Finally, create a new migration. This new migration will look at your code's current state, compare that with the database as it was before you started (since you rolled-back to that point), and include all of the changes necessary to get to that point. Here's the play-by-play view:

  1. Rollback your local development database

    Rollback to a Specific Migration

    > update-database -TargetMigration:"MigrationName"

    Rollback all migrations

    > update-database -TargetMigration:0
  2. Delete all the migrations you just rolled back from your project

  3. Create a new migration

    > add-migration MigrationName
  4. Generate SQL for necessary schema changes

    > update-database -Script
  5. Apply all schema changes back to local development database

    > update-database

Now, you have just one migration representing all of the work you did, which you can use to generate SQL to apply against your production database. Typically, you'd then hand this off to your DBA, if you have one, or give it a good once over yourself, if not, to ensure that you understand everything that will happen and that it all looks good. Then, you or your DBA would follow the normal procedure of applying these changes based on whatever change management policies are in place. (Typically that would involve a database backup, a test-run to ensure the script completes successfully without error, and a final deployment of the schema changes, which might even be against a copy of the database that would then be switched out with the live one.)

Using the Visual Studio SQL Server Data Tools

If you have the SQL Server Data Tools (SSDT) installed in Visual Studio, or are using a version of Visual Studio that comes with them by default (2013), then can use an approach very similar to the above without all the rolling back and re-migrating. However, it should be noted that the potential exists with this approach to automatically apply schema changes to the production database. While I definitely don't recommend doing that, it might make this particular method frowned upon at your organization simply because the potential to do bad things is there. You'll have to work it out with whoever is responsible for databases where you work.

In the Server Explorer pane in Visual Studio, expand the Data Connections tree, and then expand the tree for your local development database connection. This causes a connection to that database, which now allows you to right-click it and choose Browse in SQL Server Object Explorer. (This particular option does not appear until you are connected to the database. You can also get here directly through the SQL Server Object Explorer pane, but it's sometimes difficult to discern which is your local development database there.)

Once there, right-click on your local development databases entry and choose Schema Compare.... In the resulting tab in Visual Studio, your local database will be pre-selected as the source and there's another drop-down menu on the right to select a target database. For this, you will choose your production database (or perhaps a staging database, which might be a bit safer). The goal is to just get a connection to something that represents the current state of your production database's schema. From this menu, eiter choose an existing connection or create a new one via selecting Select Target.... Once you've selected the target connection. The Compare button at the top left of the tab's toolbar will light up. Click this (or use the keyboard shortcut, Shift+Alt+C) and Visual Studio will determine what changes would need to be made to make the target database (production or a facsimile thereof) match the source database (your local development). Once it completes its work, you can choose to update in place, which I don't recommend, or better, generate SQL via the button that looks like a scroll in the tab's toolbar. If you hover over it, the tooltip should say Generate Script. You can also use the keyboard shortcut Shift+Alt+G if you go for that kind of thing.

The end result is the same as the previous method: you end up with a SQL script that you should hand off to your DBA, or thoroughly inspect yourself, if you don't have a DBA. When you're confident, apply the changes to your production database in the way dictated by your organization's change management policies.

Alternative Options

  1. Data-Tier Applications

    These are honestly still a bit like black magic to me. I don't fully understand them or how to work with them, but I do know that you can extract one from an existing database and also version them such that you can deploy "upgrades" to databases you have in production. If you have a DBA familiar with these things, they can be a very powerful and secure way to migrate your database schema changes.

  2. Third-parties

    There are many vendors such as Red Gate that offer solutions to migrate schema changes. I haven't used any of these personally, but again, if you have a DBA in your organization, you should discuss what methods are available to you with that person or group and come up with a plan for your organization.


Seems there's been some debate in the comments about why it's perfectly okay to run migrations against a production database. Let me just start off by saying first and foremost, if you're running migrations against a production database, even using WebDeploy, I can almost guarantee, you aren't in an enterprise environment. Perhaps, some of my advice is a bit cautious here, but that's what's been instilled in me by years of following enterprise best practices. In my wilder and headier days, I've done direct alterations to a production database and 99 times out of 100, sure, everything goes smoothly. I also remember nightmare scenarios where I or another developer in the organization jacked up the production database entirely, as well. And, let me tell you, that is a nightmare scenario. Like I said: wilder and headier days. Now, I adhere to good change management policies, and I don't have to worry about whether the changes I'm making are going to send me into a panic-filled period of minutes or even hours of trying to restore a database while angry clients yell at me. To each his own.

That said, a couple of direct points I'd like to reference. The first is a common refrain that I wish people would just stop with already: Microsoft has some feature or another that let's me do it, so obviously they think it's fine. Okay, if that's good enough for you, go for it. Personally, I've worked with Microsoft products long enough to know that half of what goes into it is added by the marketing team. They're features with a big bold star: "Our latest and greatest version will make you X times more productive as you can now directly deploy website changes and migrate your production database all in one. Woohoo!". Yeah, and then you do that, something inevitably goes wrong, your database is borked, clients are yelling, and who's going to be there to help you out? Microsoft? Nope. You. You and you alone will have to fix it. Hey, maybe if you're a small shop, something like WebDeploy is fine. I'm sure most of the time it will work just great - maybe all the time. And, in that scenario here or there where it doesn't, it's probably not a huge deal as nothing is all that mission critical. However, step into something like the financial, oil, or healthcare sectors (the primary employers of developers here in Houston), or even just ecommerce or anything where downtime equals lost revenue, and more likely than not using something like WebDeploy is flatout restricted. Even if it isn't, the first time you screw something up by using it, your company won't be yelling at Microsoft for having a bug in their software. No, they'll be chewing your posterial region, or kicking it out the door.

The second point is essentially that this is the way other systems work, such as Ruby on Rails. Well, first, I would say that's how they can work. I'd almost bet you dimes to dollars that a top-tier RoR shop is emphatically not running automatic migrations against their clients' production databases, but rather implementing a much more secure and safe deployment process than that. If any RoR, Django, etc. shops out there with true name-recognition want to chime in and disagree, please proceed. I'd welcome discussing why you're not worried about this. I have a feeling that I won't hear from anyone, though. Second, it's exactly becuase of this lackadaisical approach to development that RoR, Django, etc. really had to fight to gain acceptance in the enterprise. Invariably, they had to always add more controls and safe-guards than they originally had, and even then, acceptance was often a compromise of accepting some potential for bad to gain the more rapid development processes. Even then, though, I'd assume change management policies went into place immediately to restrict using some of these more automated features.

A saying I'm fond of reciting is "just because you can shoot yourself in the foot, doesn't mean you should". Microsoft is fond of handing out guns with the safeties off. However, be smart with the power and please don't shoot yourself or anyone else.

comments powered by Disqus