LinkedIn | FaceBook | Twitter
I’m a big fan of code that writes code – most of the time. For instance, whenever you use the “templates” feature in SQL Server Management Studio (SSMS) or the Maintenance Wizard, you’re using code that writes other code. There’s even a trick of writing Transact-SQL (T-SQL) code that in turn creates other code.
But there is a class of code that writes code that I’m more cautious about. Whenever a program “automatically” generates a database schema, I begin to get nervous. No, I’m not talking about Entity Relationship Diagram (ERD) tools such as those from Quest and Embarcadero, I’m talking about things like NHibernate and other coding paradigms that “abstract” the database layer away from the developer.
I have two reasons that I’m not impressed with these programs and paradigms.
First, they do not take the entire solution into account. As data professionals we learn our platform (whether that’s XML, flat files, SQL Server, Oracle, IBM, whatever) and we study how each of the features maps to a complete solution. I might choose to use Replication, Service Broker, Clustering, FileStream, or any number of features to completely remove the need for code in that area. And by making those choices, I change the design of the database accordingly, based on the solution. The abstraction tools don’t – they just spit out the same Data Definition Language (DDL) statements they know how to create, without thinking about maintenance, speed, reliability or anything else.
When I mention this to the developer, they say “just put that in later” – and that’s the beginning of woes for the data professional. Most of the time you can’t put things of a fundamental nature in later. In some cases, it’s a complete tear-down and re-write of the entire database. Very painful, and something you never want to experience.
The second main reason that I am skeptical about tools that automatically create DDL statements to create databases is that they don’t do a good job. Once again, this is because of a holistic view – the tool doesn’t have the capability to take everything into account, including the data pattern, so it has no idea how far to normalize, where to put files and filegroups, what kind (if any) indexes are needed, or when to choose between a natural or surrogate key.
Is all this just because I’m a DBA, and anal about my databases? No. It’s a business problem, because these tools continue to separate the DBA from the Developer, and both from the Business Analyst. And it's the business that suffers when developers (or DBA’s) take shortcuts. I wonder how much time and money are wasted in business re-writing databases because of a shortcut using poor tools?
So let’s do the hard work – let’s let the business requirements dictate the solution, rather than the other way around. Even for the “little” databases, which of course never stay that way.
Now, does this mean I hate NHibernate or the Entity Framework? Not at all! Just work with your data professionals instead of without them. You’ll find that by bringing the best practices of these ORM tools together with a well-designed database, you’ll deliver a solution to the business that is fast, reliable and safe. Isn’t that what we all want?
I couldn't agree more, but I will also go ahead and say that I hate NHibernate with a passion! It's like Java, it's good for a science experiment and educational purposes, but it should NEVER be used in a Production environment, no matter how big or small.