I am interested in peoples opinions regarding the usage of automatically generated database surrogates from the DBMS or self generated ones via the application code.
I have a code generation tool that allows me to target many different platforms and languages. i.e. One week I could be targeting a MySQL database via Java and the next week SQL Server.
I am keen to implement database surrogates (where applicable). I can in theory implement these at the database level using AUTO_INCREMENT in MySQL or Identity in SQL Server and using other techniques for other databases.
I can write my own surrogate server that gives me the unique surrogate (across the system or the table) and use that upon insert.
Advantages of the DBMS served approach are:-
Speed (Closer to the database)
Advantages of the surrogate server at the application layer are:-
I can track the last used surrogate for an insert in case I have foreign key dependencies (i.e. a Parent Child transaction)
I can implement generically and be database agnostic.
Just wondering what others think? Opinions and suggestions.
I think you pretty well have it covered.
It is tempting to write applications so that they are database generic, and can operate on any platform. The sacrifice is all proprietary functionality specific to any individual platform, and for an enterprise application that sacrifice is often dear. Plus, in my 15 years of consulting, I and literally count on one hand the number of applications I have seen that actually used cross-platform capability.
If it's not practically useful, then it's practically useless.