I am trying to write a generic stored procedure to return the data type of a given col in a given table in any database.
I would like to place this sp in master.
input parms are @TableName sysname and @ColName sysname
ouput parm is @data_type varchar(128)
select @data_type = data_type from information_schema.columns where table_name = @TableName and column_name = @ColName
when debugging this I always get 0 rows (@data_type remains '')
I have a thought that I can add an input parm for the database, say @DbName sysname
But I am having trouble updating the select stmt to be something like
...from @DbName +'.' + information_schema.columns.
When you say - data type of a given col in a given table in any database - are you looking to test this col/table for all databases or are you just looking at 1 database ? In your example, it seems that the information_schema.columns view would be sufficient - why do you want to create a stored procedure ?
I want to use the functionality from various places so my thinking is that I want to call my own routine (the view may change someday - I will only have to update my sp - I won't have to hunt down all the selects from information_schema).
The select as given did work fine UNTIL I moved the sp it was in from my db to master - then suddenly it returned 0 rows.
I am guessing that when I debug my sp from query analyzer it seems to use master and then execute - could that be true.
Also I named my sp sp_blah (note sp_ prefix) so it should execute in the context of the current db (that was my thinking anyway).
Originally posted by rnealejr
FYI - You can use sp_Msforeachdb to loop through all databases.
Also, I would recommend creating your own information_schema view based on your requirements rather than a stored procedure.
How would you do that when you would have to cross catalogs?