We have a window which has values that get populated from 2-3 tables.
The Front End is in Power builder
for eg. The following is the layout of the window
Column Name Column Value Document Value Confirmation Ind Userid Date/time
1)Interest 20 Title Y abc ...
2)Income 3000 Payment N abc ...
In this fashion the window has 20 fields The column Value,Document Value and confirmation ind are editable
When the user clicks on Save all these values need to be updated(if already exits)/inserted(if does not exist) in the table.
along with the time and user id for each row.
For this definitely a Stored Procedure needs to be written.In order to get all the values I will have to pass these values as input parameters to the stored procedure. this would mean (20*5 = 100 input params)
First time when the user saves there wont be any data in the table at that point of time all these values will be inserted.First time 100 params will have value
Next time if the user changes any values only that value will be updated in the table. Next time only a few will have values while rest will be spaces.
But is it feasible or a good practice to give 100 input parameters to Stored Procedure.
Is there any other way that the values can be passed to storedprocedure
The table structure is as follows
Column name Char
Column Value Integer
Document Value char
Confirm Ind Char
user id Integer
Yes it is feasible to have a large number of parameters to a stored procedure. I have just written a stored procedure that inserts a single row into a table and takes 80 parameters.
With respect to good practice, when calling the stored procedure, supply the values by name rather than position, this will help debugging.
Although I know it is possible to use the data window update feature of PB to call a stored proc, I personally like to generate the SQL within the script and then execute the generated string. Again I find this simpler to debug.