I have a functioning system that loads data from an external source.

We have a need to validate and trace that information and where it was placed.

I have 1 table that contains packed data in one long field and will have a unique "Trace Number" in another field.

The data from that long field is parsed and distributed into about 150 tables depending on what specific data items it contained. This is done using ~100,000 lines of stored procedures.

One "input" record can be split out and insert new rows into several tables.
one input record may update one or several tables.
one input record may re-update a row it just modified or inserted.

The current system does not read the "Trace Number".

The intent is to test the current process so a lot of changes to that
existing code is out of the question.

In general the approach I'd like to see is this:

Modify the common procedure that first reads the record and load the
"Trace Number" to a 'Global Variable'.

Add post-insert/update triggers to all tables to log to 1 common table: the current rowid,
the table name,
a timestamp and
that "Trace Number" from the 'Global var."

Since The "Trace Number" It's not one of the columns in the tables themselves I need a way to "Share" that number from the first step
in this process with an automated trigger called from the database engine itself.

There is only one process that runs this load and only one copy will ever
be running at any point in time.

What is the best way to do it?