Your question/answer series with Metro292008 regarding text data and access recordsets and excel arrays was very well done. (Ref http://www.dbforums.com/showthread.php?t=1627146) I have a similar puzzlement that is not text but a range of cells in an Excel spreadsheet needing to go into an Access database. Its slightly different but mainly the same question.
I need to have a 250 row column by 30 row worksheet transfered to an Access database once every minute. Each column corresponds to a field in the database, so there is a 1 to 1 correspondence between the fields and the column headers. I thus can ignore Row 1, which contains column names.
There is a DDE datastream updating the data in the cells. Imagine we are measuring temperature and valve position and so on and a server is putting the data into an Excel spreadsheet in real time. The cels values are changing as the DDE datatream delivers data to the spreadsheet and every minute, I want to take a snapsot and deposit into an Access database. So, every minute, I need to flush this data to an Access database. Precisely every minute. The instant the minute happens, the data in the cells needs to be reset to all zeros, and then the 1 mimutes worth of accumulation via the DDE data-stream starts all over again. Then, precisely 1 minute later, I need to flush the range of cells to the database again. Over and over and over again.
Can I write a 250 by 30 array in a single statement to an Access database once every minute?
Or do I need to loop through the rows and write them one at a time? If I have to write them one at a time, I need to have a loop that precisely writes all 250 in a precise minute, so it seems to me I'll have to offset the writes by 1/250th of a second. It seems to me that staggered writes that are 1/250th of a minute apart will be harder to do in code than a single write of the array using a single statement, but I'm one of those odd birds whose job never required me to use Excel before, so I don't know enough to know which approach will have the smallest overhead.
If I could expand this to 750 by 30, that would be ideal, so are there memory issues I need to take into consideration? Do I need to set the array and the recordset variables to "" after each write, for example?
Just a newbie to Excel struggling to figure out the best way to take data collected in an Excel spreadsheet in real-time and deposit in one munite fields into a database for group querying & process control alerts.
My programming experience is mostly in Pick, so you can imagine how odd it feels to work in Excel LOL.
This is really a separate issue, so I'm going to split this subject off on it's own thread.
How often do does DDE update the spreadsheet? Because, it will take Vb a second or two to connect to the spreadsheet, read the data (using ADO) clone the spreadsheet, then delete all the data in the recordset. What happens if, while you're killing the existing data, but before the spreadsheet is updated, DDE adds another record?
Is the DDE app smart enough to know that the data has been moved, so it will start back at row 2 again?
Could you not use DDE to update Access directly? I haven't done that in forever, but I'm sure that it can be done. It would almost certainly be a better fit than what you're doing. As poor as Access can be when multiple users start monkeying with the data simultaneously, Excel is worse. DDE was intended to used in conjunction with Excel as a means for presenting the data to the users, and for analysis of data offline. It is NOT a good tool for realtime collection/processing, unless it's for a batch-type process (and less than 65K rows.)
Last edited by loquin; 04-04-08 at 03:10.
"Lisa, in this house, we obey the laws of thermodynamics!" - Homer Simpson
"I have my standards. They may be low, but I have them!" - Bette Middler
"It's a book about a Spanish guy named Manual. You should read it." - Dilbert