I am looking into writing a perl script that could have a hash of several hundred entries. I am a bit nervous at the prospect of soaking up this much memory because a) I have never done it before, and b) I am not certain perl will perform well at that level. Has anyone played around with hashes of up to or over 500 entries? Or should I redesign now, before it is too late?
A hash of several hundred key/value pairs should be nothing to worry about unless you are using a computer with very little memory. I've played around with hashes with thousands of key/value pairs and performance was not a problem as far as that goes.
I'm not suggesting that it is a good idea to build a hash that big, I'm just pleading guilty to having done it...
It was a Perl script to create a cross reference to an outrageous number of stored procediures doing all kinds of truly perverse things (cross database and sometimes even cross server access). The script took a long time to build, but it was tracking a truly horrific number of references to an unholy number of objects, so it was way better than trying to do it manually.
I am hoping this script only runs once per week. Looks like I am doing something similar to what you were. This is the second incarnation of my Online Data Dictionary. I am just trying to collect up all the tables, columns, and maybe indexes in this version. Stored procedures and views are still not being considered yet. Just not sure there is a demand for it, yet.