Alright im looking for the best way to setup this table if 1 massive table is the right way or if breaking them down would be better.
I have a 999x999 record matrix. It consists of the first 3 numbers of a zipcode and a corresponding number, aka. zone chart.
Now im wondering would it be best to setup 1 giant table with each 3rows the origin zip/dest zip/zone or would breaking them down in 10 tables run faster. Right now 1 giant table would run fine but if it got some heavy usage I do not think it would perform as well as it could. There is actually 2 results for each zone so it possibly could be 2million records.
I'm going to take a wild guess that you don't need to be optimizing anything at this stage of development.
In general (since I don't know what DBMS you're using) a read-only lookup table with a simple primary key is going to have a hashtable index with read-only locks. That means that lookups happen in constant time, O(1). Constant time means looking up with 200 or 200 thousand or 2 billion records take the same amount of time. Splitting it into multiple tables is probably an unnecessary headache for you and makes it more difficult for the DBMS to optimize.
But don't take my word for it: generate some test data and some test scripts to see which configuration runs faster. It'd be nice if you posted the scripts and results.