I have a huge db (~2million rows) and there is heavy traffic on this (other queries).
Lets say that I have an indexed column field "name" and an external source (non IT) gives me a file with 200.000 names to fill their attributes. I guess, writing a script which translates to query:
SELECT attr1,attr2,... FROM t1 WHERE name in ('x1','x2',...,'x200000');
is quite inefficient, isn't it? This will cause huge time for a response.
Also, writing a script that loops 200.000 times by quering:
SELECT attr1,attr2,... FROM t1 WHERE name='xn';
is not a good practice too.
Does anyone knows the best practice for this case? I note that the names are not in a table2, so JOIN cant be used. Creating a table with just 1 column "names" and then use JOIN could be my solution?