i have a pretty long text file that i need to get into a db. The are line 1: Name, line 2: address, line 3:city, and line 4hone number. What would be the easiest way to get this into a db? would it be easier if i make them comma-delimited ?
technically the file can be read either way. I'd imagine it would be an issue to re-format the text file so your solution is to read the text file line by line, building up SQL insert statements and executing them.
// to read the text file line by line using fgets ...
// note: Returns a string of up to length - 1 bytes read from the file pointed to by handle. Reading ends when length - 1 bytes have been read, on a newline (which is included in the return value), or on EOF (whichever comes first). If no length is specified, the length defaults to 1k, or 1024 bytes.
I had to do this with a bunch of text files (newspaper articles).
I used this from the command line:
LOAD DATA LOCAL INFILE 'files.txt' INTO TABLE media FIELDS TERMINATED BY '\r\n%%\r\n' LINES TERMINATED BY '\n%%%\n' (date,type,headline,text);
The \r\n stands for looking for a skip to the next line.
You can use whatever you want to separate the fields, like commas, but I chose two percent symbols on their own line, like this (since the articles had commas in them):
Just make sure the file files.txt is in the same directory that you're running mysql from.
Last edited by Chris J.; 12-20-04 at 01:55.
Reason: clarify separating lines