I have a big database with over 250.000 descriptions.
To prevent searching a lot with querys like %'searchword_here%' I thougt to make a separate table with the keywords.
So I have 2 tables
- table1, with the ids and original descriptions
description_id 1: description: word1,word2,word3,word4
description_id 2: description: word1,blabla2,word3,blabla4
description_id 3: description: blabla1,word2,word3,blabla4
- table 2, with the separate words from the descriptoins, and ids , extracted from table 1 and separated with a |
a short example:
id 1: word 1 description_ids: 1|2
id 2: word 2 description_ids: 1|3
id 3: word 3 description_ids: 1|2|3
id 4: word 4 description_ids: 1
id 4: word blabla2 description_ids: 2
id 5: word blabla1 description_ids: 3
id 6: word blabla4 description_ids: 3
with primary key on id, and index on the word
When searching query looks in table2 for the search words, get the description_ids and shows the complete description etc. getting it from table1
Searching now goes very fast.
But inserting new words in table 2 and/or updating the records, or starting from the beginning costs a lot of time when the table2 is getting bigger and bigger with more words and ids.
How can I make this more efficient and faster when building the table2 from the beginning?