I'm programming a website in which (possibly tens of) thousands of records have to be stored for every user.
With 1000+ users the table size might quickly run up to millions of records.
Will this affect query times drasticly and is it better to use a seperate table for every user? Or will loading times not be a problem using just one table with millions of records (obviously using indexing)?
You need to look more closely at your application and how it will interact with the data in the database. For example, is the data going to be read only, in other words no changes are going to happen with the data? Also will users access only the current years data or all data?
On the flip side, how stable is the software? Could table changes happen in the future? Imagine trying to make all table changes on 1000's of table?
If you are going to be managing large amounts of data, have a look at partitioning which helps in this. But you need to determine a good partitioning policy to get the most from this feature. For this you need to fully understand your application and what is happening with the data.
An example : A website where party organisators can add their party. People who suscribe on that website (enter name and e-mail) get a mail with a ticket. The organisators can download the e-mail adresses and names of people who got a ticket (per party or for all the parties they've had that year/the last years)
Because it's a website adding records (with name and e-mail) has to go smoothly, even if its the 10 millionth record added) and downloading your names/emails per party should go fast as well.
Only it's not a ticket website and their are +2000 "party organisators" and +-thousand of "tickets" per party.
It's not very likely that the tables will have to be changed.
So I'm wondering what's best, one table for all the tickets or a different table per organisator?