I need a database design orientation. I work with Postgis, and I have an application that is going to receive lots of georeferenced files every day. Each file has information of several points. Probably in few years my application will have a Tera of points information stored.
I think I can do this design in two ways:
1. Two tables, one of “uploaded_files”, and another of “points” (one uploadedFile to N points). And I'll have to partition the points table, maybe by month …
2. Or I can create one table per file, having thousands of tables in few years.
Which case is better for my application? Is there any problem to have thousands of tables with Postgres? Is there a better way to do this that I'm not thinking?