I have created a table of months and years from 2012 - 2015, so 48 entries, that I am joining with a Cartesian join with a table of 150,000 subscriptions in order to project monthly revenue (calculated daily). The resulting query should be 7.2m rows.
It is set up so that the initial 150k import table is right and left joined to a tables defining renewal rates etc so that I have all of the data i need to calculate revenue. This intermediate query, called [Data], has many nested iif statements to hammer out when and where each various renewal rates come into play, the largest of which is nested 10 times.
The [data] query is then joined with a Cartesian join against the table of months/years, called [DaysSQL] to create the final revenue set, named [Committed Revenue]. This Committed Revenue query also has a number of fairly complex iif statements that it uses to create the end result.
When I run the [Committed Revenue] query with constraints of only showing 2012 revenue, it works fine, if not after some time. My problem is that when I try to run the query with all of the breaks off, it tells me that the Result of the Query is too large (over 2gb) and it stops working.
The design of this database was supposed to be such that I could import this 150k row file from a csv on a regular basis and see how our revenue is impacted. I consider the ability for this to calculate everything from start to finished based on the original import to be imperative.
Any ideas how to get this to run better? I would post the SQL but the iif statements are very long and make it essentially unreadable. My only thought is to have VBA create the [Data] query as a table, which I would expect could speed it up substantially, but if there is some way to bypass the 2gb limit or restructure the database, I'd be interested to hear it.
Thanks in advance for any ideas you guys come up with.