before discussing the maximum datafile size, you should look at the query, that needed so much temp space.
Sometimes it's reasonable to expand, but in my experience 98% of these queries are sort operations in sort/merge joins doing cartesian products, because someone forgot a join predicate in a large query ...
If it's not a rogue query, here is the answer to your question:
the maximum size of a datafile is determined by 3 factors:
1.) maximum file size of Operating System (if its's NTFS on your 64bit Windows box it's 16TB - 64KB)
2.) whether your TEMP - tablespace is a BIGFILE or SMALLFILE Tablespace (if it's a standart implementation it's a SMALLFILE TS)
3.) the block - size of the tablespace (2K - 32K, default is 8K but 16K is quite common)
The maximum size of a SMALLFILE datafile:
2^22 * <blocksize_in_bytes>
which is 32 G for 8K blocksize and 64G for 16K blocksize
if it's a BIGFILE tablespace:
2^32 * <blocksize_in_bytes>
which translates to 32TB for 8K tablespaces and 64TB for 16K blockzize datafiles (but remember the NTFS limit of 16TB, and be aware of the limitation to 1 BIGFILE datafile per tablespace).
You may want to read up on BIGFILE tablespaces
, and Oracle Database Limits