Hello,

I have made a software which insert raws (with a command "copy from stdin") into a table defined as below:

CREATE TABLE dataflow
(
num_id_cycle int4,
ipproto int2,
ipLocal inet NOT NULL,
direction char,
ipExtern inet NOT NULL,
portLocal int4,
portExtern int4,
tcpFlags smallint,
incTraffic int8,
outTraffic int8,
incPkts int8,
outPkts int8,
firstTimestamp TIMESTAMP WITHOUT TIME ZONE,
lastTimestamp TIMESTAMP WITHOUT TIME ZONE,
ipoutsideCountryId int2,

PRIMARY KEY (num_id_cycle,ipLocal,ipExtern,ipproto,portLocal,p ortExtern),
constraint fk_num_cycle FOREIGN KEY (num_id_cycle) REFERENCES collect_cycle (num_id_cycle)
);

My program inserted several thousand raw every quarter of an hour; and every 24 hours, it's doing:

DELETE FROM dataflow WHERE num_id_cycle < xy;
-- where xy is the num_id_cycle of 24hours old data;

VACUUM ANALYZE dataflow

Thus, there are less than 24 hours data into the data base. However, the disk fills up, and after a few months, postgresql server crashes because there is more disk space.

Why is the database size increasing ? ( while the number of data it contains is constant )

Thanks in advance. Cheers