I am currently want to fetch data from sqlite and insert into postgres. I know there's a feature like copy. But the situation is, the sqlite capture data everyday, and i have to fetch new data only and insert into postgres. I wrote something like this:
for i in logs:
for name in glob.glob('/mnt/log_%s.sq3.*'%(i,i0])):
if os.path.isfile(name) == True:
selectDB="SELECT * FROM log"
for row in rows:
insertDB="""INSERT INTO user VALUES (%s,%s,)"""
if cnt < 1000:
There are actually thousand of sqlite data, approx 10-20 million rows and it takes more than 6 hours to insert data into postgres. Is there any way that i can optimise this and load data faster?