How to avoid data loss in python postgresql bulk insertion -


i want insert data postgresql 1000 1000 records ( bulk insertion ) make quick , low load on dbms, code :

cursor.execute("insert bar(first_name,last_name) values ('david', 'bar')") cursor.execute("insert bar(first_name,last_name) values ('david2', 'bar2')") cursor.execute("insert bar(first_name,last_name) values ('david3', 'bar3')") .... etc connection.commit() 

and can see committed changes @ end , that's saving lot of time me instated of committing changes after every insert query. problem if query crashed reason ( invalid data ), quires fail execute , lose data. there anyway save time of insertion , avoid data loss @ same time??

it depends on requirements of course, depending on transaction needs recommend 1 of following options:

1. using savepoint's (subtransactions):

begin;     savepoint savepoint;     insert ...;     release savepoint;     savepoint savepoint;     insert ...;     /* if you're getting error */     rollback savepoint savepoint; commit; 

2. using autocommit

set autocommit on; insert ... insert ... insert ... 

Comments

Popular posts from this blog

c++ - No viable overloaded operator for references a map -

java - Custom OutputStreamAppender not run: LOGBACK: No context given for <MYAPPENDER> -

java - Cannot secure connection using TLS -