Question about parallel processing for bulk inserts into DB

Joined
May 23, 2008
Messages
10
Reaction score
0
Hi,

I am trying to insert multiple records in a number of few 100 thousands.
This is crashing python. Can i implement parallel processing. I have
read about it , but don't know how to implement.

cursor.execute(("insert into tbl_query_detail(hash_value,file_name,package_name,QUERY_NUMBER,\
author_name,FULL_QUERY,BASE_QUERY_INDICATOR) values \
(A.NEXTVAL,:file_name,:package_name,:QUERY_NUMBER,:author_name,:FULL_QUERY,:BASE_QUERY_INDICATOR )"),(filename,Result_list[0],Result_list[ResultListIndex],'abc',Result_list[ResultListIndex + 1], Result_list[ResultListIndex +2]))


My code is actually extracting queries from the code and inserting into
database . That makes almost 1,00,000 queries to be
inserted into database.
THis is a one time activity. SO i am not bothered about performance. But more worried about to see that thepython doesnt get hung in between and insert into db.
I have been using DCOracle2 on unix where it was working fine. I
have shifted to windows and using cx_Oracle as the DB-API.


Which is the best strategy to be followed please suggest.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,020
Latest member
GenesisGai

Latest Threads

Top