Hi All,
We are creating a query using <cfset QueryAddrow(qryMasterFile) />
At the time to insert the data to oracle db we do:
<cftransaction action = "begin">
<cftry>
<cfquery name="deleteMasterFile" datasource="#application.str_dsn#">
DELETE FROM master_file
WHERE my_id = <cfqueryparam cfsqltype="cf_sql_integer" value="#arguments.my_id#">
</cfquery>
<cfoutput query = "qryMasterFile">
<cfquery name = "insertData" datasource = "#application.str_dsn#">
INSERT INTO MASTER_FILE ( list of our fields)
VALUES (sq_my_id.nextVal,
<cfqueryparam cfsqltype="cf_sql_integer" value="#arguments.req_id#" />,
<cfqueryparam cfsqltype="cf_sql_date" value = "#qryMasterFile.DATE#">, and all the rest of fields.
)
</cfquery>
</cfoutput>
<cfcatch type = "any">
<cftransaction action = "rollback" />
<cfset result['success'] = false />
<cfset result['errors'].msg = "Error occured inserting data. #cfcatch.message# #cfcatch.detail#" />
</cfcatch>
</cftry>
</cftransaction>
As you see, we are going to the DB for every record. I was checking online and I found that I can do bulk inserts using commit but I want to be able to rollback if any error.
How can I improve the oracle insert performance of 100K rows?
Thanks in advanced!