Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How to Insert huge data(abt. 30 million records) into TABLE faster?

Status
Not open for further replies.

dj77

Programmer
Apr 8, 2002
2
HK
Hi,
I am trying to insert 30mn rows from a flat file to a table. I have tried various approaches. But all approaches are taking lotta time.
1)I have loaded flat file to temp table(direct load) and then from the temp table my proc. applys business logic and loads to the main table.
The updates are surprisingly faster. I have dropped the PK,FK on the main table. so the insert shud be faster but in 14hrs i have managed only abt 26% of the data. The total data size is 1.3GB.
Now is there a faster method ?
I have checked the next extents(65k) shud i increase it ?
What abt pctincrease or any other params that mite affect this ?

I have tried the BRI trigger, it has a amazing burst initially but then as the main table size grows then the thing starts to crawl !
anyways record processing is slower.

pl. give me suggestions.

--dj
 
Use a native resource utilizable application program : like the language C, with embedded SQL.
it is, if intelligently implemented, a lot much faster....
 
hi,
u r talking something like proC ? but i dont have that compiler.
but i found the solution after some digging.
basically, increasing the initial extend to the size of the table will be quite helpful and the next extends can be increased in larger chunks.
This shud work faster, definately.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top