Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Database Issue

Status
Not open for further replies.

smerck

Programmer
Jan 16, 2006
16
IN

Hi,
We are currently working on mercator6.7. We are having a performance issue which says that " The loading into the database should increase 5 times faster" . We accept any of the format like XML, X12 , EDIFACT formats these are converted to normat flat file the n loaded into datatbase.Later recreated to original formats. Currently we have created a map which maps entire data as one row and loads into the database. It works but it is only 2times faster to the current trend. We need to achive five times faster.
Anybody with any new idea is welcommed.
Thanks,
smerck
 
Download 8.0 and run the map profiler on a copy of the map.
Use the dstxpage.exe to tune the map.

Think about using burst mode and using things like commit by statement.

Are you running this in the Event Server? What OS? What database? Are you using the latest native connectivity or ODBC?

You have to determine where the bottleneck is. If it is your network connection, then no amount of map or DB tweeking would help.




BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!
 
hi Boca,
For the question u have send.
Are you running this in the Event Server? What OS? What database? Are you using the latest native connectivity or ODBC?
Yeah we are using event server, Unix, Oracle 9i, No we are not using ODBC.


HI eyetry,
File are nearly contains 25k transactions.

thanks
smerk
 
You could use Oracle bulk loader.
See thread BulkLoad OCI through Mercator
 
25k transactions could be nothing of each transaction is 1k. If each is 5 meg., then that is a different story.

If bulkload does not work, see my other suggestions.



BocaBurger
<===========================||////////////////|0
The pen is mightier than the sword, but the sword hurts more!
 
From my experience, if you load is 10mb or less using the DBID and the Oracle ODBC is reasonable fast. Start getting much bigger and it's really more efficient to a bulkloader.

Keep in mind though that what you gain in performance you loos in flexibility; ie... sqlldr commits regularly and will fail based on the settings in you loader script (fail on 1/5/50/100 stc..). Loading via the DBID/ODBC allows you to commit or not on failure. If leaving bad data in your tables isn't an issue and you have large MBs of data to load use sqlldr.

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top