Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

EDI Integration Time

Status
Not open for further replies.

NJAnalyst

MIS
May 6, 2003
46
0
0
US
We have issues with large integrations taking an extremely inordinate amount of time to process. We have a distribution once a week that is around 2700 orders with an average of 107 lines per order that takes around 18 hours to integrate. We also have other customers that average 300 orders a day with 100 lines per order. We have two companies that allow us to run multiple integrations simultaneously. We are talking to a couple of more retailers that could double our inbound/outbound traffic. If anybody out there has come up with a work around or solution to this issue, please let me know. Also we do not currently have Flex but would consider the purchase if it would facilitate this project. Thank you for any response that I receive.

Jonathan Nelson
Business Systems Analyst
Horizon Group, USA
 
I think your processing time does sound excessive by at least double. I'll do some research for you with a client of mine that uses a lot of EDI integration: not your volume, but substantial. Their integration times are better than yours on a ratio basis & they are using a NW 5.1 server 400 meg processor & we use tcpip protocol (not state of the art by any means). Why don't you list your hardware/network/version/database version particulars while I check out my known sites? Maybe you need to dedicate a processor on the hardware to this activity to speed it up??

Are you agressively purging your files? Those EDI inbound files like the capture & audit files become behemoths in short order. I am constantly initializing them at my big EDI site as they degrade performance on the whole server when they get too large. If you are adding almost 300,000 records at a crack to the active order files, you could be running into a network traffic/physical limitation of your swap file or something like that.
 
Hello MacolaHelp,

The server is compaq proliant ML570 Dual 1 gb processors, 3 gb of ram, win2k sp2 os, sql 2000 tcpip 10/100 nics. The workstations are both p4 1.8 gb processors with 512 mb of ram.

We purge monthly but have tried initializing the capture file and the next integration showed no improvement. We have two companies in Macola one is 13 gb and one is 54 gb. We also have a 40 user license and around 35 active daily users. We have talked about throwing more hardware at it but would like to find out if that will help before making such a large investment.

The next thing we are going to try is splitting the App server and SQL server to seperate boxes. We have a another dual box with similar specs that is currently used as our Gentran server only. It is showing low utilization. Any ideas or experience that could help would be greatly appreciated. Thank you for your time.




Jonathan Nelson
Business Systems Analyst
Horizon Group, USA
 
I was just at my major EDI site today & their pure EDI line item volume is not even close to yours as it turns out. Based on our calculations of today's orders, we think 18 hours for 300K records is pretty good.

How about your iminvtrx? That would be huge quickly based on your order volume. Since there are a bunch of files being updated during the integration process, you really should have plenty of free space on the server to handle the swap file.

Have you experimented with moving the swap file onto a different partition with lots of free space?

Can you break the inbound file into smaller chunks & run from more client workstations simultaneously?

How about your transaction logs for sql? Is there enough room to save them & how often are you running your backup maintenance plan during the day?

My understanding is that sql will use happily all the hardware you throw at it, but I would suggest solicting some network expert opinions on that.
 
Have you determined whether the bottleneck is at the server or over the network? If it is over the network you might try gigabit network stuff. If it is on the server, hardware is so cheap these days I would throw in dual 3 gig processors with a full boat of ram, and an Ultra Scsi 160 (or whatever their at now) at it. That would probably get you down to at least overnight. I have a client, although not Macola, has to a job that processes close to a million records on a Sql Server and we got it down to six hours on a similiar 2.2 gig setup, using Athalon MP processors and 2 gigs of ram. If we had used Intel XEONs we would have probably done a little better, but the AMD stuff is a lot cheaper and is really hot, especially compared to the Pentium 4s. Personally, I think 1 gig processors is not enough animal to handle what your doing.

 
Purging the capture and SDQ log file will help. However, be aware that on some 856 and 810 formats Macola gathers data from the capture file. If you purge the capture file that info will be available.

Other user may also delay the process. If a user is entering an order and uses the OECTLFIL the EDI integrate proccess will wait until the person is done before continuing. If possible only run the process at night when no one is using Macola. Break up the orders into smaller batches, so they can be processed overnight.

good luck
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top