Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Downloading Many Columns from an iSeries file very slow

Status
Not open for further replies.

navshiv

MIS
Feb 15, 2002
91
US
Hi
I have a very simple SSIS Package which truncates a SQL table and downloads a file from the AS400...that's it! However, compared with our DTS package this is very slow (approx. twice as long).

I have managed to narrow it down to the number of columns that are downloaded from the file, 87. I know it sounds a lot but the issue I believe here is why is SSIS so much slower to do this than DTS? I have changed the source data flow component from a select * to select column1, column2 etc. I tried this with 50 columns and that ran very fast but as I added more columns the package started to slow down.

I've played about with the DefaultBufferMaxRows and DefaultBufferSize properties but I'm still not getting any improvement.

I'm fairly new to SSIS so I'm not really sure where else to look.

The source connection uses the Native OLE DB\IBM DB2 UDB for iSeries IBMDASQL OLE DB Provider. I have tried other connection types including ODBC but still nothing.

Any ideas at all?
 
What is the total size for a single row? Playing with the buffer settings won't help unless you really understand what they do and how they work. Are there any other tasks in the package? How much available memory is on the server? What is the Disk IO? Is your package spooling buffers?


To create SSIS packages that are fast requires some understanding of what is actually happening and how SSIS interacts with both hard ware and the database.

Most things done in DTS can and will be faster in SSIS if built correctly.

Paul
---------------------------------------
Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
Thanks for the posts. I've done a new test with a new data flow using the ADO.NET ODBC provider and this ran a lot faster (took half the time). Not sure why it didn't work in my first round of testing.

The only thing is that I have to put a data conversion between the source and destination as ODBC seems to be reading everything in as unicode string. I can't really change the SQL columns to NVARCHAR unfortunately.

I'm going to run a couple more tests and I'll update this post with how I get on.
 
The good news is that performance wise your data conversion task has very little over head. I haven't used the ADO.NET ODBC Source but have you looked at the advenced properties. Some data sources allow you to specify the datatype of your output columns.

Right Click on Task and select Show Advanced Editor.

Paul
---------------------------------------
Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
Hi

I did try changing the data type in the output columns but I keep getting a message (for security reasons I have renamed the source file to 'FILE' in the message below) -

Error at Download SOURCEFILE to Product_Temp [SOURCEFILE [1]]: The data type of output columns on the component "SOURCEFILE" (1) cannot be changed.

Error at Download SOURCEFILE to Product_Temp [FILE [1]]: System.Runtime.InteropServices.COMException (0xC020837D)
at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.SetOutputColumnDataTypeProperties(Int32 iOutputID, Int32 iOutputColumnID, DataType eDataType, Int32 iLength, Int32 iPrecision, Int32 iScale, Int32 iCodePage)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostSetOutputColumnDataTypeProperties(IDTSManagedComponentWrapper90 wrapper, Int32 iOutputID, Int32 iOutputColumnID, DataType eDataType, Int32 iLength, Int32 iPrecision, Int32 iScale, Int32 iCodePage)

I had 2 jobs with a OLE DB and ODBC connection scheduled on SQL Server but they both failed because of password encryption errors or something. Think that's a whole new issue! But I am pretty convinced that ODBC will resolve my issue. However I do have a concern with credentials to the source system being saved.

I'm still confused to what I should set the ProtectionLevel property to.

Also, do any of you use the ODBC connection? Do you tend to construct the connection string using the 'Build...' button or do you just select the DSN from the drop down list and then enter the userid/password? Should these credentials save within the package so that they are picked up when you schedule this package in SQL Server Agent?

Sorry for all the questions. We haven't been put on a training course so teaching yourself can tricky when you have specific questions like these.

Thanks for any help.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top