Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

When to stage? Improving poor performance.

Status
Not open for further replies.

smn198

Technical User
Oct 29, 2004
24
0
0
GB
I've got a fairly simple package taking a file, unpivoting it and then sorting it. The problem is, this is taking forever. The file contains about 1.8 million rows and 9 columns (including the pivot) so we end up with 14.4 million rows. It is using stacks of memory (our test and development servers only have 4GB and production only 12GB). Would I be better off writing to a temp database? Are there any free articles out there that talk about this sort of thing or do I just need to use the good old trail and error approach?

Many thanks,
Steve
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top