Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Handling Very Lareg Strings 2

Status
Not open for further replies.

RobHudson

Programmer
Apr 30, 2001
172
GB
Hi

I have an app that is generating *very* large strings. Anything from a few K to potentially 100's of meg in size.

Needless to say, it becomes incredibly slow when the string gets bigger.

I am using a StringBuilder.

Does anybody know of a really efficient way of dealing with such large strings?

Would writing the StringBuilder to a temp file at stages be a faster way of doing it?

Cheers
Rob
 
Hi JurkMonkey :)

It's kind of a long story - but I am looping all the data in a table and building INSERT statements so that I have an INSERT statement for each row of data in the table.

The string output is then shown on the form.

It's equally as slow if I write direct to the output control as using the StringBuilder.

Cheers
Rob
 
RobHudson said:
INSERT statements so that I have an INSERT statement for each row of data in the table. The string output is then shown on the form.
the first part i could see. the second, i don't. if 100's of MB of string are displayed to a user there is too much information to process and anomailies will be missed.
i would argue there is no purpose of displaying 1,00s of insert statements to the user (admin or end user). summary info absolutely, maybe even a drill down to the detail (beyond the current scope), but not all the data at once.

say there are 100,000 records which equals 100,000 insert statements. break them into 1,000 record chunks and append the chunk to a text file.
Code:
int totalRecordCount = //get number of records
int currentRecordCount = 0;

string start = "insert into [mytable] ([column1],[column1],

stringbuilder sql = new stringbuilder();

DataTable results;
while (currentRecordCount < totalRecordCount)
{
   using (results = FetchNext1000Records())
   {
      foreach (DataRow row in reuults.Rows)
      {
         sql.appendformat("{0} values ({1}, {2}, {N})", start row[0], row[1], row[N]);
      }
   }

   //write to file
   string filename = string.format("_{0}-{1}.sql", currentRecordCount, currentRecordCount + results.Rows.Count);
   using (textwriter writer = File.OpenAppend(@"c:\sql\records.sql"))
   {
      writer.write(sql.ToString());
   }
   currentRecordCount += results.Rows.Count;
}
[columnN])");
in the end your text file will be hugh so maybe even writing them to multiple files would work better.
Code:
   string filename = string.format("c:\sql\records_{0}-{1}.sql", currentRecordCount, currentRecordCount + results.Rows.Count);
   using (textwriter writer = File.CreateAppend(filename))
   {
      writer.write(sql.ToString());
   }

   currentRecordCount += results.Rows.Count;
There are other options for dumping a db. you can ship logs, detach dbs, and use bulk inserts. Not knowing what your exact needs are they may not work.

Jason Meckley
Programmer
Specialty Bakers, Inc.
 
What exactly are you displaying the results in? A textbox doesn't handle that many lines!

What you may want to consider is breaking out each table into its own separate file.

 
Thanks to you both :)

It sounds like going down the file route is the way to go.

Yes. It is displaying to textbox, so I'm guessing I will need to put somekind of checking in there as well!

Cheers
Rob
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top