kev1597770
Programmer
I have text files that are 500MB+ that i need to process line by line. Currently i am using the basic code structure below. This results in a process that takes a few hours.
I tried using threads to process parts of the same source file and reading characters from the file into a few thousand character buffer. Both of these methods slowed down the process.
What is the most effecient C# algorithm to process large files?
//########################################################
StreamReader sr = new StreamReader( sourcePath );
StreamWriter sw = new StreamWriter( destinationPath );
string strFileLine = null;
//########################################################
while (( strFileLine = sr.ReadLine()) != null )
{
// Perform some parsing and formating on strFileLine
sw.WriteLine(...);
}
//########################################################
I tried using threads to process parts of the same source file and reading characters from the file into a few thousand character buffer. Both of these methods slowed down the process.
What is the most effecient C# algorithm to process large files?
//########################################################
StreamReader sr = new StreamReader( sourcePath );
StreamWriter sw = new StreamWriter( destinationPath );
string strFileLine = null;
//########################################################
while (( strFileLine = sr.ReadLine()) != null )
{
// Perform some parsing and formating on strFileLine
sw.WriteLine(...);
}
//########################################################