Thank you Trojan and Kevin giving me a good lesson on comparison by using hash. However, I have some concerns about memory and hash. I want to use the same structure showed below to load all my data into hashes, and so I can compare the incoming data before I can insert or update the data.
I have 15,000 rows of incoming data to compare to 1000,000 rows of existing data which using the same way load them to the memory to do the comparison. I have 8 gig memory on my box. How come i get "Out of Memory!" while running the code? Does anyone know what is the best way to handle these huge amount of data comparison by using Perl? Or what language is best choice to handle this case?
Thanks,
Lucas
Code:
$statement = "select studentID, studentName, courseID from student_table";
if ($db -> Sql($statement))
{
# ..doing error handling
}
while ($db -> FetchRow())
{
@fields = $db -> Data();
$tempids{$fields[1]}{$fields[0]}{$fields[1]} = 1;
$rev_tempids{$fields[1]}{$fields[1]}{$fields[0]} = 1;
$tempidstrings{$fields[1]}{$fields[0]} .= "$fields[1],";
}
I have 15,000 rows of incoming data to compare to 1000,000 rows of existing data which using the same way load them to the memory to do the comparison. I have 8 gig memory on my box. How come i get "Out of Memory!" while running the code? Does anyone know what is the best way to handle these huge amount of data comparison by using Perl? Or what language is best choice to handle this case?
Thanks,
Lucas