I have a large table with 4 million rows. I need to do an operation on each of the rows. I am using Perl.
When I use the following code on a small table, it works just fine but on the large table, it gets stuck on line 3. As soon as it executes line 3, my hard drive starts going and keeps on going with no end. It never reaches line 4. My diagnosis follows the code fragment below:
-------------------------------------code fragment begin
1 my $sql = 'select myColumn from myTable';
2 my $sth = $dbh->prepare($sql) || die("dbh prep failed");
3 $sth->execute() || die("execute sth failed");
4 my $colValue;
5 $sth->bind_columns(\$colValue);
6 while ($sth->fetch()) {
7 do something to $colValue;
8 }
-------------------------------------code fragment end
I think this is because mySQL is trying to load all 4 million rows into memory. Is there a way, perhaps a command line switch, to delay this so that it processes one row at a time. I have tried adding SQL_NO_CACHE to my SELECT statement but that did not make any difference at all.
On a related note, when I was trying to export the four million rows using mysql command line, I had a similar problem but I found the -q command line switch for mysql command line (From the manual: -q or --quick means "Don't cache result, print it row by row.")
I hope there is something similar for my Perl problem above too. Or am I doing something completely wrong in the way I am doing this? I am a newbie to mySQL.
When I use the following code on a small table, it works just fine but on the large table, it gets stuck on line 3. As soon as it executes line 3, my hard drive starts going and keeps on going with no end. It never reaches line 4. My diagnosis follows the code fragment below:
-------------------------------------code fragment begin
1 my $sql = 'select myColumn from myTable';
2 my $sth = $dbh->prepare($sql) || die("dbh prep failed");
3 $sth->execute() || die("execute sth failed");
4 my $colValue;
5 $sth->bind_columns(\$colValue);
6 while ($sth->fetch()) {
7 do something to $colValue;
8 }
-------------------------------------code fragment end
I think this is because mySQL is trying to load all 4 million rows into memory. Is there a way, perhaps a command line switch, to delay this so that it processes one row at a time. I have tried adding SQL_NO_CACHE to my SELECT statement but that did not make any difference at all.
On a related note, when I was trying to export the four million rows using mysql command line, I had a similar problem but I found the -q command line switch for mysql command line (From the manual: -q or --quick means "Don't cache result, print it row by row.")
I hope there is something similar for my Perl problem above too. Or am I doing something completely wrong in the way I am doing this? I am a newbie to mySQL.