Maybe not so much a mysql question, but maybe someone knows a trick:
I have a database(mysql) with a lot of tables. ( over 20000). Yes, I really need this as I need to do detailed searches for each category represented by such a table
The issue is in inserts. I use perl DBI for this.
Normally I use placeholders, i.e.
$str=$dbh->prepare("INSERT into $tablename VALUES (?,?,?,...);"
once at the beginning and then
$str->execute(@values); in the loop reading the data
Now the problem is that these data come at random
so I do not know what table they go in a priori.
So what I would like to do is
$str=$dbh->prepare("INSERT INTO ? VALUES (?,?,...)"
then $str->execute($tablename, @values);
but that does not work of course.
It looks to me like I have two solutions:
for each entry read
push @{$hash{$tablename}},[@values]
and at the end for each hash key
create $str and run
or prepare at the very start some 20000 $strs
and save them as a hash, then execute $str{$tablename}
Any other tricks?
Thanks, svar
I have a database(mysql) with a lot of tables. ( over 20000). Yes, I really need this as I need to do detailed searches for each category represented by such a table
The issue is in inserts. I use perl DBI for this.
Normally I use placeholders, i.e.
$str=$dbh->prepare("INSERT into $tablename VALUES (?,?,?,...);"
once at the beginning and then
$str->execute(@values); in the loop reading the data
Now the problem is that these data come at random
so I do not know what table they go in a priori.
So what I would like to do is
$str=$dbh->prepare("INSERT INTO ? VALUES (?,?,...)"
then $str->execute($tablename, @values);
but that does not work of course.
It looks to me like I have two solutions:
for each entry read
push @{$hash{$tablename}},[@values]
and at the end for each hash key
create $str and run
or prepare at the very start some 20000 $strs
and save them as a hash, then execute $str{$tablename}
Any other tricks?
Thanks, svar