Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

using perl to mass-insert mysql data files

Status
Not open for further replies.

svar

Programmer
Aug 12, 2001
349
GR
I am using mysql4.01(with InnoDB)
I have a large application producing a few files every 10 minutes or so. These files(say file.sql) are of the form

set autocommit=0;
INSERT INTO TABLE1(field1,field2,..,fieldn) VALUES (val1,val2,...,valn);
.....
commit;

Obviously I can have a perl script parse this file line by line and do the insert via DBI.
However, I am trying to do this another way, because this is a large file, but mostly because it looks cleaner:

I am trying to do it via
system(&quot;mysql -u user -ppassword DATABASE <file.sql&quot;);

However, I also want the following:
If the bulk inert succeeds, I want to write to a file DONES
that file.sql was correctly imported and to delete that file
If this fails, I want to write to a file ERRORS that
file.sql failed to import . However something like
system(&quot;mysql -u user -ppassword DATABASE <file.sql >OUT&quot;);
open(OUTF,&quot;OUT&quot;) or die &quot;cannot open OUT\n&quot;;
my @msg=<OUTF>;
if(@msg){... write to ERRORS}
else{write to DONES}
does not work because I do not get in OUT the ERROR statement and instead getthe welcome to mysql.... stuff

Any ideas how to do this?
Thanks, svar
 
Frankly, *I* would recommend having a program do the inserts, but I think you want to use

2>error_output_filename

to get input from std_err
instead of

>normal_user_output_filename

to get input from std_out

You can even have both if you want.

Steve
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top