Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Script that databases in Oracle 8i hangs

Status
Not open for further replies.

leeBen

Programmer
Feb 13, 2002
7
US
I have a script runs a while() loop and sleeps for 4 seconds between iterations. This script reads files from a directory that is continuously fed small text files, which the script reads in, parses out (the data is weather observations), then databases in an Oracle 8i database.

Unfortunately, after about four hours (this length of time can vary dramatically) the script simply stops. Most of the time I can cntrl+break out of the loop and restart the script, but some times I have to kill the command shell (it is running on a Windows2K Pro PIII 700+ MHz machine) to restart.

I saw the thread by Kathy1 and re-installed ActiveState 5.6.1 but the problem persists. Are there any other 'tricks of the trade' to keep this type of application running?
 
Ok.

The script that's running all of the time - I would try and cut down the work it does, drastically.

What about splitting your system into two scripts?

The first script would run all of the time, it would look like this:

while(){
system('gather_data.pl');
sleep(4);
}

nice and simple, not much chance for it to gather errors as it goes.

the second script, which would do all of the real work, is called gather_data.pl - most of what you have now would go in here. Mike
michael.j.lacey@ntlworld.com
Email welcome if you're in a hurry or something -- but post in tek-tips as well please, and I will post my reply here as well.
 
I will certainly try that.

Question...do errors eat up memory? Like if I get a "uninitialized string" error, do these errors use memory then not release it? I ask this because I had had several of these which I subsequently cleared up and the run time increased. I still get one of these errors every so often.
 
That's very difficult to say. Perl shouldn't eat memory as it runs, variables are created and destroyed fairly reliably as scripts run.

Does the size of the Perl.exe process increase as the script runs? Mike
michael.j.lacey@ntlworld.com
Email welcome if you're in a hurry or something -- but post in tek-tips as well please, and I will post my reply here as well.
 
Actually, yes. I sat and watched just exactly that the other day (a lot like watching paint dry) and saw that it grew in (apparently) 4KB chunks. I realize the chunk size might be as much a "Task Manager" ism as anything. Anyway, I noticed the size only grew when some errors went by. Unfortunately, the errors and the data went by faster than I could catch them...then it stopped occuring. This was late on Friday and I haven't had the chance watch the paint dry again.

What about DBI errors? Like insert statements that fail? I have my insert statements inside an eval block and I occasionally see a "data to long for column" type error. This may be the one where I saw the growth in, but didn't see it recur before I went home...any idea?
 
I'm doing something similar myself at the moment, watching SQL/Server count how many rows it has in a table.... (about 4.8 million, but I need to know better than "about", unfortunately)

I don't know whether DBI errors leak memory, I doubt it frankly. You occasionally see a bug posted on the DBI mailing list about memory leaks but it's not common there.

If you think you've found a DBI bug then the mailing list (dbi-users@isc subscribe by sending mail to dbi-users-subscribe@perl.org) is the place to go.

Before you do that though....

Make sure you can reproduce the problem.
Make sure you're using the latest version of everything
Make sure you can reproduce the problem.
Write a nice small script that demonstrates the problem.
Make sure you can reproduce the problem.

I wouldn't bother, it's not the friendliest mailing list in the world, I'd just work around the problem.
Mike
"Experience is the comb that Nature gives us after we are bald."

Is that a haiku?
I never could get the hang
of writing those things.
 
Ok, I tried splitting up the script (which works nicely) but unfortunately the 'sticking' still occurs. I have determined the problem is not memory since the size of perl.exe never seems excessive (it loads at 10MB) and may or may not grow before sticking.

I have placed a bunch of little 'tracer' print lines at various points in the script to see where the failure occurs and to see if the failure is always at the same spot...stay tuned.

Lee
 
How annoying....

Ok. If you comment out everything to do with the DBI in your script (so that it just scans the directory, formats data for your neat SQL statements but then doesn't actually do anything) does it still fail? I hope that it still fails because that will mean the error is in your script - where it is under your control...

And the subtle approach (not).... Have you considered using the Windows 'at' utility to schedule the job rather than having a while loop?

And -- can you tell yet where the script is hanging?

And - A question I should have asked right at the start.

Is the Oracle database ok when the Perl process has "stuck"? Can you insert new records, using sqlplus, whilst the Perl process is stuck. Mike
"Experience is the comb that Nature gives us after we are bald."

Is that a haiku?
I never could get the hang
of writing those things.
 
Do you keep a connection with the server the entire time or do you only create a connection when you have new information to add?
 
Problem: You 'occasionally see a "data too long for column" type error.'

Solution: In your insert statement use substr() oracle method to limit the length of the data going into the table columns. E.g. Insert into blah ( mycolumn) values ( substr( myvariable, 1, 44 ) ); mycolumn length 44

Sorry, I am not a Perl programmer. Mostly Java and PL/SQL, so I don't know what database access from Perl is like. Are you using sql imbedded in Perl or are you doing procedure calls? If ODBC/Perl database connectivity is anything like JDBC then throwing away reference to the procedure might help--in java you create a new CallableStatement object and only use it once. Excuse me for stating obvious, but make sure you commit at the end of the processing script.

In Perl, is it possible to launch one script into another totally separate process?

pfist
 

you can create a separate process in perl with fork....at least kina a totally separate process, the parent will want to wait unless you ignore the child's signal..otherwise you'll have zombie's ooooohhhhhhhhhh, and other "things" are inherited, I have one that runs under the same name as the parent,?? also note the difference from system and exec
 
Well, here goes...

I have narrowed the 'sticking' point to two places.
1. the 'prepare' statement, 2. the 'commit' statement...the prepare is usually the sticking point.

Here is the flow of the script: I have a small script which runs in a while loop that recursively calls the primary workhorse script then sleeps for three seconds.

The workhorse script makes a connection to the oracle 8i database or dies if it can't. Then I create a handle which I will later reuse for each observation I will database. Next, glob the directory which holds all of the observation bulletins to parse. One at a time, each file is read and parsed out(this works with out a hitch). Once the data for one location is parsed out (some files have more than one location in them, others have only one), I proceed to call execute(with the list of variables holding the data) using the previously created handle. (Previously I created the handle at this point, but I read somewhere it is better to create it once and reuse it.) In the previous version the script usually stuck at this point. Now it sticks occasionally at the commit statement (I had turned Autocommit off earlier trying to see if forcing a commit would avoid the sticking). I have since turned Autocommit back on and eliminated the commit statement. Unfortunately, I ran out of daylight and had to head home...so not sure if this latest fix fixed it.

Anyway, once the location data has been databased, the script either proceeds to the next observation in the same file or to the next file. Once all the files are done, the database connection is closed, then control returns to the main script...sleeps for 3 then back at it.

To answer Mike's questions:
1. As far as I can tell, when it doesn't actually make the database insert, the script doesn't have a problem...in fact it is very fast at doing nothing:)
2.AT..no I haven't...I was warned the windows scheduling thing is flakey.
3.I have not noticed any problems...though I have not specifically checked either. Usually when I restart the process it runs with out problem...for a while.


Question: Can you outrun the DBI? Like make too many calls in to short a time? Our database is on a Dell 6450 Quad processor with 2GB of RAM.

Question 2: With the pre-made prepare statement (the ones with '?' for values, how do you insert an Oracle 'to_date' function...or can you?

Pfist...will try the substr() thing tomorrow.

Regards,
Lee
 

insert into blah_table ( mydatecolumn )
values ( to_date( ? , 'YYYYMMDD HH24:MI:SS' ) );

I would expect that the ? is where a string scalar variable goes. And the format specified in second argument of method would match format of data.
 
...meanwhile...back at the ranch...
I found the error causing the 'too long for column error...fixed it.
Pfist, the date thing worked...fixed it
The script runs blazing fast now...right up to when it sticks.

The sticks are either at an execute statement, commit statement, or when calling a specific sub...the saga continues...

Lee
 
All,
I have modified the script with WIN32::process command to launch the parsing script. The logic goes 'wait for the parser to finish for 60 seconds, if it does not finish kill it'. This is safe enough as the the parser never runs more than a few seconds when operating right (not sticking). In this configuration the script has run undisturbed for nearly a week.

While it is nice the script runs, I still wish I knew how to make the 'sticking' stop.

Question: Is there a way to tell PERL to keep trying to make a DBI call if it hangs the first time? I ask because the problem is occuring only at the DBI interaction poiints.
 
:) good for you leeBen, glad you've worked around it

Can you make the DBI fail with a nice short script? Mike
"Experience is the comb that Nature gives us after we are bald."

Is that a haiku?
I never could get the hang
of writing those things.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top