Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

batch file sqlplus execution

Status
Not open for further replies.

eyetry

Programmer
Oct 2, 2002
560
US
I'm executing a bat file with a rule that looks like....

=VALID(GET("BAT","-CMD c:\batch_trun_table.bat -TRACE -AUDIT "), FAIL("TRUNCATE TEMP TABLE: " + LASTERRORMSG()))

batch file =

sqlplus user/pw@sid @c:\path\truncat_temp_table.sql

sql file =

truncate table temp_table_name

Works once but after that it seems to fail with rc=3

What's going on and better yet how do I resolve the issue.
 
Why would you do it in a batch file in the first place. Better to use dblookup/dbquery or run a map with the query as the input card.
 
The result code is probably because you haven't closed the database connection within sqlplus.
 
janhes, tried performing dblookup to process file and it's taking over 6 hours due to the size of the file. So, created temp table loading all relevant data to table and then executing map with queries that will select the data needed. Reduces time from hours to minutes.

I thought it might be a connection issue but I put a 'quit' after the truncate but still seem to have the issue.

Any suggestions?

eyetry

ps: I know it sounds ugly and prob is but feel like I've exhousted other possibilities.
 
Never mind. Sometimes I'm just thick.

 
On my way into work I actually understood your earlier suggestion. dblookup("truncate......","......") Must be to close to the forest to see the tree.

Not sure why I didn't just do it that way the first time but I guess it just takes some of us a bit longer. I was already using the same mdq for the related table in a subsequent map so.......

My over 6 hour comment was me trying to do use the dblookup to solve a problem related to the same data. The new method of using the temp table resolves all of the data issues in 20 minutes, from initial read of the large file thru loading it into the perm tables.

I'm learning the DB side on the fly so struggling with some of things. But, now that I see how to handle this type of issue, I should be able to turn similar problems around in a day or two instead of a month.....

Thanks again!
 
Just for clarity you can do most SQL things through DBLOOKUP.
If you want to do a bulk load to Oracle you can use the Oracle bulk loader through a batch file. It's in the examples for vn8
 
Yea, I know just having brain spasms on this one and I'm doing things in Oracle I've never done before and I'm not Oracle trained so learning on the fly.

We are getting a large file from a TPA that processed fine. Then learned that there were some problems with their data which I took care of in a few seconds. Then we needed to split the file up. Then...... and now.... Every time I resolve a problem and confirm there aren't additional issues QA and Users sign off. Goes to production and within a few days another problem is found. Not sure why they didn't know about this last one because it's the worst and has taken me a few weeks of mucking around. We've been working with the TPA for 2 months trying to get them to clean up the file. They can't seem to get anything right so I'm now doing it for them. So when I'm done removing issues now, I've removed about 75% of the inbound data.

Just want to be done with it because I have other things starting to back up on me.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top