You guys are awesome!
Both solutions worked just dandy! I guess my mind was stuck in the wrong type of loop. I will leave it a mystery as to which solution I ended up using in the end, but you both deserve stars for quick and accurate replies.
Thanks again.
I am trying to get an accurate row count for all tables owned by a user at a specific moment in time. Querying the user_tables table seemed like the ideal solution... just run a dbms stats, and then query the table. However, since the num_rows columns in dba_tables and user_tables only seems...
I'm stuck trying to figure out how to easily take a specific delimited field of values, and populate another similar table that has the values broken out (see below). This forum was very helpful when I had to process data in the other direction, but I have not been able to figure out the...
Same problems using UTF8.
As I continue to dig in further, I realized that the data is strictly just binary values, so the statement that they are "UniCode" is quite probably incorrect and misleading. I apologize for the inaccuracy, this is my first time dealing with any data other than...
I am trying to load some variable length data using SQL*Loader. My control file is included. The only catch is that the 2 bytes that identify the logical record length, are little endian binary (Unicode) data. For example, for those 2 bytes to represent a record length of 101 characters, the...
I've posted and received a nice solution to an issue I had with SQL Server (see thread thread183-1190923), however now I'm faced with the same issue on Oracle 9i. Here is a summary, the solution for SQL Server, and my attempt in Oracle. Seems like it should work.....
/* I'm trying to get...
I used your advise and code sample, and the load works like a charm! Thank-you very much. I made a slight modification so I have included that here. This script was able to load about 33,000 individual files.
# load_doc_text.sh
for i in `cat srcpath_sample.dat`
do
cat << !! > load_doc.ctl...
I'll play around with that idea a bit and let you know how it goes. I've got approx. 33,000 separate .dat files, so as long as I'm creating a ctl file only long enough for the loop to process it, and then deleting it, I think this could work.
I am using Solaris, I am generating the list of 'infile' lines using 'find', then running those results through sed to format it as a valid infile line, then concating that with a pre-built 'append into table...' section to create a single ctl similar to my example file with all infile names...
I am attempting to load many files using multiple INFILE parameters in a single SQL*Loader control file. I want to include in one of the data fields, the name of the current .dat file being processed (see "DOC_NAME" below). Where '090005.DAT' is hard-coded on that line, the current "INFILE"...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.