Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

SQLLDR adding a century to pre-1950 dates

Status
Not open for further replies.

LarrySteele

Programmer
May 18, 2004
318
US
I've encountered an issue I've never seen before and I couldn't find a solution through Google.

I'm using SQLLDR to bring data from some text files, nothing particularly special. The files are CSV with double-quotes where needed. The inbound data appears good.

The problem is that dates prior to 1950, Oracle is adding a century. So for a date before 1950 such as "2/6/1948" in the inbound file, Oracle stores "2/6/2048" in the table. For dates 1950 or after such as "4/24/1973" Oracle correctly stores "4/24/1973". This is consistent through all date columns and all rows.

Here's the header portion of my SQLLDR control file:

[tt]LOAD DATA
INFILE '...'
REPLACE
INTO TABLE "FOO"."BAR"
FIELDS TERMINATED BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
([/tt]

Here's one of the lines where I'm bringing in a date field:
[tt]birthdate "decode:)birthdate,'#EMPTY',null,to_date:)birthdate,'MM/DD/YYYY'))",
[/tt]

Any idea what would cause this and how to correct it?

Thanks in advance,
Larry
 
Add an example of the input csv. Do not type it in, cut and paste from the actual data. Feel free to mask any personally identifiable information

Bill
Lead Application Developer
New York State, USA
 
The behavior you are seeing is the default behavior for DATE fields with only 2 digits. I don't think you can change the default behavior with an option setting, so you will need to add the century field to your dates using a coding process.



==================================
advanced cognitive capabilities and other marketing buzzwords explained with sarcastic simplicity
 
I found something else while searching (I use duckduckgo.com as my search engine, not Google).

Oracle uses a special calculation when populating the century by default. If the two-digityear
value is between 50 and 99, Oracle will consider the 20th century the default (i.e.
1950 – 1999). If the two-digit-year value is between 00 and 49, Oracle will consider the
21st century the default (i.e. 2000 – 2049). The characters “DD-MON-RR” are used to
denote the characteristics of the default date. This set of characters is called a date
picture. The “DD” characters represent the day-of-the-month. The “MON” characters
represent the month in the format of “APR”. The “RR” characters represent a two-digit
year. The “RR” characters tell Oracle to use the discussed formula for determining the
default century. This format is the default. If the “YY” characters are used, the default
century will always be the current century.
To display more than the default date picture components, the To_char function must be
used. This function changes the default date picture. The date picture is also used to tell
Oracle the format of any non-default-inputted dates. This function is discussed in the next
section. You might also note that the dates in our practice database range from the 1800’s
to the 2000’s. If you do not put a century component in the arguments for your Where
clauses, Oracle will place a “20” as the century by default. This may be the cause of some
unexpected results.

==================================
advanced cognitive capabilities and other marketing buzzwords explained with sarcastic simplicity


 
Thanks for your response Bill.

Here's a line of data from my CSV file with personally identifiable information replaced.

[tt]11111111,LName,FName,,Street,,City,ST,State,11111,CNTRY,Country,1-1234567890123,0,,,1,2/6/1948,6/29/1987,2/28/2014,Text,T,Text,,,,,,,,,6/29/1987,ABC,ABC,,,,,,,,,,,,,6/29/1987,,,,,,3/1/2014[/tt]

All dates are imported into Oracle correctly except for "2/6/1948," which Oracle stores as "2/6/2048."

Here's another line of data from my CSV file...

[tt]22222222,LName,FName,,Street,,City,ST,State,22222,CNTRY,Country,1-123456789123,,,,1,6/18/1917,4/30/1937,6/30/1982,Text,R,Text,1,Text,,,,,,,4/30/1937,TXT,Text,,,,,,,,,,,,,4/30/1937,,,,,,7/1/1982[/tt]

In this case, the only dates that imported correctly were the two 1982 dates. All the rest were imported as "20nn."

Any thoughts?

Thanks in advance,
Larry
 
Thank you for your response John,

You are correct that Oracle is acting as if it's only receiving RR rather than YYYY. That's what's baffling to me. The dates include the century (see examples from my original post plus the actual sample lines I was assembling as you were posting).

I'm also specifying the inbound data format of "MM/DD/YYYY.
 
Looks like I found the root cause: it appears to have been the decode function.

Again with Google, this time included "decode" in the criteria and I noticed a number of searches related to SQLLDR, incorrect dates, and Decode. So I modified my control file thusly:

[tt]birthdate "to_date:)birthdate,'MM/DD/YYYY')",[/tt]

That brought the correct date in.

I then cleared that value from the record to see how an empty string would be handled and it correctly brought it in as a null date. Apparently I didn't need to convert empty strings to null using the Decode function as I had been.
 
Thats Great, Is there a reason that you aren't using external files? They are so much easier to use.

Bill
Lead Application Developer
New York State, USA
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top