Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

MAX(CONVERT(CONVERT(CONVERT...))) 2

Status
Not open for further replies.

kjv1611

New member
Jul 9, 2003
10,758
US
Hi folks,

Not been very active of late here on tek-tips. I've been very busy in every are of life, so not hardly time to even check in but every few days, it seems.

Well, at work, I'm trying to clean up some SQL code as well as some other things, in a new department I'm working in.

I've already got some cleanup done, but lots more to do.

Anyway, I've got this one type of TIME conversion that is apparently necessary in a given query, and I'm hoping there's a way to do this a little easier on the server, so it'll run faster.

Here's the current SQL:
Code:
SELECT	 r.Account
		,MAX(r.adate + CONVERT(datetime,CONVERT(decimal(6,6), '.' + CONVERT(varchar(50),r.atime)))) AS LastUpdate
FROM	StatusTable r
WHERE	r.active_date < '2010-09-30'

The Account field contains account numbers in a varchar format.

Apparently, with the way this table was designed, you have to combine the DATE field with the TIME field to get the full datetime stamp. For the life of me, I can't figure that one out, as most everything else I've seen at work has just one datetime field, so everything is covered. This one might have something to do with the data originally coming out of an Oracle database, but I do not know that for certain.

My thought is this: Is there not some way I could get the same results in a way that wouldn't put such a heavy load on the server? I mean, with this (unless I'm wrong), it appears we have to first convert the time value a couple of times, then add it to the date, and THEN get the Maximum value for each Account... before a particular date.

So, is it just me, and perhaps I am not understanding something, or is there a better way?

Any thoughts?

Below is a sample of the data in the table I'm attempting to query from (of course, adjusted to protect the innocent:
[TT] Account Date Time
11111 9/8/2010 23593
77777 9/14/2010 23593
88888 9/15/2010 25071
99999 9/12/2010 25071
10101 9/21/2010 25071
10102 9/5/2010 25078
10105 9/13/2010 25078
10106 9/7/2010 25131
10107 9/25/2010 25131
10111 9/4/2010 25895
10112 9/20/2010 25895
10117 9/13/2010 25895[/TT]

And here is an example of what the returned data looks like:
[TT] account LastUpdate
31312 5/14/08 8:30 AM
31351 6/21/06 9:29 AM
135131 11/30/04 6:51 AM
51313 3/24/08 9:29 AM
3131 3/24/08 9:29 AM
513131 2/25/09 10:01 AM[/TT]

Thanks for any thoughts, suggestions, references.
 
That is a pretty bizzare way of storing time.

Can I assume that the time column is an int?

What happens to the time values with it's less than a 5 digit number. Your code (which I assume you inherited) assumes the time is 5 digits because it is blindly putting a decimal point in front of it, so 25100 would return the same value as 251. Unless, of course, the time column is a varchar, which would be truly sad indeed.

Anyway... I came up with an alternative method for combining the date and time. You'll notice that there is a "magic" number involved (which I don't like), but you'll also notice that there is only one (implicit) conversion and one date add. This is bound to give you better performance. Lastly, notice that the results are not exactly the same. I am seeing some differences in the milliseconds. To see what I mean, run the following code in a query window.

Code:
  Declare @Temp Table(Account VarChar(20), aDate DateTime, aTime Int)
  Insert Into @Temp Values('11111','9/8/2010' ,   23593)
  Insert Into @Temp Values('77777','9/14/2010',   23593)
  Insert Into @Temp Values('88888','9/15/2010',   25071)
  Insert Into @Temp Values('99999','9/12/2010',   25071)
  Insert Into @Temp Values('10101','9/21/2010',   25071)
  Insert Into @Temp Values('10102','9/5/2010' ,   25078)
  Insert Into @Temp Values('10105','9/13/2010',   25078)
  Insert Into @Temp Values('10106','9/7/2010' ,   25131)
  Insert Into @Temp Values('10107','9/25/2010',   25131)
  Insert Into @Temp Values('10111','9/4/2010' ,   25895)
  Insert Into @Temp Values('10112','9/20/2010',   25895)
  Insert Into @Temp Values('10117','9/13/2010',   25895)

SELECT  r.Account
        ,r.adate + CONVERT(datetime,CONVERT(decimal(6,6), '.' + CONVERT(varchar(50),r.atime))) AS OriginalMethod
        ,DateAdd(millisecond, aTime * 864.0, aDate) As NewMethod
FROM    @Temp r

You should notice that some times the data is off by 3 milliseconds, and sometimes it is completely accurate. I think this is probably a rounding issue. You should probably evaluate this pretty hard to see if you can tolerate this level of error.

For the magic number....

I noticed that you were effectively dividing your time by 100,000 and then adding that number of days to your date. There are 86,400,000 milliseconds in a day (24 hours * 60 minutes per hour * 60 seconds per hour * 1000 milliseconds per second).

86,400,000 / 100000 = 864



-George
Microsoft SQL Server MVP
My Blogs
SQLCop
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
The time data looks like number of seconds past midnight to me. It's kind of hard to tell since the returned data doesn't match the sample data in the original post.


There were (are) lots of systems that used similar times. (0 being midnight and 86399 being 23:59:59). Also, I'm assuming that we're missing the
Code:
Group By r.Account
clause in the original post - otherwise the MAX wouldn't work.

I'm pretty sure (again assuming that it's the number of seconds past midnight) that George's example would work perfectly if you remove the magic number of 864 and replace it with 1000. i.e.
Code:
DateAdd(millisecond, aTime * 1000.0, aDate) As NewMethod

Greg
People demand freedom of speech as a compensation for the freedom of thought which they seldom use. Kierkegaard
 
Thanks for your posts so far... I'll get to reading the rest, but to start with, I meant to include the file formats for the fields.

gmmastros said:
Can I assume that the time column is an int?
Pretty close. The actual data type designation is:
decimal(6,0)

I'll get back with you both after this. And yes, I agree it's a bizar way of storing time. I really wish it were like most data sources here at work - just one field for time and date. If you need just one or the other, you remove the rest. That makes it easier for sorting, using greater than, less than, etc. (at least I think so).
 
What happens to the time values with it's less than a 5 digit number. Your code (which I assume you inherited) assumes the time is 5 digits because it is blindly putting a decimal point in front of it, so 25100 would return the same value as 251. Unless, of course, the time column is a varchar, which would be truly sad indeed.

Yes, I did inherit this code. It's part of a big heap of stuff that I'm currently trying to sort through, verify it's validity, and correct and tweak what I can.

The last person put it together, probably, in a hurry a few months ago, before they left the company.
 
gmmastros said:
Unless, of course, the time column is a varchar, which would be truly sad indeed.
You know what's funny? I just found a date column, different table, different database, and was thinking a similar thought: Why in the world is it in a varchar format rather than datetime?! But I'm giving the benefit of the doubt. I'm hoping that it's because of some crossover from Oracle to MS SQL. I have no prior experience, myself, to base that on, in Oracle. I'm just tossing it out as a guess.
 
re: missing Group By r.Account

Yes, when I was cleaning it up, I ended up removing that whole line on the grouping, but it was indeed there. Thanks for the catch.

In Re: using the DateAdd function, it indeed LOOKS to me like it would run much faster as well. I'll see if I can find out from one of the other folks here who might know something about it, if both your thoughts on the time are correct. If so, I'll be super-duper happy for sure!

Regardless, I've learned some with this thread for sure.

I'll post back with the final decision and all.

By the way, I ran the code this way to include the original, and both variations of the same method. None of the methods match exactly. I'll keep asking and digging and post back with whatever I find on this.

The SQL:
Code:
 Declare @Temp Table(Account VarChar(20), aDate DateTime, aTime Int)
  Insert Into @Temp Values('11111','9/8/2010' ,   23593)
  Insert Into @Temp Values('77777','9/14/2010',   23593)
  Insert Into @Temp Values('88888','9/15/2010',   25071)
  Insert Into @Temp Values('99999','9/12/2010',   25071)
  Insert Into @Temp Values('10101','9/21/2010',   25071)
  Insert Into @Temp Values('10102','9/5/2010' ,   25078)
  Insert Into @Temp Values('10105','9/13/2010',   25078)
  Insert Into @Temp Values('10106','9/7/2010' ,   25131)
  Insert Into @Temp Values('10107','9/25/2010',   25131)
  Insert Into @Temp Values('10111','9/4/2010' ,   25895)
  Insert Into @Temp Values('10112','9/20/2010',   25895)
  Insert Into @Temp Values('10117','9/13/2010',   25895)

SELECT  r.Account
        ,r.adate + CONVERT(datetime,CONVERT(decimal(6,6), '.' + CONVERT(varchar(50),r.atime))) AS OriginalMethod
        ,DateAdd(millisecond, aTime * 864.0, aDate) As NewMethod864
        ,DateAdd(millisecond, aTime * 1000.0, aDate) As NewMethod1000
FROM    @Temp r

Results:
[TT]
Account OriginalMethod NewMethod864 NewMethod1000
11111 2010-09-08 05:39:44.350 2010-09-08 05:39:44.353 2010-09-08 06:33:13.000
77777 2010-09-14 05:39:44.350 2010-09-14 05:39:44.353 2010-09-14 06:33:13.000
88888 2010-09-15 06:01:01.343 2010-09-15 06:01:01.343 2010-09-15 06:57:51.000
99999 2010-09-12 06:01:01.343 2010-09-12 06:01:01.343 2010-09-12 06:57:51.000
10101 2010-09-21 06:01:01.343 2010-09-21 06:01:01.343 2010-09-21 06:57:51.000
10102 2010-09-05 06:01:07.390 2010-09-05 06:01:07.393 2010-09-05 06:57:58.000
10105 2010-09-13 06:01:07.390 2010-09-13 06:01:07.393 2010-09-13 06:57:58.000
10106 2010-09-07 06:01:53.183 2010-09-07 06:01:53.183 2010-09-07 06:58:51.000
10107 2010-09-25 06:01:53.183 2010-09-25 06:01:53.183 2010-09-25 06:58:51.000
10111 2010-09-04 06:12:53.280 2010-09-04 06:12:53.280 2010-09-04 07:11:35.000
10112 2010-09-20 06:12:53.280 2010-09-20 06:12:53.280 2010-09-20 07:11:35.000
10117 2010-09-13 06:12:53.280 2010-09-13 06:12:53.280 2010-09-13 07:11:35.000[/TT]
 
Right. Like I said in my post. Some of the times are off by 3 milliseconds. I'm reasonably certain that my method will be faster than the original method. If you are willing to tolerate the 3 millisecond difference (which is probably a result of rounding), then you should use my method.

-George
Microsoft SQL Server MVP
My Blogs
SQLCop
"The great things about standards is that there are so many to choose from." - Fortune Cookie Wisdom
 
Yeah, I don't fully understand the whole thing I'm looking at yet, but I do totally agree. I sent an email to one guy here at work who might know what I'm talking about in reference to the table fields. I've not heard back from him yet.

But for what my purpose is, I think, with this query, is to pull the maximum combined value of the Date and Time fields. So, the actual value, I don't think, is terribly important. And I seriously doubt a few milliseconds will matter.

Thanks a lot for that information. I'd put 2 stars on it if I could! [bigsmile]
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top