Hi All,
Maybe it's just because of Tax Shock, but I have to confess being totally confused by this thread. I've worked with BCD numbers for over 20 years, and thought I "got it".
First, let's start with the decimal number 1273.
In an ASCII string, it would be represented internally as "1273":
3 1 3 2 3 7 3 3
0011 0010 0011 0010 0011 0111 0011 0011
In binary digits, it would be represented internally as 495:
4 9 5
0100 1001 0101
(1 x 1024 + 1 x 128 + 1 x 16 + 1 x 4 + 1 x 1)
In hexadecimal, it would be represented internally as 4F9:
4 F 9
0100 1111 1001
(4 x 256 + 15 x 16 + 9 x 1) or
(4 x 16**2 + 15 x 16**1 + 9 x 16**0)
In BCD, it would be represented internally as 1273:
1 2 7 3
0001 0010 0111 0011
With, depending on implementation, possibly a sign digit on the end.
So, where does the number 10032 come in???? I am not trying to be obtuse about this, I'm just confused. If one of you can help me understand, I would appreciate it.
BTW, SJA, I ran your code and got 2730???
Thanks,
Paul