Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Converting a int * to char * doesnt work

Status
Not open for further replies.

Infinity88

Programmer
Jun 29, 2001
6
0
0
US
The problem is I have an int array, and I need to convert it to a char array so that I can get one byte at a time from it. This is what I tried:

Code:
int *i = new int[10];
i[0] = 1;
i[1] = 2;
i[2] = 3; //etc

char *c = (char *) i;
after doing this, this is what I would expect:
Code:
c[0] == 0
c[1] == 0
c[2] == 0
c[3] == 1
c[4] == 0
c[5] == 0
c[6] == 0
c[7] == 2
Because each int uses 4 bytes. HOWEVER, this is what I actually get:
Code:
c[0] == 1
c[1] == 0
c[2] == 0
c[3] == 0
c[4] == 2
c[5] == 0
c[6] == 0
c[7] == 0
c[8] == 3
As if the first 3 bytes were thrown away. This doesnt make any sense to me. Why is the char* being aligned to the least-significant bits of the first (and only the first) int????? Its driving me nuts. VC++ 6.0 w/ SP5

HELP
 
Ahhh... it is an ENDIAN problem. Intel and HP represent data differently. One represents it from left to right as we would assume but that is HP. You using an Intel based platform and the bytes are actually represented in reverse

example

int x = 255;

// HP
1 2 3 4
00000000|00000000|00000000|11111111

// INTEL
1 2 3 4
11111111|00000000|00000000|00000000

if you search the web on byte swapping you will see examples of this.

the way byte swapping would work with this is 1 will be swapped with 4 and 2 swapped with 3.

Hope this helps, and post if you need more explanation.

Matt
 
the spacing on the numbering was off... each number (1-4) represents a byte
 
UGH... thanx for bring me sanity. But I'm not sure knowing the problem helps me much. I'm writing a DirectX application, and the array I'm recieving is the memory space of a direct draw surface. Because of the nature of the problem, I wont know until run time if I have an array of bytes, words, or dwords -- so I deal with it by using a byte array and changing my logic accordingly.

this causes problems... off the top of my head...

1) How would I detect whether the bytes were forward or backward (ENDIAN or not I guess is the proper term) short of writing out an int or something and checking it?
2) Swapping it must be dirt slow.. the array I'm dealing with could have a few megs worth of bytes in it
3) I need to write directly to the array I recieve, so even if I swapped the bytes, I would then have to re-swap them to write them back to the source, which would be twice as slow

There must be a better way :(
 
swapping may not be required. Do you know the data type? If it is an array of ints the your all set... just use an int and the platform will take care of interpreting the value. This holds true for any case.

There could be a way around it though.

struct my_int
{
union byteswap{
char c[4];
int x;
} my_value;

operator = (int x){
my_value.c[4] = x>> 0 & 0xFF;
my_value.c[4] = x>> 8 & 0xFF;
my_value.c[4] = x>> 16 & 0xFF;
my_value.c[4] = x>> 24 & 0xFF;
}
};


Just a side note... this was off the top of my head. It may be >> 0 , >> 7 , >> 15 and >>23 I just wrote this up quick

Usage

my_int x;
x = 7;
cout<<&quot;byteswapped value = &quot;<<(int)x;

matt

 
Thats a great idea.. although I don't necessarily have long ints, they could be short ints or chars. But thats easy to allow for. I could assume that it is always little-endian (intel) and just write my bytes backwards, that would be fine UNLESS theres a lot of machines out there that are big-endian. Are there??

What I really need to do is find out how DirectX programmers deal with this problem.

Thanks for your help though, you've been a life-saver. I'm not sure how I ever would have discovered the cause of the problem on my own (believe me I tried)
 
Heh... no problem. DirectX, as far as I know because I have not ventured into the game developing aspect of C++, is only windows based and therefore should only be little-endian.

Matt
 
What is the problem?

struct intchar
{
union
{
int x;
char xx[4];
}
}; John Fill
1c.bmp


ivfmd@mail.md
 
The problem is BIGENDIAN and LITTLEENDIAN. If your not seeing a problem, i would assume your probabally working on a unix based system.

try this out:

intchar i;
i.union_name.x = 255;

when you look at
xx[3] is it a 0 or a -1? If it is a zero then you are working on a BIGENDIAN machine. If, however, xx[0] is -1 then you are working on a LITTLEENDIAN machine.

Matt
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top