Themuppeteer
Programmer
Hello,
I have a C# client that gets a string from a textfield.
This client calls a webservice with as argument the string.
(The webservice is written in C++ with gsoap ..)
When the webservice is called, it prints the stringlength of the recieved string on the screen.
Now when I call my service with a ONE japanese character as argument, the stringlength of the string the webservice recieves is 1. It should be 2 right (because a unicode character is 16 bit)? Apperantly the client cuts of the unicode chars at 8 bits.
Now what I want to do to solve this is convert my string with unicode to another string (twice as long) with the unicode chars split up.
Who can help me ? Who can give me functions/ideas/tips ??
thanks a lot.
Greetz,
NOSPAM_themuppeteer@hotmail.com (for mails, remove the NOSPAM_)
Don't eat yellow snow...and don't mess with your fstab!
I have a C# client that gets a string from a textfield.
This client calls a webservice with as argument the string.
(The webservice is written in C++ with gsoap ..)
When the webservice is called, it prints the stringlength of the recieved string on the screen.
Now when I call my service with a ONE japanese character as argument, the stringlength of the string the webservice recieves is 1. It should be 2 right (because a unicode character is 16 bit)? Apperantly the client cuts of the unicode chars at 8 bits.
Now what I want to do to solve this is convert my string with unicode to another string (twice as long) with the unicode chars split up.
Who can help me ? Who can give me functions/ideas/tips ??
thanks a lot.
Greetz,
NOSPAM_themuppeteer@hotmail.com (for mails, remove the NOSPAM_)
Don't eat yellow snow...and don't mess with your fstab!