AtomicChip
Programmer
K... I've been trying to understand this for a little while now, but it's REALLY starting to tick me off... What I want to be able to do is send HEX "ff" through a UDP socket...
private const string CMD_INFO = "\xff\xff\xff\xff\x00";
Socket sock = new Socket ( AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp );
IPAddress ipServer = IPAddress.Parse ( "192.168.0.3" );
IPEndPoint epServer = new IPEndPoint ( ipServer, 27015 );
sock.Connect ( (EndPoint) epServer );
sock.SetSocketOption ( SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, 25 );
sock.Send ( System.Text.ASCIIEncoding.ASCII.GetBytes ( CMD_INFO ) );
When this sends though, and I take a look at the packet that is sent, for some reason the output is 3f 3f 3f 3f 00. The \x00 seems to be working fine, but the \xff is changed to 3f. Am I encoding this wrong, or is it something else that I'm not seeing?
-----------------------------------------------
"The night sky over the planet Krikkit is the least interesting sight in the entire universe."
-Hitch Hiker's Guide To The Galaxy
private const string CMD_INFO = "\xff\xff\xff\xff\x00";
Socket sock = new Socket ( AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp );
IPAddress ipServer = IPAddress.Parse ( "192.168.0.3" );
IPEndPoint epServer = new IPEndPoint ( ipServer, 27015 );
sock.Connect ( (EndPoint) epServer );
sock.SetSocketOption ( SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, 25 );
sock.Send ( System.Text.ASCIIEncoding.ASCII.GetBytes ( CMD_INFO ) );
When this sends though, and I take a look at the packet that is sent, for some reason the output is 3f 3f 3f 3f 00. The \x00 seems to be working fine, but the \xff is changed to 3f. Am I encoding this wrong, or is it something else that I'm not seeing?
-----------------------------------------------
"The night sky over the planet Krikkit is the least interesting sight in the entire universe."
-Hitch Hiker's Guide To The Galaxy