Hi all,
I'm have an unusual problem with a C# console application. I have a legacy application with which I've piped its standard output to my app's standard input and am reading the data from the old unmanaged application with Console.Read().
The issue is that Console.Read() appears to be changing the integer value of the characters coming into it! I think it is the difference between Unicode and the "extended ASCII" that the other program is thinking in. The characters are really coming in 1 byte at a time, but Console.Read() is converting some of the above 127 characters into really big Unicode numbers (156 turns into 339, etc). The data coming in isn't just text though, it is binary information (images, etc) so this is a real problem.
Having given up on making Console.Read() work, I'm trying to use ReadConsole() imported directly from the unmanaged Win32 API. It's prototype looks like this:
Here's what I've tried that isn't working:
I've also done a Win32 SetConsoleMode() similarly that was successfull as that one effects the normal Console.Read() method built into the .Net framework (to remove buffering) as well.
This is a pretty odd and long question so I know it is kind of a long shot, but thought I would ask as I have run out of ideas. Thanks a million for anyone who gives it a shot!
I'm have an unusual problem with a C# console application. I have a legacy application with which I've piped its standard output to my app's standard input and am reading the data from the old unmanaged application with Console.Read().
The issue is that Console.Read() appears to be changing the integer value of the characters coming into it! I think it is the difference between Unicode and the "extended ASCII" that the other program is thinking in. The characters are really coming in 1 byte at a time, but Console.Read() is converting some of the above 127 characters into really big Unicode numbers (156 turns into 339, etc). The data coming in isn't just text though, it is binary information (images, etc) so this is a real problem.
Having given up on making Console.Read() work, I'm trying to use ReadConsole() imported directly from the unmanaged Win32 API. It's prototype looks like this:
Code:
BOOL ReadConsole(
HANDLE hConsoleInput, // handle of a console input buffer
LPVOID lpBuffer, // address of buffer to receive data
DWORD nNumberOfCharsToRead, // number of characters to read
LPDWORD lpNumberOfCharsRead, // address of number of characters read
LPVOID lpReserved // reserved
);
Here's what I've tried that isn't working:
Code:
[DllImport("kernel32", SetLastError=true)]
static extern int ReadConsole(
int hConsoleHandle,
ref int lpBuffer,
int nNumberOfCharsToRead,
ref int lpNumberOfCharsRead,
int lpReserved);
[DllImport("kernel32", SetLastError=true)]
static extern System.IntPtr GetStdHandle(int nStdHandle);
...
int c=0; // Assigned to ensure the object is really created.
int l=0;
System.IntPtr h;
h=GetStdHandle(3); // I believe 3 to be stdin in this case.
ReadConsole(h, ref p, 1, ref l, 0);
...
I've also done a Win32 SetConsoleMode() similarly that was successfull as that one effects the normal Console.Read() method built into the .Net framework (to remove buffering) as well.
This is a pretty odd and long question so I know it is kind of a long shot, but thought I would ask as I have run out of ideas. Thanks a million for anyone who gives it a shot!