Guest_imported
New member
- Jan 1, 1970
- 0
I'm adapting a pretty old program to run in a session oriented Client-Server mode over the Internet. A typical Client-Server session may last an hour or so. At the beginning of a session, the Client logs on and the server uses ShellExecute to start an instance of our backend process for that session, assigning it a unique session ID. The backend process stays alive for the duration of the session. The assigned session ID is included in every subsequent request from the client, and the server uses the ID to connect with the associated instance of the backend process.<br><br>The server will reside on an NT 4.0 machine. Assume we can get a state of the art machine with as much disk and RAM as we need. <br><br>So does anyone have any clues as to how many concurrent copies of the backend process we'll be able to start and run, before the system starts thrashing itself to death or NT runs into some internal limitation? The actual processing done by the backend is minuscule and requests are generated in response to user actions at the screen with a lot of think-time between them. Thus, the real processing load is going to be in the overhead of managing hundreds (thousands? more?) of concurrent instances of the backend process.<br><br>Because this is an older program with 350,000 lines of code, a rewrite to use a more state of the art approach is probably not in the cards. <br><br>Thanks.<br><br>