lichtjiang
Programmer
I have a server written in sun/j2se. It uses unix domain socket in a standard way. It listens for connection requests in background all time. After accepting a request, it creates a object of taking care of some job, passes it to an object of Thread, and starts the Thread object:
while(true){
UnixSocket client;
try{
client=server.accept();
ClientHandler ch=new ClientHandler(client, this);
Thread t = new Thread(ch);
t.start();
}catch(IOException e){
}
}//while
I have the following questions. Pls help me. Thanks!
- How to terminate this server gracefully in a Unix/Linux? Now, I simply send a TERM signal to it(though, I didn't write any JNI to handle signals) by "kill -TERM pid".
- How to terminate each connection session created after t.start() and free memory? This seems to be a stupid question. "ClientHandler" does all its job in "run()". After "run()" hits the end-point (either exit is called or last statement is called.), "ClientHandler" terminates. Is this correct? But what bothers me is that the memory used by a socket connection session (an instance of "ClientHandler" and other objects) seemed not to be released. Before the server starts any connection, the memory was about 25M. When it runs a "ClientHandler", memory usage was about 170M. And it remains almost the same number after "run()" of the "ClientHandler" finished. I also called "Runtime r = Runtime.getRuntime(); r.gc();" to request garbage collection at the end of "run()" and above "client=server.accept()". But it seems not working!
Any idea or suggestions? THanks!
while(true){
UnixSocket client;
try{
client=server.accept();
ClientHandler ch=new ClientHandler(client, this);
Thread t = new Thread(ch);
t.start();
}catch(IOException e){
}
}//while
I have the following questions. Pls help me. Thanks!
- How to terminate this server gracefully in a Unix/Linux? Now, I simply send a TERM signal to it(though, I didn't write any JNI to handle signals) by "kill -TERM pid".
- How to terminate each connection session created after t.start() and free memory? This seems to be a stupid question. "ClientHandler" does all its job in "run()". After "run()" hits the end-point (either exit is called or last statement is called.), "ClientHandler" terminates. Is this correct? But what bothers me is that the memory used by a socket connection session (an instance of "ClientHandler" and other objects) seemed not to be released. Before the server starts any connection, the memory was about 25M. When it runs a "ClientHandler", memory usage was about 170M. And it remains almost the same number after "run()" of the "ClientHandler" finished. I also called "Runtime r = Runtime.getRuntime(); r.gc();" to request garbage collection at the end of "run()" and above "client=server.accept()". But it seems not working!
Any idea or suggestions? THanks!