2005/09/28

[Troubleshooting] Too many open files

Q.

<2005/9/27 上午09時32分13秒 GMT+08:00> <Critical> <WebLogicServer> <BEA-000204>
<Failed to listen on port 7001, failure count: 2,204, failing for 31,398
seconds, java.net.SocketException: Too many open files>

 



A.
大致上來說,這個問題有可能導因於OS的設定、應用程式在處理IO的時候,沒有正確關閉stream,以致於這些讀取檔案系統的stream一直處於開啟的狀態,最後就把系統所允許的file數給耗盡了。另外,如果在應用程式中開啟了一些socket,沒有正確關閉,也會發生相同的狀況 。這些關閉IO、socket的指令,一定要放在finally的區塊裡面,以確保即使發生exception,還是會被執行到。

We may increase the rlimit for a process by increasing the rlim_fd_max value (e.g. set rlim_fd_max = 4096) on /etc/system and enlarge the default open file number with ulimit -n 4096.
However, if we still find the open file limit for the java process that hosts WebLogic Server cannot excceed 1024,
please edit '${WL_HOME}/common/bin/commEnv.sh'.
COMMENT OUT the last line and add an ulimit statement (e.g. ulimit -n 8192)in the startup scripts.//resetFd

No comments: