running out of file descriptors

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I am writing a multi-threaded application which services hundreds of
remote connections for data transfer.  Several instances of this
program are run simultaneously.  The problem is that whenever the
total number of active user connections (cumulative total of all open
sockets tallied from all process instances) reaches about 700 the
system appears to run out of file descriptors.  I have tried raising
the open files limit via "ulimit -n" and by using the setrlimit()
facility.  Neither of these seem to help.  I am currently having to
limit the number of processes running on the system to 2 instances
allowing no more than 256 connections each.  In this configuration the
sever will run for days without failure until I stop it.  If I try to
add a third process or restart one of the process with a higher
connection limit, bad things will start happening at about 700 open
sockets.

Thanks in advance to anyone who can help.

-- 
Bryan
<><
--
To unsubscribe from this list: send the line "unsubscribe linux-c-programming" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Assembler]     [Git]     [Kernel List]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [C Programming]     [Yosemite Campsites]     [Yosemite News]     [GCC Help]

  Powered by Linux