Re: running out of file descriptors

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Sun, Feb 15, 2009 at 9:48 PM, Bryan Christ <bryan.christ@xxxxxxxxx> wrote:
> I am writing a multi-threaded application which services hundreds of
> remote connections for data transfer.  Several instances of this
> program are run simultaneously.  The problem is that whenever the
> total number of active user connections (cumulative total of all open
> sockets tallied from all process instances) reaches about 700 the
> system appears to run out of file descriptors.  I have tried raising
> the open files limit via "ulimit -n" and by using the setrlimit()
> facility.  Neither of these seem to help.  I am currently having to
> limit the number of processes running on the system to 2 instances
> allowing no more than 256 connections each.

Have you tried editing /etc/security/limits.conf  (or equivalent file
on your system) to increase the max number of open files?

perhaps something like:
*              -       nofile         524288

is what you want?

joe
--
To unsubscribe from this list: send the line "unsubscribe linux-c-programming" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Assembler]     [Git]     [Kernel List]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [C Programming]     [Yosemite Campsites]     [Yosemite News]     [GCC Help]

  Powered by Linux