> PG *will* close the connection immediately if it receives any > notification of connection drop from the client. If it's not seeing > one, that suggests something wrong in your network infrastructure. > I'd suggest fixing the problem rather than kluging the symptom. I have W2K server running also Exchange and other applications. I havent seen any problem with other appls. In every morning 8 persons from different locations over internet each open a single TCP connection to this Postgres 8.1.3 server using ODBC. They close my appl in midnight. Every day some clients receive randomly error form ODBC driver SQL state: 08S01 Error number: 27 Message: "Error while executing the query; server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request." In fewer cases error message is "Could not send query to backend;Could not send query to backend" In this case my application sends disconnect command to ODBC driver and re-connects. Re-connect is OK. After first error I see that there are 2 Postgres processes running for that user. Clients are using ADSL connection over phone line maintained by phone company and I have no control over this. Doe to design request my application checks for duplicate connections and those dupl connections prevent working. Any idea what should I change? > You can reduce the TCP timeout settings if you are using PG 8.1 and an > operating system that supports it (I have no idea if Windows does or not). > I wouldn't recommend trying to make it "immediate" since then any > network instability breaks your application. 5 minutes or so might be > reasonable though. As you've noticed, the default timeouts are usually > upwards of an hour. (You should however ask yourself if you really know > more about TCP than the authors of the TCP specifications do.) My queries do not take more time than 5 minutes. In case of connection loss application re-connects automatically. What are reasonable values of the 3 TCP_KEEPALIVE_* parameters in this case ? Andrus.