Hi all, I'm working on a problem at the moment where I have some data that I need to get from a proprietary system into a web page. I was thinking of using PostgreSQL as a middle man to store the data. E.g - C++ app reads data from proprietary system and writes it into temp table in PostgreSQL - ASP.NET web service reads data from temp table in PostgreSQL and generates HTML I already have a database that I'm using for other parts of the web site, so thought I'd just add an extra table that looks like this: CREATE TABLE "DataExchange" ( "DataExchangeID" serial NOT NULL PRIMARY KEY, "Name" text NOT NULL UNIQUE, "Value" integer NOT NULL, "UpdateTime" timestamp without time zone ); This temp table will probably contain up to 10000 records, each of which could be changing every second (data is coming from a real-time monitoring system). On top of this, I've then got the ASP.NET app reading the updated data values every second or so (the operators want to see the data as soon as it changes). I was going to do some performance testing to see how well it would work, but thought I'd ask the question here first: I know that the number of records isn't a problem, but how about the frequency of updates/reads? Is 10000 updates/reads a second considered a lot in the PostgreSQL world, or will it do it easily? Regards, Andrew -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general