On 01/06/2010 08:49 PM, Greg Smith wrote: > Yan Cheng Cheok wrote: >> The time taken to perform measurement per unit is in term of ~30 >> milliseconds. We need to record down the measurement result for every >> single unit. Hence, the time taken by record down the measurement >> result shall be far more less than milliseconds, so that it will have >> nearly 0 impact on the machine speed (If not, machine need to wait for >> database to finish writing, before performing measurement on next unit) > Saving a piece of data to a hard disk permanently takes a few > milliseconds. As pointed out already, exactly how many depends on the > drive, but it's probably going to be 8ms or longer on your system. > There are a few options here: > 3) Write the data to a flat file. Periodically import the results into > the database in a batch. > If you're OK with the possibility of losing a measurement in the case of > a system crash, then you should just write measurements to a series of > flat files, then have another process altogether (one that isn't holding > up the machine) load those files into the database. The fact that it At my last company we built a system for use in semiconductor/flat-panel display equipment and faced a very similar issue -- namely we needed to collect 40+ parameters at 6 kHz from one source, and another 200+ at 6 kHz from a second source (and then sync them so they could be properly analyzed). Our solution was similar to #3, except we didn't bother with the flat file. We basically had a C++ process "catch" the incoming stream of data into a memory buffer, and periodically bulk copy it into the database using libpq COPY functions (see: http://www.postgresql.org/docs/8.4/interactive/libpq-copy.html) HTH, Joe
Attachment:
signature.asc
Description: OpenPGP digital signature