Re: how to handle a big table for data log

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Jul 20, 2010 at 9:51 PM, kuopo <spkuo@xxxxxxxxxxxxxx> wrote:
Let me make my problem clearer. Here is a requirement to log data from a set of objects consistently. For example, the object maybe a mobile phone and it will report its location every 30s. To record its historical trace, I create a table like
CREATE TABLE log_table
(
  id integer NOT NULL,
 data_type integer NOT NULL,
 data_value double precision,
 ts timestamp with time zone NOT NULL,
 CONSTRAINT log_table_pkey PRIMARY KEY (id, data_type, ts)
)
;
In my location log example, the field data_type could be longitude or latitude.


I witnessed GridSQL in action many moons ago that managed a massive database log table.  From memory, the configuration was 4 database servers with a cumulative 500M+ records and queries were running under 5ms.  May be worth a look.

http://www.enterprisedb.com/community/projects/gridsql.do

Greg

[Postgresql General]     [Postgresql PHP]     [PHP Users]     [PHP Home]     [PHP on Windows]     [Kernel Newbies]     [PHP Classes]     [PHP Books]     [PHP Databases]     [Yosemite]

  Powered by Linux