On Mon, Jan 04, 2010 at 05:12:56PM -0800, Yan Cheng Cheok wrote: > Measurement table will have 24 * 50 million rows in 1 day > Is it efficient to design that way? > > **I wish to have super fast write speed, and reasonable fast read speed from the database.** When writing software there's (almost) always a trade-off between development time and resulting performance. If you want best performance, I'd go for a table per "unit type" but this obviously requires more implementation effort to maintain all these tables. The data rates you're talking about means that you're going to have to put quite a bit of effort into performance, doing a simple EAV style solution you suggested isn't going to scale very well. I'd guess you're talking about a minimum of 70GB of data per day for your initial suggestion, whereas a table per unit type will take it to about 10% of this. > Or shall I make use of PostgreSQL Array facility? That may help a bit, but read performance is going to be pretty bad. -- Sam http://samason.me.uk/ -- Sent via pgsql-general mailing list (pgsql-general@xxxxxxxxxxxxxx) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-general