Ashley M. Kirchner wrote:
Someone's going to tell me to go buy a book, I just know it. I'll
ask anyway:
I'm starting to log weather data to a database and I'm trying to
figure out what's the best way to create the tables. The reports are
coming in every minute, of every hour, 24 hours a day. Eventually, I'd
like to do some calculations on the statistics, displaying daily values
(which can be broken down to hourly), but then also daily and monthly
averages.
To me, it doesn't make sense to dump everything into one big table,
but I can't figure out what's the best way to break it down either.
Keep in mind that the only data I have, is what comes in for that
minute. The daily averages I have to calculate myself (later.) But I
can't see one large table being very effective when it comes to
calculating that stuff.
So, how should I break the tables down? Create a new table every day
(20061219_data, 20061220_data, etc.) and insert all the values in it?
Or, break it down per values (temp_table, humidity_table, etc.) and
insert daily data in them?
'Scuse me for re-opening an old thread, but apart from the other
suggestions you could use table partitioning.
If you're using mysql, see
http://dev.mysql.com/doc/refman/5.1/en/partitioning.html
Note it only came in at version 5.1.
If you're using postgresql, see
http://www.postgresql.org/docs/8.2/interactive/ddl-partitioning.html
Has been there for a while now.
Using something else? Err, find their docs :)
The links provided will explain things better than I can..
But if you're only getting 1 new record per minute, then I'd just chuck
it all in one table - as Jochem said you're only going to get roughly
500k items per year - not that many.
Indexing it might be a bit fiddly but that depends on the queries that
you are running.
--
Postgresql & php tutorials
http://www.designmagick.com/
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php