Juan, * Juan Pereira (juankarlos.openggd@xxxxxxxxx) wrote: > - The schema for this kind of data consists of several arguments -latitude, > longitude, time, speed. etc-, none of them is a text field. I would think you might want *some* text fields, for vehicle identification, as a seperate table about trucks. > - The database also should create a table for every truck -around 100 > trucks-. As mentioned elsewhere, you're probably fine with 1 table, but if it becomes a problem you can always partition it up and have one view across all of them (make sure to set up your constraints correctly and enable constraint_exclusion if you go with this route). You could then have, say, 10 tables, with 10 trucks in each. > - There won't be more than 86400 * 365 rows per table -one GPS position > every second along one year-. As mentioned, you might want to eliminate duplicate entries; no sense storing information that can be trivially derived. > - There won't be more than 10 simultaneously read-only queries. While this is good to know, I kind of doubt it's accurate, and more important is the number of simultaneous writers. I'm assuming 100, but is that correct? > The question is: Which DBMS do you think is the best for this kind of > application? PostgreSQL or MySQL? Given the list you posted to, I would say you're likely to get alot of PostgreSQL recommendations. Assuming you posted something similar to a MySQL list, I would recommend that you not pick a solution based on the number of responses you get but rather what you're most comfortable with and understand best. If there is a learning curve either way, I think PostgreSQL would be the best solution. If you're thinking about what to have your application support, you might consider trying to support both. Doing that from the beginning is usually best since you'll develop your system at a high enough level to mitigate the problems (syntax differences, performance differences, etc) between the databases. As an aside, and I don't know where the MySQL community is on this, but we have the US Census TIGER Shapefile data set loaded into PostgreSQL with PostGIS, with a geocoder that works with it. We should have a complete packaged solution for loading it, indexing, etc, soon. That's a fairly large, free, data set of all streets, addresses, etc, in the US with lat/long information. Thanks, Stephen
Attachment:
signature.asc
Description: Digital signature