On Nov 26, 2007 9:34 PM, Elvis Henríquez <henriquez.elvis@xxxxxxxxx> wrote: > Hello everybody. > > Rather than asking for a technical detail or writing about a problem, > I'm asking for up-to-date case studies involving PostgreSQL. > > The company where I'm actually working is migrating some apps; one > requires a Data Warehouse, and I'm proposing PostgreSQL, but they're > thinking of Oracle, as the system has one table (among others) with 20 > fields and more then 19 millions of records, and this exists for 2 > subcompanies, which are actually in different databases, but the > migration project implies joining the two subcompanies data. > > This amount of data is the result of 8 years of usage, and the data > growing rate has increased in the last two years. That's actually pretty small. Where I work we have a data warehouse of similar design (a few large tables, a few small lookup tables). It has 86,840,447 rows and takes up 44 Gigs of space. It sits on a single CPU box with a 4 disk RAID-10 and runs queries covering a few minutes to a few days worth of monitoring data. Sequential scanning the whole main table takes 621 seconds or so (10+ minutes). We add 150 to 200k rows a day to it. Selecting a days's worth of data takes ~ 350ms. A week's worth takes 2 to 10 seconds depending on how much is cached. This is a small database for either oracle or postgresql. Talk your bosses into giving postgresql a try if you can. You should be able to build a 20million test database in an afternoon or so, so it's not like you'll be dedicating thousands of man hours to test it. ---------------------------(end of broadcast)--------------------------- TIP 7: You can help support the PostgreSQL project by donating at http://www.postgresql.org/about/donate