On Fri, 2006-07-07 at 22:41 +0000, lanczos@xxxxxxxxxx wrote: > > [mailto:pgsql-general-owner@xxxxxxxxxxxxxx] On Behalf Of Adrian Klaver > > > > I guess the solution depends on what is a 'large amount of data'. The > > most time consuming part is going to be converting the single data > > elements at the top of each sheet into multiple elements. I would > > create columns for the data in the sheet. At the same time I would > > order the columns to match the database schema. Then it would a matter > > of cut and paste to fill the columns with the data. The event id's > > could be renumbered using Excel's series generator to create a non > > repeating set of id's. If the amount of data was very large it might > > pay to create some macros to do the work. Once the data was filled in > > you would have a couple of choices. One, as mentioned by Ron would be > > to use OpenOffice v2 to dump the data into the database. The other > > would be to save the data as CSV and use the psql \copy command to > > move the data into the table. > > On Friday 07 July 2006 09:40 am, Parang Saraf wrote: > > Evrything You described is familiar to me, except the OpenOffice v2 > dump - could You explain this more in details pls? I tried to do it > many times, without success. > > Thank You > > Tomas Does the "OpenOffice v2 dump" convert the date correctly when exporting into PostgreSQL? The date in .xls when using excel is exported to CSV as a number which is tricky to convert to a date. This is what I use : date_pli('epoch'::date, date_num::integer - 25569) AS date_fmt The number "25569" is a fudge factor, that can be different between dumps, but stays consistent through the dump. I usually adjust it and compare the result to the value shown in excel until I get a match.