Thank you for your response! I tested the split command and I got another problem because for each file doesn't have the "COPY" header. As well I said early, one table has more than 30G and I need to import to another server (linux), but by dvd media because doens't have network connection. Whats the best way to do this. Any suggestion? If doesn't have another way, how can I put the "header" in the begin of file without open? With "cat >>" command I put in the end. Could you help me? Once again, thank you! --- Scott Marlowe <scott.marlowe@xxxxxxxxx> escreveu: > On Dec 17, 2007 2:06 PM, A.Burbello > <burbello3000@xxxxxxxxxxxx> wrote: > > I consider this way a good practice to transport > the > > files. This is because I have table that has more > than > > 30GB. > > But if the OS was windows, couldn't split the > files > > because postgres doesn't has this feature! > > Could be a good option if postgres had native. > > > > Weel, I will try this: > > eg: $ pg_dump postgres -U postgres -f split.txt | > > split --bytes=10m > > http://gnuwin32.sourceforge.net/packages/coreutils.htm > > for now. I used these back in the day (NT4.0 SP4 or > so) and they > worked a charm back then. Heck, even ln worked ( in > a manner of > speaking ) back then. > > ---------------------------(end of > broadcast)--------------------------- > TIP 9: In versions below 8.0, the planner will > ignore your desire to > choose an index scan if your joining column's > datatypes do not > match > Abra sua conta no Yahoo! Mail, o único sem limite de espaço para armazenamento! http://br.mail.yahoo.com/ ---------------------------(end of broadcast)--------------------------- TIP 6: explain analyze is your friend