Re: Insert 1 million data

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Sreejit,

On Tue, Dec 29, 2020 at 10:56 AM Sreejith P <sreejith@xxxxxxxxxxxxx> wrote:

Thanks Rohit.

 

After upgrading volume getting following error. Almost same as previous one.

 

We have increased backup volume and run the Job Again . When I reach 900 thousand records,  getting almost similar error again. 

 

  • Do I need to turn off auto vaccum ?
  • Shall increase maintance work mem ? 

If you're tight on space, my recommendation would be to run the inserts in small batches (say 10,000 at a time). Don't turn off autovaccum, ever :-)

That being said, if you're suffering this way when creating your database, my inclination would be to move it with its logs to a disk with more space. Your server has no scalability and you'll suffer more dramatic crashes very quickly.

My cent worth...
--
Olivier Gautherot
 

[Index of Archives]     [KVM ARM]     [KVM ia64]     [KVM ppc]     [Virtualization Tools]     [Spice Development]     [Libvirt]     [Libvirt Users]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite Questions]     [Linux Kernel]     [Linux SCSI]     [XFree86]

  Powered by Linux