Re: problem with extraction of .tgz file on Redhat AS 3.0

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Donald,

Thanks for the reply, does this mean that .tgz will get corrupted if the datafile is in use?

Thx & Rgds
Girish

O'Neill, Donald (US - Deerfield) wrote:

Some Oracle process could still have some open files. I would do a
'fuser -am' on the partition just to make sure there are no open files
before tarring.. You could also do a 'tar -cjvf' which uses bzip instead
of gzip..


-----Original Message-----
From: redhat-list-bounces@xxxxxxxxxx
[mailto:redhat-list-bounces@xxxxxxxxxx] On Behalf Of Girish N
Sent: Monday, December 20, 2004 1:56 AM
To: General Red Hat Linux discussion list
Subject: Re: problem with extraction of .tgz file on Redhat AS 3.0

Hi Linus,

Thanks again for the reply. As said earlier, wil reschedule the .tgz dump to a local mount point & will check the same.

Thanks & Regards
Girish

C. Linus Hicks wrote:



On Mon, 2004-12-20 at 11:07 +0530, Girish N wrote:




Hi Linus,

Thanks for the reply,

1. The datafile in question if only 1Gb
2. This is a low end server with 2Gb memory & the backup is scheduled


to

run at 4 AM when there is no memory resource crunch.

The corruption seems to be very inconsistent, 1 day the .tgz is fine, while the 2nd day, the .tgz file gets corrupted. Am planning to reschedule the .tgz backup to one of the local Mount points instead of





the SAN Harddisk & then check the same.

You hv commented that "Not with the symptoms you have", does that mean





that this may be one-of-the reasons for file corruption.




Having an inconsistent datafile will not cause the kind of corruption
you are getting in the tgz file. If you backup (by whatever means) a
datafile that is in an inconsistent state, then the result of restoring
that file will be a datafile in an inconsistent state, not a problem
with the restore. The reason tar complained was because of the gzip
error. When gzip took the error, it was unable to continue ungzipping
the file and sent EOF to tar.

This means the error will be with corruption either during gzip, or
writing to disk. This suggests a hardware problem, perhaps in memory,


or


with writing to the SAN. Trying a local disk rather than the SAN is a
good idea. You might also try running memtest on this machine. Having


no


memory resource crunch at the time of the backup doesn't really mean
much, but I would expect other files to show the same symptom if memory
is the problem.

http://www.memtest86.com/





--
redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list

[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux