git reset --hard should not irretrievably destroy new files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



If you `git add new_file; git reset --hard`, new_file is gone forever.

This is totally what git says it will do on the box, but it caught me out.

It might seem a little less stupid if I explain what I was doing: I was
breaking apart a chunk of work into smaller changes:

git commit -a -m 'tmp'           # You feel pretty safe now, right?
git checkout -b backup/my-stuff  # Not necessary, just a convenience
git checkout -
git reset HEAD^                  # mixed
git add new_file
git add -p                       # also not necessary, but distracting
git reset --hard                 # decided copy from backed up diff
# boom. new_file is gone forever


Now, again, this is totally what git says it's going to do, and that was
pretty stupid, but that file is gone for good, and it feels bad.

Everything that was committed is safe, and the other untracked files in
my local directory are also fine, but that particular file is
permanently destroyed. This is the first time I've lost something since I
discovered the reflog a year or two ago.

The behaviour that would make the most sense to me (personally) would be
for a hard reset to unstage new files, but I'd be nearly as happy if a
commit was added to the reflog when the reset happens (I can probably make
that happen with some configuration now that I've been bitten).

If there's support for this idea but no-one is keen to write the code, let
me know and I could have a crack at it.

Cheers,

Julian de Bhál




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]