On Mon, May 15, 2017 at 11:47:14PM -0400, Jeffrey Walton wrote: > > The fact that you think it is expected is immaterial. Git doesn't > > know (or care) how you made the files different from HEAD, so it > > looks like a damage to it. > > 'git pull' fails and its expected, but 'git pull -f' is supposed to > succeed. That's what -f is supposed to do. Well, no. "pull -f" does something else, and is documented as such. > Is there a way to add intelligence to Git so that it sees they are the > _exact_ same file, and it stops bothering me with details of problems > that don't exist? > > It seems like adding the intelligence is a good enhancement. A version > control tool has to do three things: check-out, check-in, and > determine differences. Its not doing a good job of determining > differences considering they are the exact same file. AFAICT there are basically two changes we could consider here: 1. Some kind of --force option to git-merge and git-pull that just overwrites files, regardless of content. That's not much better than "git reset --hard && git merge", but I suppose it might save the state of files that wouldn't otherwise be affected by the merge. We already have something similar for "checkout -f". 2. Right now the verify_uptodate() check happens deep in unpack-trees, which doesn't actually know what the merge result is going to be for that file. In some cases (like yours) the threeway result is trivial, but in others it requires doing an actual content-level merge. But in theory we could get the entire merge result and only then decide whether to write it in place (after comparing to the on-disk contents). I suspect that covering the latter would take some major surgery to the way that the merge code works. The trivial cases could probably be handled inside unpack-trees. Neither seem totally unreasonable to me. But without working patches, there's not much to discuss. -Peff