Re: Query remote repository files, blobs

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Shawn O. Pearce wrote:
Bill Lear <rael@xxxxxxxxxx> wrote:
On Saturday, December 19, 2009 at 12:02:02 (+0100) Johannes Schindelin writes:
On Sat, 19 Dec 2009, Shakthi Kannan wrote:
...
I am able to query for list of remote heads, and tags. I would like to
know if it is possible to query for information on remote files, or
blobs?
This has been discussed a number of times, but we cannot allow that for security reasons. A blob might contain confidential information, in which case the branch has to be rewritten and force-pushed. However, that does not make the blob go away, but makes it only unreachable. Until the next garbage collection kicks in, that is (which you typically cannot control).
Hmm, I thought this had been addressed by git in a different way (removing
confidential information).  A company will not be satisfied that its
proprietary information is "unreachable" in your software repository.
They want absolute assurance that the information is completely
removed.

Have I remembered wrongly --- is this still not possible with git?

Its still possible, but you have to wipe out the reflog record(s)
that had the object in it, and you have to repack to evict it from
the pack files, and you have to run `git prune --expire=0` to force
it to wipe out the object immediately.

We already support dumping back random commits via upload-archive, if
its enabled in the daemon, and I think a lot of people do turn it on.
There is no validation that the requested tree-ish is reachable.

I think gitweb winds up doing the same thing, it doesn't actually
try to validate the object is reachable, it just serves whatever
it was asked for, if its present in the repository.


I'm getting some mild suggestions over here at $DAY_JOB to implement
shallow clones by lazily downloading large blobs on demand.
We've resisted doing this in git because of the reachability test
Dscho mentioned above... but many people skip that anyway due
to gitweb and upload-archive being enabled.  Which is making me
start to question who is broken... upload-pack for not being more
willing to serve arbitrary content, or gitweb/upload-archive for
not validating their requests are reachable.

The security argument hasn't been true for a while if any of the following are enabled on the server:
	Gitweb
	http transport
	rsync transport
	ftp transport
	alternates (the alternates are the ones at risk)
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]