I have noticed that when the term 'museum quality' is used in this type of discussions it usually ends up meaning 'apply some unknown and unknowable criteria that purport to ensure that the data will be readable when the planet fries to a crisp when the sun expires'. Museums don't have any experience of keeping digital documents for a century. Most have people who have considered the problem but so have we. This is not an area where it makes sense to defer to the opinions of unspecified experts in another field. It is one thing if we have someone from that community with ideas who can engage and assist. But deferring to the opinions of unspecified experts is quite another. Deferring to the opinions of unspecified experts as recounted second hand is worse still. For example, an objection that was repeatedly raised against changing the RFC format was the fear that future generations might forget how to read HTML (or whatever). There are certainly some in the museum archive community that raise that as a concern. But I don't think you will find anyone in that community who would say that it is an insurmountable one. Right now there are people who are recovering all the CEEFAX data that was broadcast on the BBC and ITV from old VHS tapes. This is quite a feat as the tapes are not designed to provide the necessary bandwidth. But people are doing reasonably well at it even so. In short, just work out ways to preserve the bits. If you keep the bits you can be reasonably confident that the technology will be there to read them. For IETF purposes, keeping the bits means making sure that all the data is backed up in multiple physical locations in ways that ensure that it is highly unlikely that a common failure (or attack) would compromise all of them. Put in those terms, this looks like a networking problem to me.