Re: 答复: IETF Journal - November 2017

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I think that this demonstrates an instance of an important use case
that is widely felt. It is also a problem that IETF cannot solve on
its own, nor can any institution nor even any single country (I met a
traveler in an antique land...).

Back in the 1990s, disk space was expensive. Archiving technology
primitive. Today disk is cheap but the ratio of signal to noise is
vastly lower. And it is not just digital media that are affected this
way. Any picture from the 1800s is likely to be an important
historical document. We don't need to keep every selfie taken with a
cell phone.

There are basically two approaches to preserving anything, one is to
have one hypersecure vault and protect it, the second is to have a
vast number of copies and a social infrastructure that replaces copies
when some are lost.

Consider 'the long now' and Danny Hillis's clock. Sure you can build a
clock that will run for a thousand years (maybe). But the way that we
have maintained an unbroken time sequence of increasing accuracy since
the 1950s is through a federation of national labs. There is no single
source of time. There are many that produce COORDINATED time.


The problem is not really preserving the bits. The problem is that the
bits are liable to get lost. If you don't know you have it, you might
as well not have it. Atoms fall victim to the same problem.

What we need to solve this problem is some consortium of interests who
can provide a modest level of resources (a few machines, a few TB of
storage, some network capacity) and some sort of gating mechanism to
make sure that we are applying those resources to preserve as much
signal as possible.

The technical requirements are not great: Merkle Tree, HashGraph,
Wingardium Leviosa. The real obstacle is establishing the social
infrastructure to get the project to critical mass. If one person ran
a node, the project could die any moment. If a dozen people ran nodes
then we have probably reached a large enough critical mass for some
people to think it worthwhile taking a copy for their archives.

There are in fact many folk who are archiving parts of the Web, they
are just not coordinated or even aware of the others.

The secondary set of issues are to do with copyright and IPR. But
those are fixable when you lay out the requirements.


I have been building something of the sort for managing Mathematical
Mesh profiles:

http://www.prismproof.org/Documents/draft-hallambaker-jbcd-container.html




On Thu, Nov 9, 2017 at 7:59 PM, Zhenghaomian <zhenghaomian@xxxxxxxxxx> wrote:
> Personally I kept collecting one physical copy in each IETF meeting from
> last year, it would be ‘sudden’ if I lose this opportunity.
>
>
>
> Is it possible to reduce the number of paper copies instead of directly
> cancelling the publication? Thank you for the consideration.
>
>
>
> Best wishes,
>
> Haomian
>
>
>
> 发件人: ietf [mailto:ietf-bounces@xxxxxxxx] 代表 Ted Lemon
> 发送时间: 2017年11月10日 8:47
> 收件人: Brian E Carpenter <brian.e.carpenter@xxxxxxxxx>
> 抄送: IETF discussion list <ietf@xxxxxxxx>
> 主题: Re: IETF Journal - November 2017
>
>
>
> On Nov 10, 2017, at 3:19 AM, Brian E Carpenter <brian.e.carpenter@xxxxxxxxx>
> wrote:
>
> When the IETF Journal started, it took over as the medium for this
> requirement.
> Are we happy that this will become an on-line only publication?
>
>
>
> Yes.   Waste of paper otherwise.
>
>





[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Fedora Users]