RE: Monopolization in the Internet

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Vittorio said: making sure that new technologies do not assume or prompt a limited number of operators, and if possible include whatever it takes for people to implement them in free software, deploy their own new operations or even self-host their services as a user

This is all good, but this has been tried, and in many domains it hasn’t stopped concentration (DNS, CDN, DDoS protection, just to name the simplest).

Too many services have massive economies of scale in a ruthlessly competitive market which overwhelm the small improvements that can be made in openness.

J.

 

From: ietf <ietf-bounces@xxxxxxxx> On Behalf Of Vittorio Bertola
Sent: Monday, November 15, 2021 10:51 AM
To: Vasilenko Eduard <vasilenko.eduard@xxxxxxxxxx>; Stewart Bryant <stewart.bryant@xxxxxxxxx>
Cc: ietf@xxxxxxxx; iesg@xxxxxxxx
Subject: Re: Monopolization in the Internet

 

 

Il 15/11/2021 10:29 Vasilenko Eduard <vasilenko.eduard@xxxxxxxxxx> ha scritto:

 

 

>Listening to Geoff Huston's well thought out predictions on the future of the Internet, https://youtu.be/cx2G5QxS9Eo

From 1:01:00.

 

IMHO: only “stationary bandit” (I mean “Government” by definition from here https://en.wikipedia.org/wiki/Mancur_Olson)

Could save the Internet from monopolization. Only a brute force approach would work here, like https://en.wikipedia.org/wiki/Standard_Oil.

 

Geoff, good news, It is never late for a government to fix this problem. They just need to spot it.

Of course, they would fix it in a way that not many would like

Indeed, the last few years have seen a sharp turn in public Internet policies even in "Western" countries, with much stronger regulation in the making to address the centralization that has happened in the last 10-15 years and the resulting issues of economic power, taxation, competition, privacy, democracy and content moderation. This is also due to a change in the perceived threat model; while we used to focus only on attacks from within the network, now most of the problems come from the behaviour of the endpoints (not the human ones, the software ones).

 

How this should affect Internet standardization is a discussion that apparently still needs to be had. IMHO this community seems to be quite late in accepting or even realizing the problem in respect to others. In the end, technical standardization alone cannot solve centralization issues, but could at least make sure not to worsen them, e.g. by making sure that new technologies do not assume or prompt a limited number of operators, and if possible include whatever it takes for people to implement them in free software, deploy their own new operations or even self-host their services as a user.

 

    -- vb.


[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux