Re: Suggestion: use static linking instead of dynamic linking just as how rust and golang do.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Hello,

I am not experienced in haskell but I wanted to point out the argument
with storage is not as simple as it seems.

Storage matters still in 2024, if you run an environment where you have
a lot of software using the same libraries, you do not want it
duplicated over and over again. But its also bandwidth, that is not
cheap and it costs a lot of money for the mirrors to provide fast links
for users to download binaries.

I can give 3 examples of programming languages which bundle or static
link:

- Go, Go binaries tend to be relatively large after all the
  dependencies are compiled in.
- Rust, I wonder how many times Rustls or tokio is duplicated on a
  system with high numbers of rust software. Yes I am aware rust
  binaries are stripped down, but they do tend to produce large bundles.
- Java, two words, "Uber Jar".

Now I believe this comes down to pragmatism, and you could argue that
its more pragmatic to just static link everything. But if package
maintainer(s) are willing to maintain haskell libraries, surely it will
be less pragmatic to then go back on that and rewrite all the packages
to be statically compiled.

Also another thing which is a concern for Java, and probably rust and
go, but has no "good" solution. All these libraries have their own
licenses, some of them are MIT which has a copyright field which should
be credited. If each haskell dependency is packaged independently, its
license is clearly listed and installed, but in a static linked package
surely you would need to go through each dependency, check its license,
and then add it to the license array.

I discussed this with Artafinde in the past with Java Uber Jars after a
developer brought up the question that one of their software which was
packaged in Arch Linux only listed their license, and not the licenses
on the libraries used. But Artafinde made it clear its upstreams
responsibility to provide a file with the dependency licenses.

I am aware Java are JIT compiled and thus irrelevant to linking, but
they have the same issues as static linking as it comes under the
umbrella of "dependency bundling".

Also heres a big one, with dynamic linking if the ABI remains the same
you don't need to rebuild, if you static link you MUST rebuild each
version when a dependency is bumped.

Also another thing, Arch Linux is rolling release, one issue I saw when
I briefly packaged rust code is that some devs never bump the
dependency versions, you could have vulnerable versions of rustls for
example which is never bumped, which then needs patching. Wouldn't
haskell have a similar issue? If upstream doesn't keep their
dependencies up to date, arch team would then need a patchset to keep
the dependencies up to date? And then rebuild the software each time
there is a issue with a dependency? Then technically they are tracking
the dependencies anyways and therefore why not just package them
separately?

So is it truly more pragmatic or faster?

Not a staff member, just found this thread interesting. Sorry if I
derailed the original purpose with my own curiosity :)

Take care,
-- 
Polarian
GPG signature: 0770E5312238C760
Website: https://polarian.dev
JID/XMPP: polarian@xxxxxxxxxxxx

Attachment: pgpmieYAeotpI.pgp
Description: OpenPGP digital signature


[Index of Archives]     [Linux Wireless]     [Linux Kernel]     [ATH6KL]     [Linux Bluetooth]     [Linux Netdev]     [Kernel Newbies]     [Share Photos]     [IDE]     [Security]     [Git]     [Netfilter]     [Bugtraq]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux ATA RAID]     [Samba]     [Device Mapper]

  Powered by Linux