UNIX maximalism and minimalism - Other *nix-like OSes & POSIX related

Users browsing this thread: 2 Guest(s)
ckester
Nixers
The latest nixers newsletter linked to an article with the same title as this thread.

After reading it, I'm prompted to ask: is the trend to rewrite so many utilities in Rust or Go an example of minimalism, or of bloat?

I'm tempted to say the latter, because I don't care for the toolchains they bring along. Any time I look at the git repository for some tempting project and see references to cargo, I say no thank you. I dislike that almost as much as cmake.

(It's taken me years to grudgingly accept autotools. But it's still a strike against any program that uses them.)

And maybe I'm misunderstanding or overreacting but golang apparently has a worrisome style of pulling dependencies out of the cloud. Um, no, I'm not comfortable with that either.

What do my fellow nixers think? Am I just a crotchety old C programmer who likes simple makefiles? and everything statically linked and preferably less than a megabyte after stripping (less than 500K even better)?
venam
Administrators
That particular article for me sat in between satire and seriousness, but it was still a fun read.

(21-02-2025, 04:41 PM)ckester Wrote: Is the trend to rewrite so many utilities ... minimalism, or of bloat?
As with most things, I guess it depends what people mean by minimalism and bloat. For me minimalism is that when I look at a process tree I'm able to know everything that's in there. But it also means for me not having to pass through hoops and juggle 5 balls to be able to get the thing running, I'd rather have a solid straightforward solution.

That also doesn't mean I dismiss trying new stuff, on the contrary. There are a few of the new utilities I enjoy very much like jq, fzf, ripgrep.

(21-02-2025, 04:41 PM)ckester Wrote: Any time I look at the git repository for some tempting project and see references to cargo, I say no thank you.
(21-02-2025, 04:41 PM)ckester Wrote: preferably less than a megabyte after stripping (less than 500K even better)?
I have a similar issue with projects that have gigantic build systems, or where the final binary is 100+MB. That's just too much of a bubble where the devs live in a world of fast fiber internet and laptops-of-the-year. That's just not the case in most places, it's hindering the adoption of such tool, and pushing these tools in a direction where only the people that can install them can actually contribute and propose ideas on how to improve them.

Is that "bloat"? I'm not sure, I think it depends how many other tools you already have on your system that depends on that build environment. But if you're compiling a single binary and it requires you to pull 1GB of dependencies and build system, I'd say in that case that size can be part of "bloat" too.

I'm also interested in other's opinions.
jkl
Long time nixers
Most software can be compiled locally, and I usually recommend my users to build my tools themselves instead of just fetching a precompiled binary. Some of it depends on the runtime: I have been doing a lot in Common Lisp recently, and Common Lisp is not exactly lightweight when it comes to built binaries, because they contain a whole Common Lisp environment, even when compressed.

But they're still static binaries, which is good. YMMV.
ckester
Nixers
An often overlooked virtue of small statically-linked binaries is that they can used in a shell pipeline or script without unacceptably degrading performance due to the time needed to load them.

Larger statically-linked binaries obviously take more time to load.

Those using shared libraries or runtime environments are even worse (unless, of course, other programs using the same libraries or environment have already been loaded into memory -- which is one of the reasons I have a longstanding complaint about what I call "vanity libraries", which their authors distribute as shared libs but which are only ever used by their own programs and almost never at the same time. I.e., they're not really "shared" at all. They're just wasting time doing the work to locate the library, load it, and resolve the references to the code therein.)

Sorry for the rant!

EDIT: yes, I know there are ways to map files into memory without actually copying the data until it's needed. But I think my point still stands, because the larger the code the more likely a cache miss is going to be. (Especially if the code is poorly organized with respect to locality.)