Yeeeeaaaah, HN, reddit, and unending IRC and Discord and Forum servers have utterly ripped this article apart… I’ll do a quick summarization:
In 2022, a standard library should at least contain the following packages:
DearGodNo! In short by making those things in the standard library means you are no longer able to iterate and advance on it, the API is then stuck forever more if you care about backwards compatibility at all, in addition no single instantiation of those is acceptable for all purposes, this is utterly inane to even suggest.
I don’t like centralized package repositories. They add complexity and obfuscation, all while supply chain attacks are increasing.
A useful ecosystem is useful, in addition you are not beholden to crates.io at all, you can make your own package server (even import things from crates.io or others into it statically), or vendor things in (cargo can even do that for you), or a variety of other things. But having a good default is good.
Forcing people to, as the article says,
follow the Go model: centralized discovery but decentralized distribution is horrible from a maintenance standpoint because depending on a random git repo can suddenly vanish, has no reliability in long term availability, etc… etc… And that’s not even mentioning the other horrors of the go model that it didn’t even scratch on.
Due to how modules and packages work in Rust, I create dependency cycles more often than with some other languages.
…it’s not possible to have dependency cycles in rust, soooo… wut?
I think that Go got it right: modules are scoped by folder instead of files.
Yeah… modules are scoped by filename, so you actually can know what file something you are requesting from without needing to jump all over the place.
Again, I think that Go nailed it: Using the case (lowercase for private, uppercase for public) of the first letter of the identifier is perfect for lazy developers like me.
AnotherGoodFreakingGod, no, doesn’t matter if erlang does it or go does it, specializing visible by initial-case is stupid, just outright. First of all in go something is either public or private, there’s no other visibility modifiers (of which rust has many).
Yet, I don’t think that lexical lifetimes are the answer.
I’m far from an expert in this field, still, from a programmer’s perspective, I would love to see a mix of compile-time lifetime analysis, Automatic Reference Counting (like in the lobster programming language), and manual memory management (marked as
unsafe) when extreme performance is needed. How to solve leaks due to cycles? I’m not sure.
But I’m sure that I no longer want to see lifetime annotations pollute our code ever
At least the person mentions they are not an expert in this because it’s obvious they’ve never had to deal with long lasting system maintenance. And they seem to do the very what-on-earth thing that a lot of people who don’t seem to know much in thinking that ownership is for handling memory, it’s not, it’s for handling all resources, which yes includes memory but that’s a surprisingly small amount of what it manages, of which in go you still have to manage all that stuff manually still, the only thing go does is handle memory via a (very poor) GC, no other resource.
Thus features only accumulate, and complexity compounds over time.
Which is why rust has every-3-year versions that actively deprecate and add things in a backwards incompatible way, with an auto-migration tool included with the compiler system.
Related to features bloat: Who is in charge of refusing new features added to the language to avoid its collapse?
That would indeed be the Rust Governance Committee, which is one of the most well set up of any language anywhere with a very well set process.
Among lots of other… interesting things people have said. This article is exceedingly obviously made by someone who does not have long term work in maintainability of backend services in the industry.