The main issue I have with rust is the lack of a rust abi for shared libraries, which makes big dependencies shitty to work with. Another is a lot of the big, nearly ubiquitous libraries don’t have great documentation, what’s getting put up on crates.io is insufficient to quickly get an understanding of the library. It’d also be nice if the error messages coming out of rust analyzer were as verbose as what the compiler will give you. Other than that it’s a really interesting language with a lot of great ideas. The iterator paradigm is really convenient, and the way enums work leads to really expressive code.
Documentation is generally considered one of the stronger points of rust libraries. Crates.io is not a documentation site you want https://docs.rs/ for that though it is generally linked to on crates.io. A lot of bigger crates also have their own online books for more in depth stuff. It is not that common to find a larger crate with bad documentation.
One specific example I encountered was ndarray. I couldn’t figure out how to make a function take an array and an arrayslice without rewriting the function for both types. This could be because I’m novice with the language, but it didn’t seem obvious. I ended up giving up after trying to dig through the docs for a few hours and went back to C++.
As someone that have worked in software for 30 years, and deplying complicated software, shared libraries is a misstake. You think you get the benefit of size and easy security upgrades, but due to deployment hell you end up using docker and now your deployment actually added a whole OS in size and you need to do security upgrades for this OS instead of just your application.
I use rust for some software now, and I build it with musl, and is struck by how small things get in relation to the regular deployment, and it feels like magic that I no longer get glibc incompatibility issues.
Technically this is conflating two things: bundling dependencies and static/dynamic linking. But since you have to bundle your dependencies to use static linking, and there’s little point dynamic linking if you bundle your dependencies… most of the time they are synonymous.
Exceptions are things like plugins, but that’s pretty rare.
Maybe for your use cases that’s OK, but there are many situations where the size and ease of upgrading provided by shared libraries is worthwhile. For example it would suck to need to push a 40+ GB binary to a fleet of systems with a poor or unreliable internet connection. You could try to mitigate this sort of thing by splitting the application up into microservices, but that adds complexity, and isn’t always a viable tradeoff if maximizing compute efficiency is also a concern.
I’m not so sure that dynamic libraries always reduces the size. Specially with libraries that are linked by a single binary.
With static libraries, you can conditionally compile only the features you’re gonna use. With dynamic libraries, however, the whole library must be compiled.
EDIT: just to clarify, I’m not saying that static libraries result always in less size. I’m saying that it’s not a black and white issue.
You can just use an unsafe block though. Or make a thin wrapper that is just safe functions that inside just have an unsafe block with the C ABI function.
Even if rust had a stable ABI, you would still need that unsafe block.
My friend from university sends me his Rust code snippets sometimes. Ngl it looks like a pretty cool language.
There was also that tldr reimplemention in Rust that is a gatrillion times faster than the original.
I really want to give it a try but I have executive dysfunction and don’t have any ideas of what I could use it for.
The main issue I have with rust is the lack of a rust abi for shared libraries, which makes big dependencies shitty to work with. Another is a lot of the big, nearly ubiquitous libraries don’t have great documentation, what’s getting put up on crates.io is insufficient to quickly get an understanding of the library. It’d also be nice if the error messages coming out of rust analyzer were as verbose as what the compiler will give you. Other than that it’s a really interesting language with a lot of great ideas. The iterator paradigm is really convenient, and the way enums work leads to really expressive code.
Documentation is generally considered one of the stronger points of rust libraries. Crates.io is not a documentation site you want https://docs.rs/ for that though it is generally linked to on crates.io. A lot of bigger crates also have their own online books for more in depth stuff. It is not that common to find a larger crate with bad documentation.
One specific example I encountered was ndarray. I couldn’t figure out how to make a function take an array and an arrayslice without rewriting the function for both types. This could be because I’m novice with the language, but it didn’t seem obvious. I ended up giving up after trying to dig through the docs for a few hours and went back to C++.
As someone that have worked in software for 30 years, and deplying complicated software, shared libraries is a misstake. You think you get the benefit of size and easy security upgrades, but due to deployment hell you end up using docker and now your deployment actually added a whole OS in size and you need to do security upgrades for this OS instead of just your application. I use rust for some software now, and I build it with musl, and is struck by how small things get in relation to the regular deployment, and it feels like magic that I no longer get glibc incompatibility issues.
Maybe tackle that deployment hell instead of band-aiding it with docker?
He is. By using statically linked binaries.
Technically this is conflating two things: bundling dependencies and static/dynamic linking. But since you have to bundle your dependencies to use static linking, and there’s little point dynamic linking if you bundle your dependencies… most of the time they are synonymous.
Exceptions are things like plugins, but that’s pretty rare.
Maybe for your use cases that’s OK, but there are many situations where the size and ease of upgrading provided by shared libraries is worthwhile. For example it would suck to need to push a 40+ GB binary to a fleet of systems with a poor or unreliable internet connection. You could try to mitigate this sort of thing by splitting the application up into microservices, but that adds complexity, and isn’t always a viable tradeoff if maximizing compute efficiency is also a concern.
I’m not so sure that dynamic libraries always reduces the size. Specially with libraries that are linked by a single binary.
With static libraries, you can conditionally compile only the features you’re gonna use. With dynamic libraries, however, the whole library must be compiled.
EDIT: just to clarify, I’m not saying that static libraries result always in less size. I’m saying that it’s not a black and white issue.
Why not just use the C ABI?
And what libraries are you referring to? Almost all the ones I’ve used have fantastic docs.
In my understanding, you can’t interface with the C abi without using an unsafe block.
I think there are some crates that wrap the unsafe code for you, e.g. https://github.com/rodrimati1992/abi_stable_crates/ (I haven’t ever tried it).
You can just use an unsafe block though. Or make a thin wrapper that is just safe functions that inside just have an unsafe block with the C ABI function.
Even if rust had a stable ABI, you would still need that unsafe block.
fn executive() {}
Of course its rewrite is nearly infinitely faster than the original JavaScript.
oh
lol
didn’t cross my mind that someone would make a CLI program in js
I mean, I’ve done it, but I am a registered dunce cap owner.
Well, so is that guy.