C++ is even worse, due to templates and the so-called most vexing parse. Initializing with {} mitigated the latter somewhat, but came with its own set of woes
In practice, type inference in Rust is not a problem since the language is so strongly typed. In fact, it is more strongly typed than both C and C++, and will force you to cast values explicitly in cases where C and C++ will happily mess up your variables without warning. The absence of type inference would also be a major pain, since nested types such as iterators can get quite complex and very verbose. If you've programmed using older C++ standards, then you know this pain
I believe that it is useful in a few places. cppreference.com mentions templates as one case:
Trailing return type, useful if the return type depends on argument names, such as
template<class T, class U> auto add(T t, U u) -> decltype(t + u);or is complicated, such as inauto fpif(int)->int(*)(int)
The syntax also matches that of lambdas, though I'm not sure that adding another way of specifying regular functions actually makes the language more consistent, since most code still uses the old style.
Additionally, the scope of the return type matches the function meaning that you can do
auto my_class::my_function() -> iterator { /* code */ }
instead of
my_class::iterator my_class::my_function() { /* code */ }
which is kinda nice
With Rust you safe 1 char, and gain needing to skip a whole line to see what type something is.
Honestly, the Rust way of doing things feels much more natural to me.
You can read it as
- Define a function,
- with the name
getoffmylawn, - that takes a
Lawnargument namedlawn, - and returns a
bool
Whereas the C function is read as
- Do something with a
bool? Could be a variable, could be a function, could be a forward declaration of a function, - whatever it is, it has the name
getoffmylawn, - there's a
(, so all options are still on the table, - ok, that' a function, since it takes a
Lawnargument namedlawn, that returns abool
Amusingly, modern C++ allows you to copy the rust signature nearly 1:1:
auto getofmylawn(Lawn lawn) -> Option<Teenager> {
return lawn.remove();
}
At this point the people complaining about Rust at every opportunity have become more annoying than the "rewrite it in Rust" people ever were
The article is about an internal kernel API: They can easily rename those, since they are not exposed to user-space.
But you seem to be talking about the kill command and/or the kill function, that are part of the POSIX and C standards, respectively. Renaming either would break a shit-ton of code, unless you merely aliased them. And while I agree that kill is a poor name, adding non-standard aliases doesn't really offer much benefit
I set the timestamps of my music to its original release date, so that I can sort it chronologically... OK, I don't actually do that, but now I'm tempted
How the fuck can it not recover the files?
Undeleting files typically requires low-level access to the drive containing the deleted files.
Do you really want to give an AI, the same one that just wiped your files, that kind of access to your data?
I'm surprised that you didn't mention Zig. It seems to me to be much more popular than either C3 or D's "better C" mode.
It is “FUD” if you ask why it’s still const by default.
I'd be curious if you could show any examples of people asking why Rust is const by default being accused of spreading "FUD". I wasn't able to find any such examples myself, but I did find threads like this one and this one, that were both quite amiable.
But I also don't see why it would be an issue to bring up Rust's functional-programming roots, though as you say the language did change quite a lot during its early development, and before release 1.0. IIRC, the first compiler was even implemented in OCaml. The language's Wikipedia page goes into more detail, for anyone interested. Or you could read this thread in /r/rust, where a bunch of Rust users try to bury that sordid history by bringing it to light
Makes memory unsafe operations ugly, to “disintensivise the programmer from them”.
From what I've seen, most unsafe rust code doesn't look much different compared to safe rust code. See for example the Vec implementation, which contains a bunch of unsafe blocks. Which makes sense, since it only adds a few extra capabilities compared to safe rust. You can end up with gnarly code of course, but that's true of any non-trivial language. Your code could also get ugly if you try to be extremely granular with unsafe blocks, but that's more of a style issue, and poor style can make code in any language look ugly.
Has a pretty toxic userbase
At this point it feels like an overwhelming majority of the toxicity comes from non-serious critics of Rust. Case in point, many of the posts in this thread
Like, one of the issues that Linus yelled at Kent about was that bcachefs would fail on big endian machines. You could spend your limited time and energy setting up an emulator of the powerPC architecture, or you could buy it at pretty absurd prices — I checked ebay, and it was $2000 for 8 GB of ram…
It's not that BCacheFS would fail on big endian machines, it's that it would fail to even compile, and therefore impacted everyone who had it enabled in their build. And you don't need actual big endian hardware to compile something for that arch: Just now it took me a few minutes to figure what tools to install for cross-compilation, download the latest kernel, and compile it for a big endian arch with BCacheFS enabled. Surely a more talented developer than I could easily do the same, and save everyone else the trouble of broken builds.
ETA: And as pointed out in the email thread, Overstreet had bypassed the linux-next mailing list, which would have allowed other people to test his code before it got pulled into the mainline tree. So he had multiple options that did not necessitate the purchase of expensive hardware
One option is to drop standards. The Asahi developers were allowed to just merge code without being subjected to the scrutiny that Overstreet has been subjected to. This was in part due to having stuff in rust, and under the rust subsystem — they had a lot more control over the parts of Linux they could merge too. The other was being specific to macbooks. No point testing the mac book-specific patches on non-mac CPU’s.
It does not sound to me like standards were dropped for Asahi, nor that their use of Rust had any influence on the standards that were applied to them. It is simply as you said: What's the point of testing code on architectures that it explicitly does not and cannot support? As long as changes that touches generic code are tested, then there is no problem, but that is probably the minority of changes introduced by the Asahi developers
Loop labels are rare, but they lead to much simpler/clearer code when you need them. Consider how you would implement this kind of loop in a language without loop variables:
In C/C++ you'd need to do something like
Personally, I wouldn't call it ugly, either, but that's mostly a matter of taste