It’s also just a huge fallacy. He’s saying that people just choose to not write memory safe code, not that writing memory safe code in C/C++ is almost impossible. Just look at NASA’s manual for writing safe C++ code. It’s insanity. No one except them can write code that’s safe and they’ve stripped out half the language to do so. No matter how hard you try, you’re going to let memory bugs through with C/C++, while Rust and other memory safe languages have all but nullified a lot of that.
As someone who is in the aerospace industry and has dealt with safety critical code with NASA oversight, it’s a little disingenuous to pin NASA’s coding standards entirely on attempting to make things memory safe. It’s part of it, yeah, but it’s a very small part. There are a ton of other things that NASA is trying to protect for.
Plus, Rust doesn’t solve the underlying problem that NASA is looking to prevent in banning the C++ standard library. Part of it is DO-178 compliance (or lack thereof) the other part is that dynamic memory has the potential to cause all sorts of problems on resource constrained embedded systems. Statically analyzing dynamic memory usage is virtually impossible, testing for it gets cost prohibitive real quick, it’s just easier to blanket statement ban the STL.
Also, writing memory safe code honestly isn’t that hard. It just requires a different approach to problem solving, that just like any other design pattern, once you learn and get used to it, is easy.
Also, writing memory safe code honestly isn’t that hard. It just requires a different approach to problem solving, that just like any other design pattern, once you learn and get used to it, is easy.
Also, writing memory safe code honestly isn’t that hard. It just requires a different approach to problem solving, that just like any other design pattern, once you learn and get used to it, is easy.
This statement is kinda ironic after you just said it’s “easier” to just ban the entire STL and dynamic memory allocation as a whole. You already left the domain of what most consider “easy” quite a while ago.
That first link is about a document from 2006, while C++ became a lot safer with C++11 in 2011. It’s much easier to write safe C++ now, if you follow current guidelines:
I would say the standard has changed. The current guidelines require you to use features that didn’t exist before C++11. C++11 was a huge change and it made C++ a lot nicer. The updates since then have generally been improvements but more incremental than revolutionary.
Print from html to PDF in a browser was 708 pages, so maybe half that if printed like a book (less whitespace etc.). About like a Rust textbook. Still a lot I guess.
It’s also a fallacy that rust code is memory safe. I audited a couple of large rust projects and found that they both had tens of unsafe constructs. I presume other projects are similar.
You can’t use “unsafe” and then claim that your program’s memory safe. It may be “somewhat safe-ish” but claiming that your code is safe because you carefully reviewed your unsafe sections leaves you on the same shaky ground as c++, where they also claim that they carefully review their code.
I don’t have the data, but I don’t think it’s wild to assume that most rust programs have 0-1 unsafe blocks, in total. Except for special cases like ffi.
Even if your rust project has 1000s of unsafe blocks, it is still safer than C++, which is 100% an unsafe block. You only have to carefully review the parts marked “unsafe”, in C++ you have to carefully review the whole code.
Also, because unsafe blocks are explicitly declared, you know which parts of the code require extra carefulness, and if you encounter a memory bug, doing Ctrl+F “unsafe” will soon show the root cause.
It’s also just a huge fallacy. He’s saying that people just choose to not write memory safe code, not that writing memory safe code in C/C++ is almost impossible. Just look at NASA’s manual for writing safe C++ code. It’s insanity. No one except them can write code that’s safe and they’ve stripped out half the language to do so. No matter how hard you try, you’re going to let memory bugs through with C/C++, while Rust and other memory safe languages have all but nullified a lot of that.
As someone who is in the aerospace industry and has dealt with safety critical code with NASA oversight, it’s a little disingenuous to pin NASA’s coding standards entirely on attempting to make things memory safe. It’s part of it, yeah, but it’s a very small part. There are a ton of other things that NASA is trying to protect for.
Plus, Rust doesn’t solve the underlying problem that NASA is looking to prevent in banning the C++ standard library. Part of it is DO-178 compliance (or lack thereof) the other part is that dynamic memory has the potential to cause all sorts of problems on resource constrained embedded systems. Statically analyzing dynamic memory usage is virtually impossible, testing for it gets cost prohibitive real quick, it’s just easier to blanket statement ban the STL.
Also, writing memory safe code honestly isn’t that hard. It just requires a different approach to problem solving, that just like any other design pattern, once you learn and get used to it, is easy.
the CVE list would disagree with you.
I’d’ve thought Ada would play a larger role in aerospace, because of these issues, among others.
This statement is kinda ironic after you just said it’s “easier” to just ban the entire STL and dynamic memory allocation as a whole. You already left the domain of what most consider “easy” quite a while ago.
Op said NASA thinks it’s easier.
Do you have a link?
https://en.wikipedia.org/wiki/The_Power_of_10%3A_Rules_for_Developing_Safety-Critical_Code
and their 40 page coding standard document. https://ntrs.nasa.gov/api/citations/20080039927/downloads/20080039927.pdf https://ntrs.nasa.gov/citations/20080039927
and their software safety handbook. https://standards.nasa.gov/standard/nasa/nasa-gb-871913
all 389 pages of it https://standards.nasa.gov/sites/default/files/standards/NASA/Baseline/0/nasa-gb-871913.pdf
That first link is about a document from 2006, while C++ became a lot safer with C++11 in 2011. It’s much easier to write safe C++ now, if you follow current guidelines:
https://isocpp.github.io/CppCoreGuidelines/
Yeah the standards for safe C++ haven’t changed, no matter how much the language changes.
I would say the standard has changed. The current guidelines require you to use features that didn’t exist before C++11. C++11 was a huge change and it made C++ a lot nicer. The updates since then have generally been improvements but more incremental than revolutionary.
Dude core guidelines is like 2000 pages, C++ is a meme language
Print from html to PDF in a browser was 708 pages, so maybe half that if printed like a book (less whitespace etc.). About like a Rust textbook. Still a lot I guess.
Well I’m old so I need a larger font size
Seems very reasonable, apart from an arbitrary number of assertions required per function
It’s also a fallacy that rust code is memory safe. I audited a couple of large rust projects and found that they both had tens of unsafe constructs. I presume other projects are similar.
You can’t use “unsafe” and then claim that your program’s memory safe. It may be “somewhat safe-ish” but claiming that your code is safe because you carefully reviewed your unsafe sections leaves you on the same shaky ground as c++, where they also claim that they carefully review their code.
I don’t have the data, but I don’t think it’s wild to assume that most rust programs have 0-1 unsafe blocks, in total. Except for special cases like ffi.
Even if your rust project has 1000s of unsafe blocks, it is still safer than C++, which is 100% an unsafe block. You only have to carefully review the parts marked “unsafe”, in C++ you have to carefully review the whole code.
Also, because unsafe blocks are explicitly declared, you know which parts of the code require extra carefulness, and if you encounter a memory bug, doing Ctrl+F “unsafe” will soon show the root cause.
Yes, that’s the difference between “safer” and “actually safe”.