X-Git-Url: https://git.ralfj.de/web.git/blobdiff_plain/eb7a37abac8444a4f33aa2e83b2314ea3594fc83..dfbd19b865b636c19e788fa9e07be3829a521ff4:/ralf/_posts/2019-07-14-uninit.md diff --git a/ralf/_posts/2019-07-14-uninit.md b/ralf/_posts/2019-07-14-uninit.md index 85c8dcb..7af373f 100644 --- a/ralf/_posts/2019-07-14-uninit.md +++ b/ralf/_posts/2019-07-14-uninit.md @@ -36,7 +36,7 @@ Here is an example to demonstrate why "random bit pattern" cannot describe unini use std::mem; fn always_returns_true(x: u8) -> bool { - x < 150 || x > 120 + x < 120 || x == 120 || x > 120 } fn main() { @@ -45,22 +45,22 @@ fn main() { } {% endhighlight %} `always_returns_true` is a function that, clearly, will return `true` for any possible 8-bit unsigned integer. -After all, *every* possible value for `x` will be less than 150 or bigger than 120. -A quick loop [confirms this](https://play.rust-lang.org/?version=stable&mode=release&edition=2018&gist=58168009e601a2f01b08981f907a473c). -However, if you [run the example](https://play.rust-lang.org/?version=stable&mode=release&edition=2018&gist=da278adb50142d14909df74ea1e43069), you can see the assertion fail.[^godbolt] +After all, *every* possible value for `x` will be either less than 120, equal to 120, or bigger than 120. +A quick loop [confirms this](https://play.rust-lang.org/?version=stable&mode=release&edition=2018&gist=65b690fa3c1691e11d4d45955358cdbe). +However, if you [run the example](https://play.rust-lang.org/?version=stable&mode=release&edition=2018&gist=812fe3c8655bfedcea37bb18bb70a945), you can see the assertion fail.[^godbolt] -[^godbolt]: In case this changes with future Rust versions, [here is the same example on godbolt](https://godbolt.org/z/JX4B4N); the `xor eax, eax` indicates that the function returns 0, aka `false`. And [here is a version for C++](https://godbolt.org/z/PvZGQB); imagine calling `make_true(true)` which *should* always return `true` but as the assembly shows will return `false`. +[^godbolt]: In case this changes with future Rust versions, [here is the same example on godbolt](https://godbolt.org/z/9G67hP); the `xor eax, eax` indicates that the function returns 0, aka `false`. And [here is a version for C++](https://godbolt.org/z/TWrvcq). ## What *is* uninitialized memory? How is this possible? -The answer is that, in the "abstract machine" that is used to specify the behavior of our program, every byte in memory cannot just have a value in `0..256` (this is Rust/Ruby syntax for a left-inclusive right-exclusive range), it can also be "uninitialized". +The answer is that, in the "abstract machine" that is used to specify the behavior of our program, every byte in memory cannot just have a value in `0..256` (this is Rust syntax for a left-inclusive right-exclusive range), it can also be "uninitialized". Memory *remembers* if you initialized it. The `x` that is passed to `always_return_true` is *not* the 8-bit representation of some number, it is an uninitialized byte. Performing operations such as comparison on uninitialized bytes is [undefined behavior]({% post_url 2017-07-14-undefined-behavior %}). As a consequence, our program has undefined behavior, so we should not be surprised that it acts "weirdly". -Of course, there is a reason for this undefined behavior. +Of course, there is a reason for this undefined behavior; there is a reason the "abstract machine" is defined the way it is. Compilers don't just want to annoy programmers. Ruling out operations such as comparison on uninitialized data is useful, because it means the compiler does not have to "remember" which exact bit pattern an uninitialized variable has! A well-behaved (UB-free) program cannot observe that bit pattern anyway. @@ -122,7 +122,7 @@ The Rust abstract machine *does* make a distinction between "relaxed" and "relea After all, x86 does not have "uninitialized bytes" either, and still our example program above went wrong. Of course, to explain *why* the abstract machine is defined the way it is, we have to look at optimizations and hardware-level concerns. -But without an abstract machine, it is very hard to ensure that all the optimizations a compiler performs are consistent---in fact, both [LLVM](https://bugs.llvm.org/show_bug.cgi?id=35229) and [GCC](https://gcc.gnu.org/bugzilla/show_bug.cgi?id=65752) suffer from miscompilations caused by combining optimizations that all seem fine in isolation, but together cause incorrect code generation. +But without an abstract machine, it is very hard to ensure that all the optimizations a compiler performs are consistent---in fact, both [LLVM](https://bugs.llvm.org/show_bug.cgi?id=35229) and [GCC](https://gcc.gnu.org/bugzilla/show_bug.cgi?id=65752) suffer from miscompilations caused by combining optimizations that all seem [fine in isolation, but together cause incorrect code generation]({% post_url 2020-12-14-provenance %}). The abstract machine is needed as an ultimate arbiter that determines which optimizations can be safely combined with each other. I also think that when writing unsafe code, it is much easier to keep in your head a fixed abstract machine as opposed to a set of optimizations that might change any time, and might or might not be applied in any order.