Windows
Minimum Rust Binary Size
Posted by Alyssa Riceman on
Consider this very simple Rust program:
main() {
()
}
When compiled into a Windows executable in debug mode (target: x86_64-pc-windows-msvc
), the resulting file is 135 KB. In release mode, 132 KB.
When compiled into a Linux binary in debug mode (target: x86_64-unknown-linux-gnu
), the resulting file is 3.4 MB. In release mode, 3.3 MB.
When compiled into a Mac binary in debug mode (target: x86_64-apple-darwin
), the resulting file size is 419 KB. In release mode, 414 KB.
I wish I knew enough about reading binaries to be able to get anything useful out of those files in a hex editor, because those file sizes, and especially the differences between them, are interesting. What’s all this machine code being run in order to start a program and immediately exit without doing anything? What are the differences between the systems (or between the compilers for the systems) which lead to such dramatic disparities in how much such code is run in order to do that? I don’t know, and I lack the skills necessary to find out, but I predict that the curiosity is going to keep gnawing at me now that I’ve discovered this.
Not All Operating Systems’ Time Measurements Have Nanosecond Precision
Posted by Alyssa Riceman on
Rust is normally pretty good about cross-system support. However, I discovered yesterday that, at least within a certain narrow domain, its cross-platform support ends up being somewhat limited.
The SystemTime
struct serves as a convenient way to get timestamps for things. And, under normal circumstances, run on Linux, it offers nanosecond precision.
However, I discovered the hard way, while writing some tests, that said precision is heavily platform-dependent. Run on Mac OS, it only has microsecond precision; on Windows, tenth-of-microsecond precision. The nanosecond precision on Linux is, as it turns out, the exception, not the rule, dependent on the precision of the system call it underlyingly relies on.
I’m not sure where the gap in precision between operating systems comes from. It’s not a matter of different underlying hardware; running it on a Linux VM hosted on my Mac, it works fine. It’s purely a difference between the operating systems themselves, in terms of how their respective system-time-retrieval functions work. And I don’t know why Microsoft and Apple wouldn’t offer nanosecond-precision time-checking.
But the fact of the matter is that they don’t, or at least not in any manner convenient enough for the Rust standard library’s developers to have taken advantage of it. Anyone planning on writing a program which expects nanosecond precision should be accordingly cautious.