• 0 Posts
  • 5 Comments
Joined 2 years ago
cake
Cake day: August 2nd, 2023

help-circle
  • I’m curious why 16-bit support is being dropped. Too much additional codebase complexity for such a small use case, or are there technical reasons it’s difficult to support in a 64-bit environment that somehow don’t exist in a 32-bit one? Or is it simply not implemented yet due to a lack of dev time/interest in the feature?

    I know 16-bit programs are incredibly niche these days, but I’d be way more comfortable with enterprises running their ancient software in a secure, up-to-date WINE environment as opposed to an actual Windows 3.x one with its nonexistent security. Even in an isolated VM, that kind of setup is one misconfiguration away from disaster.


  • The Unix epoch problem is completely unrelated to a program being 32-bit or not. The architecture affects the maximum addressable memory space, not the size of individual types. You could easily define and use a 128-bit type in a 16-bit environment, for example.

    The epoch problem is simply due to a bad design call a long time ago - one that proved foundational and incredibly difficult to change once it’d become an entrenched standard. They could have made timestamps 64-bit at the time, and probably would have if they’d known their work would survive the several decades it’d take for that decision to pay off.