• BatmanAoD@programming.dev
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    10 hours ago

    Without one, the run time system, must assign some semantics to the source code, no matter how erroneous it is.

    That’s just not true; as the comment above points out, Python also has no separate compilation step and yet it did not adopt this philosophy. Interpeted languages were common before JavaScript; in fact, most LISP variants are interpreted, and LISP is older than C.

    Moreover, even JavaScript does sometimes throw errors, because sometimes code is simply not valid syntactically, or has no valid semantics even in a language as permissive as JavaScript.

    So Eich et al. absolutely could have made more things invalid, despite the risk that end-users would see the resulting error.

    • bss03@infosec.pub
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      9 hours ago

      Python also has no separate compilation step and yet it did not adopt this philosophy

      Yes. It did. It didn’t assign exactly the same semantics, but it DOES assign a run time semantic to min().

      • BatmanAoD@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        9 hours ago

        I’m addressing the bit that I quoted, saying that an interpreted language “must” have valid semantics for all code. I’m not specifically addressing whether or not JavaScript is right in this particular case of min().

        …but also, what are you talking about? It throws a type error if you call min() with no argument.