• frezik@midwest.social
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Years ago, older C programmers told me you don’t know C unless you use dynamic memory management. I ended up rarely writing any C, but when I do, it’s usually on microcontrollers where dynamic memory management isn’t even supported out of the box.

    Jokes on you, greybeards!

  • EatATaco@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I was an embedded developer for years for critical applications that could not go down. While I preferred avoiding dynamically allocating memory, as it was much less risky, there were certainly times it just made sense or was the only way.

    One was when we were reprogramming the device, which was connected to an fpga which would also require reprogramming. You couldn’t store both the fpga binary and the new binary for the device in memory at once, but there was plenty of space to hold each one individually. So allocate the space for the fpga, program it, free and allocate space the new processor code, verify and flash.

    What am I missing? Have things changed?

    • owenfromcanada@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’d effectively gain the advantage of dynamic allocation by using a union (or just a generic unsigned char buffer[16384] and use it twice). Mostly the same thing as a malloc.

    • bus_factor@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Using a file system is much less bad than dynamically allocating memory, at least as long as you keep a predefined set of files.

      • Troy@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I hate to alarm you but… What is a file system except dynamically allocated memory. ;)