(Please don’t lob rocks at me. I love Python.)

  • m_f@discuss.online
    link
    fedilink
    English
    arrow-up
    36
    ·
    2 days ago

    To be fair, Python is just glue for code written in lower level languages when it comes to AI

        • abbadon420@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          2 days ago

          I’ve never played with FORTRAN, but I’ve done some linear algebra with matlab. Matlab was interesting for the native handling if matrices. What makes FORTRAN so good at linear algebra?

          • mkwt@lemmy.world
            link
            fedilink
            arrow-up
            6
            ·
            14 hours ago

            Matlab’s syntax for matrices actually derives from Fortran. There’s a lot of flexibility in Fortran’s array features for

            • multidimensional arrays
            • arrays of indeterminate and flexible length
            • vectorized operations on arrays without explicitly writing loops.

            Because Fortran does not have a pointer in the sense of C, the Fortran compiler is free to make several optimization that a C compiler can’t. Compiled Fortran is often faster than C code that does the same thing.

          • lime!@feddit.nu
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 day ago

            the main thing that makes fortran preferable to C is the way it handles arrays and vectors. due to different pointer semantics, they can be laid out more efficiently in memory, meaning less operations need to be done for a given calculation.

            • LeninOnAPrayer@lemm.ee
              link
              fedilink
              English
              arrow-up
              5
              ·
              edit-2
              1 day ago

              Interesting. Is this a fundamental limitation of C or is it just more preferable and easier to use FORTRAN when implementing it?

              Meaning could the same performance be achieved in C but most optimized libraries are already written so why bother? Or basically C can’t achieve the memory optimization at all?

              • lime!@feddit.nu
                link
                fedilink
                English
                arrow-up
                8
                ·
                edit-2
                1 day ago

                you can get the same performance by using the restrict keyword in C.

                basically, C allows pointer aliasing while fortran does not, which means C programs need to be able to handle cases when a value is accessed from multiple locations. fortran does not, so a lot of accesses can be optimized into immediates, or unrolled without guards.

                restrict is a pinky-promise to the compiler that no overlapping takes place, e.g. that a value will only be accessed from one place. it’s basically rust ownership semantics without enforcement.

    • Lucy :3@feddit.org
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      Does one even have to actually write Python code, except for frontends? I’d assume you just load the model, weights and maybe training data into pytorch/tensorflow.

      • chicken@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        Doesn’t seem to be the case, some popular servers:

        And then of course talking to these servers can be in any language that has a library for it or even just handles network requests, although Python is a nice choice. Possibly the process of training models is more heavy on the Python dependencies than inference is, haven’t actually done anything with that though.

  • SpaceNoodle@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    2 days ago

    It sure made sense forty years ago. And I’d bet that the examples in that book are more AI than today’s LLMs.

  • massive_bereavement@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    I have this one! Probably at my folks’ place, definitely I’ll put it behind my chair so people can see it during video calls.

    • curbstickle@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      Python is phenomenal for prototyping IMO.

      Once you need performance, its best to use another language (even partially).

      But quickly banging out a concept, to me, is the big win for python.

      • lunarul@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        2 days ago

        But quickly banging out a concept, to me, is the big win for python.

        For me the best language for quickly banging out a concept has always been the one I’m most familiar with at the moment.

      • sping@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        Once you need performance

        If you need more performance. Many things just don’t.

    • Fabian@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      As far as I know many Python libraries which need performance are mainly written in C++

    • Rose@slrpnk.netOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      2 days ago

      …It’s okay. I’ve programmed in far far worse languages. …It’s got its advantages. It’s got it’s problems. 🤷🏻‍♀️

      Edit: If you need a serious answer: Much like BASIC, it’s a language often used in teaching programming. In that sense, I guess it’s much better than BASIC. You can, like, actually use it on real world applications. If you’re using BASIC for real world applications in this day and age something has gone really wrong.

      • TheRealKuni@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        2
        ·
        2 days ago

        If you’re using BASIC for real world applications in this day and age something has gone really wrong.

        Visual Basic is essentially the same as C# if they’re both working with the .NET framework, if I recall correctly.

        But yes.

    • ProgrammingSocks@pawb.social
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      2 days ago

      Python is the tradeoff between ease of development and performance. If you do things the “normal” way (aka no cython) your programs will oftentimes severely underperform when compared with something written in a relatively lower-level language. Even Java outperforms it.

      But, you can shit out a program in no time. Or so I’ve been told. Python is pretty far from the things I’m interested in programming so I haven’t touched it much.

    • GreenKnight23@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      good is subjective to the opinions of the group.

      objectively, Python is a smoldering pile of trash waiting to completely ignite. it does have one thing going for it though.

      it’s not JavaScript.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Python is great, but it’s so forgiving that it’s easy to write garbage code if you’re not very proficient and don’t use the right tools with it.

      The only objectively bad (major) thing against it is speed. Not that it matters much for most applications though, especially considering that most number crunching tasks will use libraries that have critical path written in a systems language:

      numpy, pandas, polars, scikit-learn, pytorch, tf, spacy; all of them use another language to do the cpu intensive tasks, so it really doesn’t matter much that you’re using python at the surface.

    • qaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      It’s okay, but it’s a bit slow and dynamic typing in general isn’t that great IMO.

      • sping@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        Dynamic typing is shit. But type annotation plus CI checkers can give you the same benefits in most cases.

      • JeremyHuntQW12@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        2 days ago

        It doesn’t have dynamic typing FFS, variable are typed. You mean declarations.

        You can’t have statically typed objects, because they are of indeterminate length.

        • lime!@feddit.nu
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          it is a dynamically typed language, but it’s not a weakly typed language.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        9 hours ago

        Python itself might not be, but all the AI shit runs on GPUs so it’s CUDA or OpenCL or whatever underneath