• 0 Posts
  • 36 Comments
Joined 3 years ago
cake
Cake day: June 14th, 2023

help-circle
  • I’d be happy if plasma looked a bit more like WinNT. Completely functional, all the information there at a glance. Nothing hidden away in hamburger menus, no guessing about what you can and can’t click on. Does what it needs to then gets out your way. The best-designed that Windows has ever been.


  • addie@feddit.uktoProgrammer Humor@programming.devSenior devs...
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    8 days ago

    Abstraction is not very compatible with concurrency, so as well as your your beautiful abstract API, you also need some ‘cut through the layers’ functions to return the underlying classes you need to synchronise on. Now you have a right mess that’s incredibly hard to understand, infuriating to debug, and impossible to refactor. Best you can do is put another layer of abstraction on top. Repeat every six months.






  • systemd-networkd gets installed by default by Arch, integrates a bit better with the rest of SystemD, doesn’t have so many VPN surprises, and the configuration is a bit more obvious to me - a few config files rather than NetworkManager’s “loads of scripts” approach. Small niggles rather than big issues.

    Really, I just don’t want duplication of services - more stuff to keep up-to-date. And if I’ve got SystemD anyway, might as well use it…


  • NetworkManager dependencies can now be disabled at build time…

    Nice. It was a damned nuisance that Cinnamon brought its own network stack with it. All my headless servers and my Plasma gaming desktop use systemd-networkd, which meant that my Cinnamon laptop needed different configuration. Now they can all be the same.

    Hopefully the new release will bash a few of the remaining Wayland bugs; Plasma is great but I prefer Cinnamon for work, and it’s just too buggy for gaming on a multi-monitor setup at the moment.



  • Java’s biggest strength is that “the worst it can be” is not all that bad, and refactoring tools are quite powerful. Yes, it’s wordy and long-winded. Fine, I’d rather work with that than other people’s Bash scripts, say. And just because a lot of Java developers have no concept of what memory allocation means, and are happy to pull in hundreds of megabytes of dependencies to do something trivial, then allocate fucking shitloads of RAM for no reason doesn’t mean that you have to.

    There is a difference in microservices between those set up by a sane architect:

    • clear data flow and pragmatic service requirements
    • documented responses and clear failure behaviour
    • pact server set up for validation in isolation
    • entire system can be set up with eg. a docker compose file for testing
    • simple deployment of updates into production and easy rollback

    … and the CV-driven development kind by people who want to be able to ‘tick the boxes’ for their next career move:

    • let’s use Kubernetes, those guys earn a fortune
    • different pet language for every service
    • only failure mode is for the whole thing to freeze
    • deployment needs the whole team on standby and we’ll be firefighting for days after an update
    • graduate developers vibe coding every fucking thing and it getting merged on Claude’s approval only

    We mostly do the second kind at my work; a nice Java monolith is bliss to work on in comparison. I can see why others would have bad things to say about them too.


  • Apart from being slow, having discoverability issues, not being able combine filters and actions so that you frequently need to fall back to shell scripts for basic functionality, it being a complete PITA to compare things between accounts / regions, advanced functionality requiring you to directly edit JSON files, things randomly failing and the error message being carefully hidden away, the poor audit trail functionality to see who-changed-what, and the fact that putting anything complex together means spinning so many plates that Terraform’ing all your infrastructure looks like the easy way; I’ll have you know there’s nothing wrong with the AWS Console UI.


  • Yeah. You know the first time you install Arch (btw), and you realise you’ve not installed a working network stack, so you need to reboot from the install media, remount your drives, and pacstrap the stuff you forgot on again? Takes, like, three minutes every time? Imagine that, but you’ve got a kernel compile as well, so it takes about half an hour.

    Getting Gentoo so that it’ll boot to a useful command line took me a few hours. Worthwhile learning experience, understand how boot / the initramfs / init and the core utilities all work together. Compiling the kernel is actually quite easy; understanding all the options is probably a lifetime’s work, but the defaults are okay. Setting some build flags and building ‘Linux core’ is just a matter of watching it rattle by, doesn’t take long.

    Compiling a desktop environment, especially a web browser, takes hours, and at the end, you end up with a system with no noticeable performance improvements over just installing prebuilt binaries from elsewhere.

    Unless you’re preparing Linux for eg. embedded, and you need to account for basically every byte, or perhaps you’re just super-paranoid and don’t want any pre-built binaries at all, then the benefits of Gentoo aren’t all that compelling.


  • Only has the functionality that you need, everything is obviously in its place. Runs incredibly quickly without using a lot of resources, and then gets out your way when you’re trying to do stuff. No settings hidden away because they might confuse novice users. No bullshit shoehorned in by managers.

    Apart from the ugly font rendering, this might be as good as the Windows UI ever got. WinNT looks the same and has almost incomparable stability improvements, but only if you’ve the right hardware to run it. WinXP starts the downhill slide with ‘appearance over functionality’ and the hot mess of the control panel.

    I could live with how OP has things set up here; my own copy of Plasma doesn’t look a million miles from this.


  • I’m in this photo and I don’t like it.

    More specifically, my programming background is in industrial automation and I’d like to add some more ‘robust and flexible’ algorithms to CoolerControl so I can control my system fans / temperature better, but it’s written in a mix of TypeScript and Rust.

    I’ve spent 20 years programming hard real-time z80 assembly and know quite a few higher-level languages. (Although I prefer the lower-level ones.) Not those ones, however, so it’s not just a couple of hours work to raise a PR against that project. Going to need to crack some books.


  • It’s not a million miles away, but it’s still got some problems. The ‘extract archive’ functionality seems to do it for me; think it must be wanting to pop up a (nested?) file chooser, but causes a session crash.

    Cinnamon legacy for getting work done, and KDE wayland for playing games, for me. Nice to go 100% cinnamon though, for sure.


  • I understand that things have changed a bit since I first moved over to Linux - moving from Red Hat Linux to Ubuntu ‘Warty Warthog’ was such a revelation in overall user-friendliness and usability, back in the day. But upgrading my graphics card from an NVidia one to an AMD was a similar change. I might have only just installed the base operating system and a desktop environment and haven’t got around to a web browser yet, but I’ve already got full hardware accelerated graphics - that’s crazy.

    Most distros now make the NVidia drivers a complete non-issue, I think? My 6600XT is requiring just a few too many compromises on new games, so I’ll need something new too, sooner or later. I used to hold off on graphics cards updates until I could get something twice as good so that it was a noticeable upgrade, but I could buy a pretty decent second-hand car for all the ones which are ‘twice as good’ now.

    An upgrade from a 1050 Ti shouldn’t be such a problem. Well done on keeping it alive so long - I had a GeForce GTX 970 that would have been a similar age, but it let out its magic smoke years ago.


  • Really, it’s a misuse of language to describe elementary particles as having ‘wave/particle duality’. If you ask them a wave-like question, they give a wave-like answer. If you ask them a particle-like question, they give a particle-like answer. But that doesn’t mean they’re a combination of the two; just means that our everyday understanding of big things isn’t suitable for describing small things.

    We know that general relativity and quantum dynamics can’t be quite right. They have enormous predictive power, but they don’t overlap, which means we can’t model things where they’re equally important; the big bang and black holes for instance. “Higher dimensions” is the string theory way of trying to reconcile them - it might be right. But a theory isn’t scientific if it doesn’t make predictions you can test, and string theory hasn’t been very productive in that so far. Amazing maths though, has been great for expanding our knowledge there.


  • Moved my father-in-law from Windows 10 to Mint.

    Biggest problem was all his ‘documents’, which were office365 web links rather than ‘actual documents’. Linux presents them as the urls that they really are. They open just fine, though, and can be exported as real local docs for libreoffice etc.

    Security and privacy were the main selling points for him. He’d done some reading and thought that Mint was among the best choices for a newstart that just want everything to work; no interests in playing games or anything. I agreed that was the most solid choice. I use Arch btw myself, but wouldn’t recommend that for beginners.


  • addie@feddit.uktoLinux@programming.dev*Permanently Deleted*
    link
    fedilink
    arrow-up
    51
    arrow-down
    1
    ·
    4 months ago

    Centrally managed repositories help a lot, here. Linux users tend not to download random software off of sketchy websites; it’s all installed and kept up to date via the package manager.

    Yes, Linux malware and viruses exist, and we shouldn’t pretend otherwise. The usual reason for installing Linux virus scanners is because you’re hosting a file/email server, and you want to keep infected files away from Windows users, tho.


  • Isn’t the default installation of Ubuntu to BTRFS? In which case, you should have an @ subvolume with Ubuntu that’s mounted to /, and an @home subvolume that’s mounted to /home.

    Make a new subvolume, install a new operating system into it, and choose that subvolume in the bootloader, should be able to have Ubuntu and ‘your favourite OS’ (I use Arch btw) living side-by-side with the same home directory.