• 0 Posts
  • 39 Comments
Joined 1 year ago
cake
Cake day: December 6th, 2024

help-circle
  • Try Lutris or Heroic Launcher as those two wrap around Wine (and everything else needed to run Windows games in Linux, such as DXVK) and manage the whole process for you, with only a few games which might need tweaking the config to run (and the fraction of games like that is no worse there than it is in Steam).

    I use both Steam and Lutris and in my experience Steam is not at all a good launcher for anything other than games from the Steam store, mainly because it is less configurable and because it doesn’t directly expose the tools you need to use to fix those few games that won’t just run and limits the launch options you can tweak, whilst Lutris follows the Unofficial Open Source Credo of pretty much making it possible to configure everything (though Lutris specifically defaults to the best configuration for each game, but it definitelly gives you more than enough rope to hang yourself with)

    Steam is very popular because of the Steam Store market dominance so tons of people swear by it (never having used anything else), but it’s not actually the greatest option for anything but steam games and even for those, sometimes it’s worse that getting the same game from GoG and using Lutris or Heroic, mainly because the DRM in the non-GoG version of some games interferes with running them in Linux.







  • On a serious note, having used Linux on and off since the 90s (aah, Slackware, how I miss installing you from floppies … not), Linux has, IMHO, actually been desktop ready for ages (though definitelly not in the days of Slackware when configuring X was seriously interesting for a geek and pretty much an impossible barrier for everybody else).

    The problem have always been applications not having Linux builds, only Windows builds, not the actual desktop Linux distros being an inferior desktop experience than Windows (well, not once Gnome and KDE emerged and made things like configuring your machine possible via GUIs - the age of the RTFF and editing text files in the command line before that wasn’t exactly friendly for non-techies).

    In other words, from maybe the late 00s onwards the problem were mainly the “networks effects” (in a business sense of "apps are made for Windows because that’s were users are, users go for Windows because that’s were the apps are) rather than the “desktop” experience.

    The almost unassailable advantage of Windows thanks to pretty much just network effects, was something most of us Linux fans were aware since way back.

    What happened in the meanwhile to make Linux more appealing “in the Desktop” was mainly on the app availabilty side - OpenOffice (later LibreOffice and derivatives) providing an Office-style suit in Linux, the movement from locally hosted apps to web-hosted apps meaning that a lot of PC usage was really just browser usage, Wine improving by leaps and bounds and making more and more Windows applications run in Linux (most notably and also thanks to DXVK, Games) and so on.

    Personally I think Linux has been a superior experience on the server side since the late 90s and, aside for the lack of Linux versions of most commonly used non-OS applications, a superior experience in the desktop since the 00s.



  • At times that shit is pretty much the opposite of what should be done.

    Fail Fast is generally a much better way to handle certain risks, especially those around parts of the code which have certain expectations and the code upstream calling it (or even other systems sending it data) gets changed and breaks those expectations: it’s much better to just get “BAAM, error + stack trace” the first time you run the code with those upstream changes than have stuff silently fail to work properly and you only find out about it when the database in Production starts getting junk stored in certain fields or some other high impact problem.

    You don’t want to silently suppress error reporting unless the errors are expected and properly dealt with as part of the process (say, network errors), you want to actually have the code validate early certain things coming in from the outside (be it other code or, even more importantly, other systems) for meeting certain expectations (say, check that a list of things which should never be empty is in fact not empty) and immediatly complain about it.

    I’ve lost count how many times following this strategy has saved me from a small stupid bug (sometimes not even in my system) snowballing into something much worse because of the code silently ignoring that something is not as it’s supposed to be.


  • In some situation shader pre-caching makes things worse rather than better: for example in my machine Borderlands 2 would take 10 minutes to update shaders at pretty much every game start even though I have a Gbps Internet connection.

    Eventually it turned out that you really shouldn’t be running “Latest Proton” with it as any update to Proton would trigger a full update of the shader chache (or worse, local generation, which in my machine took hours). Of course, information about that shit was nowhere to be found, nor was the default configuration of that game under Linux setup to just run the game with a specific Proton version.

    Switching shader pre-caching off also solved the problem, but to avoid the situation as you described of “shader translation at the time the shader is first used” causing unexpected slowdowns at bad time, when I figured out the that it was using “Latest proton” that was triggering full shader cache downloads I switched it all to use shader pre-chaching with a specific, fixed proton version.

    All this to say that the was Steam does shader pre-caching isn’t a silver bullet - for some games it makes them near unplayable by default until you figure out the specific configuration changes needed (and, with at best many minutes before each game start actually succeeds, trial and error is a very slow and frustrating way to figure out what’s going on and how to fix it).





  • This is really just a driver which sends a bunch of bytes via I2C to a microcontroller.

    I2C is a very standard way of communicating with digital integrated circuits at low speed so this is not specific to the microcontroller used on Synology NAS devices (which is actually a pretty old and simple one) much less specific to drive leds.

    So whilst technically this specific Linux Driver ends up controlling LEDs on a very specific device, the technique used in it is way more generic than that, and can be used to control just about any functionality sitting behind a digital integrated circuit that exposes an interface to control it via I2C, be it one that hardcodes it or one which, like this one, is a microcontroller which itself implements it in code.

    All this to say that this is a bit bigger than just “LED driver”.


  • Yeah, that’s much better.

    Personally I detest not understanding what’s going on when following a guide to do something, so I really dislike recipe style.

    That said, I mentioned recipes because recipes meant to be blindly followed are the style of guide which has the lowest possible “required expertise level” of all.

    I supposed a playbook properly done (i.e. a dumbed down set by step “do this” guide but with side annotations which are clearly optional reading, explaining what’s going on for those who have the higher expertise levels needed to understand them) can have as low a “required expertise level” as just a plain recipe whilst being a much nicer option because people who know a bit more can get more from it that they could from just a dumbed down recipe.

    That said, it has to be structured so that it’s really clear that those “explanation bits” are optional reading for the curious which have the knowhow to understand them, otherwise it risks scaring less skilled people who would actually be able to successfully do the taks by blindly following the step-by-step recipe part of it.


  • For “all documentation” to “cater to all levels” it would have to explain to people “how do you use a keyboard” and everything from there upwards, because there are people at that level hence it’s part of “all levels”.

    I mean the your own example of good documentation starts with an intro of “goals” saying:

    “Visual Studio (VS) does not (currently) provide a blank .NET Multi-platform Application User Interface (MAUI) template which is in C# only. In this post we shall cover how to modify your new MAUI solution to get rid of the XAML, as well as cover how to do in C# code the things which are currently done in XAML (such as binding). We shall also briefly touch on some of the advantages of doing this.”

    For 99% of people almost all that is about as understandable as Greek (expect for Greek people, for whom it’s about as understandable as Chinese).

    I mean, how many people out there in the whole World (non-IT people as illustrated in the actual article linked by the OP) do you think know what the hell is “Visual Studio”, “.Net”, “Multi-platform Application User Interface”, “template”, “C#”, “XAML”, “binding” (in this context).

    I mean, if IT knowledge was a scale of 1 to 10 with 10 the greatest, you’re basically thinking it’s “catering to all levels” when an explanation for something that is level 8 knowledge (advanced programming) has a baseline required level of 7 (programming). I mean, throw this at somebody that “knows how to use Excel” which is maybe level 4 and they’ll be totally lost, much less somebody who only knows how to check their e-mail using a browser without even properly understanding the concept of "browser (like my father) which is maybe level 2 (he can actually use a mouse and keyboard, otherwise I would’ve said level 1).

    I think you’re so way beyond the average person in your expertise in this domain that you don’t even begin to suspect just how little of our domain the average person knows compared to an mere programmer.


  • The more advanced the level of knowledge on something the more foundation knowledge somebody has to have to even begin to understand things at that level.

    It would be pretty insane to in a tutorial for something at a higher level of expertise, include all the foundational knowledge to get to that level of expertise so that an absolute beginner can understand what’s going on.

    Imagine if you were trying to explain something Mathematical that required using Integrals and you started by “There this symbol, ‘1’ which represents a single item, and if you bring another single item, this is calling addition - for which we use the symbol ‘+’ and the count of entities when you have one single entity and ‘added’ another single entity is represented by the symbol ‘2’. There is also the concept of equality, which means two matematical things represent the same and for which the symbol we use is ‘=’ - writting this with Mathematical symbols, ‘1 + 1 = 2’” and built the explanation up from there all the way to Integrals before you could even start to explain what you wanted to explain in the first place.

    That said, people can put it in “recipe” format - a set of steps to be blindly followed without understanding - but even there you have some minimal foundational knowlegde required - consider a cooking recipe: have you ever seen any that explains how does one weight ingredients or what is “boiling” or “baking”?

    So even IT “recipes” especially designed so that those with a much lower level of expertise than the one required to actually understand what’s going on have some foundational knowledge required to actually execute the steps of the recipe.

    Last but not least I get the impression that most people who go to the trouble of writting about how to do something prefere to do explanations rather than recipes, because there’s some enjoyment in teaching about something to others, which you get when you explain it but seldom from merely providing a list of steps for others to blindly follow without understanding.

    So, if one wants to do something way above the level of expertise one has, look for “recipe” style things rather than explanations - the foundational expertise required to execute recipes is way lower than the one required to undertand explanations - and expect that there are fewer recipes out there than explanations. Further, if you don’t understand what’s in a recipe then your expertise is below even the base level of that recipe (for example, if somebody writes “enter so and so in the command prompt” and you have no fucking clue what a “command prompt” is, you don’t meet the base requirements to even blindly follow the recipe), so either seek recipes with an even lower base level or try and learn those base elements.

    Further, don’t even try and understand the recipe if your expertise level is well below what you’re trying to achieve: sorry but you’re not going to get IT’s “Integrals” stuff if your expertise is at the level of understanding “multiplication”.



  • Try Lutris - it integrates with the GOG store so will fetch and install the games from there, with proper scripts to configure Wine so that the game just works with no extra configuration (i.e. with works like the Steam launcher does for the Steam store).

    I believe Heroic Launcher does the same, but I’ve just settled down to use Lutris and Steam so never go around to test Heroic.


  • Kinda reminds me this Game one plays in Theatre which is to Play The Status (you’re given a number between 1 and 10, with 1 having the lowest social status and 10 the highest, and you try and act as such a person).

    Alongside the whole chin-down to chin-up thing, people tend to do more fast and confident moving the higher the status, but the reality is that whilst indeed up the scale in professional environment the higher the status the more busy and rushed they seem, the trully highest status people (the 10s) don’t at all rush: as I put it back then (this was the UK) “the Queen doesn’t rush because for everybody the right time for the Queen to be somewhere is when she’s there, even it it’s not actually so, hence she doesn’t need to rush”.

    There was also some cartoon making the rounds many years ago about how people on a company looked depending on their social status, were you started with the unkept shabbily dressed homeless person that lived outside the vuilding, and as you went up the professional scale people got progressively more well dressed and into suits and such, and then all of a sudden a big switch, as the company owner at the top dressed as shabbily as the homeless person.