

Pourquoi pas?


Pourquoi pas?
There’s a technical difference. On a single drive, GRUB (or any other modern bootloader) can handle multiple OSs that coexist on the same boot chain. Windows doesn’t like this of course. On different drives it is the UEFI that chooses which drive boot sector to boot from, regardless of which bootloader it has. Here, Windows doesn’t get a say, and it is less likely to break.
Historically, the first case was called dual booting but the second is not called that. If the same result is achieved, maybe the distinction doesn’t matter anymore. However, in the olden days, there was only one disk allowed to have a master boot partition, the Device 0 in an IDE bus. Consumer PCs were limited to two IDE busses, with a device 0 and device 1 each, only one hard drive could have an MBR on the primary IDE. Now a days it is much easier to have multi-disk boot capabilities in hardware thanks to EFI system partitions (since mid 2000s), but it used to be necessary to fiddle with an MBR even if the OSs were on different disks.
It is an important distinction because dual booting, as a concept, almost always exists in relation with Windows. If you have two, three or more Linux OSs running on the same disk drive, it is not called dual booting, it is just booting and choosing your distro, as bootloaders like GRUB are multi-booting by default.
So, yeah, maybe it is dual-booting as well, but it is not what the original term used to mean. It is just Windows wasting space in a quarantined disk, which I still prefer.
Don’t dual boot. Instead, invest in two drives and dedicate each to each os fully. Way less headache and far more control. Easier to keep windows oblivious of Linux existence so it doesn’t fuck with it.


Anti-cheat, even kernel level anti-cheat has worked on Linux for a very long time. Some of the most popular products used by AAA have been available for years. They just intentionally refuse to make their products work on Linux.
Remember Genshin Impact, for example. It literally has an internal flag that instantly closes the game if it detects it is running on Linux. There’s no technical limitation for any of those big multiplayer titles from working, they just don’t want them to.


Meh, sure it was an operational loss for sony. But there’s a slew of condintions so different from the ps3 to the steam machines that it’s very hard to compare them. First of all, the Linux PS3 never actually worked. It was janky and required a ton of workarounds and hacks, not really a viable desktop PC. The famous calculation clusters were created by universities and technology enthusiasts. The processing units are too niche for day to day use, having virtually no consumer software for them.
Second, Sony got pushed into a higher cost of manufacture than planned because of a shortage of blurays and the rise in costs of their unique silicon manufacturing. Some say it was more than 100% over their expectations. And I still remember people in the gaming scenes complaining that it was too expensive.
Third, speaking of bluray, the ps3 was way too ambitious technologically speaking, to not be a good target for this type of scalping. First commercial bluray, first HDMI output, a “supercomputer for the living room” vision. If anything, it was the cheap bluray angle that drove scalping and shortages, not the OtherOS capabilities.
I still think it is an unfounded concern with the Steam Machine. Valve already said, it won’t be sold at a loss. It has no specialized technological advancement in particular. It is a mid range entry PC at the most. Having worked with many IT teams and business acquisition teams, it is just not a very attractive proposal. It will be seen as a gaming toy. No exec wants to buy toys for employees.


Nor do I know which point you are making. Because so far you’re arguing about things I haven’t said. Either you don’t have the reading skills or are arguing in bad faith. And I extremely dislike wasting my time explaining myself to someone who is intentionally misleading just to be contrarian.


Way to take the comment out of context and build a strawman. I was reiterating something I said in more detail in a higher up comment. Companies do buy expensive laptops. I said so. Mind you, the steam machine is, emphatically, not a laptop.


Not enough to cause a shortage. As I said. No business will pay more than a couple hundred for a PC. If they need more juice, then the steam machine won’t be it. It is more like an enthusiast or a content creator midlevel machine.


That just wouldn’t happen unless the steam machine costs less than $300. That’s usually the top a corporation is willing to pay for bulk mini nucs, which is all that they want for clerk desks. Information workers get laptops with dell or HP embossed in the lid. Workstations for top design or video editing require way more juice than the Steam Machine can deliver, those are bought on order to professional boutiques, or they just buy Apple. Also, no administrator will sit on the steam shop page to buy one at a time, they like their bulk purchases and Valve can simple refuse anyone buying hundreds of machines. Then, corporations don’t just want the PC, they want tech support, advanced guarantee schemes, etc. This usually come with a subscription per seat. All things Valve simply won’t provide. It won’t even register as an option for businesses.
This is an unfounded concern.
It’s Fedora based. If you want to develop on it, it supports containerized workflows. There’s a DX version explicitly designed for developers.
Finished means it’s feature complete according to the specification and feature frozen. It says nothing of bugs. Bugs are ethereal qualities, subject to opinion and criteria chosen for triage. Sudo is finished, it does what is meant to do. Does it do it bug free? For the most part it does. Doesn’t mean there aren’t any bugs left. But no new bugs are expected to be introduced by active development. Any bugs that arise, and it has been the case for a long time, will be old bugs that haven’t been discovered yet.
That’s where the adapt part comes in.
I had a friend who collected CRTs and VHS players right at the turn from DVD to bluray. He didn’t argue to kill LCDs, HD video or CDs. He didn’t wrote to Sony to complain that he couldn’t find VHS on Walmart anymore or that his hyper specific CC format didn’t work on DVD the exact same way it did on VHS. He accepted that tech culture shifted and that to keep his hobby up he had to take up a lot of the upfront work of maintaining old tech alive. He learned to repair old CRTs and VHSs and keeps them running for libraries. Even collaborating to digitize particularly niche historical content.
I understand and agree. Anyone who has a super specific use case that means they still use X11, go ahead, no one is stopping them. But to complain or trash Wayland on that basis is asinine. Every single change in paradigm breaks someone’s workflow, that’s impossible to avoid. But the responsible thing to do is to adapt either with new tools and resources, or with a slight change in workflow. They act like people are taking away their toy, when in reality it is just adding to the pile of available toys. But they are upset because their toy is old and won’t get repaired anymore, while the new toy is slightly different but a bit easier to clean and repair, so they get upset at the other kids for playing with it. Ignoring that the new toy doesn’t make the old toy disappear.
I love your metaphor because it is exactly the kind of pedantry that is usually at play with X11 vs Wayland.
“I can’t take an electric uber because it has an effective range less than 400 miles!”
Who the fuck takes a uber to a destination over 4 hours away?
A normal person rents a car, takes a bus, catches a train or buys a plane ticket. Ain’t no one faring a uber for a long trip to another city. But that’s exactly the kind of complaints from people obsessively clinging to X11. They have a hyper specific use case or workflow that almost no one else uses.
Oh, my bad. Misread it the other way around. Disregard the comment then. That said. I disagree with the other commenter. Such a combination of GPUs is not rare at all. It was even recommended frequently to PC builders about 5+ years ago.
Is it something specific to a particular model or combination of those? Because I run a laptop with amd integrated GPU and discrete nvidia GPU (not that rare, actually) and Wayland works flawlessly. Games use xwayland without any issues when necessary.
I used to have my reservations a year or so ago, but Wayland has grown in leaps and bounds over the last couple of years. It is much more ready now.


Canonical, leading the charge towards enshittification of Linux. Who would’ve guessed this 20 years ago?


Their CEO supports a company that bombs kids in Gaza. Should I say more? Because there’s more.
There’s three types of NVIDIA failures on Linux:
A- The niche thing that doesn’t work for the group of people who use it.
B- The specific card model that doesn’t work.
C- The distro that for some reason is a nightmare to install the drivers.
Each motive individually is not a lot of people, but all together it is way much more than AMD. Hence the difference.
Also, if you have a type A failure card, there’s a probability that maybe it will be fixed eventually. But for type B, you’re out of luck. There’s a non-zero chance that your card will never work.
Type C is entirely up to user error and distro effort. But it won’t help with type A and B. If NVIDIA of fails you, whether you can install the drivers on your distro or not, is irrelevant.