not op but here’s my reasons: I want my apps to be able to talk to each other. So flatpak is just in the way. Also, I don’t see the point of immutable distros. I could boot off of btrfs snapshots years ago. Immutability gives me absolutely nothing of value either
- 0 Posts
- 13 Comments
an ai is not a script. You can know what a script does. neural networks don’t work that way. You train them, and hope you picked the right dataset for it to hopefully learn what you want it to learn. You can’t test it. You can know that it works sometimes but you also know that it will also not work sometimes and there’sjacksjit you can do about it. A couple of gigabytes of floating point numbers is not decipherable to anyone.
enjoying it is a different issue. You probably enjoy it because it’s more difficult, which is perfectly valid reasoning
so? someone invented current llms too. Nothing like them existed before either. If they vibe coded with them they’d still be producing slop.
Coding an llm is very very easy. What’s not easy is having all the data, hardware and cash to train it.
i like the analogy
given that expert systems are pretty much just a big ball of if-then statements, then he might be considered to have written the app. Just with way more extra steps.
exactly, you can only really verify the code if you were capable of writing it in the first place.
And it’s an old well known fact that reading code is much harder than writing it.
the “target” is to get useful software out. The ai is the tool. In this example, the ai is the gun. It is the tool used to achieve the goal.
Anyone can make an improvised hammer. Stick a rock or a piece of metal on a stick. But that doesn’t make them carpenters, even though they made their own tools.
and he stil wouldn’t understand its output. Because as we clearly see, he doesn’t even try to look at it.
So? Some of the people pushing out ai slop would be perfectly capable of writing their own llm out of widely available free tools. Contrary to popular belief, they are not complex pieces of software, just extremely data hungry. Does not mean they magically understand the code output by the llm when it spits out something.
yes. Because that would still mean they didn’t code the app.
“killing is bad!” “but what if the murderer 3d printed his own gun?”
if they build software using mainly ai generated code, then they are a bad coder
I know one person who does this. It’s simple: apple discontinued the ipod touch. He had no other choice than to get an iphone without a sim