or you can’t buy if you’re not successful enough or you’re in the wrong country. For example, in my country, the minimum cost of a 1TB SSD is about $85 and a salary of $2,000 is considered a very successful salary at the upper limit
It’s not about storage. It’s about complexity getting back at you, for example not knowing what caused a problem because multiple programs are stepping on each others feet
For me it’s not about the size, it’s about the understanding. I’d really like to understand what everything on my system does and why it’s there. It seems impossible with modern systems. Back in the '90s I needed a secure email relay - it had lilo, kernel, init, getty, bash, vi, a few shell utils (before busybox…), syslogd and sendmail. I’m not sure any more as it was a long time ago, but I think I even statically linked everything so there was no libc. I liked that system.
For me it was a problem with update frequency and how long they would take. Once i got rid of my flatpaks and moved to stable firefox i update once a week instead of daily now and it takes seconds instead of minutes. Probably also solvable with auto updates.
You realize you don’t have to backup the actual “bloated” programs. Just maybe their configs and any files those programs generate that you’d like to keep, right?
That’s committing the cardinal sin of cherrypicking your backup contents. You may end up forgetting to include things that you didn’t know you needed until restore time and you’re creating a backup that is cumbersome to restore. Always remember: you should really be creating a restore strategy rather than a backup strategy.
As a general rule I always backup the filesystem wholesale, optionally exclude things of which I’m 100% sure that I don’t need it, and keep multiple copies (daylies and monthlies going some time back) so I always have a complete reference of what my system looked like at a particular point in time, and if push comes to shove I can always revert to a previous state by wiping the filesystem and copying one of the backups to it.
What I find interesting is that no one is asking about the quality of code, nor do they seem concerned about the dependencies but they do care about that one package/app/program of any size they see and don’t immediately know why it’s there.
I never understand this obsession with “bloat” when you can buy a 1 TB SSD for € 50.
or you can’t buy if you’re not successful enough or you’re in the wrong country. For example, in my country, the minimum cost of a 1TB SSD is about $85 and a salary of $2,000 is considered a very successful salary at the upper limit
bro a 256 gb ssd here costs 200+
That’s wild. I just bought several recently for $20 ea
That sounds insane, are computer parts in general that much more expensive than other countries?
yeah
Do you live in North Korea?
of course, how else would i use lemmy?
It’s not about storage. It’s about complexity getting back at you, for example not knowing what caused a problem because multiple programs are stepping on each others feet
For me it’s not about the size, it’s about the understanding. I’d really like to understand what everything on my system does and why it’s there. It seems impossible with modern systems. Back in the '90s I needed a secure email relay - it had lilo, kernel, init, getty, bash, vi, a few shell utils (before busybox…), syslogd and sendmail. I’m not sure any more as it was a long time ago, but I think I even statically linked everything so there was no libc. I liked that system.
For me it was a problem with update frequency and how long they would take. Once i got rid of my flatpaks and moved to stable firefox i update once a week instead of daily now and it takes seconds instead of minutes. Probably also solvable with auto updates.
Bloat is more about performances
Bloat multiplies when you have to back it up.
You realize you don’t have to backup the actual “bloated” programs. Just maybe their configs and any files those programs generate that you’d like to keep, right?
That’s committing the cardinal sin of cherrypicking your backup contents. You may end up forgetting to include things that you didn’t know you needed until restore time and you’re creating a backup that is cumbersome to restore. Always remember: you should really be creating a restore strategy rather than a backup strategy.
As a general rule I always backup the filesystem wholesale, optionally exclude things of which I’m 100% sure that I don’t need it, and keep multiple copies (daylies and monthlies going some time back) so I always have a complete reference of what my system looked like at a particular point in time, and if push comes to shove I can always revert to a previous state by wiping the filesystem and copying one of the backups to it.
It seems to be seen across all platforms.
What I find interesting is that no one is asking about the quality of code, nor do they seem concerned about the dependencies but they do care about that one package/app/program of any size they see and don’t immediately know why it’s there.
So you have a folder and need to find a specific file from it. Would it be faster to find the file when there are 5 folders or 500?
Snaps still take longer to load with that.
It’s not always about storage. It can also be more processes that drains battery, more attack vectors etc.