Sounds like you haven’t done it in a while. It has calamares installer now.
Sounds like you haven’t done it in a while. It has calamares installer now.
I haven’t tried it out yet for some reason, but I’d start here https://youtu.be/qlfm3MEbqYA
Going to be interesting to see fox news shift gears into full-time demonization of her.
collapsed
I don’t see this word in the article. I’m guessing OP changed the title or perhaps CNN did after the fact. Nothing surprises me anymore.
I once went for lower CAS timing 2x 128MB ram sticks (256 MB) instead of 2x 256s with slower speeds because I thought 512MB was insane overkill. Realized how wrong I was when trying to play Star Wars galaxies mmorpg when a lot of people were on the screen it started swapping to disk. Look up the specs for an IBM Aptiva, first computer my parents bought, and you’ll understand how 512MB can seem like a lot.
Now my current computer has 64 GB (most gaming computers go for 32GB) at the time I built it. My workstation at work has 128GB which really isn’t even enough for some workloads we have that use a lot of in-memory cache… And large servers can have multiple TB of RAM. My mind has been blown multiple times.
I’m pretty convinced from watching every season of alone that catabolysis was likely the main factor rather than eating berries.
Why does “Why did you switch from*” have different options for each distro? Thought it was funny only NixOS had “toxic people” option. Guessing due to recent drama.
Wake me up when nixpkgs issues decline significantly from 5k+ due to AI.
I share your concern about over dependence, but for different reasons.
There’s a lot of metadata Github has that’s not backed up in the git repo. For example. pull requests, issues, projects, milestones, wikis etc.
There appears to be third party tools that can back this metadata up using Githubs API. I wonder if anyone bothered. Kinda doubt it.
After the Google cloud UniSuper incident, I think it’s clear that no org is immune from accidentally deleting everything.
which somehow belong to Microsoft and is not good for privacy.
It’s hard for me to guess exactly what your concern is. There’s nothing stopping you from using an anonymous account on Github to contribute. Or just connecting to it via Tor / VPN.
If its mostly about sticking to principles, then you might like https://guix.gnu.org/ more.
There’s also these are projects you could contribute to:
I think they’re all using forgejo or some FOSS git forge. At least I think aux is still planning moving to forgejo. However, I’m not aware of anyone still planning to fork nixpkgs.
Most of the survey was about AI it seemed.
Here’s some more anti-patterns https://gist.pother.ca/2e5817a37b1229ea1930/
All my homies use TempleOS
Running
We need a cocaine tortoise horror movie.
I thought it was https://upload.wikimedia.org/wikipedia/en/c/c3/Nothingremastered.jpg at first
I dodged that bullet then. I do have a Bosch dishwasher. It’s fairly new and I like it so far.
Hmm, not sure. That’s what I was going to call it at first. I couldn’t find much either way, but I did find this which uses ephemeral to describe it. Perhaps I should rename it.
I was wondering if I am using a large package like TexLiveFull, how to make sure nix don’t delete large packages after I close the shell? I also don’t want this package to be available in my global environment, as I don’t need to use it outside vscode.
There’s a bunch of tools that solve this problem.
https://github.com/direnv/direnv/wiki/Nix
In the link above check out the table in the “Some factors to consider” section. However, note that it hasn’t be updated since May 30, 2022.
Many of those tools don’t depend on direnv
if you don’t need its functionality.
Personally, I use direnv
and enable nix-direnv
using these options:
Here’s an example of how I use direnv
with nix-direnv
.
Edit: damn over wrote what I wrote to the first question with a response to the second question. Thank goodness for automatic file backups I have setup in Emacs.
Another question is how to keep my packages up-to-date. I don’t do serious development work, thus I typically perfer my package and dev-tools to be on the latest version. I prefer to have a little management of this as possible. Ideally, every time I start up a nix shell, the package manager will grab the latest version of the package if possible without requiring additional interaction from me. Is this possible?
Definitely sounds like you should look into using https://direnv.net/. Once you direnv allow
the directory, as soon as you enter the directory it will create per-project isolated development environments.
The in the .envrc
file you could have something like:
nix flake update
use flake
If your using nix flakes which also imply you’re using git.
However, without flakes you could use a tool like:
And run their update command from the .envrc
Or if you don’t want to use direnv, then perhaps run a update command from the nix shellHook.
shellHook =
''
echo "Hello shell"
export SOME_API_TOKEN="$(cat ~/.config/some-app/api-token)"
'';
Sorry, I’m not sure about your last question.
Edit:
If you’re using git
and a forge like GitHub, then you could use a GitHub
action to automate the update and create a
PR. Such as a GH action like
https://github.com/DeterminateSystems/update-flake-lock
Personally, for projects I use direnv
+ flakes and that github action above,
but I can understand if you don’t want to mess with learning git.
Yes