• 4 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle





  • I once went for lower CAS timing 2x 128MB ram sticks (256 MB) instead of 2x 256s with slower speeds because I thought 512MB was insane overkill. Realized how wrong I was when trying to play Star Wars galaxies mmorpg when a lot of people were on the screen it started swapping to disk. Look up the specs for an IBM Aptiva, first computer my parents bought, and you’ll understand how 512MB can seem like a lot.

    Now my current computer has 64 GB (most gaming computers go for 32GB) at the time I built it. My workstation at work has 128GB which really isn’t even enough for some workloads we have that use a lot of in-memory cache… And large servers can have multiple TB of RAM. My mind has been blown multiple times.





  • I share your concern about over dependence, but for different reasons.

    There’s a lot of metadata Github has that’s not backed up in the git repo. For example. pull requests, issues, projects, milestones, wikis etc.

    There appears to be third party tools that can back this metadata up using Githubs API. I wonder if anyone bothered. Kinda doubt it.

    After the Google cloud UniSuper incident, I think it’s clear that no org is immune from accidentally deleting everything.

    which somehow belong to Microsoft and is not good for privacy.

    It’s hard for me to guess exactly what your concern is. There’s nothing stopping you from using an anonymous account on Github to contribute. Or just connecting to it via Tor / VPN.

    If its mostly about sticking to principles, then you might like https://guix.gnu.org/ more.

    There’s also these are projects you could contribute to:

    I think they’re all using forgejo or some FOSS git forge. At least I think aux is still planning moving to forgejo. However, I’m not aware of anyone still planning to fork nixpkgs.












  • Another question is how to keep my packages up-to-date. I don’t do serious development work, thus I typically perfer my package and dev-tools to be on the latest version. I prefer to have a little management of this as possible. Ideally, every time I start up a nix shell, the package manager will grab the latest version of the package if possible without requiring additional interaction from me. Is this possible?

    Definitely sounds like you should look into using https://direnv.net/. Once you direnv allow the directory, as soon as you enter the directory it will create per-project isolated development environments.

    The in the .envrc file you could have something like:

    nix flake update
    use flake
    

    If your using nix flakes which also imply you’re using git.

    However, without flakes you could use a tool like:

    And run their update command from the .envrc

    Or if you don’t want to use direnv, then perhaps run a update command from the nix shellHook.

    shellHook =
      ''
        echo "Hello shell"
        export SOME_API_TOKEN="$(cat ~/.config/some-app/api-token)"
      '';
    

    Sorry, I’m not sure about your last question.

    Edit:

    If you’re using git and a forge like GitHub, then you could use a GitHub action to automate the update and create a PR. Such as a GH action like https://github.com/DeterminateSystems/update-flake-lock

    Personally, for projects I use direnv + flakes and that github action above, but I can understand if you don’t want to mess with learning git.