• 5 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle
  • foolsh_one@sh.itjust.worksOPtoScience Memes@mander.xyzLol!!!
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    11 months ago

    If the horizon of the universe is like the horizon of a blackhole then the energy loss through Hawking radiation through the converstion e=mc^2 simply implies that mass is lost from the universe over time. If we extrapolate out this energy/mass loss over time for every mass in the universe then the distance between the surfaces of each grow as a relative change with the exponentially decreasing mass over time, directly correlating the dark phenomena we observe as a geometric quantum event.





  • Well, openERP or openbravo were what I would have recommended ten years ago, but due to their commercialization aren’t really relevant any longer. If I personally was setting this up for myself I would probably use redmine and a plugin that gives redmine the invoice functionality. However I wouldn’t call it simple for a first timer to pull off, but if redmine is mastered you will find very extensible and customizable to any particular project’s needs.




  • I would like to add to this, I expect the jargon and equations to fly far overhead most of the general public. So I will try to sum the entire process here:

    1. My initial idea was that the horizon of our universe was “leaking” energy from its horizon, just as a black hole does. If this was true then according to e=mc^2 the universes mass should decrease over time.

    2. I had earlier made the connection that dark matter and energy maybe a geometric phenomena and wrote a thought experiment outlining the idea. If we take two observers and keep them stationary in space, then in some fantastic way shrink them both at the same rate at the same time without the observers knowing, the view of the observers would be that they are moving away from each other as they stay relatively the same size to each other and the distance increases according to their mass and thereby volume decreasing.

    3. These two basic ideas together seemed to say that as the universe losses energy from its horizon like a black hole, the atoms and point masses inside all lose a tiny amount of mass, and thereby each point’s volume decreases a tiny amount. This causes distance to increase ever so slightly between the particles. This entire effect over every particle in the universe together gives us the expanding of distance we call dark energy. This same effect when taken with relativity and the laws of motion produce a dark matter signal as well.

    4. Lastly gravity was already known to probably be quantum in nature, so digging on further revealed this interesting, if somewhat novel view of the universe and everything in it.

    I have some of my previous papers hosted on researchub, this link leads to a post with further information about the initial research that lead to this paper, I continue to use researchhub for further work.

    I hope this additional information has helped you in some way to understand this idea I call special gravity. Cheers!





  • I completely agree with every word, it was the observations alone of dark energy and matter that led me in this direction.

    At one time I tried to describe them with an unknown fifth dimension, but later realized that’s only an abstraction. Perhaps just maybe black holes and universes share this property of evaporation, which if so, would have interesting consequences.

    I have the thought experiment to go along with the paper, if you’d like to see that at https://madhakker.com/ just scroll down one post. That was from when I was trying the fifth dimension angle, but it does a good job of describing a dark matter like signal in the terms of a changing mass.



  • Also you’re asking about multi gpu, I have a few other cards stuffed in my backplane. The GeForce GTX 1050 Ti has 4GB of vram, and is comparable to the P40 in performance. I have split a larger 33B model on the two cards. Splitting a large model is of course slower than running on one card alone, but is much faster than cpu (even with 48 threads). However speed when splitting depends on the speed of the pci-e bus, which for me is limited to gen 1 speeds for now. If you have a faster/newer pci-e standard then you’ll see better results than me.




  • I have a p40 I’d be glad to run a benchmark on, just tell me how. I have Ooba and llama.cpp installed on linux Ubuntu 22.04, it’s a Dell r620 with 2 x 12 3.5 Ghz cores (2 threads per core for 48 threads) Xeon with 256GB ram @ 1833Mhz, I have a pci-e gen 1 20 slot backplane. The speed of the pci-e bus might impact the loading time of the large models, but seems to not affect the speed of inference.

    I went for the p40 for costs per GB of vram, speed was less important to me than being able to load the larger models at all. Including the fan and fan coupling i’m all in about $250 per card. I’m planning on adding more in the future, I to suffer from too many pci-e slots.

    The cuda version I dont think will become an issue anytime to soon but is coming to be sure.