- cross-posted to:
- singularity@lemmy.fmhy.ml
- technews@radiation.party
- cross-posted to:
- singularity@lemmy.fmhy.ml
- technews@radiation.party
cross-posted from: https://lemmy.intai.tech/post/72919
Parameters count:
GPT-4 is more than 10x the size of GPT-3. We believe it has a total of ~1.8 trillion parameters across 120 layers. Mixture Of Experts - Confirmed.
OpenAI was able to keep costs reasonable by utilizing a mixture of experts (MoE) model. They utilizes 16 experts within their model, each is about ~111B parameters for MLP. 2 of these experts are routed to per forward pass.
Related Article: https://lemmy.intai.tech/post/72922
They are the right ones. Should be a tweet archive and a blog post
Well that’s weird because the first takes me to a shitpost with a picture of cake, and the second a shitpost about sucking your dentist’s fingers…
ewwww lol
are you using an app or the web? the links should point to the intai instance which works fine for me but i don’t know what various clients will do with those links
I’m using Connect, so that could explain it! Thanks. I’ll see if I can figure it out because this is really interesting to me, but the dentist post is not! Haha!