Depending on where we look, the universe is expanding at different rates. Now, scientists using the James Webb and Hubble space telescopes have confirmed that the observation is not down to a measurement error.
I don’t think that you can do the current XL models with 8GB, even for low-resolution images. Maybe with --lowvram or something.
I’ve got a 24GB RX 7900 XT and would render higher resolution images if I had the VRAM – yeah, you can sometimes sort of get a similar effect by upscaling in tiles, but it’s not really a replacement. And I am confident that even if they put a consumer card out with 128GB, someone will figure out some new clever extension that does something fascinating and useful…as long as one can devote a little more memory to it…
And then Stable Diffusion showed up
Im getting away with my 8gb for now.
Its the language/text stuff that really needs like 30gb GPUs.
I don’t think that you can do the current XL models with 8GB, even for low-resolution images. Maybe with --lowvram or something.
I’ve got a 24GB RX 7900 XT and would render higher resolution images if I had the VRAM – yeah, you can sometimes sort of get a similar effect by upscaling in tiles, but it’s not really a replacement. And I am confident that even if they put a consumer card out with 128GB, someone will figure out some new clever extension that does something fascinating and useful…as long as one can devote a little more memory to it…
I do XL all the time, at about 30-45 seconds per image. 8gb is surprisingly enough for SDXL, and I run like 7gb models with 3-6 Lora on top.