Beefy can mean things to different people too. I have a mobile 1660ti and it can generate images in decent times (about 40seconds for a 20iteration image from prompt)
I’m slightly lacking in VRAM though, something 8GB VRAM would allow you to use most models.
Fun fact, it can be run on as low as 2gb vram! It works out of the box with the --lowvram parameter, and with some extra fiddling with extensions you can even generate high resolution stuff.
Beefy can mean things to different people too. I have a mobile 1660ti and it can generate images in decent times (about 40seconds for a 20iteration image from prompt)
I’m slightly lacking in VRAM though, something 8GB VRAM would allow you to use most models.
yeah thats fair enough on the wordage.
Im rocking a 3070 and 11th gen i7, but only 16gb of ram.
still pretty quick imo
Took longer for my browser to download the image than it took for you to generate it. :)
Fun fact, it can be run on as low as 2gb vram! It works out of the box with the --lowvram parameter, and with some extra fiddling with extensions you can even generate high resolution stuff.