"Intel’s Arc A770 and A750 were decent at launch, but over the past few months, they’ve started to look like some of the best graphics cards you can buy if you’re on a budget. Disappointing generational improvements from AMD and Nvidia, combined with high prices, have made it hard to find a decent GPU around $200 to $300 — and Intel’s GPUs have silently filled that gap.
They don’t deliver flagship performance, and in some cases, they’re just straight-up worse than the competition at the same price. But Intel has clearly been improving the Arc A770 and A750, and although small driver improvements don’t always make a splash, they’re starting to add up."
If you can bear the terrible drivers, consider a used nvidia card. They can be decent deals for gaming as well.
This is so the way. Using a used Tesla P40 in a Linux server for AI stuff. Card goes hard.
What happened to the drivers for the old cards to make them bad?
Crashes, broken adaptive sync, general display problems and, most importantly, stutter. I’m running a version from about a year ago on my 1070 Ti because every time I try to update, some game starts to stutter and I get to use DDU and try multiple versions until I find one that doesn’t have that problem.
About 2-3 weeks ago, an update also worsened LLM performance by a lot on 30 and 40 series cards. There were a lot of reports on Reddit, not sure if they fixed it yet.
My default advice for any issue on r/techsupport that could be nvidia driver related has been to DDU and install a version from 3-6 months ago and that has worked shockingly well.
That reminds me, have the r/techsupport mods migrated to lemmy yet? Their explanation of the whole reddit issue was great, so I don’t think they’ll want to stay on there.
Anyways, back to the topic. Since OP also mentioned ROCm, I’m assuming he uses Linux for that. The nvidia drivers on linux are pretty much unusable because of all the glitches and instabilities they cause. Nvidia is a giant meme in the linux community because of this.