title /10 chars
no /10 chars
if you looked back at Steam’s hardware survey back when more people were doing it, it still wasn’t that popular with only 2-4% of steam users running SLI/CF. The added expenses, extra heat created, and limited support never really added up as a good return to your investment.
It would probably be more if it actually worked in anything new…
I’m glad it’s gone - we need to trend towards efficiency not brute force
I used CrossFire with a 5870+5970 back in ~2010-2011. It wound up being nice when Bitcoin mining started to kick off, but for gaming it was awful.
I stubbornly tried CrossFire again in 2015 with dual R9 290 GPUs. It was incredibly broken and caused some very frustrating issues, like random audio pops and stutters. The problems were really common back in the day with no decent solutions. I wound up returning the second 290 in under 24 hours and picked up a 980 Ti instead.
I have zero plans of trying multi-GPU for gaming ever again.
Back in around 2006 or so I used to post on Toms a lot and we always had the “New Build Advice” threads were the OP wanted to “future proof” with a SLI/CF build. After mentioning the pros (few) and cons (more) of SLI/CF I always stated the numbers prove you either will go multi-GPU out of the gate or will do a whole new build before you buy that additional(or more) video card. So instead of the high wattage PSU (and often mediocre), full tower case, over priced mobo that you will likely never need, get a better CPU and/or GPU that you can use out of the gate?
FYI, My character list is back and I can finally post on my main toon.
It’s really interesting comparing stuff from that era to today - even with the most inefficient CPU’s out there (intel, I’m looking at you), it pales in comparison to the amount of raw power our GPUs of old used to pull. For example, a fully overclocked 10900k pulls maybe 300w at full render load, whereas your typical gaming CPU, a Ryzen 5 3600, will pull maybe 50-60w under a gaming load.
A 2006-era high-end GPU would pull nearly 300w all on its own!
Today, (well, as of today) our most powerful consumer GPU, the 2080 ti, will pull sub 400w total system power. With more modest hardware, a <300w total system power gaming load is pretty much the norm.
an OC AMD Phenom II x4 955/965 ( I had the 955) and ATI 4870x2 (AMD decided to do the CF for you) could pull almost 500w. Today the Ryzen 3900x and RTX 2080ti will pull under 400w while gaming as you stated (a 100w difference). Add in more efficient power supplies that offer 3x the warranty, temps/power demands have certainly changed for the better.
What cards are those? Top-tier GPUs from 2006 like the GeForce 7950 GX2 or Radeon X1950 XTX had under 200W TDP. Even NVIDIA’s notoriously power-hungry GTX 480 released in 2010 operated at 250W, and the dual-GPU GTX 295 at close to 300W.
Modern GPUs use more power than their predecessors. My 2080 Ti out of the box would suck up 300W with the factory OC, but I’ve dialed the power limit down to a more reasonable 250W.
The cards with the highest TDP in recent enough history have been dual-GPU cards. The GTX 590, for example, had a whopping 365W TDP.
Thankfully, on the whole everything is still much more efficient across the board.
Was thinking the GeForce 280 - total system power draw on reviews was over 300w
Later would be GTX 480 as a stand out high power draw GPU
Guessing Ray Tracing will go the way of the dodo soonish too? After all it is still really inefficient to build out graphics, is it not?