Explicit Multi-GPU Support added to Shadowlands!

not necessarily, if you favorite games support it, then yes, it’s worth it. so far mine do. Shadow of the Tomb Raider, Doom Eternal, World of Warcraft, etc.

I run a 4K monitor capable of native 120hz… so with only 1 graphic card, i come no where close to my monitor’s refresh rate, so running two graphic cards is pretty much the only viable option to achieve 120fps @4K at ultra quality settings at present in current games.

But if you’re still living in the past and running 1080 or 1440 resolutions and/or have low refresh rate monitors, then yeah it’s a pointless thing to do and you’ll gain zero benefit.

1 Like

DO IT~! share your before and after fps :D!

Also because I am curious if mixing the two different chipset vendors still works today in 2020 and if it also works in WoW as unfortunately i don’t have an amd card currently to try myself.

It wont fit in my pc :cry: it smacks my audio card.

take the audio card out for now… for science.
use built in motherboard sound in the mean time :stuck_out_tongue: or no sound at all

Ill have to do it a bit later. Brother came back and is now using the his pc.

Well today I received a 3080, and for the hell of it decided to stick one of the 3070s in with it

https://i.imgur.com/wvuouRN.jpg

the 3080 alone was giving me ~200 fps at graphics setting 7 in game with rtx off @4K. and sticking in the 3070 that jumped up to ~250fps :slight_smile: of course this is just for science as I no longer plan to keep my system in sli, as the single 3080 alone achieves what I wanted, 4K@120 at a decent graphic quality setting.

Well science time is over, now it’s time to take that 3070 out and cable manage…

3 Likes

Only useful if the rate of refresh of your screen can keep up (not saying that you don’t know this, but lots of people don’t)

yep, my 4K monitor is 120hz, so only way to achieve that was with two 3070s at the time, as only one 3070 didn’t come anywhere close to it’s maxing it’s frame rate capabilities.

Also it would of been very useful with my last monitor, a 1440p 240hz Samsung Odyssey G9, as my single 2080 Ti at the time only maintained around 180 fps, so also not near that monitor’s full capabilities.

2 Likes

With the terrible luck I have had with AMD cards I generally stick with Nvidia. Had two rx 580’s and one didn’t work out of the box and the other caught on fire 17 days after purchase. Good ole Best Buy and the15 day return policy would not budge on it so I had to send the card back to XfX. During that time I bought a gtx 1650 and not a bit of problems with it. I am still stuck with the rx 580 they sent back from the manufacturer so I put it in my sons computer. Thing is loud, and I hate the drivers amd uses. Alot of older games don’t work right on the AMD card they run too fast. Even trying to use the frame rate limit option still couldn’t get them to work with it.

Also tried to upgrade a video card 20 years ago with an AMD card and the game I was playing at the time didn’t even work with it. So took it back and got a nvidia card and no issues.

Honestly its probably just bad luck/timing on my part but I generally try to stick with Nvidia products. Also as you can probably tell I don’t buy top end hardware either.

2 Likes

Exactly this.

Unfortunately to the lay person, they are just woo’d by some simpleton graph or benchmark, or hear how all the tech reviewers are claiming how Intel is turning into a dumpster fire because of Ryzen, etc… but in reality, Intel just shrugs it off because gaming PCs are only a minuscule fraction of their business… It’s the reason why supercomputers the world over still use Intel CPUs and not AMD’s still to this day, even though AMD has EPYC cpus. It’s because even though on the surface it may appear that AMD is beating Intel, in reality, AMD is so far behind and lacking in the features, especially in the proprietary/intellectual property that Intel holds the keys to which are vital to business, scientific, and research computing…

Nvidia is in the same boat as Intel… Their feature set and IP and propriority features are so far ahead of AMD that regardless if the AMD 6000 series is faster than the Nvidia 30 series in games, it won’t have any impact on the professional/medical/scientific/ai/business space, where the real money is at.

Plus there’s the fact that Nvidia can more easily and faster churn out a new product line to retake the lead at a moment’s notice… unlike AMD which takes them a year or more to develop a new product line… The 3080 Ti is rumored to launch in January and rumored to be priced to take on the 6900 and outperform it… Will be no different than when AMD launched the 5000 series… Nvidia will announce their next product line immediately after AMD releases cards and steal their thunder and cause potential AMD customers to wait…

1 Like

Waste of money. Just need one decent card.

Not necessarily. especially if you SLI previous generations… Let’s say you currently have a 2080 Ti, and happen to have your old 980 sitting in your closet… you can pop that older card into your system and may see a performance boost.

In otherwords, if you have a second graphics card laying around collecting dust… you may have a free boost to performance waiting for you. Plus there’s the fact that you can score a previous generation graphics card for cheap these days… the 10 series nvidia cards are virtually worthless now that the 30 series are here. 900 series, cost pennies…

So it may be worth trying out to see if it works for you or not.

2 Likes

I have a 970 collecting dust. I may give it a shot and see if it’s worth extra wattage strain my power supply.

Badly, optimized?, all but the most recent expansions can run on a toaster with playable framerates.

1 Like

1660 Super runs the game fine, unless you want 100+ FPS at 4K.

That’s exactly what I want :stuck_out_tongue: running a lg cx 48" oled 4K monitor @120Hz here. So maintaining a consistent 120+ fps without having to lower the game’s quality settings is my ultimate goal.

1 Like

I use a super 2070. More than enough for WoW.

1 Like

I guess I am getting old, since I am OK with 1080p never going below 60 FPS with max settings.

it’s one of those things where unless you experience it, you’ll be fine or you don’t know what you’re missing out on type of scenarios…

I was in the same boat a few years ago with my 1080p 60hz asus monitor that i use to overclock to 72, all was fine in the world… it wasn’t until i got my first 1440p 144hz monitor, that things changed… knew i could never go back to 1080, since for a while i did use my old 1080p as a second monitor, and was a night and day difference in quality. Then the same happened again with going from 1440 to 4K for me.

I’m surprised that more people aren’t talking about this. If it really works, it will certainly make things easier for me.

In the past I’ve run WoW on:
ATI/AMD 4850 Crossfire (2 cards)
ATI/AMD 4870x2 Quad-Crossfire (2 dual-GPU cards)
Nvidia GTX 680 SLI (2 cards)
Nvidia GTX 680 Triple SLI (3 cards)

I had a great experience with all of those setups.

I’m using a single 2080 RTX at the moment. Seeing the new GPU launches from Nvidia and AMD, and the hassle of actually trying to find one in-stock made me a bit sad. It reminded me that if SLI worked in DirectX 12 I would probably just buy a 2nd 2080 right now instead.

Maybe that plan will work after-all.