Nvidia RTX Launch Suggests AMD May Have A Shot At High End Of GPU Market With Big Navi

Just wanted to see what you folk thought about this.

I thought this was interesting:

Assuming AMD’s 1.5x performance per watt claim holds, Big Navi should be about 1.5x the performance of current Navi at the same power. Navi is currently at about the same performance level as previous generation Nvidia 2070 Super. The implication here is that Big Navi could slot in between 3070 and 3080 at the same power level as current Navi. However, if AMD pulls the same trick as Nvidia and increases the power envelope, Big Navi could pull up closer to or a bit above 3080. If AMD does achieve that, it will be quite the coup as AMD has not participated at the high end of the market in many years.

While it is difficult to tell where AMD will land due to many variables involved in estimating performance, it is fair to say that AMD now seems to be in the hunt at the high end of the gaming GPU market.

Even if AMD cannot deliver at the 3080 level, it would be adding the key Ray Tracing technology to its products and reducing the functional gap with Nvidia.

Source: https://seekingalpha.com/article/4372391-nvidia-rtx-launch-suggests-amd-may-shot-high-end-of-gpu-market-big-navi

2 Likes

The more competition between AMD and Nvidia (and Intel) the better it is for us consumers. How game developers handle ray tracing will be a major factor for both companies (fortunately or unfortunately)

Although I’m somewhat of an AMD GPU fan myself, I’m not overly confident they can pull off brute forcing their way to performance. The last two generations, Vega and Navi, have all had issues taming their thermals, and their last attempt at brute forcing (Radeon VII, which was Vega) was almost an utter failure.

I think they’ll still be competitive in the sub $500 market, but not convinced they’ll offer competition above that.

I’m not sure we can use 1st gen Navi as an example. It wasn’t fully developed RDNA, and I think the thermal issues were a big issue with GCN. So far as we know, BIG Navi will be the first release featuring RDNA exclusively (no GCN mixed in) so perhaps that is where the big jump will come from.

They have also had the added development time with RDNA itself as a result of the Navi release (I almost think that first Navis’ were a stopgap, meant to show they are still pushing for GPU relevance). My opinion on the future of Radeon hasn’t changed though, even with the Ampere release: If Radeon can’t make a card that can at least compete with Nvidia’s current 3 sku line up, they will never catchup, and should relegate themselves to the budget gamer market and just put their eggs there to make the best low and mid grade gpu’s they can for the sub-500 or 400 market.

There comes a point where people just won’t believe them. Ryzen gave the public a glympse of hope that AMD could come out of their competitors shadow, and stand on their own. RVII should have done that for the GPU segment, heck… the 5700xt should have done that (they did, but barely… with the previously mentioned issues). the GPU division doesn’t yet have the good will to miss a target.

I’ll just settle for a stable video driver that I don’t have to start playing with settings just to get it working somewhat.

I haven’t had much of a problem - except I have had to pretty much give an undervolt/overclock/fan curve adjustment to both my Vega 64 and my 5700 XT to get them to perform well.

That’s kind of an AMD thing at this point, though. I’m used to it.

If AMD is able to drop some cards comparable to the mid-to-high-end 30xx that’d be great, because AMD GPUs generally play much much more nicely with non-Windows operating systems. Nvidia is fine if you’re an exclusive Windows user or the machine in question is a dedicated game box, but there’s plenty of niches outside of those where they’re more dodgy.

Wow, so much good feedback … where to start???

This is one of the things that I’m feeling positive about. With AMD closing the gap on GPUs and CPU single threaded performance - heck, I personally/subjectively prefer Intel/Nvidia, but this is good news for everyone (I grudgingly upgraded to the 3600 because it fit my use case so well, and it has been great). The closer it gets the better for us. I’m not sure there will be any reason for me to upgrade my 2070 since I only play WoW - my son on the other hand … hmmmm…

No wonder you and my predecessor got along so well… :wink:

Seriously though, even if they just close the gap a bit more on the high end, it will be good. If they can slot in as the article suggests all hell is going to break loose! (well not really, but the market’s reaction will be interesting to watch)

Perhaps that’s why we’re not hearing much. Sometimes these leaks are strategic - clearly there is a reason so much accurate information leaked in the 1-2 weeks prior to the Nvidia event.

:woman_facepalming:
This reputation does not help - I think it scares many people away. Most people want to get in the car, turn the key and drive - they don’t want to have to tinker under the hood.

I did not know this. Do you use Linux? I wonder how big that market is…

Occasionally, yes, though it’s not my main OS (at least for now). The reason why AMD works better under Linux is because AMD open sourced their Linux drivers, which means two things:

  • The drivers can be included with Linux distributions
  • The community can prioritize and fix bugs, performance issues, etc on their own

These advantages apply to Intel iGPUs as well, since Intel has also open sourced their Linux drivers.

By contrast, Nvidia’s secretive and controlling tendencies mean that they not only have not open sourced their Linux drivers, but are actively uncooperative with Linux kernel and graphics developers. They’re bullies of a sort in that community, expecting the Linux desktop graphics stack to be developed to suit Nvidia’s whims, rather than Nvidia adapt their drivers to Linux’s graphics stack. Linus Torvalds, creator and active maintainer of Linux, famously flipped the bird at Nvidia on camera because how much of a nightmare they are to work with.

For end users, this means that machines with Intel iGPUs and AMD GPUs get proper drivers right out of the box and more or less “just work” across a wider variety of setups (dual monitors, differing DPIs between monitors, etc). If your machine has an Nvidia GPU, the best Linux comes with are reverse-engineered community-made drivers that have a literal performance cap because Nvidia cards throttle themselves when run with unapproved drivers. You’re forced to download and install Nvidia’s closed source drivers if you want decent performance, which the process for differs depending on the flavor of Linux you’re running, and even then expect to encounter weird behavior and bugs if your hardware/software setup is even slightly unusual, because for Nvidia Linux desktop users are an afterthought at best.

Apple’s falling out with Nvidia isn’t entirely unrelated. Where AMD shares its GPU driver code with Apple and allows Apple to make platform and even device-specific tweaks and bug fixes for macOS AMD drivers (which makes perfect sense, fine-tuning is Apple’s whole schtick), Nvidia refuses to do this, stemming from that same secretive, controlling attitude mentioned earlier. As a result, if you build a hackintosh with an AMD GPU, you’ll get full support and the right drivers out of the box, whereas Nvidia GPUs are unsupported entirely.

Nvidia is brilliant technically but they’re really not a nice company to work with.

1 Like

The last Nvidia GPU I purchased was a 1060 3gb mini and a 650 ti before that. And they weren’t really for my personal use.

They were all doorbuster sales, too. If I have a choice and performance isn’t too heavily weighted vs. value, I usually choose AMD.

To be fair, that 650ti is a workhorse GPU. had one for a while. Wasn’t amazing, but it never said no to me.

It’s in the computer i gave to my mom.

i5-4570, 8gb ddr3 1600 ram, 128gb ssd, 500gb hdd, 650 ti 2gb.

Pot, let me introduce you to the kettle

She will have the highest refresh rate vs the rest of her digital canasta group! Well done.

2 Likes

Well, before Covid my son would use it when he would go over to play games on. So there’s that.

I had an i5-4440 with a GTX 645 … did its job faithfully for many years! The 645 went to my sister and the 4440 is connected to the TV with my 1050ti in it. This is why I tend to upgrade rigs and not parts, there is always someone who needs it or some other use for it!

I bet those Mahjong tiles look sharp! :slightly_smiling_face:

I mean that was kinda my point, in that aspect Apple and Nvidia are actually pretty similar, so naturally they’d clash.

I do think it’s a questionable practice to keep driver code locked up, though, regardless of the company in question. If the hardware could be used without drivers keeping them closed off would be fine, but drivers are as critical of a part of a GPU as a transmission is in a car, and as such should be open. So at least in that particular area, AMD and Intel are doing better than Nvidia.

It sounds to me like Nvidia has made a calculated decision that protecting their IP/code is likely to have a greater impact on their profitability than any increase in revenue they would see from sharing it. Even if it is true now, I wonder how long it will stay true if they keep losing market share. I would hazard a guess they have some very well paid people looking at this very thing.

Oh no doubt.

One of the bigger factors is likely how with the current driver distribution model, Nvidia can charge massive markup on their Quadro GPUs because they give access to the more stable, less buggy workstation fork of Nvidia’s drivers, even though often Quadro cards are just consumer GTX/RTX cards with better binned chips and sometimes extra CUDA cores. If the driver code were open, many prosumers and entry-to-mid-level workstation customers could make do with much cheaper consumer cards and community-built Quadro drivers.

1 Like

so just kinda got me thinking what do you all think the longevity will be for rtx 20s
i hope it’s not a case that they will be obsolete just got one several months ago