RIP Intel?

Welcome to business, while i don’t agree with rising costs as a consumer; i also find it very interesting people are angry with prices before they see performance graphs. Zen 3 is a major architectural improvement over Zen 2.

People have long considered AMD to be the “cheaper” option and have always justified poorer performance with the pricing, and finally when it seems performance will be on top, they want it for less too.

Can’t please em.

1 Like

Do people also forget AMD bundles a 100% perfectly normal cpu cooler as well? That right there is like 40 bucks.

That’s about to come to an end.

Oh yeah your right, only the 5600x will come with one. Did they say that on the stream or was it afterwards?

it’s not that good, I can name well over a dozen coolers @ $40 or less that are better then the AMD bundle cooler starting from the Pure Rock 2 @ $40 to the Gammax 400 @ $25. The bundle coolers are OK but if AMD took back $40 on their CPU price (heck even $25) I would gladly just take the CPU.

Rocket Lake is still another 14nm. I wonder how hot and power hungry that is.

Apparently it’s a new architecture, despite being on 14nm.

We’ll have to wait, I suppose.

It might be good, it might not. Same goes for Zen 3, which all we have is the manufacturer’s claims at present.

1 Like

Definitely not anytime soon. I definitely will get 5900X first.

RTX 3080 still nowhere to be found… Maybe I should screw it and wait for 20GB version. lol

Personally I am probably still going to go with AMD again on GPUs - The 6900XT (which is allegedly what we saw earlier in the month) looks promising and even though I’d like to have competent Ray Tracing, I don’t think it’s big enough yet to be a meaningful purchasing factor.

I am really, really unhappy with the handling of the 3000 series, with its hype as a $699 unicorn part, to Nvidia shifting sales of their FE cards to Best Buy (USA Only, leaving the globe out of luck), and with the imminent launch of the 20gb models, to them ultimately settling to be $1000 cards as they were originally intended.

AMD might not always deliver on performance, but their prices have always at least been competitive.

We’ll see in a month or 2. Not a long wait. I doubt Nvidia will charge $1000 for 3080 20GB though. Especially if Big Navi is on their heels when it launch.

It depends. If they don’t do FE cards anymore, and don’t do an FE 20gb model, how much will AIB 20gb models cost?

Let’s presume there’s a $100 price increase by doubling the amount of memory.

Presently, there’s only a handful of $699 AIB 10gb cards.

Most land somewhere in between $750 and $800. Some are as high as $860.

The handful of $699 cards become $800.

Most in between will be $850-900. The high end models will be $960.

I can totally see the average price of these cards landing closer to the $1000 mark than the $700 mark.

Basic versions probably will hit $800. $850-900 for OC and “upgraded” ones. It will still depend on Big Navi price as well.

Still leans closer to the end of $1000 for most cards.

I don’t think Big Navi is going to actually perform as well as the 3080, and definitely not the 3090 (despite leaks from that MLID guy). Maybe in a few games, but not most.

The 6900XT will probably be price competitive with the 3080 current price, but will be slower overall but with more memory.

If so, I’ll take it.

1 Like

If you don’t have to get one asap I’d wait. Ampere is built on a process designed for mobile chips and not GPUs. Supposedly, new GPU bins are getting upwards of 2.1Ghz. Also, allegedly, AMD CEO is cutting off heads if they have massive driver problems this time.

There is going to be a pricing war on GPUs coming in late 2020/early 2021. For WoW a 3080 10GB is fine but for something like Crysis remastered we’re already seeing more active VRAM usage than 10GB.

The preview from AMD already shows one big NAVI already 5% below the 3080 at Nvidia favored titles. I wouldn’t rule out AMD for gaming this time around until we see the cards. It’s looking promising so far.

1 Like

Time will tell.

Yeah, I can wait. My 1080 Ti is more than enough for 1440p. Only reason for me to get RTX 3080 is proper ray tracing performance.

In my reality, it doesn’t really matter how well the 3080 or 3090 perform once the price gets too close to $1000.

I just can’t justify personally spending that much on a GPU.

I would rather get a $500 GPU today and another $500 GPU in two years, and have two capable GPUs to pass down to my family members.

The $699 promises were already on the edge of my comfort level, but now that I know that was mostly a lie I am pretty much over it.

2 Likes

Raytracing on an TSMC 7nm would have been wonderful on 3k series cards but Ampere on Samsung had to cut corners due to lower die space. Games with low rasterization has wonderful raytracing (ie Minecraft). On something like WoW the perf drops are pretty rough. WoW’s image quality with ray tracing also isn’t great right now.

If you’re looking for good ray tracing I suspect next gen Hopper and RDNA3 will probably more exciting on that front.

well as a person that has been using Citrix and AMD products many times in the past AMD pretends they can keep up… they never do. Fine we can wait until 5950X comes out… and find out it’s not any better than i7 10600KF. The current highest AMD CPU is BARELY faster than a middle i7 … so unless the 5950 comes with some huge jolt like back to the future… I doubt it will come close to matching the i9… so they are just paying catch up.

You know you can test that feature right now in WoW right?

After months of anticipation, Ray Tracing Shadows is finally enabled on the World of Warcraft Shadowlands beta, providing enhanced shadows.
Blizzard

WoW’s ray tracing implementation uses DXR 1.1, not DXR 1.0.

To be able to use DXR 1.1, you have to be on the most recent version of Windows 10 (the 20H1 release, 2004) and have DXR 1.1 capable graphics drivers.

I have BETA using a standard 5600 XT works fine… so much for the high end cards that support it… it’s supported NOW. It only needs Windows 10 with current drivers…