RTX 3090 Reviews Are Out

Since we did this for the 3080, why not do it again? It would seem that the 3080 is a much better buy when you look at performance and cost.

Edit - My personal conclusion: I’ll take a look at the 3070 and the new AMD cards, but unless there is an incredible value proposition (or another crazy black friday sale!) with one of these, I’ll probably stick with my 2070 and wait for the 40x0’s. From the look of things, high refresh 4k gaming may be an affordable reality in a couple of years. Maybe by then WoW will have more advanced ray tracing effects as well.

Tom’s Hardware:
https://www.tomshardware.com/news/nvidia-geforce-rtx-3090-review

TechPowerUp (Zotac Review):
https://www.techpowerup.com/review/zotac-geforce-rtx-3090-trinity/

2 Likes

I’ll upgrade to RDNA2 when they launch and if they are good. Mostly so I can retire the lowest GPU in my family stack (1060 3gb) and everyone gets an updrade.

I just spent a bunch of money on some kitchen gear so that satisfied my “gotta spend now!” itch.

The Kitchenaid pan set I’ve wanted has been out of stock since March so you’re lucky.

1 Like

Ok, now this is a conversation we can really sink our teeth into! What did you get?

I’ve been thinking about a bread machine. We bake bread in the oven (much easier now that we have a stand mixer), and will keep doing that for Italian/French breads, but my sister and my friend have been telling me how great their bread makers are for loaves. Any experience with these? I kind of like the idea of putting in the ingredients, setting the timer and waking up to fresh bread.

I think you’re supposed to stay on the site and press F5 until it shows up. :wink:

I got this:

https://www.bestbuy.com/site/combo/stand-mixers/3d9d4534-2a3a-4906-ab5d-a484c8b69336

and this

https://www.amazon.com/3-Piece-Attachment-KitchenAid-Stainless-Accessory/dp/B0721M32GH

gonna be makin sausage, pasta, and dough

1 Like

Our stand mixer is a KitchenAid - they’re great… Enjoy!

This counts as ‘hardware’, so we’re in the correct forum.

1 Like

“Neat” comes to mind when i see that card and its price tag…kind of like the same way when i pass the fc kerbeck dealer near by (thats a luxury and exotic car dealership…ferrari lamborghini maybachs higher end mercedes, etc) …

Neat…awesome stuff i cant afford.

:slight_smile:

This may seem like an unfair take, but when I see a card that big with that exotic of a cooling system - I can only come to the conclusion that this type of development is unsustainable long term. It just seems impractical. I know that as PC gaming enthusiasts we’re not supposed to care about practicality, but $1700 graphics cards that take up 3 slots and special braces and whatnot to support them seems like a brute force technique that is bound to be problematic.

I think I’ll leave this kind of progress to content creators and industrial tasks. It’s not a matter of can I afford it or not, but does it make sense? To me it’s like buying a whole cake when you just want a slice. Such a waste.

1 Like

If nothing else, the 3090 stretches the ATX PC form factor to its limits. If cards like it are going to be the future, motherboards, cases, etc need to change to accommodate them properly.

Lol, great response! I understand the sentiment.

The only thing I would add is that state-of-the-art today does help to inform us of where we might be tomorrow.

It is similar for me in that I only play WoW and it would be overkill. My son is an avid gamer, but this is way out of his price range (he’s still in school) and more than I’d be willing to gift.

There have already been people posting case modifications! :slight_smile:

Reality is for most people that GPU performance beyond say, RTX 2060 level is excess - given the benchmark for “acceptable performance” set by the gaming consoles is dynamic resolution 30fps, Almost any modern hardware can achieve that.

You shouldn’t feel any type of guilt for not wanting to spend $1500 on a graphics card for a kid.

The 3080 are crashing from cheap manufacturers atm, it works just dont overclock.

They’re crashing because nvidia was too secretive about their drivers and the AIBs didn’t get the chance to test them properly. It’s not just the “cheap” cards, even EVGA is affected.

https://www.igorslab.de/en/what-real-what-can-be-investigative-within-the-crashes-and-instabilities-of-the-force-rtx-3080-andrtx-3090/

1 Like

I got my wife one of these earlier this year. It makes tasty food😃

1 Like

It has less to do with drivers (I won’t say nothing to do with them) and almost everything to do with AIBs trying to save a buck (not unreasonable when you consider they expect to shift possibly tens of thousands of units long-term). The use of cheaper capacitors is fine and all until they try to push them too hard, such as cards which will - by default - boost to 2GHz and beyond. If you stick below that you’re golden, but if you try to pass it you’re rolling the dice.

From that article it looks as though FEs are unaffected, ASUS’ TUF lines should be more than adequate, and MSI might have some lines that aren’t affected as much (though may still be susceptible to faults at a higher speed).

Regarding EVGA, though:
https://forums.evga.com/Message-about-EVGA-GeForce-RTX-3080-POSCAPs-m3095238.aspx

Only review samples were affected, due to being based on pre-production cards. Production cards never shipped with the full 6 POSCAP layout.

1 Like

Nvidia provided them with a “spec” for the reference design. That design was flawed.

Not really. It sounds like it works perfectly if you keep to the reference specs - it’s just the ones that come factory overclocked (overclocking = outside of spec) that cause problems. Though I have heard of at least one user who claimed that their reference clocked card exhibited similar faults, which may or may not be because of the same issue (it could also just have bad caps).

I would argue that the reference design is exactly what’s been demonstrated with the FE (which, by all accounts, is unaffected) in the same way that the reference design for the predecessors was the same as the FE. but there’s always the possibility that NVidia deviated from the reference for these ones. So I’m still leaning towards AIBs trying to save a buck being the root cause.

The FE this time aren’t reference designs, just like the last generation. They haven’t used “reference” designs since the 10 series for the Nvidia brand products.

And since the AIBs were not provided with working drivers, all they could do is power on and thermal testing before going into full production to hit the launch deadline.

They had no way to know what would happen from Nvidia, and it seems only really ASUS took the extra step in engineering beyond the reference spec and ahead of testing. Even EVGA did some combination of POSCAP on their most expensive models.

I am blaming Nvidia solely on this one. The AIBs can be judged by how they handle this after the fact.

More information from people who are smarter than me:

https://www.reddit.com/r/hardware/comments/izmi1k/ampere_poscapmlcc_counts/g6k2h0a?utm_source=share&utm_medium=web2x&context=3

https://www.reddit.com/r/hardware/comments/j09yj5/poscap_vs_mlcc_what_you_need_to_know/?utm_source=share&utm_medium=web2x&context=3

tl;dr: POSCAPS not worse or cheaper than MLCC; the fault lay primality with Nvidia for not allowing them to test configurations with the correct drivers, and less so on the AIBs.

All of this is why I am not, and will likely never be, an early adopter. I’m an occasional over-spender, but I’ll wait until they work the kinks out. :slight_smile: