So true. I actually had to talk a customer down from buying some super expensive parts because he wanted to be able to get 150+ FPS in most games. I had to explain to him why if he wasn’t going to upgrade his monitor that he was just wasting his money
He was going to be doing this on a 32 inch 1440p 60hz monitor
because you like to tinker and see if that one part gives you that extra percentage point(s) of difference. On the customer builds it’s not worth the headache so you go with the tried and true parts.
It was a rhetorical question for the most part - to drive home the importance of leaps in performance for upgrades vs imperceptible increments.
From everything I’m seeing, an OC 8700K isn’t even really holding back a 3090 very much if at all at 1440p in most games.
And I’m not going to be buying anything close to a 3090 on this (or any) platform in the near future.
Unless SAM or MS Direct Storage become a huge deal in the next two years or so making Zen3/PCIE4 important, a 6800xt/3080 seem to be a great fit for 1440p high refresh gaming paired with my CPU in most games.
It would take a bottleneck big enough to make say, a 6800XT fall enough to be the level of a 6800, or a 3080 to fall to a 3070, to warrant a CPU upgrade. Or even some significant gradient in between. And from the data available I don’t think that’s the case generally. And you’ll always have at least one component stronger than the other anyway.
I’m usually not that guy who pairs things poorly, but in the past I’ve overspent on things on an initiative gamble (8700k in 2017 paired with an RX 580 placeholder; paid off).
Right now if I had nothing I wouldn’t get an Intel system that has no PCIE4 or Zen2 unless it was heavily discounted.
But that said if I had anything 8700k+ or R5 3600+ I wouldn’t really consider upgrading anyway until GPUs becomes stronger than the 3080.
I had to change my post from “definitely not that guy to pairs things poorly” to “usually” because I have absolutely made some dumb decisions (justifiable in my own head, for sure at the time, but often in retrospect lacked forethought).
I think one thing people always get too obsessed about “bottlenecks” that nebulous term that seems to be absolute and at the same time incorporeal.
There’s always something imbalanced in a build somewhere; I suppose it’s just finding the balance that makes the most sense at the time to mitigate it as much as possible.
Bottlenecks and future proofing are the two terms I always hear about from people asking about build advice or rate my builds type posts. You are correct, all PCs have a bottleneck and if you really want to future proof then build a PC in the future. I try to ask people what their plan is? If you have no plan then you end building a someone else’s PC.
What kind of performance do you want and for how long? What parts will deliver that performance? Does your budget cover that performance? If it does, then go ahead and build. If it doesn’t you need to re-evaluate either the performance, time-line, budget or some variation of the three.
You both are right and have made some good points.
Im thinking its may be smartest just going to do a i9 10900k video/build next. I really do love that processor. Plus intels are fun to OC and tweak. The fact that I can get one for $485 is pretty nice too. 65$ cheaper than the Ryzen 9 5900x. I need to also consider since they are a new architecture they will be very buggy for the next 3months. Intel will work out of the box
AMD benefit of PCIE 4.0 is kind of useless to me and most gamers considering my gaming SSD is a 8TB SATA. Only my main C drive would be m.2 so defeats the purpose. Only M.2s can use it. PCIE 4.0 on graphics cards? The RTX 3090 only uses 55% of PCIE 3.0 so its kind pointless there atm too. It wont be a standard for another 4years roughly in which case I will upgrade then anyway. So I am really not missing anything. As an added bonus I have a slight upgrade path with intel. 12th gen will be 1200 also.
What difference does that make if the majority of my games are running off my 8TB SATA SSD? See my point? Thats where all my games and video editing files are.
To use PCIE 4.0 your games/files have to be stored on an M.2. Only the M.2 drives can use PCIE 4.0 man. Intel knows most gamers are running off Large SSD that are NOT m.2s its why they were in no rush to jump on PCIE 4.0. So the only thing id be getting it on is my C drive. I mean “Yay my C drive boots up 3seconds sooner and I can launch chrome faster?” lol
See my point about it? Most gamers are on the same standpoint I am. Few are in the financial situations to buy 4TB M.2s
Well obviously the idea is to use the fastest interface for playing games… It’s not my fault you got an 8TB SSD LOL (how much was that anyway? Jesus lol).
A lot of gamers have a single 1tb nvme ssd or a 1tb sata ssd probably. If the technology became useful and gamers had the capability it’s likely they’d move to it - gamers that don’t won’t care about the difference anyway.
I would say most users do not have an 8tb ssd. Like, less than 1%.
I use both m.2 nvme and sata ssd for games… That said id move the games that could utilize the technologies to the nvme.
And if I upgraded to a pcie4 system, I’d grab a pcie4 ssd.
I don’t even have 8tb total ssd storage in my house and I have 5 pcs.
300$ because its not an M.2. Non m.2 SSDs are dirt cheap. Its so large because I am also a video editor on top of being a gamer. I have about 2.5 TB of games stored on it
Tariffs and a high demand on parts by the OEM builders. Our tariff went from 8% to 25% and we pass on those cost to the consumer. We also had to bid against some of the big box stores to get our stuff made in time for the holidays. We have containers coming in every week since late August when in past years it was all in by late August.