I know this post is a few days old but here’s my two cents.
OLED’s incredible and there’s nothing on the market that will match it for gaming.
Not a big fan of anti-glare especially when OLED can be had without it.
Burn in is still a real issue for OLED gaming especially with static elements such as WoW’s UI. Asus is offering a 2 year warranty on burn in, but you could probably buy from Best Buy and add a warranty if it’s a concern.
I play in a high end VA panel and I’m really happy with it, and I’ve owned several OLED TV’s to compare.
I can’t speak to the OLED aspect, but I absolutely love having a 43" 4K monitor, a relatively inexpensive Gigabyte FV43U. I usually play in a 2K window actually, with browser windows and videos or whatever around the sides. It’s awesome. I resize my WoW window depending on how much I care about whatever I’m watching! I used two 1080p monitors side by side before this setup, but I like this much better, though I couldn’t say why. Maybe the flexibility.
Agree 100% Im using the 42" LG C2. 42" LG C2 and C3 BOTH have dual firmwares installed. 1 firmware when used as a TV and a differant firmware/interface when used as a monitor. Its the main TV/Monitor id look at for that reason. You can snag a c3 for $800 and a c2 for $500-600
DISCLAIMER: Keep in mind these are 4k Monitors. Its going to really push that 3090 if you play AAA titles like Cyberpunk ect ect. So you need replace your CPU with the 5800x3D which is a drop in replacement. ATM you are CPU bottlenecked at 4k with the 5900x
Yeah he just need to upgrade his CPU also. He is bottlenecked atm. The 5800x3D is leaps and bounds ahead of the 5900x in gaming. The difference was massive as apposed to the 7700x vs 7800x3d which was a nothing sandwich
That may be next year the 5800x3D is cheap and will alleviate his bottleneck for now was my point. If hes spending $800+ on a monitor he wont mind an extra $200 or so to do that
Sadly this originates from one source. He is not a trustworthy source. It also all removes around one game which cares more about CPU freqancy and Ram speed then sub timings or Game Cache. So there is some truth to this. I will go into Detail.
Ironically BOTH WoW and Final Fantasy also have this issue. SWTOR also to some degree. It mainly has to do with the 1% lows rather than average FPS. Their Game engines Prefer CPU Frequency and raw Ram Speed over Ram Subtimings and CPU Cache. This why the 13900k blows the 7800x3D out of the water in Games like WoW, COD and Final Fantasy. Mainly in 1% lows which is what controls “how smooth” the game feels in heavily crowded areas like Raid Boss Fights, World Bosses and Superblooms ect ect.
WOW does not have built in benchmark to reliably test this. Final Fantasy does so I will leave it here an example. Look at the chart You will notice the 1% lows are higher on ALL intel CPUs after even if average FPS isnt. Like I said in MMOs your 1% lows matter the most. Because thats your amount of frame drops during Raid Boss Fights, World Bosses and Superblooms ect ect. So yes in MMOs the “AMDIP” is real. But the root cause is CPU clock frequency and just raw Ram speed . Intel leads in that Area
I know some system builders. They said stay away from AMD when it comes to workstations. Recently, a local already unhappy after buying 7950X for his workstation. RAM support is crap.
You are mostly right but I did explain accurately what causes the “AMDIP”. Everyone foams over over Bencmark charts like lemmings. But this are missing half the story because they tend to ignore is the 1% lows. This is HUGE in MMOs. This it why my buddy on the 7800x3D gets “microstutters” during superbloom and me on the 13900k gets a smooth like butter experience. My 1% lows are no where near as low as his. So my gameplay in heavily crowded areas with a LOT of input data feels smoother than his even though he has a slightly higher Average FPS than me my gaming experience in games like wow will always feel smoother than his
Yeah, which is why I’ve been thinking about 48GB DDR5-8000 CL40 for my 13900K. Sadly, paying for board and RAM doesn’t justify the price. I might do it on next gen though.
Also, my 13900K running with “Optimized for Ryzen” RAM. Rock stable.
32GB vs 48GB will make 0 difference in wow. WoW drinks up Ram Speed like water in the desert however so RAM speed will make a difference. Work applications are a differant story. Will it change your average FPS? No. But the 8000mhz will give you better 1% lows aka less frame drops in crowded areas like Raids.
In that case 32GB vs 48GB will make a difference in wow. The more stuff you do in the background the more your “glass gets full of water”. A full glass = less FPS and higher 1% lows aka a horrible experience. I know you already get this but I am posting this so others reading this thread understand the science and reasons behind it.
This is the main reason I cant justify buying an 8 core CPU in 2024 even if it does have large game cache “7800x3D”. All it takes a few streaming apps and other things running on a second monitor to bogg down those 8 cores really fast
Was a selling point for 14th Gen. But the 14900k is a reprogramed and pushed to the max 13900k. You have 0 headroom fro Undervolting. the 200mhz boost on P-cores comes at the cost of Heat. This is why many people complain undervolting does nothing for Temps in the 14900k.
Have a buddy at intel he got me 3 14900k Processors to test. All with differant silicon quality. Undervolting did next to nothing for Temps it ALWAYS hits 90c+ in stress test compared to the 13900k which has no problems with Undervolting. Yet my OC + Undervolted 13900k Boost 6.0 P-Cores ad 4.6 E-Cores. In a stress test my 13900k with an OC mind you only pulls 274w. That is insane. Which outperforms the 14900k in every way. Temps, Power usage Performance #s. I wouldn’t recommend the 14900k to anyone even if it was cheaper than the 13900k
Basically the 14900k just about HAS to power limited which nukes performance. A simple undervolt wont work because of how it is programmed to behave.