Looking At A ASUS ROG Swift 41.5” 4K OLED Gaming Monitor (PG42UQ)

My coworker gushes over his new OLED monitor and I am looking for one.

ASUS ROG Swift 41.5” 4K OLED Gaming Monitor (PG42UQ) - UHD (3840 x 2160), 138Hz, 0.1ms, HDMI2.1, True 10 bit, DCI-P3 98%, G-SYNC Compatible, DisplayPort, USB, Console Ready, Remote Control, Anti-Glare.

This is my first leap into 4K gaming. I Love the size, don’t mind the price. Can anyone think of better?

Current rig:

AMD Ryzen 9 5900X 12-Core Processor with a NVIDIA GeForce RTX 3090 Ti.

P.S. Sorry for the double posts.

If I want big screen, I would stay away from monitors. They are extremely overpriced. Going for LG C3 TV instead.

1 Like

I know this post is a few days old but here’s my two cents.

OLED’s incredible and there’s nothing on the market that will match it for gaming.

Not a big fan of anti-glare especially when OLED can be had without it.

Burn in is still a real issue for OLED gaming especially with static elements such as WoW’s UI. Asus is offering a 2 year warranty on burn in, but you could probably buy from Best Buy and add a warranty if it’s a concern.

I play in a high end VA panel and I’m really happy with it, and I’ve owned several OLED TV’s to compare.

I can’t speak to the OLED aspect, but I absolutely love having a 43" 4K monitor, a relatively inexpensive Gigabyte FV43U. I usually play in a 2K window actually, with browser windows and videos or whatever around the sides. It’s awesome. I resize my WoW window depending on how much I care about whatever I’m watching! I used two 1080p monitors side by side before this setup, but I like this much better, though I couldn’t say why. Maybe the flexibility.

Agree 100% Im using the 42" LG C2. 42" LG C2 and C3 BOTH have dual firmwares installed. 1 firmware when used as a TV and a differant firmware/interface when used as a monitor. Its the main TV/Monitor id look at for that reason. You can snag a c3 for $800 and a c2 for $500-600

DISCLAIMER: Keep in mind these are 4k Monitors. Its going to really push that 3090 if you play AAA titles like Cyberpunk ect ect. So you need replace your CPU with the 5800x3D which is a drop in replacement. ATM you are CPU bottlenecked at 4k with the 5900x

1 Like

Also, LG C4 is coming as well. Doesn’t look like a big upgrade though.

https://www.whathifi.com/reviews/lg-c4

1 Like

Yeah he just need to upgrade his CPU also. He is bottlenecked atm. The 5800x3D is leaps and bounds ahead of the 5900x in gaming. The difference was massive as apposed to the 7700x vs 7800x3d which was a nothing sandwich

1 Like

I would rather wait for Intel LGA 1851 and go latest CPU with 48GB DDR5-8000 CL40 instead.

1 Like

That may be next year the 5800x3D is cheap and will alleviate his bottleneck for now was my point. If hes spending $800+ on a monitor he wont mind an extra $200 or so to do that

1 Like

One year of 5900X is enough to convince me to stop using AMD. Also, multiple news about AMD blips after my swap to 12900K tells me to avoid AMD.

They enjoy pushing performance instead of quality control first these days.

Sadly this originates from one source. He is not a trustworthy source. It also all removes around one game which cares more about CPU freqancy and Ram speed then sub timings or Game Cache. So there is some truth to this. I will go into Detail.

Ironically BOTH WoW and Final Fantasy also have this issue. SWTOR also to some degree. It mainly has to do with the 1% lows rather than average FPS. Their Game engines Prefer CPU Frequency and raw Ram Speed over Ram Subtimings and CPU Cache. This why the 13900k blows the 7800x3D out of the water in Games like WoW, COD and Final Fantasy. Mainly in 1% lows which is what controls “how smooth” the game feels in heavily crowded areas like Raid Boss Fights, World Bosses and Superblooms ect ect.

WOW does not have built in benchmark to reliably test this. Final Fantasy does so I will leave it here an example. Look at the chart You will notice the 1% lows are higher on ALL intel CPUs after even if average FPS isnt. Like I said in MMOs your 1% lows matter the most. Because thats your amount of frame drops during Raid Boss Fights, World Bosses and Superblooms ect ect. So yes in MMOs the “AMDIP” is real. But the root cause is CPU clock frequency and just raw Ram speed . Intel leads in that Area

1 Like

Not one source.

We Exploded the AMD Ryzen 7 7800X3D & Melted the Motherboard (youtube.com)
Why I switched back to Intel… (youtube.com)
The Issue with CPU Reviews… 13 x Ryzen 5 7600 compared (youtube.com) (Yes, they sell potato chips too!)

INTEL & AMD both lied! :point_right:REAL WORLD power consumption is MESSED UP (youtube.com)

I know some system builders. They said stay away from AMD when it comes to workstations. Recently, a local already unhappy after buying 7950X for his workstation. RAM support is crap.

You are mostly right but I did explain accurately what causes the “AMDIP”. Everyone foams over over Bencmark charts like lemmings. But this are missing half the story because they tend to ignore is the 1% lows. This is HUGE in MMOs. This it why my buddy on the 7800x3D gets “microstutters” during superbloom and me on the 13900k gets a smooth like butter experience. My 1% lows are no where near as low as his. So my gameplay in heavily crowded areas with a LOT of input data feels smoother than his even though he has a slightly higher Average FPS than me my gaming experience in games like wow will always feel smoother than his

1 Like

Yeah, which is why I’ve been thinking about 48GB DDR5-8000 CL40 for my 13900K. Sadly, paying for board and RAM doesn’t justify the price. I might do it on next gen though.

Also, my 13900K running with “Optimized for Ryzen” RAM. Rock stable.

32GB vs 48GB will make 0 difference in wow. WoW drinks up Ram Speed like water in the desert however so RAM speed will make a difference. Work applications are a differant story. Will it change your average FPS? No. But the 8000mhz will give you better 1% lows aka less frame drops in crowded areas like Raids.

1 Like

Add dual streaming, multiple browser window, watch video and music to the mix.

1 Like

In that case 32GB vs 48GB will make a difference in wow. The more stuff you do in the background the more your “glass gets full of water”. A full glass = less FPS and higher 1% lows aka a horrible experience. I know you already get this but I am posting this so others reading this thread understand the science and reasons behind it.

This is the main reason I cant justify buying an 8 core CPU in 2024 even if it does have large game cache “7800x3D”. All it takes a few streaming apps and other things running on a second monitor to bogg down those 8 cores really fast

1 Like

Yup, main reason for me avoiding 7800X3D is because of what I do to my PC. Cache doesn’t work properly with my workload.

Also, Intel APO is coming for 13th gen… with WoW support!

1 Like

Was a selling point for 14th Gen. But the 14900k is a reprogramed and pushed to the max 13900k. You have 0 headroom fro Undervolting. the 200mhz boost on P-cores comes at the cost of Heat. This is why many people complain undervolting does nothing for Temps in the 14900k.

Have a buddy at intel he got me 3 14900k Processors to test. All with differant silicon quality. Undervolting did next to nothing for Temps it ALWAYS hits 90c+ in stress test compared to the 13900k which has no problems with Undervolting. Yet my OC + Undervolted 13900k Boost 6.0 P-Cores ad 4.6 E-Cores. In a stress test my 13900k with an OC mind you only pulls 274w. That is insane. Which outperforms the 14900k in every way. Temps, Power usage Performance #s. I wouldn’t recommend the 14900k to anyone even if it was cheaper than the 13900k

Basically the 14900k just about HAS to power limited which nukes performance. A simple undervolt wont work because of how it is programmed to behave.

Take a look

1 Like

I can do 0.07V UV for my chip.

1 Like