Your rig+performance reports, lets have it people

I was going to upgrade and build new 7800x3d until they started blowing up.

Alienware laptop m17 R5 :

  • CPU : AMD Ryzen 7 6800H (8-Core/16 Thread, 20MB Cache, up to 4.7 GHz max boost)
  • GPU : NVIDIA GeForce RTX 3070 Ti 8GB GDDR6
  • RAM : 16 GB DDR5 4800MHz
  • SSD : 1TB M.2 PCIe NVMe
  • Screen : 17.3" QHD 2560x1440 165Hz

Worked like a breeze on ultra settings.

I was going to get a 5000 series Ryzen if my CPU bottlenecked too hard, but appears my Kaby Lake from 6 years ago still performs. They changed chipsets after 7th gen so I would have to get a new mobo with a new CPU so I’m just glad I don’t need to spend more money yet.

Just finished building an updated system, played the game over the weekend:

i9 12900k w/ MSI 360R V2 AIO
MSI Pro WiFi Z790 P
32GB of DDR5 5600
Windows 11 Pro
1TB Kingston Fury Nvme (game installed there)
used Asus 2070 OC (yes older, but works)
Corsair AX1200i PSU
Cooler Master HAF ATX case

Ran it with default settings, it defaulted to one notch below ultra I think. Most enhancements were auto-enabled. The only changes I made were to the vsync as I was using two different screens, so adjusted the FPS to match
(I take that back, I seem to recall I couldn’t adjust that at some point. I think it worked at first, then it changed later)

On my 4K Samsung 46" TV, it ran between 25-35FPS at 4K resolution with the default settings. Smooth enough, but had horrible screen tearing (due to the Windowed Mode forced)

Then tried on my Gsync Asus Predator 24" HD monitor. Ran 144FPS solid as a rock, smooth as glass, and ran pretty comfortably temp wise.

I think the hottest the GPU hit was about 180F for the hot spot but averaged less. It ran hotter when trying to support the 4K setup.

I was going to swap in the RTX 3060 12GB during the weekend, but never got around to it. I know those two cards are not that far apart performance wise depending on application though.

Anyway, for me it ran great, I played for several hours on Saturday no issues at all. Internet was through Starlink even, no drop outs or weird latency issues.

Also did 3 battles against the world boss, won all 3. The first one I was at level 15 even. Played Necro, really didn’t have any issues with it either.

All in all, of the 3 different beta’s for this game I have now participated in, this was the best round.

Hopefully the launch goes even better.

Game on.

first beta : core i7 4790k, gtx 980, 16 gb ddr3, sata ssd. Was running 1080p / 60 fps locked / medium settings.

Server slam : core i7 4790k, rtx 2070, 16 gb ddr3, sata ssd. Running 1440p / 60-110 fps (depends if I use dlss or not) / ultra settings. And the icing on the cake. A new LG ultragear oled monitor.

CPU: i9 13900
RAM: 32 GB DDR5 6400
GPU: RTX 4090
Monitor: Samsung Neo G9 49" super ultrawide
PSU: 1000W

Settings maxed
Stable 165 FPS except the 30 FPS cutscenes
No issues (lag, rubber banding, etc)
Didn’t really pay attention to temps but I have a corsair cpu cooler and I never noticed the temp on the screen more than 40-50c

1 Like

During this beta they will.
Come release (IIRC) Blizzard will add/enable tRTX of the 20/30/40 series cards and the game will then look different. (Reflections etc.)

i7-7700k 3.60ghz, 16gh ram, rtx 3060

Played the game on 2560x1440 res, 144hz, with high settings and performance DLSS had quite stable 144 FPS. At worst it would dip to 130-120 FPS in some rare instances.

12700k
EVGA 3090 FE
32GB DDR5 6000
PCIE5 NVME
Played on Ultra 4k without any lag, studder etc. Worked so well, I didnt even check if the fps reached the 144Hz monitor runs at.

I5 8400
RTX 3060 3GB

I didn’t touch any settings,didnt even look at them. Game felt fine except somewhat long load times after using town portal.

Kinda old PC

i7 - 7700
16GB RAM
GTX 1060 6GB
Windows 10

Playing at 1080p on High settings. Stable 60 fps

3080ti
5600x
32gb ram

Ultra settings on 1440p without dlss: 160fps avg. Some fps drops and minor stuttering in towns after portalling.

Kinda mid tier pc but
Ryzen 5 5600
16gb DDR4 ram
nvidia gtx 1660 super
M.2 Samsung 980 NVME

Medium graphics felt no stutters except 1 point when i left the east side of kyovashad… my pc isnt really worth mentioning here tbh

5800x3d
rx 6800
3440x1440

Everything maxed. Mostly 100-141 (locked to 141) fps during gameplay, 80+ in Kyovashad. Didn’t really look during the world boss but it probably dipped below 100 if I had to guess.

Maxed, ultra textures, uncapped background / foreground

1440p widescreen

4090 FE
12th Gen i7
32 GB RAM

(99% FPS numbers)

165 FPS rendered at 140% without DLSS
283 FPS without DLSS 100% render
400 FPS with DLSS 3

53-55° C

I7 8700k with mild oc
3070ti
32 gigs @4000
Nothing special m.2
Forced 3440 ×1440
ultra settings
Dlss set to performance

Solid 120 fps (manually capped as I was playing on a 75" 4k 120hz tv, never cared to see what it would run uncapped)

Temps peaked at 64C, oddly, on both cpu and gpu.

I did push at full 4k res with the same settings for s’s and g’s, but obviously still, 8 gigs of vram just doesnt quite cut it. Ran at a solid 90 for a few minutes until d4 just started wrecking house.

8700K intel cpu (standard OC, from 3700 - 4700mhz)
16gb 3000mhz ddr4
1080ti Nvidia GPU (minor OC)

/ / /

/ UW 1440p native resolution (DLSS is too much of a hit to quality imo).
/ 60 fps stable (in town included) (You can set it to well over 60fps, would have to take a huge quality sacrifice for 120fps and it’s not at all consistent).
/ High Textures
/ x16 anti filtering and anti ali set to high.

Ahah only Nvidia guys here.

My rig is a nuc8i7hnk:

  • core i7 8705G @3.10 GHz
  • 32gb ddr4 @2400 MHz
  • M.2 500gb ssd (Samsung 970 evo) connected to PCIe 3.0 x4
  • Radeon Vega M GL (~= gtx 660)
  • 3440*1440 Samsung odyssey G5
  • Windows 11

Worked pretty well with constant 50-60 fps last beta with lowest settings and 50% instead of 100% to texture resolution. It was playable but very ugly, with lots of frame drop in Kyovashad.

I bought a RX 6700 XT for 340€ (370$) that i received friday morning, before server slam start. Connected to the nuc with a R43SG to my empty M.2 NVMe slot, powered by a 350W psu (Corsair VS350).

Ultra settings without FSR 2.0 got met a solid 80+ fps and 60 in Kyovashad, with 90° junction temp.

And High Settings with FSR 2.0 set to quality, i was running 110-120 fps smoothly (some micro stutter here an there), i also tried to play with amd software settings (custom config set to gaming). My goal is to reach constant 120 fps with the highest settings, but probably not all ultra.

(have some coil whine when gpu is busy, but my psu is crap - 11$, not gonna complain ahah, and with sony wireless headphones WF 1000XM4 it’s not a concern).

i5-13600k
32 GB DDR5 5600
MSI Ventus 3x RTX 4070 ti
1440p, 144 Hz, g-sync

Maxed out setting, 1440p, but had it limited to 144 fps. FPS rarely dropped below 144 except occasionally when loading some areas. Very smooth, no stutters.

what kind of monitor you have and what;s your default refresh rate?

And for everyone having fun with a medium rig like me,
I think that will change once they implant ray tracing in the game.
but i really don’t see a whole lotta difference, when looking at youtube games demos.
the only thing i’ve seen a real astounding difference is the 3DMARK demos.