Yep this is pretty huge news for us cloud gaming enjoyers.
I was wondering how they would manage 240hz but basically what they going to do is use the av1 codec which is basically a much more efficient codec so you can get a lot of better video quality even on a lower transmission rate, heh
What surprises me is that amd link did 144hz since 2021, with way less requirements or bs. While nvidia did 120hz at 2022.
Which makes me wonder, if this is actually a progress. Due the change to av1 itself enable it and the 2 requirements (g-sync and reflex). Which btw is gimmicky as heck.
While their pricing isn’t that great. I would rather invest in a rig, laptop or console for an year or two and/or use steam,xbox or amd link counterparts.
Would be more flexible and be less hostage of it’s overpriced service which doesn’t support worldwide really well and nvidia proprietary stuff that often goes to oblivion.
The service is too costly that wouldn’t be “interesting”. I mean 200$ per year you can get some nice stuff in one or two years to make it less desireable really fast. Tbh, if was like 100$ an year I could see an advantage, but 200 with the associated cost of needing a 2nd device, supported screen and the application use reflex for that anyways.
I would rather buy a device already capable of that and save some money on the long run. Heck you can achieve it with some laptops already and laptops in general got really lighter and thinner overall.
Feels like, game streaming still doesn’t make much sense outside your own rig. Pay for something that you could do it for free if you own a device already capable of that, seens pointless to me. At least right now with those pricing and those requirements. Who would invest on a streaming service that would require some gear instead of a nice rig already that would be agnostic to the “specific” monitor itself, while you could play in your smart tv without hassle by example on amd link, steam or xbox plans already.
Feels another “illusory” marketing tbh. While I hope things get better this pricing and requirements doesn’t make much sense for me.
G-sync is a proprietary tech not always fully compatible wirh freesync and be at mercy of reflex which is gimmicky as heck to actually get that “experience”.
Surprisingly no. My current system doesn’t anyway. I do use a custom fan curve using afterburner on the GPU, but I’ve never had any heat problems. Under heavy load I max out under 70C usually.
As for noise, it’s not really noticeable with my headset off, and I use the arctis pro /w gamedac… can’t hear anything with those on anyway.
Most of pcs these days can run really cold tbh, if you undervolt and set fan speeds to low is barely audible, some times even less.
My rig has:
Rx 6900xt (oem variant, similar to rx6950xt) 40-60, but on really higher settings up 80’s.
Gtx 1080 mostly for work or some proprietary coding like cuda reliant codes or nvidia proprietary tech/features.
R7 5800x 4375mhz(locked)
48gb ram 3400mhz.
And the hottest device is like 60-65s with fans spinning like 700-900ish rpm. At a room temp of around 30-32°C.
You can also put it in a place that you wouldn’t even be near and stream to your tv, smartphone, tablet or even laptop.
Laptops these days got really nice structures. Some hardwares can handle most of esports games at 120-180fps without making audible noises. Due most of them using the keyboard as form to cool the heatsink and more homogeneous exausting techs.
If you dislike noises you can also place on another room by example. Pcs only generate a ton of heat if they’re being stressed out. My rig only makes some noise when I set more than 300fps and gets annoying after 400fps. That on really high settings and resolution.