Is it worth playing in 16:10 (if its an option)

i am wondering if its worth playing in 16:10 if that is a available option as all my previous laptops used 16:9 and this laptop i just recently got is a 16:10 display

Yes it is. (And yes it is, unless you have a wonky driver issue on your PC)

Source: been gaming on the slightly taller 16:10 format since 2005. Overwatch plays on it just fine.

What does it change lol

Using an example: the standard 1k resolution is 1920x1080 pixels, which has a ratio of 16:9.

A 1k 16:10 ratio would have 1920x1200 pixels, the same number of pixels width-wise, but features 120 more pixels vertically.

ive honestly never played in 16:10 before im not even sure the exact difference

One of two possibilities happen when you game in 16:9 mode on a 16:10 monitor:

  1. You see black bars on the top or bottom of your screen
  2. Your image is slightly stretched.

Is either of those options acceptable to you? No? Then play on your screen’s native format (16:10).

I don’t think it even works properly in OW.
At least it does nothing on 16:9 display.

16:10 normally is much shorter view so it should be leaving black bars in left and right in wider displays.
If you are trying wider view while using 16:10 monitor, the black bars will be up and down instead of on the side.

From competitive standpoint 16:9 lets you see more of the playfield but it is not a big deal if using 16:10. You will just have to look around a bit more.

i dont think i see bars as i have the game set to 2560x1600 my game could be set to 16:10 but i would have to check my settings when i login

I can think of no better way to explain what the render ratio does than this:

2560/1600 = 1.6, (16:10 aspect ratio), so all your monitor’s pixels will be in use and not stretched. I don’t remember off the top of my head if Overwatch is like some games where you can set your monitor to use your native 2560x1600 resolution, yet force a non-native ratio. If that option is available, then what it would do in a configuration with a 16:10 screen would be:

  1. (most likely case) internally render the game at 16:9 (2560x1440), which is fewer pixels needed to display on your monitor correctly, then interpolate nearby pixels to stretch the image vertically to fill the remaining missing 120 vertical pixels, resulting in a fuzzier image that looks like it was stretched tall.
  2. -- or --
  3. (less likely case) internally render the game at 16:9 (2560x1440), which is fewer pixels needed to display on your monitor correctly, then merely fill the 120 vertical pixels with junk data, most likely "solid black", 60 pixels from the top of the screen, and 60 pixels from the very bottom of the screen, resulting in:
    • displaying annoying black bars instead of filling the monitor entirely,
    • no loss of image quality
    • slighter faster render time than your monitor's native resolution (due to internally having less work to process!).
  4. -- or --
  5. I suppose you could also internally render at a resolution greater than your monitor's native resolution. Math time:

    Note the standard conversion math:
    1600 * 16/10 = 2560

    And…
    2560 * 9/16 = 1440
    2560 * 10/16 = 1600

    We could try scaling up by rendering more horizontal pixels than we need—and then throwing them away to fit your monitor’s 16:10 aspect ratio, assuming we hold the number vertical pixels constant.

    We could increase the number of vertical pixels, if we wanted to hold the number of horizontal pixels constant?
    2560 * 11/16 = 1760

    Under this 16:11 ratio, we could display this on your screen by throwing away 160 vertical pixels worth of rendering, giving you a nice clean 2560x1600 resolution (at increased VRAM and GPU processing costs, resulting in unnecessary lower FPS than if you had simply rendered at 2560x1600 internally and displaying 2560x1600 natively on your monitor.

    If we wanted to be extra wasteful, we could render a 16:9 ratio at a resolution greater than your x1600, and throw away both horizontal and vertical pixels, giving you the same image quality as if you had rendered natively, but costing you a serious loss of FPS performance. (By the way, you can do this sort of preserved ratio scaling in the internal renderer while preserving the render resolution. The option for doing this is called “Resolution scaling.” It’s practically pointless setting it to anything other than 1.0. Greater than 1 values result in the aforementioned performance loss with no noticeable image quality gain, and setting to values less than 1 result in increased performance at loss of image quality.)

Of course, knowing Blizzard these days, I guess we could always argue the render ratio option could really be broken entirely and it actually really doesn’t do anything at all! :clown_face: