Fastest possible visible feedback?

Uhmmmmm. As far as I’m aware, any input automation for any reason is bannable if detected, so I was considering techniques like that off-limits.

“Automated Input” is directed more towards playing the game for you. Macros are fine, otherwise. Besides, in this case, I was only using a single keypress macro for testing - to hold a key for a precise amount of time.

An example of a bannable macro is using it to control a spray pattern such as Soldiers burst firing. Marcos to change weapon, or - in my case - push to talk are 100% legal.

I would be pressing 2 keys by necessity (I want to measure the time difference between heroes of the time from mouse click to the ‘firing primary’ event.)

To use a key press macro then, I would be pressing 2 buttons simultaneously - the mouse press and the control keypress.

2 actions with one macro is the red line they’ve drawn.

I don’t really understand where the second key press comes in. Isn’t it just pressing and releasing the same button?

Annyyyywayy, let’s move on…

I have read the post and I think you might be going about it slightly the wrong way. I don’t think you need to do this visually. I have a question:

How is the OBS timer displayed?

It is just a timer that starts from 0 when you press the button?

Or is it a millisecond timestamp (https://currentmillis.com) of when you pressed the button?

Also:

This is something you should consider of you haven’t already. Your time difference of the physical button press reaching the server will vary between everyone. (Obviously I don’t know the specifics of your test though, so I can’t be sure you’re going to use the data in other environments)

1st key press is mouse1, and is what I’m testing so it can’t be my timer control. 2nd key press is any unbound key pressed for timer control.

I don’t have a timer in obs right now. I have an overlay that lights up when I click the mouse. I’m sure I could add an ongoing timer overlay, and probably one that starts on mouse click… OBS has a lot of plugins

The results would still be recorded in a 60fps video though. (I haven’t seriously investigated if I can record at 144fps, but that would be more accurate so I’ll look into it)

If nothing else running a timer overlay would eliminate human error of miscounting how many times ‘step forward’ was pressed.

This is made especially difficult simply because of that jump you’re also trying to measure between the client and the server lol. Because of this, I think you will have to do it somewhat visually after all lol.

So. Here’s my idea for this:

Firstly, you’ll need a timer overlay that starts counting up in milliseconds when you press your primary fire button.

Then create 3 rules:
On Going - Global
Conditions {
    // No Conditions
}
actions {
    ChaseGlobalAtRate(GlobalVariable(Z), 100000, 1)
    
    // Make sure you index these in this order, it'll make things easier.
    CreateHUD( (Display Global Variable Z * 1000) )
    CreateHUD( *Display Global Variable A* )
    CreateHUD( *Display Global Variable B* )
}
On Going - Per-Player
Conditions {
    IsHoldingButton(PrimaryFire) - // Don't use IsUsingPrimaryFire()
}
actions {
    SetGlobalVariable(A, GlobalVariable(Z) * 1000)
}
On Going - Player Took Damage Event //You could also try "Dealt Damage" and see if that gives you different results
Conditions {
    // No Conditions
}
actions {
    SetGlobalVariable(B, GlobalVariable(Z) * 1000)
}

Now when you record, you’ll have 4 pieces of information:

  1. Your client-side OBS button press timer - hopefully in milliseconds.
  2. The current time in the Overwatch instance that will always be counting up (Global Variable Z).
  3. The time at which the button press was detected (Global Variable A).
  4. The time at which the damage was detected (Global Variable B).

When you watch back the recording, the first step is to note the game time (Global Variable Z).
As soon as Global Variable A (the second HUD Element) changes from 0, your OBS timer is the very rough time to reach the the server. We’ll call this value X.
Then:

  • Global Variable B - Global Variable A is the time taken to deal the damage on the server.
  • Add this value to X and this is your total time

This is a lot, I know lol. You might find that this doesn’t give you the accuracy you want, but unfortunately, I don’t think there’s much more than this you can do. :confused:

Feel free to add me if you want to talk through DM.

1 Like

Have you locked the game to 60fps as well? As input latency is directly related to framerate that is important, or you will see very varying results.

1 Like

I don’t actually need step 4/variable B. I’ve separately tested the time it takes from ‘is firing primary’ to ‘damage received’. It is in fact 0 for all hit-scan heroes, which is good since that’s the definition of hit-scan. I mean, I guess I could leave that part of it in anyway.

Anyway, I’ve got an OBS layout with an overlay of a stopwatch with millisecond accuracy next to the mouse overlay working now. I’ll just have to try to get a continuous timer on the hud text, and I should be good to go

Thanks for the suggestions!

I haven’t… I’m running at ~200fps rendered / 144fps displayed. I’m not sure why it would matter as long as my frame rate is stable, but for the sake of solid data I can do the test at a locked frame rate (probably at 144fps for higher accuracy… it does look like recording at 144fps is pretty easy to do … err, but having not tried it yet, I don’t know that it won’t tank in-game frame rate below 144 which would be … problematic… so maybe 60, heh).

The first rule I listed will give you that functionality - specifically these 2 actions:

Let me know how it goes for you. Good luck!

1 Like

If you’re recording at one framerate and rendering at another, lower, one you may end up not recording the first few frames when you click or get the response.

If you start displaying a timer instead it becomes much less of a problem.

1 Like

This seems to work pretty well. I have timers in both OBS and Workshop now using your technique ; then I register the time of the firing action as a copy of Z.

I have a spreadsheet line like,

local remote firing registered – remote actual loc/remote offset firing registered – local calculated DeltaT – click to firing
62550 65835 65980 3285 62695 145 mccree
64380 67659 67788 3279 64509 129 mccree
65830 69115 69244 3285 65959 129 mccree
78550 81818 81979 3268 78711 161 ashe
80240 83546 83675 3306 80369 129 ashe
87400 90697 90842 3297 87545 145 ashe

Pretty sure that Ashe line is just an outlier, probably due to the low local/remote offset at that frame. I should probably use average offset instead anyway - the jitter in the offset is almost certainly jitter in what the workshop displays. (Or is the data more compelling if I just do enough samples to iron out the outliers and use the offset values for each sample? Hmmmmm. Probably doesn’t matter, the kind of people I’m trying to convince aren’t going to believe me anyway… w/e, that’s part of analysis, not experimental method, so I can just run it both ways. Why not.)

Anyway, I’ll clean up the presentation a little (I’ve got trailing 10ths and 100ths of milliseconds, and an entirely unused field in my HUD text because I was adapting existing code), and do a run with 10 or so shots from each of Ashe and McCree, but initial results look like there’s no real difference.

Oh, also, my machine cannot handle both playing and recording at 144fps. :frowning:

Did a test of 60fps, and good lord does that feel awful. Also no wonder I used to get motion-sick from too much fps gaming with that, it’s like watching the world through a strobe light or something. (I didn’t really feel the difference much going to 144fps, but now I see I can never go back :laughing:)

Turn your graphics all the way down. Render scale, FOV and resolution. This only effects the actual rendered world, and not the HUD interface so they will still be crisp.

Unfortunately, I only know this because that’s how I play OW right now xD. GPU died a couple weeks back so it’s integrated graphics and locked to 30fps for me lol :man_shrugging:

Hmm, good point. I already play on pretty low settings, but I do have 100% render scale and a wide FOV, and probably not lowest possible settings (e.g., I’m sure I have some anti-aliasing and not the worst textures).

Oh, that sucks. I don’t know if I could play at 30fps, I think it would make me motion-sick after 20 minutes or so. (Even at 60 fps, I kind of needed to take breaks from time to time.)

I haven’t even tried to actually play because I know what the result will be haha. I just sit in the Workshop, mostly, and even then it’s not a completely stable 30 fps lol. icry