The standard test to check for periodicity is to take a Fourier transform of the data and look for peaks, but that is difficult because of the missing values. If I interpolate the missing data as you did, I see periodicity. If I replace the missing values with zeroes, I see no periodicity. But since we are trying to determine whether or not there is periodicity, we want to be careful to not use analysis methods that can distort the answer. From this test, I learn that interpolation is dangerous in this case, and may make you see things that aren’t there.
A different test is to use an autocorrelation function (which measures whether the differential at game A is correlated to the differential at game A + x), which is forgiving of missing data. But in this case there is not nearly enough good data, and the error bars came out way bigger than any putative signal.
If you want to know more, I can go through this more carefully and show my work, but I want to be sure that you are interested before I do the significant amount of work required.
As a highly relevant aside, note:
This says that even if the game was buggy (or it is hard to find fair matches in your SR/platform/region) and fat fingering the win/loss scale, you would get more/less SR to compensate and there should be no net bias against you.