Thanks haha! Well, I am just a nerd. I’m not right on everything either. But I try 
Regarding the resolution question: Both sort of. The concept of “increasing resolution reduces CPU requirement” is only a half-truth.
Think of framerate being a two-part game.
Extremely unscientific and simplistic explanation with made up numbers:
The CPU can process as many frames as it wants, but the GPU can only render as many as the CPU is asking for and is itself capable of.
Increasing resolution is mostly a GPU stresser, and CPU doesn’t get affected as much from the increase of resolution.
If at 1080p, your CPU is capable of processing 150 frames, but your GPU is capable of rendering 200 frames, then you will only see 150 frames because you are CPU-limited. In this case, a CPU upgrade could help you utilize more of your GPU’s capability.
If at 1440p, your CPU is capable of processing 140 frames, and your GPU is capable of rendering 150 frames, then you will only see 140 frames because you are still CPU-limited, but much less so. The GPU is able to be more fully utilized because you’re asking more of it.
If at 4k, your CPU is capable of pressing 120 frames, and your GPU is only capable of render 100 frames, then you will only see 100 frames because you are now GPU limited.
But let’s say you had a weaker processor that could only process 120 frames at 1080p, 110 frames at 1440p, and 100 frames at 4k. Let’s say your GPU could do 200, 150, and 120 respectively at those resolutions.
You could run at 1080p/1440p, but you would be stuck at 120fps/110fps while your GPU had headroom for 200/150, or you could run at 4k and get actual 100fps - 100fps from the CPU and 120 frames from the GPU. The 4k option gets most benefit out of both components. Increasing resolution didn’t actually increase FPS at all, just balanced the load and gave you more utilization.
But say you had an even weaker CPU, say that could only run 60fps. Increasing the resolution won’t give you more frames past 60, and getting a faster graphics card won’t help.
So the whole “higher resolution is less demanding on the CPU” really depends and doesn’t mean you can get a poor CPU.
There’s other nuances like 1% and 0.1% lows, scenarios where no matter what GPU or CPU you have, you will have low FPS (to varying degrees). But this is a very super basic overview. Also the idea of deliberately seeking lower GPU utilization (thermals, noise). But that’s another story.
So to answer your question, you need both, but as long as your CPU isn’t totally terrible, GPU is generally more impactful on your experience.
Cliffs: if you upgrade to a 2k or 4k display, and your GPU is very strong, your CPU will be less of a bottleneck, but you won’t get more frames than you would have if you were instead playing at 1080p. I recommend 2k with something like a 5700 XT or 2070 Super for those resolutions, and the limitations of your CPU will be less obvious, not counting the possibility you are using too many simultaneous programs