First off: trying to compare screen resolutions is pointless as its only one of three variables the information for whether it matters or not. You also need to include Viewing distance and the screen size. There's plenty of handy charts out there that show the "breakpoints" for when a change in resolution is or is not perceptible to the human eye with 20/20 vision at specific distances on specific sizes.
Here's an example:
So lets say you are using a 50" screen, and are viewing from <2' away (so a moderately large screen being used pretty close as a desktop setup for PC gaming), as you can see on the chart, you actually can still perceive the difference even up well past Ultra HD, in fact IIRC, when closer than 2' on large screens you can still easily tell the difference between 4k and 8k resolution, but around 8kish is when you start peaking.
And at
extremely close but small screen distances, you go even way up and beyond that, its theorized way above a benchmark of a theoretical "16K" would still be needed. This case specifically would be for VR headsets, which have already started surpassing 8K and still are well below necessary benchmarks.
Or on the other end of the spectrum, a massive several hundred foot screen, like in a movie theatre, but from only like 20~30 feet away (So first row seats), you'll still as well be able to perceive the difference between 4K and 8K.
However, as you get farther from the screen, it blurs and you start to have trouble distinguishing between the two, so if you're sitting in the back row of a movie theatre, you probably wouldn't be able to perceive a difference.
As for frame rates, the human eye does not perceive light in the form of a "frame rate", and the human eye can perceive the difference even as high as 120 fps in images, but how perceptible it is to the human eye depends on a myriad of things, like what color the light is, how bright it is, etc. You have a much easier time perceiving it on bright colors. Its why the common cited example is moving the mouse on a white background, that black pattern on a white background is extremely perceptible to the human eye and yeah, you can even tell the difference in a case like that well above 120fps.
And as others mentioned, whether its periphery or focal matters a tonne too. Also other random ***like how tired you are, how strained your eyes are, how good your eyesight is, whether you are paying attention and looking for it, etc etc.
But the human eye doesnt see "frames" of data, it constantly moves around your view and takes multiple snapshots of information, then your brain pieces that data together to create a conceptual idea of what you are even seeing. Half of what you think you even are seeing is not actually there, it's "guesswork" your brain used to fill in what it thinks outta be there.
Its why stuff like optical illusions exist, like if you realized just how much of what you think you see is actually just your brain filling the data in by guessing, you'd be surprised.
For example: If you had a white screen you were staring at, and for 1/120th of a second it flickered to all black then back to white, you absolutely without a doubt would notice that.
But, if you have a screen flickering back and forth between "white" and "Ever so slightly off white" 120 times per second, you would likely not perceive that at all.
And then you get into REALLY crazy stuff, like pupil dilation responses, and the fact your brain also automatically performs color correction over time (which is what causes those afterimages you see when you stare at a light for awhile), and even locomotion correction (which is what causes that "waterfall" effect you experience if you watch the credits of a movie for a long while, then look away)
There's so many different mechanisms all working together in unison when it comes to our eyesight, trying to simplify it down to "can or can not humans perceive fps" is just not possible.