This page is meant to demonstrate how the framerate/refresh-rate actually impacts on what you see on a computer monitor or TV.
The major part of the console users, or uninformed pc users, keep saying something like:

30 fps are cinematic 60 fps are enough 120 fps are useless The eye sees no difference above X fps
Is this really true?
- Fps
(i'm assuming that you have a decent enough browser to view the above thing, and a decent enough computer to run that stably your monitor refresh rate)
You are probably running this at ~60fps
Do you think that 60 fps are enough?
What you are seeing now is this:

What you see in real life is more similar to this:


So, are 60 fps enough?

We're talking about an application of the Nyquist–Shannon theorem, saying "N fps is enough" is not true, for every N as big as you want, the displayed sequence will not be realistic as long as there's an object that moves faster than a certain speed.

That speed is determined by the framerate and the resolution, and when that speed is exceeded, temporal aliasing occurs, this means that the moving object skips some pixels, it teleports from a place to another in your screen.

This is evident on the above example, on a normal setup with a big screen the green circle teleports by 4-5 cm every frame, it's pretty difficult for the human brain to believe that this is a moving object.
This is not a matter of persistence of vision or other vision tricks, like 99% of the people say, the only problem is to make our brain believe that we are staring at a moving object, and not at a bunch of photos, if you play super mario, or another slow game at 30 fps, maybe you won't have much disconfort, because the difference between a frame and the next one is pretty small (apart from jumps).
If you play sonic (just to fit the above example), or another fast-paced game, at 30 fps, you are going to have a bad time, the difference from a frame to another is much more evident and you'll start losing immersion and important informations about the game, it'll become harder for your brain to track and see what's happening in the game world and thus harder and uncomfortable for you to play.
Another edge case, if you look at you clock's hour hand, 1 fps are enough, nobody complains "my clock lags", because the movement is so small you wouldn't notice, it's smaller than the minimum distance your eye would notice.

So, what is the speed beyond which you start losing information?


Here it is.
This circle is moving one pixel at a time.
Everything you see in your monitor that is faster than this, is actually skipping pixels, losing information, aliasing, lagging, etc...
Seriously
That might not seem a big deal, but ideally you'd want to keep every object movement under 1px/frame (not considering spatial antialiasing, that actually increases your render resolution), otherwise you'll end up teleporting things around, losing information.

This is skipping a pixel every frame

This is skipping 5 pixels every frame

This is skipping 10 pixels every frame

For the above examples, your brain and your vision are filling up the space, making a teleporting ball appear as a continuous movement, this is NOT realistic, try following the ball below with your eyes, or look at something still and see the ball lag around, that sucks, it's not like a real moving object at all.

Motion blur can be added, but then you'll end up with a blurred moving thing, still not what you want.

Conclusion:
Everyone that says "it's useless over X fps" is completely wrong, and playing a game at 25-30 fps like on modern "next gen" consoles is the worst gaming experience ever, it's gonna break your immersion and your accuracy, everyone that denies this is simply misinformed, the same for those who say "120fps is useless", a difference exists and can be seen and distinguished by everyone.

FPS > GFX