So wait…what’s the frame rate of our eyes?

What’s the frame made of the human eye? Spoiler alert! There is none. But there is. Kind of. It’s weird. Movies, video games, television, your
computer, and your lights? All these trick the
human eye into believing that something that is
only there intermittently, is actually continuous. For most modern
movies, 24 unique frames are flashed before the eyes between 48 and 72 times per second, in order to give
the illusion of continuous movement. For televisions, that number is 24, 25 30, 50, or 60 unique frames,
depending on how the program was captured, and where in the world you’re watching it.
Computer screens, depending on the type, can flash between 0 and one hundred and
twenty times per second, while your overhead lights flicker at a
rate of 50 or 60 times per second. The concept of frames is
so fundamental to how displays and cameras work that the next obvious question is: what
about the human eye? How many unique frames per second does
it see? Zero. It sees Zero. Or, infinite, depending on the way that
you look at it. Let’s take a look at how the eye works. Inside the eye there are five different
kinds of nerve cells: photoreceptor cells, bipolar cells, ganglion cells, retinal horizontal cells, and amacrine cells. Bipolar cells,
horizontal cells, and amacrine cells are primarily responsible for the eye’s
reflexes, such as movement tracking, iris compensation, activating and
deactivating rods and cones, and grouping signals for easier
transmission. Photoreceptor cells are our rods and cones, cells that
respond to either the intensity of light, or the intensity of light at specific
wavelengths. Our cones let us see color and detail in brightly illuminated environments, and
are concentrated in the Fovea Centralis, the dense region ourselves directly
opposite the lens in the eye that form the center of our vision. The
rods are more sensitive than cones, but indiscriminate to color, and only
see brightness. These are much more spread out across
the retina, and form the bulk of our peripheral vision. The also let us see at night, and in poorly lit
situations. Both cones in rods communicate with
the other nerve cells through what’s called a gradient
potential: changes in light striking the outer segment cause the cell to either
absorb or release sodium or potassium, keeping the ion
concentration at its synapse roughly identical to the amount of light
hitting the cell. In other words, the cones and rods see
the world continuously, in what can be described as a low-latency, infinite frame rate. But this gradient isn’t transmitted
directly to the brain. Instead, it’s carried by bipolar and
amacrine cells to the retinal ganglion at the back
to the eye, which form the optic nerve. The ganglion continually measure the
gradient, and transmits changes in the gradient to the brain through action potential: a quick spike
in the charge at the nerve’s synapse. The rate of nerve firing communicates the
information as a sort of pulse frequency modulation, with
higher frequencies meaning that the gradient is getting stronger, and lower frequencies meaning that the
gradient is getting weaker. The visual center of the brain processes
these signals from the eye to create one unified perception of the
world, called the visual field. If your field of view – the stuff that’s
in front of you – isn’t changing much, then your brain doesn’t update the visual
field, meaning that its frame rate is essentially zero. Collectively the eyes in the vision
center the brain are grouped together into what’s called
the human visual system, or HVS. And since the parts don’t work
independently, what we should be looking at is the
visual system as a whole, and not just the parts. So, understanding the
basics of how the HVS works lets us ask some better questions about
frame rates: Question 1: How many unique frames do we
need to present to the human eye in one second, in
order for the HVS to perceive continual, fluid motion? The answer to this question was
discovered pretty quickly at the end of the 19th century with the advent of cinema. Both the Thomas Edison
Corporation and the Lumière Brothers found that a frame rate of 16 frames
per second was about the minimum that you could get
away with, to trick the brain into seeing continuous motion. Question 2: How many flashes of light per
second are needed in order for the HVS to perceive the light as continuous? Once again the answer
comes from the early days of cinema, and the same two pioneering companies. The
Lumière Brothers noticed that at 16 frames per second, the flashing of the screen was unbearable.
So they designed a double-bladed and then later a
triple-bladed shutter, in order to flash each frame more than
once. At these higher flash patterns, the flash was almost unnoticeable. Similarly Thomas Edison is said to have
observed that forty-six flashes per second is the
minimum in order to keep the audience comfortable, and to reduce strain. Later his company
was involved in research for the emerging alternating current electrical systems,
and determined that 48 flashes of a light per second was
the absolute minimum, in order for humans to
see the light as continuous. Question 3: What’s the shortest amount of
time a flash of light needs to be in order to be perceived? The
answer to this question comes to us straight from outer space!
Narrator: Yes, it came from outer space… Mission Control: Three, two, one… Mission Control: … zero. All engines running, commence, lift-off! We have lift-off! Astronauts on the Apollo missions to the
moon, who had left the protection of the Earth’s magnetosphere, reported seeing spots, stars, streaks, and clouds of lights, flashing once every
about three minutes. While we’re not sure the exact mechanism
of interaction with the human visual system, we do know that the flashes were caused
by high-energy cosmic rays – particles traveling at near the speed of
light. Which means that the actual flash durations lasted on the order of femtoseconds. Thats
ten to the minus fifteen seconds. But coming up with a fixed value is
actually kinda impossible, since it depends on how much brighter the
flash is, than the ambient conditions. A brighter
flash needs less time to be noticed. Question 4: what is the shortest amount of time a
period up darkness has to be in order to be perceived?
A 2009 study, presented to the National Institute of
Health, looked at the degenerate effects of aging on the senses. To measure the
degeneration of the vision, They measured the amount of time a dark
gap had to be, in order for each participant to be able
to see it. Using a group of younger participants as
a control, they found that the mean time for this group was only 18 milliseconds. For the older group
he was still only 22! This translates to a flicker free flash
rate of between 45 and 55 hertz. Some other participants were still able
to identify a blank period of only two milliseconds! That’s a refresh
rate equivalent of 500 Hertz! But on the other hand… When we blink, we black out our field of view for a third of a second. between 300 and
400 milliseconds. And our brain just filters it out! Unless
you’re focusing on the fact that you’re blinking. Like you’re doing right now… Question 5:
How long do we need to see a scene in order to be able to identify it?
Now we’re getting to the crux of what we really want to know. A 2014 study published in the journal
Attention, Perception and Psychophysics, look at the rapid recognition images
using a test that went something like this:
In the following set of pictures, is there a boat? Did that group with pictures contain a
wedding cake? No there is no boat, and yes, there was a
wedding cake. By asking questions both before and after a group of pictures, the researchers were able to control for
both image recognition, and image retention in the brain. They
found that their subjects could identify images shown to them in as little as 13
milliseconds, or 75 Hertz, with the statistical accuracy outside a
simple chance. The researchers wanted to go further, but
that was the limit on the hardware they were using! And that brings us to the last question
the bunch, probably the question that brings most of us to ask about the
human visual system to begin with. Question 6: Are there any
advantages to higher frame rates in cinema, television, video games, or
virtual reality? I hope by now that you can see the answer
is.. yes, especially in applications that benefit
from a high level of detail, like video games, virtual reality, and
sports broadcasting. But even traditional cinema can benefit
from higher frame rate allowing for larger and brighter screens. When you emerge motion to a larger
screen, you run the risk of having motion separation, where the brain
sees a double image, because the jump between frames is
too much, rather than continuous movement. This is
less of a problem and higher frame rates. Additionally, the brightness is of projected
images in traditional cinema is limited, in order to reduce our
perception of flicker. But when developing the Showscan
format in the early nineteen eighties, Douglas Trumbull found that higher frame
rates, especially 60 frames per second, removed this brightness limitation. In
the same set of experiments, Trumbull found that, when all other
factors were controlled for, the audience had a greater
emotional connection with the scene at seventy two frames per second, than
any other frame rate, which suggests that even though we don’t consciously see
them, we’re still subtly aware of the period between frames. And
removing them, or minimizing them, gives a much greater level realism
On the other hand… 24 frames per second cinema does
evoke a specific look of quality, a certain stylization that helps the
viewer to suspend their disbelief. Certain costumes, sets, and effects that
hold up under 24 frames per second cinema, don’t look so good in higher frame rates.
And yes, I’m looking at you, Peter Jackson’s The
Hobbit: An Unexpected Journey. But at the same time, it’s not really
fair to point out the problems of the first film done in this way. It does add a level realism that it is my personal belief, will
benefit cinema in the end. But in order for it to get there we
have to look at the hurdles and overcome them, rather than just
saying “it’s too hard”. So, all things considered, that’s the
frame rate at the human eye. If you’re hoping for a simple answer, I
really hate to disappoint you. I’m disappointed too. But the reality is the
real answer is far more interesting than a single
number! For Tech Laboratories, I’m Tech Adams, saying Keep Thinking, and
thanks for watching! Don’t forget to like this video and
subscribe to Tech Laboratories for more mind-blowing videos on science and technology!


  1. But honestly, in a video game on a normal screen, after around 50-75 FPS you can't really SEE a difference.

  2. Lights are actually flashing at 100 or 120 Hz, due to the voltage being applied crossing zero twice per cycle

  3. Actually, lights flicker at 100 or 120 times a second, when running off a 50 or 60 Hz mains AC supply, respectively. Each half cycle of the AC results in one cycle of light.

  4. This was interesting. When developing early cinema one of the concerns was that a faster frame rate meant that you needed more film. This would drive up costs for both the film stock itself and result in having to ship more reels of film to the theaters.

    Strobe lights are fun for playing with frames / flashes. It is also usually possible to tell is a moving object was either filmed or directly captured with video. The human eye seems to prefer the series of snapshots for moving things over the way video image sensors work. With direct video, progressive scanning and interleave scanning have peculiar artifacts.

  5. TECHOOPS [0:55] Houselights on 50/60 cycles-per-second AC (Hz) flash 100/120 per-second and flicker on submultiples-thereof depending on nonlinearities, fluorescent ballast leakage…
    TECHOOPS [2:26] Low-latency maybe, but heavily filtered: What's the gradient time-constant?!
    TECHBUTS: What about overall bandwidth…color pixel rate (e.g. 4:2:2 specifications)…etc.
    TECHANDS: And then get into detail-resolution vs. gradient-resolution…and their, rates…
    P.S. Did you know that the very-center-point of the fovea is a rod (B&W) not a cone (color)…

  6. Wow, TY, before this video I though the human eye could only see at roughly 24 to 25fps. It turns out, "photo receptor cells are our rods and cones" is as difficult to write as it is to speak! 🙂

  7. For framerate, the higher the better, but yes if you take cost of calculating and displaying you come up with a workable compromise..
    Things to consider that are often ignored, you mentioned the jerkiness at 24/30 frames per sec, that's a big limitation when filming scenes, and in interactive media it is a bit jarring. Also with sample and hold you'll likely end up watching a blurry mess even at 60hz. Those are bigger factors to perception than the limit to continuous motion or the minimum amount of time where you can see a variation in brightness.

  8. To me and this is my opinion. The frame rate to my eyes are 60fps but converted to 30fps. Let me explain.  You know how youtube can't play 60fps videos in Non HD but it looked smoother than a 30fps normal video with similar videoplay. Let this work out a little bit. I sometimes see my hand change to a different frame.

  9. Any super high frame rate has to take into account our AC power frequency. For 60 Hz, a 120 FPS would match each pulse of a neon and fluorescent lights. In Europe, that would be 100 FPS to match their 50 Hz power. Then again, 300 FPS would match both systems.

  10. Answer: We don't see in frame rate, or HFR, or whatever the fuck. We don't see in FR. That question has been asked many times before and has already been debunked due to the flabbergasting nature in of itself.

  11. Fighter pilots have been recorded spotting 1/255th of a frame. 255 frames per second And they could give a rough estimate as to what they've seen. its a test every pilot needs to undergo.
    Other tests with Air force pilots have shown, that they could identify the plane on a flashed picture that was flashed only for 1/220th of a second.

  12. I'd had a conversation with a friend when we were about to watch the last of the Hobbit films. It was about EXACTLY this very subject: 10:18

  13. When I have 6 stars Online on the Xbox One it's unbearable I think I get about 12 fps and sometimes I get killed because of the pauses the frames skips… really can't afford a good PC.

  14. This is one of the best science oriented videos I've ever seen on youtube. It had high quality production and just the right amount of jargon and detail to instill curiousity about the subject for further study while still providing a coherent, sensible answer.

  15. A few years ago, when 60fps became popular, many people actually said that the human eye can't see the difference

  16. 120hz removes the unattainable fantasy in movies for me, they become to similar to news broadcast beyond 60hz

  17. That was an AMAZING video that answered ALL the questions I had on the subject. I have always known I was super sensitive to frame rate but didn't understand why there was such varieties between humans. I also read elsewhere that you can train your neuron system to perceive higher frame rates and that some fighter pilots are trained to have a sensitivity up to 255FPS.

    With that in mind, what I would like to be able to do is to train my brain to TEMPORARILY REDUCE my FPS sensitivity so that when I watch cinema/IMAX movies I don't get so bothered by the really obvious visible motion separation I experience, at least until the world upgrades to higher frame rate projections/screens etc.

  18. It's very strange, our visual cortex only fires 15 times a second yet it can register 60 fps… Shit just got weird.

  19. Very good explanation. I still can't put my finger on why I don't like video shot at higher frame rates though. With the gear I have I limited to 60fps if I'm willing to shoot at 720p. I just don't like the look of it though. It should look more life like but doesn't. To me it looks more fluid than real life even when shot with a tripod. My thinking is has to do with the frames themselves and the shutter speed of 1/125th of second. That combined with what I've been conditioned to see from personal exposure over my life time. I would be interested in seeing some HD video shot at 120fps on 120Hz display to see if it has the same odd look to it.

  20. it doesn't film in ∞ frames per second, silly. If it did, that would be an infinite amount of data for our brain to pass over every nothingth of a second.

  21. Great video. Back in the old days we played Quake anywhere from 10 fsp and up depending on the hardware you had. The in-game default (or tic) was actually 72fps, which we achieved when the first Voodoo 3D accellerator cards arrived on the scene in about 1997. How a revolution that was! Soon we were at 100-120-160 – really as much as your hardware and CRT-monitors could mustard.

    Personally, I could notice a distinct difference between 120 and 160fps/Hz. But I could never see a real difference between 160 and 200. So, I gathered that on my particular setup, using my particular game and display hardware and config, my perception limit was somewhere between 160 and 200 fps. Although in the end I settled on 120fps for practical reasons (resolution, synch, and just felt 'good enough').

    Oh and using old fashion CRTs we had no input lag or latency, but our old serial mouse was limited to 30Hz. When PS2 arrived it had a 60Hz limit AFAIR, but hacks soon came around that bumped that up to 200Hz. Later USB mouse hacks did 500Hz or even 1000Hz, but 500Hz became the standard in the high-end QW gaming community.
    Oh those were the days.. 🙂

  22. Interestingly skirts around the issue of the soap opera effect of higher frame rates. Exposes that Mr TechLaboratories brings his preferences/prejudices in to his vids. Unsub.

  23. One question that wasnt answered, and is very interesting, has to do with the speed of which the images reach the brain. It was touched on as recognition speed, and was roughly 13ms if I remember correctly. Now imagine driving in a race car, at what speed does it become irrelevant to what you see at say 15 yds in front of you, since the speed now matches the speed of recognition, and what you see has already happened. For arguments sake, say its 13ms, that speed would be roughly 400kph (250mph). Given reaction time of a hooned racecar driver, even if something happened 50 yds away by the time his brain had sent signals to his muscles to react to it, it would be too late. Excellent video by the way.

Leave a Reply

Your email address will not be published. Required fields are marked *