Why all professional gamers care about Input Lag – What it is, how we define it, and how minimizing it can make you a much better player.

How do we define “Input Lag” in our reviews?

Most computer engineers would define a display’s input lag as being the difference in time between the output from a graphics card, to the time that image is displayed on the screen. There are other factors which constitute “lag” which we will discuss on this page, and which ones you should be the most concerned about when looking to purchase a display or device for gaming. 

It is worth noting that the display tests measure a different type of lag than the peripheral devices tests, and that the phrase “Input Lag” has no actual defined measurement by computer scientists. Display input lag is concerned with the time required for the pixels to show an image after it has been processed and sent by the graphics card. This is different than the manufacturers quoted pixel responsiveness time (such as GTG response time, or BTW – more on this in a bit), which measures how quickly the pixels can change from one color to another. Pixel response times are what affect things like motion blurring and ghosting – which also an important measure for gamers needing maximum performance for fast paced games.  

The amount of lag varies greatly depending on the display, hence why there is a noted difference between manufacturers gaming displays vs their normal ones for casual/business use. It’s become commonplace for manufacturers to focus on gaming displays to provide the fastest response times and lowest input lag possible for the type of panel being used. The difference in lag depends on the internals of a monitor, such as its internal circuit board and scaling chips, which determine its signal processing time and therefore its quoted input lag and responsiveness. Many manufacturers take active measures for their gaming displays to reduce input lag as much as possible, with many offering modes which can bypass the scaling chips and including other options to reduce the input lag. 

Short answer: input lag is caused by a combination of two things – the signal processing delay caused by the monitor’s internal electronic board, and the response time of the pixels. 

How do we determine our quoted Input Lag time?  

“Input Lag” is made up of 2 parts. When a signal is sent from the GPU to a screen, the instant it hits the port on the back of the monitor/TV it begins what we would define as “Input Lag”. As the signal enters the display, it has 2 sections of lag, which can be measured.

  • Signal Processing Lag (The lag from an image being processed by the monitor after being sent the image from the GPU). 
  • Pixel Responsiveness Lag (The lag from the time is takes for the pixels to change to show the image after it was processed by the monitor’s internal circuit board). 

Because the quoted pixel responsiveness time provided by the manufacturer is usually found to be pretty accurate (after comparing our tests to their quoted times), on InputLag.com we only focus on reporting this number.

How are our ratings determined?  

We only quote the manufacturer’s given response time, but to consider whether or not that’s good, please see the chart below:

For 240Hz Monitors: 

  • Excellent) Less than 4.17ms / 1 frame lag at 240Hz – Would be great for competitive gaming at 240 FPS.
  • Good) A lag of 4.17 – 8.3ms / One to two frames at 240Hz – Still very good for pro gaming compared to lower refresh rate monitors.
  • Decent) A lag of more than 8.3ms / more than 2 frames at 240Hz – Still better than any TV, though not suitable for competitive gaming at 240 FPS.

For 144Hz Monitors: 

  • Excellent) Less than 6.9ms / 1 frame lag at 144Hz – Would be great for competitive gaming at 144 FPS.
  • Good) A lag of 6.94 – 13.9ms / One to two frames at 144Hz – Still very good for pro gaming compared to lower refresh rate monitors.
  • Decent) A lag of more than 13.9ms / more than 2 frames at 144Hz – Still better than most TV’s, though not suitable for competitive gaming at 144 FPS.

For 120Hz Monitors: 

  • Excellent) Less than 8.3ms / 1 frame lag at 120Hz – Would be great for competitive gaming at 120 FPS.
  • Good) A lag of 8.3 – 16.7ms / One to two frames at 120Hz – Still very good for gaming compared to lower refresh rate monitors.
  • Decent) A lag of more than 16.7ms / more than 2 frames at 120Hz – Still better than most TV’s, though not suitable for competitive gaming at 120 FPS.

For 60Hz Monitors: 

  • Excellent) Less than 16.7ms / 1 frame lag at 60Hz – Great for gaming at 60 FPS.
  • Good) A lag of 16.7 – 33.3ms / One to two frames at 60Hz – Good enough for casual gaming.
  • Poor) A lag of more than 33.3ms / more than 2 frames at 60Hz – Not suitable for serious gaming at 60 FPS.

For 30Hz Monitors (not considered gaming monitors, really): 

  • Excellent) Less than 33.3ms / 1 frame lag at 30Hz – Good enough.
  • Good) A lag of 33.3 – 66.6ms / One to two frames at 30Hz –Noticeable at 30 FPS.
  • Poor) A lag of more than 66.6ms / more than 2 frames at 30Hz – Very noticeable/distracting lag. Early LCD’s from the 1990’s may be this bad, but they do not exist now. 

I think you see the pattern here. 1 second = 1000 milliseconds. So, if a monitor is displaying a game at 144 frames per second, you can do the math to determine what an acceptable frame loss should be.  

At 144Hz (and assuming constant 144 FPS output from GPU) you are getting: 
-144 frames per 1 second 
-144 frames per 1000 ms 
-1 frame every 6.944ms (we round to nearest 10th

If the monitor has one frame of lag, it won’t really be noticeable because there are 144 frames being shown every second. However, a monitor running at only 30 FPS, skipping one frame constantly from the input lag is going to be much more noticeable, and put you at a significant disadvantage.  

Why don’t you use a Lag Tester tool such as Leo Bodnar’s lag tester? 

There are a few main reasons for this, most notably is the lack of testing ability for higher framerates. Leo’s tool is limited to 1080p and 60Hz, while many of the top gaming monitors are way ahead of this, pushing 240Hz nowadays. It would not be accurate to be testing an expensive top-shelf gaming monitor with a native 240Hz refresh rate, with a tool that can only measure up to 60Hz. It forces you to measure the input lag of that 240Hz monitor at only 60Hz, which means the fastest it can possibly score is 16.67ms. Wait, what – the maximum visible display lag for 60Hz is 16.67ms? Yes. 

For instance, if a 240Hz monitor is rated as “Excellent @240Hz”, it will still be rated “Excellent” at 60Hz. The refresh rate of a monitor has a direct impact on its input lag. A 60Hz monitor, for example, will never have a visible input lag below 16.67ms, because at 60Hz, the screen gets refreshed every 16.67ms (refresh to the bottom of the screen, see the next question for an explanation). So, if the overall input lag time is really 15ms, it doesn’t matter, because the lag time is less than the time for the screen to refresh the image, and it can’t be visibly measured. A 120Hz display halves that time to 8.3ms, and a 240Hz display further reduces it to 4.17ms. 

This is a huge distinction that needs to be made. The web is full of misinformation concerning quoted “Input Lag” times, which often are only measuring the time it takes for the pixels to change from black to white, and is more of an indication of pixel response time rather than the true “input lag” that gamers are wanting to know for an advantage.  

Should Leo Bodnar ever update a testing tool aimed at TV’s and Monitors, complete with support for 240Hz readings, and fixes the mentioned problems with his tool, we would be one of his first customers.  

What is the Refresh Rate aka the Frame Rate?  

This is important to understand to get a good grasp of how monitors work and why there is always a difference in reported lag times. It also explains how some sites can quote an Input Lag time that is faster than the possible refresh time of the screen. LCD monitors refresh from top to bottom as seen below in the video. 

High Speed Video of LCD Refresh:

Most people using Leo’s tool test the monitor and report the number from the middle of the screen, something that SMTT does as well. This is why the maximum visible display lag for 60Hz is 16.67ms, yet often gets reported as being lower for gaming displays running at 60Hz. If you test a 60Hz display with an exact quoted input lag time of 16.67ms (time to refresh to the last pixels at the bottom of the LCD panel), it should take roughly (just an estimate) 16.67ms/2 = 8.335ms to provide a reading in the middle of the display.  

Can you explain to me the different types of Input Lag? 

To start, let’s first visualize how a signal would travel through the computer to eventually be shown on the display: 

Output from Mouse/Keyboard/Controller -> Signal processed by Computer and Game -> Signal sent out to screen through HDMI/other cable (about 1/100th the speed of light [scienceline.ucsb.edu/getkey.php?key=2910]) -> monitor processes signal and sends to panel for display -> pixels on panel change to show image as intended.  

What is highlighted in red is what we are measuring for the “Input Lag” of a display (TVs & Monitors). By using a tool to measure the screen going from white to black, it can only visibly measure  

What is highlighted in green is what we have the least amount of control over, it’s up to your gaming computer/console to determine how quickly the signal gets processed. See here for more info. 

What is highlighted in blue is what I consider to be Physical Input Lag – the time it takes for your mouse/keyboard/controller to send the signal to your computer. There is significant room for variation here, hence why this category is also so important to gaming.  

Why do most manufacturers use GTG and not BTW pixel responsiveness times?  

Simply put, GTG is a more realistic representation of what the pixels will be like under normal gaming conditions. Grey is a unique color from a technical perspective. It’s half white, and half black – meaning the pixels have all of their colors activated to some degree from one frame to the next. That differs from BTW, which tests the time it takes for all the colors to become active (white) from them not being activated (black).  

The SMTT tool provides the ability to test any 2 colors responsiveness times as well, such as blue to green. This is a decently realistic representation of responsiveness when compared to BTW, but GTG still wins for measuring responsiveness for real world play. Think about any game you enjoy playing, and take 1 frame of it (pause the game) and look at all the pixel colors. Most likely, it will be a wash of different colors, even for dark games. But when you dive further into what colors are actually being shown, most of them are much grayer than you would think. Download a tool such as Digital Color Meter (comes with all Macs), or Digital Colorimeter for Windows, and pick different pixels of that frame to see for yourself. Even on brightly colored animated games, like South Park or something from Nintendo, the “bright” colors you see are nowhere near as vibrant as what the pixel can actually go. It’s this distinction that needs to be made.  

When you are testing BTW responsiveness, the white you see is all of the colors activated 100%, and with black, the colors are activated 0% – with the backlight on for both, of course. This is a good test, but it’s not really something that you will experience often while gaming, (unless you are playing a strobe light game or something).  

When you are testing GTG responsiveness, you are going between 2 different shades of grey – usually going from colors being activated 40% to 60%, or 33% to 66%. This produces a more realistic test of what you will be experiencing while gaming.  

In short, we are trying to measure the average color of all the pixels going from one frame to the next. No responsiveness test will be perfect, but anytime you take an average of many colors at once – grey is the best representation of that.  

I heard something about Input Lag not mattering, since the human eye has its own lag of about 250ms?

This is absolutely false. It’s true the human eye does have its own lag, and then there is the lag associated with reacting to what the eye sees (such as moving your thumb on a controller), but all of this takes place after the input lag described so far on this page. Factoring in this “human eye lag”, it would look something like this:

  • Press a button on your controller, signal travels to computer ->
  • computer GPU/game processes your signal and sends output image ->
  • signal travels through cable to your monitor ->
  • monitor processes signal (Signal processing lag) tells pixels what to do ->
  • pixels change to show image for next refresh (pixel responsiveness lag) ->
  • human eye sees the change, recognizes what is going on (brain processing lag) ->
  • Your thumb moves to react to what change you see on the screen (reaction time lag).

As you can see, the Input Lag of a display does matter. Any improvement in lag, whether it’s from the controller, the signal processing, the refresh rate, the computer itself, or even wearing the proper glasses – puts you at an advantage over someone who is on inferior equipment.

That being said, there is something to say about the lag times from human to human. Some of us are better at recognizing what is being displayed on a screen. Some of us are better at reacting to the change on a screen. This is why the pro player will win against a casual player 10 times out of 10 for a fast paced game – they have been practicing this talent, and may even have some sort of natural-born advantage (debatable). Either way, will shaving 10ms of lag off your game make you win every time? Probably not. But will shaving 10ms of lag off your game help you win more? Definitely.

What about cable speed – does this play a role in lag?

This is the question with the most technical jargon to comprehend. Data in a cable (in the form of an electrical frequency) travels at about 1/100th the speed of light (source). So for any digital cable, we are talking about nanoseconds in lag, billionths of seconds. The lag is so small, and so insignificant, it makes no difference whatsoever to a professional gamer. When talking about other components of Input Lag, we are able to measure them in thousandths of seconds, so shaving off 10ms is a whole tenth of a second (it makes a difference). But cable speed is the most insignificant part of input lag by a large margin. What’s important to take away is that the monitor you purchase will always include a cable to support the highest refresh rate and resolution possible for that display. So this section shouldn’t matter too much, but for those interested here is the information anyway.

HDMI (Digital)

HDMI 1.0/1.1: Max resolution 1080p at 60Hz

HDMI 1.2/1.2a: Adds support for 1440p at 30Hz

HDMI 1.3-1.4b: Natively adds support for 1080p up to 144Hz, 1440p at up to 75Hz, or 4K at 30Hz.

Adds support for 1080p at up to 240Hz (Y′CBCR with 4:2:0 subsampling), 1440p at up to 144Hz (Y′CBCR with 4:2:0 subsampling), and 4K at up to 75HZ (Y′CBCR with 4:2:0 subsampling), or 5K at 30Hz (Y′CBCR with 4:2:2 subsampling).

HDMI 2.0-2.0b: Natively adds support for 1080p at up to 240Hz, 1440p at up to 144Hz, 4K at up to 60Hz, or 5K at 30Hz.

Adds support for 1440p at 240Hz (Y′CBCR with 4:2:0 subsampling), 4K at 75Hz (Y′CBCR with 4:2:2 subsampling), 4K at 120Hz (Y′CBCR with 4:2:2 subsampling), 5K at 60Hz (Y′CBCR with 4:2:0 subsampling), or 8K at 30Hz (Y′CBCR with 4:2:0 subsampling).

HDMI 2.1: Natively adds support for 1440p at up to 240Hz, 4K at up to 144Hz, 5K at up to 60Hz, or 8K at up to 30Hz.

Adds support for 4K at 240Hz (with Display Stream Compression (DSC)), 5K at 120Hz (with Display Stream Compression (DSC)), or 8K at up to 120Hz (with Display Stream Compression (DSC)).

DVI-D, A, or I (Digital or Analog or Both)

2560×1600 at 60Hz

VGA (Analog)

VGA-1920×1080 at 70Hz.

DisplayPort (Digital)

1.0-1.1a: Natively supports; 1080p at up to 144Hz, 1440p at up to 75Hz, or 4K at 30Hz.

1.2-1.2a: Natively supports; 1080p at up to 240Hz, 1440p at up to 165Hz, 4K at up to 120Hz, or 5K at 30Hz.

With chroma subsampling enabled, supports; 1440p at 240Hz (Y′CBCR with 4:2:2 subsampling), 4K at 120Hz (Y′CBCR with 4:2:2 subsampling), 5K at 60Hz (Y′CBCRwith 4:2:2 subsampling), or 8K at 30Hz (Y′CBCR with 4:2:2 subsampling).

1.3: Natively supports; 1440p at up to 240Hz, 4K at up to 120Hz, 5K at up to 60Hz, or 8K at 30Hz.

With chroma subsampling enabled, supports; 4K at 144Hz (Y′CBCR with 4:2:2 subsampling), 4K at 240Hz (Y′CBCR with 4:2:0 subsampling), 5K at 120Hz (Y′CBCRwith 4:2:0 subsampling), or 8K at 60Hz (Y′CBCR with 4:2:0 subsampling).

1.4-1.4a: Natively supports the same specs that 1.3 natively supports. Adds optional support for DSC to allow: 4K at up to 240Hz, 5K at up to 120Hz, or 8k at 60Hz. With DSC and Y′CBCR with 4:2:2 subsampling enabled, it allows for 5K at 240Hz, and 8K at 120Hz. With DCS and Y′CBCR with 4:2:0 subsampling enabled, it allows for 8K at 144Hz.