Skip to content. Skip to navigation
You are here: Home AV University Display Formats & Technology 1080p and the Acuity of Human Vision
#########
   Alabama
   Alaska
   Arizona
   Arkansas
   California
   Colorado
   Connecticut
   DC
   Delaware
   Florida
   Georgia
   Hawaii
   Idaho
   Illinois
   Indiana
   Iowa
   Kansas
   Kentucky
   Louisiana
   Maine
   Maryland
   Massachusetts
   Michigan
   Minnesota
   Mississippi
   Missouri
   Montana
   Nebraska
   Nevada
   New Hampshire
   New Jersey
   New Mexico
   New York
   North Carolina
   North Dakota
   Ohio
   Oklahoma
   Oregon
   Pennsylvania
   Rhode Island
   South Carolina
   South Dakota
   Tennesee
   Texas
   Utah
   Vermont
   Virginia
   Washington
   West Virginia
   Wisconsin
   Wyoming
 

1080p and the Acuity of Human Vision

by Joe Cornwall last modified April 02, 2007
1080p resolution

1080p resolution

"1080p provides the sharpest, most lifelike picture possible."   "1080p combines high resolution with a high frame rate, so you see more detail from second to second."  This marketing copy is largely accurate.  1080p can be significantly better that 1080i, 720p, 480p or 480i.  But, (there’s always a "but") there are qualifications.  The most obvious qualification: Is this performance improvement manifest under real world viewing conditions?  After all, one can purchase 200mph speed-rated tires for a Toyota Prius®.  Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however!  In the consumer electronics world we have to ask a similar question.  I can buy 1080p gear, but will I see the difference?  The answer to this question is a bit more ambiguous.

Measuring Human Vision

To fully understand the implications of high resolution and high definition we must first explore the limitations of human vision.  The Dictionary of Visual Science defines visual acuity as "acuteness or clearness of vision, especially form vision, which is dependent on the sharpness of the retinal focus within the eye, the sensitivity of the nervous elements, and the interpretative faculty of the brain."  Simply put, our eyes have a resolution limit.  Beyond our ability to see it, increased image resolution is simply an academic exercise.  It can have no real part in improving the viewing experience.  Unlike hearing, our visual acuity is unambiguous and relatively simple to measure.

Vision is measured using a few different tools.  The most familiar is called the Snellen chart.  Using this tool an optometrist or physician would ask you, from a standardized distance of twenty feet (six meters in countries that use the metric system), to read the "letters" on the chart.  The smallest line that can be read accurately defines the acuity of vision, which is expressed in a quasi-fractional manner.  20/20 means that a subject can read the line that defines average vision from the prescribed twenty feet away.  20/10 means that same subject can read, from a distance of twenty feet, the line that a subject with "normal" vision could only read from ten feet.  20/10 vision is therefore twice as good as 20/20.   Similarly, 20/40 is half as good with the subject being able to read at twenty feet what someone with normal vision could read at forty. 

The next part of the puzzle is applying this understanding to a video display or other image composed of heterogeneous elements.  The human eye’s resolution (acuity) is directly proportional to the size of the elements of the image and inversely proportional to distance from the elements.  This relationship is best expressed in degrees. 

It's common knowledge that people have a finite field of view, which is normally considered from its upper limit.  Technically this is said to be the angular extent of the observable world that is seen at any given moment.  Roughly put, we can see things that exist within a known angle with the apex being our nose.  Staring straight ahead the average person has a stereoscopic field of view (not including peripheral vision which allows nearly a 180 degree field of view) of about 100 degrees.  In a similar manner we have a lower limit to our field of view.  Scientists express this as an angle as well, but because that angle is less than a degree we have to use the language of engineering and describe this lower limit in minutes of arc. 

Everyone knows from their high school geometry classes that a circle is 360 degrees (360°).  For angles smaller than 1 degree we use arcminutes and arcseconds as a measurement.  An arcminute is equal to one sixtieth (1/60) of one degree.  "Normal" visual acuity is considered to be the ability to recognize an optotype (letter on the Snellen chart) when it subtends 5 minutes of arc.  We can most certainly see objects below this level, as this describes only our ability to recognize a very specific shape.  Taking this a step further, we find that the lower limit of "resolution" of average eyes equates to roughly ½ the limit of acuity.  In other words, the average person cannot see more than two spots (pixels if you will) separated by less than 2 arcminutes of angle. 

by Joe Cornwall last modified April 02, 2007

Recent Forum Posts:

Post Reply
agabriel posts on October 07, 2007 12:51
I should preface my post with I haven't had a chance to view 1080, my TV runs in 720p mode and for the most part I like it. I also don't run brand new equipment.

So this weekend I upgraded my receiver. I went from a Denon 3801 to a 3803; like I said I don't run new stuff and I wanted upconversion. So I notice two differences one of which is clearer audio and video (this really surprised me). I could never make out all of the audio in the comcast Bengals commercial, I can now. The video also appears clearer and a bit brighter. So, with that said, I think the differences will become more apparent in the next few years. HDMI, despite being a spec for a few years is in my opinion still young and only starting to come in to is own. I think it takes the manufactures and circuit designers a few iterations to really get it right. I think the differences I'm experiencing between the 3801 and the 3803 are perfect examples of this.

So, I think both sides of this topic are correct. I think with the right hardware people will see a difference.

Maybe I'm way off basis, but thats my impression.

Anthony
Einomies posts on October 07, 2007 09:52
Just google "moire focus test for monitor tweaking"

I'm apparently not allowed to post URLs.
Kolia posts on October 06, 2007 08:39
Nice post Einomies.

Any idea where such a pattern can be found for testing my display?
Einomies posts on October 05, 2007 22:06
While discussing the ability of the human eye to see the pixels and at what distances, has anyone actually considered that humans have two eyes, and the visual aquity tests usually have you use only one eye.

We also have a brain that can compare the image from two slightly offset eyes and basically construct an image that is sharper and more detailed than what the individual eye can theoretically see. In fact, the brain will do it with one eye only, by constantly moving the eye around and constructing an image from the "snapshots" it has taken from different spots.

It's like the techinque they use in amateur astronomy. You take one telescope, one standard webcam, 50 photos of the moon and let the computer calculate a high-res combination from the photographs. You can calculate sub-pixel size details, when you compare the differences between the images - as long as they're not all exactly on the same spot, so you get differences between the pixels.

These theoretical figures don't really tell us what we can or cannot see when we set out to actually look at something.

One way of testing your personal viewing distance would be to display a moiré test grid on your LCD. It's a chessboard pattern with alternating black and white pixels.

A proper LCD with proper scaling won't have any problems showing that grid exactly the way it is. Here's the trick: when you look at it and move your head back and forth, you start seeing halo-like objects. These are actually the moire patterns of your own retina. Move your head back until you stop seeing them and the image turns to a dull grey color without any pattern.

That's where your real limit is. Though you might notice that if you look at the screen, you can still see the grid pattern at places. That's your two eyes and brains at work.
Jedi2016 posts on June 29, 2007 20:43
Mine falls right into the "benefit of 1080p starts to become noticable". Which sounds about right, because I have noticed differences between 720p and 1080p video (of the same source). While I can't necessarily make out more "detail" on the 1080p picture, it just looks overall sharper. When watched back-to-back, the 720p looks just a little soft by comparison.

That's what I see, anyway.
Post Reply
 
Join our Newsletter for News & Deals
#########