If the human eye was a digital camera, it’s “data sheet” would say that it has a of 60 pixels/degree at the fovea (the part of the retina where the visual acuity is highest). This is called eye-limiting resolution. This means that if there an image with 3600 pixels (60 x 60) and that image fell on a 1° x 1° area of the fovea, a person would not be able to tell it apart from an image with 8100 pixels (90 x 90) that fell on a 1° x 1° area of the fovea.
Note 1: 60 pixels per degree figure is sometimes expressed as “1 arc-minute per pixel”. Not surprisingly, an arc-minute is an angular measurement defined as 1/60th of a degree.
Note 2: this kind of calculation is the basis for what Apple refers to as a “retina display”, a screen that when held at the right distance would generate this kind of pixel density on the retina.
If you have a VR goggle, you can calculate the pixel density – how many pixels per degree if presents the eye – by dividing the number of pixels in a horizontal display line by the horizontal field of view provided by the eyepiece. For instance, the Oculus DK1 (yes, I know that was quite a while ago) had 1280 x 800 pixels across both eyes, so 640 x 800 pixels per eye, and with a monocular horizontal field of view of about 90 degrees, it had a pixel density of 640 / 90 so just over 7 pixels/degree. Not to pile on the DK1 (it had many good things, though resolution was not one of them), 7 pixels/degree is the linear pixel density. When you think about it in terms of pixel density per surface area, is it not just 8.5 times worse than the human eye (60 / 7 = 8.5) but actually a lot worse (8.5 * 8.5 which is over 70).
The following table compares pixel densities for some popular consumer and professional HMDs:
|Approximate Horizontal Field of View
(degrees per eye)
|Approximate Pixel Density
|Sensics zSight 1920||1920||60||32.0|
Higher pixel density allows you to see finer details – read text; see the grain of the leather on a car’s dashboard; spot a target at a greater distance – and in general contribute to an increasingly realistic image. Historically, one of the things that separated professional-grade HMDs from consumer HMDs was that the professional HMDs had higher pixel density.
Let’s simulate this using the following four images. Let’s assume that the first image, taken from Unreal Engine’s Showdown demo, is shown at full 60 pixels/degree density. We can then re-sample it at half the pixel density – simulating 30 pixels/degree – and then half again (resulting in 15 pixels/degree) and half again (7,5 pixels/degree). Notice the stark differences as we go to lower and lower pixel densities.
Full resolution (simulating 60 pixels/degree)
Half resolution (simulating 30 pixels/degree)
Simulating 15 pixels/degree
Simulating 7.5 pixels/degree
Higher pixel density for the visual system is not the same as higher pixel density for the screen because pixels on the screen are magnified through the optics. The same screen could be magnified differently with two different optical systems resulting in different pixel densities presented to the eye. It is true, though, that given the same optical system, higher pixel density of pixels on the screen does translate to higher pixel density presented to the eye.
As screens get better and better, we will get increasingly closer to eye-limiting resolution in the HMD and thus to essentially photo-realistic experiences.
The post Understanding Pixel Density and Eye-Limiting Resolution appeared first on Sensics.