Resolving (or complicating) the iPhone resolution debate

On Tuesday I posted that the iPhone 4 is nowhere near the capabilities of the human retina. But yesterday over at the Bad Astronomy blog, Phil Plait argued the opposite:

Jobs wasn’t falsely advertising the iPhone’s capabilities at all. … But a lot of people read the headlines and it taints their view; someone reading that article may be more likely to think Jobs, once again, has overblown a product to excite people. He didn’t.

Plait was talking about an article on Wired, which cited display expert Raymond Soneira, who argued that 477 ppi would be necessary to match the human retina at a distance of 12 inches.

Meanwhile, I was arguing for 1060 ppi, at 10 inches! Who’s right?

In a way, we all are. Again, I think the Clarkvision site has the clearest explanation of the research:

Blackwell (1946) derived the eye’s resolution, which he called the critical visual angle as a function of brightness and contrast. In bright light (e.g. typical office light to full sunlight), the critical visual angle is 0.7 arc-minute (see Clark, 1990, for additional analysis of the Blackwell data). The number above, 0.7 arc-minute, corresponds to the resolution of a spot as non-point source. Again you need two pixels to say it is not a point, thus the pixels must be 0.35 arc-minute (or smaller) at the limit of visual acuity.

In other words, at 0.7 arc-minutes, you can tell that a light source is not a point. So an 0.7-arc-minute pixel would be too big. Therefore, Clark argues, for a pixel to match the resolution of the retina, it must be half this size — 0.35 arc-minutes. Meanwhile, Soneira says a pixel need only be .6 arc-minutes. We know a .6 arc-minute pixel is indistinguishable from a point — you don’t have to go as small as 0.35 arc-minutes, so that makes some sense.

But Clark cites other research with converging evidence that we see pairs of objects at a resolution of 0.7 arc-minutes. That means single objects (pixels) need to be half that size. Plait counters with a Wikipedia link claiming 1.2 arc-minutes. Who’s right?

It’s hard to say, but in any case, these studies are all measuring something very different from what you use your computer display to do. They’re testing your ability to see lines, dots, or other very simplified figures. Why not study actual pictures, or actual text?

In fact, Clark did exactly that. He had viewers sort pictures printed at 150, 300, and 600 ppi. They could successfully sort the pictures from lowest- to highest-resolution. Thus, it’s quite clear that the 326 ppi of the iPhone 4 doesn’t display the highest resolution detectable by the human eye in realistic conditions.

Calling the new iPhone display the “Retina Display,” therefore, is an exaggeration at best.

This entry was posted in Psychology, Technology. Bookmark the permalink.

3 Responses to Resolving (or complicating) the iPhone resolution debate

  1. Daniel says:

    Hey Dave,
    I posted a link to this article on Phil’s blog too because my big problem with his post is the erroneous use of 20/20 as average human visual acuity. 20/20 is Snellen’s cut-off for normal vision (if its better than that there’s no need to attempt to correct it). Elliott, Yang, and Whitaker (1995, link: http://bit.ly/aYfJ8N) reported average visual acuity for 19-24 year olds of 20/15. Average visual acuity was better than 20/20 up to the 75 years old. Figure 2 in that paper compares their results to 4 other studies… across all the studies average visual acuity better than 20/20 up to about age 55.

    For everyone under 55, Jobs claim is an exaggeration.

    And Phil’s post is spreading the 20/20 = average visual acuity misuse around (see Engadget and Slashdot posts). *sigh*

  2. Bob Calder says:

    Dave, When we print using a 600 dpi image, what gets laid on the paper is anything but 600 anything. The paper is incredibly coarse at that resolution and the capacity of the printing device is just not up to much. I don’t know what the actual dot size can get down to, but I have prepared enough images for print that I am doubtful print technology offers a solution to your human eye resolution problem.

  3. dave says:

    Good points, Bob. But we’re actually talking about a 1200 X 2400 dpi printer printing a 600 ppi digital image. What viewers could distinguish is the 600 ppi original versus the 300 ppi original.

    I agree, though, print isn’t ideal. Better might be to optically reduce the pixels on a digital display (since we don’t currently have displays at such a high resolution), and see what the threshold of detection is.

    In either case, though, the question is, “can the human eye tell the difference?” The answer, in the case of printed 600 ppi images, is yes.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>