I've heard various claims for just how accurate the resolution of the human eye is. Wikipedia says a theoretical maximum of 1.2 arcminutes (60 arcmin = 1 degree = 1/360th circle) with practical limit around 1.7 arcmin; true 20/20 vision is the ability to recognize patterns at 1 arcmin line width.
(That said, however, there are certain situations where the human eye can detect much smaller resolutions. Stars only 0.1 arcseconds (1/600 arcminute) in width can be seen with the naked eye because of their high brightness. Lines against contrasting backgrounds are also more visible; with clear air, one can make out a 2-inch (5-cm) with power line from a distance of several miles. At 2 miles distance (3.2 km), this is equal to a resolution of about 1/20 arcmin, or 3 arcseconds.)
Assuming the resolution of the human eye is 1 arcmin for normal images, then, what resolution does a computer screen need? I personally sit 24 inches from my computer screen (laptop, on my lap); at that distance, one minute of arc is equal to 0.18 millimeters. Given that in the worst-case senario, you need two rows of RGB pixels to produce a color pixel, this means a pizel size of 0.09 mm - equal to about 280 pixels per inch.
My personal display is about 7.5" by 12" - equal to 2100 by 3360 pixels at this "lifelike" resolution. (In reality, it's 800 by 1280 - 38% of the resolution). The largest displays right now are usually 1920 x 1200 pixels, and about 24" by 15". An life-resolution image on one of these giant desktop monitors would have to be 4200 by 6720 pixels - a whopping 28 megapixels. (All image sizes given here are for jpeg format; multiply by about 4 for PNG format).
Which, in terms of cameras nowadays, isn't that much. You can find 14-megapixel cameras for under $300, and Canon offers a 10-megapixel camera for $100. (Nikon will sell you a top-of-the line DSLR with 24.5 MP if you have 4 figures to spare). But the problem lies in the filesize. A good estimate is one megabyte per three megapixels for an image with a reasonable amount of detail - so that giant 4200x6720 image will be almost 10 megabytes. (70 will fill a CD). Even the laptop-size image is 3 megabytes.
(It should be mentioned that, sitting 10 feet from a TV, 0.5mm pixels will suffice for reality. Few HDTVs offer this size pixel, but any laptop does. A life-resolution image for such a 60" non-widescreen (36" by 48") TV would be only 1800x2400 - taken by almost any digital camera on the market - and just 1.5 megabytes in size, equal to 90 Mb/s for 60fps HDTV. (1080p HDTV is just 1080 pixels wide, or only about 60% of the resolution of life).
So, this brings us to the real question: what is the resolution of life? Well, let's assume the human field of view is 180 degrees by 180 degrees (it's actually a bit smaller, but this makes the numbers easier). From left to right, you'll need 10800 arcminutes - 10800 pixels. Thus, at best, a human can view a 10800 x 10800 image. Any image larger than that is unnecessary unless it can be zoomed; higher resolution cannot be detected by the human eye.
Now, recall that that's on a half-sphere. If half the circumference is 10800 arcminutes - or 10800 pixels - then the radius would be 3438 pixels. For a hemisphere, surface area A = 2 pi r2, or about 74.3 megapixels.
That's the magic number, then:
The Boston Projection
1 day ago