Two and a half years ago or so, I wrote a "First Impressions" article about the Canon D-60 for Michael Reichmann's Luminous-Landscape web site. The general theme of the item was that with the D-60, digital cameras had pretty well 'caught-up' quality-wise with 35 mm colour film. Indeed I felt that photos taken with the D-60 held some of the qualities of Medium Format colour film images. At the time, there were still many photographers who were yet to be convinced that digital technology could ever be as good as 35 mm film. There are fewer doubters today. 6-Megapixel (and higher) SLRs have essentially taken over the niche once occupied by 35 mm film SLRs, and even a piece of the medium format market.
Now, two and a bit years later, we have affordable 8-Megapixel cameras. Where is photography going?
Before talking about the future, let's talk about the present. Let's try to understand the situation today.
Digital SLRs with APS-sized sensors (approximately 15 by 22 mm) are generally giving us the quality we previously expected with full-frame (24 by 36 mm) colour film. Resolutions are comparable and the noise/grain factor leans in favour of the smaller digital sensor. Although ultimate resolution is comparable, there are differences in the character of the resolution. Film exhibits a more gradual reduction of contrast with increasing resolution, and it is very difficult to put an absolute upper limit on resolution: if we are prepared to accept a low level of contrast, we can claim higher resolution. Digital sensors are different. There is an absolute upper limit on resolution set by the fixed spacing of the photo-sensitive sites on the sensor chip. There can be no no resolution higher than that allowed by the physical design of the chip. Below that limit, contrast can be quite high higher than for film but above that limit there is nothing. This means our digital images often demonstrate a clear characteristic that tells us "this is a digital image". If we have a scene with gradually increasing detail in some direction a picket fence extending into the distance, for example we see good rendition of detail up to some point, and then there is a sharp transition into visual mush. Suddenly the individual pickets disappear and we have a uniform gray solid fence. Similar effects can be seen with grass, tree bark and shingled roofs. Personally, I find this effect slightly disturbing. I prefer the way film handles detail.
Here's an example of the Picket Fence Effect mentioned above. We can plainly see the vertical bars of the fence to the right of the two men standing together, but not to the left. (This is a pixel-for-pixel portion of a photo taken with a Canon S50.)
Similarly, digital noise possesses a character different from film grain. It took me a while to notice it, but having once identified colourful digital noise in an image, it was easy to find it again. I can still see it in EOS-20D images at the lowest ISO setting. In this case, however, I prefer the digital noise to film grain. And camera manufacturers have worked hard to minimize this effect. The Canon EOS 20D offers significantly less digital noise than did the D-60.
Colour rendition is another factor where film and digital sensors differ. I'm not so much concerned with the accuracy of the rendition of large colour areas as I am with the rendition of small coloured detail. This is the area where I used to feel that medium format had 35 mm beat hands down. Not only did medium format images deliver a smooth rendition of colour, but it held the colour of small details well. I don't know the reason for this; for some reason, film seems to have difficulty determining the colour of small objects. Typical digital sensors have an obvious technical difficulty here: with most sensors it takes a minimum of three sensor sites to determine the colour of a pixel. The exception is, of course, the three-layer "Foveon" sensor used in Cameras like the Sigma SD9/10. Nevertheless, the digital sensor seems generally better than film in this area. And this advantage, along with the noise issue, gives 6 Megapixel digital images some of their medium-format cleanliness. There can be other related problems for the digital sensors; usually they show up as moire effects. The problem arises when the optical system has higher resolution than the sensor and the subject holds fine detail. Digital camera designers try to minimize moire effects using some form of anti-aliasing (intentional blurring) filter immediately in front of the sensor.
I used to wonder if it was the film or the lens that resulted in differences between 35 mm and medium/large format photography. I guess now it is pretty obvious that it is the film characteristics than account for the difference. The same 35 mm lenses are now providing us with images exhibiting some medium-format character using an image area smaller than full-frame 35 mm film.
So, it looks as if digital sensors area for area are better than traditional film. Just how great this advantage is can be demonstrated by the point-and-shoot digital cameras. Whereas the digital SLRs we have been discussing use APS-sized sensors, the point-and-shoot cameras typically used much smaller sensors (8 by 11 mm) to generate images with 5 Megapixels or 10 by 13.5 mm sensors to generate 8 Megapixels. Apart from a noise penalty, these images are surprisingly good: approaching the quality of the digital SLRs. These tiny sensors correspond to what we used to call "subminiature" as for the Minox subminiature camera (10 by 14 mm). But the digital sensors are producing images of a quality far superior to traditional subminiature film cameras. Indeed the resolution of these digital sensors corresponds to roughly three times that of generally available colour films. This is not the only reason for this resurgence of the subminiature format, however. The precision required of the traditional camera and photographic enlarging-printer increases significantly as the camera becomes smaller. Whereas wood construction - with physical dimensions that change with humidity - is adequate for large format cameras, even sheet-metal construction was found to be inadequately stable for 35 mm cameras: 35 mm camera bodies need to be cast metal or other stable material. And subminiature camera would need something even more stable in order to be optically competitive with 35 mm cameras. Keeping film flat and in the proper location has been a problem for all formats. The same factors also apply to the enlarging printers. It is not surprising that subminiature cameras could compete only where small camera size was important, and then only with a penalty in picture-taking performance. The digital sensor changes that, especially when coupled with electronic (automatic or manual) focusing. If one is using the sensor itself to determine focus, it really does not matter that the camera body might change dimensions a bit from hour to hour or day to day: we can compensate. We simply optimize the focus for the conditions that currently exist. As for printing, with the digital camera, the same printer will work for all formats. Once the image is captured digitally, the precision needed for the mechanics of the printer depend upon print size, not camera size. With digital cameras, all formats are created equal when it comes to printing the final image. So, for the digital subminiature camera, four problems have been resolved: we have a sensor with a resolution higher than traditional film, minor dimensional stability problems and sensor location can be compensated and the mechanical precision required for print reproduction is the same as for any other format.
You may have noticed that in much of this discussion I have been careful to address my comments to colour film. Colour film has always suffered from resolution inferior to what could be achieved with black and white film. Generally, colour film has had about one-third the resolution of B&W. Traditional B&W film can still out-perform the current digital SLRs. But note that the point-and-shoot cameras are using sensors with resolutions comparable to B&W films!
Another important factor in digital imaging is the flexibility permitted by digital image editing. Today, using computer applications like Photoshop we can correct for inappropriate perspective, correct colours, and in the latest version, correct for some aspects of lateral colour aberration (by changing slightly the image magnification applied to each of the three primary colours). Other image editors can correct for barrel and pincushion distortion.
So that is the present; what does this mean for the future?
Well, the most obvious thing is that 16 Megapixel full-frame sensors still fall short of what is in principle possible. If we scale-up those point-and-shoot sensors to full-frame 35 mm size, we have something like 50 Megapixels. That would be roughly 25 Megapixels for an APS-sized sensor. These numbers represent "large-format" quality but in a 35 mm sized camera! And pixel densities can probably go even higher. Never mind that those point-and-shoot sensors currently have higher noise levels than larger sensors. When we have this many pixels making up our image, the noise will be sufficiently fine-grained that it will probably be too fine-grained to be noticeable in typical images.
Based on this simple analysis I expect that "large format" will eventually mean the equivalent of full-frame 35 mm, and we will see miniaturized (SLR-sized) cameras and lenses with the optical flexibility of the view camera. Never mind that the 'ground-glass image' will be so small, we will view the image on a computer screen or some such electronic viewer anyway. Other cameras will use smaller sensors and yet produce images comparable to medium format cameras today both B&W and colour.
With such high-resolution sensors, antialiasing and moire effects as well as the mushiness I see in todays's digital images - will be gone. The lens and depth-of-focus will be the factors limiting fine detail in the image, not the sensor.
Lens designs will probably change to take advantage of the flexibility offered by image editing software software that might even be embedded in the camera so that its use is transparent to the user. Distortion can be permitted in the lens so long as its characteristics are predictable and can be reversed in the image editing process. Some types of colour aberrations might also be permitted, but I suspect that this will be a minor factor. The lens has to deal with the full spectrum of colours, not just the red, green and blue used in subsequent processing. Or, to put it another way, each sensor, whether red, green or blue-sensitive, actually responds colours over a significant band of the spectrum, Within each of those bands, the chromatic aberrations will still need to be controlled optically. Of course, we could see sensor systems using more than three narrower-band sensor types. That would make software colour aberration correction more viable.
The SLR concept of using separate optical paths for viewing and image capture requires matched precision along both the taking optical path and the viewing path. A full-time, high-resolution viewing system using the sensor itself permits significant reductions in mechanical precision and permit simpler in-camera manual or optical compensation. We could, for example, see cameras automatically using the Scheimpflug principle to deliver extended depth-of-field. Tilting the sensor is the easiest way to accomplish this but is not as 'correct' as tilting or swinging the lens. But then the distortion correction system will automatically put things right anyway.
Thus the electronic viewfinder is probably here to stay - for a while, anyway. But I expect we will see flexible electronic viewfinders that offer the option to view with fast response or with high resolution - or perhaps some combination of features: a small area with high resolution and a larger surrounding area with fast response.
Motion compensation/anti-vibration? Of course. We have it today; it will just get better.
GPS location, time, look-direction logging of images? Of course. It's really not a technical problem to do it today. We could couple this with automatic panorama generation.
How about 3-D imaging. Well, it will all depend upon consumer demand. It would not be a problem to do it today.
The short answer is "no". Light-sensitive silver-halide chemistry based film will continue to have specialized applications. If capturing more that 50 or 100 Megapixels worth of photographic information is important, then film may be the economic solution. Please recognize, however, that it is very difficult to generate a plane image that has that many independent bits of information. Diffraction effects limit the ultimate resolving capability of a lens. For higher resolving power we need a larger lens aperture. But larger apertures result in more limited depth of field so using a large-aperture lens usually means the image has lots of places where the image is out-of-focus (and so there is little detail to capture). Maximum overall image detail involves a compromise. Unless the subject being photographed is essentially a flat plane, it becomes very hard to generate that diffraction-limited level of detail throughout the image. Where the subject lies along a plane, film will continue to provide a useful high-detail solution. For general picture-taking of three-dimensional subject matter, I think digital photography will predominate. So, yes, for most of us, film has a very limited life.
But photography has never been a single-process activity. From the very beginning we have had alternative methods to choose from in creating the image. And that will continue.
Undoubtedly yes. Technology will continue to evolve over time. Colour filters and layered photo sensors do not constitute the only ways to determine colour. One could imagine future sensors with micro-spectrometers at each pixel. Or, it might be possible to introduce intentional colour aberrations in the lens that somehow allow pixel colour to be determined by how the image is distributed over the pixels. The ultimate imaging technology in science fiction seems to be full-colour, full-motion, holography. That may yet be possible. I can't tell you how, but I expect it will be possible. So, yes there is plenty opportunity for change yet to come.
Photography is all about producing images that represent the world around us. The technical process by which that is image accomplished is secondary. The process may well be fun, or tedious, or expensive, or easy and so the process is an important part of the photographic experience, for the photographer. The photograph user or viewer, however, is usually ignorant of the efforts or emotions of the photographer.
Is the information on this web site still relevant? Yes. The issues dealt with on this web site deal with the basic optical principles at work in all forms of photographic apparatus. Format choice and imaging technology are secondary issues. About the only issue dealt with in The INS and OUTs of FOCUS that is affected by modern digital photography is the matter of film flatness. With solid state imaging chips being rigid devices, their ability to distort physically is minimal. I suspect that there can be thermally induced warping of the chip, but I expect it is quite minor compared the the flexibility and curl of traditional photographic films. There have been cameras with intentionally curved film planes. And there may be some advantage in having our solid-state imaging chips curved in some circumstances. For now, however, imaging chips themselves are made by methods that are essentially photographic on large flat wafers. Making curved-surface imaging chips would be difficult and expensive. So, the issue of "film flatness" essentially disappears for digital cameras as we currently know them. As implied earlier, the Scheimpflug rules governing focus are still at work. There could be some deviation from Scheimpflug's principles in future if lenses are allowed to deviate significantly from the ideal rectilinear, flat-field model.
There is a general issue that as the image (camera) format is reduced, depth-of-field increases. This assumes the working f-numbers are similar. Countering this trend is the fact that as the physical aperture decreases in size, diffraction effects increase and so the ability to capture detail decreases for very small lens apertures. Many of the digital point-and-shoot cameras today will not stop down beyond f/8 or so. There is a reason for that: the image would soften considerably if the lens were to stop down much further. As the camera format becomes smaller, depth of field generally increases, but if we want to retain all the detail we can get, the depth-of-field issue will not go away. What it comes down to is that photographically useful lens apertures are generally in the 1 to 5 mm range. At smaller apertures detail is degraded; at larger apertures depth-of-field becomes limited. Of course we often use limited depth of field to isolate a subject and telephoto lenses with apertures of 100 mm and more are useful for special purposes.
Much of the subject matter on this web site deals with how the Scheimpflug
principle really works. So it is that I see a certain irony in the development
of digital photo-editing and page layout techniques. Theodor Scheimpflug's
motivation for analysing the optics of tilted lenses was to achieve predictable
image distortion: he wanted to translate oblique aerial images into accurate
maps. One of his most difficult tasks was to devise an optical system that
would stretch an image in one direction only. This is an image distortion
we see all too often today as layout artists re-size photographs without
holding down the shift key! What was difficult for Scheimpflug happens all
too easily by accident today!
Back to Main Page Table of Contents
Back to Top of this Page