Friday, January 5, 2007
The Digital vs. Film debate has been on-going since the advent of digital photography. Early in the debate, there was a question as to which medium was better. That debate isn’t really the issue anymore. Rather, what I find most amusing are the discussions regarding the 35mm film equivalent to digital in terms of megapixels (MP). I’ve seen wild ranges of values and talks about a general consensus, etc. This article attempts to explore the “megapixel” issue in more detail.
Which is better, film or digital?
I’ll get right to the point and say that all things being equal, digital is clearly better. By “all things being equal”, I mean that film comes in multiple formats (35mm, medium – 4×5, etc.). Likewise, it’s fair to say that digital has not completely eliminated the need for film as there is no real digital equivalent to the high end film formats. However, for the other 99.999% (estimation of course) of the photographer population we’re really only interested in 35mm film. With the quality of today’s digital cameras, there is really no need to continue using 35mm film based equipment. Some older photographers like the look (including the imperfections) of film based prints, but even this can also be simulated with digital filters. Of course, I suppose it’s not fair to just proclaim one medium as being better than another without qualifying the reasons behind the claim. In no particular order, digital is better than film because:
1. The development process: There are so many ways things can go wrong with making prints from film. Digital processing is much more consistent. Only the most skilled photographers can get the maximum benefit of film. In practice, the average photographer does much better with digital.
2. Workflow: To get the best image possible, film has to be digitized any way in order to tweak the color, sharpness, attempt to reduce the noise, etc. In these cases, you’re at the mercy of the quality of your scanner. In terms of convenience, digital is already “digitized”, though those shooting in RAW format will do some development. The convenience of sorting, storing, searching and transferring digital files cannot be matched by film.
3. Much more accurate colors.
4. Better dynamic range through RAW file processing.
5. Far less noise and much more useful ISO range.
6. Film costs money. Digital storage is comparatively cheap.
7. Film degrades over time. Files can be duplicated exactly.
8. Immediate feedback of the quality of the picture taken. This just isn’t possible with film based cameras. Many professional photographers readily admit that learning on a digital camera can save years of trial and error based experimentation with film based cameras.
What is the digital equivalent to 35mm film in megapixels?
This is the more difficult question to answer. The explanation for this difficulty is that there is no quick and easy way to measure this accurately. Worse, most photography “experts” are by no means qualified scientists. Many feel that based on their photography (which is essentially an art form) credentials, they’re able to also provide answers to more scientific questions. People feel the need to quote respected photographers on what is essentially their “opinion” on the matter. As such, for 35mm film, I’ve seen ranges from 5 megapixels up to and even beyond 20 megapixels.
Why such a large range and who is right?
One of the main reasons there is such a large range is because of film itself. That is, not all film is created equal. Film advocates tend to quote numbers from the highest quality, highest priced, highest grain films available. That’s fine, but in reality the average film purchased is probably the cheapest. In theory, the very high grain 35mm films have the equivalent of about 20 megapixels. So, 35mm film = 20 megapixels, right? Film advocates would like to stop there and say yes, but the reality is far different. The film advocates that think 35mm film is equivalent to 20MP apparently aren’t familiar with the Modulation Transfer Function (MTF). In layman’s terms, this refers to the cumulative affect of the lens, lens + film, scanner and sharpening algorithms, etc. In short, the theoretical 20MP quickly gets cut in half to about 10MP. More detail about MTF can be found at the link below:
This is where the industry “consensus” comes in. The general consensus is that in terms of resolution only, 35mm film is close to 10MP on digital cameras. That would be fine if resolution where the only factor involved with image quality, but that’s not the case. The four major factors that impact image quality are resolution, noise/grain, dynamic range and color. Digital cameras have several times the signal to noise ratio as compared to film cameras. Even digital cameras are not the same in quality. Digital SLRs have a much higher signal to noise ratio than “point and shoot” digital cameras because of their larger (and more expensive) internal sensors. Likewise, a 6MP digital SLR will produce better quality images than an 8MP point and shoot digital camera.
I’ve read many articles in magazines and read many web based articles on the topic. I’ve come to the conclusion that most photographers are not scientists and likewise don’t really understand the dynamics that come into play when making such comparisons. While I encourage everyone to read as much information as they can on the topic, ultimately you’ll have to come to your own conclusion. There is one web site that I can recommend on this topic. It’s certainly one of the few that have any scientific credibility behind it.
Of particular interest, is a graph which shows the range of resolution on the vertical axis and ISO values on the horizontal axis. The quality of film photography quickly drops off across the ISO range. What’s the point of enlarging a photo if it’s going to look like crap? Just because film has potential resolution, doesn’t mean the images will be visually appealing if they are grainy and noisy. I’ll present my own conclusions which are a combination of my own observations and my research on the topic in the conclusion paragraph below.
Also, when researching the topic, I’ve found that most of the articles are old and mostly out of date. That is, film is relatively stagnant in terms of improvements. By comparison, digital imaging technology has been progressing at a very fast pace. For example, the first consumer level digital camera was the Apple Quicktake 100, released in February, 1994. It would be a joke to compare that to film. Clearly film was far superior. For more digital photography history, see the link below.
By 2001, 5MP cameras were coming onto the scene that posed a serious challenge to 35mm film capabilities. By 2003, there were heated debates on the topic as digital cameras from several vendors were exceeding 35mm film capabilities. As of this writing, December 2006, nobody with an ounce of credibility can deny that digital photography has long since eclipsed the capabilities of 35mm film based photography. In fact, higher end Canon DSLRs are producing images that match or rival medium format film cameras.
How much is enough?
In the printing business, the rule of thumb for high quality output is that you want to keep your images in the 200 – 300 ppi (pixels per inch) range. Essentially, based on printing technology, nothing above 300ppi can be realized in terms of quality. Anything under 200ppi is where you might start to notice a drop in quality. So, for the best quality images, how much resolution is needed? A 4” x 6” photo at 300ppi would require a 2.16MP image. (300 x 4) + (300 x 6) = 2,160,000 or 2.16MP.
So, why would you want more resolution? More resolution allows you crop a portion of your image and still make a reasonable sized print out of it. The resolution chart above also explains why digital photography really took off at the consumer level once affordable digital cameras were available in the 2 – 3 MP range. Most amateur photographers rarely have a need for anything but 4×6 prints.
Based on my own observations and research into the many factors of image quality, I’ve come to the following conclusion: There is no exact megapixel correlation between 35mm film and megapixels in digital media. Image quality varies based on the quality of the film used; the ISO setting for the picture taken; not just the number of megapixels, but also the size of the digital sensor; the signal to noise ratios, etc. That said, there are a few generalizations that can be made with regard to the 35mm film and digital mediums.
1. With “regular” film, shooting at a standard ISO 200 on a film based point and shoot camera is equivalent to about a 5MP point and shoot digital camera.
2. On one impractical extreme, under the most ideal lighting, ISO 50, best quality film possible, 35mm film can achieve a quality somewhere in the 10+ MP digital equivalent ranges.
3. On the other impractical extreme, at ISO 1600 and with lesser quality film, 35mm film degrades to somewhere between the 2 – 4 MP digital equivalent range.
4. Overall, across a range of settings, a decent quality digital camera in the 6 – 8 MP range will almost always beat 35mm film based cameras in terms of the overall quality of image produced.
Since there are many factors which influence image quality, it’s not hard to understand why there is no one single magic answer as to how many megapixels are equivalent to 35mm film. Likewise, regardless of how many people repeat the same “consensus”, understand there is no single answer. Any source that tries to peg 35mm film into a single, precise megapixel equivalent is simply incorrect.