Re: Diffraction, Landscapes and Calculator

Discussion forum for Tawbaware's PTAssembler software, Helmut Dersch's Panorama Tools software and any other photography related software

Moderator: spamkiller

Post Reply
Posts: 3
Joined: Tue Jun 28, 2011 7:47 am

Re: Diffraction, Landscapes and Calculator

Post by woods » Sun Aug 07, 2011 9:21 am

I first wrote these words at the end of another thread, back on june 28, 2011. No one replied to my post there, perhaps because the thread started so long ago. So I'm re-posting here, instead:


i'm having trouble reconciling these 2 things:

1.) max lyons's first post on this thread:


2.) daniel browning's first post on this thread: ... p?t=747761

in the same vein, i'm having trouble reconciling certain conclusions i (perhaps mistakenly) draw from

1.) mr. lyons's lens equivalence / diffraction calculator ( ) and

2.) the pictures in mr. browning's first post, above:

does the lyons calculator indicate that a photograph taken with a 21-MP full-frame sensor at f/14 will resolve no more detail than a photograph taken with a full-frame 12-MP sensor, all other things being equal (same sensor quality, same filter, etc.)? does it (also? instead?) indicate that a picture shot at f/14 with a 21-MP full-frame sensor will look the same at native resolution as it looks after being down-rezzed to an 11.4-MP file and then up-rezzed back to 21-MP? neither of these indications is claimed in these words by max lyons or by others in the threads I mention; rather, they are my attempts at drawing practical conclusions from the calculator. am I misunderstanding the calculator? what confuses me is that the photos shot at f/16 (viewable on the browning post, above) seem to show that the full-frame 21-MP sensor resolves more detail than the 12-MP sensor even when both sensors are exposed with the same lens at f/16. Granted, this is not an "all other things being equal" situation, since the 21-MP sensor is arguably of higher quality, with a better filter; and i don't know what algorithm mr. browning used to up-rezz from the crop of 12-MP picture; and i don't know what sharpening was used on the 2 images, or whether the sharpening was applied equally, and so on. but i would expect, based on the implications of the lyons calculator, to see these two sensors resolving roughly the same amount of detail at f/16 (indeed, even at f/14). I know that my understanding is not full, and that i am probably overlooking some important considerations; i mean this post not at all as a challenge to either mr. lyons or mr. browning, but rather as a question, a plea to both of them for help in understanding their claims. it might be that they do not even disagree with one another. so i'm asking for help in seeing how their claims or conclusions about diffraction-limited systems are compatible.

with gratitude to all,


Posts: 3649
Joined: Fri Jun 20, 2003 8:55 pm
Location: USA

Re: Diffraction, Landscapes and Calculator

Post by maxlyons » Mon Aug 08, 2011 11:01 pm


I think there are others that contribute here who are much more expert on this matter than me. But, here is what I understand to be true (I'm sure someone will correct me if this isn't right). Diffraction puts a limit on the finest details that can be resolved. As the aperture is decreased the finest details that can be resolved diminishes. My calculator (green line) shows the minimum size that a point of light will cover on the sensor (because of the blurring due to diffraction).

Of course, all of this is theory, and there are a lot of simplifying assumptions baked into these calculations...the two largest (in my opinion) being:

1. The use of the wavelength of some intermediate light color (rather than the full spectrum of visible light).
2. Ignoring the fact that most digital cameras use a Bayer filter to convert the output from a "monochromatic" sensor to a color image.

I believe that both of these simplifying assumptions can make a big impact on the real-world performance of today's cameras. In practice (i.e. looking at images produced by my own camera/lenses), my own experience leads me to believe that the actual diffraction limited apertures may be somewhat smaller than those calculated by my calculator, which may explain why you see an increase in the detail in images produced by different megapixel sensors when both are used with small (i.e. "diffraction limited" apertures). In any event, if you are concerned about the performance of your own camera/lenses, we used to have a contributor on this forum whose signature read "a test is worth one-thousand expert opinions". I think that is wise advice!

Here are a couple of other useful links that may help clarify the issue: ... graphy.htm


Posts: 44
Joined: Sat Jan 17, 2009 4:48 am
Location: Newcastle, Australia

Re: Diffraction, Landscapes and Calculator

Post by Growing » Sun Aug 21, 2011 1:45 am

I'm trying to use Max's diffraction calculator ( ) for video on a Canon 7D, but I can't find some of the vital details:

* What is the dimension of the sensor in the various video modes (they are somewhat cropped from the full sensor). The Canon 60D/600D have additional "zoom" modes.

* I believe the rows of the sensor are skipped (leading to moire problems). But are the horizontal pixels also skipped? Or are they averaged?

* What is a suitable criteria for in-focus and out-of-focus for video? For still photos, one uses a circle of confusion, based upon some hypothetical print viewing distance. But video is not printed. Does a circle-of-confusion <= 1 pixel correspond to in-focus, and circle-of-confusion >= 2 pixels to blurred?

* Does contrast-detection-focusing (in still and/or video) utilise the FullHD video feed (and thus is limited to the 2M pixel video resolution), or does it somehow peek at all 18M pixels (that lie within the focus region)?

Presumably since the lens can resolve far more than FullHD, the sensor suffers from moire if circle-of-confusion << 1 pixel. If one wants maximum sharpness without moire, an aperture such that the minimum circle of confusion is around 1 pixel should be selected. I calculate the video pixel spacing to be 0.0117mm, so 2 pixels is close to your recommendation for a CoC for cropped sensors of 0.019mm for still images. So that would tend to lead to similar calculations for depth of field etc, which doesn't quite feel right to me.

So how should one calculate DoF for DSLR video?


Post Reply

Who is online

Users browsing this forum: No registered users and 0 guests