It turns out calibrating AF Microadjustment is actually pretty difficult. Sure, there’s lots of information on the web about how you can do it with a piece of paper, but there are so many variables involved that a static paper target doesn’t really cut it for useful real-world results.
At Reikan, I’ve been looking at focus calibration for a little over 5 years now. Over that time, I’ve made a lot of assumptions about how a camera’s autofocus system works and often found that – after testing – those assumptions aren’t quite right. And there’s almost no way you’ll get detailed information out of camera manufacturers – they’re more tight lipped than the NSA!
FoCal came about because I wanted to test various ideas about autofocus systems, but a lot of the testing was difficult or impossible to do manually because it took too long or you couldn’t touch the camera during the testing. I keep finding more things to test and analyse, hence the addition of lots of interesting tests in FoCal Pro.
Recently, thanks to a discussion with a long time and well respected user of FoCal, I took some time to look in detail at just how the spectrum of the light illuminating the subject can affect the autofocus results – i.e. how does colour affect autofocus? – and the findings are quite surprising so far.
In the FoCal Lab we have performed a lot of tests with various light sources to give the general recommendations for testing, but it’s only recently that I’ve taken the opportunity to test with some more unusual monochromatic light sources (the sort that it’s not recommended to calibrate against!). Following on from this, I’ve also played with some custom demosaicing algorithms for raw image data to be able to analyse focus results for red, green and blue lighting and also get light spectral quality information.
It’s important to point out here that any results shown below are very early data. There’s a lot more testing to do in order to properly understand the interaction between the various parts of the autofocus and imaging systems when faced with different light sources. So… this post is just for interest – there will be more concrete findings to follow.
The Autofocus and Imaging Systems and their response to Colour
It’s important to have in your mind that the camera Phase Detect AF system and the Imaging system are independent.
I use the term “AF system” because I’m not talking about just the AF sensor, but everything that goes into getting what the camera considers an “in focus“ image on the sensor. This includes the main mirror and it’s cushioning, the secondary mirror, the AF sensor itself, the lens optics and motors, the algorithms used to determine focus quality, lens drive and – in a lot of newer cameras – the metering sensor as well. ALL of these components have an input into how the focus is achieved and how accurate and fast it is, so you can’t discount any aspect of the system when analysing general results captured from the imaging sensor.
So, you have to be a bit careful about how results are interpreted. When you light the subject with a monochromatic red light, the results analysed at the imaging sensor don’t absolutely tell you what the AF sensor saw.
The imaging sensor consists of millions of photodiodes which are sensitive to everything from around 400nm (blue/violet) through to 1µm or above (well into infra-red). The whole sensor is covered with an infrared blocking filter and a colour-filter array composed of red, green and blue dye panels which cover the individual photodiodes. The characteristics of these dyes are different for different manufacturers – letting different amount of various colours of light through. So, for example, the “red pixels” will be sensitive mainly to red but will also show some signal if exposed to a bright pure green light as the dye is not a perfect bandpass filter.
To try to see what happens with different lighting, I used a specially modified version of FoCal which allows analysis of the data from the red, green and blue photosites independently. I could then run the AF Consistency test across a range of AF Microadjustment values and see where the best focus was for each colour, in this instance relative to the uncalibrated AF position (i.e. the position that the camera/lens would focus with an AF Microadjustment value of 0).
I tested with 2 different light sources in a number of configurations:
– Monochromatic Lighting: individual red, green and blue single-wavelength lights
– Pseudo-white Lighting: the red, green and blue lights all illuminating the target together, giving a visual “white” but actually composed of 3 very distinct monochromatic light sources
– Daylight: a fairly balanced broad-spectrum white light from diffused daylight
For each of the tests, the percentage of total energy received in the red, green and blue channels is shown giving an indication of the light component levels (as the camera sensor sees them).
Below are a series of graphs showing the results from the testing. Bear in mind that this is an early set of tests and there will be lots more information coming, as well as improvements to the analysis and results presentation.
A few notes on the results:
- The results are normalised to the peak measured result (this will have a value of 1.0)
- The vertical lines show the predicted best-focus position for each colour (if there isn’t a vertical line, it means there wasn’t enough light of that colour to determine the result)
- The distance shown in brackets is the depth-of-focus offset (i.e. the focus point offset at the sensor) from a zero adjustment.
Canon EF 40mm f/2.8 STM
RGB Combined – “Pseudo-white”
Canon EF 24-70mm f/2.8L
Canon EF 85mm f/1.2L
Interpreting the Results
Although I have a lot of theories and practical evidence of certain behaviours, I’m not even going to try and explain the results for a few reasons:
Firstly, the results above are run with an immature test. There’s no autofocus consistency validation, and whilst they were repeated to check result validity there could reasonably be some errors which may affect the finer details of the interpretation of the results.
Secondly, there’s not enough data to draw any serious conclusions. We can observe that it certainly looks like the perfect focus position for red, green and blue light isn’t in the same position which could result in some image softening in some channels (and this could likely be from axial chromatic aberration – where each wavelength of light is focused on a different plane parallel to the sensor plane).
Thirdly, there’s a fair chance that the camera manufacturers implement a correction to autofocus position based on spectral content as measured by the AF (and metering) sensor. If this is the case, there’s not going to be any linear change as you adjust the amount of red, green and blue, and it would be very difficult (or most likely impossible) to determine what’s actually happening in the complete focus system.
For now, the best conclusion from this is to make sure you calibrate your camera and lenses under the lighting conditions you’re most likely to shoot in. For example, if you calibrate under warm incandescent lighting, you may find that your focus isn’t optimal for clear blue daylight shooting.
The results are really exciting because they start to quantify some detailed effects as a result of the AF system and lens design, and will go towards helping get more accurate calibration for a wider range of shooting conditions.
This data will be available in FoCal 2 (for FoCal Pro users) – you’ll be able to automatically determine best focus for various colours as well as getting an idea of the spectral content of your source lighting. There are a few other features which will help make more use of this from a calibration point of view, but I’ll leave that to another post.