Mirrorless AF Calibration – Part 3: On-Sensor Phase-Detect Issues

In part 2, we covered the operation of the AF sensor, so now it’s time to take a look at some potential problems that could occur at the sensor which may affect the sharpness of our final images.

This post will specifically look at issues relating to the phase-detection process occurring on the image sensor, and how the measurement of the focus error (how out-of-focus our point of interest is) may be affected.

The autofocus sensor is just one part of the overall Autofocus System, however, and to see how sensor issues can result in soft images, you’ll want to continue through the journey in the other posts.

About this Series

This post is part of a series looking at on-sensor phase-detect autofocus, the whole autofocus system, and why mirrorless autofocus might still need some calibration.

The full list of posts are:

On-Sensor Phase Detect

[Part 2 covered how the On-Sensor Phase Detect system works in a fair bit of detail, with a step-by-step guide through the effect of the sensor microlenses and pixel masking. It’s worth reading that before heading on with this post if you haven’t already, as it should provide a good background for how the AF sensor actually works]

Each pixel on an image sensor is a complicated little thing! In the middle is a photodiode – the actual light-sensitive component – and around this are a few transistors to process the signal from the photodiode and allow access to each pixel by the electronics at the edge of the sensor.

A diagram of a sensel on an camera image sensor
Diagram of a single pixel (sensel) on a camera image sensor

The electronics take up space around the photodiode, reducing the total area that’s sensitive to light for each pixel, and this is where microlenses come into play. They sit on top of the whole pixel, capturing light from a larger area and focusing on the photodiode, improving the efficiency of the whole sensing element significantly.

Underneath the microlens is a colour filter, and for pixels used in phase-detect autofocus, there will be a mask covering part of the underside of the microlens.

Several sensels, one with a masked pixel
Masked pixels for phase-detection

Now we have a clearer view of the construction of a single sensing element, we can start to look at places where errors may be introduced into autofocus operations.

Mask Placement

Take the example of the Nikon Z9 – the sensor is 35.9mm across, and covering this distance is 8,280 pixels, giving a pixel size of 4.3µm (microns, or millionths of a metre). For phase-detect autofocus pixels, the mask covers half of the microlens in order to see light from a specific direction.

Illustration of a mask on a phase detect pixel
Exploded view showing phase-detect mask

Now, what might happen if the microlens isn’t in quite the right place?

Illustration of too small and too large masks on a phase-detect pixel
Incorrect masks: 40% mask (left), and 60% mask (right)

I ran some simulations of shifting the microlens by 1/10th of the pixel width in each direction – around 400nm (nanometres, or billionths of a metre). And here’s a picture of the results…

View through the sensor masks when the mask placement is correct, offset one way and offset the other way
View at the sensor with incorrect mask placement

This is the view the sensor will see of our far-focus target, looking through the right- and left-masked microlenses (if you don’t understand this image, read Part 2 – On Sensor Phase Detect).

The green arrow indicates the focus offset direction and amount, which is the information we need from the autofocus sensor.

The top pair has the mask perfectly covering 50% of the lens, whereas the other two are shifted slightly to one side or the other (covering either 40% or 60%). And remember, we’re talking about a tiny, tiny offset in the mask placement of around 1/125th the thickness of a human hair!

The important thing to notice in the image above is that the lengths of the 3 green arrows are different. This represents the AF sensor’s measurement of the amount of defocus, or “how out of focus” the lens is, and with fractionally incorrect mask placement on the sensor, the results are significantly different.

Incorrect mask placement can change the amount of defocus reported by the AF sensor.


The microlenses on top of the pixels used to guide light onto the smaller photodiode can be produced in a number of different ways. In one, cylinders of material can be deposited above the pixels and then melted into shape, and in another, the whole array may be pressed from a master a bit like a vinyl record.

Back to our Nikon Z9 – what if some of those 46 million tiny lenses mounted on top of the sensor aren’t in quite the right place or aren’t quite the right shape?

Illustration of incorrect microlens placement and shape
One microlens is in the wrong place, and one is the wrong shape (form error)

From our autofocus perspective, a shift in the position of the microlens could change the view seen by masked pixels, and have a similar effect as shown above with an incorrect mask position.

An error in the lens shape (known as a lens form error) can lead to stray light being projected onto adjacent pixels, leading to noise in autofocus measurements and reducing the accuracy, or a lens that’s too flat as shown above may not correctly capture light rays for reliable phase-detect operation, leading to erroneous results for that pixel.

Microlens form errors can change the amount of defocus reported by the AF sensor.

Dead Pixels

Dead pixels are a fact of life when you’re dealing with millions of sensing elements on a single sensor. But what happens if a masked autofocus sensing pixel is dead?

To be honest, almost certainly nothing. The AF pixels don’t operate on their own – the autofocus measurement is the result of measuring light at hundreds of individual masked pixels, so if one or two of them are “dead” rather than returning potentially subtly incorrect information, they return nothing. And that’s pretty easy to ignore.

It’s far more of an issue for the above cases, where the AF Sensor reports some information but it’s not quite accurate.

Effect on Autofocus

I’ve shown a few potential causes of manufacturing-related sensor issues that wouldn’t affect image quality but may affect results from individual autofocus sensor pixels. 

Autofocus as a process doesn’t use a single pixel though. Instead many pixels are used to form lines, and the focus position is determined by looking at edges of objects across the line. So problems with single pixels don’t matter, do they…?

A line of sensels, showing alternating left and right masks over green filtered units.
Image sensors often use lines of masked pixels for phase-detection

We’re talking about random errors across millions of pixels, so the chance of quite a few pixels being marginally affected by mask placement or microlens issues is fairly likely. And remember, these issues don’t typically affect image quality, so unless you’re looking at the raw phase detect data directly from the sensor under test conditions, you’ll never know exactly how the pixels are affected.

And the result of each pixel having just a tiny, random bit of error is an increase in the uncertainty of the result.

Don’t forget, these tiny errors are unchanging – they’re part of the manufacturing process and aren’t affected by vibration, temperature or general wear-and-tear. And this is important:

The slightly erroneous results being fed into the Autofocus Processor will always be wrong in the same way.

It’s not inconceivable that, after the full autofocus process has run, these tiny, fixed sensor errors might cause a fixed offset to the focus position… exactly the sort of thing that AF Fine-tune is there to fix.

There’s a lot more to the whole Autofocus System to look at first though before we draw any solid conclusions.

Next up, Part 4 and the beast that is The Lens – this one’s got loads of pretty pictures!

2 comments on “Mirrorless AF Calibration – Part 3: On-Sensor Phase-Detect Issues

  • Andy Miller says:

    Do cameras that use a Bayer sensor design perform AF poorly when the subjects are pure Blue or Red — so effectively only 1/4 of the AF sensors provide useable data for the Af-processor to use?

    • Interestingly, in tests we’ve done, the answer is no. Typically, the PDAF masks are used on the green pixels as there are twice as many, and also getting the green channel in focus results in a perceptually sharper image to the human eye. But if the subject is lit with pure blue or red light, AF still works (at least on later cameras like the 5D Mark IV using live view, and the Nikon Z-series mirrorless cameras), which suggests either (a) maybe a mix of red, green and blue pixels are used for PDAF, and/or (b) the colour filters are not all that specific, and there’s enough red/blue getting through the green filter for AF to work, or (c) there is no colour mask on the PDAF pixels. The latter certainly happens for some sensors, but generally those a lot smaller than DSLR size as it’s difficult to lay filter arrays with blank spaces.


Leave a Reply

Your email address will not be published. Required fields are marked *