Pentax K-3 -- A Geek's Guide to On-Demand Low-Pass Filtering

by Dave Etchells

Pentax revolutionizes low-pass filtering. In a truly revolutionary move, Pentax has developed a solution for variable, on-demand low-pass (anti-aliasing) filtering in digital cameras, the first implementation being in their new K-3 DSLR. This is such an important development that we're going to devote a little time to explaining how they do it, and why it's so significant.

Low-pass filters, aka anti-aliasing (AA) filters are an important part of digital imaging of which most people have little understanding. Recently, there's been a move afoot in the photo industry to eliminate them, which we at IR consider ill-advised. They're very necessary in some situations, yet in others needlessly reduce resolution and sharpness. Clearly, what's needed is a way to have a low-pass filter when you need it, and do away with it when you don't. That's exactly what Pentax has just made possible for the first time, in their new K-3 SLR.

Some background. Before getting into the details of what Pentax has done, it'd be useful to explain a bit about the role of low-pass filters in digital imaging, and just why they're so necessary.

Ultimately, what it all comes down to is accurately reproducing the scene in front of a digital camera, without artifacts such as jaggies and false colors in the image. At the root of the issue is something called the Nyquist-Shannon Sampling Theorem (the Sampling Theorem, for short.) That's a mouthful, and there's a lot of complicated math behind it, but the bottom line is that you need to digitize an analog signal at twice the frequency of the highest frequency it contains.

Wait a minute: Frequency? Who said anything about frequency? Aren't frequencies just something we worry about in audio applications?

While most of us tend to think of frequency in the context of sound, the same concept applies to images as well, in the form of spatial frequencies. To understand, think of a grid of black and white lines. The spacing between the lines corresponds to the grid's spatial frequency. When the lines are fine and close together, that represents a higher spatial frequency; when they're coarse and farther apart, the spatial frequency is lower. The illustration at right shows examples of low and high spatial frequencies.

The Sampling Theorem says that to accurately reproduce an image like this, we need to have twice as many pixels as the finest-pitch detail that we want to record. If we have too few, we'll see aliasing, most obviously as moiré patterns appearing as broad, often swirling bands of tone or sometimes color, visible in the image but not present in the subject.

We've all probably seen moiré patterns at one time or another, perhaps on television when someone on-camera is wearing a herringbone-patterned suit or other fabric. You might also have seen color artifacts in your own digital images as well, where very fine, high-contrast details were present in the original scene.

The solution to problems like this is to insert a low-pass filter into the optical chain, before the image is digitized by the sensor. As its name suggests, a low-pass filter only lets through spatial frequencies lower than a certain value. If you set the cutoff frequency at half the pitch of the sensor pixels, you should be able to eliminate aliasing. (In actuality, you may need a lower cutoff frequency than that to avoid color artifacts, because the pitch of the color filter array on most cameras is greater than that of the underlying sensor pixels. Only the Foveon sensors used in cameras made by Sigma Corporation entirely eliminate color artifacts, because those sensors produce full red/green/blue data at every pixel location. They can still be subject to luminance artifacts, though.)

The shot above is an excellent example of color aliasing, caused by the fine thread patterns in the model's outfit. In this particular case, the problem wouldn't be too difficult to eliminate in Photoshop, perhaps using a hue brush to remove the offending colors. Subjects with varying hues or larger-scale detail patterns of their own can make removing color artifacts very difficult or impossible. As noted in the text below, this was shot with an Olympus OM-D E-M1, but could as easily have come from any camera lacking a low-pass filter (or having one that's too weak to properly do its job).

The images above show an example of what this can look like in a digital image. These are from shots taken with an Olympus E-M1, but they could as easily have come from any other camera lacking a low-pass filter, such as the Nikon D800E, Nikon D7100, Sony RX1R, or most any medium-format digital camera back. You can clearly see the swirling color patterns in the fabric of the model's dress; take our word for it, the dress itself had no such colors in it. You can see luminance moiré patterns as well.

While clever software can do a surprising amount to eliminate artifacts like these, there will invariably be situations where they're just unavoidable. Once they're there, they can be nearly impossible to get rid of, depending on the subject in question.

This is at the heart of why IR believes eliminating LPFs entirely is a bad idea. You might get away with it 95% of the time, but the remaining 5% can make your life as a photographer truly miserable, with hours of Photoshop work needed to eliminate the visible effects. You might escape the consequences of not having an LPF most of the time, but when aliasing appears, you'll sorely regret its lack.

On-demand LPF! This is where Pentax comes in, with tech that delivers LPF when you want it, and none when you don't.

The concept isn't entirely new. Nikon recently filed a Japanese patent on a concept for a switchable low-pass filter, although it wasn't without some limitations. Neither is Pentax's solution, for that matter, but in our view, it has a good bit more to offer.

Let's take a look at how conventional LPF designs work, how Nikon's newly-patented approach works, and then what Pentax is doing. First, conventional LPF systems.

Conventional Low-Pass Filters. As noted above, the job of a low-pass filter is to eliminate too-high spatial frequencies, by applying a very controlled blurring to the image data before it reaches the sensor's surface. Normally, this is done by using slices of birefringent material to double the image slightly, both vertically and horizontally by a controlled amount. The illustration below (taken from our Nikon D800E review) shows how this works in practice.

 

Basically, an initial low-pass filter doubles the image horizontally, a "wave plate" reorients the polarization of the light exiting the first LPF, and then a second LPF doubles the image vertically. The final result is an image that's been blurred by a carefully controlled amount in both vertical and horizontal directions. If you blur enough, you'll completely eliminate luminance or chroma moiré patterns and jaggies.

Of course, the catch is that, in the process of eliminating moiré, you've also significantly reduced image sharpness. This explains why companies have generally been weakening LPFs in recent years, and recently have begun eliminating them entirely.

Nikon's adjustable-LPF patent. Nikon made news recently (late August, 2013) with a Japanese patent filing for an adjustable low-pass filter. Nikon's approach inserts a liquid-crystal layer into the optical chain, between the first and second low-pass filters. When it's turned off, the liquid-crystal rotates the plane of polarization of the light passing through it, with the result that the split image from the first LPF is recombined when it hits the second, resulting in no LPF action at all. When it's turned on, the polarization is unchanged, so the second birefringent element splits the image further, producing a low-pass filter effect.

With the liquid-crystal layer disabled, it rotates the polarization of the light, and the second birefringent element simply undoes what the first one did.
When the liquid-crystal layer is activated, the polarization remains the same, so the second element just spreads the split image further.

The diagrams above (courtesy of the Japan Patent Office) show what was just described: On the left, the liquid crystal -- the layer we've colored green -- rotates the polarization of the light by 90 degrees, so that the second birefringent element simply recombines the spit image back into a single one again. (This is how the LPF system in the Nikon D800E works; a second element undoes what the first one did. The result is that there's no LPF function, without having changed the optical path compared to a standard D800.)

On the right, the liquid crystal is is activated so that there's no change in polarization, and the second filter doesn't recombine the images, but rather shifts the doubled image even more, resulting in a LPF function in the horizontal direction. (Note that we'd have to duplicate the entire system above in the vertical direction, in order to produce the combined up/down-left/right doubling of a conventional optical LPF, as in the first diagram above.)

The Nikon patent also describes using this technique to switch between heavier LPF action for video work and a lighter LPF for still capture.

It certainly seems that this could all work, and it'd be a pretty revolutionary advancement in its own right if it can be put into practice. But there's no free lunch here; the liquid crystal layer will necessarily introduce some light loss, and to have a full, two-dimensional LPF function, you'll need two of these setups, further reducing light transmission.

Pentax - A mechanical LPF! Now comes Pentax, with an entirely new approach to the whole question of low-pass filtering. A mechanical method!

Pentax, together with Olympus and Sony, have technology for sensor-based image stabilization. These systems shift the sensor back and forth by microscopic amounts, to compensate for camera motion, keeping the image in a stable position on the sensor's surface. Basically, they undo blur that would happen otherwise.

But wait a minute: If the IS system can undo blur when we don't want it, why not have it create blur when we want it?

That's exactly how the anti-aliasing scheme in the new Pentax K-3 works. During the exposure, the anti-shake actuators oscillate the sensor assembly in either a circular or linear pattern, shifting back and forth a pixel or less in each direction. It thus does pretty much exactly what a conventional LPF system does, but without all the optical complexity -- and it can be turned on or off at will. The amount of motion could even be varied to change the strength of the resulting LPF function. (As far as we know, this latter ability isn't implemented in the K-3, but it should be pretty trivial to do so. Firmware upgrade, anyone?)

According to the briefing we received, the Pentax K-3's mechanical anti-alias system has two different modes of operation, one in which it only moves back and forth along a single axis (called Type 1), and the other in which it oscillates with a circular motion (Type 2). At press time, we don't know the reason for the single-axis mode, as it'll only provide low-pass filtering in one direction.

This illustration shows the Pentax K-3's sensor assembly, which "floats" inside the chassis, its precise position controlled via electromechanical actuators (the rectangular, copper-colored coils located to each side and below the sensor itself). These can move the sensor assembly with sub-pixel accuracy. Normally used for Pentax's "SR" shake reduction feature, in the K-3, the sensor-position actuators can also produce the very slight, deliberate blurring needed to emulate a conventional optical low-pass filter.

 

Whenever we hear about a radical new technology like this, the first thing we ask is "what are the tradeoffs?" In the case of Pentax's mechanical LPF technology, our first question was what range of shutter speeds would it work over? In order to work properly, the entire sensor assembly must complete a full "orbit" of its circular motion during whatever time the shutter is open. We thus expected that there'd be some fairly low maximum shutter speed at which it could operate.

AA Simulator Example 1. This animation cycles through Off, Type 1 and Type 2 settings, showing the anti-aliasing effect on one of our Color Starburst targets. Too see the full-res versions, click on: K3hVFA_AA0.JPG, K3hVFA_AA1.JPG and K3hVFA_AA2.JPG on our thumbnails page.

When we asked, though, we were astonished to learn that it was fully functional up to shutter speeds of 1/1000 second, and "partially functional" even higher! This means that the entire sensor assembly has to be oscillating up and down and right to left at something over a thousand times per second. (See the next section below for the results of some further analysis, though, which seems to indicate that the system is only oscillating at about 500 Hz.) While it's only having to move a very slight amount in each direction (a pixel or less, which in this case means just 4 microns or about 15 hundred-thousandths of an inch), we were still surprised that that large a moving mass could be made to oscillate that rapidly. This is something on the order of 50x faster than needed for normal IS reduction, so perhaps Pentax has beefed up the electromagnetic actuator system for the K-3's sensor assembly. (We also think they tuned the system to have a natural resonance at the operating frequency of the AA-simulation system.)

Another key feature of this Pentax approach to anti-aliasing is that you can vary the amount of blurring pretty arbitrarily. Again, this doesn't appear to be implemented in the Pentax K-3, but the potential is certainly there. Imagine: A smoothly-variable low-pass filter that lets you dial in exactly the amount of filtering you need for different subjects!

AA Simulator Example 2. This animation also cycles through the three settings, showing the blurring effect on the finer elements in our USAF Resolution target (magnified 200%).

Ultimately, it might even be possible to extend this system to use in video recording as well, though that's not supported in the K-3. You'd need to do something to get rid of the actuator noise, either through using an external mic, or perhaps with some clever signal-processing. Given that the AA system's noise frequency is so well-defined (see the audio analysis in the following section), it would be pretty straightforward to knock it out of the camera's audio without affecting the rest of the audio stream at all. (The very narrow and well-defined frequency means that a very narrow notch filter could get rid of it, with little or no impact on the rest of the audio spectrum.)

Tradeoffs? The natural question to ask at this point is what are the tradeoffs? There doesn't seem to be much of a restriction on shutter speed, is there anything else?

Before getting to the negatives, we should note that Pentax's approach has a big advantage over Nikon's approach in that there's no light loss associated with it, since there's no added liquid-crystal layer, or even a birefringent element anywhere to be found. As we've pointed out, it's also completely variable, whereas the Nikon system's behavior is governed by fixed-size birefringent elements.

When it comes right down to it, about the only downside we see is the sound that the camera generates when the system is active; this is a mid-frequency hum, almost exactly 500 Hz by our measurements. We thought that the frequency of the oscillation might vary depending on the shutter speed in use, but our experiments with a prototype unit showed no change from 1/20 to 1/1,000 second, suggesting that the system is in fact using a natural oscillatory frequency of the sensor/actuator system. The AA system only cranks up while the shutter is open, so the hum is very short in duration, and largely masked by the shutter sound itself, although it's more noticeable when the camera is in continuous-shooting mode. Our prototype sample seemed to disable the function at shutter speeds below 1/20 second, but that could easily change (along with other aspects we've reported on here) when the final firmware is released.

For those interested, here's a link to an MP3 of the Pentax K-3's shutter sound, with the anti-aliasing system enabled.

The image above shows the sound waveform from the K-3's shutter when operating at 1/20 second. You can clearly see sounds associated with the initial pressing of the shutter button, the mirror opening, the AA/LPF system cranking up, the shutter opening, closing again, and finally the mirror lowering.

The waveform above shows just a portion of the cycle above, when the AA system is operating. We've amplified the signal a fair bit here, so you can see the frequency produced by the AA system itself more clearly.

Finally, here's a spectrum plot of the portion of the waveform when the AA system can be heard without interference from other sounds. You can see the sharp peak from the AA system, right around 500 Hz. (502 Hz in this plot, but others with different sample lengths came out right about at 500 Hz; the exact frequency that the spike is displayed at will vary slightly depending on the length of the sequence analyzed, the type of windowing function used, etc.)

As a side note, we're a little surprised that the system's operating frequency was 500 Hz, vs 1 KHz or higher, given that Pentax claims full effectiveness up to shutter speeds of 1/1,000 second. It may work well enough that fast, but from what we've seen in the audio signature of the system, it looks to us that it should only be fully effective to 1/500 second. We'll experiment with this a little once we receive a camera body running final firmware.

The sound artifact is doubtless why Pentax hasn't tried to use the new AA system for video operation, where it would appear prominently in the soundtrack from in-camera mics and can be even heard fairly readily from anywhere near the camera, in a quiet environment. We have a couple of thoughts on this, though. First, the movement of the sensor assembly could be a good bit slower in video mode, which would both reduce the amplitude of the mechanical vibrations, as well as shift it to a much lower frequency, where it might be less obtrusive. The second consideration is that the sensor noise is very consistent and has a very well-defined central frequency. As noted earlier, this would make it relatively trivial to remove it from the audio track with a little signal processing. Given this, we don't think it'll be too long before we see this technology applied to video as well.

It's of course possible that the system just won't quite work as advertised, but we'll know that pretty soon, as Pentax USA has said that they expect to receive final firmware for the camera fairly shortly after launch. (Stay tuned...)

Who can follow? While we're sure Pentax has this new AA approach well-protected with patents, there's always the possibility of lucrative licensing arrangements for them. Given that, it's interesting to speculate on the extent to which this new AA technology might propagate throughout the industry. As we mentioned above, both Olympus and Sony have sensor-based image stabilization tech, so it seems that either could pretty easily adopt an approach like this, if they were able to license the concept from Pentax. Between these three companies, we could see quite a few cameras sporting this technology at some point in the future, and the capabilities of this approach wouldn't be easily replicable by Canon or Nikon.

But what about lens-based stabilization systems? Couldn't this same approach be applied to them, as well? Conceptually, there's no reason why you couldn't implement an AA system like this using lens-based IS technology. The idea would be the same; use the IS element to shift the image slightly on the focal plane during the exposure. The devil would very much be in the details, though, as the actuators for the in-lens IS elements would have to be much more robust than most are currently, to achieve sufficiently fast operation to permit reasonable shutter speeds. This could be prohibitive from a weight or power standpoint, though. The amount of motion would also have to be precisely controllable, and vary according to the size of the pixels in the camera the lens was being used with. All this makes us think it's unlikely that we'll see lenses incorporating this AA concept, but we'd be the last ones to definitively rule anything like that out.

Bottom line. It's not often that we label an imaging technology "revolutionary", but if ever one deserved to be called such, Pentax's selectable anti-aliasing filter technology is it. It's a fundamentally different approach to anti-aliasing, and one that appears to have surprisingly few downsides associated with it. For the first time ever, there's a camera on the market that lets the end-user decide when they want a low-pass filter, and when they don't, with little apparent penalty regardless of their choice. And it was Pentax that brought it to us.

Place your order with trusted Imaging Resource affiliate Adorama now:

Or with Amazon:

Or with B&H Photo Video:



Editor's Picks