Let’s talk about Sony’s new Real-Time Tracking Autofocus: A Breakthrough?

by

posted Tuesday, February 5, 2019 at 7:45 PM EDT

 
 

Though the unveiling of the Sony A6400 stole the majority of the news cycle for Sony's recent announcements, perhaps what has longer-lasting effects to the photo industry was one piece of technology inside the A6400: Object Recognition Tracking. Coming also to the Sony A9, this new advancement in autofocus that we had the opportunity to test is Sony's showcase in why they believe they're so much further ahead than the competition. Their message was: while others struggle to get Eye-AF into their cameras, we've already iterated on the concept and developed a brand new system, never before seen.

Arguably, this is the same evolution of a new technology that we saw when Sony first debuted Eye-AF. But like the first swing at Eye-AF, Real-Time Tracking isn't quite perfect just yet, at least in the A6400. That said, it's got considerable promise. No, more than that. Real-Time Tracking can completely change how photographers shoot, changing parts of the process and how they mentally prepare for a particular shooting situation. This is not an overstatement, it really does have that kind of impact.

Using the existing eye detection found in Eye-AF but then combining it with pattern detection, depth information and color, when it works, Real-Time tracking has the ability to fully isolate a subject in a scene and follow them as long as the photographer needs to. Depending on what camera they are using, the experience varies, however.

For example, we shot with a Sony A9 running a beta version the v5.0 firmware that outfitted it with Real-Time tracking last week, and the experience was eye-opening. As a note, we are aware that high-end DSLRs like the Nikon D5 and Canon 1DX II can lock onto a subject in one location and follow it around the frame, it's just that they rely on depth information and extrapolating from detected motion to do so. Nikon specifically also integrates color and tone information from their RGB exposure sensor to do subject tracking. It's not nearly as high resolution as the color/tone data that Sony has, but the idea of using color and tone together with depth data has been around for a while. Sony seems to be taking it to the next level though, especially when combined with Eye-AF information. 

(Click this image, and all others in this article, to view it at full resolution.)

 
 

Instead of doing the work yourself, you can trust the camera to handle the focus calculations while you, the photographer, are free to reframe and move the camera around dynamically. This is best achieved by using another button to activate Real-Time Tracking (like back button AF) while the camera is set to Wide Focus Area. That way, the whole sensor is being utilized to see the whole scene, and you as the photographer can pick anywhere on that scene to frame. As the shooter, I could be following the action at a shorter focal length, and then activating tracking when I wanted to just follow one person in that action, moving the camera any which way I wanted all while never losing my grasp of focus on my subject. The entire telephoto lens, from widest to most zoomed, was open to me, in any configuration. It was freeing.

As a note on this: in this particular case the idea is that you're relying on the camera to identify the subject based on eye, face, shape, depth, etc., and that you need to have the desired subject somewhere central or otherwise isolated. If you're using the Real Time Tracking AF by itself (which is less dynamic but more purposeful), you can manually select anywhere on screen to start the tracking process, but if you're in Wide Area AF mode, you do need to find a way to initially isolate your subject in order for it to lock onto your intended subject. Once locked, however, the camera does an excellent job of holding that lock for long periods of time.

I want to show you a perfect example of where Real-Time Tracking has made a huge difference, evolving beyond just Eye-AF, and allowing me to make sure I'm getting the shot even though I'm not even close to being a sports photographer.

In this sequence, which are all the images below in the remainder of this article, I did not move my camera much at all, but instead activated Real-Time Tracking to follow a basketball player as he moved from the top of the key up to and through a dunk. This single action took well over 20 frames and in real-time lasted less than three seconds, but this sample of images from that burst should show why the technology is so powerful.

 
 

In the first frame, the subject is in focus, locked on the eyes, on the far left of my viewfinder. As he moves to the basket, he fully turns around to where his eyes are hidden from me and I can only see the back of his head. After he completes a full rotation, he moves quickly from that position upwards to the hoop. By the time he has completed the motion, he is now fully on the center-right side of my viewfinder.

 
 

At no point during this move does the A9 lose track of him or what about him is important: his face. Even when he turns, the back of his head is perfectly in focus and as soon as his eyes return to the frame, tracking re-acquires his eyes and follows him fully through the action.

You might think this is nothing special, since Eye-AF is already very powerful and theoretically would give you the same result. Eye-AF is already the dominant technology behind why the whole series was in focus to begin with, right?

 
 

Well, not exactly. You see, when the basketball player turns his back fully to the camera, Eye-AF no longer has a subject. If that were all the camera had available, it would drop AF and start to seek. Odds are high that it would find the other player in the background and attempt to lock on there. Regardless, it wouldn't easily be able to continue to follow the same player without an eye to lock on to. 

That is why Sony decided they needed to make Real-Time Tracking. The idea is that at its best, Real-Time Tracking is going to be working in tandem with Eye-AF to provide full support for following anyone through any sort of situation. That is the goal, and why Real-Time Tracking and Eye-AF are not separate functions in the autofocus menu. They are designed to be used in tandem.

 
 
 
 
 
 

Sony did, however, separate them in one case: video. For reasons that they could not say (that's a direct quote: "we cannot say."), Sony does not allow Eye-AF to be active while shooting video. However, you can use Real-Time Tracking in video, which fully relies on Sony's "AI" technology to track and follow subjects.

On the A9, the experience is pretty spot on. In multiple situations I was able to follow a single basketball player for long periods of time without ever losing focus on them. The A9 was already very good about locking focus on one subject even when someone or something passed between it and the camera, but it's even more impressive now with this next-generation tracking.

 
 

This kind of computation must take a considerable amount of horsepower inside the camera, though, with the result that the shooting experience on the A9 was wholly different than that on the A6400.

We'll be going into more detail on the A6400 in our full first field test, because we're still deciding on how we feel regarding its autofocus performance with Real-Time Tracking. The reason we are still formulating our opinions even two weeks later is because the shooting experience with the A6400 was very different than with the A9, and not just because the A9 shot more frames. Mainly, it felt like the A6400 just lacked the horsepower to simultaneously do tracking, display the results of the tracking (that is, update the display to show what AF points were in use), capture the images, and write the data all at the same time flawlessly. There is likely a bottleneck at the sensor readout level, since the A9 has a far more advanced sensor than the A6400, thanks to the stacked-sensor technology it uses, with sensor, memory and processing circuitry all integrated in the sensor package itself.

The result is a somewhat choppy experience on the A6400. Without getting too much into it here, there were situations where it did not keep or lock focus as well as the A9, its display is not quite right (the view in the EVF often showed incorrect autofocus points as the camera struggled to keep its display in line with the AF points it was actually using), and video focus performance dipped. With the A9, very rarely did the lock waver away from the intended subject. With the A6400, the camera appeared to lose tracking quite frequently (although it seems that it may actually have still been locked in many cases, it being just the display of the AF points that lagged badly), and attempts to re-acquire subjects with object recognition was sometimes very challenging. So while the A9 had almost no problem being great without Eye-AF backing up the Real-Time Tracking, the A6400 struggled quite a bit more.

That's actually pretty expected, though. The A6400 is an entry-level camera while the A9 is the most advanced camera Sony has ever made. There was bound to be a difference in performance, and the A9 is clearly the best example right now of what can be accomplished with Sony's new technology. But seeing the A6400's struggles means that this new tech takes quite a bit of hardware prowess to function at its peak, so don't automatically expect every Sony camera that gets this feature to perform at the most optimum level until seeing test results.

Sony is going to keep iterating on this technology and improving it, as evidenced by what they did with Eye-AF to make it as good as it is today. From what we saw on the A9, which was in a very early beta version of the firmware, this is an extremely impressive first shot at "AI" autofocus from Sony, and strongly positions them as one of the early innovators, along with Olympus' first shot in the E-M1X. No one asked for this feature, but watch it become as popular and as demanded of a feature as Eye-AF has become as the years go by.


[A side note: While Sony calls their new tech "AI" autofocus, it's not clear the extent to which it's actually using the kind of deep-learning technology that's what people usually mean when they say "AI" these days. That sort of capability is clearly coming, but "AI" is a very vague term, and it's not clear just what Sony's implemented in their new AF system. AI has become a marketing buzzword with no clear definition, and we're not sure that what we're seeing in Sony's Real Time Tracking really qualifies for that label.]