Image stabilisation: eliminating camera shake
iPhone cameras are surprisingly good at cutting out wobbles – but how?
The Steadicam is a Hollywood staple, but not much good for home movies when cine cameras became small enough to hold in one hand, they began to change the way we recorded family memories, as well as freeing movie directors from the pedestals on which bulky filming rigs were wheeled around.
But there was a visible problem: while most people could hold a stills camera – well, still for long enough to take a photo – shooting moving pictures resulted in what became known as ‘shakycam’.
While the producers of gritty dramas shrugged and called it a style, audiences – including your in-laws – were inclined to grumble that they felt seasick and couldn’t see what was going on. In the mid-1970s, cameraman Garrett Brown invented a bodymounted, counterweighted gimbal system that used tensioned springs to transfer the operator’s movements to the camera without the jitter. Marketed as the Steadicam, it became a Hollywood staple, but a rig costing tens of thousands of dollars and requiring weeks of training didn’t offer much hope for the quality of home movies.
In the 1990s, SLR camera makers introduced stabilised lenses that used gyroscopic sensors to detect small movements and electromagnets to move a lens element to compensate. Originally designed to prevent blur due to camera shake in still photos, the same technology – generally referred to as optical image stabilisation (OIS) – can be used for video, although the results are less predictable. More recently, mirrorless camera makers have begun applying stabilisation to the image sensor. This can work with unstabilised lenses and potentially allows larger movements to be eliminated.
Stabilisation is described as operating in two to six axes of movement; beyond the three dimensions (or X, Y and Z axes), this refers to handling translational motion – where the camera moves up, down, left, right, backwards or forwards – as well as angular motion, where it tilts, yaws or rolls. Miniaturised systems like those in iPhones and small drones use servo motors, while bulkier gimbals use smoother brushless motors.
As digital sensors took over from film, electronic stabilisation also became feasible. The camera’s software compares each frame to the last to find shifts representing small unintentional movement or vibration, then
shifts the pixels back. The same technique can be used in post-production using features like Final Cut Pro X’s Stabilization and Adobe Premiere Pro’s Warp Stabilizer.
Digital stabilisation can’t quite match the results of its optical counterpart: it doesn’t prevent motion blur affecting each frame, and larger amounts of movement still produce a rolling shutter effect: the sensor records each pixel in turn, and the time difference between the top left and bottom right of the frame distorts moving objects, creating a wobbling ‘jelly’ effect when the camera also moves.
Even so, because it doesn’t require any special camera hardware, only processing power, digital stabilisation has become commonplace in phones, helped along by digital signal processor (DSP) chips. It’s the reason why clips from your iPhone look a lot steadier than those from old film reels or videotapes. But optical stabilisation can do even better – if it can be squeezed in.
First for phone cameras
Nokia’s Lumia 920, released in 2012, was the first phone camera with optical image stabilisation, which helped both stills and video. But the 920 was criticised for being big and heavy, and as phones got slimmer and lighter, finding room for physical stabilisers got harder. It was with the launch of 2014’s oversized 6 Plus that the iPhone first offered OIS. An Apple patent application filed that year (granted in 2017 as US9591221B2) describes a coil mounting system that uses tiny electromagnets to shift the lens. Initially this was only optimised for still images, but with the next year’s 6s Plus it was applied to video as well.
Subsequent Plus models and the iPhone X use similar systems, with the iPhone X being the first to have dual OIS, whereas only the wide-angle lens in the 7 Plus and 8 Plus has optical stabilisation. One catch is that the OIS system can’t be turned off when it isn’t required. Users have found that when used with external mounts that eliminate shake, such as a drone or handheld gimbal, Apple’s OIS makes the camera more susceptible to small vibrations rather than less. Some have resorted to locking the iPhone’s lens in place with a magnet. This could affect the hardware over time, although iPhone accessories commonly incorporate magnets, so the risk is likely small.
In 2016, Chinese manufacturer Oppo announced SmartSensor, a system for sensor stabilisation in phones, claiming it’s more precise and energy-efficient than lens-based OIS. We haven’t seen it catch on so far, though.
Optical image stabilisation in iPhones attempts to do the same job as this Steadicam rig, with tiny motors.
Apple’s 2014 patent application illustrates how magnets and springs are fitted around the lens to shift it in response to movement detected by gyroscopic sensors.
Both of the iPhone X’s rear cameras have OIS. A longer focal length magnifies shake, so the telephoto has seven electromagnets instead of four.
In this iFixit.com photo of the iPhone 8 Plus’ cameras, you can see the extra mount on the wide-angle camera (right). Sam Lionheart/iFixit CC BY-NC-SA