iPad&iPhone user

How to: Take great photos with the Camera app

Boost your photograph­y skills with Glenn Fleishman’s tips

-

When it comes to photograph­y with the new iPhones, all the attention goes to the hardware and the improved sensors that can reduce noise and produce photos with better definition in more circumstan­ces. But let’s not overlook the software: iOS 11 offers at least one advantage to photograph­ers across all recent iPhone models. And all these features

require rethinking how you frame, time pressing the shutter, or use natural light with a shot.

Live Photo captures can now be manipulate­d and converted into video or a new kind of still in a few ways through Photos. This works for all iPhones and iPads that support the Live Photo option.

Apple has offered high-dynamic range (HDR) capture for years, which synthesize­s a single image from rapidfire different exposures. But in the latest release, Apple is confident enough about the quality on the iPhone 8 and 8 Plus that it no longer retains the ‘normal’ photo by default. (You can re-enable this option.)

The iPhone 8 and 8 Plus also had a seemingly small hardware upgrade to the Quad-LED True Tone flash system that makes flash photograph­s look fantastica­lly better than with previous models, approachin­g something that previously required a separate flash on a mirrorless or DSLR camera. Finally, the iPhone 8 Plus enhances Portrait mode with new studio lighting options that work best when you spend a little time finding the optimal background and light conditions. Live Photo gets a kick Live Photo seemed like a gimmick at first, but iOS 11 finally makes it into something worth experiment­ing with. Live images are still captured the same way: tapping the Live button in the Camera app (if it’s not already enabled and showing yellow) captures a total of three seconds of images before and after the point at which you tap the shutter-release button.

Once captured, you can edit in Photos in iOS by selecting the image and then swiping up. This reveals

the four available options for Effects: Live, Loop, Bounce, and Long Exposure. You can then tap Edit for additional controls, including changing the key image or trimming the long exposure range. Likewise in Photos 3 for macOS High Sierra, you double-click a Live Photo, click Edit, and then will see options in a popup menu at the bottom alongside the trim control.

The Live option retains the original crop of your photo as you saw it in the Camera preview. Switching to any of the other three modes, however, crops to a greater or lesser degree on each side to make sure the same area of the photo appears and is stable across all the frames, and the resulting video or exposure isn’t jumping all over the place or has jagged uncaptured edges.

This requires some planning, as you can’t capture in those modes. You have to visualize the crop while shooting. If you’re not trying to grab a unique moment, like a live event or a rare bird, you can shoot and then use Photos in iOS to view the effect to get a better sense of how it will be managed. Then, you can shoot the same scene again, or repeatedly, to nail down what you want.

Long Exposure is a particular­ly interestin­g and difficult mode, because there’s no way to adjust the speed or duration of a Live Photo. As a consequenc­e, only elements

in a photo that move at a certain rate relative to the capture will create a long exposure that feels like it has a purpose.

You also need to consider camera movement. With better lighting, a faster shutter speed and larger aperture reduces the effects of camera movement, especially with optical stabilizat­ion on any iPhone equipped with it. In lower light conditions, a slower exposure already amplifies or blurs movement, making the long exposure much less crisp. For particular shots, using a monotype or tiny tripod could dramatical­ly improve the effect for long exposure, as well as for Bounce and Loop videos.

I shot a number of Live Photo exposures at the Seattle Japanese Garden, a park near my home, as it’s full of light and dark, and moving and still water. Photos of a rapid trickle of water, a slower-moving area, and a large pool each produce different results: the rapidly moving water becomes a haze above rocks; the slower-moving rivulet shows whirlpools and motions; the pool’s ripples disappear and it becomes almost supernatur­ally still.

Long Exposure effectivel­y makes a photo appear to have lower resolution, and zooming in reveals artifacts instead of detail. As a result, it’s better to use these for

online display at smaller size instead of ‘hero’ images that might fill a browser window or for printing. HDR by default I’ve been shooting with Apple’s HDR since it was introduced, but it’s only the change in the iPhoto 8 and 8 Plus to only retain the HDR image by default that has had me re-evaluate the way I frame, adjust exposure, and react to pictures with an extended dynamic range.

The iPhone can’t yet preview HDR, although I imagine that’s in the future, possibly requiring

more advanced image processing hardware or a second wide-angle lens. (The future of smartphone photograph­y is probably more than two lenses.) The Camera app shows only a single exposure, rather than a rolling combinatio­n of different exposures as in a post-processed HDR.

To get the best results for non-spontaneou­s moments, you should shoot and immediatel­y look at the results to see how well the range gets captured, especially when very dark or light areas are in the frame. Shooting a daytime sky will almost always look blown out in the Camera preview, but the HDR result almost always shows a fair amount of tonal detail.

The Camera app doesn’t always shoot HDR: it only does so when it detects sufficient detail will be blown out or shifted to black. I’ve noticed that HDR in iOS 11 tends to be darker than I expect, especially when I’ve tapped to set exposure in the frame and then taken the picture, as the HDR synthesis overrides an exposure setting. This means with some shots, you’ll need to plan for post-capture tonal adjustment. However, because HDR provides more tonal range, you have more ‘space’ out of which to carve the right balance without blowing out or filling in details. Shooting with flash on an iPhone 8 and 8 Plus I confess I haven’t used the flash on an iPhone for years, until now. With every new model, I’ll shoot some tests and, dissatisfi­ed, flip the switch from Auto to Off and leave it there. Tipped by pictures taken by Matthew Panzarino, the editor in chief of TechCrunch (and once a full-time profession­al photograph­er), I gave the new

‘Quad-LED True Tone with Slow Sync’ a chance with an iPhone 8 Plus.

Slow sync is an old technique that effectivel­y combines flash and a slow shutter speed. This illuminate­s a subject in the foreground with the flash, while the longer exposure time captures enough light from the background to show detail. Previously, the iPhone only used flash to flood a scene and take a shot exposed only as long as necessary to capture the closest foreground image.

This new feature, which essentiall­y has no controls in Camera to modify, turns otherwise poor night shots that will be grainy or blurry into vibrant ones that have crisp detail. The deeper sensors in the iPhone 8 and 8 Plus (and X) cameras help here, too, as they reduce noise in low-light conditions.

The right scenes work well with slow sync, especially if you have a person, animal, or object roughly 3m away from you and a background that’s recedes far further. For best results, you need a tripod or monopod or a way

to brace yourself to avoid motion, and the foreground subject has to be relatively (but not perfectly still). The background can be still or have motion.

The flash also seems to work better in my testing even when it’s not dark indoors or out, providing some additional fill in the foreground without flooding the entire scene. Image stabilizat­ion pairs neatly with slow sync, because image stabilizat­ion effectivel­y makes it easier to capture a sharp photo in low light even with a slower shutter. Setting up for Portrait Lighting Our colleague at Macworld, Adam Patrick Murray, also a profession­al photograph­er, ran down head-to-head iPhone 7 Plus and 8 Plus comparison­s on photograph­ic strengths, like colour and clarity. But I have a few additional tips after shooting in a number of informal settings, like a restaurant and at a small conference.

The two-camera iPhones capture images simultaneo­usly, and then iOS calculates a depth map by identifyin­g the diverging location of the same objects in receding planes. There’s some machine learning as part of this that assists in object recognitio­n, because simple divergence wouldn’t allow for full boundary detection.

Apple’s depth map calculates nine layers as separate planes. Portrait mode uses on-screen cues to have you place your subject in the correct distance from the camera to be in one of those planes, but distinct enough from the background to distinguis­h the receding planes behind them.

For the various lighting effects only available in the iPhone 8 Plus, machine learning assists more than in the

original Portrait option in the iPhone 7 Plus that’s still available there and in the 8 Plus. It adds a fake low light as a kind of visual effect onto a portrait to create the sense of studio lighting.

I’ve found in a variety of lighting and background conditions that you have to think like an AI about those planes to mentally preview how the camera will analyse them. Portrait works worse against a plain, uncluttere­d background, because there’s less to pick up on to separate depths into planes. A busier background actually works better, especially one that has varying lighting on it.

For Stage Light (colour and mono), iOS is using some facial identifica­tion to ensure the right framing. I was trying at an event to take a Studio Light photo of a colleague who has a marvellous Mohican. She was in profile to show it off, and we couldn’t get the camera to recognize her. I had her shift to look head on, and it immediatel­y worked in the same setting. However, in another setting, I was able to get a profile of a friend without any such problem.

Fellows with little hair like myself tend to fare badly, too. In photos a colleague has taken of me with an 8 Plus and I’ve taken of others with cropped or absent hair, the feature unkindly removes even more of our follicles, sometimes leaving us with lumpy heads.

Lighting matters quite a bit. In darker settings, Portrait can produce an image that looks blocky rather than crisp, as it’s making up for the telephoto lens’s relatively slow f-stop by dropping in digitally zoomed detail from the wide-angle lens. With enough light, though, the Portrait is crisp. I captured two friends in the same restaurant, and one was perfectly detailed and the other a blended blur.

 ??  ?? Two captures in the same venue a few feet apart: at left, crisp and detailed; at right, digitally zoomed and blurry
Two captures in the same venue a few feet apart: at left, crisp and detailed; at right, digitally zoomed and blurry
 ??  ?? Even on a large outdoor shot at night, the slow sync flash created a less noisy image, while allowing a foreground highlight
Even on a large outdoor shot at night, the slow sync flash created a less noisy image, while allowing a foreground highlight
 ??  ?? HDR shots tend to capture images a little darker than I expect in iOS 11, often requiring some adjustment
HDR shots tend to capture images a little darker than I expect in iOS 11, often requiring some adjustment
 ??  ?? You can combine flash and Long Exposure to get interestin­g night-time effects
You can combine flash and Long Exposure to get interestin­g night-time effects
 ??  ?? Long Exposure via Live Photo used to capture four different flows of water at the Seattle Japanese Garden
Long Exposure via Live Photo used to capture four different flows of water at the Seattle Japanese Garden
 ??  ??
 ??  ??

Newspapers in English

Newspapers from Australia