How to: Take great pho­tos with the Cam­era app

Boost your pho­tog­ra­phy skills with Glenn Fleish­man’s tips

iPad&iPhone user - - HOW TO -

When it comes to pho­tog­ra­phy with the new iPhones, all the at­ten­tion goes to the hard­ware and the im­proved sen­sors that can re­duce noise and pro­duce pho­tos with bet­ter def­i­ni­tion in more cir­cum­stances. But let’s not over­look the soft­ware: iOS 11 of­fers at least one ad­van­tage to pho­tog­ra­phers across all re­cent iPhone mod­els. And all th­ese fea­tures

re­quire re­think­ing how you frame, time press­ing the shut­ter, or use nat­u­ral light with a shot.

Live Photo cap­tures can now be ma­nip­u­lated and con­verted into video or a new kind of still in a few ways through Pho­tos. This works for all iPhones and iPads that sup­port the Live Photo op­tion.

Ap­ple has of­fered high-dy­namic range (HDR) cap­ture for years, which syn­the­sizes a sin­gle im­age from rapid­fire dif­fer­ent ex­po­sures. But in the lat­est re­lease, Ap­ple is con­fi­dent enough about the qual­ity on the iPhone 8 and 8 Plus that it no longer re­tains the ‘nor­mal’ photo by de­fault. (You can re-en­able this op­tion.)

The iPhone 8 and 8 Plus also had a seem­ingly small hard­ware up­grade to the Quad-LED True Tone flash sys­tem that makes flash pho­tographs look fan­tas­ti­cally bet­ter than with pre­vi­ous mod­els, ap­proach­ing some­thing that pre­vi­ously re­quired a sep­a­rate flash on a mir­ror­less or DSLR cam­era. Fi­nally, the iPhone 8 Plus en­hances Por­trait mode with new stu­dio light­ing op­tions that work best when you spend a lit­tle time find­ing the op­ti­mal back­ground and light con­di­tions. Live Photo gets a kick Live Photo seemed like a gim­mick at first, but iOS 11 fi­nally makes it into some­thing worth ex­per­i­ment­ing with. Live im­ages are still cap­tured the same way: tap­ping the Live but­ton in the Cam­era app (if it’s not al­ready en­abled and show­ing yel­low) cap­tures a to­tal of three sec­onds of im­ages be­fore and after the point at which you tap the shut­ter-re­lease but­ton.

Once cap­tured, you can edit in Pho­tos in iOS by se­lect­ing the im­age and then swip­ing up. This re­veals

the four avail­able op­tions for Ef­fects: Live, Loop, Bounce, and Long Ex­po­sure. You can then tap Edit for ad­di­tional con­trols, in­clud­ing chang­ing the key im­age or trim­ming the long ex­po­sure range. Like­wise in Pho­tos 3 for macOS High Sierra, you dou­ble-click a Live Photo, click Edit, and then will see op­tions in a popup menu at the bot­tom along­side the trim con­trol.

The Live op­tion re­tains the orig­i­nal crop of your photo as you saw it in the Cam­era pre­view. Switch­ing to any of the other three modes, how­ever, crops to a greater or lesser de­gree on each side to make sure the same area of the photo ap­pears and is sta­ble across all the frames, and the re­sult­ing video or ex­po­sure isn’t jump­ing all over the place or has jagged un­cap­tured edges.

This re­quires some plan­ning, as you can’t cap­ture in those modes. You have to vi­su­al­ize the crop while shoot­ing. If you’re not try­ing to grab a unique mo­ment, like a live event or a rare bird, you can shoot and then use Pho­tos in iOS to view the ef­fect to get a bet­ter sense of how it will be man­aged. Then, you can shoot the same scene again, or re­peat­edly, to nail down what you want.

Long Ex­po­sure is a par­tic­u­larly in­ter­est­ing and dif­fi­cult mode, be­cause there’s no way to ad­just the speed or du­ra­tion of a Live Photo. As a con­se­quence, only el­e­ments

in a photo that move at a cer­tain rate rel­a­tive to the cap­ture will cre­ate a long ex­po­sure that feels like it has a pur­pose.

You also need to con­sider cam­era move­ment. With bet­ter light­ing, a faster shut­ter speed and larger aper­ture re­duces the ef­fects of cam­era move­ment, es­pe­cially with op­ti­cal sta­bi­liza­tion on any iPhone equipped with it. In lower light con­di­tions, a slower ex­po­sure al­ready am­pli­fies or blurs move­ment, mak­ing the long ex­po­sure much less crisp. For par­tic­u­lar shots, us­ing a mono­type or tiny tri­pod could dra­mat­i­cally im­prove the ef­fect for long ex­po­sure, as well as for Bounce and Loop videos.

I shot a num­ber of Live Photo ex­po­sures at the Seat­tle Ja­panese Gar­den, a park near my home, as it’s full of light and dark, and mov­ing and still wa­ter. Pho­tos of a rapid trickle of wa­ter, a slower-mov­ing area, and a large pool each pro­duce dif­fer­ent re­sults: the rapidly mov­ing wa­ter be­comes a haze above rocks; the slower-mov­ing rivulet shows whirlpools and mo­tions; the pool’s rip­ples dis­ap­pear and it be­comes al­most su­per­nat­u­rally still.

Long Ex­po­sure ef­fec­tively makes a photo ap­pear to have lower res­o­lu­tion, and zoom­ing in re­veals ar­ti­facts in­stead of de­tail. As a re­sult, it’s bet­ter to use th­ese for

on­line dis­play at smaller size in­stead of ‘hero’ im­ages that might fill a browser win­dow or for print­ing. HDR by de­fault I’ve been shoot­ing with Ap­ple’s HDR since it was in­tro­duced, but it’s only the change in the iPhoto 8 and 8 Plus to only re­tain the HDR im­age by de­fault that has had me re-eval­u­ate the way I frame, ad­just ex­po­sure, and re­act to pic­tures with an ex­tended dy­namic range.

The iPhone can’t yet pre­view HDR, al­though I imag­ine that’s in the fu­ture, pos­si­bly re­quir­ing

more ad­vanced im­age pro­cess­ing hard­ware or a sec­ond wide-an­gle lens. (The fu­ture of smartphone pho­tog­ra­phy is prob­a­bly more than two lenses.) The Cam­era app shows only a sin­gle ex­po­sure, rather than a rolling com­bi­na­tion of dif­fer­ent ex­po­sures as in a post-pro­cessed HDR.

To get the best re­sults for non-spon­ta­neous mo­ments, you should shoot and im­me­di­ately look at the re­sults to see how well the range gets cap­tured, es­pe­cially when very dark or light ar­eas are in the frame. Shoot­ing a day­time sky will al­most al­ways look blown out in the Cam­era pre­view, but the HDR re­sult al­most al­ways shows a fair amount of tonal de­tail.

The Cam­era app doesn’t al­ways shoot HDR: it only does so when it de­tects suf­fi­cient de­tail will be blown out or shifted to black. I’ve no­ticed that HDR in iOS 11 tends to be darker than I ex­pect, es­pe­cially when I’ve tapped to set ex­po­sure in the frame and then taken the pic­ture, as the HDR syn­the­sis over­rides an ex­po­sure set­ting. This means with some shots, you’ll need to plan for post-cap­ture tonal ad­just­ment. How­ever, be­cause HDR pro­vides more tonal range, you have more ‘space’ out of which to carve the right bal­ance with­out blow­ing out or fill­ing in de­tails. Shoot­ing with flash on an iPhone 8 and 8 Plus I con­fess I haven’t used the flash on an iPhone for years, un­til now. With ev­ery new model, I’ll shoot some tests and, dis­sat­is­fied, flip the switch from Auto to Off and leave it there. Tipped by pic­tures taken by Matthew Pan­zarino, the edi­tor in chief of TechCrunch (and once a full-time pro­fes­sional pho­tog­ra­pher), I gave the new

‘Quad-LED True Tone with Slow Sync’ a chance with an iPhone 8 Plus.

Slow sync is an old tech­nique that ef­fec­tively com­bines flash and a slow shut­ter speed. This il­lu­mi­nates a sub­ject in the fore­ground with the flash, while the longer ex­po­sure time cap­tures enough light from the back­ground to show de­tail. Pre­vi­ously, the iPhone only used flash to flood a scene and take a shot ex­posed only as long as nec­es­sary to cap­ture the clos­est fore­ground im­age.

This new fea­ture, which es­sen­tially has no con­trols in Cam­era to mod­ify, turns other­wise poor night shots that will be grainy or blurry into vi­brant ones that have crisp de­tail. The deeper sen­sors in the iPhone 8 and 8 Plus (and X) cam­eras help here, too, as they re­duce noise in low-light con­di­tions.

The right scenes work well with slow sync, es­pe­cially if you have a per­son, an­i­mal, or ob­ject roughly 3m away from you and a back­ground that’s re­cedes far fur­ther. For best re­sults, you need a tri­pod or mono­pod or a way

to brace your­self to avoid mo­tion, and the fore­ground sub­ject has to be rel­a­tively (but not per­fectly still). The back­ground can be still or have mo­tion.

The flash also seems to work bet­ter in my test­ing even when it’s not dark in­doors or out, pro­vid­ing some ad­di­tional fill in the fore­ground with­out flood­ing the en­tire scene. Im­age sta­bi­liza­tion pairs neatly with slow sync, be­cause im­age sta­bi­liza­tion ef­fec­tively makes it eas­ier to cap­ture a sharp photo in low light even with a slower shut­ter. Set­ting up for Por­trait Light­ing Our col­league at Mac­world, Adam Pa­trick Mur­ray, also a pro­fes­sional pho­tog­ra­pher, ran down head-to-head iPhone 7 Plus and 8 Plus com­par­isons on pho­to­graphic strengths, like colour and clar­ity. But I have a few ad­di­tional tips after shoot­ing in a num­ber of in­for­mal set­tings, like a restau­rant and at a small con­fer­ence.

The two-cam­era iPhones cap­ture im­ages si­mul­ta­ne­ously, and then iOS cal­cu­lates a depth map by iden­ti­fy­ing the di­verg­ing lo­ca­tion of the same ob­jects in re­ced­ing planes. There’s some ma­chine learn­ing as part of this that as­sists in ob­ject recog­ni­tion, be­cause sim­ple di­ver­gence wouldn’t al­low for full bound­ary de­tec­tion.

Ap­ple’s depth map cal­cu­lates nine lay­ers as sep­a­rate planes. Por­trait mode uses on-screen cues to have you place your sub­ject in the cor­rect dis­tance from the cam­era to be in one of those planes, but dis­tinct enough from the back­ground to dis­tin­guish the re­ced­ing planes be­hind them.

For the var­i­ous light­ing ef­fects only avail­able in the iPhone 8 Plus, ma­chine learn­ing as­sists more than in the

orig­i­nal Por­trait op­tion in the iPhone 7 Plus that’s still avail­able there and in the 8 Plus. It adds a fake low light as a kind of vis­ual ef­fect onto a por­trait to cre­ate the sense of stu­dio light­ing.

I’ve found in a va­ri­ety of light­ing and back­ground con­di­tions that you have to think like an AI about those planes to men­tally pre­view how the cam­era will an­a­lyse them. Por­trait works worse against a plain, un­clut­tered back­ground, be­cause there’s less to pick up on to sep­a­rate depths into planes. A busier back­ground ac­tu­ally works bet­ter, es­pe­cially one that has vary­ing light­ing on it.

For Stage Light (colour and mono), iOS is us­ing some fa­cial iden­ti­fi­ca­tion to en­sure the right fram­ing. I was try­ing at an event to take a Stu­dio Light photo of a col­league who has a mar­vel­lous Mo­hi­can. She was in pro­file to show it off, and we couldn’t get the cam­era to rec­og­nize her. I had her shift to look head on, and it im­me­di­ately worked in the same set­ting. How­ever, in an­other set­ting, I was able to get a pro­file of a friend with­out any such prob­lem.

Fel­lows with lit­tle hair like my­self tend to fare badly, too. In pho­tos a col­league has taken of me with an 8 Plus and I’ve taken of oth­ers with cropped or ab­sent hair, the fea­ture un­kindly re­moves even more of our fol­li­cles, some­times leav­ing us with lumpy heads.

Light­ing mat­ters quite a bit. In darker set­tings, Por­trait can pro­duce an im­age that looks blocky rather than crisp, as it’s mak­ing up for the tele­photo lens’s rel­a­tively slow f-stop by drop­ping in dig­i­tally zoomed de­tail from the wide-an­gle lens. With enough light, though, the Por­trait is crisp. I cap­tured two friends in the same restau­rant, and one was per­fectly de­tailed and the other a blended blur.

Two cap­tures in the same venue a few feet apart: at left, crisp and de­tailed; at right, dig­i­tally zoomed and blurry

Even on a large out­door shot at night, the slow sync flash cre­ated a less noisy im­age, while al­low­ing a fore­ground high­light

HDR shots tend to cap­ture im­ages a lit­tle darker than I ex­pect in iOS 11, of­ten re­quir­ing some ad­just­ment

You can com­bine flash and Long Ex­po­sure to get in­ter­est­ing night-time ef­fects

Long Ex­po­sure via Live Photo used to cap­ture four dif­fer­ent flows of wa­ter at the Seat­tle Ja­panese Gar­den

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.