Q&A FACE-AGEING TECH HELPS HUNT FOR MISSING PEOPLE
How the University of Bradford professor Hassan Ugail is improving the accuracy of algorithms for ageing photos
WHAT ARE YOU going to look like in ten, 20 or 30 years? Professor Hassan Ugail knows. His team at the University of Bradford’s Centre for Visual Computing has developed a machinelearning trained algorithm to age photos of missing people, considering not only their features, but background, culture, lifestyle and even how their family ages.
Making an accurate image of a person missing for years can help track them down, but previous models have mostly taken into account facial features, assuming all humans age in the same way. Professor Ugail first used machine learning to teach his system the different ways we age, improving accuracy.
As a test case, Ugail’s team progressed photos of Ben Needham, who went missing from the Greek island of Kos in 1991, in the hopes that more accurate images might help track down the vanished toddler. We spoke to Professor Ugail to understand how the system works.
■ What’s new about your technique?
We’ve used a database of 30,000 people with their ages, backgrounds and culture so the computer learns how humans age and how to individualise it. All humans age in a certain way, though there are differences. We were interested in modelling what those trends might be, mathematically and algorithmically.
Facial aging isn’t a new topic, we’re not the first people to look at this problem. There have been algorithms available for ageing for decades… the cutting edge before us was the assumption that all humans age in a very similar way, which is obviously wrong – there might be people who constantly drink or take drugs… they will age very fast. So there’s lifestyle, culture, habit – things like that.
Our programme tries to take differences into account. That means we could take in a person’s face, but also their siblings, parents and grandparents into the picture, to get a better result. We’ve developed a nonlinear model that’s individualised. It’s not 100% accurate, computers can never be 100% accurate [with such predictions], but it’s more accurate than previous algorithms.
■ How do you test the accuracy of an algorithm?
We would take a picture of you, and recreate what you would have looked like at about four or five [years of age], and then take an actual photo of you from the same age, and compare both. If there’s minimal difference, for example it passes through facial recognition as you, then we know that our algorithm is pretty accurate. That’s how we can test it, by running the algorithm backwards.
■ Who will use this?
We’re talking to various policing and charity organisations, and we’ve done some examples with Ben Needham. Interestingly, our results are much different than what police have produced so far, and we think ours are much more accurate. We’re beginning to have discussions [on other cases], mainly for finding missing people, though there are other applications. We’ve done work with the BBC for entertainment purposes, looking at a photo of yourself when you’re ten years older, but the main application lies in finding missing people.
■ Will you make an app so the rest of us can see our future?
We’re computer scientists, so we tend not to develop software. If a company is interested in working with us to develop this, we’d be interested in it, but it’s not really what we do. We’re here to solve unsolved problems.
■ What’s next?
We want to create an emotional analysis, to predict in real-time people’s emotion. That uses simple things like a blink rate. Counting a blink rate through a computer tells a lot about what the state of a person is – the more blinks you make the more tired you are, for example. There’s also looking at the face to identify what diseases or illness they might have. This sort of machine learning applied to the face is a very powerful way to predict many things.