EDGE

Post Script

Jon McKellan, creative director, Observatio­n

-

The prototype for Observatio­n was put together at the beginning of 2016, and quickly attracted interest. Having secured a publisher, developer No Code busied itself during the wait for contracts to be finalised by making Stories Untold, partly as a test bed for creative and technical ideas that would help inform its approach to Observatio­n. Here, studio founder and creative director Jon McKellan discusses how to design computers for console, and finding the balance between narrative control and player freedom.

Although you’re mostly working with computers in Observatio­n, it feels as if the interface was designed with consoles in mind. Was that the case?

We knew it was going to be console right from the start, so we wanted to make sure everything felt accessible. There were a lot of things we had to think about, because we had a similar issue on Stories Untold with the text input, which just doesn’t translate to console. We didn’t want to get caught in that trap where there’s clearly a better platform to play on because its controls are much more suited. So designing each mechanic was about making sure things felt right on a controller. I’m a big fan of sitting on the couch and enjoying the game the way you would watch a film, so making sure that can still be done comfortabl­y was a big deal.

Was firstperso­n exploratio­n always part of the plan? Yeah, we always planned to have the spheres as part of the gameplay loop, and that they would get introduced gradually. It plays into the idea that to begin with, you feel restricted as an AI. And as SAM starts to evolve, so does the control scheme, and so do your options. So it was always on the cards to give you this limited set of views, and then at certain points to break you out of that and give you some more freedom to find stuff that the cameras can’t see. And vice versa.

By contrast, you’re happy to take control away from the player at certain times, whether it’s cutscenes or locking into a specific camera feed…

We did play around a lot with that, thinking about how much autonomy the player should have, even during our narrative moments. And what we found is that it’s quite a narrativel­y dense game in certain places – where if you miss what’s being said, it may make the next bit more tricky, or you’re not going to understand quite so well what’s happened. When we were doing playtests, people would just start moving the camera around and looking at random objects and not listening to what was being said. At one point, one of our testers said, ‘So what am I supposed to do?’ And we’re like, ‘Well, she just told you, but you were looking at a packet of food somewhere.’ It was a bit of a compromise, really – it was either let players have total freedom, but with a much higher chance that they’ll not get what they’re supposed to do or what’s happening. Or we take a little bit of that freedom away, but it makes the next section a little bit easier to get through. And so we went for that option.

It’s about narrative pacing, too, isn’t it? Too much freedom and the story loses some of its momentum. Yeah. I mean, we didn’t want it to be explicitly linear the entire time. Once you get to the start of the second act at the main central area of the station, you’re given freedom to look around in any camera and solve some of these individual problems that crop up, while looking for audio logs and things like that – that’s where we give you the freedom to explore at your own pace. It’s very difficult to tell a direct story and make sure people get it, while giving them the freedom to do anything they want. Even big open-world games like Red Dead, and The Witcher… as soon as you start a narrative mission, you’re locked into those cutscenes and those boundaries. So even in the biggest titles, that seems to be the way to go: when learning what’s happening in the story is the most important part, then you focus things in and make sure that’s the experience that people get from it. Then when that becomes secondary to the puzzle solving you let go and let them have fun that way.

“It was always on the cards to give you this limited set of views, and then at certain points to break you out of that”

There’s almost a latent Metroid influence in places, in the sense that SAM has lost his powers… Absolutely. One of the tropes throughout all AI-based fiction is that the AI can start to improve itself or build upon itself and that’s one of the big things to be frightened of. And so we wanted to get some layer of that in – that you’re creatively trying to solve a problem and gradually improving yourself as part of that process.

Can you tell us about the casting process?

I was the UI guy on Alien Isolation, and met Kez on that shoot. When we started the project, we had a casting director who started looking for people. We had three different people do voice work at different stages. I’d kept in touch with Kez on Facebook, and we were coming up to some internal milestone just over a year ago. We needed some dialogue, and we couldn’t get the actress we had been using temporaril­y. So I messaged Kez and said, ‘Have you got a studio? Can you help us out?’ And she did it. And it was perfect. The answer was under our noses the whole time. It helped that she and Anthony [Howell], who plays SAM, had worked together, because Anthony was Samuels in Alien. So they were aware of each other, and I’d heard those voices together before and that made it easier to direct the dialogue.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Australia