Subscribe
Gaming

Ben Minto on Star Wars Battlefront’s sound design: “We put a sound in with one in a million chance of it playing”

Ben Minto (former audio director and sound designer at EA Dice) and now company director and supervising sound designer at Sweet Justice Sound reflects on bringing the sound of Star Wars Battlefront to life, the fans’ favourite sonic moments, his pet peeves in game sound, and how immersive audio is changing the gaming landscape.

You were the sound director for Star Wars Battlefront at EA Dice, which has elements that we all recognise as being a part of the franchise’s sonic universe. Where did you start in terms of mapping out a sonic foundation for the landscape in order to design the sound?

The interesting thing is the first Battlefront came out just before The Force Awakens, so it was this idea of – let's go back to the original trilogy from the ‘70s and early 80s. I think I was one of the few people on the team who was actually born before the Star Wars titles! 

There's always this idea of: how do you up-res things? How do you take things that were obviously recorded on tape – maybe in mono. There's not a lot of variety to that. How do you take those roots and then expand it to bring up the fidelity overall?

We were super fortunate early on to go over and visit the ranch at Skywalker Sound to ask Ben Burtt and Matt Wood questions, like how would they approach it, and could we have access to the film stems in the original recordings?

If we asked for material and they had it, they would supply it to us. We had these ingredients to start from the original different, iconic Star Wars sounds, but we needed to expand and build upon those so that we could cover all the areas needed in the game.

The titles I was working on before Battlefield, they're modern military hardware, but based in the world where the sound propagates through the world in the way there's destruction and in the way people interact with environments.

So if you remove the idea of your starting ingredients, be in military hardware, and replace it with Star Wars’ iconic sounds, you can almost use a very similar process like, “Okay, here's the blaster sound, but here's how we ground the blaster in the world; here's how we affect the sound of the blaster with distance so it becomes readable like a blaster, and that's about 50 metres away in a building”.

People said, “Don't you feel really constrained?” And actually that was great. It's like, “Okay, it has to sound like the films” – you don't have to reinvent anything, you don't have to come up with a brand new lightsaber sound. What you have to try and do is make the game sound like the film, which obviously is a challenge in itself!

What you have to try and do is make the game sound like the film.

Did Battlefront see you visit any weird and wonderful places all in the name of capturing some interesting sounds?

Yeah, I was super fortunate. On the first one, we went to four locations in the Star Wars universe. There was Endor, Tatooine, Hoth, and then a planet which is part of the lore, but it's not part of the films as such. 

I did end up in the Redwood Forest to record ambiances for Endor and I had two weeks recording in Iceland. Hoth was originally done in Norway; I went up to the top of Sweden in the middle of winter and that got most of those. For EA Dice's 20th anniversary, they flew the entire studio to Dubai for a week and I got to spend some time in the desert recording the stuff needed for Tatooine.

There've been such great books about how Ben Burtt did a lot of the original sounds – and we obviously tried to recreate a lot of those as well. I went to a guy-wire – the one where you hit it and it goes “twaaang” – to try and recreate the blasters. 

Also because we weren't allowed to kill R2-D2, when you're a player and you have an R series droid in the back of your X-wing, we had to come up with a new voice for that.

So I read how Ben Burtt did it originally, and I was actually on paternity leave for a while, so I was basically coaxing my child at the time, Atticus, to make lots of cooing sounds, which then I processed in the same way he did in the original film. So yeah, I put my son in the game, which was pretty good. He'll appreciate that one day!

Did you slip the famous Wilhelm scream in there somewhere?

Of course! You have to. The way it works in the game is you have a bank of screams – I think there's 100 from all the different voice actors. What you want to try and do is find the right conditions. 

So if a person falls from above 10 metres velocity and reaches this speed, then there's a chance they do the Wilhelm scream. 

There is another one as well. It's the famous Tarzan scream that Chewie does in Return of the Jedi, and then he does in The Clone Wars as well. We put that one in with, I think, one in a million chance of it playing – so somebody would have heard it somewhere. I’ve never heard it!

We put Chewie's Tarzan scream in with one in a million chance of it playing.

Fans are so invested in game sound design; in Battlefront’s trailer someone in the comments picked up that the music matches perfectly with blaster shots and explosions at one point. Are you surprised by fans’ attention to detail when it comes to game sound?

Not really, because say you make a film and it's two or three hours long, and people say, “I've seen that film 10 times”. That's 30 hours. With some of these online multiplayer titles, people put in hundreds of hours, sometimes even thousands. 

Playing games isn't a passive experience – it's active; the soundscape is giving you information about what's going on in the game that's vital, so people usually listen deeper – they're always trying to pull information out.

We do read the comments and different things like that. In the first game, we had Darth Vader, and when he went below 10% health somebody suggested that we switch over to his laboured breathing at the end of Return of the Jedi – spoilers! – where they take a helmet off, and he's wheezing away. 

It's like, “Wow! Such a great idea and very simple to implement”. So we fixed that in the next patch and then for the next title as well.

When you have an R series droid in the back of your X-wing, we had to come up with a new voice for that.

What are some of Battlefront fans’ favourite sounds in the game?

In Star Wars there was a sound for a thing called a Thermal Imploder, which is a device that’s a bit like the sonic mines that come out of the back of – if I get this wrong, please don't crucify me! – after Jango Fett’s ship in the prequels. 

It sort of drops, but the great thing about it from a sound point of view is it sucks all the silence in before it detonates. We knew it had to sound a bit like the sonic mine, but we had to record some other elements as well as a type of plucked string sound.

Ben Burtt, who made the original sound, has never said how he did it. I did ask him once, and he said, “Can't I keep some secrets?” Sure you can! But I think we did it justice – and it was the most talked about sound in the game. 

People have ripped it and made ringtones, memes, all that kind of thing. So it's nice when a really large audience recognises the sounds.

I did end up in the Redwood Forest to record ambiances for Endor.

PS5 has Tempest Audio and Xbox Series X and S utilise Dolby Atmos. How do advancements in immersive audio in gaming affect your approach to sound design?

Battlefront one was actually the first title to ship with Dolby Atmos. The interesting thing in games is that if there is an object above you, we attach the sound to the object. So a helicopter flying overhead already has the sound with height information on it. What we have to do is compress that height information down and then play it on a traditional linear plane – whether it's stereo, 5.1 or 7.1.

So for us, it was almost like removing a part of the end process – the sound already existed there and there were a few things we did just to increase the amount of information in the height field.

The first implementation we did was Atmos over HDMI, so that's the format where you need to put it into an amp and then it breaks it out into the discrete speakers. After that, because there were different systems both on Sony Xbox and PC – with Peter Bliss, who's based at Criterion in the UK, we actually developed our own 3D solution as well so we could be platform-agnostic.

This helps us on the mixing stage, because then we just mix to that one target, and then it can be translated onto any platform that we work on down the lines. So the EA Dice titles currently working with the Frostbite engine utilise that 3D system, and it is translatable to the Tempest system and then to Atmos as well.

The soundscape is giving you information about what's going on in the game that's vital, so people usually listen deeper.

Has it been a steep learning curve to adapt to immersive game sound design?

The most interesting thing is changing from the idea of channel-based to object-based. When you're mastering channel based, if you need to master a 5.1 stream, you need six channels to master at the same time.

With the number of objects we had, which was 64 objects, we reserved objects for the traditional speaker placement of 7.1.4. All of a sudden, you've got 52 objects, then you have to master them in some way. 

So if you want to filter them, instead of being able just to filter, say six channels, you actually had to filter each object, which then obviously gets very expensive from a DSP point of view, because then you're running 56 times as many processing units. Just getting the computational budget to be able to do that was quite a challenge.

Initially, how do you handle reverb? It's quite easy to use a surround reverb or something like that, but at runtime if there's an object, and that object then is cast into the reverb space, where do you mix the reverb to? It was really quite fundamental questions like that which we had to work around.

One of the most interesting things, and I think a lot of creative people get scared about this, is the use of AI.

What are some of the biggest developments in game sound design you've seen since you've been doing the job?

There always seems to be struggles with each console generation. Initially it was things like the amount of sound RAM you had. The first titles I worked on, you had about half a megabyte, and then it went to two megabytes, then it went to eight megabytes. These days, we have around half a gigabyte, so it doesn't really become an issue anymore because you have systems just for pulling in the sounds you need and then replacing them once they've played with new material.

Then the next challenge was the amount of disk space. So if you started to do 5.1 music or have lots and lots of music, it was about how much you needed to compress it.

I think the challenges these days are that a lot of the systems we work on can be quite complex, and it's actually maintaining those. In early games I may have had to work on, say 10 weapons, whereas now a game may ship with 200 weapons. If you think it takes a week to do a weapon, well I haven't really got 200 weeks available! That's a huge amount of time.

So you have to build systems that use a bit more shared content and are clever in how they build things. Especially in thinking towards live service, if I need to add 10 more weapons, you can't always say, “Okay, that's going to be an extra 10 weeks worth of work”.

The challenges have become that the team sizes have grown absolutely huge. Games I worked on in the past, I was almost like a one person sound team and I could do the sound design, implementation and mixing, obviously using external VO and music talent as well.

Whereas a game Sweet Justice has recently worked on, I think the audio team, including internal at Sony and all the additional developers, was way over 100 people.

Ben Burtt, who made the original sound, has never said how he did it.

Are you highly sensitive to noticing mistakes or things you would never do in the audio design in games that you haven’t worked on?

Yes! It's just pet peeves and things that don't feel right: things like sound propagation through the world. The simplest ones are usually with games when there's a problem with variety. So you might only have one or two versions of a sound effect. This is very common in old games. So you almost get the “clip clop, clip clop, clip clop” sound as somebody's doing footsteps.

One thing I notice is when people use pitch shifting on metal, because to me, it's almost saying that the size of the object is changing – almost like you’re getting different pitches out of different size bells. So when I hear that I’m like, “Oh,no, you could have done it a different way”, or don't even bother altering the pitch.

I've got a huge list of them but a lot of the work I do towards the end of the game is trying to find all these things and trying to smooth all these things out.

What would you say the future looks like for game sound design?

I think one of the most interesting things at the moment, and I think a lot of creative people get scared about this, is the use of AI and content generation with AI as well. 

Would I always need to go out and record fresh footsteps for every title I work on, or eventually will a model exist where it can work out, “here's a footstep on clay on grass on puddles and actually be able to generate that at runtime?”

I think it's not going to replace our jobs, but it's going to change the way that our jobs are. Some of the work we do will be handled by that and we will almost become the people asking the questions of the AI, the orchestrators or that sort of conductor of it as well. 

Over time, obviously AI will learn more and be able to take more of that work on board and then what we do will shift again as well.

There are systems that are already able to do voice synthesis based on either previous recordings of a voice actor or from a script, or actually take a performance from a different person and then map that to an AI voice as well. 

Voice is probably one of the first ones that's going to be easy-ish to do from an audio point of view then probably music because that's more well understood. And then I think towards sound design and mixing later on.

When he went below 10% health somebody suggested that we switch over to his laboured breathing.

Tell us about Sweet Justice Sound and how different your role is there from at EA Dice?

The thing that brought me here was that there were some similarities. The interesting thing for me at Sweet Justice was, I'd worked with both of the original founders at different times in my career. So Chris Sweetman, I'd worked with early on in my career, and he's a long term friend. Then later at EA Dice, Sam Justice joined us as a junior working on Battlefield four. 

They were both starting their journeys as freelance sound designers at the same time. It was like, maybe you should get together: you're called Chris Sweetman, and you're called Sam Justice – just start a company called Sweet Justice and help each other out!

That was about seven years ago and they've just grown the company and taken on really amazing products and delivered astounding audio as well. Towards the end of the pandemic, I checked in with them, both being peers, and they both just said, “Why don't you come and join us and help us run the company?” So I said, “Yeah sure!”

In early games I may have had to work on 10 weapons, whereas now a game may ship with 200 weapons.

What is Sweet Justice Sound working on at the moment?

Since I've been at Sweet Justice, they've shipped things like God of War Ragnarok, they worked on Halo, Horizons Forbidden West, Dead Space, Returnal – on a tonne of awesome games.

Another main area I've been looking at is a new offshoot of Sweet Justice called Sweege Tech, which is as well as Sweet Justice doing the traditional cinematic and linear work, there’s also the game implementation side. We've got a new division now, which will develop tech as well, which will help us with developing game audio.

Tell us about this new game audio tech…

Initially, we're working with Epic's UE5 metal sound technology. Why this is super interesting to us is what's called a graph-based audio engine. It looks a little bit visually like Max MSP or pure data, or even looks like, say, a modular synth. It's like a visual form of programming, in a way.

My experience at EA Dice for 15 years is based on an engine called Frostbite, which has a very similar way of handling audio – this graph based system – whereas the current or previous generation of audio that's generally available on the market isn't based on that.

So for me, it's coming from an area where it was proprietary tech, where there were only a few people across the world that could use it. Whereas now Epic have decided to introduce this in Meta sounds, which is going to be available to everybody.

We actually developed our own 3D solution so we could be platform-agnostic.

EA Dice has over 300 Genelec speakers and are upgrading two Atmos rooms to LCR Ones and the rest on GLM. We’re guessing you’re very familiar with Genelecs for your game sound design work?

Very much so. When I started at EA Dice, they had Genelecs from about 2008 and then we upgraded them with the SAM system.

I think most studios ended up on 8330s and 8340s by the time I was there, and then when we moved to Atmos, we started drilling holes in the ceiling and sticking some more up there. They're heavy speakers. It's a bit scary having those above your head to be honest!

Battlefront one was actually the first title to ship with Dolby Atmos.

What is it about Genelec that is such a crucial part of your sound design work on these games and how important is that when you’re working so closely on intricately mixed game sound design?

The first thing for working on something is just the confidence that what you're hearing is correct and that you're not going to take it somewhere else and be surprised, like “There's too much bass in this or the high end is mushy, or this TV tone coming through”. 

They need to be transparent, but they also need to be accurate as well so that you can just get on with the work and not worry. If I go somewhere else, the same thing has to translate between different rooms.

We don't typically work where the platform we share is something like say, Pro Tools or Reaper or a DAW like it is in the post production industry. Our shared space is obviously the game engine. If I make something in my room, somebody else should be able to play it in their room and it sounds the same – there should be no surprises. 

When you have many different contributors towards the game, where there's a final mix that needs to be done, you will need to adhere to certain standards regarding loudness and frequency content.

I think I was a bit of a sore thumb when I joined EA Dice because I turned up with my Dynaudio BM6As which I'd travelled around with for quite a while and everybody else was on Genelecs. 

So there was always a bit of a translation thing. But I got to know the Genelec people in Sweden and they came by the office one day because they were buying lots and lots of Genelec and upgrading the rooms, and they left me a pair of The Ones and said, “How would you like to try these?”

It was a pair of 8341s and I loved them because for some reason they remind me of the Pet Shop Boys [laughs]. I don't know why, maybe just the white speaker with the black bit in the middle. That's so Pet Shop Boys! They just looked so different and a bit alien.

I was like, “Yeah, sure, I'll plug them in”. I did the whole thing of going through denial like, “Sure, they’re probably 5% better. They sound different”. But then it made me start to question my previous setup and where I was sitting and everything like that. 

I went through a whole period of doubt and then eventually was just like, “Okay, you're not taking these back. Sorry, they’re staying here with me”. I've been a fan ever since.

If I make something in my room, somebody else should be able to play it and it sounds the same – there should be no surprises.

Do you have Genelecs in your home studio as well?

Yes, I started with a stereo pair of 8341s and then as I've expanded into a new room, I did LCR with 8351s – so the next size up – working on things like explosions and guns. 

I try not to monitor too loud, but occasionally you do need to, so I did like having the extra power in the 8351s. For the subs they did recommend I went for the 7380, but I did go for the 7370 in the end as well.