Ready player one

Jim Fowler (JF): I come from quite a varied musical background: I studied Jazz performance at Leeds College of Music but after that moved into writing rather than performing, doing lots of theatre work in Newcastle. My wife and I actually set up our own touring children’s theatre company and I wrote some plays. It was very interesting as I was writing songs for children to listen to, rather than to sing. You don’t have to patronise kids so I could expose them to crazy styles. I then studied sound design and music at Bournemouth under Professor Stephen Deutsch who was an excellent teacher. The experience was really immersive and there was lots of interaction with other disciplines such as animation and TV.

One thing that stayed with me was the concept that the process of making sound doesn’t really matter, it’s the end result that people hear. My first video games job was transcribing songs for Singstar. This wasn’t before audio to MIDI conversion, but it was actually quicker to do it by hand because we just needed the tune and we didn’t always manage to get an acapella. The longest part of that process was always breaking up the lyrics to fit the melodies. Foreign songs were a real challenge. During that time I was always pestering Alistair Lindsay [Head of Audio – Sony Interactive] for more games music jobs. I actually ended up running the whole music content team which was a great crash course in games development.

 

Ready player two

Joe Thwaites (JT): It’s actually kind of a similar story for me as I was also at Leeds uni doing music for film and television, followed by a Sound Design masters at Bournemouth. After that I moved into freelancing with a stint in post production and sound engineering. I’d met Alistair [Lindsay] at the Game Developers Conference during the ‘GANG Composers Challenge’  where composers submit a piece of music and sit on a panel to chat about it. He was in the audience and two years later he had a music production assistant job available which I jumped at. That was my foot in the door. The first big project for me was “Book of Spells” which needed some pretty crazy music systems, so I was designing those and writing temp music simultaneously to test the programming. When it came time to implement the final music the development team liked what I’d done and wanted more of it. Luckily Jim was there to help me out and we’ve been writing together ever since.

The multiplayer game

JF: We’re working on a bunch of things at the moment but the main thing is London Studio’s “Blood & Truth”, a VR action game. The music is based around the idea of combining the sounds of modern London with traditional scoring. So that’s resulted in a combination of grime music and live orchestra. We recorded finished orchestral pieces upfront and then brought grime artists into the studio to collaborate. The cool thing about working in-house for Playstation is that it allows us to be involved in the game making process from very early on. It gives us the freedom to be a bit more experimental, designing the music systems and writing the music itself. Lots of planning is done in advance and the team is just upstairs so there’s a great deal of collaboration and not many arguments. Sometimes the sound designers ask us to change musical elements as they conflict with FX but otherwise it’s pretty amicable.

 

 

JT: A good example is the music for “Book Of Spells” which has an insanely complex set of rules for the music. There’s no way that the music system for those games could’ve been built separately from the composing process. It wasn’t a traditional score but was written like a journey through classical music, starting with early romantic and stepping forward in time with each new chapter of the book. It was a great idea when we came up with it but soon realised that you can’t write authentic music from those eras without key modulations and tempo changes, both of which are huge challenges for interactive music. I built a system that kept track of home key, tempo and modulations so when you turned a page the score would keep up and change accordingly. There were some transitions that had something like 20 different available options. That’s the sort of thing that would be impossible to do without a composer that’s intimately linked to the programming side.

JF: Although cymbal rolls and tremolo swells cover a multitude of sins!

Jim Fowler _ Joe Thwaites-sub-3-1
Jim Fowler _ Joe Thwaites-sub-3-2

First, KONTAKT

JT: Our primary source of instruments is within the KONTAKT engine. It’s so easy to work with that you just forget you’re using it. That’s what we use to build the core of our demos prior to orchestration.

JF: I often dive into the editing side of KONTAKT to change things like pitch bend. I use it for making notes slide at the same time as FX patches. I use a lot of Orchestral String FX from Dynamic Sound Sampling so it’s cool to tie those together. It’s nice because their sections are laid out across the keyboard so you can combine them in interesting ways. Our template of instruments really varies with each project. At the moment we’re using lots of Spitfire strings, brass from Sample Modelling, EastWest and VSL. I often use Cinesamples Piano in Blue which is quite a jazzy sound, but for writing I prefer New York Grand which has a nice bright clean timbre. Evolve Mutations and Damage were used extensively for ambient transitional stuff on Playstation VR Worlds.

JT: For the grime elements of “Blood & Truth” we’re using MASSIVE. It’s a good starting point, using sample pack presets and tweaking them to our liking.

 

Mass effects

JF: I’ve used all sorts of Native Instruments effects in projects over the years: The RC48 reverb gets a lot of use, especially on orchestral instruments. When we’re making demos we’re using it to emulate how the final product will sound once it’s mixed. We know that our mixing engineer [Jake Jackson] will probably use the Lexicon on the final mix so using the RC48 at the demo stage gives a good approximation. For the final mix we’re using the multi-out functions of KONTAKT to BUS tracks out in groups, then to RC48 and often Altiverb followed by a little EQ and compression from FabFilter plugins and a little mastering with iZotope Ozone. Guitar Rig gets all sorts of action. It’s great to mess with sounds and so easy to throw on ludicrously dangerous electric effects, cut bits out, make new loops and play around. It’s just a very reliable and realistic signal chain although we do sometimes replace the effects with real-life amps. We might use some extra UAD or Soundtoys plugins after the fact too. In fact, the demos for the “London Heist” music used Guitar Rig before the re-record at Air Studios. The fact that it’s a non-destructive signal path is very useful. Music for games is iterative and lots of things are happening simultaneously so it’s great to be able to go back to a dry guitar if you need to.

JT: I actually used The Finger for “Danger Ball” to explore how we could create stutter and filter effects, essentially mocking up concept audio so we could demo our ideas to the game team early on in development. Then we recreated those techniques with the in-game engine to react in real-time to the play.

Hard wearing

JF: Pretty much everything is in the box. We run the studio on a MacPro dustbin with 64GB of RAM and 4 SSD drives attached; one for project files and the other three for samples. The audio interface is a Universal Audio Apollo. My master keyboard is a Doepfer LMK2 built into the desk with a separate pitch and mod wheel – the best controller keyboard I’ve ever had. I have a couple of other controllers, a MIDI Fighter Twister, from which is basically a 16 knob assignable unit. I also use a standard mechanical computer keyboard which is rigged up to convert button presses to CC data for key switching with ControllerMate – really basic but effective. Currently the studio is only set up for stereo mixing as I’m still working a few things out. If I need a 5.1 output I’ll often use Halo Upmix for to convert the stereo mix. We get orchestral recordings back from the sessions as 5.1 mixes anyway so that’s handy. When it comes to our DAW, it’s a bit of a long story. When the team came together after Singstar we were all using different DAWs but it made sense to just stick to one. We all knew how to use Cubase in some form – it was just the easiest learning curve for everybody.

JT: I was on Logic 9 and then it changed to Logic X and I heard people were complaining about that, so I thought learning Cubase properly would be just as easy. Now I really prefer Cubase and I find Steinberg an approachable company. They’re trying to integrate more game tools into their software. There is one thing I’d like to see in the future; at the moment we have to use markers to indicate when audio changes but when it doesn’t work you have to go back into the DAW and rebuild the cues. It would be nice to be able to do all of that in one place so that’s on my wish list.

 

A waiting game

JT: We’re always trying to combine the music and the interactivity, giving them equal weight. Some game soundtracks really shy away from big thematic music because it’s just easier to dip in and out of ambient cues. We always try and include themes and when they hit at the right point it’s a great gaming experience. We want it to be so seamless that you don’t notice what’s going on and sometimes we even make the game wait for the music.

JF: With the system we’re working on at the moment you could play the game stealthily or at a faster pace and the music changes accordingly. It should feel like someone’s captured your play through of the game, taken it away and then scored the action. It’s a fun thing to try and achieve without compromising musicality – like a big 3D jigsaw.

Sounds in space

JF: It’s a really interesting time for music and effects, especially when it comes to VR games. We deliver a 5.1 score and essentially we have the ability to place that sound wherever we like in the game world but it has to work with the environment. It’s possible to track the player’s head movements so we can use binaural processing to change what the player hears as they move in the space. The fun is that we can know where the player is looking and change the music accordingly. For example in the “Ocean Descent” VR experience, as you look up at the surface of the water the low frequencies are scooped out and more sparkly layers are added. When you look down into the depths we introduce low gurgles to give a sense of dread. It shouldn’t draw attention to itself.

JT: Having said that, we still want the experience to be quite filmic so we tend not to move the main score around too much unless there’s a specific source in the game world, like a radio or speakers. The sound FX are moving about a lot more than the music.

 

Play testing the competition

JT: I like to think that we’re pushing the boundaries more than most but there are definitely some people doing great work. Harmonix “Drop Mix” is such a cool concept with some amazing tech behind it.

JF: For me that was super-fun to play from a geeky point of view, trying to figure out the programming behind it and I was in awe of the huge amount of work that it must’ve involved.

JT: What about the licensing? How they got that game signed off I’ll never know. I just think with any game that the careful combining of music and sound is so inspiring – Dead Space and Alien Isolation also do that really well.

 

 

LIVE STREAMING

JF: I’ve spent the last three months producing the orchestrations for Playstation In Concert and it’s been amazing to do the arrangements for that, especially as there’s no budgetary constraints. It was so much fun because it’s just great music to listen to so I’m excited that it’s being performed live in front of an audience. Some of the music wasn’t orchestral originally so I’ve revisited that. There will be a few surprises from the PS1 era, re-orchestrated of course, and composer Jessica Curry is presenting.

photo credits: Kristina Sälgvik

Playstation in Concert takes place at The Royal Albert Hall on May 30th
Tickets are available here.