Adam MoseleyEnglish-born producer/engineer Adam Moseley got his start at London’s Trident Studios, where he worked with the likes of Phil Ramone, Tom Dowd, Mutt Lange and Steve Lillywhite. Now settled in Downtown Los Angeles, he operates a studio, owns the Accidental Muzik label and is an instructor at UCLA.

Having just finished an EP with Manchester artist Zoe Violet, he wants to talk studios and sound.

Adam Moseley‘My main work set-up is Pro Tools,’ he begins. ‘I run Ableton through it in Rewire with Apogee Quartet converters and Barefoot monitors. A little bit of outboard – Joe Meek and Trident EQ, and a Studio Projects VTB mic preamp that I love on DI and vocals. I overdub guitars and keyboards, and I’ve got an Akai MPD32 for drum pads and beats and stuff. I do guitars here and vocals here if I need to, but basically everything I do at this stage is in-the-box, whether it’s for a record or film.’

Among his recent film and TV work is The Big Wedding, with Robert De Niro, and True Blood for HBO. He has also been mixing for film and TV with Lisbeth Scott and Matthias Weber, and has been recording and mixing project with John Cale. The diversity of his work reflects lessons learned thrugh these and many other projects, as well as his time at Trident, London’s The Boat Studios and New York’s Sorcerer Sound among other assigments.

‘What I’ve found over the years in studios – and I’ve worked in some of the top studios in lots of different countries – is that the hardest thing is getting a room that’s true and getting a sound that’s true to my own acoustic perception and design,’ he says. ‘Whatever you’re hearing, when you take it away, when you send it to mastering, when you go somewhere else, when you listen in the car, it should be what you expect it to sound like. That’s one of the biggest challenges.

‘It’s not a given science; it takes a lot of time to set up a room and get it right. Monitoring is your only reference as a producer or mixer or engineer. It’s the only reference that you have.

‘When I left Trident, when I started going out into the big world on my own, I would be on a board that I didn’t know and thinking, this sound needs a certain frequency. I’d go to that frequency and I’d be adding it or subtracting it, and my ear wouldn’t be getting what I thought it should be getting – like someone had pulled the rug away from under my feet. It’s the scariest feeling. I learned to adapt to that. I would always take my speakers with me, ones that were tried and tested. By now I’ve got quite a few reference songs that over the decades I’ve heard in different studios, different systems, and I know how they should sound.’

It is with this in mind, Moseley has recently equipped his own facility with the Barefoot loudspeakers.

Adam Moseley

‘I’ve been working with the MicroMain45 monitors for about a month, and really love the definition in the high mid-range and the bottom end,’ he reports. ‘I always love a lot of bottom end in my mixes, but the hardest thing on most speakers is to get that definition in the mid-range, because that’s where you hear a lot of the reflections in your reverbs and a lot of the delays.

‘I get into such finite detail of positioning and delays, and delays that go into reverbs. Maybe you don’t hear the delay, you just hear the reverb, and the reverb has reflections, but reflections have to cut off before the next part comes in, so my listening experience has to be absolutely precise.

‘I always EQ the delay differently from the original signal so it isn’t bouncing back within the same frequencies. A lot of stuff happens in the high mids, and a lot of your reverbs, a lot of your spatials happen up there. That’s been one of my most enjoyable experiences with these speakers – I can really get my definition, get my placement, and hear back what I’m hearing in my head. Speakers that are absolutely true give the real proof of concept that I’m going for, and the visualisation of spatial events.’

Picture this

‘When I started at Trident, they had huge floor-to-ceiling monitors that looked like a wardrobe. We had those, which we’d listen to at ear-bleed level for Kiss albums and Rush albums and so on, and Auratones, the little cube boxes. Sometimes someone would bring in from BBC Radio 1 or from another radio station a little mono speaker that they would swear were the same as the BBC used. You’d check your mixes back through it and make sure it was okay, but you couldn’t mix by that as your end result. You have to make the most glorious sonic picture you can, and then reference certain smaller things, reference on computer speakers and such, but you’ve got to make it the best it can possibly be. If it sounds good on that, then it’ll sound good anywhere.’

But now the sound must also be considered in the context of image.

‘Whether it’s for a TV series or a feature movie, I’m looking at picture. I’m getting the concept in the visual, and working out how I can not just play the performance of the music and bring the best out of the music, but also adapt it to the visual to work with the flow and the mood,’ he agrees. ‘I also have sound design tracks that I’m mixing with as well as dialogue, so I have to be aware of the whole picture.’

Moseley’s aim is to map the visuals with an aural 3D experience.

‘Over the years I’ve searched to get more of a 3D dimension to music, to take instruments and set them back in the distance, then bring them closer, dryer, reducing any effects or reverb, to really get 3D dynamics happening. That works whether it’s an orchestral arrangement or whether I’m working with loops or doing a drop in a song, cutting out the drums and changing the shift of positioning of things. It’s what I call shape-shifting in the mix – I want to use the width and the height and the depth of the mix, but also get things to go behind the speakers or to come out and have things pop out in front, to come out and move out, and get a real 3D experience, sound that comes out and surrounds you.

‘These days, all music ends up pretty much as MP3s that people listen to through computer speakers or earbuds, but I don’t believe you can go into making your music with that end listener situation in mind. You’ve got to be aware of it, but you still have to do the best possible sonic job that you can do and make sound as glorious and beautiful and deep and rich and dynamic as you can. Oddly enough, I use great speakers, and then I’ve got my earbuds coming off the same Apogee converters, so I often check the mixes on my earbuds afterwards because that’s how they’re going to be heard – except they won’t be heard in the quality that I’m listening to here, it will be heard as an MP3.’

The sound of progress

Hurry Death

From his roots in the heyday of big-room recording studios and the height of vinyl record sales, Moseley has seen record companies get greedy over CD profits, the adoption of data compression for portable music players and the destruction of old distribution channels by music download websites. With audio quality on the back foot, and a lot of music being produced in bedroom studios, how is he approaching his recording projects?

I work on all different kinds of music,’ he says. ‘I’ve just finished an EP with a Zoe Violet, produced and written by Rie Sinclair. Very Brit synth-pop. The vocal treatments are different. The vocal effects are different. The beat has to be up and foremost and in your face and just driving, but then all the synths, there’s so much sound design and different texturing going on where certain instruments bleed into the next thing and the other instrument takes over. I have to leave myself room and space to reintroduce the instrument. Sometimes I’ll use panning to bring instruments in and then spread it out to allow space for the next event to come in as the first one just spreads away, or I’ll use panning to move it to a side and create space for the next musical event…

‘Before I can start, I have to know exactly where I’m going with a song. I have to almost draw a picture of the song, of the shapes of the instruments, their positioning, and really explore the session and see what exists and where I can take it.’

In contrast, a hybrid pop/classical album with opera singer Nathan Pacheco levelled very different demands on Moseley’s recording skills.

‘Some of the songs had a hundred and 20 tracks of orchestration, then lots of tracks of choir, lots of programming, lots of bass synth, lots of really great percussive elements,’ he explains. ‘The mixes were very involved and took a long time. In that situation, as when I’m working with a film that’s got an orchestra, I work from the score. I’m reading the score, listening to the music, getting to know the session, getting to know the interplay between the instruments and understand the arrangement, understand which instruments are the lead, which are the response, which are the harmonics, which reinforce the melody, which smooth out the dynamic.

‘I literally learn the session, every single part. If it’s 120 tracks or more, then I learn every single track so that I can work out how to pace the mix, who should have the foreground, who will be in the background, when to switch that perspective, and when to maximise. First of all, it’s just understanding arrangements, and then to bring my mixing concept and visualisation.

‘The whole thing is about reading the arrangement and using what I’ve learned over the years from mixing various records, different styles of music, bringing it all together and really getting the absolute most I can get out of it.’

Concurrently, Moseley is working on a second EP with LA indie rock band Hurry Death produced by Johnny Watt – which, again calls on his ‘shape-shifting’ technique.

‘It’s a completely different sonic approach but there’s a really interesting situation on one of the songs, ‘The Hurt Happened Here,’ where the guitars are double-tracked,’ he says. ‘They start playing an arpeggiated line on the lower strings, and as it develops the line gets busier and moves up to higher notes. The interesting thing is that on the lower notes the guitars sum more in the middle, but as they move up the arpeggio it spreads out, which leaves space for another instrument to come through. Then that instrument goes into the background and into a delayed reverb and fades away in time for the focus to come back to the guitars again, and then the vocals drop in.

‘It’s a constant movement of positioning, not so that you disorientate people, but just by introducing a sound – or in classical music, introducing a theme of an instrument – once the ear has heard it you can drop the volume because the ear recognises the pattern, so it doesn’t have to be so loud because the idea is the ear has identified with it. You can reduce it so that it gives you space to bring something else in.’

Adam Moseley

The technique crops up again with a band from Australia called Braves. ‘It’s pop, but it’s very, very different,’ Moseley says. ‘When I first heard some of the demos and rough mixes and was asked to mix the project, I had to ask myself where can I go with this that I haven’t gone before, sonically, spatially, visually, and conceptually?

‘I came up with a few ideas of how certain sounds will morph into different sounds, then go further away and come back in a different position. You don’t want to confuse the listener, but its more than just balancing and shaping and positioning. There’s so much more movement you can do within the parts, just like in classical where your attention is constantly being changed between maybe the cello or the violas or the first violins and the seconds, and then maybe the oboes will do a response that works with the violas and the flutes will come in. In my opinion, Beethoven is still one of the greatest music producers and mixers of all time.

‘What I like to do in music is to just build and build on a sonic picture. With Braves, I want to “boldly go somewhere where I haven’t gone before”, and I think I know where that is.’

At UCLA, Moseley is keen to share this awareness with a new generation of musicians and engineers.

‘When I’m talking to my students at UCLA in the Extension programme, the thing I push most is that music is about arrangements, is about arrangement of the musical notes into parts and then how you arrange those parts within your sonic picture. When you go back and study how music producers came about, all the great music producers were arrangers. Quincy Jones, Phil Ramone, Phil Spector, Brian Wilson, George Martin, Arif Mardin, all the great original producers of modern day music started as arrangers, and that is something that I push and push on my students.

‘It doesn’t matter what your parts are, how great they are. If the parts are interfering with each other, it will kill your dynamics. If the sonics aren’t right, it’s not going to be exciting. If you’re not holding back on your higher frequencies and punching out at certain times with them, you’re not playing your mix to the full. That’s the most important thing – understanding arrangement, not just of your musical parts but also of your sonics, your frequencies, your stereo width, your depth, and just how to use the whole picture. I think that’s the essence of it all.’

TwitterGoogle BookmarksRedditLinkedIn Pin It An independent news site and blog for professional audio and related businesses, provides a platform for discussion and information exchange in one of the world's fastest-moving technology-based industries.
Fast Touch:
Author: This email address is being protected from spambots. You need JavaScript enabled to view it.

Fast Thinking:Marketing:  This email address is being protected from spambots. You need JavaScript enabled to view it.
Web: Latitude Hosting