Evolution and movement
One crucial difference between acoustic and electronic sounds is that it’s possible to design instruments, electronically, which don’t ‘move’. If you program a basic waveform synth patch, fix a filter position and create an amplifier envelope with immediate attack and full sustain, for as long as you hold down a note, it won’t evolve or move. This reliability can be very useful but in other contexts, the lack of movement can be disconcerting, even boring. Whenever you’re designing textures ‘behind’ the dominant elements of a track, try to make them move. Movement comes in many forms and can mean different things but, broadly, if a sound develops from a starting point in terms of its volume, tone, space or stereo position and goes on a journey through one or more of those parameters, it will feel as though it’s evolving. Even if you ask a violinist to ‘hold’ a note at a fixed pitch or volume, he or she will run out of bow and have to start it again. The same is true with wind or brass players who run out of breath; even percussionists gently rolling their sticks on a timpani will vary the dynamics of their performance to produce subtle undulations. These imperfections are human nature and an inevitable consequence of our muscles responding to the instruments we play, so it’s no surprise that many software instruments (like Spectrasonics’ Omnisphere) which build synth sounds from ‘organic’ starting points are so popular. Even if you sample and loop a section of a performance played on an acoustic instrument, these undulations, rises and falls will keep sounds evolving and, as a result, somehow more engaging to the listener. Whenever you’re building background textures, look for ways to keep things moving; either by layering acoustic sounds alongside electronic ones, or employing LFOs to ‘interrupt’ tone, volume, stereo width or other parameters and, therefore, keeping sounds on their toes. Even subtle evolution can be hugely powerful, come the final mix.