PC Pro

DICK POUNTAIN

Our resident guitarist employs a pair of robot musicians to accompany him on drums and bass, and the result is music to his ears

-

Our resident guitarist employs a pair of robot musicians to accompany him on drums and bass.

My Christmas present to myself this year was a guitar, an Ibanez AS73 Artcore. This isn’t meant to replace my vintage MIJ Strat but rather to complement it in a jazzier direction. 50-odd years ago, I fell in love with blues, ragtime and country finger-picking, then slowly gravitated toward jazz via Jim Hall and Joe Pass, then to current Americanaf­usionists like Bill Frisell, Charlie Hunter and Julian Lage (none of whom I’m anywhere near skill-wise).

It’s a long time since I was in a band and I play mostly for amusement, but I can’t escape the fact that those idols all work best in a trio format, with drums and bass. My rig includes a Hofner violin bass, drum machine and looper pedal to record and replay accompanim­ents, and I have toyed with apps such as Band-in-a-Box, or playing along to Spotify tracks, but none of these is satisfacto­ry – too rigid, no feedback. Well, I’ve mentioned before in this column my project to create human-sounding music by wholly programmat­ic means. The latest version, which I’ve named “Algorhythm­ics”, is written in Python and is getting pretty powerful. I wonder if I could use it to write myself a robot trio?

Algorhythm­ics starts out using native MIDI format, treating pitch, time, duration and volume data as four separate streams, each represente­d by a list of ASCII characters. In its raw form, this data just sounds like a hellish digital music box; the challenge is to devise algorithms that inject structure, texture, variation and expression.

I’ve had to employ five levels of quasi-random variation to achieve something that sounds remotely human. The first level composes the data lists themselves by manipulati­ng, duplicatin­g, reversing, reflecting and seeding with randomness. The second level employs two variables I call

“arp” (for arpeggio) and “exp” (for expression) that alter the way notes from different MIDI tracks overlap to control legato and staccato.

A third level produces melodic structure by writing functions called “motifs” to encapsulat­e short tune fragments, which can then be assembled like Lego blocks into bigger tunes with noticeably repeating themes. Motifs alone aren’t enough, though: if you stare at wallpaper with a seemingly random pattern, you’ll notice where it starts to repeat, and the ear has this same ability to spot (and be bored by) literal repetition.

Level four has a tool called “vary” that subtly alters the motifs inside a loop at each pass, and applies tables of chord/scale relations (gleaned from online jazz tutorials and a book about Béla Bartók) to harmonise the fragments. Level five is the outer loop that generates the MIDI output, in which blocks of motifs are switched on and off under algorithmi­c control, like genes being expressed in a string of DNA.

So my robot jazz trio is a Python program called Tribot that generates improvised MIDI accompanim­ents – for acoustic bass and General MIDI drum kit – and plays them into my Marshall amplifier. The third player is me, plugged in on guitar. The General MIDI drum kit feels sparse, so I’ve introduced an extra drum track using instrument­s such as woodblock, Taiko drum and melodic tom. Tribot lets me choose tempo, key and scale (such as major, minor, bop, blues and chromatic) through an Android menu interface, and my two robot colleagues will improvise away until halted. QPython lets me save new Tribot versions as clickable Android apps, so I can fiddle with its internal works as ongoing research.

It’s still only a partial solution, because although the drummer and bass player “listen” to one another – they have access to the same pitch and rhythm data – they can’t “hear” me, and I can only follow them. In one sense, this is fair enough as it’s what I’d experience playing with much better live musicians. At brisk tempos, Tribot sounds like a Weather Report tribute band on crystal meth, which makes for a good workout. But my ideal would be what Bill Frisell once talked about ( pcpro.link/306bill): a trio that improvise all together, leaving “space” for each other.

That’s possible in theory, using a MIDI guitar like a Parker or a MIDI pickup for my Artcore. I’d need to make Tribot work in real-time – it currently saves MIDI to an intermedia­te file – and merge in my guitar’s output translated back into Algorhythm­ic data format, meaning drummer and bass could “hear” me and adjust their playing to fit.

A final fantasy would be to extend Tribot to control an animated video of cartoon musicians. I won’t have sufficient steam left to do either; maybe I’ll learn more just trying to keep up with my robots. dick@dickpointa­in.co.uk

Algorhythm­ics is written in Python and is getting pretty powerful. I wonder if I could use it to write myself a robot trio?

At brisk tempos, Tribot sounds like a Weather Report tribute band on crystal meth, which makes for a good workout

 ??  ?? Dick Pountain is editorial fellow of PC Pro. He recommends you watch this fiveminute YouTube video – pcpro.
link/306lage – before reading this column.
Dick Pountain is editorial fellow of PC Pro. He recommends you watch this fiveminute YouTube video – pcpro. link/306lage – before reading this column.
 ??  ??

Newspapers in English

Newspapers from United Kingdom