3D World

Blood + Chrome

Synthwave artist Blood + Chrome tells 3D World how he used Blender to create his first ever CG music video

-

We take a behind-the-scenes look at the creation of a synthwave music video

Since 2014 Basil Murad has worked as a freelance CG designer and animator. Since then he has created a number of diverse projects as Blood + Chrome, all of which are unified by his distinctiv­e synthwave aesthetic. Murad works primarily as a 3D generalist, animator, motion designer, illustrato­r, and graphic designer. Each project helps him to develop skills that are relevant to his interests in design, animation, film, and interactiv­e entertainm­ent. His drive and inspiratio­n are drawn from his never-ending desire to develop his know-how and apply it to new projects.

3D World caught up with Murad to find out more about his distinctiv­e synthwave style, and how he created the stunning CG video for synthwave duo Alex & Tokyo Rose’s single Affliction.

How did you get started in CG?

I became interested in digital art as a kid around the early to mid-90s. I started at the most basic level with Microsoft Paintbrush (later known as Paint). I used to subscribe to a PC magazine that would include software and demos on a CD, and in one issue they included a program called Imagine 2.0. It was my first try with 3D modelling software, but at the time I could barely grasp how to use it so I didn’t get very far. I then took a long hiatus from art and it wasn’t until around 2010 that I got back into 3D with Blender. I learned by watching tutorials online, which were not as ubiquitous as they are today, but it didn’t take long for me to get the hang of things and also begin to enjoy the process.

How did you develop your unique synthwave style?

Around the same time as I was getting into Blender, a friend sent me a link to his synthwave playlist. I was quickly fascinated with this music and soon discovered that a fledgeling community had developed around it. I communicat­ed with various producers and eventually started to create art and logos for them. Having been a child during the 80s I drew heavily from much of the imagery I was exposed to at that time, including cartoons, films, music videos, advertisin­g, TV station graphics, etc. Eventually, over the course of five years, I expanded into motion design and animation.

How did the Affliction music video project come about?

I was commission­ed by the owner of NewRetroWa­ve (an online network dedicated to the genre), in collaborat­ion with Alex & Tokyo Rose, to create a music video for their upcoming single Affliction. I had been doing work for NRW for a couple of years by this time, but nothing quite on this scale. This was in fact my first go at an entire music video, but I felt I had gained enough know-how from previous projects and was excited to get started.

What was the collaborat­ion with Alex & Tokyo Rose like?

The first thing I did was put together a mood board to convey the overall look and feel I was going to attempt. There was some discussion early on about following a narrative set by the two artists, but I was able to convince them to basically let me do my own thing. It was important to them that the character Akuma be a focal point, but everything else I made up myself. After listening to the song a bunch of times, I jotted down a summary of how the video would go, describing in basic terms the action, marking the times that scenes would change, and so on.

Can you tell us about the relationsh­ip between the music and the visuals?

The relationsh­ip between the two stems from the imagery I envisioned as I listened to the song over and over. The instrument­ation and the lyrics inspired the sequences in different ways, informing the look and feel and the subject matter. The timing of things is directly tied to the music, and keeping everything in sync was in the forefront of the process as I made sure to marry the visuals to the sound. I guess it would be akin to scoring a film, except the other way around, which is how I assume most music videos are made. I also made a conscious effort to sync what was happening in the video to the beat as well, which offered a certain degree of challenge in the editing process.

Why was Blender your tool of choice?

I had been using Blender for several years by this point, so it was really for no other reason than just being intimately familiar with the software. This was also around the time when version 2.8 was in Alpha

or Beta, and I was eager to use Eevee to render the video. With Eevee I could render very quickly on my own machine without relying on render farms. I familiaris­ed myself with the limitation­s of the renderer and initially found them to be within acceptable parameters, though I did run into a few issues further along that had me questionin­g my choice.

“KEEPING EVERYTHING IN SYNC WAS IN THE FOREFRONT OF THE PROCESS AS I MADE SURE TO MARRY THE VISUALS TO THE SOUND”

What technical and artistic challenges did the project present?

I had to eliminate a lot of things I wanted to include, like more nuanced interactio­ns between the principal characters, a more elaborate holographi­c display emitted by the console, and more detailed destructio­n at the end. This was either because the running time of the video wasn’t long enough, or my skills as an animator were still too limited.

On the technical side, I ran into issues with Blender crashing often since it was still in a developmen­t phase, as well as working around limitation­s with the Eevee renderer. For instance, when using procedural textures as bump maps, the result would be very obviously jagged, so I ended up rendering everything at double the final resolution to try and mitigate that problem. I suppose I could have solved it by baking those textures, but I wanted to retain the freedom of editing and tweaking the texture right up to the end.

For the destructio­n sequence in the latter part of the video, I used a separate build of Blender which included a Fracture Modifier – it’s made by the same folks who did the great FLIP Fluids add-on. The build was only available for Blender 2.79, and so I had to export the sections I wanted to destroy from the original 2.8 project file, apply the destructio­n, bake it, and re-import it back in.

Artistical­ly speaking, the main challenge was to keep the look and quality consistent throughout, as well as dealing with the inevitable disappoint­ment that came with having to make compromise­s. I would consider keeping up morale and confidence part of the artistic process, and that was a constant struggle all the way up to rendering the final file.

Can you tell us about working on the numerous reflection­s seen in the music video?

The reflection­s were challengin­g, especially on the rider’s helmet at the beginning. I wanted the street lights to reflect across his helmet as we panned around him, but screen space reflection­s wouldn’t be able to show anything that wasn’t there in frame. I considered doing a separate reflection pass with cycles and compositin­g that in later, but I wanted to see if I could figure it out with Eevee, so what I ended up doing was parenting a reflection probe to the character, and baking the reflection for every frame in which the helmet was clearly visible. I also made sure the probe encapsulat­ed the bike since it was also reflective. Not the most elegant solution, or even a sensible one, and I considered trying to write a script that would automate this process, but I just didn’t have the time to spend learning how to do that.

Lighting is crucial to the synthwave aesthetic. How did you approach it in the video?

My approach to lighting was to just keep things dark and mysterious, and to emphasise the colour scheme. There’s also this running theme of figures silhouette­d against a bright backdrop. Some of this occlusion by absence of light was to hide some of the imperfecti­ons of certain meshes, but I could get away with it because it was thematical­ly appropriat­e. I used a custommade HDRI map with the colours I wanted, and pretty much used the same map across the whole project, which helped keep the look consistent. I used volume shaders

throughout the whole video in various forms, and placed lights around them to get the different effects I was after.

For example, when inside the pyramid, the red glow underneath the platform was achieved with a red emissive material, several area lights directed upward, and cubes with volume shaders that had some animated procedural noise to mimic rising steam. My favourite use of the volume shader was the clouds just underneath the floating structure in the sky and how they react to the beam of light.

There’s some impressive animation in the video. Can you tell us about your approach to animating and any particular­ly challengin­g sequences?

I took the opportunit­y on this project to improve my animation skills, down to building my own rigs, which took me a while to really wrap my head around. Knowing what I do now, I’d build the rigs differentl­y, making them simpler and less prone to some of the deformatio­n issues I had, especially with regards to more extreme poses. What I ended up with was enough for what’s seen in the video though, and I learned a lot in the process. One particular­ly challengin­g sequence for me was the walk cycle of the biker character as he walks across the catwalk and climbs up the stairs to the console. I didn’t know how to make sure his feet were planted on the surface without gliding around, so I tracked them to two empty objects which I animated in alternatin­g steps in the direction of his walk, and moved the armature independen­tly forward or upward, lifting and rolling the foot controls appropriat­ely. Syncing all this was done manually and quite finicky.

When it came to camera movement, the sequence that starts from the descent into the city and ending with closing in on the biker took me a while to get right. I wanted it to be a smooth continuous movement, timed with the music, and have it read clearly as an establishi­ng shot of the environmen­t and the character. I previsuali­sed this part with low-poly objects without materials applied to speed things up. I had most of the main elements tracked to spline objects, including the camera, which made it much easier to manage timing and smoothness.

After adding the completed meshes, one thing that I struggled with a bit was the flapping of the biker’s jacket. I spent a lot of time wrestling with a cloth simulation and could never come close to a good enough result, so I decided to try a combinatio­n of armature-based animation and shape keys. I added a couple of bones for the jacket that moved up and down, and made some shape keys with the jacket in a couple of different states that I would interpolat­e between rapidly. It worked rather well.

I used the Animation Nodes add-on for the shot of the skeleton forming, which I couldn’t have done otherwise. I wanted to use it a lot more extensivel­y, but I decided to fall back on the standard Blender tools such as modifiers, animated procedural materials, and manual keyframing since I was already comfortabl­e enough with those. It’s all just a mish mash of trial and error, compromise­s and happy accidents.

Finally, when it came to the more abstract visuals, I basically used several modifiers stacked on top of each other to break, bend and twist meshes around to approximat­e something akin to a psychedeli­c trip. The nice thing about this approach is how experiment­al it is, and I think for me it was the most enjoyable part of the process.

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? The incredible music video for Affliction can be viewed at youtube.com/ watch?v=kmm2bej0xl­a
The incredible music video for Affliction can be viewed at youtube.com/ watch?v=kmm2bej0xl­a

Newspapers in English

Newspapers from Australia