Category » Inspiration Clear ×
Unveiling the Magic- Ultra Music Festival, Miami 2018
Ultra Music Festival is a technology lover’s Disneyland. Every year, the month of March has all eyes on Miami: Artists showcasing fresh sounds and music; Visual artists showcasing fresh content & mad skills; festival organizers showcasing new heights of production design; and fans showcasing new levels of outrageous clothing. The festival has it all.[fold][/fold]
A close look at the Main Stage over the years tells us one thing: It is a visual artist’s wildest dreams come true. With super complex rigs, a robust mega structure to hold it all and processing power to tickle our nerd parts, it is always a really well-designed canvas where visual artists can come and show off. And man, do they come all guns blazing.
This year’s stage was remarkably symmetric. Not just left to right but also top to bottom. We especially liked the L’ shaped LED mesh columns that were used to create a massive X. Arranged in a way to make full use of the Z axis, with the ‘L’ shape, the designers ensured no gaps in the LED- no matter what angle you view the stage from.
A small almost unnoticeable detail, but this makes all the difference in the world for an LED heavy stage. The countless pixel strips and a massive lighting & laser inventory perfectly added the cherry to the top.
In our Ultra series of blogposts, we take a look into what went down in making the enormous Main stage, this 20th year of Ultra, a reality. To start off with, our go- to person is, of course, Vello Virkhaus – Resident VJ at Ultra Worldwide for over 15 years.
A genius in anything related to pixels and visual production, Vello & V Squared Labs are real pioneers and have been such an integral part of making Ultra reach the technological heights it’s achieved.
Thank you for doing this, Vello! Let’s get right into it.
How long have you been on this crazy ride with Ultra?
I have been associated with Ultra as a VJ and visual art talent resource for close to 20 years. Time flies when you are having fun!
Over these past 20 years I have seen so much growth in the visual arts community, having met visual artists from around the world, touring with the Ultra family.
Talk to us about this growth. What has stuck with you the most?
During the formative years of Ultra, I would VJ from opening till closing, performing 12 hour sets, back to back with very few guest VJ’s. This was an epic journey with wild weather thrown in, to say the least. I used to have management deliver artist visuals on DVDs and VHS tapes. Artists like Armin and Paul Van Dyke used to provide me with visual loop DVD’s, some containing logos on blue backgrounds so I could key them out on my analog switcher to layer over additional elements. Most of my show was standard definition and was still partially mixed off tapes.
Looking back to these early days feels like the dark ages now. This is just an example of the remarkable growth that has occurred. We have come such a long way in our technology and methods of expression. Look at the Armin show now, and see how much the visual art and music scene has exploded globally.
Another visible sign of growth has been the increased scale of the production and audience attendance. There was a time when rave culture was illegal and not commercially acceptable. There was a time when Ultra had only one 16x9 LED screen on stage as a backdrop with two flanking IMAG screens stage right and left. Maybe 100 tiles max. Now the main stage is 1587 LED tiles. WHOA. I still remember protesting for the Right to Dance and VJ’ing the night away at illegal warehouse raves in Chicago. Look at us now.
That is an incredible story. Look at us now, indeed.
So Circa 2018, let’s talk about the creative process behind the stage and content.
For content creation, I typically get animated materials from Luis Torres, Ultra's animation lead. The process involves a technical review and then a final delivery. I usually get 10-15 pixel mapped animations which I incorporate into all the change- over looks throughout the day. This process has been established and in effect for many years now.
I also program custom generative effects using the Macro Editor/Panel Effects and Chasers/Tracers systems in Crescent Sun. The combination of pixel mapped content layered with architectural, generative panel based visuals makes for some good looks, and lots of possibilities.
I also curate content from my visuals library, along with integrating artist visuals for every Ultra. My approach has always been to group content into large theme banks, and to make sure I have enough, stylistic variation in the main project to hop across aesthetic choices in a rapid manner.
For Ultra Miami 2018, I used Resolume 6 along with Crescent Sun as the primary media server package. I use Resolume to quickly trigger columns of preset mixes, running in 8 layers x 200 columns of content for my primary deck. I capture Resolume into Crescent Sun via a magewell capture card, along with IMAG cameras. The combination of the two different programs (Touch & Resolume) has always been my jam.
When it comes to the all things Stage design, Richard Milstein is the mastermind behind Ultra’s unique stages, of late. I know he currently collaborates closely with Ray Steinman and The Activity/Patrick Deirson for rendering, production and lighting design.
As the House VJ, I really enjoy the canvas I am provided to paint on.
And how much LED went into this colossal canvas?
A LOT!
The video was 50 ft (15 mtr) high by about 150 ft (45 mtr) wide.
Most of the stage was made up of 8 mm tiles. The columns and top fingers was 37mm.
With the Upstage Video Wall, UMF logo, DJ Booth & IMAG there were around 1240 x 8 mm tiles & 350 x 37 mm tiles for the columns and top fingers.
Wow. That’s close to a 1000 sq mtr (10,000 sq ft) of LED. Massive massive.
You guys seemed to process all of that like child’s play..
For processing AG provided the Barco e2 as the primary screen management tool. It easily handled the complex number of HDMI / DisplayPort inputs and outputs.
The e2 also provided the ability to have multiple preview monitors setup so guest VJ's could view their signals and check pixel maps before they go live. The e2 is very low latency at 1 frame or less for progressive sources.
When our international vendors do not have Barco technology, we also integrate the Analog Way Ascender units.
Running one of the biggest festival stages around the world, you obviously face an onslaught of guest VJs running timecode, live feed and inputs to LED processors- How do you handle the madness?
Overall, we have been trying to consolidate and group 1920 HD outputs into 4k signals to reduce the number of cables needed at FOH, per setup. There are usually 3-4 VJ positions at each Ultra around the world, which have ethernet for TC, audio PGM and HD-SDI for camera feeds as well as inputs to processors.
My solution to the FOH craziness was to create an Ultra VJ rider that we can distribute to the various production vendors around the world. This combined with effective artist rider review and solid team communication makes the experience flow smoothly. With the rider and emails our teams know exactly what everyone needs/wants and can provide enough runs to accommodate all artists way ahead of time.
Coordinating it is definitely tough, and could not be done without the help of Ray Steinman and the awesome production team from Ultra.
When you’re pushing so many outs & running systems continuously in the heat, your equipment must be top notch.
What gear do you recommend?
For my hardware, I run a custom-built Windows 10 PC, inside a cooler master case.
I chose this setup as it can accommodate a full-size graphics card, hot swap drive bay, m.2 slot and the ability to support a Corsair high performance liquid CPU cooler. With this configuration, I can run 3 x 4k outputs no problem utilizing the Titan X Pascal card.
This portable setup stays cool in the most demanding circumstances, like Ultra China when it was 100 degrees + outside during opening sets. While my laptop slows down to a crawl, this rig keeps on rocking.
What was your biggest challenge this time round? Any words of advice for our budding VJs out there?
For me the biggest challenge this year was using the experimental VJ software Crescent Sun, along with the new version of Resolume 6. I went a little crazy on clip loading with over 200 columns. Figured.. hey its 64bit and can take it. Then it took 5 minutes to load my show file. Oh Boy!
The best way to overcome challenges for me involves as much practice as possible and pushing through it with good old elbow grease ;-)
As for advice, don't ever take things personally, it's a very challenging business.
True that. Wise words.
Thanks Vello for talking to us. It has been a pleasure. Congratulations on your many accomplishments and on being such a boss-man. Here is to creating a lot more magic and pixel pushing together :)
See you guys in the next blogpost. Until then, go grab that elbow grease.
Credits:
Ultra Creative Director & Production Designer: Richard Milstein
Ultra Animation Lead: Luis Torres
Ultra Production Director: Ray Steinman
2018 Main Stage LED & Tech: AG Lighting & Sound
2018 Main Stage Video Engineer: James Watral
Lighting Designer / LD: Patrick Deirson and The Activity
Main stage MC : Voice of Dance Music, Damian Pinto
Imaginex VJ assist / system programmer: Eric Mintzer
Photos: Eric Mintzer, Rudgr.com & Vello Virkhaus
Resolume Blog
This blog is about Resolume, VJ-ing and the inspiring things the Resolume users make. Do you have something interesting to show the community? Send in your work!
Highlights
KBK - Awakening The Beast
If you are into live visuals & motion graphics, KBK visuals is where it’s at.
Founded under the name of Kijkbuiskinderen in 2003, they’re super multidisciplinary with backgrounds in graphic design, animation, graffiti, photography, music et cetera. A collective of specialists, they expertly tie together the unending realm of visual art, installation, stage design, motion graphics and automation. It’s all about video and how video can relate to its surroundings. After having worked with the biggest artists at the coolest festivals, their work has already gone down in new-age history.
[fold][/fold]
They changed their name to KBK Visuals in 2012. Today, KBK is comprised of 10 people and they have all of it covered- be it management, 2D or 3D animation, operating, programming or research. Most of them carry multiple responsibilities. They often seek collaborations with other, not necessarily VJ-related, artists as well.
This brings us to Dutch megaevent- Awakenings.
Since 1997, Awakenings has been dropping relentless one-night parties and festivals. This year they hit 20 & to celebrate, the Easter rampage from April 13th-16th at the Gashouder was, for lack of a better word, Lit.
This Awakenings gig had robotic arms with Led screens on them. Just so cool. And as all things cool, super complex.
We caught up with the guys from KBK visuals to talk to us about the gig.
Thanks for doing this, guys.
It is pretty evident how KBK and Awakenings have grown together to create concert history. Since when & how has KBK been associated with Awakenings?
KBK has been associated with Awakenings for almost 10 years now. The first shows from 2008 until 2010 were in collaboration with Eyesupply, with KBK taking the task over fully in 2010. Over the course of the years Awakenings shows have become bigger and more elaborate, and we have grown with them. At this point we have provided visuals for well over 100 Awakenings shows.
The Gashouder is as beautiful as it is legendary. What do you keep in mind while designing shows here?
The Gashouder is a beautiful venue in and of itself. With its origins as a gasworks it’s easily recognizable by its circular shape, tall ceiling and industrial appearance. The fact that it’s cylindrical allows for stage designers to go nuts, and especially during Awakenings a lot of thought is put into creating an experience that is as breathtaking and immersive as possible. They never shy away from complex and often massive LED setups. Whenever a new edition comes along we try to pay as much attention as we can to our relation to the lights and lasers, so we can create show that works aesthetically as well as melting faces.
Tell us about your rig at Awakenings.
For pretty much all editions of Awakenings we use one or more heavy duty laptops running Resolume Arena to act as servers. At the moment we are using custom built Clevo laptops. These laptops capture HD inputs from other laptops, mixers and SDI camera feeds via a capture device that is supported by Resolume, like the Blackmagic Ultrastudio 4K. The servers act as a kind of routing system and are used to distribute and chase the content across the various screens, as well as overlaying logos and additional graphics.
With this setup we can use the huge LED rig to its full potential, and it’s flexible enough that we can change things on the fly. This is incredibly valuable, as it’s not unusual for ideas to come up during the show, and Resolume allows us to quickly make changes and add things to the set.
When you do something groundbreaking, so are the challenges. Tell us about your biggest challenges on the Easter gig & how did you over- come them?
Every new edition of Awakenings provides new challenges, and we often get inspired by the technological curveballs that get thrown at us.
When Awakenings told us they were going to hire some industrial robotic arms and put LED screens on them, we knew we had to do something creative with them to take it to the next level.
The arms are basically repurposed factory tools, and their motion is pre-programmed and triggered live by a dedicated operator. We wanted to figure out a way to translate that motion to the visuals by using the rotation of the screens on the arms, and we did.
Merijn came up with the idea to stick iPhones to the back of the screens and use them to read out the phone’s accelerometer.
He approached our MaxMSP specialist Sem, who took the idea further and wrote a patch that translated the Phone’s tilt information to x/y/z rotation in 3d space and transmitted that to a laptop running Max in the front of house.
Sem created some custom visuals that made use of this data, which were then captured by our main Resolume server and sent to the screens on the robot arms.
The whole project was conceived and executed within a matter of days.
For those of you who want to dig a bit deeper into this, here is a bit of technical gebabble, as described by KBK, “The software that’s generating the live rendered physics was built with Max 7. Each of the three robo screens has its own compiled version of a patch that renders the physics in real time, based on the angle of the iPhones. Each patch now has 3 scenes and individual parameters like color selection of the lights and size of the objects. Every patch has a spout output and there is one patch combining all three Spout outputs to be sent to the main Resolume computer via TCPSpout. There it’s imported as a Spout input and scaled to fit the composition. All the apps are connected over a local network via UDP. This way it’s also possible to control all the individual apps with one controller on another computer. The laptop running the apps was tucked away so this came in very handy. There’s one central control app (called OSCBOT) not only controlling the physics renders but also controlling Resolume dashboard links with the angles from the iPhones. This way we can rotate visuals in the opposite direction to the rotation of the robot arms so it looks like the visuals are standing still while the robots are moving.”
Just amazing. You guys do make complicated rigs seem like child's play :)
We put a lot of effort into creating inputmaps that give us a comprehensive view of the available screens and allow us to effectively distribute footage across them. After we receive the stage designs and info on the LED setup, we think about where we want to place our footage during the show, if we want to see it full scale across multiple screens, duplicated or a bit of both. We build our inputmap with as much flexibility as possible in mind while trying to maintain a composition of manageable size.
Needless to say a powerful computer allows for larger scale compositions, which can in turn open up new possibilities as far as scaling your content goes. In the case of large setups the screens are often spread across multiple outputs. We often use Datapath X4 display wall controllers to split our signal across multiple outputs if we really need to cover a lot of pixels.
Are there any tips you would like to give budding visual artists out there? Hardware/ Software/ Life related?
We started small, doing all kinds of underground shows and developing our skills from the ground up. A lot of that DIY mentality hasn’t changed.
We still try out strange ideas (like sticking iPhones to the back of LED screens) and are people who generally enjoy solving the puzzles that a show can present us in a creative way. Quality comes with practice, and practice involves failure.
Also, make sure to drink a lot of water and bananas are a healthy source of potassium and vitamin c.
Yum. We're going to get cracking on those bananas right away, all sorts of pun intended.
Thanks for talking to us, guys!
KBK is: Freek Braspenning, Merijn Meijers, Luuk Meuffels, Tristan Gieler, Eva Imming, Marike Verbiest, Elisa Zaros, Chanon Satthum, Kees van Duyn and Sem Schreuder.
Check out more of their work on their website, vimeo and instagram.
Cosmic Connection with Total Unicorn
“On a breezeless day, in total darkness, on the underside of the planet, somewhere in Austin, The Protocorn pulled a laptop from a rock and called itself Single Unicorn. Drawn to the roiling lumps of the sound waves, other corns appeared: an impossible trio. The Unicorn became Total.“

What do you get when you put a composer, visual artist and choreographer together? Cool content.
Thrown in a unicorn.. and things get cosmic.[fold][/fold]
Enter Total Unicorn.
A performance-based group, with multiple twists: Composer Lyman churns out some serious grooves.
Consciousness-expanding, hallucinogenic, dope.
Visual specialist Stephen backs the music up with some crazy illustration. Layers, colors,
dimensions- all if it synced to make sense in different parallel universes.

Lindsey, the choreographer, perfectly rounds out the performance with resounding energy & elysian
moves. Man, these Unicorn’s know how to tune in.

We caught up with these guys to talk about their show, and the world they live in. Try and see if we can bring some of that awesome sauce to Mother Earth.
It all began when, according to Lyman, Stephen had an idea about the visual delivery of Aesthetic Acid- that you could dose yourself or others with beauty itself, and asked to illustrate that idea with music.
“He then proposed we do this live, to dose entire audiences with our Aesthetic Acid, on a mass or semi-mass scale. Thus was born the idea of Total Unicorn. Lindsey (Unicorn Pink) happened upon one of these happenings, and offered to add a dance physicality to things.”

Total Unicorn say they were complete when the three dimensions of vision, sound and dance were present in equal quantities. But the journey begins with the Music.



Says Stephen, “The music usually starts the conversation. Often times, a moist concept forms in the dark. The witch reads the coffee grounds, the oracle orates, etc. Nowadays, story is critical when we’re making new material. We like pieces that introduces a new character, or that can elaborate on our backstory. I like to make apocryphal stuff about us. Like, if you’re paying attention, there’s a universe to discover.”

Checkout some of their work here:
When asked about his process of creating the audio, Lyman says it starts very simply with a beat, sequence or sample that catches his ear. It then gets run through various electronic and virtual processes- which adds a pulse and makes the sound breathe. Companion sounds are then added and with EQ, compression and other “nurturing” elements, the whole breathes as one. “There is no premeditated idea of what should be the sound. The sound comes to life, and tells me what it needs, and I attempt to deliver those goods to the best of my ability.”

About his visual design, Stephen has a pretty cool process. He loves creating narratives. These ideas only evolve when he plays around with contributing “items”.

“I have these huge libraries of rotoscoped images, 3d models and vector artworks that I’ve created over the years. I synthesize things with those, to get warmed up. Also, I like to do image research at the beginning of each new piece, and explore an obscure art movement, subculture, mythology, memes, mimes and different times”

“I love creating costumes from vegetables and auto parts, and I love modeling environments in 3D. I could spend a day just working on some filigree or other tiny detail. I suppose that I could sketch things out, and not waste so much time, but I would miss out on the spontaneous hexplosions.”
Hardware & software-wise, these guys have a lot of components coming together to create their vision.
Lyman recommends Abelton which he uses to build and indulge in the complete mutation of sound of any of the DAWs out there. Pro Tools is occasionally used as well. He uses a lot of secret and not so secret plug in mutation tools (the special sauce) which help break down and reassemble the component parts into music. Analog sources (Moog Sub-37, Realistic MG-1, Elektron Analog RYTM, Little Korg boxes, etc.) meet digital sources (Native Instruments, Sugar Bytes, Glitchmachines, et al) and “hilarity ensues.”


On the visual side, Stephen use the Adobe suite and Cinema 4D to produce animations. Some of it is cel, 2D, a lot is 3D and they occasionally shoot some live action on green screen. Stephen then distills that in After Effects and renders it for Resolume.




“There, it’s chewed up and broken down into digestible proteins. Resolume adds the presence, the performative quality, and the seasoning. Effects are added, and clips are broken up and cue points are programmed. If mapping is needed, if audio-reactivity is required, then a sacrifice is made, and the ancients are satisfied."

Resolume, as per Stephen, is the most straightforward, reliable and affordable tool for performing live projections. “I encourage a lot of my motion graphics friends starting out with live visuals to give it a go. The DXV codec is one of its strongest attributes, in my opinion. It makes performing with it, across operating systems, very stable.”

“I have actually done shows with artists using other applications and watched them crash or get bogged down with their clips at critical moments, while Resolume powers through. It’s insane how many layers of HD I can mix with in real-time on a laptop. Also, the interface is so intuitive. It’s really easy to visualize my composites when I can see thumbnails, and the layering is reminiscent of most editing programs, so that felt very familiar right away. I can label everything and plot my set left to right. It feels very natural, like writing.“

Lindsey has a “library” of moves of her own. They are essentially things that look cool in the mirror. “I then combine those with other moves from memes, GIFs and Youtube videos. Finally, I season with aimless flopping, slo-mo sequences, and feigned distress.”

When they perform live, Lyman runs Ableton Live with an APC40 Controller, a vintage Memory Man and a Red Panda Particle for a few delays and effects, the Moog Sub Phatty for sequences, all running through a Native Instruments Komplete Audio 6. He also just picked up an Electron Octatrack- “so once I get my head around that universe I think the music will migrate into the boxes and the computer will stay at home.”
Stephen’s setup is, partially, a holdover from a time when he was using slower laptops. One runs Arena- used to drive the projections on the larger screen behind the band. The second runs Avenue- drives our smaller, free-standing vertical screens.
“I still have this Livid Block from several years ago which just keeps on ticking. That triggers effects, clips, mixes layers, and I use the knobs for time scrubbing clips on the fly. I also use this old Korg drum pad to trigger video cues so I can play them rhythmically. And there is a game controller (or whatever other type of USB controller I can get my hands on) to trigger clips on the smaller machine. I run this old Max patch that converts the controller signals to MIDI (or OSC)-kind of like a primitive version of OSCulator.”
Lindsey has the most interesting piece of equipment of all. A liquid spine with all the attached accoutrements.

Essentially believing in freestyle, Stephen creates the visuals to the track. But when they perform, it’s being triggered live by his living hands. :)
They decided, early on, not to have everything triggered automatically by one machine. Says Stephen, “The performance is essential. Rehearsal, repetition, regeneration. I mean, there are moments of improvisation, but we like to keep it tight.”
Lindsey’s dancing is a combination of choreography and improvisation. “On one hoof it’s important for me to know when the musical changes occur, but on the other hoof I need to keep it loose enough to be able to adapt to different situations as they occur. “
When asked about where Total Unicorn draws inspiration from Stephen said,“ From everything and everywhere. It’s all blended in the hyperspindle: the part of the brain filled with the softest, richest, most flavorful cream. Our cream is abundant, allowing the spindle to swell.“
Hmm.. Until we see you again, ponder on this:

"The Unicorn is perceived as this fanciful, sweet creature, but it’s really kind of terrifying. A Unicorn can be anything to anybody, and since few have seen one in person it gives us a freedom to be who we are, without fixed ideas or boundaries."
Photo Credits: Allison Turrell, Celesta Danger and Maye Marley
What do you get when you put a composer, visual artist and choreographer together? Cool content.
Thrown in a unicorn.. and things get cosmic.[fold][/fold]
Enter Total Unicorn.
A performance-based group, with multiple twists: Composer Lyman churns out some serious grooves.
Consciousness-expanding, hallucinogenic, dope.
Visual specialist Stephen backs the music up with some crazy illustration. Layers, colors,
dimensions- all if it synced to make sense in different parallel universes.
Lindsey, the choreographer, perfectly rounds out the performance with resounding energy & elysian
moves. Man, these Unicorn’s know how to tune in.
We caught up with these guys to talk about their show, and the world they live in. Try and see if we can bring some of that awesome sauce to Mother Earth.
It all began when, according to Lyman, Stephen had an idea about the visual delivery of Aesthetic Acid- that you could dose yourself or others with beauty itself, and asked to illustrate that idea with music.
“He then proposed we do this live, to dose entire audiences with our Aesthetic Acid, on a mass or semi-mass scale. Thus was born the idea of Total Unicorn. Lindsey (Unicorn Pink) happened upon one of these happenings, and offered to add a dance physicality to things.”
Total Unicorn say they were complete when the three dimensions of vision, sound and dance were present in equal quantities. But the journey begins with the Music.
Says Stephen, “The music usually starts the conversation. Often times, a moist concept forms in the dark. The witch reads the coffee grounds, the oracle orates, etc. Nowadays, story is critical when we’re making new material. We like pieces that introduces a new character, or that can elaborate on our backstory. I like to make apocryphal stuff about us. Like, if you’re paying attention, there’s a universe to discover.”
Checkout some of their work here:
When asked about his process of creating the audio, Lyman says it starts very simply with a beat, sequence or sample that catches his ear. It then gets run through various electronic and virtual processes- which adds a pulse and makes the sound breathe. Companion sounds are then added and with EQ, compression and other “nurturing” elements, the whole breathes as one. “There is no premeditated idea of what should be the sound. The sound comes to life, and tells me what it needs, and I attempt to deliver those goods to the best of my ability.”
About his visual design, Stephen has a pretty cool process. He loves creating narratives. These ideas only evolve when he plays around with contributing “items”.
“I have these huge libraries of rotoscoped images, 3d models and vector artworks that I’ve created over the years. I synthesize things with those, to get warmed up. Also, I like to do image research at the beginning of each new piece, and explore an obscure art movement, subculture, mythology, memes, mimes and different times”
“I love creating costumes from vegetables and auto parts, and I love modeling environments in 3D. I could spend a day just working on some filigree or other tiny detail. I suppose that I could sketch things out, and not waste so much time, but I would miss out on the spontaneous hexplosions.”
Hardware & software-wise, these guys have a lot of components coming together to create their vision.
Lyman recommends Abelton which he uses to build and indulge in the complete mutation of sound of any of the DAWs out there. Pro Tools is occasionally used as well. He uses a lot of secret and not so secret plug in mutation tools (the special sauce) which help break down and reassemble the component parts into music. Analog sources (Moog Sub-37, Realistic MG-1, Elektron Analog RYTM, Little Korg boxes, etc.) meet digital sources (Native Instruments, Sugar Bytes, Glitchmachines, et al) and “hilarity ensues.”
On the visual side, Stephen use the Adobe suite and Cinema 4D to produce animations. Some of it is cel, 2D, a lot is 3D and they occasionally shoot some live action on green screen. Stephen then distills that in After Effects and renders it for Resolume.
“There, it’s chewed up and broken down into digestible proteins. Resolume adds the presence, the performative quality, and the seasoning. Effects are added, and clips are broken up and cue points are programmed. If mapping is needed, if audio-reactivity is required, then a sacrifice is made, and the ancients are satisfied."
Resolume, as per Stephen, is the most straightforward, reliable and affordable tool for performing live projections. “I encourage a lot of my motion graphics friends starting out with live visuals to give it a go. The DXV codec is one of its strongest attributes, in my opinion. It makes performing with it, across operating systems, very stable.”
“I have actually done shows with artists using other applications and watched them crash or get bogged down with their clips at critical moments, while Resolume powers through. It’s insane how many layers of HD I can mix with in real-time on a laptop. Also, the interface is so intuitive. It’s really easy to visualize my composites when I can see thumbnails, and the layering is reminiscent of most editing programs, so that felt very familiar right away. I can label everything and plot my set left to right. It feels very natural, like writing.“
Lindsey has a “library” of moves of her own. They are essentially things that look cool in the mirror. “I then combine those with other moves from memes, GIFs and Youtube videos. Finally, I season with aimless flopping, slo-mo sequences, and feigned distress.”
When they perform live, Lyman runs Ableton Live with an APC40 Controller, a vintage Memory Man and a Red Panda Particle for a few delays and effects, the Moog Sub Phatty for sequences, all running through a Native Instruments Komplete Audio 6. He also just picked up an Electron Octatrack- “so once I get my head around that universe I think the music will migrate into the boxes and the computer will stay at home.”
Stephen’s setup is, partially, a holdover from a time when he was using slower laptops. One runs Arena- used to drive the projections on the larger screen behind the band. The second runs Avenue- drives our smaller, free-standing vertical screens.
“I still have this Livid Block from several years ago which just keeps on ticking. That triggers effects, clips, mixes layers, and I use the knobs for time scrubbing clips on the fly. I also use this old Korg drum pad to trigger video cues so I can play them rhythmically. And there is a game controller (or whatever other type of USB controller I can get my hands on) to trigger clips on the smaller machine. I run this old Max patch that converts the controller signals to MIDI (or OSC)-kind of like a primitive version of OSCulator.”
Lindsey has the most interesting piece of equipment of all. A liquid spine with all the attached accoutrements.
Essentially believing in freestyle, Stephen creates the visuals to the track. But when they perform, it’s being triggered live by his living hands. :)
They decided, early on, not to have everything triggered automatically by one machine. Says Stephen, “The performance is essential. Rehearsal, repetition, regeneration. I mean, there are moments of improvisation, but we like to keep it tight.”
Lindsey’s dancing is a combination of choreography and improvisation. “On one hoof it’s important for me to know when the musical changes occur, but on the other hoof I need to keep it loose enough to be able to adapt to different situations as they occur. “
When asked about where Total Unicorn draws inspiration from Stephen said,“ From everything and everywhere. It’s all blended in the hyperspindle: the part of the brain filled with the softest, richest, most flavorful cream. Our cream is abundant, allowing the spindle to swell.“
Hmm.. Until we see you again, ponder on this:
"The Unicorn is perceived as this fanciful, sweet creature, but it’s really kind of terrifying. A Unicorn can be anything to anybody, and since few have seen one in person it gives us a freedom to be who we are, without fixed ideas or boundaries."
Photo Credits: Allison Turrell, Celesta Danger and Maye Marley
Where Wolves Roam
Wolves are slightly eccentric, mega talented visual artists who are making their presence felt across the globe. With fabulous skills to boast of, Joshua Dmello & Jash Reen have been fast racing to the front of the (visual) pack.

They’ve worked with all sizes of setups- from tiny to large to omgsomassive and have delivered to the hilt, over & over again. Be it with projection, or LED.
[fold][/fold]
From Sunburn to EDC to Beyond wonderland; Noisia to Nucleya to Flux Pavillion- Wolves have definitely carved a niche for themselves and are well on their way to achieve their dream of world domination.


When you look through their work, what is striking is their creativity in led mapping.
This blog post looks to cover some of their best art, understand their psyche & give you a grand display of some of their prized possessions.
Thanks for doing this, you guys!
Tell us a little something about where it all began for you. At what point did you realize visuals was your calling?
It pretty much started when we met in high school and played a whole lot of video games. We were so obsessed with films, comic books and games growing up that after trying to chase the proverbial ‘calling’ in the real world — Josh has a background in his Dad’s light and sound business and Jash an audio engineer and a journalist — we somehow snapped back to a way to playing video games again. Only we’ve got way bigger screens now and a lot more people are watching.

What do you prefer: Projection or LED? And why?
LED. We respect and admire a lot of projection mapping set ups in the art installation space. But we reached a point of exhaustion with upholding that medium in larger music arenas and festivals. It felt like we were alienating the larger elements of production like the lighting rigs, lasers, stage fx and of course, the performing artists themselves. You can’t having them performing at the same capacity if you have to worry about the lights overshadowing the projections.
So we took the fundamentals of projection mapping and chopped and screwed our input and output maps to fit to the most luminous and stubborn of LED surfaces. Every show’s a new challenge and we never repeat a set up twice.
Tell us the process you follow for pixel mapping.
It starts with us coming up with a stage design, which we conceptualize from scratch or is given to us by the client/ festival. Once we get the tile sizes and resolutions we figure out a way to create the most effective pixel map. One that can fit custom content and existing 1080p content that will look correct moving across the entire stage. Doing a 5-6 hour music festival is not feasible on custom content alone. If you have the accurate outmap map as well you can do most of your mapping from the hotel room itself. Aside from a few onsite tweaks.
Resolume offers mapping and playout of content in the same platform, very few other software offer the same with the ease of access and user-friendly interface. The snapping features, keyboard short cuts and the ability the set virtual displays makes mapping on Arena a breeze.

From the Imperial guard to Nucleya’s crazy stage to the current tour rig with Flux, how do you come up with different LED designs? Is there a brief you follow? Or do you adapt as per the content that you visualize?
There’s never a hardcore brief in play rather than an intense few weeks of conversation between us and the artists/clients we’re working with. Our inspirations personally come from larger than life cinematic universes — think Guillermo Del toro explaining Pacific Rim for the first time to a boardroom of studio execs and you know the kind of take we go in swinging with. Often we get so invested in bringing our content to life they become part of larger narratives we rarely get to talk about in a field like live visuals; but it definitely helps us tie it all together. We can give you 3 examples:
1) When Nucleya’s team approached us for his Bass Yatra Tour, we had around 600-700 sq ft of LED available and that many different versions of the LED layout to explore. We do a few days of pure illustration work along these layouts. Sometimes things just don't click and we break it up and start again. There’s a moment where the practicality of having this layout on stage meets this insane war demon sketch you’ve been rooting for in the start, and then we all sleep well at night.


2) Some of the bigger stages we’ve done like the ‘Outer Realm’ stage at Beyond Wonderland in Los Angeles throw a real curve ball at us. The stage had multiple narrow arches extending from the stage over the audience with very little room for seamless content.

We took it upon us to create two worlds loosely themed as — An Enchanted Forest and A Lost Ship — and somehow adapt it to the whole spread of LED. Keep in mind this was for a stage hosted by the incredible Bassrush crew and we knew things were about to get heavy. Those pleasant environments gave way to a forgotten labyrinth of cogs and pipes powered by a mechanical eye and a set of makeshift Icarus wings - and it all fit!


3) Fast forward to the Flux Pavilion tour and we’re working with a limited itinerary. The production company and label (Bigabox Productions and Circus Records) stock their LED in-house and had to scale it down to a realistic number of panels that would tour safely across all cities and fit in every venue. When they gave us the LED layout and pixel map, we were determined to follow through and not fallback on a 16:9 screen.


The other half of the conversation was with Flux Pavilion himself who had a very distinct vision of these worlds he wanted us to create for different sections of his set. It really pushed us to get a collective vision across that would fit on four sections of LED (one being the fascia) and still be immersive enough to draw audiences in every night.

We ended up with one hell of a ride — a quirky intro feature that draws people from the a scenic British landscape to the strange worlds of Flux Pavilion. At one section of the we have a bionic ship carrier take over the LED and transport him between these worlds. At another, a mad-scientist rig of electricity Pylons take over the screens and charges up flags made of coloured electricity. It gets weirder as we go on, and we’re pretty much developing these worlds as we go along on tour. Every night’s had a great response and we cant wait to see what the show turns into.



Tell us about your content creation process. From scratch to the final render. What software you use, and how much time one clip takes, on an average.
We throw ideas back and forth to settle on a base LED design or projection surface. Once thats finalized, we start to sketch over it- not only does it help control our ideas, but it also assures we’re creating something that will no doubt function at every step.



The sketches are then scanned, digitized and modelled in a 3D realm. We then gather up as many parts as possible to see what we can play around with in terms of animation. We start of with Illustrator to vectorize the sketches and then move on the Cinema 4Dfor the 3D modeling and animation. Post that, its taken into After Effects for final compositing and animation.
We have a great team on board that makes doing all of this such a breeze. They work round the clock and deliver spot on content. Shout out to them!


It can take anywhere between one day to around three to four days based on how complex the clip is in terms of 3D animation and texturing and lighting.
Let's take a walk through your studio. What hardware do you use, what is your most prized possession and what would you like to change/ upgrade?


Like most VJ's out there, we've also realized that the macbook pros just don’t cut it anymore. The heating issues, performance related to the AMD cards and throw in the 'Donglepocalypse', its a no brainer. We've switched to windows laptops for our tour machines.
We're currently using the Octane II from PC Specialist.

Specs:
Intel® CoreTMi7 Quad Core Processor i7-6700k
32Gb Ram
GeForce GTX 1070
1 TB Samsung SSD + 1 TB Seagate Hybrid.
For display outs they have 2 x mini DP, 1 x HDMI , 1 x USB C/Thunderbolt 3.
For our live feed needs we use the Magewell USB 3 to HDMI capture card.
We’re using multiple Windows servers for content generation with dual GTX 1080 GPU’s and the Intel i7 5960x CPU, 32 GB of RAM and 2TB SSD’s in each machine.
Our most prized possession would have to be the TMNT statues we have based on art by James Jean.

[attachment=0]imageedit_1_5038307659.jpg[/attachment]
What process do you follow live? Do you prefer freestyling or is Midi/ SMPTE your friend?
We’re firm believers in doing everything live. We make our content in layers with alpha so we can explore different versions easily and no two sets end up being alike. It keeps the set interesting for us and helps us play it out with an instrument just like any performing artist would.
It also helps us build upon the content on the road and add more layers to it as our heads come up with something. If we go a bit too far we use Resolume’s crazed set of essential effects to roll out of a situation and come back with another banger. True story.
Is there anything more you would like to talk about?
We're playing around with expanding the Wolves brand because a lot of people seem to connect with it. It's all non profit and DIY at this stage. Our first step was starting a merch line that we want to promote within an immediate community of artists before they make their way out to a larger group of people. We started handing them ourselves at shows backstage, outside venues, on long tours and of course to all the wonderful crew we've had a chance to work with at front of house. It's like marking our path through the trenches of the creative industry with a sort-of-cult symbol rather than it become a household brand.

For people in India who'd like to represent, we've partnered with Redwolf to stock the latest 'MK-II' designs online: http://www.redwolf.in/wolves
We've also toying around with the tag 'New Wolf Order' to host a series of video content. At present, we're running with a VLog called transmissions.
It's really rough, surreal first person edits from what it's like to be at a Wolves show. Josh and I often argue that some of the videos lose the point entirely haha so hopefully we'll promote it better soon and start featuring more from a technical standpoint. If there's an audience for that, we're game.
Thanks guys for taking the time out to talk to us and giving us all these cool details.
Next up for Wolves: The Basspod Stage at EDC, Las Vegas.
Howl at ya’ll there.

Photo Credits: BRXVN. Follow him on Facebook & Instagram
They’ve worked with all sizes of setups- from tiny to large to omgsomassive and have delivered to the hilt, over & over again. Be it with projection, or LED.
[fold][/fold]
From Sunburn to EDC to Beyond wonderland; Noisia to Nucleya to Flux Pavillion- Wolves have definitely carved a niche for themselves and are well on their way to achieve their dream of world domination.
When you look through their work, what is striking is their creativity in led mapping.
This blog post looks to cover some of their best art, understand their psyche & give you a grand display of some of their prized possessions.
Thanks for doing this, you guys!
Tell us a little something about where it all began for you. At what point did you realize visuals was your calling?
It pretty much started when we met in high school and played a whole lot of video games. We were so obsessed with films, comic books and games growing up that after trying to chase the proverbial ‘calling’ in the real world — Josh has a background in his Dad’s light and sound business and Jash an audio engineer and a journalist — we somehow snapped back to a way to playing video games again. Only we’ve got way bigger screens now and a lot more people are watching.
What do you prefer: Projection or LED? And why?
LED. We respect and admire a lot of projection mapping set ups in the art installation space. But we reached a point of exhaustion with upholding that medium in larger music arenas and festivals. It felt like we were alienating the larger elements of production like the lighting rigs, lasers, stage fx and of course, the performing artists themselves. You can’t having them performing at the same capacity if you have to worry about the lights overshadowing the projections.
So we took the fundamentals of projection mapping and chopped and screwed our input and output maps to fit to the most luminous and stubborn of LED surfaces. Every show’s a new challenge and we never repeat a set up twice.
Tell us the process you follow for pixel mapping.
It starts with us coming up with a stage design, which we conceptualize from scratch or is given to us by the client/ festival. Once we get the tile sizes and resolutions we figure out a way to create the most effective pixel map. One that can fit custom content and existing 1080p content that will look correct moving across the entire stage. Doing a 5-6 hour music festival is not feasible on custom content alone. If you have the accurate outmap map as well you can do most of your mapping from the hotel room itself. Aside from a few onsite tweaks.
Resolume offers mapping and playout of content in the same platform, very few other software offer the same with the ease of access and user-friendly interface. The snapping features, keyboard short cuts and the ability the set virtual displays makes mapping on Arena a breeze.
From the Imperial guard to Nucleya’s crazy stage to the current tour rig with Flux, how do you come up with different LED designs? Is there a brief you follow? Or do you adapt as per the content that you visualize?
There’s never a hardcore brief in play rather than an intense few weeks of conversation between us and the artists/clients we’re working with. Our inspirations personally come from larger than life cinematic universes — think Guillermo Del toro explaining Pacific Rim for the first time to a boardroom of studio execs and you know the kind of take we go in swinging with. Often we get so invested in bringing our content to life they become part of larger narratives we rarely get to talk about in a field like live visuals; but it definitely helps us tie it all together. We can give you 3 examples:
1) When Nucleya’s team approached us for his Bass Yatra Tour, we had around 600-700 sq ft of LED available and that many different versions of the LED layout to explore. We do a few days of pure illustration work along these layouts. Sometimes things just don't click and we break it up and start again. There’s a moment where the practicality of having this layout on stage meets this insane war demon sketch you’ve been rooting for in the start, and then we all sleep well at night.
2) Some of the bigger stages we’ve done like the ‘Outer Realm’ stage at Beyond Wonderland in Los Angeles throw a real curve ball at us. The stage had multiple narrow arches extending from the stage over the audience with very little room for seamless content.
We took it upon us to create two worlds loosely themed as — An Enchanted Forest and A Lost Ship — and somehow adapt it to the whole spread of LED. Keep in mind this was for a stage hosted by the incredible Bassrush crew and we knew things were about to get heavy. Those pleasant environments gave way to a forgotten labyrinth of cogs and pipes powered by a mechanical eye and a set of makeshift Icarus wings - and it all fit!
3) Fast forward to the Flux Pavilion tour and we’re working with a limited itinerary. The production company and label (Bigabox Productions and Circus Records) stock their LED in-house and had to scale it down to a realistic number of panels that would tour safely across all cities and fit in every venue. When they gave us the LED layout and pixel map, we were determined to follow through and not fallback on a 16:9 screen.
The other half of the conversation was with Flux Pavilion himself who had a very distinct vision of these worlds he wanted us to create for different sections of his set. It really pushed us to get a collective vision across that would fit on four sections of LED (one being the fascia) and still be immersive enough to draw audiences in every night.
We ended up with one hell of a ride — a quirky intro feature that draws people from the a scenic British landscape to the strange worlds of Flux Pavilion. At one section of the we have a bionic ship carrier take over the LED and transport him between these worlds. At another, a mad-scientist rig of electricity Pylons take over the screens and charges up flags made of coloured electricity. It gets weirder as we go on, and we’re pretty much developing these worlds as we go along on tour. Every night’s had a great response and we cant wait to see what the show turns into.
Tell us about your content creation process. From scratch to the final render. What software you use, and how much time one clip takes, on an average.
We throw ideas back and forth to settle on a base LED design or projection surface. Once thats finalized, we start to sketch over it- not only does it help control our ideas, but it also assures we’re creating something that will no doubt function at every step.
The sketches are then scanned, digitized and modelled in a 3D realm. We then gather up as many parts as possible to see what we can play around with in terms of animation. We start of with Illustrator to vectorize the sketches and then move on the Cinema 4Dfor the 3D modeling and animation. Post that, its taken into After Effects for final compositing and animation.
We have a great team on board that makes doing all of this such a breeze. They work round the clock and deliver spot on content. Shout out to them!
It can take anywhere between one day to around three to four days based on how complex the clip is in terms of 3D animation and texturing and lighting.
Let's take a walk through your studio. What hardware do you use, what is your most prized possession and what would you like to change/ upgrade?
Like most VJ's out there, we've also realized that the macbook pros just don’t cut it anymore. The heating issues, performance related to the AMD cards and throw in the 'Donglepocalypse', its a no brainer. We've switched to windows laptops for our tour machines.
We're currently using the Octane II from PC Specialist.
Specs:
Intel® CoreTMi7 Quad Core Processor i7-6700k
32Gb Ram
GeForce GTX 1070
1 TB Samsung SSD + 1 TB Seagate Hybrid.
For display outs they have 2 x mini DP, 1 x HDMI , 1 x USB C/Thunderbolt 3.
For our live feed needs we use the Magewell USB 3 to HDMI capture card.
We’re using multiple Windows servers for content generation with dual GTX 1080 GPU’s and the Intel i7 5960x CPU, 32 GB of RAM and 2TB SSD’s in each machine.
Our most prized possession would have to be the TMNT statues we have based on art by James Jean.
[attachment=0]imageedit_1_5038307659.jpg[/attachment]
What process do you follow live? Do you prefer freestyling or is Midi/ SMPTE your friend?
We’re firm believers in doing everything live. We make our content in layers with alpha so we can explore different versions easily and no two sets end up being alike. It keeps the set interesting for us and helps us play it out with an instrument just like any performing artist would.
It also helps us build upon the content on the road and add more layers to it as our heads come up with something. If we go a bit too far we use Resolume’s crazed set of essential effects to roll out of a situation and come back with another banger. True story.
Is there anything more you would like to talk about?
We're playing around with expanding the Wolves brand because a lot of people seem to connect with it. It's all non profit and DIY at this stage. Our first step was starting a merch line that we want to promote within an immediate community of artists before they make their way out to a larger group of people. We started handing them ourselves at shows backstage, outside venues, on long tours and of course to all the wonderful crew we've had a chance to work with at front of house. It's like marking our path through the trenches of the creative industry with a sort-of-cult symbol rather than it become a household brand.
For people in India who'd like to represent, we've partnered with Redwolf to stock the latest 'MK-II' designs online: http://www.redwolf.in/wolves
We've also toying around with the tag 'New Wolf Order' to host a series of video content. At present, we're running with a VLog called transmissions.
It's really rough, surreal first person edits from what it's like to be at a Wolves show. Josh and I often argue that some of the videos lose the point entirely haha so hopefully we'll promote it better soon and start featuring more from a technical standpoint. If there's an audience for that, we're game.
Thanks guys for taking the time out to talk to us and giving us all these cool details.
Next up for Wolves: The Basspod Stage at EDC, Las Vegas.
Howl at ya’ll there.
Photo Credits: BRXVN. Follow him on Facebook & Instagram
Visual Pixation with Rick Jacobs
Our quest for excellence in the visual space has now brought us to Rick Jacobs of RJB visuals.

Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]

What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.


We reached out to Rick to talk to us, and throw some light on the great work he is doing.
What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?
It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)
Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.

When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.
Here are links to some intros: Novell & Omnia
It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!
Here is a sneakpeak into the latest intro for Nicky:


You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?
The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.
The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.

For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.
From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.
Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.
It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!
It’s fun to change stuff around now and then.


Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.
What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?
Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.
What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.
We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.

The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.
Here are some Intro Clips:



Talk to us about your go to software & hardware (both for content creation & operation).
Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.
When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume!
As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.
Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?
My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.

We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.
As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.


Any words of advice for budding visual artists out there?
Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.
Is there anything else you would like to talk about? We would love to hear it.
Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.
As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.
Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.
It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)


And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D
If you’re interested we’ll post something about it on our social media in the future.

*Shudders* Oooh this gave us chills.
Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.
With skills like that, you’re miles ahead already :)
Check out some more of Rick’s work here
Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website
Touring currently with Nicky Romero, and the man behind the operation and visual design of his entire show, the stuff that Rick does is epic of massive proportions. [fold][/fold]
What do we love about him?
He makes some great, heavily detailed content which is then displayed perfectly in sync with what Nicky is doing on stage. I, personally, love the magnitude & depth with which he portrays infinity, space and the inexplicable wonders of it.
We reached out to Rick to talk to us, and throw some light on the great work he is doing.
What is touring with Nicky like? When did this great journey begin & how would you say you have grown with it?
It started 4 years ago, my first show with Nicky was Ultra 2013, the Main Stage. I was so nervous, everybody at home watching, my friends, family. Before that I had vj’d at clubs with just 1 output always. So, for Ultra, I brought 2 laptops to handle multiple outputs - being the newby I were back then ;)
Nicky and the team were impressed with that first show and offered me to tour with them. I chose to finish school first, because it was just 3 months left. I graduated as a game design and developer and missed my graduation ceremony as I went straight to Vegas to tour with Nicky.
When I finished the tour I started RJB Visuals and teamed up with my brother Bob who was studying game art. Our teamwork was put to the test, immediately. We needed to conceptualize and create a 5min intro visual in 3 days!
Nowadays, we plan 1 month for an intro- This has become kind of our signature.
Here are links to some intros: Novell & Omnia
It’s been a really awesome journey so far, Nicky and the team trust Bob and I with the whole visualization of the show. When I started, they more or less had just the guy fawkes mask, so I had the freedom to design and develop a whole new visual style for his shows, which was really great!
Here is a sneakpeak into the latest intro for Nicky:
You and the LD do a great job live. How much of a role does SMPTE play in this & how much is freestyle?
The first 2 years that I toured with Nicky, we didn’t have a LD. After that, Koen van Elderen joined the team and I couldn’t have been happier! The guy is great, he programs really fast and we come up with new things while we are doing the show. We just understand each other immediately.
The whole show is freestyle, we never use SMPTE.It keeps us focused. Also, I don’t link all visuals to songs. One day this song has these visuals the next day you’ll see something different, it depends on what colors Koen and I yell at each other.
For all lyrics I use cue points so as soon as I hear Nicky mixing in a track with vocals I’ll ready up the visual and start cueing it.
From on point strobes, to perfect transitions, to super color changes- there’s gotta be a lot of concentration, communication & practice involved between you and Koen.
Like I said, Koen and I are just really on the same page. We make up new stuff during the show and remember it for the next show.We normally don’t receive a playlist or a lot of info on his set so we often get some nice surprises and have to come up with something, along the way.
It usually goes something like this:If you take the TISKS I’ll take the BOEMS.. Sure thing..Or whenever there are really powerful accents in a song we look at each other and ask “do you want to take these or shall I take them?” Haha!
It’s fun to change stuff around now and then.
Also at each outro of a song we turn to each other and one of us will say the next color and we change it at the same time, when the song changes over. Or, if it’s a familiar song with its own visuals we both already know what to do or I make hand gestures of the visual that is coming up next so he will know the color. Sometimes, I will be stone faced visualizing a sea with my hands and he will know which visual is coming up.
What are your favorite effects & features on Resolume, that you would encourage VJs to incorporate into their show?
Mostly my effects are small flashy clips linked to buttons on my MIDI panel, but my knobs are linked to various screenshake/twitch/strobe effects. Mainly all sorts of small effects to highlight certain melodies or accentuate the bass.
What brief/ thought process do you follow while designing content for the show. We see a whole range from nature to space to waterfalls to abstract.
We try to create contrast between drops and breaks by changing the color scheme, style and pace while at the same time try to have the transitions be as fluid as possible. Break visuals for Nicky Romero's show are often desaturated/black-and-white realistic looking visuals while the drop visuals are full of flashing neon colors and abstract shapes loosely based on the realistic styled visual. Putting these completely different styles together in one song works as a great crowd booster.
The risk of mixing these completely different styles after each other is that it could lead to too harsh of a transition. We're not a big fan of fade ins so several visuals have an actual intro clip that will autopilot to the next clip which is a loop. They're sort of a 'special fade in' to a loop starting of black and having the visual's scene unfold in a smooth transition.
Here are some Intro Clips:
Talk to us about your go to software & hardware (both for content creation & operation).
Most of our content is created in Cinema4D with the Octane renderer. For all the intros we use Autodesk Maya, since we have a history in game design and development we were pretty used to working in Maya or 3ds Max at school. It just has a little bit more extra options to get that exact look you want for the intro.
When we started creating short visual loops we soon realized Cinema4D is much more straight forward for creating visuals.For post we are using After Effects. And, of course, for vjing Resolume!
As for hardware, I’m using an MSI GT73VR 6RF Titan Pro and the Akai APC40MK2.
Tell us about your studio. What gear is an absolute must have, and what would you like to change/ upgrade?
My studio isn’t that great actually, haha, we have a small office in Limburg at my parents place. One of our employees is also from Limburg so half of the week we’re working in Limburg and the other in Utrecht.
We have a small setup in my apartment in Utrecht, my brother lives with me so it’s easy to work from home. In the near future we’re planning to get an office as we’re expanding and looking for new people to work with.
As for an upgrade, I really need more render power, haha, with these 4k visual content rendering is a nightmare.
Any words of advice for budding visual artists out there?
Less is more! Don’t layer 6 completely different visuals on top of each other and mix them every beat. It can become chaos really easily. Also black is your friend, leave enough black space to make your visual really pop out.
Is there anything else you would like to talk about? We would love to hear it.
Our most recent development is that we’re starting a co-operation called Visual Lab with Hatim. Hatim was the reason I started vj’ing for Nicky and over the years we built a great bond as he is Nicky Romero’s tour/production manager.
As probably all of us here know, talking and arranging gigs/assignments is the least fun part of our job so it seems like a great idea to have someone do that for us. It also seems like the next big step for our company and will lead to us hiring more talented vj’s and content creators.
Also, recently we’ve been working on creating a more generic visualpack we would like to sell on Resolume.
It’s interesting creating visuals that are not for your own use because normally we create pretty specific visuals for certain parts of the show. Now we need to forget about that and create visuals that can be used by anyone in any situation. It’s a good practice. I think we have come up with a pretty cool style of modern styled visuals and classic kaleidoscopic visuals for your enjoyment. :)
And, on a last note ,we are working on a VR horror escape room game in between all the visual work related stuff. Got to keep those university-skills going! :D
If you’re interested we’ll post something about it on our social media in the future.
*Shudders* Oooh this gave us chills.
Thanks for doing this interview Rick. We all look forward to those visual packs from you, and wish you so much success with Visual Lab.
With skills like that, you’re miles ahead already :)
Check out some more of Rick’s work here
Credits:
Rick and Bob, RJB Visuals + Visual Lab
Follow them on: Instagram & their website
Make Some Noisia
Dutch electronic megahouse Noisia has been rocking the planet with their latest album ‘Outer Edges’.

Photo by Diana Gheorghiu
It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.
An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.
Here is a look at Diplodocus, a favorite amongst bass heads:
Video by Diana Gheorghiu
Thanks for doing this guys! Much appreciated.
What exactly is a concept show and how is preparation for it different from other shows?
When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.
Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.
We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.
What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?
We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.
We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.

What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?
Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.
The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.
For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.
We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?
Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.
When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.
This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.



Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?
Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.
The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.
A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.
I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.
When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.
Do you link it to Resolume?
Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.
Video by Diana Gheorghiu
Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!
The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.


The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?
The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.

Photo by Diana Gheorghiu

Photo by Diana Gheorghiu
The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?
That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.
This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).
During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.
Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.
Here is some more madness that went down:
Video by Diana Gheorghiu
Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?
We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.

Photo by Diana Gheorghiu
Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.
It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).
Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.
Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!
What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.
Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?
Here is one last video for a group rage :
Video by Diana Gheorghiu
Rinseout.
Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.
Photo by Diana Gheorghiu
It was a wait. But one that was truly worth it. Essentially a concept album, they pushed the boundaries on this one by backing it up with a ‘concept tour’.
An audio-visual phenomenon with rivetting content, perfect sync & melt-yo-face energy, the Outer Edges show is one that could not pass our dissection.
[fold][/fold]
We visited Rampage, one of the biggest Drum & Bass gigs around the world & caught up with Roy Gerritsen (Boompje Studio) & Manuel Rodrigues (DeepRED.tv), on video and lighting duty respectively, to talk to us about the levels of excellence the Noisia crew has achieved, with this concept show.
Here is a look at Diplodocus, a favorite amongst bass heads:
Video by Diana Gheorghiu
Thanks for doing this guys! Much appreciated.
What exactly is a concept show and how is preparation for it different from other shows?
When Noisia approached us they explained they wanted to combine the release of their next album “Outer Edges” with a synchronized audio visual performance. It had been 6 years since Noisia released a full album so you can imagine it was a big thing.
Together, we came up with a plan to lay the foundation for upcoming shows. We wanted to focus on developing a workflow and pipeline to create one balanced and synchronized experience.
Normally, all the different elements within a show (audio, light, visual, performance) focus on their own area. There is one general theme or concept and everything then comes together in the end - during rehearsals.
We really wanted to create a show where we could focus on the total picture. Develop a workflow where we could keep refining the show and push the concept in all different elements in a quick and effective way, without overlooking the details.
What was the main goal you set out to achieve as you planned the Outer Edges show?
How long did it take to come together, from start to end?
We wanted to create a show where everything is 100% synchronized and highly adaptable. Having one main control computer which connects to all elements within the show in a perfect synchronized way.This setup gave us the ability to find a perfect balance and narrative between sound, performance, lights and visuals. Besides that we wanted to have a modular and highly flexible show. Making it easy and quick to adapt or add new content.
We started with the project in March 2016 and our premiere was at the Let It Roll festival in Prague (July 2016).
The show is designed in such a way that it has an “open-end”. We record every show and because of the open infrastructure we are constantly refining it on all fronts.
What are the different gadgets and software you use to achieve that perfect sync between audio/video & lighting?
Roy:Back in the day, my graduation project at the HKU was a vj mix tool where I used the concept of “cue based” triggering. Instead of the widely used timecode synchronization where you premix all the content (the lights and the video tracks), we send a MIDI trigger of every beat and sound effect.This saves a lot of time in the content creation production process.
The edit and mix down of the visuals are basically done live on stage instead of on After effects. This means we don't have to render out 8 minute video clips and can focus on only a couple of key visual loops per track. (Every track consists of about 5 clips which get triggered directly from Ableton Live using a custom midi track).Inside Ableton we group a couple of extra designated clips so they all get triggered at the same time.
For every audio clip we sequence separate midi clips for the video and lighting, which get played perfectly in sync with the audio. These midi tracks then get sent to the VJ laptop and Manuel's lighting desk.
We understand you trigger clips off Resolume from Abelton Live using the extremely handy Max for Live patches?
Yes, we sequence a separate midi track for each audio track. We divided up the audio track in 5 different elements (beats, snares, melody , fx etc.), which corresponds with 5 video layers in Resolume.
When a note gets triggered, a Max for Live patch translates it to an OSC message and sends if off to the VJ laptop. The OSC messages get caught by a small tool we built in Derivative’s TouchDesigner. In its essence this tool translates the incoming messages into OSC messages which Resolume understands. Basically, operating Resolume automatically with the triggers received from Ableton.
This way of triggering videoclips was a result of an experiment from Martin Boverhof and Sander Haakman during a performance at an art festival in Germany, a couple of years ago. Only two variables are being used- triggering video files and adjusting the opacity of a clip. We were amazed how powerful these two variables are.
Regarding lighting, we understand the newer Chamsys boards have inbuilt support for MIDI/ timecode. What desk do you use?
Manuel:To drive the lighting in the Noisia - Outer Edges show I use a Chamsys Lighting desk. It is a very open environment. You can send Midi, Midi showcontrol, OSC, Timecode LTC & MTC, UDP, Serial Data and off course DMX & Artnet to the desk.
The support of Chamsys is extremely good and the software version is 100% free. Compared to other lighting desk manufacturers, the hardware is much cheaper.
A lighting desk is still much more expensive than a midi controller.
It might look similar as both have faders and buttons but the difference is that a lighting desk has a brain.
You can store, recall and sequence states, something which is invaluable for a lighting designer and now is happening is videoland more and more.
I have been researching on bridging the gap between Ableton Live and ChamSys for 8 years.
This research has led me to M2Q, acronym of Music-to-Cue which acts as a bridge between Ableton live and ChamSys. M2Q is a hardware solution designed together with Lorenzo Fattori, an Italian lighting designer and engineer. M2Q listens to midi messages sent from Ableton Live and converts them to Chamsys Remote Control messages, providing cue triggering and playback intensity control.
M2Q is reliable, easy and fast lighting sync solution. It enables non linear lighting sync.
When using Timecode it is impossible to loop within a song, do the chorus one more time or alter the playback speed on the fly. One is basically limited to pressing play.
Because our lighting sync system is midi based the artist on stage has exactly the same freedom Ableton audio playback offers.
Do you link it to Resolume?
Chamsys has a personality file (head file) for Resolume and this enables driving Resolume as a media server from the lighting desk. I must confess that I’m am considering switching to Resolume now for some time as it is very cost effective and stable solution compared to other mediaserver platforms.
Video by Diana Gheorghiu
Tell us about the trio’s super cool headgear. They change color, strobe, are dimmable. How?!
The led suits are custom designed and built by Élodie Laurent and are basically 3 generic led parcans and have similar functionality.
They are connected to the lighting desk just as the rest of the lighting rig and are driven using the same system.
Fun fact: These are the only three lights we bring with us so the Outer Edges show is extremely tour-able.
The Noisia content is great in it’s quirkyness. Sometimes we see regular video clips, sometimes distorted human faces, sometimes exploding planets, mechanical animals- what’s the thought process behind the content you create? Is it track specific?
The main concept behind this show is that every track has his own little world in this Outer Edges universe. Every track stands on its own and has a different focus on style and narrative.
Nik (one third of Noisia & Art director) compiled a team of 10 international motion graphic artists and together we took on the visualization of the initial 34 audio tracks. Cover artwork, videoclips and general themes from the audio tracks formed the base for most of the tracks.
Photo by Diana Gheorghiu
Photo by Diana Gheorghiu
The lighting & video sync is so on point, we can’t stop talking about it. It must have taken hours of studio time & programming?
That was the whole idea behind the setup.
Instead of synchronizing everything in the light and video tracks, we separated the synchronizing process from the design process. Meaning that we sequence everything in Ableton and on the content side Resolume takes care of the rest. Updating a vj clip is just a matter of dragging a new clip into Resolume.
This also resulted in Resolume being a big part in the design process (instead of normally only serving as a media server).
During the design process we run the Ableton set and see how clips get triggered, if we don't like something we can easily replace the video clip with a new one or adjust for instance the scaling size inside Resolume.
Some tracks which included 3D rendered images took a bit longer, but there is one track “Diplodocus” which took 30 minutes to make from start to finish. Originally, meant as a placeholder but after seeing it being synchronized we liked the simplicity and boldness of it and decided to keep it in the show.
Here is some more madness that went down:
Video by Diana Gheorghiu
Is it challenging to adapt your concept show into different, extremely diverse festival setups? How do you output the video to LED setups that are not standard?
We mostly work with our rider setup consisting of a big LED screen in the back and LED banner in front of the booth, but in case of bigger festivals we can easily adjust the mapping setup inside Resolume.
In the case of Rampage we had another challenge to come up with a solution to operate with 7 full HD outputs.
Photo by Diana Gheorghiu
Normally Nik is controlling everything from stage and we have a direct video line to the LED processor. Since all the connections to the LED screens were located in the front of house we used 2 laptops positioned there.
It was easy to adjust the Ableton Max for Live patch to send the triggers to two computers instead of one, and we wrote a small extra tool that sends all the midi-controller data from the stage to the FOH (to make sure Nik was still able to operate everything from the stage).
Talk to us about some features of Resolume that you think are handy, and would advice people out there to explore.
Resolume was a big part of the design process in this show. Using it almost as a small little After Effects, we stacked up effects until we reached our preferred end result. We triggered scalings, rotations, effects and opacity using the full OSC control option Resolume offers. This makes it super easy to create spot on synchronized shows. With a minimal amount of pre - production.
This in combination with the really powerful mapping options makes it an ideal tool to build our shows on!
What a great interview, Roy & Manuel.
Thanks for giving us a behind-the-scenes understanding of what it takes to run this epic show, day after day.
Noisia has been ruling the Drum & Bass circuit, for a reason. Thumping, fresh & original music along with a remarkable show- what else do we want?
Here is one last video for a group rage :
Video by Diana Gheorghiu
Rinseout.
Credits:
Photo credits Noisa setup: Roy Gerritsen
Adhiraj, Refractor for the on point video edits.
Jammin' at the FOH with the A-Team
Bas Scheij & Angelo Isenia are popularly known as the A-Team.
Main men at the Front of House for Afrojack, they are widely considered amongst the best lighting & video team in the electronic music space.
Bas
Picture:RUDGR
Angelo

After closely following them live multiple times we have reached the conclusion that they have one mind.
We, here at Resolume, like to call them the Human Timecode.
[fold][/fold]
This blogpost we will try to delve into the workings of this destructive duo & let’s see if we can squeeze some secrets outta them. *wink
Thanks for doing this guys!
Give us a small background of yourselves.
How did you guys get into this & at what point did you realize- “this is it”?
Bas: It all started for me 21 years ago. I worked as a stagehand for several Dutch bands.
From day one I was inspired by lights and soon I realized this was it for me. I have worked for rental companies as a light technician, project manager and operator. In 2009 I became a freelance operator and continued touring with bands, operating festivals, joined The Art of Light for a while.
This January 2017, I started my new venture BASZ design & live operating
Angelo: For me it started when I was studying at HKU school of Arts around 10 years ago. My good friend and classmate at the time, Cheverno Leiwakabessy, introduced me to VJ’ing. We were geeking out a lot back in the days trying new stuff and learning new software.
I had that ‘it’ moment when we started VJ’ing for Eyesupply. We kept doing bigger and bigger shows. That feeling of standing in the front of house running a show is an adrenaline rush for me! From then on things developed pretty quickly. We were hired by Kevin, Carlo and Sander (Eyesupply/250K) to do what we love for a living.
I had my second ‘it’ moment last year when I decided to start freelancing and started my own company, Dvizion. I did this as a means of challenging myself to get better and cooperate with more people around me.
For how long have you both been working together? Was it love at first gig?
Bas: We are working as a duo since the summer of 2013, when André Beekmans (The Art of Light) and Sander Reneman (250K) introduced me to the team doing programming in advance for André, who was Afrojack’s main LD those days.Later on, he asked me to succeed him as LD for Afrojack.
We soon found out that there was a great vibe and a synergy between Angelo, Afrojack and I, as a team.
Angelo: We recently talked about this. The years fly by so fast. It has been almost 4 years since we were brought together for the Afrojack shows. We had a really good vibe from the beginning. What is important is that we could always tell each other if we didn’t like something which only kept pushing us these years.
Picture:Sunburn Festival
While on lighting & video duty, it is important to strike a balance. Not let one overpower the other. Do you agree? How do you keep that balance?
Bas: I totally agree. One of the main rules for us as a team is to keep the right balance between video and lights and let one not overpower the other. LED walls are very bright nowadays which makes it hard to keep the right balance in output between lights and video, live and for television. We mostly achieve the balance by controlling the brightness for video, a useful tool in Resolume.
Angelo: I totally agree, this is something we talk about constantly. We call it the 100%-limit theory, as in my video can go up to 100% and lights too. But, you never want to go over 100% together. During the show we tell each other “Hey, not 200%” so we instantly know we are pushing it too much. There is still much to learn but at least this way we try to balance it out. Also, what is really important is communication during the show.
What process do you follow during lighting design & content creation? And then for operation?
Bas: For festival shows we use the design which is made by the festival designer, and for solo shows, we use our rider design. For the operational part, we work very closely with Afrojack as he really knows what he wants for video and lights.
When we, Angelo and I, receive the songs he’s about to play, we start with analyzing each song, decide which colors we’re going to use and create solo moments for video and lights. Once this is done, Angelo and I practice a little & tell each other very specifically what to do and when. After this it’s show time! At the end of each show we try to find improvements we can implement in the next show.
Angelo: The visuals are made not only by me but also by Eyesupply. We worked on a moodboard consisting of mech robots, manga designs and technical overlays. During the course of 5 years I remixed and edited a lot, adding more abstract content. It all comes down to running shows, sit down to talk about it and go back to the drawing board to adjust where necessary.
Picture: MTV Crashes Plymouth
I remember a great conversation we had about “Music sensibility” and about how understanding rhythm & time is key. Can you tell us about this?
Bas: I think this is one of the characteristics you need to have as an LD to become a real professional. Afrojack decides on the flow, which songs he’s about to play and sometimes plays new tracks without telling us beforehand! “To keep them focused”, as he always says.
It’s important to feel the groove of the music to find out when the new track comes into the mix etc. To react on a break or accent in the music, timing is key to make the difference. I think this is something you can’t learn. It's something you have or not.It is important to hit the button or fader on the right moment to give the extra dimension to the show.
Angelo: Having a sense of rhythm is key to operating, in my opinion. People know me for being too caught up trying to catch every beat and drop, it’s like a blessing and a curse :-) For me it’s really important to mix the video according to the music because everything we do is live.
Here is a great video of Bas & Angelo slaying programming @ the Main Stage, Ultra Miami 2016:
And, here is a video of the outcome:
Everything you do is freestyle. No SMPTE. No MIDI. No Timecode. Any tips for achieving such levels of tightness in transitions & color changes?
Bas: Timing is key. When both understand rhythm and timing, you’ll have the base for tight transitions and color changes as a team. We always use intercoms during show. This is very helpful to call breaks and accents. Also, we spend a lot of time listening to the songs.
Angelo: Communication. We communicate constantly during the show. Although we’ve been doing it for years so we can also sense what the other one will transition to. We also like to nerd-talk about it after the shows. We ask each other “what if we do this or that”. Then, it’s back to the hotel (or home), hook up my gear and try to incorporate those ideas into the show.
Picture:PHOTORIK @ Amsterdam MusicFestival 2015
A metal-head and a Salsa lover, what are your thoughts on the direction electronic music is heading - visually?
Bas: Most of the time a DJ is performing on his own on very big stages in front of thousands of people, which means that you have to reinforce the show visually. Stages are getting bigger nowadays and technique innovates constantly; so visually electronic shows will grow without limits.
Angelo: Haha it’s been a long time I danced! I must admit I never listen to EDM outside of work. I think visually there is much more to explore and improve upon. There are many more artists out there doing bigger and better shows than us so for me it’s always a drive to get better and better. I really look up to those trying new stuff and pushing the limits, take the Eric Prydz’ show, for example. Since I started VJ’ing I have always wondered how we can make the show more interactive - between the Artist and the Audience. This is something I would like to explore in the future.
What are your go to weapons for destruction?
Bas: For Afrojack shows I prefer to work on GrandMA 2, because of its endless possibilities and ease of operation. Nowadays it’s easy to connect Resolume to the GrandMA, which helps us push boundaries to innovate the show.
Angelo: I almost read “mass destruction” ;) For me it’s Resolume Arena hands down. I have been using it since version 2 and really like the simplicity, while having really powerful features like the Advanced Output. We are really close with the Resolume team and it’s a pleasure working out new ideas with them. For crazy new ideas is Joris my go-to guy at Resolume HQ. My MIDI controller of choice is the Akai APC40 mkII. This is the closest controller having all the bells and whistles I want. I still would like to have a better controller, maybe someday I can design my own :)
Picture: Amsterdam Music Festival 2016
Any advice you would like to give budding “FOH artists” out there?
Bas: Stay motivated learning new techniques and stimulate creativity by fueling passion.
Angelo: My advice is relax and enjoy it. It’s ok to make mistakes. I made a lot of mistakes in the past but in the end it all comes down to learning from that. Heck, I still make some mistakes haha. Also, always be friendly to your FOH buddies. In my opinion everyone in the FOH is equal so try to work together to make the best show possible.
Any last words before we let you continue being awesome?
Bas: I want to thank you guys, Shipra and Joris, for the interview. And, of course, Resolume for their support and collaboration, keep it up! Special thanks to Afrojack, 250K and The Art of Light.
Angelo: I really want to thank you guys for this fun interview, it made me think and reminisce about the past. Also, I would like to thank the Resolume guys for always being so cool with us and having a listening ear to our crazy ideas. I must also say that I wouldn't be standing here if it wasn't for Eyesupply and Afrojack. This only pushed me to perform better each show.
Ah. What a great interview this was. Straight from the heart and so much to learn.
Love your FOH buddies. Aw.
Show us some love now, guys. Come on.
Picture: Angelo's I-Phone, Times Square, NYC
Thanks. V much.
Main men at the Front of House for Afrojack, they are widely considered amongst the best lighting & video team in the electronic music space.
Bas
Angelo
After closely following them live multiple times we have reached the conclusion that they have one mind.
We, here at Resolume, like to call them the Human Timecode.
[fold][/fold]
This blogpost we will try to delve into the workings of this destructive duo & let’s see if we can squeeze some secrets outta them. *wink
Thanks for doing this guys!
Give us a small background of yourselves.
How did you guys get into this & at what point did you realize- “this is it”?
Bas: It all started for me 21 years ago. I worked as a stagehand for several Dutch bands.
From day one I was inspired by lights and soon I realized this was it for me. I have worked for rental companies as a light technician, project manager and operator. In 2009 I became a freelance operator and continued touring with bands, operating festivals, joined The Art of Light for a while.
This January 2017, I started my new venture BASZ design & live operating
Angelo: For me it started when I was studying at HKU school of Arts around 10 years ago. My good friend and classmate at the time, Cheverno Leiwakabessy, introduced me to VJ’ing. We were geeking out a lot back in the days trying new stuff and learning new software.
I had that ‘it’ moment when we started VJ’ing for Eyesupply. We kept doing bigger and bigger shows. That feeling of standing in the front of house running a show is an adrenaline rush for me! From then on things developed pretty quickly. We were hired by Kevin, Carlo and Sander (Eyesupply/250K) to do what we love for a living.
I had my second ‘it’ moment last year when I decided to start freelancing and started my own company, Dvizion. I did this as a means of challenging myself to get better and cooperate with more people around me.
For how long have you both been working together? Was it love at first gig?
Bas: We are working as a duo since the summer of 2013, when André Beekmans (The Art of Light) and Sander Reneman (250K) introduced me to the team doing programming in advance for André, who was Afrojack’s main LD those days.Later on, he asked me to succeed him as LD for Afrojack.
We soon found out that there was a great vibe and a synergy between Angelo, Afrojack and I, as a team.
Angelo: We recently talked about this. The years fly by so fast. It has been almost 4 years since we were brought together for the Afrojack shows. We had a really good vibe from the beginning. What is important is that we could always tell each other if we didn’t like something which only kept pushing us these years.
While on lighting & video duty, it is important to strike a balance. Not let one overpower the other. Do you agree? How do you keep that balance?
Bas: I totally agree. One of the main rules for us as a team is to keep the right balance between video and lights and let one not overpower the other. LED walls are very bright nowadays which makes it hard to keep the right balance in output between lights and video, live and for television. We mostly achieve the balance by controlling the brightness for video, a useful tool in Resolume.
Angelo: I totally agree, this is something we talk about constantly. We call it the 100%-limit theory, as in my video can go up to 100% and lights too. But, you never want to go over 100% together. During the show we tell each other “Hey, not 200%” so we instantly know we are pushing it too much. There is still much to learn but at least this way we try to balance it out. Also, what is really important is communication during the show.
What process do you follow during lighting design & content creation? And then for operation?
Bas: For festival shows we use the design which is made by the festival designer, and for solo shows, we use our rider design. For the operational part, we work very closely with Afrojack as he really knows what he wants for video and lights.
When we, Angelo and I, receive the songs he’s about to play, we start with analyzing each song, decide which colors we’re going to use and create solo moments for video and lights. Once this is done, Angelo and I practice a little & tell each other very specifically what to do and when. After this it’s show time! At the end of each show we try to find improvements we can implement in the next show.
Angelo: The visuals are made not only by me but also by Eyesupply. We worked on a moodboard consisting of mech robots, manga designs and technical overlays. During the course of 5 years I remixed and edited a lot, adding more abstract content. It all comes down to running shows, sit down to talk about it and go back to the drawing board to adjust where necessary.
I remember a great conversation we had about “Music sensibility” and about how understanding rhythm & time is key. Can you tell us about this?
Bas: I think this is one of the characteristics you need to have as an LD to become a real professional. Afrojack decides on the flow, which songs he’s about to play and sometimes plays new tracks without telling us beforehand! “To keep them focused”, as he always says.
It’s important to feel the groove of the music to find out when the new track comes into the mix etc. To react on a break or accent in the music, timing is key to make the difference. I think this is something you can’t learn. It's something you have or not.It is important to hit the button or fader on the right moment to give the extra dimension to the show.
Angelo: Having a sense of rhythm is key to operating, in my opinion. People know me for being too caught up trying to catch every beat and drop, it’s like a blessing and a curse :-) For me it’s really important to mix the video according to the music because everything we do is live.
Here is a great video of Bas & Angelo slaying programming @ the Main Stage, Ultra Miami 2016:
And, here is a video of the outcome:
Everything you do is freestyle. No SMPTE. No MIDI. No Timecode. Any tips for achieving such levels of tightness in transitions & color changes?
Bas: Timing is key. When both understand rhythm and timing, you’ll have the base for tight transitions and color changes as a team. We always use intercoms during show. This is very helpful to call breaks and accents. Also, we spend a lot of time listening to the songs.
Angelo: Communication. We communicate constantly during the show. Although we’ve been doing it for years so we can also sense what the other one will transition to. We also like to nerd-talk about it after the shows. We ask each other “what if we do this or that”. Then, it’s back to the hotel (or home), hook up my gear and try to incorporate those ideas into the show.
A metal-head and a Salsa lover, what are your thoughts on the direction electronic music is heading - visually?
Bas: Most of the time a DJ is performing on his own on very big stages in front of thousands of people, which means that you have to reinforce the show visually. Stages are getting bigger nowadays and technique innovates constantly; so visually electronic shows will grow without limits.
Angelo: Haha it’s been a long time I danced! I must admit I never listen to EDM outside of work. I think visually there is much more to explore and improve upon. There are many more artists out there doing bigger and better shows than us so for me it’s always a drive to get better and better. I really look up to those trying new stuff and pushing the limits, take the Eric Prydz’ show, for example. Since I started VJ’ing I have always wondered how we can make the show more interactive - between the Artist and the Audience. This is something I would like to explore in the future.
What are your go to weapons for destruction?
Bas: For Afrojack shows I prefer to work on GrandMA 2, because of its endless possibilities and ease of operation. Nowadays it’s easy to connect Resolume to the GrandMA, which helps us push boundaries to innovate the show.
Angelo: I almost read “mass destruction” ;) For me it’s Resolume Arena hands down. I have been using it since version 2 and really like the simplicity, while having really powerful features like the Advanced Output. We are really close with the Resolume team and it’s a pleasure working out new ideas with them. For crazy new ideas is Joris my go-to guy at Resolume HQ. My MIDI controller of choice is the Akai APC40 mkII. This is the closest controller having all the bells and whistles I want. I still would like to have a better controller, maybe someday I can design my own :)
Any advice you would like to give budding “FOH artists” out there?
Bas: Stay motivated learning new techniques and stimulate creativity by fueling passion.
Angelo: My advice is relax and enjoy it. It’s ok to make mistakes. I made a lot of mistakes in the past but in the end it all comes down to learning from that. Heck, I still make some mistakes haha. Also, always be friendly to your FOH buddies. In my opinion everyone in the FOH is equal so try to work together to make the best show possible.
Any last words before we let you continue being awesome?
Bas: I want to thank you guys, Shipra and Joris, for the interview. And, of course, Resolume for their support and collaboration, keep it up! Special thanks to Afrojack, 250K and The Art of Light.
Angelo: I really want to thank you guys for this fun interview, it made me think and reminisce about the past. Also, I would like to thank the Resolume guys for always being so cool with us and having a listening ear to our crazy ideas. I must also say that I wouldn't be standing here if it wasn't for Eyesupply and Afrojack. This only pushed me to perform better each show.
Ah. What a great interview this was. Straight from the heart and so much to learn.
Love your FOH buddies. Aw.
Show us some love now, guys. Come on.
Thanks. V much.
Taking the World by Storm (Part 2)
Hello all you video junkies. This one's just for you.
It took a while to digest the awesomeness, but Part 2 of "Taking the world by Storm" is here.
So, quick recap?
Gig: Storm Festival, 2016, Shanghai
Epic stage:

Video: 400 square meters of Led, 7.9 mm pitch, 10 processors.
[fold][/fold]
Brandon Chaung, the local VJ on site, talked us through the whole process. And it is intense.
So, sit back..relax..a Storm is brewing.
What computers did you use for the show?
I used both PC and Mac.
I like PC because it is powerful and easy to upgrade. Especially with the graphics card (MXM 3.0b) and storage -which are both essential for running Resolume.
I replaced my optical drive with a SSD for new custom footage for the show (I have more than 40 decks in my composition).
I like Mac because of the great onboard audio quality. Also, it’s less of a hassle for audio playback and midi mapping, when using Resolume with other applications, at the same time.
I switched between two laptops with a Barco Encore switcher.
It required four HD outputs to cover all the LED panels we had.
Both the circle screen and the one behind the DJ booth are split into two outputs. Which makes it important to have synchronized outputs.

The 4kTwo display controller provided by Flux studio did a great job.
It is a Chinese brand which has similar, but less, features than the Datapath X4 that other VJs brought for the show.
Talk to us about Resolume, maybe you have heard of it? *grins*
Resolume Arena is my first choice for media server.
It runs great on both operating systems. I can have exactly the same experience while VJing, no matter if I’m on a PC or a MAC. I can switch between the two without thinking.
Another great tool that is worth mentioning is Chaser by Joris. Woo hoo!
I find it super useful when I use it for switching between different mapping settings and even masking.
Some VJs use Madmapper or mapio to switch mapping. I prefer doing this with Resolume.
Normally, I apply two Chaser FFGL plugins on each layer. One is for switching between different output mappings.
So, after some set up in Advance output and Chaser, during the show, I can just pick the footage I desire, set the screen I want it to show through Chaser (using steps) and boom it’s on!
And, I can mix different layers with different mapping without losing blend mode.
Also, what I see in the preview window can be very close to what I get on the actual screen.
A second Chaser plugin applied in the EFX chain (sometime I don’t need it, or if in future it supports polygons other than triangles) is to mask out unwanted parts of the layer that shows on the screens I don’t want it to.
For me it’s better than applying crops and adjusting XY position in different layers, because I can just make use of the slices in advance output.
This technique is very useful for the circle screen in this show.
Can you give us details about how the LED was mapped?
I did the mapping by starting with numbers.
Counting the pixels, modules, actual width and length.
Then, it became like a math exam in high school or a puzzle to solve. The goal was to make the best use of every pixel of every HD output.
Try to find the most efficient combinations of each slice. At the same time, think about how to run the CAT.5 cable through every module- with less cables.


This is the the front view of all the LEDs. Below the name of each slice is the number of modules followed with the pixels in width and height. The circle screen is cut into 8 slices using two HD outputs. The main screen is spread across two HD outputs.
Next, we come to the pixel maps for the four outputs.




I actually quite enjoy this process.
The advance output of Resolume Arena is pretty handy when solving the puzzle. The fixed snapping across screens in Arena 5.1 saved me a lot of time.
Then it’s time to match the advance output with the pixel map (Thanks to the new feature about importing .png into advanced output). After adding a few masks and adjusting it to fit 4k output, It’s pretty much done with the basic setting.
The Output side looks like this:

The Input side looks like this:

Now comes the most interesting part, Chaser.
I added another virtual screen at the bottom just for Chaser slices.
These slices are just for Chaser programing. Dosen’t really output anything.
From these slices you can see it’s all in the ratio of 1920 x 1080, except the center triangle used for custom footage. This also shows how I scale and postion the footage (Most of my footage is in HD)
This is one of the mapping when I want the HD footage focus at only the circle screen, but then notice it also covers the IMAG screen. This will be masked by the second Chaser plugin
Then, I create another sequnce to pick the screen I wanna preserve. So it functions like a mask. Here, I picked the circle. Note that both these two sequences have the same amount of steps.
In this picture you can see the result of what we did in the preview window. I put another layer of lines in different mapping, opacity at half and used diffrence as blend mode. So you can see the blend mode still works like a charm.
Then, I assign both steps of the Chaser plugin to one fader or knob on my MIDI controller. So I can switch it really fast.
This is how I arranged the mapping for this show.
Of course, I still use Chaser to create bumps like it was designed for.
In my mind, I feel that there must be many other creative ways in Resolume to fullfill my imagination- about how my visuals should look, or how I can respond to the music, the moment I hear it. Or when the screens and cues get complicated, how do to it in a simple way.
I’m glad so far Resolume had never let me down.
*Blushes* Thanks for your great words, Brandon!
Quick question- a lot of VJs have been complaining of overheating of MACs, was this a problem during the show?
Not on the 1st day because it was cloudy.
But on the 2nd day, right before the show, I found my MacBook lagging and it was exposed to direct sunlight.
After a reboot and change of position, it came back to normal. Other than this, it was all good during this show.
I think it is not a problem only for MacBook, but with my PC too- it just reacts differently.
The overheating can cause Resolume to crash on my Windows laptop.
So, extra fans for both PC and MacBook become a must have for most of my outdoor events.
Finally, here is a list of equipment that was used during the show:
MSI GT72-2QE Laptop with-
CPU: Intel Core i7 4980HQ @ 2.8GHz
RAM: 32GB DDR3L
Graphic: nVidia GTX980m GDDR5 8GB
Storage: MSI superRAID 4x128g SSD, 512GB Samsung EVO SSD AKAI APC40 MKII
Magewell HDMI USB3.0 capture device
4kTwo display controller x 2
Windows 10
Resolume Arena 5.1.1
Apple Macbook Pro Retina (Mid-2012)
CPU: 2.7Ghz Intel Core i7
RAM: 16GB 1600MHz DDR3
Graphic: NVIDIA GeForce GT 650M 1G VRAM Storage: APPLE SSD SM512E
OS X 10.11.6 El Capitan
Resolume Arena 5.0.2
With this, we come to an end to this two-part extensive coverage of Storm Festival, Shanghai.
It feels great to see the new features we develop put to use, in multiple different ways. Sometimes, even in ways we didn't fathom while developing them :)
Thank you to 250K and the entire crew for doing such a great job at the festival and then educating us about it, in these interviews.
Until we see you again- go try Chaser like Brandon explained. Go on now, get moving.
It took a while to digest the awesomeness, but Part 2 of "Taking the world by Storm" is here.
So, quick recap?
Gig: Storm Festival, 2016, Shanghai
Epic stage:
Video: 400 square meters of Led, 7.9 mm pitch, 10 processors.
[fold][/fold]
Brandon Chaung, the local VJ on site, talked us through the whole process. And it is intense.
So, sit back..relax..a Storm is brewing.
What computers did you use for the show?
I used both PC and Mac.
I like PC because it is powerful and easy to upgrade. Especially with the graphics card (MXM 3.0b) and storage -which are both essential for running Resolume.
I replaced my optical drive with a SSD for new custom footage for the show (I have more than 40 decks in my composition).
I like Mac because of the great onboard audio quality. Also, it’s less of a hassle for audio playback and midi mapping, when using Resolume with other applications, at the same time.
I switched between two laptops with a Barco Encore switcher.
It required four HD outputs to cover all the LED panels we had.
Both the circle screen and the one behind the DJ booth are split into two outputs. Which makes it important to have synchronized outputs.
The 4kTwo display controller provided by Flux studio did a great job.
It is a Chinese brand which has similar, but less, features than the Datapath X4 that other VJs brought for the show.
Talk to us about Resolume, maybe you have heard of it? *grins*
Resolume Arena is my first choice for media server.
It runs great on both operating systems. I can have exactly the same experience while VJing, no matter if I’m on a PC or a MAC. I can switch between the two without thinking.
Another great tool that is worth mentioning is Chaser by Joris. Woo hoo!
I find it super useful when I use it for switching between different mapping settings and even masking.
Some VJs use Madmapper or mapio to switch mapping. I prefer doing this with Resolume.
Normally, I apply two Chaser FFGL plugins on each layer. One is for switching between different output mappings.
So, after some set up in Advance output and Chaser, during the show, I can just pick the footage I desire, set the screen I want it to show through Chaser (using steps) and boom it’s on!
And, I can mix different layers with different mapping without losing blend mode.
Also, what I see in the preview window can be very close to what I get on the actual screen.
A second Chaser plugin applied in the EFX chain (sometime I don’t need it, or if in future it supports polygons other than triangles) is to mask out unwanted parts of the layer that shows on the screens I don’t want it to.
For me it’s better than applying crops and adjusting XY position in different layers, because I can just make use of the slices in advance output.
This technique is very useful for the circle screen in this show.
Can you give us details about how the LED was mapped?
I did the mapping by starting with numbers.
Counting the pixels, modules, actual width and length.
Then, it became like a math exam in high school or a puzzle to solve. The goal was to make the best use of every pixel of every HD output.
Try to find the most efficient combinations of each slice. At the same time, think about how to run the CAT.5 cable through every module- with less cables.
This is the the front view of all the LEDs. Below the name of each slice is the number of modules followed with the pixels in width and height. The circle screen is cut into 8 slices using two HD outputs. The main screen is spread across two HD outputs.
Next, we come to the pixel maps for the four outputs.
I actually quite enjoy this process.
The advance output of Resolume Arena is pretty handy when solving the puzzle. The fixed snapping across screens in Arena 5.1 saved me a lot of time.
Then it’s time to match the advance output with the pixel map (Thanks to the new feature about importing .png into advanced output). After adding a few masks and adjusting it to fit 4k output, It’s pretty much done with the basic setting.
The Output side looks like this:
The Input side looks like this:
Now comes the most interesting part, Chaser.
I added another virtual screen at the bottom just for Chaser slices.
These slices are just for Chaser programing. Dosen’t really output anything.
Then, I assign both steps of the Chaser plugin to one fader or knob on my MIDI controller. So I can switch it really fast.
This is how I arranged the mapping for this show.
Of course, I still use Chaser to create bumps like it was designed for.
In my mind, I feel that there must be many other creative ways in Resolume to fullfill my imagination- about how my visuals should look, or how I can respond to the music, the moment I hear it. Or when the screens and cues get complicated, how do to it in a simple way.
I’m glad so far Resolume had never let me down.
*Blushes* Thanks for your great words, Brandon!
Quick question- a lot of VJs have been complaining of overheating of MACs, was this a problem during the show?
Not on the 1st day because it was cloudy.
But on the 2nd day, right before the show, I found my MacBook lagging and it was exposed to direct sunlight.
After a reboot and change of position, it came back to normal. Other than this, it was all good during this show.
I think it is not a problem only for MacBook, but with my PC too- it just reacts differently.
The overheating can cause Resolume to crash on my Windows laptop.
So, extra fans for both PC and MacBook become a must have for most of my outdoor events.
Finally, here is a list of equipment that was used during the show:
MSI GT72-2QE Laptop with-
CPU: Intel Core i7 4980HQ @ 2.8GHz
RAM: 32GB DDR3L
Graphic: nVidia GTX980m GDDR5 8GB
Storage: MSI superRAID 4x128g SSD, 512GB Samsung EVO SSD AKAI APC40 MKII
Magewell HDMI USB3.0 capture device
4kTwo display controller x 2
Windows 10
Resolume Arena 5.1.1
Apple Macbook Pro Retina (Mid-2012)
CPU: 2.7Ghz Intel Core i7
RAM: 16GB 1600MHz DDR3
Graphic: NVIDIA GeForce GT 650M 1G VRAM Storage: APPLE SSD SM512E
OS X 10.11.6 El Capitan
Resolume Arena 5.0.2
With this, we come to an end to this two-part extensive coverage of Storm Festival, Shanghai.
It feels great to see the new features we develop put to use, in multiple different ways. Sometimes, even in ways we didn't fathom while developing them :)
Thank you to 250K and the entire crew for doing such a great job at the festival and then educating us about it, in these interviews.
Until we see you again- go try Chaser like Brandon explained. Go on now, get moving.
250K- Taking the World by Storm
Massive rigs.
Immersive content.
Path-breaking stage productions.
What a great time to be alive!
We certainly think so, and our quest for “a big production to dissect” landed us in the eye of the Storm @ Shanghai.

[fold][/fold]
On design and production duty for Storm festival 2016 were super imaginative creative specialists- 250K.
They have been slowly and steadily taking over the world, one stage at a time.
After epic shows like The Flying Dutch, Ground Zero and Armin Van Buuren’s tours, Storm Festival 2016 was 250K's most recent conquest.
We got Dennis de Klein to take a break from basking in the glory of a great show (naww!) to talk us through the setup and tech specifics.
From starting out as a stage design intern with 250K, cut to 6 years later: Creative Project Manager- Dennis has come a long way.
For Storm Festival, he managed the project from start to end, working in close association with the designers and the Creative Director of 250K- Sander Reneman.
Dennis’ most important responsibility was to ensure the original design is brought to life, in the best way possible.

He did a good job right?
So, the Set was 60 meters wide x 20 meters deep x 36 meters high. Whew!
Productions of this scale need some detailed and on pointe planning.
They probably worked on the design for months, right?
The development of the set-design, from initial idea to a final 3D drawing, took the 250k team one month.

Then, the 3D drawing was translated into technical drawings and detailed decor plans- to be able to create the set design as efficiently and as close to the original design as possible.

The load-in lasted for around two weeks, and the load-out was finalized in about five days.
We, here at Resolume, love the stage concept! Can you talk us through what the stage depicts?
Storm Festival, a concept created by A2Live, tells the story of the Actaurians, travelers from outer space who have come to Earth to find like-minded people to live and collaborate with.
This year, the story focuses on ‘The Impact’, the first contact between Actaurians and humans.

The (3D) logo of Storm Festival is actually depicted as the mothership of the Actaurians in the artwork and trailers.
To reenact ‘The Impact’ within the set design, the mothership has landed into the set-design making the connection with the Earth.
It is a representation of both worlds colliding into one merged structure, where the logo and organic shapes represent the Actaurians, and the solid stage platform expresses humanity.
How do you, as designers, incorporate a balance between set fabrication, LED and lighting?
As designers, we focus on finding the perfect mix between representing the brand identity of the promotor/event and the Artist’s technical rider requirements.
Keeping this in mind, we are constantly looking to take the set design to a next level, to create something that is not out there and to challenge all disciplines.
For example, if we design a specific set of video panels, it needs to be positioned in a logical location, needs to be functional for décor visual content and artist visual content and needs to blend in with the look & feel of the set design.
It is also about contrast, where video, lighting & decor feel balanced when looking at the set design. We closely collaborate with lighting designers, video operators and the decor fabrication company.
The lighting inventory seems massive!
For this set design, we have collaborated closely with Daniel Richardson, the lighting designer and operator for the past years of Storm Festival.

For this set design, he created a lighting design that incorporates over
140 beams
225 quad LED bars
60 Spots &
60 blinders, just to name a few.
Fun fact, all lighting in the stage is hanging (except for those on the deck).
In terms of set fabrication, what material did you use to create the set and the mammoth logo in the center?
Structurally, the logo is created using truss, it is a geometrical shape that can be recreated with truss and corners.
The facades of the pyramid are supported by a custom welded steel frame, on which the wooden (grey) panels are connected on.
The wooden panels are painted with two type of finishes, to give it that look & feel and to add the sharp edge to accentuate the logo.
The inside of the pyramid is covered with semi-transparent white fabric, to transform it into a giant lightbox.
The pyramid’s truss and frame is held up to a large scaffolding wall, that is part of the whole set design.

Lets talk video..
For the whole set design, we have used around 400 square meters of 7.9 mm LED. The LED tiles were split into 10 processors, of which 4 controlled the central circle screen. Most of the lower LED screens are stacked on a deck, the top circle is supported by the back scaffolding.

What challenges did you face producing this Stage in China and how did you overcome them?
There were two main challenges that had to be overcome:
First, a language barrier.
Most of the crew would only speak Chinese, so it was difficult to get a message across. It can be quite difficult to translate the technical terms that I am used to in Dutch, to English and then to Chinese. For this, sketching, gestures and the 3D model we have created was of great use.
Second, the overall approach is different.
Not saying it is good or bad, but different than what we are used to from a production in Europe or the USA.
The level of customization to the set design and adding details on site, instead of off site, was a lot higher.
There was a strong focus on bringing the detail of the original 3D render into reality. And in addition, the used materials were different.
The basis is still scaffolding and trussing, but the measurements were different than what we’re used to.
It is not a difficulty, but is something that has be taken into account strongly when designing a set design for a different market or region.

With this, we come to the end of Part 1 of our coverage of Storm Festival.
Thanks for talking to us Dennis. Kudos on a great show!
In Part 2, we will plunge into the video details of Storm festival 2016- so get your geek on.
Credits:
Gil Wadsworth and the whole team of A2Live for the opportunity to create 250k's first set design in China;
Daniel Richardson for his great lighting design, lighting operating during show and to assist as a translator from English to Chinese and back;
Atilla Meijs from Corrino for introducing 250K to A2Live
You can also visit 250K & Storm Festival
Photo Credits: Dennis de Klein & Storm Festival
Immersive content.
Path-breaking stage productions.
What a great time to be alive!
We certainly think so, and our quest for “a big production to dissect” landed us in the eye of the Storm @ Shanghai.
[fold][/fold]
On design and production duty for Storm festival 2016 were super imaginative creative specialists- 250K.
They have been slowly and steadily taking over the world, one stage at a time.
After epic shows like The Flying Dutch, Ground Zero and Armin Van Buuren’s tours, Storm Festival 2016 was 250K's most recent conquest.
We got Dennis de Klein to take a break from basking in the glory of a great show (naww!) to talk us through the setup and tech specifics.
From starting out as a stage design intern with 250K, cut to 6 years later: Creative Project Manager- Dennis has come a long way.
For Storm Festival, he managed the project from start to end, working in close association with the designers and the Creative Director of 250K- Sander Reneman.
Dennis’ most important responsibility was to ensure the original design is brought to life, in the best way possible.
He did a good job right?
So, the Set was 60 meters wide x 20 meters deep x 36 meters high. Whew!
Productions of this scale need some detailed and on pointe planning.
They probably worked on the design for months, right?
The development of the set-design, from initial idea to a final 3D drawing, took the 250k team one month.
Then, the 3D drawing was translated into technical drawings and detailed decor plans- to be able to create the set design as efficiently and as close to the original design as possible.
The load-in lasted for around two weeks, and the load-out was finalized in about five days.
We, here at Resolume, love the stage concept! Can you talk us through what the stage depicts?
Storm Festival, a concept created by A2Live, tells the story of the Actaurians, travelers from outer space who have come to Earth to find like-minded people to live and collaborate with.
This year, the story focuses on ‘The Impact’, the first contact between Actaurians and humans.
The (3D) logo of Storm Festival is actually depicted as the mothership of the Actaurians in the artwork and trailers.
To reenact ‘The Impact’ within the set design, the mothership has landed into the set-design making the connection with the Earth.
It is a representation of both worlds colliding into one merged structure, where the logo and organic shapes represent the Actaurians, and the solid stage platform expresses humanity.
How do you, as designers, incorporate a balance between set fabrication, LED and lighting?
As designers, we focus on finding the perfect mix between representing the brand identity of the promotor/event and the Artist’s technical rider requirements.
Keeping this in mind, we are constantly looking to take the set design to a next level, to create something that is not out there and to challenge all disciplines.
For example, if we design a specific set of video panels, it needs to be positioned in a logical location, needs to be functional for décor visual content and artist visual content and needs to blend in with the look & feel of the set design.
It is also about contrast, where video, lighting & decor feel balanced when looking at the set design. We closely collaborate with lighting designers, video operators and the decor fabrication company.
The lighting inventory seems massive!
For this set design, we have collaborated closely with Daniel Richardson, the lighting designer and operator for the past years of Storm Festival.
For this set design, he created a lighting design that incorporates over
140 beams
225 quad LED bars
60 Spots &
60 blinders, just to name a few.
Fun fact, all lighting in the stage is hanging (except for those on the deck).
In terms of set fabrication, what material did you use to create the set and the mammoth logo in the center?
Structurally, the logo is created using truss, it is a geometrical shape that can be recreated with truss and corners.
The facades of the pyramid are supported by a custom welded steel frame, on which the wooden (grey) panels are connected on.
The wooden panels are painted with two type of finishes, to give it that look & feel and to add the sharp edge to accentuate the logo.
The inside of the pyramid is covered with semi-transparent white fabric, to transform it into a giant lightbox.
The pyramid’s truss and frame is held up to a large scaffolding wall, that is part of the whole set design.
Lets talk video..
For the whole set design, we have used around 400 square meters of 7.9 mm LED. The LED tiles were split into 10 processors, of which 4 controlled the central circle screen. Most of the lower LED screens are stacked on a deck, the top circle is supported by the back scaffolding.
What challenges did you face producing this Stage in China and how did you overcome them?
There were two main challenges that had to be overcome:
First, a language barrier.
Most of the crew would only speak Chinese, so it was difficult to get a message across. It can be quite difficult to translate the technical terms that I am used to in Dutch, to English and then to Chinese. For this, sketching, gestures and the 3D model we have created was of great use.
Second, the overall approach is different.
Not saying it is good or bad, but different than what we are used to from a production in Europe or the USA.
The level of customization to the set design and adding details on site, instead of off site, was a lot higher.
There was a strong focus on bringing the detail of the original 3D render into reality. And in addition, the used materials were different.
The basis is still scaffolding and trussing, but the measurements were different than what we’re used to.
It is not a difficulty, but is something that has be taken into account strongly when designing a set design for a different market or region.
With this, we come to the end of Part 1 of our coverage of Storm Festival.
Thanks for talking to us Dennis. Kudos on a great show!
In Part 2, we will plunge into the video details of Storm festival 2016- so get your geek on.
Credits:
Gil Wadsworth and the whole team of A2Live for the opportunity to create 250k's first set design in China;
Daniel Richardson for his great lighting design, lighting operating during show and to assist as a translator from English to Chinese and back;
Atilla Meijs from Corrino for introducing 250K to A2Live
You can also visit 250K & Storm Festival
Photo Credits: Dennis de Klein & Storm Festival
Mad About Madeon
Madeon is a French electronic producer, who uses gadgets and technology like they’re an extension of his very being.
With an on stage setup that baffles even the best in the business, this 22 year old producer has reached where he is because of his focus on the audio-visual aspect of a performance, as a unit.
His stage setup should be trademarked. It’s a diamond with arrow- like shapes on either side.
All made of LED.

Geometric.
Symmetric.
Minimalist.
We, here at Resolume, couldn’t pass on the chance of understanding his rig and how he perfectly triggers his visuals to the music, live.
Thanks very much for speaking to us Hugo!
[fold][/fold]
First things first, the answer many have been curious to know, can you explain your live setup to us? All the gadgets you use and their purpose?
The show is run on two laptops which are on stage with me.
One runs the audio side of things in Ableton and sends MIDI through ethernet in real time to a second, dedicated video laptop running Resolume.
I have two Novation Launchpads to play musical parts and modify existing stems, one Novation Launchcontrol XL to handle some additional fx and general controls (including tempo) and a Korg SV-1 keyboard.
There is also a Xone K2 plugged into Resolume to control some video effects.

You do a great job of syncing your visuals to the music. Can you explain to us how you do this with Resolume?
All of the audio clips and parts in Ableton are grouped with matching MIDI clips that trigger videos and effect in Resolume.
All of the editing is done in real time, it's really useful as it means I can edit the video show easily between shows by simply changing the MIDI score.
It also means that I can improvise, extend or shorten a section, with the knowledge that the video show will keep up.
We have noticed some LED strips being used in your setups. Do you control DMX fixtures with Resolume as well?
No, we haven't explored this yet but i'm looking forward to it! At the moment, all of the fixtures are triggered manually (no timecode, shoutout to Alex Cerio!)
We really like the pixel mapping of the buttons on your Launchpads. Tell us about this.
This is a simple MIDI score sent to the Launchpad to animate it. Novation kindly built custom Launchpads for me with unique firmware features enabling me to switch between this type of "animation" mode and a regular functioning mode seamlessly.

Audio-visuals is so important to you- sometimes the content looks like the launchpad. It’s gotta be intentional?
Absolutely! For the 2012 shows, there were sections of the show where the screen matched the Launchpad completely. There were also pixel-style grid animations that were completely in real-time (with 64 layers in Resolume for each of the 64 pad), each pad corresponding to a different MIDI note. Very fun to program!

What thought process do you go through while creating visuals in your studio? What software do you use? How long does it take for you to prepare a render/ clip?
I work with a number of companies on making the content for the show but I make the content for about a third of the show.
I mostly use After Effects. I'm not very familiar with 3D softwares so I make 3D animations in AE polygon by polygon which is quite excruciating!
I like to keep making content on tour as new ideas occur to me, it's always a work in progress.

Give us a rundown of your studio equipment. What is an absolute must-have?What would you like to change/ upgrade?
A great computer has to be the most indispensable gear.
Whenever I upgrade, my production style always seems to adapt to use more plugins until I reach the limit again, it's constant frustration!
A zero-latency, unlimited-resources dream computer would be the best imaginable upgrade.
Why did you pick Resolume over the other software available out there?
Resolume reminded me a lot of audio softwares I was already familiar with.
It's intuitive and powerful, the effects are extremely usable and the latest updates in Arena 5 added mapping options that enabled my latest "diamond/chevron" LED setup.

With this, we come to the end of this interview.
Thanks much for taking the time out to do this Hugo, we are all very grateful.
Our hunger for technology and the things you can do with it has been duly satiated. For now.
Time to go try all of this out now, eh? :)
You can check out Madeon's work here:
Photo Cred: Charles Edouard Dangelser
With an on stage setup that baffles even the best in the business, this 22 year old producer has reached where he is because of his focus on the audio-visual aspect of a performance, as a unit.
His stage setup should be trademarked. It’s a diamond with arrow- like shapes on either side.
All made of LED.
Geometric.
Symmetric.
Minimalist.
We, here at Resolume, couldn’t pass on the chance of understanding his rig and how he perfectly triggers his visuals to the music, live.
Thanks very much for speaking to us Hugo!
[fold][/fold]
First things first, the answer many have been curious to know, can you explain your live setup to us? All the gadgets you use and their purpose?
The show is run on two laptops which are on stage with me.
One runs the audio side of things in Ableton and sends MIDI through ethernet in real time to a second, dedicated video laptop running Resolume.
I have two Novation Launchpads to play musical parts and modify existing stems, one Novation Launchcontrol XL to handle some additional fx and general controls (including tempo) and a Korg SV-1 keyboard.
There is also a Xone K2 plugged into Resolume to control some video effects.
You do a great job of syncing your visuals to the music. Can you explain to us how you do this with Resolume?
All of the audio clips and parts in Ableton are grouped with matching MIDI clips that trigger videos and effect in Resolume.
All of the editing is done in real time, it's really useful as it means I can edit the video show easily between shows by simply changing the MIDI score.
It also means that I can improvise, extend or shorten a section, with the knowledge that the video show will keep up.
We have noticed some LED strips being used in your setups. Do you control DMX fixtures with Resolume as well?
No, we haven't explored this yet but i'm looking forward to it! At the moment, all of the fixtures are triggered manually (no timecode, shoutout to Alex Cerio!)
We really like the pixel mapping of the buttons on your Launchpads. Tell us about this.
This is a simple MIDI score sent to the Launchpad to animate it. Novation kindly built custom Launchpads for me with unique firmware features enabling me to switch between this type of "animation" mode and a regular functioning mode seamlessly.
Audio-visuals is so important to you- sometimes the content looks like the launchpad. It’s gotta be intentional?
Absolutely! For the 2012 shows, there were sections of the show where the screen matched the Launchpad completely. There were also pixel-style grid animations that were completely in real-time (with 64 layers in Resolume for each of the 64 pad), each pad corresponding to a different MIDI note. Very fun to program!
What thought process do you go through while creating visuals in your studio? What software do you use? How long does it take for you to prepare a render/ clip?
I work with a number of companies on making the content for the show but I make the content for about a third of the show.
I mostly use After Effects. I'm not very familiar with 3D softwares so I make 3D animations in AE polygon by polygon which is quite excruciating!
I like to keep making content on tour as new ideas occur to me, it's always a work in progress.
Give us a rundown of your studio equipment. What is an absolute must-have?What would you like to change/ upgrade?
A great computer has to be the most indispensable gear.
Whenever I upgrade, my production style always seems to adapt to use more plugins until I reach the limit again, it's constant frustration!
A zero-latency, unlimited-resources dream computer would be the best imaginable upgrade.
Why did you pick Resolume over the other software available out there?
Resolume reminded me a lot of audio softwares I was already familiar with.
It's intuitive and powerful, the effects are extremely usable and the latest updates in Arena 5 added mapping options that enabled my latest "diamond/chevron" LED setup.
With this, we come to the end of this interview.
Thanks much for taking the time out to do this Hugo, we are all very grateful.
Our hunger for technology and the things you can do with it has been duly satiated. For now.
Time to go try all of this out now, eh? :)
You can check out Madeon's work here:
Photo Cred: Charles Edouard Dangelser