Category » Inspiration Clear ×

Artist Profile: Richard De Souza // Manoeuvre

Say what you will about Facebook, it's an amazing medium to get in touch with people. You can get the skinny on what's happening on the other side of the world, and see things you would otherwise never know about. Case in point: Richard De Souza. I got connected to Richard about a year ago, and since then, my newsfeed started filling up with one jaw dropping stage design after another. Seems like the Australian continent, besides spiders, is hosting buckets of raw talent as well.

Who are you and how do you spend your day?

I am the director of MNVR (Manoeuvre); an Australian based new media company primarily focussed on concert visuals and interactive. I’m always working on something. If not in production mode I’m either VJing or developing custom VJ tools. I love tinkering and I spend most of my free time progressing my art and process.

01_Neon.jpg
[fold][/fold]

How and when did you get started in this whole VJ circus?

I’ve had a love of music for as long as I can remember though my strengths have always been with the visual arts. In 1999 The Big Day Out, a national touring festival here in Australia, put a callout for video artists. A friend and I threw a reel together and were selected to do visuals for the Big Day Out’s Boiler Room. That summer we were VJing in front of thousands of people for what was then Australia’s largest festival - I was hooked. It was all Betacam and VHS back then, cueing tapes and mixing cams. VJing at The Big Day Out became my summer hobby for many years and I would spend all my spare time working on show content. Live visuals in the festival dance scene were new here. There were no rules and few points of reference so I felt free to do what I liked. It was hugely exciting being there so early in the Australian scene and has given me the oppportunity to operate visuals for likes of Aphex Twin, Justice, Avicii, The Bloody Beetroots to name a few at various points in their careers.

The industry has changed so much from the early days. Visuals have become a respected and essential part of live performance. As more electronic music festivals took off I made the leap from the architectural industry to full time professional VJ, 3D artist and content producer.




You do a lot of amazing custom tour content. What's your process for coming up with the overall identity of the show and the look and feel of each song individually?

It's always about representing the artist to the best of my ability - put the show first - it sounds simple but many people get that wrong. Vision is there to strengthen or amplify the experience of the music; it should never detract or take over. My process varies on a show-to-show basis. Sometimes minimal visuals and different techniques should be employed but I decide this by getting to know the artist.

The best shows are either a great pairing of musician with like-minded visual artist or an environment that allows symbiosis. It’s really just about gauging the intent of the artist and their music, taking the time to listen and study their person, catalogue, performances and media. If you have spent some time with an artist it's pretty easy to translate their personalities and music visually. Music is highly emotive anyway; it already carries a ton thematic content within to draw from.

Riser-Graveyard-A-Export_00033-1.jpg
When dealing with single tracks I always consider the hero shot. This is generally a compositionally strong or thematic element that identifies the peak moments of a track. By placing these hero moments throughout the set you can avoid motion graphics run on, where an hour of mograph blends into a singular moment. The idea is that the audience will identify or recall key imagery attached to a particular song strengthening the emotional impact of the show. If you orchestrate these peaks and troughs you avoid a visually flat show.

You also do incredible stage designs. What's your stage design process like? Do you have ideas for specific visuals and design a stage around that? Do you also design the light plan and overall stage deco?

I've done a fair bit of concept design over the years. With my background in architectural visualisation and animation I can flesh out the initial concept to a point where I can sell an idea pretty easily. I’ve also been fortunate enough to work on some big stages at festivals like Stereosonic and Future Music Festival so I have an understanding of what works and what doesn’t.

I tend to think in 3D regardless. Much of my VJ content is 3D animation so stage design for me is just another structural element in a larger meta 3D composition.



I work alongside some very talented lighting designers and production crew that flesh out my initial concept designs. In a larger team environment it’s a matter of letting everyone play to his or her strengths. When you have skilled people around you and everyone has input and ownership of the creative process, that’s the environment in which you get the best results.

You've got a lot of nifty TouchDesigner responsive stuff going on. Especially 'California' on the Peking Duk tour intrigues me to no end. Can you elaborate a bit on that?

The idea behind the Peking Duk show is that it's a post-EDM punk rock show. Everything in the show follows that simple design ethos - from visual design to playback and performance. Visually it’s supposed to look like a live comic book mashup of punk t-shirt illustrations and rock poster art. Instead of using the usual white outs and strobes it has a bunch of animated visual stabs and toon effects that sync up with the different visual tracks. I really wanted it to be a departure from the generic stock mograph look that many shows have.

Peking Duk (Adam Hyde & Reuben Styles) are very active on stage and perform with a rotating cast of collaborators, guest vocalists and musicians. The show is never the same twice but is always a hell of a party so the visuals needed to reflect that.

Peking-Duk-Melbourne-Forum-10.jpg

The show is split up into key tracks that Peking Duk are likely to play for which I have accompanying visuals, and then freeform sections that I adapt to on the fly.

California was initially developed in TouchDesigner to carry the freeform sections of the show but it is now used throughout. Basically I'm using TouchDesigner to do audio analysis live, then I render a trigger based texture palette of sprite animations. They are very simple solid pulses, gradient ramps, animated noise that respond to audio. I send these rendered textures over to Resolume via Spout and position them behind alpha channelled layered animations using Visution Mapio on a layer and column basis. I then launch column compositions instead of running a solely clip based show.

The generative element gives the show added life. Having an automated element sitting behind what I’m doing live broadens the scope of the show and creates a much tighter visual experience overall.

MNVR_TV_PEKING_DUK_SOCAL.gif

VJ'ing at a show like Godskitchen must be very different from regular VJ work. How does it differ and how do you keep a show like that fresh for a full night?

Designing for a show like Godskitchen you have multiple show arcs. Not only do you have multiple artist set arcs, but you also have a larger show arc to take into account. It’s important to consider how a night builds, with the introduction of different elements as the night progresses, from the video rig, to lighting, pyro and lasers.

I usually have some thematic show content woven into the video design that is either audio reactive, real-time or real-time 3D. These framing elements are usually persistent throughout the whole show and are treated like a video light rig rather than just a screen. These design elements tie into the other show elements like artist intros to create a larger show story.



What has your overall software and hardware journey been like? Do you tend to stick to a particular set of tools or do you switch around a lot? You also design a lot of your own software, right?

I started in 3D computer graphics. By the late ‘90s I was working on SGI machines doing real-time 3D interactive environments in VRML. It was definitely a revelation - real-time 3D, graphics acceleration, interactive and web deployment but there were many things you just couldn’t do because the hardware and software weren’t up to it.

My interests haven’t really changed but hardware and technology have caught up. The only limitations now are time, resources and budget.

I jumped on laptop VJing as soon as I could. Originally I was pushing out 3 layers of 320x240 on Resolume 2, which seems ridiculous now, but given the tape-to-tape alternative, at the time it was revolutionary. Resolume is still my core go to performance application.

I’m a bit of a control scheme nut. I have a somewhat ridiculous midi controller collection and I'm always working on new control schemes. My TouchDesigner work creating custom performance interfaces is just an extension of that. I’ve always wondered what touch screens or motion controls like Leap Motion would be like in live situations so I created my own custom tools to integrate these elements into shows. Having my own tools removes many technical limitations and allows me to focus on the creative. For me creative is always at the forefront but sometimes there are technical hurdles to navigate first.

Gods-Kitchen-Melbourne-Nathan-Doran-Photography.jpg


How does Resolume fit into all this?

Resolume has been hugely influential as it’s the first real-time compositional environment I used. I’ve always used it like a real-time version of Photoshop or After Effects. The immediacy of feedback that Resolume gives is essential for my creative process. It has influenced many of the software and workflow directions I’ve taken over the years.

I actually run through compositional design sketches in Resolume, creating collages of ideas and forms from video elements and effects that I then round-trip back through the preproduction process to create final artwork.

Even when working in TouchDesigner I pass the output through Resolume via Spout. Resolume’s video playback, effects stack, layer routing, and mapping are just so extremely powerful and flexible I can’t imagine creating a show without those tools.

Who is currently blowing your mind in your field?

I have always loved UVA (United Visual Artists), their work for Massive Attack in particular the 100th Window Tour was inspirational and influenced me to follow live visuals as a career path. Their blending of sculptural architecture, vision as lighting and considered space is fantastic. To this day they still do beautiful, political and thoughtful work.

Where can we find you online?

I can be found online at http://www.manoeuvre.tv

On Tour with Zedd: Gabe Damast

Working for Resolume, we're lucky enough to see some of the most amazing VJ talent in action. One such person is Gabe Damast, whose live show for Zedd blew me away. Gabe is a true VJ and seldom we see a show this tight and in sync with the music. And most amazing of all, it's pure VJ skill, no SMPTE or other tricks.

Take a look at the video for an idea of how Gabe rocks it, and then read on below for what he has to say about all this.



[fold][/fold]

How did you start VJ'ing?

My introduction to the world of VJing came through music. I grew up in the San Francisco Bay Area playing saxophone and piano in a couple different Jazz and Funk bands, and as my love for electronic music developed I got into beat making, record producing, and sound engineering. I spent years learning basically every major production software set up a small studio in my parents basement where I'd record myself and my musician friends goofing off, and sometimes they'd turn into actual songs.

At the end of college, a friend of mine showed me Resolume, which was really the first time I was exposed to any visual performance software. I remember a lot of things clicked for me all at once, coming from a background using Ableton Live and FL Studio, Resolume felt like a very user friendly video version of the DAWs I was familiar with. It wasn't long before I got ahold of a projector and started working on my first VJ sets in my tiny dark bedroom late at night. At first I would use found footage and VJ clips from vimeo, but I eventually got into cinema 4D and after effects and started making my own video content, some of which is being used in the Zedd show currently!

productionclub-zedd-truecolorstour-worldwide-2015-20.jpg

Can you tell us a bit more about the Zedd tour? How does such a tour get organised when it comes to the stage design, the content, the operating of video, lights and laser? Who does what?

The True Colors - which was the latest Arena tour we did with Zedd - all started more than two years ago with scribbles on a paper napkin. Many artists will hire a specific designer to conceptualize a stage production, but from the very beginning, the Zedd touring team been extremely close-knit, and we always share roles and creative ideas freely. Zedd likes to be incredibly close with pretty much every aspect of his live show, so many of the crucial design decisions would happen in a group discussions during a meal at an airport, or a van ride on the way to a music festival. Our lighting director Stevie Hernandez would create renderings of different ideas in vector works pretty much in real time, which helped different ideas evolve and change.

Video content has always been the central focus of the Zedd show (and I'm NOT just saying that because I'm a VJ!!). For the True Colors Tour we wanted to give fans the most immersive experience possible, so the design we landed on was pretty much a giant 84 foot wide LED wall, framed with all sorts of light fixtures, lasers, and special effects. We were able to use an LED wall that was fully 4K in width - a dream come true for any pixel pusher. It's been really exciting to watch the rapid development of LED technology in recent years. Bigger walls, higher resolutions, soon I'm sure we're going to be watching shows in retina quality! In the five months leading up to the start of the tour, we worked closely with Beeple (Mike Winkelman) to create the bulk of the new show visuals rendered in stunning 4418x1080 resolution. Scott Pagano and myself also contributed to the content push, which enabled me to curate an entirely new Zedd visual show from our previous tour.

Read more about Production Works process here: http://www.productionclub.net/work/truecolors



The thing that stands out most to me is how video, laser and light play the accents in the music as a team, almost like a band. Is this something that you practice?

"Practicing" is always a tricky subject in the world of live production. The cost of renting enough gear to do a proper rehearsal is so high that it only really makes sense surrounding a tour where the costs are being spread over a few months. We were lucky to have two weeks of rehearsals before our tour rolled out, where we built the full size production in a sweaty, cavernous warehouse in Las Vegas, and Zedd, myself, Ken (our tour manager AND laser operator), and Stevie (lights) spent 12+ hours a day listening to music and creating unique looks for each song Zedd wanted to play during the tour. We brought in a mobile studio for Zedd to use, and each day would usually begin with us brainstorming visual ideas, and then taking breaks where me and Stevie could program the looks, and Zedd could work on musical edits and tweaks. It was hard to leave the rehearsal space at the end of the day because we were getting so much done!

It's all live right, no SMPTE? What would you say to people that are just starting out and are looking to get a tight sync like that?

No SMPTE! Every single video clip, strobe hit, and pyro shot are all cued live. That's why our rehearsals took so long. I have a lot of respect for people who put together time coded shows, and there are a lot of things you can do with that kind of sync that just aren't possible with live triggering, but for me, realtime performance is the only way I like to work. Music is what drives the visuals, and Zedd always DJs live, so there is a certain level of authenticity that is communicated by including some human error into the visual performance.

Whenever someone asks me how they should get into VJing, I always tell them to start by understanding music. You can definitely be a technical person and excel in the visual performance world, but in order to deliver an on-time show (with no timecode) you really have to learn music and rhythm. If you have good timing, and understand the basics of music theory, you can put on an amazing show even with the worst video content on the smallest screens.

productionclub-zedd-truecolorstour-worldwide-2015-08.jpg

What gear are you bringing with you? Is it easy to deal with airport customs?

For a normal fly-in show, I use a Macbook Pro Retina with three midi controllers: 2 tractor control F1s and a MIDI fighter 3D. My whole kit fits nicely in a Pelican 1510 carryon case, and if customs ever tries to hassle me I just say "it's for making computer music!!!" and they always leave me alone. Flying around with three laptops sometimes raises a few eyebrows, but I've never gotten seriously held up (yet! *knock on wood*)

How does Resolume fit into all this?

Resolume's simple layout makes it SUPER easy to organize our visual show. I always try to think about telling a story through our video content, and all of my Resolume compositions are arranged in a timeline that I navigate around depending on what songs are being played. Since everything is live, choosing a media server that allowed for quick re-organization was really important to me. Add in the first class customer service from the Resolume team, and it's a no brainer!

productionclub-zedd-truecolorstour-worldwide-2015-09.jpg

Where can we find you online?

You can find my work on the web at:
--- http://www.gabedamast.com ---

or other platforms like:
--- vimeo: https://vimeo.com/user5953855 ---
--- behance: https://www.behance.net/gabedamast ---

Combining video, lights and laser: Rebel Overlay at Hydra

When we released Arena 5 with the added options for DMX output, we were expecting you guys to do amazing things with it. Still, Spencer Heron from Rebel Overlay recently sent us a video that exceeded our expectations and then some.

Their setup has video, lights and laser, all controlled by a single instance of Arena. Combined with Rebel Overlay's trademark minimal design, it looks downright amazing.



Read more about Rebel Overlay and how they made this happen below.

[fold][/fold]

Who and what is Rebel Overlay and where can we find you online? 

Rebel Overlay is a London based visual design company, we specialize in live visuals, content creation and creative visual setups for events worldwide. We’re formed of three core members, Spencer Heron, Philip Rust and Ben Field.

You can find our work online at http://www.rebeloverlay.com, http://www.vimeo.com/rebeloverlay and http://www.facebook.com/reblovrla

How did you start VJ'ing?

Rebel Overlay started back in around 2005 while I was studying Film Production at the Bournemouth Screen Academy, UK. I needed more of a creative outlet for my ideas and video experimentations. Back then the VJ scene in the UK was building up some momentum, and the emphasis on experimenting with visualizing audio was a popular medium I had a lot of interest in. My first real love of visuals came when I would be glued to the television at 3am on the weekends watching a TV series on terrestrial television produced by Addictive TV, titled ‘Mixmasters’. It was a late night music show that featured audio-visual and DJ:VJ mixes produced by hundreds of different music labels and visual artists worldwide.

I wasn’t able to translate the audio and visual relationship too much in to my studies, and so I formed good relationships with event programmers and began doing projection and television displays at my friends events. Since then I moved to the city and the demand for visuals was enough that I could continue to grow the company and we’ve been working in the field ever since.

DSC_1331.JPG
Can you tell us a bit more about your setup at Hydra?

In the video we’re using 70x RGB pixel battens, a 12mm LED screen, 24x green point lasers and 4x RGB sweeper beam moving head lights. Everything runs from inside the new Resolume Arena 5 software. We’re running the LED screen straight on DVI, while all the other fixtures are controlled using a Showjockey 16way Art-Net to DMX box on 16 lumiverses.

We create animation sequences, for e.g simple flashing boxes, and they are used to control the lighting fixtures. DMX works in values of 0 – 255. So, to translate that in to video, black is 0 and full brightness is 255, and that’s how we work with the ranges of each lighting fixture. The moving heads sweep beam lights only have one axis but split in to two sections per light, which is fairly simple to control in theory but was quite difficult to find a working solution without using more layers. For this, we used a gradient source so we have control of the brightness on 2 sides of a solid layer. We mapped the DMX value for the lights tilting to either side of this gradient, which meant we can use Resolume’s automation controls to sync in the movement, or map the movement to a fader on our midi controllers.

The lasers work on a matrix that any value over 250 switches them on, and the pixel battens act like a low resolution video screen so we can extend the content from the LED screen out over the heads of the audience.

DSC_1090.JPG
What's Hydra? Does the name have any significance?

The Hydra is an expertly curated series of parties in London, running primarily throughout autumn each year. Created in 2012 from the brainchild of former head of programming at The End Club, Ajay Jayaram, and off-location event pioneer Dolan Bergin, The Hydra has built up an inimitable reputation in the UK and further afield. Hence the name from Greek Mythology, meaning a “beast with many heads”. The Hydra also rears its head at other events across the year in London and overseas.

The Hydra’s uncompromising mission is to collaborate with the finest artists, record labels and collectives from around the world, representing them against a backdrop of superlative sound and production and bringing the very best underground music to London’s doorstep.

DSC_1052.JPG
How does Resolume fit into all this?

I began using Resolume software around the launch of Avenue, I was drawn to the simplicity of it all and the linear layout where I could have the option of loading all my usable media in to a single composition. The ability to add effects and manipulate a clip, then duplicate it and add more manipulation and still be able to call that content up when required was an extremely valuable asset for us.

The latest Resolume 5.0 release is the backbone of the production we do for The Hydra. We’ve invested a lot in equipment, displays and lighting fixtures over the last year with the idea that we can control most elements of a show live, with all the lighting and video elements working together in synchronicity, in perfect time with Resolume’s BPM tracker and all running from a single laptop. It’s a fairly new and unique approach to controlling fixtures, using an animated composition as the triggers. We simply design our show layouts in 3D, and export image masks to work from and animate on top of. In Resolume we then map all our fixtures in the advanced output of Arena, sending our DMX over Art-Net and the video via DVI.

It’s the scenography element that most interests us today, we want to design shows that aren’t just focused on the screens and visual / VJ element, but how everything else from lasers to moving heads is incorporated in to the design.

DSC_1152.JPG
Why did you choose Res over a regular lighting desk? Don't you like endless rows of faders and illegible touchscreens?

Being solely from a video background, DMX is still very new to us and we started with absolutely no experience with DMX at all so it was quite a learning curve to get our head around. Lighting desks confuse the life out of me, I don’t really know where to start. On many international events I would work closely with an incredibly good lighting designer named Ross Chapple, (who now works with the likes of Eric Prydz) and he gave some invaluable assistance in the DMX world.

Lighting desks and operators will obviously be the right way to do things, and we’re not trying to over look their use, it’s just for what we are doing on the smaller scale, it makes a lot of sense to keep everything self contained, which is kind of the same thing we’re seeing in opposite with lighting desks now incorporating more elements of video. Only, that our preferred approach to things is from a video background and using the Arena servers to push the pixels.

You're one of the first to push the DMX integration this far. How is it being an early adopter? 

The DMX integration has really stepped up our game, having just one software application running keeps the processing down and means a single machine can handle what we’re throwing at it. It’s been a fairly simple process getting to grips with the advanced pixel mapping features, being used to video and pixel mapping with previous versions of Arena, the addition of DMX follows the same principles in the advanced output and that’s made the transition an easy one. The latest release of 5.0.2 is almost there with all the features we will ever need, we’re regularly running a full 16 universes alongside up to five DVI outputs on dual and triplehead2go’s on a single machine, and Arena is handling things very well!

How was the lead-up / tech setup? Smooth sailing or does it still give you nightmares?

The tech setup for the Hydra was initially a difficult process, the animation, layout and processing of things is fairly straightforward with the main difficulty coming from creating a show that can be setup in a short amount of time and be taken down the next morning. The Hydra venue is a photography studio by day and event space by night and that’s where things have become more time consuming, building a system that can be installed and taken down quickly and efficiently. This also limits our creativity, as we have so many ideas to make large scale impressive setups but the budgets and time constraints mean we have to work in a more realistic manner.

Anything else?

The next Hydra show happens on April 29th and will see the Dutch heavyweight festival Dekmantel come to London and showcase the likes of Joy Orbison, Kyle Hall and Marcellus Pittman.

DSC_1106.JPG

Touring Latin America - Viaje Tour Ricardo Arjona

The Viaje Tour Ricardo Arjona has been called the most successful latin tour in 2014-2015 by Pollstar and Billboard, with an attendance of more than 1.7 million people.



The man behind the visuals on this tour is Camilo Mejia, also known as VJ Bastard. Read more about what he has to say on the touring experience below.
[fold][/fold]
We have been touring for a year using Resolume with no issues. We've been to Argentina (8 cities, 25 shows and we're going back for at least 5 shows more in November), Mexico (16 cities, 35 shows), Uruguay, Paraguay, Panama, Costa rica (2 shows), Chile (7 shows), Puerto Rico (5 shows), USA (13 cities, and we're going back next month to 8 cities), Ecuador (3 cities), Venezuela (5 cities), Guatemala (2 shows, 3 cities), El Salvador, Honduras (2 cities), Nicaragua, Colombia (5 cities) and we are waiting for the confirmation of the Europe tour.

RicardoArjona_1.jpg
I have been using Resolume since 2.4.1, and have a good 15 years of experience playing with video.

I was called for this tour in May of 2014 as visualist and video engineer. We made rehearsals for a month in Mexico, during which we decided that the perfect system for our tour would be Resolume Arena.

First of all is the stability. I've played HUGE clips from 2gb to 70 or 80 gb, with no issues, so I know i will not have any problem with that. Because we don't run with a backup signal, that's a serious point.

Second of all we have a lot to stuff in our tour. Props (cars, trains, bikes, chairs…), back line, consoles, screens, and everything is travelling with us. As you may notice the screens of the tour are huge, and it is the first thing that we prepare for the show. Build up time is around 4 hours for the screens alone. With other systems it's easy for things to go missing, so the portability is really important for us.



The set up for the show that we use for the tour consists 436 modules of 6mm pitch LED screens. Resolume runs on a MacPro 12core 2,7Ghz /dual gpu with 64GbRam DDR3 with 1TB of storage, and 2 GPU Fire Pro d700 AMD of 6gm of VRAM.

The outputs are set up 1 for the main screen, 1 for the “leeds” or totems by the sides, 1 for the backing for the musicians (moving door), and 1 for the tunnels. The full comp is made for 2115 px X 1080 px, with no scalers, Folsoms or anything, I go straight to the processors with dual links and that's it.
RicardpArjona_2.jpg
I play the show with an Akai APC40 (older version) and some of the songs had SMPTE sync sent from Pro Tools. A Blackmagic capture card is used to capture an HD-SDI signal that is used in some cues to show the musicians and other live shots.


Read more about Camilo at his website: http://www.vjbastard.com or check out his work on Vimeo at https://vimeo.com/visualbastard

Rocking Out and Getting Your Geek On: Negrita!

We spoke with Marino Cecada, an italian visual designer who has been doing some out of the box work for various pop and rock acts. Where most rock shows visually rely on simple live camera registration, Marino uses Arena and some custom FFGL wizardry to take things to the next level.

Negrita-ilGioco.jpg
[fold][/fold]
Tell us about yourself
I live and work in Italy, and since 2006 I have been working in the video production business, in the beginning as a cameraman and editor. In 2007, together with 2 colleagues, we worked on our first visual project for concerts: it was the "Soundtrack" tour for Elisa, a famous Italian singer.

There were no live cameras for that tour. We had PVC screens on which our videos were projected. Each video was done especially for each song (track). During the years, I have been more and more attracted by video ­art and, generally speaking, everything regarding video production in the music and concert world.

Some of the first interactive video installations I made were with Quartz Composer. I used it to work on musical video­clips and collaborated on other tours in which I have always supervised the visuals and recently also direction and broadcast.

Then, in 2012, while preparing for a tour in the US with Il Volo, my colleague Steve Sarzi proposed to work with Arena. I never heard about it, but after using it with Steve for a couple hours, we had already set the basics for the project I had in mind and most of all, it worked! I was impressed by how fast and easy the software is to use. Also, it immediately read SMPTE signals which was extremely important as I usually work with pre-made videos which are synced.

Tell us about your last work
The last job was for Negrita, an Italian rock group that has a 30 year long career. For spring 2015 they had an in­door stadium tour in mind. The lighting and stage designer, Jo Campana, conceived a very essential stage: the background was made of three big LED screens, 5,60 meters tall and 4,20 Meters wide, occupying 16 meters in length.

Because of the importance given in terms of space, a lot of the show was centered on what was happening on these screens. The idea we had was to mainly use live footage and to exclude tracks with simply a video in the background. The live images had been conceived as a graphic element in support of the set/scenic design, so the important thing was that what we were filming had to be processed and filtered to give a different interpretation for each track.

Being a live broadcast, the result was a sequence of live videos that followed the dynamics and the energy of what was going on on stage. No pre­-created video could have given the same feeling. The only part we left unprocessed was the very last song, in which all the lights were turned on and the show came back to a more "earthly" environment.

Tell us about the video system you used
To realize what we had in mind, Telemauri , which is a video rental company Steve and I closely work with, gave us 3 cameras with cameramen, three remote­controlled cameras plus some small ones which were placed on stage, one of which on the drum set.

All cameras were routed into a ATEM switcher, which was extremely versatile. Thanks to it we could independently control 4 SDI and 4 other input signals to the computer with Arena.

What came out of Arena, went directly to a video matrix and consequently to the screen. Our output was a single full HD signal, the mapping of the three screens were directly done on Arena, deciding what should go on each screen. I prefer to keep some of the controls of Arena in manual mode, like the master fader for example, so we connected a Samson Graphite MF8 controller to the computer.

Have you used particular effects?
One of the aspects that made us choose Arena towards other more "prestigious" media servers, is that through FFGL we could develop our own plugins. In fact, also in the previous tour, Elisa's "L'Anima Vola" we created some plugins to make a choreography of the singer moving on stage, while on the screens her movements were repeated several times to create a trail.

Elisa-Anima vola.pngElisa tour 2014

Another plugin I enjoy, which has been developed together with Davide Mania (an FFGL programmer I have been working with for years) we named ScrumblePict. I often use it, and it allows us to have copies of the signals without having to use more of Arena's clip slots . These copies can be moved, rotated, scaled and cropped, allowing to always create different templates.

Elisa-Labyrinth.jpgElisa tour 2014

Antonacci.jpgBiagio Antonacci tour 2014

Could you show us some of the graphic styles used for the show?
As I mentioned, I very much enjoy working with image decompositions, so in this tour we also got busy with breakdowns and recomposing the signal that came through the switcher.

Negrita-radioConga.jpgNegrita live, example of the ScrumblePict effect.

For other tracks, we took advantage of the edge detection and pixel screen effects.

Negrita-Atomo.jpg

Another fantastic aspect of Arena is the possibility to use DXV files with the alpha channel. In this way we can create moving masks for live inputs.

Negrita-Love.jpgNegrita live, mask with alpha channel and live inside

More info and other works and productions at http://www.editart.it

Artist profile: Ghostdad

A breath of fresh air in the saturated landscape of abstract EDM visuals, Ghostdad aka Ryan Sciaino caught our eye running the impressive visuals for Porter Robinson's Worlds tour. After spending a morning scouring the Interwebs for concert footage, we figured we just might as well get in touch with the legend himself.

DM_SH01.jpgPorter Robinson - Worlds still image courtesy of Invisible Light Network
[fold][/fold]
How did you get started in the VJ game? When did you discover Resolume?

I grew up playing music, and eventually DJing, and then creating visuals for my own music. At some point digging for records and samples turned into digging for found footage from VHS tapes and dollar store DVD's. That was around the time internet video was becoming popular too so I’d comb youtube and archive.org for weird stuff also.

I went to college for computer music and started learning max/msp while I was there. I built a video sampler I could use to switch through clips while DJing, but eventually amped it up to take it on the road with my band WIN WIN when we were doing a synched up DJ/VJ set. It was sort of a monster with cue points and BPM synch and effects so programming it got pretty intense!



When I started working with Porter I knew I needed something faster and easier to throw new content into on the fly. I was making lots of new looks to layer up with other clips and logos etc. It seemed like Resolume could handle anything I threw at it, and the triggering was the fastest I’d ever used, making it really fun to jam with.

Who are some of the artists that inspired you early on? Who is knocking your head back currently?

I listened to a lot of indie rock in high school and Cornelius was one of my favourite artists from Japan. I was lucky enough to see him do his Point show in NYC. I had never seen anything like it. I grew up watching music videos and even got into film and video art so I was used to seeing music and video together but never with live music in person like that. The content and degree of synch were incredible. It really blew my mind.



I’m a pretty big fan boy of artists who use multimedia in a conceptual way but also keep it really clean design wise like Ryoji Ikeda or Rafael Rozendaal. I’ve found more and more of my Vimeo likes being taken up by things that have been featured at http://ghosting.tv. And I definitely try to check out other artists when I’m at festivals too. I saw Bassnectar at Buku in New Orleans a few weeks ago and that was an awesome show.

You have a very varied but distinct style. From anime characters to mayan mysticism to abstract glitch to low-poly geometry to ponies, the list goes on and on. Where does it all come from? Didn't your mom tell you not to spend so much time on the internet?

No actually! We didn’t have the internet when I was growing up! We got a connection by the time I was in high school but it was maybe dial up speed at best. I’m a little older then what I consider to be the first real “internet generation” so when things got really high speed and dazzling it made me feel like a stranger in a strange land. There was so much amazing stuff happening on Tumblr or Vimeo or Second Life that I just wanted to check it all out. I get sucked down the rabbit hole online pretty easily, especially when I want to find out more about a genre or an artists or a meme. Some design trends I see online do remind me of things I grew up with like 8 bit video games or low poly 3D graphics so maybe that makes me think “I can do that!”

Tigerlily2.pngVisuals for DJ Tigerlily courtesy of Ryan

What caught my eye about the Porter Robinson 'Worlds' content is that it almost seems to be cinematic, in that it seems to be telling a story. Now our minds will always create a narrative with what we see, but is this an experience you consciously set out to create?

I think Porter’s goal is to invoke a feeling rather then tell a specific story. There’s definitely a tendency to connect what you’re seeing on the screen and create a story in your mind but that’s also the process that pulls you in and allows you to really feel it. In programming the show we give you every audio/video/lighting cue we can for the theme and timing and mood, and as a result I think the viewer gets to paint their own story and put themselves into it in the process.

FF_SH01.jpgPorter Robinson - Worlds still image courtesy of Invisible Light Network

The Worlds tour content is a collaborative effort, with you playing content created by a larger group of visual artists. Who are the people that you've been working with and how has it been working with them?

We worked with Invisible Light Network on the animated looks you see in the show. They’re based in NY also and had about 9 or 10 illustrators and animators working on their team. We were also able to grab additional content from some of Porter’s music videos like Flicker by Coyote Post.

I made content for the show as well and Porter was super in touch with everyone throughout the creation process. It was a lot of different footage to wrangle in Premiere, but I spent a week with Porter before the start of the Worlds tour where we really figured out the visual flow and style of the show while putting it all together. Porter has a tremendous amount of vision when it comes to his music which is totally inspiring.

So when looking at Youtube videos from your shows, I came across this: https://www.youtube.com/watch?v=AdotsHAzfVA. It looks like someone has been re-creating the content he saw at the show. How do you like them apples?

Yah we just saw that also! I think fans are dying to take home a piece of the show and it’s really cool they’d go so far as to recreate it from the bits of media that are floating around out there. I think that excitement starts with Porter’s music though since there are practically whole new versions of songs from the album in the live set.

Someone even put the entire set together from cell phone footage taken at shows with homemade recreations of the live music. Okay and here’s where it gets really crazy, someone even started building the live rig into a 3D game engine: https://youtu.be/kq3TcMxpcV4

The expanded presentation of Worlds as an album is what makes it special, but I think the live and communal aspects are still super important. Maybe someday we’ll all be able to log into an MMO and experience something similar but even that won’t be able to beat being there in person experiencing the show with other fans. My guess is there will be a complete version of the Worlds show you can watch at home someday but for now we try to keep certain things exclusive to the live set so you have to show up and get the full experience.


Hopefully this crowd video from the Youtubes captures some of that live experience!

Recently you've been playing with Unity to make realtime visuals. What's the main thing that makes realtime more fun than pre-rendered?

Render time is never fun and playing video games is always fun right? I’ve never been very patient with 3D software. A lot of the 3D stuff I work on has a lo-fi video game aesthetic as well so its sort of a no brainer to start throwing stuff into Unity. I jump in and out of Blender as well but I figured if I’m going to put my time into learning a 3D environment I wanted it to be real time.

edit Copy 09_changes_140817.00_46_21_15.Still004 copy.jpgPorter Robinson - Worlds still image courtesy of Ryan

Alex my band mate in WIN WIN is way more under hood with Blender and rendered some really weird stuff for our last music video. We really liked the effect of video footage height mapped to a mesh and the objects came out really smooth and organic looking, in part thanks to some render time:



What are the main stumbling blocks you run into when working in realtime as opposed to keyframe everything? What about the liberating moments of the freedom it offers?

Scripting is something I wrestle with. It’s great that objects can do what you want in real time but you still have to tell them what to do! The benefit of course being you can see those changes instantly, and tweak it endlessly.

Controlling things in real time keeps me a little more engaged and expressive. I think coming from a music background makes that important to me.

Tigerlily3.pngVisuals for DJ Tigerlily courtesy of Ryan

Do you like to control things in realtime during show? Or is the appeal more during the creative process?

It’s been great to get a chance to do both this year. Both VJing live but also spending time editing and programming a show I mean. There are always things that will look better when edited ahead of time, but even in a show like Worlds I leave myself a few things to do by hand. Sometimes that’s so I can follow what Porter’s doing live, but also for me to feel more involved in the performative aspect of the show. I still play guitar and keys so I don’t want to let go of that live aspect of playing visuals also like an instrument.

People seem to get really excited when discussing realtime vs rendered, some people even get militant about it. You seem to switch seamlessly between both. Do you think one or the other has more potential? Are they mutually exclusive? Where would you like to see visuals heading in the next five years?

Part of my thinking about learning realtime is definitely about the future. Ideally real time processing will catch up to how good it can look when rendering. I don’t mind it looking a little rough around the edges for now if I can play it to the beat.

Ryan is a prolific internet user, so you can catch him in a variety of digital media. Get started down the rabbit hole at his website: http://www.djghostdad.com/

Gorgeous & Playful Interactive Projection Mapping

Dalziel-Pow is a London-based agency with over 30 years' experience in brand and retail design, passionate about creating retail experiences that are unique and engaging for the customer.

A few days ago, they sent us the link to their amazing new project:

http://www.dalziel-pow.com/news/interactive-animations-retail-design-expo

They were kind enough to expand a little about how the project works:
[fold][/fold]
It's a giant screen printed illustration into a touch sensitive midi-controller using conductive ink.
It's proving massively popular in the retail world and we're getting plenty of publicity for it.

And rightly so! The playful artwork looks gorgeous, and the interactive projection elements complement it really well.

Using Resolume has been a great help getting the project up and running:

The ability to play multiple layers at once was really what drew us to the software - very handy and more importantly, very stable. With a powerful enough machine we must have been playing something like 30 layers simultaneously when we where stress testing!

The midi mapping feature meant we could trigger content using conductive ink through the picture itself.
http://www.bareconductive.com/

Be sure to check out their other work at http://www.dalziel-pow.com/

Dream On: Rocking It Out with Aerosmith

One of the great things about hosting Resolume workshops is you get to meet so many amazing people from all over the world.

One such amazing person is Danny Purdue.

After joining us for a session last year, he showed us the impressive work he was doing for Light Nightclub in Las Vegas. Soon after that, we got word he would be running Resolume for the visuals on the Aerosmith world tour. Live visuals are common in EDM and club scenes, but still a relatively new thing on rock shows, so of course we had to get the lowdown on this.

Here's the interview with Danny himself.
[fold][/fold]

ResolumeSetupSmall.jpg

Who are you and how the heck did you land a job with Aerosmith? What other work have you done?

I’ve spent most of the last ten years touring and producing live video at concerts. I started out running camera and editing, then eventually got into directing and more of the overall show design. On the side I had an interest in VJing, and the two paths crossed when I directed the live video for Swedish House Mafia's “One Last Tour” in 2012. The camera shots needed to be stylized to complement the LED visuals, so I integrated Resolume with a broadcast switcher to mix effects with my line cut. Creatively it was really fun and being out there inspired me to pursue VJing more seriously.

When that tour was over I headed to Las Vegas for a residency at two clubs opening in Mandalay Bay. One was Daylight Beach Club, a straightforward VJ gig, and the other was Cirque du Soleil’s Light Nightclub, an ambitious project to combine the theatrics of Cirque with EDM and the nightclub environment. Light has a massive LED installation, wireless HD cameras, motion tracking equipment for generative visuals, and custom content built for the architecture of the room. It took a lot of talented people to bring all the pieces together and make Light successful.

In March I got a call about putting together a Resolume demo for the upcoming Aerosmith tour. It sounded like a cool opportunity, so I went over to LA and worked out a formula similar to the Swedish House rig with Arena running inline with broadcast video equipment. A few days later Steven Tyler came by, I demoed some looks, then we spent a couple hours trying out all kinds of effects using recordings from previous shows. He liked what he saw and asked me to join the tour.

Why was Resolume chosen over other media servers?

The choice to use Resolume came from Steven. I was pretty surprised he knew about it, and even more so when we first met that he had actually downloaded the demo, gone through footage on the website, and rattled off names of the animators he liked. The man does his homework. After seeing what VJs were doing with Resolume, Steven was excited to use the large palette of effects to create visual moments in Aerosmith's show.

We didn’t have production rehearsals before the tour, so the immediate benefit of Resolume was rapidly developing ideas. Instead a lengthy creative process with renderings and reviewals, we knew a server running Arena could achieve whatever visual treatments we came up with on the road.

How is operating a rock show different from an EDM style event?

The main difference on this project was using visuals and effects to accent a show rather than drive it. What fans want to see at an Aerosmith concert isn’t graphics, it’s these rock icons playing their instruments and Steven Tyler’s wild stage presence. So it was a video-first approach where several elements had to be right for a visual moment to be effective.

After Steven and I developed a concept, I worked with our lighting designer Cosmo Wilson, video director Jeff Claire, and the camera crew to sort out the practical side of things like colors, spotlights, and filming angles. It was a much different environment than VJing at a rave where your content is the show and you’re more in control of the ambience.

How was your deck structured?

I ended up using a single deck mostly because it simplified my workflow with live inputs. Rather than having a lot of device source clips, I stuck with two and used layer routers to get signal wherever I needed it in the
composition. For one of the keying effects this routing allowed me to send an upper layer back down in the render order, which is a feature that’s hard to appreciate until you need it.

The deck was mostly effect clips and a small selection of content. Out of seven layers total, two were essentially fixed tools and the other five gave me plenty of room to stack up each look in a single column. One feature of Resolume I had rarely used for VJing but came to rely on for this project was Composition MIDI mapping. It saved a lot of time by not having to remap as I shuffled things around and tried different orders of effects.

What effects did you use to create the different looks for the songs?

Each look was a combination of multiple effects with the most significant parameters linked to dashboard controls. Here are a few of my favorites.

One of the first looks we created was for “Eat the Rich,” which started with an upbeat, tribal-esque breakdown of percussion and vocals. Steven walked downstage, faced the camera, and did some crazy dance moves with an effect we called "Shiva.” It was based on Delay RGB with some color strobing, and I had a couple knobs controlling distortion effects based on his moves that particular night.

EatTheRich.jpeg

The trick with this Edge Detection effect was keeping detail on Steven so it didn’t look gaudy, then I added the glare to give it a more elegant feel. This one really popped during “Dream On” when Steven stood on top of the piano with his hair and clothes blowing in the wind from the stage fans. The song opened with a clip of rolling clouds and those fit nicely with this look too.

PurpleGlow.jpg

The most challenging cue each show was called “Face Melt,” where a combination of keying effects made Steven's skin translucent to reveal twisting (Cosmic Strings!) graphics. Most of the time we used this at the beginning of "Sweet Emotion” under moody ultraviolet light, which is incredibly tough to key against. I had presets that got it close and dialed in the rest by hand to make sure content only played over him and didn't spill out over the rest of the image. This look was part of my original Resolume demo for Steven.

FaceMelt.jpg

What were the technical setup and signal flow like?

As VJs we’re often confined to prosumer gear that fits in a carry-on case. Not here. My equipment and the main video system were provided by Screenworks NEP out of LA, giving me access to considerable resources at the beginning of the project. During the system build I was able to pull broadcast-grade hardware and converters off the shelf, test them out, and get exactly what I needed. Having a lab of sorts to experiment with integration was a real luxury. Once the spec was complete, our tech-saavy video engineer went through each piece of gear from camera to screen and shaved off every millisecond of latency possible.

My rig was located backstage with the rest of video world since I needed access to several HD sources and quick communication with the video crew. Resolume ran on a 2013 Mac Pro and captured two discrete video signals using a Blackmagic DeckLink Duo. The card took any combination of cameras and switcher feeds based on my selection with remote panels connected to the main system router. Resolume’s output passed through an Image Pro for genlock and HD-SDI conversion, then went back to the central router so we could place it upstream or downstream of anything else in the signal flow to the LED screen.

For control I used Livid Instruments’ CNTRL:R. It has both regular and “endless" knobs, small drum pads, plenty of RGB buttons, long-throw faders, and a layout that works well with how I operate. Everyone of course has their own cup of tea when it comes to MIDI controllers, but when Resolume is open I almost always have the CNTRL:R plugged in too.

The heart of the video system was a Ross Vision, a high end broadcast switcher with all kinds of mixing, keying, and automation abilities. We had one look driven by the Vision that was a grid of nine different 1080 HD sources with no drop in frame-rate or performance. For another song we had switcher buttons triggering sequences of comic book styled playback based on which band member and camera angle were being shown, then a layer of effected live video from Resolume was keyed into a panel to match the look. Top-notch hardware opens the door to some pretty imaginative workflows.

VideoWorld.jpg

Where do you see Resolume fitting in to a crowded scene of media servers and VJ software?

What originally got me into Resolume is its simplicity and intuitiveness, which let me focus on being creative. This is particularly important when you’re working with a high profile artist whose time is very valuable. In a creative session you have to quickly translate ideas into a repeatable cues, so you need a fast and flexible workflow. There is always time to go back and get technical with optimizing effect stacks, layering, and MIDI mapping. What doesn’t work is a rock star tapping his foot waiting on you to configure something.

One of Resolume’s best advantages that seems to either be overlooked or taken for granted is that it’s cross-platform. It’s important to me that no matter what tools and hardware I want to use, I don’t have to worry about changing the main piece of software I use to operate a show. Especially with Syphon and now Spout, a lot just comes down to user preference and project needs.

Looking forward, I’m excited to see how Resolume tackles new trends like it did with projection mapping. Things like timeline-based control data and DMX output are readily available using third party apps, but the process could be simpler.

Resolume is still a new tool to the industry as a whole and has a lot of room to grow beyond the EDM scene. As more artists embrace interactive technologies, generative show elements, and live content operators, having a powerful creative hub that can adapt to different workflows is key. Before this project I wouldn’t have expected Aerosmith to be part of this conversation, and was pleasantly surprised that even rock legends are riding the new wave of visual art.

See more of Danny's work at http://programfeed.com/

Media Selectah Workshop Brings People Together

Sometimes it's good to remember life is not all about optimized GPU codecs, gigantic LED walls and how many outputs you can run of a single computer. Sometimes life is about using visual art to transform spaces. And if we're really lucky, we can also use it to transform people.

Recently we received an email from Roxanna VJ Thai that reminded us of exactly that. And we'd like to share that letter with you all as well.

[fold][/fold]

Hello Resolume friends!

I wanted to share my recent experience, taking video and multimedia education to another level.

MediaSelectah_1.jpg
I was invited to give a multimedia workshop at REYNOSA, one of the cities here in México, where violence has taken over as a result from the drug war. I took the chance, and tried to put fear aside.

MediaSelectah_2.jpg
It was an intensive 1 week workshop called "LET US PLAY!", all about new media and exploring realtime video possibilities, revisiting artists, techniques, tools, etc. They were introduced to Resolume Avenue software as the main tool to develop their individual projects.

MediaSelectah_3.jpg
With the support of Media Selectah, we opted to close the workshop with a video projection intervention in situ, where we designed a creative proposal and it was done on the same day. The place, was a big white closed room, used for conventions, we had 4 "little projectors" (2K lumens), but as always we did some magic using them to the maximum, and creating a smaller 3D video space within. Every participant developed their idea in a 3 minute performance, and then they all took part of a jam improvisation where random music was played and they would have to play vj.

MediaSelectah_5.jpg
They could use the crowd source material where everybody contributed with videos and images in a collective way.

This workflow was inspired by the Exquisite Corpse from the Surrealists. We have done this workshop also
in Monterrey city, Culiacán, Sinaloa; and San Luis Potosí, and it is always a new and different experience.

We love video and multimedia, it is our passion in Media Selectah! So we use it to transform spaces and people, and to open up new possibilities for these hybrid media explorations.

MediaSelectah_7.jpg
As a VJ artist (Vj Thai), it has been very difficult to do what I like for so many years without giving up, due to the lack of spaces and recognition for the visual as part of everything that has to do with music culture and beyond. So we have worked hard on opening up new spaces for video in art and entertainment in México, by curating and producing multimedia happenings around the north of the country, mainly in Monterrey city.

This is our website:

http://www.mediaselectah.com

http://www.vjthai.mediaselectah.com

And facebook fan page:

https://www.facebook.com/mediaselectah

You can see more pics here.

Pictures by Carlos Limas, Laura Flores, Tochirock Gallegos, Arabella Medrano and VjThai for Media Selectah .

thank you

_*_

Thai.

Project Showcase: Landschapslumen by Beeldjutters

Between all the EDM LED screen madness and the onslaught of geometric projection mapping, sometimes it's nice to see something completely different.

Such a thing is Landschapslumen by the Dutch artist collective Beeldjutters:


Dwaallumen (pilot).
[fold][/fold]
Our name is Beeldjutters, and we have a long-term project which is called Landschapslumen. We create images which we use to transform a landscape into a surreal spectacle. The first version of Landschapslumen was shown at the Dutch Oerol Theatre Festival in 2010. Following a successful try-out year, the installation returned to the 2011 edition of the festival in a large-scale version with 14 beamers projecting onto the sand dunes near the sea. Since then, we projected on the forest, snow, palm trees and other landscapes. Back in those days we used to work with Wings Platinum. Nowadays, we mainly work with Resolume Arena and sometimes old-school DVD players. Resolume gives us more freedom to experiment and to see instant results.



Sneeuwlumen

For our latest project Dwaallumen we need this freedom because we need to create a dialogue in projected images with the dancers, musicians and actors. For example to add extra arms and hands in the shadows of de musician, or double reversed silhouettes of the dancers, or even project a fish with the head of an actor flying through the landscapes. Our work stands in stark contrast to the techno bombast that you usually see in the clubs. Most of the time we don’t use advanced 3D mapping, but focus on the narrative images. However, we do use the mapping options very often to place the images in our landscapes and create the desired composition.

Come and experience Dwaallumen, 3 oktober in Utrecht. Tickets are available at our website. In december, Landschapslumen is shown at Amsterdam Light Festival. For more info, please visit http://www.landschapslumen.nl (mainly in Dutch, sorry!)