Visual Artists, Musicians, and VJs
Maxwell Crabill, Peter Franko
Concept and Proof
Research
Coding and Testing
Empowering visual artists to improvise alongside musicians.
For a coding exercise, I sought to inject an additional layer of artistry and control in to music visualization. The Visual Instrument gives digital artists the power to live-paint visual landscapes with brushes which react to music.
I took the traditional concept of the visual music synthesizer and combined it with a generative painting program. The program features three unique brushes, and two separate canvas clearing functions that each play differently with the brush algorithms.
All code up up to 2020 written in processing. All prototypes built after 2020 are built with code in TouchDesigner.
How might we use code to allow visual artists and illustrators to synthesize their work with music as its being played?
Processing, a visual coding language, was to be our only tool. With this exercise, we intended to familiarize ourselves with the processes, constraints, and demands of coding.
Despite their close associations, marrying illustration with music is challenging, thanks to music's temporal, reactive, and improvisational nature contrasting with the drawn out, process-to-payoff nature of traditional artwork.
To prepare for my solution, I investigated existing ways to create dynamic visuals:
An incredibly popular and affordable VJ choice. Enables the rapid swapping of video clips and the application of preset effects. High fidelity, low potential for detail sculpting and improvisational content.
100% analog solution. High fidelity, abstract images created by mixing dyes and chemicals over projectors. Highly reactive and improvisational. Low potential for detail and sculpting, as hands and instruments block the image capture.
A particle emitter software which, when cleverly rigged, enables artists to "paint" custom particle effects onto the canvas. High potential for individual expression, requires immense amounts of preparation.
Once I confirmed that the use and control of multiple brushes was indeed possible within Processing, I drafted ideas based off of what I knew could be accomplished from other code I had seen. I wanted to create a wide variety of textures for the artist to use to recreate the feel of a wide variety of songs and sounds.
I created multiple sketches of potential variations on the concept. The first sketch features a GUI featured which allows users to switch brush symmetry on and off and select brushes on screen. A synthesizer runs in the background, which operates on the same unified color palette as the chosen brush. Other variations included a user-controlled procession of background images and videos, taking a cue from Resolume, allowing for more storytelling capabilities.
Lets collab, bro
I'm grateful to currently have artistic collaborators helping to bring this dream into fruition. Our goal is to enhance analog liquid light (after admitting to ourselves that nature can render visuals with infinite times the efficiency of computers) with digital flourishes and control methods, some of which are explored in this case study. I've chosen to keep the most exciting innovations under wraps for now, out of respect for my partners.