I created the concept, design, and implementation for The Dandy Warhols’ video “Motor City Steel”. With the help of editor Dan Scofield, I used green screen footage of the band in combination with OpenFrameworks, camera vision, and a number of photo and video sources to create an animated vision of the story of “Travis and Ricki” and their attempts at love. The video received over 15,000 views in its first week, more than 150,000 views in total, and launched the start of the band’s 25th anniversary tour.
Technologies Used: OpenFrameworks, Camera Vision, Face Tracking
Soft Screen is a responsive architectural partition designed for the main lobby space of the offices of a media company in California. The project, which measures 120’ wide and 40’ tall, features over 30,000 LEDs hidden behind a fuzzy acoustical membrane. The overall effect is a seamless integration of the analog architecture of the lobby space and the digital platform of the company it represents. By standing on medallions distributed throughout the lobby, visitors trigger interactive features within the wall, making it your own experience for a few moments.
The Meditation Chambers – Arches, Skydome & Horizon – are a series of approximately 100 square foot spaces completed for a tech company in San Francisco.
They provide an intense immersive experience: a concentration of color, light and sound in spaces made specifically for workplace escape, both mental and physical. They are bigger than they seem from the outside, a byproduct of their unassuming outside appearance, compressed entry sequence, and the visual interplay of light and sound within.
Each room is a built illustration of the expansive magic that punctuates daily life for those who manage to slow down and open themselves to quiet contemplation and subtle shifts of the environment.
Photos: Tom Harris
Technologies Used: Arduino, Addressable LEDs, MP3 Shield
How does a media company create separation between spaces? How can a wall be both visible with video content and invisible when turned off? How does a constantly changing company use a screen without looking immediately dated? Screen Play is an immersive and interactive architectural partition designed for the tech offices in California, the purpose of which is multifold but ultimately quite simple: when a design can remove the frame from a screen, from the imagery it contains, then the technology fades and only the video remains.
Screen Play proposes that by breaking the image into many vertical screens, what we see can be boundless, provide transparency, and remain relevant; it can be a partition, a video wall, and with reflective materials, disappear altogether.
Photos: Tom Harris
Brookfield Place Luminaries is an iconic holiday light installation designed by the LAB at Rockwell Group. As Lead Interactive Developer on the project, I designed the software architecture, capacitive cube interaction, and the content for the 2016 and 2017 seasons. The installation has been universally lauded in publications such as The New York Times, Architectural Digest, and Inhabitat.
Technologies Used: DMX, Processing
Based on an original design from Hecho, Inc., and in collaboration with Dan Scofield and Tucker Viemeister, we created two iconic LED walls for the Brooklyn night club Baby's Allright.
Technologies Used: Addressable LEDs, Processing
As Lead Interactive Developer I architected and designed the technology platform for a mosaic "wall" of 86 Samsung devices for the Green Room at the 86th Academy Awards. The devices ran a custom Android app, which synchronized their output, and allowed for unified content transitions across devices.
Technologies Used: Android, Java, Processing
I was the Lead Interactive Developer for the Rockwell Group technology team that implemented multiple galleries for the Hudson Yards Experience Center. This included software architecture, coordination of multiple large screen displays, inter-room communication, dynamic DMX LED lighting, and more.
Technologies Used: Node.js, DMX, HTML5, CSS
In 2012, I worked with Tucker Viemeister and JCDecaux on a winning technology bid for Los Angeles International Airport's Bradley Terminal. The work included ideation and brainstorming, as well as the formulation of a comprehensive document outlining the proposal.
The Docks is a site specific installation created by the Rockwell Group LAB. It uses LED panels and surface tablets to create an interactive "dock" setting, where users can "make" virtual fish and flowers. As the Lead Interactive Developer on The Docks, I planned the software architecture, and wrote four custom applications: A tablet-based app for creating virtual fish and flowers, a geometrically mapped animation of the river, animated trees with changing seasons, and a real-time camera vision engine.
Technologies Used: Processing, Spacebrew, Java, Camera Vision
My first project with Rockwell Group LAB, the DigiTree uses custom hanging enclosures to turn a large ficus tree into an interactive social network. Via a local photo booth, or through Twitter hashtags, users can see their content appear on screens in the tree in real time. As an Interactive Developer, my roles on the project included software architecture planning, a custom real-time content server, and managing a third party developer in the creation of a custom CMS.
Technologies Used: Java, Node.js, Spacebrew
Since 2012, I have worked on various occasions with the band Lucius to create reactive projections to accompany their music. The work has involved audio reactivity and projection mapping, and has appeared in music videos, as well as at live performances at New York’s Mercury Lounge and Bowery Ballroom.
Video: Don’t Just Sit There
Video: Turn It Around
Technologies Used: Processing, Openframeworks, MadMapper
While working at Rockwell Group, I helped to create a hand-wired LED array for their concrete tray design for Wallpaper: Handmade 2016. Using programmable LEDs and a Teensy micro-controller, the tray could display multiple patterns corresponding to predetermined glassware locations.
Technologies Used: Teensy, Addressable LED Strips
I worked with Arisohn and Murphy to create multiple prototypes for an artist’s installation in New York. Both prototypes involved site-specific movement and presence detection, one using a light-based sensor, and the other using camera vision. Both implementations then talked to DMX controllers to trigger a spotlight when presence was detected.
I created GridMusic as a real-time interactive sound installation. Using a virtual grid overlaid on a public space, the system interprets movement in each node of the grid. Each node of the grid is also associated with a given sine wave. When movement occurs, the volume of that wave is increased, and when it is idle, it is decreased. The result is a sonic landscape that represents the movement throughout the space, and changes throughout the day.
Technologies Used: Java, Camera Vision, MIDI, OpenCV
Sine Beats is a live music performance and immersive experience. In it, I combine sound and light, using custom controllers to mix 6 pairs of closely attuned sine waves and their corresponding light sources. The volume of the two waves, and their proximity, creates an unmistakeable "beating" effect that forms the backbone of the piece. As the volume and beating builds, each controller also adjusts the strobing of a halogen lamp, to create a visual effect that mimics the generated audio in real time.
Technologies Used: Arduino, PureData
In the Headroom projects I explored the synthesis of two subjects' facial features in real time. The first (Mk. I) used digital cameras mounted on helmets, with a custom controller to adjust resolution in real time. The second (Mk. II) was a collaboration with Luis Violante, and explored the same principles, but with laser-cut, two sided mirrors.
Technologies Used: OpenFrameworks, Arduino
With my collaborator Dan Scofield, we conceived of an outdoor interactive soundscape that changed in real time based on movement within the space. Using multiple cameras for movement detection in a custom software platform, as well as a large scale surround array routed through MAX/MSP, we deployed the installation in a public garden as part of The Switched On Garden.
Technologies Used: Camera Vision, Processing, MaxMSP
Between 2006 and 2009 I was a Software Design Engineer for Dolby's Digital Cinema platform in the ShowStore group. Key responsibilities included adding support for GPIO, 3D Content, and Serial Automation.
Technologies Used: Java, C, PostgreSQL