I created the concept, design, and implementation for The Dandy Warhols’ video “Motor City Steel”. With the help of editor Dan Scofield, I used green screen footage of the band in combination with OpenFrameworks, camera vision, and a number of photo and video sources to create an animated vision of the story of “Travis and Ricki” and their attempts at love. The video received over 15,000 views in its first week, more than 150,000 views in total, and launched the start of the band’s 25th anniversary tour.
Technologies Used: OpenFrameworks, Camera Vision, Face Tracking
I worked with Office Of Things on the design and development of a new lobby renovation at YouTube's headquarters in San Bruno, CA. My work included software for video playback on three separate segments of large-scale LED wall, and an interactive feature utilizing 5 cameras throughout the space. I also created a software management suite to dynamically display and schedule videos in real time on this installation, as well as the earlier ScreenPlay projects in a separate building at the same campus.
I worked with Arisohn and Murphy to create multiple prototypes for an artist’s installation in New York. Both prototypes involved site-specific movement and presence detection, one using a light-based sensor, and the other using camera vision. Both implementations then talked to DMX controllers to trigger a spotlight when presence was detected.
After Office Of Things was tasked with representing "YouTube as a material" in a multi-story installation, I worked with the team on design and development of the project that became Screen Play. Using LED panels, real-time video mapping, virtual filters, and dichroic glass, we created a wall-based installation that deconstructs YouTube content and represents it as a novel abstraction
Photos: Tom Harris
After successfully implementing a sound and light "meditation" at YouTube in Sunnyvale, Office Of Things was enlisted to design a second generation of the unique spaces. I worked with their team to streamline software, and implement multiple new LED patterns for the ~15 minute meditations.
Photos: Tom Harris
Technologies Used: Arduino, Addressable LEDs, MP3 Shield
While working at Rockwell Group, I helped to create a hand-wired LED array for their concrete tray design for Wallpaper: Handmade 2016. Using programmable LEDs and a Teensy micro-controller, the tray could display multiple patterns corresponding to predetermined glassware locations.
Technologies Used: Teensy, Addressable LED Strips
Brookfield Place Luminaries is an iconic holiday light installation designed by the LAB at Rockwell Group. As Lead Interactive Developer on the project, I designed the software architecture, capacitive cube interaction, and the content for the 2016 and 2017 seasons. The installation has been universally lauded in publications such as The New York Times, Architectural Digest, and Inhabitat.
Technologies Used: DMX, Processing
Based on an original design from Hecho, Inc., and in collaboration with Dan Scofield and Tucker Viemeister, we created two iconic LED walls for the Brooklyn night club Baby's Allright.
Technologies Used: Addressable LEDs, Processing
As Lead Interactive Developer I architected and designed the technology platform for a mosaic "wall" of 86 Samsung devices for the Green Room at the 86th Academy Awards. The devices ran a custom Android app, which synchronized their output, and allowed for unified content transitions across devices.
Technologies Used: Android, Java, Processing
I was the Lead Interactive Developer for the Rockwell Group technology team that implemented multiple galleries for the Hudson Yards Experience Center. This included software architecture, coordination of multiple large screen displays, inter-room communication, dynamic DMX LED lighting, and more.
Technologies Used: Node.js, DMX, HTML5, CSS
The Docks is a site specific installation created by the Rockwell Group LAB. It uses LED panels and surface tablets to create an interactive "dock" setting, where users can "make" virtual fish and flowers. As the Lead Interactive Developer on The Docks, I planned the software architecture, and wrote four custom applications: A tablet-based app for creating virtual fish and flowers, a geometrically mapped animation of the river, animated trees with changing seasons, and a real-time camera vision engine.
Technologies Used: Processing, Spacebrew, Java, Camera Vision
With my collaborator Dan Scofield, we conceived of an outdoor interactive soundscape that changed in real time based on movement within the space. Using multiple cameras for movement detection in a custom software platform, as well as a large scale surround array routed through MAX/MSP, we deployed the installation in a public garden as part of The Switched On Garden.
Technologies Used: Camera Vision, Processing, MaxMSP
In the Headroom projects I explored the synthesis of two subjects' facial features in real time. The first (Mk. I) used digital cameras mounted on helmets, with a custom controller to adjust resolution in real time. The second (Mk. II) was a collaboration with Luis Violante, and explored the same principles, but with laser-cut, two sided mirrors.
Technologies Used: OpenFrameworks, Arduino
I created GridMusic as a real-time interactive sound installation. Using a virtual grid overlaid on a public space, the system interprets movement in each node of the grid. Each node of the grid is also associated with a given sine wave. When movement occurs, the volume of that wave is increased, and when it is idle, it is decreased. The result is a sonic landscape that represents the movement throughout the space, and changes throughout the day.
Technologies Used: Java, Camera Vision, MIDI, OpenCV
Sine Beats is a live music performance and immersive experience. In it, I combine sound and light, using custom controllers to mix 6 pairs of closely attuned sine waves and their corresponding light sources. The volume of the two waves, and their proximity, creates an unmistakeable "beating" effect that forms the backbone of the piece. As the volume and beating builds, each controller also adjusts the strobing of a halogen lamp, to create a visual effect that mimics the generated audio in real time.
Technologies Used: Arduino, PureData
Since 2012, I have worked on various occasions with the band Lucius to create reactive projections to accompany their music. The work has involved audio reactivity and projection mapping, and has appeared in music videos, as well as at live performances at New York’s Mercury Lounge and Bowery Ballroom.
Video: Don’t Just Sit There
Video: Turn It Around
Technologies Used: Processing, Openframeworks, MadMapper
My first project with Rockwell Group LAB, the DigiTree uses custom hanging enclosures to turn a large ficus tree into an interactive social network. Via a local photo booth, or through Twitter hashtags, users can see their content appear on screens in the tree in real time. As an Interactive Developer, my roles on the project included software architecture planning, a custom real-time content server, and managing a third party developer in the creation of a custom CMS.
Technologies Used: Java, Node.js, Spacebrew
In 2012, I worked with Tucker Viemeister and JCDecaux on a winning technology bid for Los Angeles International Airport's Bradley Terminal. The work included ideation and brainstorming, as well as the formulation of a comprehensive document outlining the proposal.
Between 2006 and 2009 I was a Software Design Engineer for Dolby's Digital Cinema platform in the ShowStore group. Key responsibilities included adding support for GPIO, 3D Content, and Serial Automation.
Technologies Used: Java, C, PostgreSQL