A new search ‘vehicle’ for the Library of Congress’ unfathomably massive photo archive, this experiment uses the analog of a roadtrip as a search through these collections. As you plot a route on the map, photos taken along that route are displayed. My hope is to allow serendipitous connections between photos from different times, collections and populations which would otherwise never appear together within the archive. Featured on Glitch.
This project, made in collaboration with Itay Niv, was built for Data through Design’s 2019 exhibition “celebrating tangible and multimedia expressions of New York City’s Open Data.” The project reimagines routes through the city as tracks on a musical sequencer, with the city’s trees and urban elements (e.g. subway stops, wifi access points) as notes on these tracks. Using an interactive map displayed on an supersized touchscreen, audience members are immersed in a synesthetic experience of the once-familiar landscape.
This (ongoing) project is an exploration into new methods of visualization for a museum setting started during a Fellowship at Terreform One, a nonprofit architecture and urban design group based in the New Lab at the Brooklyn Navy Yard. I was particularly interested in breaking down the barrier between scientific research and science communication and facilitating a sense of agency among a viewing public in the process of scientific discovery.
This project explores real-time interactive world-building in virtual reality. Users speak and objects appear around them. They ask kindly and mountains move. Basically, it turns them into the gods of their own terrible little worlds. Voice control allows virtual reality to feel a little closer to our reality, in which spoken words have meaning and merit responses. This experience allows us to consider what might happen when virtual reality advances enough to feel truly real: will we all succumb to the temptations of complete control and live our lives in VR?
This mask was 3D scanned using a technique called photogrammetry. Many (>150) photos are taken of the mask from different angles and depth information is extracted from these photos using specialized software. The result is a cloud of several million points plotted in 3D space from which a ‘mesh’ describing the mask’s surface can be created. The print seen here is a ‘texture map’ — a 2D image which, when unwrapped over the surface of this model, describes its color and brightness at every point.
While modern life is full of job, family, academic and interpersonal responsibilities, “Take the Money and Run” is a game that allows anyone to live out their dreams of dropping our of the rat-race and traveling the world (in a safe and much less expensive way).
This project (made in collaboration with Yifan Liu) was an attempt to create a interactive two-person version of the classic arcade game “Skee Ball.” In updating this common game, we wanted to allow two players to play directly against one another, rather than just adjacent to one another.
Produced short film “Around the Corner” shot in NYC and surrounding area. Organized 3-day shoot and helped manage expenses / equipment pickup and drop-off / transportation for actors and crew / permitting with NYC Department of Film and Television as well as NYC Parks Department.
After building my CNC Router, I wanted to see if I could make three dimensional relief maps. I set about downloading elevation data from the USGS National Elevation Dataset and writing a small script to convert the raw data into a format and scale which worked reasonably well with my 3D modeling software. After converting this data into 3D models, I used CAM software to create g-code tool paths for my CNC Router.