Sagar Mohite is a computational artist and an engineer based in New York City. His work has been an exploration in combining principles of design and computational sciences to generate visualizations.

Posts from the Film / Photo / Audio Category

Oakwood Beach Returns to the Wild

At its peak, Hurricane Sandy, the largest Atlantic hurricane on record, spanned over 1,150 miles – roughly the driving distance between New York and Miami. As it began nearing the coastline, pouring down rain and rattling windows, schools closed, public transit shut down, and flood-zone residents evacuated their homes. The full force of the storm hit New York City on October 29, 2012, with winds gusting to 80 miles per hour.

Oakwood Beach, a low-lying hamlet on the south shore of Staten Island, was no match for the superstorm. By the time the winds and waters receded, leaving mud-filled homes in their wake, three residents were dead. Dozens more would be homeless for months.

During the year that followed, some residents chose to return and rebuild. But now, two years since the storm, the state is in the midst of implementing a buy-out program designed to convince all residents to leave their homes. Officials have closed on 276 buy-out sales in Oakwood Beach and two nearby towns, at a cost of $112 million; another 200 applications are pending.

In Oakwood Beach, once an oceanfront sanctuary for working class families, just a handful of stubborn holdouts remain. They live among empty lots and boarded windows; 47 neighborhood houses have already been demolished.

We visited Oakwood Beach around Sandy’s anniversary to talk to the remaining residents. They are surrounded: by nature, as animals and wetlands reclaim the land, and by government, as officials prepare for a world in which Oakwood Beach no longer exists. For now, the town sits in limbo.

Footage shot with DJI Phantom 2 and a GoPro Hero4 with a gimbal attachment.

Time in the Sky + high-res version

Time in the Sky

A responsive documentary piece showcasing a diverse set of people all experiencing the highline on one shared day.

The main frame opens with a split screen – the top half being a timelapse of the sun rising and setting over the bay, and feet walking across the highline on the lower half. Notice the difference in shoes and diverse conversational snippets while the sky transforms. This footage loops until a viewer lingers, and a sensor cues a portrait photograph of some visitor and audio clip taken from a diverse set of highline visitors revealing their favorite time of day.

The following is a sneak peek of what resulted from over 9 hours of footage, hundreds of photographs, several lines of code and over 40 personal interviews of people relishing the park in the sky.

This was an experimental film experience that was designed in Fall 2013 in New York City at ITP that I designed with my super teammates – Rosalie and Amanda.

Made with OpenFrameworks and Arduino. Read more on the development process.

This was the very first storyboard we made for the project. These initial sketches served as a basis for our work right till the end.

  • test

All credits to Rosalie for the slick sketches!

The type of response we needed from the film was largely intrinsic and not essentially an “interaction”. It was supposed to behave in manners similar to a mirror. In other words, the viewers would not have to do a specific action or gesture to make the screen respond. We chose to do this to keep the film experience organic and not make it an interactive app or game experience.

Proximity Sensors

Ultrasonic pings would serve our purpose gracefully and so we decided to use three of them. The sensors would lay in the bottom below each of the small videos playing on the screen. The responsiveness we decided on was subtle in the sense that as a viewer approaches one area of the screen, the timelapse slows down until it comes to a halt and the area in which he/she is standing is replaced by a randomly selected particular shot of time on The Highline and play a random audio clip from the vast pool of one line audio interviews that we took.

Basic Setup

Three ultrasonic pings connected to an Arduino; OpenFrameworks reading the sensor values from the Arduino and controlling the video and audio playback for the 5 HD videos and over 45 audio clips; A large 40+” 1080p display for the output.


The ultrasonic pings used to interfere with each other because of physical proximity of their locations and this added to an extra computational cost as we decided it to handle it on the computer. We decided that a bigger screen and switches instead of proximity sensors would be a better option as the computational power was a scarce resource considering that 4-5 high definition videos had to be switched, modified in real-time depending on the sensor values without having a frame lag.

Its experimental. Erik Satie’s Gymnopédie No. 1 and Gnossiennes blended with some pennies, nickels, dimes and quarters. The idea was to express the ambivalence people have towards ‘money’. It was great working with Karam Byun esp. the fact that he recorded the sounds in the library!

[soundcloud url=”″ params=”” width=” 100%” height=”166″ iframe=”true”/]