Info

Sagar Mohite is a computational artist and an engineer based in New York City. His work has been an exploration in combining principles of design and computational sciences to generate visualizations.

Posts from the Lorem Ipsum Category

Colorpress / Data Portraits of Text

Colorpress is an online (link to be posted soon) and printed collection of grammatical portraits of various books, short stories, poems and other textual works.

These generative data portraits can be described as a pixelated representation of all the words in a book, color coded by various grammar rules. We also code occurrences of various characters and special phrases in each book.

We started out with the aim of creating a portraiture of an entire book to see what would it look like if each word was replaced by a pixel. The kinds of images that resulted looked like fingerprints of the original texts.

J. K. Rowling’s Harry Potter and the Philosopher’s Stone is the first printed Colorpress book. This book was created using the Canvas API and RiTa.js.

Request a portrait in print here.

[gview file=”http://www.seaflection.com/wp-content/uploads/2014/12/HP1_colorpress_compressed.pdf”]


Clara and Sagar

Camouflage – a dystopic self portrait created from point cloud data

Surreal, Science, Fiction, Dystopia, Monotonicity, Multiplicity, Trigonometry, Machines, Authority, Trap, Illusion and Camouflage were all the words I had in mind before making this portrait. This is an abstract self-portrait about the same.

How it works?

Given any scene the background color is a cosine function of the pixel data of my photo feed in the x-direction while it follows the tangent curve in the y-direction. The blinking and other effects therefore follow when the tangent function approaches infinity(NaN). There is no post-processing involved in the video except cutting and arranging scenes in Premiere Pro.

Written in C++ using OpenFrameworks and Kinect. Sounds from freesound.org.

+ high-res version

The aesthetics of responsive and contextual search

Rethinking Responsive

A major goal of this experiment was to rethink the meaning of the word responsive when it comes to user interfaces and experiences. A casual mention of this term today is often attached with a frame of reference. That frame of reference being “devices” and “screens”. We gauge responsiveness with reference to device and screen sizes. The apparent diversity seen these days in devices and products is what supports this method of designing or measuring responsiveness.

If, however, we choose to step back and look at the scene from a macro perspective, it might become apparent that the system won’t work if screens and devices are taken out of the picture. Considering the rate at which we are technologizing ourselves a future without traditional displays and screens can be acknowledged to be one of the many. What then happens to responsive? Can the way a machine interacts with us still be responsive?

I am more interested in thinking about the way a machine interacts with us rather than the antithesis. This is because I usually tend to respect the way we interact with machines (with our limited yet not-fully-understood intellect) and I believe that it is them who need to learn and not us.

Responsiveness should thus be considered with reference to people and context instead of devices and screens. Because people are here to stay while devices and screens aren’t.

The Funnel

The funnel is a conceptual solution that I will explore in this rant about a search system that exists in a quantified-self scenario.

The funnel is a middleware concept that takes in the following inputs from a client (search ) –
1. Monitors (digital self+ environment ) –
◾ Personal: calendar, call log, SMS threads, emails.
◾ Public/Community: news, traffic (Waze) etc

2. Trackers (physical self+ environment ) –
◾ Personal: activity trackers, health tracker, GSRs, EEGs, GPS etc.
◾ Public/Community: weather, temperature, humidity, air quality.

3. Raw data from search.

It processes (funnels) this information to sum up the following outputs –
◾ Context sensitive data results.
◾ Context sensitive instructions for the requester to compute the interface.

Why funnel?
The choice for using the term “funnel” instead of the popular term “filter” was an informed one.

The idea is to seek relevance instead of eliminating irrelevance.

The subtle difference lies in the fact that eliminating or filtering out things may not always leave back that what was initially sought.

Use Case 1.

Let’s get back to some screens since for some sample illustrations of how funnel might work in screen based search queries.

This example is assumed to be set in a world were browser based UIs have taken a prominence and in a time like 2014 where the only common sensor that is available to Funnel is the location sensor. The following images sum up the results from Bing for the query “nearest atm” with and without Funnel.

BING WITHOUT FUNNEL (Actual results)

bing_na

 

BING WITH FUNNEL

Discover Redesigned

Notice the account information? (seems like Tom Riddle has a good credit)
Q: Where did it come from?
A: Pipes.

Concept of Pipes

A pipe channels the output from one command to another. Symbolically denoted as “|” in many programming environments.

An extrapolation of this concept applied to the way native apps work, can help simplify inter-app communications. We have to start by assuming that an app is just a teensy bit sophisticated command that does a specific task. Pipes essentially bridge Funnel to these apps. An open pipe helps channel the output from one app to another (in this case Funnel). They are opened in case the contextualizing engine needs information from them to personalize the results.

In the above case, a pipe was opened from Mint to Funnel.

Where does Funnel live?

Funnel lives in the encrypted personal cloud of a user. It is owned and operated completely by the user thus eliminating uninformed use of personal sensitive data.

Funnel is a search-engine agnostic data cruncher that keeps context in mind. Funnel is not a search engine.

Use Case 2.

I search for “dinner suggestions” with Funnel enabled. It is assumed that Funnel knows what I like, what I ate for lunch today, what is my budget, am I too tired to cook etc. It knows this by opening a couple of pipes to relevant sensors and apps.

Context Redesigned

Use Case 3.

I search for my “car keys”. Funnel opens up pipes to “my tracked objects” inventory. This enables it to search for real things.

Things Redesigned

So what is Funnel?

It’s a data-crunching, fact-based-guesstimating middleware engine that provides users with deeper contextual results for search queries.

Lyric videos are trending these days. I like watching them as compared to their music video counterparts because often they display the exacting grace of good typography coupled with kinetic effects. The final project for CEI is going to be a piece of generative animations that creates a lyric video for a given song.

I began by illustrating Yann Tiersen’s Amelie title score [Repository].

The idea is to pick punch words from every line and animate them using kinetic typography and other motion shapes that are beat and tempo sensitive.