With a macro camera lens and 2 machine learning javascript libraries, I had turned microbes into musicians - if only they were aware of their talent!
The project began with inspiration from an artist's "music", to a nostalgic flashback to Osmosis Jones, then a fascination with the Microscopy Lab of RISD's Nature Lab, and ultimately the translation of microbial activity into music. Below are microbiomes I observed, and a video demonstrating the project.
This isn't just a personal story. Populations and nations across the world are currently facing multitudes of risks without proper regulation of automation and re-skilling. These are the figures that we must understand and act upon:
I first read of Bartholomäus Traubeck's "Years" project, in which he translated a slice of a tree into playable vinyl record. By using camera vision, he was able to "read" the tree's rings and tag the data to musical notes – I was in awe.
I thought about what other formats computer vision could be used in to produce music, but didn't find an intersection I was interested in until I re-watched Osmosis Jones with my friends (it's a movie about the human microbiome).
I felt energized by Chris Rock's voice and the movie's soundtrack.
With access to the RISD Nature Lab's Microscopy Lab, I began to collect specimens and research into potential micro-organisms and their movement. Below is a sample of brine shrimp hatchlings being observed under a high-power microscope.
Brine shrimp hatchlings moved too quickly and are very opaque. This made it hard for the object recognition algorithm to identify it against the backdrop, so I had to seek something less active and visually saturated.
Below is a paramecium from a sample of water from the Providence Canal. It's just dark enough to stand against the spotlight, and its movements were significantly slower. I went forward with recording videos of paramecium movements to feed into the javascript object recognition algorithm, Coco SSD.
With the customizability of the Coco SSD tracking algorithm, I could combine integrate the tone.js library to write a program that played sound based on an object's location.
By dividing the computer's vision into 4 quadrants, I can assign an instrument to each, controlling individual instrument volume by the microbe's location within each quadrant. Click on each box to hear the sample assigned, or the whole compilation here.
Looking ahead, there is potential for this to become a web application that puts a fun spin on biology education. What if the experience could be built to students how micro-organism species differ simply through the cameras of their phone? What if sound artists can utilize their natural biome to create music or audio experiences? What if it could help biologists identify microbiomes in a whole new medium?
With sufficient time and resources, these concepts can be achieved, with an arduous ML algorithm training process.
IxD
UX
Discursive Design
Exhibition Design
Thesis complete – view interactive prototype from the exhibition here: https://raceagainstai.github.io/
More documentation coming soon.
_______________________________________________________________________________
For my thesis project, I'm currently designing an interactive exhibition/ experience in which participants view conceptual visualizations of the future of work through narratives. Through my academic exchange at University of Arts Berlin, and lectures from DFKI (German Research Center for Artificial Intelligence), I researched AI ethics and legislation in the European Union.
Utilizing the resources and location, I began to understand the regulatory process that the European Commission is undergoing to control artificial intelligence and work automation. In combination with research in America, I will produce an future-cast experience.
Below is my project plan. More Documentation coming soon.
Clips are pieced together, like parts of a puzzle, to create one reflective narrative. The reflective narrative reflects the visitor's answers to the survey - details ranging from marriage status to hopes and fears of automation.
Take a peak at two sample clips below:
Below is the full project plan. I present a shorter version to community collaborators and to request for exhibition/installation permits in public spaces.
IxD
Computer Vision
Machine Learning
Discursive Design
What if microbial activity can be mapped into music?
I will be designing a web application that allows people to observe microbial activity with just their smartphone camera and a plastic lens attachment. The libraries I'm currently using are:
More documentation coming soon.
Proof of concept (midterm):
Using openFrameworks to demonstrate functionality.
Recording of micro-organism activity in order to test ofxCv's blob detection and contour-tracking libraries. The goal is to store up to 10 cells (micro-organisms) from the field-of-view into a vector. Then, attach a sound sample to each cell. You can see the basic structure of how cells are tracked and stored on the screenshot at right.
Contour-tracking of micro-organisms from a river-water culture (OFxCV)
Collaborating with RISD Nature Lab for organism observation: Observing baby and adult sea monkeys (brine shrimp)
Working with the RISD Nature Lab, I'm using the micro-organisms of various aquacultures to observe for this project.