Micro-challenge I
Last updated
Last updated
Our project explores the agency of food and its role in human collaboration. We have created an interactive workspace where sourdough, equipped with sensors, determines who may work with it based on a hidden algorithm. Two participants approach the table, and the sourdough evaluates their compatibility using pH, humidity, and air quality sensors. If "chosen", they are guided to a central workspace where they work together to knead, shape, and prepare the dough for fermentation, symbolizing their shared collaboration with the living, but non-human ingredient. This project challenges the perception of food as a passive matter. Instead, it frames food as an active participant in decision-making and creation, and driving an experience without human err-gency.
Mohit Chopra: Concept development, research, storytelling, narrative integration, and visualization.
Kevin Enriquez: Technical development, fabrication, coding, and visualization.
Hardware & Components:
pH sensors
Humidity sensors (air and dough)
Arduino microcontroller
Overhead projector
Projector (for Touchdesigner visuals)
3D-printed components
Laser-cut wooden/acrylic parts
Software:
Arduino IDE (for microcontroller programming)
TouchDesigner (for real-time visualization)
Fusion 360 and Rhino 8 (for 3D modeling)
Fabrication Tools:
3D printer
Laser cutter
Manual fabrication tools
Contextual Concept: Food is the "universal language." In this "human-centric" world, where the human decides, what it eats and where it eats, what if we were to flip the table, literally? What if, instead of the human deciding what food it might want to eat, the food were to decide whether it deems the human compatible with it or not?
Concept Integration: We aligned our research interestsβfood design and storytellingβto create a shared vision: a dining experience where the non-human interface, i.e. food, decides which humans maybe able to share/start their stories through the sensory abilities of the culinary conversation starter.
Brainstorming & Selection: We wanted to take a deep dive into human-nonhuman interaction and how human(s) may react and interact, when they aren't the ones in-charge. The idea explores the possibility of starting a conversation based off of non-human agencies, like food, which could turn a difficult or a boring situation, into a dynamic "compatibility" or even a fun simulation of ifs, maybes and fuck it.
Food was always the focal point to communicate something. We were clear on the message we wanted to convey to the viewers of the project, but we werenβt certain how to transform this abstract concept into a working prototype. We explored several different routes via sketching, clustering, and mind mapping before settling on food as an active agent in collaboration.
Iteration Based on Feedback: Given how abstract the core idea was, we wanted to be able to create a project that not only serves as a "talking point" but becomes the start of future talks that might not have existed otherwise. After discussions with peers and advisors, we refined our idea from "elevating" a dining experience to sharing a "culinary journey", so the idea maybe able to not just be understood but implied by variety of people and not just us.
Developing Functionality: We mapped out the user journey with rhino/fusion 360 and 3D printer to give our vision of physical life while Touchdesigner was implied to create visual that would guide the user on their selection and further actions. While mapping out the physicality of the system, the technical part was driven by Arduino and the various sensors (humidity and temperature) to be able to make the human selection based on food consent. To summarize the functionality:
Digital Fabrication: The structure was modelled in Fusion 360/Rhino 8, ensuring precise measurements for 3D printing and laser cutting.
Electronics & Coding: We researched circuits, developed a wiring diagram, and coded the Arduino to process sensor data.
Visual Narrative Design: Using TouchDesigner, we designed real-time projections that visually responded to sensor inputs.
Physical Assembly: 3D-printed and laser-cut parts were assembled, integrating the sensors, lighting, and microcontroller.
Refinement & Testing: We calibrated the sensors, adjusted the visuals, and iterated based on test runs.
System Diagram & Assembly Sketches: We created a flowchart detailing sensor inputs and activation triggers, along with sketches of the table. This helped us refine how the system responded to participants, ensuring seamless interaction between the sensors and the visual elements. The process also revealed potential issues in real-time data processing, allowing us to adjust the design before fabrication.
The culinary experience is designed to spark dialogue on food agency and decision-making. Rather than humans choosing how to manipulate food, the dough itself acts as a gatekeeper, assessing and selecting its artisans. The experience is intentionally opaque, leaving participants questioning what factors influenced the selection process. This ambiguity adds to the conversation about foodβs autonomy and our relationship with it.
Early versions of the project considered more direct control over selection criteria, but we later embraced unpredictability to reinforce the idea that food, as an active agent, operates beyond human logic. This unpredictability strengthens the theme of food agency, making its 'choices' feel more autonomous adding the "mystification" of the food. We also adjusted the visual representation of βalivenessβ through multiple iterations, refining how the projector illustrated the doughβs decision-making.
Expanding the system to include additional biometric inputs (e.g., pH, pulse, or COβ levels) to further personalize the selection process, allowing the dough to respond to participantsβ physiological states and create an even more dynamic interaction.
Exploring different fermented food mediums with unique sensory responses, such as kombucha or miso. For example, kombuchaβs acidity and carbonation levels could dynamically shift in response to human touch, creating a different form of interaction compared to the elasticity and hydration of sourdough.
Refining real-time visualization techniques by improving responsiveness to sensor data, enhancing aesthetic cohesion, and increasing interactivity to create a more immersive and engaging storytelling experience.
Enhancing physical interaction by integrating robotic or kinetic elements in the form of the dining tableβs response.
This project serves as a prototype for rethinking foodβs role in human interactions, providing an engaging and provocative experience that challenges traditional perspectives.