House of Desktop DiningWebsite and Streaming, Machine Learning
Research sponsored by SNAP and led by Jenny Rodenhouse
Role: data collection,model training, HTML/CSS/JS coding
Dining Room Entrance
Visit dining room located at
The House of Desktop Dining is a digital restaurant to visit and eat with your computer. Automatically generating a table for two, the restaurant uses select visual and audio data from the top mukbang performers to prepare a set of personalized food pairings with your body’s posture and mouth sounds.
Questions we kept in mind include: How do machines see physical world? What does that mean?How does machine vision affects human activities and performance?
ToolsTeachable Machine, RunwayML
Training Process and Evolvement of Machine’s Vision of a burger
Audio and Visual data collectionThrough observation and discussion, our team found that popular mukbang videos represent typical types and food and the performers, the YouTubers, use exaggerated postures, movements and sounds to highlight foods and the eating process. Thus, we collected our data from mukbang videos on YouTube, exported and seperated frames and sound to train model according to different kinds of typical mukbang food and their corresponding gesters and sound.