Live Trace


In this interactive experience we are interested in enabling quick input actions to Google Glass. The application allows users to trace an object or region of interest in their live view. The Live Trace app demonstrates the effectiveness of gestural control for head-mounted displays.


Back to the Desktop


In this project, we construct a virtual desktop centered around the smartphone display with the surface around the display opportunistically used for input. We use a 3-pixel optical time-of-flight sensor, Mime, to capture hand motion.


Biomimicry II - Frog Vision


How do frogs see? Why do they stay still for the most of the day? This project explores frog vision and provides unique and immersive experience to people to see the world as frogs with Oculus Rift and Kinect.


Biomimicry I - How Bats Hunt


It is known that bats use echolocation to perceive the world and to precisely spot moving insects in the air when hunting. This project is an attempt to replicate the experience to the humans by spatially locating speakers in each volumetric ear piece.


Webpage designed by Hye Soo Yang

Fabrication - How to Make (Almost) Anything


Using various fabrication tools and machines in the machine shop, numerous hands-on projects that started from scratch produced interesting outputs that were only imagined of previously. 




While there exists a number of ferrofluid-based applications, none of them has a tangible interface. AnimaFluid introduces touchable interface that creates interactive buttons using textures generated from ferrofluid and electromagnets.



Designing a new tool that takes affordances of different objects can be cumbersome if one does not have a clear idea of how it is going to look like. As a quick solution to this, Vmorph creates differently combined shapes of two objects based on the distance between the two.


Make Your Own Phone


Phones today are extremely complicated and, therefore, it is very difficult to understand how it actually works. During this workshop, makers were challenged to solder basic phone components and to get it to work just like any other mobile phone.


Hand Gesture Recognition & Augmented Reality


As input area for mobile devices is decreasing due to smaller form factor and as limited input system for wearable devices is disconcerting, this project uses hand gesture to make the whole volume around the device as an input volume. 

Light-field Camera iPhone Application


In this interactive experience we are interested in enabling quick input actions to Google Glass. The application allows users to trace an object or region of interest in their live view. The Live Trace app demonstrates the effectiveness of gestural control for head-mounted displays.


Computational Photography


Computationally combining, manipulating and changing digital photography not only help create unique images but also allow edge detection and refocusing on objects at different depth. 


User-customizable Gaming Environment


Mobile games are mostly designed by professional game engineers and designers. Gaming props, characters and settings must be predefined and users do not have much input towards them. In this iPhone game application, users can create their own gaming environment using image from their photo library.

Computational Pointillism


Most representative pointillism masterpiece, A Sunday Afternoon on the Island of La Grande Jatte by Seurat, can be rendered in 3D space with spheres. When looked at straight from the front, individual spheres come together and make the whole piece.


Python Image Program


Pixels come together to make up a larger image and are not so visible from distance. This python image program converts each pixel into a letter and tries to secretly embody stories in images. As an example, a picture of a renowned person itself becomes his wikipedia biography.  


Computer Graphics - Metro City


Metro City is a virtual 3D world that is built in an attempt to understand how 3D modeling software worked behind the scene. It is written in Python and uses OpenGL API to create 3D shapes and lights and to map various textures. 

Engineering - Hopping Robot & Fire Fighting Robot


The hopping rabbit robot was a project aimed at creating a unique forward moving mechanism that uses two simple motors. Integrating the motors with wheels and four light sensors allowed the robot to find a light source and put it out.

Projects   2011 -2015

Hand Puppetry 


Inspired by hand puppetry in which finger joints successfully move different parts of a puppet, this project animates a virtual model with hand gestures whose finger joint information is captured through Leap Motion controller. 


Play Virtual Chess with Your Hand  


Moving 3D chess pieces in a virtual environment is challenging. For this reason, many chess games are designed in a 2-1/2D space and have unnatural ways of interaction. This project enables 3D gestural interaction in the gameplay using a compact 3D camera that captures hand information.

Relighting the Scene in Augmented Reality


Augmented objects in augmented reality often have a disconnected look due to the unique light setting inherent to the 3D mesh. In this mini project, an augmented object can be relit by the user who controls the direction of light with a fingertip.


Variable Reality


This augmented reality application developed for Oculus Rift DK1 with two webcams is meant to bring together the advantages of a physical book and that of a digital book to enable a unique user experience in reading.


Live Trace II


As a continuation of Live Trace I, a quick gestural input actions are performed on two distinct wearable devices (see-through Vuzix glasses and Google Glass) to satisfy the need for indoor and outdoor use cases. 


Variable Reality II


A wearable augmented reality-based system focused on creating a unique on-the-go reading experience that combines the readily accessible nature of digital books with the favorable physical spatiality of a paper book.