Prof. Hiroshi Ishii

Tangible Bits

Tangible User Interfaces

Tangible Bits_The relationship between the center and periphery of awareness in physical space, and graspable media and ambient media.

Tangible Bits_The relationship between the center and periphery of awareness in physical space, and graspable media and ambient media.

Content Description

Sampler of Tangible User Interface Design
1 Pinwheels
Pinwheels is an example of ambient display that spins in a wind of digital information (bits). The spinning pinwheels allow people to feel the flow of bits representing human activities or happenings in the natural world in their peripheral vision while they concentrate on other activities (such as conversation) in the foreground.
An astronomer following the activities of a solar corona could install these pinwheels in his or her home in order to monitor solar winds in the background. Being peripherally aware of subtle changes in solar activity leading up to significant events could help the astronomer time periods of intensive observation. The basic concept is to make solar winds of ionized particles and all kinds of other information flows perceptible in architectural space as a “wind” driving old-fashioned pinwheels. Current graphical user interfaces display most of the information as pixels on a screen, requiring the user’s conscious attention. As such they are foreground media. But our capacity to recognize and process information is exhausted when we are faced with too much data in the foreground, leading to information overload.
Ambient displays, such as spinning pinwheels, help to solve this problem by representing continuous information flows as continuous physical phenomena in the background so the user can be aware of them peripherally.
http://tangible.media.mit.edu/projects/ICC_Exhibition/pinLarge.htm

2 inTouch
InTouch is a project that explores new forms of
interpersonal communication through touch. InTouch uses force-feedback technology to create the illusion that people – separated by distance – are actually interacting with shared physical objects (Distributed Shared Physical Objects). The “shared” object is a haptic link between geographically distributed users, creating a channel for physical expression over distance.
Each of two identical InTouch devices use three freely rotating rollers. Force-feedback technology synchronizes each individual roller to the corresponding roller on the distant mechanism; when one InTouch roller is moved the corresponding roller on the other InTouch also moves. If the movement of one roller is resisted, the corresponding roller also feels resistance. They are, in a sense, connected by a stiff “digital spring.” Two distant users can play through touch, moving rollers to feel the other’s presence. InTouch demonstrates a unique interface that has no boundary between “input” and “output” (the wooden rollers are force displays as well as input devices). The sense of touch is playing critical role, and information can be sent and received simultaneously through one’s hand.
Past communication media (such as video telephony) tried to reproduce the voice or the image of the human face as realistically as possible in order to create the illusion of “being there.” InTouch takes the opposite approach by making users aware of the other person without explicitly embodying him or her. We think that InTouch creates a “ghostly presence.” By seeing and feeling an object move in a human fashion on its own, we imagine a ghostly body. The concept of the ghostly presence provides us with a different approach to the conventional notion of telepresence.
http://tangible.media.mit.edu/projects/ICC_Exhibition/inLarge.htm

3 curlybot
curlybot is an educational toy that records and plays back physical motion. When the user takes hold of curlybot and moves it around on a flat surface it remembers how it has been moved. When it is then released, it replays the movement with all the intricacies of the original, including every pause, acceleration, and tremor of the user's hand. It was designed to help children develop geometrical thinking and as a medium for lyrical expression.
Phil Frei created the curlybot concept and completed the industrial and interaction design in late 1998. With the support of Victor Su for electronic circuit design and prototype construction, the first prototype was completed in the spring of 1999. The forced-feedback technology used for real-time simultaneous communication in inTouch is used in curlybot for the recording and playback of non-simultaneous gestures.
This project has significance in terms of both interface design and the use of computers for educational purposes. As a tangible interface it blurs the boundary between input and output (similar to inTouch): curlybot itself is both an input device to record gestures and a physical display device to reenact them. By allowing users to teach curlybot gestures hand and body motions, curlybot enables a strong connection between body and mind not obtainable from anything expressed on a computer screen.
From an educational standpoint, curlybot opened new horizons as a toy that may help children acquire mathematical concepts. It is programmed not on the computer screen but simply by moving it around in physical space, demonstrating the power of "programming by gesture.”
http://tangible.media.mit.edu/projects/ICC_Exhibition/curlyLarge.htm

4 Urp
Urp is a tangible urban-planning workbench based on the “I/O Bulb” concept originally developed by Dr. John Underkoffler in 1998. The “I/O bulb” creates high resolution, bi-directional light flows. It collects photons from physical surfaces, and uses knowledge about a particular domain, such as urban planning, to interpret the light patterns. It then responds with digitally-controlled light output, which is projected back onto the physical space.
In Urp physical architectural models are placed on a table illuminated with “I/O bulbs” and shadows are cast according to a computer simulation. By adjusting the clock, it is possible to track the shadow movements and sun reflections. In addition, air currents around the buildings are rendered visible and a wind gauge can be used to measure the wind speed at any point. Using “I/O bulbs” to project realtime computer simulations onto physical models makes it possible to understand and directly manipulate digitally rendered urban spaces in a world that is contiguous with one’s own body.
When designing tangible interfaces, it is important to consider which elements should be given physical form and which elements should be rendered as digital images. The key to a successful interface lies in hiding the boundary between the digital and physical worlds. The digital shadows (video projections) cast by the physical models in Urp represent one solution to this problem.
If we were to replace all of the hundreds and thousands of light bulbs in an architectural space with I/O bulbs, what kind of interaction design would be possible? The I/O bulb, as the core concept in this project, has demonstrated the potential for new digital interactions that occur not only on the tabletop, but within architectural space itself.
http://tangible.media.mit.edu/projects/ICC_Exhibition/luminLarge.htm

5 bottles
Through the seamless extension of physical affordances and the metaphor of bottles, this project explores interface transparency. Humans have used glass bottles for thousands of years. The basic concept uses glass bottles as both containers and minimalist interfaces to digital information.
Just as we naturally open and close lids to access bottles’ physical contents, in this project users open and close lids to access digital information. A wide variety of contents (including music, weather reports, and stories) have been developed to test the concept.
The “bottle-as-interface” concept began as a “weather forecast bottle,” which Ishii envisioned as a present for his mother. Upon opening the weather bottle, she would be greeted by the sound of singing birds if the next day’s weather was forecasted to be clear. On the other hand, the sound of rainfall would indicate impending rain. Such an interface would be consistent with her everyday interactions with her familiar, physical environment, such as opening a bottle of soy sauce. She never clicked a mouse or typed a URL in her life, but opened soy sauce bottles thousands of times.
In late 1998, Ishii and Rich Fletcher expanded this idea into “musicBottles” and began the project. They used sensor technology developed by Dr. Joe Paradiso and collaborated with different designers, engineers and artists to create a custom table and bottles with special electromagnetic tags.
Three sets of bottles – each with different content: classical, jazz, and techno music – were designed and built.
In June 2000, this project received the IDEA 2000 Silver Prize (International 2000 Industrial Design Excellence Awards competition).

We also developed custom wireless sensing technology for this project. An antenna coil attached to the underside of the table creates a magnetic field above the table. A custom electronic circuit detects disturbances in this magnetic field that are caused by the placement and opening of tagged bottles. The system then executes musical programs for each bottle (e.g. opening one bottle plays a piano) and controls the patterns of colored LED light projected onto the table.
This project uses a combination of artistic and technological techniques to support emotional interactions that are fundamentally different from conventional, functioncentric interfaces.
http://tangible.media.mit.edu/projects/ICC_Exhibition/bottlesLarge.htm

(H.Ishii)