University of Calgary Interactions Lab

January 29, 2007

Last week, I had the opportunity to visit the Interactions Lab at the University of Calgary. The ilab conducts research in the fields of human-computer interaction, computer-supported co-operative work (CSCW), and information visualization. On Wednesday they held a demo day, where they showed off their projects and facilities to other researchers at U of C, people from industry, and prospective students like me.

Phidgets are probably the most famous thing to come of the ilab (well, I’d heard of them before, anyways). Phidgets — short for “physical widgets” — are easy-to-program USB devices for interacting with the “real world”: e.g. servos, sliders, motion sensors, temperature sensors, and buttons. Saul Greenberg demoed some of the work that is being done with “shared phidgets” — phidgets that are networked.

There were several jaw-droppping demos on the interactive displays. One of them was Uta Hinrich‘s Interface Currents, which is a technique to make it easier to access workspace items on a large, interactive display. The main workspace is in the center of the display, while various objects travel in a “current” around the outside of the screen. A very cool idea and a beautiful demo.

Interface Currents

Edward Tse showed his work in multi-user and multi-modal interactions. If you’re not sure what that means, picture this: Ed pulled up Google Earth on the DiamondTouch interactive table. He could speak commands into a microphone (“fly to Calgary”), and manipulate the map using natural touch commands. For example, zooming was done using the same pinch gesture used in the iPhone. Ed also showed similar applications in World of Warcraft and the Sims. He’s got lots more information (including videos) on his website.

World of Warcraft

Also on one of the big screens, Michael Nunes demoed Video Traces, which is “a visualization system that allows detailed exploration of a large video stream”. It was a really neat system that allowed to look at video from a surveillance camera, and quickly identify and drill down to areas of interest, like when someone entered a room.

The interactive displays seem to be very central to a lot of the work being done at the ilab, but there were many other interesting demos. Rob Diaz-Marino‘s project uses motion-detection to create ambient sounds in the environment. For example, movement in one area of a room could cause the sound of waves crashing on the beach, while another area might be linked to the sound of seagulls.

One of the funnest projects I saw belonged to Jim Young. Jim is a PhD student working with Dr. Ehud Sharlin on human-robot interaction, and he’s exploring the use of cartoon expressions to communicate the state of robots. I thought this was a really interesting approach. It reminded me of Malcolm Gladwell’s Blink, which I read recently, where he discussed how humans are adapted to make very sophisticated judgements based on facial expressions. I really like the idea of exploring new interfaces that take advantage of our natural cognitive abilities. I think that this is something were really going to need, as we have more and more devices clamoring for our attention.

All in all, I had a great visit to the ilab. I left very impressed and inspired. Thanks to Saul Greenberg and Sheelagh Carpendale for bringing me there for the demo day, to Ed for helping show me around and telling me lots about the program and the school, and to everyone else at the ilab for being friendly and helpful!