share!








INTERVIEW by LAUREN RIGNEY | PHOTOGRAPHY courtesy of CHRIS HARRISON

 

Picture this: you need to make a phone call, but instead of reaching for your cell phone, you open the palm of your hand. The numbers are displayed on your hand’s surface, and you dial by tapping the digits on your own skin. As long as Chris Harrison continues his research, soon enough we’ll be making calls, taking notes, and browsing the web on any flat surface — yes, including our own bodies. As a graduate student at Carnegie Mellon University’s Human-Computer Interaction Institute, Chris has already helped develop a couple of prototypes that change the way people interact with computers. His OmniTouch system turns any surface into a touch screen and has already become an internet sensation with its viral demonstration video on YouTube.

 

 

What are you studying at Carnegie Mellon University right now?

 

My PhD will be in human-computer interaction.

 

Are you from Pittsburgh?

 

No. Born in England, raised in New York City. I moved from the U.K. when I was six months old.

 

The OmniTouch video has been posted on YouTube for just under two months, and it’s already at 1.2 million views. Were you surprised by how quickly this video took off?

 

It was definitely a little unexpected. My projects contain technology that span immediate next steps all the way to something very futuristic. And this project in particular was quite far out — it’s almost sci-fi. I think that caught peoples’ imaginations.

 

Obviously smartphones and touchscreen tablets have really changed the way we do a lot of things. And being able to have touchscreen-like functionality on anything — your hands, walls, notepad, whatever — I think is a really powerful idea.

 

The people that I got a lot of emails from — the people who were a little bit more invested in thought than a one-line YouTube comment — were like, “I can see in ten years’ time, this is going to be super small and twice as good.” Those people really understood the implications, and I think it’s the implications of this project that’s really exciting.

 

Okay, so, Chris, what are the implications of OmniTouch?

 

Well, probably the most straightforward implication is that anything you can do on your smartphone today, you could conceivably do in the palm of your hand tomorrow. It’s not even in your pocket, it’s right there on your body — it’s getting really intimate.

 

What do you see as the near future for OmniTouch?

 

I can’t comment on anything specific, because it is an active project at Microsoft. What I can say is the logical next step would be miniaturization. And then I think the next big step is making it more palatable — so rather than having something shoulder-mounted, perhaps it could be integrated into the back of a cell phone. Or perhaps we can build this into a little thing you clip on your pocket or on the strap of your backpack.

 

Looking around on your site, it’s obvious that you’ve been doing a lot of other research and projects. Is OmniTouch taking up the majority of your time and focus right now?

 

Actually, I’ve already moved on to new projects, though on-body computing is a continuing thread of research. There was also another project that came out at the same time as OmniTouch, called TapSense. If you have mobile devices, there’s two fundamental problems with them: one is that they’re small, so you inherently have less area to do things. The other side of the input coin is richness. How can we make the interactions on that small device as rich as possible? One example of increased richness is multi-touch. TapSense is an example of a project trying to increase touchscreen richness. It basically listens to the sound of the impact of your finger; it can tell which part of your finger was used. For example, you can hit with your knuckle, your nail, the tip of your finger, the pad of your finger, and the device knows.

 

The problem is not that mobile phones are slow, or that humans are slow. The problem is that we can’t communicate with each other fast enough. So my research is trying to figure out a way to remove that human-computer bottleneck.

 

You mentioned that TapSense uses the sound of your finger parts?

 

If you tap with your fingernail versus if you tap with your knuckle, you can hear with just your naked ear that they sound very different. Basically a microphone in the device listens to this sound.

 

Watching the video on OmniTouch, I was thinking about all the fun, potential uses for this. When you were playing around with it, was there anything really interesting or funny that you discovered?

 

One thing that was very intriguing — we were running the user study for OmniTouch and we had budgeted ten minutes for people to practice using the system. With this system, it turns out people didn’t really need any training at all. In the end, even though we budgeted ten minutes of time for all these participants, no one used any of those ten minutes. Within two minutes they were like, “Okay, let’s go. I get it.”

 

What kind of phone do you use in your personal life?

 

I have an Android phone.

 

Do you have a tablet you use?

 

I don’t actually have a tablet. I just have my laptop and it goes everywhere with me.

 

What are your plans post-PhD? What do you see yourself doing?

 

I’m slightly leaning towards academia — becoming a professor. I’ve had the opportunity to mentor a lot of undergraduates at CMU, and this semester I’m teaching a class on prototyping. I’ve really enjoyed the experience, so I’d be very excited to continue that. That being said, there’s also some fantastic industrial labs, which are almost like academic institutions. The one that comes to mind is Microsoft Research.

 

How can Daily BR!NK readers help contribute to your success?

 

Foremost is that we are always looking for excited interns and undergraduates. There are a lot of different opportunities at CMU — programs you can apply to. If people are interested, have enthusiasm, and think they have the skill sets to tackle projects like this, we’re always interested in hearing from them.

 

Is there anything we left out?

 

The only other thing I’d add is what I’d touched on before about bringing computation closer to the user. There’s huge literature on wearable computers. If you went back to the 80s, people had all these notions of, like, you had a head-mounted computer and you had a keyboard on your belt, and you were basically kind of a cyborg. I’m going anti-cyborg. I want to remove, or at least hide, the computing. I want to augment the human form, not obscure it or replace it. We’re so familiar with our bodies. We get way more training with our bodies than any other device. It’s actually the perfect union between technology and the body. It’s a very nice marriage between these two worlds that hasn’t ever really existed before.

 

 

logo


 

connect
Chris is looking for:
interns, undergraduates
media
OmniTouch
radar
comments
Say something >








Copyright © 2012 Daily BR!NK. All rights reserved.