24
May
12

Must See: MIT Creates Impressive Augmented Reality Object Manipulation System [video]

I wrote a few months back that augmented reality needed to prove itself. While I still think this is the case, I’m happy to report that top minds are working on just that. Some genius folks from MIT have created ‘T(ether)’, an amazing system which allows a user to interact with an augmented reality world by reaching out and manipulating it with their hands.

This sort thing is best explained with video. Thankfully, the MIT folks have put together a great video showing a bit about how T(ether) works:

Here’s the technical description straight from the project page:

T(ether) is a novel spatially aware display that supports intuitive interaction with volumetric data. The display acts as a window affording users a perspective view of three- dimensional data through tracking of head position and orientation. T(ether) creates a 1:1 mapping between real and virtual coordinate space allowing immersive exploration of the joint domain. Our system creates a shared workspace in which co-located or remote users can collaborate in both the real and virtual worlds. The system allows input through capacitive touch on the display and a motion-tracked glove. When placed behind the display, the user’s hand extends into the virtual world, enabling the user to interact with objects directly.

As you can see the system works very well, even if it’s just a prototype. The hand-tracking is extremely responsive, and with it, the user can easily create, manipulate, and even animate objects in a virtual world.

Unfortunately, T(ether) requires an expensive and immobile infrared tracking system (the same kind used for motion-capture in videogames and movies) in order to use; this isn’t a portable or affordable system by any stretch of the imagination.

Still, what they have here is an impressive proof-of-concept which makes it very easy to imagine practical applications of such technology.

I’ve emailed the folks responsible for T(ether) about using an HMD instead of an iPad as the display interface. It would seem much more natural to me to simply look around through a head-tracked HMD than to hold a heavy iPad as a window into the augmented-reality world. The interface is of course placed on the iPad, but with a little elbow-grease, the interface could simply be projected within the augmented world itself, eliminating the need for the iPad. I’ll update this article if I get a response.

As soon as I started watching them create and place cubes, my mind went to Minecraft. Pair T(ether) with an omni-directional treadmill an HMD and Minecraft, and you’d be able to stroll around a virtual world and physically hold a virtual cube and place it wherever you like; this would be way more immersive than simply right-clicking to place a block. I’ve built many-a structure in Minecraft, but to place each cube with my bare-hands would be truly incredible and a big step in the direction of immersive VR gaming.

Gotta give credit where credit is due. The folks behind T(ether) are as follows: Matthew Blackshaw (@mblackshaw)Dávid Lakatos (@dogichow)Hiroshi IshiiKen Perlin.

Advertisements

3 Responses to “Must See: MIT Creates Impressive Augmented Reality Object Manipulation System [video]”


  1. May 25, 2012 at 10:01 am

    It’s good as a demo but it’s really just highlighting how inconvenient everything in VR is without an HMD.

  2. May 25, 2012 at 10:05 am

    RE: T(ether) + Minecraft
    Take a look at this video which puts MC’s block “disadvantage” to good use:


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: