In Nov 2010 this blog was closed (read why). To keep up to date with me now, visit www.chrisoshea.org and follow @chrisoshea on twitter. Thanks
Pixelsumo is a blog about interaction, with an emphasis on play, installation, video game culture, playgrounds and toys. Written by Chris O'Shea.
Follow...
Twitter
Posted February 11th 2006 under Installations, Multi-touch, Surfaces, Tangible
Jeff Han
![]()
Back in August last year I posted a project by Jeff Han that demonstrated how a multi-touch surface could work using internal reflection of infrared light and camera tracking. Since then he has been working with a number of demos with collaborators, including many from Philip Davidson, two from Casey Muller and one from Ilya Rosenberg. These demos are great, especially the modular synth patch (would be good to see a video of this alone). View here.
Is this from Apple?
No. Many sites have reported that this is a new Apple product, but the release of these videos just happened around the same time as news of patent applications for multi-touch interface gestures. CreateDigitalMusic has a good roundup of the patent.
The project by Jeff couldn’t easily become a commercial product in its current state, as the set-up (rear projection, cameras etc) is more suited to interactive installations, than an off the shelf package.
What about Lemur?
![]()
JazzMutants’ Lemur is a touch-screen interface, mainly used my performers/musicians to control various sound applications via OSC/MIDI. There are two main difference between these two projects:
1) Display: Lemur lets you to create an interface of pre-defined panels, buttons and sliders, but with Jeff’s allows you to program software that reacts to input and then overlay this on the screen. This enables an intrinsic link between the input and output, rather than a seperate connection.
2) Portable: The Lemur is obviously a lot more portable and self contained, but at the same time is perhaps only suitable for one user at a time due to the size. The Lemur uses electronic sensing of touch, where as the first uses camera tracking, contributing to the size of space required.
I am not comparing these two projects from a competitive standpoint, as I believe both have different objectives. Jeff is exploring interaction methods using multiple and collaborative inputs, Lemur is a self contained package that allows musicians multiple controls using one or two hands.
SmartSkin
![]()
Back in 2002 Jun Rekimoto created SmartSkin, a multi-touch surface using electronic capacitive sensing (a grid of wires that detect small electrical currents in our bodies, kind of like a theremin) and top down projection. Due to the wire mesh, this doesn’t easily allow for rear projection. Unfortunately this work doesn’t seem to have progressed, as Jun has been working on lots of other great projects. [movie]
TactaPad
![]()
Back in June I found TactaPad that demonstrated multi-touch input to be used like a mouse. No more information has been released on this potential product as yet. From the images my guess at how this works would either be a) camera above the hands b) projecting infrared light downwards and measuring light levels through the base. I may be completely wrong :)
Comments?
I really like the first example by Jeff, as it is easy to build (compared to the others) for installations and has demonstrated how such applications can be used. Although camera tracking and infrared illumination is nothing new, I’ve not seen frustrated internal reflection being used in this way.
As far as commercial products go, combine electronic sensing like SmartSkin or Lemur into a flat screen monitor, allow programmers to access the input data and then display their visuals on the surface. This would be the ultimate consumer product for multi-touch. Perhaps Apple are working on such a device, or could Lemur have plans in this area?
If you know of any other multi-touch devices and research, or would like to comment on the above, feel free to post.
Comments
(February 12th 2006)
Here’s a commercial product (not as sophisticated as the Lemur): http://www.thinkmig.com/stcpics.html
(February 14th 2006)
[…] Multi-Touch Interaction Roundup A summary of current multi-touch / gesture interfaces. (tags: tangible_computing information_visualization) […]
(February 28th 2006)
Another product, discontinued but which you can find somewhat regularly on ebay, is the fingerworks iGesture – see http://www.fingerworks.com/ . There’s one on ebay right now, actually, just search on ebay for “igesture”. They go for anywhere between $200 and $300, depending on how desparate the buyer is. I absolutely love it, because they provided a wonderful SDK that makes it easy to get the raw data out of it, and you can connect multiple of them to a single computer simultaneously – i.e. for both hands. The device is extremely responsive, keeps track of fingers fairly well, and gives you the raw data (including the area covered by fingers, giving pseudo-pressure) in a very convenient form. Heartily recommended for DIY folks. I added iGesture support to keykit, if you are interested a MIDI-capable system that can use it. …Tim…
(March 15th 2006)
just discovered this one… seems to be one of the most promising new developments…
http://www.esa.int/esaCP/SEM60JYEM4E_index_0.html
(March 23rd 2006)
yet another one, by microsoft…
http://research.microsoft.com/~awilson/papers/ICMI%202004%20TouchLight.pdf
(May 28th 2006)
The Exploratorium had a system, maybe 10 years ago, that used internal reflection and a rear-mounted camera in a similar way. The surface was kept wet to improve the effect of contact.
(September 12th 2006)
multitouch gone mass-production
Last week i was visiting the IFA fair on its last day. All in all it must have been heaven for electronics sales people, since every big player on the market was showing off his newest gadgets. Yet for the consumer, unless he or she is a total computer…
(November 30th 2006)
Noteworthy but unmentioned in your roundup are two sadly defunct but very real products:
The MTC Express from Tactex was not only bundled with useful interface libraries, but also capable of pressure sensitivity, something lacking from the capacitive technologies (although Han’s system may be capable of it to some degree). It was based on a grid of optical fibers. Apparently Tactex has retained the patent and is interested in liscensing, but probably not willing to try another venture into the consumer market.
Also interesting is the “iGesture” series of capacitive multi-touch pads. Apparently their maker went under, but they still surface on eBay occasionally. One of them was a programmable keyboard, and they all had some degree of gesture recognition built into their firmware.
(September 23rd 2010)
Hi everyone!!
I need help with my multi-touch lemur!!! Im currently working on a project that requires me and my group to build a platform for the lemur program…however we are supposed to use a projector and Infrared LEDs to project the screen….If anyone can send me any messages with advice on making these two components work it would be greatly appreciated :]