Canesta showed off a demo at the International Consumer Electronics show in Hitachi’s booth. Basically, the 3-D depth camera can detect your movements. Hitachi used this system to create a gesture-controlled TV. You don’t need a remote control.
Category: Gesture Recognition Page 2 of 3
This video from the 2009 International Consumer Electronics Show shows how you can control your TV using hand gestures that are detected by the 3-D camera atop the TV. Softkinetics does the software, a Swiss company does the depth camera, and Orange Vallee will deploy it in its interactive TV network
Best Of Show Award & Best UI design at CEATEC 2008.
New remote controller concept from Panasonic R&D (San Jose Lab) featuring a dual click-pad, hand detection and on-screen user interface.
UI snapshots and award ceremony at CEATEC 2008.
This is again, like the Hitachi TV (here), a very good example of good gesture recognition combined with excellent interaction design and a good Graphical User Interface (GUI). The three elements need to be combined to get the right kind of gestural interaction, it would seem. On the iPhone it works that way as well: good touch gesture recognition, good interaction design (the way the gestures translate to computer actions) and a good GUI (which invites or ‘affords’ the right sort of gestures).
RBB TODAY
http://www.rbbtoday.com/news/feature/ceatec2008/
This looks like it is actually heading in the right direction. The gestures appear well implemented, as could be expected from the boys of GestureTek. And the use of the Canesta Vision chips (more here) appears to be very effective as well. There is a decent review of this Hitachi TV over here at Take a Plunge…
The TV uses single-chip-based 3-D sensors provided by Canesta and the software created by GestureTek.
The Canesta’s sensors in the TV will collect a 3-D image of everything in the room. This 3-D technology helps it to recognize your hand from a printed hand on your t shirt or in any other object in the room. It recognizes different people and your hand when you stick out your hand for controlling the TV.
The gestures are simple and culturally sensitive. Gesturetek the software makes it easier for the users to control the TV according to their movements. You will also have alternate methods to control the TV.
A user of the new Hitachi TV set can get the control bar with just a wave of the hand
Spin the wrist – activate scroll wheel
Swipe left or right – browse options
Two hands – switch to a different function
As you can see in this next video, they created a wonderful GUI, an interface to go with the gestures. You are not left alone gesturing in thin air, no, you get good feedback on the screen about your gestures. This greatly resembles the old Playstation EyeToy (see here), also made by GestureTek.
A Computer Vision based hand gesture recognition system that replaces the mouse with simple hand movements. It’s done at the School of Computing, Dublin City University, Ireland.
Sometimes the future of gesture recognition can become clearer by examining an application that will definitely NOT hit the market running. Why on earth would anyone prefer to wave their hands in the air and click on empty space with their index finger instead of feeling a solid mouse underneath your hands? I just don’t get it. If it’s supposed to be a technology showcase, then okay, they managed to get something up and running, bravo!
I think that generally speaking, people are enthusiastic about human-computer interaction if it feels good , because it’s usable (effective, efficient, economic), pleasing to the senses, or in some other way beneficial to their concerns. I imagine that this virtual ‘mousing’ is none of the above. Maybe if they changed it to a pistol gesture, where you shoot with your thumb, it would get slightly better. But I would have to be able to launch a quick barrage of shots, say 4 or 5 per second, for this to be of any use in a first person shooter game. There’s a nice challenge for you, guys 🙂
I noticed a flurry of gesture patents that mentioned a ‘portable mutlifunction device’. That’s patentspeak for iPhone. The patents were all from APPLE Inc. Well done Apple. That’s how you manage a patent portfolio. Philips and IBM used to be the masters in this line of completely covering an area with a barrage of patents. It will give Apple something to negotiate with in future business deals with other vendors.
Here they all are as far as I could tell:
- PORTABLE MULTIFUNCTION DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR INTERPRETING A FINGER GESTURE ON A TOUCH SCREEN DISPLAY (WO 2008/086302)
- PORTABLE ELECTRONIC DEVICE SUPPORTING APPLICATION SWITCHING (WO 2008/086298)
- SYSTEM, METHOD, AND GRAPHICAL USER INTERFACE FOR INPUTTING DATE AND TIME INFORMATION ON A PORTABLE MULTIFUNCTION DEVICE (WO 2008/086073)
- APPLICATION PROGRAMMING INTERFACES FOR GESTURE OPERATIONS (WO 2008/085848)
- MULTI-TOUCH GESTURE DICTIONARY (WO 2008/085784)
- GESTURE LEARNING (WO 2008/085783)
- PORTABLE MULTIFUNCTION DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR INTERPRETING A FINGER SWIPE GESTURE (WO 2008/085770)
- PORTABLE ELECTRONIC DEVICE, METHOD AND GRAPHICAL USER INTERFACE FOR DISPLAYING INLINE MULTIMEDIA CONTENT (WO 2008/085747)
- PORTABLE MULTIFUNCTION DEVICE,METHOD, AND GRAPHICAL USER INTERFACE FOR TRANSLATING DISPLAYED CONTENT (WO 2008/085744)
- OVERRIDE OF AUTOMATIC PORTRAIT-LANDSCAPE ROTATION FOR A PORTABLE MULTIFUNCTION DEVICE WITH ACCELEROMETERS (WO 2008/085741)
- METHOD, SYSTEM, AND GRAPHICAL USER INTERFACE FOR VIEWING MULTIPLE APPLICATION WINDOWS (WO 2008/085739)
- METHOD, SYSTEM, AND GRAPHICAL USER INTERFACE FOR PROVIDING WORD RECOMMENDATIONS (WO 2008/085737)
- Somewhat earlier this year: DELETION GESTURES ON A PORTABLE MULTIFUNCTION DEVICE (WO 2008/030975)
- SOFT KEYBOARD DISPLAY FOR A PORTABLE MULTIFUNCTION DEVICE (WO 2008/030974)
- PORTABLE ELECTRONIC DEVICE PERFORMING SIMILAR OPERATIONS FOR DIFFERENT GESTURES (WO 2008/030972)
- EMAIL CLIENT FOR A PORTABLE MULTIFUNCTION DEVICE (WO 2008/030970)
- PORTABLE ELECTRONIC DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR DISPLAYING STRUCTURED ELECTRONIC DOCUMENTS (WO 2008/030879)
- PORTABLE MULTIFUNCTION DEVICE, METHOD, AND GRAPHICAL USER INTERFACE FOR CONFIGURING AND DISPLAYING WIDGETS (WO 2008/030875)
- PORTABLE ELECTRONIC DEVICE FOR PHOTO MANAGEMENT (WO 2008/030779)
- PORTABLE ELECTRONIC DEVICE FOR INSTANT MESSAGING (WO 2008/030776)
- 2007: UNLOCKING A DEVICE BY PERFORMING GESTURES ON AN UNLOCK IMAGE (WO 2007/076210)
- 2006: GESTURES FOR TOUCH SENSITIVE INPUT DEVICES (WO 2006/020305)
Who will be able to argue with this patent portfolio? Who will be able to claim that the things Apple has patented were already invented elsewhere? Who will be able to maintain that gestures are not technical inventions but natural human communicative actions? Who will pay the lawyers to fight these fights?
Here it all is in a fashion that is easier to digest than sifting through 22 patents.
I think Apple has won this fight before it could even get started.
Control a Beamed Powerpoint Presentation with Gestures
These students appear to have created a gesture based application that we also considered about four years ago. I know IBM and Philips were interested in this sort of application. So, well done guys! And excellent presentation too. I think they managed to make the best of it, given a difficult application.
Why is a presentation system a difficult application? Well if someone is presenting he will usually gesture during talking. These gestures are directed at the audience and not at the presentation software. So, the first task of such a system is to discriminate between those gestures: what is for me and what is for the audience. Furthermore, a presenter may also be fidgeting during his talk which shouldn’t be interpreted as a gesture. Unfortunately, it is unclear whether these students treated these issues.
The things they did do seem to be designed well enough. I think I like the calibration they designed: It creates a connection between the user’s physical environment and the camera he must address. It grounds the interaction. The subsequent examples of the functionality they have built in is less impressive. The forward-back commands are okay, but the drawing and highlighting are not very valuable in my opinion. People in the audience can see that you are pointing at something so there is perhaps little need to do more. But maybe these are first steps which need a bit more maturity in their interaction design to become useful.
On the whole, excellent work.
Ninja Strike, a killer application for gesture recognition?
This is certainly an interesting development. Previously we have seen mobile phones using motion and acceleration sensors for gesture control (see here and here). There have also been applications where the camera was used to simply capture optical flow: something in front of the camera is moving/turning in direction A therefore the phone is moving/turning in A + 180 degrees (here). In this case the gesture recognition appears to go a step further and at least the hand appears to be extracted from the image. Or does it simply assume all movement is the hand? And then perhaps the position of the motion is categorized into left-middle-right? Maybe the velocity is calculated but I don’t think so.
Update: I do like the setup of how people can hold their phone with the camera in one hand, throw with the other and check their virtual throw on the display. The virtual throwing hand on the display is more or less in the same position as your physical hand, which I think is nice.
EyeSight is a techno start-up of 2004 from the Kingdom of Heaven (Tel Aviv) aspiring to use nothing but Air and a Camera to achieve a divine interaction between true techno-believers and their mobile phones. They prophetize that their technology will ‘offer users, including those who are less technologically-adept, a natural and intuitive way to input data, play games and use their mobile phone for new applications’. Heaven on Earth. Mind you, nothing is carved in stone these days. Besides, human nature and intuition are all too often deified these days anyway. Human nature is what usually gets us into trouble (not in the least in the Middle East).
Anyway, one of their angels called Amnon came to me in the night bearing the following message:
Hello Jeroen,
First Allow me to introduce myself. I’m Amnon Shenfeld, RND projects manager for eyeSight Mobile Technologies.
I’ve been following (and enjoying) your BLOG reports for a while, and I thought that the following news from my company, eyeSight Mobile Technologies, may make for an interesting post.
eyeSight has just launched “Ninja Strike”, an innovative mobile game featuring a unique touch free user interface technology we call eyePlay™. Allow me to provide some background information about eyeSight, eyePlay and Ninja Strike: I’m sure you are aware of the popularity and attention innovative user interfaces are getting since the introduction Apple’s IPhone and Nintendo’s Wii… My company’s vision is to bring this technology into the mobile market, and our first products are focused on changing the way mobile gamers play. Our new game, “Ninja Strike”, does exactly this.
You play a ninja warrior with Ninja Stars as your primary weapon. Your stars are thrown by making a throwing motion in front of the phone’s camera. Much like training in real life, during the game you will learn how to throw your weapon correctly, and improve your aim. Your enemies, the evil Kurai ninjas, will also gain strength as the game advances…
Looking forward to hear from you, I hope to see a new post in your blog soon, you’ve been quiet for a while… J
Amnon Shenfeld
Amnon, will you heed my calls? Have you answers to my burning questions above?
Here is another aspiring wannabee HCI star at the gesture firmament: the gesture watch.
Activate! The Gesture Watch has five infrared sensors, four of which sense any hand motion that occurs above the watch. If the user is wearing the watch on his left hand, he can move his right hand over the watch in an up or down, left or right, or circular motion. Different combinations of these movements communicate an action to the watch. (source)
Why do such applications receive so much credit in the various tech news sites and magazines? The only thing happening is that a couple of engineers have put together a neat device that can do a trick. It’s not commercially available, there are no real users yet, there is no positive market feedback. There is only a vague promise of solving a vague problem.
Discovery Channel: It won’t be long now before all electronic devices go “nano,” and shrink to the size of frosted mini wheat square. You won’t know whether to turn it on or eat it. But the real question is: How do you press those teeny buttons?
I know that writing an opening line can be hard, but this one has fallen straight from the sky on the willing imagination of Tracy Staedter (the reporter in question). Did she not notice the big display on the iPhone? People may not want tiny devices at all, because they need displays. And yes, they may also require decent buttons from their devices. In other words, the premises of the promises are promiscuous (sorry, couldn’t resist); reporters are trading in their objective reflection for a nice soundbite.
Two guys in Australia have given humanity the ultimate killer application for gesture recognition:
Just the thing we needed, really. I am going to throw my remote away as soon as I can get this little gem of technology: something that solves the giant problems we are having with TV remote controls (and replaces them with a whole new set of problems).
I think about twenty problem scenarios popped up simultaneously in my head fighting for priority. But I am just too lazy to type them all in. Instead I will just shrug this one off and save myself the calories.