Various enterprises and personal interests, such as Man-Machine Interaction (MMI), gesture studies, signs, language, social robotics, healthcare, innovation, music, publications, etc.

Category: Robots

Human-Robot Interaction, Social Robotics

Robo ONE competition

Day one of the ROBO-ONE 14 competition in Yokohama featured champion OmniZero.7 doing his usual outstanding job of surprising and delighting both the judges and the crowds. For more information visit Robots Dreams at http://www.robots-dreams.com


ROBO ONE is a competition between robots with a jury. Kind of a beauty pageant, I guess.

Asimo dancing

Here, four Asimo robots are dancing a really nice choreography. Quite entertaining, but not because of how they interact with humans. It is entertaining to see how someone managed to build a robot with the right movement parameters and then managed to program it to dance in this way.  One could also admire the aesthetics of the movements or of the synchronization.

Actroid DER2

Here is a robot that appears to be quite social. But is it just a script which she/it is going through or does she also interact with people? Perhaps this is a typical example of a robot that is made to ‘appear human’ in what she does, but not in how she observes other humans and interacts with them.

Odd Robots

I am trying to put together a plan to work on social robotics. If I look at the following collection of odd robots I suddenly get a sense of urgency. The only reason for most of these robots is that they are entertaining.

Nadia Magnenat-Thalmann at the FG2008

One of the more interesting lectures at the FG2008 conference was a keynote speech delivered by Nadia Magnenat-Thalmann, director of the MIRALab in Geneva. She talked about Communicating with a Virtual Human or a Robot that has Emotions, Memory and Personality. She went far beyond the simplistic notion of expressing ‘the six basic emotions’ and talked about how mood, personality and relationships may affect our facial expressions.

Example of MIRALab's facial expression techniques
The talk by Magnenat-Thalmann focused on facial expression. (source)

By coincidence I got an invitation to write a paper for another conference, organized by Anton Nijholt and Nadia Magnenat-Thalmann (and others), called the Conference on Computer Animation and Social Agents (CASA 2009). It is organized by people from the University of Twente but held in Amsterdam. Call for papers: deadline February 2009.

Nadia also mentioned a researcher at Utrecht University called Arjan Egges. He got his PhD at the MIRALab and is now working on “the integration of motion capture animation with navigation and object manipulation”.

Wii-Gesture Control for Robots

Here is a guy called roschler demonstrating how to (learn to) control an i-Sobot, the world’s smallest humanoid robot, with Wii-gestures instead of using a complicated remote control. Because the robot’s routines are mostly gestures you can create commands for them through imitation or iconicity. Probably, actions will often have to be simplified, since I do not expect people will want to make an actual somersault to tell the robot to do that. Also, for certain actions or scripts I imagine that the gestures will become arbitrary and not really intuitive, but probably still easier to use than the alternative RC.


Robot control, another good niche for gesture recognition?

i-SOBOT by ThinkGeek: World’s Smallest Fully Articulated Humanoid Robot
Buy an i-SOBOT at Amazon for about $250
Watch a fight: i-SOBOT vs. Godzilla
More about Robodance, a.k.a. Robosapiens Dance Machine, which can also be used to control WowWee Robots.

The ‘gezellige’ robot

There is a robot that I have fallen in love with. I never saw him but only read a story in a newspaper about him. That leaves me free to project my hopes and desires unto this unwitting machine. His name is Hall Object and, as a robot, it has no practical use whatsoever. Or does it? It makes an impressive but otherwise dull hall of a building more gezellig. My colleague Elif was was reminded of a rabbit called Nabaztag. But while the makers of the world’s first artificial, smart rabbit are doing everything they can to make sure that your Nabaztag is functional as well as cute, Hall Object’s sole purpose is to be in a good or bad mood and react (or not) to other people in the hall:

[Hall Object] can decide to be in an certain mood and act accordingly. When it picks up signals through its sensors – from people passing by for instance – it can come toward you, showing affection, or it can turn away or ignore you and keep to himself. 

Does he socialize more easily than me? (source)

Since the 26th of october Hall Object lives in the hall of the NPS/VARA-gebouw on the ediapark in Hilversum. It is a work of art by Studio Job. I think it illustrates the only real function robots and other AI gadgets have at the moment: a social function. We find them funny, amazing, or cuddly. We project emotions on them, or even attitudes or intentions. And Hall Object is the perfect object to project stuff on, because he is blanco. An empty thing, doing just enough to be noticed, and leaving spectators free to see and think what they want.

Let us link this to Robot Asimo, who applies gesture technology for social functioning: if you wave at him, he waves back (see video). It is simple but effective. A little bit of acknowledgement of our human existence immediately sparks our imagination: “If it can see that I am here, it may have an attitude toward me. He may be watching me. He might react to what I will do. He may not like it? etc. etc.”

Asimo responds to several gestures (source Plyojump.com) and events he picks up:

  • Asimo follows a person, then stops when when it hears a command and sees a hand gesture.
  • Asimov watches a person point to where it is supposed to go, confirms by speaking, and walks over.
  • Listening to two speakers, Asimov swivels its head to face the person who just spoke.
  • Encountering two moving people, Asimov stops walking to let them pass, then resumes walking.
  • Seeing two stationary people, Asimov walks around them to its destination.
  • When the person waves, Asimov waves back.
  • With two people speaking, Asimov only listens to the one it recognizes.

In my opinion, this is the only viable application of gesture recognition technology I can foresee for the near future, apart from some niche applications and motion sensing in gaming. If a robot catches my gestures and my speech (or even my emotions) it can start to live in the same world as I do. I will no longer have to sit down and enter his realm.

Frankly, now I am in doubt. Should I visit Hall Object or stay away? I live in Hilversum, so he is only a short bike ride away. But it seems I can only lose from this encounter. Will my wonderfull illusions of a gezellige robot survive the confrontation with an actual machine, with the many flaws it will inevitably display upon close inspection? I’ll keep you posted…

The Evolutionary Edge of Imitation

Not a few scientists and/or psychologists are quite excited by the discovery of mirror neurons. What is a mirror neuron? A mirror neuron fires both when you perform an action and when you observe the same action performed by another. The neuron “mirrors” the behavior of others, as though you were acting yourself. Why do we and other apes have this mirror system? It is speculated that we use it to understand the actions of other people, and for learning new skills by imitation.

Do you want to feel your mirror neurons at work in a game of mind reading? Try guessing Rooney’s intentions over here.

But when it comes to imitation one wonders who benefits most from it? Animal or Man? It seems that at least one gesturing cat makes the most of it.


Will Garfield benefit from his gestural abilities in an evolutionary sense as well?
Will Jon be able to use his mirror neurons to understand Garfield’s intentions better next time? (
source)

And then again it may not be a matter of out-evolving other animal species. We may have to go up against the machines one day. At Honda (maker of Asimo) and ATR they are equipping robots with abilities to read minds and imitate gestures.

Victory for the machines? (source)

At least for the machine it is clear how it accomplishes the task. It scans your brain with MRI. That brings us back to humans and their mirror neurons. How does it work? We do not scan other people’s brains. We merely have our eyes.

I believe that we see what we want to see as much as what is actually shown. We do not read minds but project our own minds unto others. So do our mirror neurons inform the visual system and the rest of the brain (and body?) what to see? Or does my visual system communicate directly with unknown human motion perception bits and pieces. Pieces that are as much about perception as they are about motor production?

It would be very interesting to see what happens to firing mirron neurons in cases of misjudged intentions. Suppose we think we see someone about to hit another man, whereas he was actually just scratching his armpit (for want of a nicer example). Would the right mirror neurons fire, because it is but the low-level motor programs associated with the actual postures and movements that are mirrored? Or would the wrong mirror neurons fire because they are under the control of our higher ‘mind projecting’ powers?

I thought I saw a terrorist
Acting suspiciously
So my neurons fired first
Triggered unhappily

Asimo Gestures

Asimo was in the news again (in Manilla), and I thought an update would be nice.

“howwa YOU dowin?” (can Asimo look a girl up and down?)

The little guy is currently equipped with some gesture recognition (he can wave back) and other computer vision capabilities. He has always had a bunch of synthesized gesture routines in his repertoire. When they made him speak they made him do co-speech gestures. I guess otherwise there would be nothing to watch, since he has no lips to move nor eyebrows to raise.

Asimo is making a name for himself on stages everywhere. Rumour even has it he is getting pretentious and is plotting a coup against Mickey Mouse.

Page 5 of 5

Powered by WordPress & Theme by Anders Norén