Various enterprises and personal interests, such as Man-Machine Interaction (MMI), gesture studies, signs, language, social robotics, healthcare, innovation, music, publications, etc.
The British are doing it again. Leading the world to a better place. This time it concern books, or stories might be a better word, in sign language, BSL to be exact. I wonder why they keep calling it books? Although my hearing kids sometimes listen to ‘Spoken Books’ on CDs, hmm. Thanks to Gavin Howard for the link.
MyBSLbooks: Welcome to myBSLbooks.com – the World’s first free online library of signed books. We are delighted to share with you a range of popular children’s books, available for the first time in British Sign Language. This site offers D/deaf children, their families and schools wider access to their favourite stories in the preferred language of the Deaf Community.
Well, the site only contains about eight DVD’s so far. And it’s hardly a library since it doesn’t cover any books published by anyone else, and I don’t know if lending instead of buying is an option. The site is copyrighted by Lexicon/Signstream, so I guess they somehow own it.
Come to think of it, the Dutch site Vi-Taal – De Gebarenwinkel has had a similar offering out there for years, and also offers a lot of other sign language goodies. And the Nederlands Gebarencentrum has a few DVDs as well. But well done all the same, you wonderful Britons.
Arend Harteveld died at the age of 50 years on Sunday 7 September 2008. Much too soon and entirely unexpected he was struck down by an accident in the blood circulation. Arend was a good man and a well respected colleague at Delft University of Technology. My thoughts go out to his family, especially to his mother who lived in with him and whom Arend was taking care of.
Arend contributed to much of the research based on which I hope to write my thesis, and these last years would not have been the same without him. He bore quite a burden in providing, more or less on his own, support to many courses and many labs, a burden he used to share with three colleagues in support who all left as a result of reorganisations. Meanwhile, his main interest was to work on research projects himself, and I found his contributions, both in creating software for experiments or for data analysis and in discussing the design of the experiments, to be very valuable. Arend always quickly grasped the ideas behind experiments and had a knack of pointing out flaws in the experimental design.
Arend also maintained a website with information that shows some of his technical prowess. The website is maintained now by one his radio amateur friends. Arend tells of radio and measuring equipment, chirps, about which he also gave lectures occasionally. From personal experience I know that if a subject gripped him he wouldn’t rest until he understood it fully, which happened during our collaboration for example with capturing response times on a laptop. He tried out several clocks of the PC and its processor and experimentally tested delays and variance in delays. As a radio amateur, a passion he picked up in his teens, he was known as PA1ARE. And now, as his brother in law said during the departure ceremony: “PA1ARE is voorgoed uit de lucht”.
Ruud de Wild, the barely disguised saviour of our ears?
Ruud de Wild is a DJ (and uomo universalis) who has prime time shows on the Dutch Radio. He switched to Q-Music in 2007 and this is one of their recent ads. Some people are complaining now but it is still going. There is a billboard poster too:
In the news: Brabants Dagblad: Op posters en in tv-spotjes is De Wild afgebeeld op een manier die doet denken aan een Christusfiguur. Op zijn T-shirt staat een hart omringd met stralen, net zoals Christus soms wordt weergegeven.
It took me a few moments to dig up this image of Christ that looks very similar:
The group of MobileASL researchers at the University of Washington features in a local news bulletin. They have been working for a few years now on efficient transmitting of ASL video over a channel with limited bandwidth. The idea is to enable mobile videophony, which has been the holy grail of mobile applications for quite some time already.
Personally, I am not convinced that specific technology for the transmission of sign language video will really have an impact. Here are a few reasons. Bandwidth will increase anyway with costs going down. Processing capacity in phones will increase. Videophony is an application that is desirable for many, not just signers. In other words, there is already a drive towards videophony that will meet the requirements for signing. Furthermore, I am not sure which requirements are specifically posed by sign language. People talk and gesture too, and I imagine they would want that to come across in the videophony as well. Finally, signers can and do adjust their signing to for example webcams. Does the technology address a real problem?
“The team tried different ways to get comprehensible sign language on low-resolution video. They discovered that the most important part of the image to transmit in high resolution is around the face. This is not surprising, since eye-tracking studies have already shown that people spend the most time looking at a person’s face while they are signing.”
Would this not be true for any conversation between people?
On the positive side: perhaps this initiative for signers will pay off for everyone. It wouldn’t be the first time that designs for people with specific challenges actually addressed problems everyone had to some degree.
Hello my dear readers. Perhaps you missed it, but this website was down for almost two weeks. There were some technical difficulties and in the end I more or less started anew, with a less than perfect backup of the content. So, it is possible that pages are missing or that links malfunction. If you would be so kind to report these things it would be highly appreciated. At least the website is upgraded with new blogging software and a new look.
I noticed a flurry of gesture patents that mentioned a ‘portable mutlifunction device’. That’s patentspeak for iPhone. The patents were all from APPLE Inc. Well done Apple. That’s how you manage a patent portfolio. Philips and IBM used to be the masters in this line of completely covering an area with a barrage of patents. It will give Apple something to negotiate with in future business deals with other vendors.
Who will be able to argue with this patent portfolio? Who will be able to claim that the things Apple has patented were already invented elsewhere? Who will be able to maintain that gestures are not technical inventions but natural human communicative actions? Who will pay the lawyers to fight these fights?
Here it all is in a fashion that is easier to digest than sifting through 22 patents.
I think Apple has won this fight before it could even get started.
Although the story of David Healy’s flute gesture is getting a little moldy it has generated enough discourse to deserve another mentioning here. The interesting thing about this flute gesture is how it is part of the history of the Northern Ireland sectarian conflicts. Sensitive catholic Irish republicans will get inflamed over the gesture while others have no idea what the problem is.
These flute bands on Orangist marches are what the gesture refers to.
Get a glimpse of the triumphalist nature of these marches
By coincidence I am currently reading ‘The Irish War’ by Tony Geraghty. He sketches a long and messy conflict which has gone on for more than 300 years. It is clear that these marches are of an inflammatory nature, and therefore a gesture that refers to them is also inflammatory. It is not just a merry band of flute-playing men. They celebrate Orangist protestant dominance in Northern Ireland at the expense of the catholic part of the population.
The conflict carried over to a Scottish football match called ‘the Old Firm’ between the Rangers (protestant) and Celtic (catholic), see this nice historical overview by the BCC. Many Irish people moved to Scotland and brought the conflict with them. Paul Gasoigne made the mistake of making this gesture while he played for the Rangers and paid a heavy fine of 20.000 pounds.
Paul Gascoigne made the same flute gesture during the old firm (Picture: BBC News)
David Healy was not playing for the Rangers, in fact I don’t think he ever did, but he is known as a Rangers fan. He is from Northern Ireland and he plays in their national side. However, in this game Healy was playing for Fulham (an English club) in a friendly match against Celtic, which sets the context for the gesture. Healy was ‘provoked’ by the Celtic fans who knew his sympathies and chanted ‘where were you on The Twelfth‘ (a reference to an important march on the twelfth of July). In response, he seems to have made this gesture somewhat jokingly. The strange thing is that he seems to be escaping the sort of fine Gascoigne got. Why is that? Was Gazza perceived as doing it to inflame Celtic supporters whereas Healy was just fooling around? I think many people will take it more seriously than that. As always happens with sportsmen making inappropriate gestures, Healy is now apologizing and his club is investigating. It wouldn’t surprise me if a fine came soon.
Update: I think an important difference between Healy and Gascoigne is that the latter played for the Rangers who were at that time trying to defuse a tense situation. Gascoigne’s gesture was hurting that effort.
Control a Beamed Powerpoint Presentation with Gestures
These students appear to have created a gesture based application that we also considered about four years ago. I know IBM and Philips were interested in this sort of application. So, well done guys! And excellent presentation too. I think they managed to make the best of it, given a difficult application.
Why is a presentation system a difficult application? Well if someone is presenting he will usually gesture during talking. These gestures are directed at the audience and not at the presentation software. So, the first task of such a system is to discriminate between those gestures: what is for me and what is for the audience. Furthermore, a presenter may also be fidgeting during his talk which shouldn’t be interpreted as a gesture. Unfortunately, it is unclear whether these students treated these issues.
The things they did do seem to be designed well enough. I think I like the calibration they designed: It creates a connection between the user’s physical environment and the camera he must address. It grounds the interaction. The subsequent examples of the functionality they have built in is less impressive. The forward-back commands are okay, but the drawing and highlighting are not very valuable in my opinion. People in the audience can see that you are pointing at something so there is perhaps little need to do more. But maybe these are first steps which need a bit more maturity in their interaction design to become useful.
Ninja Strike, a killer application for gesture recognition?
This is certainly an interesting development. Previously we have seen mobile phones using motion and acceleration sensors for gesture control (see here and here). There have also been applications where the camera was used to simply capture optical flow: something in front of the camera is moving/turning in direction A therefore the phone is moving/turning in A + 180 degrees (here). In this case the gesture recognition appears to go a step further and at least the hand appears to be extracted from the image. Or does it simply assume all movement is the hand? And then perhaps the position of the motion is categorized into left-middle-right? Maybe the velocity is calculated but I don’t think so.
Update: I do like the setup of how people can hold their phone with the camera in one hand, throw with the other and check their virtual throw on the display. The virtual throwing hand on the display is more or less in the same position as your physical hand, which I think is nice.
EyeSight is a techno start-up of 2004 from the Kingdom of Heaven (Tel Aviv) aspiring to use nothing but Air and a Camera to achieve a divine interaction between true techno-believers and their mobile phones. They prophetize that their technology will ‘offer users, including those who are less technologically-adept, a natural and intuitive way to input data, play games and use their mobile phone for new applications’. Heaven on Earth. Mind you, nothing is carved in stone these days. Besides, human nature and intuition are all too often deified these days anyway. Human nature is what usually gets us into trouble (not in the least in the Middle East).
Anyway, one of their angels called Amnon came to me in the night bearing the following message:
Hello Jeroen,
First Allow me to introduce myself. I’m Amnon Shenfeld, RND projects manager for eyeSight Mobile Technologies.
I’ve been following (and enjoying) your BLOG reports for a while, and I thought that the following news from my company, eyeSight Mobile Technologies, may make for an interesting post.
eyeSight has just launched “Ninja Strike”, an innovative mobile game featuring a unique touch free user interface technology we call eyePlay™. Allow me to provide some background information about eyeSight, eyePlay and Ninja Strike: I’m sure you are aware of the popularity and attention innovative user interfaces are getting since the introduction Apple’s IPhone and Nintendo’s Wii… My company’s vision is to bring this technology into the mobile market, and our first products are focused on changing the way mobile gamers play. Our new game, “Ninja Strike”, does exactly this.
You play a ninja warrior with Ninja Stars as your primary weapon. Your stars are thrown by making a throwing motion in front of the phone’s camera. Much like training in real life, during the game you will learn how to throw your weapon correctly, and improve your aim. Your enemies, the evil Kurai ninjas, will also gain strength as the game advances…
Looking forward to hear from you, I hope to see a new post in your blog soon, you’ve been quiet for a while… J
Amnon Shenfeld
Amnon, will you heed my calls? Have you answers to my burning questions above?