Maybe you can apply the hand gesture recognition to understand hand language for people with hearing problem. It will be interesting if the robot can understand and translate the language.
Yea, for people with hearing problem or people who use wheelchair, this idea can work. but the robot should be so professional, that can understand commands with a little sign and go for doing it. Thanks, Nur, this could be an idea but a little time-consuming.
In reinforcement learning, you teach by rewarding, i meant rewards and punishments by appreciation and disapproval. Some gestures mean total disapproval and some may mean acceptable, scenerios can be extended.
In the pedagogical case you may check the works that Hatice Kose participated. In some of her works they had observed interactions between a humanoid robot and children. Her works include topics such as "Drumming with a Humanoid Robot: Results from Human-Robot Interaction Studies", "Learning by imitation and implementation of sign language gestures by a humanoid robot".
Look at the product Gestigon makes. It works with a depth sensor. Put that on your robot, make sure it's equipped to sense at the appropriate range you intend to interact with people, and that will work. You can also consider vison-based solutions if range is an issue.
Thanks Victor, then the implementation should be in OpenCV? or any other languages is supported for working with depth sensor?
Suppose that robot just can understand gesture of numbers, then we need a screen on robot to show the number after recognizing gesture, or do sth special for every number for example?
Yes, opencv is a great free software solution that can accept both depth and regular camera images as well as having lots of vision building blocks.
As for showing feedback, having a display interface e would be useful, as well as being handy for other visual feedback. Audio feedback, having the robot repeat numbers may also be considered, but may be disruptive to communication if it interrupts the speaker.