Human Robot Interaction through Behavioural based Modelling
Human Robot interaction through behavioural based modeling is the interaction between human and robot. In our project we build a robot to interact with humans. We used leap motion sensor for gesture interaction and camera for visual interaction. We have 3 modes in project 1. Button Control m
2025-06-28 16:32:58 - Adil Khan
Human Robot Interaction through Behavioural based Modelling
Project Area of Specialization RoboticsProject SummaryHuman Robot interaction through behavioural based modeling is the interaction between human and robot. In our project we build a robot to interact with humans. We used leap motion sensor for gesture interaction and camera for visual interaction. We have 3 modes in project
1. Button Control mode
2. Gesture Control mode
3. Human Tracking Mode
We can change the mode by using either buttons or by gestures. We can make interaction with robot through 7 inch capacitive touch screen LCD. Recognized faces and expressions are also shown on LCD.
Most important part of our robot is that it can convert sign language of deaf person into voice or text. We are also writing research paper in this domain.
Project Objectives• Robot can convert sign language to voice or text.
• It can interact with human through gesture or facial expressions.
• It can track Humans.
• 2DOF arms to do small tasks.
• Mobile Robot.
• Face Recognition and Expression Recognition.
• Personal Assistant
Project Implementation MethodFirstly, Hardware is built with metal sheets and base have High power dc geared motors with tyres. It has to caster wheels and two motors. Leap sensor is used to recognize gestures. We use python to work with leap motion. 2 types of gestures are used
1. Dynamic
2. Static
For sign language conversion we took data base of 50 persons and then apply machine learning algorithms to detect correct gesture of deaf people.
We used OpenCV to recognize face and expressions. Face Recognition is done by LBPF algorithm while Expression is recognized by using Fisher Face Algorithm. We make data base of group members to correctly recognize face while for expression we use standard dataset. We make graphical interface in python to show all the results on touch LCD. We send these all commands to motors using Bluetooth sensor. Monster moto shield driver is used to run motors. Data Fusion is applied in python to fuse all our work.
Benefits of the ProjectIt can provide various benefits to our society. It can be used to serve and interact with costumers in any restaurant. It have visual interface on LCD which can be very beneficial in restaurants. It can be used in Malls to guide people. Our Research makes a new way to heap deaf people in our society. We can easily identify sign language of deaf people. We have an LCD to show what deaf person is trying to say. It opens a door of new way to research in domain of human robot interaction. It can serve as personal assistant in homes.
Technical Details of Final DeliverableIn final deliverable, we will have complete interaction with robot like we can interact using gestures which can be very useful for deaf people and for normal people to understand deaf people language. When person come in front of robot it will detect person and expressions and show them on LCD and greet person accordingly. Person can select different options on LCD and robot will act accordingly. We used machine learning algorithm for every task. If option for deaf person is selected then it will convert sign language of person in voice and also show it on LCD so that normal person can understand easily. If tracking mode is selected it will track the human and go behind him. Person can also use buttons on mobile or laptop to control robot and make it do normal tasks using arms and base. It can also provide information using internet. This all done using user interface so that person can enjoy his smooth interaction with robot either it is in market or home.
Final Deliverable of the Project Hardware SystemCore Industry OthersOther Industries Education , Medical , Food Core Technology RoboticsOther Technologies Artificial Intelligence(AI), Augmented & Virtual Reality, Big DataSustainable Development Goals Good Health and Well-Being for People, Quality Education, Industry, Innovation and InfrastructureRequired Resources| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Total in (Rs) | 56500 | |||
| Leap motion sensor | Equipment | 1 | 12000 | 12000 |
| Microsoft Lifecam cinema | Equipment | 1 | 7000 | 7000 |
| 7 inch touch LCD | Equipment | 1 | 6500 | 6500 |
| Robot body with arms design | Equipment | 1 | 20000 | 20000 |
| High power DC gear motors | Equipment | 2 | 1000 | 2000 |
| Caster wheels | Equipment | 2 | 150 | 300 |
| Simple wheels | Equipment | 2 | 100 | 200 |
| TevaC microcontroller | Equipment | 1 | 3400 | 3400 |
| Montor moto shield | Equipment | 1 | 850 | 850 |
| Bluetooth module | Equipment | 1 | 450 | 450 |
| Dry Batteries | Equipment | 2 | 900 | 1800 |
| Other expenses | Miscellaneous | 1 | 2000 | 2000 |