Human Robot Interaction through Behavioural based Modelling

Human Robot interaction through behavioural based modeling is the interaction between human and robot. In our project we build a robot to interact with humans. We used leap motion sensor for gesture interaction and camera for visual interaction. We have 3 modes in project 1. Button Control m

2025-06-28 16:32:58 - Adil Khan

Project Title

Human Robot Interaction through Behavioural based Modelling

Project Area of Specialization RoboticsProject Summary

Human Robot interaction through behavioural based modeling is the interaction between human and robot. In our project we build a robot to interact with humans. We used leap motion sensor for gesture interaction and camera for visual interaction. We have 3 modes in project

1. Button Control mode

2. Gesture Control mode

3. Human Tracking Mode

We can change the mode by using either buttons or by gestures. We can make interaction with robot through 7 inch capacitive touch screen LCD. Recognized faces and expressions are also shown on LCD.

Most important part of our robot is that it can convert sign language of deaf person into voice or text. We are also writing research paper in this domain.

Project Objectives

• Robot can convert sign language to voice or text.

• It can interact with human through gesture or facial expressions.

• It can track Humans.

• 2DOF arms to do small tasks.

• Mobile Robot.

• Face Recognition and Expression Recognition.

• Personal Assistant

Project Implementation Method

Firstly, Hardware is built with metal sheets and base have High power dc geared motors with tyres.  It has to caster wheels and two motors. Leap sensor is used to recognize gestures. We use python to work with leap motion. 2 types of gestures are used

1. Dynamic

2. Static

For sign language conversion we took data base of 50 persons and then apply machine learning algorithms to detect correct gesture of deaf people.

We used OpenCV to recognize face and expressions. Face Recognition is done by LBPF algorithm while Expression is recognized by using Fisher Face Algorithm. We make data base of group members to correctly recognize face while for expression we use standard dataset. We make graphical interface in python to show all the results on touch LCD. We send these all commands to motors using Bluetooth sensor. Monster moto shield driver is used to run motors. Data Fusion is applied in python to fuse all our work.

Benefits of the Project

It can provide various benefits to our society. It can be used to serve and interact with costumers in any restaurant. It have visual interface on LCD which can be very beneficial in restaurants. It can be used in Malls to guide people. Our Research makes a new way to heap deaf people in our society. We can easily identify sign language of deaf people. We have an LCD to show what deaf person is trying to say. It opens a door of new way to research in domain of human robot interaction. It can serve as personal assistant in homes.

Technical Details of Final Deliverable

In final deliverable, we will have complete interaction with robot like we can interact using gestures which can be very useful for deaf people and for normal people to understand deaf people language. When person come in front of robot it will detect person and expressions and show them on LCD and greet person accordingly. Person can select different options on LCD and robot will act accordingly. We used machine learning algorithm for every task. If option for deaf person is selected then it will convert sign language of person in voice and also show it on LCD so that normal person can understand easily. If tracking mode is selected it will track the human and go behind him. Person can also use buttons on mobile or laptop to control robot and make it do normal tasks using arms and base. It can also provide information using internet. This all done using user interface so that person can enjoy his smooth interaction with robot either it is in market or home.

Final Deliverable of the Project Hardware SystemCore Industry OthersOther Industries Education , Medical , Food Core Technology RoboticsOther Technologies Artificial Intelligence(AI), Augmented & Virtual Reality, Big DataSustainable Development Goals Good Health and Well-Being for People, Quality Education, Industry, Innovation and InfrastructureRequired Resources
Item Name Type No. of Units Per Unit Cost (in Rs) Total (in Rs)
Total in (Rs) 56500
Leap motion sensor Equipment11200012000
Microsoft Lifecam cinema Equipment170007000
7 inch touch LCD Equipment165006500
Robot body with arms design Equipment12000020000
High power DC gear motors Equipment210002000
Caster wheels Equipment2150300
Simple wheels Equipment2100200
TevaC microcontroller Equipment134003400
Montor moto shield Equipment1850850
Bluetooth module Equipment1450450
Dry Batteries Equipment29001800
Other expenses Miscellaneous 120002000

More Posts