Head Movement and Eyeball Tracking using Machine Learning and Computer Vision
Head pose is utilized to approximate a user's line-of-sight for real-time image rendering and interaction in most of the 3D visualization applications using head-mounted displays. Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety
2025-06-28 16:32:51 - Adil Khan
Head Movement and Eyeball Tracking using Machine Learning and Computer Vision
Project Area of Specialization Artificial IntelligenceProject SummaryHead pose is utilized to approximate a user's line-of-sight for real-time image rendering and interaction in most of the 3D visualization applications using head-mounted displays. Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. Despite the amount of research done on different technologies, researchers are still trying to find robust methods to use effectively in various applications. In this project, we propose a software system based on AI to detect head and eyeball movement. First, the system will take a video or images from real-time video. After collecting the dataset, different machine learning algorithms will be applied for further training. The computer-vision-based techniques will also be applied to identify the pupil of the eye and the position of the head with the help of an Artificial Neural Network (ANN). In this way, an individual either at home giving an exam or an employee doing work from home will be monitored comfortably by an organization.
Project ObjectivesTo implement a system that identifies human activity on the basis of the eyeball movement and head movement. It will determine the position and also identify the presence of mind of the workers by using some datasets like videos and images.
Project Implementation MethodFirst, the video streaming software will be created for the implementation of the detection module. It contains multiple grids of streams. The data which will be needed for detection (head and eyeball) will be collected from this software.
There are two types of work in this project, one is head movement detection and the other is eye movement detection. We are describing the working of both separately.
EYEBALL TRACKING:
COLLECTION OF DATA:
The first step for eyeball tracking is that we need a real-time video or images from real-time videos. We need some dataset on which the analysis is done.
CREATION OF FRAMES:
After getting some datasets we need to create the frames of video or also we can create the frame of an image consisting of a set of images.
DETECTION OF PUPIL:
Once the frame is created then first, we have to detect the pupil. This is very important if the pupil is not identified correctly then the whole analysis is doing the wrong work.
TRACKING OF PUPIL:
Now when the pupil is detected successfully then we have to determine the movement of it. The tracking of pupils is done by using some algorithm like particle filter.
TERMINATION OR ITERATION:
As there are many frames then the above steps are repeated for all frames and when there is the last frame for analysis then this process is terminated.
HEAD MOVEMENT DETECTION:
COLLECTION OF DATA:
The first step in the head movement detection is the same as the eyeball tracking that is the dataset (video or images) is the need for further computation.
DETECTION OF INITIAL POSE:
The second step is to detect the initial static pose of the head. Like the eyeball tracking, here we also need to detect the static position of the head.
TRACKING POSITION OF HEAD:
After detecting the initial pose successfully, now we need to track the position of the head means now the head in moving continuously so we have to track the continuous movement of the head by using some algorithm like 3D tracking
Nowadays, almost every individual all over the world is working from home. Many students are attending online classes. So, the need of accessing their attention and interest is increasing these days. During the online live session, many students or employees are pretending to be working but actually, they are not. During an assessment, identifying the position of students is very difficult because there are hundreds of students, and watching each person is a very difficult task.
Now there is a need for some technology that makes these online works more reliable. Our project does the same thing. It helps to identify which student or employee is attentive or not and also their position is also identified by analyzing the head of each stream or image. This project is not only suitable for online work but it is also beneficial in some other fields like using eyes a disabled person can control many things. This project can be used as an extension in many fields.
Final Deliverable will consist of video software which will detect movement of eye and head.
Final Deliverable of the Project Software SystemCore Industry ITOther IndustriesCore Technology Artificial Intelligence(AI)Other TechnologiesSustainable Development Goals Quality Education, Decent Work and Economic Growth, Partnerships to achieve the GoalRequired Resources| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Total in (Rs) | 40250 | |||
| Smart Phone For good camera | Equipment | 2 | 20000 | 40000 |
| Proposal Printing | Miscellaneous | 2 | 50 | 100 |
| Report Printing | Miscellaneous | 3 | 50 | 150 |