Since there are differently abled people everywhere. It is crucial to make them feel included. Here we are specifically talking about people with mental disability and disorders, such people are being diagnosed by the neurologist and the EEG machines to express their sentiments and their feelings sp
EEG Based Human Emotion Recognition using Deep Learning
Since there are differently abled people everywhere. It is crucial to make them feel included. Here we are specifically talking about people with mental disability and disorders, such people are being diagnosed by the neurologist and the EEG machines to express their sentiments and their feelings specially the one who are paralyzed.
Recognition of human emotion can only be detected by the help of EEG signals and normally not everyone is aware of the technicalities it took to gain the authenticated results, to end the communication barrier between them and people and to enhance the Mobility a system is required which recognizes those signals and help those people who are unable to understand.
So that’s why we are making a project in which the machine will extract the signals from human brain and provides the output in the shape of emotion as per the situation.
The EEG headset will be placed on the paralyzed person, and it will extract the human emotion accordingly and will represent in form of text and emoji.
This is how we will take our input and this input and DEAP DATASET and KAGGLE will go through a convolution neural network and out pre trained CNN model. The results will be shown after post-processing of data set in the form of Text and Emoji.
The basic objective of our project is to identify the human emotions from EEG signals using state of the art deep learning models.
However, emotions can be detected through various devices, but objective is to explore and research some reasonable devices in order to capture and collect desired the EEG waves.
Our project will be performing feature extraction and to perform preprocess and to train DL classifiers in order recognize specific emotion.
aims to accurately predict the emotional states of a subject while watching various videos scenes given their EEG readings using a recurrent neural network.
Our project will for every individual and the device doesn't have to be reconfigured according to new users.
We will use google collab free GPU and Anaconda (Jupiter Notebook) for training the models. The neural network we used in our project is a convolution neural network and the model we used for this project is a human emotions detector by adopting sophisticated techniques, including deep learning-based or shallow machine learning-based approaches on either raw signals or combined extracted features to recognize exact emotion.
First of all, we're going to be importing libraries that we're going to be using for this program. we have NumPy and pandas which are basically used for data manipulation and data handling and then we have matplotlib and seaborn which are used for data visualization. This makes it easier for us to understand the data and know what we're working with and then finally we have TensorFlow and these functions that we're going to be importing from sklearn such as train test split, confusion matrix and classification report and this is what will help us for the machine learning itself.
it's important that we use a recurrent neural network because the values and the data aren't necessarily independent but rather tied in with previous and following values, we’ve used a Gated Recurrent Unit (GRU) and it's a bit simpler.

It all starts with Human and machine interaction when human brain connects with the EEG headset the interfacing between them gives the phenomenal results.
The emotions are stored in our brain and if there is blockage somewhere so they got stuck in there which can create many problems. So, when machine connects with brain, so it acquires the data coming out from brain and then the data gets preprocessed accordingly.
After that feature got extracted in which we will come to know that what the next person is feeling at the moment, or which classifies the emotion.

The first chunk is how we goanna retrieves our data from the dataset, which are already trained by users. the data will be recorded by the EEG headset and a convolution neural network theorem will
be applied to it. Turn it will reach to our fully trained system. We will pre-train our system. In the last of our first chunk, the features will be recognized in contrast to our requirement.
In the second chunk, the input will be taken from the EEG headset to create and record the DATASET, it will be connected to our system and the headsets will collect brainwave signals and interpret these signals to provide information on the mental state of a person, with the implementation of a virtual reality environment in different applications.
The desired results will be sent toward post-processing of frames and output will be shown in form of text or emojis.
For the other way around, we will use a different methodology.
We will use python for this conversion. Along with python, we will use Keras, TensorFlow, pandas and NumPy libraries, these libraries help us to retrieve desired output through given dataset.Basically, it helps to get desired requirement for detect emotions.
Our project will help the paralyzed community of the world,
It will be most effective in hospital and psychology department.
It can be also use for the security purpose for simple detection of human feelings at a specific moment without asking them.
Our project is helpful for making better decisions, improving their focus and performance, managing stress, and also helping them adopt healthier and more productive working styles.
After this project, the one which are suffering from whole body paralyses and psychological problems will be able to express their emotions up to the 93% assurance.
Easy to use with less complications
Our final product will be based more on Hardware, and we will use less Software. The person has to wear the EEG headset till the all the signals extracted successfully from brain by which the software-based program will analyze the readings and will provide the emotion as per signals.
For this process, we would be using data set
Jupiter Notebook, Google Colab, Google Drive as our data repository, we would be using a convolution neural network as well as RNN (GRU)
EEG headset will be used to extract the signal from the brain which will be taken as the input to create and record the dataset and it will be providing the output emotion in the form of text and emoji that will be shown on the monitor screen of the individual.
| Item Name | Type | No. of Units | Per Unit Cost (in Rs) | Total (in Rs) |
|---|---|---|---|---|
| Sichiray Mind wave EEG Headset | Equipment | 1 | 37000 | 37000 |
| Total in (Rs) | 37000 |
A novice user or new buyer with no prior knowledge about the maintenance of the vehicle an...
Electric wheelchair, also called electric-powered wheelchair, motorized wheelcha...
In this project we are going to design automatic sorting machine by using Programmable Log...
In Pakistan only 49 million people are using banks? services, and remaining 75% are depriv...
This project focuses on that need by providing navigation control to the person through ey...