- Open Access
- Authors : Raghav Somani , Harsh Oza , Ishika Singhania , Shrihari Kulkarni
- Paper ID : IJERTV11IS050175
- Volume & Issue : Volume 11, Issue 05 (May 2022)
- Published (First Online): 28-05-2022
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Attention Calculator for Assessment of Students Attention Span and Concentration Level
Raghav Somani1, Harsh Oza2, Ishika Singhania3, Shrihari Kulkarni4
Department of Information Technology, Government College of Engineering,Amravati – 444601,
Maharashtra, India
Abstract:- Understanding students attention span and the type of behavior that may indicate lack of attention is important to understand and enhance the dynamics of a lecture. The main goal of this project is to create an application which will provide information to both teachers and students. The engagement of student in such matters can lead us to desirable academic outcome such as critical thinking, and grades obtained in the subject.
Features and Pose Extraction: One immediate feature that comes to mind is eye tracking but eye tracking tends to suffer from low resolution images. Hence we use head position which tends to contribute highly in the overall gaze direction. Although head position already is highly accurate in attention detection, paired with another technique vastly improves the results. Students that are paying attention normally react to stimulus the same way that is students having their motions synchronized to the majority are paying attention. An example of this synchronization is when the class has to look down to write when the teachers instructs them to.
The output of this project is two way:
-
First one is a graph that shows time based evaluation of student which gives the highest attentive time and lowest attentive time.
-
Another output gives data for the duration of whole lecture/month (for multiple lectures of same subject). This consists of pie charts on the basis of number of days the student attends the lectures and his/her behavior during the same.
This project can revolutionize education by providing guidance and feedback not only to the teachers on how to improve their teaching, but to the students as well on how to improve their behavior and eventually their academic performance. Our project provides visual feedback to the teachers regarding the average level of students attention, and provide counseling to the students regarding their behavior during the class.
Keywords Attention, Drowsiness, Eye, Head, Mouth, Ratio, Track.
-
INTRODUCTION
Attention Calculator is a web application that can motivate students to understand and behave better and help teachers assess students behavior. The application takes frames from web camera and assess them to understand the type of behavior. It then compares it with an ideal behavior and gives out the result.
The result of this application is a line graph to check the hourly variation in students attention and a pie chart to give overall output for the duration of one lecture.
It helps in understanding and improving the dynamics of a lecture. The teachers can use this application to understand
what is required to make lectures more interesting so that the students are more attentive.
To assess students behavior, this app will take face from the web camera and use certain methods to form the attention graphs and charts.
We are using 3 methods for attention span calculation.
-
Eye aspect Ratio calculation
-
Mouth aspect ratio calculation
-
Head position tracking
Using these methods, we will calculate the number of times the student was drowsy, number of times he/she yawned and average position during the span of one lecture. This will give us the total attention span of student in a lecture with most accuracy. This information will then be available to the teachers for assessment of a student behavior so that he can improve lectures if required and make them more interesting and also help him make students understand
topics better.
-
-
RELATED WORK
DLIB'S 68 FACIAL LANDMARK DETECTION IN PYTHON:
In this project, we are using the Dlibs 68 point model to locate the facial features and then apply some algorithms to calculate eye aspect ratio, mouth aspect ratio and track head position.
While doing this, first the human face is detected in the whole image. We use face detector to locate the face in whole image. This technique returns 4 coordinates for the 4 corners of a bounding box or a rectangle box. The face is inside this box so now we have a face as shown in figure below.
Fig: Face detected and placed inside a green box
Now we have the coordinates of the 4 corners of the face as shown in figure. Next, we use 68 point model to find out small features on the face like eyebrows, lips, eyes, jawline, etc.
There are mostly two steps to detect face landmarks in an image which are given below:
Face detection: Face detection is the first method which locates a human face and return a value in x, y, w, h which is a rectangle.
Face landmark: After getting the location of a face in an image, it goes through points inside of that rectangle and returns the locations of eyes, lips, etc. features.
The figure below shows the 68 points detected on a face.
Fig: 68 points on a human face
As shown in figure, the eyes are denoted by 6 points each. These points are then used to calculate eye aspect ratio and eventually drowsiness.
Fig: Facial Landmark location
The facial landmark detection returns all the required features of a human face. It give us an array with coordinates of all the points on the face. This array can then be used to build the algorithm that calculates attention and checks if the person in question is attentive or not at any given time.
-
METHODOLOGY
-
Checking Eye Position
The application will first check if the eyes of student are visible. If yes, it calculates the position of all 6 point in a 2D plane as shown in figure.
Fig: Points placement for an eye
The dlib model returns coordinates of all six points in form of an array which is then used to calculate the eye aspect ratio.
The given formula is used to calculate the EAR (Eye Aspect Ratio).
After calculating EAR, it checks if the ratio is less than or equal to 0.25. If the ratio is less than 0.25, it checks the head position.
This EAR helps in deciding if the student is drowsy or attentive.
EAR greater than 0.25 suggests that the student is completely attentive. Although this is a great method to check attention, if used alone, the accuracy will be low since eyes are small part of the whole face and due to low resolution, it leads to low accuracy in attention calculation.
Therefore, we will use head position tracking to increase accuracy.
-
Tracking Head Position
The application checks the head position if eyes are not visible clearly. For Checking the head position, Dlibs model provides 17 points on jawline that can be used to calculate the head position to find out the direction in which the student is looking.
Incase if the student is looking down to write something asked by teacher, the application can mark it as drowsy and that will lead to low accuracy. To avoid this, we have added a feature where if the application finds that one students is not having proper head position, it will check for head position of all the other students in the room. If majority of students have same position, then it will not mark anyone as drowsy. Else it marks the student as drowsy for the time instant.
Head position can be tracked using Eulers angle. The Euler angles are three angles introduced by Leonhard Euler to describe the orientation of a rigi body with respect to a fixed coordinate system. We will use the 17 point coordinates to calculate the same.
-
Mouth Aspect Ratio Calculation
The One effect that drowsiness has on humans is Yawn. Yawns are one of the accurate ways of checking if the person in question is attentive or not.
When we are bored or tired, we just don't breathe as deeply as we usually do. As this theory goes, our bodies take in less oxygen because our breathing has slowed. Therefore, yawning helps us bring more oxygen into the blood and move more carbon dioxide out of the blood.
Yawning also suggests that the student is not attentive. Hence in this application, we will calculate Mouth aspect ratio to see if the student is yawning or not.
This will increase the accuracy of attention span and give better results. Yawning also suggests disinterest which can help teachers understand what topics disinterest students and if it is important, the teacher can help students understand it in a better way that avoids boredom.
-
-
RESULTS:
For the output of this application, we have a line graph and pie charts for yawning, position and overall attention. A graph describing the concentration levels of the students during the lecture is drawn with time on X axis and levels of yawing, drowsiness, and changing head position on Y-axis.
This graph gives the changing attention of a student for the span of a lecture. This will inform the teacher about time when most
students were drowsy and when most were attentive.
This Graph will also help in studying students nature and attention span.
The consolidated data generated from this graph is shown in the Pie charts. These pie charts give the overall attentive span of students during one or multiple lectures which will help calculate attendance.
The pie charts are as follows:
-
Representing the Time for which the student was Drowsy or Not Drowsy
-
Representing the Time for which the student was looking at the screen or Not Looking at the screen
-
Representing the Time for which the student was Yawning or Not Yawning
-
Representing the Time for which the student was Active or Inactive on tab.
The Line Graph and pie charts will provide the hour wise, lecture wise data which can then be used by teachers to calculate scores and attendance. This will improve students behavior in lecture and help them understand better.
Fig: Drowsiness Detection Graph
Fig: Position Detection Graph
Fig: Yawn Detection Graph
Fig: Data in the form of Pie charts
-
-
CONCLUSION:
-
Thus, this attention span calculator can be used by students and teachers to improve teaching and learning.
It is useful for students to understand ideal behavior and work to improve themselves.
It is useful for teachers to assess students based on their attention span and help them improve.
ACKNOWLEDGMENT:
Prof. S. A. Lohi, Dept. of Information Technology (I.T.), Government College of Engineering, Amravati, has been a source of support and guidance to the authors throughout this research.
We are also obliged to our Head of Department (Dept. of Information Technology), Prof. A. W. Bhade for giving us this great opportunity. We are grateful for theircooperation during the period of our assignment.
REFERENCES
[1] S. Sathasivam, A. K. Mahamad, S. Saon, A. Sidek, M. M. Som andH. A. Ameen, "Drowsiness Detection System using Eye Aspect Ratio Technique," 2020 IEEE Student Conference on Research and Development (SCOReD), 2020, pp. 448-452, doi: 10.1109/SCOReD50371.2020.9251035.
[2] B. Alshaqaqi, A. S. Baquhaizel, M. E. Amine Ouis, M. Boumehed,A. Ouamri and M. Keche, "Driver drowsiness detection system," 2013 8th International Workshop on Systems, Signal Processing and their Applications (WoSSPA), 2013, pp. 151-155, doi: 10.1109/WoSSPA.2013.6602353.
[3] W. Deng and R. Wu, "Real-Time Driver-Drowsiness Detection System Using Facial Features," in IEEE Access, vol. 7, pp. 118727- 118738, 2019, doi: 10.1109/ACCESS.2019.2936663. [4] X. Ren, J. Ding, J. Sun and Q. Sui, "Face modeling process based on Dlib," 2017 Chinese Automation Congress (CAC), 2017, pp. 1969-1972, doi: 10.1109/CAC.2017.8243093.
[5] Lin, Chih-Hung, Wun-Hau Wu, and Tsu-Nan Lee. Using an Online Learning Platform to Show Students Achievements and Attention in the Video Lecture and Online Practice Learning Environments. Educational Technology & Society 25, no. 1 (2022): 15565. https://www.jstor.org/stable/48647037. [6] K. Arai, and R. Mardiyanto, Real time blinking detection based on Gabor filter, International Journal of Recent Trends in Human Computer Interaction (IJHCI), vol. 1, no. 3, 2010, pp. 33-45