- Open Access
- Authors : Y. Sai Chandu, Shaik. Fathimabi
- Paper ID : IJERTCONV9IS05045
- Volume & Issue : ICRADL – 2021 (Volume 09 – Issue 05)
- Published (First Online): 27-03-2021
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Epilepsy prediction using Deep Learning
Y. Sai Chandu, Shaik. Fathimabi
VR Siddhartha Engineering College
Abstract – Epilepsy is one of the world's most traditional neurological sicknesses. The early expectation of the approaching seizures impacts epileptic patients' life. To make and tentatively assess a method for convulsion discovery joining pulse and development. Seizure forecasts can build autonomy and permit deterrent treatment for patients with epilepsy. Versatile seizure recognition has just been directed, there are still no methodologies that will dependably identify diverse seizure types within the home climate. During this venture it presents a proof-of-idea for a seizure expectation framework that's exact, completely robotized, persistent explicit, and tunable to a personality's necessities. Even as a challenge that lies within the assortment of side effects of certain seizure types. This exploration portrays the venture which plans to acknowledge convulsions with the help of sensor and to line up a systems administration foundation even as to trade clinical information between significant entertainers. This task visiting utilizes Profound learning for the identification of epileptic seizures and embody the application utilizing the case of discovery of Summed up Tonic-Clonic Seizures utilizing increasing speed information from the sensors. Additionally, this task will exhibit the learning is giving better outcomes contrasted with the AI calculations. . It's observed that LSTM has more accuracy in deep learning compared to the three models worn out machine learning.
Keywordsepilepsy, epileptic seizure, deep learning, machine learning, Tunable, Tonic-Clonic Seizure
I.INTRODUCTION
The illness where patients suffer seizures led to by a cerebrum usefulness problem is named epilepsy [1]. While over ve million individuals around the globe are determined to possess epilepsy [2], in the US, around 3,000,000 patients are affected by epilepsy. Epilepsy is that the third most simple cerebrum problem [3]. Within the interim, there are some potential reasons for epilepsy, one in every of which is an atomic change, which brings about unpredictable neuronal conduct or relocation of neurons. Despite the very fact that the basic driver of epilepsy stays obscure, an early determination is often valuable for treating epilepsy. Epilepsy patients are treated with drugs or surgeries [4]. Nonetheless, these techniques aren't completely effective. Shockingly, seizures that cannot be treated restoratively limit the dynamic existence of the patient. In these cases, patients can't autonomously work and do some action. This prompts social disengagement of individuals and monetary difficulties.
The early expectation of epileptic seizures guarantees enough time before it happens; it's exceptionally helpful because the assault may be evaded by the medication. Epileptic seizures have four different states: the preictal state, which is an expression that shows up before the seizure starts, the ictal express that starts with the start of the seizure and finishes with an assault, the postictal express that begins after ictal state, and interictal express that begins after the postictal
condition of first seizure and closures before the start of the preictal condition of continuous seizure.
FIGURE. 1: Brain states in a typical epileptic EEG recording
The Centers for Disease Control and Prevention (CDC) describe epilepsy as "a common brain condition that causes repeated seizures."
Epilepsy could be a disorder of the brain characterized by repeated seizures. A seizure is sometimes dened as a sudden alteration of behavior thanks to a brief change within the electrical functioning of the brain.
Normally, the brain continuously generates tiny electrical impulses in an orderly pattern. These impulses travel neurons
the network of nerve cells within the brain and throughout the entire body via chemical messengers called neurotransmitters.
In epilepsy the brain's electrical rhythms incline to become imbalanced, leading to recurrent seizures. In patients with seizures, the traditional electrical pattern is disrupted by sudden and synchronized bursts of power which will briey affect their consciousness, movements, or sensations.
SEIZURE CLASSIFICATION:
FIGURE 2: Seizure Classication
Moreover, seizures may be anticipated by identifying the beginning of the preictal state. An incomplete seizure happens when the epileptic movement happens in one piece of somebody's mind. There are two subtypes of fractional seizure:
Basic fractional seizure: During this type of seizure, the individual is cognizant. By and enormous, they're likewise mindful of their environmental factors, in any event, when the seizure is in advancement.
Complex halfway seizure: During this kind, the seizure impedes a person's cognizance. they'll commonly not recollect the seizure. Within the event that they are doing, their memory of it'll be ambiguous.
SUMMED UP SEIZURE:
A summed up seizure happens when the epileptic action affects the two parts of the mind. The individual will for the foremost part pass out while the seizure is in advancement.
There are some subtypes of summed up seizure, including:
Tonic-clonic seizures: Maybe the foremost popular style of summed up seizure, tonic-clonic seizures cause loss of awareness, body stiffness, and shaking. Specialists recently called these fantastic mal seizures.
Nonattendance seizures: Recently called epileptic seizure seizures, these include short passes of cognizance wherein the individual has all the earmarks of being gazing off into space. Nonattendance seizures open react well to treatment.
Tonic seizures: In tonic seizures, the muscles become stiff, and therefore the individual may fall.
Atonic seizures: A deficiency of tone makes the individual drop abruptly.
Clonic seizures: This subtype causes cadenced, twitching developments, frequently within the face or one arm or leg.
Myoclonic seizures: This subtype makes the chest area or legs out of nowhere yank or jerk.
Auxiliary summed up seizure: An auxiliary summed up seizure happens when the epileptic action begins as a fractional seizure however spreads to the two parts of the mind. As this seizure advances, the individual will pass out.
C.EPILEPSY VERSUS SEIZURES:-
Seizures are the elemental side effect of epilepsy. Truth be told, Johns Hopkins Medication dene epilepsy as having "at least two unmerited seizures."
A few people may have a solitary seizure, or they'll encounter seizures that aren't thanks to epilepsy.
It is even workable for specialists to misdiagnose nonepileptic seizures as epilepsy. Notwithstanding, non-epileptic seizures don't come from anomalous electrical action within the mind. the explanations for these will be physical, passionate, or mental. There are additionally different styles of seizure, which can fluctuate among individuals with epilepsy. In two individuals with epilepsy, as an example, the condition may look different. Thus, the CDC depict epilepsy as a variety problem.
Cautioning gadgets, A few gadgets can screen seizures and prepared parental figures, possibly beneting treatment and forestalling abrupt unforeseen demise in epilepsy (SUDEP). A little 2018 examination including 28 members, the aftereffects of which showed up within the diary system science, considered one such multimodality gadget, the Nightwatch, to an Emt bed sensor. The Nightwatch idntified 85% of every single extreme seizure, contrasted and 21% for the bed sensor. It additionally just missed one genuine assault at regular intervals. Almost 70%of SUDEP cases happen during the rest, as indicated by one 2017 investigation. This shows that there may well be potential benets of utilizing exact evening cautioning frameworks.
-
IS IT INFECTIOUS?
Anybody can create epilepsy, yet it's not infectious. A 2016 survey of examination featured some confusions and disgrace about epilepsy, including the deception that epilepsy can send between individuals.
The examination creators note that individuals with lower schooling levels and financial status had a high pace of misinterpretations, as did the individuals who did not have a clue about any individuals with epilepsy. Therefore, intercessions and other instructive eorts can be valuable to diminish shame around epilepsy and increment comprehension of the condition. As indicated by the CDC, individuals are at more danger of SUDEP on the off chance that they need having epilepsy for a protracted time, or on the off chance that they need customary seizures. Following these means can help diminish the danger of SUDEP:
-
Taking all dosages of the antiseizure drug.
-
Restricting liquor consumption.
-
Getting sufficient rest.
-
SIDE EFFECTS THAT DEMONSTRATE A SEIZURE IS IN ADDVANCEMENT INCLUDE:
One can encounter both central and summed up seizures simultaneously, or one can occur before the opposite. The manifestations can last anyplace from a pair of moments to fifteen minutes for each scene.
In some cases, side effects happen before the seizure happens. These can include:
-
an unexpected sensation of dread or uneasiness, a sensation of being debilitated to your stomach, dizziness, a vision change, a jerky development of the arms and legs that will cause you to drop things.
-
an out of body sensation, a migraine, losing awareness, trailed by disarray, having wild muscle fits, drooling or foaming at the mouth, falling, having an unusual intuition with regards to mouth.
-
clenching teeth, biting tongue, having unexpected, fast eye developments, making irregular clamors, as an example, snorting, losing control of bladder or gut work, having unexpected mindset changes.
It is indispensable to appear for discussion with a specialist if any of those indications happen over once.
The accompanying conditions may make comparable manifestations those above, so some people can confuse them with those of epilepsy:
1.high fever with epilepsy-like manifestations 2.blacking out
3.narcolepsy, or repeating scenes of rest during the day 4.cataplexy, or times of outrageous muscle shortcoming 5.rest problems, bad dreams, and alarm assaults
The following conditions may cause similar symptoms to those above, so some people can mistake them for those of epilepsy:
-
high fever with epilepsy-like symptoms 2.fainting 3.narcolepsy, or recurring episodes of sleep during the day 4.cataplexy, or periods of utmost muscle weakness 5.sleep disorders, nightmares, panic attacks 6.fugue state, a rare psychiatric condition within which an individual forgets
details about their identity 7.psychogenic seizures or seizures with a psychological or psychiatric cause
-
LITERATURE SURVEY
Efficient epileptic seizure prediction based on deep learning Hisham Daoud, et al. Proposed four deep learning-based models for the aim of early and accurate seizure prediction taking into consideration the data processing. The seizure prediction problem is formulated as a classification task between interictal and preictal brain states, during which a real alarm is taken into account when the preictal state is detected within the predetermined preictal period. Aim at automatic extraction of the foremost important features by developing deep learning-based algorithms with none preprocessing. Multi-Layer Perceptron is applied to the raw EEG recordings as an easy architecture of multiple trainable hidden layers, then Deep Convolutional Neural Network (DCNN) is employed to be told the discriminative spatial features between interictal and preictal states. In another proposed model, Bidirectional Long STM (Bi-LSTM) Recurrent Neural Network is concatenated to the DCNN to try to to the classification task. An Autoencoder (AE) based semi-supervised model is proposed and pre-trained using transfer learning technique to reinforce the model optimization and converge faster. The computation time needed to extract these features depends on the method complexity and is taken into account another challenge especially in real-time application. Motivated by these challenges and thanks to the importance of the first and accurate seizure prediction, we developed deep learning- based seizure prediction algorithms that combine the feature extraction and classification stages into one automated framework [5].
Epileptic Seizures Prediction Using Machine Learning Methods
Syed Muhammad Usman et al…Proposed SVM, NaiveBayes, KNN models. Support Vector Machine is chosen as a classifier because of its superior performance in terms of sensitivity. After the selection of the classifier, we will apply our model for the prediction of epileptic seizures. In the future, preprocessing of the EEG signal is further improved to induce an increased sensitivity of seizure prediction. Other preprocessing methods may be tried, including those hybrid preprocessing methods and people that include adaptive window sizes. Moreover, we will also develop a web system for the prediction of epileptic seizures[6].
Smartphone applications for seizure management PuneetSinghPandher et al…Proposed different features of Seizure management applications.the bulk of apps were available on the iPhone or Android operating systems, with just one app being found on Blackberry and none on Windows Mobile and Nokia-Symbian. this is often not surprising considering that iPhone and Android currently hold the bulk of smartphone market shares, accounting for 92.3 percent of all smartphone shipments within the half-moon of 2013.45 it's also worth noting that while the quantity of epilepsy apps has grown steadily since their debut on the market, the full number continues to be considered but a
number of the opposite health and fitness apps available on thisplatform.46However, with the continued rise in smartphone ownership, and also the increasing recognition of smartphones as an innovative way of delivering healthcare to patients, we expect this number to still rise rapidly within the future[7].
An IoT Platform for Epilepsy Monitoring and Supervising Paula M. Vergara et al. uses the Partitioning and Offloading algorithm. This study proposes a platform for monitoring and supervising epileptic patients, focused on the 2 main epilepsy types: the focal myoclonic and also the epileptic absence seizures, the bulk of the approaches lack several remarkable factors: developing ergonomic approaches, supporting existence, providing economically affordable solutions, introducing storage of the sampled data and providing intelligent CC services, introducing real-time response, or considering multiple services on the MCC size. This study addresses the planning of an IoT platform for epilepsy seizure detection and monitoring considering each of those factors [8].
Biosensors for epilepsy management: state-of-art and future aspect
Shivani et al. use nanomaterials (nanomembranes, nanotubes, and nanodendrimers). Its the sphere of nano-technology only which may provide us an integrated solution to epilepsy cure, remediation, and therefore the agnostic developments. The collaboration among technological giants (Apple, Nike, Hexoskin, Neuron), research institutions (Rice, MIT, and others), and various discussion platforms like epilepsy talk, is required during this active period, like a combination of Garmin wearable's data capture and th actigraph analytical platform that gives a powerful monitoring system. Smart nano sensor-enabled apparels, headsets, bracelets, and other daily wearable's remove the stigma and pain related to complex monitoring techniques. the event of newer devices and APPs which collaborated the already patented products, software, data analysis platforms could only help the caregivers, doctors, and patients a sense of relief[9].
F. Deep Learning and Big Data in Healthcare
Sergio Munoz-Romero et al. analyses Medical images. The appliance of huge Data and Deep Learning in healthcare includes areas like individual disease diagnoses, disease prognoses, disease prevention, and prediction, and also designing tailored health treatments supported lifestyle. one in all the foremost famous example applications in medical aid is that the "Precision Medicine Initiative" project, which was promoted by the US President Obama in 2015. IBM Watson for healthcare can incorporate clinical, pathological, and genetic characteristics so to propose standardized clinical pathways and individualized treatment recommendations from those characteristics. The Enliticcompany (San Francisco, CA, USA) was able to improve diagnostic accuracy in less time and at a reduced cost (when compared to traditional diagnostic methods), utilizing DL networks to investigate medical images (such as X-rays and MRIs). Another exemplary use is Google Flu Trends, which was ready to predict quite double the proportion of doctor visits
for influenza-like illness than the Centers for Disease Control and Prevention by using surveillance reports from laboratories across theUnited States [10].
-
PROPOSED APPROACH
-
Epilepsy Prediction Using Machine Learning
-
K Nearest Neighbors (KNN) :- KNN is one in each of the rst models that individuals realize with regards to scikitlearn 's, classication models. The model classies the instance addicted to the k examples that are nearest to that. as an example, if k = 3, and every of the three of the closest examples are of the positive class, at that time the instance would be classied as class 1. within the event that two out of the three closest examples are of the positive class, at that time the instance would have a 66% likelihood to be classied as sure.
K-nearest neighbors is a directed ML calculation utilized for both relapse and classication issues. Generally executed for design acknowledgment, this calculation rst stores, and identies the distance between all contributions to the information utilizing a distance work, chooses the k specied inputs nearest to question and yields:
The most regular mark (for classication)
The normal estimation of k nearest neighbors (for relapse) Genuine uses of this calculation incorporate
Unique mark recognition FICO assessment
Anticipating the securities exchange Breaking down illegal tax avoidance. Bank insolvencies
Money conversion scale
FIGURE 3 :K-nearest Neighbours
-
Naïve Bayes (NB) :-
The Naive Bayes classier utilizes the Bayes hypothesis to perform classication. It expects that on the off chance that each one highlight isn't identified with each other, at that time the likelihood of seeing the highlights together are only the results of the likelihood of every element occurring. It nds the likelihood of the instance being classied as sure, given all the different mixes of highlights. The model is usually awed in light of the very fact that the "credulous" some portion of the model expects all highlights are autonomous, which isn't the case more often than not.
In less complex terms, it helps nd the likelihood of a happening, A occurrence, providing occasion B went on.
Naive Bayes is best for
-
Separating spam messages
-
Suggestion frameworks, for example, Ne\lix
-
Arrange a news story about innovation, legislative issues, or sports
-
Estimation examination via online media
FIGURE 4: NAIVE BAYES
3.Random Forest(RF) :-
Since choice trees are likely to overt, the arbitrary timberland was made to decrease that. Numerous choice trees compose an arbitrary timberland (Random Forest) model. An arbitrary wood comprises of bootstrapping the dataset and utilizing an irregular subset of highlights for each choice tree to decrease the link of every tree, subsequently lessening the likelihood of overtting. We will gauge how great arbitrary backwoods is by utilizing the "out-of-sack" information that wasn't utilized for any trees to check the model. Irregular backwoods are likewise very often favored over a choice tree since the model features a lower change; subsequently, the model can sum up better.
Stage 1: Make a bootstrapped dataset. We are permitted to pick a similar example more than once.
Stage 2: Make a choice tree utilizing the bootstrapped.
Stage 3: Rehash stage 1 by making another bootstrapped dataset and making another tree. Preferably, make in excess of 100 trees.
Random Forests applications can be found in —
-
Extortion discovery for financial balances, Visa.
-
Identify and foresee the medication affectability of a medication.
-
Distinguish a patient's sickness by dissecting their clinical records.
-
Foresee assessed misfortune or prot while buying a specific stock.
FIGURE 5 :RANDOM FOREST
-
-
-
Calculation FOR EPILEPSY Utilizing Profound LEARNING
-
LONG SHORT-TERM MEMORY (LSTM):-
-
Neural Networks are set of algorithms that closely resemble the human brain and are designed to acknowledge patterns. They interpret sensory data through a machine perception, labeling, or clustering raw input. They will recognize numerical patterns, contained in vectors, into which all real- world data (images, sound, text, or time series), must be translated. Artificial neural networks are composed of an outsized number of highly interconnected processing elements (neuron) working together to unravel an issue An ANN usually involves an oversized number of processors operating in parallel and arranged in tears. The primary tier receives the raw input information analogous to optic nerves in human visual processing. Each successive tier
receives the output from the tier preceding it, instead of from the raw input within the same way neurons farther from the cranial nerve receive signals from those closer to that. The last tier produces the output of the system.
A recurrent neural network (RNN) could be a generalization of a feed-forward neural network that has an inside memory. RNN is recurrent because it performs the identical function for each input of knowledge while the output of the present input depends on the past computation. After producing the output, it's copied and sent back to the recurrent network. for creating a choice, it considers this input and also the output that it's learned from the previous input.
Unlike feed forward neural networks, RNNs can use their internal state (memory) to process sequences of inputs. This makes them applicable to tasks like unsegmented, connected handwriting recognition or speech recognition. In other neural networks, all the inputs are independent of every other. But in RNN, all the inputs are associated with one another.
First, it takes the X (0) from the sequence of input and so it outputs h (0) which along with X (1) is that the input for the following step. So, the h (0) and X (1) is that the input for the following step. Similarly, h (1) from the following is that the input with X (2) for the following step and then on. This way, it keeps remembering the context while training.
FIGURE 6: AN UNROLLED RECURRENT NEURAL NETWORK
The formula for the current state is
Applying Activation Function:
W is weight, h is the single hidden vector, Whh is the weight at previous hidden tate, Whx is the weight at current input state, tanh is the Activation funtion, that implements a Non- linearity that squashes the activations to the range[-1.1]
Output:
Yt is the output state. Why is the weight at the output state. Advantages of Recurrent Neural Network
RNN can model sequence of data so that each sample can be assumed to be dependent on previous ones
Recurrent neural network are even used with convolutional layers to extend the effective pixel neighbourhood.
Disadvantages of Recurrent Neural Network
Gradient vanishing and exploding problems. Training an RNN is a very difficult task.
It cannot process very long sequences if using tanh or relu as an activation function.
What is Long Short Term Memory (LSTM)?
Long Short-Term Memory (LSTM) networks are a modified version of recurrent neural networks, which makes it easier to remember past data in memory. The vanishing gradient problem of RNN is resolved here. LSTM is well-suited to classify process and predict time series given time lags of unknown duration. It trains the model by using back- propagation. In an LSTM network, three gates are present:
FIGURE 7 : LSTM GATES
Input gate discover which value from input should be used to modify the memory. Sigmoid function decides which values to let through 0, 1. And tanh function gives weightage to the values which are passed deciding their level of importance ranging from-1 to 1.
Forget gate discover what details to be discarded from the block. It is decided by the sigmoid function. it looks at the previous state(ht-1) and the content input(Xt) and outputs a number between 0(omit this)and 1(keep this)for each number in the cell state. Ct1
Output gate the input and the memory of the block is used to decide the output. Sigmoid function decides which values to let through 0,1. and tanh function gives weightage to the values which are passed deciding their level of importance ranging from-1 to 1 and multiplied with output of Sigmoid.
Bi-LSTM is used in two proposed models, as the back-end classier that works on the feature vector generated by DCNN. The proposed network consists of a single bidirectional layer that predicts the class label at the last time instance after processing all the EEG segments as shown in Fig3.2.. We chose the number of units, dimensionality of the output space, to be 20. Dropout regularization technique is utilized to avoid overtting. The dropout is applied to the input and the recurrent state with factor of 10% and 50% respectively. The sigmoid activation function
is used for prediction of the EEG segments class and RMSprop is selected for optimization.
FIGURE 8 :The unrolled Bidirectional LSTM Network
-
-
DESCRIPTION OF DATASETS AND TOOLS
Collected the data from Kaggle
The original dataset from the reference consists of 5 dierent folders, each with 100 les, with
each le representing a single subject/person. Each le is a recording of brain activity for 23.6 seconds.
The corresponding time-series is sampled into 4097 data points. Each data point is the value of the EEG recording at a dierent point in time. So we have total 500 individuals with each has 4097 data points for 23.5 seconds.
We divided and shued every 4097 data points into 23 chunks, each chunk contains 178 data points for 1 second, and each data point is the value of the EEG recording at a dierent point in time.
So now we have 23 x 500 = 11500 pieces of information (row), each information contains 178 data points for 1 second(column), the last column represents the label y
{1,2,3,4,5}.
The response variable is y in column 179, the Explanatory variables X1, X2, …, X178 Let us look more preciesly on y column and it's value's importance for our case.
y contains the category of the 178-dimensional input vector. Specically y in {1, 2, 3, 4, 5}
5 – eyes open, means when they were recording the EEG signal of the brain the patient hadtheir eyes open
4 – eyes closed, means when they were recording the EEG signal the patient had their eyes closed
3 – Yes they identify where the region of the tumor was in the brain and recording the EEG activity from the healthy brain area
2 – They recorder the EEG from the area where the tumor was located
1 – Recording of seizure activity All subjects falling in classes 2, 3, 4, and 5 are subjects who did not have epileptic seizure. Only subjects in class 1 have epileptic seizure.
Python is a popular programming language. It was created by Guido van Rossum, and released in 1991.
It is used for:
web development (server-side), software development, mathematics,
system scripting.
NumPy arrays are stored at one continuous place in memory unlike lists, so processes can access and manipulate them very eciently.
This behavior is called locality of reference in computer science.
This is the main reason why NumPy is faster than lists. Also it is optimized to work with latest CPU
architectures.
Matplotlib is a plotting library for the Python programming language and its numerical mathematics extension NumPy. It provides an object-oriented API for embedding plots into applications using general- purpose GUI toolkits like Tkinter, wxPython, Qt, or GTK+. SciPy makes use of Matplotlib.
Matplotlib: Matplotlib is mainly deployed for basic plotting. Visualization using Matplotlib generally consists of bars, pies, lines, scatter plots and so on. Seaborn: Seaborn, on the other hand, provides a variety of visualization patterns. It uses fewer syntax and has easily interesting default themes.
head () method. Python is a great language for doing data analysis, primarily because of the fantastic ecosystem of data- centric Python packages. Pandas is one of those packages and make importing and analyzing data much easier. Pandas head () method is used to return top n (5 by default) rows of a data frame or series.
Head & Tail
head () returns the rst n rows(observe the index values). The default number of elements to display is ve, but you may pass a custom number.
tail () returns the last n rows(observe the index values). … tail () returns the last n rows (observe the index values).
The tail () function is used to get the last n rows. This function returns last n rows from the object based on position. It is useful for quickly verifying data, for example, after sorting or appending rows.
The describe () function computes a summary of statistics pertaining to the Data Frame columns. This function gives the mean, std and IQR values. And, function excludes the character columns and given summary about numeric columns takes the list of values; by default, number'.
loc gets rows (or columns) with particular labels from the index. iloc gets rows (or columns) at particular positions in the index (so it only takes integers). ix usually tries to behave like loc but falls back to behaving like iloc if a label is not present in the index.
Pandas reset_index () is a method to reset index of a Data Frame. reset_index () method sets a list of integer ranging from 0 to length of data as index level: int, string or a list to select and remove passed
Column from index. drop: Boolean value, Adds the replaced index column to the data if False.
Keras is a neural network library while Tensor Flow is the open-source library for a number of various tasks in machine learning. Tensor Flow provides both high-level and low-level APIs while Kerasprovides only high-level APIs Keras is built in Python which makes it way more user-friendly than Tensor Flow.
Tensor Flow
Tensor Flow is an open-source software library for machine learning across a range of tasks. It is a symbolic math library, and also used as a system for building and training neural networks to detect and de- cipher patterns and correlations, analogous to human learning and reasoning. It is used for both re- searh and production at Google often replacing its closed-source predecessor, Disbelief. Tensor Flow was developed by the Google Brain team for internal Google use. It was released under the Apache 2.0 open source license on 9 November 2015.
Tensor Flow provides a Python API as well as C++, Haskell, Java, Go and Rust APIs.
A tensor can be represented as a multidimensional array of numbers. A tensor has its rank and shape, rank is its number of dimensions and shape is the size of each dimension.
All data of Tensor Flow is represented as tensors. It is the sole data structure: w.oat32, w.oat64, w.int8, w.int16, , w.int64, w.uint8, …
Tensor Flow programs consist of two discrete sections: A graph is created in the construction phase.
The computational graph is run in the execution phase, which is a session.
Tensor Board
Tensor Flow provides functions to debug and optimize programs with the help of a visualization tool called Tensor Board.
Tensor Flow creates the necessary data during its execution. The data are stored in trace les.
Tensor board can be viewed from a browser using http://localhost:6006/
We can run the following example program, and it will create the directory "output" We can run now tensor board: tensor board –logdir output
which will create a webserver: Tensor Board 0.1.8 at http://marvin:6006 (Press CTRL+C to quit)
-
RESULTS AND OBSERVATIONS
-
Test Case Results Via Machine Learning
KNN WITH 70% TRAINING 30% VALIDATION
KNN WITH 80% TRAINING 20% VALIDATION
NAIVE BAYES WITH 70% TRAINING 30% VALIDATION NAIVE BAYES WITH 80% TRAINING 20% VALIDATION
-
Test Case Results Via DeepLearning
Deep Learning-Long Short Term Memory(LSTM) Note:-0- no.of nonseizures,1-no.of seizures
FIGURE 9: COUNT OF NO. OF SEIZURES VERSUS NON
SEIZURES FIGURE 10: No. of Samples versus UV
FIGURE 11: No. Of Parameters
Above table gives information regarding Total parameters, Trainable parameters, and NonTrainable parameters
FIGURE 12: ACCURACY
OBSERVATIONS:-
From the above three methods we identied that the accuracy of Navies Bias is more compared to Random Forest than KNN
-
-
CONCLUSION AND FUTURE STUDY :-
-
A .Conclusion:-
By comparing both machine learning and deep learning accuracy in deep learning using LSTM is high.
-
uture Study:-
To gather more information regarding IOT devices which helps to detect seizures and on ideas of developing particular app. Useful for patients
REFERENCES
-
U. R. Acharya, S. V.Sree, G. Swapna, R. J. Martis, and J. S. Suri, Automated EEG analysis of epilepsy: a review, Knowledge- Based Systems, vol. 45, pp. 147165, 2013.
-
R. S. Fisher, W. Van Emde Boas, W. Blume et al., Epileptic seizures and epilepsy: definitions proposed by the International League Against Epilepsy (ILAE) and the In- ternational Bureau for Epilepsy (IBE), Epilepsia, vol. 46, no. 4, pp. 470472, 2005.
-
L. E. Hebert, P. A. Scherr, J. L. Bienias, D. A. Bennett, and D. A. Evans, Alzheimer disease in the US population: prevalence estimates using the 2000 census, JAMA Neu- rology, vol. 60, no. 8, pp. 11191122, 2003.
-
M.Guenot,Surgicaltreatmentofepilepsy:Outcomeofvarious surgical procedures in adults and children, Revue Neurologique, vol. 160, no. 5, pp. S241S250, 2004.
-
Efficient Epileptic Seizure Prediction based on Deep Learning Hisham Daoud, Member, IEEE, Magdy Bayoumi, Fellow, IEEE
-
EpilepticSeizuresPredictionUsingMachineLearningMethodsVolu me 2017 |ArticleID 9074759hips://doi.org/10.1155/2017/9074759
-
Smartphone applications for seizure management Health Informatics Journal 2016, Vol.22(2)209220 DOI: 10.1177/1460458214540906 jhi.sagepub com
-
An IoT Pla\orm for Epilepsy Monitoring and Supervising Hindawi, Jour nal ofSe nsors,Volum e 2 0 1 7 , A r tic le ID6 0 4 3 0 6 9 , 1 8 page s https : / / d o i . o rg / 10.1155/2017/6043069
-
Biosensors for Epilepsy Management: State-of-Art and Future Aspects Sensors 2019, 19, 1525; doi:10.3390/s19071525www.mdpi.com/journal/sensors
-
Epileptic Seizure Prediction Using Big Data and Deep Learning: Toward a Mobile SystemPub:December12,2017DOI:https://doi.org/10.1016/j.ebio m.2017.11.032
-
R. S. Fisher et al., ILAE Official Report: A practical clinical definition of epilepsy, Epilepsia, vol. 55, no. 4, pp. 475482, Apr. 2014.
-
World Health Organization, Neurological Disorders: Public Health Challenges. World Health Organization, 2006.
-
Cheng-Yi Chiang, Nai-Fu Chang, Tung-Chien Chen, Hong-Hui Chen, and Liang-Gee Chen, Seizure prediction based on classification of EEG synchronization patterns with on-line retraining and post-processing scheme, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, pp. 7564 7569, 2011.
-
Epilepsy prevalence, incidence and other statistics, Joint Epilepsy Council, Leeds, UK, 2005.
-
E. Bou Assi, D. K. Nguyen, S. Rihana, and M. Sawan, Towards accurate prediction of epilep- tic seizures: A review, Biomed. Signal Process. and Control, vol. 34, pp. 144157, Apr. 2017.
-
Fisher RS, Boas WvE, Blume W, et al. Epileptic seizures and epilepsy: definitions proposed by the International League Against Epilepsy (ILAE) and the International Bureau for Epilepsy (IBE). Epilepsia 2005; 46: 470472.
-
World Health Organization. Epilepsy factsheet, http://www.who.int/mediacentre/factsheets/ fs999/en/ index.html (accessed 4 August 2013).
-
Joint Epilepsy Council of Australia. A fair go for people living with epilepsy in Australia, http:// www. epinet.org.au/downloads/File/
-
NCGC. The epilepsies: the diagnosis and management of the epilepsies in adults and children in primary and secondary care, http://www.nice.org.uk/nicemedia/live/13635/57784/57784.pdf (2012).
-
CDC. Living well with epilepsy II, http://www.cdc.gov/epilepsy/pdfs/living_well_2003.pdf (2003).
-
England MJ, Liverman C, Schultz AM, et al. Epilepsy across the spectrum: promoting health and under-standing. Washington, DC: National Academies Press, 2012.
-
Gilstrap D. Ericsson mobility report, http://www.ericsson.com/res/docs/2013/ericsson-mobility- report- june-2013.pdf (2013).
-
Gartner. Gartner says annual smartphone sales surpassed sales of feature phones for the first time in 2013, http://www.gartner.com/newsroom/id/2665715 (accessed 25 April 2014).
-
mobiThinking. Global mobile statistics 2014 part A: mobile subscribers; handset market share, mobile operators, http://mobithinking.com/mobile-marketing-tools/latest-mobile- stats/a#smart- phoneforecast (accessed 25 April 2014).
-
Healthcare Global. Use of mobile health care apps on the rise, http://www.healthcareglobal.- com/health- care_technology/use- of-mobile-health-care-apps-on-the-rise (accessed 1 August 2013).
-
R. S. Fisher, W. Van Emde Boas, W. Blume et al., Epileptic seizures and epilepsy: definitions proposed by the International League Against Epilepsy (ILAE) and the International Bureau for Epilepsy (IBE), Epilepsy, vol. 46, no. 4, pp. 470472, 2005.
-
J. Engel Jr., A proposed diagnostic scheme for people with epileptic seizures and with epilepsy: report of the ILAE task force on classification and terminology, Epilepsia, vol. 42, no. 6, pp. 796803, 2001.
-
J. R. Villar, P. Vergara, M. Menendez, E. De La Cal, V. M. Gonzalez, and J. Sedano, Gen- eralized models for the classifi- cation of abnormal movements in daily life and its applicability to epilepsy convulsion recognition, International Journal of Neural Systems, vol. 26, no. 6, 2016.
-
E. B. Petersen, J. Duun-Henriksen, A. Mazzaretto, T. W. Kjar, C.
E. Thomsen, and H. B. D. Sorensn, Generic single-channel detection of absence seizures, in Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS 11), pp. 48204823, Boston, Mass, USA, September 2011.
-
C. Pradhan, S. Sinha, K. Thennarasu, and T. Jagadisha, Quan- titative analysis of heart rate variability in patients with absence epilepsy, NeurologyIndia, vol. 59, no. 1, pp. 2529, 2011.
-
S. Beniczky, T. Polster, T. W. Kjaer, and H. Hjalgrim, Detec- tion of generalized tonic- clonic seizures by a wireless wrist accelerometer: a prospective, multicenter study, Epilepsia, vol. 54, no. 4, pp. e58e61, 2013.
-
D. Callegari, E. Conte, T. Ferreto et al., EpiCare – a home care platform based on mobile cloud computing to assist epilepsy diagnosis, in Proceedings of the 4th International Confer- ence on Wireless Mobile Communication and Healthcare, MOBIHEALTH 2014, pp. 148151, November 2014.
-
P. Lokhande and T. Mote, Epilepsy monitoring and analysis system using android platform, International Journal of Sci- ences and Research, vol. 5, no. 7, pp. 6064,2016.
-
P. Meghana and V.V.Sravani, Mobile application based detection of seizures using inertial measurement units and emergency care, International Journal on Recent and Innovation Trends in Computing and Communication, vol. 3, no. 11, pp. 63886391, 2015.
S. Ramgopal, S. Thome-Souza, M. Jackson et al., Seizure detection, seizure prediction, and closed-loop warning systems in epilepsy, Epilepsy and Behavior, vol. 37, pp. 291307, 2014.
-
S. Deshmukh and R. Shah, Computation offloading frame- works in mobile cloud computing: a survey, in Proceedings of the 2016 IEEE International Conference on Cur- rent Trends in Advanced Computing, ICCTAC 2016, pp. 15, March 2016.
-
N. Fernando, S. W. Loke, and W. Rahayu, Mobile cloud computing: a survey, Fu- ture Generation Computer Systems, vol. 29, no. 1, pp. 84106, 2013.
-
M.Othman,S.A.Madani,S.U.Khanetal.,Asurveyofmobile cloud computing applica- tion models, IEEE Communications Surveys and Tutorials, vol. 16, no. 1, pp. 393413, 2014.
-
Step by step treatment of Epilepsy by Pv Ravi
-
Recent Advancement in Epilepsy PPT by Dr.HelalUddin Ahmed NIMHANS [1] U. R. Acharya, S. V. Sree, G. Swapna, R. J. Martis, and J. S. Suri, Automated EEG analysis of epilepsy: a review, Knowledge- Based Systems, vol. 45, pp. 147165, 2013.
-