Determining the Optimal Time Interval for AF Classification from ECG Signal by Machine Learning

Suttirak Duangburong, Busaba Phruksaphanrat and Sombat Muengtaweepongsa (Thammasat University, Thailand)

Atrial Fibrillation (AF) is the most common cardiac arrhythmia. AF patients should receive urgent treatment to reduce risk of death. A classification model is needed for helping doctors to diagnose. Then, they can make an effective plan for the treatment of a patient. Electrocardiogram (ECG) signals display biological signals that are influenced by the autonomic of the heart; it is one of the effective methods that show abnormality of a patient's heart. Current classification models from ECG signals activity are mostly developed by PhysioNet datasets that may not be suitable for local patients. Therefore, ECG signals of local patients were used in this research for training and constructing the AF classification model by machine learning, the popular technique for classification. However, the appropriate time interval for AF classification by machine learning has not been deeply investigated. In this research, both non-AF and AF signals were divided into segments of 2.5, 2.0, and 1.5 minutes. R peak, RR intervals, F-wave, HRV, heart rate, and SampEn are significant features that were extracted from lead II-ECG. Then, the machine learning method was used to find the most suitable time interval and classification model. Finally, the time interval at 2.5 minutes was the most appropriate length, showing the highest performance by an ensemble (bagged Tree) and a tree (fine tree) models at 100% ACC, SE, SP, TRP, and 0% FPR respectively. The proposed time interval and classification models can be deployed as a decision tool to assist cardiac physicians in diagnosis of AF patients.

Journal: International Journal of Simulation- Systems, Science and Technology- IJSSST V22

Published: Jun 14, 2021

DOI: 10.5013/IJSSST.a.22.01.16