Friday, December 18, 2020

Big Data in Preclinical ECG Alterations Research

Big Data in Preclinical ECG Alterations Research


Synergistic interaction of different disciplines including biomedicine, mathematic, computer science, statistics, and engineering approach the ever more pressing demands of an early diagnosis in the management of various diseases with significant social impact. Cardiovascular diseases, which still constitute one of the most important public health problems, demand ever more technological approaches. Thus, big data volumes managed with efficient computational methods, represent almost promising reachable target of increasingly efficient, early and personalized diagnostic tools. The electrocardiography is the common diagnostic medical device for cardiac electrical activity detection. Despite technological advances from its debut in 1895 by Willem Einthoven [1] through the 12-lead development in 1942 by Goldberger [2-3], it has not much been changed. Recently, Deserno and Marx started to develop methodology for computational ECG analysis based on big data volumes that are impossible to be inspected conventionally but require efficient computational methods. Same authors defined however the urgent need to revisit ECG reading mainly focusing research on data analysis and modelling [4].

The contribution of the different layers of cardiac microfibers is of fundamental importance for understanding the complex mechanisms of the three-dimensional myocardial deformation. In this regard, it is really urgent to increase efficient advanced technology to be developed in preclinical research. Over the years, mouse models have proved to be a valuable tool to understand the different mechanisms of alteration of normal cardiac electrical activity and its propagation [5]. While, on the one hand, classical studies of electrocardiographic tracing are fundamental to understand the causes and mechanisms underlying arrhythmias and other related diseases, on the other side the information’s does not allow to know, generally, in a preventive manner, the onset of a cardiac damage, before it is established. It is therefore necessary to set up new parameters to be applied in different preclinical cardiovascular diagnosis. In light of this, the advent of Big Data, which collect higher complexity information’s featured by volume, variety, and velocity, is providing a valuable contribution in the control, contrast and management of large data sets [6].

All this could be applied to the prediction, prevention, management and treatment of cardiovascular diseases in order to calculate and register new indices and obtain, in tabular form, numbers and statistics. As the continuous recording of the start and end times, of the amplitudes P, Q, R, S and T, and of the time intervals PQRST. From the intersection of these numerical data, new indices can be created that can quantitatively analyze the global and regional function of the ventricular and atrial myocardium, as it occurs for echocardiography with speckle tracking or as in the case of P-wave dispersion (PWD) [7] which is now a non-invasive marker for the risk of atrial fibrillation. Further modules could automatically extract the calculated data in order to generate new information of interest such as QT / RR (QT Interval vs. RR Interval), QT / Time Plot (ideal for pharmacokinetics); RR / Time, (for HR variation). The numerical data thus obtained can be extracted according to blocks of interest to generate more and more new specific indices. No more then the cardiac waves deriving from the resultant of the various vectors of cardiac deformations, but tables of data and diagrams of infinite information coming from microscopic variations of the cardiac activity. The aim is to identify the possible malfunction of the heart muscle as soon as possible.

Through the use of Big Data it will be possible to reach more and more personalized, targeted and effective diagnostic paths. In this context, the Machine e Deep Learning methodologies offer a great opportunity to achieve radical innovation in medical field related to cardiac diseases. Until a few years ago, the study of physiological signals, including the ECG, could count on statistical and mathematical approaches grouped in the classical methodologies of Signal Theory. Starting from these methodologies, further analytical tools have been defined to enhance the analysis of physiological signals. These tools are based on “hand-crafted” indicators, such QT / RR, etc. and they are able to provide important information on the dynamics of the analyzed signals. The classical approaches of Signal Theory and Image Processing improved by the adoption hand-crafted indicators have significantly improved the study of physiological signals and medical images, making these algorithms valid Advisors for the doctor. However, these methods present some limitations due to a statistical polarization induced by the hand-crafted heuristic indicators leading to the inability of these models to determine the discriminating effects of these indicators. With the advent of modern Machine and Deep Learning techniques, the above-mentioned limits have been largely overcome, making possible today the development of advanced algorithms able to appropriately characterize the physiological signals to accurately a estimate the risk of contracting certain diseases. With specific reference of cardiac filed, the latest generation algorithms for the analysis of ECG signals are composed of hybrid strategies including mathematical models, hand-crafted indicators and Machine Learning methods. This makes possible to construct an ECG signal study strategy that exploits the advantages of each methodology used. An interesting example of this approach is reported in [8]. A hybrid pipeline (composed of Machine Learning methods and nonlinear mathematical models) is described able to stabilize, filter and analyze both the ECG signal and, simultaneously, the PPG (Photo Plethysmo Graphic) signals. Through a non-linear mathematical Reaction-Diffusion model and a correlation study between the first derivative of the PPG signal and the dynamics of the corresponding ECG signal a discrimination of ECG stable patterns consistent with the PPG signal ( therefore with the corresponding cardiac phases) from those presenting abnormal features is achieved.

The processing of both ECG and PPG signals were designed by observing the timing of the patient’s heart rhythms. To achieve this result, a specific artificial intelligence system based on a properly configured Cellular Neural Networks framework, has been used making possible the compliance of cardiac timing. The method shown excellent results. Once again, the use of a hybrid pipeline consisting of mathematical models, hand-crafted indicators and Machine Learning systems, allows to adequately characterize the activity of the cardio-circulatory system, making possible to obtain indexes of cardiac functionality that, in not invasive way, are able to create predictive models of cardio-vascular risks concretely prevent any physical damage deriving from an abnormal cardiac activity. In conclusion, it is therefore desirable to start a complete decryption in numerical terms of the traditional electrocardiographic signal, through the joint use of appropriate hand-crafted indicators together with modern Machine Learning systems capable of extracting further features, so as to identify new parameters of early dysfunction and ensure a targeted intervention in the treatment of cardiovascular diseases.

Aprioristic Estimation of Back Pain Risk, Connected with Work at Workers of The Metallurgical Enterprise-https://biomedres01.blogspot.com/2020/12/aprioristic-estimation-of-back-pain.html

More BJSTR Articles : https://biomedres01.blogspot.com

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Types and Treatments of Leishmaniasis

  Types and Treatments of Leishmaniasis Introduction The Leishmaniasis are a cluster of parasitic diseases produced by morphologically alike...