Worldwide healthcare delivery trends are undergoing a subtle paradigm shift--patient centered services as opposed to provider centered services and wellness maintenance as opposed to illness management. In this paper we present a Tele-Healthcare project TIDE--Tele-Healthcare Information and Diagnostic Environment. TIDE manifests an 'intelligent' healthcare environment that aims to ensure lifelong coverage of person-specific health maintenance decision-support services--i.e., both wellness maintenance and illness management services--ubiquitously available via the Internet/WWW. Taking on an all-encompassing health maintenance role--spanning from wellness to illness issues--the functionality of TIDE involves the generation and delivery of (a) Personalized, Pro-active, Persistent, Perpetual, and Present wellness maintenance services, and (b) remote diagnostic services for managing noncritical illnesses. Technically, TIDE is an amalgamation of diverse computer technologies--Artificial Intelligence, Internet, Multimedia, Databases, and Medical Informatics--to implement a sophisticated healthcare delivery infostructure.
Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.
Recently, Wireless Body Area Network (WBAN) has witnessed significant attentions in research and product development due to the growing number of sensor-based applications in healthcare domain. Design of efficient and effective Medium Access Control (MAC) protocol is one of the fundamental research themes in WBAN. Static on-demand slot allocation to patient data is the main approach adopted in the design of MAC protocol in literature, without considering the type of patient data specifically the level of severity on patient data. This leads to the degradation of the performance of MAC protocols considering effectiveness and traffic adjustability in realistic medical environments. In this context, this paper proposes a Traffic Priority-Aware MAC (TraPy-MAC) protocol for WBAN. It classifies patient data into emergency and non-emergency categories based on the severity of patient data. The threshold value aided classification considers a number of parameters including type of sensor, body placement location, and data transmission time for allocating dedicated slots patient data. Emergency data are not required to carry out contention and slots are allocated by giving the due importance to threshold value of vital sign data. The contention for slots is made efficient in case of non-emergency data considering threshold value in slot allocation. Moreover, the slot allocation to emergency and non-emergency data are performed parallel resulting in performance gain in channel assignment. Two algorithms namely, Detection of Severity on Vital Sign data (DSVS), and ETS Slots allocation based on the Severity on Vital Sign (ETS-SVS) are developed for calculating threshold value and resolving the conflicts of channel assignment, respectively. Simulations are performed in ns2 and results are compared with the state-of-the-art MAC techniques. Analysis of results attests the benefit of TraPy-MAC in comparison with the state-of-the-art MAC in channel assignment in realistic medical environments.
The security effectiveness based on users' behaviors is becoming a top priority of Health Information System (HIS). In the first step of this study, through the review of previous studies 'Self-efficacy in Information Security' (SEIS) and 'Security Competency' (SCMP) were identified as the important factors to transforming HIS users to the first line of defense in the security. Subsequently, a conceptual model was proposed taking into mentioned factors for HIS security effectiveness. Then, this quantitative study used the structural equation modeling to examine the proposed model based on survey data collected from a sample of 263 HIS users from eight hospitals in Iran. The result shows that SEIS is one of the important factors to cultivate of good end users' behaviors toward HIS security effectiveness. However SCMP appears a feasible alternative to providing SEIS. This study also confirms the mediation effects of SEIS on the relationship between SCMP and HIS security effectiveness. The results of this research paper can be used by HIS and IT managers to implement their information security process more effectively.
This study presents a systematic literature review of access control for electronic health record systems to protect patient's privacy. Articles from 2006 to 2016 were extracted from the ACM Digital Library, IEEE Xplore Digital Library, Science Direct, MEDLINE, and MetaPress using broad eligibility criteria, and chosen for inclusion based on analysis of ISO22600. Cryptographic standards and methods were left outside the scope of this review. Three broad classes of models are being actively investigated and developed: access control for electronic health records, access control for interoperability, and access control for risk analysis. Traditional role-based access control models are extended with spatial, temporal, probabilistic, dynamic, and semantic aspects to capture contextual information and provide granular access control. Maintenance of audit trails and facilities for overriding normal roles to allow full access in emergency cases are common features. Access privilege frameworks utilizing ontology-based knowledge representation for defining the rules have attracted considerable interest, due to the higher level of abstraction that makes it possible to model domain knowledge and validate access requests efficiently.
The growing worldwide population has increased the need for technologies, computerised software algorithms and smart devices that can monitor and assist patients anytime and anywhere and thus enable them to lead independent lives. The real-time remote monitoring of patients is an important issue in telemedicine. In the provision of healthcare services, patient prioritisation poses a significant challenge because of the complex decision-making process it involves when patients are considered 'big data'. To our knowledge, no study has highlighted the link between 'big data' characteristics and real-time remote healthcare monitoring in the patient prioritisation process, as well as the inherent challenges involved. Thus, we present comprehensive insights into the elements of big data characteristics according to the six 'Vs': volume, velocity, variety, veracity, value and variability. Each of these elements is presented and connected to a related part in the study of the connection between patient prioritisation and real-time remote healthcare monitoring systems. Then, we determine the weak points and recommend solutions as potential future work. This study makes the following contributions. (1) The link between big data characteristics and real-time remote healthcare monitoring in the patient prioritisation process is described. (2) The open issues and challenges for big data used in the patient prioritisation process are emphasised. (3) As a recommended solution, decision making using multiple criteria, such as vital signs and chief complaints, is utilised to prioritise the big data of patients with chronic diseases on the basis of the most urgent cases.
Parkinson's disease (PD) is a type of progressive neurodegenerative disorder that has affected a large part of the population till now. Several symptoms of PD include tremor, rigidity, slowness of movements and vocal impairments. In order to develop an effective diagnostic system, a number of algorithms were proposed mainly to distinguish healthy individuals from the ones with PD. However, most of the previous works were conducted based on a binary classification, with the early PD stage and the advanced ones being treated equally. Therefore, in this work, we propose a multiclass classification with three classes of PD severity level (mild, moderate, severe) and healthy control. The focus is to detect and classify PD using signals from wearable motion and audio sensors based on both empirical wavelet transform (EWT) and empirical wavelet packet transform (EWPT) respectively. The EWT/EWPT was applied to decompose both speech and motion data signals up to five levels. Next, several features are extracted after obtaining the instantaneous amplitudes and frequencies from the coefficients of the decomposed signals by applying the Hilbert transform. The performance of the algorithm was analysed using three classifiers - K-nearest neighbour (KNN), probabilistic neural network (PNN) and extreme learning machine (ELM). Experimental results demonstrated that our proposed approach had the ability to differentiate PD from non-PD subjects, including their severity level - with classification accuracies of more than 90% using EWT/EWPT-ELM based on signals from motion and audio sensors respectively. Additionally, classification accuracy of more than 95% was achieved when EWT/EWPT-ELM is applied to signals from integration of both signal's information.
The non-stationary and multi-frequency nature of biomedical signal activities makes the use of time-frequency distributions (TFDs) for analysis inevitable. Time-frequency analysis provides simultaneous interpretations in both time and frequency domain enabling comprehensive explanation, presentation and interpretation of electrocardiogram (ECG) signals. The diversity of TFDs and specific properties for each type show the need to determine the best TFD for ECG analysis. In this study, a performance evaluation of five TFDs in term of ECG abnormality detection is presented. The detection criteria based on extracted features from most important ECG signal components (QRS) to detect normal and abnormal cases. This is achieved by estimating its energy concentration magnitude using the TFDs. The TFDs analyse ECG signals in one-minute interval instead of conventional time domain approach that analyses based on beat or frame containing several beats. The MIT-BIH normal sinus rhythm ECG database total records of 18 long-term ECG sampled at 128 Hz have been analysed. The tested TFDs include Dual-Tree Wavelet Transform, Spectrogram, Pseudo Wigner-Ville, Choi-Williams, and Born-Jordan. Each record is divided into one-minute slots, which is not considered previously, and analysed. The sample periods (slots) are randomly selected ten minutes interval for each record. This result with 99.44% detection accuracy for 15,735 ECG beats shows that Choi-Williams distribution is most reliable to be used for heart problem detection especially in automated systems that provide continuous monitoring for long time duration.
In real-time medical systems, the role of biometric technology is significant in authentication systems because it is used in verifying the identity of people through their biometric features. The biometric technology provides crucial properties for biometric features that can support the process of personal identification. The storage of biometric template within a central database makes it vulnerable to attack which can also occur during data transmission. Therefore, an alternative mechanism of protection becomes important to develop. On this basis, this study focuses on providing a detailed analysis of the extant literature (2013-2018) to identify the taxonomy and research distribution. Furthermore, this study also seeks to ascertain the challenges and motivations associated with biometric steganography in real-time medical systems to provide recommendations that can enhance the efficient use of real-time medical systems in biometric steganography and its applications. A review of articles on human biometric steganography in real-time medical systems obtained from three main databases (IEEE Xplore, ScienceDirect and Web of Science) is conducted according to an appropriate review protocol. Then, 41 related articles are selected by using exclusion and inclusion criteria. Majority of the studies reviewed had been conducted in the field of data-hiding (particularly steganography) technologies. In this review, various steganographic methods that have been applied in different human biometrics are investigated. Thereafter, these methods are categorised according to taxonomy, and the results are presented on the basis of human steganography biometric real-time medical systems, testing and evaluation methods, significance of use and applications and techniques. Finally, recommendations on how the challenges associated with data hiding can be addressed are provided to enhance the efficiency of using biometric information processed in any authentication real-time medical system. These recommendations are expected to be immensely helpful to developers, company users and researchers.
Electrocardiography (ECG) sensors play a vital role in the Internet of Medical Things, and these sensors help in monitoring the electrical activity of the heart. ECG signal analysis can improve human life in many ways, from diagnosing diseases among cardiac patients to managing the lifestyles of diabetic patients. Abnormalities in heart activities lead to different cardiac diseases and arrhythmia. However, some cardiac diseases, such as myocardial infarction (MI) and atrial fibrillation (Af), require special attention due to their direct impact on human life. The classification of flattened T wave cases of MI in ECG signals and how much of these cases are similar to ST-T changes in MI remain an open issue for researchers. This article presents a novel contribution to classify MI and Af. To this end, we propose a new approach called deep deterministic learning (DDL), which works by combining predefined heart activities with fused datasets. In this research, we used two datasets. The first dataset, Massachusetts Institute of Technology-Beth Israel Hospital, is publicly available, and we exclusively obtained the second dataset from the University of Malaya Medical Center, Kuala Lumpur Malaysia. We first initiated predefined activities on each individual dataset to recognize patterns between the ST-T change and flattened T wave cases and then used the data fusion approach to merge both datasets in a manner that delivers the most accurate pattern recognition results. The proposed DDL approach is a systematic stage-wise methodology that relies on accurate detection of R peaks in ECG signals, time domain features of ECG signals, and fine tune-up of artificial neural networks. The empirical evaluation shows high accuracy (i.e., ≤99.97%) in pattern matching ST-T changes and flattened T waves using the proposed DDL approach. The proposed pattern recognition approach is a significant contribution to the diagnosis of special cases of MI.
Blood leucocytes segmentation in medical images is viewed as difficult process due to the variability of blood cells concerning their shape and size and the difficulty towards determining location of Blood Leucocytes. Physical analysis of blood tests to recognize leukocytes is tedious, time-consuming and liable to error because of the various morphological components of the cells. Segmentation of medical imagery has been considered as a difficult task because of complexity of images, and also due to the non-availability of leucocytes models which entirely captures the probable shapes in each structures and also incorporate cell overlapping, the expansive variety of the blood cells concerning their shape and size, various elements influencing the outer appearance of the blood leucocytes, and low Static Microscope Image disparity from extra issues outcoming about because of noise. We suggest a strategy towards segmentation of blood leucocytes using static microscope images which is a resultant of three prevailing systems of computer vision fiction: enhancing the image, Support vector machine for segmenting the image, and filtering out non ROI (region of interest) on the basis of Local binary patterns and texture features. Every one of these strategies are modified for blood leucocytes division issue, in this manner the subsequent techniques are very vigorous when compared with its individual segments. Eventually, we assess framework based by compare the outcome and manual division. The findings outcome from this study have shown a new approach that automatically segments the blood leucocytes and identify it from a static microscope images. Initially, the method uses a trainable segmentation procedure and trained support vector machine classifier to accurately identify the position of the ROI. After that, filtering out non ROI have proposed based on histogram analysis to avoid the non ROI and chose the right object. Finally, identify the blood leucocytes type using the texture feature. The performance of the foreseen approach has been tried in appearing differently in relation to the system against manual examination by a gynaecologist utilizing diverse scales. A total of 100 microscope images were used for the comparison, and the results showed that the proposed solution is a viable alternative to the manual segmentation method for accurately determining the ROI. We have evaluated the blood leucocytes identification using the ROI texture (LBP Feature). The identification accuracy in the technique used is about 95.3%., with 100 sensitivity and 91.66% specificity.
This paper proposes a robotic Transesophageal Echocardiography (TOE) system concept for Catheterization Laboratories. Cardiovascular disease causes one third of all global mortality. TOE is utilized to assess cardiovascular structures and monitor cardiac function during diagnostic procedures and catheter-based structural interventions. However, the operation of TOE underlies various conditions that may cause a negative impact on performance, the health of the cardiac sonographer and patient safety. These factors have been conflated and evince the potential of robot-assisted TOE. Hence, a careful integration of clinical experience and Systems Engineering methods was used to develop a concept and physical model for TOE manipulation. The motion of different actuators of the fabricated motorized system has been tested. It is concluded that the developed medical system, counteracting conflated disadvantages, represents a progressive approach for cardiac healthcare.
One of the major issues in time-critical medical applications using wireless technology is the size of the payload packet, which is generally designed to be very small to improve the transmission process. Using small packets to transmit continuous ECG data is still costly. Thus, data compression is commonly used to reduce the huge amount of ECG data transmitted through telecardiology devices. In this paper, a new ECG compression scheme is introduced to ensure that the compressed ECG segments fit into the available limited payload packets, while maintaining a fixed CR to preserve the diagnostic information. The scheme automatically divides the ECG block into segments, while maintaining other compression parameters fixed. This scheme adopts discrete wavelet transform (DWT) method to decompose the ECG data, bit-field preserving (BFP) method to preserve the quality of the DWT coefficients, and a modified running-length encoding (RLE) scheme to encode the coefficients. The proposed dynamic compression scheme showed promising results with a percentage packet reduction (PR) of about 85.39% at low percentage root-mean square difference (PRD) values, less than 1%. ECG records from MIT-BIH Arrhythmia Database were used to test the proposed method. The simulation results showed promising performance that satisfies the needs of portable telecardiology systems, like the limited payload size and low power consumption.
The new and ground-breaking real-time remote monitoring in triage and priority-based sensor technology used in telemedicine have significantly bounded and dispersed communication components. To examine these technologies and provide researchers with a clear vision of this area, we must first be aware of the utilised approaches and existing limitations in this line of research. To this end, an extensive search was conducted to find articles dealing with (a) telemedicine, (b) triage, (c) priority and (d) sensor; (e) comprehensively review related applications and establish the coherent taxonomy of these articles. ScienceDirect, IEEE Xplore and Web of Science databases were checked for articles on triage and priority-based sensor technology in telemedicine. The retrieved articles were filtered according to the type of telemedicine technology explored. A total of 150 articles were selected and classified into two categories. The first category includes reviews and surveys of triage and priority-based sensor technology in telemedicine. The second category includes articles on the three-tiered architecture of telemedicine. Tier 1 represents the users. Sensors acquire the vital signs of the users and send them to Tier 2, which is the personal gateway that uses local area network protocols or wireless body area network. Medical data are sent from Tier 2 to Tier 3, which is the healthcare provider in medical institutes. Then, the motivation for using triage and priority-based sensor technology in telemedicine, the issues related to the obstruction of its application and the development and utilisation of telemedicine are examined on the basis of the findings presented in the literature.
This paper presents a new approach to prioritize "Large-scale Data" of patients with chronic heart diseases by using body sensors and communication technology during disasters and peak seasons. An evaluation matrix is used for emergency evaluation and large-scale data scoring of patients with chronic heart diseases in telemedicine environment. However, one major problem in the emergency evaluation of these patients is establishing a reasonable threshold for patients with the most and least critical conditions. This threshold can be used to detect the highest and lowest priority levels when all the scores of patients are identical during disasters and peak seasons. A practical study was performed on 500 patients with chronic heart diseases and different symptoms, and their emergency levels were evaluated based on four main measurements: electrocardiogram, oxygen saturation sensor, blood pressure monitoring, and non-sensory measurement tool, namely, text frame. Data alignment was conducted for the raw data and decision-making matrix by converting each extracted feature into an integer. This integer represents their state in the triage level based on medical guidelines to determine the features from different sources in a platform. The patients were then scored based on a decision matrix by using multi-criteria decision-making techniques, namely, integrated multi-layer for analytic hierarchy process (MLAHP) and technique for order performance by similarity to ideal solution (TOPSIS). For subjective validation, cardiologists were consulted to confirm the ranking results. For objective validation, mean ± standard deviation was computed to check the accuracy of the systematic ranking. This study provides scenarios and checklist benchmarking to evaluate the proposed and existing prioritization methods. Experimental results revealed the following. (1) The integration of TOPSIS and MLAHP effectively and systematically solved the patient settings on triage and prioritization problems. (2) In subjective validation, the first five patients assigned to the doctors were the most urgent cases that required the highest priority, whereas the last five patients were the least urgent cases and were given the lowest priority. In objective validation, scores significantly differed between the groups, indicating that the ranking results were identical. (3) For the first, second, and third scenarios, the proposed method exhibited an advantage over the benchmark method with percentages of 40%, 60%, and 100%, respectively. In conclusion, patients with the most and least urgent cases received the highest and lowest priority levels, respectively.
Hospital scheduling presents huge challenges for the healthcare industry. Various studies have been conducted in many different countries with focus on both elective and non-elective surgeries. There are important variables and factors that need to be taken into considerations. Different methods and approaches have also been used to examine hospital scheduling. Notwithstanding the continuous changes in modern healthcare services and, in particular, hospital operations, consistent reviews and further studies are still required. The importance of hospital scheduling, particularly, has become more critical as the trade-off between limited resources and overwhelming demand is becoming more evident. This situation is even more pressing in a volatile country where shootings and bombings in public areas happened. Hospital scheduling for elective surgeries in volatile country such as Iraq is therefore often interrupted by non-elective surgeries due to war-related incidents. Hence, this paper intends to address this issue by proposing a hospital scheduling model with focus on neuro-surgery department. The aim of the model is to maximize utilization of operating room while concurrently minimizing idle time of surgery. The study focused on neurosurgery department in Al-Shahid Ghazi Al-Hariri hospital in Baghdad, Iraq. In doing so, a Mixed-integer linear programming (MILP) model is formulated where interruptions of non-elective surgery are incorporated into the main elective surgery based model. Computational experiment is then carried out to test the model. The result indicates that the model is feasible and can be solved in reasonable times. Nonetheless, its feasibility is further tested as the problems size and the computation times is getting bigger and longer. Application of heuristic methods is the way forward to ensure better practicality of the proposed model. In the end, the potential benefit of this study and the proposed model is discussed.
The increasing demand for Android mobile devices and blockchain has motivated malware creators to develop mobile malware to compromise the blockchain. Although the blockchain is secure, attackers have managed to gain access into the blockchain as legal users, thereby comprising important and crucial information. Examples of mobile malware include root exploit, botnets, and Trojans and root exploit is one of the most dangerous malware. It compromises the operating system kernel in order to gain root privileges which are then used by attackers to bypass the security mechanisms, to gain complete control of the operating system, to install other possible types of malware to the devices, and finally, to steal victims' private keys linked to the blockchain. For the purpose of maximizing the security of the blockchain-based medical data management (BMDM), it is crucial to investigate the novel features and approaches contained in root exploit malware. This study proposes to use the bio-inspired method of practical swarm optimization (PSO) which automatically select the exclusive features that contain the novel android debug bridge (ADB). This study also adopts boosting (adaboost, realadaboost, logitboost, and multiboost) to enhance the machine learning prediction that detects unknown root exploit, and scrutinized three categories of features including (1) system command, (2) directory path and (3) code-based. The evaluation gathered from this study suggests a marked accuracy value of 93% with Logitboost in the simulation. Logitboost also helped to predicted all the root exploit samples in our developed system, the root exploit detection system (RODS).
The burden on healthcare services in the world has increased substantially in the past decades. The quality and quantity of care have to increase to meet surging demands, especially among patients with chronic heart diseases. The expansion of information and communication technologies has led to new models for the delivery healthcare services in telemedicine. Therefore, mHealth plays an imperative role in the sustainable delivery of healthcare services in telemedicine. This paper presents a comprehensive review of healthcare service provision. It highlights the open issues and challenges related to the use of the real-time fault-tolerant mHealth system in telemedicine. The methodological aspects of mHealth are examined, and three distinct and successive phases are presented. The first discusses the identification process for establishing a decision matrix based on a crossover of 'time of arrival of patient at the hospital/multi-services' and 'hospitals' within mHealth. The second phase discusses the development of a decision matrix for hospital selection based on the MAHP method. The third phase discusses the validation of the proposed system.
This study aims to systematically review prior research on the evaluation and benchmarking of automated acute leukaemia classification tasks. The review depends on three reliable search engines: ScienceDirect, Web of Science and IEEE Xplore. A research taxonomy developed for the review considers a wide perspective for automated detection and classification of acute leukaemia research and reflects the usage trends in the evaluation criteria in this field. The developed taxonomy consists of three main research directions in this domain. The taxonomy involves two phases. The first phase includes all three research directions. The second one demonstrates all the criteria used for evaluating acute leukaemia classification. The final set of studies includes 83 investigations, most of which focused on enhancing the accuracy and performance of detection and classification through proposed methods or systems. Few efforts were made to undertake the evaluation issues. According to the final set of articles, three groups of articles represented the main research directions in this domain: 56 articles highlighted the proposed methods, 22 articles involved proposals for system development and 5 papers centred on evaluation and comparison. The other taxonomy side included 16 main and sub-evaluation and benchmarking criteria. This review highlights three serious issues in the evaluation and benchmarking of multiclass classification of acute leukaemia, namely, conflicting criteria, evaluation criteria and criteria importance. It also determines the weakness of benchmarking tools. To solve these issues, multicriteria decision-making (MCDM) analysis techniques were proposed as effective recommended solutions in the methodological aspect. This methodological aspect involves a proposed decision support system based on MCDM for evaluation and benchmarking to select suitable multiclass classification models for acute leukaemia. The said support system is examined and has three sequential phases. Phase One presents the identification procedure and process for establishing a decision matrix based on a crossover of evaluation criteria and acute leukaemia multiclass classification models. Phase Two describes the decision matrix development for the selection of acute leukaemia classification models based on the integrated Best and worst method (BWM) and VIKOR. Phase Three entails the validation of the proposed system.