Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also successfully encrypts and decrypts textual data in Microsoft Word document, Microsoft Excel and Portable Document Formats which are the conventional format of documenting medical records. Interestingly, the entire encryption and decryption procedures were achieved at a lower computational cost using regular hardware and software resources without compromising neither the quality of the decrypted data nor the security level of the algorithms.
The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.
Developing a software program to manage data in a general practice setting is complicated. Vision Integrated Medical System is an example of a integrated management system that was developed by general practitioners, within a general practice, to offer a user friendly system with multi tasking capabilities. The present report highlights the reasons behind the development of this system and how it can assist day to day practice.
The parallelisation of big data is emerging as an important framework for large-scale parallel data applications such as seismic data processing. The field of seismic data is so large or complex that traditional data processing software is incapable of dealing with it. For example, the implementation of parallel processing in seismic applications to improve the processing speed is complex in nature. To overcome this issue, a simple technique which that helps provide parallel processing for big data applications such as seismic algorithms is needed. In our framework, we used the Apache Hadoop with its MapReduce function. All experiments were conducted on the RedHat CentOS platform. Finally, we studied the bottlenecks and improved the overall performance of the system for seismic algorithms (stochastic inversion).
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
The vector evaluated particle swarm optimisation (VEPSO) algorithm was previously improved by incorporating nondominated solutions for solving multiobjective optimisation problems. However, the obtained solutions did not converge close to the Pareto front and also did not distribute evenly over the Pareto front. Therefore, in this study, the concept of multiple nondominated leaders is incorporated to further improve the VEPSO algorithm. Hence, multiple nondominated solutions that are best at a respective objective function are used to guide particles in finding optimal solutions. The improved VEPSO is measured by the number of nondominated solutions found, generational distance, spread, and hypervolume. The results from the conducted experiments show that the proposed VEPSO significantly improved the existing VEPSO algorithms.
Voting is an important operation in multichannel computation paradigm and realization of ultrareliable and real-time control systems that arbitrates among the results of N redundant variants. These systems include N-modular redundant (NMR) hardware systems and diversely designed software systems based on N-version programming (NVP). Depending on the characteristics of the application and the type of selected voter, the voting algorithms can be implemented for either hardware or software systems. In this paper, a novel voting algorithm is introduced for real-time fault-tolerant control systems, appropriate for applications in which N is large. Then, its behavior has been software implemented in different scenarios of error-injection on the system inputs. The results of analyzed evaluations through plots and statistical computations have demonstrated that this novel algorithm does not have the limitations of some popular voting algorithms such as median and weighted; moreover, it is able to significantly increase the reliability and availability of the system in the best case to 2489.7% and 626.74%, respectively, and in the worst case to 3.84% and 1.55%, respectively.
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.
For medical application, the efficiency and transmission distance of the wireless power transfer (WPT) are always the main concern. Research has been showing that the impedance matching is one of the critical factors for dealing with the problem. However, there is not much work performed taking both the source and load sides into consideration. Both sides matching is crucial in achieving an optimum overall performance, and the present work proposes a circuit model analysis for design and implementation. The proposed technique was validated against experiment and software simulation. Result was showing an improvement in transmission distance up to 6 times, and efficiency at this transmission distance had been improved up to 7 times as compared to the impedance mismatch system. The system had demonstrated a near-constant transfer efficiency for an operating range of 2cm-12cm.
New techniques are presented for Delaunay triangular mesh generation and element optimisation. Sample points for triangulation are generated through mapping (a new approach). These sample points are later triangulated by the conventional Delaunay method. Resulting triangular elements are optimised by addition, removal and relocation of mapped sample points (element nodes). The proposed techniques (generation of sample points through mapping for Delaunay triangulation and mesh optimisation) are demonstrated by using Mathematica software. Simulation results show that the proposed techniques are able to form meshes that consist of triangular elements with aspect ratio of less than 2 and minimum skewness of more than 45°.
The Internet Engineering Task Force provides a network-based mobility management solution to execute handover in heterogeneous networks on network-side called Proxy Mobile IPv6 (PMIPv6). In this data article, data are presented during the horizontal and vertical handover on video communication in PMIPv6 mobility protocols. The handover data are gathered using several measurement factors, which are latency, jitter, cumulative measured, and peak signal noise ratio under network simulation software, for both horizontal and vertical handovers [8].
A description is given of the numerical integration method for the calculation of the mean kidney dose for a Co-57 external radiation source. Based on this theory, a computer program was written. Initial calculation of the kidney volume shows that the method has a good accuracy. For the mean kidney dose, this method gives a satisfactory result, since the calculated value lies within the acceptable range of the central axis depth dose.
Satu huraian diberikan tentang kaedah pengkamiran berangka untuk mengira dos buah pinggang purata untuk satu sumber sinaran luar Co-57. Berdasarkan teori ini, satu program komputer ditulis. Pengiraan awal isipadu buah pinggang menunjukkan yang kaedah ini mempunyai ketepatan yang baik. Untuk dos buah pinggang purata, kaedah ini memberikan keputusan yang baik, kerana nilai kiraan terletak diantara julat dos kedalaman paksi pusat yang diterima.
In this paper, we have examined the effectiveness of the quarter-sweep iteration concept on conjugate gradient normal residual (CGNR) iterative method by using composite Simpson's (CS) and finite difference (FD) discretization schemes in solving Fredholm integro-differential equations. For comparison purposes, Gauss- Seidel (GS) and the standard or full- and half-sweep CGNR methods namely FSCGNR and HSCGNR are also presented. To validate the efficacy of the proposed method, several analyses were carried out such as computational complexity and percentage reduction on the proposed and existing methods.
This paper deals with the interface-relevant activity of a vehicle integrated intelligent safety system (ISS) that includes an airbag deployment decision system (ADDS) and a tire pressure monitoring system (TPMS). A program is developed in LabWindows/CVI, using C for prototype implementation. The prototype is primarily concerned with the interconnection between hardware objects such as a load cell, web camera, accelerometer, TPM tire module and receiver module, DAQ card, CPU card and a touch screen. Several safety subsystems, including image processing, weight sensing and crash detection systems, are integrated, and their outputs are combined to yield intelligent decisions regarding airbag deployment. The integrated safety system also monitors tire pressure and temperature. Testing and experimentation with this ISS suggests that the system is unique, robust, intelligent, and appropriate for in-vehicle applications.
Complex engineering systems are usually designed to last for many years. Such systems will face many uncertainties in the future. Hence the design and deployment of these systems should not be based on a single scenario, but should incorporate flexibility. Flexibility can be incorporated in system architectures in the form of options that can be exercised in the future when new information is available. Incorporating flexibility comes, however, at a cost. To evaluate if this cost is worth the investment a real options analysis can be carried out. This approach is demonstrated through analysis of a case study of a previously developed static system-of-systems for maritime domain protection in the Straits of Malacca. This article presents a framework for dynamic strategic planning of engineering systems using real options analysis and demonstrates that flexibility adds considerable value over a static design. In addition to this it is shown that Monte Carlo analysis and genetic algorithms can be successfully combined to find solutions in a case with a very large number of possible futures and system designs.
In the recent years, many research works have been published using speech related features for speech emotion recognition, however, recent studies show that there is a strong correlation between emotional states and glottal features. In this work, Mel-frequency cepstralcoefficients (MFCCs), linear predictive cepstral coefficients (LPCCs), perceptual linear predictive (PLP) features, gammatone filter outputs, timbral texture features, stationary wavelet transform based timbral texture features and relative wavelet packet energy and entropy features were extracted from the emotional speech (ES) signals and its glottal waveforms(GW). Particle swarm optimization based clustering (PSOC) and wrapper based particle swarm optimization (WPSO) were proposed to enhance the discerning ability of the features and to select the discriminating features respectively. Three different emotional speech databases were utilized to gauge the proposed method. Extreme learning machine (ELM) was employed to classify the different types of emotions. Different experiments were conducted and the results show that the proposed method significantly improves the speech emotion recognition performance compared to previous works published in the literature.
: We present ClicO Free Service, an online web-service based on Circos, which provides a user-friendly, interactive web-based interface with configurable features to generate Circos circular plots.
Popular text-matching software generates a percentage of similarity - called a "similarity score" or "Similarity Index" - that quantifies the matching text between a particular manuscript and content in the software's archives, on the Internet and in electronic databases. Many evaluators rely on these simple figures as a proxy for plagiarism and thus avoid the burdensome task of inspecting the longer, detailed Similarity Reports. Yet similarity scores, though alluringly straightforward, are never enough to judge the presence (or absence) of plagiarism. Ideally, evaluators should always examine the Similarity Reports. Given the persistent use of simplistic similarity score thresholds at some academic journals and educational institutions, however, and the time that can be saved by relying on the scores, a method is arguably needed that encourages examining the Similarity Reports but still also allows evaluators to rely on the scores in some instances. This article proposes a four-band method to accomplish this. Used together, the bands oblige evaluators to acknowledge the risk of relying on the similarity scores yet still allow them to ultimately determine whether they wish to accept that risk. The bands - for most rigor, high rigor, moderate rigor and less rigor - should be tailored to an evaluator's particular needs.
The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects.
Biometric verification methods have gained significant popularity in recent times, which has brought about their extensive usage. In light of theoretical evidence surrounding the development of biometric verification, we proposed an experimental multi-biometric system for laboratory testing. First, the proposed system was designed such that it was able to identify and verify a user through the hand contour, and blood flow (blood stream) at the upper part of the hand. Next, we detailed the hard and software solutions for the system. A total of 40 subjects agreed to be a part of data generation team, which produced 280 hand images. The core of this paper lies in evaluating individual metrics, which are functions of frequency comparison of the double type faults with the EER (Equal Error Rate) values. The lowest value was measured for the case of the modified Hausdorff distance metric - Maximally Helicity Violating (MHV). Furthermore, for the verified biometric characteristics (Hamming distance and MHV), appropriate and suitable metrics have been proposed and experimented to optimize system precision. Thus, the EER value for the designed multi-biometric system in the context of this work was found to be 5%, which proves that metrics consolidation increases the precision of the multi-biometric system. Algorithms used for the proposed multi-biometric device shows that the individual metrics exhibit significant accuracy but perform better on consolidation, with a few shortcomings.