Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also successfully encrypts and decrypts textual data in Microsoft Word document, Microsoft Excel and Portable Document Formats which are the conventional format of documenting medical records. Interestingly, the entire encryption and decryption procedures were achieved at a lower computational cost using regular hardware and software resources without compromising neither the quality of the decrypted data nor the security level of the algorithms.
The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.
Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.
The vector evaluated particle swarm optimisation (VEPSO) algorithm was previously improved by incorporating nondominated solutions for solving multiobjective optimisation problems. However, the obtained solutions did not converge close to the Pareto front and also did not distribute evenly over the Pareto front. Therefore, in this study, the concept of multiple nondominated leaders is incorporated to further improve the VEPSO algorithm. Hence, multiple nondominated solutions that are best at a respective objective function are used to guide particles in finding optimal solutions. The improved VEPSO is measured by the number of nondominated solutions found, generational distance, spread, and hypervolume. The results from the conducted experiments show that the proposed VEPSO significantly improved the existing VEPSO algorithms.
Voting is an important operation in multichannel computation paradigm and realization of ultrareliable and real-time control systems that arbitrates among the results of N redundant variants. These systems include N-modular redundant (NMR) hardware systems and diversely designed software systems based on N-version programming (NVP). Depending on the characteristics of the application and the type of selected voter, the voting algorithms can be implemented for either hardware or software systems. In this paper, a novel voting algorithm is introduced for real-time fault-tolerant control systems, appropriate for applications in which N is large. Then, its behavior has been software implemented in different scenarios of error-injection on the system inputs. The results of analyzed evaluations through plots and statistical computations have demonstrated that this novel algorithm does not have the limitations of some popular voting algorithms such as median and weighted; moreover, it is able to significantly increase the reliability and availability of the system in the best case to 2489.7% and 626.74%, respectively, and in the worst case to 3.84% and 1.55%, respectively.
Unified Modeling Language is the most popular and widely used Object-Oriented modelling language in the IT industry. This study focuses on investigating the ability to expand UML to some extent to model crosscutting concerns (Aspects) to support AspectJ. Through a comprehensive literature review, we identify and extensively examine all the available Aspect-Oriented UML modelling approaches and find that the existing Aspect-Oriented Design Modelling approaches using UML cannot be considered to provide a framework for a comprehensive Aspectual UML modelling approach and also that there is a lack of adequate Aspect-Oriented tool support. This study also proposes a set of Aspectual UML semantic rules and attempts to generate AspectJ pseudocode from UML diagrams. The proposed Aspectual UML modelling approach is formally evaluated using a focus group to test six hypotheses regarding performance; a "good design" criteria-based evaluation to assess the quality of the design; and an AspectJ-based evaluation as a reference measurement-based evaluation. The results of the focus group evaluation confirm all the hypotheses put forward regarding the proposed approach. The proposed approach provides a comprehensive set of Aspectual UML structural and behavioral diagrams, which are designed and implemented based on a comprehensive and detailed set of AspectJ programming constructs.
In the recent years, many research works have been published using speech related features for speech emotion recognition, however, recent studies show that there is a strong correlation between emotional states and glottal features. In this work, Mel-frequency cepstralcoefficients (MFCCs), linear predictive cepstral coefficients (LPCCs), perceptual linear predictive (PLP) features, gammatone filter outputs, timbral texture features, stationary wavelet transform based timbral texture features and relative wavelet packet energy and entropy features were extracted from the emotional speech (ES) signals and its glottal waveforms(GW). Particle swarm optimization based clustering (PSOC) and wrapper based particle swarm optimization (WPSO) were proposed to enhance the discerning ability of the features and to select the discriminating features respectively. Three different emotional speech databases were utilized to gauge the proposed method. Extreme learning machine (ELM) was employed to classify the different types of emotions. Different experiments were conducted and the results show that the proposed method significantly improves the speech emotion recognition performance compared to previous works published in the literature.
This paper deals with the interface-relevant activity of a vehicle integrated intelligent safety system (ISS) that includes an airbag deployment decision system (ADDS) and a tire pressure monitoring system (TPMS). A program is developed in LabWindows/CVI, using C for prototype implementation. The prototype is primarily concerned with the interconnection between hardware objects such as a load cell, web camera, accelerometer, TPM tire module and receiver module, DAQ card, CPU card and a touch screen. Several safety subsystems, including image processing, weight sensing and crash detection systems, are integrated, and their outputs are combined to yield intelligent decisions regarding airbag deployment. The integrated safety system also monitors tire pressure and temperature. Testing and experimentation with this ISS suggests that the system is unique, robust, intelligent, and appropriate for in-vehicle applications.
Complex engineering systems are usually designed to last for many years. Such systems will face many uncertainties in the future. Hence the design and deployment of these systems should not be based on a single scenario, but should incorporate flexibility. Flexibility can be incorporated in system architectures in the form of options that can be exercised in the future when new information is available. Incorporating flexibility comes, however, at a cost. To evaluate if this cost is worth the investment a real options analysis can be carried out. This approach is demonstrated through analysis of a case study of a previously developed static system-of-systems for maritime domain protection in the Straits of Malacca. This article presents a framework for dynamic strategic planning of engineering systems using real options analysis and demonstrates that flexibility adds considerable value over a static design. In addition to this it is shown that Monte Carlo analysis and genetic algorithms can be successfully combined to find solutions in a case with a very large number of possible futures and system designs.
The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects.
Covering as much as 25% to 35% of the development cost, software testing is an integral part of the software development lifecycle. Despite its importance, the current software testing practice is still based on highly manual processes from the generation of test cases (i.e. from specifications) up to the actual execution of the test. These manually generated tests are sometimes executed using ad hoc approaches, typically requiring the construction of a test driver for the particular application under test. In addition, test engineers are also under pressure to test increasing lines of code in order to meet market demands for more software functionalities. While there are significant proliferations of helpful testing tools or research prototypes in the market, much of them do not adequately provide the right level of abstraction and automation as required by test engineers. In order to facilitate and address some of the aforementioned issues, an automated testing tool was developed, called SFIT, based on Java® technology. This paper describes the development, implementation and evaluation of SFIT. Two case studies involving the robustness assessment of an adder module and a Linda-based distributed shared memory implementation are described in order to demonstrate the applicability of SFIT as a helpful automated testing tool.
A new method to construct the distinct Hamiltonian circuits in complete
graphs is called Half Butterfly Method. The Half Butterfly Method used the concept
of isomorphism in developing the distinct Hamiltonian circuits. Thus some theoretical
works are presented throughout developing this method.
Nowadays, the applications of tracking moving object are commonly used in various
areas especially in computer vision applications. There are many tracking algorithms
have been introduced and they are divided into three groups which are generative
trackers, discriminative trackers and hybrid trackers. One of the methods is TrackingLearning-Detection
(TLD) framework which is an example of the hybrid trackers where
combination between the generative trackers and the discriminative trackers occur. In
TLD, the detector consists of three stages which are patch variance, ensemble classifier
and KNearest Neighbor classifier. In the second stage, the ensemble classifier depends
on simple pixel comparison hence, it is likely fail to offer a better generalization of the
appearances of the target object in the detection process. In this paper, OnlineSequential
Extreme Learning Machine (OS-ELM) was used to replace the ensemble
classifier in the TLD framework. Besides that, different types of Haar-like features were
used for the feature extraction process instead of using raw pixel value as the features.
The objectives of this study are to improve the classifier in the second stage of detector
in TLD framework by using Haar-like features as an input to the classifier and to get a
more generalized detector in TLD framework by using OS-ELM based detector. The
results showed that the proposed method performs better in Pedestrian 1 in terms of
F-measure and also offers good performance in terms of Precision in four out of six
Space Vector Pulse Width Modulation (SVPWM) method is widely used as a modulation technique
to drive a three-phase inverter. It is an advanced computational intensive method used in pulse width modulation (PWM) algorithm for the three-phase voltage source inverter. Compared with the other PWM techniques, SVPWM is easier to implement, thus, it is the most preferred technique among others. Mathematical model for SVPWM was developed using MATLAB/ Simulink software. In this paper, the interface between MATLAB Simulink with the three-phase inverter by using Arduino Uno microcontroller is proposed. Arduino Uno generates the SVPWM signals for Permanent Magnet Synchronous Motor (PMSM) and is described in this paper. This work consists of software and hardware implementations. Simulation was done via Matlab/Simulink software to verify the effectiveness of the system and to measure the percentage of Total Harmonic Distortion (THD). The results show that SVPWM technique is able to drive the three-phase inverter with the Arduino UNO.
The best commonly applied approach in seating ergonomics is the concept that the seat must fit the sitter.
Understanding of population anthropometry is necessary because, in the mass vehicle market, a single seat should fit
a huge portion of the population. This research work proposes some automotive seat fit parameters based on a
representative Nigerian anthropometric data, to ensure an optimum fit between the vehicle seats and the occupants,
as well as providing adequate accommodation. Anthropometric data of 863 Nigerians captured with special emphasis
on the dimensions that are applicable in automotive seat design. A comparison made between the data obtained and
that of five other countries. The proposed dimensions includes: seat cushion width (475mm); seat cushion length
(394mm); seat height (340mm); seat lateral location (583mm); seat back height (480mm); seat back width (427mm);
armrest height (246mm); headrest height (703mm); armrest surface length (345mm); backrest width (thoracic level)
(524mm); seat adjustment (186mm); backrest width (lumbar level) (475mm) and distance between armrests
(475mm). A comparison made between the proposed dimensions and those recommended by four other scholars for
other populations. Finally, an ergonomic automotive seat suitable for the Nigerian population was designed using
AutoCAD 2016 software based on the proposed established dimensions.
Information Centric Network (ICN) is expected to be the favorable deployable future Internet paradigm. ICN intends to replace the current IP-based model with the name-based content-centric model, as it aims at providing better security, scalability, and content distribution. However, it is a challenging task to conceive how ICN can be linked with the other most emerging paradigm, i.e., Vehicular Ad hoc Network (VANET). In this article, we present an overview of the ICN-based VANET approach in line with its contributions and research challenges.In addition, the connectivity issues of vehicular ICN model is presented with some other emerging paradigms, such as Software Defined Network (SDN), Cloud, and Edge computing. Moreover, some ICN-based VANET research opportunities, in terms of security, mobility, routing, naming, caching, and fifth generation (5G) communications, are also covered at the end of the paper.
This paper describes an approach for improving the accuracy of memory-based collaborative filtering, based on the technique for order of preference by similarity to ideal solution (TOPSIS) method. Recommender systems are used to filter the huge amount of data available online based on user-defined preferences. Collaborative filtering (CF) is a commonly used recommendation approach that generates recommendations based on correlations among user preferences. Although several enhancements have increased the accuracy of memory-based CF through the development of improved similarity measures for finding successful neighbors, there has been less investigation into prediction score methods, in which rating/preference scores are assigned to items that have not yet been selected by a user. A TOPSIS solution for evaluating multiple alternatives based on more than one criterion is proposed as an alternative to prediction score methods for evaluating and ranking items based on the results from similar users. The recommendation accuracy of the proposed TOPSIS technique is evaluated by applying it to various common CF baseline methods, which are then used to analyze the MovieLens 100K and 1M benchmark datasets. The results show that CF based on the TOPSIS method is more accurate than baseline CF methods across a number of common evaluation metrics.
The generalized and improved (G'/G)-expansion method is a powerful and advantageous mathematical tool for establishing abundant new traveling wave solutions of nonlinear partial differential equations. In this article, we investigate the higher dimensional nonlinear evolution equation, namely, the (3+1)-dimensional modified KdV-Zakharov-Kuznetsev equation via this powerful method. The solutions are found in hyperbolic, trigonometric and rational function form involving more parameters and some of our constructed solutions are identical with results obtained by other authors if certain parameters take special values and some are new. The numerical results described in the figures were obtained with the aid of commercial software Maple.
A system for computerising histopathology records developed in-house using dBASE IV on IBM-compatible microcomputers in a local area network is described. The software package uses a horizontal main menu bar with associated pull-down submenus as interface between the machine and the user. It is very easy to use. The package provides options for selecting databases by years, entering/editing records, browsing data, making multi-characteristics searches/retrievals, printing data, and maintaining databases that includes backing-up and repairing corrupted databases.