Displaying publications 1 - 20 of 27 in total

Abstract:
Sort:
  1. Tao H, Rahman MA, Al-Saffar A, Zhang R, Salih SQ, Zain JM, et al.
    Work, 2021;68(3):853-861.
    PMID: 33612528 DOI: 10.3233/WOR-203419
    BACKGROUND: Nowadays, workplace violence is found to be a mental health hazard and considered a crucial topic. The collaboration between robots and humans is increasing with the growth of Industry 4.0. Therefore, the first problem that must be solved is human-machine security. Ensuring the safety of human beings is one of the main aspects of human-robotic interaction. This is not just about preventing collisions within a shared space among human beings and robots; it includes all possible means of harm for an individual, from physical contact to unpleasant or dangerous psychological effects.

    OBJECTIVE: In this paper, Non-linear Adaptive Heuristic Mathematical Model (NAHMM) has been proposed for the prevention of workplace violence using security Human-Robot Collaboration (HRC). Human-Robot Collaboration (HRC) is an area of research with a wide range of up-demands, future scenarios, and potential economic influence. HRC is an interdisciplinary field of research that encompasses cognitive sciences, classical robotics, and psychology.

    RESULTS: The robot can thus make the optimal decision between actions that expose its capabilities to the human being and take the best steps given the knowledge that is currently available to the human being. Further, the ideal policy can be measured carefully under certain observability assumptions.

    CONCLUSION: The system is shown on a collaborative robot and is compared to a state of the art security system. The device is experimentally demonstrated. The new system is being evaluated qualitatively and quantitatively.

    Matched MeSH terms: Heuristics
  2. Nurul Nadia Nordin, Lee, Lai Soon
    MyJurnal
    Facility Layout Problem (FLP) is a NP-hard problem concerned with the arrangement of facilities as to minimize the distance travelled between all pairs of facilities. Many exact and approximate approaches have been proposed with an extensive applicability to deal with this problem. This paper studies the fundamentals of some well-known heuristics and metaheuristics used in solving the FLPs. It is hoped that this paper will trigger researchers for in-depth studies in FLPs looking into more specific interest such as equal or unequal FLPs.
    Matched MeSH terms: Heuristics
  3. Masroor K, Jeoti V, Drieberg M, Cheab S, Rajbhandari S
    Sensors (Basel), 2021 Apr 22;21(9).
    PMID: 33922288 DOI: 10.3390/s21092943
    The bi-directional information transfer in optical body area networks (OBANs) is crucial at all the three tiers of communication, i.e., intra-, inter-, and beyond-BAN communication, which correspond to tier-I, tier-II, and tier-III, respectively. However, the provision of uninterrupted uplink (UL) and downlink (DL) connections at tier II (inter-BAN) are extremely critical, since these links serve as a bridge between tier-I (intra-BAN) and tier-III (beyond-BAN) communication. Any negligence at this level could be life-threatening; therefore, enabling quality-of-service (QoS) remains a fundamental design issue at tier-II. Consequently, to provide QoS, a key parameter is to ensure link reliability and communication quality by maintaining a nearly uniform signal-to-noise ratio (SNR) within the coverage area. Several studies have reported the effects of transceiver related parameters on OBAN link performance, nevertheless the implications of changing transmitter locations on the SNR uniformity and communication quality have not been addressed. In this work, we undertake a DL scenario and analyze how the placement of light-emitting diode (LED) lamps can improve the SNR uniformity, regardless of the receiver position. Subsequently, we show that using the principle of reciprocity (POR) and with transmitter-receiver positions switched, the analysis is also applicable to UL, provided that the optical channel remains linear. Moreover, we propose a generalized optimal placement scheme along with a heuristic design formula to achieve uniform SNR and illuminance for DL using a fixed number of transmitters and compare it with an existing technique. The study reveals that the proposed placement technique reduces the fluctuations in SNR by 54% and improves the illuminance uniformity up to 102% as compared to the traditional approach. Finally, we show that, for very low luminous intensity, the SNR values remain sufficient to maintain a minimum bit error rate (BER) of 10-9 with on-off keying non-return-to-zero (OOK-NRZ) modulation format.
    Matched MeSH terms: Heuristics
  4. Albowarab MH, Zakaria NA, Zainal Abidin Z
    Sensors (Basel), 2021 May 12;21(10).
    PMID: 34065920 DOI: 10.3390/s21103356
    Various aspects of task execution load balancing of Internet of Things (IoTs) networks can be optimised using intelligent algorithms provided by software-defined networking (SDN). These load balancing aspects include makespan, energy consumption, and execution cost. While past studies have evaluated load balancing from one or two aspects, none has explored the possibility of simultaneously optimising all aspects, namely, reliability, energy, cost, and execution time. For the purposes of load balancing, implementing multi-objective optimisation (MOO) based on meta-heuristic searching algorithms requires assurances that the solution space will be thoroughly explored. Optimising load balancing provides not only decision makers with optimised solutions but a rich set of candidate solutions to choose from. Therefore, the purposes of this study were (1) to propose a joint mathematical formulation to solve load balancing challenges in cloud computing and (2) to propose two multi-objective particle swarm optimisation (MP) models; distance angle multi-objective particle swarm optimization (DAMP) and angle multi-objective particle swarm optimization (AMP). Unlike existing models that only use crowding distance as a criterion for solution selection, our MP models probabilistically combine both crowding distance and crowding angle. More specifically, we only selected solutions that had more than a 0.5 probability of higher crowding distance and higher angular distribution. In addition, binary variants of the approaches were generated based on transfer function, and they were denoted by binary DAMP (BDAMP) and binary AMP (BAMP). After using MOO mathematical functions to compare our models, BDAMP and BAMP, with state of the standard models, BMP, BDMP and BPSO, they were tested using the proposed load balancing model. Both tests proved that our DAMP and AMP models were far superior to the state of the art standard models, MP, crowding distance multi-objective particle swarm optimisation (DMP), and PSO. Therefore, this study enables the incorporation of meta-heuristic in the management layer of cloud networks.
    Matched MeSH terms: Heuristics
  5. Nisar K, Sabir Z, Zahoor Raja MA, Ibrahim AAA, Mahmoud SR, Balubaid M, et al.
    Sensors (Basel), 2021 Sep 30;21(19).
    PMID: 34640887 DOI: 10.3390/s21196567
    In this study, the numerical computation heuristic of the environmental and economic system using the artificial neural networks (ANNs) structure together with the capabilities of the heuristic global search genetic algorithm (GA) and the quick local search interior-point algorithm (IPA), i.e., ANN-GA-IPA. The environmental and economic system is dependent of three categories, execution cost of control standards and new technical diagnostics elimination costs of emergencies values and the competence of the system of industrial elements. These three elements form a nonlinear differential environmental and economic system. The optimization of an error-based objective function is performed using the differential environmental and economic system and its initial conditions. The optimization of an error-based objective function is performed using the differential environmental and economic system and its initial conditions.
    Matched MeSH terms: Heuristics*
  6. Ahmad Z, Jehangiri AI, Ala'anzy MA, Othman M, Umar AI
    Sensors (Basel), 2021 Oct 30;21(21).
    PMID: 34770545 DOI: 10.3390/s21217238
    Cloud computing is a fully fledged, matured and flexible computing paradigm that provides services to scientific and business applications in a subscription-based environment. Scientific applications such as Montage and CyberShake are organized scientific workflows with data and compute-intensive tasks and also have some special characteristics. These characteristics include the tasks of scientific workflows that are executed in terms of integration, disintegration, pipeline, and parallelism, and thus require special attention to task management and data-oriented resource scheduling and management. The tasks executed during pipeline are considered as bottleneck executions, the failure of which result in the wholly futile execution, which requires a fault-tolerant-aware execution. The tasks executed during parallelism require similar instances of cloud resources, and thus, cluster-based execution may upgrade the system performance in terms of make-span and execution cost. Therefore, this research work presents a cluster-based, fault-tolerant and data-intensive (CFD) scheduling for scientific applications in cloud environments. The CFD strategy addresses the data intensiveness of tasks of scientific workflows with cluster-based, fault-tolerant mechanisms. The Montage scientific workflow is considered as a simulation and the results of the CFD strategy were compared with three well-known heuristic scheduling policies: (a) MCT, (b) Max-min, and (c) Min-min. The simulation results showed that the CFD strategy reduced the make-span by 14.28%, 20.37%, and 11.77%, respectively, as compared with the existing three policies. Similarly, the CFD reduces the execution cost by 1.27%, 5.3%, and 2.21%, respectively, as compared with the existing three policies. In case of the CFD strategy, the SLA is not violated with regard to time and cost constraints, whereas it is violated by the existing policies numerous times.
    Matched MeSH terms: Heuristics
  7. Dey A, Chattopadhyay S, Singh PK, Ahmadian A, Ferrara M, Senu N, et al.
    Sci Rep, 2021 Dec 15;11(1):24065.
    PMID: 34911977 DOI: 10.1038/s41598-021-02731-z
    COVID-19 is a respiratory disease that causes infection in both lungs and the upper respiratory tract. The World Health Organization (WHO) has declared it a global pandemic because of its rapid spread across the globe. The most common way for COVID-19 diagnosis is real-time reverse transcription-polymerase chain reaction (RT-PCR) which takes a significant amount of time to get the result. Computer based medical image analysis is more beneficial for the diagnosis of such disease as it can give better results in less time. Computed Tomography (CT) scans are used to monitor lung diseases including COVID-19. In this work, a hybrid model for COVID-19 detection has developed which has two key stages. In the first stage, we have fine-tuned the parameters of the pre-trained convolutional neural networks (CNNs) to extract some features from the COVID-19 affected lungs. As pre-trained CNNs, we have used two standard CNNs namely, GoogleNet and ResNet18. Then, we have proposed a hybrid meta-heuristic feature selection (FS) algorithm, named as Manta Ray Foraging based Golden Ratio Optimizer (MRFGRO) to select the most significant feature subset. The proposed model is implemented over three publicly available datasets, namely, COVID-CT dataset, SARS-COV-2 dataset, and MOSMED dataset, and attains state-of-the-art classification accuracies of 99.15%, 99.42% and 95.57% respectively. Obtained results confirm that the proposed approach is quite efficient when compared to the local texture descriptors used for COVID-19 detection from chest CT-scan images.
    Matched MeSH terms: Heuristics
  8. Haliza Abd. Rahman, Arifah Bahar, Norhayati Rosli, Madihah Md. Salleh
    Sains Malaysiana, 2012;41:1635-1642.
    Non-parametric modeling is a method which relies heavily on data and motivated by the smoothness properties in estimating a function which involves spline and non-spline approaches. Spline approach consists of regression spline and smoothing spline. Regression spline with Bayesian approach is considered in the first step of a two-step method in estimating the structural parameters for stochastic differential equation (SDE). The selection of knot and order of spline can be done heuristically based on the scatter plot. To overcome the subjective and tedious process of selecting the optimal knot and order of spline, an algorithm was proposed. A single optimal knot is selected out of all the points with exception of the first and the last data which gives the least value of Generalized Cross Validation (GCV) for each order of spline. The use is illustrated using observed data of opening share prices of Petronas Gas Bhd. The results showed that the Mean Square Errors (MSE) for stochastic model with parameters estimated using optimal knot for 1,000, 5,000 and 10,000 runs of Brownian motions are smaller than the SDE models with estimated parameters using knot selected heuristically. This verified the viability of the two-step method in the estimation of the drift and diffusion parameters of SDE with an improvement of a single knot selection.
    Matched MeSH terms: Heuristics
  9. Mohamed M. GahGah, Juhari Mat Akhir, Abdul Ghani M. Rafek, Ibrahim Abdullah
    Sains Malaysiana, 2009;38(6):827-833.
    The aim of this study is to investigate the factors that cause landslides in the area along the new road between Cameron Highlands and Gua Musang. Landslide factors such as lineaments have been extracted from remote sensing data (Landsat TM image) using ERDAS software. A soil map has been produced using field work and laboratory analysis, and the lithology, roads, drainage pattern and rainfall have been digitized using ILWIS software together with the slope angle and elevation from the Digital Elevation Model (DEM). All these parameters, which are vital for landslide hazard assessment, have been integrated into the geographical information system (GIS) for further data processing. Weightage for these landslide relevant factors related to their influence in landslide occurrence using the heuristic method has been carried out. The results from this combination through a modified ‘index overlay with multi class maps’ model was used to produce a landslide hazard zonation map. Five classes of potential landslide hazard have been derived as the following: very low hazard zone 17.27%, low hazard zone 39.35%, medium hazard zone 25.1%, high hazard zone 15.35% and very high hazard zone 2.93%. The results from this work have been checked through the landslide inventory using available aerial photos interpretation and field work, and show that the slope and elevation have the most direct affect on landslide occurrence.
    Matched MeSH terms: Heuristics
  10. Ser LL, Shaharuddin Salleh, Nor Haniza Sarmin
    Sains Malaysiana, 2014;43:1263-1269.
    In this paper, a model called graph partitioning and transformation model (GPTM) which transforms a connected graph into a single-row network is introduced. The transformation is necessary in applications such as in the assignment of telephone channels to caller-receiver pairs roaming in cells in a cellular network on real-time basis. A connected graph is then transformed into its corresponding single-row network for assigning the channels to the caller-receiver pairs. The GPTM starts with the linear-time heuristic graph partitioning to produce two subgraphs with higher densities. The optimal labeling for nodes are then formed based on the simulated annealing technique. Experimental results support our hypothesis that GPTM efficiently transforms the connected graph into its single-row network.
    Matched MeSH terms: Heuristics
  11. Khowaja K, Salim SS, Asemi A
    PLoS One, 2015;10(7):e0132187.
    PMID: 26196385 DOI: 10.1371/journal.pone.0132187
    In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen's set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen's heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system.
    Matched MeSH terms: Heuristics*
  12. Al-Saiagh W, Tiun S, Al-Saffar A, Awang S, Al-Khaleefa AS
    PLoS One, 2018;13(12):e0208695.
    PMID: 30571777 DOI: 10.1371/journal.pone.0208695
    Word sense disambiguation (WSD) is the process of identifying an appropriate sense for an ambiguous word. With the complexity of human languages in which a single word could yield different meanings, WSD has been utilized by several domains of interests such as search engines and machine translations. The literature shows a vast number of techniques used for the process of WSD. Recently, researchers have focused on the use of meta-heuristic approaches to identify the best solutions that reflect the best sense. However, the application of meta-heuristic approaches remains limited and thus requires the efficient exploration and exploitation of the problem space. Hence, the current study aims to propose a hybrid meta-heuristic method that consists of particle swarm optimization (PSO) and simulated annealing to find the global best meaning of a given text. Different semantic measures have been utilized in this model as objective functions for the proposed hybrid PSO. These measures consist of JCN and extended Lesk methods, which are combined effectively in this work. The proposed method is tested using a three-benchmark dataset (SemCor 3.0, SensEval-2, and SensEval-3). Results show that the proposed method has superior performance in comparison with state-of-the-art approaches.
    Matched MeSH terms: Heuristics
  13. Jayaprakasam S, Abdul Rahim SK, Leow CY, Ting TO
    PLoS One, 2017;12(5):e0175510.
    PMID: 28464000 DOI: 10.1371/journal.pone.0175510
    Collaborative beamforming (CBF) with a finite number of collaborating nodes (CNs) produces sidelobes that are highly dependent on the collaborating nodes' locations. The sidelobes cause interference and affect the communication rate of unintended receivers located within the transmission range. Nulling is not possible in an open-loop CBF since the collaborating nodes are unable to receive feedback from the receivers. Hence, the overall sidelobe reduction is required to avoid interference in the directions of the unintended receivers. However, the impact of sidelobe reduction on the capacity improvement at the unintended receiver has never been reported in previous works. In this paper, the effect of peak sidelobe (PSL) reduction in CBF on the capacity of an unintended receiver is analyzed. Three meta-heuristic optimization methods are applied to perform PSL minimization, namely genetic algorithm (GA), particle swarm algorithm (PSO) and a simplified version of the PSO called the weightless swarm algorithm (WSA). An average reduction of 20 dB in PSL alongside 162% capacity improvement is achieved in the worst case scenario with the WSA optimization. It is discovered that the PSL minimization in the CBF provides capacity improvement at an unintended receiver only if the CBF cluster is small and dense.
    Matched MeSH terms: Heuristics
  14. Madni SHH, Abd Latiff MS, Abdullahi M, Abdulhamid SM, Usman MJ
    PLoS One, 2017;12(5):e0176321.
    PMID: 28467505 DOI: 10.1371/journal.pone.0176321
    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
    Matched MeSH terms: Heuristics*
  15. Zamli KZ, Din F, Ahmed BS, Bures M
    PLoS One, 2018;13(5):e0195675.
    PMID: 29771918 DOI: 10.1371/journal.pone.0195675
    The sine-cosine algorithm (SCA) is a new population-based meta-heuristic algorithm. In addition to exploiting sine and cosine functions to perform local and global searches (hence the name sine-cosine), the SCA introduces several random and adaptive parameters to facilitate the search process. Although it shows promising results, the search process of the SCA is vulnerable to local minima/maxima due to the adoption of a fixed switch probability and the bounded magnitude of the sine and cosine functions (from -1 to 1). In this paper, we propose a new hybrid Q-learning sine-cosine- based strategy, called the Q-learning sine-cosine algorithm (QLSCA). Within the QLSCA, we eliminate the switching probability. Instead, we rely on the Q-learning algorithm (based on the penalty and reward mechanism) to dynamically identify the best operation during runtime. Additionally, we integrate two new operations (Lévy flight motion and crossover) into the QLSCA to facilitate jumping out of local minima/maxima and enhance the solution diversity. To assess its performance, we adopt the QLSCA for the combinatorial test suite minimization problem. Experimental results reveal that the QLSCA is statistically superior with regard to test suite size reduction compared to recent state-of-the-art strategies, including the original SCA, the particle swarm test generator (PSTG), adaptive particle swarm optimization (APSO) and the cuckoo search strategy (CS) at the 95% confidence level. However, concerning the comparison with discrete particle swarm optimization (DPSO), there is no significant difference in performance at the 95% confidence level. On a positive note, the QLSCA statistically outperforms the DPSO in certain configurations at the 90% confidence level.
    Matched MeSH terms: Heuristics*
  16. Rehman MZ, Zamli KZ, Almutairi M, Chiroma H, Aamir M, Kader MA, et al.
    PLoS One, 2021;16(12):e0259786.
    PMID: 34855771 DOI: 10.1371/journal.pone.0259786
    Team formation (TF) in social networks exploits graphs (i.e., vertices = experts and edges = skills) to represent a possible collaboration between the experts. These networks lead us towards building cost-effective research teams irrespective of the geolocation of the experts and the size of the dataset. Previously, large datasets were not closely inspected for the large-scale distributions & relationships among the researchers, resulting in the algorithms failing to scale well on the data. Therefore, this paper presents a novel TF algorithm for expert team formation called SSR-TF based on two metrics; communication cost and graph reduction, that will become a basis for future TF's. In SSR-TF, communication cost finds the possibility of collaboration between researchers. The graph reduction scales the large data to only appropriate skills and the experts, resulting in real-time extraction of experts for collaboration. This approach is tested on five organic and benchmark datasets, i.e., UMP, DBLP, ACM, IMDB, and Bibsonomy. The SSR-TF algorithm is able to build cost-effective teams with the most appropriate experts-resulting in the formation of more communicative teams with high expertise levels.
    Matched MeSH terms: Computer Heuristics
  17. Hasiah Mohamed@Omar, Rohana Yusoff, Azizah Jaafar
    MyJurnal
    Heuristic Evaluation (HE) is used as a basis in developing a new technique to evaluate usability or
    educational computer games known as Playability Heuristic Evaluation for Educational Computer Game (PHEG). PHEG was developed to identify usability problems that accommodate five heuristics, namely, interface, educational elements, content, playability and multimedia. In HE process, usability problems are rated based on severity score and this is followed by presentation of a mean value. The mean value is used to determine the level of usability problems; however, in some cases, this value may not accurate because it will ignore the most critical problems found in a specific part. In developing PHEG, a new quantitative approach was proposed in analyzing usability problems data. Numbers of sub-heuristics for each heuristic involved were taken into account in calculating percentage for each heuristic. Functions to calculate critical problems were also introduced. Evaluation for one educational game that was still in development process was conducted and the results showed that most of the critical problems were found in educational elements and content heuristics (57.14%), while the least usability problems were found in playability heuristic. In particular, the mean value in this analysis can be used as an indicator in identifying critical problems for educational computer games.
    Matched MeSH terms: Heuristics
  18. Mohd Khairol Anuar Mohd Ariffin, Masood fathi, Napsiah Ismail
    MyJurnal
    Assembly line balancing is well-known in mass production system but this problem is non-deterministicpolynomial-time(NP)-hard, even for a simple straight line. Although several heuristic methods havebeen introduced and used by researchers, knowing and using an effective method in solving these typesof problems in less computational time have a considerable place in the area of line balancing problem.In this research, a new heuristic approach, known as critical node method (CNM), was introduced andtested by solving several test problems available in the literature so as to solve straight assembly lines.Finally, the obtained results are compared with 9 other heuristic rules in some performance measures.Thus, it is concluded that the proposed CNM is better than the rest in all the measures.
    Matched MeSH terms: Heuristics
  19. Mohd. Shareduwan Mohd. Kasihmuddin, Mohd. Asyraf Mansor, Saratha Sathasivam
    MyJurnal
    Swarm intelligence is a research area that models the population of swarm that is able to self-organise effectively. Honey bees that gather around their hive with a distinctive behaviour is another example of swarm intelligence. In fact, the artificial bee colony (ABC) algorithm is a swarm-based meta-heuristic algorithm introduced by Karaboga in order to optimise numerical problems. 2SAT can be treated as a constrained optimisation problem which represents any problem by using clauses containing 2 literals each. Most of the current researchers represent their problem by using 2SAT. Meanwhile, the Hopfield neural network incorporated with the ABC has been utilised to perform randomised 2SAT. Hence, the aim of this study is to investigate the performance of the solutions produced by HNN2SAT-ABC and compared it with the traditional HNN2SAT-ES. The comparison of both algorithms has been examined by using Microsoft Visual Studio 2013 C++ Express Software. The detailed comparison on the performance of the ABC and ES in performing 2SAT is discussed based on global minima ratio, hamming distance, CPU time and fitness landscape. The results obtained from the computer simulation depict the beneficial features of ABC compared to ES. Moreover, the findings have led to a significant implication on the choice of determining an alternative method to perform 2SAT.
    Matched MeSH terms: Heuristics
  20. Mohamad-Matrol AA, Chang SW, Abu A
    PeerJ, 2018;6:e5579.
    PMID: 30186704 DOI: 10.7717/peerj.5579
    Background: The amount of plant data such as taxonomical classification, morphological characteristics, ecological attributes and geological distribution in textual and image forms has increased rapidly due to emerging research and technologies. Therefore, it is crucial for experts as well as the public to discern meaningful relationships from this vast amount of data using appropriate methods. The data are often presented in lengthy texts and tables, which make gaining new insights difficult. The study proposes a visual-based representation to display data to users in a meaningful way. This method emphasises the relationships between different data sets.

    Method: This study involves four main steps which translate text-based results from Extensible Markup Language (XML) serialisation format into graphs. The four steps include: (1) conversion of ontological dataset as graph model data; (2) query from graph model data; (3) transformation of text-based results in XML serialisation format into a graphical form; and (4) display of results to the user via a graphical user interface (GUI). Ontological data for plants and samples of trees and shrubs were used as the dataset to demonstrate how plant-based data could be integrated into the proposed data visualisation.

    Results: A visualisation system named plant visualisation system was developed. This system provides a GUI that enables users to perform the query process, as well as a graphical viewer to display the results of the query in the form of a network graph. The efficiency of the developed visualisation system was measured by performing two types of user evaluations: a usability heuristics evaluation, and a query and visualisation evaluation.

    Discussion: The relationships between the data were visualised, enabling the users to easily infer the knowledge and correlations between data. The results from the user evaluation show that the proposed visualisation system is suitable for both expert and novice users, with or without computer skills. This technique demonstrates the practicability of using a computer assisted-tool by providing cognitive analysis for understanding relationships between data. Therefore, the results benefit not only botanists, but also novice users, especially those that are interested to know more about plants.

    Matched MeSH terms: Heuristics
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links