Displaying publications 1 - 20 of 27 in total

Abstract:
Sort:
  1. Zamli KZ, Din F, Ahmed BS, Bures M
    PLoS One, 2018;13(5):e0195675.
    PMID: 29771918 DOI: 10.1371/journal.pone.0195675
    The sine-cosine algorithm (SCA) is a new population-based meta-heuristic algorithm. In addition to exploiting sine and cosine functions to perform local and global searches (hence the name sine-cosine), the SCA introduces several random and adaptive parameters to facilitate the search process. Although it shows promising results, the search process of the SCA is vulnerable to local minima/maxima due to the adoption of a fixed switch probability and the bounded magnitude of the sine and cosine functions (from -1 to 1). In this paper, we propose a new hybrid Q-learning sine-cosine- based strategy, called the Q-learning sine-cosine algorithm (QLSCA). Within the QLSCA, we eliminate the switching probability. Instead, we rely on the Q-learning algorithm (based on the penalty and reward mechanism) to dynamically identify the best operation during runtime. Additionally, we integrate two new operations (Lévy flight motion and crossover) into the QLSCA to facilitate jumping out of local minima/maxima and enhance the solution diversity. To assess its performance, we adopt the QLSCA for the combinatorial test suite minimization problem. Experimental results reveal that the QLSCA is statistically superior with regard to test suite size reduction compared to recent state-of-the-art strategies, including the original SCA, the particle swarm test generator (PSTG), adaptive particle swarm optimization (APSO) and the cuckoo search strategy (CS) at the 95% confidence level. However, concerning the comparison with discrete particle swarm optimization (DPSO), there is no significant difference in performance at the 95% confidence level. On a positive note, the QLSCA statistically outperforms the DPSO in certain configurations at the 90% confidence level.
    Matched MeSH terms: Heuristics*
  2. Nisar K, Sabir Z, Zahoor Raja MA, Ibrahim AAA, Mahmoud SR, Balubaid M, et al.
    Sensors (Basel), 2021 Sep 30;21(19).
    PMID: 34640887 DOI: 10.3390/s21196567
    In this study, the numerical computation heuristic of the environmental and economic system using the artificial neural networks (ANNs) structure together with the capabilities of the heuristic global search genetic algorithm (GA) and the quick local search interior-point algorithm (IPA), i.e., ANN-GA-IPA. The environmental and economic system is dependent of three categories, execution cost of control standards and new technical diagnostics elimination costs of emergencies values and the competence of the system of industrial elements. These three elements form a nonlinear differential environmental and economic system. The optimization of an error-based objective function is performed using the differential environmental and economic system and its initial conditions. The optimization of an error-based objective function is performed using the differential environmental and economic system and its initial conditions.
    Matched MeSH terms: Heuristics*
  3. Kishore DJK, Mohamed MR, Sudhakar K, Peddakapu K
    Environ Sci Pollut Res Int, 2023 Jul;30(35):84167-84182.
    PMID: 37358770 DOI: 10.1007/s11356-023-28248-8
    At present, a photovoltaic (PV) system takes responsibility to reduce the risk of global warming and generate electricity. However, the PV system faces numerous problems to track global maximum peak power (GMPP) owing to the nonlinear nature of the environment especially due to partial shading conditions (PSC). To solve these difficulties, previous researchers have utilized various conventional methods for investigations. Nevertheless, these methods have oscillations around the GMPP. Hence, a new metaheuristic method such as an opposition-based equilibrium optimizer (OBEO) algorithm is used in this work for mitigating the oscillations around GMPP. To find the effectiveness of the proposed method, it can be evaluated with other methods such as SSA, GWO, and P&O. As per the simulation outcome, the proposed OBEO method provides maximum efficiency against all other methods. The efficiency for the proposed method under dynamic PSC is 95.09% in 0.16 s, similarly, 96.17% for uniform PSC and 86.25% for complex PSC.
    Matched MeSH terms: Heuristics*
  4. Ser LL, Shaharuddin Salleh, Nor Haniza Sarmin
    Sains Malaysiana, 2014;43:1263-1269.
    In this paper, a model called graph partitioning and transformation model (GPTM) which transforms a connected graph into a single-row network is introduced. The transformation is necessary in applications such as in the assignment of telephone channels to caller-receiver pairs roaming in cells in a cellular network on real-time basis. A connected graph is then transformed into its corresponding single-row network for assigning the channels to the caller-receiver pairs. The GPTM starts with the linear-time heuristic graph partitioning to produce two subgraphs with higher densities. The optimal labeling for nodes are then formed based on the simulated annealing technique. Experimental results support our hypothesis that GPTM efficiently transforms the connected graph into its single-row network.
    Matched MeSH terms: Heuristics
  5. Madni SHH, Abd Latiff MS, Abdullahi M, Abdulhamid SM, Usman MJ
    PLoS One, 2017;12(5):e0176321.
    PMID: 28467505 DOI: 10.1371/journal.pone.0176321
    Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
    Matched MeSH terms: Heuristics*
  6. Odili JB, Noraziah A, Zarina M
    Comput Intell Neurosci, 2021;2021:6625438.
    PMID: 33986793 DOI: 10.1155/2021/6625438
    This paper presents a comparative performance analysis of some metaheuristics such as the African Buffalo Optimization algorithm (ABO), Improved Extremal Optimization (IEO), Model-Induced Max-Min Ant Colony Optimization (MIMM-ACO), Max-Min Ant System (MMAS), Cooperative Genetic Ant System (CGAS), and the heuristic, Randomized Insertion Algorithm (RAI) to solve the asymmetric Travelling Salesman Problem (ATSP). Quite unlike the symmetric Travelling Salesman Problem, there is a paucity of research studies on the asymmetric counterpart. This is quite disturbing because most real-life applications are actually asymmetric in nature. These six algorithms were chosen for their performance comparison because they have posted some of the best results in literature and they employ different search schemes in attempting solutions to the ATSP. The comparative algorithms in this study employ different techniques in their search for solutions to ATSP: the African Buffalo Optimization employs the modified Karp-Steele mechanism, Model-Induced Max-Min Ant Colony Optimization (MIMM-ACO) employs the path construction with patching technique, Cooperative Genetic Ant System uses natural selection and ordering; Randomized Insertion Algorithm uses the random insertion approach, and the Improved Extremal Optimization uses the grid search strategy. After a number of experiments on the popular but difficult 15 out of the 19 ATSP instances in TSPLIB, the results show that the African Buffalo Optimization algorithm slightly outperformed the other algorithms in obtaining the optimal results and at a much faster speed.
    Matched MeSH terms: Heuristics
  7. Hasiah Mohamed@Omar, Rohana Yusoff, Azizah Jaafar
    MyJurnal
    Heuristic Evaluation (HE) is used as a basis in developing a new technique to evaluate usability or
    educational computer games known as Playability Heuristic Evaluation for Educational Computer Game (PHEG). PHEG was developed to identify usability problems that accommodate five heuristics, namely, interface, educational elements, content, playability and multimedia. In HE process, usability problems are rated based on severity score and this is followed by presentation of a mean value. The mean value is used to determine the level of usability problems; however, in some cases, this value may not accurate because it will ignore the most critical problems found in a specific part. In developing PHEG, a new quantitative approach was proposed in analyzing usability problems data. Numbers of sub-heuristics for each heuristic involved were taken into account in calculating percentage for each heuristic. Functions to calculate critical problems were also introduced. Evaluation for one educational game that was still in development process was conducted and the results showed that most of the critical problems were found in educational elements and content heuristics (57.14%), while the least usability problems were found in playability heuristic. In particular, the mean value in this analysis can be used as an indicator in identifying critical problems for educational computer games.
    Matched MeSH terms: Heuristics
  8. Jayaprakasam S, Abdul Rahim SK, Leow CY, Ting TO
    PLoS One, 2017;12(5):e0175510.
    PMID: 28464000 DOI: 10.1371/journal.pone.0175510
    Collaborative beamforming (CBF) with a finite number of collaborating nodes (CNs) produces sidelobes that are highly dependent on the collaborating nodes' locations. The sidelobes cause interference and affect the communication rate of unintended receivers located within the transmission range. Nulling is not possible in an open-loop CBF since the collaborating nodes are unable to receive feedback from the receivers. Hence, the overall sidelobe reduction is required to avoid interference in the directions of the unintended receivers. However, the impact of sidelobe reduction on the capacity improvement at the unintended receiver has never been reported in previous works. In this paper, the effect of peak sidelobe (PSL) reduction in CBF on the capacity of an unintended receiver is analyzed. Three meta-heuristic optimization methods are applied to perform PSL minimization, namely genetic algorithm (GA), particle swarm algorithm (PSO) and a simplified version of the PSO called the weightless swarm algorithm (WSA). An average reduction of 20 dB in PSL alongside 162% capacity improvement is achieved in the worst case scenario with the WSA optimization. It is discovered that the PSL minimization in the CBF provides capacity improvement at an unintended receiver only if the CBF cluster is small and dense.
    Matched MeSH terms: Heuristics
  9. Mohd Khairol Anuar Mohd Ariffin, Masood fathi, Napsiah Ismail
    MyJurnal
    Assembly line balancing is well-known in mass production system but this problem is non-deterministicpolynomial-time(NP)-hard, even for a simple straight line. Although several heuristic methods havebeen introduced and used by researchers, knowing and using an effective method in solving these typesof problems in less computational time have a considerable place in the area of line balancing problem.In this research, a new heuristic approach, known as critical node method (CNM), was introduced andtested by solving several test problems available in the literature so as to solve straight assembly lines.Finally, the obtained results are compared with 9 other heuristic rules in some performance measures.Thus, it is concluded that the proposed CNM is better than the rest in all the measures.
    Matched MeSH terms: Heuristics
  10. Chee, L.P., Wan Alwi, S.R., Lim, J.S.
    MyJurnal
    Aggregate planning acts as a blueprint for all operational planning activities. Despite
    the substantial amount of research that has been done in determining methods to
    improve aggregate planning approaches, the industry is still at a loss when it comes
    to working on the tactical planning aspect, especially in aggregate production.
    Therefore, this research work aims to present a comprehensive and generalised
    framework that will formulate a realistic batch production environment using an
    interactive Production Decision Support System. This system consists of an aggregate
    planning framework that combines a simulation model and a Pinch Analysis graphical
    approach to improve the effectiveness and efficiency of the decision-making process.
    The target is to allow operational opportunities to be captured at first sight and thus,
    maximise organisational profit. The simplicity and practicality of this new Production
    Decision Support System is demonstrated through two illustrative examples where a
    total of four heuristics were identified and turned into the new strategies to avoid the
    stock-out scenarios.
    Matched MeSH terms: Heuristics
  11. Nurul Nadia Nordin, Lee, Lai Soon
    MyJurnal
    Facility Layout Problem (FLP) is a NP-hard problem concerned with the arrangement of facilities as to minimize the distance travelled between all pairs of facilities. Many exact and approximate approaches have been proposed with an extensive applicability to deal with this problem. This paper studies the fundamentals of some well-known heuristics and metaheuristics used in solving the FLPs. It is hoped that this paper will trigger researchers for in-depth studies in FLPs looking into more specific interest such as equal or unequal FLPs.
    Matched MeSH terms: Heuristics
  12. Mohamad-Matrol AA, Chang SW, Abu A
    PeerJ, 2018;6:e5579.
    PMID: 30186704 DOI: 10.7717/peerj.5579
    Background: The amount of plant data such as taxonomical classification, morphological characteristics, ecological attributes and geological distribution in textual and image forms has increased rapidly due to emerging research and technologies. Therefore, it is crucial for experts as well as the public to discern meaningful relationships from this vast amount of data using appropriate methods. The data are often presented in lengthy texts and tables, which make gaining new insights difficult. The study proposes a visual-based representation to display data to users in a meaningful way. This method emphasises the relationships between different data sets.

    Method: This study involves four main steps which translate text-based results from Extensible Markup Language (XML) serialisation format into graphs. The four steps include: (1) conversion of ontological dataset as graph model data; (2) query from graph model data; (3) transformation of text-based results in XML serialisation format into a graphical form; and (4) display of results to the user via a graphical user interface (GUI). Ontological data for plants and samples of trees and shrubs were used as the dataset to demonstrate how plant-based data could be integrated into the proposed data visualisation.

    Results: A visualisation system named plant visualisation system was developed. This system provides a GUI that enables users to perform the query process, as well as a graphical viewer to display the results of the query in the form of a network graph. The efficiency of the developed visualisation system was measured by performing two types of user evaluations: a usability heuristics evaluation, and a query and visualisation evaluation.

    Discussion: The relationships between the data were visualised, enabling the users to easily infer the knowledge and correlations between data. The results from the user evaluation show that the proposed visualisation system is suitable for both expert and novice users, with or without computer skills. This technique demonstrates the practicability of using a computer assisted-tool by providing cognitive analysis for understanding relationships between data. Therefore, the results benefit not only botanists, but also novice users, especially those that are interested to know more about plants.

    Matched MeSH terms: Heuristics
  13. Haliza Abd. Rahman, Arifah Bahar, Norhayati Rosli, Madihah Md. Salleh
    Sains Malaysiana, 2012;41:1635-1642.
    Non-parametric modeling is a method which relies heavily on data and motivated by the smoothness properties in estimating a function which involves spline and non-spline approaches. Spline approach consists of regression spline and smoothing spline. Regression spline with Bayesian approach is considered in the first step of a two-step method in estimating the structural parameters for stochastic differential equation (SDE). The selection of knot and order of spline can be done heuristically based on the scatter plot. To overcome the subjective and tedious process of selecting the optimal knot and order of spline, an algorithm was proposed. A single optimal knot is selected out of all the points with exception of the first and the last data which gives the least value of Generalized Cross Validation (GCV) for each order of spline. The use is illustrated using observed data of opening share prices of Petronas Gas Bhd. The results showed that the Mean Square Errors (MSE) for stochastic model with parameters estimated using optimal knot for 1,000, 5,000 and 10,000 runs of Brownian motions are smaller than the SDE models with estimated parameters using knot selected heuristically. This verified the viability of the two-step method in the estimation of the drift and diffusion parameters of SDE with an improvement of a single knot selection.
    Matched MeSH terms: Heuristics
  14. Ahmad Z, Jehangiri AI, Ala'anzy MA, Othman M, Umar AI
    Sensors (Basel), 2021 Oct 30;21(21).
    PMID: 34770545 DOI: 10.3390/s21217238
    Cloud computing is a fully fledged, matured and flexible computing paradigm that provides services to scientific and business applications in a subscription-based environment. Scientific applications such as Montage and CyberShake are organized scientific workflows with data and compute-intensive tasks and also have some special characteristics. These characteristics include the tasks of scientific workflows that are executed in terms of integration, disintegration, pipeline, and parallelism, and thus require special attention to task management and data-oriented resource scheduling and management. The tasks executed during pipeline are considered as bottleneck executions, the failure of which result in the wholly futile execution, which requires a fault-tolerant-aware execution. The tasks executed during parallelism require similar instances of cloud resources, and thus, cluster-based execution may upgrade the system performance in terms of make-span and execution cost. Therefore, this research work presents a cluster-based, fault-tolerant and data-intensive (CFD) scheduling for scientific applications in cloud environments. The CFD strategy addresses the data intensiveness of tasks of scientific workflows with cluster-based, fault-tolerant mechanisms. The Montage scientific workflow is considered as a simulation and the results of the CFD strategy were compared with three well-known heuristic scheduling policies: (a) MCT, (b) Max-min, and (c) Min-min. The simulation results showed that the CFD strategy reduced the make-span by 14.28%, 20.37%, and 11.77%, respectively, as compared with the existing three policies. Similarly, the CFD reduces the execution cost by 1.27%, 5.3%, and 2.21%, respectively, as compared with the existing three policies. In case of the CFD strategy, the SLA is not violated with regard to time and cost constraints, whereas it is violated by the existing policies numerous times.
    Matched MeSH terms: Heuristics
  15. Nantha YS
    Korean J Fam Med, 2017 Nov;38(6):315-321.
    PMID: 29209469 DOI: 10.4082/kjfm.2017.38.6.315
    A prescriptive model approach in decision making could help achieve better diagnostic accuracy in clinical practice through methods that are less reliant on probabilistic assessments. Various prescriptive measures aimed at regulating factors that influence heuristics and clinical reasoning could support clinical decision-making process. Clinicians could avoid time-consuming decision-making methods that require probabilistic calculations. Intuitively, they could rely on heuristics to obtain an accurate diagnosis in a given clinical setting. An extensive literature review of cognitive psychology and medical decision-making theory was performed to illustrate how heuristics could be effectively utilized in daily practice. Since physicians often rely on heuristics in realistic situations, probabilistic estimation might not be a useful tool in everyday clinical practice. Improvements in the descriptive model of decision making (heuristics) may allow for greater diagnostic accuracy.
    Matched MeSH terms: Heuristics
  16. Mousavi SM, Naghsh A, Abu-Bakar SA
    J Digit Imaging, 2015 Aug;28(4):417-27.
    PMID: 25736857 DOI: 10.1007/s10278-015-9770-z
    This paper presents an automatic region of interest (ROI) segmentation method for application of watermarking in medical images. The advantage of using this scheme is that the proposed method is robust against different attacks such as median, Wiener, Gaussian, and sharpening filters. In other words, this technique can produce the same result for the ROI before and after these attacks. The proposed algorithm consists of three main parts; suggesting an automatic ROI detection system, evaluating the robustness of the proposed system against numerous attacks, and finally recommending an enhancement part to increase the strength of the composed system against different attacks. Results obtained from the proposed method demonstrated the promising performance of the method.
    Matched MeSH terms: Heuristics*
  17. Khowaja K, Salim SS, Asemi A
    PLoS One, 2015;10(7):e0132187.
    PMID: 26196385 DOI: 10.1371/journal.pone.0132187
    In this paper, we adapted and expanded a set of guidelines, also known as heuristics, to evaluate the usability of software to now be appropriate for software aimed at children with autism spectrum disorder (ASD). We started from the heuristics developed by Nielsen in 1990 and developed a modified set of 15 heuristics. The first 5 heuristics of this set are the same as those of the original Nielsen set, the next 5 heuristics are improved versions of Nielsen's, whereas the last 5 heuristics are new. We present two evaluation studies of our new heuristics. In the first, two groups compared Nielsen's set with the modified set of heuristics, with each group evaluating two interactive systems. The Nielsen's heuristics were assigned to the control group while the experimental group was given the modified set of heuristics, and a statistical analysis was conducted to determine the effectiveness of the modified set, the contribution of 5 new heuristics and the impact of 5 improved heuristics. The results show that the modified set is significantly more effective than the original, and we found a significant difference between the five improved heuristics and their corresponding heuristics in the original set. The five new heuristics are effective in problem identification using the modified set. The second study was conducted using a system which was developed to ascertain if the modified set was effective at identifying usability problems that could be fixed before the release of software. The post-study analysis revealed that the majority of the usability problems identified by the experts were fixed in the updated version of the system.
    Matched MeSH terms: Heuristics*
  18. Al-Saiagh W, Tiun S, Al-Saffar A, Awang S, Al-Khaleefa AS
    PLoS One, 2018;13(12):e0208695.
    PMID: 30571777 DOI: 10.1371/journal.pone.0208695
    Word sense disambiguation (WSD) is the process of identifying an appropriate sense for an ambiguous word. With the complexity of human languages in which a single word could yield different meanings, WSD has been utilized by several domains of interests such as search engines and machine translations. The literature shows a vast number of techniques used for the process of WSD. Recently, researchers have focused on the use of meta-heuristic approaches to identify the best solutions that reflect the best sense. However, the application of meta-heuristic approaches remains limited and thus requires the efficient exploration and exploitation of the problem space. Hence, the current study aims to propose a hybrid meta-heuristic method that consists of particle swarm optimization (PSO) and simulated annealing to find the global best meaning of a given text. Different semantic measures have been utilized in this model as objective functions for the proposed hybrid PSO. These measures consist of JCN and extended Lesk methods, which are combined effectively in this work. The proposed method is tested using a three-benchmark dataset (SemCor 3.0, SensEval-2, and SensEval-3). Results show that the proposed method has superior performance in comparison with state-of-the-art approaches.
    Matched MeSH terms: Heuristics
  19. Masroor K, Jeoti V, Drieberg M, Cheab S, Rajbhandari S
    Sensors (Basel), 2021 Apr 22;21(9).
    PMID: 33922288 DOI: 10.3390/s21092943
    The bi-directional information transfer in optical body area networks (OBANs) is crucial at all the three tiers of communication, i.e., intra-, inter-, and beyond-BAN communication, which correspond to tier-I, tier-II, and tier-III, respectively. However, the provision of uninterrupted uplink (UL) and downlink (DL) connections at tier II (inter-BAN) are extremely critical, since these links serve as a bridge between tier-I (intra-BAN) and tier-III (beyond-BAN) communication. Any negligence at this level could be life-threatening; therefore, enabling quality-of-service (QoS) remains a fundamental design issue at tier-II. Consequently, to provide QoS, a key parameter is to ensure link reliability and communication quality by maintaining a nearly uniform signal-to-noise ratio (SNR) within the coverage area. Several studies have reported the effects of transceiver related parameters on OBAN link performance, nevertheless the implications of changing transmitter locations on the SNR uniformity and communication quality have not been addressed. In this work, we undertake a DL scenario and analyze how the placement of light-emitting diode (LED) lamps can improve the SNR uniformity, regardless of the receiver position. Subsequently, we show that using the principle of reciprocity (POR) and with transmitter-receiver positions switched, the analysis is also applicable to UL, provided that the optical channel remains linear. Moreover, we propose a generalized optimal placement scheme along with a heuristic design formula to achieve uniform SNR and illuminance for DL using a fixed number of transmitters and compare it with an existing technique. The study reveals that the proposed placement technique reduces the fluctuations in SNR by 54% and improves the illuminance uniformity up to 102% as compared to the traditional approach. Finally, we show that, for very low luminous intensity, the SNR values remain sufficient to maintain a minimum bit error rate (BER) of 10-9 with on-off keying non-return-to-zero (OOK-NRZ) modulation format.
    Matched MeSH terms: Heuristics
  20. Nor Hasnul Azirah Abdul Hamid, Normalina Ibrahim@Mat, Nurul Najihah Mustopa
    ESTEEM Academic Journal, 2020;16(2):51-64.
    MyJurnal
    Student Information Management System (SIMS) is a computerized system for education that can be used to manage student information and data. PASTI An-Nur is chosen as a case study in developing the system. Thus, several problems are identified that PASTI An-Nur faces due to the
    implementation of a manual system in the admission process. The first problem is the paper-based registration form that is prone to lose, misplaced and less secure. As for the payment process, arise a problem in term of higher error rate when checking and calculating the payments. The biggest downfall for PASTI An-Nur is the amount of space used to store all the students' files.
    These problems bring inefficiency since the world is changing to computerized, where data management become one of the most significant issues nowadays. So, the aim of developing the Preschool Management System (PRESIMS) is for helping the staffs and teachers in managing the
    students' information. The Adapter Waterfall model was used in developing this system. Additionally, usability heuristics was used also as a theory to guide the development of this system. The system has been tested with the four (4) users and two (2) experts. The testing method is the ISO/IEC 9126- 4 approach to measure usability metrics, including efficiency, effectiveness, and satisfaction. Whereas, for the experts, heuristic evaluation is used to bring six (6) usability principles into implementation for testing. The result of the testing is very satisfying, which shows 75.5% of efficiency, 83.33% of effectiveness and three (3) out of four (4) users very satisfied with the system. The result of heuristic evaluation also shows a successful implementation of the system. The details of the result are discussed in this paper and expected to meet the users' specification and it is ready to go live.
    Matched MeSH terms: Heuristics
Related Terms
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links