Displaying publications 41 - 60 of 163 in total

Abstract:
Sort:
  1. Noor Ashikin Othman, Mohammad Khatim Hasan
    MyJurnal
    Simulating Lotka-Volterra model using numerical method require researchers to apply tiny mesh sizes to obtain an accurate result. This approach nevertheless increases the complexity and burden of computer memory and consume long computational time. To overcome these issues, we investigate and construct new two-step solver that could simulate Lotka-Volterra model using bigger mesh size. This paper proposes three new two-step schemes to simulate Lotka-Volterra model. A non-standard approximation scheme with trimean approach was adopted. The nonlinear terms in the model is approximated via trimean approach and differential equation via non-standard denominators. Four sets of parameters were examined to analyse the performance of these new schemes. Results show that these new schemes provide better simulation for large mesh size.
    Matched MeSH terms: Computers
  2. Mohammad Khatim Hasan, Shahrizan Mazlan
    MyJurnal
    Simulating Lotka-Volterra model using a numerical method requires the researcher to apply tiny mesh sizes to come up with an accurate solution. This approach will increase the complexity and burden of computer memory and consume long computational time. To overcome these issues, a new solver is used that could simulate Lotka-Volterra model using bigger mesh size. In this paper, prey and predator behaviour is simulated via Lotka-Volterra model. We approximate the nonlinear terms in the model via weighted average approach and differential equation via nonstandard denominators. We provide three new schemes for one step method and simulate four sets of parameters to examine the performance of these new schemes. Results show that these new schemes simulate better for large mesh sizes.
    Matched MeSH terms: Computers
  3. Kayode JS, Yusup Y
    Data Brief, 2018 Aug;19:798-803.
    PMID: 29900375 DOI: 10.1016/j.dib.2018.05.090
    A secondary dataset was generated from the Euldph-λ semi-automatic Algorithm, (ESA) developed to automatically computes various depths to the magnetic anomalies using a primary data set from gridded aeromagnetic data obtained in the study area. Euler Deconvolution techniques, (EDT), was adopted in the identification and definition of the magnetic anomaly source rocks in the study area. The aim is to use the straightforward technique to pinpoint magnetic anomalies at a depth which substantiate mineralization potential of the area. The ESA was integrated with the imaging function of Oasis Montaj 2014 source parameter from Geosoft® Inc. From the data, it could be summarized that similar tectonic processes during the deformation and metamorphic activities, the subsurface structures of the study area produce corresponding trending form.
    Matched MeSH terms: Computers
  4. Mohamad-Matrol AA, Chang SW, Abu A
    PeerJ, 2018;6:e5579.
    PMID: 30186704 DOI: 10.7717/peerj.5579
    Background: The amount of plant data such as taxonomical classification, morphological characteristics, ecological attributes and geological distribution in textual and image forms has increased rapidly due to emerging research and technologies. Therefore, it is crucial for experts as well as the public to discern meaningful relationships from this vast amount of data using appropriate methods. The data are often presented in lengthy texts and tables, which make gaining new insights difficult. The study proposes a visual-based representation to display data to users in a meaningful way. This method emphasises the relationships between different data sets.

    Method: This study involves four main steps which translate text-based results from Extensible Markup Language (XML) serialisation format into graphs. The four steps include: (1) conversion of ontological dataset as graph model data; (2) query from graph model data; (3) transformation of text-based results in XML serialisation format into a graphical form; and (4) display of results to the user via a graphical user interface (GUI). Ontological data for plants and samples of trees and shrubs were used as the dataset to demonstrate how plant-based data could be integrated into the proposed data visualisation.

    Results: A visualisation system named plant visualisation system was developed. This system provides a GUI that enables users to perform the query process, as well as a graphical viewer to display the results of the query in the form of a network graph. The efficiency of the developed visualisation system was measured by performing two types of user evaluations: a usability heuristics evaluation, and a query and visualisation evaluation.

    Discussion: The relationships between the data were visualised, enabling the users to easily infer the knowledge and correlations between data. The results from the user evaluation show that the proposed visualisation system is suitable for both expert and novice users, with or without computer skills. This technique demonstrates the practicability of using a computer assisted-tool by providing cognitive analysis for understanding relationships between data. Therefore, the results benefit not only botanists, but also novice users, especially those that are interested to know more about plants.

    Matched MeSH terms: Computers
  5. Teh Sin Yin, Ong Ker Hsin, Soh Keng Lin, Khoo Michael Boon Chong, Teoh Wei Li
    Sains Malaysiana, 2015;44:1067-1075.
    The existing optimal design of the fixed sampling interval S2-EWMA control chart to monitor the sample variance of a process is based on the average run length (ARL) criterion. Since the shape of the run length distribution changes with the magnitude of the shift in the variance, the median run length (MRL) gives a more meaningful explanation about the in-control and out-of-control performances of a control chart. This paper proposes the optimal design of the S2-EWMA chart, based on the MRL. The Markov chain technique is employed to compute the MRLs. The performances of the S2-EWMA chart, double sampling (DS) S2 chart and S chart are evaluated and compared. The MRL results indicated that the S2-EWMA chart gives better performance for detecting small and moderate variance shifts, while maintaining almost the same sensitivity as the DS S2 and S charts toward large variance shifts, especially when the sample size increases.
    Matched MeSH terms: Computers
  6. Othman A. Karim, Crapper M, Ali K.H.M.
    The study of cohesive sediment in the laboratory gives rise to a number of instrumentation problems, especially in the location of mud bed, fluid mud and hindered settling layers and in the measurement of flow velocities. This paper describes the application of medical diagnostic ultrasound technique in the cohesive sediment study conducted at the University of Liverpool, United Kingdom. This paper illustrates that the use of ultrasound technique creates a reasonably flexible environment for the study of fluid mud phenomenon in which bed formation and flow velocities can be measured easily, accurately and non-intrusively. This in turn will assist in development of computer models to predict the environmental impact, siltation rates and dredging requirements in both new and existing ports and harbour developments.
    Kajian endapan berjeleket di dalam makmal mengalami pelbagai masalah peralatan, terutamanya bagi menentukan lokasi dasar lumpur, pengenapan terhalang dan pengukuran halaju aliran. Dalam makalah ini diterangkan penggunaan teknologi diagnosis perubatan ultrabunyi dalam kajian endapan berjeleket, yang dijalankan di University of Liverpool, United Kingdom. Ditunjukkan bahawa penggunaan teknologi ultrabunyi keadaan yang begitu boleh suai bagi kajian fenomenon lumpur yang pembentukan dasar dan halaju aliran dapat diukur dengan mudah, tepat dan tanpa gangguan. lni seterusnya dapat membantu di dalam pembangunan model komputer bagi menjangka kesan sekitaran, kadar enapan dan keperluan mengorek bagi pembangunan kawasan pelabuhan baru dan sedia ada.
    Matched MeSH terms: Computers
  7. Abu Hassan Shaari Mohd Nor, Ahmad Shamiri, Zaidi Isa
    In this research we introduce an analyzing procedure using the Kullback-Leibler information criteria (KLIC) as a statistical tool to evaluate and compare the predictive abilities of possibly misspecified density forecast models. The main advantage of this statistical tool is that we use the censored likelihood functions to compute the tail minimum of the KLIC, to compare the performance of a density forecast models in the tails. Use of KLIC is practically attractive as well as convenient, given its equivalent of the widely used LR test. We include an illustrative simulation to compare a set of distributions, including symmetric and asymmetric distribution, and a family of GARCH volatility models. Our results on simulated data show that the choice of the conditional distribution appears to be a more dominant factor in determining the adequacy and accuracy (quality) of density forecasts than the choice of volatility model.
    Matched MeSH terms: Computers
  8. Chamran MK, Yau KA, Noor RMD, Wong R
    Sensors (Basel), 2019 Dec 19;20(1).
    PMID: 31861500 DOI: 10.3390/s20010018
    This paper demonstrates the use of Universal Software Radio Peripheral (USRP), together with Raspberry Pi3 B+ (RP3) as the brain (or the decision making engine), to develop a distributed wireless network in which nodes can communicate with other nodes independently and make decision autonomously. In other words, each USRP node (i.e., sensor) is embedded with separate processing units (i.e., RP3), which has not been investigated in the literature, so that each node can make independent decisions in a distributed manner. The proposed testbed in this paper is compared with the traditional distributed testbed, which has been widely used in the literature. In the traditional distributed testbed, there is a single processing unit (i.e., a personal computer) that makes decisions in a centralized manner, and each node (i.e., USRP) is connected to the processing unit via a switch. The single processing unit exchanges control messages with nodes via the switch, while the nodes exchange data packets among themselves using a wireless medium in a distributed manner. The main disadvantage of the traditional testbed is that, despite the network being distributed in nature, decisions are made in a centralized manner. Hence, the response delay of the control message exchange is always neglected. The use of such testbed is mainly due to the limited hardware and monetary cost to acquire a separate processing unit for each node. The experiment in our testbed has shown the increase of end-to-end delay and decrease of packet delivery ratio due to software and hardware delays. The observed multihop transmission is performed using device-to-device (D2D) communication, which has been enabled in 5G. Therefore, nodes can either communicate with other nodes via: (a) a direct communication with the base station at the macrocell, which helps to improve network performance; or (b) D2D that improve spectrum efficiency, whereby traffic is offloaded from macrocell to small cells. Our testbed is the first of its kind in this scale, and it uses RP3 as the distributed decision-making engine incorporated into the USRP/GNU radio platform. This work provides an insight to the development of a 5G network.
    Matched MeSH terms: Computers; Microcomputers
  9. Noor Azrin Zainuddin, Shamsatuan Nahar, Norzarina Johari, Farah Suraya Md Nasrudin, Noraisyah Abdul Aziz, Nur Diana Zamani, et al.
    Jurnal Inovasi Malaysia, 2018;1(2):23-36.
    MyJurnal
    The use of technology in teaching and learning is increasingly synonymous with the existence of multiple online platforms. Online teaching and learning guides help lecturers and students to obtain a variety of information related to their specialization in the field of study. As all UiTM students at the Diploma level are required to take and pass the Entrepreneurial Basic course (ENT300), they need to produce an entrepreneurial project as one of the course evaluation components. However, the number of science and technology based entrepreneurship projects and products are still too few based on project titles every semester. The ENT300 Kiosk Science and Technology (KENTS) was developed specifically as a guide for the students of the Faculty of Computer Science and Mathematics and has been improved by expanding its scope to the students of the Faculty of Engineering and Faculty of Science at UiTM Johor. A more global scope in KENTS provides specialized online guides for lecturers and students in the science and technology clusters. KENTS is a platform that can be used to realize the direction of higher education in Malaysia and to assist UiTM in producing holistic graduates with entrepreneurship. This online guide platform provides teaching and learning assistance through the custom business template which is categorized into two, system development and machine design. KENTS provides a search function from a list of compilations of science-based and entrepreneurial projects that help lecturers and students find entrepreneurial ideas. KENTS database is used to store student entrepreneurial project information as an e-learning platform that can be shared by lecturers and students globally.
    Matched MeSH terms: Computers
  10. Zhan Z, Wang C, Yap JBH, Loi MS
    Heliyon, 2020 Apr;6(4):e03671.
    PMID: 32382668 DOI: 10.1016/j.heliyon.2020.e03671
    This study is aimed to rationalise and demonstrate the efficacy of utilising laser cutting technique in the fabrication of glulam mortise & tenon joints in timber frame. Trial-and-error experiments aided by laser cutter were conducted to produce 3D timber mortise & tenon joints models. The two main instruments used were 3D modelling software and the laser cutter TH 1390/6090. Plywood was chosen because it could produce smooth and accurate cut edges whereby the surface could remain crack-free, and it could increase stability due to its laminated nature. Google SketchUp was used for modelling and Laser CAD v7.52 was used to transfer the 3D models to the laser cutter because it is compatible with AI, BMP, PLT, DXF and DST templates. Four models were designed and fabricated in which the trial-and-error experiments proved laser cutting could speed up the manufacturing process with superb quality and high uniformity. Precision laser cutting supports easy automation, produces small heat-affected zone, minimises deformity, relatively quiet and produces low amount of waste. The LaserCAD could not process 3D images directly but needed 2D images to be transferred, so layering and unfolding works were therefore needed. This study revealed a significant potential of rapid manufacturability of mortise & tenon joints with high-quality and high-uniformity through computer-aided laser cutting technique for wide applications in the built environment.
    Matched MeSH terms: Computers
  11. Aznan A, Gonzalez Viejo C, Pang A, Fuentes S
    Sensors (Basel), 2021 Sep 23;21(19).
    PMID: 34640673 DOI: 10.3390/s21196354
    Rice quality assessment is essential for meeting high-quality standards and consumer demands. However, challenges remain in developing cost-effective and rapid techniques to assess commercial rice grain quality traits. This paper presents the application of computer vision (CV) and machine learning (ML) to classify commercial rice samples based on dimensionless morphometric parameters and color parameters extracted using CV algorithms from digital images obtained from a smartphone camera. The artificial neural network (ANN) model was developed using nine morpho-colorimetric parameters to classify rice samples into 15 commercial rice types. Furthermore, the ANN models were deployed and evaluated on a different imaging system to simulate their practical applications under different conditions. Results showed that the best classification accuracy was obtained using the Bayesian Regularization (BR) algorithm of the ANN with ten hidden neurons at 91.6% (MSE = <0.01) and 88.5% (MSE = 0.01) for the training and testing stages, respectively, with an overall accuracy of 90.7% (Model 2). Deployment also showed high accuracy (93.9%) in the classification of the rice samples. The adoption by the industry of rapid, reliable, and accurate methods, such as those presented here, may allow the incorporation of different morpho-colorimetric traits in rice with consumer perception studies.
    Matched MeSH terms: Computers
  12. Sabry AH, Hasan WZW, Ab Kadir M, Radzi MAM, Shafie S
    PLoS One, 2017;12(9):e0185012.
    PMID: 28934271 DOI: 10.1371/journal.pone.0185012
    The main tool for measuring system efficiency in homes and offices is the energy monitoring of the household appliances' consumption. With the help of GUI through a PC or smart phone, there are various applications that can be developed for energy saving. This work describes the design and prototype implementation of a wireless PV-powered home energy management system under a DC-distribution environment, which allows remote monitoring of appliances' energy consumptions and power rate quality. The system can be managed by a central computer, which obtains the energy data based on XBee RF modules that access the sensor measurements of system components. The proposed integrated prototype framework is characterized by low power consumption due to the lack of components and consists of three layers: XBee-based circuit for processing and communication architecture, solar charge controller, and solar-battery-load matching layers. Six precise analogue channels for data monitoring are considered to cover the energy measurements. Voltage, current and temperature analogue signals were accessed directly from the remote XBee node to be sent in real time with a sampling frequency of 11-123 Hz to capture the possible surge power. The performance shows that the developed prototype proves the DC voltage matching concept and is able to provide accurate and precise results.
    Matched MeSH terms: Computers
  13. Zheng P, Belaton B, Liao IY, Rajion ZA
    PLoS One, 2017;12(11):e0187558.
    PMID: 29121077 DOI: 10.1371/journal.pone.0187558
    Landmarks, also known as feature points, are one of the important geometry primitives that describe the predominant characteristics of a surface. In this study we proposed a self-contained framework to generate landmarks on surfaces extracted from volumetric data. The framework is designed to be a three-fold pipeline structure. The pipeline comprises three phases which are surface construction, crest line extraction and landmark identification. With input as a volumetric data and output as landmarks, the pipeline takes in 3D raw data and produces a 0D geometry feature. In each phase we investigate existing methods, extend and tailor the methods to fit the pipeline design. The pipeline is designed to be functional as it is modularised to have a dedicated function in each phase. We extended the implicit surface polygonizer for surface construction in first phase, developed an alternative way to compute the gradient of maximal curvature for crest line extraction in second phase and finally we combine curvature information and K-means clustering method to identify the landmarks in the third phase. The implementations are firstly carried on a controlled environment, i.e. synthetic data, for proof of concept. Then the method is tested on a small scale data set and subsequently on huge data set. Issues and justifications are addressed accordingly for each phase.
    Matched MeSH terms: Computers
  14. Purnamasari P, Amran NA, Hartanto R
    F1000Res, 2022;11:559.
    PMID: 36474997 DOI: 10.12688/f1000research.121674.2
    Background: This study aims to examine public sector auditors' tendency to use somputer assisted audit techniques (CAATs) in managing their audit works. Methods: A total of 400 questionnaires were distributed to auditors working in the public sectors in Central Java, West Java, and East Java. From the total, 225 questionnaires were returned and completed.  The Structural Equation Modelling (SEM) and Partial Least Square (PLS) were used to analyze the data. Results: The empirical findings reveal that performance expectation and facilitating conditions have encouraged auditors to use CAATs in their works. Further, there is a positive influence between the intention to use and CAATs audit. This implies that auditors with an intention will be more open to using the CAATs optimally in achieving effective and efficient work. The utilization of CAATs in public services needs to have strong support from the government and positive attitudes from the auditors as the users of the system. Conclusion: This study covers broad areas of Central Java, West Java, and East Java. Further, the findings add to the literature on emerging markets specifically for Indonesian government auditors' intention and appropriateness of using CAATs. The use of CAATs help to provide auditors information on the highest number of auditees involved in corruption.
    Matched MeSH terms: Computers
  15. Zainurin SN, Wan Ismail WZ, Mahamud SNI, Ismail I, Jamaludin J, Ariffin KNZ, et al.
    Int J Environ Res Public Health, 2022 Oct 28;19(21).
    PMID: 36360992 DOI: 10.3390/ijerph192114080
    Nowadays, water pollution has become a global issue affecting most countries in the world. Water quality should be monitored to alert authorities on water pollution, so that action can be taken quickly. The objective of the review is to study various conventional and modern methods of monitoring water quality to identify the strengths and weaknesses of the methods. The methods include the Internet of Things (IoT), virtual sensing, cyber-physical system (CPS), and optical techniques. In this review, water quality monitoring systems and process control in several countries, such as New Zealand, China, Serbia, Bangladesh, Malaysia, and India, are discussed. Conventional and modern methods are compared in terms of parameters, complexity, and reliability. Recent methods of water quality monitoring techniques are also reviewed to study any loopholes in modern methods. We found that CPS is suitable for monitoring water quality due to a good combination of physical and computational algorithms. Its embedded sensors, processors, and actuators can be designed to detect and interact with environments. We believe that conventional methods are costly and complex, whereas modern methods are also expensive but simpler with real-time detection. Traditional approaches are more time-consuming and expensive due to the high maintenance of laboratory facilities, involve chemical materials, and are inefficient for on-site monitoring applications. Apart from that, previous monitoring methods have issues in achieving a reliable measurement of water quality parameters in real time. There are still limitations in instruments for detecting pollutants and producing valuable information on water quality. Thus, the review is important in order to compare previous methods and to improve current water quality assessments in terms of reliability and cost-effectiveness.
    Matched MeSH terms: Computers
  16. Inamdar MA, Raghavendra U, Gudigar A, Chakole Y, Hegde A, Menon GR, et al.
    Sensors (Basel), 2021 Dec 20;21(24).
    PMID: 34960599 DOI: 10.3390/s21248507
    Amongst the most common causes of death globally, stroke is one of top three affecting over 100 million people worldwide annually. There are two classes of stroke, namely ischemic stroke (due to impairment of blood supply, accounting for ~70% of all strokes) and hemorrhagic stroke (due to bleeding), both of which can result, if untreated, in permanently damaged brain tissue. The discovery that the affected brain tissue (i.e., 'ischemic penumbra') can be salvaged from permanent damage and the bourgeoning growth in computer aided diagnosis has led to major advances in stroke management. Abiding to the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) guidelines, we have surveyed a total of 177 research papers published between 2010 and 2021 to highlight the current status and challenges faced by computer aided diagnosis (CAD), machine learning (ML) and deep learning (DL) based techniques for CT and MRI as prime modalities for stroke detection and lesion region segmentation. This work concludes by showcasing the current requirement of this domain, the preferred modality, and prospective research areas.
    Matched MeSH terms: Computers
  17. Ali S, Ghatwary N, Jha D, Isik-Polat E, Polat G, Yang C, et al.
    Sci Rep, 2024 Jan 23;14(1):2032.
    PMID: 38263232 DOI: 10.1038/s41598-024-52063-x
    Polyps are well-known cancer precursors identified by colonoscopy. However, variability in their size, appearance, and location makes the detection of polyps challenging. Moreover, colonoscopy surveillance and removal of polyps are highly operator-dependent procedures and occur in a highly complex organ topology. There exists a high missed detection rate and incomplete removal of colonic polyps. To assist in clinical procedures and reduce missed rates, automated methods for detecting and segmenting polyps using machine learning have been achieved in past years. However, the major drawback in most of these methods is their ability to generalise to out-of-sample unseen datasets from different centres, populations, modalities, and acquisition systems. To test this hypothesis rigorously, we, together with expert gastroenterologists, curated a multi-centre and multi-population dataset acquired from six different colonoscopy systems and challenged the computational expert teams to develop robust automated detection and segmentation methods in a crowd-sourcing Endoscopic computer vision challenge. This work put forward rigorous generalisability tests and assesses the usability of devised deep learning methods in dynamic and actual clinical colonoscopy procedures. We analyse the results of four top performing teams for the detection task and five top performing teams for the segmentation task. Our analyses demonstrate that the top-ranking teams concentrated mainly on accuracy over the real-time performance required for clinical applicability. We further dissect the devised methods and provide an experiment-based hypothesis that reveals the need for improved generalisability to tackle diversity present in multi-centre datasets and routine clinical procedures.
    Matched MeSH terms: Computers
  18. Ramesh M, Muthuraman A
    Curr Top Med Chem, 2021;21(32):2856-2868.
    PMID: 34809547 DOI: 10.2174/1568026621666211122161932
    Neuropathic pain occurs due to physical damage, injury, or dysfunction of neuronal fibers. The pathophysiology of neuropathic pain is too complex. Therefore, an accurate and reliable prediction of the appropriate hits/ligands for the treatment of neuropathic pain is a challenging process. However, computer-aided drug discovery approaches contributed significantly to discovering newer hits/ligands for the treatment of neuropathic pain. The computational approaches like homology modeling, induced-fit molecular docking, structure-activity relationships, metadynamics, and virtual screening were cited in the literature for the identification of potential hit molecules against neuropathic pain. These hit molecules act as inducible nitric oxide synthase inhibitors, FLAT antagonists, TRPA1 modulators, voltage-gated sodium channel binder, cannabinoid receptor-2 agonists, sigma-1 receptor antagonists, etc. Sigma-1 receptor is a distinct type of opioid receptor and several patents were obtained for sigma-1 receptor antagonists for the treatment of neuropathic pain. These molecules were found to have a profound role in the management of neuropathic pain. The present review describes the validated therapeutic targets, potential chemical scaffolds, and crucial protein-ligand interactions for the management of neuropathic pain based on the recently reported computational methodologies of the present and past decades. The study can help the researcher to discover newer drugs/drug-like molecules against neuropathic pain.
    Matched MeSH terms: Computers*
  19. Sudha R, Thiagarajan AS, Seetharaman A
    Pak J Biol Sci, 2007 Jan 01;10(1):102-6.
    PMID: 19069993
    The existing literatures highlights that the security is the primary factor which determines the adoption of Internet banking technology. The secondary information on Internet banking development in Malaysia shows a very slow growth rate. Hence, this study aims to study the banking customers perception towards security concern and Internet banking adoption through the information collected from 150 sample respondents. The data analysis reveals that the customers have much concern about security and privacy issue in adoption of Internet banking, whether the customers are adopted Internet banking or not. Hence, it infers that to popularize Internet banking system there is a need for improvement in security and privacy issue among the banking customers.
    Matched MeSH terms: Attitude to Computers*
  20. Liew TS, Vermeulen JJ, Marzuki ME, Schilthuizen M
    Zookeys, 2014.
    PMID: 24715783 DOI: 10.3897/zookeys.393.6717
    Plectostoma is a micro land snail restricted to limestone outcrops in Southeast Asia. Plectostoma was previously classified as a subgenus of Opisthostoma because of the deviation from regular coiling in many species in both taxa. This paper is the first of a two-part revision of the genus Plectostoma, and includes all non-Borneo species. In the present paper, we examined 214 collection samples of 31 species, and obtained 62 references, 290 pictures, and 155 3D-models of 29 Plectostoma species and 51 COI sequences of 19 species. To work with such a variety of taxonomic data, and then to represent it in an integrated, scaleable and accessible manner, we adopted up-to-date cybertaxonomic tools. All the taxonomic information, such as references, classification, species descriptions, specimen images, genetic data, and distribution data, were tagged and linked with cyber tools and web servers (e.g. Lifedesks, Google Earth, and Barcoding of Life Database). We elevated Plectostoma from subgenus to genus level based on morphological, ecological and genetic evidence. We revised the existing 21 Plectostoma species and described 10 new species, namely, P. dindingensis sp. n., P. mengaburensis sp. n., P. whitteni sp. n., P. kayiani sp. n., P. davisoni sp. n., P. relauensis sp. n., P. kubuensis sp. n., P. tohchinyawi sp. n., P. tenggekensis sp. n., and P. ikanensis sp. n. All the synthesised, semantic-tagged, and linked taxonomic information is made freely and publicly available online.
    Matched MeSH terms: Computers
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links