Displaying publications 61 - 80 of 2380 in total

Abstract:
Sort:
  1. Khalid R, Nawawi MK, Kawsar LA, Ghani NA, Kamil AA, Mustafa A
    PLoS One, 2013;8(4):e58402.
    PMID: 23560037 DOI: 10.1371/journal.pone.0058402
    M/G/C/C state dependent queuing networks consider service rates as a function of the number of residing entities (e.g., pedestrians, vehicles, and products). However, modeling such dynamic rates is not supported in modern Discrete Simulation System (DES) software. We designed an approach to cater this limitation and used it to construct the M/G/C/C state-dependent queuing model in Arena software. Using the model, we have evaluated and analyzed the impacts of various arrival rates to the throughput, the blocking probability, the expected service time and the expected number of entities in a complex network topology. Results indicated that there is a range of arrival rates for each network where the simulation results fluctuate drastically across replications and this causes the simulation results and analytical results exhibit discrepancies. Detail results that show how tally the simulation results and the analytical results in both abstract and graphical forms and some scientific justifications for these have been documented and discussed.
  2. Sagadevan S, Chowdhury ZZ, Johan MRB, Khan AA, Aziz FA, F Rafique R, et al.
    PLoS One, 2018;13(10):e0202694.
    PMID: 30273344 DOI: 10.1371/journal.pone.0202694
    A cost-effective, facile hydrothermal approach was made for the synthesis of SnO2/graphene (Gr) nano-composites. XRD diffraction spectra clearly confirmed the presence of tetragonal crystal system of SnO2 which was maintaining its structure in both pure and composite materials' matrix. The stretching and bending vibrations of the functional groups were analyzed using FTIR analysis. FESEM images illustrated the surface morphology and the texture of the synthesized sample. HRTEM images confirmed the deposition of SnO2 nanoparticles over the surface of graphene nano-sheets. Raman Spectroscopic analysis was carried out to confirm the in-plane blending of SnO2 and graphene inside the composite matrix. The photocatalytic performance of the synthesized sample under UV irradiation using methylene blue dye was observed. Incorporation of grapheme into the SnO2 sample had increased the photocatalytic activity compared with the pure SnO2 sample. The electrochemical property of the synthesized sample was evaluated.
  3. Al-Hada NM, Saion EB, Shaari AH, Kamarudin MA, Flaifel MH, Ahmad SH, et al.
    PLoS One, 2014;9(8):e103134.
    PMID: 25093752 DOI: 10.1371/journal.pone.0103134
    A facile thermal-treatment route was successfully used to synthesize ZnO nanosheets. Morphological, structural, and optical properties of obtained nanoparticles at different calcination temperatures were studied using various techniques. The FTIR, XRD, EDX, SEM and TEM images confirmed the formation of ZnO nanosheets through calcination in the temperature between 500 to 650 °C. The SEM images showed a morphological structure of ZnO nanosheets, which inclined to crumble at higher calcination temperatures. The XRD and FTIR spectra revealed that the samples were amorphous at 30 °C but transformed into a crystalline structure during calcination process. The average particle size and degree of crystallinity increased with increasing calcination temperature. The estimated average particle sizes from TEM images were about 23 and 38 nm for the lowest and highest calcination temperature i.e. 500 and 650 °C, respectively. The optical properties were determined by UV-Vis reflection spectrophotometer and showed a decrease in the band gap with increasing calcination temperature.
  4. Yew SM, Chan CL, Lee KW, Na SL, Tan R, Hoh CC, et al.
    PLoS One, 2014;9(8):e104352.
    PMID: 25098697 DOI: 10.1371/journal.pone.0104352
    Dematiaceous fungi (black fungi) are a heterogeneous group of fungi present in diverse environments worldwide. Many species in this group are known to cause allergic reactions and potentially fatal diseases in humans and animals, especially in tropical and subtropical climates. This study represents the first survey of dematiaceous fungi in Malaysia and provides observations on their diversity as well as in vitro response to antifungal drugs. Seventy-five strains isolated from various clinical specimens were identified by morphology as well as an internal transcribed spacer (ITS)-based phylogenetic analysis. The combined molecular and conventional approach enabled the identification of three classes of the Ascomycota phylum and 16 genera, the most common being Cladosporium, Cochliobolus and Neoscytalidium. Several of the species identified have not been associated before with human infections. Among 8 antifungal agents tested, the azoles posaconazole (96%), voriconazole (90.7%), ketoconazole (86.7%) and itraconazole (85.3%) showed in vitro activity (MIC ≤ 1 µg/mL) to the largest number of strains, followed by anidulafungin (89.3%), caspofungin (74.7%) and amphotericin B (70.7%). Fluconazole appeared to be the least effective with only 10.7% of isolates showing in vitro susceptibility. Overall, almost half (45.3%) of the isolates showed reduced susceptibility (MIC >1 µg/mL) to at least one antifungal agent, and three strains (one Pyrenochaeta unguis-hominis and two Nigrospora oryzae) showed potential multidrug resistance.
  5. Tang TQ, Jan R, Shah Z, Vrinceanu N, Tanasescu C, Jan A
    PLoS One, 2024;19(4):e0297967.
    PMID: 38656969 DOI: 10.1371/journal.pone.0297967
    Infectious disease cryptosporidiosis is caused by the cryptosporidium parasite, a type of parasitic organism. It is spread through the ingestion of contaminated water, food, or fecal matter from infected animals or humans. The control becomes difficult because the parasite may remain in the environment for a long period. In this work, we constructed an epidemic model for the infection of cryptosporidiosis in a fractional framework with strong and weak immunity concepts. In our analysis, we utilize the well-known next-generation matrix technique to evaluate the reproduction number of the recommended model, indicated by [Formula: see text]. As [Formula: see text], our results show that the disease-free steady-state is locally asymptotically stable; in other cases, it becomes unstable. Our emphasis is on the dynamical behavior and the qualitative analysis of cryptosporidiosis. Moreover, the fixed point theorem of Schaefer and Banach has been utilized to investigate the existence and uniqueness of the solution. We identify suitable conditions for the Ulam-Hyers stability of the proposed model of the parasitic infection. The impact of the determinants on the sickness caused by cryptosporidiosis is highlighted by the examination of the solution pathways using a novel numerical technique. Numerical investigation is conducted on the solution pathways of the system while varying various input factors. Policymakers and health officials are informed of the crucial factors pertaining to the infection system to aid in its control.
  6. Jafari H, Shohaimi S, Salari N, Kiaei AA, Najafi F, Khazaei S, et al.
    PLoS One, 2022;17(1):e0262701.
    PMID: 35051240 DOI: 10.1371/journal.pone.0262701
    Anthropometry is a Greek word that consists of the two words "Anthropo" meaning human species and "metery" meaning measurement. It is a science that deals with the size of the body including the dimensions of different parts, the field of motion and the strength of the muscles of the body. Specific individual dimensions such as heights, widths, depths, distances, environments and curvatures are usually measured. In this article, we investigate the anthropometric characteristics of patients with chronic diseases (diabetes, hypertension, cardiovascular disease, heart attacks and strokes) and find the factors affecting these diseases and the extent of the impact of each to make the necessary planning. We have focused on cohort studies for 10047 qualified participants from Ravansar County. Machine learning provides opportunities to improve discrimination through the analysis of complex interactions between broad variables. Among the chronic diseases in this cohort study, we have used three deep neural network models for diagnosis and prognosis of the risk of type 2 diabetes mellitus (T2DM) as a case study. Usually in Artificial Intelligence for medicine tasks, Imbalanced data is an important issue in learning and ignoring that leads to false evaluation results. Also, the accuracy evaluation criterion was not appropriate for this task, because a simple model that is labeling all samples negatively has high accuracy. So, the evaluation criteria of precession, recall, AUC, and AUPRC were considered. Then, the importance of variables in general was examined to determine which features are more important in the risk of T2DM. Finally, personality feature was added, in which individual feature importance was examined. Performing by Shapley Values, the model is tuned for each patient so that it can be used for prognosis of T2DM risk for that patient. In this paper, we have focused and implemented a full pipeline of Data Creation, Data Preprocessing, Handling Imbalanced Data, Deep Learning model, true Evaluation method, Feature Importance and Individual Feature Importance. Through the results, the pipeline demonstrated competence in improving the Diagnosis and Prognosis the risk of T2DM with personalization capability.
  7. Zheng P, Belaton B, Liao IY, Rajion ZA
    PLoS One, 2017;12(11):e0187558.
    PMID: 29121077 DOI: 10.1371/journal.pone.0187558
    Landmarks, also known as feature points, are one of the important geometry primitives that describe the predominant characteristics of a surface. In this study we proposed a self-contained framework to generate landmarks on surfaces extracted from volumetric data. The framework is designed to be a three-fold pipeline structure. The pipeline comprises three phases which are surface construction, crest line extraction and landmark identification. With input as a volumetric data and output as landmarks, the pipeline takes in 3D raw data and produces a 0D geometry feature. In each phase we investigate existing methods, extend and tailor the methods to fit the pipeline design. The pipeline is designed to be functional as it is modularised to have a dedicated function in each phase. We extended the implicit surface polygonizer for surface construction in first phase, developed an alternative way to compute the gradient of maximal curvature for crest line extraction in second phase and finally we combine curvature information and K-means clustering method to identify the landmarks in the third phase. The implementations are firstly carried on a controlled environment, i.e. synthetic data, for proof of concept. Then the method is tested on a small scale data set and subsequently on huge data set. Issues and justifications are addressed accordingly for each phase.
  8. Baccini A, Petrovich E
    PLoS One, 2023;18(12):e0294669.
    PMID: 38157326 DOI: 10.1371/journal.pone.0294669
    Self-citations are a key topic in evaluative bibliometrics because they can artificially inflate citation-related performance indicators. Recently, self-citations defined at the largest scale, i.e., country self-citations, have started to attract the attention of researchers and policymakers. According to a recent research, in fact, the anomalous trends in the country self-citation rates of some countries, such as Italy, have been induced by the distorting effect of citation metrics-centered science policies. In the present study, we investigate the trends of country self-citations in 50 countries over the world in the period 1996-2019 using Scopus data. Results show that for most countries country self-citations have decreased over time. 12 countries (Colombia, Egypt, Indonesia, Iran, Italy, Malaysia, Pakistan, Romania, Russian Federation, Saudi Arabia, Thailand, and Ukraine), however, exhibit different behavior, with anomalous trends of self-citations. We argue that these anomalies should be attributed to the aggressive science policies adopted by these countries in recent years, which are all characterized by direct or indirect incentives for citations. Our analysis confirms that when bibliometric indicators are integrated into systems of incentives, they are capable of affecting rapidly and visibly the citation behavior of entire countries.
  9. Zamli KZ, Din F, Ahmed BS, Bures M
    PLoS One, 2018;13(5):e0195675.
    PMID: 29771918 DOI: 10.1371/journal.pone.0195675
    The sine-cosine algorithm (SCA) is a new population-based meta-heuristic algorithm. In addition to exploiting sine and cosine functions to perform local and global searches (hence the name sine-cosine), the SCA introduces several random and adaptive parameters to facilitate the search process. Although it shows promising results, the search process of the SCA is vulnerable to local minima/maxima due to the adoption of a fixed switch probability and the bounded magnitude of the sine and cosine functions (from -1 to 1). In this paper, we propose a new hybrid Q-learning sine-cosine- based strategy, called the Q-learning sine-cosine algorithm (QLSCA). Within the QLSCA, we eliminate the switching probability. Instead, we rely on the Q-learning algorithm (based on the penalty and reward mechanism) to dynamically identify the best operation during runtime. Additionally, we integrate two new operations (Lévy flight motion and crossover) into the QLSCA to facilitate jumping out of local minima/maxima and enhance the solution diversity. To assess its performance, we adopt the QLSCA for the combinatorial test suite minimization problem. Experimental results reveal that the QLSCA is statistically superior with regard to test suite size reduction compared to recent state-of-the-art strategies, including the original SCA, the particle swarm test generator (PSTG), adaptive particle swarm optimization (APSO) and the cuckoo search strategy (CS) at the 95% confidence level. However, concerning the comparison with discrete particle swarm optimization (DPSO), there is no significant difference in performance at the 95% confidence level. On a positive note, the QLSCA statistically outperforms the DPSO in certain configurations at the 90% confidence level.
  10. Dehdasht G, Ferwati MS, Zin RM, Abidin NZ
    PLoS One, 2020;15(2):e0228746.
    PMID: 32023306 DOI: 10.1371/journal.pone.0228746
    Successful implementation of the lean concept as a sustainable approach in the construction industry requires the identification of critical drivers in lean construction. Despite this significance, the number of in-depth studies toward understanding the considerable drivers of lean construction implementation is quite limited. There is also a shortage of methodologies for identifying key drivers. To address these challenges, this paper presents a list of all essential drivers within three aspects of sustainability (social, economic, and environmental) and proposes a novel methodology to rank the drivers and identify the key drivers for successful and sustainable lean construction implementation. In this regard, the entropy weighted Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) was employed in this research. Subsequently, an empirical study was conducted within the Malaysian construction industry to demonstrate the proposed method. Moreover, sensitivity analysis and comparison with the existing method were engaged to validate the stability and accuracy of the achieved results. The significant results obtained in this study are as follows: presenting, verifying and ranking of 63 important drivers; identifying 22 key drivers; proposing an MCDM model of key drivers. The outcomes show that the proposed method in this study is an effective and accurate tool that could help managers make better decisions.
  11. Maktabdar Oghaz M, Maarof MA, Zainal A, Rohani MF, Yaghoubyan SH
    PLoS One, 2015;10(8):e0134828.
    PMID: 26267377 DOI: 10.1371/journal.pone.0134828
    Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications.
  12. Ng KH, Ho CK, Phon-Amnuaisuk S
    PLoS One, 2012;7(10):e47216.
    PMID: 23071763 DOI: 10.1371/journal.pone.0047216
    Clustering is a key step in the processing of Expressed Sequence Tags (ESTs). The primary goal of clustering is to put ESTs from the same transcript of a single gene into a unique cluster. Recent EST clustering algorithms mostly adopt the alignment-free distance measures, where they tend to yield acceptable clustering accuracies with reasonable computational time. Despite the fact that these clustering methods work satisfactorily on a majority of the EST datasets, they have a common weakness. They are prone to deliver unsatisfactory clustering results when dealing with ESTs from the genes derived from the same family. The root cause is the distance measures applied on them are not sensitive enough to separate these closely related genes.
  13. Biswas K, Nazir A, Rahman MT, Khandaker MU, Idris AM, Islam J, et al.
    PLoS One, 2022;17(1):e0261427.
    PMID: 35085239 DOI: 10.1371/journal.pone.0261427
    Cost and safety are critical factors in the oil and gas industry for optimizing wellbore trajectory, which is a constrained and nonlinear optimization problem. In this work, the wellbore trajectory is optimized using the true measured depth, well profile energy, and torque. Numerous metaheuristic algorithms were employed to optimize these objectives by tuning 17 constrained variables, with notable drawbacks including decreased exploitation/exploration capability, local optima trapping, non-uniform distribution of non-dominated solutions, and inability to track isolated minima. The purpose of this work is to propose a modified multi-objective cellular spotted hyena algorithm (MOCSHOPSO) for optimizing true measured depth, well profile energy, and torque. To overcome the aforementioned difficulties, the modification incorporates cellular automata (CA) and particle swarm optimization (PSO). By adding CA, the SHO's exploration phase is enhanced, and the SHO's hunting mechanisms are modified with PSO's velocity update property. Several geophysical and operational constraints have been utilized during trajectory optimization and data has been collected from the Gulf of Suez oil field. The proposed algorithm was compared with the standard methods (MOCPSO, MOSHO, MOCGWO) and observed significant improvements in terms of better distribution of non-dominated solutions, better-searching capability, a minimum number of isolated minima, and better Pareto optimal front. These significant improvements were validated by analysing the algorithms in terms of some statistical analysis, such as IGD, MS, SP, and ER. The proposed algorithm has obtained the lowest values in IGD, SP and ER, on the other side highest values in MS. Finally, an adaptive neighbourhood mechanism has been proposed which showed better performance than the fixed neighbourhood topology such as L5, L9, C9, C13, C21, and C25. Hopefully, this newly proposed modified algorithm will pave the way for better wellbore trajectory optimization.
  14. Abbasi IA, Jan SU, Alqahtani AS, Khan AS, Algarni F
    PLoS One, 2024;19(1):e0294429.
    PMID: 38289970 DOI: 10.1371/journal.pone.0294429
    Cloud computing is vital in various applications, such as healthcare, transportation, governance, and mobile computing. When using a public cloud server, it is mandatory to be secured from all known threats because a minor attacker's disturbance severely threatens the whole system. A public cloud server is posed with numerous threats; an adversary can easily enter the server to access sensitive information, especially for the healthcare industry, which offers services to patients, researchers, labs, and hospitals in a flexible way with minimal operational costs. It is challenging to make it a reliable system and ensure the privacy and security of a cloud-enabled healthcare system. In this regard, numerous security mechanisms have been proposed in past decades. These protocols either suffer from replay attacks, are completed in three to four round trips or have maximum computation, which means the security doesn't balance with performance. Thus, this work uses a fuzzy extractor method to propose a robust security method for a cloud-enabled healthcare system based on Elliptic Curve Cryptography (ECC). The proposed scheme's security analysis has been examined formally with BAN logic, ROM and ProVerif and informally using pragmatic illustration and different attacks' discussions. The proposed security mechanism is analyzed in terms of communication and computation costs. Upon comparing the proposed protocol with prior work, it has been demonstrated that our scheme is 33.91% better in communication costs and 35.39% superior to its competitors in computation costs.
  15. Shiraz M, Gani A, Ahmad RW, Adeel Ali Shah S, Karim A, Rahman ZA
    PLoS One, 2014;9(8):e102270.
    PMID: 25127245 DOI: 10.1371/journal.pone.0102270
    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
  16. Chen Z, Rajamanickam L, Cao J, Zhao A, Hu X
    PLoS One, 2021;16(12):e0260758.
    PMID: 34879097 DOI: 10.1371/journal.pone.0260758
    This study aims to solve the overfitting problem caused by insufficient labeled images in the automatic image annotation field. We propose a transfer learning model called CNN-2L that incorporates the label localization strategy described in this study. The model consists of an InceptionV3 network pretrained on the ImageNet dataset and a label localization algorithm. First, the pretrained InceptionV3 network extracts features from the target dataset that are used to train a specific classifier and fine-tune the entire network to obtain an optimal model. Then, the obtained model is used to derive the probabilities of the predicted labels. For this purpose, we introduce a squeeze and excitation (SE) module into the network architecture that augments the useful feature information, inhibits useless feature information, and conducts feature reweighting. Next, we perform label localization to obtain the label probabilities and determine the final label set for each image. During this process, the number of labels must be determined. The optimal K value is obtained experimentally and used to determine the number of predicted labels, thereby solving the empty label set problem that occurs when the predicted label values of images are below a fixed threshold. Experiments on the Corel5k multilabel image dataset verify that CNN-2L improves the labeling precision by 18% and 15% compared with the traditional multiple-Bernoulli relevance model (MBRM) and joint equal contribution (JEC) algorithms, respectively, and it improves the recall by 6% compared with JEC. Additionally, it improves the precision by 20% and 11% compared with the deep learning methods Weight-KNN and adaptive hypergraph learning (AHL), respectively. Although CNN-2L fails to improve the recall compared with the semantic extension model (SEM), it improves the comprehensive index of the F1 value by 1%. The experimental results reveal that the proposed transfer learning model based on a label localization strategy is effective for automatic image annotation and substantially boosts the multilabel image annotation performance.
  17. Lee YL, Islam T, Danaee M, Taib NA, MyBCC study group
    PLoS One, 2022;17(11):e0277982.
    PMID: 36409745 DOI: 10.1371/journal.pone.0277982
    Regular physical activity (PA) after a breast cancer diagnosis is associated with reduced mortality and better quality of life. In this prospective cohort study, we aimed to explore the trends of PA among breast cancer survivors over three years and identify factors associated with low PA. Interviews on 133 breast cancer patients were conducted at baseline, one and three years after the diagnosis of breast cancer at University Malaya Medical Centre in Kuala Lumpur. Physical activity was measured by using the Global Physical Activity Questionnaire. PA was categorised as active (≥ 600 MET-min/week) and inactive (<600 MET-min/week). We used the generalised estimating equation method to examine PA levels and factors affecting PA longitudinally. The survivors' mean age was 56.89 (±10.56) years; half were Chinese (50.4%), and 70.7% were married. At baseline, 48.1% of the patients were active, but the proportion of active patients declined to 39.8% at one year and 35.3% in the third year. The mean total PA decreased significantly from 3503±6838.3 MET-min/week to 1494.0±2679.8 MET-min/week (one year) and 792.5±1364 MET-min/week (three years) (p<0.001). Three years after diagnosis (adjusted odds ratio [AOR]: 1.74, p = 0.021); Malay ethnicity (AOR: 1.86, p = 0.042) and being underweight (AOR: 3.43, p = 0.004) were significantly associated with inactivity. We demonstrated that breast cancer survivors in Malaysia had inadequate PA levels at diagnosis, which decreased over time. Thus, it is vital to communicate about the benefits of PA on cancer outcomes and continue to encourage breast cancer survivors to be physically active throughout the extended survivorship period, especially in the Malay ethnic group and underweight patients.
  18. Gulzari UA, Khan S, Sajid M, Anjum S, Torres FS, Sarjoughian H, et al.
    PLoS One, 2019;14(10):e0222759.
    PMID: 31577809 DOI: 10.1371/journal.pone.0222759
    This paper presents the Hybrid Scalable-Minimized-Butterfly-Fat-Tree (H-SMBFT) topology for on-chip communication. Main aspects of this work are the description of the architectural design and the characteristics as well as a comparative analysis against two established indirect topologies namely Butterfly-Fat-Tree (BFT) and Scalable-Minimized-Butterfly-Fat-Tree (SMBFT). Simulation results demonstrate that the proposed topology outperforms its predecessors in terms of performance, area and power dissipation. Specifically, it improves the link interconnectivity between routing levels, such that the number of required links isreduced. This results into reduced router complexity and shortened routing paths between any pair of communicating nodes in the network. Moreover, simulation results under synthetic as well as real-world embedded applications workloads reveal that H-SMBFT can reduce the average latency by up-to35.63% and 17.36% compared to BFT and SMBFT, respectively. In addition, the power dissipation of the network can be reduced by up-to33.82% and 19.45%, while energy consumption can be improved byup-to32.91% and 16.83% compared to BFT and SMBFT, respectively.
  19. Rahman LF, Marufuzzaman M, Alam L, Sidek LM, Reaz MBI
    PLoS One, 2020;15(2):e0225408.
    PMID: 32023244 DOI: 10.1371/journal.pone.0225408
    A high-voltage generator (HVG) is an essential part of a radio frequency identification electrically erasable programmable read-only memory (RFID-EEPROM). An HVG circuit is used to generate a regulated output voltage that is higher than the power supply voltage. However, the performance of the HVG is affected owing to the high-power dissipation, high-ripple voltage and low-pumping efficiency. Therefore, a regulator circuit consists of a voltage divider, comparator and a voltage reference, which are respectively required to reduce the ripple voltage, increase pumping efficiency and decrease the power dissipation of the HVG. Conversely, a clock driving circuit consists of the current-starved ring oscillator (CSRO), and the non- overlapping clock generator is required to drive the clock signals of the HVG circuit. In this study, the Mentor Graphics EldoSpice software package is used to design and simulate the HVG circuitry. The results showed that the designed CSRO dissipated only 4.9 μW at 10.2 MHz and that the phase noise was only -119.38 dBc/Hz at 1 MHz. Moreover, the proposed charge pump circuit was able to generate a maximum VPP of 13.53 V and it dissipated a power of only 31.01 μW for an input voltage VDD of 1.8 V. After integrating all the HVG modules, the results showed that the regulated HVG circuit was also able to generate a higher VPP of 14.59 V, while the total power dissipated was only 0.12 mW with a chip area of 0.044 mm2. Moreover, the HVG circuit produced a pumping efficiency of 90% and reduced the ripple voltage to <4 mV. Therefore, the integration of all the proposed modules in HVG ensured low-ripple programming voltages, higher pumping efficiency, and EEPROMs with lower power dissipation, and can be extensively used in low-power applications, such as in non-volatile memory, radiofrequency identification transponders, on-chip direct current DC-DC converters.
  20. Pilcher NJ, Adulyanukosol K, Das H, Davis P, Hines E, Kwan D, et al.
    PLoS One, 2017;12(12):e0190021.
    PMID: 29284017 DOI: 10.1371/journal.pone.0190021
    Fisheries bycatch is a widespread and serious issue that leads to declines of many important and threatened marine species. However, documenting the distribution, abundance, population trends and threats to sparse populations of marine species is often beyond the capacity of developing countries because such work is complex, time consuming and often extremely expensive. We have developed a flexible tool to document spatial distribution and population trends for dugongs and other marine species in the form of an interview questionnaire supported by a structured data upload sheet and a comprehensive project manual. Recognising the effort invested in getting interviewers to remote locations, the questionnaire is comprehensive, but low cost. The questionnaire has already been deployed in 18 countries across the Indo-Pacific region. Project teams spent an average of USD 5,000 per country and obtained large data sets on dugong distribution, trends, catch and bycatch, and threat overlaps. Findings indicated that >50% of respondents had never seen dugongs and that 20% had seen a single dugong in their lifetimes despite living and fishing in areas of known or suspected dugong habitat, suggesting that dugongs occurred in low numbers. Only 3% of respondents had seen mother and calf pairs, indicative of low reproductive output. Dugong hunting was still common in several countries. Gillnets and hook and line were the most common fishing gears, with the greatest mortality caused by gillnets. The questionnaire has also been used to study manatees in the Caribbean, coastal cetaceans along the eastern Gulf of Thailand and western Peninsular Malaysia, and river dolphins in Peru. This questionnaire is a powerful tool for studying distribution and relative abundance for marine species and fishery pressures, and determining potential conservation hotspot areas. We provide the questionnaire and supporting documents for open-access use by the scientific and conservation communities.
Filters
Contact Us

Please provide feedback to Administrator (afdal@afpm.org.my)

External Links