McIntyre was the first to suggest ranked set sampling (RSS) method for estimating the population mean. In this paper, we modify RSS to come up with new sampling method, namely, two stage ranked set sampling (TSRSS) for samples of size m=3k (k=1,2,..). The TSRSS is suggested for estimating the population median in order to increase the efficiency of the estimators. The TSRSS was compared to the simple random sampling (SRS), ranked set sampling (RSS), extreme ranked set sampling (ERSS), median ranked set sampling (MRSS) and balance groups ranked set sampling (BGRSS) methods. It is found that, TSRSS gives an unbiased estimator of the population median of symmetric distributions and it is more efficient than SRS. Also, it is more efficient than RSS, ERSS, MRSS and BGRSS based on the same number of measured units. For asymmetric distributions considered in this study, TSRSS has a small bias and smaller variance than SRS, RSS, ERSS, MRSS and BGRSS methods.
It is commonly held that in vivo biological experimental models are concrete and non-fictional. This belief is primarily supported by the fact that in vivo studies involve biological models which are alive, and what is alive cannot be fictional. However, I argue that this is not always the case. The design of an experimental model could still render an in vivo model fictional because fictional elements and processes can be built into these in vivo experimental models. These fictional elements are essential parts of a credentialed fiction because the designs of in vivo experimental models are constrained by imaginability, conceivability, and credit-worthiness. Therefore, despite its fictionality, it is credible for an in vivo experimental model to stand in for the phenomenon of interest.
Research implementation methodology is an important element in any study. Good data
are obtained from the study that is carefully planned based on an appropriate design, as well as the
approach that is used in the process of obtaining the data. The main objective of the proposed study is
to identify criteria for sustainable construction. Therefore, the right selection of study design and
implementation methodology is very important to ensure that the objectives are successfully achieved.
This manuscript writing presents the description of the design and implementation methodology used
in this study, namely content analysis, to meet the objective. Justification for the selected method to
achieve the objectives of the study is also discussed.
Many sampling methods have been suggested for estimating the population median. In the situation when the sampling units in a study can be easily ranked than quantified, the ranked set sampling methods are found to be more efficient and cost effective as compared to the simple random sampling. In this paper, the superiority of several ranked set sampling methods over the simple random sampling are illustrated through some simulation study. In addition, some new research topics under ranked set sampling are suggested.
In this paper, we report about chemically interaction between Pt Subnano-Clusters on Graphene Nano Sheets (GNS). The aim of this research is to clarify the size effect of Pt clusters on Pt 1-7 wt.%/GNS. This research is an experimental laboratory research. GNS was synthesized by using modified Hummer's method and 1-7 wt.% Pt/GNS were prepared with impregnation method. Then, they were analyzed with TG/DTA, XRD, TEM and XPS, respectively. The results show that Pt clusters are well deposited on GNS (TG/DTA and TEM data). Those data also are consistent with XRD data. The weak and broad peaks appear at 2θ = 39°, indicating Pt metal exists on GNS. The state of Pt is confirmed by using XPS. The appearance of Pt 4f. peaks proves that Pt metal is chemical interaction on GNS. The size of Pt clusters may affect the chemically properties of Pt/GNS catalysts.
Case reports can provide early information about new, unusual or rare disease(s), newer treatment strategies, improved therapeutic benefits and adverse effects of interventions or medications. This paper describes the process that led to the development of the Preferred Reporting Items for Case reports in Endodontics (PRICE) 2020 guidelines through a consensus-based methodology. A steering committee was formed with eight members (PD, VN, BC, PM, PS, EP, JJ and SP), including the project leaders (PD, VN). The steering committee developed an initial checklist by combining and modifying the items from the Case Report (CARE) guidelines and Clinical and Laboratory Images in Publications (CLIP) principles. A PRICE Delphi Group (PDG) and PRICE Face-to-Face Meeting Group (PFMG) were then formed. The members of the PDG were invited to participate in an online Delphi process to achieve consensus on the wording and utility of the checklist items and the accompanying flow chart that was created to complement the PRICE 2020 guidelines. The revised PRICE checklist and flow chart developed by the online Delphi process was discussed by the PFMG at a meeting held during the 19th European Society of Endodontology (ESE) Biennial Congress in Vienna, Austria, in September 2019. Following the meeting, the steering committee created a final version of the guidelines, which were piloted by several authors during the writing of a case report. In order to help improve the clarity, completeness and quality of case reports in Endodontics, we encourage authors to use the PRICE 2020 guidelines.
There has been progress towards malaria elimination in the last decade. In response, WHO launched the Global Technical Strategy (GTS), in which vector surveillance and control play important roles. Country experiences in the Eliminating Malaria Case Study Series were reviewed to identify success factors on the road to elimination using a cross-case study analytic approach.
Since its introduction in 1995, nanoimprint lithography has been demonstrated in many researches as a simple, low-cost, and high-throughput process for replicating micro- and nanoscale patterns. Due to its advantages, the nanoimprint lithography method has been rapidly developed over the years as a promising alternative to conventional nanolithography processes to fulfill the demands generated from the recent developments in the semiconductor and flexible electronics industries, which results in variations of the process. Roll-to-roll (R2R) nanoimprint lithography (NIL) is the most demanded technique due to its high-throughput fulfilling industrial-scale application. In the present work, a general literature review on the various types of nanoimprint lithography processes especially R2R NIL and the methods commonly adapted to fabricate imprint molds are presented to provide a clear view and understanding on the nanoimprint lithography technique as well as its recent developments.
One of the most widely-used techniques for ligand-based virtual screening is similarity searching. This study adopted the concepts of quantum mechanics to present as state-of-the-art similarity method of molecules inspired from quantum theory. The representation of molecular compounds in mathematical quantum space plays a vital role in the development of quantum-based similarity approach. One of the key concepts of quantum theory is the use of complex numbers. Hence, this study proposed three various techniques to embed and to re-represent the molecular compounds to correspond with complex numbers format. The quantum-based similarity method that developed in this study depending on complex pure Hilbert space of molecules called Standard Quantum-Based (SQB). The recall of retrieved active molecules were at top 1% and top 5%, and significant test is used to evaluate our proposed methods. The MDL drug data report (MDDR), maximum unbiased validation (MUV) and Directory of Useful Decoys (DUD) data sets were used for experiments and were represented by 2D fingerprints. Simulated virtual screening experiment show that the effectiveness of SQB method was significantly increased due to the role of representational power of molecular compounds in complex numbers forms compared to Tanimoto benchmark similarity measure.
A malaria eradication goal has been proposed, at the same time as a new global strategy and implementation framework. Countries are considering the strategies and tools that will enable progress towards malaria goals. The eliminating malaria case-study series reports were reviewed to identify successful programme management components using a cross-case study analytic approach.
Combinatorial test design is a plan of test that aims to reduce the amount of test cases systematically by choosing a subset of the test cases based on the combination of input variables. The subset covers all possible combinations of a given strength and hence tries to match the effectiveness of the exhaustive set. This mechanism of reduction has been used successfully in software testing research with t-way testing (where t indicates the interaction strength of combinations). Potentially, other systems may exhibit many similarities with this approach. Hence, it could form an emerging application in different areas of research due to its usefulness. To this end, more recently it has been applied in a few research areas successfully. In this paper, we explore the applicability of combinatorial test design technique for Fractional Order (FO), Proportional-Integral-Derivative (PID) parameter design controller, named as FOPID, for an automatic voltage regulator (AVR) system. Throughout the paper, we justify this new application theoretically and practically through simulations. In addition, we report on first experiments indicating its practical use in this field. We design different algorithms and adapted other strategies to cover all the combinations with an optimum and effective test set. Our findings indicate that combinatorial test design can find the combinations that lead to optimum design. Besides this, we also found that by increasing the strength of combination, we can approach to the optimum design in a way that with only 4-way combinatorial set, we can get the effectiveness of an exhaustive test set. This significantly reduced the number of tests needed and thus leads to an approach that optimizes design of parameters quickly.
The well-known geostatistics method (variance-reduction method) is commonly used to determine the optimal rain gauge network. The main problem in geostatistics method to determine the best semivariogram model in order to be used in estimating the variance. An optimal choice of the semivariogram model is an important point for a good data evaluation process. Three different semivariogram models which are Spherical, Gaussian and Exponential are used and their performances are compared in this study. Cross validation technique is applied to compute the errors of the semivariograms. Rain-fall data for the period of 1975 – 2008 from the existing 84 rain gauge stations covering the state of Johor are used in this study. The result shows that the exponential model is the best semivariogram model and chosen to determine the optimal number and location of rain gauge station.
Differential equations are commonly used to model various types of real life applications. The complexity of these models may often hinder the ability to acquire an analytical solution. To overcome this drawback, numerical methods were introduced to approximate the solutions. Initially when developing a numerical algorithm, researchers focused on the key aspect which is accuracy of the method. As numerical methods becomes more and more robust, accuracy alone is not sufficient hence begins the pursuit of efficiency which warrants the need for reducing computational cost. The current research proposes a numerical algorithm for solving initial value higher order ordinary differential equations (ODEs). The proposed algorithm is derived as a three point block multistep method, developed in an Adams type formulae (3PBCS) and will be used to solve various types of ODEs and systems of ODEs. Type of ODEs that are selected varies from linear to nonlinear, artificial and real life problems. Results will illustrate the accuracy and efficiency of the proposed three point block method. Order, stability and convergence of the method are also presented in the study.
This volume has highlighted the many recent advances in tinnitus theory, models, diagnostics, therapies, and therapeutics. But tinnitus knowledge is far from complete. In this chapter, contributors to the Behavioral Neuroscience of Tinnitus consider emerging topics and areas of research needed in light of recent findings. New research avenues and methods to explore are discussed. Issues pertaining to current assessment, treatment, and research methods are outlined, along with recommendations on new avenues to explore with research.
Reverse micellar extraction (RME) has emerged as a versatile and efficient tool for downstream processing (DSP) of various biomolecules, including structural proteins and enzymes, due to the substantial advantages over conventional DSP methods. However, the RME system is a complex dependency of several parameters that influences the overall selectivity and performance of the RME system, hence this justifies the need for optimization to obtain higher possible extraction results. For the last two decades, many experimental design strategies for screening and optimization of RME have been described in literature. The objective of this article is to review the use of different experimental designs and response surface methodologies that are currently used to screen and optimize the RME system for various types of biomolecules. Overall, this review provides the rationale for the selection of appropriate screening or optimization techniques for the parameters associated with both forward and backward extraction during the RME of biomolecules.
Laboratory-based research studies are the most common form of research endeavour and make up the majority of manuscripts that are submitted for publication in the field of Endodontology. The scientific information derived from laboratory studies can be used to design a wide range of subsequent studies and clinical trials and may have translational potential to benefit clinical practice. Unfortunately, the majority of laboratory-based articles submitted for publication fail the peer-review step, because unacceptable flaws or substantial limitations are identified. Even when apparently well-conducted laboratory-based articles are peer-reviewed, they can often require substantial corrections prior to the publication. It is apparent that some authors and reviewers may lack the training and experience to have developed a systematic approach to evaluate the quality of laboratory studies. Occasionally, even accepted manuscripts contain limitations that may compromise interpretation of data. To help authors avoid manuscript rejection and correction pitfalls, and to aid editors/reviewers to evaluate manuscripts systematically, the purpose of this project is to establish and publish quality guidelines for authors to report laboratory studies in the field of Endodontology so that the highest standards are achieved. The new guidelines will be named-'Preferred Reporting Items for Laboratory studies in Endodontology' (PRILE). A steering committee was assembled by the project leads to develop the guidelines through a five-phase consensus process. The committee will identify new items as well as review and adapt items from existing guidelines. The items forming the draft guidelines will be reviewed and refined by a PRILE Delphi Group (PDG). The items will be evaluated by the PDG on a nine-point Likert scale for relevance and inclusion. The agreed items will then be discussed by a PRILE face-to-face consensus meeting group (PFCMG) formed by 20 individuals to further refine the guidelines. This will be subject to final approval by the steering committee. The approved PRILE guidelines will be disseminated through publication in relevant journals, presented at congresses/meetings, and be freely available on a dedicated website. Feedback and comments will be solicited from researchers, editors and peer reviewers, who are invited to contact the steering committee with comments to help them update the guidelines periodically.
Predicting the number of defects in software at the method level is important. However, little or no research has focused on method-level defect prediction. Therefore, considerable efforts are still required to demonstrate how method-level defect prediction can be achieved for a new software version. In the current study, we present an analysis of the relevant information obtained from the current version of a software product to construct regression models to predict the estimated number of defects in a new version using the variables of defect density, defect velocity and defect introduction time, which show considerable correlation with the number of method-level defects. These variables also show a mathematical relationship between defect density and defect acceleration at the method level, further indicating that the increase in the number of defects and the defect density are functions of the defect acceleration. We report an experiment conducted on the Finding Faults Using Ensemble Learners (ELFF) open-source Java projects, which contain 289,132 methods. The results show correlation coefficients of 60% for the defect density, -4% for the defect introduction time, and 93% for the defect velocity. These findings indicate that the average defect velocity shows a firm and considerable correlation with the number of defects at the method level. The proposed approach also motivates an investigation and comparison of the average performances of classifiers before and after method-level data preprocessing and of the level of entropy in the datasets.