DESIGN: Paediatric Dentistry journals ranked in the top five of the h5 index of Google Scholar Metrics were selected. SRs with MA were searched independently by two reviewers using PubMed and Scopus databases until December 2017. Methodological quality was assessed using A Measurement Tool to Assess Systematic Reviews (AMSTAR) tool. Statistical significance was set at P
MATERIALS AND METHODS: Systematic reviews with meta-analyses in paediatric dentistry were searched in PubMed and Scopus databases from inception to December 2017. Selection of studies by title and abstract screening followed by full-text assessment was independently done by two reviewers. The quality of abstracts was assessed by PRISMA-Abstract checklist comprising of 12 items; one each for title and objective, three items for methods, three items for results, two items for discussion and two items for others. PRISMA-A median scores were calculated and compared with the article characteristics. Statistical significance was set at p reporting quality of abstracts.
STUDY DESIGN AND SETTING: Forty-seven rehabilitation clinicians of 5 professions from 7 teams (Belgium, Italy, Malaysia, Pakistan, Poland, Puerto Rico, the USA) reviewed 76 RCTs published by main rehabilitation journals exploring 14 domains chosen through consensus and piloting.
RESULTS: The response rate was 99%. Inter-rater agreement was moderate/good. All clinicians considered unanimously 12 (16%) RCTs clinically replicable and none not replicable. At least one "absent" information was found by all participants in 60 RCTs (79%), and by a minimum of 85% in the remaining 16 (21%). Information considered to be less well described (8-19% "perfect" information) included two providers (skills, experience) and two delivery (cautions, relationships) items. The best described (50-79% "perfect") were the classic methodological items included in CONSORT (descending order: participants, materials, procedures, setting, and intervention).
CONCLUSION: Clinical replicability must be considered in RCTs reporting, particularly for complex interventions. Classical methodological checklists such as CONSORT are not enough, and also Template for Intervention Description and Clinical replication do not cover all the requirements. This study supports the need for field-specific checklists.