Browsing by Author "Moher, David"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
- ItemCompliance of clinical trial registries with the World Health Organization minimum data set : a survey(BioMed Central, 2009-07) Moja, Lorenzo P.; Moschetti, Ivan; Nurbhai, Munira; Compagnoni, Anna; Liberati, Alessandro; Grimshaw, Jeremy M.; Chan, An-Wen; Dickersin, Kay; Krleza-Jeric, Karmela; Moher, David; Sim, Ida; Volmink, JimmyBackground: Since September 2005 the International Committee of Medical Journal Editors has required that trials be registered in accordance with the World Health Organization (WHO) minimum dataset, in order to be considered for publication. The objective is to evaluate registries' and individual trial records' compliance with the 2006 version of the WHO minimum data set. Methods: A retrospective evaluation of 21 online clinical trial registries (international, national, specialty, pharmaceutical industry and local) from April 2005 to February 2007 and a cross-sectional evaluation of a stratified random sample of 610 trial records from the 21 registries. Results: Among 11 registries that provided guidelines for registration, the median compliance with the WHO criteria were 14 out of 20 items (range 6 to 20). In the period April 2005–February 2007, six registries increased their compliance by six data items, on average. None of the local registry websites published guidelines on the trial data items required for registration. Slightly more than half (330/610; 54.1%, 95% CI 50.1% – 58.1%) of trial records completed the contact details criteria while 29.7% (181/610, 95% CI 26.1% – 33.5%) completed the key clinical and methodological data fields. Conclusion: While the launch of the WHO minimum data set seemed to positively influence registries with better standardisation of approaches, individual registry entries are largely incomplete. Initiatives to ensure quality assurance of registries and trial data should be encouraged. Peer reviewers and editors should scrutinise clinical trial registration records to ensure consistency with WHO's core content requirements when considering trial-related publications.
- ItemEssential items for reporting of scaling studies of health interventions (SUCCEED) : protocol for a systematic review and Delphi process(BMC (part of Springer Nature), 2020-01-11) Gogovor, Amede; Zomahoun, Herve Tchala Vignon; Charif, Ali Ben; McLean, Robert K. D.; Moher, David; Milat, Andrew; Wolfenden, Luke; Prevost, Karina; Aubin, Emmanuelle; Rochon, Paula; Ekanmian, Giraud; Sawadogo, Jasmine; Rheault, Nathalie; Legare, FranceBackground: The lack of a reporting guideline for scaling of evidence-based practices (EBPs) studies has prompted the registration of the Standards for reporting studies assessing the impact of scaling strategies of EBPs (SUCCEED) with EQUATOR Network. The development of SUCCEED will be guided by the following main steps recommended for developing health research reporting guidelines. Methods: Executive Committee. We established a committee composed of members of the core research team and of an advisory group. Systematic review. The protocol was registered with the Open Science Framework on 29 November 2019 (https://osf. io/vcwfx/). We will include reporting guidelines or other reports that may include items relevant to studies assessing the impact of scaling strategies. We will search the following electronic databases: EMBASE, PsycINFO, Cochrane Library, CINAHL, Web of Science, from inception. In addition, we will systematically search websites of EQUATOR and other relevant organizations. Experts in the field of reporting guidelines will also be contacted. Study selection and data extraction will be conducted independently by two reviewers. A narrative analysis will be conducted to compile a list of items for the Delphi exercise. Consensus process. We will invite panelists with expertise in: development of relevant reporting guidelines, methodologists, content experts, patient/member of the public, implementers, journal editors, and funders. We anticipated that three rounds of web-based Delphi consensus will be needed for an acceptable degree of agreement. We will use a 9-point scale (1 = extremely irrelevant to 9 = extremely relevant). Participants’ response will be categorized as irrelevant (1–3), equivocal (4–6) and relevant (7–9). For each item, the consensus is reached if at least 80% of the participants’ votes fall within the same category. The list of items from the final round will be discussed at face-to-face consensus meeting. Guideline validation. Participants will be authors of scaling studies. We will collect quantitative (questionnaire) and qualitative (semi-structured interview) data. Descriptive analyses will be conducted on quantitative data and constant comparative techniques on qualitative data. Discussion: Essential items for reporting scaling studies will contribute to better reporting of scaling studies and facilitate the transparency and scaling of evidence-based health interventions.
- ItemReporting of health equity considerations in cluster and individually randomized trials(BMC (part of Springer Nature), 2020-04-03) Petkovic, Jennifer; Jull, Janet; Yoganathan, Manosila; Dewidar, Omar; Baird, Sarah; Grimshaw, Jeremy M.; Johansson, Kjell A.; Kristjansson, Elizabeth; McGowan, Jessie; Moher, David; Petticrew, Mark; Robberstad, Bjarne; Shea, Beverley; Tugwell, Peter; Volmink, Jimmy; Wells, George A.; Whitehead, Margaret; Cuervo, Luis G.; White, Howard; Taljaard, Monica; Welch, VivianBackground: The randomized controlled trial (RCT) is considered the gold standard study design to inform decisions about the effectiveness of interventions. However, a common limitation is inadequate reporting of the applicability of the intervention and trial results for people who are “socially disadvantaged” and this can affect policy-makers’ decisions. We previously developed a framework for identifying health-equity-relevant trials, along with a reporting guideline for transparent reporting. In this study, we provide a descriptive assessment of healthequity considerations in 200 randomly sampled equity-relevant trials. Methods: We developed a search strategy to identify health-equity-relevant trials published between 2013 and 2015. We randomly sorted the 4316 records identified by the search and screened studies until 100 individually randomized (RCTs) and 100 cluster randomized controlled trials (CRTs) were identified. We developed and pilottested a data extraction form based on our initial work, to inform the development of our reporting guideline for equity-relevant randomized trials. Results: In total, 39 trials (20%) were conducted in a low- and middle-income country and 157 trials (79%) in a high-income country focused on socially disadvantaged populations (78% CRTs, 79% RCTs). Seventy-four trials (37%) reported a subgroup analysis across a population characteristic associated with disadvantage (25% CRT, 49% RCTs), with 19% of included studies reporting subgroup analyses across sex, 9% across race/ethnicity/culture, and 4% across socioeconomic status. No subgroup analyses were reported for place of residence, occupation, religion, education, or social capital. One hundred and forty-one trials (71%) discussed the applicability of their results to one or more socially disadvantaged populations (68% of CRT, 73% of RCT). Discussion: In this set of trials, selected for their relevance to health equity, data that were disaggregated for socially disadvantaged populations were rarely reported. We found that even when the data are available, opportunities to analyze health-equity considerations are frequently missed. The recently published equity extension of the Consolidated Reporting Standards for Randomized Trials (CONSORT-Equity) may help improve delineation of hypotheses related to socially disadvantaged populations, and transparency and completeness of reporting of health-equity considerations in RCTs. This study can serve as a baseline assessment of the reporting of equity considerations.
- ItemReporting of methodological studies in health research : a protocol for the development of the Methodological Study reporting Checklist (MISTIC)(BMJ Publishing, 2020-12) Lawson, Daeria O.; Puljak, Livia; Pieper, Dawid; Schandelmaier, Stefan; Collins, Gary S.; Brignardello-Petersen, Romina; Moher, David; Tugwell, Peter; Welch, Vivian A.; Samaan, Zainab; Thombs, Brett D.; Nørskov, Anders K.; Jakobsen, Janus C.; Allison, David B.; Mayo-Wilson, Evan; Young, Taryn; Chan, An-Wen; Briel, Matthias; Guyatt, Gordon H.; Thabane, Lehana; Mbuagbaw, LawrenceIntroduction: Methodological studies (ie, studies that evaluate the design, conduct, analysis or reporting of other studies in health research) address various facets of health research including, for instance, data collection techniques, differences in approaches to analyses, reporting quality, adherence to guidelines or publication bias. As a result, methodological studies can help to identify knowledge gaps in the methodology of health research and strategies for improvement in research practices. Differences in methodological study names and a lack of reporting guidance contribute to lack of comparability across studies and difficulties in identifying relevant previous methodological studies. This paper outlines the methods we will use to develop an evidence-based tool—the Methodological Study reporting Checklist—to harmonise naming conventions and improve the reporting of methodological studies. Methods and analysis: We will search for methodological studies in the Cumulative Index to Nursing and Allied Health Literature, Cochrane Library, Embase, MEDLINE, Web of Science, check reference lists and contact experts in the field. We will extract and summarise data on the study names, design and reporting features of the included methodological studies. Consensus on study terms and recommended reporting items will be achieved via video conference meetings with a panel of experts including researchers who have published methodological studies. Ethics and dissemination: The consensus study has been exempt from ethics review by the Hamilton Integrated Research Ethics Board. The results of the review and the reporting guideline will be disseminated in stakeholder meetings, conferences, peer-reviewed publications, in requests to journal editors (to endorse or make the guideline a requirement for authors), and on the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) Network and reporting guideline websites. Registration: We have registered the development of the reporting guideline with the EQUATOR Network and publicly posted this project on the Open Science.