Future Science Group
Browse

An evaluation of stakeholder engagement in comparative effectiveness research: lessons learned from SWOG S1415CD:Supplementary material

Download (15.33 kB)
workflow
posted on 2022-11-15, 17:47 authored by Ari Bell-Brown, Kate Watabayashi, Karma Kreizenbeck, Scott D Ramsey, Aasthaa Bansal, William E Barlow, Gary H Lyman, Dawn L Hershman, Anne Marie Mercurio, Barbara Segarra-Vazquez, Barbara Segarra-Vazquez7, Jamie S Myers, John D Golenski, Judy Johnson, Robert L Erwin, Guneet Walia, Jeffrey Crawford, Sean D Sullivan

Aim: Stakeholder engagement is central to comparative effectiveness research yet there are gaps in

definitions of success. We used a framework developed by Lavallee et al. defining effective engagement

criteria to evaluate stakeholder engagement during a pragmatic cluster-randomized trial. Methods: Semistructured

interviews were developed from the framework and completed to learn about members’

experiences. Interviews were analyzed in a deductive approach for themes related to the effective

engagement criteria. Results: Thirteen members participated and described: respect for ideas, time to

achieve consensus, access to information and continuous feedback as areas of effective engagement.

The primary criticism was lack of diversity. Discussion: Feedback was positive, particularly among themes

of respect, trust and competence, and led to development of a list of best practices for engagement.

The framework was successful for evaluating engagement. Conclusion: Standardized frameworks allow

studies to formally evaluate their stakeholder engagement approach and develop best practices for future

research.

Funding

Research reported in this manuscript was partially funded through a Patient-Centered Outcomes Research Institute (PCORI) Award no. PCS-1402-09988) and through the National Cancer Institute (nos. 5U10 CA180819-03 and 5UG1CA189974).

History

Usage metrics

    Journal of Comparative Effectiveness Research

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC