Guidelines for reporting evaluations based on observational methodology

被引:74
作者
Portell, Mariona [1 ]
Teresa Anguera, M. [2 ]
Chacon-Moscoso, Salvador [3 ,4 ]
Sanduvete-Chaves, Susana [3 ]
机构
[1] Univ Autonoma Barcelona, E-08193 Barcelona, Spain
[2] Univ Barcelona, E-08007 Barcelona, Spain
[3] Univ Seville, Seville, Spain
[4] Univ Autonoma Chile, Santiago, Chile
关键词
Program evaluation; observational methodology; designs; low intervention; reporting guidelines; COMPLEX INTERVENTIONS; BEHAVIOR; IMPLEMENTATION; FIDELITY; CRITERIA; TIME;
D O I
10.7334/psicothema2014.276
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Background: Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. Method: The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. Results: We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. Conclusions: The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.
引用
收藏
页码:283 / 289
页数:7
相关论文
共 59 条
  • [21] Moscoso SC, 2013, INT J CLIN HLTH PSYC, V13, P58
  • [22] Methodological convergence of program evaluation designs
    Chacon-Moscoso, Salvador
    Anguera, M. Teresa
    Sanduvete-Chaves, Susana
    Sanchez-Martin, Milagrosa
    [J]. PSICOTHEMA, 2014, 26 (01) : 91 - 96
  • [23] Developing and evaluating complex interventions: The new Medical Research Council guidance
    Craig, Peter
    Dieppe, Paul
    Macintyre, Sally
    Michie, Susan
    Nazareth, Irwin
    Petticrew, Mark
    [J]. INTERNATIONAL JOURNAL OF NURSING STUDIES, 2013, 50 (05) : 587 - 592
  • [24] Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation
    Durlak, Joseph A.
    DuPre, Emily P.
    [J]. AMERICAN JOURNAL OF COMMUNITY PSYCHOLOGY, 2008, 41 (3-4) : 327 - 350
  • [25] A review of research on fidelity of implementation: implications for drug abuse prevention in school settings
    Dusenbury, L
    Brannigan, R
    Falco, M
    Hansen, WB
    [J]. HEALTH EDUCATION RESEARCH, 2003, 18 (02) : 237 - 256
  • [26] Lince: multiplatform sport analysis software
    Gabin, Brais
    Camerino, Oleguer
    Teresa Anguera, M.
    Castaner, Marta
    [J]. 4TH WORLD CONFERENCE ON EDUCATIONAL SCIENCES (WCES-2012), 2012, 46 : 4692 - 4694
  • [27] Collado AG, 2006, PSICOTHEMA, V18, P785
  • [28] Evaluating the relevance, generalization, and applicability of research - Issues in external validation and translation methodology
    Green, LW
    Glasgow, RE
    [J]. EVALUATION & THE HEALTH PROFESSIONS, 2006, 29 (01) : 126 - 153
  • [29] Qualitative or quantitative? Developing and evaluating complex interventions: time to end the paradigm war
    Griffiths, Peter
    Norman, Ian
    [J]. INTERNATIONAL JOURNAL OF NURSING STUDIES, 2013, 50 (05) : 583 - 584
  • [30] Hernández Mendo Antonio, 2012, CPD, V12, P55