NICE process and methods

Appendix L NICE review format

Although there is no strict guidance for the way an evidence review is structured, it is important that it sets out as clearly as possible the information that the public health advisory committees will need to use to inform its deliberations and recommendations.

The exact structure of the review should be agreed with the CPHE project team on a review by review basis, however, in general we would expect a review to report the following:

Executive summary

  • Brief summary of the aims and objectives, methods, main findings and conclusions.

  • It should include all of the evidence statements and the related references.

Contents/structure for main report

1. Introduction

  • Context in which the review is set, this may include:

    • reference to the scope

    • epidemiological background

    • policy context

    • organisational context

    • theoretical perspectives

    • summary of effectiveness review.

All to be supported by current literature.

  • Aims and objectives of the review.

  • Research questions.

  • Operational definitions.

  • Identification of possible equality and equity issues.

  • Review team:

    • expertise (both in reviewing and subject area) and perspective brought to the review, for example:

      • researcher

      • professional/end user of guidance – clinician/practitioner/from heath/social/local authority/private sector

      • target population – general public/patient, carer

    • roles in the review process

    • conflicts of interest.

2. Methodology

Identification of evidence – for example, databases, websites, search strategies, hand-searching, contacts with experts in the field, author contacting.

  • Inclusion/exclusion criteria for review – type of studies, years, country, population, implementation process, moderation process.

  • Flow chart of number of studies identified from different sources and numbers excluded at different stages of process and reasons for exclusion.

  • Quality appraisal processes including consistency checking within and between appraisers, moderation at data extraction and analysis stages.

  • Software used for screening and coding of studies, data extraction, analysis and synthesis, managing the bibliography.

  • Criteria for appraising for applicability. Sample characteristics, context, conceptual and theoretical focus.

  • Methods of synthesis and data presentation.

3. Findings

  • Overview of the studies for each research question, such as, sub-question, population and outcome.

  • Narrative summary and evidence statements for each question, such as, sub-question, population, outcome:

    • quality, quantity and consistency of evidence

    • applicability of the evidence.

  • Meta-analyses, if applicable.

4. Discussion

  • Findings into context.

  • Implications of findings.

  • Limitations of the evidence, gaps.

  • Limitations of the review and potential impact on findings.

5. Conclusion and recommendations

Appendices

  • Sample search strategies.

  • Bibliography of included studies.

  • Bibliography of excluded studies with reasons for each study.

  • Evidence tables.

  • Examples of methodology checklists used.