Navigation

Shared learning database

Type and Title of Submission


Title:

Implementation of NICE guidance on Assessment of Febrile Children

Description:

NICE recommends all febrile children should have their temperature, heart rate, respiratory rate and capillary refill time measured and their subsequent management be based on a traffic light system to identify those with potential serious illness. This project was conducted in a large GP surgery, specifically aiming to improve the assessment using ALL four of assessment criteria. The project went beyond conventional audit and use the principles of improvement science to make it easier for clinicians to do the right thing and harder to do the wrong thing, and if they did forget to create systems spot and stop the omission.

Please note that this example was submitted to demonstrate implementation of CG47. This guideline was updated and replaced in May 2013 by CG160. The practice in this example remains consistent with the updated guidance.

Does the submission relate to the general implementation of all NICE guidance?

No

Does the submission relate to the implementation of a specific piece of NICE guidance?

Yes

Full title of NICE guidance:

CG160 - Feverish illness in children

Is the submission industry-sponsored in any way?

No


Description of submission


Aims and objectives

By the end of October 2011 95% of children with febrile illness will be assessed in accordance with NICE guidelines.

Context

The baseline assessment showed that all four components of the recommended clinical assessment was not happening at all. The most assessment that was being conducted and recorded was the temperature (just over 80% of the time), followed by capillary refill time (30%), followed by respiratory rate (under 20%) and none of the children had their heart rate measured.

Methods

The first phase was a diagnostic phase to understand why the assessments were not taking place. This include a combination of staff questionnaire, direct observation and "leadership walkarounds". This led to a real understanding for the reasons that the assessment was not being conducted (using an improvement science technique known as a fishbone diagram and 5 Whys).

On the basis of this a series of interventions were considered and tested using the PDSA methodology. The series of interventions included:
(1) Making the equipment available - a workplace organisation tool from Lean methodology know as 5S was used.
(2) A human factors intervention of a visible prompt on the tympanic thermometer just below the screen with a picture of the child and a checklist
(3) Mouse mats with a picture of the same child having her temperature checked and a table of the normal values was placed in each room
(4) On the practice intranet the traffic light table was placed for ease of reference
(5) A recording template was created on the clinical system making it easier to code and record the assessment
(6) An electronic algorithm was created on the clinical system (EMIS LV), so that if a code was entered for a child that would suggest a condition with a fever e.g. OTITIS MEDIA the computer system would check to see if the four items had been recorded and if not it would create a 'forcing' function reminding the clinician to conduct these components of the assessment, and would then automatically call the data recording template
(7) Those clinicians persistently not conducting all four features of the assessment would be sent a personalised postcard reminding them of the guidance.

The costs incurred in addition to the time of the author in conducting the improvement were the printing of the postcards, reminder stickers, and mousemats, and purchase of thermometers (maximum cost of 500.00)

Results and evaluation

The progress was monitored using another tool from improvement science (known as SPC chart). Each patient was given a score of 1,2,3 or 4 depending on how many components of the assessment they had conducted. Each patient was then plotted on the SPC chart and the chart was annotated with the interventions. This clearly demonstrated the interventions were leading to an improvement. In addition, a balancing measure of length of consultation was measured. One of the anxieties was that conducting the assessments might lead to increase in consultation length. The data clearly demonstrated this was not the case.

Key learning points

The use of improvement science principles is really helpful. It provides a systematic approach to improvement. The importance of establishing reasons or root causes is invaluable as well as the approach to testing it out ideas. Regular and frequent measurement and presenting graphically in the form of an annotated run chart is a power engagement tool

View the supporting material


 

NICE handles personal information provided to the Institute in accordance with the Data Protection Act 1998. Find further details in our data protection policy.

This page was last updated: 15 February 2012

Accessibility | Cymraeg | Freedom of information | Vision Impaired | Contact Us | Glossary | Data protection | Copyright | Disclaimer | Terms and conditions

Copyright 2014 National Institute for Health and Care Excellence. All rights reserved.

Selected, reliable information for health and social care in one place

Accessibility | Cymraeg | Freedom of information | Vision Impaired | Contact Us | Glossary | Data protection | Copyright | Disclaimer | Terms and conditions

Copyright 2014 National Institute for Health and Care Excellence. All rights reserved.