3 Evidence

The diagnostics advisory committee (section 7) considered evidence on rapid tests for group A streptococcal infections (strep A) in people with a sore throat from several sources. Full details are in the project documents for this guidance.

Clinical effectiveness

3.1

The external assessment group (EAG) did a systematic review to identify evidence on the clinical effectiveness of rapid tests for detecting strep A infection in people with a sore throat. Evidence on the following outcomes was of interest:

  • diagnostic performance

  • effect on prescribing behaviours and clinical outcomes

  • contribution to antimicrobial stewardship and onward transmission of infection.

3.2

The EAG found 38 studies that met the inclusion criteria:

  • 35 studies reported test accuracy data, 12 reported antibiotic prescribing behaviours (9 studies reported both outcomes), and none reported clinical outcomes such as morbidity, mortality or onward transmission rate.

  • 26 studies were reported in peer-reviewed journals (full-text articles), 3 in conference abstracts, 4 in Food and Drug Administration (FDA) documents and 5 in unpublished manufacturers' data.

3.3

Across studies, the prevalence of strep A ranged from 15% to 49%. There were no clear demographic or clinical patterns accounting for this variation, and no identified differences between primary and secondary care settings.

3.4

Populations in most studies did not fit the scope for this assessment. Only 2 studies included people with a Centor score of 3 or more, or FeverPAIN score of 4 or more; the people who would have a rapid test in current practice. Both studies reported antibiotic prescribing behaviours only. There were 2 test accuracy studies that reported outcomes separately by Centor score. All other studies enrolled people with lower clinical scores than those in the scope, or did not use clinical scores as an inclusion criterion.

3.5

The relevant subgroups in the review included children (aged 5 to 14), adults (aged 15 to 75) and older people (aged over 75). However, age group definitions varied between studies. Only 2 studies met the age criterion for children and 2 studies met the age criterion for adults defined in the topic scope. There were no studies reporting data for the older population (aged over 75).

3.6

The quality of all 26 published accuracy studies was assessed using QUADAS‑2 criteria. All studies were considered at high risk of bias in at least 1 domain, and 13 studies were considered at high risk of bias in 2 or more domains. Studies reported in FDA documents or unpublished manufacturer data could not be quality assessed because of the lack of information. The main applicability issue was related to not using clinical scoring tools, described in section 3.4.

3.7

Of studies reporting antibiotic prescribing behaviours, the methodological quality of the 3 randomised controlled trials was fair, as assessed by the Cochrane risk of bias tool. No domains were considered at high risk of bias but 1 to 3 domains per study had unclear risk of bias. Of 9 cohort studies, 3 assessed hypothetical prescribing behaviours according to the prescribing guidelines and were not quality assessed. The remaining 6 cohort studies were assessed using the Joanna Briggs Institute Critical Appraisal Checklist for analytical cross-sectional studies. There was 1 study with unclear risk of bias in 1 domain, and 5 studies were at high risk of bias in 1 or more domains.

Evidence on diagnostic performance of rapid tests for strep A infections

3.8

Only 2 studies reported the diagnostic accuracy of the rapid tests in people who are more (FeverPAIN score of 2 or 3), or most (Centor score of 3 or 4, or a FeverPAIN score of 3 or 4) likely to benefit from antibiotics. Accuracy data from these 2 studies is in table 3.

Table 3 Diagnostic accuracy of rapid tests in people who are more or most likely to benefit from antibiotics
Citation Test Population Setting Centor threshold Sensitivity (95% confidence interval) % Specificity (95% confidence interval) %

Humair et al. (2006)

Alere TestPack Plus Strep A

Adults with a Centor score of 2 or more

Primary care in Switzerland

Centor 3 or more

95% (89% to 98%)

94% (88% to 98%)

Centor 2

80% (63% to 92%)

96% (91% to 99%)

Llor et al. (2011)

OSOM Strep A

Adults with a Centor score of 1 or more

Primary care in Spain

Centor 3 or more

92% (76% to 98%)

96% (89% to 99%)

Centor 1 or 2

85% (55% to 98%)

93% (87% to 96%)

3.9

Most studies included either all patients with acute sore throat, without using the clinical scoring tools, or used these tools at a lower threshold than in UK practice. Across these studies (any population or healthcare setting), accuracy data were available for 18 of 21 tests:

  • There were no accuracy data for 3 tests: Strep A Rapid Test Strip (Biopanda), Biosynex Strep A Cassette test, and Bionexia Strep A Plus Cassette test.

  • Accuracy estimates for 8 tests (Strep A rapid test cassette [Biopanda], 5 NADAL Strep A tests, Alere i Strep A 2 tests and Xpert Xpress Strep A test) were only available from unpublished manufacturer data or FDA reports.

  • Only 5 tests had data from 2 or more published studies (BD Veritor Plus System, GuickRead Go Strep A Kit, Alere i Strep A, OSOM Strep A Strip, Alere TestPack Plus Cassette). Meta-analysis was possible for these tests.

3.10

Across studies, there was a wide variation in sensitivity (67.9% to 100%), and specificity (73.3% to 100%) of the rapid tests. There was a wide variation in accuracy estimates even for the same test. For example, the sensitivity of the Alere TestPack Plus cassette ranged from 73% (95% confidence interval [CI] 45% to 92%) to 96% (95% CI 91% to 99%). Its specificity ranged from 86% (95% CI 81% to 91%) to 100% (95% CI 96% to 100%) across 10 studies. Data from the manufacturer and FDA submissions consistently provided higher estimates of sensitivity and specificity than peer-reviewed studies.

3.11

Head-to-head comparison of the diagnostic accuracy of different tests was only reported in 4 studies. These studies suggested there is some variation in accuracy between tests. Because of the large degree of inter-study variability, it was not possible to compare the relative accuracy of different tests across different studies.

3.12

There were 3 studies that enrolled both adults and children, with separate accuracy data for each age group, allowing for a within-study comparison. These studies showed no clear trends in the diagnostic accuracy of the rapid tests between different age groups. In addition, there were 7 studies that enrolled adults only and 10 studies that enrolled children only. All other studies enrolled a mixed population of adults and children or did not report the age group.

3.13

No studies compared the diagnostic accuracy of the rapid tests in different healthcare settings. A total of 10 studies were done in primary care and 14 in secondary care; healthcare setting was not reported in the remaining studies. There were no studies done in a pharmacy setting.

3.14

Conflicting results between the rapid tests and microbiological culture of throat swabs were resolved using polymerase chain reaction (PCR) in 4 studies. A large proportion of conflicting results (both false positive and false negative) tested positive with PCR. This suggests that the reference standard used in this assessment is not 100% accurate, and may be under or overestimating the accuracy of rapid tests.

3.15

Rapid test failure rates were generally low, as reported in 5 studies:

  • Alere i Strep A: 0% and 2.8% (2 studies).

  • Alere TestPack Plus Strep A: 0.3% and 1.3% (2 studies).

  • Sofia Strep A FIA: 4.7% (1 study).

    The EAG noted that these differences could be because of environmental factors such as staff training rather than issues with the tests.

Evidence on antibiotic prescribing behaviour

3.16

The 3 randomised controlled trials reporting on antibiotic prescribing showed a decrease in antibiotic prescribing with the rapid tests:

  • In a UK study of adults and children aged 3 years or more with acute sore throat in primary care (Little et al. 2013), the rate of immediate prescribing was 10% (21 of 207 patients) in the control (delayed antibiotic) group, 16% (33 of 211 patients) in the clinical scoring tool (FeverPAIN) group, and 18% (38 of 213 patients) in the FeverPAIN plus rapid strep A (Alere TestPack Plus Strep A) test group. The rate of delayed prescribing was 79%, 41% and 23%, respectively. The rate of immediate or delayed prescriptions was lower with the rapid strep A test compared with the clinical scoring group, but the reported use of antibiotics was comparable between the groups (35% and 37% respectively, compared with 46% in the control group). Data on reported antibiotic use were only available for 80% of enrolled patients so should be interpreted with caution.

  • In a Spanish study of adults in primary care (Llor et al. 2011), 44% of people who had the OSOM Strep A test as well as the Centor tool had an antibiotic prescription, compared with 64% of people in the Centor only group.

  • In a Canadian study of adults in primary care (Worrall et al. 2007), antibiotics were prescribed for 58% of patients in the control group (usual care), 55% of people in the sore throat decision rules group (STDR; modified Centor), 27% of people in the rapid test group (Clearview Exact Strep A), and 38% of patients in the STDR plus rapid test group.

3.17

The before-and-after study by Bird et al. (2018) assessed antibiotic prescribing rates before and after introducing the McIsaac clinical scoring tool and a rapid strep A test (Bionexia Strep A) in a UK paediatric emergency department. After introducing this strategy, antibiotic prescribing rates decreased from 79% at baseline (October to November 2014) to 24% in the first year (August to November 2015) and 28% in the second year (September to November 2016). However, random annual fluctuations and seasonality could have confounded the results.

Cost effectiveness

Systematic review of cost-effectiveness evidence

3.18

The EAG found 3 cost-effectiveness studies for the rapid strep A tests. However, 2 of these studies only reported cost per person and did not report enough information for full data extraction and their quality appraisal. The economic evaluation by Little et al. (2014) was considered high quality according to the consolidated health-economic evaluation reporting standards checklist.

3.19

Little et al. (2014) did an economic analysis alongside a randomised controlled trial (reported in Little et al. 2013). The trial was based in UK primary care clinics, and included both adults and children aged 3 years or more with acute sore throat. Patients were randomised to targeted antibiotic use according to:

  • delayed prescribing

  • FeverPAIN clinical scoring tool

  • rapid strep A test (Alere TestPack Plus Strep A; used with FeverPAIN tool).

3.20

The economic analysis was from the NHS perspective and the time horizon was short (14 and 28 days), so long-term effects were not captured. The analysis included a cost-effectiveness analysis (cost per change in symptom severity) and a cost–utility analysis (cost per quality-adjusted life year [QALY]). QALYs were calculated using the mean EQ-5D scores from the 14‑day diary records, and were adjusted for differences in baseline characteristics.

3.21

In the cost–utility analysis, the delayed prescribing group was dominated by the FeverPAIN group for both time frames. The incremental cost-effectiveness ratio (ICER) for the rapid test compared with FeverPAIN was £74,286 per QALY gained for the 14‑day time frame and £24,528 per QALY gained for the 28‑day time frame. At £30,000 per QALY gained, the probabilities of each strategy being cost effective were 28%, 38% and 35% for delayed prescribing, FeverPAIN clinical score and the rapid test, respectively, for the 28‑day time frame.

Economic analysis

3.22

The study by Little et al. (2014) included only 1 of the 21 rapid tests relevant to this assessment. Also, it only considered a primary care setting, and did not assess adults and children separately. Therefore, the EAG constructed 4 de novo economic models to assess the cost effectiveness of all relevant rapid tests in people with acute sore throat:

  • adults in primary care

  • adults in secondary care

  • children in primary care

  • children in secondary care.

3.23

Economic assessment for older people or for the pharmacy setting was not possible because of the lack of evidence.

Model structure
3.24

A decision tree was created to simulate the potential care pathways associated with using rapid tests and clinical scoring tools, compared with using clinical scoring tools only (current practice), in people with acute sore throat.

3.25

The economic analysis was from the UK NHS and personal social services perspective. A 1‑year time horizon was used to see the effect of rare but serious complications of strep A infection on costs and outcomes (a shorter time frame of 14 days was used in sensitivity analyses). No discounting was applied to costs and benefits because of the short time horizon. Because of the lack of published data, the models did not consider wider public health benefits such as the potential effect on antimicrobial stewardship or onward transmission rates.

Model inputs
3.26

A prevalence of 22.6% was used for adults, based on the study by Little et al. (2014). The study enrolled patients aged 3 years or older in UK primary care. For children, an estimate of 30.2% was assumed, based on the median of 3 non-UK studies of children in primary care.

3.27

The accuracy estimates for the Centor clinical scoring tool were taken from the meta-analysis by Aalbers et al. (2011). It focused on Centor to predict strep A pharyngitis in adults (15 years or older) in primary care. At the Centor threshold of 3 or more, the sensitivity was estimated as 49% (95% CI 38% to 60%), and specificity as 82% (95% CI 72% to 88%). There were no studies reporting the accuracy of the FeverPAIN clinical scoring tool so it could not be modelled.

3.28

The accuracy estimates for the rapid strep A tests were from the systematic literature review done by the EAG. The sensitivity of the rapid tests ranged from 68% to 100%, and the specificity from 79% to 100% (see table 4 and table 5). The estimates of accuracy based on unpublished manufacturers' data or FDA reports were consistently higher than the estimates from the published peer-reviewed studies. Therefore, the economic models based solely on manufacturers' test accuracy data should be interpreted with caution.

Table 4 Test accuracy data used in the economic model for adults in primary care
Test name (manufacturer) Sensitivity (95% confidence interval) % Specificity (95% confidence interval) % Data source

Clearview Exact Strep A cassette (Abbott)

68 (54 to 80)

95 (92 to 97)

1 abstract

Clearview Exact Strep A dipstick (Abbott)

68 (54 to 80)

95 (92 to 97)

1 abstract

BD Veritor Plus system group A Strep Assay cassette (Becton Dickinson)

78 (67 to 87)

90 (86 to 93)

2 published studies

Strep A rapid test cassette (Biopanda Reagents)

95 (90 to 98)

98 (96 to 99)

1 unpublished study1

Strep A rapid test dipstick (Biopanda Reagents)

95 (90 to 98)

98 (96 to 99)

No data2

NADAL Strep A test strip (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A cassette (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A plus cassette (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A plus test strip (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A scan test cassette (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

OSOM Strep A test strip (Sekisui Diagnostics)

92 (76 to 98)

96 (89 to 99)

3 published studies

QuikRead Go Strep A test kit (Orion Diagnostica)

100 (85 to 100)

79 (60 to 92)

1 published study

Alere TestPack Plus Strep A cassette (Abbott)

95 (89 to 98)

94 (88 to 98)

1 published study

Bionexia Strep A plus cassette (Biomerieux)

No data

Bionexia Strep A dipstick test strip (Biomerieux)

85 (74 to 92)

91 (84 to 95)

1 abstract

Biosynex Strep A cassette (Biosynex)

No data

Sofia Strep A FIA (Quidel)

85 (81 to 89)

95 (93 to 97)

1 published study

Alere i Strep A (Abbott)3

95 (74 to 100)

97 (92 to 99)

1 published study

Alere i Strep A 2 (Abbott)4

98 (96 to 100)

93 (91 to 95)

1 FDA Report

Cobas Strep A assay on Liat system (Roche Diagnostics)

98 (93 to 100)

93 (90 to 96)

1 published study

Xpert Xpress Strep A (Cepheid)

100 (99 to 100)

94 (92 to 96)

1 unpublished study1 and 1 FDA report

Notes:

1, Unpublished manufacturer data.

2, Assumed the same accuracy as the cassette version of the test.

3, Replaced by ID NOW Strep A.

4, Rebranded to ID NOW Strep A 2.

Table 5 Test accuracy data used in the economic model for children in primary care
Test name (manufacturer) Sensitivity (95% confidence interval) % Specificity (95% confidence interval) % Data source

Clearview Exact Strep A cassette (Abbott)

68 (54 to 80)

95 (92 to 97)

1 abstract

Clearview Exact Strep A dipstick (Abbott)

68 (54 to 80)

95 (92 to 97)

1 abstract

BD Veritor Plus system group A Strep Assay cassette (Becton Dickinson)

76 (61 to 88)

94 (89 to 97)

1 published study

Strep A rapid test cassette (Biopanda Reagents)

95 (90 to 98)

98 (96 to 99)

1 unpublished study1

Strep A rapid test dipstick (Biopanda Reagents)

95 (90 to 98)

98 (96 to 99)

No data2

NADAL Strep A test strip (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A cassette (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A plus cassette (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A plus test strip (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

NADAL Strep A scan test cassette (nal von minden GmbH)

98 (92 to 100)

98 (94 to 99)

1 unpublished study1

OSOM Strep A test strip (Sekisui Diagnostics)

94 (89 to 98)

95 (91 to 98)

1 published study

QuikRead Go Strep A test kit (Orion Diagnostica)

80 (56 to 94)

91 (72 to 99)

1 published study

Alere TestPack Plus Strep A cassette (Abbott)

86 (79 to 91)

99 (97 to 100)

1 published study

Bionexia Strep A plus cassette (Biomerieux)

No data

Bionexia Strep A dipstick test strip (Biomerieux)

85 (74 to 92)

91 (84 to 95)

1 abstract

Biosynex Strep A cassette (Biosynex)

No data

Sofia Strep A FIA (Quidel)

85 (81 to 89)

95 (93 to 97)

1 published study

Alere i Strep A (Abbott)3

98 (95 to 100)

96 (89 to 100)

3 published studies

Alere i Strep A 2 (Abbott)4

98 (96 to 100)

93 (91 to 95)

1 FDA Report

Cobas Strep A assay on Liat system (Roche Diagnostics)

98 (93 to 100)

93 (90 to 96)

1 published study

Xpert Xpress Strep A (Cepheid)

100 (99 to 100)

94 (92 to 96)

1 unpublished study1 and 1 FDA report

Notes:

1, Unpublished manufacturer data.

2, Assumed the same accuracy as the cassette version of the test.

3, Replaced by ID NOW Strep A.

4, Rebranded to ID NOW Strep A 2.

3.29

Treatment-related probabilities and complication rates used in the models are in table 6.

Table 6 Treatment-related probabilities and complication rates
GP practice Mean Standard error

Proportion attending repeat GP consultation following group A streptococcal infection

0.142

0.007

Antibiotic prescribing probabilities Mean Standard error

Probability of immediate prescription if Centor score is 3 or higher, or positive test

1.00

Probability of delayed prescription if Centor score is below 3 (current practice arm)

0.51

0.026

Probability of delayed prescription if negative test (intervention arm)

0.267

0.014

Probability of antibiotics used given delayed prescription

0.46

0.023

Probability of antibiotics used given immediate prescription

1.0

Complication rates following group A streptococcal infection Mean Standard error

Probability of complication if antibiotics given (treated infection)

0.013

0.0005

Probability of complications if no antibiotics given (untreated infection)

0.015

0.0007

Proportion of complications that are non-suppurative (that is, rheumatic fever)

0.0001

Adverse effects of penicillin Mean Standard error

Penicillin-induced rash

0.02

Penicillin-induced anaphylaxis

0.0001

Note: Standard error figures derived from assuming upper and lower bound equal to 10% of the mean estimate.

3.30

The health impact of each pathway was expressed in QALYs. These were calculated by subtracting the disutilities associated with treated and untreated strep A infection, complications of strep A infection and adverse effects of penicillin (see table 7) over 1 year from the mean baseline utilities. The mean baseline utilities in the models were based on a general UK population: 0.863 for adults and 0.94 for people under 25 years (Kind et al. 1998). The latter is the closest age group to children and therefore was used as a baseline utility in the children's models. Mean disutilities were based on published literature (Neuner et al. 2003; reported as quality-adjusted life days), by converting quality-adjusted life days to utility decrements.

Table 7 Utility decrements associated with strep A infection and complications
strep A infections Mean quality-adjusted life days lost Mean utility decrement used in the models Standard error

Untreated infection

0.25

0.000685

0.00005

Treated infection

0.15

0.000411

0.00003

strep A infection complications Mean quality-adjusted life days lost Mean utility decrement used in the models Standard error

Peritonsillar abscess

5

0.0137

0.0007

Rheumatic fever

76.5

0.209

0.011

Adverse effects of penicillin Mean quality-adjusted life days lost Mean utility decrement used in the models Standard error

Penicillin-induced anaphylaxis

9

0.025

0.0013

Penicillin-induced rash

0.65

0.0017

0.0001

Notes:

Mean utility decrement used in the models, calculated by converting quality-adjusted life days to utilities.

Standard error figures derived from assuming upper and lower bound equal to 10% of the mean estimate.

3.31

Costs were calculated using 2017/18 prices. The total costs for each strategy (current practice and rapid tests) include GP consultations, antimicrobial therapy, and managing strep A infection-related complications and adverse effects of penicillin (see table 8).

Table 8 Treatment costs (2017/18 price year)
Treatment costs Mean Standard error Source

Antibiotics (phenoxymethylpenicillin 250 mg, 28‑tablet pack)

£0.91

£0.046

BNF 72 (2017)

Pain relief (paracetamol 500 mg, 32-tablet pack)

£0.74

£0.037

BNF 72 (2017)

GP consultation (9.22 minutes)

£37.4

£1.91

Personal social services research unit costs 2017

Treatment costs, penicillin-induced rash (switch to erythromycin 500 mg)

£10.00

£0.51

BNF 72 (2017)

Treatment costs, penicillin-induced anaphylaxis1

£1,744.64

£89.01

Derived from Hex et al. 2017

Treatment costs, abscess (tonsillectomy)

£1,571.28

£80

2017 NHS reference costs

Treatment costs, acute rheumatic fever

£1,772.44

£90.43

2017 NHS reference costs

Note: 1, Based on expert opinion, costs of penicillin-induced anaphylaxis were assumed to be equivalent to the initial cost of treating sepsis, as derived from Hex et al. 2017.

3.32

Cost data were available for 14 of the 21 rapid tests in this assessment (see table 9). The cost of testing also accounted for:

  • Additional GP time needed to process the test, ranging from 5 to 12 minutes depending on the test.

  • Apportioned cost of analyser or test cassette reader (that is, cost of analyser or reader adjusted for its average life span and the average number of samples analysed).

  • Cost of the microbiological culture of throat swabs (£8 per sample) to confirm negative test results, when needed.

Table 9 Test costs
Test ID Test name Cost Test process time Throat culture

1

Clearview Exact Strep A cassette (Abbott)

£2.72

5

Yes

2

Clearview Exact Strep A dipstick (Abbott)

£1.92

5

Yes

3

BD Veritor Plus system group A Strep Assay cassette (Becton Dickinson)

Not known

Not known

Not known

4

Strep A rapid test cassette (Biopanda Reagents)

£0.82

5

Yes

5

Strep A rapid test dipstick (Biopanda Reagents)

£0.64

5

Yes

6

NADAL Strep A test strip (nal von minden GmbH)

£1.20

5

No

7

NADAL Strep A cassette (nal von minden GmbH)

£1.40

5

No

8

NADAL Strep A plus cassette (nal von minden GmbH)

£1.50

5

No

9

NADAL Strep A plus test strip (nal von minden GmbH)

£1.30

5

No

10

NADAL Strep A scan test cassette (nal von minden GmbH)

£1.96

5

No

11

OSOM Strep A test strip (Sekisui Diagnostics)

Not known

Not known

Not known

12

QuikRead Go Strep A test kit (Orion Diagnostica)

£4.34

5

Assumed yes3

13

Alere TestPack Plus Strep A cassette (Abbott)

£2.70

5

Assumed no4

14

Bionexia Strep A plus cassette (Biomerieux)

Not known

Not known

Not known

15

Bionexia Strep A dipstick (Biomerieux)

Not known

Not known

Not known

16

Biosynex Strep A cassette (Biosynex)

Not known

Not known

Not known

17

Sofia Strep A FIA (Quidel)

Not known

Not known

Not known

18

Alere i Strep A (Abbott)5

Not known

Not known

Not known

19

Alere i Strep A 2 (Abbott)6

£22.94

5

No

20

Cobas Strep A assay on Liat system (Roche Diagnostics)

£357

6

No

21

Xpert Xpress Strep A (Cepheid)

£4.258

12

Assumed yes3

Notes:

Cost, includes apportioned cost of analyser or test cassette reader (that is, cost of analyser or reader adjusted for its average life span and the average number of samples analysed), when relevant.

Throat culture, confirmatory microbiological culture of throat swabs for negative results of rapid tests is needed, as specified in the information for use documents.

3, Not known whether confirmatory test is needed, assumed that it is.

4, Confirmatory testing warranted only if symptoms persist.

5, This test has been replaced by ID NOW Strep A 2 test.

6, Rebranded to ID NOW Strep A 2.

7, Average test selling price based on volume-based discounts (submitted by company during consultation; does not include apportioned analyser cost).

8, Based on the list price provided by the company and EAG's assumptions.

Base-case assumptions
3.33

The model was created for adults in primary care and then adapted for children and secondary care:

  • In current practice, antibiotic prescribing (immediate, delayed or no prescribing) is based on the Centor score.

  • In the rapid test cohort, people with a Centor score of 3 or more are offered the rapid test. Antibiotic prescribing decisions (immediate, delayed or no prescribing) are based on the test results.

  • Of people offered delayed prescription, 46% use their prescription.

  • There are 1.3% to 1.5% of people with strep A infection who develop complications, depending on whether or not they had antibiotics.

  • People who take antibiotics are at risk of penicillin-related adverse effects (2% have penicillin-induced rash and 0.01% have penicillin-induced anaphylaxis).

  • When recommended by manufacturers, negative results for rapid strep A tests were followed up with a microbiological culture or throat swabs to confirm the results.

3.34

The model for adults in secondary care was adapted from the adult primary care model by excluding the cost of the initial GP consultation. Also, it was assumed that all rapid tests could be done in the standard time allocated for secondary care appointments. The accuracy of rapid tests was assumed to be the same as in primary care (because of the lack of specific data in secondary care) except for 3 tests for which the sensitivity estimates from secondary care were available: OSOM Strep A test (94%), QuikRead Go Strep A test kit (87%) and the Alere TestPack Plus Strep A (90%). All other assumptions and inputs are the same as in the primary care model.

3.35

The model for children in primary care was adapted from the corresponding adult model by adjusting the prevalence of strep A infections from 22.6% to 30.2%, and using the accuracy estimates from studies in children whenever these were available (see table 5). The costs of treating peritonsillar abscess and related complications in children were assumed to be lower than in adults (£1,420.50 compared with £1,571.28), based on the NHS reference costs for both age groups.

3.36

The test accuracy data for children in secondary care were assumed to be the same as in primary care (because of the lack of specific data in secondary care), except for 3 tests for which the accuracy estimates from secondary care were available: OSOM Strep A test (test strip; sensitivity: 94%, specificity: 97%), QuikRead Go Strep A test kit (sensitivity: 87%, specificity: 78%), and Alere TestPack Plus Strep A (sensitivity: 77%, specificity: 97%).

Economic analysis results

Base-case results
3.37

In the base-case adult primary care model, current practice dominated (that is, current practice was more effective and cheaper than the testing strategy) 2 tests: the Clearview Exact Strep A cassette and dipstick. The ICERs for the remaining 12 tests ranged from £1,353,677 to £6,059,081 per QALY gained, compared with current practice (see table 10). Costs and QALYs were multiplied by 1,000 because of the very small incremental QALYs.

3.38

The results of the base-case adult secondary care model were in line with the results of the adult primary care model, but the ICERs were much lower (see table 11).

3.39

In both models for children, current practice dominated 4 tests: the Clearview Exact Strep A cassette and dipstick, QuikRead Go Strep A test kit and Alere TestPack Plus Strep A cassette (see table 11). In the children's primary care model, the ICERs for the remaining 10 tests ranged from £1,762,306 to £7,893,857 per QALY gained, compared with current practice. In the children's secondary care model, the ICERs for the remaining 10 tests ranged from £65,122 to £5,723,279 per QALY gained, compared with current practice.

Table 10 Base-case cost-effectiveness results: adult primary care model
Test Mean costs Mean quality-adjusted life years Incremental costs Incremental quality-adjusted life years Incremental cost-effectiveness ratio versus current practice

Current practice2

£49,147

859.82458955

£0

0.0000000

Clearview Exact Strep A cassette (Abbott)3

£56,180

859.82063008

£7,033

−0.0039595

Dominated

Clearview Exact Strep A dipstick (Abbott)3

£55,980

859.82063008

£6,833

−0.0039595

Dominated

Strep A rapid test cassette (Biopanda Reagents)4

£55,442

859.82769587

£6,295

0.0031063

£2,026,496

Strep A rapid test dipstick (Biopanda Reagents)4,5

£55,397

859.82769587

£6,250

0.0031063

£2,012,006

NADAL Strep A test strip (nal von minden GmbH)4

£54,394

859.82846603

£5,248

0.0038765

£1,353,677

NADAL Strep A cassette (nal von minden GmbH)4

£54,444

859.82846603

£5,298

0.0038765

£1,366,577

NADAL Strep A plus cassette (nal von minden GmbH)4

£54,469

859.82846603

£5,323

0.0038765

£1,373,029

NADAL Strep A plus test strip (nal von minden GmbH)4

£54,419

859.82846603

£5,273

0.0038765

£1,360,126

NADAL Strep A scan test cassette (nal von minden GmbH)4

£54,584

859.82846603

£5,438

0.0038765

£1,402,700

QuikRead Go Strep A test kit (Orion Diagnostica)

£56,083

859.82810269

£6,936

0.0035131

£1,974,319

Alere TestPack Plus Strep A cassette (Abbott)

£54,781

859.82751669

£5,634

0.0029271

£1,924,717

Alere i Strep A 2 (Abbott)4,6

£59,837

862.82824206

£10,691

0.00365250

£2,926,915

Cobas Strep A assay on Liat system (Roche Diagnostics)7

£63,868

859.82824206

£14,722

0.0036525

£4,030,533

Xpert Xpress Strep A (Cepheid)4

£63,323

859.82854357

£14,177

0.0039540

£3,585,436

Notes: Cost-effectiveness analyses were not done for 7 tests that had no cost data (Bionexia Strep A plus cassette and Biosynex Strep A cassette had neither costs nor accuracy data available).

Mean costs, mean quality-adjusted life years, incremental costs and incremental quality-adjusted life years are per 1,000 individuals.

2, Clinical scoring based on Centor 3 or higher plus clinical assessment.

3, Based on the accuracy data presented in a conference abstract only.

4, Based on the accuracy data from the FDA or manufacturer's data.

5, Assumed equal accuracy to the cassette version of this test.

6, Rebranded to ID NOW Strep A 2.

7, Based on average selling price submitted by the company during consultation (based on volume-based discounts; without including apportioned analyser costs).

Table 11 Base-case cost-effectiveness results: other models (incremental cost-effectiveness ratio versus current practice)
Test Adults secondary care Children primary care Children secondary care

Current practice2

Clearview Exact Strep A cassette (Abbott)3

Dominated

Dominated

Dominated

Clearview Exact Strep A dipstick (Abbott)3

Dominated

Dominated

Dominated

Strep A rapid test cassette (Biopanda Reagents)4

£392,342

£2,992,743

£517,066

Strep A rapid test dipstick (Biopanda Reagents)4,5

£377,852

£2,970,792

£495,115

NADAL Strep A test strip (nal von minden GmbH)4

£44,184

£1,762,306

£65,122

NADAL Strep A cassette (nal von minden GmbH)4

£57,085

£1,779,026

£81,845

NADAL Strep A plus cassette (nal von minden GmbH)4

£63,537

£1,787,386

£90,205

NADAL Strep A plus test strip (nal von minden GmbH)4

£50,636

£1,770,666

£73,482

NADAL Strep A scan test cassette (nal von minden GmbH)4

£93,211

£1,825,846

£128,662

QuikRead Go Strep A test kit (Orion Diagnostica)

£12,700,432

Dominated

Dominated

Alere TestPack Plus Strep A cassette (Abbott)

£335,358

Dominated

Dominated

Alere i Strep A 2 (Abbott)4,6

£1,537,126

£3,817,336

£2,008,522

Cobas Strep A assay on Liat system (Roche Diagnostics)7

£2,362,784

£5,253,699

£4,396,205

Xpert Xpress Strep A (Cepheid)4

£504,287

£4,396,205

£574,900

Notes: Cost-effectiveness analyses were not done for 7 tests that had no cost data (Bionexia Strep A plus cassette and Biosynex Strep A cassette had neither costs nor accuracy data available).

Figures are based per 1,000 individuals.

2, Clinical scoring based on Centor 3 or higher plus clinical assessment.

3, Based on the accuracy data presented in a conference abstract only.

4, Based on the accuracy data from the FDA or manufacturer's data.

5, Assumed equal accuracy to the cassette version of this test.

6, Rebranded to ID NOW Strep A 2.

7, Based on average selling price submitted by the company during consultation (based on volume-based discounts; without including apportioned analyser costs).

Probabilistic sensitivity analysis
3.40

The results of the probabilistic sensitivity analysis mirrored the results of the deterministic base-case analysis in all models.

3.41

The probability of a rapid test being cost effective was 0 in all 4 models, regardless of the rapid test used.

Deterministic sensitivity analyses
3.42

A range of scenario analyses was done. For the adult primary care model, none produced ICERs that were around or below £30,000 per QALY gained, compared with current practice.

3.43

For the adult secondary care model, changing the rate of penicillin-induced anaphylaxis from 0.01% (Neuner et al. 2003) to 0.64% (Van Howe and Kusnier 2006), resulted in 6 rapid tests dominating current practice (that is, testing was cheaper and more effective than current practice). These were the 5 NADAL tests and Alere TestPack Plus Strep A. The ICERs for 4 tests (2 Clearview Exact Strep A tests and 2 Strep A rapid tests from Biopanda) decreased to around or below £30,000 per QALY gained, compared with current practice.

3.44

In addition, for the adult secondary care model, the ICERs for the 5 NADAL tests decreased to around or below £30,000 per QALY gained, compared with current practice, for the following assumptions:

  • changing the Centor threshold for starting antibiotics and testing to 2 or more (ICERs: £30,230 to £69,690 per QALY gained)

  • changing the Centor threshold for starting antibiotics and testing to 1 or more (ICERs: £22,220 to £56,190 per QALY gained)

  • lowering the prevalence of strep A infection to 10% (ICERs: £20,628 to £53,506 per QALY gained)

  • doubling the rate of penicillin-related rash to 4% (ICERs: £8,913 to £32,557 per QALY gained)

  • doubling the utility decrement of penicillin-induced rash (ICERs: £21,309 to £44,953 per QALY gained).

3.45

For the children's primary care model, no scenario analyses produced ICERs that were around or below £30,000 per QALY gained.

3.46

The scenario analyses for the children's secondary care model largely mirrored scenario analyses for the adult secondary care model, except that changing Centor threshold for starting antibiotics and testing to a score of 2 or more had no major effect on the ICERs.

3.47

In addition, several analyses favoured testing strategies, and all or some of the tests dominated by current practice in base-case analyses were no longer dominated. However, the ICERs were around or above £100,000 per QALY gained:

  • doubling the complication rate of treated strep A infection to 2.6%

  • halving the complication rate of untreated strep A infection to 0.75% (children's primary care model only)

  • halving the utility decrement of untreated strep A infection

  • doubling the utility decrement of treated strep A infection

  • changing the accuracy estimates to the lower confidence limits for both the rapid test and Centor clinical scoring tool (children's primary care model only).

3.48

Several scenario analyses favoured current practice. In all 4 models, doubling the utility decrement associated with untreated strep A infection resulted in an additional 2 to 4 tests being dominated by current practice, compared with base-case results. These tests were the Strep A rapid test cassette and the test strip from Biopanda (all models), the Alere TestPack Plus Strep A cassette (adult primary and secondary care models) and the QuikRead Go Strep A test kit (adult secondary care model). In the adult secondary care model, the following assumptions also resulted in additional tests being dominated by current practice:

  • increasing the prevalence rate to 35.9%

  • halving the complication rate of treated strep A infection to 0.65%

  • doubling the complication rate of untreated strep A infection to 3%

  • halving the rate of penicillin-related rash to 1%

  • halving the utility decrement of treated strep A infection

  • halving the utility decrement of penicillin-induced rash

  • doubling the utility decrement of an abscess.