1 Recommendations

1.1 More research is needed on the following artificial intelligence (AI)‑derived software to analyse chest X‑rays alongside clinician review for suspected lung cancer in adults referred from primary care:

  • AI‑Rad Companion Chest X‑ray (Siemens Healthineers)

  • Annalise CXR (Annalise ai)

  • Auto Lung Nodule Detection (Samsung)

  • ChestLink (Oxipit)

  • ChestView (Gleamer)

  • Chest X‑ray (Rayscape)

  • ClearRead Xray (Riveraintech)

  • InferRead DR Chest (Infervision)

  • Lunit INSIGHT CXR (Lunit)

  • Milvue Suite (Milvue)

  • qXR (Qure.ai)

  • Red dot (Behold.ai)

  • SenseCare‑Chest DR PRO (SenseTime)

  • VUNO Med‑Chest X‑Ray (VUNO).

1.2 Access to the technology should be through company, research or non-core NHS funding.

1.3 Centres already using AI‑derived software to review chest X‑rays in adults referred from primary care may continue to do so but only under an appropriate evaluation framework and only alongside clinician review. They should do more research on the outcomes described in recommendation 1.4.

More research

1.4 More research is needed on the following key outcomes:

  • the impact of the software on clinical decision making and the number of people referred to have a chest CT scan

  • how using the software affects healthcare costs and resource use

  • the impact on review and reporting time, and time to CT referral and diagnosis

  • the diagnostic accuracy of AI‑derived software alongside clinician review in detecting nodules or other abnormal lung features that suggest lung cancer

  • the diagnostic accuracy of using AI‑derived software alongside clinician review for identifying normal X‑rays with high confidence and the impact of this on work prioritisation and patient flow

  • the technical failure and rejection rates of the software

  • whether the software works in cases when it is hard to get high-quality images, for example, scoliosis and morbid obesity

  • whether the software works in groups that could particularly benefit, including people with multiple conditions, people from high-risk family backgrounds, and younger women who do not smoke (see section 3.12)

  • patient perceptions of using AI‑derived software.

Key gaps in the evidence:

  • There is an unmet need for quicker reporting of chest X‑rays in people referred from primary care, particularly when this may support earlier detection of lung cancer. AI‑derived software show promise and may help to reduce time to diagnosis and treatment by supporting clinician review of images and by prioritising images for review so a CT scan can be done the same day. But, there is insufficient evidence.

  • With the available evidence, it is not possible to assess the clinical and cost benefits or risks of using the technology in the NHS. So, AI‑derived software should not be used for clinical decision making in the NHS until more evidence is available. Centres using them should generate evidence that will allow clinical and cost benefits to be assessed in the future.

  • There is no evidence to show how accurate software-assisted clinician review will be at identifying lung abnormalities compared with clinician review alone in people referred for a chest X‑ray by their GP. Using this software could lead to lung cancer being missed or people having unnecessary CT scans, which can cause anxiety. This could also increase costs and overburden CT services.

  • In addition, software may not reduce, or may increase, the time radiologists or diagnostic radiographers spend reviewing and reporting chest X‑rays.

  • Prospective studies in a population referred from primary care that assess the software alongside clinician review would allow better understanding of the risks and benefits for patients and the healthcare system.

Overall, more evidence is needed on:

  • the diagnostic accuracy of AI‑derived software when used alongside clinician review to identify suspected lung cancer and to identify normal X‑rays

  • the risk and consequences of false-positive results including number of CT scans done

  • the risk and consequences of false-negative results

  • the impact the software has on the time a clinician takes to read and report a chest X‑ray.

The evidence generation plan gives further information on the prioritised evidence gaps and outcomes, ongoing studies and potential real-world data sources. It includes how the evidence gaps could be resolved through real-world evidence studies.

  • National Institute for Health and Care Excellence (NICE)