Artificial Intelligence for Ultrasound Scanning in Regional Anaesthesia - ESRA

ESRA Updates

October 2022 | Issue 10

Artificial Intelligence for Ultrasound Scanning in Regional Anaesthesia

Eluned Fisher (Specialist Registrar, Aneurin Bevan University Health Board)
Steve Coppens (Co-editor of ESRA Updates, UZ Leuven, Belgium) @Steve_Coppens
James S Bowness (Consultant Anaesthetist, Aneurin Bevan University Health Board) @bowness_james
SHARE

Declaration of Interest
JSB is a Senior Clinical Advisor to Intelligent Ultrasound, reporting research funding and honoraria.

Acknowledgements
The authors would like to thank Patrick Cullum (Intelligent Ultrasound, Cardiff, UK) for his help in producing the figures and video.


The human eye can distinguish around a million different colours,1 but only around 30 shades of grey,2 so why do we persist with viewing medical images in greyscale?

 

Introduction

Regional anaesthesia has traditionally been performed using anatomical landmarks to identify underlying structures, in addition to information gathered from patient symptoms (e.g., paraesthesia), to guide needle placement and injection.  The use of electrical stimulation was introduced used to elicit a motor or sensory response, and improve nerve identification.3  Ultrasound image guidance, first described in 1989,4 now forms the basis of practice for most regional anaesthesia.5  Ultrasound-guided regional anaesthesia (UGRA) revolutionised peripheral nerve blockade, with improved success rate, faster onset and reduced rates of complications including vascular injury and local anaesthetic systemic toxicity.5, 6

 

Challenges in Ultrasound Guidance

Despite the known advantages of UGRA, an inequality of patient access to such techniques persists based, to some extent, on the availability of an anaesthetist with the required specialist skills.7  UGRA is undoubtedly operator-dependent8 and fundamental skills required in these techniques, in particular ultrasound scanning and needle-probe manipulation, contribute to this.  The ability to acquire and interpret optimal sonographic images requires many years of training, and remains a barrier to successful delivery of UGRA.9  Unfortunately, patients who stand most to gain from UGRA are often those in whom ultrasound scanning is most difficult – the obese, those with previous surgery or trauma, and patients with co-morbidities which can lead to complications that grossly alter the sonographic image (e.g., oedema).  In addition, interpretation may vary even amongst specialists in regional anaesthetist when viewing the same ultrasound images.10, 11

Recent initiatives have aimed to improve delivery of UGRA through standardisation of practice.7, 12, 13  If these are to be successful, we must consider new technologies to augment ultrasound scanning such as bio-impedance needling14 and artificial intelligence (AI) image interpretation.9

 

Artificial Intelligence

Artificial intelligence encompasses a field that enables computers to perform tasks usually associated with human intelligence.15  Machine learning (ML) is a branch of AI which allows computers to learn (i.e., improve performance of a given task with increased experience).9  One of the most common strategies of ML is deep learning (DL).  DL employs an artificial neural network to imitate the neural network of the human brain and is particularly suited to image recognition/analysis.  The artificial neurons are arranged in layers, to process data sequentially and produce a fine-grained interpretation of the image (Figure 1).  When training DL neural networks, data are presented to the neural network and statistical associations are made between the input data and the desired outcome (e.g., classifying an image as ‘dog’ or ‘cat’).  Over time, the systems become adept at differentiating between different images or features in the image such that, when the system is deployed, it can autonomously differentiate between different classes in the data (e.g., dog vs cat).

Artificial intelligence for UGRA has been tried before – one of the authors worked on a forerunner project which aimed to develop reliable nerve identification with adaptive learning.  Although the project failed to achieve its objectives, the use of such systems in UGRA may be of benefit in ultrasound scanning and image interpretation, with the aim of making UGRA more accessible for clinicians and patients.  ScanNavTM Anatomy Peripheral Nerve Block (Intelligent Ultrasound, Cardiff, UK) is one such system (Figure 2).  It uses DL to produce a colour overlay on real-time B-mode ultrasound, which aims to draw the user’s attentional gaze to the stucture(s) of interest (Figure 3 & Supplementary Video).  Early evidence suggests that the device may help experts in teaching UGRA and non-experts in learning or performing UGRA.11  The colour overlay aids the non-experts in acquiring optimal ultrasound images and correctly identifying structures on those images.16  In one study, experts viewed 720 ultrasounds scans and reported highlighting to be accurate in 93.5% (1519/1624) of structures.  In addition, the experts concluded, in their subjective opinion, that complications of UGRA be reduced in 62.9% – 86.2% of scans viewed and that highlighting was likely to reduce the incidence of block failure in 81.2%.17  The device has gained regulatory approval for clinical use in Europe (April 2021) and is currently undergoing review for a similar approval in the USA.  Other systems have been designed with similar functions, including Nerveblox (Smart Alfa Teknoloji San. Ve Tic AS, Ankara, Turkey),18 NerveTrack (Samsung Medison, Seoul, South Korea),19 and cNerve (GE Healthcare, Chicago, USA).20

 

Limitations

Naturally, there is potential for error when using AI systems.  Complications may arise relating to device performance or operator dependence on the technology instead of gaining the required procedural knowledge.  Often, expectations of AI are exceedingly high, and some find initial limitations disappointing, however new systems will emerge and existing ones are likely to improve.  As in other areas of clinical practice, current technology should be used to provide the operator with additional information, as opposed to being the decision maker in the process of UGRA (6).

 

AI Beyond Ultrasound and Anaesthesia

Artificial intelligence will be of benefit in other applications for anaesthesia, such as closed-loop feedback systems for propofol sedation,21 prediction of patient outcome,22 and potentially incorporating AI within robotic systems being design for practical skills such as tracheal intubation23.  In some specialties, such as emergency medicine, doctors are often familiar with point of care ultrasound but perform UGRA infrequently.  AI support may enable such doctors to develop their skills in a standardised manner, giving patients faster and safer intervention and improving outcomes.

 

Summary

We are practicing UGRA in a time of rapid technological advancement, so why do we limit our own practice by utilising potentially outdated technology such as greyscale images?  As ultrasound machines evolve, with embedded high-performance AI image analysis, who knows what the future may bring – one can only dream of the possibilities!

 


Figures & Videos

Figure 1
Simplified schematic of the artificial neural network used by ScanNav Anatomy Peripheral Nerve Block

click on the picture for high resolution

Figure 2
ScanNav Anatomy Peripheral Nerve Block

click on the picture for high resolution

Figure 3
Examples of the AI colour overlay from ScanNav Anatomy Peripheral Nerve Block. ALM, adductor longus muscle; AS, anterior scalene; BPN, brachial plexus nerves (trunks/divisions); CPN, common peroneal (fibular) nerve; CTf, fascia overlying conjoint tendon; C5, C5 nerve root; C6, C6 nerve root; DCIA, deep circumflex iliac artery; ESM, erector spinae muscle group (and overlying muscles); FA, femoral artery; FI, fascia iliaca; H, humerus; I, ilium; IM, iliacus/iliopsoas muscle; McN, musculocutaneous nerve; MN, median nerve; MS, middle scalene; Pe, peritoneum and contents; Pl, pleura; R, first rib; RA, rectus abdominis muscle; RN, radial nerve; RSa, anterior layer of rectus sheath; RSp, posterior layer of rectus sheath; SaN, saphenous nerve/nerve complex; ScA, subclavian artery; SCM, sternocleidomastoid muscle; SM, sartorius muscle; TN, tibial nerve; TP, transverse process; UN, ulnar nerve; UT, upper trunk of the brachial plexus

click on the picture for high resolution

 

Supplementary Video
A video showing AI-based highlighting for ultrasound scans of the axillary level brachial plexus, erector spinae plane, and adductor canal block regions.


References

  1. https://www.bbc.com/future/article/20150727-what-are-the-limits-of-human-vision, Accessed 8th November 2022.
  2. https://www.popsci.com/humans-can-only-distinguish-between-about-30-shades-gray/, Accessed 8th November 2022.
  3. Bowness, J. and A. Taylor, Ultrasound-Guided Regional Anaesthesia: Visualising the Nerve and Needle. Adv Exp Med Biol, 2020. 1235: p. 19-34.
  4. Ting, P.L. and V. Sivagnanaratnam, Ultrasonographic study of the spread of local anaesthetic during axillary brachial plexus block. Br J Anaesth, 1989. 63(3): p. 326-9.
  5. Neal, J.M., R. Brull, J.L. Horn, et al., The Second American Society of Regional Anesthesia and Pain Medicine Evidence-Based Medicine Assessment of Ultrasound-Guided Regional Anesthesia: Executive Summary. Reg Anesth Pain Med, 2016. 41(2): p. 181-94.
  6. Henderson, M. and J. Dolan, Challenges, solutions, and advances in ultrasound-guided regional anaesthesia. BJA Education, 2016. 16(11): p. 374-380.
  7. Turbitt, L.R., E.R. Mariano, and K. El-Boghdadly, Future directions in regional anaesthesia: not just for the cognoscenti. Anaesthesia, 2020. 75(3): p. 293-297.
  8. Marhofer, P. and G. Fritsch, Safe performance of peripheral regional anaesthesia: the significance of ultrasound guidance. Anaesthesia, 2017. 72(4): p. 431-434.
  9. Bowness, J., K. El-Boghdadly, and D. Burckett-St Laurent, Artificial intelligence for image interpretation in ultrasound-guided regional anaesthesia. Anaesthesia, 2021. 76(5): p. 602-607.
  10. Bowness, J., K. Turnbull, A. Taylor, et al., Identifying variant anatomy during ultrasound-guided regional anaesthesia: opportunities for clinical improvement. British Journal of Anaesthesia. British Journal of Anaesthesia, 2019. 122(5): p. e75-e77.
  11. Bowness, J.S., K. El-Boghdadly, G. Woodworth, J.A. Noble, H. Higham, and D. Burckett-St Laurent, Exploring the utility of assistive artificial intelligence for ultrasound scanning in regional anesthesia. Reg Anesth Pain Med, 2022. 47(6): p. 375-379.
  12. Bowness, J.S., A. Pawa, L. Turbitt, et al., International consensus on anatomical structures to identify on ultrasound for the performance of basic blocks in ultrasound-guided regional anesthesia. Reg Anesth Pain Med, 2022. 47(2): p. 106-112.
  13. El-Boghdadly, K., M. Wolmarans, A.D. Stengel, et al., Standardizing nomenclature in regional anesthesia: an ASRA-ESRA Delphi consensus study of abdominal wall, paraspinal, and chest wall blocks. Reg Anesth Pain Med, 2021. 46(7): p. 571-580.
  14. O’Donnell, B.D. and F. Loughnane, Novel nerve imaging and regional anesthesia, bio-impedance and the future. Best Pract Res Clin Anaesthesiol, 2019. 33(1): p. 23-35.
  15. Drukker, L., J.A. Noble, and A.T. Papageorghiou, Introduction to artificial intelligence in ultrasound imaging in obstetrics and gynecology. Ultrasound Obstet Gynecol, 2020. 56(4): p. 498-505.
  16. Bowness, J., O. Varsou, L. Turbitt, and D. Burkett-St Laurent, Identifying anatomical structures on ultrasound: assistive artificial intelligence in ultrasound-guided regional anesthesia. Clin Anat, 2021. 34(5): p. 802-809.
  17. Bowness, J.S., D. Burckett-St Laurent, N. Hernandez, et al., Assistive artificial intelligence for ultrasound image interpretation in regional anaesthesia: an external validation study. Br J Anaesth, 2022.
  18. Gungor, I., B. Gunaydin, S.O. Oktar, et al., A real-time anatomy identification via tool based on artificial intelligence for ultrasound-guided peripheral nerve block procedures: an accuracy study. J Anesth, 2021. 35(4): p. 591-594.
  19. Link unavailable, Accessed 25.12.2021.
  20. https://www.gehealthcare.com/-/jssmedia/gehc/us/images/products/ultrasound/venue/republish/regional-anesthesia/brochure-regional-anesthesia-pocus-venue-family-jb20273xx.pdf?rev=-1, Accessed 8th November 2022.
  21. Lee, H.C., H.G. Ryu, E.J. Chung, and C.W. Jung, Prediction of Bispectral Index during Target-controlled Infusion of Propofol and Remifentanil: A Deep Learning Approach. Anesthesiology, 2018. 128(3): p. 492-501.
  22. Gabriel, R.A., B. Harjai, R.S. Prasad, et al., Machine learning approach to predicting persistent opioid use following lower extremity joint arthroplasty. Reg Anesth Pain Med, 2022. 47: p. 313-319.
  23. Hemmerling, T.M., R. Taddei, M. Wehbe, C. Zaouter, S. Cyr, and J. Morse, First robotic tracheal intubations in humans using the Kepler intubation system. Br J Anaesth, 2012. 108(6): p. 1011-6.
Topics: Artificial Intelligence , Ultrasound-guidance

14th Congress of The European Pain Federation (EFIC)

Subscribe To Our Mailing List

We treat your personal data with care, view our privacy notice.