Declaration of Interest
JSB is a Senior Clinical Advisor to Intelligent Ultrasound, reporting research funding and honoraria.
The authors would like to thank Patrick Cullum (Intelligent Ultrasound, Cardiff, UK) for his help in producing the figures and video.
The human eye can distinguish around a million different colours,1 but only around 30 shades of grey,2 so why do we persist with viewing medical images in greyscale?
Regional anaesthesia has traditionally been performed using anatomical landmarks to identify underlying structures, in addition to information gathered from patient symptoms (e.g., paraesthesia), to guide needle placement and injection. The use of electrical stimulation was introduced used to elicit a motor or sensory response, and improve nerve identification.3 Ultrasound image guidance, first described in 1989,4 now forms the basis of practice for most regional anaesthesia.5 Ultrasound-guided regional anaesthesia (UGRA) revolutionised peripheral nerve blockade, with improved success rate, faster onset and reduced rates of complications including vascular injury and local anaesthetic systemic toxicity.5, 6
Challenges in Ultrasound Guidance
Despite the known advantages of UGRA, an inequality of patient access to such techniques persists based, to some extent, on the availability of an anaesthetist with the required specialist skills.7 UGRA is undoubtedly operator-dependent8 and fundamental skills required in these techniques, in particular ultrasound scanning and needle-probe manipulation, contribute to this. The ability to acquire and interpret optimal sonographic images requires many years of training, and remains a barrier to successful delivery of UGRA.9 Unfortunately, patients who stand most to gain from UGRA are often those in whom ultrasound scanning is most difficult – the obese, those with previous surgery or trauma, and patients with co-morbidities which can lead to complications that grossly alter the sonographic image (e.g., oedema). In addition, interpretation may vary even amongst specialists in regional anaesthetist when viewing the same ultrasound images.10, 11
Recent initiatives have aimed to improve delivery of UGRA through standardisation of practice.7, 12, 13 If these are to be successful, we must consider new technologies to augment ultrasound scanning such as bio-impedance needling14 and artificial intelligence (AI) image interpretation.9
Artificial intelligence encompasses a field that enables computers to perform tasks usually associated with human intelligence.15 Machine learning (ML) is a branch of AI which allows computers to learn (i.e., improve performance of a given task with increased experience).9 One of the most common strategies of ML is deep learning (DL). DL employs an artificial neural network to imitate the neural network of the human brain and is particularly suited to image recognition/analysis. The artificial neurons are arranged in layers, to process data sequentially and produce a fine-grained interpretation of the image (Figure 1). When training DL neural networks, data are presented to the neural network and statistical associations are made between the input data and the desired outcome (e.g., classifying an image as ‘dog’ or ‘cat’). Over time, the systems become adept at differentiating between different images or features in the image such that, when the system is deployed, it can autonomously differentiate between different classes in the data (e.g., dog vs cat).
Artificial intelligence for UGRA has been tried before – one of the authors worked on a forerunner project which aimed to develop reliable nerve identification with adaptive learning (https://rasimas.plri.de). Although the project failed to achieve its objectives, the use of such systems in UGRA may be of benefit in ultrasound scanning and image interpretation, with the aim of making UGRA more accessible for clinicians and patients. ScanNavTM Anatomy Peripheral Nerve Block (Intelligent Ultrasound, Cardiff, UK) is one such system (Figure 2). It uses DL to produce a colour overlay on real-time B-mode ultrasound, which aims to draw the user’s attentional gaze to the stucture(s) of interest (Figure 3 & Supplementary Video). Early evidence suggests that the device may help experts in teaching UGRA and non-experts in learning or performing UGRA.11 The colour overlay aids the non-experts in acquiring optimal ultrasound images and correctly identifying structures on those images.16 In one study, experts viewed 720 ultrasounds scans and reported highlighting to be accurate in 93.5% (1519/1624) of structures. In addition, the experts concluded, in their subjective opinion, that complications of UGRA be reduced in 62.9% – 86.2% of scans viewed and that highlighting was likely to reduce the incidence of block failure in 81.2%.17 The device has gained regulatory approval for clinical use in Europe (April 2021) and is currently undergoing review for a similar approval in the USA. Other systems have been designed with similar functions, including Nerveblox (Smart Alfa Teknoloji San. Ve Tic AS, Ankara, Turkey),18 NerveTrack (Samsung Medison, Seoul, South Korea),19 and cNerve (GE Healthcare, Chicago, USA).20
Naturally, there is potential for error when using AI systems. Complications may arise relating to device performance or operator dependence on the technology instead of gaining the required procedural knowledge. Often, expectations of AI are exceedingly high, and some find initial limitations disappointing, however new systems will emerge and existing ones are likely to improve. As in other areas of clinical practice, current technology should be used to provide the operator with additional information, as opposed to being the decision maker in the process of UGRA (6).
AI Beyond Ultrasound and Anaesthesia
Artificial intelligence will be of benefit in other applications for anaesthesia, such as closed-loop feedback systems for propofol sedation,21 prediction of patient outcome,22 and potentially incorporating AI within robotic systems being design for practical skills such as tracheal intubation23. In some specialties, such as emergency medicine, doctors are often familiar with point of care ultrasound but perform UGRA infrequently. AI support may enable such doctors to develop their skills in a standardised manner, giving patients faster and safer intervention and improving outcomes.
We are practicing UGRA in a time of rapid technological advancement, so why do we limit our own practice by utilising potentially outdated technology such as greyscale images? As ultrasound machines evolve, with embedded high-performance AI image analysis, who knows what the future may bring – one can only dream of the possibilities!
Simplified schematic of the artificial neural network used by ScanNav Anatomy Peripheral Nerve Block
click on the picture for high resolution
ScanNav Anatomy Peripheral Nerve Block
Examples of the AI colour overlay from ScanNav Anatomy Peripheral Nerve Block. ALM, adductor longus muscle; AS, anterior scalene; BPN, brachial plexus nerves (trunks/divisions); CPN, common peroneal (fibular) nerve; CTf, fascia overlying conjoint tendon; C5, C5 nerve root; C6, C6 nerve root; DCIA, deep circumflex iliac artery; ESM, erector spinae muscle group (and overlying muscles); FA, femoral artery; FI, fascia iliaca; H, humerus; I, ilium; IM, iliacus/iliopsoas muscle; McN, musculocutaneous nerve; MN, median nerve; MS, middle scalene; Pe, peritoneum and contents; Pl, pleura; R, first rib; RA, rectus abdominis muscle; RN, radial nerve; RSa, anterior layer of rectus sheath; RSp, posterior layer of rectus sheath; SaN, saphenous nerve/nerve complex; ScA, subclavian artery; SCM, sternocleidomastoid muscle; SM, sartorius muscle; TN, tibial nerve; TP, transverse process; UN, ulnar nerve; UT, upper trunk of the brachial plexus
A video showing AI-based highlighting for ultrasound scans of the axillary level brachial plexus, erector spinae plane, and adductor canal block regions.
We treat your personal data with care, view our privacy notice.