EAU22 Press release: Machine learning goes with the flow
Under strict embargo: 01:00 CEST, Friday 01 July
Machine learning goes with the flow
An artificial intelligence (AI) algorithm trained to listen to patients pass urine is able to identify abnormal flows and could be a useful and cost-effective means of monitoring and managing urology patients at home. It is presented today at the European Association of Urology Annual Congress (EAU22), in Amsterdam.
The deep learning tool, Audioflow, performed almost as well as a specialist machine used in clinics, and achieves similar results to urology residents in assessing urinary flow. The current study focuses on sound created by urine in a soundproof environment, but the ambition is to create an app so patients can monitor themselves at home.
Lower urinary tract symptoms, problems related to the working of the bladder and the urethra, are common and affect an estimated 60% of men and 57% of women.
Uroflowmetry is an important tool for the assessment of patients with symptoms, but patients have to urinate into a machine during outpatient visits. They are asked to urinate into a funnel connected to the uroflowmeter which records information about flow. During the COVID-19 pandemic access to clinics has been restricted, and even where patients can attend, the test can take a long time with queues to use a single machine.
Dr Lee Han Jie, Prof. Ng Lay Guat and colleagues at Singapore General Hospital collaborated with colleagues in the engineering department of Singapore University of Technology and Design to develop an algorithm and recruited 534 male participants between December 2017 and July 2019 to train and validate it. Participants used the usual uroflowmetry machine in a soundproofed room and recorded their urination using a smartphone.
Using 220 recordings, the AI learned to estimate flow rate, volume, and time which can indicate when there is an obstruction or if the bladder is not working well. It was trained to listen to and analyse male urinary flow which is different from that of women and would need a separate sample to learn to analyse female urination.
Results were compared to a conventional uroflowmetry machine and to a panel of six urology residents who separately graded the dataset. The AI agreed with conventional uroflowmetry for over 80% of recordings, and compared to the specialist urologists and external residents for identification of abnormal flows, it achieved an 84% rate of agreement.
Dr Lee says: “There is a trend towards using machine learning in many fields, because clinicians do not have a lot of time. At the same time, particularly since the pandemic there is a shift towards telemedicine and less hospital-based care. We were keen to develop a way to monitor our patients to see how they are doing between hospital visits.”
“Our AI can outperform some non-experts and comes close to senior consultants,” he continues. “But the real benefit is having the equivalent of a consultant in the bathroom with you, every time you go. We are now working towards the algorithm being able to work when there is background noise in the normal home environment and this will make the true difference for patients.”
Audioflow will now be rolled out as a smartphone app via primary care physicians so it can be tested in the real world and learn from different datasets in different noise environments.
Christian Gratzke, Professor of Urology at University Hospital Freiburg (DE) and member of the EAU22 Scientific Congress Committee of Urology says: “Giving patients the ability to measure urinary flow at home is more comfortable for them and reduces time waiting in the clinic. This is a well-executed study with a significant number of patients and represents a promising approach to developing a portable app that can be used at home. I look forward to seeing the real-world results.”
Notes to editors:
Europe’s biggest urology congress will take place from 1-4 July 2022 in Amsterdam, The Netherlands. With nearly 1,300 abstracts presented and moderated live, the 37th Annual Congress of the European Association of Urology (EAU22) will be amongst Europe’s biggest medical congresses in 2022.
Clinicians, scientists, and patients will meet to discuss topics such as:
• Prostate cancer: new developments to improve treatments of the most common male cancer
• Urinary incontinence: a growing concern for the elderly population
• Practice changing treatments for both bladder and kidney cancer
• Prevention and treatment of urinary stones; 1 in 10 people (55 million adults in Europe) will form a stone at some point
• Special track for representatives of patient advocacy group on Monday 4 July
…and many other conditions related to the male and female urinary tract system and male reproductive organs. Review the full scientific programme on the congress website.
Ruth Francis, Campus PR
Tel: +44 7968 262273
The abstract, Development and validation of a deep learning system for sound-based prediction of urinary flow, is presented to the European Association of Urology Annual Congress (EAU22) in Amsterdam on Sunday 03 July, 2022.
A0845: Development and validation of a deep learning system for sound-based prediction of urinary flow
Introduction & Objectives
Uroflowmetry is an important tool for the assessment of patients with lower urinary tract symptoms (LUTS), but patients have to void unnaturally during outpatient visits, and accuracy may be limited by within-subject variation of urinary flow rates. Voiding acoustics appear to correlate well with uroflowmetry, and shows promise as a convenient home-based alternative for repeated monitoring of urinary flows. We aimed to evaluate the ability of a sound-based deep learning algorithm (Audioflow) to predict uroflowmetry parameters, generate flow patterns, and identify abnormal flow patterns.
Materials & Methods
In this prospective open-label study, 534 male participants were recruited at Singapore General Hospital between December 1, 2017 and July 1, 2019 for training and validation of the Audioflow algorithm. Patients voided into a uroflowmetry machine, and voiding acoustics were recorded using a smartphone in close proximity. Suboptimal recordings were excluded. The algorithm was trained using 220 paired uroflowmetry traces and voiding sounds, as well as a reference standard for normal/abnormal urinary flows created by two functional Urologists. For validation, agreement between Audioflow’s predictions of maximum flow rate (Qmax), average flow rate (Qave), voided volume (VV), and flow time (FT) were evaluated against conventional uroflowmetry. Sensitivity, specificity, and area under the receiver operating curve (AUC) was determined for the algorithm’s ability to identify an abnormal urinary flow, and compared to a panel of 6 urology residents who separately graded the validation dataset.
In the validation dataset (n=111), agreement between Audioflow and conventional uroflowmetry for Qmax, Qave, VV and FT was 0.80 (95% CI, 0.70–0.78), 0.81 (95% CI, 0.73–0.87), 0.86 (95% CI, 0.78–0.91), and 0.96 (95% CI, 0.94–0.98) respectively. Urinary flow patterns generated by the algorithm showed a high degree of closeness with conventional uroflowmetry with a Frechet distance of 3.89. For identification of abnormal flows, Audioflow achieved a high rate of agreement of 83.8% (95% CI 77.5–90.1%) with the reference standard by specialist Urologists, which was comparable to that by the external panel of residents of 83.0% (95% CI: 76.9–90.3%). AUC was 0.892 (95% CI: 0.834–0.951), with a high sensitivity of 87.3% (95% CI: 76.8–93.7%) and specificity of 77.5% (95% CI: 61.1–88.6%).
The Audioflow algorithm is not only able to predict urinary flow parameters and generate flow patterns with a high degree of agreement with conventional uroflowmetry, but achieve a performance similar to Urology residents in the identification of abnormal urinary flows. It can potentially provide a simple, cost-effective and repeatable means of monitoring urinary flow rates to enhance the management of LUTS. Further work is necessary to assess the clinical effect of the algorithm in the routine care of patients.