Back to Search
Start Over
Development and Validation of a Deep Learning System for Sound-based Prediction of Urinary Flow
- Source :
- European urology focus.
- Publication Year :
- 2022
-
Abstract
- Uroflowmetry remains an important tool for the assessment of patients with lower urinary tract symptoms (LUTS), but accuracy can be limited by within-subject variation of urinary flow rates. Voiding acoustics appear to correlate well with conventional uroflowmetry and show promise as a convenient home-based alternative for the monitoring of urinary flows.To evaluate the ability of a sound-based deep learning algorithm (Audioflow) to predict uroflowmetry parameters and identify abnormal urinary flow patterns.In this prospective open-label study, 534 male participants recruited at Singapore General Hospital between December 1, 2017 and July 1, 2019 voided into a uroflowmetry machine, and voiding acoustics were recorded using a smartphone in close proximity. The Audioflow algorithm consisted of two models-the first model for the prediction of flow parameters including maximum flow rate (QLin's correlation coefficient was used to evaluate the agreement between Audioflow predictions and conventional uroflowmetry for QA total of 331 patients were included for analysis. Agreement between Audioflow and conventional uroflowmetry for QThe results of this study suggest that a deep learning algorithm can predict uroflowmetry parameters and identify abnormal urinary voids based on voiding sounds, and shows promise as a simple home-based alternative to uroflowmetry in the management of patients with LUTS.In this study, we trained a deep learning-based algorithm to measure urinary flow rates and identify abnormal flow patterns based on voiding sounds. This may provide a convenient, home-based alternative to conventional uroflowmetry for the assessment and monitoring of patients with lower urinary tract symptoms.
- Subjects :
- Urology
Subjects
Details
- ISSN :
- 24054569
- Database :
- OpenAIRE
- Journal :
- European urology focus
- Accession number :
- edsair.doi.dedup.....761f93def47811d81fab77190cb7ab81