93 results on '"Lee, John Aldo"'
Search Results
2. Trapping of carvacrol by konjac glucomannan-potato starch gels: Stability from macroscopic to microscopic scale, using image processing
- Author
-
Lafarge, Céline, Journaux, Ludovic, Bonnotte, Aline, Lherminier, Jeannine, Lee, John Aldo, Le Bail, Patricia, and Cayot, Nathalie
- Published
- 2017
- Full Text
- View/download PDF
3. Semi-supervised t-SNE with multi-scale neighborhood preservation
- Author
-
Walter Serna-Serna, De Bodt, Cyril, Andres M. Alvarez-Meza, Lee, John Aldo, Verleysen, Michel, Alvaro A. Orozco-Gutierrez, UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique, Universidad Tecnológica de Pereira - Automatic Research Group, and Universidad Nacional de Colombia - Signal Processing and Recognition Group
- Subjects
Stochastic neighbor embedding ,Neighborhood preservation ,Data visualization ,Semi-supervised learning ,Dimensionality reduction - Abstract
Unsupervised dimensionality reduction (DR) aims to preserve input data structure in a low-dimensional (LD) space based on neighborhood information. In contrast, supervised DR intends to improve the learning performance, i.e., classification and regression, in an LD representation. Unfortunately, obtaining the complete label outputs of a data set for real-world applications is hard. Here, we introduce a novel DR framework coupling both available class labels and input feature similarities to extend the well-known t-distributed Stochastic Neighbor Embedding (SNE) for semi-supervised scenarios. Our proposal, termed Semi-Supervised t-SNE (SS.t-SNE), properly fixes the widths of Gaussian neighborhoods to reveal the salient local and global data structures in an LD space. Indeed, our approach is presented as a generalization of unsupervised and supervised versions of t-SNE. SS.t-SNE outperforms other semi-supervised DR methods in data visualization and classification tasks in LD embeddings.
- Published
- 2023
4. An individualized radiation dose escalation trial in non-small cell lung cancer based on FDG-PET imaging
- Author
-
Wanet, Marie, Delor, Antoine, Hanin, François-Xavier, Ghaye, Benoît, Van Maanen, Aline, Remouchamps, Vincent, Clermont, Christian, Goossens, Samuel, Lee, John Aldo, Janssens, Guillaume, Bol, Anne, and Geets, Xavier
- Published
- 2017
- Full Text
- View/download PDF
5. Comparing dynamics of fluency and inter-limb coordination in climbing activities using multi-scale Jensen–Shannon embedding and clustering
- Author
-
Herault, Romain, Orth, Dominic, Seifert, Ludovic, Boulanger, Jeremie, and Lee, John Aldo
- Published
- 2017
- Full Text
- View/download PDF
6. Validation of the mid-position strategy for lung tumors in helical TomoTherapy
- Author
-
Wanet, Marie, Sterpin, Edmond, Janssens, Guillaume, Delor, Antoine, Lee, John Aldo, and Geets, Xavier
- Published
- 2014
- Full Text
- View/download PDF
7. Performance of a hybrid Monte Carlo‐Pencil Beam dose algorithm for proton therapy inverse planning
- Author
-
Barragán Montero, Ana María, Souris, Kevin, Sanchez‐Parcerisa, Daniel, Sterpin, Edmond, and Lee, John Aldo
- Published
- 2018
- Full Text
- View/download PDF
8. Consistency in quality correction factors for ionization chamber dosimetry in scanned proton beam therapy
- Author
-
Sorriaux, Jefferson, Testa, Mauro, Paganetti, Harald, Bertrand, Damien, Lee, John Aldo, Palmans, Hugo, Vynckier, Stefaan, and Sterpin, Edmond
- Published
- 2017
- Full Text
- View/download PDF
9. Radiotherapy for head and neck tumours in 2012 and beyond: conformal, tailored, and adaptive?
- Author
-
Grégoire, Vincent, Jeraj, Robert, Lee, John Aldo, and O’Sullivan, Brian
- Published
- 2012
- Full Text
- View/download PDF
10. Gradient-based delineation of the primary GTV on FDG-PET in non-small cell lung cancer: A comparison with threshold-based approaches, CT and surgical specimens
- Author
-
Wanet, Marie, Lee, John Aldo, Weynand, Birgit, De Bast, Marc, Poncelet, Alain, Lacroix, Valérie, Coche, Emmanuel, Grégoire, Vincent, and Geets, Xavier
- Published
- 2011
- Full Text
- View/download PDF
11. Evaluation of the radiobiological impact of anatomic modifications during radiation therapy for head and neck cancer: Can we simply summate the dose?
- Author
-
Orban de Xivry, Jonathan, Castadot, Pierre, Janssens, Guillaume, Lee, John Aldo, Geets, Xavier, Grégoire, Vincent, and Macq, Benoît
- Published
- 2010
- Full Text
- View/download PDF
12. Assessment by a deformable registration method of the volumetric and positional changes of target volumes and organs at risk in pharyngo-laryngeal tumors treated with concomitant chemo-radiation
- Author
-
Castadot, Pierre, Geets, Xavier, Lee, John Aldo, Christian, Nicolas, and Grégoire, Vincent
- Published
- 2010
- Full Text
- View/download PDF
13. Evaluation of MVCT protocols for brain and head and neck tumor patients treated with helical tomotherapy
- Author
-
Vaandering, Aude, Lee, John Aldo, Renard, Laurette, and Grégoire, Vincent
- Published
- 2009
- Full Text
- View/download PDF
14. Comparison of 12 deformable registration strategies in adaptive radiation therapy for the treatment of head and neck tumors
- Author
-
Castadot, Pierre, Lee, John Aldo, Parraga, Adriane, Geets, Xavier, Macq, Benoît, and Grégoire, Vincent
- Published
- 2008
- Full Text
- View/download PDF
15. Nonlinear dimensionality reduction of data manifolds with essential loops
- Author
-
Lee, John Aldo and Verleysen, Michel
- Published
- 2005
- Full Text
- View/download PDF
16. Fast Multiscale Neighbor Embedding.
- Author
-
de Bodt, Cyril, Mulders, Dounia, Verleysen, Michel, and Lee, John Aldo
- Subjects
COST functions ,DISTRIBUTION (Probability theory) ,NEIGHBORHOODS ,BIG data ,NEIGHBORS - Abstract
Dimension reduction (DR) computes faithful low-dimensional (LD) representations of high-dimensional (HD) data. Outstanding performances are achieved by recent neighbor embedding (NE) algorithms such as $t$ -SNE, which mitigate the curse of dimensionality. The single-scale or multiscale nature of NE schemes drives the HD neighborhood preservation in the LD space (LDS). While single-scale methods focus on single-sized neighborhoods through the concept of perplexity, multiscale ones preserve neighborhoods in a broader range of sizes and account for the global HD organization to define the LDS. For both single-scale and multiscale methods, however, their time complexity in the number of samples is unaffordable for big data sets. Single-scale methods can be accelerated by relying on the inherent sparsity of the HD similarities they involve. On the other hand, the dense structure of the multiscale HD similarities prevents developing fast multiscale schemes in a similar way. This article addresses this difficulty by designing randomized accelerations of the multiscale methods. To account for all levels of interactions, the HD data are first subsampled at different scales, enabling to identify small and relevant neighbor sets for each data point thanks to vantage-point trees. Afterward, these sets are employed with a Barnes–Hut algorithm to cheaply evaluate the considered cost function and its gradient, enabling large-scale use of multiscale NE schemes. Extensive experiments demonstrate that the proposed accelerations are, statistically significantly, both faster than the original multiscale methods by orders of magnitude, and better preserving the HD neighborhoods than state-of-the-art single-scale schemes, leading to high-quality LD embeddings. Public codes are freely available at https://github.com/cdebodt. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. Tensor factorization to extract patterns in multimodal EEG data
- Author
-
Mulders, Dounia, de Bodt, Cyril, Lejeune, Nicolas, Lee, John Aldo, Mouraux, André, Verleysen, Michel, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique, and UCL - SSS/IONS/COSY - Systems & cognitive Neuroscience
- Subjects
Periodicity ,Steady-states ,Tensor factorization ,Thermal perception ,Canonical Polyadic Decomposition ,EEG - Abstract
Noisy multi-way data sets are ubiquitous in many domains. In neuroscience, electroencephalogram (EEG) data are recorded during periodic stimulation from different sensory modalities, leading to steady-state (SS) recordings with at least four ways: the channels, the time, the subjects and the modalities. Improving the signal-to-noise ratio (SNR) of the SS responses is crucial to enable their practical use. Supervised spatial filtering methods can be considered for this purpose to relevantly guide the extraction of specific activity patterns. Nevertheless, such approaches are difficult to validate with few subjects and can process at most two data ways simultaneously, the remaining ones being either averaged or considered independently despite their dependencies. This paper hence designs unsupervised tensor factorization models to enable identifying meaningful underlying structures characterized in all ways of multimodal SS data. We show on EEG recordings from 15 subjects that such factorizations faithfully reveal consistent spatial topographies, time courses with enhanced SNR and subject variations of the periodic brain activity.
- Published
- 2019
18. Class-aware t-SNE: cat-SNE
- Author
-
de Bodt, Cyril, Mulders, Dounia, Lopez Sanchez, Daniel, Verleysen, Michel, Lee, John Aldo, European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Neighborhood preservation ,Neighbor embedding ,Class labels ,KNN accuracy ,Dimensionality reduction ,t-SNE - Abstract
Stochastic Neighbor Embedding (SNE) and variants like t-distributed SNE are popular methods of unsupervised dimensionality reduction (DR) that deliver outstanding experimental results. Regular t-SNE is often used to visualize data with class labels in colored scatterplots, even if those labels are actually not involved in the DR process. This paper proposes a modification of t-SNE that employs class labels to adjust the widths of the Gaussian neighborhoods around each datum, instead of deriving those from a perplexity set by the user. The widths are fixed to concentrate a major fraction of the probability distribution around a datum on neighbors with the same class. This tends to shrink the bulk of the classes and to stretch their low-dimensional separation. Experimental results show that the proposed class-aware t-SNE (cat-SNE) outperforms regular t-SNE in KNN classification tasks carried out in the embedding.
- Published
- 2019
19. Using planning CTs to enhance CNN-based bladder segmentation on Cone Beam CT
- Author
-
Brion, Eliott, Léger, Jean, Javaid, Umair, Lee, John Aldo, De Vleeschouwer, Christophe, Macq, Benoît, SPIE Medical Imaging 2019, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Segmentation ,Cone Beam CT ,Radiotherapy ,Bladder ,Convolutional Neural Networks - Abstract
For prostate cancer patients, large organ deformations occurring between the sessions of a fractionated radiotherapytreatment lead to uncertainties in the doses delivered to the tumour and the surrounding organs at risk. Thesegmentation of those structures in cone beam CT (CBCT) volumes acquired before every treatment sessionis desired to reduce those uncertainties. In this work, we perform a fully automatic bladder segmentation ofCBCT volumes with u-net, a 3D fully convolutional neural network (FCN). Since annotations are hard to collectfor CBCT volumes, we consider augmenting the training dataset with annotated CT volumes and show that itimproves the segmentation performance.Our network is trained and tested on 48 annotated CBCT volumes using a 6-fold cross-validation scheme.The network reaches a mean Dice similarity coefficient (DSC) of0.801±0.137 with 32 training CBCT volumes.This result improves to0.848±0.085 when the training set is augmented with 64 CT volumes. The segmentationaccuracy increases both with the number of CBCT and CT volumes in the training set. As a comparison, thestate-of-the-art deformable image registration (DIR) contour propagation between planning CT and daily CBCTavailable in RayStation reaches a DSC of0.744±0.144 on the same dataset, which is below our FCN result.
- Published
- 2019
20. Nonlinear projection with curvilinear distances: Isomap versus curvilinear distance analysis
- Author
-
Lee, John Aldo, Lendasse, Amaury, and Verleysen, Michel
- Published
- 2004
- Full Text
- View/download PDF
21. Introducing a probabilistic definition of the target in a robust treatment planning framework.
- Author
-
Buti, Gregory, Souris, Kevin, Montero, Ana Maria Barragán, Lee, John Aldo, and Sterpin, Edmond
- Subjects
ROBUST optimization ,LUNGS ,LUNG tumors ,MATHEMATICAL optimization ,PROTON therapy ,STANDARD deviations - Abstract
The 'clinical target distribution' (CTD) has recently been introduced as a promising alternative to the binary clinical target volume (CTV). However, a comprehensive study that considers the CTD, together with geometric treatment uncertainties, was lacking. Because the CTD is inherently a probabilistic concept, this study proposes a fully probabilistic approach that integrates the CTD directly in a robust treatment planning framework. First, the CTD is derived from a reported microscopic tumor infiltration model such that it explicitly features the probability of tumor cell presence in its target definition. Second, two probabilistic robust optimization methods are proposed that evaluate CTD coverage under uncertainty. The first method minimizes the expected-value (EV) over the uncertainty scenarios and the second method minimizes the sum of the expected value and standard deviation (EV-SD), thereby penalizing the spread of the objectives from the mean. Both EV and EV-SD methods introduce the CTD in the objective function by using weighting factors that represent the probability of tumor presence. The probabilistic methods are compared to a conventional worst-case approach that uses the CTV in a worst-case optimization algorithm. To evaluate the treatment plans, a scenario-based evaluation strategy is implemented that combines the effects of microscopic tumor infiltrations with the other geometric uncertainties. The methods are tested for five lung tumor patients, treated with intensity-modulated proton therapy. The results indicate that for the studied patient cases, the probabilistic methods favor the reduction of the esophagus dose but compensate by increasing the high-dose region in a low conflicting organ such as the lung. These results show that a fully probabilistic approach has the potential to obtain clinical benefits when tumor infiltration uncertainties are taken into account directly in the treatment planning process. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
22. Freeze-thaw stability of konjac glocomannane-potato starch gels: stability from macroscopic to microscopic scale, using image processing
- Author
-
Lafarge, Céline, Cayot, Nathalie, Ribourg, Lucie, Journaux, Ludovic, Bonnotte, Aline, Lherminier, Jeannine, Lee, John Aldo, Le Bail, Patricia, Procédés Alimentaires et Microbiologiques [Dijon] (PAM), Université de Bourgogne (UB)-AgroSup Dijon - Institut National Supérieur des Sciences Agronomiques, de l'Alimentation et de l'Environnement-Université Bourgogne Franche-Comté [COMUE] (UBFC), Unité de recherche sur les Biopolymères, Interactions Assemblages (BIA), Institut National de la Recherche Agronomique (INRA), Laboratoire Electronique, Informatique et Image [UMR6306] (Le2i), Université de Bourgogne (UB)-École Nationale Supérieure d'Arts et Métiers (ENSAM), Arts et Métiers Sciences et Technologies, HESAM Université (HESAM)-HESAM Université (HESAM)-Arts et Métiers Sciences et Technologies, HESAM Université (HESAM)-HESAM Université (HESAM)-AgroSup Dijon - Institut National Supérieur des Sciences Agronomiques, de l'Alimentation et de l'Environnement-Centre National de la Recherche Scientifique (CNRS), Dispositif Inter-régional d'Imagerie Cellulaire [Dijon] (DImaCell), Procédés Alimentaires et Microbiologiques (PAM), Université de Bourgogne (UB)-AgroSup Dijon - Institut National Supérieur des Sciences Agronomiques, de l'Alimentation et de l'Environnement-Université de Bourgogne (UB)-AgroSup Dijon - Institut National Supérieur des Sciences Agronomiques, de l'Alimentation et de l'Environnement-Ingénierie et biologie cellulaire et tissulaire (IBCT (ex IFR133)), Centre Hospitalier Régional Universitaire [Besançon] (CHRU Besançon)-Etablissement français du sang [Bourgogne-France-Comté] (EFS [Bourgogne-France-Comté])-Université de Franche-Comté (UFC)-Centre Hospitalier Régional Universitaire [Besançon] (CHRU Besançon)-Etablissement français du sang [Bourgogne-France-Comté] (EFS [Bourgogne-France-Comté])-Université de Franche-Comté (UFC), Université Catholique de Louvain (UCL), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université de Bourgogne (UB)-AgroSup Dijon - Institut National Supérieur des Sciences Agronomiques, de l'Alimentation et de l'Environnement, Centre Hospitalier Régional Universitaire de Besançon (CHRU Besançon)-Etablissement français du sang [Bourgogne-Franche-Comté] (EFS [Bourgogne-Franche-Comté])-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC)-Centre Hospitalier Régional Universitaire de Besançon (CHRU Besançon)-Etablissement français du sang [Bourgogne-Franche-Comté] (EFS [Bourgogne-Franche-Comté])-Université de Franche-Comté (UFC), Université Bourgogne Franche-Comté [COMUE] (UBFC)-Université Bourgogne Franche-Comté [COMUE] (UBFC), and Université Catholique de Louvain = Catholic University of Louvain (UCL)
- Subjects
Performance ,[SDV]Life Sciences [q-bio] ,Freezing ,food and beverages ,Starch ,Potato ,Stability ,Experimentation - Abstract
International audience; Freeze-thaw (FT) stability is often used to assess the ability of a gel to support the damage induced byfreezing; selected parameters such as drip loss, damage to structure etc can be used to assess the freezetolerance of a gel. Konjac glucomannan (KGM) is a very specific hydrocolloid able to trap 100 timesits weight in water; it has not been studied so far as an improver to enhance FT stability. The aim of the study was to show that the presence of a small quantity of konjac glucomannan (KGM)in potato starch suspension increased the stability of carvacrol antioxidant trapping. FT cycles wereused to accelerate the ageing of the product and to assess its stability. In addition to drip lossesdetermination, the stability of carvacrol trapping was evaluated by the quantification of carvacrol inthe syneresis liquid. Microscopic and macroscopic scales were considered with microscopy. Themoment of the addition of carvacrol and the presence of KGM both had an effect on the stability ofcarvacrol trapping and of the structure of the gel. KGM promoted amylose retrogradation but sloweddown amylopectin retrogradation. The stability of potato starch gels can be improved by the additionof a small quantity of KGM, which showed a “cryoprotectant” behaviour. New method to characterizethe micro and macrostructure from SEM images processing has also been proposed. The processing ofmicroscopy images was done using Generalized Fourier Descriptors and allowed the characterizationof each sample. The carvacrol addition lowered the physical stability of the gel with larger pores andincreased syneresis. On the contrary, the KGM addition increased the size of the pores but preventedthe formation of very large pores and reduced syneresis. The most stable system was obtained by theaddition of carvacrol at the end of heating, in a konjac glucomannane potato starch gel.
- Published
- 2018
23. Compressive Sampling Approach for Image Acquisition with Lensless Endoscope
- Author
-
Guérit, Stéphanie, Sivankutty, Siddharth, Scotté, Camille, Lee, John Aldo, Rigneault, Hervé, Jacques, Laurent, international Traveling Workshop on Interactions between low-complexity data models and Sensing Techniques, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Compressive sampling ,Biological imaging ,Lensless endoscope ,Inverse problem - Abstract
The lensless endoscope is a promising device designed to image tissues in vivo at the cellular scale. The traditional acquisition setup consists in raster scanning during which the focused light beam from the optical fiber illuminates sequentially each pixel of the field of view (FOV). The calibration step to focus the beam and the sampling scheme both take time. In this preliminary work, we propose a scanning method based on compressive sampling theory. The method does not rely on a focused beam but rather on the random illumination patterns generated by the single-mode fibers. Experiments are performed on synthetic data for different compression rates (from 10 to 100% of the FOV).
- Published
- 2018
24. Extensive assessment of Barnes-Hut t-SNE
- Author
-
de Bodt, Cyril, Mulders, Dounia, Verleysen, Michel, Lee, John Aldo, The European Symposium on Artificial Neural Networks, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Neighborhood preservation ,Coranking ,Barnes-Hut acceleration ,Neighbor embedding ,Dimensionality reduction ,t-SNE - Abstract
Stochastic Neighbor Embedding (SNE) and variants are dimensionality reduction (DR) methods able to foil the curse of dimensionality to deliver outstanding experimental results. Mitigating the crowding problem, t-SNE became an extremely popular DR scheme. Its quadratic time complexity in the number of samples is nevertheless unaffordable for big data sets. This motivates its Barnes-Hut (BH) acceleration for large-scale use. Although the latter is faster by orders of magnitude, few studies quantify its DR quality with respect to t-SNE. Extensive comparisons between t-SNE and its BH version are conducted using neighborhood preservation-based criteria. Both methods perform very similarly, suggesting the BH scheme superiority thanks to its reduced time complexity.
- Published
- 2018
25. Capturing Variabilities from Computed Tomography Images with Generative Adversarial Networks
- Author
-
Javaid, Umair, Lee, John Aldo, European Symposium on Artificial Neural Networks - ESANN'18, and UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie
- Abstract
With the advent of Deep Learning (DL) techniques, especially Generative Adversarial Networks (GANs), data augmentation and genera- tion are quickly evolving domains that have raised much interest recently. However, the DL techniques are data demanding and since, medical data is not easily accessible, they suffer from data insufficiency. To deal with this limitation, different data augmentation techniques are used. Here, we propose a novel unsupervised data-driven approach for data augmentation that can generate 2D Computed Tomography (CT) images using a simple GAN. The generated CT images have good global and local features of a real CT image and can be used to augment the training datasets for effective learning. In this proof-of-concept study, we show that our pro- posed solution using GANs is able to capture some of the global and local CT variabilities. Our network is able to generate visually realistic CT im- ages and we aim to further enhance its output by scaling it to a higher resolution and potentially from 2D to 3D.
- Published
- 2018
26. Perplexity-free t-SNE and twice Student tt-SNE
- Author
-
de Bodt, Cyril, Mulders, Dounia, Verleysen, Michel, Lee, John Aldo, The European Symposium on Artificial Neural Networks, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Heavy-tailed distributions ,Perplexity-free ,Multi-scale neighborhoods ,Neighbor embedding ,Dimensionality reduction ,t-SNE - Abstract
In dimensionality reduction and data visualisation, t-SNE has become a popular method. In this paper, we propose two variants to the Gaussian similarities used to characterise the neighbourhoods around each high-dimensional datum in t-SNE. A first alternative is to use t distributions like already used in the low-dimensional embedding space; a variable degree of freedom accounts for the intrinsic dimensionality of data. The second variant relies on compounds of Gaussian neighbourhoods with growing widths, thereby suppressing the need for the user to adjust a single size or perplexity. In both cases, heavy-tailed distributions thus characterise the neighbourhood relationships in the data space. Experiments show that both variants are competitive with t-SNE, at no extra cost.
- Published
- 2018
27. Multi-organ Segmentation of Chest CT Images in Radiation Oncology: Comparison of Standard and Dilated UNet
- Author
-
Javaid, Umair, Dasnoy-Sumell, Damien, Lee, John Aldo, Advanced Concepts for Intelligent Vision Systems - ACIVS'18, and UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie
- Subjects
computed tomography ,segmentation ,multi-organ - Abstract
Automatic delineation of organs at risk (OAR) in Computed Tomography (CT) images is a crucial step for treatment planning in radiation oncology. However, manual delineation of organs is a challenging and time-consuming task subject to inter-observer variabilities. Automatic organ delineation has been relying on non-rigid registrations and atlases. However, lately deep learning appears as a strong competitor with specific architectures dedicated to image segmentation like UNet. In this paper, we first assess the standard UNet to delineate multiple organs in CT images. Second, we observe the effect of dilated convolutional layers in UNet to better capture the global context from the CT images and effectively learn the anatomy, which results in increased accuracy of organ delineation. We evaluate the performance of a standard UNet and a dilated UNet (with dilated convolutional layers) on four chest organs (esophagus, left lung, right lung, and spinal cord) from 29 lung image acquisitions and observe that dilated UNet delineates the soft tissues notably esophagus and spinal cord with higher accuracy than the standard UNet. We quantify the segmentation accuracy of both models by computing spatial overlap measures like Dice similarity coefficient, recall & precision, and Hausdorff distance. Compared to the standard UNet, dilated UNet yields the best Dice scores for soft organs whereas for lungs, both models have the same delineation accuracy: 0.84±0.07 vs 0.71±0.10 for esophagus, 0.99±0.01 vs 0.99±0.01 for left lung, 0.99±0.01 vs 0.99±0.01 for right lung and 0.91±0.05 vs 0.88±0.04 for spinal cord.
- Published
- 2018
28. Improvement of kilovoltage intrafraction monitoring accuracy through gantry angles selection.
- Author
-
Vander Veken, Loïc, Dechambre, David, Michiels, Steven, Cohilis, Marie, Souris, Kevin, Lee, John Aldo, and Geets, Xavier
- Published
- 2020
- Full Text
- View/download PDF
29. Towards fast and robust 4D optimization for moving tumors with scanned proton therapy.
- Author
-
Buti, Gregory, Souris, Kevin, Montero, Ana Maria Barragán, Lee, John Aldo, and Sterpin, Edmond
- Subjects
MONTE Carlo method ,ROBUST optimization ,PROTON therapy ,LUNG cancer ,TUMORS ,SCHEDULING - Abstract
Purpose: Robust optimization is becoming the gold standard for generating robust plans against various kinds of treatment uncertainties. Today, most robust optimization strategies use a pragmatic set of treatment scenarios (the so‐called uncertainty set) consisting of combinations of maximum errors, of each considered uncertainty source (such as tumor motion, setup and image‐conversion errors). This approach presents two key issues. First, a subset of considered scenarios is unnecessarily improbable which could potentially compromise the plan quality. Second, the resulting large uncertainty set leads to long plan computation times, which limits the potential for robust optimization as a standard clinical tool. In order to address these issues, a method is introduced which is able to preselect a limited set of relevant treatment error scenarios. Methods: Uncertainties due to systematic setup errors, image‐conversion errors and respiratory tumor motion are considered. A four‐dimensional (4D)‐equiprobability hypersurface is defined, which takes into account the joint probabilities of the above‐mentioned uncertainty sources. Only scenarios that lie on the predefined 4D hypersurface are considered, guaranteeing statistical consistency of the uncertainty set. In this regard, twelve scenarios are selected that cover maximum spatial displacements of the tumor during breathing. Subsequently, additional scenarios are considered (sampled from the aforementioned 4D hypersurface) in order to cover any estimated residual range errors. Two different scenario‐selection procedures were tested: (a) the maximum displacements (MD) method that only considers twelve scaled maximum displacement scenarios and (b) maximum displacements and residual range (MDR) method which, in addition to the scaled maximum displacement scenarios, considers additional maximum range uncertainty scenarios. The methods were tested for five lung cancer patients by performing comprehensive Monte Carlo robustness evaluations. Results: A plan computation time gain of 78% is achieved by applying the MD method, whilst obtaining a target robustness of D95 larger than 95% of the prescribed dose, for the worst‐case scenario. Additionally, the MD method has the potential to be fully automatic which makes it a promising candidate for fast automatic planning workflows. The MDR method produced plans with excellent target robustness (D99 larger than 95% of the prescribed dose, even for the worst‐case scenario), whilst still obtaining a significant plan computation time gain of 57%. Conclusions: Two scenario‐selection procedures were developed which achieved significant reduction of plan computation time and memory consumption, without compromising plan quality or robustness. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
30. Human-centered machine learning through interactive visualization
- Author
-
Sacha, Dominik, Sedlmair, Michael, Leishi Zhang, Lee, John Aldo, Weiskopf, Daniel, North, Stephen, and Keim, Daniel
- Abstract
The goal of visual analytics (VA) systems is to solve complex problems by integrating automated data analysis methods, such as machine learning (ML) algorithms, with interactive visualizations. We propose a conceptual framework that models human interactions with ML components in the VA process, and makes the crucial interplay between automated algorithms and interactive visualizations more concrete. The framework is illustrated through several examples. We derive three open research challenges at the intersection of ML and visualization research that will lead to more effective data analysis.
- Published
- 2016
31. Blind Deconvolution of PET Images using Anatomical Priors
- Author
-
Guérit, Stéphanie, González, Adriana, Bol, Anne, Lee, John Aldo, Jacques, Laurent, iTWIST 2016, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
blind deconvolution ,anatomical prior ,total variation ,PET imaging ,inverse problem - Abstract
Images from positron emission tomography (PET) provide metabolic information about the human body. They present, however, a spatial resolution that is limited by physical and instrumental factors often modeled by a blurring function. Since this function is typically unknown, blind deconvolution (BD) techniques are needed in order to produce a useful restored PET image. In this work, we propose a general BD technique that restores a low resolution blurry image using information from data acquired with a high resolution modality (e.g., CT-based delineation of regions with uniform activity in PET images). The proposed BD method is validated on synthetic and actual phantoms. Images from positron emission tomography (PET) provide metabolic information about the human body. They present, however, a spatial resolution that is limited by physical and instrumental factors often modeled by a blurring function. Since this function is typically unknown, blind deconvolution (BD) techniques are needed in order to produce a useful restored PET image. In this work, we propose a general BD technique that restores a low resolution blurry image using information from data acquired with a high resolution modality (e.g., CT-based delineation of regions with uniform activity in PET images). The proposed BD method is validated on synthetic and actual phantoms.
- Published
- 2016
32. OC-0265: Efficient implementation of random errors in robust optimization for proton therapy with Monte Carlo
- Author
-
Barragan Montero, Ana Maria, Souris, Kevin, Sterpin, Edmond, Lee, John Aldo, ESTRO annual meeting (35), and UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie
- Published
- 2016
33. Image Deconvolution by Local Order Preservation of Pixels Values
- Author
-
Guérit, Stéphanie, Jacques, Laurent, Lee, John Aldo, 2016 European Signal Processing Conference, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Blind deconvolution ,Pixel ,business.industry ,PET imaging ,02 engineering and technology ,local constraints ,Inverse problem ,deconvolution ,030218 nuclear medicine & medical imaging ,03 medical and health sciences ,0302 clinical medicine ,Regularization (physics) ,Radiation oncology ,0202 electrical engineering, electronic engineering, information engineering ,inverse problem ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,Deconvolution ,business ,Image resolution ,Algorithm ,Image restoration ,Mathematics - Abstract
Positron emission tomography is more and more used in radiation oncology, since it conveys useful functional information about cancerous lesions. Its rather low spatial resolution, however, prevents accurate tumor delineation and heterogeneity assessment. Post-reconstruction deconvolution with the measured point-spread function can address this issue, provided it does not introduce undesired artifacts. These usually result from inappropriate regularization, which is either absent or making too strong assumptions about the structure of the signal. This paper proposes a deconvolution method that is based on inverse problem theory and involves a new regularization term that preserves local pixel value order relationships. Such regularization entails relatively mild constraints that are directly inferred from the observed data. This paper investigates the theoretical properties of the proposed regularization and describes its numerical implementation with a primal-dual algorithm. Preliminary experiments with synthetic images are presented to compare quantitatively and qualitatively the proposed method to other regularization schemes, like TV and TGV. Positron emission tomography is more and more used in radiation oncology, since it conveys useful functional information about cancerous lesions. Its rather low spatial resolution, however, prevents accurate tumor delineation and heterogeneity assessment. Post-reconstruction deconvolution with the measured point-spread function can address this issue, provided it does not introduce undesired artifacts. These usually result from inappropriate regularization, which is either absent or making too strong assumptions about the structure of the signal. This paper proposes a deconvolution method that is based on inverse problem theory and involves a new regularization term that preserves local pixel value order relationships. Such regularization entails relatively mild constraints that are directly inferred from the observed data. This paper investigates the theoretical properties of the proposed regularization and describes its numerical implementation with a primal-dual algorithm. Preliminary experiments with synthetic images are presented to compare quantitatively and qualitatively the proposed method to other regularization schemes, like TV and TGV.
- Published
- 2016
34. Geometrical homotopy for data visualization
- Author
-
Peluffo Ordoñez, Diego Hernan, Lee, John Aldo, Verleysen, Michel, Alvarado-Pérez, Juan C., UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Abstract
This work presents an approach allowing for an interactive visualization of dimensionality reduction outcomes, which is based on an extended view of conventional homotopy. The pairwise functional followed from a simple ho- motopic function can be incorporated within a geometrical framework in order to yield a bi-parametric approach able to combine several kernel matrices. There- fore, the users can establish the mixture of kernels in an intuitive fashion by only varying two parameters. Our approach is tested by using kernel alternatives for conventional methods of spectral dimensional reduction such as multidimensional scalling, locally linear embedding and laplacian eigenmaps. Provided mixture rep- resents every single dimensional reduction approach as well as helps users to find a suitable representation of embedded data.
- Published
- 2015
35. Automatic dose painting workflow: from tumor segmentation to optimization
- Author
-
Barragan Montero, Ana Maria, Lee, John Aldo, Sterpin, Edmond, 3rd ESTRO Forum 201, and UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie
- Published
- 2015
36. Molecular Imaging-Guided Radiotherapy for the Treatment of Head-and-Neck Squamous Cell Carcinoma: Does it Fulfill the Promises?
- Author
-
Grégoire, Vincent, Thorwarth, Daniela, and Lee, John Aldo
- Abstract
With the routine use of intensity modulated radiation therapy for the treatment of head-and-neck squamous cell carcinoma allowing highly conformed dose distribution, there is an increasing need for refining both the selection and the delineation of gross tumor volumes (GTV). In this framework, molecular imaging with positron emission tomography and magnetic resonance imaging offers the opportunity to improve diagnostic accuracy and to integrate tumor biology mainly related to the assessment of tumor cell density, tumor hypoxia, and tumor proliferation into the treatment planning equation. Such integration, however, requires a deep comprehension of the technical and methodological issues related to image acquisition, reconstruction, and segmentation. Until now, molecular imaging has had a limited value for the selection of nodal GTV, but there are increasing evidences that both FDG positron emission tomography and diffusion-weighted magnetic resonance imaging has a potential value for the delineation of the primary tumor GTV, effecting on dose distribution. With the apprehension of the heterogeneity in tumor biology through molecular imaging, growing evidences have been collected over the years to support the concept of dose escalation/dose redistribution using a planned heterogeneous dose prescription, the so-called “dose painting” approach. Validation trials are ongoing, and in the coming years, one may expect to position the dose painting approach in the armamentarium for the treatment of patients with head-and-neck squamous cell carcinoma. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
37. Studie zur individualisierten Bestrahlungsdosiseskalation bei nichtkleinzelligem Lungenkarzinom basierend auf der FDG-PET-Bildgebung.
- Author
-
Wanet, Marie, Delor, Antoine, Hanin, François-Xavier, Ghaye, Benoît, Van Maanen, Aline, Remouchamps, Vincent, Clermont, Christian, Goossens, Samuel, Lee, John, Janssens, Guillaume, Bol, Anne, Geets, Xavier, Hanin, François-Xavier, Ghaye, Benoît, and Lee, John Aldo
- Subjects
CLINICAL trials ,COMPARATIVE studies ,DEOXY sugars ,DOSE-response relationship (Radiation) ,LUNG cancer ,LUNG tumors ,RESEARCH methodology ,MEDICAL cooperation ,COMPUTERS in medicine ,RADIATION doses ,RADIOPHARMACEUTICALS ,RADIOTHERAPY ,RESEARCH ,POSITRON emission tomography ,PILOT projects ,EVALUATION research ,TREATMENT effectiveness ,PATIENT-centered care - Abstract
Copyright of Strahlentherapie und Onkologie is the property of Springer Nature and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2017
- Full Text
- View/download PDF
38. Nonlinear Dimensionality Reduction With Missing Data Using Parametric Multiple Imputations.
- Author
-
de Bodt, Cyril, Mulders, Dounia, Verleysen, Michel, and Lee, John Aldo
- Subjects
DATA reduction ,GAUSSIAN mixture models ,DATA distribution ,COST functions ,SETTLEMENT costs - Abstract
Dimensionality reduction (DR) aims at faithfully and meaningfully representing high-dimensional (HD) data into a low-dimensional (LD) space. Recently developed neighbor embedding DR methods lead to outstanding performances, thanks to their ability to foil the curse of dimensionality. Unfortunately, they cannot be directly employed on incomplete data sets, which become ubiquitous in machine learning. Discarding samples with missing features prevents their LD coordinates computation and deteriorates the complete samples treatment. Common missing data imputation schemes are not appropriate in the nonlinear DR context either. Indeed, even if they model the data distribution in the feature space, they can, at best, enable the application of a DR scheme on the expected data set. In practice, one would, instead, like to obtain the LD embedding with the closest cost function value on average with respect to the complete data case. As the state-of-the-art DR techniques are nonlinear, the latter embedding results from minimizing the expected cost function on the incomplete database, not from considering the expected data set. This paper addresses these limitations by developing a general methodology for nonlinear DR with missing data, being directly applicable with any DR scheme optimizing some criterion. In order to model the feature dependences, an HD extension of Gaussian mixture models is first fitted on the incomplete data set. It is afterward employed under the multiple imputation paradigms to obtain a single relevant LD embedding, thus minimizing the cost function expectation. Extensive experiments demonstrate the superiority of the suggested framework over alternative approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
39. Radiation dose escalation based on FDG-PET driven dose painting by numbers in oropharyngeal squamous cell carcinoma: a dosimetric comparison between TomoTherapy-HA and RapidArc.
- Author
-
Differding, Sarah, Sterpin, Edmond, Hermand, Nicolas, Vanstraelen, Bianca, Nuyts, Sandra, de Patoul, Nathalie, Denis, Jean-Marc, Lee, John Aldo, and Grégoire, Vincent
- Subjects
DRUG dosage ,FLUORODEOXYGLUCOSE F18 ,POSITRON emission tomography ,SQUAMOUS cell carcinoma ,CANCER treatment ,COMPARATIVE studies ,CLINICAL trials ,DEOXY sugars ,EXPERIMENTAL design ,HEAD tumors ,RESEARCH methodology ,MEDICAL cooperation ,COMPUTERS in medicine ,NECK tumors ,RADIATION doses ,RADIATION measurements ,RADIOPHARMACEUTICALS ,RADIOTHERAPY ,RESEARCH ,EVALUATION research ,OROPHARYNGEAL cancer - Abstract
Purpose: Validation of dose escalation through FDG-PET dose painting (DP) for oropharyngeal squamous cell carcinoma (SCC) requires randomized clinical trials with large sample size, potentially involving different treatment planning and delivery systems. As a first step of a joint clinical study of DP, a planning comparison was performed between Tomotherapy HiArt® (HT) and Varian RapidArc® (RA).Methods: The planning study was conducted on five patients with oropharyngeal SCC. Elective and therapeutic CTVs were delineated based on anatomic information, and the respective PTVs (CTVs + 4 mm) were prescribed a dose of 56 (PTV56) and 70 Gy (PTV70). A gradient-based method was used to delineate automatically the external contours of the FDG-PET volume (GTVPET). Variation of the FDG uptake within the GTVPET was linearly converted into a prescription between 70 and 86 Gy. A dilation of the voxel-by-voxel prescription of 2.5 mm was applied to account for geometric errors in dose delivery (PTVPET). The study was divided in two planning phases aiming at maximizing target coverage (phase I) and lowering doses to OAR (phase II). A Quality-Volume Histogram (QVH) assessed conformity with the DP prescription inside the PTVPET.Results: In phase I, for both HT and RA, all plans achieved comparable target coverage for PTV56 and PTV70, respecting the planning objectives. A median value of 99.9 and 97.2% of all voxels in the PTVPET received at least 95% of the prescribed dose for RA and HT, respectively. A median value of 0.0% and 3.7% of the voxels in the PTVPET received 105% or more of prescribed dose for RA and HT, respectively. In phase II, no significant differences were found in OAR sparing. Median treatment times were 13.7 min for HT and 5 min for RA.Conclusions: Both HT and RA can generate similar dose distributions for FDG-PET based dose escalation and dose painting in oropharyngeal SCC patients. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
40. Fast multipurpose Monte Carlo simulation for proton therapy using multi- and many-core CPU architectures.
- Author
-
Souris, Kevin, Lee, John Aldo, and Sterpin, Edmond
- Subjects
- *
PROTON therapy , *RADIOTHERAPY treatment planning , *CENTRAL processing units , *MONTE Carlo method , *COPROCESSORS - Abstract
Purpose: Accuracy in proton therapy treatment planning can be improved using Monte Carlo (MC) simulations. However the long computation time of such methods hinders their use in clinical routine. This work aims to develop a fast multipurpose Monte Carlo simulation tool for proton therapy using massively parallel central processing unit (CPU) architectures. Methods: A new Monte Carlo, called MCsquare (many-core Monte Carlo), has been designed and optimized for the last generation of Intel Xeon processors and Intel Xeon Phi coprocessors. These massively parallel architectures offer the flexibility and the computational power suitable to MC methods. The class-II condensed history algorithm of MCsquare provides a fast and yet accurate method of simulating heavy charged particles such as protons, deuterons, and alphas inside voxelized geometries. Hard ionizations, with energy losses above a user-specified threshold, are simulated individually while soft events are regrouped in a multiple scattering theory. Elastic and inelastic nuclear interactions are sampled from ICRU 63 differential cross sections, thereby allowing for the computation of prompt gamma emission profiles. MCsquare has been benchmarked with the gate/geant4Monte Carlo application for homogeneous and heterogeneous geometries. Results: Comparisons with gate/geant4 for various geometries show deviations within 2%-1 mm. In spite of the limited memory bandwidth of the coprocessor simulation time is below 25 s for 107 primary 200 MeV protons in average soft tissues using all Xeon Phi and CPU resources embedded in a single desktop unit. Conclusions: MCsquare exploits the flexibility of CPU architectures to provide a multipurpose MC simulation tool. Optimized code enables the use of accurate MC calculation within a reasonable computation time, adequate for clinical practice. MCsquare also simulates prompt gamma emission and can thus be used also for in vivo range verification. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
41. Methodology for adaptive and robust FDG-PET escalated dose painting by numbers in head and neck tumors.
- Author
-
Differding, Sarah, Sterpin, Edmond, Janssens, Guillaume, Hanin, François-Xavier, Lee, John Aldo, and Grégoire, Vincent
- Abstract
Objective.To develop a methodology for using FDG PET/CT in adaptive dose painting by numbers (DPBN) in head and neck squamous cell carcinoma (HNSCC) patients. Issues related to noise in PET and treatment robustness against geometric errors are addressed. Methods.Five patients with locally advanced HNSCC scheduled for chemo-radiotherapy were imaged with FDG-PET/CT at baseline and 2–3 times during radiotherapy (RT). The GTVPETwas segmented with a gradient-based method. A double median filter reduces the impact of noise in the PET uptake-to-dose conversion. Filtered FDG uptake values were linearly converted into a voxel-by-voxel prescription from 70 (median uptake) to 86 Gy (highest uptake). A PTVPETwas obtained by applying a dilation of 2.5 mm to the entire prescription. Seven iso-uptake thresholds led to seven sub-levels compatible with the Tomotherapy HiArt®Treatment Planning System. Planning aimed to deliver a median dose of 56 Gy and 70 Gy in 35 fractions on the elective and therapeutic PTVs, respectively. Plan quality was assessed with quality volume histogram (QVH). At each time point, plans were generated with a total of 3–4 plans for each patient. Deformable image registration was used for automatic contour propagation and dose summation of the 3 or 4 treatment plans (MIMvista®). Results.GTVPETsegmentations were performed successfully until week 2 of RT but failed in two patients at week 3. QVH analysis showed high conformity for all plans (mean VQ = 0.9593%; mean VQ = 1.053.9%; mean QF 2.2%). Good OAR sparing was achieved while keeping high plan quality. Conclusion.Our results show that adaptive FDG-PET-based escalated dose painting in patients with locally advanced HNSCC is feasible while respecting strict dose constraints to organs at risk. Clinical studies must be conducted to evaluate toxicities and tumor response of such a strategy. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
42. Generalized kernel framework for unsupervised spectral methods of dimensionality reduction.
- Author
-
Peluffo-Ordonez, Diego H., Lee, John Aldo, and Verleysen, Michel
- Published
- 2014
- Full Text
- View/download PDF
43. Effect of high hydrostatic pressure on extraction of B-phycoerythrin from Porphyridium cruentum: Use of confocal microscopy and image processing.
- Author
-
Tran, Thierry, Denimal, Emmanuel, Lafarge, Céline, Journaux, Ludovic, Lee, John Aldo, Winckler, Pascale, Perrier-Cornet, Jean-Marie, Pradelles, Rémi, Loupiac, Camille, and Cayot, Nathalie
- Abstract
Abstract The aim of the study was to extract B-phycoerythrin from Porphyridium cruentum while preserving its structure. The high hydrostatic pressure treatments were chosen as extraction technology. Different methods have been used to observe the effects of the treatment: spectrophotometry and confocal laser scanning microscopy followed by image processing analysis. Image processing led to the generation of masks used for the identification of three clusters: intra, extra and intercellular. All methods showed that high hydrostatic pressure treatments between 50 and 500 MPa failed to extract B-phycoerythrin from Porphyridium cruentum cells. The fluorescence emission was negatively impacted by high hydrostatic pressure treatment from 400 MPa for the extracellular and intercellular cluster and from 500 MPa for the intercellular cluster. These results suggest that high pressure treatments could induce the denaturation of B-phycoerythrin in all clusters but with different intensities depending on the cluster. Graphical abstract Unlabelled Image Highlights • 5 min at 500 MPa treatment had no effect on cell morphology. • Extra and intercellular clusters showed fluorescence decrease from 400 MPa. • Intracellular cluster showed fluorescence decrease from 500 MPa. • Intracellular cluster may play a protective effect towards B-phycoerythrin. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
44. Adaptive functional image-guided IMRT in pharyngo-laryngeal squamous cell carcinoma: Is the gain in dose distribution worth the effort?
- Author
-
Castadot, Pierre, Geets, Xavier, Lee, John Aldo, and Grégoire, Vincent
- Subjects
- *
SQUAMOUS cell carcinoma , *PHARYNGEAL cancer , *LARYNGEAL cancer treatment , *CANCER radiotherapy , *RADIATION doses , *ANATOMICAL variation , *MAGNETIC resonance imaging , *CANCER treatment - Abstract
Abstract: Background and purpose: The planning process in radiotherapy (RT) typically involves the acquisition of a unique set of CT images – and eventually of functional images – which is used for delineation of target volumes (TV) and organs at risk (OAR) and for dose calculation. Restricting the delineation and dose calculation solely on pre-treatment images is an oversimplification as it is only a snapshot of the patient’s anatomy. The objectives of the present study were (1) to assess the consequences of anatomic modification in dose distribution for both TVs and OARs; (2) to assess the potential benefit of adaptive strategies using Helical Tomotherapy (HT); and (3) to compare CT-based and FDG-PET-based adaptive planning strategies. Materials and methods: Ten patients with H&N SCC were imaged before and during concomitant chemo-RT using CT and FDG-PET acquisition after a mean dose of 14.2, 24.5, 35.0 and 44.9Gy. Simultaneous integrated boost IMRT planning was performed using HT. We compared (1) the planned dose distribution, (2) the delivered dose distributions that took into account impact of anatomical modifications on dose distribution, (3) the adaptive dose distributions after replanning to take into account the anatomic modifications and the anatomic or functional GTV shrinkage. Results: There was an increase between the planned and the delivered high dose volumes, which correlated with the slope of the GTV shrinkage. The adaptive high dose volumes were significantly smaller than the delivered ones. The difference between the adaptive and the delivered high dose volume also correlated with the slope of the GTV shrinkage. For both parotid glands combined, the delivered D mean showed a statistical trend for an increase of 4.4% compared to the planned D mean. For the ipsilateral parotid glands, there was a correlation between the D mean gain and the slope of the GTV shrinkage when an adaptive planning was used. For the oral cavity, the adaptive D mean was 10% smaller than the delivered ones. For the PRV around the spinal cord, there was an increase of about 4.5% between the delivered and the planned D 2%. The adaptive planning translated into a decrease in D 2% of 7.2%. The differences between the delivered and planned D 2% and between the adaptive and the delivered D 2% were correlated with the slope of the GTV shrinkage. For the CTVproph and PTVproph coverage, adaptive strategy induced a better dose conformation. No significant difference was observed in the various figures of merit between PET-based plan and CT-based isodose distributions. Conclusions: The dose distribution that is actually delivered to patients significantly differs from what was planned because of anatomic modifications. Adaptive multi-modality IMRT is feasible in H&N tumors and could compensate and improve dose distribution. Some useful surrogate criteria or “flags” are, however, needed to identify patients who might benefit from an adaptive strategy. The optimal adaptive strategy still needs to be defined and prospective studies will have to be conducted to address the safety and the clinical impact of such approaches on patient outcome. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
45. Accurate assessment of proton therapy treatments : fast Monte Carlo dose engine and extensive robustness tests
- Author
-
Souris, Kevin, UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie, UCL - Faculté de pharmacie et des sciences biomédicales, Lee, John Aldo, Sterpin, Edmond, Gregoire, Vincent, Vynckier, Stefaan, Geets, Xavier, Cortina, Eduardo, Lin, Liyong, and Knopf, Antje
- Subjects
Treatment robustness ,Monte Carlo ,Proton therapy - Abstract
Radiation therapy is one of the main treatments for cancer care. It consists in irradiating the tumor, while limiting the toxicity associated with the exposure of healthy tissues. Proton therapy is an emerging radiation delivery modality, which has the potential to better spare healthy tissues than conventional radiotherapy treatments. However, this new modality is much more sensitive to treatment uncertainties, such as patient anatomy changes. To address the lack of robustness in proton therapy, this thesis provides accurate treatment preparation tools, such as fast Monte Carlo dose calculation and robust planning methods. Furthermore, a comprehensive and realistic treatment robustness verification tool was developed in order to assess the sensitivity of the treatment plan to uncertainties. By combining these tools, proton therapy could be delivered more safely, improving the treatment outcome for the patient. All tools developed during this thesis are released open source and are already used in several institutions for research and clinical purposes. (BIFA - Sciences biomédicales et pharmaceutiques) -- UCL, 2018
- Published
- 2018
46. Analysis of the Financial Times ranking 'master in management' with machine learning
- Author
-
Jansen, Arthur, UCL - Ecole polytechnique de Louvain, Lee, John Aldo, and Vrins, Frédéric
- Subjects
Machine learning ,Ranking ,Dimensionality reduction - Abstract
University rankings play nowadays a major role in the decision of many students with regards to their future schools. Nonetheless, these rankings often remain quite opaque: not all data are made available, the methodology behind the rankings is not well defined, etc. One of the main ranking centred on business schools is the "Master in Management" from the Financial Times. This work aims to study the relevance of this ranking and its possible flaws. Several techniques are conducted, as a robustness analysis to assess the sturdiness of the ranking when facing uncertainties or dimensionality reductions to visualize the data set in two or three dimensions and thus envision local neighbourhoods of schools. Finally, from these analyses, potential improvements to the ranking from the Financial Times are discussed, from the design of a new ranking from scratch to the addition of visualization tools to enhance the informative character of the ranking. Master [120] : ingénieur civil en informatique, Université catholique de Louvain, 2017
- Published
- 2017
47. Determining meaningful locations in user's life from raw data logged by smartphones and wearables
- Author
-
De Droogh, Joachim, UCL - Ecole polytechnique de Louvain, and Lee, John Aldo
- Subjects
density ,machine learning ,GPS ,smartphone ,clustering - Abstract
A lot of applications on smartphones and other wearables use a localization system to trace the users. We will be interested in giving meaning to GPS traces (latitude / longitude / timestamp) by detecting meaningful locations for a user from the traces. The purpose of this master thesis is to find out an algorithm and a set of input parameters that gives the most satisfactory results to get the meaningful locations from datasets. This master thesis was realized in collaboration with SONY. We build up a very first intuitive solution from scratch, without any help of specialized literature. We enlarge a public dataset with calculation of speeds between successive observations, to only conserve the observations with a speed lower than 2 km/h. We display observations on maps with the tools of Google, but the solution is limited to public datasets. We also try to get our first meaningful locations with Excel sheets by grouping other observations with a chosen radius; it could be a solution, but this one is not scalable : each observation has successively to be compared to all other ones. The concept of "density" will then be used. The unsupervised learning machine domain will help to determinate clusters from datasets with algorithms, by grouping data based on similarity. DBSCAN is such an algorithm that defines a cluster as a maximal set of density-connected points, based on the "-neighborhood and the minimum of points per cluster. OPTICS is a more advance version of DBSCAN : the density of a cluster can be different from one cluster to another. K-MEANS is another type of algorithm which creates K clusters to minimize a function : the squared Euclidean distance from a point to the mean of the points to its cluster. With OPTICS in combination with K-MEANS, we build a program to try the several combinations of the algorithms with parameters sets (input values) on the five datasets from SONY. We get the results sets and do some validations to only keep realistic results. We switch then into supervised learning by using the ground truths as input for the validations. We assign a score depending on time - percentage of common duration in the clusters of the ground truths - and distance criteria - proximity of the centers of the clusters with the ground truths. The final score is the multiplication of both percentages. Each center of cluster should be a meaningful location. The highest score is held by this combination : run OPTICS and then reapply OPTICS on the same data without the outliers. This highest score is assigned to only one dataset, the other datasets have a really lower score. We then apply the geometric mean on the results of the highest score by the user for each combination of algorithms. The algorithm is OPTICS followed by K-MEANS : the centers of the clusters are determined by K-MEANS, based on the number of clusters determined by OPTICS. The most appropriated parameters from proposed parameter set are also determined. Master [60] en sciences informatiques, Université catholique de Louvain, 2017
- Published
- 2017
48. Robust, accurate and patient-specific treatment planning for proton therapy
- Author
-
Barragan Montero, Ana Maria, UCL - SSS/IREC/MIRO - Pôle d'imagerie moléculaire, radiothérapie et oncologie, UCL - Faculté de pharmacie et des sciences biomédicales, Lee, John Aldo, Sterpin, Edmond, Lecouvet, Frédéric, Geets, Xavier, Orban de Xivry, Jonathan, Verellen, Dirk, Oelfke, Uwe, and Reynaert, Nick
- Subjects
Cancer treatment planning ,Proton therapy - Abstract
The survival statistics for cancer patients treated with radiation therapy using photon beams show that many treatments fail due to poor tumour local control (TLC). A potential solution would be to increase the target dose, but this often entails high toxicity in the nearby healthy tissue. Unlike photons, protons release most of their energy at the end of their path (the so-called Bragg peak), reducing the dose to healthy tissue, which might be the key for safe dose escalation. Moreover, functional images can reveal spatial heterogeneity in the tumor radioresistance pattern and therefore allow for improved targeting (dose painting) and possibly enhanced TLC. In this context, special care must be taken to accurately model the possible uncertainties in the position of the Bragg peak, since they can strongly deteriorate treatment quality, especially in very heterogeneous dose distributions as in dose painting plans. In this thesis, we investigated the use of the most advanced techniques, such as Monte Carlo dose calculation and robust optimization, to ensure accurate and robust planning for (dose painted) proton therapy treatments. (BIFA - Sciences biomédicales et pharmaceutiques) -- UCL, 2017
- Published
- 2017
49. Non-rigid deformation of treatment plans for on-line adaptive protontherapy
- Author
-
Schepmans, Geoffrey, UCL - Ecole polytechnique de Louvain, Sterpin, Edmond, and Lee, John Aldo
- Subjects
Treatment Planning ,Radiotherapy ,Radiation Dose ,Adaptive Planning ,Proton Radiotherapy ,Pencil Beam Scanning ,Non-Rigid Deformation ,Radiation Dose Distribution ,Monte Carlo simulation ,Cancer - Abstract
External radiotherapy is a major component of cancer treatment. Its objective is to deliver a high measure of radiation to tumors while minimizing the dose inflicted to healthy tissues. Protons have the ability to deliver a peak of energy (the "Bragg peak") in a precise location depending on the atomic composition and densities of the tissues in their path, making them highly suitable for radiotherapy. Because the anatomy of the patient is continuously changing during the course of the treatment that may last more than 30 days, the position of the Bragg peak is subject to uncertainties, which may have dramatic consequences for the patient. The goal of this master thesis is to tune the treatment to day-to-day anatomical changes by performing the so-called "adaptive planning". We submit an original strategy where the treatment plan, optimized before the start of the treatment and approved by a radiation oncologist, is adjusted to the new anatomy. Three test cases have been designed: 1) anatomical changes around a tumor with unmodified position and shape; 2) tumor displacement; and 3) tumor reduction. The evaluation of a plan is conducted by analysing the dose volume histogram of the predicted dose by a Monte Carlo algorithm. Except for the tumor reduction, our adapted plans manage to produce a dose close to the desired one. In conclusion, non-rigid deformation of treatment plan could be used in addition of other method to perform efficient adaptive protontherapy. Master [120] : ingénieur civil biomédical, Université catholique de Louvain, 2016
- Published
- 2016
50. A comparative study on automatic treatment planning for online adaptive proton therapy of esophageal cancer: Which combination of deformable registration and deep learning planning tools performs the best?
- Author
-
Draguet C, Populaire P, Vera MC, Fredriksson A, Haustermans K, Lee JA, Barragan Montero AM, and Sterpin E
- Abstract
Objective: To demonstrate the feasibility of integrating fully-automated online adaptive proton therapy strategies (OAPT) within a commercially available treatment planning system and underscore what limits their clinical implementation. These strategies leverage existing deformable image registration (DIR) algorithms and state-of-the-art deep learning (DL) networks for organ segmentation and proton dose prediction.
Approach: Four OAPT strategies featuring automatic segmentation and robust optimization were evaluated on a cohort of 17 patients, each undergoing a repeat CT scan. (1) DEF-INIT combines deformably registered contours with template-based optimization. (2) DL-INIT, (3) DL-DEF, and (4) DL-DL employ a nnU-Net DL network for organ segmentation and a controlling ROIs-guided DIR algorithm for iCTV segmentation. DL-INIT uses this segmentation alongside template-based optimization, DL-DEF integrates it with a dose-mimicking (DM) step using a reference deformed dose, and DL-DL merges it with DM on a reference DL-predicted dose. All strategies were evaluated on manual contours and contours used for optimization and compared with manually adapted plans. Key dose volume metrics like iCTV D98% are reported.
Main results: iCTV D98% was comparable in manually adapted plans and for all strategies in nominal cases but dropped to 20 Gy in worst-case scenarios for a few patients per strategy, highlighting the need to correct segmentation errors in the target volume. Evaluations on optimization contours showed minimal relative error, with some outliers, particularly in template-based strategies (DEF-INIT and DL-INIT). DL-DEF achieves a good trade-off between speed and dosimetric quality, showing a passing rate (iCTV D98% > 94%) of 90% when evaluated against 2, 4 and 5mm setup error and of 88% when evaluated against 7mm setup error. While template-based methods are more rigid, DL-DEF and DL-DL have potential for further enhancements with proper DM algorithm tuning. 
Significance: Among investigated strategies, DL-DEF and DL-DL demonstrated promising within-10-minutes OAPT implementation results and significant potential for improvements.
., (© 2024 Institute of Physics and Engineering in Medicine. All rights, including for text and data mining, AI training, and similar technologies, are reserved.)
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.