271,442 results on '"Wells, A"'
Search Results
2. A Post-Occupancy Study of Nature-Based Outdoor Classrooms in Early Childhood Education
- Author
-
Dennis, Jr., Samuel F., Wells, Alexandra, and Bishop, Candace
- Published
- 2023
3. HIV pre-exposure prophylaxis (PrEP) should be free across Canada to those meeting evidence-based guidelines
- Author
-
Gaspar, Mark, Tan, Darrell H.S., Lachowsky, Nathan, Hull, Mark, Wells, Alex, Sinno, Jad, Espinosa, Oscar Javier Pico, and Grace, Daniel
- Published
- 2022
4. Connecting Primary Care Providers in Free Clinics with Specialists Via Telehealth: A Pilot Program with Three Miami Clinics
- Author
-
Anderson, Frederick, Wells, Alan L., Levine, Lisa Bard, Stumbar, Sarah E., Maurer, Michael, and Gwynn, Lisa
- Published
- 2021
- Full Text
- View/download PDF
5. 'It's Just Good Teaching': Black Educators Respond to the So-Called 'Anti-Critical Race Theory' Backlash in K-12 Schools
- Author
-
Leana Cabral, Siettah Parks, and Amy Stuart Wells
- Abstract
As sociologists of education, we're deeply concerned about the growing censorship in our schools and the attack on teaching the truth about our history and present-day inequality. We also recognize how an educational past mired in antiblack practices and policies remains with us today and thus why teachers are still faced with navigating censorship and constraints on what they know are critical and proven pedagogies. This article explores the continued need for "fugitive" practices to employ educational models that de-center Eurocentric narratives and center Black or other marginalized cultures and ways of knowing. We argue that educators committed to antiracist teaching can learn from the legacy of the art of Black teaching and how it was subversively taken up and put into practice by Black teachers over time (Gay, 2002; Givens, 2021; Walker, 2018).
- Published
- 2024
6. On-Farm Hog Processing Demonstration for Teenage Exhibitors: Blending Academic, Laboratory, and Farm-Based Learning
- Author
-
Katherine A. Wells, Chris L. Bruynis, and Lyda G. Garcia
- Abstract
COVID-19 challenges induced a U.S. meatpacking industry bottleneck. Ohio Extension identified the need and responded by creating a three-step hands-on training for teenage junior fair exhibitors. Ohio Extension Meat Scientist and graduate students assisted in demonstrating an on-farm hog harvest and processing event in collaboration with a local Extension office for 4-H and FFA teenagers. To add a practical perspective, a local hog-producing and harvesting family was asked to assist with the event. An online post-survey reflected 90-100% gains in five educational areas and 100% said they would a similar event in the future and recommend it to a friend.
- Published
- 2024
7. Book review
- Author
-
Wells, Alastair
- Published
- 2023
8. AI-Machine Learning-Enabled Tokamak Digital Twin
- Author
-
Tang, William, Feibush, Eliot, Dong, Ge, Borthwick, Noah, Lee, Apollo, Gomez, Juan-Felipe, Gibbs, Tom, Stone, John, Messmer, Peter, Wells, Jack, Wei, Xishuo, and Lin, Zhihong
- Subjects
Physics - Computational Physics ,Physics - Plasma Physics - Abstract
In addressing the Department of Energy's April, 2022 announcement of a Bold Decadal Vision for delivering a Fusion Pilot Plant by 2035, associated software tools need to be developed for the integration of real world engineering and supply chain data with advanced science models that are accelerated with Machine Learning. An associated research and development effort has been introduced here with promising early progress on the delivery of a realistic Digital Twin Tokamak that has benefited from accelerated advances by the Princeton University AI Deep Learning innovative near-real-time simulators accompanied by technological capabilities from the NVIDIA Omniverse, an open computing platform for building and operating applications that connect with leading scientific computing visualization software. Working with the CAD files for the GA/DIII-D tokamak including equilibrium evolution as an exemplar tokamak application using Omniverse, the Princeton-NVIDIA collaboration has integrated modern AI/HPC-enabled near-real-time kinetic dynamics to connect and accelerate state-of-the-art, synthetic, HPC simulators to model fusion devices and control systems. The overarching goal is to deliver an interactive scientific digital twin of an advanced MFE tokamak that enables near-real-time simulation workflows built with Omniverse to eventually help open doors to new capabilities for generating clean power for a better future.
- Published
- 2024
9. Estimand-based Inference in Presence of Long-Term Survivors
- Author
-
Tai, Yi-Cheng, Wang, Weijing, and Wells, Martin T.
- Subjects
Statistics - Methodology - Abstract
In this article, we develop nonparametric inference methods for comparing survival data across two samples, which are beneficial for clinical trials of novel cancer therapies where long-term survival is a critical outcome. These therapies, including immunotherapies or other advanced treatments, aim to establish durable effects. They often exhibit distinct survival patterns such as crossing or delayed separation and potentially leveling-off at the tails of survival curves, clearly violating the proportional hazards assumption and rendering the hazard ratio inappropriate for measuring treatment effects. The proposed methodology utilizes the mixture cure framework to separately analyze the cure rates of long-term survivors and the survival functions of susceptible individuals. We evaluate a nonparametric estimator for the susceptible survival function in the one-sample setting. Under sufficient follow-up, it is expressed as a location-scale-shift variant of the Kaplan-Meier (KM) estimator. It retains several desirable features of the KM estimator, including inverse-probability-censoring weighting, product-limit estimation, self-consistency, and nonparametric efficiency. In scenarios of insufficient follow-up, it can easily be adapted by incorporating a suitable cure rate estimator. In the two-sample setting, besides using the difference in cure rates to measure the long-term effect, we propose a graphical estimand to compare the relative treatment effects on susceptible subgroups. This process, inspired by Kendall's tau, compares the order of survival times among susceptible individuals. The proposed methods' large-sample properties are derived for further inference, and the finite-sample properties are examined through extensive simulation studies. The proposed methodology is applied to analyze the digitized data from the CheckMate 067 immunotherapy clinical trial.
- Published
- 2024
10. Coalitions of AI-based Methods Predict 15-Year Risks of Breast Cancer Metastasis Using Real-World Clinical Data with AUC up to 0.9
- Author
-
Jiang, Xia, Zhou, Yijun, Wells, Alan, and Brufsky, Adam
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Computer Science - Neural and Evolutionary Computing ,Quantitative Biology - Quantitative Methods - Abstract
Breast cancer is one of the two cancers responsible for the most deaths in women, with about 42,000 deaths each year in the US. That there are over 300,000 breast cancers newly diagnosed each year suggests that only a fraction of the cancers result in mortality. Thus, most of the women undergo seemingly curative treatment for localized cancers, but a significant later succumb to metastatic disease for which current treatments are only temporizing for the vast majority. The current prognostic metrics are of little actionable value for 4 of the 5 women seemingly cured after local treatment, and many women are exposed to morbid and even mortal adjuvant therapies unnecessarily, with these adjuvant therapies reducing metastatic recurrence by only a third. Thus, there is a need for better prognostics to target aggressive treatment at those who are likely to relapse and spare those who were actually cured. While there is a plethora of molecular and tumor-marker assays in use and under-development to detect recurrence early, these are time consuming, expensive and still often un-validated as to actionable prognostic utility. A different approach would use large data techniques to determine clinical and histopathological parameters that would provide accurate prognostics using existing data. Herein, we report on machine learning, together with grid search and Bayesian Networks to develop algorithms that present a AUC of up to 0.9 in ROC analyses, using only extant data. Such algorithms could be rapidly translated to clinical management as they do not require testing beyond routine tumor evaluations.
- Published
- 2024
11. Trustworthy and Responsible AI for Human-Centric Autonomous Decision-Making Systems
- Author
-
Dehghani, Farzaneh, Dibaji, Mahsa, Anzum, Fahim, Dey, Lily, Basdemir, Alican, Bayat, Sayeh, Boucher, Jean-Christophe, Drew, Steve, Eaton, Sarah Elaine, Frayne, Richard, Ginde, Gouri, Harris, Ashley, Ioannou, Yani, Lebel, Catherine, Lysack, John, Arzuaga, Leslie Salgado, Stanley, Emma, Souza, Roberto, Santos, Ronnie de Souza, Wells, Lana, Williamson, Tyler, Wilms, Matthias, Wahid, Zaman, Ungrin, Mark, Gavrilova, Marina, and Bento, Mariana
- Subjects
Computer Science - Artificial Intelligence - Abstract
Artificial Intelligence (AI) has paved the way for revolutionary decision-making processes, which if harnessed appropriately, can contribute to advancements in various sectors, from healthcare to economics. However, its black box nature presents significant ethical challenges related to bias and transparency. AI applications are hugely impacted by biases, presenting inconsistent and unreliable findings, leading to significant costs and consequences, highlighting and perpetuating inequalities and unequal access to resources. Hence, developing safe, reliable, ethical, and Trustworthy AI systems is essential. Our team of researchers working with Trustworthy and Responsible AI, part of the Transdisciplinary Scholarship Initiative within the University of Calgary, conducts research on Trustworthy and Responsible AI, including fairness, bias mitigation, reproducibility, generalization, interpretability, and authenticity. In this paper, we review and discuss the intricacies of AI biases, definitions, methods of detection and mitigation, and metrics for evaluating bias. We also discuss open challenges with regard to the trustworthiness and widespread application of AI across diverse domains of human-centric decision making, as well as guidelines to foster Responsible and Trustworthy AI models., Comment: 44 pages, 2 figures
- Published
- 2024
12. Dynamical Accretion Flows -- ALMAGAL: Flows along filamentary structures in high-mass star-forming clusters
- Author
-
Wells, M. R. A., Beuther, H., Molinari, S., Schilke, P., Battersby, C., Ho, P., Sánchez-Monge, Á., Jones, B., Scheuck, M. B., Syed, J., Gieser, C., Kuiper, R., Elia, D., Coletta, A., Traficante, A., Wallace, J., Rigby, A. J., Klessen, R. S., Zhang, Q., Walch, S., Beltrán, M. T., Tang, Y., Fuller, G. A., Lis, D. C., Möller, T., van der Tak, F., Klaassen, P. D., Clarke, S. D., Moscadelli, L., Mininni, C., Zinnecker, H., Maruccia, Y., Pezzuto, S., Benedettini, M., Soler, J. D., Brogan, C. L., Avison, A., Sanhueza, P., Schisano, E., Liu, T., Fontani, F., Rygl, K. L. J., Wyrowski, F., Bally, J., Walker, D. L., Ahmadi, A., Koch, P., Merello, M., Law, C. Y., and Testi, L.
- Subjects
Astrophysics - Astrophysics of Galaxies ,Astrophysics - Solar and Stellar Astrophysics - Abstract
We use data from the ALMA Evolutionary Study of High Mass Protocluster Formation in the Galaxy (ALMAGAL) survey to study 100 ALMAGAL regions at $\sim$ 1 arsecond resolution located between $\sim$ 2 and 6 kpc distance. Using ALMAGAL $\sim$ 1.3mm line and continuum data we estimate flow rates onto individual cores. We focus specifically on flow rates along filamentary structures associated with these cores. Our primary analysis is centered around position velocity cuts in H$_2$CO (3$_{0,3}$ - 2$_{0,2}$) which allow us to measure the velocity fields, surrounding these cores. Combining this work with column density estimates we derive the flow rates along the extended filamentary structures associated with cores in these regions. We select a sample of 100 ALMAGAL regions covering four evolutionary stages from quiescent to protostellar, Young Stellar Objects (YSOs), and HII regions (25 each). Using dendrogram and line analysis, we identify a final sample of 182 cores in 87 regions. In this paper, we present 728 flow rates for our sample (4 per core), analysed in the context of evolutionary stage, distance from the core, and core mass. On average, for the whole sample, we derive flow rates on the order of $\sim$10$^{-4}$ M$_{sun}$yr$^{-1}$ with estimated uncertainties of $\pm$50%. We see increasing differences in the values among evolutionary stages, most notably between the less evolved (quiescent/protostellar) and more evolved (YSO/HII region) sources. We also see an increasing trend as we move further away from the centre of these cores. We also find a clear relationship between the flow rates and core masses $\sim$M$^{2/3}$ which is in line with the result expected from the tidal-lobe accretion mechanism. Overall, we see increasing trends in the relationships between the flow rate and the three investigated parameters; evolutionary stage, distance from the core, and core mass., Comment: 11 pages, 11 figures, accepted for publication in A&A
- Published
- 2024
13. Deep Learning: a Heuristic Three-stage Mechanism for Grid Searches to Optimize the Future Risk Prediction of Breast Cancer Metastasis Using EHR-based Clinical Data
- Author
-
Jiang, Xia, Zhou, Yijun, Xu, Chuhan, Brufsky, Adam, and Wells, Alan
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Computer Science - Neural and Evolutionary Computing ,Quantitative Biology - Quantitative Methods - Abstract
A grid search, at the cost of training and testing a large number of models, is an effective way to optimize the prediction performance of deep learning models. A challenging task concerning grid search is the time management. Without a good time management scheme, a grid search can easily be set off as a mission that will not finish in our lifetime. In this study, we introduce a heuristic three-stage mechanism for managing the running time of low-budget grid searches, and the sweet-spot grid search (SSGS) and randomized grid search (RGS) strategies for improving model prediction performance, in predicting the 5-year, 10-year, and 15-year risk of breast cancer metastasis. We develop deep feedforward neural network (DFNN) models and optimize them through grid searches. We conduct eight cycles of grid searches by applying our three-stage mechanism and SSGS and RGS strategies. We conduct various SHAP analyses including unique ones that interpret the importance of the DFNN-model hyperparameters. Our results show that grid search can greatly improve model prediction. The grid searches we conducted improved the risk prediction of 5-year, 10-year, and 15-year breast cancer metastasis by 18.6%, 16.3%, and 17.3% respectively, over the average performance of all corresponding models we trained using the RGS strategy. We not only demonstrate best model performance but also characterize grid searches from various aspects such as their capabilities of discovering decent models and the unit grid search time. The three-stage mechanism worked effectively. It made our low-budget grid searches feasible and manageable, and in the meantime helped improve model prediction performance. Our SHAP analyses identified both clinical risk factors important for the prediction of future risk of breast cancer metastasis, and DFNN-model hyperparameters important to the prediction of performance scores.
- Published
- 2024
14. Towards Semantic Markup of Mathematical Documents via User Interaction
- Author
-
Vrečar, Luka, Wells, Joe, and Kamareddine, Fairouz
- Subjects
Computer Science - Computation and Language ,Computer Science - Information Retrieval ,68V30 - Abstract
Mathematical documents written in LaTeX often contain ambiguities. We can resolve some of them via semantic markup using, e.g., sTeX, which also has other potential benefits, such as interoperability with computer algebra systems, proof systems, and increased accessibility. However, semantic markup is more involved than "regular" typesetting and presents a challenge for authors of mathematical documents. We aim to smooth out the transition from plain LaTeX to semantic markup by developing semi-automatic tools for authors. In this paper we present an approach to semantic markup of formulas by (semi-)automatically generating grammars from existing sTeX macro definitions and parsing mathematical formulas with them. We also present a GUI-based tool for the disambiguation of parse results and showcase its functionality and potential using a grammar for parsing untyped $\lambda$-terms., Comment: Submitted to the CICM 2024 conference, due to be published in Volume 14960 of Springer's Lecture Notes in Computer Science
- Published
- 2024
15. Facets in the Vietoris--Rips complexes of hypercubes
- Author
-
Briggs, Joseph, Feng, Ziqin, and Wells, Chris
- Subjects
Mathematics - Algebraic Topology - Abstract
In this paper, we investigate the facets of the Vietoris--Rips complex $\mathcal{VR}(Q_n; r)$ where $Q_n$ denotes the $n$-dimensional hypercube. We are particularly interested in those facets which are somehow independent of the dimension $n$. Using Hadamard matrices, we prove that the number of different dimensions of such facets is a super-polynomial function of the scale $r$, assuming that $n$ is sufficiently large. We show also that the $(2r-1)$-th dimensional homology of the complex $\mathcal{VR}(Q_n; r)$ is non-trivial when $n$ is large enough, provided that the Hadamard matrix of order $2r$ exists.
- Published
- 2024
16. Enabling Contextual Soft Moderation on Social Media through Contrastive Textual Deviation
- Author
-
Paudel, Pujan, Saeed, Mohammad Hammas, Auger, Rebecca, Wells, Chris, and Stringhini, Gianluca
- Subjects
Computer Science - Computation and Language ,Computer Science - Cryptography and Security - Abstract
Automated soft moderation systems are unable to ascertain if a post supports or refutes a false claim, resulting in a large number of contextual false positives. This limits their effectiveness, for example undermining trust in health experts by adding warnings to their posts or resorting to vague warnings instead of granular fact-checks, which result in desensitizing users. In this paper, we propose to incorporate stance detection into existing automated soft-moderation pipelines, with the goal of ruling out contextual false positives and providing more precise recommendations for social media content that should receive warnings. We develop a textual deviation task called Contrastive Textual Deviation (CTD) and show that it outperforms existing stance detection approaches when applied to soft moderation.We then integrate CTD into the stateof-the-art system for automated soft moderation Lambretta, showing that our approach can reduce contextual false positives from 20% to 2.1%, providing another important building block towards deploying reliable automated soft moderation tools on social media.
- Published
- 2024
17. Galaxies and Their Environment at $z \gtrsim 10$ -- I: Primordial Chemical Enrichment, Accretion, Cooling, and Virialization of Gas in Dark Matter Halos
- Author
-
Hicks, William M., Norman, Michael L., Wells, Azton I., and Bordner, James O.
- Subjects
Astrophysics - Astrophysics of Galaxies - Abstract
Recent observations made using the James Webb Space Telescope have identified a number of high-redshift galaxies that are unexpectedly luminous. In light of this, it is clear that a more detailed understanding of the high redshift, pre-reionization universe is required for us to obtain the complete story of galaxy formation. This study is the first in a series that seeks to tell the story of galaxy formation at $z \gtrsim 10$ using a suite of large-scale adaptive mesh refinement cosmological simulations. Our machine-learning-accelerated surrogate model for Population III star formation and feedback, StarNet, gives us an unprecedented ability to obtain physically accurate, inhomogeneous chemical initial conditions for a statistically significant number of galaxies. We find that of the 12,423 halos in the mass range of $10^6\,\,M_\odot < M_\mathrm{vir} < 10^9\,\, M_\odot$ that form in our fiducial simulation, $16\%$ are chemically enriched by Population III supernovae by $z\sim12$. We then profile and compare various cooling processes at the centers of halos, and find a complete absence of atomic cooling halos. All of our halos with central cooling gas are dominated by H$_2$ cooling, metal cooling, or a mixture of the two, even in the presence of a strong H$_2$-photodissociating Lyman-Werner background. We also find that gas accretion through the virial radius is not driven by cooling. We find that gas virialization in halos with $M_\mathrm{vir}\gtrsim10^7\,\,M_\odot$ is supported by bulk turbulent flows, and that thermal energy accounts for only a small fraction of the total kinetic energy. Because of this, the mean gas temperature is well below the virial temperature for these halos. We then compute the mass of gas that is available for Population II star formation, and infer star formation rates for each potential star-forming halo., Comment: 36 pages, 27 figures
- Published
- 2024
18. Sensitivity target for an impactful Higgs boson self coupling measurement
- Author
-
Bhattiprolu, Prudhvi N. and Wells, James D.
- Subjects
High Energy Physics - Phenomenology - Abstract
We argue that a measurement of the Higgs boson self-coupling becomes particularly meaningful when its sensitivity is within 40\% of its Standard Model value. This constitutes a target for a future impactful experimental achievement. It is derived from recently obtained results of how extreme the differences can be between effective field theory operator coefficients when their origins are from reasonable theories beyond the Standard Model., Comment: 10 pages, 2 figures
- Published
- 2024
19. Compressive Electron Backscatter Diffraction Imaging
- Author
-
Broad, Zoë, Robinson, Alex W., Wells, Jack, Nicholls, Daniel, Moshtaghpour, Amirafshar, Kirkland, Angus I., and Browning, Nigel D.
- Subjects
Electrical Engineering and Systems Science - Image and Video Processing ,Condensed Matter - Materials Science - Abstract
Electron backscatter diffraction (EBSD) has developed over the last few decades into a valuable crystallographic characterisation method for a wide range of sample types. Despite these advances, issues such as the complexity of sample preparation, relatively slow acquisition, and damage in beam-sensitive samples, still limit the quantity and quality of interpretable data that can be obtained. To mitigate these issues, here we propose a method based on the subsampling of probe positions and subsequent reconstruction of an incomplete dataset. The missing probe locations (or pixels in the image) are recovered via an inpainting process using a dictionary-learning based method called beta-process factor analysis (BPFA). To investigate the robustness of both our inpainting method and Hough-based indexing, we simulate subsampled and noisy EBSD datasets from a real fully sampled Ni-superalloy dataset for different subsampling ratios of probe positions using both Gaussian and Poisson noise models. We find that zero solution pixel detection (inpainting un-indexed pixels) enables higher quality reconstructions to be obtained. Numerical tests confirm high quality reconstruction of band contrast and inverse pole figure maps from only 10% of the probe positions, with the potential to reduce this to 5% if only inverse pole figure maps are needed. These results show the potential application of this method in EBSD, allowing for faster analysis and extending the use of this technique to beam sensitive materials.
- Published
- 2024
20. AstroMLab 1: Who Wins Astronomy Jeopardy!?
- Author
-
Ting, Yuan-Sen, Nguyen, Tuan Dung, Ghosal, Tirthankar, Pan, Rui, Arora, Hardik, Sun, Zechang, de Haan, Tijmen, Ramachandra, Nesar, Wells, Azton, Madireddy, Sandeep, and Accomazzi, Alberto
- Subjects
Astrophysics - Instrumentation and Methods for Astrophysics ,Astrophysics - Earth and Planetary Astrophysics ,Astrophysics - Astrophysics of Galaxies ,Astrophysics - Solar and Stellar Astrophysics ,Computer Science - Artificial Intelligence ,Computer Science - Computation and Language - Abstract
We present a comprehensive evaluation of proprietary and open-weights large language models using the first astronomy-specific benchmarking dataset. This dataset comprises 4,425 multiple-choice questions curated from the Annual Review of Astronomy and Astrophysics, covering a broad range of astrophysical topics. Our analysis examines model performance across various astronomical subfields and assesses response calibration, crucial for potential deployment in research environments. Claude-3.5-Sonnet outperforms competitors by up to 4.6 percentage points, achieving 85.0% accuracy. For proprietary models, we observed a universal reduction in cost every 3-to-12 months to achieve similar score in this particular astronomy benchmark. Open-source models have rapidly improved, with LLaMA-3-70b (80.6%) and Qwen-2-72b (77.7%) now competing with some of the best proprietary models. We identify performance variations across topics, with non-English-focused models generally struggling more in exoplanet-related fields, stellar astrophysics, and instrumentation related questions. These challenges likely stem from less abundant training data, limited historical context, and rapid recent developments in these areas. This pattern is observed across both open-weights and proprietary models, with regional dependencies evident, highlighting the impact of training data diversity on model performance in specialized scientific domains. Top-performing models demonstrate well-calibrated confidence, with correlations above 0.9 between confidence and correctness, though they tend to be slightly underconfident. The development for fast, low-cost inference of open-weights models presents new opportunities for affordable deployment in astronomy. The rapid progress observed suggests that LLM-driven research in astronomy may become feasible in the near future., Comment: 45 pages, 12 figures, 7 tables. Submitted to ApJ. Comments welcome. AstroMLab homepage: https://astromlab.org/
- Published
- 2024
21. Learning to Complement and to Defer to Multiple Users
- Author
-
Zhang, Zheng, Ai, Wenjie, Wells, Kevin, Rosewarne, David, Do, Thanh-Toan, and Carneiro, Gustavo
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
With the development of Human-AI Collaboration in Classification (HAI-CC), integrating users and AI predictions becomes challenging due to the complex decision-making process. This process has three options: 1) AI autonomously classifies, 2) learning to complement, where AI collaborates with users, and 3) learning to defer, where AI defers to users. Despite their interconnected nature, these options have been studied in isolation rather than as components of a unified system. In this paper, we address this weakness with the novel HAI-CC methodology, called Learning to Complement and to Defer to Multiple Users (LECODU). LECODU not only combines learning to complement and learning to defer strategies, but it also incorporates an estimation of the optimal number of users to engage in the decision process. The training of LECODU maximises classification accuracy and minimises collaboration costs associated with user involvement. Comprehensive evaluations across real-world and synthesized datasets demonstrate LECODU's superior performance compared to state-of-the-art HAI-CC methods. Remarkably, even when relying on unreliable users with high rates of label noise, LECODU exhibits significant improvement over both human decision-makers alone and AI alone., Comment: ECCV 2024
- Published
- 2024
22. Effect estimation in the presence of a misclassified binary mediator
- Author
-
Webb, Kimberly A. Hochstedler and Wells, Martin T.
- Subjects
Statistics - Methodology - Abstract
Mediation analyses allow researchers to quantify the effect of an exposure variable on an outcome variable through a mediator variable. If a binary mediator variable is misclassified, the resulting analysis can be severely biased. Misclassification is especially difficult to deal with when it is differential and when there are no gold standard labels available. Previous work has addressed this problem using a sensitivity analysis framework or by assuming that misclassification rates are known. We leverage a variable related to the misclassification mechanism to recover unbiased parameter estimates without using gold standard labels. The proposed methods require the reasonable assumption that the sum of the sensitivity and specificity is greater than 1. Three correction methods are presented: (1) an ordinary least squares correction for Normal outcome models, (2) a multi-step predictive value weighting method, and (3) a seamless expectation-maximization algorithm. We apply our misclassification correction strategies to investigate the mediating role of gestational hypertension on the association between maternal age and pre-term birth., Comment: 44 pages, 5 figures, 6 tables
- Published
- 2024
23. How We Built Cedar: A Verification-Guided Approach
- Author
-
Disselkoen, Craig, Eline, Aaron, He, Shaobo, Headley, Kyle, Hicks, Michael, Hietala, Kesha, Kastner, John, Mamat, Anwar, McCutchen, Matt, Rungta, Neha, Shah, Bhakti, Torlak, Emina, and Wells, Andrew
- Subjects
Computer Science - Software Engineering - Abstract
This paper presents verification-guided development (VGD), a software engineering process we used to build Cedar, a new policy language for expressive, fast, safe, and analyzable authorization. Developing a system with VGD involves writing an executable model of the system and mechanically proving properties about the model; writing production code for the system and using differential random testing (DRT) to check that the production code matches the model; and using property-based testing (PBT) to check properties of unmodeled parts of the production code. Using VGD for Cedar, we can build fast, idiomatic production code, prove our model correct, and find and fix subtle implementation bugs that evade code reviews and unit testing. While carrying out proofs, we found and fixed 4 bugs in Cedar's policy validator, and DRT and PBT helped us find and fix 21 additional bugs in various parts of Cedar.
- Published
- 2024
24. Contested Bodies: Pregnancy, Childrearing, and Slavery in Jamaica by Sasha Turner, and: The Politics of Reproduction: Race, Medicine, and Fertility in the Age of Abolition by Katherine Paugh (review)
- Author
-
Wells, Andrew
- Published
- 2020
25. Material Phenomenology by Michel Henry (review)
- Author
-
Wells, Adam
- Published
- 2020
26. Confounded or Controlled? A Systematic Review of Media Comparison Studies Involving Immersive Virtual Reality for STEM Education
- Author
-
Alyssa P. Lawson, Amedee Marchand Martella, Kristen LaBonte, Cynthia Y. Delgado, Fangzheng Zhao, Justin A. Gluck, Mitchell E. Munns, Ashleigh Wells LeRoy, and Richard E. Mayer
- Abstract
A substantial amount of media comparison research has been conducted in the last decade to investigate whether students learn Science, Technology, Engineering, and Mathematics (STEM) content better in immersive virtual reality (IVR) or more traditional learning environments. However, a thorough review of the design and implementation of conventional and IVR conditions in media comparison studies has not been conducted to examine the extent to which specific affordances of IVR can be pinpointed as the causal factor in enhancing learning. The present review filled this gap in the literature by examining the degree to which conventional and IVR conditions have been controlled on instructional methods and content within the K-12 and higher education STEM literature base. Thirty-eight published journal articles, conference proceedings, and dissertations related to IVR comparison studies in STEM education between the years 2013 and 2022 were coded according to 15 categories. These categories allowed for the extraction of information on the instructional methods and content characteristics of the conventional and IVR conditions to determine the degree of control within each experimental comparison. Results indicated only 26% of all comparisons examined between an IVR and conventional condition were fully controlled on five key control criteria. Moreover, 40% of the comparisons had at least one confound related to instructional method and content. When looking at the outcomes of the studies, it was difficult to gather a clear picture of the benefits or pitfalls of IVR when much of the literature was confounded and/or lacked sufficient information to determine if the conditions were controlled on key variables. Implications and recommendations for future IVR comparison research are discussed.
- Published
- 2024
- Full Text
- View/download PDF
27. Safety of treating acute pulmonary embolism at home: an individual patient data meta-analysis.
- Author
-
Luijten, Dieuwke, Douillet, Delphine, Luijken, Kim, Tromeur, Cecile, Penaloza, Andrea, Hugli, Olivier, Aujesky, Drahomir, Barco, Stefano, Bledsoe, Joseph, Chang, Kyle, Couturaud, Francis, den Exter, Paul, Font, Carme, Huisman, Menno, Jimenez, David, Kabrhel, Christopher, Kline, Jeffrey, Konstantinides, Stavros, van Mens, Thijs, Otero, Remedios, Peacock, W, Sanchez, Olivier, Stubblefield, William, Valerio, Luca, Vinson, David, Wells, Philip, van Smeden, Maarten, Roy, Pierre-Marie, and Klok, Frederikus
- Subjects
Clinical decision-making ,Early discharge ,Emergency care ,Outpatient care ,Pulmonary embolism ,Humans ,Pulmonary Embolism ,Acute Disease ,Home Care Services ,Hemorrhage ,Male ,Female ,Anticoagulants ,Randomized Controlled Trials as Topic ,Prospective Studies ,Aged ,Natriuretic Peptide ,Brain ,Middle Aged - Abstract
BACKGROUND AND AIMS: Home treatment is considered safe in acute pulmonary embolism (PE) patients selected by a validated triage tool (e.g. simplified PE severity index score or Hestia rule), but there is uncertainty regarding the applicability in underrepresented subgroups. The aim was to evaluate the safety of home treatment by performing an individual patient-level data meta-analysis. METHODS: Ten prospective cohort studies or randomized controlled trials were identified in a systematic search, totalling 2694 PE patients treated at home (discharged within 24 h) and identified by a predefined triage tool. The 14- and 30-day incidences of all-cause mortality and adverse events (combined endpoint of recurrent venous thromboembolism, major bleeding, and/or all-cause mortality) were evaluated. The relative risk (RR) for 14- and 30-day mortalities and adverse events is calculated in subgroups using a random effects model. RESULTS: The 14- and 30-day mortalities were 0.11% [95% confidence interval (CI) 0.0-0.24, I2 = 0) and 0.30% (95% CI 0.09-0.51, I2 = 0). The 14- and 30-day incidences of adverse events were 0.56% (95% CI 0.28-0.84, I2 = 0) and 1.2% (95% CI 0.79-1.6, I2 = 0). Cancer was associated with increased 30-day mortality [RR 4.9; 95% prediction interval (PI) 2.7-9.1; I2 = 0]. Pre-existing cardiopulmonary disease, abnormal troponin, and abnormal (N-terminal pro-)B-type natriuretic peptide [(NT-pro)BNP] at presentation were associated with an increased incidence of 14-day adverse events [RR 3.5 (95% PI 1.5-7.9, I2 = 0), 2.5 (95% PI 1.3-4.9, I2 = 0), and 3.9 (95% PI 1.6-9.8, I2 = 0), respectively], but not mortality. At 30 days, cancer, abnormal troponin, and abnormal (NT-pro)BNP were associated with an increased incidence of adverse events [RR 2.7 (95% PI 1.4-5.2, I2 = 0), 2.9 (95% PI 1.5-5.7, I2 = 0), and 3.3 (95% PI 1.6-7.1, I2 = 0), respectively]. CONCLUSIONS: The incidence of adverse events in home-treated PE patients, selected by a validated triage tool, was very low. Patients with cancer had a three- to five-fold higher incidence of adverse events and death. Patients with increased troponin or (NT-pro)BNP had a three-fold higher risk of adverse events, driven by recurrent venous thromboembolism and bleeding.
- Published
- 2024
28. Improving laboratory animal genetic reporting: LAG-R guidelines.
- Author
-
Teboul, Lydia, Amos-Landgraf, James, Benavides, Fernando, Birling, Marie-Christine, Brown, Steve, Bryda, Elizabeth, Bunton-Stasyshyn, Rosie, Chin, Hsian-Jean, Crispo, Martina, Delerue, Fabien, Dobbie, Michael, Franklin, Craig, Fuchtbauer, Ernst-Martin, Gao, Xiang, Golzio, Christelle, Haffner, Rebecca, Hérault, Yann, Hrabe de Angelis, Martin, Lloyd, Kevin, Magnuson, Terry, Montoliu, Lluis, Murray, Stephen, Nam, Ki-Hoan, Nutter, Lauryl, Pailhoux, Eric, Pardo Manuel de Villena, Fernando, Peterson, Kevin, Reinholdt, Laura, Sedlacek, Radislav, Seong, Je, Shiroishi, Toshihiko, Smith, Cynthia, Takeo, Toru, Tinsley, Louise, Vilotte, Jean-Luc, Warming, Søren, Wells, Sara, Whitelaw, C, Yoshiki, Atsushi, and Pavlovic, Guillaume
- Subjects
Animals ,Animals ,Laboratory ,Guidelines as Topic ,Reproducibility of Results ,Research Design ,Animal Experimentation ,Biomedical Research - Abstract
The biomedical research community addresses reproducibility challenges in animal studies through standardized nomenclature, improved experimental design, transparent reporting, data sharing, and centralized repositories. The ARRIVE guidelines outline documentation standards for laboratory animals in experiments, but genetic information is often incomplete. To remedy this, we propose the Laboratory Animal Genetic Reporting (LAG-R) framework. LAG-R aims to document animals genetic makeup in scientific publications, providing essential details for replication and appropriate model use. While verifying complete genetic compositions may be impractical, better reporting and validation efforts enhance reliability of research. LAG-R standardization will bolster reproducibility, peer review, and overall scientific rigor.
- Published
- 2024
29. Materials descriptors for advanced water dissociation catalysts in bipolar membranes
- Author
-
Sasmal, Sayantan, Chen, Lihaokun, Sarma, Prasad V, Vulpin, Olivia T, Simons, Casey R, Wells, Kacie M, Spontak, Richard J, and Boettcher, Shannon W
- Subjects
Macromolecular and Materials Chemistry ,Chemical Sciences ,Physical Chemistry ,Engineering ,Materials Engineering ,Affordable and Clean Energy ,Nanoscience & Nanotechnology - Abstract
The voltage penalty driving water dissociation (WD) at high current density is a major obstacle in the commercialization of bipolar membrane (BPM) technology for energy devices. Here we show that three materials descriptors, that is, electrical conductivity, microscopic surface area and (nominal) surface-hydroxyl coverage, effectively control the kinetics of WD in BPMs. Using these descriptors and optimizing mass loading, we design new earth-abundant WD catalysts based on nanoparticle SnO2 synthesized at low temperature with high conductivity and hydroxyl coverage. These catalysts exhibit exceptional performance in a BPM electrolyser with low WD overvoltage (ηwd) of 100 ± 20 mV at 1.0 A cm-2. The new catalyst works equivalently well with hydrocarbon proton-exchange layers as it does with fluorocarbon-based Nafion, thus providing pathways to commercializing advanced BPMs for a broad array of electrolysis, fuel-cell and electrodialysis applications.
- Published
- 2024
30. Posttraumatic Stress Disorder After Spontaneous Coronary Artery Dissection: A Report of the International Spontaneous Coronary Artery Dissection Registry.
- Author
-
Sumner, Jennifer, Kim, Esther, Wood, Malissa, Chi, Gerald, Nolen, Jessica, Grodzinsky, Anna, Gornik, Heather, Kadian-Dodov, Daniella, Wells, Bryan, Hess, Connie, Lewey, Jennifer, Tam, Lori, Henkin, Stanislav, Orford, James, Wells, Gretchen, Kumbhani, Dharam, Lindley, Kathryn, Gibson, C, Leon, Katherine, and Naderi, Sahar
- Subjects
PTSD ,SCAD ,health status ,sleep ,trauma ,treatment ,Female ,Humans ,Male ,Middle Aged ,Coronary Angiography ,Coronary Vessel Anomalies ,Coronary Vessels ,Registries ,Risk Factors ,Stress Disorders ,Post-Traumatic ,Vascular Diseases - Abstract
BACKGROUND: Myocardial infarction secondary to spontaneous coronary artery dissection (SCAD) can be traumatic and potentially trigger posttraumatic stress disorder (PTSD). In a large, multicenter, registry-based cohort, we documented prevalence of lifetime and past-month SCAD-induced PTSD, as well as related treatment seeking, and examined a range of health-relevant correlates of SCAD-induced PTSD. METHODS AND RESULTS: Patients with SCAD were enrolled in the iSCAD (International SCAD) Registry. At baseline, site investigators completed medical report forms, and patients reported demographics, medical/SCAD history, psychosocial factors (including SCAD-induced PTSD symptoms), health behaviors, and health status via online questionnaires. Of 1156 registry patients, 859 patients (93.9% women; mean age, 52.3 years) completed questionnaires querying SCAD-induced PTSD. Nearly 35% (n=298) of patients met diagnostic criteria for probable SCAD-induced PTSD in their lifetime, and 6.4% (n=55) met criteria for probable past-month PTSD. Of 811 patients ever reporting any SCAD-induced PTSD symptoms, 34.8% indicated seeking treatment for this distress. However, 46.0% of the 298 patients with lifetime probable SCAD-induced PTSD diagnoses reported never receiving trauma-related treatment. Younger age at first SCAD, fewer years since SCAD, being single, unemployed status, more lifetime trauma, and history of anxiety were associated with greater past-month PTSD symptom severity in multivariable regression models. Greater past-month SCAD-induced PTSD symptoms were associated with greater past-week sleep disturbance and worse past-month disease-specific health status when adjusting for various risk factors. CONCLUSIONS: Given the high prevalence of SCAD-induced PTSD symptoms, efforts to support screening for these symptoms and connecting patients experiencing distress with empirically supported treatments are critical next steps. REGISTRATION: URL: https://www.clinicaltrials.gov; Unique identifier: NCT04496687.
- Published
- 2024
31. The Stochastic Occupation Kernel Method for System Identification
- Author
-
Wells, Michael, Lahouel, Kamel, and Jedynak, Bruno
- Subjects
Statistics - Machine Learning ,Computer Science - Machine Learning ,Electrical Engineering and Systems Science - Systems and Control - Abstract
The method of occupation kernels has been used to learn ordinary differential equations from data in a non-parametric way. We propose a two-step method for learning the drift and diffusion of a stochastic differential equation given snapshots of the process. In the first step, we learn the drift by applying the occupation kernel algorithm to the expected value of the process. In the second step, we learn the diffusion given the drift using a semi-definite program. Specifically, we learn the diffusion squared as a non-negative function in a RKHS associated with the square of a kernel. We present examples and simulations., Comment: 8 pages, 3 figures
- Published
- 2024
32. An Initial Investigation of Language Adaptation for TTS Systems under Low-resource Scenarios
- Author
-
Gong, Cheng, Cooper, Erica, Wang, Xin, Qiang, Chunyu, Geng, Mengzhe, Wells, Dan, Wang, Longbiao, Dang, Jianwu, Tessier, Marc, Pine, Aidan, Richmond, Korin, and Yamagishi, Junichi
- Subjects
Computer Science - Computation and Language ,Electrical Engineering and Systems Science - Audio and Speech Processing - Abstract
Self-supervised learning (SSL) representations from massively multilingual models offer a promising solution for low-resource language speech tasks. Despite advancements, language adaptation in TTS systems remains an open problem. This paper explores the language adaptation capability of ZMM-TTS, a recent SSL-based multilingual TTS system proposed in our previous work. We conducted experiments on 12 languages using limited data with various fine-tuning configurations. We demonstrate that the similarity in phonetics between the pre-training and target languages, as well as the language category, affects the target language's adaptation performance. Additionally, we find that the fine-tuning dataset size and number of speakers influence adaptability. Surprisingly, we also observed that using paired data for fine-tuning is not always optimal compared to audio-only data. Beyond speech intelligibility, our analysis covers speaker similarity, language identification, and predicted MOS., Comment: Accepted to Interspeech 2024
- Published
- 2024
33. TDCOSMO. XVI. Measurement of the Hubble Constant from the Lensed Quasar WGD$\,$2038$-$4008
- Author
-
Wong, Kenneth C., Dux, Frédéric, Shajib, Anowar J., Suyu, Sherry H., Millon, Martin, Mozumdar, Pritom, Wells, Patrick R., Agnello, Adriano, Birrer, Simon, Buckley-Geer, Elizabeth J., Courbin, Frédéric, Fassnacht, Christopher D., Frieman, Joshua, Galan, Aymeric, Lin, Huan, Marshall, Philip J., Poh, Jason, Schuldt, Stefan, Sluse, Dominique, and Treu, Tommaso
- Subjects
Astrophysics - Cosmology and Nongalactic Astrophysics ,Astrophysics - Astrophysics of Galaxies - Abstract
Time-delay cosmography is a powerful technique to constrain cosmological parameters, particularly the Hubble constant ($H_{0}$). The TDCOSMO collaboration is performing an ongoing analysis of lensed quasars to constrain cosmology using this method. In this work, we obtain constraints from the lensed quasar WGD 2038-4008 using new time-delay measurements and previous mass models by TDCOSMO. This is the first TDCOSMO lens to incorporate multiple lens modeling codes and the full time-delay covariance matrix into the cosmological inference. The models are fixed before the time delay is measured, and the analysis is performed blinded with respect to the cosmological parameters to prevent unconscious experimenter bias. We obtain $D_{\Delta t} = 1.68^{+0.40}_{-0.38}$ Gpc using two families of mass models, a power-law describing the total mass distribution, and a composite model of baryons and dark matter, although the composite model is disfavored due to kinematics constraints. In a flat $\Lambda$CDM cosmology, we constrain the Hubble constant to be $H_{0} = 65^{+23}_{-14}\, \rm km\ s^{-1}\,Mpc^{-1}$. The dominant source of uncertainty comes from the time delays, due to the low variability of the quasar. Future long-term monitoring, especially in the era of the Vera C. Rubin Observatory's Legacy Survey of Space and Time, could catch stronger quasar variability and further reduce the uncertainties. This system will be incorporated into an upcoming hierarchical analysis of the entire TDCOSMO sample, and improved time delays and spatially-resolved stellar kinematics could strengthen the constraints from this system in the future., Comment: 8 pages, 5 figures, 3 tables; accepted for publication in Astronomy & Astrophysics
- Published
- 2024
34. CHEOPS in-flight performance: A comprehensive look at the first 3.5 years of operations
- Author
-
Fortier, A., Simon, A. E., Broeg, C., Olofsson, G., Deline, A., Wilson, T. G., Maxted, P. F. L., Brandeker, A., Cameron, A. Collier, Beck, M., Bekkelien, A., Billot, N., Bonfanti, A., Bruno, G., Cabrera, J., Delrez, L., Demory, B. -O., Futyan, D., Florén, H. -G., Günther, M. N., Heitzmann, A., Hoyer, S., Isaak, K. G., Sousa, S. G., Stalport, M., Turin, A., Verhoeve, P., Akinsanmi, B., Alibert, Y., Alonso, R., Bánhidi, D., Bárczy, T., Barrado, D., Barros, S. C., Baumjohann, W., Baycroft, T., Beck, T., Benz, W., Bíró, B. I., Bódi, A., Bonfils, X., Borsato, L., Charnoz, S., Cseh, B., Csizmadia, Sz., Csányi, I., Cubillos, P. E., Davies, M. B., Davis, Y. T., Deleuil, M., Demangeon, O. D. S., Derekas, A., Dransfield, G., Ducrot, E., Ehrenreich, D., Erikson, A., Fariña, C., Fossati, L., Fridlund, M., Gandolfi, D., Garai, Z., Garcia, L., Gillon, M., Chew, Y. Gómez Maqueo, Gómez-Muñoz, M. A., Granata, V., Güdel, M., Guterman, P., Hegedüs, T., Helling, Ch., Jehin, E., Kalup, Cs., Kilkenny, D., Kiss, L., Kriskovics, L., Lam, K. W. F., Laskar, J., Etangs, A. Lecavelier des, Lendl, M., Pina, A. Lopez, Luntzer, A., Magrin, D., Miller, N. J., Contreras, D. Modrego, Mordasini, C., Munari, M., Murray, C. A., Nascimbeni, V., Ottacher, H., Ottensamer, R., Pagano, I., Pál, A., Pallé, E., Pasetti, A., Pedersen, P., Peter, G., Petrucci, R., Piotto, G., Pizarro-Rubio, A., Pollacco, D., Pribulla, T., Queloz, D., Ragazzoni, R., Rando, N., Rauer, H., Ribas, I., Sabin, L., Santos, N. C., Scandariato, G., Schanche, N., Schroffenegger, U., Scutt, O. J., Sebastian, D., Ségransan, D., Seli, B., Smith, A. M. S., Southworth, R., Standing, M. R., Szabó, M. Gy., Szakáts, R., Thomas, N., Timmermans, M., Triaud, A. H. M. J., Udry, S., Van Grootel, V., Venturini, J., Villaver, E., Vinkó, J., Walton, N. A., Wells, R., and Wolter, D.
- Subjects
Astrophysics - Instrumentation and Methods for Astrophysics ,Astrophysics - Earth and Planetary Astrophysics - Abstract
CHEOPS is a space telescope specifically designed to monitor transiting exoplanets orbiting bright stars. In September 2023, CHEOPS completed its nominal mission and remains in excellent operational conditions. The mission has been extended until the end of 2026. Scientific and instrumental data have been collected throughout in-orbit commissioning and nominal operations, enabling a comprehensive analysis of the mission's performance. In this article, we present the results of this analysis with a twofold goal. First, we aim to inform the scientific community about the present status of the mission and what can be expected as the instrument ages. Secondly, we intend for this publication to serve as a legacy document for future missions, providing insights and lessons learned from the successful operation of CHEOPS. To evaluate the instrument performance in flight, we developed a comprehensive monitoring and characterisation programme. It consists of dedicated observations that allow us to characterise the instrument's response. In addition to the standard collection of nominal science and housekeeping data, these observations provide input for detecting, modelling, and correcting instrument systematics, discovering and addressing anomalies, and comparing the instrument's actual performance with expectations. The precision of the CHEOPS measurements has enabled the mission objectives to be met and exceeded. Careful modelling of the instrumental systematics allows the data quality to be significantly improved during the light curve analysis phase, resulting in more precise scientific measurements. CHEOPS is compliant with the driving scientific requirements of the mission. Although visible, the ageing of the instrument has not affected the mission's performance., Comment: Accepted for publication in Astronomy and Astrophysics
- Published
- 2024
35. Low-Mass Galaxy Interactions Trigger Black Hole Activity
- Author
-
Mićić, Marko, Irwin, Jimmy A., Nair, Preethi, Wells, Brenna N., Holmes, Olivia J., and Eames, Jackson T.
- Subjects
Astrophysics - Astrophysics of Galaxies ,Astrophysics - Cosmology and Nongalactic Astrophysics - Abstract
The existence of high-$z$ over-massive supermassive black holes represents a major conundrum in our understanding of black hole evolution. In this paper, we probe from the observational point of view how early Universe environmental conditions could have acted as an evolutionary mechanism for the accelerated growth of the first black holes. Under the assumption that the early Universe is dominated by dwarf galaxies, we investigate the hypothesis that dwarf-dwarf galaxy interactions trigger black hole accretion. We present the discovery of 82 dwarf-dwarf galaxy pairs and 11 dwarf galaxy groups using the Hubble Space Telescope, doubling existing samples. The dwarf systems span a redshift range of 0.13$<$z$<$1.5, and a stellar mass range of 7.24$<$log(M$_*$/\(M_\odot\))$<$9.73. We performed an X-ray study of a subset of these dwarf systems with Chandra and detected six new AGN, increasing the number of known dwarf-dwarf-merger-related AGN from one to seven. We then compared the frequency of these AGN in grouped/paired dwarfs to that of isolated dwarfs and found a statistically significant enhancement (4$\sigma$-6$\sigma$) in the interacting sample. This study, the first of its kind at the lowest mass scales, implies that the presence of a nearby dwarf neighbor is efficient in triggering black hole accretion. These results open new avenues for indirect studies of the emergence of the first supermassive black holes., Comment: 19 pages, 5 figures, 4 tables. Accepted for publication in the Astrophysical Journal Letters
- Published
- 2024
36. Patient-Specific Real-Time Segmentation in Trackerless Brain Ultrasound
- Author
-
Dorent, Reuben, Torio, Erickson, Haouchine, Nazim, Galvin, Colin, Frisken, Sarah, Golby, Alexandra, Kapur, Tina, and Wells, William
- Subjects
Electrical Engineering and Systems Science - Image and Video Processing ,Computer Science - Computer Vision and Pattern Recognition - Abstract
Intraoperative ultrasound (iUS) imaging has the potential to improve surgical outcomes in brain surgery. However, its interpretation is challenging, even for expert neurosurgeons. In this work, we designed the first patient-specific framework that performs brain tumor segmentation in trackerless iUS. To disambiguate ultrasound imaging and adapt to the neurosurgeon's surgical objective, a patient-specific real-time network is trained using synthetic ultrasound data generated by simulating virtual iUS sweep acquisitions in pre-operative MR data. Extensive experiments performed in real ultrasound data demonstrate the effectiveness of the proposed approach, allowing for adapting to the surgeon's definition of surgical targets and outperforming non-patient-specific models, neurosurgeon experts, and high-end tracking systems. Our code is available at: \url{https://github.com/ReubenDo/MHVAE-Seg}., Comment: Early accept at MICCAI 2024 - code available at: https://github.com/ReubenDo/MHVAE-Seg
- Published
- 2024
37. Some Notes on the Sample Complexity of Approximate Channel Simulation
- Author
-
Flamich, Gergely and Wells, Lennie
- Subjects
Computer Science - Information Theory ,Computer Science - Machine Learning ,68P30 ,G.3 ,E.4 - Abstract
Channel simulation algorithms can efficiently encode random samples from a prescribed target distribution $Q$ and find applications in machine learning-based lossy data compression. However, algorithms that encode exact samples usually have random runtime, limiting their applicability when a consistent encoding time is desirable. Thus, this paper considers approximate schemes with a fixed runtime instead. First, we strengthen a result of Agustsson and Theis and show that there is a class of pairs of target distribution $Q$ and coding distribution $P$, for which the runtime of any approximate scheme scales at least super-polynomially in $D_\infty[Q \Vert P]$. We then show, by contrast, that if we have access to an unnormalised Radon-Nikodym derivative $r \propto dQ/dP$ and knowledge of $D_{KL}[Q \Vert P]$, we can exploit global-bound, depth-limited A* coding to ensure $\mathrm{TV}[Q \Vert P] \leq \epsilon$ and maintain optimal coding performance with a sample complexity of only $\exp_2\big((D_{KL}[Q \Vert P] + o(1)) \big/ \epsilon\big)$., Comment: Accepted as a spotlight paper at the first 'Learn to Compress' Workshop@ ISIT 2024. V2: corrected some typos and simplified Appendix C
- Published
- 2024
38. Intersection Types via Finite-Set Declarations
- Author
-
Kamareddine, Fairouz and Wells, Joe
- Subjects
Computer Science - Logic in Computer Science ,Mathematics - Logic ,03, 68 ,F.4 - Abstract
The lambda-cube is a famous pure type system (PTS) cube of eight powerful explicit type systems that include the simple, polymorphic and dependent type theories. The lambda-cube only types Strongly Normalising (SN) terms but not all of them. It is well known that even the most powerful system of the lambda-cube can only type the same pure untyped lambda-terms that are typable by the higher-order polymorphic implicitly typed lambda-calculus Fomega, and that there is an untyped {\lambda}-term U' that is SN but is not typable in Fomega or the lambda-cube. Hence, neither system can type all the SN terms it expresses. In this paper, we present the f-cube, an extension of the lambda-cube with finite-set declarations (FSDs) like y\in{C1,...,Cn} : B which means that y is of type B and can only be one of C1,..., Cn. The novelty of our FSDs is that they allow to represent intersection types as Pi-types. We show how to translate and type the term U' in the f-cube using an encoding of intersection types based on FSDs. Notably, our translation works without needing anything like the usual troublesome intersection-introduction rule that proves a pure untyped lambda-term M has an intersection of k types using k independent sub-derivations. As such, our approach is useful for language implementers who want the power of intersection types without the pain of the intersection-introduction rule., Comment: To appear in Wollic 2024
- Published
- 2024
39. A High-order Arbitrary Lagrangian-Eulerian Virtual Element Method for Convection-Diffusion Problems
- Author
-
Wells, H.
- Subjects
Mathematics - Numerical Analysis - Abstract
A virtual element discretisation of an Arbitrary Lagrangian-Eulerian method for two-dimensional convection-diffusion equations is proposed employing an isoparametric Virtual Element Method to achieve higher-order convergence rates on curved edged polygonal meshes. The proposed method is validated with numerical experiments in which optimal $H^1$ and $L^2$ convergence are observed. This method is then successfully applied to an existing moving mesh algorithm for implicit moving boundary problems in which higher-order convergence is achieved.
- Published
- 2024
40. Isoparametric Virtual Element Methods
- Author
-
Cangiani, Andrea, Dedner, Andreas, Hubbard, Matthew, and Wells, Harry
- Subjects
Mathematics - Numerical Analysis - Abstract
We present two approaches to constructing isoparametric Virtual Element Methods of arbitrary order for linear elliptic partial differential equations on general two-dimensional domains. The first method approximates the variational problem transformed onto a computational reference domain. The second method computes a virtual domain and uses bespoke polynomial approximation operators to construct a computable method. Both methods are shown to converge optimally, a behaviour confirmed in practice for the solution of problems posed on curved domains.
- Published
- 2024
41. Absolute dimensions of solar-type eclipsing binaries. NY Hya: A test for magnetic stellar evolution models
- Author
-
Hinse, T. C., Baştürk, O., Southworth, J., Feiden, G. A., Tregloan-Reed, J., Kostov, V. B., Livingston, J., Esmer, E. M., Yılmaz, Mesut, Yalçınkaya, Selçuk, Torun, Şeyma, Vos, J., Evans, D. F., Morales, J. C., Wolf, J. C. A., Olsen, E. H., Clausen, J. V., Helt, B. E., Lý, C. T. K., Stahl, O., Wells, R., Herath, M., Jørgensen, U. G., Dominik, M., Skottfelt, J., Peixinho, N., Longa-Peña, P., Kim, Y., Kim, H. -E., Yoon, T. S., Alrebdi, H. I., and Zotos, E. E.
- Subjects
Astrophysics - Solar and Stellar Astrophysics - Abstract
The binary star NY Hya is a bright, detached, double-lined eclipsing system with an orbital period of just under five days with two components each nearly identical to the Sun and located in the solar neighbourhood. The objective of this study is to test and confront various stellar evolution models for solar-type stars based on accurate measurements of stellar mass and radius. We present new ground-based spectroscopic and photometric as well as high-precision space-based photometric and astrometric data from which we derive orbital as well as physical properties of the components via the method of least-squares minimisation based on a standard binary model valid for two detached components. Classic statistical techniques were invoked to test the significance of model parameters. Additional empirical evidence was compiled from the public domain; the derived system properties were compared with archival broad-band photometry data enabling a measurement of the system's spectral energy distribution that allowed an independent estimate of stellar properties. We also utilised semi-empirical calibration methods to derive atmospheric properties from Str\"{o}mgren photometry and related colour indices. Data was used to confront the observed physical properties with classic and magnetic stellar evolution models., Comment: 34 pages, 19 figures, 13 tables, (accepted for publication in A&A)
- Published
- 2024
42. Frogs, hats and common subsequences
- Author
-
Briggs, Joseph, Parker, Alex, Schwieder, Coy, and Wells, Chris
- Subjects
Mathematics - Combinatorics ,Mathematics - Probability ,05A19, 60J10 - Abstract
Write $W^{(n)}$ to mean the $n$-letter word obtained by repeating a fixed word $W$ and let $R_n$ denote a uniformly random $n$-letter word sampled from the same alphabet as $W$. We are interested in the average length of the longest common subsequence between $W^{(n)}$ and $R_n$, which is known to be $\gamma(W)\cdot n+o(n)$ for some constant $\gamma(W)$. Bukh and Cox recently developed an interacting particle system, dubbed the frog dynamics, which can be used to compute the constant $\gamma(W)$ for any fixed word $W$. They successfully analyzed the simplest case of the frog dynamics to find an explicit formula for the constants $\gamma(12\cdots k)$. We continue this study by using the frog dynamics to find an explicit formula for the constants $\gamma(12\cdots kk\cdots 21)$. The frog dynamics in this case is a variation of the PushTASEP on the ring where some clocks are identical. Interestingly, exclusion processes with correlated clocks of this type appear to have not been analyzed before. Our analysis leads to a seemingly new combinatorial object which could be of independent interest: frogs with hats!, Comment: 29 pages, 7 figures
- Published
- 2024
43. Demonstration of weighted graph optimization on a Rydberg atom array using local light-shifts
- Author
-
de Oliveira, A. G., Diamond-Hitchcock, E., Walker, D. M., Wells-Pestell, M. T., Pelegrí, G., Picken, C. J., Malcolm, G. P. A., Daley, A. J., Bass, J., and Pritchard, J. D.
- Subjects
Quantum Physics ,Physics - Atomic Physics - Abstract
Neutral atom arrays have emerged as a versatile platform towards scalable quantum computation and optimization. In this paper we present first demonstrations of weighted graph optimization on a Rydberg atom array using annealing with local light-shifts. We verify the ability to prepare weighted graphs in 1D and 2D arrays, including embedding a five vertex non-unit disk graph using nine physical qubits and demonstration of a simple crossing gadget. We find common annealing ramps leading to preparation of the target ground state robustly over a substantial range of different graph weightings. This work provides a route to exploring large-scale optimization of non-planar weighted graphs relevant for solving relevant real-world problems., Comment: 7 pages, 5 figures. Methods: 3 pages, 2 figures
- Published
- 2024
44. Is it time to start moving soil microbial fuel cell research out of the lab and into the field?
- Author
-
Taylor, Stephen, Jaliff, Laura, Wells, George, and Josephson, Colleen
- Subjects
Biological Sciences ,Industrial Biotechnology ,Affordable and Clean Energy ,Microbial fuel cell ,Soil ,Soil microbial fuel cell ,Environmental Sciences - Abstract
Soil microbial fuel cells (SMFCs) function as bioelectrochemical energy harvesters that convert electrons stored in soil organic matter into useful electrical energy. Broadly, an SMFC comprises three essential components: an anode buried in the soil (the negative terminal), a colony of exoelectrogenic microorganisms residing on this anode, and a cathode (the positive terminal). As the exoelectrogens respire, they release electrons to the anode, which acts as an external receptor. These released electrons then flow through a load (e.g. a resistor), connecting the anode and cathode. Though minuscule, the electrical power produced by SMFCs has a number of potential applications such as sustaining low-power embedded electronics, pollutant remediation, or as a bio-sensing proxy for soil qualities and microbial activity. This discussion aims to emphasize the potential of SMFCs in addressing real-world environmental issues and to generate interest in the larger scientific community for broad interdisciplinary research efforts, particularly in field deployments.
- Published
- 2024
45. Invasive Assessment of Coronary Artery Disease in Clonal Hematopoiesis of Indeterminate Potential.
- Author
-
Heimlich, J Brett, Raddatz, Michael A, Wells, John, Vlasschaert, Caitlyn, Olson, Sydney, Threadcraft, Marcus, Foster, Kristoff, Boateng, Emmanuel, Umbarger, Kelsey, Su, Yan Ru, Roden, Dan M, Barker, Colin M, and Bick, Alexander G
- Subjects
Biomedical and Clinical Sciences ,Cardiovascular Medicine and Haematology ,Cardiovascular ,Stem Cell Research ,Atherosclerosis ,Genetics ,Heart Disease - Coronary Heart Disease ,Heart Disease ,2.1 Biological and endogenous factors ,Aetiology ,Good Health and Well Being ,atherosclerosis ,genetics ,heart failure ,mutation ,observational cohort ,Medical Biotechnology ,Cardiorespiratory Medicine and Haematology ,Cardiovascular System & Hematology ,Cardiovascular medicine and haematology - Abstract
BackgroundClonal hematopoiesis of indeterminate potential (CHIP) occurs due to acquired mutations in bone marrow progenitor cells. CHIP confers a 2-fold risk of atherosclerotic cardiovascular disease. However, there are limited data regarding specific cardiovascular phenotypes. The purpose of this study was to define the coronary artery disease phenotype of the CHIP population-based on coronary angiography.MethodsWe recruited 1142 patients from the Vanderbilt University Medical Center cardiac catheterization laboratory and performed DNA sequencing to determine CHIP status. Multivariable logistic regression models and proportional odds models were used to assess the association between CHIP status and angiography phenotypes.ResultsWe found that 18.4% of patients undergoing coronary angiography had a CHIP mutation. Those with CHIP had a higher risk of having obstructive left main (odds ratio, 2.44 [95% CI, 1.40-4.27]; P=0.0018) and left anterior descending (odds ratio, 1.59 [1.12-2.24]; P=0.0092) coronary artery disease compared with non-CHIP carriers. We additionally found that a specific CHIP mutation, ten eleven translocase 2 (TET2), has a larger effect size on left main stenosis compared with other CHIP mutations.ConclusionsThis is the first invasive assessment of coronary artery disease in CHIP and offers a description of a specific atherosclerotic phenotype in CHIP wherein there is an increased risk of obstructive left main and left anterior descending artery stenosis, especially among TET2 mutation carriers. This serves as a basis for understanding enhanced morbidity and mortality in CHIP.
- Published
- 2024
46. Harm Reduction in the Field: First Responders’ Perceptions of Opioid Overdose Interventions
- Author
-
Fockele, Callan Elswick, Frohe, Tessa, McBride, Owen, Perlmutter, David L., Goh, Brenda, Williams, Grover, Wettemann, Courteney, Holland, Nathan, Finegood, Brad, Oliphant-Wells, Thea, Williams, Emily C., and van Draanen, Jenna
- Subjects
opioid use disorder ,opioid overdose ,naloxone ,buprenorphine ,HIV testing ,HCV testing ,emergency medical services - Abstract
Introduction: Recent policy changes in Washington State presented a unique opportunity to pair evidence-based interventions with first responder services to combat increasing opioid overdoses. However, little is known about how these interventions should be implemented. In partnership with the Research with Expert Advisors on Drug Use team, a group of academically trained and community-trained researchers with lived and living experience of substance use, we examined facilitators and barriers to adopting leave-behind naloxone, field-based buprenorphine initiation, and HIV and hepatitis C virus (HCV) testing for first responder programs.Methods: Our team completed semi-structured, qualitative interviews with 32 first responders, mobile integrated health staff, and emergency medical services (EMS) leaders in King County, Washington, from February–May 2022. Semi-structured interviews were recorded, transcribed, and coded using an integrated deductive and inductive thematic analysis approach grounded in community-engaged research principles. We collected data until saturation was achieved. Data collection and analysis were informed by the Consolidated Framework for Implementation Research. Two investigators coded independently until 100% consensus was reached.Results: Our thematic analysis revealed several perceived facilitators (ie, tension for change, relative advantage, and compatibility) and barriers (ie, limited adaptability, lack of evidence strength and quality, and prohibitive cost) to the adoption of these evidence-based clinical interventions for first responder systems. There was widespread support for the distribution of leave-behind naloxone, although funding was identified as a barrier. Many believed field-based initiation of buprenorphine treatment could provide a more effective response to overdose management, but there were significant concerns that this intervention could run counter to the rapid care model. Lastly, participants worried that HIV and HCV testing was inappropriate for first responders to conduct but recommended that this service be provided by mobile integrated health staff.Conclusion: These results have informed local EMS strategic planning, which will inform roll out of process improvements in King County, Washington. Future work should evaluate the impact of these interventions on the health of overdose survivors.
- Published
- 2024
47. Transcobalamin receptor antibodies in autoimmune vitamin B12 central deficiency
- Author
-
Pluvinage, John V, Ngo, Thomas, Fouassier, Camille, McDonagh, Maura, Holmes, Brandon B, Bartley, Christopher M, Kondapavulur, Sravani, Hurabielle, Charlotte, Bodansky, Aaron, Pai, Vincent, Hinman, Sam, Aslanpour, Ava, Alvarenga, Bonny D, Zorn, Kelsey C, Zamecnik, Colin, McCann, Adrian, Asencor, Andoni I, Huynh, Trung, Browne, Weston, Tubati, Asritha, Haney, Michael S, Douglas, Vanja C, Louine, Martineau, Cree, Bruce AC, Hauser, Stephen L, Seeley, William, Baranzini, Sergio E, Wells, James A, Spudich, Serena, Farhadian, Shelli, Ramachandran, Prashanth S, Gillum, Leslie, Hales, Chadwick M, Zikherman, Julie, Anderson, Mark S, Yazdany, Jinoos, Smith, Bryan, Nath, Avindra, Suh, Gina, Flanagan, Eoin P, Green, Ari J, Green, Ralph, Gelfand, Jeffrey M, DeRisi, Joseph L, Pleasure, Samuel J, and Wilson, Michael R
- Subjects
Biomedical and Clinical Sciences ,Clinical Sciences ,Women's Health ,Autoimmune Disease ,Brain Disorders ,Dietary Supplements ,Nutrition ,Clinical Research ,Minority Health ,Neurosciences ,Biotechnology ,2.1 Biological and endogenous factors ,Neurological ,Humans ,Vitamin B 12 Deficiency ,Vitamin B 12 ,Autoantibodies ,Female ,Receptors ,Cell Surface ,Antigens ,CD ,Middle Aged ,Autoimmune Diseases ,Blood-Brain Barrier ,Male ,Biological Sciences ,Medical and Health Sciences ,Medical biotechnology ,Biomedical engineering - Abstract
Vitamin B12 is critical for hematopoiesis and myelination. Deficiency can cause neurologic deficits including loss of coordination and cognitive decline. However, diagnosis relies on measurement of vitamin B12 in the blood, which may not accurately reflect the concentration in the brain. Using programmable phage display, we identified an autoantibody targeting the transcobalamin receptor (CD320) in a patient with progressive tremor, ataxia, and scanning speech. Anti-CD320 impaired cellular uptake of cobalamin (B12) in vitro by depleting its target from the cell surface. Despite a normal serum concentration, B12 was nearly undetectable in her cerebrospinal fluid (CSF). Immunosuppressive treatment and high-dose systemic B12 supplementation were associated with increased B12 in the CSF and clinical improvement. Optofluidic screening enabled isolation of a patient-derived monoclonal antibody that impaired B12 transport across an in vitro model of the blood-brain barrier (BBB). Autoantibodies targeting the same epitope of CD320 were identified in seven other patients with neurologic deficits of unknown etiology, 6% of healthy controls, and 21.4% of a cohort of patients with neuropsychiatric lupus. In 132 paired serum and CSF samples, detection of anti-CD320 in the blood predicted B12 deficiency in the brain. However, these individuals did not display any hematologic signs of B12 deficiency despite systemic CD320 impairment. Using a genome-wide CRISPR screen, we found that the low-density lipoprotein receptor serves as an alternative B12 uptake pathway in hematopoietic cells. These findings dissect the tissue specificity of B12 transport and elucidate an autoimmune neurologic condition that may be amenable to immunomodulatory treatment and nutritional supplementation.
- Published
- 2024
48. Differentially co‐expressed myofibre transcripts associated with abnormal myofibre proportion in chronic obstructive pulmonary disease
- Author
-
Chiles, Joe W, Wilson, Ava C, Tindal, Rachel, Lavin, Kaleen, Windham, Samuel, Rossiter, Harry B, Casaburi, Richard, Thalacker‐Mercer, Anna, Buford, Thomas W, Patel, Rakesh, Wells, J Michael, Bamman, Marcas M, Hanaoka, Beatriz Y, Dransfield, Mark, and McDonald, Merry‐Lynn N
- Subjects
Medical Physiology ,Biomedical and Clinical Sciences ,Health Sciences ,Lung ,Genetics ,Human Genome ,Chronic Obstructive Pulmonary Disease ,Musculoskeletal ,Respiratory ,Humans ,Pulmonary Disease ,Chronic Obstructive ,Male ,Female ,Aged ,Middle Aged ,Muscle Fibers ,Skeletal ,Muscle ,Skeletal ,Transcriptome ,Gene Expression Profiling ,COPD ,fibre-type shift ,myofibre proportions ,sex differences ,skeletal muscle ,transcriptomics ,fibre‐type shift ,Physiology ,Clinical Sciences ,Human Movement and Sports Sciences ,Clinical sciences ,Allied health and rehabilitation science ,Sports science and exercise - Abstract
BackgroundSkeletal muscle dysfunction is a common extrapulmonary manifestation of chronic obstructive pulmonary disease (COPD). Alterations in skeletal muscle myosin heavy chain expression, with reduced type I and increased type II myosin heavy chain expression, are associated with COPD severity when studied in largely male cohorts. The objectives of this study were (1) to define an abnormal myofibre proportion phenotype in both males and females with COPD and (2) to identify transcripts and transcriptional networks associated with abnormal myofibre proportion in COPD.MethodsForty-six participants with COPD were assessed for body composition, strength, endurance and pulmonary function. Skeletal muscle biopsies from the vastus lateralis were assayed for fibre-type distribution and cross-sectional area via immunofluorescence microscopy and RNA-sequenced to generate transcriptome-wide gene expression data. Sex-stratified k-means clustering of type I and IIx/IIax fibre proportions was used to define abnormal myofibre proportion in participants with COPD and contrasted with previously defined criteria. Single transcripts and weighted co-expression network analysis modules were tested for correlation with the abnormal myofibre proportion phenotype.ResultsAbnormal myofibre proportion was defined in males with COPD (n = 29) as 22% type IIx/IIax fibres and in females with COPD (n = 17) as 12% type IIx/IIax fibres. Half of the participants with COPD were classified as having an abnormal myofibre proportion. Participants with COPD and an abnormal myofibre proportion had lower median handgrip strength (26.1 vs. 34.0 kg, P = 0.022), 6-min walk distance (300 vs. 353 m, P = 0.039) and forced expiratory volume in 1 s-to-forced vital capacity ratio (0.42 vs. 0.48, P = 0.041) compared with participants with COPD and normal myofibre proportions. Twenty-nine transcripts were associated with abnormal myofibre proportions in participants with COPD, with the upregulated NEB, TPM1 and TPM2 genes having the largest fold differences. Co-expression network analysis revealed that two transcript modules were significantly positively associated with the presence of abnormal myofibre proportions. One of these co-expression modules contained genes classically associated with muscle atrophy, as well as transcripts associated with both type I and type II myofibres, and was enriched for genetic loci associated with bone mineral density.ConclusionsOur findings indicate that there are significant transcriptional alterations associated with abnormal myofibre proportions in participants with COPD. Transcripts canonically associated with both type I and type IIa fibres were enriched in a co-expression network associated with abnormal myofibre proportion, suggesting altered transcriptional regulation across multiple fibre types.
- Published
- 2024
49. Antibody discovery identifies regulatory mechanisms of protein arginine deiminase 4.
- Author
-
Zhou, Xin, Kong, Sophie, Maker, Allison, Remesh, Soumya, Leung, Kevin, Verba, Kliment, and Wells, James
- Subjects
Protein-Arginine Deiminase Type 4 ,Humans ,Catalytic Domain ,Cryoelectron Microscopy ,Models ,Molecular ,Antibodies ,Arthritis ,Rheumatoid ,Hydrolases ,Protein-Arginine Deiminases - Abstract
Unlocking the potential of protein arginine deiminase 4 (PAD4) as a drug target for rheumatoid arthritis requires a deeper understanding of its regulation. In this study, we use unbiased antibody selections to identify functional antibodies capable of either activating or inhibiting PAD4 activity. Through cryogenic-electron microscopy, we characterized the structures of these antibodies in complex with PAD4 and revealed insights into their mechanisms of action. Rather than steric occlusion of the substrate-binding catalytic pocket, the antibodies modulate PAD4 activity through interactions with allosteric binding sites adjacent to the catalytic pocket. These binding events lead to either alteration of the active site conformation or the enzyme oligomeric state, resulting in modulation of PAD4 activity. Our study uses antibody engineering to reveal new mechanisms for enzyme regulation and highlights the potential of using PAD4 agonist and antagonist antibodies for studying PAD4-dependency in disease models and future therapeutic development.
- Published
- 2024
50. Diagnosing Parkinsons disease and monitoring its progression: Biomarkers from combined GC-TOF MS and LC-MS/MS untargeted metabolomics.
- Author
-
Dahabiyeh, Lina, Nimer, Refat, Wells, Jeremiah, Abu-Rish, Eman, and Fiehn, Oliver
- Subjects
Biomarker ,Cysteine-S-Sulfate ,Diagnosis ,Metabolomics ,Neurodegenerative ,Parkinsons disease ,Xanthines - Abstract
Parkinsons disease (PD) is a prevalent neurodegenerative disorder with a poorly understood etiology. An accurate diagnosis of idiopathic PD remains challenging as misdiagnosis is common in routine clinical practice. Moreover, current therapeutics focus on symptomatic management rather than curing or slowing down disease progression. Therefore, identification of potential PD biomarkers and providing a better understanding of the underlying disease pathophysiology are urgent. Herein, hydrophilic interaction liquid chromatography-mass spectrometry (LC-MS/MS) and gas chromatography-mass spectrometry (GC-TOF MS) based metabolomics approaches were used to profile the serum metabolome of 50 patients with different stages of idiopathic PD (early, mid and advanced) and 45 age-matched controls. Levels of 57 metabolites including cysteine-S-sulfate and N-acetyl tryptophan were significantly higher in patients with PD compared to controls, with lower amounts of additional 51 metabolites including vanillic acid, and N-acetylaspartic acid. Xanthines, including caffeine and its downstream metabolites, were lowered in patients with PD relative to controls indicating a potential role caffeine and its metabolites against neuronal damage. Seven metabolites, namely cysteine-S-sulfate, 1-methylxanthine, vanillic acid, N-acetylaspartic acid, 3-N-acetyl tryptophan, 5-methoxytryptophol, and 13-HODE yielded a ROC curve with a high classification accuracy (AUC 0.977). Comparison between different PD stages showed that cysteine-S-sulfate levels were significantly increasing with the advancement of PD stages while LPI 20:4 was significantly decreasing with disease progression. Our findings provide new biomarker candidates to assist in the diagnosis of PD and monitor its progression. Unusual metabolites like cysteine-S-sulfate might point to therapeutic targets that could enhance the development of novel PD treatments, such as NMDA antagonists.
- Published
- 2024
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.