159 results on '"data analysis methods"'
Search Results
2. The use of electronic health records for predictive modelling and cohort analysis in diabetes
- Author
-
McGurnaghan, Stuart John, McKeigue, Paul, and Spiliopoulou, Athina
- Subjects
electronic health records ,data analysis methods ,anonymised health records ,diabetes ,dapgliflozin ,e-health records - Abstract
The increasing use of electronic, rather than paper-based, healthcare records provides enormous potential for health research. However, research-enabling such electronic health (e-health) records presents considerable challenges across several inter-connected domains such as database design and construction, creation of verifiable research pipelines, development, and application of appropriate statistical methods through to solutions for governance and privacy issues. In this thesis, I describe the development of a platform and demonstrate its application in risk prediction, pharmacoepidemiology and disease prevalence. The work in this thesis describes i) the construction of a research platform based on e-health records from the total population of Scotland with diabetes ii) the use of this platform to address several questions about the epidemiology and pharmacoepidemiology of diabetes complications relevant to the care of people with diabetes. The thesis consists of two major themes i) technical development of the platform that I designed as a software developer to enable the research to be achieved and ii) the application of appropriate study design and statistical methods as an emerging epidemiologist to answer specific research questions. Chapter 1 contains an overview of some of the challenges and approaches taken in the construction and use of research data platforms from e-health records. The chapter focuses on the informatics theme of the thesis. The introductory background to the specific research questions is given within the question-specific chapters. Chapter 2 describes the research platform that I designed and implemented. It summarises the types of data available in the platform and describes the cohort of those with diabetes in Scotland. These data and participants are then selected in the subsequent specific research studies. As such, it forms the equivalent of an overarching methods chapter for the remainder of the thesis. The manuscript has been submitted for publication and at the time of thesis submission was under review. In Chapter 3, I describe work I conducted and published using the research platform to provide a contemporary snapshot of prevalence of cardiovascular disease (CVD) and cardiovascular risk factors in all people with type 2 diabetes in Scotland. Cardiovascular disease is the leading cause of death and loss of life expectancy in both type 2 and type 1 diabetes, but the past decades have seen advances in the understanding of disease pathogenesis and its prevention. However, I found that prevalence remains high (about one-third of those with type 2 diabetes had already been diagnosed with CVD) and that two-thirds had suboptimal control of at least two modifiable risk factors for CVD. This demonstrated the ongoing impact of CVD in diabetes and delineated areas of unmet need with respect to known risk factor control. In Chapter 4, I demonstrate the use of the research platform for real-world pharmacoepidemiology. Specifically, it describes a study I conducted and published to test the effect of a new diabetes drug (dapagliflozin) on cardiovascular risk factors in people with type 2 diabetes. At the time of the study, it was unknown whether this drug would achieve the same effects in the real-world as it had in the much more idealised setting of short duration clinical trials. The analysis found that reductions in HbA1c, SBP, and BMI were equivalent to that of the clinical trials, and importantly that these effects were sustained over the median 210 day follow-up. Chapter 5 describes the quantification of current CVD incidence rates in people with type 1 diabetes in Scotland and the use of the research platform to construct a CVD risk prediction model for people with type 1 diabetes. I then used the Swedish national diabetes register to validate the generalisability of the model. There is considerable debate among clinicians about the appropriate age and circumstances under which cardiovascular risk modifying therapies (in particular, statins) should be prescribed in type 1 diabetes. Clinical guidelines assume a very high absolute rate of CVD from an early age and advocate basing treatment-initiation decisions either on the expected absolute rate over the ensuing decade or in some cases lifetime risk. Yet contemporaneous data on the actual current experienced risks are lacking. I found the current absolute rates to be much lower than the guidelines implicitly assume. I showed that under current guidelines, >90% of those aged 20-40 years and 100% of those >=40 years with type 1 diabetes were eligible for statins, but it was not until age 65 upwards that 100% had a modelled risk of CVD >=10% in 10 years, the threshold for statin use in the general population. The CVD prediction model I constructed was well calibrated and achieved high predictive performance in both Scotland and Sweden. The results should prove useful to facilitate individualised discussions regarding appropriate prescribing and the rationale for prescribing. In chapter 6, I describe my use of the research platform to rapidly address a sudden major challenge in diabetes, namely that of the SARS-CoV-2 pandemic. At the outset of the pandemic in 2020, a high preponderance of diabetes among those being admitted to hospital with COVID-19 was being reported. Yet most studies lacked a population denominator, so it was unclear just how large the risks were associated with diabetes, and if the clinical risks (of hospitalisation, admission to a critical care facility and death) were predictable among those with diabetes. It was also unclear whether all those with these risks should be in shielding. I found that by the end of the first wave of the pandemic the risk of severe COVID-19 was elevated 2.4 fold in type 1 diabetes and 1.4 fold in type 2 diabetes (much less than what was generally thought). The prediction for the cross-validated predictive model of severe COVID-19, retained 11 factors in addition to age, sex, diabetes type and duration and had good predictive performance (C-statistic of 0.85). The study results were reported to government and diabetes stakeholder groups early in the pandemic and were used to inform shielding policy. Chapter 7 presents an overall discussion of the work of the thesis, the lessons learned and future work focusing on the research platform development theme. The key findings and advantages of the work I have described in generating the data platform are: the importance of separating the analysis from the data; the ability to accommodate various data types from different data sources (flexible data input); longitudinal database formatting; the importance of an accurate metadata dictionary; a verifiable research pipeline; the compliance with governance, and the ability to generate synthetic datasets. The work in this thesis provides evidence of the feasibility and usefulness of harnessing e-health records for important and timely research that impacts on people with diabetes. It provides insights and exemplars of use that should be helpful for others in the field trying to develop such platforms for diabetes or other disease areas. Future work includes the development of a pharmacoepidemiology pipeline, allowing rapid analysis of safety and effectiveness outcomes, given a wide variety of exposures, and the ability to incorporate genetic and 'omics data.
- Published
- 2022
- Full Text
- View/download PDF
3. Plasticity of Neuronal Interactions
- Author
-
Kristan, William B., Jr., Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
4. Significance of Spike Train Correlations
- Author
-
Palm, Günther, Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
5. Higher-Order Correlations and Synfire Chains
- Author
-
Grün, Sonja, Diesmann, Markus, Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
6. Spatio-Temporal Spike Patterns
- Author
-
Abeles, Moshe, Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
7. Repeating Patterns in Spike Trains and Functional Groups of Neurons
- Author
-
Dayhoff, Judith E., Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
8. Application of the Gravitational Clustering Method
- Author
-
Lindsey, Bruce, Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
9. Dynamics of Neuronal Interactions
- Author
-
Aertsen, Ad, Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
10. Neuronal Spike Trains and Stochastic Point Processes
- Author
-
Rotter, Stefan, Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
11. Improvements in the Analysis of Neuronal Interactions
- Author
-
Baker, Stuart N., Destexhe, Alain, Series Editor, Brette, Romain, Series Editor, Aertsen, Ad, editor, Grün, Sonja, editor, Maldonado, Pedro E., editor, and Palm, Günther, editor
- Published
- 2023
- Full Text
- View/download PDF
12. Improving the quality of reporting findings using computer data analysis applications in educational research in context
- Author
-
Patrick Ngulube
- Subjects
Computer packages for data analysis ,Computer applications for data analysis ,Educational research ,Data analysis methods ,Doctoral research ,Methodological transparency ,Science (General) ,Q1-390 ,Social sciences (General) ,H1-99 - Abstract
Data analysis is an important step in the research process as it influences the quality and standard of reporting research findings. Based on a review of the content of 255 doctorate theses, the use of computer applications for data analysis in educational research was assessed. It was feasible to assess how extensively used and accepted computer packages had become in educational research using an aspect of the diffusion of innovations theory as part of the conceptual framework. The results showed that the use of computer applications to analyse data was more prevalent among researchers using quantitative and mixed-methods research methodologies than among qualitative educational researchers. Educational researchers have not yet fully adopted innovative computer data analysis techniques in their research. It is evident that they use traditional technologies more than computer applications in their research. Name dropping of the computer applications used without employing the language or visualisations features provided by the applications was rife. This article bridges the gap between methodological scholarship and the use of computer applications in data analysis. It illuminates the potential of computer software to enhance the quality of the reporting of findings. The article aims to contribute to improvements in the standard of research reporting and the attributes of the graduates. The practical methodological advice in this article is aimed at guiding researchers who consider using computer packages in data analysis, irrespective of their methodological orientation. It stimulates debate on the use of computer applications in data analysis.
- Published
- 2023
- Full Text
- View/download PDF
13. Qualitative Auswertungsmethoden in digitalen Lernumgebungen Ein Blended-Learning-Konzept imPraxistest.
- Author
-
Ülpenich, Bettina
- Subjects
- *
EDUCATIONAL resources , *DATA analysis , *DIGITAL learning , *DIGITAL technology , *HIGHER education , *LEARNING , *QUALITATIVE research , *BLENDED learning , *CLASSROOM environment - Abstract
This article reflects on the development, implementation, and testing of a blended learning course on qualitative data analysis methods which are used in social sciences. The course was created and tested at Heinrich-Heine-University Düsseldorf in 2022. The e-learning offer is an online-based course whose modules can be integrated into face-to-face seminars but can also accompany independent learning processes by students. In addition to teaching qualitative data analysis methods, the course aims to provide students with first insights into the practice of qualitative data analysis. Summing up first experiences from the practical use of the e-learning offer, challenges and opportunities of teaching qualitative data analysis in digital learning environments are also discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. Magnetic Resonance Imaging-Based 4D Flow: The Role of Artificial Intelligence
- Author
-
Peper, Eva S., Kozerke, Sebastian, van Ooij, Pim, Schoepf, U. Joseph, Series Editor, De Cecco, Carlo N., editor, van Assen, Marly, editor, and Leiner, Tim, editor
- Published
- 2022
- Full Text
- View/download PDF
15. Estimation of hydrodynamic properties of a sandy-loam soil by two analysis methods of single-ring infiltration data
- Author
-
Bagarello Vincenzo, Caltabellotta Gaetano, and Iovino Massimo
- Subjects
soil hydrodynamic properties ,beerkan infiltration run ,data analysis methods ,best methodology ,wu1 method ,Hydraulic engineering ,TC1-978 - Abstract
Beerkan infiltration runs could provide an incomplete description of infiltration with reference to either the near steady-state or the transient stages. In particular, the process could still be in the transient stage at the end of the run or some transient infiltration data might be loss. The Wu1 method and the BEST-steady algorithm can be applied to derive soil hydrodynamic parameters even under these circumstances. Therefore, a soil dataset could be developed using two different data analysis methods. The hypothesis that the Wu1 method and BEST-steady yield similar predictions of the soil parameters when they are applied to the same infiltration curve was tested in this investigation. For a sandy-loam soil, BEST-steady yielded higher saturated soil hydraulic conductivity, Ks, microscopic pore radius, λm, and depth of the wetting front at the end of the run, dwf, and lower macroscopic capillary length, λc, as compared with the Wu1 method. Two corresponding means differed by 1.2–1.4 times, depending on the variable, and the differences appeared overall from moderate to relatively appreciable, that is neither too high nor negligible in any circumstance, according to some literature suggestions. Two estimates of Ks were similar (difference by < 25%) when the gravity-driven vertical flow and the lateral capillary components represented the 71–89% of total infiltration. In conclusion, the two methods of data analysis do not generally yield the same predictions of soil hydrodynamic parameters when they are applied to the same infiltration curve. However, it seems possible to establish what are the conditions making the two methods similar.
- Published
- 2022
- Full Text
- View/download PDF
16. Machine learning within the Parkinson's progression markers initiative: Review of the current state of affairs.
- Author
-
Gerraty, Raphael T., Provost, Allison, Lin Li, Wagner, Erin, Haas, Magali, and Lancashire, Lee
- Subjects
DISEASE progression ,BIOMARKERS ,MACHINE learning ,GENE expression ,PARKINSON'S disease ,DATA analysis ,ALGORITHMS ,NEURORADIOLOGY - Abstract
The Parkinson's Progression Markers Initiative (PPMI) has collected more than a decade's worth of longitudinal and multi-modal data from patients, healthy controls, and at-risk individuals, including imaging, clinical, cognitive, and 'omics' biospecimens. Such a rich dataset presents unprecedented opportunities for biomarker discovery, patient subtyping, and prognostic prediction, but it also poses challenges that may require the development of novel methodological approaches to solve. In this review, we provide an overview of the application of machine learning methods to analyzing data from the PPMI cohort. We find that there is significant variability in the types of data, models, and validation procedures used across studies, and that much of what makes the PPMI data set unique (multi-modal and longitudinal observations) remains underutilized in most machine learning studies. We review each of these dimensions in detail and provide recommendations for future machine learning work using data from the PPMI cohort. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. A hybrid approach to event reconstruction for atmospheric Cherenkov Telescopes combining machine learning and likelihood fitting.
- Author
-
Schwefer, Georg, Parsons, Robert, and Hinton, Jim
- Subjects
- *
PARTICLE acceleration , *PROBABILITY density function , *ZENITH distance , *MACHINE learning , *ASTROPHYSICS - Abstract
The imaging atmospheric Cherenkov technique provides potentially the highest angular resolution achievable in astronomy at energies above the X-ray waveband. High-resolution measurements provide the key to progress on many of the major questions in high energy astrophysics, including the sites and mechanisms of particle acceleration to PeV energies. The huge potential of the next-generation CTA observatory in this regard can be realised with the help of improved algorithms for the reconstruction of the air-shower direction and energy. Hybrid methods combining maximum-likelihood-fitting techniques with neural networks represent a particularly promising approach and have recently been successfully applied for the reconstruction of astrophysical neutrinos. Here, we present the FreePACT algorithm, a hybrid reconstruction method for IACTs. In this, making use of the neural ratio estimation technique from the field of likelihood-free inference, the analytical likelihood used in traditional image likelihood fitting is replaced by a neural network that approximates the charge probability density function for each pixel in the camera. The performance of this improved algorithm is demonstrated using simulations of the planned CTA southern array. For this setup FreePACT provides significant performance improvements over analytical likelihood techniques, with improvements in angular and energy resolution of 25% or more over a wide energy range and an angular resolution as low as 40 ′ ′ at energies above 50 TeV for observations at 20 ° zenith angle. It also yields more accurate estimations of the uncertainties on the reconstructed parameters and significantly speeds up the reconstruction compared to analytical likelihood techniques while showing the same stability with respect to changes in the observation conditions. Therefore, the FreePACT method is a promising upgrade over the current state-of-the-art likelihood event reconstruction techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Machine learning within the Parkinson’s progression markers initiative: Review of the current state of affairs
- Author
-
Raphael T. Gerraty, Allison Provost, Lin Li, Erin Wagner, Magali Haas, and Lee Lancashire
- Subjects
machine learning ,Parkinson’s Disease ,multi-omic analyses ,PD progression ,data analysis methods ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
The Parkinson’s Progression Markers Initiative (PPMI) has collected more than a decade’s worth of longitudinal and multi-modal data from patients, healthy controls, and at-risk individuals, including imaging, clinical, cognitive, and ‘omics’ biospecimens. Such a rich dataset presents unprecedented opportunities for biomarker discovery, patient subtyping, and prognostic prediction, but it also poses challenges that may require the development of novel methodological approaches to solve. In this review, we provide an overview of the application of machine learning methods to analyzing data from the PPMI cohort. We find that there is significant variability in the types of data, models, and validation procedures used across studies, and that much of what makes the PPMI data set unique (multi-modal and longitudinal observations) remains underutilized in most machine learning studies. We review each of these dimensions in detail and provide recommendations for future machine learning work using data from the PPMI cohort.
- Published
- 2023
- Full Text
- View/download PDF
19. Spirality: A Novel Way to Measure Spiral Arm Pitch Angle.
- Author
-
Shields, Deanna, Boe, Benjamin, Pfountz, Casey, Davis, Benjamin L., Hartley, Matthew, Miller, Ryan, Slade, Zac, Abdeen, M. Shameer, Kennefick, Daniel, and Kennefick, Julia
- Subjects
FAST Fourier transforms ,GALACTIC evolution ,SHORT-term memory - Abstract
We present the MATLAB code Spirality, a novel method for measuring spiral arm pitch angles by fitting galaxy images to spiral templates of known pitch. Computation time is typically on the order of 2 min per galaxy, assuming 8 GB of working memory. We tested the code using 117 synthetic spiral images with known pitches, varying both the spiral properties and the input parameters. The code yielded correct results for all synthetic spirals with galaxy-like properties. We also compared the code's results to two-dimensional Fast Fourier Transform (2DFFT) measurements for the sample of nearby galaxies defined by DMS PPak. Spirality's error bars overlapped 2DFFT's error bars for 26 of the 30 galaxies. The two methods' agreement correlates strongly with galaxy radius in pixels and also with i-band magnitude, but not with redshift, a result that is consistent with at least some galaxies' spiral structure being fully formed by z = 1.2 , beyond which there are few galaxies in our sample. The Spirality code package also includes GenSpiral, which produces FITS images of synthetic spirals, and SpiralArmCount, which uses a one-dimensional Fast Fourier Transform to count the spiral arms of a galaxy after its pitch is determined. All code is freely available. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
20. SELECTED METHODS OF PROJECT AND DATA ANALYSIS.
- Author
-
GEMBALSKA-KWIECIEŃ, Anna
- Subjects
DATA analysis ,DECISION making ,ACQUISITION of data - Abstract
Purpose: Presentation of selected methods of project and data analysis – describes how data on ongoing projects should be collected, so that it can be used later, by applying the appropriate methodology. Design/methodology/approach: Literature research of the subject was carried out. Finding: Having a methodology for data collection leads, in the long term, to the implementation of a system that allows the use of this methodology. The system should provide information to help make better decisions, reducing or eliminating the risk of project failure. Practical implications: Development of a methodology of data collection and analysis on this basis. Originality/value: The implemented projects are comparable with each other and, on this basis, it can be argued that identifying the risks that have occurred in the past, during the implementation of various stages of projects, can contribute to more effective risk management during the implementation of current and future projects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
21. Real-time Recovery Efficiencies and Performance of the Palomar Transient Factory's Transient Discovery Pipeline
- Author
-
DeRose, J. [Stanford Univ., CA (United States)]
- Published
- 2017
- Full Text
- View/download PDF
22. Implications of z ~ 6 Quasar Proximity Zones for the Epoch of Reionization and Quasar Lifetimes
- Author
-
Mazzucchelli, Chiara [Max Planck Inst. for Astronomy, Heidelberg (Germany); Heidelberg Univ. (Germany)]
- Published
- 2017
- Full Text
- View/download PDF
23. A Search for Kilonovae in the Dark Energy Survey
- Author
-
Wester, W.
- Published
- 2017
- Full Text
- View/download PDF
24. Multi-dataset electron density analysis methods for X-ray crystallography
- Author
-
Pearce, Nicholas M., Kelm, Sebastian, Deane, Charlotte, Shi, Jiye, and von Delft, Frank
- Subjects
548 ,Data Analysis Methods ,X-ray Crystallography ,PanDDA - Abstract
X-ray crystallography is extensively deployed to determine the structure of proteins, both unbound and bound to different molecules. Crystallography has the power to visually reveal the binding of small molecules, assisting in their development in structure-based lead design. Currently, however, the methods used to detect binding, and the subjectivity of inexperienced modellers, are a weak-point in the field. Existing methods for ligand identification are fundamentally flawed when identifying partially-occupied states in crystallographic datasets; the ambiguity of conventional electron density maps, which present a superposition of multiple states, prevents robust ligand identification. In this thesis, I present novel methods to clearly identify bound ligands and other changed states in the case where multiple crystallographic datasets are available, such as in crystallographic fragment screening experiments. By applying statistical methods to signal identification, more crystallographic binders are detected than by state-of-the-art conventional approaches. Standard modelling practice is further challenged regarding the modelling of multiple chemical states in crystallography. The pervading modelling approach is to model only the bound state of the protein; I show that modelling an ensemble of bound and unbound states leads to better models. I conclude with a discussion of possible future applications of multi-datasets methods in X-ray crystallography, including the robust identification of conformational heterogeneity in protein structures.
- Published
- 2016
25. An Impact of Empirical Data Analysis in the World of Business Environment.
- Author
-
Swetha, Merla, E., Naresh, and Parakh, Santosh
- Subjects
DATA analysis ,PERSONNEL management ,DATA science ,DECISION making ,BUSINESS enterprises - Abstract
Copyright of International Research Journal of Business Studies is the property of Prasetiya Mulya Publishing, Universitas Prasetiya Mulya and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2022
- Full Text
- View/download PDF
26. Fermi LAT Stacking Analysis of Swift Localized GRBs
- Author
-
Wood, K.
- Published
- 2016
- Full Text
- View/download PDF
27. Signs of magnetic acceleration and multizone emission in GRB 080825C
- Author
-
Axelsson, Magnus [KTH Royal Inst. of Technology, Stockholm (Sweden). Oskar Klein Center for CosmoParticle Physics; Tokyo Metropolitan Univ. (Japan). Dept. of Physics]
- Published
- 2016
- Full Text
- View/download PDF
28. Review of The Palgrave Handbook of Applied Linguistics Research Methodology, by Aek, Phakiti; Peter De Costa; Luke, Plonsky; & Sue, Starfield
- Author
-
Kioumars Razavipour
- Subjects
Applied linguistics ,Research methodology ,Data collection methods ,Data analysis methods ,Special aspects of education ,LC8-6691 ,Language acquisition ,P118-118.7 - Published
- 2020
- Full Text
- View/download PDF
29. CRITICAL REVIEW OF THE ENVIRONMENTAL INVESTIGATION ON SOIL HEAVY METAL CONTAMINATION.
- Author
-
ERSOY, A.
- Subjects
MULTIVARIATE analysis ,ANALYSIS of heavy metals ,SOIL pollution ,ENVIRONMENTAL engineering ,AGRICULTURAL processing ,SOIL sampling - Abstract
Soil contamination by heavy metals has become a severe environmental issue in the world due to rapid development of urbanisation, industrial, mining, agricultural and natural processes, and chemical compounds. Reliable and quality results quantify the adverse effects of these factors. A precise and costefficient study depends on adequate background research, a well-planned sampling design and strategy, quality data, appropriate selection and implementation of analytical techniques and investigation. The investigation methods for heavy metal soil or land contaminations drive decision making and remediation which is very expensive. Therefore, this study offers comprehensive and comparative review on data organisation and treatment; guidelines, legislation of heavy metals; data analysis and investigation methods. The primary objectives of the review are to discuss the various stages involved in the investigation of heavy metals/land for site engineers and environmental scientists. Data analysis methods include exploring contamination indices, statistical and multivariate statistical analysis methods, interpolation techniques, geostatistical estimation, simulation, and combined methods. Strengths, weaknesses and the application scopes of these methods and the resulting models used are critical for success in environmental modelling. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
30. Spirality: A Novel Way to Measure Spiral Arm Pitch Angle
- Author
-
Deanna Shields, Benjamin Boe, Casey Pfountz, Benjamin L. Davis, Matthew Hartley, Ryan Miller, Zac Slade, M. Shameer Abdeen, Daniel Kennefick, and Julia Kennefick
- Subjects
data analysis methods ,image analysis methods ,galaxies ,spiral galaxies ,spiral arms ,galaxy structure ,Astronomy ,QB1-991 - Abstract
We present the MATLAB code Spirality, a novel method for measuring spiral arm pitch angles by fitting galaxy images to spiral templates of known pitch. Computation time is typically on the order of 2 min per galaxy, assuming 8 GB of working memory. We tested the code using 117 synthetic spiral images with known pitches, varying both the spiral properties and the input parameters. The code yielded correct results for all synthetic spirals with galaxy-like properties. We also compared the code’s results to two-dimensional Fast Fourier Transform (2DFFT) measurements for the sample of nearby galaxies defined by DMS PPak. Spirality’s error bars overlapped 2DFFT’s error bars for 26 of the 30 galaxies. The two methods’ agreement correlates strongly with galaxy radius in pixels and also with i-band magnitude, but not with redshift, a result that is consistent with at least some galaxies’ spiral structure being fully formed by z=1.2, beyond which there are few galaxies in our sample. The Spirality code package also includes GenSpiral, which produces FITS images of synthetic spirals, and SpiralArmCount, which uses a one-dimensional Fast Fourier Transform to count the spiral arms of a galaxy after its pitch is determined. All code is freely available.
- Published
- 2022
- Full Text
- View/download PDF
31. The dynamics and energetics of radio-loud active galaxies
- Author
-
Harwood, Jeremy James
- Subjects
523.1 ,Radio galaxies ,X-ray jets ,active galaxies ,spectral ageing ,acceleration of particles ,non-thermal radiation mechanisms ,AGN ,data analysis methods ,radio continuum - Abstract
In this thesis, I use the new generation of radio interferometer along with X-ray observations to investigate the dynamics and energetics of radio-loud active galaxies which are key to understanding AGN feedback and the evolution of galaxies as a whole. I present new JVLA observations of powerful radio source and use innovative techniques to undertake a detailed analysis of JVLA observations of powerful radio galaxies. I compare two of the most widely used models of spectral ageing, the Kardashev-Pacholczyk and Jaffe-Perola models and also results of the more complex, but potentially more realistic, Tribble model. I find that the Tribble model provides both a good fit to observations as well as providing a physically realistic description of the source. I present the first high-resolution spectral maps of the sources and find that the best-fitting injection indices across all models take higher values than has previously been assumed. I present characteristic hot spot advance speeds and compare them to those derived from dynamical ages, confirming that the previously known discrepancy in speed remains present in older radio sources even when ages are determined at high spectral and spatial resolutions. I show that some previously common assumptions made in determining spectral ages with narrow-band radio telescopes may not always hold. I present results from a study of the powerful radio galaxy 3C223 at low frequencies with LOFAR to determine its spectrum on spatially small scales and tightly constrain the injection index, which I find to be consistent with the high values found at GHz frequencies. Applying this new knowledge of the low energy electron population, I perform synchrotron / inverse-Compton model fitting and find that the total energy content of the radio galaxy lobes increases by a factor greater than 2 compared to previous studies. Using this result to provide revised estimates of the internal pressure, I find the northern lobe to be in pressure balance with the external medium and the southern lobe to be overpressured. I go on to present the first large sample investigation of the properties of jets in Fanaroff and Riley type I radio galaxies (FR-I) at X-ray energies based on data from the Chandra archive. I explore relations between the properties of the jets and the properties of host galaxies in which they reside. I find previously unknown correlations to exist, relating photon index, volume emissivity, jet volume and luminosity, and find that the previously held assumption of a relationship between luminosities at radio and X-ray wavelengths is linear in nature when bona fide FR-I radio galaxies are considered. In addition, I attempt to constrain properties which may play a key role in determination of the diffuse emission process. I test a simple model in which large-scale magnetic field variations are primarily responsible for determining jet properties; however, we find that this model is inconsistent with our best estimates of the relative magnetic field strengths in my sample.
- Published
- 2014
32. Selection and processing of calibration samples to measure the particle identification performance of the LHCb experiment in Run 2
- Author
-
Roel Aaij, Lucio Anderlini, Sean Benson, Marco Cattaneo, Philippe Charpentier, Marco Clemencic, Antonio Falabella, Fabio Ferrari, Marianna Fontana, Vladimir Vava Gligorov, Donal Hill, Tibaud Humair, Christopher Robert Jones, Oliver Lupton, Sneha Malde, Carla Marin Benito, Rosen Matev, Alex Pearce, Anton Poluektov, Barbara Sciascia, Federico Stagni, Ricardo Vazquez Gomez, and Yanxi Zhang
- Subjects
Experimental methods and data analysis methods ,Data acquisition ,Data analysis methods ,Physics ,QC1-999 ,Optics. Light ,QC350-467 ,Descriptive and experimental mechanics ,QC120-168.85 - Abstract
Abstract Since 2015, with the restart of the LHC for its second run of data taking, the LHCb experiment has been empowered with a dedicated computing model to select and analyse calibration samples to measure the performance of the particle identification (PID) detectors and algorithms. The novel technique was developed within the framework of the innovative trigger model of the LHCb experiment, which relies on online event reconstruction for most of the datasets, reserving offline reconstruction to special physics cases. The strategy to select and process the calibration samples, which includes a dedicated data-processing scheme combining online and offline reconstruction, is discussed. The use of the calibration samples to measure the detector PID performance, and the efficiency of PID requirements across a large range of decay channels, is described. Applications of the calibration samples in data-quality monitoring and validation procedures are also detailed.
- Published
- 2019
- Full Text
- View/download PDF
33. The Fermi Large Area Telescope on Orbit: Event Classification, Instrument Response Functions, and Calibration
- Author
-
Zimmer, S.
- Published
- 2012
- Full Text
- View/download PDF
34. Image Phenotyping of Spring Barley (Hordeum vulgare L.) RIL Population Under Drought: Selection of Traits and Biological Interpretation
- Author
-
Krzysztof Mikołajczak, Piotr Ogrodowicz, Hanna Ćwiek-Kupczyńska, Kathleen Weigelt-Fischer, Srinivasa Reddy Mothukuri, Astrid Junker, Thomas Altmann, Karolina Krystkowiak, Tadeusz Adamski, Maria Surma, Anetta Kuczyńska, and Paweł Krajewski
- Subjects
automated high-throughput plant phenotyping ,barley ,data analysis methods ,drought stress ,dynamic traits ,Plant culture ,SB1-1110 - Abstract
Image-based phenotyping is a non-invasive method that permits the dynamic evaluation of plant features during growth, which is especially important for understanding plant adaptation and temporal dynamics of responses to environmental cues such as water deficit or drought. The aim of the present study was to use high-throughput imaging in order to assess the variation and dynamics of growth and development during drought in a spring barley population and to investigate associations between traits measured in time and yield-related traits measured after harvesting. Plant material covered recombinant inbred line population derived from a cross between European and Syrian cultivars. After placing the plants on the platform (28th day after sowing), drought stress was applied for 2 weeks. Top and side cameras were used to capture images daily that covered the visible range of the light spectrum, fluorescence signals, and the near infrared spectrum. The image processing provided 376 traits that were subjected to analysis. After 32 days of image phenotyping, the plants were cultivated in the greenhouse under optimal watering conditions until ripening, when several architecture and yield-related traits were measured. The applied data analysis approach, based on the clustering of image-derived traits into groups according to time profiles of statistical and genetic parameters, permitted to select traits representative for inference from the experiment. In particular, drought effects for 27 traits related to convex hull geometry, texture, proportion of brown pixels and chlorophyll intensity were found to be highly correlated with drought effects for spike traits and thousand grain weight.
- Published
- 2020
- Full Text
- View/download PDF
35. What are the drivers of business demography and employment in the countries of the European Union?
- Author
-
Abdesselam, Rafik, Bonnet, Jean, and Renou-Maissant, Patricia
- Subjects
DEMOGRAPHY ,EMPLOYMENT ,INSTITUTIONAL environment ,GOVERNMENT policy ,DISCRIMINANT analysis - Abstract
The aim of this contribution is to establish a typology of European entrepreneurship countries with respect to variables related to entrepreneurial activity and economic development. Using a combination of multidimensional data analyses allows us to extend the concept of 'entrepreneurial regimes' and leads to the distinction of five such entrepreneurial regimes. Moreover, in order to better characterize these classes, a wide set of illustrative variables representative of national economic development, labour market functioning, and formal and informal institutional environments, as well as variables specific to the entrepreneurial population, are considered. Finally, discriminant analyses show that the five explanatory themes considered (Innovation, Employment, Formal Institutions, Entrepreneurship and Governance) differentiate the classes, and significantly explain the diversity of entrepreneurial regimes. These findings have important implications for the implementation of public policy, in order to promote entrepreneurial activity and reduce unemployment. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
36. Image Phenotyping of Spring Barley (Hordeum vulgare L.) RIL Population Under Drought: Selection of Traits and Biological Interpretation.
- Author
-
Mikołajczak, Krzysztof, Ogrodowicz, Piotr, Ćwiek-Kupczyńska, Hanna, Weigelt-Fischer, Kathleen, Mothukuri, Srinivasa Reddy, Junker, Astrid, Altmann, Thomas, Krystkowiak, Karolina, Adamski, Tadeusz, Surma, Maria, Kuczyńska, Anetta, and Krajewski, Paweł
- Subjects
BARLEY ,DROUGHTS ,PLANT adaptation ,CULTIVATED plants ,CONVEX geometry ,VISIBLE spectra - Abstract
Image-based phenotyping is a non-invasive method that permits the dynamic evaluation of plant features during growth, which is especially important for understanding plant adaptation and temporal dynamics of responses to environmental cues such as water deficit or drought. The aim of the present study was to use high-throughput imaging in order to assess the variation and dynamics of growth and development during drought in a spring barley population and to investigate associations between traits measured in time and yield-related traits measured after harvesting. Plant material covered recombinant inbred line population derived from a cross between European and Syrian cultivars. After placing the plants on the platform (28th day after sowing), drought stress was applied for 2 weeks. Top and side cameras were used to capture images daily that covered the visible range of the light spectrum, fluorescence signals, and the near infrared spectrum. The image processing provided 376 traits that were subjected to analysis. After 32 days of image phenotyping, the plants were cultivated in the greenhouse under optimal watering conditions until ripening, when several architecture and yield-related traits were measured. The applied data analysis approach, based on the clustering of image-derived traits into groups according to time profiles of statistical and genetic parameters, permitted to select traits representative for inference from the experiment. In particular, drought effects for 27 traits related to convex hull geometry, texture, proportion of brown pixels and chlorophyll intensity were found to be highly correlated with drought effects for spike traits and thousand grain weight. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
37. Analysis of 4C-seq data: A comparison of methods.
- Author
-
Zisis, Dimitrios, Krajewski, Paweł, Stam, Maike, Weber, Blaise, and Hövel, Iris
- Abstract
The circular chromosome conformation capture technique followed by sequencing (4C-seq) has been used in a number of studies to investigate chromosomal interactions between DNA fragments. Computational pipelines have been developed and published that offer various possibilities of 4C-seq data processing and statistical analysis. Here, we present an overview of four of such pipelines (fourSig, FourCSeq, 4C-ker and w4Cseq) taking into account the most important stages of computations. We provide comparisons of the methods and discuss their advantages and possible weaknesses. We illustrate the results with the use of data obtained for two different species, in a study devoted to vernalization control in Arabidopsis thaliana by the FLOWERING LOCUS C (FLC) gene and to long-range chromatin interactions in mouse embryonic stem cells. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
38. A Dynamic, Modular Intelligent-Agent Framework for Astronomical Light Curve Analysis and Classification
- Author
-
McWhirter, Paul R., Wright, Sean, Steele, Iain A., Al-Jumeily, Dhiya, Hussain, Abir, Fergus, Paul, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Huang, De-Shuang, editor, Bevilacqua, Vitoantonio, editor, and Premaratne, Prashan, editor
- Published
- 2016
- Full Text
- View/download PDF
39. What are the drivers of eco-innovation? Empirical evidence from French start-ups.
- Author
-
Abdesselam, Rafik, Kedjar, Malia, and Renou-Maissant, Patricia
- Subjects
DISCRIMINANT analysis ,NEW business enterprises ,ENVIRONMENTAL education ,SURVEYS ,DATA analysis - Abstract
The purpose of this paper is to identify the drivers of eco-innovation in start-ups. Firstly, a discriminant analysis (DA) is applied to study what is distinctive about eco-innovative start-ups as compared to non-eco-innovative start-ups. Secondly, a typology of eco-innovative start-ups is developed using a hierarchical ascendant clustering (HAC). Analyses are carried out using original data from a survey of 120 eco-innovative and non-eco-innovative French start-ups. Discriminant analyses reveal that the founders of eco-innovative start-ups are differentiated by characteristics related to their environmental education and professional experience. Furthermore, eco-innovative start-ups are distinguished from the non-eco-innovative start-ups by voluntary environmental practices, such as the adoption of corporate social responsibility policies. Finally, we show that there is a diversity of profiles of eco-innovators. In fact, firms cluster into five main profiles and exhibit different eco-innovation drivers. We highlight that the different types of eco-innovators do not face the same difficulties in accessing funds. These findings have important implications for the implementation of public policy designed to promote eco-innovative activity, and they highlight the need to design policies that take into account the distinctive character of each profile. • The first objective of this study is to identify the factors differentiating eco-innovative start-ups from non-eco-innovative startups. • Three main factors are identified and discussed in light of previous literature. • The second objective of this paper is to identify the different profiles of eco-innovative start-ups. • Based on the hierarchical ascendant clustering analysis we identify five profiles that exhibit different eco-innovation drivers. • We discuss how public policy may adapt their support strategy according to the different profiles. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Table 2 Fallacy in Descriptive Epidemiology: Bringing Machine Learning to the Table
- Author
-
Chaiton, Christoffer Dharma, Rui Fu, and Michael
- Subjects
machine learning ,descriptive analysis ,data analysis methods ,alcohol use ,sexual minority youth - Abstract
There is a lack of rigorous methodological development for descriptive epidemiology, where the goal is to describe and identify the most important associations with an outcome given a large set of potential predictors. This has often led to the Table 2 fallacy, where one presents the coefficient estimates for all covariates from a single multivariable regression model, which are often uninterpretable in a descriptive analysis. We argue that machine learning (ML) is a potential solution to this problem. We illustrate the power of ML with an example analysis identifying the most important predictors of alcohol abuse among sexual minority youth. The framework we propose for this analysis is as follows: (1) Identify a few ML methods for the analysis, (2) optimize the parameters using the whole data with a nested cross-validation approach, (3) rank the variables using variable importance scores, (4) present partial dependence plots (PDP) to illustrate the association between the important variables and the outcome, (5) and identify the strength of the interaction terms using the PDPs. We discuss the potential strengths and weaknesses of using ML methods for descriptive analysis and future directions for research. R codes to reproduce these analyses are provided, which we invite other researchers to use.
- Published
- 2023
- Full Text
- View/download PDF
41. Research Methodology
- Author
-
Zhao, Xianbo, Hwang, Bon-Gang, Low, Sui Pheng, Zhao, Xianbo, Hwang, Bon-Gang, and Low, Sui Pheng
- Published
- 2015
- Full Text
- View/download PDF
42. Methods for Characterising Microphysical Processes in Plasmas
- Author
-
Dudok de Wit, T., Alexandrova, O., Furno, I., Sorriso-Valvo, L., Zimbardo, G., Balogh, André, editor, Bykov, Andrei, editor, Cargill, Peter, editor, Dendy, Richard, editor, Dudok de Wit, Thierry, editor, and Raymond, John, editor
- Published
- 2014
- Full Text
- View/download PDF
43. A new laboratory test method for tire-pavement noise.
- Author
-
Ren, Wanyan, Han, Sen, Fwa, Tien Fang, Zhang, Jiahao, and He, Zhihao
- Subjects
- *
TEST methods , *SOUND pressure , *TESTING laboratories , *TIRES , *NOISE , *TRAFFIC cameras , *AUTOMOBILE tires - Abstract
• A new laboratory test method for tire-pavement was introduced. • Contact time of tire-pavement was identified by the camera and weighing sensor. • Sound pressure level and 1/3 octave band sound spectrums were compared. • Five data analysis methods were attempted and compared with OBSI test results. • Suitable data analysis method was recommended based on comparison results. Little previous research results have been reported from laboratory tire-pavement noise test methods, especially on small laboratory specimens. This research aims to introduce and explore the feasibility of a new test method used with laboratory specimens. The proposed method measured the tire-pavement noise when a rolling tire from a sloping track hit a horizontal slab specimen of a given pavement mixture. A high speed camera and a weighing sensor were utilized to identify the tire-pavement contact time and distance. The contact time was found to be 30.72 ms. The exact start time of contact was determined by analyzing the recorded sound signal. Altogether, five data analysis methods were employed. The analysis results were evaluated using Sound Pressure Level (SPL) and 1/3 octave band spectrum. They were compared with the corresponding reference pavements from past research, and on site results. Finally, two methods were recommended. They may potentially characterize laboratory tire-pavement noise. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
44. Assessing the influence of boundary conditions, driving behavior and data analysis methods on real driving CO2 and NOx emissions.
- Author
-
Varella, Roberto A., Faria, Marta V., Mendoza-Villafuerte, Pablo, Baptista, Patrícia C., Sousa, Luis, and Duarte, Gonçalo O.
- Abstract
Abstract The need to verify vehicle emissions in real world operation led to the implementation of Real Driving Emissions (RDE) test procedures, effective since September 2017 for new Euro 6 cars following the Commission Regulation (EU) 2016/427, which defines the RDE test conditions and data analysis methods to allow representative results. Main factors addressed by the regulation include the share of driving operation, ambient temperature range, altitude and elevation difference. However, RDE is still debatable since not only boundary conditions but also the evaluation methods and trip selection are being discussed together with a carbon dioxide (CO 2) regulation, which is planned to be implemented in the short term. Thus, this work focuses on analyzing the effect of different data measurement and analysis methods (i.e. cold-operation, road grade, trip selection and driving style) on CO 2 and nitrogen oxides (NO x) emissions based on 13 RDE tests performed in the Lisbon Metropolitan Area, Portugal. The tests were conducted by 2 drivers using 5 vehicles. Each driver performed 2 trips per vehicle, one in normal driving and other in aggressive driving. A Portable Emissions Measurement System (PEMS) was used to collect 1 Hz data, which was compared and analyzed using the European Commission (EC) proposed method for RDE tests. Results show the effects of each parameter such as average difference between drivers (7% in CO 2 and 55% in NO x emissions) and between aggressive and normal driving. For road grade, big oscillations happen during the slope profile, which impacts emissions during all trips. Considering cold-operation, CO 2 and NO x emissions are, on average, ~25% and 55% higher, respectively, than in hot-operation. These results highlight the need for deeper studies on these factors to assure that RDE tests evolve to a more established certification procedure than laboratorial certifications. Graphical abstract Unlabelled Image Highlights • Evaluate current European RDE data analysis methods • Assess boundary conditions and driving behavior influence on RDE tests • Quantify the impacts of these factors in terms of CO2 and NOx emissions • Driving behavior leads to variations of up to 7% in CO2 and 55% in NOx emissions • Cold-operation increase CO2 and NOx emissions by 25% and 55% compared to hot-operation [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
45. Entrepreneurship, economic development, and institutional environment: evidence from OECD countries.
- Author
-
Abdesselam, Rafik, Bonnet, Jean, Renou-Maissant, Patricia, and Aubry, Mathilde
- Subjects
ENTREPRENEURSHIP ,ECONOMIC development ,ECONOMIC activity ,FINANCIAL crises - Abstract
The purpose of this article is to establish a typology of entrepreneurship for OECD countries over the 1999-2012 period. Our aim is to draw a distinction between managerial and entrepreneurial economies, to identify groups of countries with similar economic and entrepreneurial activity variables, and to determine the economic and institutional drivers of entrepreneurial activities in each group. We show that the level of development, sectoral specialization, and institutional variables related to entrepreneurship, functioning of the labor market, and openness of the country are decisive to understand differences in entrepreneurship activity across countries. Results show that the pre-crisis period, from 1999 to 2008, is a period of growth favorable to entrepreneurship. The financial crisis involved a break in entrepreneurial dynamism, with agricultural economies withstanding the financial crisis better. The 2010-2012 period of recovery is a period of a sharp slowdown in entrepreneurial activity, during which the countries that are less dependent on the financial sector proved to be the most resilient in terms of entrepreneurial activity. Nevertheless, it is the advanced knowledge economies with developed financial markets, fewer institutional regulatory constraints, and greater scope for qualitative entrepreneurship that show lower unemployment rates. These findings have important implications for the implementation of public policy in order to promote entrepreneurial activity and reduce unemployment. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
46. Analysis and Model of Cortical Slow Waves Acquired with Optical Techniques
- Author
-
Marco Celotto, Chiara De Luca, Paolo Muratore Francesco Resta, Anna Letizia Allegra Mascaro, Francesco Saverio Pavone, Giulia De Bonis, and Pier Stanislao Paolucci
- Subjects
slow wave activity ,gcamp6f ,wide-field microscopy ,spatio-temporal dynamics ,in vivo imaging ,data analysis methods ,toy-model simulation ,Biology (General) ,QH301-705.5 - Abstract
Slow waves (SWs) are spatio-temporal patterns of cortical activity that occur both during natural sleep and anesthesia and are preserved across species. Even though electrophysiological recordings have been largely used to characterize brain states, they are limited in the spatial resolution and cannot target specific neuronal population. Recently, large-scale optical imaging techniques coupled with functional indicators overcame these restrictions, and new pipelines of analysis and novel approaches of SWs modelling are needed to extract relevant features of the spatio-temporal dynamics of SWs from these highly spatially resolved data-sets. Here we combined wide-field fluorescence microscopy and a transgenic mouse model expressing a calcium indicator (GCaMP6f) in excitatory neurons to study SW propagation over the meso-scale under ketamine anesthesia. We developed a versatile analysis pipeline to identify and quantify the spatio-temporal propagation of the SWs. Moreover, we designed a computational simulator based on a simple theoretical model, which takes into account the statistics of neuronal activity, the response of fluorescence proteins and the slow waves dynamics. The simulator was capable of synthesizing artificial signals that could reliably reproduce several features of the SWs observed in vivo, thus enabling a calibration tool for the analysis pipeline. Comparison of experimental and simulated data shows the robustness of the analysis tools and its potential to uncover mechanistic insights of the Slow Wave Activity (SWA).
- Published
- 2020
- Full Text
- View/download PDF
47. Multivariate Data Analysis Methods for the Interpretation of Microbial Flow Cytometric Data
- Author
-
Davey, Hazel M., Davey, Christopher L., Müller, Susann, editor, and Bley, Thomas, editor
- Published
- 2011
- Full Text
- View/download PDF
48. First hydrogen operation of NIO1: characterization of the source plasma by means of an optical emission spectroscopy diagnostic
- Author
-
Marco Cavenago, Ursel Fantz, Luca Vialetto, B. Zaniol, M. Barbisan, Roberto Pasqualotto, D. Wünderlich, C. Baltador, and Gianluigi Serianni
- Subjects
Characteristic signature ,Materials science ,010504 meteorology & atmospheric sciences ,Hydrogen ,Plasma parameters ,Collisional radiative model ,chemistry.chemical_element ,FOS: Physical sciences ,Plasma measurement ,Data analysis methods ,01 natural sciences ,Light emission ,Negative ions ,Plasma diagnostics ,Ion ,Optics ,Spectroscopic data Plasma diagnostics ,Physics::Plasma Physics ,0103 physical sciences ,Electric discharges ,Ion sources ,Optical emission spectroscopy ,Plasma simulation ,Spectrometers Characteristic signature ,Hydrogen discharge ,Low-resolution spectrometer ,Molecular emissions ,Instrumentation ,0105 earth and related environmental sciences ,010302 applied physics ,Spectrometers ,Spectrometer ,Collisional radiative mode ,business.industry ,Plasma ,Ion source ,Physics - Plasma Physics ,Plasma Physics (physics.plasm-ph) ,Spectroscopic data ,chemistry ,Radio frequency ,Atomic physics ,business - Abstract
NIO1 is a compact and flexible radiofrequency H- ion source, developed by Consorzio RFX and INFN-LNL. Aim of the experimentation on NIO1 is the optimization of both the production of negative ions and their extraction and beam optics. In the initial phase of its commissioning, NIO1 was operated with nitrogen, but now the source is regularly operated also with hydrogen. To evaluate the source performances an optical emission spectroscopy diagnostic was installed. The system includes a low resolution spectrometer in the spectral range of 300-850 nm and a high resolution (50 pm) one, to study respectively the atomic and the molecular emissions in the visible range. The spectroscopic data have been interpreted also by means of a collisional-radiative model developed at IPP Garching. Besides the diagnostic hardware and the data analysis methods, the paper presents the first plasma measurements across a transition to the full H mode, in a hydrogen discharge. The characteristic signatures of this transition in the plasma parameters are described, in particular the sudden increase of the light emitted from the plasma above a certain power threshold., 3 pages, 2 figures. Contributed paper for the ICIS 2015 conference. Accepted manuscript
- Published
- 2022
49. Computation and conversion of brain pH values obtained with two algorithms of phosphorus magnetic resonance spectroscopy data analysis.
- Author
-
Cichocka, Monika
- Subjects
- *
NUCLEAR magnetic resonance spectroscopy , *PHOSPHORUS , *ALGORITHMS , *MATHEMATICAL formulas , *PHOSPHOCREATINE - Abstract
The intracellular brain pH in phosphorus magnetic resonance spectroscopy is calculated using the chemical shift between the inorganic phosphate and phosphocreatine with the Henderson-Hasselbalch equations. Researchers use various mathematical formulas that have different parameters and get various results for the same input data as a consequence. Thus, the aim of this article was to determine the mathematical formulas that allow the conversion of the pH values obtained by the most popular analysis methods to each other. To determine the relationships between pH results and the applied mathematical formula, the pH values were calculated using two algorithms for the theoretical chemical shift values. The pH results were compared with each other using the appropriate
t -tests. Mathematical formulas were designed to simplify the conversion of pH values obtained by two data analysis methods to each other. The pH values with were obtained this way did not differ significantly from the pH values calculated directly from the given formula. The computed mathematical formulas will make it possible to simplify pH conversions without knowing the chemical shift between inorganic phosphate and phosphocreatine but only basing on the final pH values obtained by one of the formulas. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
50. Sensitivity of Characterizing the Heat Loss Coefficient through On-Board Monitoring: A Case Study Analysis
- Author
-
Marieline Senave, Staf Roels, Stijn Verbeke, Evi Lambie, and Dirk Saelens
- Subjects
characterization ,physical parameter identification ,heat loss coefficient ,on-board monitoring data ,data analysis methods ,sensitivity ,uncertainty ,case study analysis ,Technology - Abstract
Recently, there has been an increasing interest in the development of an approach to characterize the as-built heat loss coefficient (HLC) of buildings based on a combination of on-board monitoring (OBM) and data-driven modeling. OBM is hereby defined as the monitoring of the energy consumption and interior climate of in-use buildings via non-intrusive sensors. The main challenge faced by researchers is the identification of the required input data and the appropriate data analysis techniques to assess the HLC of specific building types, with a certain degree of accuracy and/or within a budget constraint. A wide range of characterization techniques can be imagined, going from simplified steady-state models applied to smart energy meter data, to advanced dynamic analysis models identified on full OBM data sets that are further enriched with geometric info, survey results, or on-site inspections. This paper evaluates the extent to which these techniques result in different HLC estimates. To this end, it performs a sensitivity analysis of the characterization outcome for a case study dwelling. Thirty-five unique input data packages are defined using a tree structure. Subsequently, four different data analysis methods are applied on these sets: the steady-state average, Linear Regression and Energy Signature method, and the dynamic AutoRegressive with eXogenous input model (ARX). In addition to the sensitivity analysis, the paper compares the HLC values determined via OBM characterization to the theoretically calculated value, and explores the factors contributing to the observed discrepancies. The results demonstrate that deviations up to 26.9% can occur on the characterized as-built HLC, depending on the amount of monitoring data and prior information used to establish the interior temperature of the dwelling. The approach used to represent the internal and solar heat gains also proves to have a significant influence on the HLC estimate. The impact of the selected input data is higher than that of the applied data analysis method.
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.