289 results on '"Graphical method"'
Search Results
2. Novel Graphical Method for Data Presentation in Alcohol Systematic Reviews: The Interactive Harvest Plot.
- Author
-
Foulds, James, Knight, Josh, Young, Jesse T, Keen, Claire, and Newton-Howes, Giles
- Subjects
- *
STATISTICS , *GRAPHIC arts , *PERSONALITY disorders , *PSYCHOLOGY information storage & retrieval systems , *PUBLICATION bias , *META-analysis , *INFORMATION storage & retrieval systems , *MEDICAL databases , *MEDICAL information storage & retrieval systems , *CONFIDENCE intervals , *SYSTEMATIC reviews , *ACQUISITION of data , *TREATMENT effectiveness , *ALCOHOL drinking , *DESCRIPTIVE statistics , *DATA analysis , *RESEARCH bias , *MEDLINE - Abstract
Aims To demonstrate a novel method for presenting and exploring data in systematic reviews of the alcohol literature. Methods Harvest plots are a graphical method for displaying data on the overall pattern of evidence from a systematic review. They can display the direction of effects and risk of bias within studies for multiple outcomes in a single graphical chart. Using data from our previous meta-analysis on the association between personality disorder and alcohol treatment outcome, we extended the application of harvest plots by developing an interactive online harvest plot application. Results Studies included in the review were heterogeneous in design. There were many different primary outcomes, and similar outcomes were often defined differently across studies. The interactive harvest plot allows readers to explore trends in the data across multiple outcomes, including the impact of within-study bias and year of publication. In contrast, meta-analysis on the same data was hampered by a lack of consistency in the way outcomes were measured, and incomplete reporting of effect sizes and their variance. This meant many studies included in the systematic review could not be meta-analysed. Conclusions Interactive harvest plots are a novel graphical method to present data from systematic reviews. They can supplement or even replace meta-analysis when the studies included in a systematic review use heterogeneous designs and measures, as is often the case in the alcohol literature. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
3. Bias in analytical chemistry: A review of selected procedures for incorporating uncorrected bias into the expanded uncertainty of analytical measurements and a graphical method for evaluating the concordance of reference and test procedures
- Author
-
Robert Frenkel, Tony Badrick, and Ian Farrance
- Subjects
Data Analysis ,Computer science ,Test procedures ,Clinical Laboratory Techniques ,Concordance ,Biochemistry (medical) ,Clinical Biochemistry ,Analytical chemistry ,Process (computing) ,Probability density function ,General Medicine ,Reference Standards ,Biochemistry ,Expression (mathematics) ,Metrology ,Bias ,Measurement uncertainty ,Humans ,SIMPLE algorithm - Abstract
The Evaluation of measurement data - Guide to the Expression of Uncertainty in Measurement (GUM) provides the framework for evaluating measurement uncertainty. The preferred GUM approach for addressing bias assumes that all systematic errors are identified and corrected at an early stage in the measurement process. We review some procedures for treating uncorrected bias and its inclusion into an overall uncertainty statement. When bias and its uncertainty are recognised as metrological states independent of scatter in the test results, the uncertainty of the reference and uncertainty of the bias can be equated. The net standard uncertainty of a test result is the root-sum-square of the standard uncertainty of the bias and the standard uncertainty of measurements on the test. Since an incomplete and therefore potentially erroneous formula is often used for estimating bias standard uncertainty, we propose an alternative calculation. We next propose a graphical method using a simple algorithm that quantifies the discrepancy between the results of a test measurement and the corresponding reference value, in terms of the percentage overlap of two probability density functions. We propose that bias should be corrected wherever possible and we illustrate this approach using the graphical method. Even though this review is focused principally on analytical chemistry and medical laboratory applications, much of the discussion is applicable to all areas of metrology.
- Published
- 2019
4. On the Maximum Likelihood Estimators’ Uniqueness and Existence for Two Unitary Distributions: Analytically and Graphically, with Application
- Author
-
Gadir Alomair, Yunus Akdoğan, Hassan S. Bakouch, and Tenzile Erbayram
- Subjects
unitary distribution ,graphical method ,Cauchy–Schwarz inequality ,simulation ,data analysis ,Mathematics ,QA1-939 - Abstract
Unit distributions, exhibiting inherent symmetrical properties, have been extensively studied across various fields. A significant challenge in these studies, particularly evident in parameter estimations, is the existence and uniqueness of estimators. Often, it is challenging to demonstrate the existence of a unique estimator. The major issue with maximum likelihood and other estimator-finding methods that use iterative methods is that they need an initial value to reach the solution. This dependency on initial values can lead to local extremes that fail to represent the global extremities, highlighting a lack of symmetry in solution robustness. This study applies a very simple, and unique, estimation method for unit Weibull and unit Burr XII distributions that both attain the global maximum value. Therefore, we can conclude that the findings from the obtained propositions demonstrate that both the maximum likelihood and graphical methods are symmetrically similar. In addition, three real-world data applications are made to show that the method works efficiently.
- Published
- 2024
- Full Text
- View/download PDF
5. A graphical method for assessing risk factor threshold values using the generalized additive model: the multi-ethnic study of atherosclerosis.
- Author
-
Setodji, Claude, Scheuner, Maren, Pankow, James, Blumenthal, Roger, Chen, Haiying, and Keeler, Emmett
- Subjects
- *
CORONARY disease , *CORONARY heart disease risk factors , *RISK assessment , *AGE factors in disease , *STATISTICAL correlation , *DECISION making , *EPIDEMIOLOGY , *RESEARCH methodology , *REGRESSION analysis , *RESEARCH funding , *STATISTICS , *LOGISTIC regression analysis , *DATA analysis , *INTER-observer reliability , *GENETICS - Abstract
Continuous variable dichotomization is a popular technique used in the estimation of the effect of risk factors on health outcomes in multivariate regression settings. Researchers follow this practice in order to simplify data analysis, which it unquestionably does. However thresholds used to dichotomize those variables are usually ad-hoc, based on expert opinions, or mean, median or quantile splits and can add bias to the effect of the risk factors on specific outcomes and underestimate such effect. In this paper, we suggest the use of a semi-parametric method and visualization for improvement of the threshold selection in variable dichotomization while accounting for mixture distributions in the outcome of interest and adjusting for covariates. For clinicians, these empirically based thresholds of risk factors, if they exist, could be informative in terms of the highest or lowest point of a risk factor beyond which no additional impact on the outcome should be expected. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
6. Graphical method predicts geopressures worldwide
- Author
-
Eaton, B
- Published
- 1976
7. A Graphical Method for Determining the Ideality of a Sedimenting Boundary
- Author
-
Thomas M. Laue and David B. Hayes
- Subjects
Analytical Ultracentrifugation ,Radial position ,business.industry ,Computer science ,Data analysis ,Boundary (topology) ,The Renaissance ,Boundary shape ,business ,Process engineering ,Topology ,Automation - Abstract
Analytical ultracentrifugation can provide useful thermodynamic and hydrodynamic information for a wide variety of chemical systems. It is apparent that through the use of modern electronics and computers, analytical ultracentrifugation is undergoing a renaissance. A modernization of ultracentrifugation is underway. The automation of acquisition, reduction and analysis of data has been quite successful. However, what has been more difficult to automate is the art of determining how much information is accessible from an experiment. This is a problem that becomes apparent to anyone starting a project that involves analytical sedimentation.
- Published
- 1994
8. A graphical method to assess distribution assumption in group-based trajectory models.
- Author
-
Elsensohn, Mad-Hélénie, Klich, Amna, Ecochard, René, Bastard, Mathieu, Genolini, Christophe, Etard, Jean-François, and Gustin, Marie-Paule
- Subjects
HOMOSCEDASTICITY ,CD4 antigen ,HIV infections ,LYMPHOCYTES ,ANTIRETROVIRAL agents ,GRAPH theory ,LONGITUDINAL method ,STATISTICS ,DATA analysis ,CD4 lymphocyte count - Abstract
Group-based trajectory models had a rapid development in the analysis of longitudinal data in clinical research. In these models, the assumption of homoscedasticity of the residuals is frequently made but this assumption is not always met. We developed here an easy-to-perform graphical method to assess the assumption of homoscedasticity of the residuals to apply especially in group-based trajectory models. The method is based on drawing an envelope to visualize the local dispersion of the residuals around each typical trajectory. Its efficiency is demonstrated using data on CD4 lymphocyte counts in patients with human immunodeficiency virus put on antiretroviral therapy. Four distinct distributions that take into account increasing parts of the variability of the observed data are presented. Significant differences in group structures and trajectory patterns were found according to the chosen distribution. These differences might have large impacts on the final trajectories and their characteristics; thus on potential medical decisions. With a single glance, the graphical criteria allow the choice of the distribution that best capture data variability and help dealing with a potential heteroscedasticity problem. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
9. A Complementary Graphical Method for Reducing and Analyzing Large Data Sets.
- Author
-
Jing, X. and Cimino, J. J.
- Subjects
DATA visualization ,DATA modeling ,INFORMATION filtering ,TERMS & phrases ,BIG data - Abstract
The article reflects on several case studies demonstrating method for setting and selecting thresholds to limit graph size while retaining important information. The study employed data visualization, data manipulation and data computation, and discussed data models for different types of thresholds. It concludes that filtering method reduces large graphs to a manageable size and graphical method for large data sets provides summary based on computation of hierarchical terminology.
- Published
- 2014
- Full Text
- View/download PDF
10. Mapping wind power density for Zimbabwe: a suitable Weibull-parameter calculation method.
- Author
-
Hove, Tawanda, Madiye, Luxmore, and Musademba, Downmore
- Subjects
- *
WIND power , *WEIBULL distribution , *PROBABILITY theory , *ESTIMATION theory , *DATA analysis - Abstract
The two-parameter Weibull probability distribution function is versatile for modelling wind speed frequency distribution and for estimating the energy delivery potential of wind energy systems if its shape and scale parameters, k and c, are correctly determined from wind records. In this study, different methods for determining Weibull k and c from wind speed measurements are reviewed and applied at four sample meteorological stations in Zimbabwe. The appropriateness of each method in modelling the wind data is appraised by its accuracy in predicting the power density using relative deviation and normalised root mean square error. From the methods considered, the graphical method proved to imitate the wind data most closely followed by the standard deviation method. The Rayleigh distribution (k=2 is also generated and compared with the wind speed data. The Weibull parameters were calculated by the graphical method for fourteen stations at which hourly wind speed data was available. These values were then used, with the assistance of appropriate boundary layer models, in the mapping of a wind power density map at 50m hub height for Zimbabwe. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
11. On the Maximum Likelihood Estimators' Uniqueness and Existence for Two Unitary Distributions: Analytically and Graphically, with Application.
- Author
-
Alomair, Gadir, Akdoğan, Yunus, Bakouch, Hassan S., and Erbayram, Tenzile
- Subjects
- *
MAXIMUM likelihood statistics , *WEIBULL distribution , *PARAMETER estimation , *SCHWARZ inequality - Abstract
Unit distributions, exhibiting inherent symmetrical properties, have been extensively studied across various fields. A significant challenge in these studies, particularly evident in parameter estimations, is the existence and uniqueness of estimators. Often, it is challenging to demonstrate the existence of a unique estimator. The major issue with maximum likelihood and other estimator-finding methods that use iterative methods is that they need an initial value to reach the solution. This dependency on initial values can lead to local extremes that fail to represent the global extremities, highlighting a lack of symmetry in solution robustness. This study applies a very simple, and unique, estimation method for unit Weibull and unit Burr XII distributions that both attain the global maximum value. Therefore, we can conclude that the findings from the obtained propositions demonstrate that both the maximum likelihood and graphical methods are symmetrically similar. In addition, three real-world data applications are made to show that the method works efficiently. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Experimental Investigation on the Elicitation of Subjective Distributions.
- Author
-
Barrera-Causil, Carlos J., Correa, Juan Carlos, and Marmolejo-Ramos, Fernando
- Subjects
ELICITATION technique ,ESTIMATION theory ,DATA analysis ,CLUSTER analysis (Statistics) ,DISTRIBUTION (Probability theory) - Abstract
Elicitation methods aim to build participants' distributions about a parameter of interest. In most elicitation studies this parameter is rarely known in advance and hinders an objective comparison between elicitation methods. In two experiments, participants were first presented with a fixed random sequence of images and numbers and subsequently their subjective distributions of percentages of one of those numbers was elicited. Importantly, the true percentage was set in advance. The first experiment tested whether receiving instructions as to the elicitation method would assist in estimating a true value more accurately than receiving no instructions and whether accuracy was determined by the numerical skills of the participants. The second experiment sought to compare the elicitation method used in the first experiment with a variation of a graphical elicitation method. The results indicate that (i) receiving instructions as to the elicitation method does assist in producing estimates closer to a true percentage value, (ii) the level of numerical skills does not play a part in the accuracy of the estimation (Experiment 1), and (iii) although the average estimates of the betting and graphical method are not significantly different, the betting method leads to more precise estimations than the graphical method (Experiment 2). Both studies featured statistical procedures (functional data analysis and a novel clustering technique) not considered in past research on the elicitation of subjective distributions. The implications of these results are discussed in relation to a recent key study. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
13. Application of Horton's infiltration model for the soil of Dediapada (Gujarat), India.
- Author
-
Fadadu, M. H., Shrivastava, P. K., and Dwivedi, D. K.
- Subjects
INFILTROMETERS ,SOIL infiltration ,IRRIGATION ,DATA analysis - Abstract
The design and evaluation of surface irrigation systems of a site requires reliable data of infiltration which could be provided by an infiltration model. In this study, Horton's infiltration model has been estimated for the soil located in a field of College of Agricultural Engineering and Technology, Dediapada, Gujarat using the infiltration data obtained from several locations in the field using double ring infiltrometer. The decay constant of the Horton's infiltration model was obtained using graphical method and also by using semi - log plot of t (time) vs. (f - fc), where f is the infiltration rate (mm/hr) and fc is the initial rate of infiltration capacity (mm/hr). The potential of the Horton's infiltration model so obtained was evaluated by least square fitting with the observed infiltration data. The Horton's infiltration model was used to estimate infiltration rate (mm/hr) and cumulative infiltration (cm). The Horton's model for infiltration rate obtained by semi-log plot method was obtained as i=20 + 94 e
-1.02t , where i=infiltration rate (mm/hr) and t= time (min). The coefficient of determination obtained when the infiltration model was applied to observation data taken at various points in the field were found to 0.96. Therefore, it could be inferred that the Horton's infiltration model could give a reliable estimate of infiltration for the soil of Dediapada. [ABSTRACT FROM AUTHOR]- Published
- 2018
- Full Text
- View/download PDF
14. A GRAPHICAL APPROACH FOR ANALYSES OF DATA THIN NON-PARAMETRIC CONTINUOUS VARIABLE OF BOTIA DARIO WITH R PROGRAMMING LANGUAGE.
- Author
-
Sarker, Bhakta Supratim, Paul, Shyamal Kumar, Maruf, Kawser Kadir, Majumdar, Priyanka Rani, Azom, Golam, and Saha, Debasish
- Subjects
PROGRAMMING languages ,DATA analysis ,GAUSSIAN function ,BOX plots (Graphs) ,BIG data - Abstract
Botia dario is categorized as endangered due to considerable drop in population over past two decades creates data thin condition where the nonparametric statistical methods are superior alternative approach for data drafting. Small data set necessitate graphical display reducing the chance of data compression by numerical analyses. Monthly mass and length density estimates, location, and spread were compared with R computing environment through charts and plots; proposing a graphical method for single discrete and continuous data analyses. Through graphicacy the novel method reveals the pattern for mass and length of B. dario by depicting the modes and skews of the kernel density estimates suggested wide fluctuations during pre-monsoon months; whereas the spreads and locations of boxplots draping dot-whiskers infer the Gaussian kernel by pairwise comparisons. The boxplot widths, notches of the boxplots and red dot-whiskers illustrate comprehensive variations. The novel method and suggestive narratives appeal for inclusive use excluding the limitations. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Regional residual plots for assessing the fit of linear regression models
- Author
-
Deschepper, E., Thas, O., and Ottoy, J.P.
- Subjects
- *
REGRESSION analysis , *GRAPHICAL modeling (Statistics) , *DATA analysis , *HYPOTHESIS - Abstract
Abstract: An intuitively appealing lack-of-fit test to assess the adequacy of a regression model is introduced together with a graphical diagnostic tool. The graphical method itself includes a formal testing procedure, and, it is particularly useful to detect the location of lack-of-fit. The procedure is based on regional residuals, using subsets of the space of the independent variables. A simulation study shows that, the proposed procedures in simple linear regression have similar power as those of some popular classical lack-of-fit tests. In case of local departures from the hypothesized regression model, the new tests are shown to be more powerful. Therefore, when it becomes difficult to discriminate between systematic deviations and noise, regional residual plots are very helpful in formally locating areas of lack-of-fit in the predictor space. Data examples illustrate the ability of the new methods to detect and to locate lack-of-fit. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
16. Applications of Bladder Cancer Data Using a Modified Log-Logistic Model.
- Author
-
Kayid, Mohamed
- Subjects
BLADDER cancer ,STATISTICAL reliability ,PREDICTION models ,DATA analysis ,INFORMATION science - Abstract
In information science, modern and advanced computational methods and tools are often used to build predictive models for time-to-event data analysis. Such predictive models based on previously collected data from patients can support decision-making and prediction of clinical data. Therefore, a new simple and flexible modified log-logistic model is presented in this paper. Then, some basic statistical and reliability properties are discussed. Also, a graphical method for determining the data from the log-logistic or the proposed modified model is presented. Some methods are applied to estimate the parameters of the presented model. A simulation study is conducted to investigate the consistency and behavior of the discussed estimators. Finally, the model is fitted to two data sets and compared with some other candidates. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
17. Bootstrap Methods for Canonical Correlation Analysis of Functional Data.
- Author
-
Haoyu Yu and Lihong Wang
- Subjects
MOTION analysis ,STATISTICAL correlation ,FUNCTIONAL analysis ,DATA analysis ,GAIT in humans ,KNEE - Abstract
The bootstrap method is a very general resampling procedure for investigating the distributional property of statistics. In this paper, we present two bootstrapmethods with the aim of studying the functional canonical components for functional data. The bootstrap Imethod constructs the bootstrap replications by resampling fromthe rawdata, while the bootstrap II algorithm sampleswith replacement from the principal component scores. Simulation studies are conducted to examine the performance of the proposed bootstrap methods. The method is also applied to the motion analysis dataset, which consists of the angles formed by the hip and knee of each of 39 children over each child's gait cycle. Numerical simulations and real data analysis show the good performance of both bootstrapmethods for functional canonical correlation analysis. Moreover, as measured by themean error and mean squared error, the bootstrap II algorithm performs better in approximating sample canonical components than the bootstrap I method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. A new improved graphical and quantitative method for detecting bias in meta-analysis.
- Author
-
Furuya-Kanamori, Luis, Barendregt, Jan J., and Doi, Suhail A.R.
- Subjects
META-analysis ,REGRESSION analysis ,STATISTICS ,DATA analysis ,QUANTITATIVE research ,RECEIVER operating characteristic curves ,PUBLICATION bias ,EVALUATION - Abstract
Supplemental Digital Content is available in the text Detection of publication and related biases remains suboptimal and threatens the validity and interpretation of meta-analytical findings. When bias is present, it usually differentially affects small and large studies manifesting as an association between precision and effect size and therefore visual asymmetry of conventional funnel plots. This asymmetry can be quantified and Egger's regression is, by far, the most widely used statistical measure for quantifying funnel plot asymmetry. However, concerns have been raised about both the visual appearance of funnel plots and the sensitivity of Egger's regression to detect such asymmetry, particularly when the number of studies is small. In this article, we propose a new graphical method, the Doi plot, to visualize asymmetry and also a new measure, the LFK index, to detect and quantify asymmetry of study effects in Doi plots. We demonstrate that the visual representation of asymmetry was better for the Doi plot when compared with the funnel plot. We also show that the diagnostic accuracy of the LFK index in discriminating between asymmetry due to simulated publication bias versus chance or no asymmetry was also better with the LFK index which had areas under the receiver operating characteristic curve of 0.74–0.88 with simulations of meta-analyses with five, 10, 15, and 20 studies. The Egger's regression result had lower areas under the receiver operating characteristic curve values of 0.58–0.75 across the same simulations. The LFK index also had a higher sensitivity (71.3–72.1%) than the Egger's regression result (18.5–43.0%). We conclude that the methods proposed in this article can markedly improve the ability of researchers to detect bias in meta-analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
19. Design optimization of a novel elastomeric baffle magnetorheological fluid device.
- Author
-
Elliott, Christopher M. and Buckner, Gregory D.
- Subjects
MAGNETORHEOLOGICAL fluids ,ELASTOMERS ,COMPUTER simulation ,DATA analysis ,FINITE element method - Abstract
This article details the design optimization of a novel magnetorheological fluid device, which incorporates a rolling elastomeric baffle to contain the fluid and minimize cost and complexity. The application considered here is an electronic joystick for a drive-by-wire construction machine, although designs for other semi-active applications could be similarly optimized. Performance is evaluated computationally and experimentally. A computationally efficient system model is developed and validated using finite element analysis to accurately predict the device’s magnetic flux characteristics. The design methodology is based on a variant of the firefly algorithm, which identifies promising initial designs, and conjugate gradient methods, which optimize these designs. A unique graphical method for interpreting the results of population-based firefly optimization is demonstrated. An experimental prototype, based on the optimized design, is tested and its data compared to simulation results. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
20. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.
- Author
-
Lee Rodgers, Joseph, Beasley, William Howard, and Schuelke, Matthew
- Subjects
DATA analysis ,TIME series analysis ,SOFTWARE support ,GRAPHIC methods in statistics ,EVENT history analysis - Abstract
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition,we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
21. 基于图解法的 BQ 分级限定条件及计算公式研究.
- Author
-
赵 文, 金 陈, 路 博, and 杨 楠
- Subjects
- *
COMPRESSIVE strength , *REGRESSION analysis , *DATA analysis , *GRAPHIC methods , *CLASSIFICATION - Abstract
The calculation formula of BQ (basic quality) classification index in GB 50218- 1994 is modified in the revised Standard for Engineering Classification of Rock Mass ( GB/T 50218- 2014), but its two limiting conditions (saturated uniaxial compressive strength of rock and rock mass integrity coefficient) were not modified simultaneously. Through a large number of data studies, it is found that the calculation results are inconsistent with the qualitative and quantitative calculation results when the data are directly reduced by Rc and Kv, constraints. Therefore, the limited conditions are modified and analyzed based on simplified graphical method. According to the analysis results that are widely affected by Rc constraints, 54 sets of new data in the database are taken as samples, and a new calculation formula is established by regression analysis of the data reduced by Rc constraints. The revised limit conditions and new section calculation formula obtained are analyzed and demonstrated on the basis of actual engineering data. The results show that the revised limited conditions and the calculation formula proposed in this study improve the agreement between the qualitative classification and the quantitative classification, and a more accurate classification result is obtained. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
22. Process Capability Evaluation Using Capability Indices as a Part of Statistical Process Control.
- Author
-
Benková, Marta, Bednárová, Dagmar, and Bogdanovská, Gabriela
- Subjects
PROCESS capability ,STATISTICAL process control ,STATISTICAL hypothesis testing ,MANUFACTURING industries ,CAPABILITIES approach (Social sciences) - Abstract
This study aims to highlight the importance of a systematic approach to process capability assessment and the importance of following a sequence of steps. Statistical process control provides several different ways of assessing process capability. This study evaluates the process capability of crown cap manufacturing through capability indices. In addition to calculating the indices, the evaluation involves extensive data analysis. Before calculating the capability indices, the assumptions for their correct selection and use were also verified. Several statistical tests were used to verify each assumption. The research value of the study lies in pointing out that not all tests led to the same conclusions. It highlights the importance of selecting the appropriate test type for the evaluated process quality characteristics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. A Theoretical Analysis of Meteorological Data as a Road towards Optimizing Wind Energy Generation.
- Author
-
Orynycz, Olga, Ruchała, Paweł, Tucki, Karol, Wasiak, Andrzej, and Zöldy, Máté
- Subjects
WIND power ,WIND speed ,ENERGY development ,DATA analysis ,AIR speed - Abstract
The development of wind energy has been observed for many years. Both construction firms and the scientific world are analyzing new design solutions, atmospheric conditions and the technical performance achieved. The main goal of this research is to evaluate the requirements that have to be met to design wind power stations that would be an optimal fit for the climatic conditions in Poland. This study combines the results of empirical studies on wind velocity distributions with the physical fundamentals of wind power station design. This paper presents modelling of the relationships between wind velocity distributions observed in Poland and technical requirements for wind power stations design. The wind velocities distributions for various locations in Poland are determined and expressed in Weibull distribution parameters. Theoretical computations concerning the dependence of wind power stations as function of wind speed and air's physical properties are presented. Conclusions important for the design of power stations fitted to the atmospheric conditions in Poland are given. LabVIEW 2021 was used for computer modeling. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Methods of determining optimal cut-point of diagnostic biomarkers with application of clinical data in ROC analysis: an update review.
- Author
-
Hassanzad, Mojtaba and Hajian-Tilaki, Karimollah
- Subjects
INFLAMMATORY bowel diseases ,MONTE Carlo method ,BLOOD sedimentation ,CLINICAL medicine ,DATA analysis - Abstract
Introduction: An important application of ROC analysis is the determination of the optimal cut-point for biomarkers in diagnostic studies. This comprehensive review provides a framework of cut-point election for biomarkers in diagnostic medicine. Methods: Several methods were proposed for the selection of optional cut-points. The validity and precision of the proposed methods were discussed and the clinical application of the methods was illustrated with a practical example of clinical diagnostic data of C-reactive protein (CRP), erythrocyte sedimentation rate (ESR) and malondialdehyde (MDA) for prediction of inflammatory bowel disease (IBD) patients using the NCSS software. Results: Our results in the clinical data suggested that for CRP and MDA, the calculated cut-points of the Youden index, Euclidean index, Product and Union index methods were consistent in predicting IBD patients, while for ESR, only the Euclidean and Product methods yielded similar estimates. However, the diagnostic odds ratio (DOR) method provided more extreme values for the optimal cut-point for all biomarkers analyzed. Conclusion: Overall, the four methods including the Youden index, Euclidean index, Product, and IU can produce quite similar optimal cut-points for binormal pairs with the same variance. The cut-point determined with the Youden index may not agree with the other three methods in the case of skewed distributions while DOR does not produce valid informative cut-points. Therefore, more extensive Monte Carlo simulation studies are needed to investigate the conditions of test result distributions that may lead to inconsistent findings in clinical diagnostics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Should We Depend on Expert Opinion or Statistics? A Meta-Analysis of Accident-Contributing Factors in Construction.
- Author
-
Antoniou, Fani, Agrafioti, Nektaria Filitsa, and Aretoulis, Georgios
- Subjects
BUILDING sites ,WORK-related injuries ,GRADUATE students ,DATA analysis - Abstract
International research overflows with studies looking into the causes of construction accidents. Hundreds of studies by postgraduate students in the past 20 years focus on identifying and assessing risks contributing to accidents on Greek construction workplace sites. Many base their work on results from questionnaire surveys that collect the opinions of construction site professionals or on the analysis of data from actual accident records or statistics. Consequently, this study seeks to determine if the data source leads to differing conclusions by using two techniques to synthesize individual results and rank the accident-contributing factors investigated in the original studies. The first utilizes their relative importance index (RII) values, and the second uses their overall ranking index (ORI) to execute meta-analyses. The professional opinion concludes that factors related to operative behavior are the most significant accident-contributing factors. At the same time, actual accident statistics point to site risk factors of the construction process itself as the most important, indicating that expert opinion of Greek professionals should be considered in conjunction with data from actual accident records to provide the focus points for mitigation and assurance of safe construction sites in Greece. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Improving Holistic Business Intelligence with Artificial Intelligence for Demand Forecasting.
- Author
-
ALFURHOOD, BADRIA SULAIMAN, ALONAZI, WADI B., ARUNKUMAR, K., SANTHI, S., TAWFEQ, JAMAL FADHIL, RASHEED, TARIQ, and POOVENDRAN, PARTHASARATHY
- Subjects
ARTIFICIAL intelligence ,DEMAND forecasting ,BUSINESS intelligence ,DECISION making ,DATA analysis ,FORECASTING - Abstract
Business Intelligence Model (BIM) plays a vital role in forming a strategy and taking correct data-based steps in a modern generation to achieve a better demand forecasting result. An inevitable resolution support structure that helps the organization conduct data analyses throughout the business process has been considered a significant challenge. The prediction of potential demands for businesses is predicted with the help of artificial intelligence has been introduced in this research. Based on the intelligence technique, demand estimation is considered one of the company's major decision-making activities focused on Improving Holistic Business Intelligence Model (IHBIM). For predictions of demand, first raw data from the market is gathered, and then potential demand for sales/products is predicted according to requirements using IHBIM. This forecast is based on data obtained from multiple sources. Further, Artificial intelligence conducts data from various modules and calculates the goods/products' demands regularly, monthly, and quarterly has been integrated into IHBIM. The simulation results show that the accuracy of the demand forecast is non-compromising. Furthermore, the model's performance is validated by combining the projected results with accurate data and calculating the percentage error. [ABSTRACT FROM AUTHOR]
- Published
- 2024
27. Local Singularity Spectrum: An Innovative Graphical Approach for Analyzing Detrital Zircon Geochronology Data in Provenance Analysis.
- Author
-
Wang, Wenlei, Pei, Yingru, Cheng, Qiuming, and Wang, Wenjun
- Subjects
PROVENANCE (Geology) ,GEOLOGICAL time scales ,ZIRCON ,DATA analysis ,AGE discrimination ,SPECTRUM analysis - Abstract
Detrital zircon geochronology plays a crucial role in provenance analysis, serving as one of the fundamental strategies. The age spectrum of detrital zircons collected from the sedimentary unit of interest is often compared or correlated with that of potential source terranes. However, biases in the age data can arise due to factors related to detrital sampling, analysis techniques, and nonlinear geological mechanisms. The current study reviewed two sets of detrital zircon datasets established in 2011 and 2021 to discuss the origins of the Tibetan Plateau. These datasets collected from different media effectively demonstrate a progressive understanding of provenance affinity among the main terranes on the Tibetan Plateau. This highlights issues regarding weak and unclear temporal connections identified through analyzing the age spectrum for provenance analysis. Within this context, a local singularity analysis approach is currently employed to address issues associated with unclear and weak provenance information by characterizing local variations in nonlinear behaviors and enhancing detection sensitivity towards subtle anomalies. This new graphical approach effectively quantifies temporal variations in detrital zircon age populations and enhances identification of weak provenance information that may not be readily apparent on conventional age spectra. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. The investigation on flexural performance of prestressed concrete-encased high strength steel beams.
- Author
-
Jun Wang, Yurong Jiao, Menglin Cui, Wendong Yang, Xueqi Fang, and Jun Yan
- Subjects
HIGH strength steel ,FLEXURAL strength ,DUCTILITY ,BENDING strength ,DATA analysis - Abstract
This paper reports an experimental on the flexural performance of prestressed concrete-encased high-strength steel beams (PCEHSSBs). To study the applicability of high-strength steel (HSS) in prestressed concrete-encased steel beams (PCESBs), one simply supported prestressed concrete-encased ordinarystrength steel beam (PCEOSSB) and eight simply supported PCEHSSBs were tested under a four-point bending load. The influence of steel strength grade, I-steel ratio, reinforcement ratio and stirrup ratio on the flexural performance of such members was investigated. The test results show that increasing the I-steel grade and I-steel ratio can significantly improve the bearing capacity of PCESB. Increasing the compressive reinforcement ratio of PCEHSSB can effectively improve its bearing capacity and ductility properties, making full use of the performance of HSS in composite beams. Increasing the hoop ratio has a small improvement on the load capacity of the test beams; setting up shear connectors can improve the ductile properties of the specimens although it does not lead to a significant increase in the load capacity of the combined beams. Then, combined with the test data, the comprehensive reinforcement index considering the location of reinforcement was proposed to evaluate the crack resistance of specimens. The relationship between the comprehensive reinforcement index and the crack resistance of specimens was given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Hermite expansion and estimation of monotonic transformations of Gaussian data.
- Author
-
Janicki, Ryan and McElroy, Tucker S.
- Subjects
HERMITE polynomials ,ESTIMATION theory ,GAUSSIAN function ,DATA analysis ,MATHEMATICAL models - Abstract
This paper describes a semiparametric method for estimating a generic probability distribution using a basis expansion in. We express the given distribution as a monotonic transformation of the Gaussian cumulative distribution function, expanded in a basis of Hermite polynomials. The coefficients in the basis expansion are functionals of the quantile function, and can be consistently estimated to give a smooth estimate of the transformation function. For situations in which the estimated function is not monotone, a projection approach is used to adjust the estimated transformation function to guarantee monotonicity. Two applications are presented which focus on the analysis of model residuals. The first is a data example which uses the residuals from the 2012 Small Area Income and Poverty Estimates model. The Hermite estimation method is applied to these residuals as a graphical method for detection of departures from normality and to construct credible intervals. The second example analyses residuals from time series models for the purpose of estimating the variance of the mean and median and comparing the results to the AR-sieve. This paper concludes with a set of numerical examples to illustrate the theoretical results. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
30. Evaluation of a weighting approach for performing sensitivity analysis after multiple imputation.
- Author
-
Rezvan, Panteha Hayati, White, Ian R., Lee, Katherine J., Carlin, John B., Simpson, Julie A., and Hayati Rezvan, Panteha
- Subjects
ALGORITHMS ,COMPUTER simulation ,REGRESSION analysis ,RESEARCH funding ,STATISTICS ,DATA analysis ,STATISTICAL models - Abstract
Background: Multiple imputation (MI) is a well-recognised statistical technique for handling missing data. As usually implemented in standard statistical software, MI assumes that data are 'Missing at random' (MAR); an assumption that in many settings is implausible. It is not possible to distinguish whether data are MAR or 'Missing not at random' (MNAR) using the observed data, so it is desirable to discover the impact of departures from the MAR assumption on the MI results by conducting sensitivity analyses. A weighting approach based on a selection model has been proposed for performing MNAR analyses to assess the robustness of results obtained under standard MI to departures from MAR.Methods: In this article, we use simulation to evaluate the weighting approach as a method for exploring possible departures from MAR, with missingness in a single variable, where the parameters of interest are the marginal mean (and probability) of a partially observed outcome variable and a measure of association between the outcome and a fully observed exposure. The simulation studies compare the weighting-based MNAR estimates for various numbers of imputations in small and large samples, for moderate to large magnitudes of departure from MAR, where the degree of departure from MAR was assumed known. Further, we evaluated a proposed graphical method, which uses the dataset with missing data, for obtaining a plausible range of values for the parameter that quantifies the magnitude of departure from MAR.Results: Our simulation studies confirm that the weighting approach outperformed the MAR approach, but it still suffered from bias. In particular, our findings demonstrate that the weighting approach provides biased parameter estimates, even when a large number of imputations is performed. In the examples presented, the graphical approach for selecting a range of values for the possible departures from MAR did not capture the true parameter value of departure used in generating the data.Conclusions: Overall, the weighting approach is not recommended for sensitivity analyses following MI, and further research is required to develop more appropriate methods to perform such sensitivity analyses. [ABSTRACT FROM AUTHOR]- Published
- 2015
- Full Text
- View/download PDF
31. Exploring Exploratory Data Analysis: An Empirical Test of Run Chart Utility.
- Author
-
Barsalou, Matthew, Saraiva, Pedro Manuel, and Henriques, Roberto
- Subjects
DATA analysis ,ROOT cause analysis ,EVALUATION methodology - Abstract
This paper explores Exploratory Data Analysis (EDA). Graphical methods are used to gain insights in EDA and these insights can be useful for forming tentative hypotheses when performing a root cause analysis (RCA). The topic of EDA is well addressed in the literature; however, empirical studies of the efficacy of EDA are lacking. We therefore aim to evaluate EDA by comparing one group of students identifying salient features in a table against a second group of students attempting to identify salient features in the same data presented in the form of a run chart, and then extracting relevant conclusions from such a comparison. Two groups of students were randomly selected to receive data; either in the form of a table or a run chart. They were then tasked with visually identifying any data points that stood out as interesting. The number of correctly identified values and the time to find the values were both evaluated by a two-sample t-test to determine if there was a statistically significant difference. The participants with a graph found the correct values that stood out in the data much quicker than those that used a table. Those using the data in the form of a table too much longer and failed to identify values that stood out. However, those with a graph also had far more false positives. Much has been written on the topic of EDA in the literature; however, an empirical evaluation of this common methodology is lacking. This paper confirms with empirical evidence the effectiveness of EDA. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. The Interval-Censored Biplot.
- Author
-
Cecere, Silvia, Groenen, Patrick J. F., and Lesaffre, Emmanuel
- Subjects
CENSORING (Statistics) ,INTERVAL analysis ,PRINCIPAL components analysis ,GRAPH theory ,DATA analysis ,REPRESENTATIONS of graphs ,MATRICES (Mathematics) - Abstract
The principal components biplot is a useful visualization tool for the exploration of a samples by variables data matrix. In several data analysis situations, the data values are interval censored so that only the interval of a data value is available, but not the value itself. For such data, we propose the interval-censored biplot (IC-Biplot), a new exploratory and graphical method that is an extension of the principal component analysis biplot. It provides not only a two-dimensional graphic representation of respondents and their attributes, but also point estimates for the data values that are constrained to be in their interval. Two applications of the IC-Biplot are discussed. The first application considers data on emergence times of permanent teeth focusing on the pattern of emergence. The IC-Biplot confirms rank orders suggested earlier in the literature. Goodness-of-fit measures show that the present model seems to fit these data very well. The second application discusses a regular sample by attribute matrix from the literature on characteristics of several types of oils. This article has online supplementary materials. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
33. The Half-Half Plot.
- Author
-
Einmahl, JohnH.J. and Gantner, Maria
- Subjects
DATA analysis ,LIMIT theorems ,GRAPHIC methods ,NONPARAMETRIC statistics ,REGRESSION analysis - Abstract
The Half-Half (HH) plot is a new graphical method to investigate qualitatively the shape of a regression curve. The empirical HH-plot counts observations in the lower and upper quarter of a strip that moves horizontally over the scatterplot. The plot displays jumps clearly and reveals further features of the regression curve. We prove a functional central limit theorem for the empirical HH-plot, with rate of convergence . In a simulation study, the good performance of the plot is demonstrated. The method is also applied to two case studies. The proofs and one more case study are deferred to a supplement, which is available online. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
34. Improving the Functional Profile of the Students Through the Means of the Basketball Game.
- Author
-
Nicoleta, Leonte, Cristiana, Porfireanu, Ofelia, Popescu, and Cristian, Ristea
- Subjects
BASKETBALL games ,DATA analysis ,HIGHER education ,STUDENTS - Abstract
Copyright of Gymnasium: Scientific Journal of Education, Sports & Health is the property of Alma Mater and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2019
- Full Text
- View/download PDF
35. Mark-mark scatterplots improve pattern analysis in spatial plant ecology.
- Author
-
Ballani, Felix, Pommerening, Arne, and Stoyan, Dietrich
- Subjects
SCATTER diagrams ,ECOLOGISTS ,SCOTS pine ,SPATIAL data infrastructures ,DATA analysis - Abstract
Abstract Point process statistics provides valuable tools for many ecological studies, where 'points' are commonly determined to represent the locations of plants or animals and 'marks' are additional items such as species or size. In the statistical analysis of marked point patterns, various correlation functions are used such as the mark variogram or the mark correlation function. Often the interpretation of these functions is not easy and the non-spatial ecologist is in need of support. In order to make the analysis of spatial point patterns more accessible to ecologists, we introduced and tested a new graphical method, the mark-mark scatterplot. This plot visualises the marks of point pairs of inter-point distances r smaller than some small distance r max. We tested the application of the mark-mark scatterplot by reconsidering three quite different tree patterns: a pattern of longleaf pine trees from the southern US which was strongly influenced by fires, a tropical tree pattern of the species Shorea congestiflora from Sri Lanka and a Scots pine pattern from Siberia (Russia). The new method yielded previously undetected cause-effect information on mark behaviour at short inter-point distances and thus improved the analysis with mark correlation functions as well as complemented the information they provided. We discovered important new correlations in clusters of trees at close proximity. The application of the mark-mark scatterplot will facilitate the interpretation of point process summary statistics and will make point process analysis more accessible to ecologists not specialized in point process statistics. Highlights • The mark-mark scatterplot fills an important gap in preparing and interpreting the analysis of spatial point patterns. • We re-visited three published spatial data analyses and added significant new information. • The mark-mark scatterplot allows a more comprehensive understanding and validation of spatial analyses. • The new tool makes spatial summary statistics more intelligible and trustworthy for ecologists. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
36. Constrained polynomial fitting for recovery of regional gravity.
- Author
-
Abokhodair, Abdulwahab A.
- Subjects
POLYNOMIALS ,MATHEMATICAL mappings ,LEAST squares ,SIMULATION methods & models ,CONSTRAINTS (Physics) ,ROBUST control ,DATA analysis - Abstract
Isolation of a regional field from a Bouguer map has always been an ambiguous and troublesome problem. It is often argued that the ambiguity arises from lack of specific criteria under which the problem may be formulated. In this paper, I show that by adopting Skeels' definition of the regional field and its corollary, criteria needed to extract the field with minimum ambiguity may be developed. The definition and its corollary allow formulation of the regional field separation problem as a weighted (robustified) and constrained least-square fitting problem with constraints extracted directly from the Bouguer map. To emphasize the constraints, I formulate the problem from the perspective of prior information constrained by observational data. The new formalism offers several advantages: weighted fitting is more robust than ordinary least squares fitting, providing a simple mechanism to eliminate data outliers and reduce the undesirable influence of local gravity disturbances. Introducing constraints into the fitting procedure effectively reduces ambiguity and increases the resolution of the fitted regional field. Moreover, imposing conditions on the fitted regional field directly from the Bouguer map is tantamount to incorporating prior information about the underlying geology and structure of the area with minimum human subjectivity. The procedure was tested on simulated and actual data sets with excellent results. Indeed the test results indicate that with properly placed constraints, the regional field may be recovered in a manner that closely emulates the graphical method. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
37. Piecewise linear regression: A statistical method for the analysis of experimental adsorption data by the intraparticle-diffusion models
- Author
-
Malash, Gihan F. and El-Khaiary, Mohammad I.
- Subjects
- *
REGRESSION analysis , *ADSORPTION (Chemistry) , *DIFFUSION , *SURFACE chemistry , *SEPARATION (Technology) , *GRAPHICAL modeling (Statistics) , *DATA analysis , *GRAPHIC methods - Abstract
Abstract: The film-diffusion and the intraparticle-diffusion models are widely used to analyze the mechanism of adsorption. The plots of these models often have a multi-linear nature, and in general, the graphical method is employed to analyze the data in which the linear segments are determined visually. This method suffers from subjectivity and therefore its estimated diffusion parameters are not very reliable. An alternative statistical method, piecewise linear regression (PLR) is presented and applied to experimental data. The results demonstrate that the use of PLR is practical and leads to diffusion estimates that may be quite different from the graphical method. PLR also determined the exact time periods for each diffusion regime, which opens new possibilities for analyzing and understanding the mechanism of diffusion. In order to encourage the testing and application of PLR, an easy to use Microsoft® Excel™ spreadsheet is made available. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
38. Desarrollo de técnicas de visualización múltiple en el programa ViSta: ejemplo de aplicación al análisis de componentes principales.
- Author
-
Ledesma, Rubén, Molina, J. Gabriel, Young, Forrest W., and Valero-Mora, Pedro
- Subjects
- *
PSYCHOLOGICAL research , *RESEARCH methodology , *GRAPHIC methods , *GEOMETRICAL drawing , *DATA analysis , *PRINCIPAL components analysis - Abstract
Multiple visualisation (MV) is a statistic graphical method barely applied in data analysis practice, even though it provides interesting features for this purpose. This paper: (1) describes the application of the MV graphical method; (2) presents a number of rules related to the design of an MV; (3) introduces a general outline for developing MVs and shows how MV may be implemented in the ViSta statistical system; (4) illustrates this strategy by means of an example of MV oriented to principal component analysis; and, finally, (5) discusses some limitations of using and developing MVs. [ABSTRACT FROM AUTHOR]
- Published
- 2007
39. Ability of 18 F-FDG Positron Emission Tomography Radiomics and Machine Learning in Predicting KRAS Mutation Status in Therapy-Naive Lung Adenocarcinoma.
- Author
-
Zhang, Ruiyun, Shi, Kuangyu, Hohenforst-Schmidt, Wolfgang, Steppert, Claus, Sziklavari, Zsolt, Schmidkonz, Christian, Atzinger, Armin, Hartmann, Arndt, Vieth, Michael, and Förster, Stefan
- Subjects
ADENOCARCINOMA ,LUNG cancer ,STATISTICS ,GENETIC mutation ,CONFIDENCE intervals ,MACHINE learning ,RETROSPECTIVE studies ,MANN Whitney U Test ,COMPARATIVE studies ,RADIOPHARMACEUTICALS ,POSITRON emission tomography ,GENOMICS ,DEOXY sugars ,PREDICTION models ,STATISTICAL sampling ,DATA analysis ,LOGISTIC regression analysis ,RECEIVER operating characteristic curves - Abstract
Simple Summary: Approximately 26.1% of patients diagnosed with lung adenocarcinoma harbour a KRAS mutation, which is associated with a poorer prognosis. Recent advances in targeted therapy, specifically with sotorasib and MRTX849, have shown promise in targeting KRAS mutations. This retrospective study aimed to develop a clinical prediction model that combines clinical–pathological variables and radiomics derived from PET scans to assess the KRAS mutation status in patients with lung adenocarcinoma. This study utilised two different databases and randomly divided into a training, a validation, and a testing dataset to build and evaluate the predictive performance of our model. Our retrospectively developed model demonstrates good predictive accuracy for determining the KRAS mutation status in lung adenocarcinoma patients. Objective: Considering the essential role of KRAS mutation in NSCLC and the limited experience of PET radiomic features in KRAS mutation, a prediction model was built in our current analysis. Our model aims to evaluate the status of KRAS mutants in lung adenocarcinoma by combining PET radiomics and machine learning. Method: Patients were retrospectively selected from our database and screened from the NSCLC radiogenomic dataset from TCIA. The dataset was randomly divided into three subgroups. Two open-source software programs, 3D Slicer and Python, were used to segment lung tumours and extract radiomic features from
18 F-FDG-PET images. Feature selection was performed by the Mann–Whitney U test, Spearman's rank correlation coefficient, and RFE. Logistic regression was used to build the prediction models. AUCs from ROCs were used to compare the predictive abilities of the models. Calibration plots were obtained to examine the agreements of observed and predictive values in the validation and testing groups. DCA curves were performed to check the clinical impact of the best model. Finally, a nomogram was obtained to present the selected model. Results: One hundred and nineteen patients with lung adenocarcinoma were included in our study. The whole group was divided into three datasets: a training set (n = 96), a validation set (n = 11), and a testing set (n = 12). In total, 1781 radiomic features were extracted from PET images. One hundred sixty-three predictive models were established according to each original feature group and their combinations. After model comparison and selection, one model, including wHLH_fo_IR, wHLH_glrlm_SRHGLE, wHLH_glszm_SAHGLE, and smoking habits, was validated with the highest predictive value. The model obtained AUCs of 0.731 (95% CI: 0.619~0.843), 0.750 (95% CI: 0.248~1.000), and 0.750 (95% CI: 0.448~1.000) in the training set, the validation set and the testing set, respectively. Results from calibration plots in validation and testing groups indicated that there was no departure between observed and predictive values in the two datasets (p = 0.377 and 0.861, respectively). Conclusions: Our model combining18 F-FDG-PET radiomics and machine learning indicated a good predictive ability of KRAS status in lung adenocarcinoma. It may be a helpful non-invasive method to screen the KRAS mutation status of heterogenous lung adenocarcinoma before selected biopsy sampling. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
40. THE USE OF ADDED VARIABLE PLOTS IN REGRESSION MODELLING WITH SPATIAL DATA.
- Author
-
Haining, Robert
- Subjects
SPATIAL analysis (Statistics) ,REGRESSION analysis ,DATA analysis ,GRAPHIC methods for multivariate analysis ,GEOGRAPHICAL research ,AUTOCORRELATION (Statistics) - Abstract
A simple graphical method for developing regression models for spatially referenced data is presented. The method supplements formal testing procedures and provides the analyst with useful information helpful in developing an understanding of data properties without the need for complex data manipulations. The paper includes a worked example and suggests directions for further work in this area. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
41. Validation of a Modified Version of Interpersonal Reactivity Index for Medical Students Using Rasch Analysis.
- Author
-
Lee, Young-Mee, Park, Hyunmi, Shin, Hyoung Seok, and Huh, Sun
- Subjects
STATISTICS ,EVALUATION of medical care ,EMPATHY ,RESEARCH methodology evaluation ,MEDICAL students ,GOODNESS-of-fit tests ,PATIENT satisfaction ,PSYCHOMETRICS ,INTERPERSONAL relations ,QUESTIONNAIRES ,FACTOR analysis ,RESEARCH funding ,DATA analysis ,MEDICAL education - Abstract
Construct: Empathy has been accepted to interweave both cognitive aspects (the ability to put oneself in another person's place), and affective (or emotional) aspects, indicating an emotional reaction or response to another person's emotional state. Literature supports the positive influences of empathy on doctor-patient relationship, patient satisfaction, and positive clinical outcomes. Background: Many studies have dealt with the development of empathy measurement tools for physicians and medical students. A frequently used empathy measuring instrument for medical students is the "Interpersonal Reactivity Index" (IRI) which was designed to measure the multi-dimensional aspects of empathy in the general adult population. Most previous literature which validated IRI for medical students has used factor analysis, whilst studies applying Rasch models have been limited. Our study aimed to investigate the psychometric properties of a modified version of IRI for medical students using Rasch analysis. Approach: Medical students (1,293) from 15 medical schools in South Korea participated in an online questionnaire consisting of 28 items of the Korean translated version of IRI. We applied exploratory factor analysis (EFA) using polychoric correlation matrix to determine the optimal number of factors followed by Rasch analysis and McDonald's Omega calculation. Findings: The adapted IRI-MS (IRI for medical students) consisted of 17 items in four dimensions: empathic concern (5), fictitious situation (4), perspective taking (4), and personal distress (4). The overall fit of IRI-MS revealed an acceptable goodness-of-fit for all 17 items and a positive point measure correlation for all items. Reliability indices from the Rasch modeling and McDonald's Omega values of all four dimensions were satisfactory for research. We found the Wright-Andrich maps and category probability curves of the IRI's four dimensions to be less than optimal in measuring empathy levels with adequate precision. Conclusions: Rasch analysis of IRI-MS fell short from being able to prove satisfactory validity in measuring the multidimensional nature of empathy in medical students. However, our study applying Rasch analysis may serve as groundwork for future studies, to further develop from the shortcomings of our findings. Supplemental data for this article are available online at at www.tandfonline.com/htlm. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. Three‐Dimensional Steady‐State Hydraulic Tomography Analysis With Integration of Cross‐Hole Flowmeter Data at a Highly Heterogeneous Site.
- Author
-
Luo, Ning, Zhao, Zhanfeng, Illman, Walter A., Zha, Yuanyuan, Mok, Chin Man W., and Yeh, T.‐C. Jim
- Subjects
GROUNDWATER flow ,HYDRAULIC conductivity ,FLOW meters ,TOMOGRAPHY ,MODEL validation ,HETEROGENEITY ,DATA analysis - Abstract
Hydraulic tomography (HT) has been shown to be a robust approach for the high‐resolution characterization of subsurface heterogeneity. However, HT can yield smooth estimates of hydraulic parameters when pumping tests and drawdown measurements are sparse, thus limiting the utility of characterization results in predicting groundwater flow and solute transport. To overcome this issue, this study integrates cross‐hole flowmeter measurements with HT analysis of steady‐state pumping/injection test data for the three‐dimensional (3‐D) characterization of hydraulic conductivity (K) at a highly heterogeneous glaciofluvial deposit site, which has not been previously attempted. Geostatistical inverse analyses of cross‐hole flowmeter data are conducted to yield preliminary estimates of K distribution, which are then utilized as initial K fields for steady‐state HT analysis of head data. Four cases combining three data types (geological information, cross‐hole flowmeter measurements, and steady‐state head data) for inverse modeling are performed. Model calibration and validation results from all cases are compared qualitatively and quantitatively to evaluate their performances. Results from this study show that (a) geostatistical inverse analysis of cross‐hole flowmeter data is capable in revealing vertical distributions of K at well locations and major high/low K zones between wells, (b) cross‐hole flowmeter data carry non‐redundant information of K heterogeneity compared to geological information and steady‐state head data, and (c) integration of flowmeter data improves characterization results in terms of revealing K heterogeneity details and predicting independent hydraulic test data. Therefore, this study demonstrates the usefulness of cross‐hole flowmeter data in augmenting HT surveys for improved K characterization in 3‐D. Key Points: Geostatistical inverse analysis of cross‐hole flowmeter data is capable in revealing heterogeneity patterns of hydraulic conductivityCross‐hole flowmeter data carry non‐redundant information compared to structural information and head response dataIntegration of cross‐hole flowmeter data with steady‐state hydraulic tomography analysis improves characterization results [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. Revisiting the analysis pipeline for overdispersed Poisson and binomial data.
- Author
-
Lee, Woojoo, Kim, Jeonghwan, and Lee, Donghwan
- Subjects
GRAPHICAL modeling (Statistics) ,TEST scoring ,DATA analysis - Abstract
Overdispersion is a common feature in categorical data analysis and several methods have been developed for detecting and handling it in generalized linear models. The first aim of this study is to clarify the relationships among various score statistics for testing overdispersion and to compare their performances. In addition, we investigate a principled way to correct finite sample bias in the score statistic caused by estimating regression parameters with restricted likelihood. The second aim is to reconsider the current practice for handling overdispersed categorical data. Although the conventional models are based on substantially different mechanisms for generating overdispersion, model selection in practice has not been well studied. We perform an intensive numerical study for determining which method is more robust to various overdispersion mechanisms. In addition, we provide some graphical tools for identifying the better model. The last aim is to reconsider the key assumption for deriving the score statistics. We study the meaning of testing overdispersion when this assumption is violated, and we analytically show the conditions for which it is not appropriate to employ the current statistical practices for analyzing overdispersed data. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
44. Visual assessment of matrix‐variate normality.
- Author
-
Počuča, Nikola, Gallaugher, Michael P.B., Clark, Katharine M., and McNicholas, Paul D.
- Subjects
GAUSSIAN distribution ,DATA analysis - Abstract
Summary: In recent years, the analysis of three‐way data has become ever more prevalent in the literature. It is becoming increasingly common to analyse such data by means of matrix‐variate distributions, the most prevalent of which is the matrix‐variate normal distribution. Although many methods exist for assessing multivariate normality, there is a relative paucity of approaches for assessing matrix‐variate normality. Herein, a new visual method is proposed for assessing matrix‐variate normality by means of a distance–distance plot. In addition, a testing procedure is discussed to be used in tandem with the proposed visual method. The proposed approach is illustrated via simulated data as well as an application on analysing handwritten digits. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
45. On the Requirements for Inferring Aquifer‐Scale T and S in Heterogeneous Confined Aquifers.
- Author
-
Naderi, Mostafa and Gupta, Hoshin V.
- Subjects
AQUIFERS ,ARITHMETIC mean ,DATA analysis ,RADIUS (Geometry) - Abstract
We study the sensitivity of aquifer‐scale estimates of transmissivity (T) and storativity (S) to the variance and correlation length scale of aquifer heterogeneity, when such estimates are obtained by the traditional approach of analyzing pumping test data. We consider both constant‐rate and variable‐rate pumping tests, and a variety of Theis‐based solution methodologies (single‐ and multiple‐observation well methods, and interpreted single value or transient values for T and S parameters) applied in pumping test data analysis. Our results indicate that achieving reliable inference of effective T and S requires that pumping be continued until the radius of the test‐induced cone of depression exceeds a "representative length" that corresponds to ∼15 times than the correlation length scale of the aquifer heterogeneity. Independent of solution type, pumping history, correlation length scale, magnitude of the sill, and location of observation well, the estimates for T will converge toward the geometric mean. For S, the estimation uncertainty is relatively large for observation wells that are close to the pumping well, but diminishes for observation wells located beyond the representative distance, resulting in convergence to the arithmetic mean. Given that these results have practical implications for how pumping tests should be carried out, we present a simple "rule‐of‐thumb" for estimating the correlation length scale of the heterogeneity of an aquifer. Key Points: The T estimates for constant and heterogeneous S aquifers differ for every observation well; instead, the difference for S strongly depends on observation well distancePumping rate and history does not affect the inferred aquifer parameters; however, Theis‐based solution type results in different interpretationsAs a rule‐of‐thumb, temporal variability of T and S diminishes when the depression cone radius exceeds 15 times the correlation length scale plus the observation well distance [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
46. Investigation of the compliance of offshore wind data with the Weibull distribution function using six different methods.
- Author
-
Kaplan, Y. A.
- Subjects
OFFSHORE wind power plants ,RENEWABLE energy sources ,WEIBULL distribution ,STATISTICAL errors ,DATA analysis - Abstract
The aim of this study is to investigate how the Weibull Distribution Function (WDF) is compatible with the wind data in offshore regions. Many academic studies on wind energy have been conducted. Determining potential offshore wind energy and making investments in this area have gained further significance today. Although many studies have been made on wind energy, offshore wind energy has received less attention. The compatibility between wind data and WDF on land has been investigated by many academic studies, and the results have been evaluated. However, the compatibility of the offshore wind data with the WDF has not been investigated sufficiently and there are steps to be taken in this regard. In this study, a point was selected in Iskenderun Gulf to examine the compatibility of offshore wind data with WDF function. This study determined both the wind energy potential of the selected region and made many contributions to the literature. Six different methods were used to determine the parameters of the WDF and then, their performance were evaluated in different statistical error analysis tests. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. Integrative Analysis of Multi-Omics Data Based on Blockwise Sparse Principal Components.
- Author
-
Park, Mira, Kim, Doyoen, Moon, Kwanyoung, and Park, Taesung
- Subjects
DATA analysis ,MULTIPLE correspondence analysis (Statistics) ,GLIOBLASTOMA multiforme ,MULTICOLLINEARITY - Abstract
The recent development of high-throughput technology has allowed us to accumulate vast amounts of multi-omics data. Because even single omics data have a large number of variables, integrated analysis of multi-omics data suffers from problems such as computational instability and variable redundancy. Most multi-omics data analyses apply single supervised analysis, repeatedly, for dimensional reduction and variable selection. However, these approaches cannot avoid the problems of redundancy and collinearity of variables. In this study, we propose a novel approach using blockwise component analysis. This would solve the limitations of current methods by applying variable clustering and sparse principal component (sPC) analysis. Our approach consists of two stages. The first stage identifies homogeneous variable blocks, and then extracts sPCs, for each omics dataset. The second stage merges sPCs from each omics dataset, and then constructs a prediction model. We also propose a graphical method showing the results of sparse PCA and model fitting, simultaneously. We applied the proposed methodology to glioblastoma multiforme data from The Cancer Genome Atlas. The comparison with other existing approaches showed that our proposed methodology is more easily interpretable than other approaches, and has comparable predictive power, with a much smaller number of variables. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
48. Network rule extraction under the network formal context based on three-way decision.
- Author
-
Fan, Min, Luo, Shan, and Li, Jinhai
- Subjects
DECISION making ,DATA mining ,COVID-19 testing ,GRANULAR computing ,DATA analysis ,EIGENVALUES - Abstract
Knowledge discovery combined with network structure is an emerging field of network data analysis and mining. Three-way concept analysis is a method that can fit the human mind in uncertain decisions and analysis. In reality, when three-way concept analysis is placed in the background of a network, not only the three-way rules need to be obtained, but also the network characteristic values of these rules should be obtained, which is of great significance for concept cognition in the network. This paper mainly combines complex network analysis with the formal context of three-way decision. Firstly, the network formal context of three-way decision (NFC3WD) is proposed to unify the two studies mentioned above into one data framework. Then, the network weaken-concepts of three-way decision (NWC3WD) and their corresponding sub-networks are studied. Therefore, we can not only find out the network weaken-concepts but also know the average influence of the sub-network, as well as the influence difference within the sub-network. Furthermore, the concept logic of network and the properties of its operators are put forward, which lays a foundation for designing the algorithm of rule extraction. Subsequently, the bidirectional rule extraction algorithm and reduction algorithm based on confidence degree are also explored. Meanwhile, these algorithms are applied to the diagnosis examples of COVID-19 from which we can not only get diagnostic rules, but also know the importance of the population corresponding to these diagnostic rules in the network through network eigenvalues. Finally, experimental analysis is made to show the superiority of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Characterization of Corn Silk Extract Using HPLC/HRMS/MS Analyses and Bioinformatic Data Processing.
- Author
-
Fougère, Laëtitia, Zubrzycki, Sandrine, Elfakir, Claire, and Destandau, Emilie
- Subjects
PLANT defenses ,CORN ,ELECTRONIC data processing ,PHYTOCHEMICALS ,DATA analysis ,SILK ,FLAVONOIDS ,PHENOL content of food - Abstract
In addition to having different biological activities of interest, corn silks play a role in the defense of plants. While benzoxamines and flavonoids have already been identified as molecules of plant defense and growth mechanisms, knowledge on the phytochemical composition of corn silk is lacking. Such knowledge would make it possible to better select the most effective varieties to improve resistance or bioactive properties. In this article, an approach was implemented to map a corn silk extract in two complementary ways. The first one involved working with UHPLC/HRMS data and Kendrick and van Krevelen plots to highlight a homologous series of compounds, such as lipids from 17 to 23 carbons, monoglycosylated flavonoids from 21 to 24 carbons, diglycosylated flavonoids of 26 to 28 carbons and organic acids of 14 to 19 carbons. The second way was to analyze the sample in UHPLC/HRMS
2 and to plot mass spectral similarity networks with the GNPS platform and Cytoscape software to refine identification. By combining the information obtained, we were able to propose an identification for 104 detected molecules, including 7 nitrogenous, 28 lipidic and 67 phenolic compounds, leading to the first detailed phytochemical analysis of corn silk extract. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
50. Support vector regression (SVR) and grey wolf optimization (GWO) to predict the compressive strength of GGBFS-based geopolymer concrete.
- Author
-
Ahmed, Hemn Unis, Mostafa, Reham R., Mohammed, Ahmed, Sihag, Parveen, and Qadir, Azad
- Subjects
POLYMER-impregnated concrete ,COMPRESSIVE strength ,INORGANIC polymers ,KAOLIN ,PARTICLE swarm optimization ,PETROLEUM as fuel ,CONCRETE ,FLY ash - Abstract
Geopolymer concrete is an eco-efficient and environmentally friendly construction material. Various ashes were used as the binder in geopolymer concrete, such as fly ash, ground granulated blast furnace slag, rice husk ash, metakaolin ash, and Palm oil fuel ash. Fly ash was commonly consumed to prepare geopolymer concrete composites. It is essential to have 28 days resting period of the concrete to attain compressive strength in the structural design. In the present investigation, several soft computing models were employed to form the predictive models for forecasting the compressive strength of ground granulated blast furnace slag (GGBFS) concrete. A complete dataset of 268 samples was extracted from published research articles and analyzed to establish models. The modeling process incorporated seven effective parameters such as water content (W), temperature (T), water-to-binder ratio (w/b), ground granulated blast furnace slag-to-binder ratio (GGBFS/b), fine aggregate (FA) content, coarse aggregate (CA) content, and the superplasticizer dosage (SP) that were examined and measured on the compressive strength of GGBFS concrete by utilizing various modeling techniques, viz., Linear Regression (LR), Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Support Vector Regression (SVR), Grey Wolf Optimization (GWO), Differential Evolution (DE), and Mantra Rays Foraging Optimization (MRFO). The compressive strength of the training datasets was predicted using the SVR-PSO and SVR-GWO models, with a reliable coefficient of correlation of 0.9765 and 0.9522, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.