5,943 results on '"Performance measures"'
Search Results
2. Drivers, barriers, and challenges in NRevPAR and RevPAC adoption – Towards a revenue management adoption scale
- Author
-
BOO, Huey Chern, REMY, Detlev, and LEE, Kuan-Huei
- Published
- 2025
- Full Text
- View/download PDF
3. Optimization-based resource scheduling techniques in cloud computing environment: A review of scientific workflows and future directions
- Author
-
Kathole, Atul B., Vhatkar, Kapil, Lonare, Savita, and Kshirsagar, Aniruddha P.
- Published
- 2025
- Full Text
- View/download PDF
4. Analysis and optimization of machining parameters of AZ91 alloy nanocomposite with the Influences of nano ZrO2 through vacuum diecast process
- Author
-
R, Venkatesh, Hossain, Ismail, Mohanavel, V., Soudagar, Manzoore Elahi M., Alharbi, Sulaiman Ali, and Al Obaid, Sami
- Published
- 2024
- Full Text
- View/download PDF
5. Analytical models for flow time estimation of additive manufacturing machines.
- Author
-
Pastore, Erica, Alfieri, Arianna, Matta, Andrea, and Previtali, Barbara
- Subjects
TIME perception ,PRODUCTION planning ,PRODUCTION control ,WORK in process ,MACHINERY - Abstract
The use of Additive Manufacturing (AM) technology has largely increased in the last years. Because of its large differences from conventional technologies, the use of AM in production systems might call for new strategies in production planning and control. To this aim, this paper proposes analytical models to predict aggregate performance measures such as flow time, work in process, and production throughput, for production systems characterised by Laser Powder Bed Fusion AM technology. These indicators could be used both in operations strategy development and in technology comparison. The proposed models differentiate for their detail of the analysis and the number of input parameters that need to be estimated. The results show that the level of detail of the model affects the analysis leading to quite different values of the performance measures, especially in the case of highly saturated systems. Also, a discussion about the applicability of the proposed model to other AM technologies show whether and to what extent the proposed models can be applied for modelling other AM technologies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. 2024 Update to the 2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure A Report of the American Heart Association/American College of Cardiology Joint Committee on Performance Measures
- Author
-
Members, Writing Committee, Kittleson, Michelle M, Breathett, Khadijah, Ziaeian, Boback, Aguilar, David, Blumer, Vanessa, Bozkurt, Biykem, Diekemper, Rebecca L, Dorsch, Michael P, Heidenreich, Paul A, Jurgens, Corrine Y, Khazanie, Prateeti, Koromia, George Augustine, and Van Spall, Harriette GC
- Subjects
Biomedical and Clinical Sciences ,Cardiovascular Medicine and Haematology ,Heart Disease ,Cardiovascular ,Clinical Research ,Health Services ,Good Health and Well Being ,Humans ,Heart Failure ,United States ,American Heart Association ,Cardiology ,Adult ,KEY WORDS ACC/AHA Performance Measures ,heart failure ,performance measures ,quality indicators ,quality measures ,ACC/AHA Performance Measures ,Cardiorespiratory Medicine and Haematology ,Public Health and Health Services ,Cardiovascular System & Hematology ,Cardiovascular medicine and haematology - Abstract
This document describes performance measures for heart failure that are appropriate for public reporting or pay-for-performance programs and is meant to serve as a focused update of the "2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American College of Cardiology/American Heart Association Task Force on Performance Measures." The new performance measures are taken from the "2022 AHA/ACC/HFSA Guideline for the Management of Heart Failure: A Report of the American College of Cardiology/American Heart Association Joint Committee on Clinical Practice Guidelines" and are selected from the strongest recommendations (Class 1 or Class 3). In contrast, quality measures may not have as much evidence base and generally comprise metrics that might be useful for clinicians and health care organizations for quality improvement but are not yet appropriate for public reporting or pay-for-performance programs. New performance measures include optimal blood pressure control in patients with heart failure with preserved ejection fraction, the use of sodium-glucose cotransporter-2 inhibitors for patients with heart failure with reduced ejection fraction, and the use of guideline-directed medical therapy in hospitalized patients. New quality measures include the use of sodium-glucose cotransporter-2 inhibitors in patients with heart failure with mildly reduced and preserved ejection fraction, the optimization of guideline-directed medical therapy prior to intervention for chronic secondary severe mitral regurgitation, continuation of guideline-directed medical therapy for patients with heart failure with improved ejection fraction, identifying both known risks for cardiovascular disease and social determinants of health, patient-centered counseling regarding contraception and pregnancy risks for individuals with cardiomyopathy, and the need for a monoclonal protein screen to exclude light chain amyloidosis when interpreting a bone scintigraphy scan assessing for transthyretin cardiac amyloidosis.
- Published
- 2024
7. 2024 Update to the 2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American Heart Association/American College of Cardiology Joint Committee on Performance Measures
- Author
-
Kittleson, Michelle M, Breathett, Khadijah, Ziaeian, Boback, Aguilar, David, Blumer, Vanessa, Bozkurt, Biykem, Diekemper, Rebecca L, Dorsch, Michael P, Heidenreich, Paul A, Jurgens, Corrine Y, Khazanie, Prateeti, Koromia, George Augustine, and Van Spall, Harriette GC
- Subjects
Health Services and Systems ,Health Sciences ,Heart Disease ,Health Services ,Clinical Research ,Cardiovascular ,Good Health and Well Being ,Humans ,Heart Failure ,Quality Indicators ,Health Care ,United States ,Cardiology ,American Heart Association ,Treatment Outcome ,Consensus ,Quality Improvement ,Outcome and Process Assessment ,Health Care ,AHA Scientific Statements ,heart failure ,performance measures ,quality Indicators ,quality measures ,Cardiorespiratory Medicine and Haematology ,Public Health and Health Services ,Cardiovascular System & Hematology ,Cardiovascular medicine and haematology ,Public health - Abstract
This document describes performance measures for heart failure that are appropriate for public reporting or pay-for-performance programs and is meant to serve as a focused update of the "2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American College of Cardiology/American Heart Association Task Force on Performance Measures." The new performance measures are taken from the "2022 AHA/ACC/HFSA Guideline for the Management of Heart Failure: A Report of the American College of Cardiology/American Heart Association Joint Committee on Clinical Practice Guidelines" and are selected from the strongest recommendations (Class 1 or Class 3). In contrast, quality measures may not have as much evidence base and generally comprise metrics that might be useful for clinicians and health care organizations for quality improvement but are not yet appropriate for public reporting or pay-for-performance programs. New performance measures include optimal blood pressure control in patients with heart failure with preserved ejection fraction, the use of sodium-glucose cotransporter-2 inhibitors for patients with heart failure with reduced ejection fraction, and the use of guideline-directed medical therapy in hospitalized patients. New quality measures include the use of sodium-glucose cotransporter-2 inhibitors in patients with heart failure with mildly reduced and preserved ejection fraction, the optimization of guideline-directed medical therapy prior to intervention for chronic secondary severe mitral regurgitation, continuation of guideline-directed medical therapy for patients with heart failure with improved ejection fraction, identifying both known risks for cardiovascular disease and social determinants of health, patient-centered counseling regarding contraception and pregnancy risks for individuals with cardiomyopathy, and the need for a monoclonal protein screen to exclude light chain amyloidosis when interpreting a bone scintigraphy scan assessing for transthyretin cardiac amyloidosis.
- Published
- 2024
8. Sports Prediction for Cricket Match Using Grid Search and Extreme Gradient Boosting Classifier
- Author
-
Mahanta, Soumya Ranjan, Panda, Mrutyunjaya, Kacprzyk, Janusz, Series Editor, Gomide, Fernando, Advisory Editor, Kaynak, Okyay, Advisory Editor, Liu, Derong, Advisory Editor, Pedrycz, Witold, Advisory Editor, Polycarpou, Marios M., Advisory Editor, Rudas, Imre J., Advisory Editor, Wang, Jun, Advisory Editor, Kumar Udgata, Siba, editor, Sethi, Srinivas, editor, Ghinea, George, editor, and Kuanar, Sanjay Kumar, editor
- Published
- 2025
- Full Text
- View/download PDF
9. Multi-level Prediction of Financial Distress of Indian Companies Using Machine Learning
- Author
-
Fernandes, Reynal Lavita, Tantia, Veerta, Kacprzyk, Janusz, Series Editor, Novikov, Dmitry A., Editorial Board Member, Shi, Peng, Editorial Board Member, Cao, Jinde, Editorial Board Member, Polycarpou, Marios, Editorial Board Member, Pedrycz, Witold, Editorial Board Member, Hamdan, Allam, editor, and Braendle, Udo, editor
- Published
- 2025
- Full Text
- View/download PDF
10. Developing a multidimensional performance measurement framework for international construction joint ventures (ICJVs): the perspective of Ghana-hosted ICJVs' practitioners
- Author
-
Tetteh, Mershack Opoku, Chan, Albert P.C., Darko, Amos, Özorhon, Beliz, and Adinyira, Emmanuel
- Published
- 2024
- Full Text
- View/download PDF
11. COVID-19's Effect on the Technical Efficiency and Productivity of US Airlines: An Industry Sectoral Analysis
- Author
-
Laulederkind, Zoe and Peoples, James
- Published
- 2024
- Full Text
- View/download PDF
12. A bi-objective optimization mathematical model integrated with bulk arrival Markovian queueing system and machine learning in publishing industries.
- Author
-
Dashtakian-Nasrabad, Samane, Esmaeili-Qeshlaqi, Maryam, Alipour-Vaezi, Mohammad, Jolai, Fariborz, and Aghsami, Amir
- Subjects
MACHINE learning ,PUBLISHING ,MATHEMATICAL optimization ,MATHEMATICAL models ,RESEARCH personnel - Abstract
In the publishing industry, it is vital to comprehend customers' needs and take their pleasure into account. Thus, publishing houses (PHs) must effectively address the necessity to increase profitability as well as minimize the waiting time of the printing requests in the queue. Although some researchers have studied publishing industries' management, there is still a lack of research modeling PHs as a practical queueing system that reflects their performance precisely. Moreover, most models have not considered the machine learning technique for predicting the circulation of books integrated with a queueing model and mathematical programming. Furthermore, no study has modeled the PH as a M
[X] /M/1 queueing system to practically improve the system. To fill these gaps, this paper develops a novel bi-objective mathematical model with a bulk arrival queueing system and a machine learning technique to minimize the total cost and maximize the resource utilization rate. Also, different machine learning classifier algorithms are implemented to estimate the circulation for each publishing product. Owing to the model's bi-objective character, an enhanced LP-metrics method is presented. We have also examined a real-world case study to validate our approach and demonstrate its applicability to real-world issues. A 25% increase in profitability served as proof of the model's effectiveness. [ABSTRACT FROM AUTHOR]- Published
- 2025
- Full Text
- View/download PDF
13. Impact of stochastic vehicle generation on traffic microsimulation model output.
- Author
-
Hurtado-Beltran, Antonio, Pérez-Cruz, José R, and Madrigal-Arteaga, Víctor M
- Subjects
- *
HIGHWAY capacity , *TRAFFIC flow , *VEHICLE models , *ROADS , *CALIBRATION - Abstract
One of the most important characteristics of microsimulation is the ability to model the temporal and spatial nature of traffic demand. Every traffic microsimulation has a vehicle generation model that determines how and when the driver–vehicle units are introduced in the simulation. While microsimulation use has become increasingly popular, it is unclear how the vehicle generation options affect the final result. The purpose of this paper is to examine the stochastic component of the vehicle generation model and how it may affect the simulation output. Three scenarios, including the passenger car equivalent (PCE) model of the Highway Capacity Manual (HCM-7), were used for assessing stochastic volumes and their impacts on performance measures. Results indicate a statistically significant impact of the variability of stochastic volumes on the estimation of performance measures at the breakdown flow phase and synchronized flow phase for interrupted and uninterrupted flow conditions, respectively. This finding highlights the importance of reporting vehicle generation as a calibration parameter, enabling others to replicate the experiments. When utilizing microsimulations for the assessment of roadway/system performance, it is crucial for users to have a thorough understanding of demand generation. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
14. Proposal of a Correlation Model Integrating FDRM and CLSCM Practices and Performance Measures: A Case Study from the Automotive Battery Industry in Brazil.
- Author
-
Marco-Ferreira, Antonio, Fidelis, Reginaldo, Fenerich, Francielle Cristina, Lima, Rafael Henrique Palma, De Andrade Junior, Pedro Paulo, and Horst, Diogo José
- Subjects
SUPPLY chain management ,REVERSE logistics ,BATTERY industry ,POWER resources ,DEMAND forecasting ,SUPPLY chains ,PRODUCTION planning - Abstract
The field of closed-loop supply chain management (CLSCM) seeks to replace the linear flow of materials and energy with a cyclical model in which the outputs of the production system become inputs to the same system, thus closing the cycle of materials and energy within the supply chain. Current literature on CLSCM reports a wide variety of practices, and combining these practices with environmental performance measures is an ongoing challenge, mainly because results from these practices are often diffuse and linking them with performance results is not a straightforward task. This paper addresses this problem by proposing a model to prioritize CLSCM practices and performance measures. The correlation model integrating the fuzzy direct rating method (FDRM) and CLSCM practices and performance measures was tested in a real company that is part of a closed-loop supply chain that recycles lead obtained from automotive batteries in Brazil. The results allowed the identification of which management practices are more relevant to the organization by correlating their impact with performance measures. The most relevant practices for the company under study were demand forecasting, with 21.68% of relative importance, followed by reverse logistics practices (21.15%) and production planning and control (18.16%). Another relevant finding is that upstream performance measures account for 77.72% of the company's CLSCM performance. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
15. Business Sustainability and Its Effect on Performance Measures: A Comprehensive Analysis.
- Author
-
Pérez Estébanez, Raquel and Sevillano Martín, Francisco Javier
- Abstract
In recent years, businesses have faced growing pressures from stakeholders, including investors, customers, and regulators, to adopt sustainable practices. These pressures stem from the global focus on environmental, social, and governance (ESG) criteria and their association with risk management and corporate resilience. As a result, understanding the connection between sustainability and performance indicators, such as return on equity (ROE) and return on assets (ROA), is crucial to determine whether sustainable practices positively influence financial outcomes or primarily serve to address external expectations. This study seeks to bridge the gap between theoretical frameworks and empirical evidence by employing a rigorous methodological approach—Structural Equation Modeling (SEM)—to assess the impact of sustainability practices on key performance measures. The inclusion of a diverse range of industries from the US and Europe enhances the relevance of the findings, as it facilitates their generalization across developed economies where sustainability initiatives are highly prioritized. Our results are consistent with prior research demonstrating a positive relationship between sustainability and financial performance, particularly in high-development contexts over a medium-term period. These findings carry important implications for managers and policymakers, emphasizing that sustainability is not a compromise but a catalyst for economic and financial profitability. This study contributes to the literature by illustrating how sustainability can simultaneously advance ethical objectives and enhance financial performance, establishing it as a critical area of focus for both academics and practitioners. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
16. Measuring the production performance indicators for metal-mechanic industry: an LDA modeling approach.
- Author
-
Restrepo, Jorge Aníbal, Giraldo, Emerson Andres, and Vanegas, Juan Gabriel
- Subjects
MONTE Carlo method ,BUSINESS losses ,ANALYSIS of variance ,OPERATIONAL risk ,PRICE indexes - Abstract
Purpose: This study proposes a novel method to improve the accuracy of overall equipment effectiveness (OEE) estimation in the metallurgical industry. This is achieved by modeling the frequency and severity of stoppage events as random variables. Design/methodology/approach: An analysis of 80,000 datasets from a metal-mechanical firm (2020–2022) was performed using the loss distribution approach (LDA) and Monte Carlo simulation (MCS). The data were further adjusted with a product price index to account for inflation. Findings: The variance analysis revealed supporting colleagues (59.8% of variance contribution), food breaks (29.8%) and refreshments (9.0%) as the events with the strongest influence on operating losses. Research limitations/implications: This study provides a more rigorous approach to operational risk management and OEE measurement in the metal-mechanical sector. The developed algorithm supports the establishment of risk management guidelines and facilitates targeted OEE improvement efforts. Originality/value: This research introduces a novel OEE estimation method specifically for the metallurgical industry, utilizing LDA and MCS to improve accuracy compared to existing techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
17. Key performance indicators of hospital supply chain: a systematic review.
- Author
-
Fallahnezhad, Meysam, Langarizadeh, Mostafa, and Vahabzadeh, Afshin
- Abstract
Background: Performance measurement is vital for hospitals to become service-oriented, operate efficiently, attract customers, increase revenue, and improve both clinical and non-clinical outcomes, enabling them to succeed in the competitive healthcare sector. Key Performance Indicators (KPIs) play a crucial role in monitoring, assessing, and enhancing care quality and service delivery. However, identifying suitable KPIs for performance measurement can be challenging for hospitals due to a lack of comprehensive sources. Although many studies have explored KPIs, few have specifically addressed performance indicators within the hospital supply chain. Objectives: This systematic review seeks to identify and categorize the current knowledge and evidence concerning KPIs for the hospital supply chain. Methods: Seven bibliographic databases (PubMed, Scopus, Science Direct, Web of Science, Embase, ProQuest, and IEEE Xplore) were utilized in this research. The initial search identified 3661 articles; following a review of the titles, abstracts, and full texts, 32 articles were selected. Additionally, backward reference list checks were performed on the selected studies. Relevant studies were included based on the objectives, and data extraction was conducted using a form created in Word 2016. Results: A total of 64 KPIs for the hospital supply chain were identified. The performance indicators were categorized into financial (n = 37), managerial (n = 15), and clinical (n = 12) categories. Conclusions: This comprehensive review successfully identified 64 KPIs, highlighting their potential to advance clinical practice and enhance patient care in hospitals. Further research is essential to establish a standardized methodology for KPI development within the hospital supply chain. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Redefined Formula for Anterior Chamber Volume Calculation: Quantitative Analysis of Biometric Parameters Across Ocular Pathologies.
- Author
-
Zemitis, Arturs, Rizzuto, Vincenzo, Lavrinovica, Diana, Vanags, Juris, and Laganovska, Guna
- Subjects
- *
AQUEOUS humor , *INTRACLASS correlation , *EXFOLIATION syndrome , *VOLUME measurements ,DEVELOPING countries - Abstract
Purpose: This study evaluates the discrepancies between ACV measurements obtained from the Heidelberg Anterion and Zeiss IOLMaster 700 and investigates the significance of ACV and other ocular biometry parameters. Patients and Methods: To investigate intraocular fluid circulation, a robust formula was developed for ACV measurement using the Zeiss IOLMaster 700. A pilot study was conducted to validate this formula, which relied on WTW, CCT, and ACD. The formula used was ACV = (RAC)^2 × (CCD) × 1.51. ACV measurements showed a median of 155.38 (IQR = 131.15– 180.06) for the Heidelberg Anterion and 144.11 mm³ (IQR = 125.62– 159.81) for the Zeiss IOLMaster 700. The intraclass correlation coefficient (ICC) for ACV was 0.908, indicating excellent agreement between devices. Results: Intraocular fluid volume was significantly lower in eyes with PEXS compared to those without. Eyes with PEX had an ACV of 133 ± 28.3 mm³ versus 142 ± 30.7 mm³ in non-PEX eyes, a statistically significant difference (t (196) = − 2.09, p = 0.038, d = − 0.301). Significant differences were also observed in ACD and AL between PEX and non-PEX eyes, with PEX eyes showing reduced measurements. Conclusion: Our findings reveal that age-related changes in ACD and ACV are significant, with the redefined formula showing excellent agreement with AS-OCT methods. Eyes with PEX exhibit reduced ACD, ACV, and AL measurements. Additionally, an accessible method for ACV measurement, not relying on Pentacam or AS-OCT, would be valuable, particularly in developing countries, to facilitate broader clinical research. Plain Language Summary: Our study examines the differences in measuring aqueous humor volume in the eye using two devices, the Heidelberg Anterion and the Zeiss IOLMaster 700. Aqueous humor is the clear fluid in the front part of the eye, which circulates at a rate of 1.0% to 1.5% per minute. We refined a formula to calculate the volume of this fluid using the Zeiss IOLMaster 700, based on specific eye measurements. We found that the volume measurements from both devices were very similar, showing a strong agreement. The study also discovered that eyes with pseudoexfoliation syndrome (PEXS), a condition affecting the eye, had significantly lower fluid volume compared to eyes without the condition. This was also true for other eye measurements like anterior chamber depth (ACD) and axial length (AL). These findings are important because they show that simpler and more accessible methods for measuring fluid volume in the eye can be used effectively, especially in developing countries where advanced equipment may not be available. The study highlights the changes in eye measurements with age and how the new formula aligns well with existing methods. Overall, eyes with PEX show reduced measurements in various eye parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Evaluation of Radial Basis Function Network and Supervised Machine Learning Methods on Brain Stroke Prediction Datasets.
- Author
-
AKBAŞ, Kübra Elif and DAĞOĞLU HARK, Betül
- Abstract
Objective: Supervised machine learning algorithms and neural networks are widely used classification methods in data mining. In this study, RBFN, one of the widely used supervised machine learning (SML) algorithms and neural network methods, was used according to the factors affecting the diagnosis of cerebral palsy, and it was aimed to evaluate their classification performance. Material and Method: The dataset is an open source dataset, and there are a total of 4981 people with and without stroke. This dataset is modeled with RBFN from neural networks with four algorithms commonly used in supervised machine learning decision tree (DT), random forest (RF), and K-nearest neighbor (K-NN) and support machine vector (SVM). Their performance was evaluated according to performance criteria. Results: The algorithms with the highest performance according to the accuracy criteria are DT (0.954), SVM (0.954), RBFN (0.954) and RF (0.953), respectively. The K-NN algorithm was found to be higher than other methods in terms of precision (0.061) and sensitivity (0.080). Conclusion: The performances of DT, RF, SVM and RBFN methods were found to be close to each other in terms of accuracy criteria. In the decision-making process, the correct classification performance of these four methods is higher than K-NN. [ABSTRACT FROM AUTHOR]
- Published
- 2024
20. Manufacturing bio-based fiber-reinforced polymer composites: process performance in RTM and VARI processes.
- Author
-
Kirschnick, Ulrike, Feuchter, Michael, Ravindran, Bharath, Salzmann, Moritz, Duretek, Ivica, Fauster, Ewald, and Schledjewski, Ralf
- Subjects
FIBER-reinforced plastics ,COMPOSITE plates ,COMPOSITE material manufacturing ,MANUFACTURING processes ,FIBROUS composites ,TRANSFER molding - Abstract
The utilization of bio-based materials for the manufacturing of fiber-reinforced polymer composites is gaining importance under the sustainability paradigm. The identification of suitable process parameters and limited process reproducibility remain among the major challenges to enhance the industrial application potential of bio-based composites. This is especially relevant, as the manufacturing process influences composite quality, economic performance and environmental impacts. This study compares Resin Transfer Molding and Vacuum Assisted Resin Infusion for two sets of process parameters in order to manufacture a composite plate consisting of a flax-fiber textile impregnated with a partially bio-based epoxy matrix. Process quality is described through statistical analysis of processing and composite properties, and performance in terms of process replicability and reliability using performance estimates. Processing parameters were selected to depict a range of manufacturing scenarios that were suitable for the selected bio-based material system from curing for 180 min at 60 °C to curing for 30 min at 100 °C. For an identical set of process conditions, Resin Transfer Molding outperforms Vacuum Assisted Resin Infusion in terms of tensile and flexural characteristics. Conversely, the latter shows the strongest fiber-matrix adhesion and the most homogeneous impregnation. Whereas manufacturing at lower temperature leads to positive effects on composite quality, higher processing temperature with shorter curing cycles achieve highest process performance in terms of Pp and Ppk indices. An additional annealing at 120 °C neither increases composite quality nor reduces manufacturing-induced variability. Results depend on processing differences and indicators to determine process performance, as well as methodological choices. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Impacts of Inventory Management Factors on Inventory Performance Measures: The Sri Lankan Wholesale Sector
- Author
-
L. P. Hasindu Shanilka and C. A. Kavirathna
- Subjects
factors of inventory management ,performance measures ,wholesale sector ,pls-sem ,Transportation and communications ,HE1-9990 - Abstract
Inventory management is a crucial component of sustainability, enabling companies to reduce expenses, enhance cash flow, and increase profitability. Inventory makes up a majority of current assets in the wholesale sector, where inefficiencies are bound to happen, given the large-scale inventory. This research focuses on identifying the factors and performance measures of inventory management and evaluating their interrelationships focusing on the Sri Lankan wholesale sector. The factors identified through a systematic review and industry experts’ interviews were then categorized into key areas with a thematic analysis. The 22 factors identified were grouped into the categories of organization, facilities and equipment, and processes and practices. The 24 performance measures were grouped into operational, customer satisfaction, and environmental categories. A questionnaire survey was conducted with 126 managerial-level employees of wholesale organisations. The partial least squares structural equation modelling (PLS-SEM) method is used to validate the relationship between the variables and the moderation effect of firm size on these relationships. The findings reflect that factors of organisation and facilities and equipment, influence the performance measures. Organisational factors were the most significant in influencing all three performance categories. For most relationships, the firm size did not have a moderation effect. This research assists wholesale businesses in achieving excellence and a competitive edge in the Sri Lankan market while finding answers to the ongoing business issues in the wholesale sector.
- Published
- 2024
- Full Text
- View/download PDF
22. Comparison of Vestibular/Ocular Motor Screening (VOMS) and Computerized Eye-tracking to Identify Exposure to Repetitive Head Impacts.
- Author
-
Kontos, Anthony P, Zynda, Aaron J, and Minerbi, Amir
- Subjects
- *
INSTITUTIONAL review boards , *RECEIVER operating characteristic curves , *MILITARY medical personnel , *MILITARY personnel , *HEAD injuries - Abstract
Introduction Military service members (SMs) are exposed to repetitive head impacts (RHIs) in combat and training that are purported to adversely affect brain health, including cognition, behavior, and function. Researchers have reported that RHI from blast-related exposure may affect both vestibular and ocular function, which in turn may be related to symptomology. As such, an examination of the effects of RHI on exposed military SMs should incorporate these domains. To date, researchers have not compared groups of exposed special operations forces (SOF) operators on combined clinical vestibular/ocular and eye-tracker-based outcomes. Therefore, the primary purpose of this study was to compare participant-reported symptoms and performance on the Vestibular/Ocular Motor Screening (VOMS) tool with performance on the computerized RightEye tracking system between SOF operators exposed to blast-related RHI and healthy controls without blast-related exposure. In addition, the study aimed to compare subgroups of snipers and breachers exposed to RHI to controls on the preceding metrics, as well as identify a subset of individual (demographic) factors, participant-reported symptoms, and performance metrics on VOMS and RightEye that best identify SOF operators exposed to RHI from unexposed controls. Materials and Methods The study involved a cross-sectional design including 25 Canadian SOF SMs comprised of breachers (n = 9), snipers (n = 9), and healthy, unexposed controls (n = 7). The former 2 groups were combined into an RHI group (n = 18) and compared to controls (n = 7). Participants provided demographics and completed a self-reported concussion-related symptom report via the Military Acute Concussion Evaluation 2, the VOMS, and RightEye computerized eye-tracking assessments. Independent samples t -tests and ANOVAs were used to compare the groups on the outcomes, with receiver operating characteristic curve and area under the curve (AUC) analyses to identify predictors of blast exposure. This study was approved by the Defence Research Development Canada Human Research Ethics Committee and the Canadian Forces Surgeon General/Special Forces Command. Results The results from t -tests supported group differences for age (P = .012), participant-reported symptoms (P = .006), and all VOMS items (P range = <.001-.02), with the RHI group being higher than healthy controls on all variables. ANOVA results supported group differences among snipers, breachers, and controls for age (P = .01), RightEye saccades (P = .04), participant-reported total symptom severity (P = .03), and VOMS total scores (P = .003). The results of the receiver operating characteristic curve analyses supported age (AUC = 0.81), Military Acute Concussion Evaluation 2 participant-reported total symptom severity (AUC = 0.87), and VOMS total scores (AUC = 0.92) as significant predictors of prior blast exposure. Conclusions Participant-reported concussion symptoms, VOMS scores, and age were useful in identifying SOF operators exposed to RHI from controls. RightEye metrics were not useful in differentiating RHI groups from controls. Differences between snipers and breachers warrant further research. Overall, the findings suggest that VOMS may be a useful tool for screening for the effects of exposure to RHI in SOF operators. Future investigations should be conducted on a larger sample of military SMs, consider additional factors (e.g. RHI exposure levels, medical history, and sex), and include additional assessment domains (e.g. balance, cognitive, and psychological). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Analysis of M/M/1 queueing systems with negative customers and unreliable repairers.
- Author
-
Tian, Ruiling and Zhang, Yao
- Subjects
- *
NASH equilibrium , *GENERATING functions , *CONSUMERS , *PROBABILITY theory - Abstract
In this article, we study the M/M/1 queueing systems with negative customers and unreliable repairers. The arrival of a negative customer causes the failure of the server, and the server is repaired immediately. The repairers may successfully repair the server with probability p, and the server continues to serve customers in the system. Otherwise, once the repair fails, the system will be cleared and all customers are forced to leave the system. By analyzing the quasi-birth-and-death process and using the probability generating function method, we obtain the steady-state probability and some performance measures of the system. Then, based on the reward-cost structure, we discuss the strategic behavior of customers and consider the equilibrium strategy. Finally, the effects of system parameters on the performance measures, individual benefit, and social benefit are analyzed by numerical examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Enhancement and characteristics study of parabolic trough solar collector by using magnesium oxide coating on solar tubes.
- Author
-
Selvam, Lokesh, Hossain, Ismail, Aruna, M., Venkatesh, R., Karthigairajan, M., Prabagaran, S., Mohanavel, V., Seikh, Asiful H, and Kalam, M. A.
- Subjects
- *
PARABOLIC troughs , *SOLAR collectors , *SOLAR energy , *HEAT radiation & absorption , *OXIDE coating , *HEAT transfer fluids , *SOLAR radiation - Abstract
In the modern era, various engineering applications utilize renewable solar energy, and recent prospects aim to enhance solar thermal collector efficiency through nanotechnology found to enhance solar performance. While using the parabolic trough collector, it found excellent solar conversion efficiency and attained the maximum temperature of the working fluid. Besides the intermittency due to weather conditions, the output performance will be reduced. This study aims to enhance the performance of parabolic trough solar collector by implementing magnesium oxide (MgO) coating over the tubes as 30, 20, and 10 µm particles blended with industrial black matt paint to prepare MgO-enhanced coating through the spray pyrolysis process for varying the nanoparticle size with constant thickness coating in the thermal performance of parabolic trough solar collector (PTC). The findings of this research demonstrate that particles with coating material significantly affect the thermal performance of PTC compared with non-coating. The 10 µm MgO coating featured solar collector exploited maximum heat transfer fluid temperature (81.2 °C), increased heat absorption behaviour (662.5 W), optimum thermal and exergy efficiency values of 78.9 and 69.5%, respectively, which is the optimum value rather than all others. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Analysis of the Mt/M/1 Queueing System with Impatient Customers and Single Vacation.
- Author
-
Yousefi, Ali, Pourtaheri, Reza, and Rad, Mohammad Reza Salehi
- Abstract
We consider an M
t /M/1 queueing system with impatient customers and a single vacation, assuming the customers' impatience is due to the server's vacation. In the context of non-stationary sinusoidal modeling, this paper introduces systems with exponential service times and periodic (sinusoidal) Poisson arrival processes. We studied a novel analysis of an Mt /M/1 model including simultaneous vacations and impatient customers alongside the relative amplitude changes. In addition, the pointwise stationary approximation has been computed by integrating over time the formula for the stationary performance measure with the arrival rate that applies at each point in time. The time-dependent probability generating functions and the corresponding steady-state results have been obtained explicitly. We focus on five performance measures: the expected number of customers waiting in the queue during vacation, the expected customer waiting time in the queue during vacation, the probability of the server being busy, the probability of the server being on vacation and the probability of customers' impatience. Finally, to evaluate the performance measure of queue length, we have conducted a sensitivity analysis by running a simulation for a specific set of parameters. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
26. A Research Review and Perspective Toward Plant Leaf Disease Detection Using Image Processing Techniques.
- Author
-
Kindalkar, Amrita Arjun
- Subjects
- *
MACHINE learning , *LEAF color , *TIME complexity , *EVIDENCE gaps , *DEEP learning - Abstract
Plant Leaf Disease (PLD) detection is helpful for several fields like Agriculture Institute and Biological Research. The country’s economic growth depends on the productivity of the agricultural field. Recently developed models based on deep learning give more accurate and precise results over the detection and classification of PLD while evolving through image processing approaches. Many image-processing approaches are used for the identification and classification of PLD. The quality of agricultural products is mainly affected by several factors like fungi, bacteria, and viruses. These factors severely destroy the entire growth of the plant. Hence, some outperformed models are needed to detect and identify the severity level of plant diseases yet, the identification requires more time and has a struggle to identify the appropriate type of disease based on its symptoms. Therefore, several automatic detection and classification models are developed to avoid the time complexity. Computerized image processing approaches are utilized for crop protection, which analyzes the color information of leaves from the collected images. Hence, image processing techniques play an important role in the identification and classification of PLD. It gives more advantages by lowering the task of illustrating crops on large farms and detecting the leaf diseases at the initial stage itself based on the symptoms of the plant leaves. While implementing a new model, there is a need to study various machine and deep learning-based structures for PLD detection approaches. This research work provides an overview of various heuristic approaches, machine learning, and deep learning models for the detection and classification of PLD. This research work also covers the various constraints like PLD detection tools, performance measures, datasets used, and chronological review. Finally, the research work explores the research findings and also the research gaps with future scope. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. A Bayesian approach to determine sample size in Erlang single server queues.
- Author
-
Singh, Laxmipriya, Gomes, E. S., and Cruz, F. R. B.
- Subjects
- *
INFERENTIAL statistics , *BAYESIAN field theory , *SAMPLE size (Statistics) , *NUMBER systems , *CONSUMERS - Abstract
AbstractAn essential aspect of the practical application of Erlang single-server queues is the statistical inference of their parameters, notably traffic intensity. This metric is crucial as it serves as a fundamental performance indicator, allowing the derivation of other significant measures, including the mean queue size and the expected number of customers in the system. Additionally, it provides the proportion of time the queue system is occupied. This article explores algorithms for calculating sample sizes for these estimations, based on the number of customers who arrived during service periods, a highly intuitive approach, allowing data collection without concerns about correlations among data points. For this purpose, we have considered two forms of informative priors: Gauss hypergeometric and beta priors. Additionally, a non conjugate and objective prior, known as Jeffreys prior, has been taken into account. We present tables with sample sizes for specific configurations and abacuses for more general configurations obtainable through approximate interpolations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Transient and Steady-State Analysis of an M / PH 2 /1 Queue with Catastrophes.
- Author
-
Liu, Youxin, Liu, Liwei, Jiang, Tao, and Chai, Xudong
- Subjects
- *
POISSON processes , *BESSEL functions , *CATASTROPHE modeling , *TRANSIENT analysis , *CONSUMERS - Abstract
In the paper, we consider the P H 2 -distribution, which is a particular case of the P H -distribution. In other words, The first service phase is exponentially distributed, and the service rate is μ. After the first service phase, the customer can to go away with probability p or continue the service with probability (1 − p) and service rate μ ′ . We study an analysis of an M / P H 2 / 1 queue model with catastrophes, which is regarded as a generalization of an M / M / 1 queue model with catastrophes. Whenever a catastrophe happens, all customers will be cleaned up immediately, and the queuing system is empty. The customers arrive at the queuing system based on a Poisson process, and the total service duration has two phases. Transient probabilities and steady-state probabilities of this queuing system are considered using practical applications of the modified Bessel function of the first kind, the Laplace transform, and probability-generating function techniques. Moreover, some important performance measures are obtained in the system. Finally, numerical illustrations are used to discuss the system's behavior, and conclusions and future directions of the model are given. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Proper choice of location cumulative sum control charts for different environments.
- Author
-
Nazir, Hafiz Zafar, Ikram, Muhammad, Amir, Muhammad Wasim, Akhtar, Noureen, Abbas, Zameer, and Riaz, Muhammad
- Subjects
- *
STATISTICAL process control , *MANUFACTURING processes , *QUALITY control , *QUALITY control charts , *CUSUM technique , *SEMICONDUCTORS - Abstract
In the field of quality control, statistical process control (SPC) has its significance. The control charts are important tools of the SPC to observe the outputs of the production process. The cumulative sum (CUSUM) charting structure is one such technique that is designed to identify the medium and slight changes in the process parameters. In the literature, the assumption of normality is considered ideal for the control chart but in practice, many quality characteristics do not follow the assumption of normality. The current study proposes a class of CUSUM control charts for the location parameter of the process and investigates the performance of said location charts based on ranked set sampling (RSS) for quality characteristics having different environments. Different point estimators of location are considered in this study to develop the location control charts under normal, and a variety of non‐normal environments. The numerical results show that the newly designed schemes perform uniformly well than their competitors. A real‐life example linked with the manufacturing process is also provided for the practical implementation of the proposed scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. 2024 Update to the 2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American Heart Association/American College of Cardiology Joint Committee on Performance Measures.
- Author
-
Kittleson, Michelle M., Breathett, Khadijah, Ziaeian, Boback, Aguilar, David, Blumer, Vanessa, Bozkurt, Biykem, Diekemper, Rebecca L., Dorsch, Michael P., Heidenreich, Paul A., Jurgens, Corrine Y., Khazanie, Prateeti, Koromia, George Augustine, and Van Spall, Harriette G.C.
- Subjects
- *
MEDICAL quality control , *CARDIAC amyloidosis , *HEART failure patients , *HEART failure , *MITRAL valve insufficiency - Abstract
This document describes performance measures for heart failure that are appropriate for public reporting or pay-for-performance programs and is meant to serve as a focused update of the "2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American College of Cardiology/American Heart Association Task Force on Performance Measures." The new performance measures are taken from the "2022 AHA/ACC/HFSA Guideline for the Management of Heart Failure: A Report of the American College of Cardiology/American Heart Association Joint Committee on Clinical Practice Guidelines" and are selected from the strongest recommendations (Class 1 or Class 3). In contrast, quality measures may not have as much evidence base and generally comprise metrics that might be useful for clinicians and health care organizations for quality improvement but are not yet appropriate for public reporting or pay-for-performance programs. New performance measures include optimal blood pressure control in patients with heart failure with preserved ejection fraction, the use of sodium-glucose cotransporter-2 inhibitors for patients with heart failure with reduced ejection fraction, and the use of guideline-directed medical therapy in hospitalized patients. New quality measures include the use of sodium-glucose cotransporter-2 inhibitors in patients with heart failure with mildly reduced and preserved ejection fraction, the optimization of guideline-directed medical therapy prior to intervention for chronic secondary severe mitral regurgitation, continuation of guideline-directed medical therapy for patients with heart failure with improved ejection fraction, identifying both known risks for cardiovascular disease and social determinants of health, patient-centered counseling regarding contraception and pregnancy risks for individuals with cardiomyopathy, and the need for a monoclonal protein screen to exclude light chain amyloidosis when interpreting a bone scintigraphy scan assessing for transthyretin cardiac amyloidosis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Performance Optimization and Injury Mitigation for Air Force Student Fighter Pilots.
- Author
-
Chayrez, Stephanie E, Acevedo, Anthony, Blake, Jared, Parrott, Christopher, Gerking, Timothy, Guthmann, Deborah, Jilek, Michelle, Dorcheus, Joshua, Zeigler, Zachary, Copeland, Clint, Gill, Haley, Smietana, Andrew, Price-Moore, Carolyn, Nores, Brittaney, and Scott, Ryan M
- Subjects
- *
HEALTH care teams , *STIFF-person syndrome , *FIGHTER pilots , *AIR bases , *NECK pain - Abstract
Introduction Military fighter aircrew report high rates of cervical pain and injury. There is currently no consensus regarding the best training methods for this population. Eglin Air Force Base (AFB) and Luke AFB have multidisciplinary teams specializing in aircrew training, performance, and injury mitigation. All student pilots (SPs) completing Basic Course training at these locations engage in an 8-week Spine Training Program (STP). The STP originated at Luke AFB in 2020 and was expanded to Eglin AFB in 2022. The primary aim of this study was to assess whether the STP led to significant changes in the performance measure studied, Cervical Endurance Hold (CEH). Further, this study aimed to determine if the CEH training effect was independent of location of STP administration. We hypothesized that SPs would exhibit statistically significant CEH training adaptations irrespective of base location. Materials and Methods Air Force F-16 and F-35 SPs from Luke AFB and Eglin AFB were actively enrolled in the Basic Course and participated in the standardized STP from 2020 to 2023. The CEH test was administered prior to (intake) and following (exit) the 8-week STP. SPSS for Windows version 29 software (IBM, Armonk, NY) was used to retrospectively analyze the data from this study. Participants were excluded if they were unable to perform the CEH test at intake or exit. The study was approved by the Air Force Research Laboratory Institutional Review Board and was performed in accordance with the ethical standards of the Declaration of Helsinki. Results One hundred and ninety-eight SPs (Luke AFB, males n = 170, females n = 12; Eglin AFB, males n = 16) completed the STP program. There was no significant difference between intake and exit concerning age, height, weight, % body fat, and fat-free mass at Luke AFB or Eglin AFB (P < 0.05). Statistically significant improvements in CEH were observed within all groups from intake to exit (P < 0.001). When considering all participants collectively, there was a notable 33.6% increase in CEH from intake to exit (P < 0.001) with an overall effect size of d = 1.14. When analyzing specific subgroups, females from Luke AFB experienced a significant 20.4% increase in CEH (P < 0.001, d = 1.14), males from Luke AFB exhibited a significant 34.5% increase (P < 0.001, d = 1.09), and males from Eglin AFB demonstrated a significant increase of 55.7% in CEH (P < 0.001, d = 1.97). Conclusions This retrospective analysis showed significant improvements in the CEH across all groups following the completion of the STP. Furthermore, CEH results from both bases exhibited a large effect size indicating a meaningful change was found between intake and exit regardless of training location. These preliminary study results should be interpreted with caution as a control group was unable to be established. In the future, a randomized control trial should be performed to test the STP used in this study against other STP programs. This may better inform experts on the best spine training methods for fighter aircrew. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Common Data Elements and Databases Essential for the Study of Musculoskeletal Injuries in Military Personnel.
- Author
-
Juman, Luke, Schneider, Eric B, Clifton, Dan, and Koehlmoos, Tracey Perez
- Subjects
- *
INFORMATION resources management , *DATA libraries , *HOUSING management , *MILITARY medicine , *HEALTH of military personnel - Abstract
Introduction Injuries are the leading cause of medical encounters with over 2 million medical encounters for musculoskeletal (MSK) conditions and over 700,000 acute injuries per year. Musculoskeletal injuries (MSKIs) are by far the leading health and readiness problem of the U.S. Military. The Proceedings of the International Collaborative Effort on Injury Statistics published a list of 12 data elements deemed necessary for injury prevention in the civilian population; however, there are no standardized list of common data elements (CDEs) across the DoD specifically designed to study MSKIs in the Military Health System (MHS). This study aims to address this gap in knowledge by defining CDEs across the DoD for MSKIs, establishing a CDE dictionary, and compiling other necessary information to quantify MSKI disease burden in the MHS. Materials and Methods Between November 2022 and March 2023, we conducted an environmental scan of current MSKI data metrics across the DoD. We used snowball sampling with active engagement of groups housing datasets that contained MSKI data elements to determine CDEs as well as information on readiness databases across the DoD containing up-to-date personnel information on disease, hospitalizations, limited duty days (LDDs), and deployability status for all military personnel, as well as MSKI-specific measures from the MHS Dashboard which tracks key performance measures. Results We identified 8 unique databases: 5 containing demographic and diagnostic information (Defense Medical Surveillance System, Medical Assessment and Readiness Systems, Military Health System Data Repository, Person-Data Environment, and Soldier Performance, Health, and Readiness Database); and 3 containing LDD information (Aeromedical Services Information Management System, eProfile, and Limited Duty Sailor Marines Readiness Tracker). Nine CDEs were identified: DoD number, sex, race, ethnicity, branch of service, rank, diagnosis, Common Procedural Terminology coding, and cause codes, as they may be captured in any database that is a derivative of the Military Health System Data Repository. Medical Assessment and Readiness Systems contained most variables of interest, excluding injury/place of region and time in service. The Limited Duty Sailor Marines Readiness Tracker contains a variable corresponding to "days on limited duty." The Aeromedical Services Information Management System uses the "release date" and "profile date" to calculate LDDs. The eProfile system determines LDDs by the difference between the "expiration date" and "approved date." In addition, we identified 2 measures on the MHS Dashboard. One measures the percentage of service members (SMs) who are on limited duty for longer than 90 days because of an MSKI and the other tracks the percentage of SMs that are not medically ready for deployment because of a deployment-limiting medical condition. Conclusions This article identifies core data elements needed to understand and prevent MSKIs and where these data elements can be found. These elements should inform researchers and result in evidence-informed policy decisions supporting SM health to optimize military force readiness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Barriers and Facilitators to Administering Burn Pit Registry Exams in VHA Facilities.
- Author
-
Chen, Patricia V, Christie, Israel C, Godwin, Kyler M, Han, Jaehwan, Jani, Nisha, Sotolongo, Anays, Ali, Asma, and Helmer, Drew A
- Subjects
- *
MILITARY medical personnel , *MEDICAL personnel , *VETERANS' health , *MEDICAL records , *DUTY - Abstract
Introduction The Veterans Health Administration (VHA) established the Airborne Hazards and Open Burn Pit Registry (AHOBPR) in 2014 to address exposure concerns for veterans who have served in military operations in Southwest Asia and Afghanistan. By 2021, over 236,086 veterans completed the online questionnaire and 60% requested an AHOBPR examination. Of those requesting an exam, only 12% had an exam recorded in their medical record. This article summarizes barriers and facilitators to delivering AHOBPR exams and shares lessons learned from facilities who have successfully implemented burn pit exams for veterans. Materials and Methods We (I.C.C and J.H.) constructed a key performance measure of AHOBPR examination (the ratio of examinations performed in facility over examinations assigned to a facility) to identify top performing facilities and then used stratified purposeful sampling among high-performing sites to recruit a diverse set of facilities for participation. We (P.V.C. and A.A.) recruited and interviewed key personnel at these facilities about their process of administering burn pit exams. Rapid qualitative methods were used to analyze interviews. Results The ratio of exams performed to exams assigned ranged from 0.00 to 14.50 for the 129 facilities with available information. Twelve interviews were conducted with a total of 19 participants from 10 different facilities. We identified 3 barriers: Unclear responsibility, limited incentives and competing duties for personnel involved, and constrained resources. Facilitators included the presence of an internal facilitator, additional staff support, and coordination across a facility's departments to provide care. Conclusions Gaps across many VHA facilities to provide AHOBPR exams may be understood as stemming from organizational issues related to clear delegation of responsibility and staffing issues. VHA facilities that wish to increase AHOBPR exams for veterans may need additional administrative and medical staff. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. 2024 Update to the 2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American Heart Association/American College of Cardiology Joint Committee on Performance Measures.
- Abstract
This document describes performance measures for heart failure that are appropriate for public reporting or pay-for performance programs and is meant to serve as a focused update of the "2020 ACC/AHA Clinical Performance and Quality Measures for Adults With Heart Failure: A Report of the American College of Cardiology/American Heart Association Task Force on Performance Measures." The new performance measures are taken from the "2022 AHA/ACC/HFSA Guideline for the Management of Heart Failure: A Report of the American College of Cardiology/American Heart Association Joint Committee on Clinical Practice Guidelines" and are selected from the strongest recommendations (Class 1 or Class 3). In contrast, quality measures may not have as much evidence base and generally comprise metrics that might be useful for clinicians and health care organizations for quality improvement but are not yet appropriate for public reporting or pay-for-performance programs. New performance measures include optimal blood pressure control in patients with heart failure with preserved ejection fraction, the use of sodium-glucose cotransporter-2 inhibitors for patients with heart failure with reduced ejection fraction, and the use of guideline-directed medical therapy in hospitalized patients. New quality measures include the use of sodium-glucose cotransporter-2 inhibitors in patients with heart failure with mildly reduced and preserved ejection fraction, the optimization of guideline-directed medical therapy prior to intervention for chronic secondary severe mitral regurgitation, continuation of guideline-directed medical therapy for patients with heart failure with improved ejection fraction, identifying both known risks for cardiovascular disease and social determinants of health, patient-centered counseling regarding contraception and pregnancy risks for individuals with cardiomyopathy, and the need for a monoclonal protein screen to exclude light chain amyloidosis when interpreting a bone scintigraphy scan assessing for transthyretin cardiac amyloidosis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Deriving Verified Vehicle Trajectories from LiDAR Sensor Data to Evaluate Traffic Signal Performance.
- Author
-
Saldivar-Carranza, Enrique D. and Bullock, Darcy M.
- Subjects
LIDAR ,DETECTORS ,TRAFFIC signs & signals ,PERFORMANCE evaluation - Abstract
Advances and cost reductions in Light Detection and Ranging (LiDAR) sensor technology have allowed for their implementation in detecting vehicles, cyclists, and pedestrians at signalized intersections. Most LiDAR use cases have focused on safety analyses using its high-fidelity tracking capabilities. This study presents a methodology to transform LiDAR data into localized, verified, and linear-referenced trajectories to derive Purdue Probe Diagrams (PPDs). The following four performance measures are then derived from the PPDs: arrivals on green (AOG), split failures (SF), downstream blockage (DSB), and control delay level of service (LOS). Noise is filtered for each detected vehicle by iteratively projecting each sample's future location and keeping the subsequent sample that is close enough to the estimated destination. Then, a far side is defined for the analyzed intersection's movement to linear reference sampled trajectories and to remove those that do not cross through that point. The technique is demonstrated by using over one hour of LiDAR data at an intersection in Utah to derive PPDs. Signal performance is then estimated from these PPDs. The results are compared to those obtained from comparable PPDs derived from connected vehicle (CV) trajectory data. The generated PPDs from both data sources are similar, with relatively modest differences of 1% AOG and a 1.39 s/veh control delay. Practitioners can use the presented methodology to estimate trajectory-based traffic signal performance measures from their deployed LiDAR sensors. The paper concludes by recommending that unfiltered LiDAR data are used for deriving PPDs and extending the detection zones to cover the largest observed queues to improve performance estimation reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Manufacturing bio-based fiber-reinforced polymer composites: process performance in RTM and VARI processes
- Author
-
Ulrike Kirschnick, Michael Feuchter, Bharath Ravindran, Moritz Salzmann, Ivica Duretek, Ewald Fauster, and Ralf Schledjewski
- Subjects
Performance measures ,composite manufacturing ,bio-based materials ,resin transfer molding ,vacuum assisted resin infusion ,annealing ,Polymers and polymer manufacture ,TP1080-1185 ,Automation ,T59.5 - Abstract
The utilization of bio-based materials for the manufacturing of fiber-reinforced polymer composites is gaining importance under the sustainability paradigm. The identification of suitable process parameters and limited process reproducibility remain among the major challenges to enhance the industrial application potential of bio-based composites. This is especially relevant, as the manufacturing process influences composite quality, economic performance and environmental impacts. This study compares Resin Transfer Molding and Vacuum Assisted Resin Infusion for two sets of process parameters in order to manufacture a composite plate consisting of a flax-fiber textile impregnated with a partially bio-based epoxy matrix. Process quality is described through statistical analysis of processing and composite properties, and performance in terms of process replicability and reliability using performance estimates. Processing parameters were selected to depict a range of manufacturing scenarios that were suitable for the selected bio-based material system from curing for 180 min at 60 °C to curing for 30 min at 100 °C. For an identical set of process conditions, Resin Transfer Molding outperforms Vacuum Assisted Resin Infusion in terms of tensile and flexural characteristics. Conversely, the latter shows the strongest fiber-matrix adhesion and the most homogeneous impregnation. Whereas manufacturing at lower temperature leads to positive effects on composite quality, higher processing temperature with shorter curing cycles achieve highest process performance in terms of Pp and Ppk indices. An additional annealing at 120 °C neither increases composite quality nor reduces manufacturing-induced variability. Results depend on processing differences and indicators to determine process performance, as well as methodological choices.
- Published
- 2024
- Full Text
- View/download PDF
37. Professional pharmacy Services' outcomes performance measurement: A narrative review
- Author
-
Lígia Reis and João Gregório
- Subjects
Community pharmacy ,Professional pharmacy services ,Health outcomes ,Performance measures ,Value in Health ,Pharmacy and materia medica ,RS1-441 - Abstract
Background: Professional pharmacy services are widely recognized for their role in promoting patient health and ensuring optimal medication therapy outcomes. Community pharmacies and pharmacists need to assess professional services' performance at patient level and demonstrate their value to stakeholders. To do so is important to understand which outcome performance indicators are currently being used and how added value is proven. Objective: To identify performance indicators that measure patients' outcomes and demonstrate value of professional pharmacy services. Methods: A narrative review was performed based on a systematic search in Pubmed and Scopus databases since year 2000. Manually search was also conducted in Google Scholar and Google.com. Inclusion criteria followed the PCC mnemonic in which Population is “community pharmacies”, Context is “pharmaceutical care, professional pharmaceutical services or pharmaceutical interventions” and Concept is “key performance indicators, or performance measures or clinical indicators”. English, Spanish or Portuguese language were accepted. Results: All types of papers were included, adding up to a total of 12 papers. The publication of papers on this subject has increased in the last decade. Outcomes indicators identified were based in different frameworks, mainly linked to quality, and were clearly outlined. Disease and therapy management were the most evaluated services. Indicators were identified across 8 different domains corresponding, predominantly, to outputs rather than outcomes. Measurement is mainly conducted under the auspices of coalitions, alliances, government and payers reflecting their perspectives and based on easy-to-retrieve pharmacy data and information. Conclusions: A paradigm shift is needed, so that performance indicators are based on more appropriate frameworks to measure patient level outcomes and value assignment of professional pharmacy services. By providing robust evidence of the impact of pharmacist interventions on patient outcomes, community pharmacists can advocate for the integration, expansion, and recognition of pharmacist-led services within the broader healthcare system.
- Published
- 2024
- Full Text
- View/download PDF
38. Evaluation of operational transformations for smart manufacturing systems
- Author
-
Parhi, Shreyanshu, Kumar, Shashank, Joshi, Kanchan, Akarte, Milind, Raut, Rakesh D., and Narkhede, Balkrishna Eknath
- Published
- 2024
- Full Text
- View/download PDF
39. Revolutionizing performance measures and criteria for the facilities management industry in the UAE
- Author
-
Mawed, Mahmoud
- Published
- 2024
- Full Text
- View/download PDF
40. Development and validation of a new ICD-10-based screening colonoscopy overuse measure in a large integrated healthcare system: a retrospective observational study.
- Author
-
Adams, Megan, Kerr, Eve, Dominitz, Jason, Gao, Yuqing, Yankey, Nicholas, Mafi, John, Saini, Sameer, and May, Folsade
- Subjects
General practice ,Healthcare quality improvement ,Performance measures ,United States ,Humans ,International Classification of Diseases ,Retrospective Studies ,United States Department of Veterans Affairs ,Reproducibility of Results ,Colonoscopy ,Delivery of Health Care ,Integrated - Abstract
BACKGROUND: Low-value use of screening colonoscopy is wasteful and potentially harmful to patients. Decreasing low-value colonoscopy prevents procedural complications, saves patient time and reduces patient discomfort, and can improve access by reducing procedural demand. The objective of this study was to develop and validate an electronic measure of screening colonoscopy overuse using International Classification of Diseases, Tenth Edition codes and then apply this measure to estimate facility-level overuse to target quality improvement initiatives to reduce overuse in a large integrated healthcare system. METHODS: Retrospective national observational study of US Veterans undergoing screening colonoscopy at 119 Veterans Health Administration (VHA) endoscopy facilities in 2017. A measure of screening colonoscopy overuse was specified by an expert workgroup, and electronic approximation of the measure numerator and denominator was performed (electronic measure). The electronic measure was then validated via manual record review (n=511). Reliability statistics (n=100) were calculated along with diagnostic test characteristics of the electronic measure. The measure was then applied to estimate overall rates of overuse and facility-level variation in overuse among all eligible patients. RESULTS: The electronic measure had high specificity (99%) and moderate sensitivity (46%). Adjusted positive predictive value and negative predictive value were 33% and 95%, respectively. Inter-rater reliability testing revealed near perfect agreement between raters (k=0.81). 269 572 colonoscopies were performed in VHA in 2017 (88 143 classified as screening procedures). Applying the measure to these 88 143 screening colonoscopies, 24.5% were identified as potential overuse. Median facility-level overuse was 22.5%, with substantial variability across facilities (IQR 19.1%-27.0%). CONCLUSIONS: An International Classification of Diseases, Tenth Edition based electronic measure of screening colonoscopy overuse has high specificity and improved sensitivity compared with a previous International Classification of Diseases, Ninth Edition based measure. Despite increased focus on reducing low-value care and improving access, a quarter of VHA screening colonoscopies in 2017 were identified as potential low-value procedures, with substantial facility-level variability.
- Published
- 2023
41. Deriving Verified Vehicle Trajectories from LiDAR Sensor Data to Evaluate Traffic Signal Performance
- Author
-
Enrique D. Saldivar-Carranza and Darcy M. Bullock
- Subjects
LiDAR ,trajectory ,traffic signal ,performance measures ,connected vehicle ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Advances and cost reductions in Light Detection and Ranging (LiDAR) sensor technology have allowed for their implementation in detecting vehicles, cyclists, and pedestrians at signalized intersections. Most LiDAR use cases have focused on safety analyses using its high-fidelity tracking capabilities. This study presents a methodology to transform LiDAR data into localized, verified, and linear-referenced trajectories to derive Purdue Probe Diagrams (PPDs). The following four performance measures are then derived from the PPDs: arrivals on green (AOG), split failures (SF), downstream blockage (DSB), and control delay level of service (LOS). Noise is filtered for each detected vehicle by iteratively projecting each sample’s future location and keeping the subsequent sample that is close enough to the estimated destination. Then, a far side is defined for the analyzed intersection’s movement to linear reference sampled trajectories and to remove those that do not cross through that point. The technique is demonstrated by using over one hour of LiDAR data at an intersection in Utah to derive PPDs. Signal performance is then estimated from these PPDs. The results are compared to those obtained from comparable PPDs derived from connected vehicle (CV) trajectory data. The generated PPDs from both data sources are similar, with relatively modest differences of 1% AOG and a 1.39 s/veh control delay. Practitioners can use the presented methodology to estimate trajectory-based traffic signal performance measures from their deployed LiDAR sensors. The paper concludes by recommending that unfiltered LiDAR data are used for deriving PPDs and extending the detection zones to cover the largest observed queues to improve performance estimation reliability.
- Published
- 2024
- Full Text
- View/download PDF
42. Circular supply chain implementation performance measurement framework: a comparative case analysis.
- Author
-
Lahane, Swapnil, Kant, Ravi, Shankar, Ravi, and Patil, Sachin K.
- Subjects
CIRCULAR economy ,ANALYTIC hierarchy process ,BUSINESS enterprises ,ORGANIZATIONAL performance ,BALANCED scorecard - Abstract
Circular supply chain (CSC) has gained traction amongst academicians, practitioners, and policymakers across the world due to its wide range of sustainable benefits to business organizations. CSC amalgamates the circular economy (CE) thinking into supply chain operations of industry and improves the three sustainability dimensions of the organizational performance. However, manufacturing organizations in developing economies are finding difficult to measure the impact of CSC adoption on organizational performance. Therefore, this research aims to explore the CSC performance measures and to develop a performance measurement framework for assessing the impact of CSC implementation on business organizational performance. This research proposes a modified balanced scorecard (BSC) based hybrid framework of Pythagorean fuzzy analytic hierarchy process (PF-AHP) and Pythagorean fuzzy weighted aggregated sum product assessment (PF-WASPAS) methods. The effectiveness of the proposed framework is validated through an empirical case study of an Indian manufacturing company. Further, the proposed framework is tested with other three Indian manufacturing companies and their results are compared with the case company. The finding reveals that the overall performance of empirical case company is 62.88% based on define set of performance measures and performance of other three companies are 64.51, 56.47, and 52.43%, respectively. The outcomes of this study shows that the proposed research framework is more reliable, consistent, and robust with circular perspectives and it also offers an effective way to measure and benchmark the impact of CSC adoption on organizational performance. This research contributes to the knowledge of CSC management for achieving sustainability in the business environment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. On monitoring the standard deviation of log‐normal process.
- Author
-
Akhtar, Noureen, Abid, Muhammad, Amir, Muhammad Wasim, Riaz, Muhammad, and Nazir, Hafiz Zafar
- Subjects
- *
QUALITY control charts , *STANDARD deviations , *LOGNORMAL distribution , *SKEWNESS (Probability theory) , *MONTE Carlo method , *MOVING average process - Abstract
Control charts are widely used in the manufacturing and service sectors to track, regulate, and enhance process output. A manufacturing industry desires to utilize a control chart that has an effective structure and is sensitive to detect infrequent variations in the process. Generally, control charts are developed with the presumption that the understudy quality variable is normally distributed. In actual application, many processes have skewed distributions. The purpose of this study is to use the moving average (MA) charts to track the dispersion of a log‐normal distribution. The design of the proposals is developed and the performance is assessed by run‐length properties. The cumulative distributions of run‐length under in‐control and out‐of‐control are provided to have a broad view of the performance. The simulation findings show that when the value of the log‐normal dispersion parameter is large, the proposed chart is more sensitive to the changes in the dispersion. Additionally, an industrial application is given to illustrate the suggested charts in this research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Teoría de colas en la práctica investigativa: generación de modelos probabilísticos para líneas de espera.
- Author
-
Ángel Burbano-Pantoja, Víctor Miguel, Valdivieso-Miranda, Margoth Adriana, and Burbano-Valdivieso, Ángela Saray
- Subjects
- *
LIBRARY cooperation , *PROBABILITY measures , *QUEUING theory , *ACADEMIC libraries , *DECISION making - Abstract
The generation and application of probabilistic models are required to make assertive decisions based on data associated with user access and service times in various organizations. The objective was to analyze the waiting line system of the central library of a university, using parameters designed to measure its performance. The methodology included quantitative techniques focused on a cross-sectional design. The information was collected with an observation grid and counting the number of students who entered the library in 10 minuts periods to carry out consultation activities. Probability models were adjusted with the data to describe the operation of the system. The results showed that it operated stably, with short waiting times and relevant service times. It was concluded that the analyzed library system achieved a high percentage (89.06%) of use by the students during the observed hours. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Robust weighted general performance score for various classification scenarios.
- Author
-
Pandey, Gaurav, Bagri, Rashika, Gupta, Rajan, Rajpal, Ankit, Agarwal, Manoj, and Kumar, Naveen
- Subjects
DATA modeling ,CLASSIFICATION - Abstract
Traditionally, performance measures such as accuracy, recall, precision, specificity, and negative predicted value (NPV) have been used to evaluate a classification model's performance. However, these measures often fall short of capturing different classification scenarios, such as binary or multi-class, balanced or imbalanced, and noisy or noiseless data. Therefore, there is a need for a robust evaluation metric that can assist business decision-makers in selecting the most suitable model for a given scenario. Recently, a general performance score (GPS) comprising different combinations of traditional performance measures (TPMs) was proposed. However, it indiscriminately assigns equal importance to each measure, often leading to inconsistencies. To overcome the shortcomings of GPS, we introduce an enhanced metric called the Weighted General Performance Score (W-GPS) that considers each measure's coefficient of variation (CV) and subsequently assigns weights to that measure based on its CV value. Considering consistency as a criterion, we found that W-GPS outperformed GPS in the above-mentioned classification scenarios. Further, considering W-GPS with different weighted combinations of TPMs, it was observed that no demarcation of these combinations that work best in a given scenario exists. Thus, W-GPS offers flexibility to the user to choose the most suitable combination for a given scenario. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. LN: A Flexible Algorithmic Framework for Layered Queueing Network Analysis.
- Author
-
CASALE, GIULIANO, GAO, YICHENG, NIU, ZIFENG, and ZHU, LULAI
- Subjects
QUEUEING networks ,MOMENTS method (Statistics) ,PROBABILITY theory - Abstract
Layered queueing networks (LQNs) are an extension of ordinary queueing networks useful to model simultaneous resource possession and stochastic call graphs in distributed systems. Existing computational algorithms for LQNs have primarily focused on mean-value analysis. However, other solution paradigms, such as normalizing constant analysis and mean-field approximation, can improve the computation of LQN mean and transient performance metrics, state probabilities, and response time distributions. Motivated by this observation, we propose the first LQN meta-solver, called LN, that allows for the dynamic selection of the performance analysis paradigm to be iteratively applied to the submodels arising from layer decomposition. We report experiments where this added flexibility helps us to reduce the LQN solution errors. We also demonstrate that the meta-solver approach eases the integration of LQNs with other formalisms, such as caching models, enabling the analysis of more general classes of layered stochastic networks. Additionally, to support the accurate evaluation of the LQN submodels, we develop novel algorithms for homogeneous queueing networks consisting of an infinite server node and a set of identical queueing stations. In particular, we propose an exact method of moment algorithms, integration techniques for normalizing constants, and a fast non-iterative mean-value analysis technique. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Influences of black nickel coating thickness on thermal behaviour of flat plate solar collector: performance evaluation.
- Author
-
Palanikumar, Ponnuswamy, Seikh, A. H., Kalam, Md Abul, and Venkatesh, R.
- Subjects
- *
NICKEL-plating , *SOLAR collectors , *HEAT transfer coefficient , *HEAT radiation & absorption , *SURFACE coatings , *COPPER tubes - Abstract
Adapting coating material for copper absorber tubes with fins probably plays a significant role, which should have higher absorptance and lower emissivity. This study used FPC heat transfer augmentation through an absorber tube integrated with black nickel blended with industrial black matt paint and the coating thickness at 0.3, 0.2, and 0.1 μm made via spray pyrolysis. Based on a thermal mathematical model, heat absorption, heat transfer coefficient, and thermal and exergy efficiency of FPC were estimated, and the results were compared with the non-coating FPC. The excellent coating thickness exhibits a peak absorptance (0.98) and emissivity (0.097), leading to improved heat transfer behaviour, specifically 0.1 μm thin film-coated FPC. Hence, the outlet temperature of the working fluid is approximately 96.1 °C, when utilizing a 0.1 μm thin film coating. It represents the highest outlet temperature observed among the coating conditions and non-coating absorber tubes. The exposure of 0.1 µm thin black nickel coated FPC recorded superior heat absorption and heat transfer coefficient values of 1689W and 140.58 W m−2 K−1. Similarly, the average thermal and exergy efficiency is about 74.3% and 54.9%, respectively, by 0.1 μm thin film coating absorber through higher absorptance and lower emissivity (0.097). Moreover, the 0.1 μm thin film coating exhibits the lowest entropy generation of about 0.00065 W K−1. It indicates that, in addition to achieving higher outlet temperatures, the 0.1 μm thin film coating performs more efficiently, minimizing irreversible processes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Prediagnostic evaluation of multicancer detection tests: design and analysis considerations.
- Author
-
Baker, Stuart G and Etzioni, Ruth
- Subjects
- *
TEST design , *SAMPLE size (Statistics) , *OVARIAN cancer , *STATISTICAL sampling , *CANCER patients - Abstract
There is growing interest in multicancer detection tests, which identify molecular signals in the blood that indicate a potential preclinical cancer. A key stage in evaluating these tests is a prediagnostic performance study, in which investigators store specimens from asymptomatic individuals and later test stored specimens from patients with cancer and a random sample of controls to determine predictive performance. Performance metrics include rates of cancer-specific true-positive and false-positive findings and a cancer-specific positive predictive value, with the latter compared with a decision-analytic threshold. The sample size trade-off method, which trades imprecise targeting of the true-positive rate for precise targeting of a zero-false-positive rate can substantially reduce sample size while increasing the lower bound of the positive predictive value. For a 1-year follow-up, with ovarian cancer as the rarest cancer considered, the sample size trade-off method yields a sample size of 163 000 compared with a sample size of 720 000, based on standard calculations. These design and analysis recommendations should be considered in planning a specimen repository and in the prediagnostic evaluation of multicancer detection tests. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Efficient Breast Cancer Dataset Analysis Based on Adaptive Classifiers.
- Author
-
Kareem, Thikra Ali, Al-Ani, Muzhir Shaban, and Nejres, Salwa Mohammed
- Subjects
- *
BREAST cancer diagnosis , *EARLY diagnosis , *SUPPORT vector machines , *NAIVE Bayes classification , *DECISION making - Abstract
Many algorithms have been used to diagnose diseases, with some demonstrating good performance while others have not met expectations. Making correct decisions with the minimal possible errors is of the highest priority when diagnosing diseases. Breast cancer, being a prevalent and widespread disease, emphasizes the importance of early detection. Accurate decision-making regarding breast cancer is crucial for early treatment and achieving favorable outcomes. The percentage split evaluation approach was employed, comparing performance metrics such as precision, recall, and f1-score. Kernel Naïve Bayes achieved 100% precision in the percentage split method for breast cancer, while the Coarse Gaussian support vector machines achieved 97.2% precision in classifying breast cancer in 4-fold cross-validation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Reassessment of performance evaluation of EWMA control chart for exponential process.
- Author
-
Rasheed, Zahid, Zhang, Hongying, and Anwar, Syed Masroor
- Subjects
- *
QUALITY control charts , *URINARY tract infections , *MOVING average process - Abstract
Memory control charts or charts, such as cumulative sums and exponentially weighted moving average (EWMA), are widely used for tracking minor to moderate changes in process parameters. The literature presented an EWMA chart for monitoring exponential processes, assuming probability limits to determine both in‐ and out‐of‐control average run length (ARL). The use of probability limits for memory charts is an incorrect method. So, we are aiming to correct the EWMA chart by Aslam et al. and the proposed EWMA chart (represented by IEWMAE${\mathrm{IEWM}}{{\mathrm{A}}_{\mathrm{E}}}$) for the exponential process using the run‐length (RL) method based on the simulation method instead of the probability‐based method. The ARL, standard deviation of RL, and median of RL are used to evaluate the proposed chart's performance. For practical consideration of the proposed chart, a real‐world example of a urinary tract infection is also presented. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.