541,379 results on '"Software"'
Search Results
2. Give your WMS a boost: New types of third-party software solutions can enhance WMS in many ways, including gains in inventory accuracy, better tools for creating workflows and user interfaces for task execution, and simplified integration with mobile robots and other forms of automation
- Author
-
Michel, Roberto
- Subjects
Software ,Robotics industry ,Warehousing ,Robots ,Human resource departments -- Computer programs ,Warehouse stores ,Robotics industry ,Human resources management software ,Software quality ,Robot ,Business, general ,Business ,Engineering and manufacturing industries - Abstract
Warehouse management systems (WMS) remain the transactional and process management foundation for inventory control and order fulfillment in most DCs. In recent years, when Modern has surveyed readers about WMS [...]
- Published
- 2024
3. Gearing Up for In-House Production: By investing in a multiaxis-capable shop and utilizing simulation software for diagnostic checks, Techtronic Industries turned a four-to 10-week lead time into a one-to two-week lead time
- Author
-
Fields, Nathaniel
- Subjects
Mastercam/CNC Software Inc. ,Techtronic Industries Company Ltd. ,Synthetic training devices ,Engineering -- Computer programs ,Software ,Machine-tools ,Machinists' tools ,Software quality ,Engineering software ,Business ,Metals, metalworking and machinery industries - Abstract
Specialty gears--especially those with complex shapes or tolerances such as spur, helical, bevel, miter, spiral and pinion gears--have historically been components that Techtronic Industries (TTI) farmed out of its Anderson, [...]
- Published
- 2024
4. BUSINESS OPERATIONS: COMPUTER SYSTEMS BUYERS' GUIDE
- Subjects
K Systems Inc. ,Software ,Electronic components industry ,Customer service ,Support services ,Customer service ,Software quality ,Business ,Petroleum, energy and mining industries - Abstract
ABILIS ENERGY 1010 Sherbrooke west, 1900 Montreal Quebec, Canada J2X4J3 514-225-4145 ADD SYSTEMS 6 Laurel Dr. Flanders, NJ 07836 800-922-0972 E-mail: Coylej@addsys.com Primary Market: North America Contact Person: John Coyle, [...]
- Published
- 2024
5. Fabrication of Human Milk Fat Substitute: Based on the Similarity Evaluation Model and Computer Software.
- Author
-
Zhu, Huiquan, Zhao, Pu, Wang, Xiaodan, Wang, Yunna, Zhang, Shuwen, Pang, Xiaoyang, and Lv, Jiaping
- Subjects
- *
MILKFAT , *MILK substitutes , *BREAST milk , *RAPESEED oil , *COMPUTER software , *FAT substitutes , *FAT - Abstract
We aimed to obtain the optimal formula for human milk fat substitute (HMFS) through a combination of software and an evaluation model and further verify its practicability through an animal experiment. The results showed that a total of 33 fatty acid (FA) and 63 triglyceride (TAG) molecular species were detected in vegetable oils. Palmitic acid, oleic acid, linoleic acid, 18:1/16:0/18:1, 18:2/16:0/18:2, 18:1/18:1/18:1 and 18:1/18:2/18:1, were the main molecular species among the FAs and TAGs in the vegetable oils. Based on the HMFS evaluation model, the optimal mixed vegetable oil formula was blended with 21.3% palm oil, 2.8% linseed oil, 2.6% soybean oil, 29.9% rapeseed oil and 43.4% maize oil, with the highest score of 83.146. Moreover, there was no difference in the weight, blood routine indices or calcium and magnesium concentrations in the feces of the mice between the homemade mixed vegetable oil (HMVO) group and the commercial mixed vegetable oil (CMVO) group, while nervonic acid (C24:1) and octanoic acid (C8:0) were absorbed easily in the HMVO group. Therefore, these results demonstrate that the mixing of the different vegetable oils was feasible via a combination of computer software and an evaluation model and provided a new way to produce HMFS. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. WormBase 2024: status and transitioning to Alliance infrastructure.
- Author
-
Sternberg, Paul W, Auken, Kimberly Van, Wang, Qinghua, Wright, Adam, Yook, Karen, Zarowiecki, Magdalena, Arnaboldi, Valerio, Becerra, Andrés, Brown, Stephanie, Cain, Scott, Chan, Juancarlos, Chen, Wen J, Cho, Jaehyoung, Davis, Paul, Diamantakis, Stavros, Dyer, Sarah, Grigoriadis, Dionysis, Grove, Christian A, Harris, Todd, and Howe, Kevin
- Subjects
- *
DATABASES , *GENOMICS , *INFORMATION resources , *BIOINFORMATICS , *CAENORHABDITIS elegans , *GENETICS - Abstract
WormBase has been the major repository and knowledgebase of information about the genome and genetics of Caenorhabditis elegans and other nematodes of experimental interest for over 2 decades. We have 3 goals: to keep current with the fast-paced C. elegans research, to provide better integration with other resources, and to be sustainable. Here, we discuss the current state of WormBase as well as progress and plans for moving core WormBase infrastructure to the Alliance of Genome Resources (the Alliance). As an Alliance member, WormBase will continue to interact with the C. elegans community, develop new features as needed, and curate key information from the literature and large-scale projects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Updates to the Alliance of Genome Resources central infrastructure.
- Author
-
Consortium, The Alliance of Genome Resources
- Subjects
- *
BIOLOGICAL models , *DATABASES , *COMPUTER software , *DATA mining , *DATABASE management , *DATA curation , *ARTIFICIAL intelligence , *INFORMATION resources , *FISHES , *PROFESSIONS , *MICE , *RATS , *INFORMATION services , *INFORMATION retrieval , *CAENORHABDITIS elegans , *INSECTS , *ONTOLOGIES (Information retrieval) , *MACHINE learning , *GENOMES , *GENETICS , *YEAST , *ANURA - Abstract
The Alliance of Genome Resources (Alliance) is an extensible coalition of knowledgebases focused on the genetics and genomics of intensively studied model organisms. The Alliance is organized as individual knowledge centers with strong connections to their research communities and a centralized software infrastructure, discussed here. Model organisms currently represented in the Alliance are budding yeast, Caenorhabditis elegans , Drosophila , zebrafish, frog, laboratory mouse, laboratory rat, and the Gene Ontology Consortium. The project is in a rapid development phase to harmonize knowledge, store it, analyze it, and present it to the community through a web portal, direct downloads, and application programming interfaces (APIs). Here, we focus on developments over the last 2 years. Specifically, we added and enhanced tools for browsing the genome (JBrowse), downloading sequences, mining complex data (AllianceMine), visualizing pathways, full-text searching of the literature (Textpresso), and sequence similarity searching (SequenceServer). We enhanced existing interactive data tables and added an interactive table of paralogs to complement our representation of orthology. To support individual model organism communities, we implemented species-specific "landing pages" and will add disease-specific portals soon; in addition, we support a common community forum implemented in Discourse software. We describe our progress toward a central persistent database to support curation, the data modeling that underpins harmonization, and progress toward a state-of-the-art literature curation system with integrated artificial intelligence and machine learning (AI/ML). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Energy Consumption and Carbon Emission Reduction in HVAC System of a Dynamic Random Access Memory (DRAM) Semiconductor Fabrication Plant (fab).
- Author
-
Liao, Pin-Yen, Lin, Tee, Ali Zargar, Omid, Hsu, Chia-Jen, Chou, Chia-Hung, Shih, Yang-Cheng, Hu, Shih-Cheng, and Leggett, Graham
- Subjects
- *
DYNAMIC random access memory , *SEMICONDUCTOR manufacturing , *ENERGY consumption , *CARBON emissions , *GREENHOUSE gas mitigation - Abstract
This study focuses on energy saving for a Taiwan high-tech DRAM factory as the primary research subject. Collecting operational parameters related to various facility systems and process equipment is initially performed by using the developed energy conversion factors (ECF) calculator. Moreover, innovative fab energy simulation (FES) software has been designed by Taipei Tech. This software is designed for high-tech fab energy consumption analysis. The annual energy consumption data for fabs can be calculated. This data is then converted into carbon dioxide emissions using the power carbon emission coefficient provided by the Bureau of Energy, Ministry of Economic Affairs Taiwan. In this study, five different energy-saving strategies were proposed. The energy consumption and carbon emissions distribution were evaluated to assess the benefits of those different techniques. The findings show that among the existing operational facilities, the use of an exhaust air conditioning unit with reduced enthalpy value setting, with lowered supply air temperature, demonstrates the highest energy-saving. This technique has the potential to annually reduce carbon emissions by approximately 623,158 kg CO2 and operational costs by NT ${\$}$ 6,005,764 (189,602 U.S. ${\$}$). This can reduce the overall manufacturing cost and is also beneficial for the environment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Detection and quantification of Babesia species intraerythrocytic parasites by flow cytometry.
- Author
-
Vanderboom, Patrick M, Misra, Anisha, Rodino, Kyle G, Eberly, Allison R, Greenwood, Jason D, Morris, Heather E, Norrie, Felicity C, Fernholz, Emily C, Pritt, Bobbi S, and Norgan, Andrew P
- Subjects
- *
BABESIA , *FLOW cytometry , *MACHINE learning , *ERYTHROCYTES , *POLYMERASE chain reaction , *BLOOD parasites - Abstract
Objectives Recent work has demonstrated that automated fluorescence flow cytometry (FLC) is a potential alternative for the detection and quantification of Plasmodium parasites. The objective of this study was to apply this novel FLC method to detect and quantify Babesia parasites in venous blood and compare results to light microscopy and polymerase chain reaction methods. Methods An automated hematology/malaria analyzer (XN-31; Sysmex) was used to detect and quantify B microti –infected red blood cells from residual venous blood samples (n = 250: Babesia positive, n = 170; Babesia negative, n = 80). As no instrument software currently exists for Babesia, qualitative and quantitative machine learning (ML) algorithms were developed to facilitate analysis. Results Performance of the ML models was verified against the XN-31 software using P falciparum –infected samples. When applied to Babesia -infected samples, the qualitative ML model demonstrated an area under the curve (AUC) of 0.956 (sensitivity, 95.9%; specificity, 83.3%) relative to polymerase chain reaction. For valid scattergrams, the qualitive model achieved an AUC of 1.0 (sensitivity and specificity, 100%), while the quantitative model demonstrated an AUC of 0.986 (sensitivity, 94.4%; specificity, 100%). Conclusions This investigation demonstrates that Babesia parasites can be detected and quantified directly from venous blood using FLC. Although promising, opportunities remain to improve the general applicability of the method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Scaling and merging macromolecular diffuse scattering with mdx2.
- Author
-
Meisburger, Steve P. and Ando, Nozomi
- Subjects
- *
ELECTRONIC data processing , *MACROMOLECULAR dynamics , *DATA reduction , *PYTHON programming language , *CRYSTALLOGRAPHY , *ELECTRON density - Abstract
Diffuse scattering is a promising method to gain additional insight into protein dynamics from macromolecular crystallography experiments. Bragg intensities yield the average electron density, while the diffuse scattering can be processed to obtain a three‐dimensional reciprocal‐space map that is further analyzed to determine correlated motion. To make diffuse scattering techniques more accessible, software for data processing called mdx2 has been created that is both convenient to use and simple to extend and modify. mdx2 is written in Python, and it interfaces with DIALS to implement self‐contained data‐reduction workflows. Data are stored in NeXus format for software interchange and convenient visualization. mdx2 can be run on the command line or imported as a package, for instance to encapsulate a complete workflow in a Jupyter notebook for reproducible computing and education. Here, mdx2 version 1.0 is described, a new release incorporating state‐of‐the‐art techniques for data reduction. The implementation of a complete multi‐crystal scaling and merging workflow is described, and the methods are tested using a high‐redundancy data set from cubic insulin. It is shown that redundancy can be leveraged during scaling to correct systematic errors and obtain accurate and reproducible measurements of weak diffuse signals. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. HEIDI: an experiment‐management platform enabling high‐throughput fragment and compound screening.
- Author
-
Metz, A., Stegmann, D. P., Panepucci, E. H., Buehlmann, S., Huang, C.-Y., McAuley, K. E., Wang, M., Wojdyla, J. A., Sharpe, M. E., and Smith, K. M. L.
- Subjects
- *
SOFTWARE architecture , *LIGHT sources , *DATA management , *USER experience , *APPLICATION program interfaces , *ACQUISITION of data - Abstract
The Swiss Light Source facilitates fragment‐based drug‐discovery campaigns for academic and industrial users through the Fast Fragment and Compound Screening (FFCS) software suite. This framework is further enriched by the option to utilize the Smart Digital User (SDU) software for automated data collection across the PXI, PXII and PXIII beamlines. In this work, the newly developed HEIDI webpage (https://heidi.psi.ch) is introduced: a platform crafted using state‐of‐the‐art software architecture and web technologies for sample management of rotational data experiments. The HEIDI webpage features a data‐review tab for enhanced result visualization and provides programmatic access through a representational state transfer application programming interface (REST API). The migration of the local FFCS MongoDB instance to the cloud is highlighted and detailed. This transition ensures secure, encrypted and consistently accessible data through a robust and reliable REST API tailored for the FFCS software suite. Collectively, these advancements not only significantly elevate the user experience, but also pave the way for future expansions and improvements in the capabilities of the system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Rules as code and the rule of law: ensuring effective judicial review of administration by software.
- Author
-
Kennedy, Rónán
- Subjects
- *
JUDICIAL review , *RULE of law , *SEPARATION of powers , *SYSTEMS software , *LEGISLATION drafting - Abstract
This paper considers the possible benefits and substantial risks of 'Rules as Code', the parallel drafting of legislation and codification in software, which has been the subject of attention from policy-makers and pilot studies in some jurisdictions. It highlights the long history of these approaches, and the challenges of ossification, mis-translation of rules, and separation of powers problems. It also examines in the detail the Australian Pintarich case, which demonstrates the inadequacy of conventional judicial review of automated decision-making. It outlines some possible solutions to these issues — two 'internal' to development processes (greater transparency, and literate pair programming) and two 'external' (expanding the capacity of judicial review to look beyond a specific citizen/state interaction and consider the design and development of the controlling software system, and greater cross-disciplinary awareness by lawyers). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Use of Digital Tools for the Assessment of Food Consumption in Brazil: A Scoping Review.
- Author
-
Silva, Adriane dos Santos da, Brito, Flávia dos Santos Barbosa, Santos, Debora Martins dos, and Adegboye, Amanda Rodrigues Amorim
- Abstract
This is a scoping review on mapping the use of digital tools to assess food consumption in Brazil. Searches were carried out in nine electronic databases (Medline, Lilacs, Scopus, Embase, Web of Science, Science Direct, Ovid, Free Medical Journal and Crossref) to select studies published from October 2020 to December 2023. This review identified forty-eight digital tools in the 94 publications analyzed, the most frequent being web-based technologies (60%) and mobile devices (40%). Among these studies, 55% (n = 52) adopted a population-based approach, while 45% (n = 42) focused on specific regions. The predominant study design observed was cross-sectional (n = 63). A notable trend observed was the increasing frequency of validation studies in recent years. Although the use of digital tools in the assessment of food consumption in Brazil has grown in recent years, studies did not describe the process of creating and validating the tools, which would contribute to the improvement of data quality. Investments that allow the expansion of the use of the internet and mobile devices; the improvement of digital literacy; and the development of open-access tools, especially in the North and Northeast regions, are challenges that require a concerted effort towards providing equal opportunities, fostering encouragement, and delving deeper into the potential of digital tools within studies pertaining to food consumption in Brazil. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. INTELLIGENT CLASSROOM NOTE-TAKING APPLICATION SOFTWARE WITH HIGHER PERFORMANCE.
- Author
-
LI ZHUANG
- Subjects
NOTETAKING ,TEACHER-student relationships ,DATABASES ,CLASSROOMS ,DESIGN software - Abstract
In the process of multimedia teaching, it is common for learners to miss out on important notes and fail to summarize and organize the course content in a timely manner, resulting in a lower learning efficiency. Aim: This paper designs an intelligent classroom note-taking application software that combines traditional note-taking with the internet, based on the needs analysis of both teachers and learners. This software utilizes the single shot multibox detector and MobileNet to build a network platform, and establishes a MySQL-based database. It has been deeply developed using various intelligent algorithms and technologies, and includes modules for learning notes, searching, recognition, and recording. Through Testin and usage testing by learners and teachers, the proposed software has been proven to effectively recognize and record learning content, ensuring the recording and expansion of teacher's knowledge points, ultimately improving students' learning efficiency. Based on the current situation of classroom note-taking, this paper explains the level of learners' awareness of classroom note-taking and the problems of note-taking in the classroom, and summarizes the design ideas and basic requirements of a classroom note-taking application. The design process of an intelligent classroom note-taking application is proposed, and the design and development of the software is further completed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Integration of GeoGebra Calculator 3D with Augmented Reality in Mathematics Education for an Immersive Learning Experience.
- Author
-
IparraguirreVillanueva, Orlando, Paulino-Moreno, Cleoge, Chero-Valdivieso, Henry, Espinola-Linares, Karina, and Cabanillas-Carbonell, Michael
- Subjects
MATHEMATICS education ,AUGMENTED reality ,CALCULATORS ,MATHEMATICS students ,MATHEMATICS ,EXPERIMENTAL groups ,SOFTWARE visualization - Abstract
The use of augmented reality (AR) with GeoGebra allows for the contextualization of mathematical operations in real-world situations. In this approach, the teacher presents questions or problems that students solve using visualization and experimentation software. The objective of this work is to evaluate the impact of integrating the GeoGebra 3D calculator with AR. For the development of this study, the quasi-experimental method was employed, involving the comparison of results between two groups: the experimental group (EG) and the control group (CG). We worked with a population of 78 students. The study conducted confirms the use of the GeoGebra calculator in 3D with AR. AR effectively enhances mathematical learning. Seventy percent of the students in the EG achieved an outstanding level of performance, while 30% reached an expected level. In addition, a positive attitude towards mathematics was observed in 100% of the students. These results demonstrate that using the GeoGebra calculator in 3D with AR has a positive impact on mathematics learning. While in CG, 10% achieved the expected level of performance, 85% were in progress, and 5% were at the initial stage. Finally, it was concluded that the GeoGebra calculator in 3D with AR is very useful. It helps enhance the teaching and learning (TL) of mathematics and motivates students, making the development of class sessions more dynamic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Contrast‐agent‐based perfusion MRI code repository and testing framework: ISMRM Open Science Initiative for Perfusion Imaging (OSIPI).
- Author
-
van Houdt, Petra J, Ragunathan, Sudarshan, Berks, Michael, Ahmed, Zaki, Kershaw, Lucy E, Gurney‐Champion, Oliver J, Tadimalla, Sirisha, Arvidsson, Jonathan, Sun, Yu, Kallehauge, Jesper, Dickie, Ben, Lévy, Simon, Bell, Laura, Sourbron, Steven, Thrippleton, Michael J, Johansen, Ole Gunnar, Orton, Matthew, Welch, Brian, Smith, David, and Wardlaw, Joanna M.
- Subjects
PERFUSION imaging ,OPEN scholarship ,MAGNETIC resonance imaging ,PERFUSION ,INSTITUTIONAL repositories - Abstract
Purpose: Software has a substantial impact on quantitative perfusion MRI values. The lack of generally accepted implementations, code sharing and transparent testing reduces reproducibility, hindering the use of perfusion MRI in clinical trials. To address these issues, the ISMRM Open Science Initiative for Perfusion Imaging (OSIPI) aimed to establish a community‐led, centralized repository for sharing open‐source code for processing contrast‐based perfusion imaging, incorporating an open‐source testing framework. Methods: A repository was established on the OSIPI GitHub website. Python was chosen as the target software language. Calls for code contributions were made to OSIPI members, the ISMRM Perfusion Study Group, and publicly via OSIPI websites. An automated unit‐testing framework was implemented to evaluate the output of code contributions, including visual representation of the results. Results: The repository hosts 86 implementations of perfusion processing steps contributed by 12 individuals or teams. These cover all core aspects of DCE‐ and DSC‐MRI processing, including multiple implementations of the same functionality. Tests were developed for 52 implementations, covering five analysis steps. For T1 mapping, signal‐to‐concentration conversion and population AIF functions, different implementations resulted in near‐identical output values. For the five pharmacokinetic models tested (Tofts, extended Tofts‐Kety, Patlak, two‐compartment exchange, and two‐compartment uptake), differences in output parameters were observed between contributions. Conclusions: The OSIPI DCE‐DSC code repository represents a novel community‐led model for code sharing and testing. The repository facilitates the re‐use of existing code and the benchmarking of new code, promoting enhanced reproducibility in quantitative perfusion imaging. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. ProSeq4: A user‐friendly multiplatform program for preparation and analysis of large‐scale DNA polymorphism datasets.
- Author
-
Filatov, Dmitry A.
- Abstract
Preparation of DNA polymorphism datasets for analysis is an important step in evolutionary genetic and molecular ecology studies. Ever‐growing dataset sizes make this step time consuming, but few convenient software tools are available to facilitate processing of large‐scale datasets including thousands of sequence alignments. Here I report “processor of sequences v4” (proSeq4)—a user‐friendly multiplatform software for preparation and evolutionary genetic analyses of genome‐ or transcriptome‐scale sequence polymorphism datasets. The program has an easy‐to‐use graphic user interface and is designed to process and analyse many thousands of datasets. It supports over two dozen file formats, includes a flexible sequence editor and various tools for data visualization, quality control and most commonly used evolutionary genetic analyses, such as NJ‐phylogeny reconstruction, DNA polymorphism analyses and coalescent simulations. Command line tools (e.g. vcf2fasta) are also provided for easier integration into bioinformatic pipelines. Apart of molecular ecology and evolution research, proSeq4 may be useful for teaching, e.g. for visual illustration of different shapes of phylogenies generated with coalescent simulations in different scenarios. ProSeq4 source code and binaries for Windows, MacOS and Ubuntu are available from https://sourceforge.net/projects/proseq/. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. GbyE: an integrated tool for genome widely association study and genome selection based on genetic by environmental interaction.
- Author
-
Liu, Xinrui, Wang, Mingxiu, Qin, Jie, Liu, Yaxin, Wang, Shikai, Wu, Shiyu, Zhang, Ming, Zhong, Jincheng, and Wang, Jiabo
- Subjects
- *
GENOMES , *KRONECKER products , *GENETIC markers , *CHROMOSOMES , *STATISTICAL power analysis , *GENOTYPE-environment interaction , *GENETIC correlations - Abstract
Background: The growth and development of organism were dependent on the effect of genetic, environment, and their interaction. In recent decades, lots of candidate additive genetic markers and genes had been detected by using genome-widely association study (GWAS). However, restricted to computing power and practical tool, the interactive effect of markers and genes were not revealed clearly. And utilization of these interactive markers is difficult in the breeding and prediction, such as genome selection (GS). Results: Through the Power-FDR curve, the GbyE algorithm can detect more significant genetic loci at different levels of genetic correlation and heritability, especially at low heritability levels. The additive effect of GbyE exhibits high significance on certain chromosomes, while the interactive effect detects more significant sites on other chromosomes, which were not detected in the first two parts. In prediction accuracy testing, in most cases of heritability and genetic correlation, the majority of prediction accuracy of GbyE is significantly higher than that of the mean method, regardless of whether the rrBLUP model or BGLR model is used for statistics. The GbyE algorithm improves the prediction accuracy of the three Bayesian models BRR, BayesA, and BayesLASSO using information from genetic by environmental interaction (G × E) and increases the prediction accuracy by 9.4%, 9.1%, and 11%, respectively, relative to the Mean value method. The GbyE algorithm is significantly superior to the mean method in the absence of a single environment, regardless of the combination of heritability and genetic correlation, especially in the case of high genetic correlation and heritability. Conclusions: Therefore, this study constructed a new genotype design model program (GbyE) for GWAS and GS using Kronecker product. which was able to clearly estimate the additive and interactive effects separately. The results showed that GbyE can provide higher statistical power for the GWAS and more prediction accuracy of the GS models. In addition, GbyE gives varying degrees of improvement of prediction accuracy in three Bayesian models (BRR, BayesA, and BayesCpi). Whatever the phenotype were missed in the single environment or multiple environments, the GbyE also makes better prediction for inference population set. This study helps us understand the interactive relationship between genomic and environment in the complex traits. The GbyE source code is available at the GitHub website (https://github.com/liu-xinrui/GbyE). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. PyPop: a mature open-source software pipeline for population genomics.
- Author
-
Lancaster, Alexander K., Single, Richard M., Mack, Steven J., Sochat, Vanessa, Marian, Michael P., and Webster, Gordon D.
- Subjects
GENOMICS ,POPULATION genetics ,LINKAGE disequilibrium ,IMMUNOGENETICS ,SCIENTIFIC community - Abstract
Python for Population Genomics (PyPop) is a software package that processes genotype and allele data and performs large-scale population genetic analyses on highly polymorphic multi-locus genotype data. In particular, PyPop tests data conformity to Hardy-Weinberg equilibrium expectations, performs Ewens-Watterson tests for selection, estimates haplotype frequencies, measures linkage disequilibrium, and tests significance. Standardized means of performing these tests is key for contemporary studies of evolutionary biology and population genetics, and these tests are central to genetic studies of disease association as well. Here, we present PyPop 1.0.0, a new major release of the package, which implements new features using the more robust infrastructure of GitHub, and is distributed via the industry-standard Python Package Index. New features include implementation of the asymmetric linkage disequilibrium measures and, of particular interest to the immunogenetics research communities, support for modern nomenclature, including colon-delimited allele names, and improvements to meta-analysis features for aggregating outputs for multiple populations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Preparation of a Computer Software Program for the Feasibility Study of Livestock Enterprises.
- Author
-
MUNDAN, Durhasan and MUNDAN, İbrahim Talha
- Abstract
This study was carried out with the aim of developing a software program that will enable the breeder to decide easily during the preparation of the feasibility for livestock enterprises. For this purpose, 63 enterprises in Gaziantep and Sanliurfa provinces/Turkey were visited between the years 2021-2022 and all the data obtained were evaluated. The "C#" programming language was used in the development of the software program. "Microsoft SQL Server" database was used to store the obtained data. This feasibility program is a software program where productivity checks are performed for enterprises and their personnel. It is a program that can be used easily from a small-capacity enterprises to a large-capacity enterprises. The cost calculations are not included in the program due to the economic conditions of the market. As a result, this program, which was prepared by taking into account software engineering techniques, will provide great advantages and conveniences for enterprises. Risk factors will be determined and alternatives will be presented with this software program that performs enterprises efficiency testing. It has been concluded that this software will be a program that can be preferred by the breeder since it can be used on all computers and offers different alternatives in enterprises establishments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Clinical validation of a deep-learning-based bone age software in healthy Korean children.
- Author
-
Hyo-Kyoung Nam, Winnah Wu-In Lea, Zepa Yang, Eunjin Noh, Young-Jun Rhie, Kee-Hyoung Lee, and Suk-Joo Hong
- Abstract
Purpose: Bone age (BA) is needed to assess developmental status and growth disorders. We evaluated the clinical performance of a deep-learning-based BA software to estimate the chronological age (CA) of healthy Korean children. Methods: This retrospective study included 371 healthy children (217 boys, 154 girls), aged between 4 and 17 years, who visited the Department of Pediatrics for health check-ups between January 2017 and December 2018. A total of 553 left-hand radiographs from 371 healthy Korean children were evaluated using a commercial deep-learning-based BA software (BoneAge, Vuno, Seoul, Korea). The clinical performance of the deep learning (DL) software was determined using the concordance rate and Bland-Altman analysis via comparison with the CA. Results: A 2-sample t-test (P<0.001) and Fisher exact test (P=0.011) showed a significant difference between the normal CA and the BA estimated by the DL software. There was good correlation between the 2 variables (r=0.96, P<0.001); however, the root mean square error was 15.4 months. With a 12-month cutoff, the concordance rate was 58.8%. The Bland-Altman plot showed that the DL software tended to underestimate the BA compared with the CA, especially in children under the age of 8.3 years. Conclusion: The DL-based BA software showed a low concordance rate and a tendency to underestimate the BA in healthy Korean children. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. slimr: An R package for tailor‐made integrations of data in population genomic simulations over space and time.
- Author
-
Dinnage, Russell, Sarre, Stephen D., Duncan, Richard P., Dickman, Christopher R., Edwards, Scott V., Greenville, Aaron C., Wardle, Glenda M., and Gruber, Bernd
- Subjects
- *
SIMULATION software , *BIOLOGISTS , *PACKAGING design , *GENOMICS , *DATA analysis , *LANDSCAPE assessment - Abstract
Software for realistically simulating complex population genomic processes is revolutionizing our understanding of evolutionary processes, and providing novel opportunities for integrating empirical data with simulations. However, the integration between standalone simulation software and R is currently not well developed. Here, we present slimr, an R package designed to create a seamless link between standalone software SLiM >3.0, one of the most powerful population genomic simulation frameworks, and the R development environment, with its powerful data manipulation and analysis tools. We show how slimr facilitates smooth integration between genetic data, ecological data and simulation in a single environment. The package enables pipelines that begin with data reading, cleaning and manipulation, proceed to constructing empirically based parameters and initial conditions for simulations, then to running numerical simulations and finally to retrieving simulation results in a format suitable for comparisons with empirical data – aided by advanced analysis and visualization tools provided by R. We demonstrate the use of slimr with an example from our own work on the landscape population genomics of desert mammals, highlighting the advantage of having a single integrated tool for both data analysis and simulation. slimr makes the powerful simulation ability of SLiM directly accessible to R users, allowing integrated simulation projects that incorporate empirical data without the need to switch between software environments. This should provide more opportunities for evolutionary biologists and ecologists to use realistic simulations to better understand the interplay between ecological and evolutionary processes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Earl Grey: A Fully Automated User-Friendly Transposable Element Annotation and Analysis Pipeline.
- Author
-
Baril, Tobias, Galbraith, James, and Hayward, Alex
- Subjects
EUKARYOTIC genomes ,DROSOPHILA melanogaster ,PIPELINE failures ,ANNOTATIONS ,MODULAR construction ,QUALITY control ,PRODUCTION standards - Abstract
Transposable elements (TEs) are major components of eukaryotic genomes and are implicated in a range of evolutionary processes. Yet, TE annotation and characterization remain challenging, particularly for nonspecialists, since existing pipelines are typically complicated to install, run, and extract data from. Current methods of automated TE annotation are also subject to issues that reduce overall quality, particularly (i) fragmented and overlapping TE annotations, leading to erroneous estimates of TE count and coverage, and (ii) repeat models represented by short sections of total TE length, with poor capture of 5′ and 3′ ends. To address these issues, we present Earl Grey, a fully automated TE annotation pipeline designed for user-friendly curation and annotation of TEs in eukaryotic genome assemblies. Using nine simulated genomes and an annotation of Drosophila melanogaster , we show that Earl Grey outperforms current widely used TE annotation methodologies in ameliorating the issues mentioned above while scoring highly in benchmarking for TE annotation and classification and being robust across genomic contexts. Earl Grey provides a comprehensive and fully automated TE annotation toolkit that provides researchers with paper-ready summary figures and outputs in standard formats compatible with other bioinformatics tools. Earl Grey has a modular format, with great scope for the inclusion of additional modules focused on further quality control and tailored analyses in future releases. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. A Comparison of Smartphone-assisted and Computer Software Assisted Tracing with the Conventional Manual Method.
- Author
-
Kastury, Siddhartha, Kancherla, Pavan, Reddy, Sateesh Kumar, and Chalasani, Srikrishna
- Subjects
SMARTPHONES ,COMPUTER software ,LINEAR statistical models ,COMPUTER systems ,ORTHODONTICS - Abstract
Introduction: In view of the increased use of computers and mobile technologies in orthodontics, there is a need for evaluating the accuracy of commercially available cephalometric software. Aim: This study compares the accuracies of computer-assisted tracing, smartphone tracing with the manual method of tracing in the pre-treatment lateral cephalograms of the orthodontic patients. Materials and Methods: A total of 100 cephalograms, both digital and hard copies, were collected from the records available in the department archives. Hard and soft copies were used for obtaining values of 17 parameters for comparison between different groups. i.e (Manual, NemoCeph®, AutoCeph®, CephNinja®, OneCeph®) Results: No statistical significance between groups for SNA, SNB, ANB, FMA, SN-GoGn, U1-NA Angular,linear, L1-NB Angular, linear, IMPA and interincisal angle (p=0.798, 0.583, 0.895, 0.059, 0.140, 0.680, 0.161, 0.327, 0.906, 0.940, 0.789). A significant difference was seen for parameters Eff-Mx,Mn lengths and differential, L1-Occlusal plane, WITS, NLA.(p=0.000, 0.004, 0.018, 0.004, 0.025). Suggesting the possibility of use of software for tracing instead of manual method. Conclusions: All the software (NemoCeph®, AutoCeph®, CephNinja®, OneCeph®)as well as applications performed satisfactorily and these can be used instead of manual tracing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. GMOCU: Digital Documentation, Management, and Biological Risk Assessment of Genetic Parts.
- Author
-
Wagner, Christoph, Urquiza‐Garcia, Uriel, Zurbriggen, Matias D., and Beyer, Hannes M.
- Subjects
SYNTHETIC biology ,GENOME editing ,RISK assessment ,TRANSGENIC organisms ,MOLECULAR evolution ,DOCUMENTATION - Abstract
The continuous evolution of molecular biology and gene synthesis methods paired with an ever‐increasing potential of synthetic biology approaches and genome engineering toolkits enables the rapid design of genetic bioparts and genetically modified organisms. Although various software solutions assist with specific design tasks and challenges, lab internal documentation and ensuring compliance with governmental regulations on biosafety assessment of the generated organisms remain the responsibility of individual academic researchers. This results in inconsistent and redundant documentation regimes and a significant time and labor burden. GMOCU (GMO documentation) is a standardized semi‐automatic user‐oriented software approach —written in Python and freely available— that unifies lab internal data documentation on genetic parts and genetically modified organisms (GMOs). It automatizes biological risk evaluations and maintains a shared up‐to‐date inventory of bioparts for team‐wide data navigation and sharing. GMOCU further enables data export into customizable formats suitable for scientific publications, official biosafety documents, and the research community. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. The Impact of Using Online Learning Software During the Covid-19 Pandemic in Higher Education.
- Author
-
Natalie, Regina Yoantika, Mahendika, Devin, Salong, Amjad, Shanshan Xu, and Aji, Lexi Jalu
- Subjects
COVID-19 pandemic ,HIGHER education ,LEARNING ,DISTANCE education ,EDUCATION software - Abstract
The occurrence of covid-19 has a major effect on the smooth running of education, one of which is in higher education. So that lecturers and students need supporting media in the smooth learning process, such as the use of software. The use of software can make it easier for students to carry out the online learning process. This study aims to reveal the effect of using software on students in carrying out the distance learning process. In conducting this research, researchers used quantitative research methods with observations made in the form of distributing questionnaires to students to see the effect of software as a supporting medium in the learning process. The results of this study are that learning media using software is very helpful and has a good impact on students in the learning process during the covid-19 pandemic. The conclusion of this study is that fluency in the world of education during the co-19 pandemic is very limited, because students cannot integrate directly with lecturers in the learning process, therefore students and lecturers need media that can be used as supporting tools such as software for the smooth discussion and learning process, teaching in higher education. The limitation in this study is that researchers did not get accurate data in data collection. Therefore, it is hoped that future researchers will also conduct the same research in order to maintain relevant data results in analyzing the use of software in the learning process in higher education. This study also recommends that future researchers make this topic an ongoing discussion to obtain relevant and very effective research results so that this research can become a benchmark in the influence of using software as a learning medium during the covid-19 pandemic, so as to create a learning process based on supporting science and technology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Justification of E-Argumentation Software based on a Needs Analysis in Education Context.
- Author
-
GÜNEŞ, Erhan, ÜSTÜNDAĞ, Mutlu Tahsin, YAVUZALP, Nuh, and BAHÇİVAN, Eralp
- Subjects
NEEDS assessment ,SCIENCE education ,SOFTWARE development tools ,COMPUTER software ,TEACHER-student relationships - Abstract
Argumentation can be defined as a process in which claim, data, justification and supports, which are considered the basic building blocks of an argument, are connected together in a meaningful way. Especially in Science Education, argumentation method is known to have positive contributions to the learning-teaching processes. Today, there is much opportunity to integrate digital tools or software in argumentation processes for better learning outcomes. The literature points out the difficulties experienced by teachers and learners in the argumentation processes and emphasizes that digital tools or software can offer solutions to these problems. In this context, a wide variety of software is used to support argumentation processes in education more effectively and easily. The aim of this study is to examine existing argumentation software and to determine the features of a new "E-Argumentation" software, which is supposed to be a better and contemporary solution for argumentation processes, based on a needs analysis. Existing argumentation software is not rich in terms of multimedia usage and not compatible with group work, which is important in argumentation processes, or with three argumentation approaches in the literature. Furthermore, existing software has serious shortcomings in terms of usability and educational value. As a result, it is clear that there is a need for argumentation software which is compatible with current technologies, pedagogically useful, and has high level of usability and accessibility. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. A new GNSS-acoustic positioning software implementing multiple positioning functions considering nadir total delays.
- Author
-
Tomita, Fumiaki and Kido, Motoyuki
- Subjects
- *
MARKOV chain Monte Carlo , *SPEED of sound , *GEODETIC observations , *SUBDUCTION zones , *ARTIFICIAL satellites in navigation - Abstract
Global navigation satellite system-acoustic (GNSS-A) positioning is an important geodetic observation technique for detecting seafloor crustal deformation. After the 2011 Tohoku–Oki earthquake, GNSS-A observational networks were extended along various subduction zones, and observational systems have been improved, especially for sea surface platforms, such as the introduction of an unmanned vehicle, the Wave Glider. The aforementioned development of GNSS-A observations has provided a large amount of observational data. Furthermore, GNSS-A positioning methods were recently developed considering the lateral heterogeneity of the sound speed structure. Thus, it is important to develop a software that makes it easy for widespread use of the latest GNSS-A positioning methods. However, there is currently only one open-source GNSS-A positioning software, which may hinder the entry of various researchers into GNSS-A positioning analyses. Here, we developed a new GNSS-A positioning software, henceforth called "SeaGap" (Software of enhanced analyses for GNSS-acoustic positioning), that executes various positioning methods from the conventional kinematic positioning technique to the latest Markov Chain Monte Carlo (MCMC)-based static positioning technique. We introduce their methodology and demonstrate its application to actual observational data. The software newly added optional prior distributions to the unknown parameters expressing the heterogeneity of a sound speed structure into the MCMC-based static positioning method, and we also applied the new method to actual observational data. In addition to the positioning functions, the software contains various auxiliary functions, including drawing. The developed software is written using the "Julia" language and is distributed as an open-source software. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. A Graphical Interface to Support Low-Flow Volatile Anesthesia: Implications for Patient Safety, Teaching, and Design of Anesthesia Information Management Systems.
- Author
-
Xie, James, Jablonski, Megan, Smith, Joan, and Navedo, Andres
- Subjects
- *
MEDICAL information storage & retrieval systems , *GRAPHICAL user interfaces , *PATIENT safety , *COMPUTER science , *SOFTWARE analytics , *INFORMATION science , *OPERATING rooms - Abstract
The article offers information on the implementation of iMDsoft's Metavision as Boston Children's Hospital's anesthesia information management system (AIMS) software in 2012. It discusses the development of a "low-flow" interface for the system by 2015 to enhance teaching and patient safety during the delivery of low flow anesthesia.
- Published
- 2024
- Full Text
- View/download PDF
30. DETERMINING THE MECHANISM FOR CALCULATING THE TENSION OF A WORKING CONVEYOR BELT DURING A CHANGE IN THE TRANSPORTATION LENGTHFROM MEDICAL MASK WASTE.
- Author
-
Gavryukov, Аlexandr, Kolesnikov, Mykhailo, Zapryvoda, Andrii, Lutsenko, Vadym, and Bondarchuk, Olga
- Subjects
CONVEYOR belts ,MEDICAL wastes ,MEDICAL masks ,BELT conveyors ,DEAD loads (Mechanics) ,DYNAMIC loads - Abstract
This paper examines the working process of a belt conveyor with a working drive that can change the length of transportation. The conveyor can be used for tunneling, development of minerals in mines and quarries, transportation of materials in warehouses. The use of such a conveyor makes it possible to reduce the time for operations to increase or decrease the length of transportation, to exclude reloaders between the working equipment and conveyor itself from the transport chain. It was established that when changing the length of transportation of a working conveyor, the static and dynamic load on the belt increases. The change in the static load of the belt on the drum of the mobile station depends on the speed of the mobile station and the speed of the beltgenerated by the conveyor drive. The dynamic loading on the belt depends on the acceleration of the belt, which is related to the acceleration of the mobile station during the change in the length of the conveyor. For a working conveyor that changes the length of transportation, the static tension of the belt on the drum of the mobile station can increase by 1.1-1.4 times from the initial one. The dynamic loading of the belt can have a significant increase if the acceleration of the mobile station is not stretched over time and acquires large values. Based on the dependencesderived in the current work, a calculation procedure is proposed for the design of a belt conveyor with a working drive that can change the length of transportation. The Mathcad software was applied to verify the designcalculation procedure. The results make it possible to employ new design methods in the construction of competitive machines equipped with a belt conveyor with a variable length of transportation [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. IMPLEMENTATION OF CLASS INTERACTION UNDER AGGREGATION CONDITIONS.
- Author
-
Kungurtsev, Oleksii and Komleva, Nataliia
- Subjects
RESEARCH implementation ,PROBLEM solving ,RESOURCE management - Abstract
The object of research is the implementation of relations between software classes. It is shown that when implementing the aggregation relationship between classes, errors may occur if more than one client class is found. Class interaction errors can be caused by management of resource class attributes by one of the client classes in a way that is unacceptable to another client class due to invalid attribute values, state changes, method blocking, etc. To solve the problem, a special organization of the queue for client classes is proposed. A feature of the queue is the use of models of client classes and resource class. The model of a resource class provides an idea about its resources (attributes and methods) and how they are used. The client class model shows how much of these resources will be used by the client and how this will be done. This organization of the queue makes it possible to provide resources to the next client class only after checking its compatibility with active client classes. In general, client classes have different types, and this complicates the organization of the queue. Therefore, it is proposed to make them derived from the base class, which defines the interface for the queue. Similarly, the problem of the interaction of the class-resource with the queue is solved. The proposed base class for the resource class also provides the necessary queue interface. Software was developed that automates the process of converting classes: analysis of a resource class, determination of resource needs from client classes, construction of base classes. After the conversion is completed, the queue functions are supported. The study results verification showed a reduction in the time for converting classes by about three times, and the waiting time for access to resources during the work of the queue – at least two times. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. DarSIA: An Open-Source Python Toolbox for Two-Scale Image Processing of Dynamics in Porous Media.
- Author
-
Nordbotten, Jan Martin, Benali, Benyamine, Both, Jakub Wiktor, Brattekås, Bergit, Storvik, Erlend, and Fernø, Martin A.
- Subjects
PYTHON programming language ,IMAGE analysis ,MANUFACTURING processes ,MIRROR images ,POROUS materials - Abstract
Understanding porous media flow is inherently a multi-scale challenge, where at the core lies the aggregation of pore-level processes to a continuum, or Darcy-scale, description. This challenge is directly mirrored in image processing, where pore-scale grains and interfaces may be clearly visible in the image, yet continuous Darcy-scale parameters may be what are desirable to quantify. Classical image processing is poorly adapted to this setting, as most techniques do not explicitly utilize the fact that the image contains explicit physical processes. Here, we extend classical image processing concepts to what we define as "physical images" of porous materials and processes within them. This is realized through the development of a new open-source image analysis toolbox specifically adapted to time-series of images of porous materials. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. SimpleNMR: An interactive graph network approach to aid constitutional isomer verification using standard 1D and 2D NMR experiments.
- Author
-
Hughes, Eric and Kenwright, Alan M.
- Abstract
Despite progress in computer automated solutions, constitutional isomer verification by NMR using one‐ and two‐dimensional data sets is still, in the main, a manual, user‐intensive activity that is challenging for a number of reasons. These include the problem of simultaneously keeping track of the information from a number of separate NMR experiments and the difficulty of another researcher subsequently verifying the assignments made without having to independently repeat the whole analysis. This paper describes a graphical interactive approach that overcomes some of these problems. By using concepts used to visualise graph networks, we have been able to represent the NMR data in a manner that highlights directly the link between the different NMR experiments and the molecule of interest. Furthermore, by making the graph networks interactive, a user can easily validate and correct the assignment and understand the decisions made in arriving at the solution. We have developed a usable proof‐of‐concept computer program, ‘simpleNMR’, written in Python to illustrate the ideas and approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Don't Panic, Don't Get Angry, and Don't Let It Happen Again.
- Author
-
Toshio Kashiwagi
- Subjects
- *
TELECOMMUNICATION , *COMPUTER software , *INFRASTRUCTURE (Economics) - Abstract
NTT COMWARE supports the systems that make up NTT's telecommunications infrastructure and systems related to service operations from the software side. As a member of the NTT DOCOMO Group, it is also developing systems that will serve as the foundation for expanding the Group's business. We interviewed Toshio Kashiwagi, senior executive vice president and executive officer (chief information officer/chief digital officer) of NTT COMWARE, which is committed to providing new value to society beyond the social infrastructure of telecommunications, about the company's strengths as a group of software professionals and his attitude as a top executive. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. ABkPowerCalculator: An App to Compute Power for Balanced (AB)k Single Case Experimental Designs.
- Author
-
Batley, Prathiba, Thamaran, Madhav, and Hedges, Larry V.
- Subjects
- *
EXPERIMENTAL design , *RESEARCH personnel , *MULTILEVEL models , *MOBILE apps , *CELL phones - Abstract
Single case experimental designs are an important research design in behavioral and medical research. Although there are design standards prescribed by the What Works Clearinghouse for single case experimental designs, these standards do not include statistically derived power computations. Recently we derived the equations for computing power for (AB)k designs. However, these computations and the software code in R may not be accessible to applied researchers who are most likely to want to compute power for their studies. Therefore, we have developed an (AB)k power calculator Shiny App () that researchers can use with no software training. These power computations assume that the researcher would be interested in fitting multilevel models with autocorrelations or conduct similar analyses. The purpose of this software contribution is to briefly explain how power is derived for balanced (AB)k designs and to elaborate on how to use the Shiny App. The app works well on not just computers but mobile phones without installing the R program. We believe this can be a valuable tool for practitioners and applied researchers who want to plan their single case studies with sufficient power to detect appropriate effect sizes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. InfrastructureModels: Composable Multi-infrastructure Optimization in Julia.
- Author
-
Bent, Russell, Tasseff, Byron, and Coffrin, Carleton
- Subjects
- *
NATURAL gas pipelines , *INFRASTRUCTURE (Economics) , *MATHEMATICAL programming , *DATA libraries , *ENVIRONMENTAL infrastructure , *ARTIFICIAL intelligence - Abstract
In recent years, there has been an increasing need to understand the complex interdependencies between critical infrastructure systems, for example, electric power, natural gas, and potable water. Whereas open-source and commercial tools for the independent simulation of these systems are well established, frameworks for cosimulation with other systems are nascent and tools for co-optimization are scarce—the major challenge being the hidden combinatorics that arise when connecting multiple-infrastructure system models. Building toward a comprehensive solution for modeling interdependent infrastructure systems, this work presents InfrastructureModels, an extensible, open-source mathematical programming framework for co-optimizing multiple interdependent infrastructures. This work provides new insights into methods and programming abstractions that make state-of-the-art independent infrastructure models composable with minimal additional effort. To that end, this paper presents the design of the InfrastructureModels framework, documents key components of the software's implementation, and demonstrates its effectiveness with three case studies on canonical co-optimization tasks arising in interdependent infrastructure systems. History: Accepted by Ted Ralphs, Area Editor for Software Tools. Funding: The work was funded by Los Alamos National Laboratory's Directed Research and Development project "The Optimization of Machine Learning: Imposing Requirements on Artificial Intelligence" and the U.S. Department of Energy's Office of Electricity Advanced Grid Modeling projects "Joint Power System and Natural Gas Pipeline Optimal Expansion Planning" and "Coordinated Planning and Operation of Water and Power Infrastructures for Increased Resilience and Reliability." This work was carried out under the U.S. DOE contract no. [DE-AC52-06NA25396]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information (https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2022.0118) as well as from the IJOC GitHub software repository (https://github.com/INFORMSJoC/2022.0118). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Constructing one-dimensional supramolecular polymer structures using particle swarm optimization technique.
- Author
-
Ghosh, Arunima, Sahu, Rahul, and Reddy, Sandeep K.
- Subjects
- *
PARTICLE swarm optimization , *SUPRAMOLECULAR polymers , *POLYMER structure , *MATHEMATICAL optimization , *STACKING interactions , *LINEAR polymers - Abstract
In the realm of studying supramolecular polymers using computer simulations, the task of generating appropriate initial structures poses a significant challenge, primarily owing to the extensive range of potential configurations. In this study, we introduce StackGen, an open-source framework designed to efficiently create energy-optimized one-dimensional supramolecular polymer structures with minimal computational overhead. This tool utilizes the particle swarm optimization (PSO) algorithm in conjunction with a semiempirical quantum mechanical approach to identify low-energy supramolecular stack configurations from a diverse set of possibilities. These configurations result from the translational and rotational adjustments of adjacent molecules around monomers along various axes. The tool also considers various structural factors, including the presence of functional side groups and the extent of intermolecular π – π stacking interactions. Extensive testing across different molecules demonstrates StackGen's ability to produce low-energy structures with negligible computational costs. Additionally, the tool incorporates features for optimizing PSO hyperparameters in real-time, thus improving convergence. The tool provides a convenient means of generating structures suitable for both molecular simulations and quantum mechanical calculations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. An open-source toolbox for measuring vocal tract shape from real-time magnetic resonance images.
- Author
-
Belyk, Michel, Carignan, Christopher, and McGettigan, Carolyn
- Subjects
- *
VOCAL tract , *MAGNETIC resonance imaging , *LAUGHTER , *MACHINE learning , *HUMAN anatomy , *DEEP learning - Abstract
Real-time magnetic resonance imaging (rtMRI) is a technique that provides high-contrast videographic data of human anatomy in motion. Applied to the vocal tract, it is a powerful method for capturing the dynamics of speech and other vocal behaviours by imaging structures internal to the mouth and throat. These images provide a means of studying the physiological basis for speech, singing, expressions of emotion, and swallowing that are otherwise not accessible for external observation. However, taking quantitative measurements from these images is notoriously difficult. We introduce a signal processing pipeline that produces outlines of the vocal tract from the lips to the larynx as a quantification of the dynamic morphology of the vocal tract. Our approach performs simple tissue classification, but constrained to a researcher-specified region of interest. This combination facilitates feature extraction while retaining the domain-specific expertise of a human analyst. We demonstrate that this pipeline generalises well across datasets covering behaviours such as speech, vocal size exaggeration, laughter, and whistling, as well as producing reliable outcomes across analysts, particularly among users with domain-specific expertise. With this article, we make this pipeline available for immediate use by the research community, and further suggest that it may contribute to the continued development of fully automated methods based on deep learning algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. Experimental study on dimensional variations of 3D printed dental models based on printing orientation.
- Author
-
Perlea, Paula, Stefanescu, Cosmin, Dalaban, Madalina‐Georgiana, and Petre, Alexandru‐Eugen
- Subjects
- *
STATISTICAL software , *THREE-dimensional printing , *DENTAL technology , *STRUCTURAL stability - Abstract
Key Clinical Message: This research investigates the trueness and precision of 3D printing technology in dental applications, specifically focusing on dimensional variations observed in models printed at different angles. The methodology involved importing a dental model into slicing software, adjusting its orientation, and implementing support structures for stability. Subsequently, the model underwent 3D printing five times for each orientation using appropriate equipment and underwent post‐processing steps, including cleaning, washing, and UV‐light post‐curing. The printed models were then scanned using a specialized desktop scanner for further analysis. Accuracy assessment was carried out using dedicated software, employing an algorithm for precise alignment by comparing the scanned files. Color deviation maps were utilized to visually represent variations, aiming to evaluate how positioning during printing influences the trueness and precision of 3D‐printed dental models. Trueness and precision analyses involved the Shapiro–Wilk test for normality and a one‐way ANOVA to compare means of three independent groups, with statistical analyses conducted using IBM SPSS Statistics software. The color maps derived from 3D comparisons revealed positive and negative deviations, represented by distinct colors. Comparative results indicated that models positioned at 0° exhibited the least dimensional deviation, whereas those at 90° showed the highest. Regarding precision, models printed at 0° demonstrated the highest reproducibility, while those at 15° exhibited the lowest. Based on the desired level of precision, it is recommended that printed models be produced at an inclination angle of 0°. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. Software Effort Estimation Based on Ensemble Extreme Gradient Boosting Algorithm and Modified Jaya Optimization Algorithm.
- Author
-
Kumar, Beesetti Kiran, Bilgaiyan, Saurabh, and Mishra, Bhabani Shankar Prasad
- Subjects
- *
OPTIMIZATION algorithms , *BOOSTING algorithms , *MACHINE learning , *COMPUTER software industry , *COMPUTER software , *COMPUTER software development - Abstract
Software development effort estimation is regarded as a crucial activity for managing project cost, time, and quality, as well as for the software development life cycle. As a result, proper estimating is crucial to the success of projects and to lower risks. Software effort estimation has drawn much research interest recently and has become a problem for the software industry. When results are inaccurate, an effort may be over- or under-estimated, which can disastrously affect project resources. In the sector, machine learning methods are becoming more and more prominent. Therefore, in this paper, we propose a Modified Jaya algorithm to improve the effectiveness of the estimated model; Modified JOA selects the ideal subset of components from an extensive feature collection. Then, the ensemble machine learning-based Enhanced Extreme gradient boosting algorithm and Ensemble Learning machine approach are employed to estimate the software effort. On the PROMISE SDEE repository, the proposed methodologies are empirically assessed. In this approach, applying machine learning techniques to the effort estimation process increases the likelihood that the time and cost estimates will be accurate. The proposed approach yields a greater performance. The key benefit of this approach is that it lowers the computational cost. This approach can also inspire the development of a tool that could reliably, effectively, and accurately estimate the effort required to complete different software projects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Approaches to nonlinear curve fitting in laboratory medicine.
- Author
-
McPherson, Peter A C
- Subjects
- *
STATISTICAL models , *COMPUTERS , *DATA analysis , *MEDICAL informatics , *INDEPENDENT variables , *LABORATORIES , *TOXICOLOGY , *ENZYME-linked immunosorbent assay , *IMMUNOLOGY technique , *CLINICAL pathology , *CLINICAL chemistry , *AUTOMATION , *DATA analysis software , *CALIBRATION , *ALGORITHMS - Abstract
Nonlinear curve fitting is an important process in laboratory medicine, particularly with the increased use of highly sensitive antibody-based assays. Although the process is often automated in commercially available software, it is important that clinical scientists and physicians recognize the limitations of the various approaches used and are able to select the most appropriate model. This article summarizes the key nonlinear functions and demonstrates their application to common laboratory data. Following this, a basic overview of the statistical comparison of models is presented and then a discussion of important algorithms used in nonlinear curve fitting. An accompanying Microsoft Excel workbook is available that can be used to explore the content of this article. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Tumor Response Evaluation Using iRECIST: Feasibility and Reliability of Manual Versus Software-Assisted Assessments.
- Author
-
Ristow, Inka, Well, Lennart, Wiese, Nis Jesper, Warncke, Malte, Tintelnot, Joseph, Karimzadeh, Amir, Koehler, Daniel, Adam, Gerhard, Bannas, Peter, and Sauer, Markus
- Subjects
- *
BLOOD vessels , *COMPUTED tomography , *DIGITAL diagnostic imaging , *CLINICAL trials , *RETROSPECTIVE studies , *DESCRIPTIVE statistics , *MANN Whitney U Test , *QUANTITATIVE research , *METASTASIS , *COMPUTER-aided diagnosis , *RELIABILITY (Personality trait) , *INTER-observer reliability - Abstract
Simple Summary: Quantitative assessment of the therapy response in oncological patients undergoing chemo- or immunotherapy is becoming increasingly important not only in the context of clinical studies but also in clinical routine. To facilitate the sometimes complex and time-consuming oncological response assessment, dedicated software solutions, e.g., according to (i)RECIST, have been developed. Considering the higher degree of complexity of iRECIST, we investigated the benefits of software-assisted assessments compared to manual approaches with respect to reader agreement, error rate, and reading time. iRECIST assessments were more feasible and reliable when supported by dedicated software. We conclude that oncologic response assessment in clinical trials should be performed software-assisted rather than manually. Objectives: To compare the feasibility and reliability of manual versus software-assisted assessments of computed tomography scans according to iRECIST in patients undergoing immune-based cancer treatment. Methods: Computed tomography scans of 30 tumor patients undergoing cancer treatment were evaluated by four independent radiologists at baseline (BL) and two follow-ups (FU), resulting in a total of 360 tumor assessments (120 each at BL/FU1/FU2). After image interpretation, tumor burden and response status were either calculated manually or semi-automatically as defined by software, respectively. The reading time, calculated sum of longest diameter (SLD), and tumor response (e.g., "iStable Disease") were determined for each assessment. After complete data collection, a consensus reading among the four readers was performed to establish a reference standard for the correct response assignments. The reading times, error rates, and inter-reader agreement on SLDs were statistically compared between the manual versus software-assisted approaches. Results: The reading time was significantly longer for the manual versus software-assisted assessments at both follow-ups (median [interquartile range] FU1: 4.00 min [2.17 min] vs. 2.50 min [1.00 min]; FU2: 3.75 min [1.88 min] vs. 2.00 min [1.50 min]; both p < 0.001). Regarding reliability, 2.5% of all the response assessments were incorrect at FU1 (3.3% manual; 0% software-assisted), which increased to 5.8% at FU2 (10% manual; 1.7% software-assisted), demonstrating higher error rates for manual readings. Quantitative SLD inter-reader agreement was inferior for the manual compared to the software-assisted assessments at both FUs (FU1: ICC = 0.91 vs. 0.93; FU2: ICC = 0.75 vs. 0.86). Conclusions: Software-assisted assessments may facilitate the iRECIST response evaluation of cancer patients in clinical routine by decreasing the reading time and reducing response misclassifications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Implementation of a software application in staging and grading of periodontitis cases.
- Author
-
Marini, Lorenzo, Tonetti, Maurizio S., Nibali, Luigi, Sforza, Nicola M., Landi, Luca, Cavalcanti, Raffaele, Rojas, Mariana A., and Pilloni, Andrea
- Subjects
- *
COMPUTER software , *STATISTICS , *PERIODONTITIS , *MOBILE apps , *DENTISTS , *HUMAN services programs , *INTER-observer reliability , *COMPARATIVE studies , *DESCRIPTIVE statistics , *SENSITIVITY & specificity (Statistics) , *DIAGNOSTIC errors ,RESEARCH evaluation - Abstract
Objectives: The purpose of this study was to assess the diagnostic accuracy and the inter‐rater agreement among general dentists when staging and grading periodontitis cases with the aid of a software application (SA) developed by the Italian Society of Periodontology and Implantology. Materials and methods: Ten general dentists were asked to independently assess 25 periodontitis cases using the SA. Accuracy was estimated using quadratic weighted kappa and examiners' percentage of agreement with a reference diagnosis provided by a gold standard examiner. Inter‐rater agreement was evaluated using Fleiss kappa statistics. Results: The overall case definition agreed with the reference diagnosis in 53.6% of cases. The agreements for each general dentist's pairwise comparisons against the reference definition were at least substantial in 100% of cases for stage, in 70% of cases for grade and in none of the cases for extent. Fleiss kappa was 0.818, 0.608, and 0.632 for stage, extent, and grade, respectively. The study recognized possible reasons that could lead to decreased accuracy using the SA. Conclusions: Supported by the SA, general dentists have reached substantial inter‐rater agreement and highly accurate assignments of stage and grade. However, complete case definitions were correctly diagnosed in slightly over half of the cases. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Setting up a biomodeling, virtual planning, and three-dimensional printing service in Uruguay.
- Author
-
Zabala-Travers, Silvina and García-Bayce, Andrés
- Subjects
- *
THREE-dimensional printing , *PERSONNEL management , *THREE-dimensional imaging , *COMPUTER-assisted image analysis (Medicine) - Abstract
Virtual surgical planning and three-dimensional (D) printing are rapidly becoming essential for challenging and complex surgeries around the world. An Ibero-American survey reported a lack of awareness of technology benefits and scarce financial resources as the two main barriers to widespread adoption of 3-D technologies. The Pereira Rossell Hospital Center is a publicly funded maternal and pediatric academic clinical center in Uruguay, a low-resource Latin American country, that successfully created and has been running a 3-D unit for 4 years. The present work is a step-by-step review of the 3-D technology implementation process in a hospital with minimal financial investment. References to training, software, hardware, and the management of human resources are included. Difficulties throughout the process and future challenges are also discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Technical note: SpekPy Web--online x-ray spectrum calculations using an interface to the SpekPy toolkit.
- Author
-
Vorbau, Robert and Poludniowski, Gavin
- Subjects
X-ray spectra ,WEB-based user interfaces ,GRAPHICAL user interfaces ,PYTHON programming language ,USER interfaces ,COMPUTER software quality control - Abstract
Knowledge of the photon spectrum emitted from an x-ray tube is frequently needed in imaging and dosimetry contexts.As the spectrum characteristics are influenced by several parameters and routine measurement of a spectrum is often impractical, a variety of software programs have been developed over the decades for convenient calculations.SpekPy is a state-of-the-art software package containing several spectrum models, and was created to estimate photon spectra originating from x-ray tubes using a small set of input parameters (e.g., anode material,anode angle,tube potential, filtration, etc.).SpekPy is distributed as a Python toolkit and is available free of charge.The toolkit does,however, lack a graphical user interface and a user is required to write a Python script to make use of it. In this work this limitation is addressed by introducing a web application called SpekPy Web: a graphical user interface together with an application programmable interface (API). These developments both make the SpekPy spectrummodels accessible to a broader set of users and increases the ease of use for existing users. SpekPy Web is hosted at: https://spekpy.smile.ki.se. The functionality of the software is demonstrated, using its API, by estimating first half-value layers (HVLs) for 15 standard beam qualities from the International Bureau of Weights and Measures (BIPM). The estimated HVLs were found to all be within 3.5% agreement when compared to experimental values, with an average calculation time of 2.5 s per spectrum. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Special issue on software citation, indexing, and discoverability.
- Author
-
Katz, Daniel S. and Chue Hong, Neil P.
- Subjects
COMPUTER software ,TRAUMA registries ,INDEXING - Abstract
Software plays a fundamental role in research as a tool, an output, or even as an object of study. This special issue on software citation, indexing, and discoverability brings together five papers examining different aspects of how the use of software is recorded and made available to others. It describes new work on datasets that enable large-scale analysis of the evolution of software usage and citation, that presents evidence of increased citation rates when software artifacts are released, that provides guidance for registries and repositories to support software citation and findability, and that shows there are still barriers to improving and formalising software citation and publication practice. As the use of software increases further, driven by modern research methods, addressing the barriers to software citation and discoverability will encourage greater sharing and reuse of software, in turn enabling research progress. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Validation of an in-house developed therapeutic dosimetric software tool for the treatment of 177Lutetium-DOTATATE peptide receptor radionuclide therapy.
- Author
-
Van Wyk, Bronwin, Hasford, Francis, Nyakale, Nozipho, and Vangu, Mboyo-Di-Tamba
- Abstract
Background: Computer software for absorbed dose quantification has been used widely in nuclear medicine. Different software tools have been written to improve the dose assessment, especially in therapeutic nuclear medicine. Some software tools focusing on computational phantom models from the international commission of radiation protection and units (ICRP) whilst others on Monte Carlo simulated models. While many studies have investigated therapeutic nuclear medicine dosimetry. The authors have noticed that very few papers compare the therapeutic software tools to each other, hence a doctor of philosophy study was embarked on. The aim of our study was therefore to validate our in-house developed software tool Masterdose using the commercial software OLINDA/EXM 1.0 that was available in our department. Methods: Methodology was based on clinical patient data treated for neuroendocrine tumours with
177 Lutetium (Lu)-DOTATATE at a South African hospital. All patients underwent the same SPECT acquisition protocol and were corrected for scatter, partial volume, collimator-detector response, gamma camera calibration and attenuation. Correction factors were applied to images to convert counts to activity. The first cycle of peptide receptor radionuclide therapy (PRRT) for 11 single photon emission computed tomography (SPECT) patients were compared on the Masterdose and OLINDA/EXM 1.0 software tools at 1, 24, 72 and 168 h. Cumulated activity and the absorbed dose were compared for the two software tools. The absorbed dose difference was then compared using statistical Bland-Altman analysis. Results: Masterdose and OLINDA/EXM 1.0 had different peptide receptor radionuclide therapy methodologies. This led to different results obtained for the software tools. Cumulated activities of Masterdose and DTK was 10.5% and 10.9% for the kidneys and tumours respectively. On average tumour absorbed doses were nine-times that of the kidneys. Bland–Altman analysis show a non-systematic difference between the two software. Conclusion: On average the relative percentage difference between the cumulated activities and absorbed dose of the two software were 10.7%. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
48. Vector Accelerator Unit for Caravel.
- Author
-
Baungarten-Leon, Emilio Isaac, Ortega-Cisneros, Susana, Jaramillo-Toral, Uriel, Rodriguez-Navarrete, Francisco J., Pizano-Escalante, L., and Panduro, J. J. Raygoza
- Abstract
Caravel is an open-source project developed by Efabless for creating custom system-on-chips (SoCs). It includes the design of a configurable chip, development tools, and documentation. The Caravel SoC includes a RISC-V with the Instruction Set Architecture RV32I. One of the key features of Caravel is its open-source nature. In this letter, the vector accelerator unit for the Caravel SoC template is presented to increase the RISC-V capabilities, allowing parallel data processing through 14 vector operations. The accelerator is based on 4 Arithmetic Logic Units connected directly to the RISC-V through the logic analyzer port and the general-purpose input/output (GPIO) port. The total area of this accelerator is less than 20% of the User Project Wrapper area allowing the user to implement their custom designs in 8.4256 mm2. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Visualizing and Evaluating Microbubbles in Multiphase Flow Applications.
- Author
-
Najim, Safa A., Meerakaviyad, Deepak, Pun, Kul, Russell, Paul, Ganesan, Poo Balan, Hughes, David, and Hamad, Faik A.
- Subjects
MULTIPHASE flow ,MICROBUBBLES ,CHEMICAL processes ,WASTEWATER treatment ,LAMINAR flow ,VELOCITY measurements - Abstract
Accurate visualization of bubbles in multiphase flow is a crucial aspect of modeling heat transfer, mixing, and turbulence processes. It has many applications, including chemical processes, wastewater treatment, and aquaculture. A new software, Flow_Vis, based on experimental data visualization, has been developed to visualize the movement and size distribution of bubbles within multiphase flow. Images and videos recorded from an experimental rig designed to generate microbubbles were analyzed using the new software. The bubbles in the fluid were examined and found to move with different velocities due to their varying sizes. The software was used to measure bubble size distributions, and the obtained results were compared with experimental measurements, showing reasonable accuracy. The velocity measurements were also compared with literature values and found to be equally accurate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. From project to platform: a case study on evolving the software development team
- Author
-
Coughlin, Daniel and Lush, Binky
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.