1,563 results on '"Large-scale"'
Search Results
2. Experimental investigations on normal mode nodes as support positions of a resonant testing facility for bending fatigue tests.
- Author
-
Schramm, Clara, Birkner, Dennis, Schneider, Sebastian, and Marx, Steffen
- Subjects
- *
RESONANT vibration , *STEEL pipe , *VIBRATION tests , *BEND testing , *VIBRATION isolation , *FATIGUE testing machines - Abstract
Large‐scale fatigue testing is very important to the research on scale effects, which occur in large cyclic loaded structures, such as wind turbine towers. However, such experimental testing has a very high energy consumption. As an efficient alternative, this paper presents a new resonant testing facility for large‐scale specimens under cyclic bending loads. The facility works as a 4‐point bending test, in which the specimen is supported in the nodes of its first normal bending mode, where theoretically no reaction forces occur. Two counter‐rotating imbalance motors with excitation frequencies near resonance generate a harmonic force acting on the specimen. Experimental trial fatigue tests on a steel pipe as a specimen were carried out, in order to validate the new testing setup. A great decrease in the support forces was reached by placing the supports at the normal mode nodes. Additionally, the behavior of the support forces under varying positions and excitation frequencies was also investigated. In summary, the resonant testing method combined with the supports at the normal mode nodes offers an efficient and energy‐saving testing setup for large‐scale fatigue tests. Highlights: A resonant testing facility for large‐scale bending fatigue tests was developed and tested.Supports were placed in the nodes of the specimen's first normal bending mode.The influence of the support positions on the support forces was investigated.A significant reduction in support forces was achieved. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Autoencoder evolutionary algorithm for large-scale multi-objective optimization problem.
- Author
-
Hu, Ziyu, Xiao, Zhixing, Sun, Hao, and Yang, He
- Abstract
Multi-objective optimization problems characterized by a substantial number of decision variables, which are also called large-scale multi-objective optimization problems (LSMOPs), are becoming increasingly prevalent. Traditional evolutionary algorithms may deteriorate drastically when tackling a large number of decision variables. For LSMOPs, the dimensionality of the decision variables needs to be reduced and the algorithm needs to be designed according to the characteristics of divide-and-conquer. The autoencoder evolutionary algorithm (AEEA) is proposed based on autoencoder dimensionality reduction, the grouping of decision variables, and the application of divide-and-conquer strategies. The proposed algorithm is compared with other classical algorithms. The experiment result shows that AEEA achieves excellent convergence and diversity, and still performs well in decision variables of higher dimensions. Finally, it is verified that the autoencoder improves the running time of the proposed algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Using eDNA Sampling to Identify Correlates of Species Occupancy Across Broad Spatial Scales.
- Author
-
McColl‐Gausden, Emily F., Griffiths, Josh, Weeks, Andrew R., and Tingley, Reid
- Subjects
- *
PLATYPUS , *SPECIES distribution , *FARMS , *WATER sampling , *LAND use - Abstract
ABSTRACT Aim Location Methods Results Main Conclusions Species presence–absence data can be time‐consuming and logistically difficult to obtain across large spatial extents. Yet these data are important for ensuring changes in species distributions are accurately monitored and are vital for ensuring appropriate conservation actions are undertaken. Here, we demonstrate how environmental DNA (eDNA) sampling can be used to systematically collect species occupancy data rapidly and efficiently across vast spatial domains to improve understanding of factors influencing species distributions.South‐eastern Australia.We use a widely distributed, but near‐threatened species, the platypus (Ornithorhynchus anatinus), as a test case and undertake an environmentally stratified systematic survey to assess the presence–absence of platypus eDNA at 504 sites across 584,292 km2 of south‐eastern Australia, representing ~37% of the species' extensive distribution. Site occupancy‐detection models were used to analyse how landscape‐ and site‐level factors affect platypus occupancy, enabling us to incorporate uncertainty at the different levels inherent in eDNA sampling (site, water sample replicate and qPCR replicate).Platypus eDNA was detected at 272 sites (~54%) with platypuses more likely to occupy sites in catchments with increased runoff and less zero‐flow days, and sites with access to banks suitable for burrowing. Platypuses were less likely to occupy sites in catchments with a high proportion of shrubs and grasslands, or agricultural land use.These data provide an important large‐scale validation of the landscape‐ and site‐level factors influencing platypus occupancy that can be used to inform future conservation efforts. Our case study shows that systematically designed, stratified eDNA surveys provide an efficient means to understand how environmental characteristics affect species occupancy across broad environmental gradients. The methods employed here can be applied to aquatic and semi‐aquatic species globally, providing unprecedented opportunities to understand biodiversity status and change and provide insights for current and future conservation actions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Large-Scale Green Method for Synthesizing Ultralong Uniform Tellurium Nanowires for Semiconductor Devices.
- Author
-
Lyu, Zhiyi, Park, Mose, Tang, Yanjin, Choi, Hoon, Song, Seung Hyun, and Lee, Hoo-Jeong
- Subjects
- *
SEMICONDUCTOR devices , *SEMICONDUCTOR synthesis , *NANOWIRES , *SUSTAINABILITY , *SEMICONDUCTORS , *THIN film transistors - Abstract
This study presents a large-scale green approach for synthesizing ultralong tellurium nanowires with diameters around 13 nm using a solution-based method. By adjusting key synthesis parameters such as the surfactant concentration, temperature, and reaction duration, we achieved high-quality, ultralong Te NWs. These nanowires exhibit properties suitable for use in semiconductor applications, particularly when employed as channel materials in thin-film transistors, displaying a pronounced gate effect with a high switch of up to 104 and a mobility of 0.9 cm2 V−1s−1. This study underscores the potential of solvent-based methods in synthesizing large-scale ultralong Te NWs as a critical resource for future sustainable nanoelectronic devices. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Evaluation of Ecological Environment Quality Using an Improved Remote Sensing Ecological Index Model.
- Author
-
Liu, Yanan, Xiang, Wanlin, Hu, Pingbo, Gao, Peng, and Zhang, Ai
- Subjects
- *
REMOTE-sensing images , *PRINCIPAL components analysis , *REMOTE sensing , *ARTIFICIAL satellites , *AIR quality - Abstract
The Remote Sensing Ecological Index (RSEI) model is widely used for large-scale, rapid Ecological Environment Quality (EEQ) assessment. However, both the RSEI and its improved models have limitations in explaining the EEQ with only two-dimensional (2D) factors, resulting in inaccurate evaluation results. Incorporating more comprehensive, three-dimensional (3D) ecological information poses challenges for maintaining stability in large-scale monitoring, using traditional weighting methods like the Principal Component Analysis (PCA). This study introduces an Improved Remote Sensing Ecological Index (IRSEI) model that integrates 2D (normalized difference vegetation factor, normalized difference built-up and soil factor, heat factor, wetness, difference factor for air quality) and 3D (comprehensive vegetation factor) ecological factors for enhanced EEQ monitoring. The model employs a combined subjective–objective weighting approach, utilizing principal components and hierarchical analysis under minimum entropy theory. A comparative analysis of IRSEI and RSEI in Miyun, a representative study area, reveals a strong correlation and consistent monitoring trends. By incorporating air quality and 3D ecological factors, IRSEI provides a more accurate and detailed EEQ assessment, better aligning with ground truth observations from Google Earth satellite imagery. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Blocked out: reflections on the potential of intensive modes of teaching to enhance post-COVID-19 graduate employability in large-scale educational settings
- Author
-
Laura Dixon and Valerie Makin
- Subjects
Block teaching ,Employability ,Intensive delivery ,Large-scale ,Post-COVID-19 ,Social Sciences - Abstract
Purpose – This paper explores the potential that block teaching offers to enhance employability in the context of large-scale classes. It suggests that block teaching, with its condensed structure, necessitates curriculum innovation, fosters participatory learning and peer-to-peer networking, and has been shown to increase student focus and enhance engagement and attainment, especially amongst diverse learners. As these are the same challenges that large-scale teaching faces, it is proposed that intensive modes of delivery could be scaled up in a way that may help to mitigate such problems as cohorts in business schools continue to increase in size. Design/methodology/approach – The paper is based on secondary research and provides an overview of literature that looks at block teaching, followed by that which explores the challenges of large-scale teaching contexts. It compares and contrasts the gaps in both to suggest a way that they could be combined. Findings – The paper provides key insights into changes in the contemporary landscape of teaching within UK business schools, which have seen increasingly large cohorts and draws out the key strengths of intensive modes of delivery, which include helping students to time manage effectively, encouraging curriculum innovation and the creation of participatory learning opportunities as well as providing closer personal relationships between students and staff. Outlining some of the well-documented issues that can arise when teaching larger cohorts, the paper suggests that scaling up blocked delivery may offer a new way help to overcome them. Research limitations/implications – Because of the chosen research approach, the research results are subject to generalisation. Therefore, researchers are encouraged to test the proposed propositions in large-scale teaching scenarios. Practical implications – This paper includes implications for the development of innovative modes of teaching in the context of large cohorts, an experience that is increasingly common amongst British business schools and beyond. Originality/value – This paper brings together two bodies of literature for the first time – that of intensive modes of teaching and that focuses on large-scale teaching contexts – for the first time to show how the former may help to overcome some of the key issues arising in the latter.
- Published
- 2024
- Full Text
- View/download PDF
8. Managing EEG studies: How to prepare and what to do once data collection has begun
- Author
-
Boudewyn, Megan A, Erickson, Molly A, Winsler, Kurt, Ragland, John Daniel, Yonelinas, Andrew, Frank, Michael, Silverstein, Steven M, Gold, Jim, MacDonald, Angus W, Carter, Cameron S, Barch, Deanna M, and Luck, Steven J
- Subjects
Biological Sciences ,Biomedical and Clinical Sciences ,Psychology ,Neurosciences ,Clinical Research ,Electroencephalography ,Humans ,Data Collection ,Software ,Research Design ,EEG methods ,guidelines ,large-scale ,multisite ,protocol ,recommendations ,Medical and Health Sciences ,Psychology and Cognitive Sciences ,Experimental Psychology ,Biological sciences ,Biomedical and clinical sciences - Abstract
In this paper, we provide guidance for the organization and implementation of EEG studies. This work was inspired by our experience conducting a large-scale, multi-site study, but many elements could be applied to any EEG project. Section 1 focuses on study activities that take place before data collection begins. Topics covered include: establishing and training study teams, considerations for task design and piloting, setting up equipment and software, development of formal protocol documents, and planning communication strategy with all study team members. Section 2 focuses on what to do once data collection has already begun. Topics covered include: (1) how to effectively monitor and maintain EEG data quality, (2) how to ensure consistent implementation of experimental protocols, and (3) how to develop rigorous preprocessing procedures that are feasible for use in a large-scale study. Links to resources are also provided, including sample protocols, sample equipment and software tracking forms, sample code, and tutorial videos (to access resources, please visit: https://osf.io/wdrj3/).
- Published
- 2023
9. Large-scale photonic inverse design: computational challenges and breakthroughs
- Author
-
Kang Chanik, Park Chaejin, Lee Myunghoo, Kang Joonho, Jang Min Seok, and Chung Haejun
- Subjects
large-scale ,inverse design ,computational challenges ,Physics ,QC1-999 - Abstract
Recent advancements in inverse design approaches, exemplified by their large-scale optimization of all geometrical degrees of freedom, have provided a significant paradigm shift in photonic design. However, these innovative strategies still require full-wave Maxwell solutions to compute the gradients concerning the desired figure of merit, imposing, prohibitive computational demands on conventional computing platforms. This review analyzes the computational challenges associated with the design of large-scale photonic structures. It delves into the adequacy of various electromagnetic solvers for large-scale designs, from conventional to neural network-based solvers, and discusses their suitability and limitations. Furthermore, this review evaluates the research on optimization techniques, analyzes their advantages and disadvantages in large-scale applications, and sheds light on cutting-edge studies that combine neural networks with inverse design for large-scale applications. Through this comprehensive examination, this review aims to provide insights into navigating the landscape of large-scale design and advocate for strategic advancements in optimization methods, solver selection, and the integration of neural networks to overcome computational barriers, thereby guiding future advancements in large-scale photonic design.
- Published
- 2024
- Full Text
- View/download PDF
10. Beaconet: A Reference‐Free Method for Integrating Multiple Batches of Single‐Cell Transcriptomic Data in Original Molecular Space.
- Author
-
Xu, Han, Ye, Yusen, Duan, Ran, Gao, Yong, Hu, Yuxuan, and Gao, Lin
- Subjects
- *
TRANSCRIPTOMES , *BIOLOGICAL variation , *DATA integration - Abstract
Integrating multiple single‐cell datasets is essential for the comprehensive understanding of cell heterogeneity. Batch effect is the undesired systematic variations among technologies or experimental laboratories that distort biological signals and hinder the integration of single‐cell datasets. However, existing methods typically rely on a selected dataset as a reference, leading to inconsistent integration performance using different references, or embed cells into uninterpretable low‐dimensional feature space. To overcome these limitations, a reference‐free method, Beaconet, for integrating multiple single‐cell transcriptomic datasets in original molecular space by aligning the global distribution of each batch using an adversarial correction network is presented. Through extensive comparisons with 13 state‐of‐the‐art methods, it is demonstrated that Beaconet can effectively remove batch effect while preserving biological variations and is superior to existing unsupervised methods using all possible references in overall performance. Furthermore, Beaconet performs integration in the original molecular feature space, enabling the characterization of cell types and downstream differential expression analysis directly using integrated data with gene‐expression features. Additionally, when applying to large‐scale atlas data integration, Beaconet shows notable advantages in both time‐ and space‐efficiencies. In summary, Beaconet serves as an effective and efficient batch effect removal tool that can facilitate the integration of single‐cell datasets in a reference‐free and molecular feature‐preserved mode. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Scalable computer interactive education system based on large-scale multimedia data analysis.
- Author
-
Zhao, Jie, Liu, Taotang, and Li, Shuping
- Subjects
INTERACTIVE computer systems ,COMPUTER performance ,DATA analysis ,COMPUTER engineering ,ONLINE education - Abstract
Massive teaching resources will cause serious teaching efficiency problems for online teaching, and traditional online teaching models are even inferior to traditional classroom teaching in terms of teaching effects. Based on this, this paper analyzes massive educational resources and builds a scalable computer interactive education system based on large-scale multimedia data analysis. Moreover, this paper sets the role of the system according to the actual teaching situation, and constructs the functional module of the system structure. In addition, this paper uses computer simulation technology to analyze interactive technology and make technical improvements to make interactive technology the core technology of the computer interactive education system, and get an extensible interactive education system based on the characteristics of network teaching. Then helps to monitor and access the performance of an interactive educational system. Furthermore, this paper designs an experiment to evaluate the performance of the computer interactive education system, which is mainly carried out from two aspects: interactive evaluation and teaching evaluation. From the experimental research results, we can see that this system can effectively improve the quality of teaching. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. A Hybrid Parallel Processing Strategy for Large-Scale DEA Computation.
- Author
-
Chang, Shengqing, Ding, Jingjing, Feng, Chenpeng, and Wang, Ruifeng
- Subjects
PARALLEL processing ,DATA envelopment analysis ,TIME complexity ,MESSAGE passing (Computer science) ,PARALLEL algorithms - Abstract
Using data envelopment analysis (DEA) with large-scale data poses a big challenge to applications due to its computing-intensive nature. So far, various strategies have been proposed in academia to accelerate the DEA computation, including DEA algorithms such as hierarchical decomposition (HD), DEA enhancements such as restricted basis entry (RBE) and LP accelerators such as hot starts. However, few studies have integrated these strategies and combined them with a parallel processing framework to solve large-scale DEA problems. In this paper, a hybrid parallel DEA algorithm (named PRHH algorithm) is proposed, including the RBE algorithm, hot starts, and HD algorithm based on Message Passing Interface (MPI). Furthermore, the attribute of the PRHH algorithm is analyzed, and formalized as a computing time function, to shed light on its time complexity. Finally, the performance of the algorithm is investigated in various simulation scenarios with datasets of different characteristics and compared with existing methods. The results show that the proposed algorithm reduces computing time in general, and boosts performance dramatically in scenarios with low density in particular. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. High-throughput Oxford Nanopore sequencing-based approach for the multilocus sequence typing analysis of large-scale avian Escherichia coli study in Mississippi
- Author
-
Linan Jia, Mark A. Arick, Chuan-Yu Hsu, Daniel G. Peterson, Jeffrey D. Evans, Kelsy Robinson, Anuraj T. Sukumaran, Reshma Ramachandran, Pratima Adhikari, and Li Zhang
- Subjects
avian Escherichia coli ,Oxford Nanopore ,large-scale ,high-throughput ,field study ,Animal culture ,SF1-1100 - Abstract
ABSTRACT: Avian pathogenic Escherichia coli (APEC) cause avian colibacillosis and accurately distinguishing infectious isolates is critical for controlling its transmission. Multilocus sequence typing (MLST) is an accurate and efficient strain identification method for epidemiological surveillance. This research aimed to develop a fast and high-throughput workflow that simultaneously sequences the Achtman typing scheme's 7 housekeeping genes of multiple E. coli isolates using the Oxford Nanopore Technologies (ONT) platform for large-scale APEC study. E. coli strains were isolated from poultry farms, the housekeeping genes were amplified, and amplicons were sequenced on an R9.4 MinION flow cell using the Nanopore GridION sequencer (ONT, Oxford, UK) following the initial workflow (ONT-MLST). Moreover, the workflow was revised by introducing large-scale DNA extraction and multiplex PCR into the ONT-MLST workflow and applied to 242 new isolates, 18 isolates from the previous workflow, and 5 ATCC reference strains using Flongle flow cell on the Nanopore MinION Mk1C sequencer (ONT, Oxford, UK). Finally, the sequence type (ST) results of the 308 isolates collected from infected chickens and poultry farm environments were reported and analyzed. Data indicated that E. coli belonging to ST159, ST8578, and ST355 have the potential to infect multiple organs in broiler. In addition, zoonotic STs, ST69, ST10, ST38, and ST131, were detected from poultry farms. With the advantages of the high throughput of ONT, this study provides a rapid workflow for large-scale E. coli typing and identified frequently isolated sequence types related to APEC infection in poultry.
- Published
- 2024
- Full Text
- View/download PDF
14. Statistical methods for survival analysis in large-scale electronic health records research
- Author
-
Schmidt, James C. F.
- Subjects
Statistical Methods ,Survival analysis ,Large-scale ,Electronic Health Records research ,thesis ,Health sciences - Abstract
The relative survival framework is a popular method for the estimation of a subject's survival, corrected for the effect of non-disease related causes of death. A comparison is made between the observed all-cause survival and the expected survival, derived from published population mortality rates known as life tables, often stratified by age, sex, and calendar year. Under certain assumptions, relative survival provides an estimate of net survival, survival in a hypothetical world where subjects can only die due to their disease. In order to interpret relative survival as net survival, other-cause mortality rates for subjects with the disease of interest must be the same as expected mortality rates. When interest lies in the relative survival of diseases with multiple shared risk factors, for example lung cancer, the use of standard life tables is unsuitable, requiring additional stratification by these risk factors. The primary aim of this research is use a control population taken from large-scale linked electronic health records to adjust published life tables by comorbidity, and to investigate the impact of these and standard life tables on relative survival estimates. To achieve these research aims, bespoke software is developed to aid the management of large-scale health data, while investigations into mortality rates in the control population data is undertaken, showing biased results when follow-up requirements form part of patient selection. Comorbidity adjusted life tables are estimated using time-constant and time-updated exposures, and applied in a relative survival analysis in colorectal cancer, comparing groups defined by cardiovascular comorbidity status. This research extends concepts and methods previously developed to form novel approaches to the adjustment of background mortality data, taking into account the induced bias in the control population, and showing the importance of the use of correctly stratified life tables, with key implications for future studies investigating differential mortality rates.
- Published
- 2023
- Full Text
- View/download PDF
15. Principles and key technologies for the large-scale ecological utilization of coal gangue
- Author
-
Zhenqi HU, Yanling ZHAO, and Zhen MAO
- Subjects
coal gangue ,large-scale ,ecological utilization ,pollution control ,Geology ,QE1-996.5 ,Mining engineering. Metallurgy ,TN1-997 - Abstract
Coal is the main energy in our country and the ballast stone of energy security. As an inevitable product in the process of coal mining and coal washing, coal gangue has an annual output of more than 700 million tons, which is in urgent need of large-scale and ecological utilization to solve the problem of coal gangue as a stumbling block in enterprise development. Based on the analysis of the mechanism of ecological damage in large-scale utilization of coal gangue, the principle of large-scale ecological utilization of coal gangue was put forward, and the concrete solution of large-scale ecological utilization of environmental safety is discussed from the technical perspective. Two key technologies of large-scale ecological utilization was put forward, namely in-situ contamination control and ecological restoration of acid coal gangue mountain, and ground filling of coal gangue. The results demonstrated that: ① The key to large-scale ecological utilization of coal gangue is prevention and control of environmental contamination. Large-scale ecological utilization of coal gangue can be realized through the evaluation of the availability and economy of gangue that combined with environmental risk management. ② Realized the ecological utilization of accumulated gangue mountain by vegetation restoration based on in-situ contamination control, and developed an ecological utilization technology integrated with pollution source diagnosis, fire prevention, pollution barrier, and vegetation restoration. Based on the mechanism analysis of pollution caused by acid and heat production of gangue oxidation, thermal infrared coupled with surveying and mapping technology was uesd to locate the deep burning point (oxidation point) in gangue mountain; an oxidation inhibitor coupled with fungicide and reducing bacteria was invented, which covered with an inert materia and rolled to prevent oxygen and pollution; the fire-fighting technology combining shotcrete fire control and grouting is adopted in the spontaneous combustion area; and a fire-proof vegetation restoration technology based on local and grass irrigation was put forward, which realized in-situ contamination control and ecological restoration of acid coal gangue mountain.③ The ecological utilization of ground filling of coal gangue can be realized through the feasibility analysis of ecological utilization of ground filling, the technology of ecological utilization and the long-term monitoring of maintenance management. The key of ecological utilization is the contamination risk analysis of gangue material screening, the necessity and feasibility analysis of ground filling site selection, and environmental protection measures during the whole filling process, including safety and environmental protection measures such as anti-seepage barrier at the bottom of the site before filling, layered filling technology and soil profile reconstruction technology for fire prevention and acid control in filling, erosion control and vegetation restoration after filling. Large-scale ecological utilization of coal gangue not only solves the ecological environment problems caused by solid waste storage in mining areas, but also creates a new mode of ecological restoration in mining areas through scientific, safe, and reasonable utilization of new and dated gangue.
- Published
- 2024
- Full Text
- View/download PDF
16. High-Efficiency Dynamic Scanning Strategy for Powder Bed Fusion by Controlling Temperature Field of the Heat-Affected Zone
- Author
-
Xiaokang Huang, Xiaoyong Tian, Qi Zhong, Shunwen He, Cunbao Huo, Yi Cao, Zhiqiang Tong, and Dichen Li
- Subjects
Powder bed fusion ,Efficiency ,Large-scale ,Spot size ,Heat-affected zone (HAZ) ,Ocean engineering ,TC1501-1800 ,Mechanical engineering and machinery ,TJ1-1570 - Abstract
Abstract Improvement of fabrication efficiency and part performance was the main challenge for the large-scale powder bed fusion (PBF) process. In this study, a dynamic monitoring and feedback system of powder bed temperature field using an infrared thermal imager has been established and integrated into a four-laser PBF equipment with a working area of 2000 mm × 2000 mm. The heat-affected zone (HAZ) temperature field has been controlled by adjusting the scanning speed dynamically. Simultaneously, the relationship among spot size, HAZ temperature, and part performance has been established. The fluctuation of the HAZ temperature in four-laser scanning areas was decreased from 30.85 ℃ to 17.41 ℃. Thus, the consistency of the sintering performance of the produced large component has been improved. Based on the controllable temperature field, a dynamically adjusting strategy for laser spot size was proposed, by which the fabrication efficiency was improved up to 65.38%. The current research results were of great significance to the further industrial applications of large-scale PBF equipment.
- Published
- 2024
- Full Text
- View/download PDF
17. CDSKNNXMBD: a novel clustering framework for large-scale single-cell data based on a stable graph structure
- Author
-
Jun Ren, Xuejing Lyu, Jintao Guo, Xiaodong Shi, Ying Zhou, and Qiyuan Li
- Subjects
scRNA-seq ,Clustering ,Large-scale ,Imbalance ratio ,Medicine - Abstract
Abstract Background Accurate and efficient cell grouping is essential for analyzing single-cell transcriptome sequencing (scRNA-seq) data. However, the existing clustering techniques often struggle to provide timely and accurate cell type groupings when dealing with datasets with large-scale or imbalanced cell types. Therefore, there is a need for improved methods that can handle the increasing size of scRNA-seq datasets while maintaining high accuracy and efficiency. Methods We propose CDSKNNXMBD (Community Detection based on a Stable K-Nearest Neighbor Graph Structure), a novel single-cell clustering framework integrating partition clustering algorithm and community detection algorithm, which achieves accurate and fast cell type grouping by finding a stable graph structure. Results We evaluated the effectiveness of our approach by analyzing 15 tissues from the human fetal atlas. Compared to existing methods, CDSKNN effectively counteracts the high imbalance in single-cell data, enabling effective clustering. Furthermore, we conducted comparisons across multiple single-cell datasets from different studies and sequencing techniques. CDSKNN is of high applicability and robustness, and capable of balancing the complexities of across diverse types of data. Most importantly, CDSKNN exhibits higher operational efficiency on datasets at the million-cell scale, requiring an average of only 6.33 min for clustering 1.46 million single cells, saving 33.3% to 99% of running time compared to those of existing methods. Conclusions The CDSKNN is a flexible, resilient, and promising clustering tool that is particularly suitable for clustering imbalanced data and demonstrates high efficiency on large-scale scRNA-seq datasets.
- Published
- 2024
- Full Text
- View/download PDF
18. A PDMS coating with excellent durability for large-scale deicing
- Author
-
Tao Zhu, Yuan Yuan, Linbo Song, Xingde Wei, Huiying Xiang, Xu Dai, Xujiang Hua, and Ruijin Liao
- Subjects
PDMS ,Coating ,Large-scale ,Deicing ,Durability ,Mining engineering. Metallurgy ,TN1-997 - Abstract
The icing of wind turbine blades leads to a decrease in output power, seriously jeopardizing the economic benefits and operational reliability of wind farms. Conventional deicing techniques require expensive equipment and consume a large amount of energy. Low-interfacial toughness coatings without energy dissipation are believed to be a highly potential passive deicing technology. However, the durability in service is facing challenges. Herein, low-interfacial toughness PDMS coatings were prepared by physical blending. Through the optimization of added plasticizers and SiO2, PDMS coatings with excellent large-scale deicing performance were obtained. The constant deicing force and ice adhesion strength were reduced to 14.69 N/cm and 12.63 kPa, respectively. Moreover, a systematic durability assessment of the PDMS coating was carried out to address the actual operating conditions of wind turbines. Fortunately, the results showed that the PDMS coating could withstand 200 icing/deicing cycles while maintaining constant deicing force and ice adhesion strength of less than 50 N/cm and 30 kPa, respectively. After long-term thermal aging (21 days), UV irradiation (42 days) and salt spray corrosion (20 days), the PDMS coating still retained superior icephobicity and outstanding large-scale deicing performance. This work contributes to the research and development of low-interfacial toughness materials for large-scale deicing applications on wind turbine blades.
- Published
- 2024
- Full Text
- View/download PDF
19. Insight into best practices: a review of long-term monitoring of the rocky intertidal zone of the Northeast Pacific Coast
- Author
-
Kaplanis, Nikolas J
- Subjects
Biological Sciences ,Ecology ,long-term monitoring ,rocky intertidal zone ,sampling design ,Northeast Pacific Coast ,ecology ,large-scale ,Oceanography ,Geology - Abstract
On the shores of the Northeast Pacific Coast, research programs have monitored the rocky intertidal zone for multiple decades across thousands of kilometers, ranking among the longest-term and largest-scale ecological monitoring programs in the world. These programs have produced powerful datasets using simple field methods, and many are now capitalizing on modern field-sampling technology and computing power to collect and analyze biological information at increasing scale and resolution. Considering its depth, breadth, and cutting-edge nature, this research field provides an excellent case study for examining the design and implementation of long-term, large-scale ecological monitoring. I curated literature and interviewed 25 practitioners to describe, in detail, the methods employed in 37 community-level surveys by 18 long-term monitoring programs on the Northeast Pacific Coast, from Baja California, México, to Alaska, United States of America. I then characterized trade-offs between survey design components, identified key strengths and limitations, and provided recommendations for best practices. In doing so, I identified data gaps and research priorities for sustaining and improving this important work. This analysis is timely, especially considering the threat that climate change and other anthropogenic stressors present to the persistence of rocky intertidal communities. More generally, this review provides insight that can benefit long-term monitoring within other ecosystems.
- Published
- 2023
20. Large‐scale assessment of genetic structure to assess risk of populations of a large herbivore to disease.
- Author
-
Walter, W. David, Fameli, Alberto, Russo‐Petrick, Kelly, Edson, Jessie E., Rosenberry, Christopher S., Schuler, Krysten L., and Tonkovich, Michael J.
- Subjects
- *
CHRONIC wasting disease , *WHITE-tailed deer , *PRINCIPAL components analysis , *HERBIVORES , *PHYSIOGRAPHIC provinces , *GENETIC variation - Abstract
Chronic wasting disease (CWD) can spread among cervids by direct and indirect transmission, the former being more likely in emerging areas. Identifying subpopulations allows the delineation of focal areas to target for intervention. We aimed to assess the population structure of white‐tailed deer (Odocoileus virginianus) in the northeastern United States at a regional scale to inform managers regarding gene flow throughout the region. We genotyped 10 microsatellites in 5701 wild deer samples from Maryland, New York, Ohio, Pennsylvania, and Virginia. We evaluated the distribution of genetic variability through spatial principal component analysis and inferred genetic structure using non‐spatial and spatial Bayesian clustering algorithms (BCAs). We simulated populations representing each inferred wild cluster, wild deer in each state and each physiographic province, total wild population, and a captive population. We conducted genetic assignment tests using these potential sources, calculating the probability of samples being correctly assigned to their origin. Non‐spatial BCA identified two clusters across the region, while spatial BCA suggested a maximum of nine clusters. Assignment tests correctly placed deer into captive or wild origin in most cases (94%), as previously reported, but performance varied when assigning wild deer to more specific origins. Assignments to clusters inferred via non‐spatial BCA performed well, but efficiency was greatly reduced when assigning samples to clusters inferred via spatial BCA. Differences between spatial BCA clusters are not strong enough to make assignment tests a reliable method for inferring the geographic origin of deer using 10 microsatellites. However, the genetic distinction between clusters may indicate natural and anthropogenic barriers of interest for management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Towards large-scale programmable silicon photonic chip for signal processing.
- Author
-
Xie, Yiwei, Wu, Jiachen, Hong, Shihan, Wang, Cong, Liu, Shujun, Li, Huan, Ju, Xinyan, Ke, Xiyuan, Liu, Dajian, and Dai, Daoxin
- Subjects
SIGNAL processing ,OPTICAL computing ,MICROWAVE photonics ,OPTICAL dispersion ,OPTICAL control ,OPTICAL switching ,OPTICAL communications ,MICROWAVE filters - Abstract
Optical signal processing has been playing a crucial part as powerful engine for various information systems in the practical applications. In particular, achieving large-scale programmable chips for signal processing are highly desirable for high flexibility, low cost and powerful processing. Silicon photonics, which has been developed successfully in the past decade, provides a promising option due to its unique advantages. Here, recent progress of large-scale programmable silicon photonic chip for signal processing in microwave photonics, optical communications, optical computing, quantum photonics as well as dispersion controlling are reviewed. Particularly, we give a discussion about the realization of high-performance building-blocks, including ultra-low-loss silicon photonic waveguides, 2 × 2 Mach–Zehnder switches and microring resonator switches. The methods for configuring large-scale programmable silicon photonic chips are also discussed. The representative examples are summarized for the applications of beam steering, optical switching, optical computing, quantum photonic processing as well as optical dispersion controlling. Finally, we give an outlook for the challenges of further developing large-scale programmable silicon photonic chips. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. The Effect of the Solution Flow and Electrical Field on the Homogeneity of Large-Scale Electrodeposited ZnO Nanorods.
- Author
-
Zhao, Yanmin, Li, Kexue, Hu, Ying, Hou, Xiaobing, Lin, Fengyuan, Tang, Jilong, Tang, Xin, Xing, Xida, Zhao, Xiao, Zhu, Haibin, Wang, Xiaohua, and Wei, Zhipeng
- Subjects
- *
NANORODS , *ZINC oxide , *ANTIREFLECTIVE coatings , *HOMOGENEITY , *TIN oxides - Abstract
In this paper, we demonstrate the significant impact of the solution flow and electrical field on the homogeneity of large-scale ZnO nanorod electrodeposition from an aqueous solution containing zinc nitrate and ammonium nitrate, primarily based on the X-ray fluorescence results. The homogeneity can be enhanced by adjusting the counter electrode size and solution flow rate. We have successfully produced relatively uniform nanorod arrays on an 8 × 10 cm2 i-ZnO-coated fluorine-doped tin oxide (FTO) substrate using a compact counter electrode and a vertical stirring setup. The as-grown nanorods exhibit similar surface morphologies and dominant, intense, almost uniform near-band-edge emissions in different regions of the sample. Additionally, the surface reflectance is significantly reduced after depositing the ZnO nanorods, achieving a moth-eye effect through subwavelength structuring. This effect of the nanorod array structure indicates that it can improve the utilization efficiency of light reception or emission in various optoelectronic devices and products. The large-scale preparation of ZnO nanorods is more practical to apply and has an extremely broad application value. Based on the research results, it is feasible to prepare large-scale ZnO nanorods suitable for antireflective coatings and commercial applications by optimizing the electrodeposition conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. High-Efficiency Dynamic Scanning Strategy for Powder Bed Fusion by Controlling Temperature Field of the Heat-Affected Zone.
- Author
-
Huang, Xiaokang, Tian, Xiaoyong, Zhong, Qi, He, Shunwen, Huo, Cunbao, Cao, Yi, Tong, Zhiqiang, and Li, Dichen
- Abstract
Improvement of fabrication efficiency and part performance was the main challenge for the large-scale powder bed fusion (PBF) process. In this study, a dynamic monitoring and feedback system of powder bed temperature field using an infrared thermal imager has been established and integrated into a four-laser PBF equipment with a working area of 2000 mm × 2000 mm. The heat-affected zone (HAZ) temperature field has been controlled by adjusting the scanning speed dynamically. Simultaneously, the relationship among spot size, HAZ temperature, and part performance has been established. The fluctuation of the HAZ temperature in four-laser scanning areas was decreased from 30.85 ℃ to 17.41 ℃. Thus, the consistency of the sintering performance of the produced large component has been improved. Based on the controllable temperature field, a dynamically adjusting strategy for laser spot size was proposed, by which the fabrication efficiency was improved up to 65.38%. The current research results were of great significance to the further industrial applications of large-scale PBF equipment. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. CDSKNNXMBD: a novel clustering framework for large-scale single-cell data based on a stable graph structure.
- Author
-
Ren, Jun, Lyu, Xuejing, Guo, Jintao, Shi, Xiaodong, Zhou, Ying, and Li, Qiyuan
- Subjects
- *
K-nearest neighbor classification , *PARALLEL algorithms , *MULTIPLE comparisons (Statistics) - Abstract
Background: Accurate and efficient cell grouping is essential for analyzing single-cell transcriptome sequencing (scRNA-seq) data. However, the existing clustering techniques often struggle to provide timely and accurate cell type groupings when dealing with datasets with large-scale or imbalanced cell types. Therefore, there is a need for improved methods that can handle the increasing size of scRNA-seq datasets while maintaining high accuracy and efficiency. Methods: We propose CDSKNNXMBD (Community Detection based on a Stable K-Nearest Neighbor Graph Structure), a novel single-cell clustering framework integrating partition clustering algorithm and community detection algorithm, which achieves accurate and fast cell type grouping by finding a stable graph structure. Results: We evaluated the effectiveness of our approach by analyzing 15 tissues from the human fetal atlas. Compared to existing methods, CDSKNN effectively counteracts the high imbalance in single-cell data, enabling effective clustering. Furthermore, we conducted comparisons across multiple single-cell datasets from different studies and sequencing techniques. CDSKNN is of high applicability and robustness, and capable of balancing the complexities of across diverse types of data. Most importantly, CDSKNN exhibits higher operational efficiency on datasets at the million-cell scale, requiring an average of only 6.33 min for clustering 1.46 million single cells, saving 33.3% to 99% of running time compared to those of existing methods. Conclusions: The CDSKNN is a flexible, resilient, and promising clustering tool that is particularly suitable for clustering imbalanced data and demonstrates high efficiency on large-scale scRNA-seq datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Scale‐up fabrication and advanced properties of recycled polyethylene terephthalate aerogels from plastic waste.
- Author
-
Goh, Xue Yang, Deng, Xinying, Teo, Wern Sze, Ong, Ren Hong, Nguyen, Luon Tan, Bai, Tianliang, and Duong, Hai M.
- Subjects
POLYETHYLENE terephthalate ,PLASTIC scrap ,MECHANICAL drawing ,AEROGELS ,BLENDED yarn ,POLYVINYL alcohol - Abstract
Traditional fabrication methods of aerogels are time consuming, toxic, and difficult to implement, making the production of aerogels expensive and severely limits widespread adoption. Nonwoven technology is introduced to prepare fibers that can be used to create polymer‐based aerogel. With its introduction, it allows the continuous flow of fine fibers and eliminates the bottlenecking fiber preparation phase of the fabrication process. Using recycled polyethylene terephthalate (rPET) fibers and polyvinyl alcohol, two types of rPET aerogels are successfully fabricated, namely the lab‐scale and the large‐scale aerogels, to investigate the effectiveness of the nonwoven process line for the fiber preparation processing step. Fibers prepared manually (lab‐scale aerogels) and with the aid of a fiber preparation production line (large‐scale aerogels) are characterized and compared. Both lab‐scale and large‐scale aerogels exhibited the required specifications of low densities (12.6–45.9 and 13.2–43.7 mg/cm3, respectively) and high porosity (99.1%–96.7% and 99.0%–96.8%, respectively). Their thermal conductivity (23.4–34.0 and 23.2–31.9 mW/m⋅K, respectively) and compressive modulus (4.74–21.91 and 4.53–22.29 kPa, respectively) were also relatively similar. The advantage of scaled preparation of fibers for aerogel manufacturing includes higher throughputs (the line can produce up to 60 kg/h), improved consistency for defibrillation, homogenous fiber blending, and accurate replication of laboratory‐made aerogel properties. This demonstrates the viability of using nonwoven technology to scale for continuous production to bring down the production cost. Highlights: Scale up production of aerogels using nonwoven technologyImproving preparation process of aerogels through homogenous fiber blendingPreparation rate of up to 60 kg/hDeveloped high porosity aerogels up to 99%Good thermal insulation of 23.2–31.9 mW/m⋅K [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. A two-part alternating iteration power flow method based on dynamic equivalent admittance
- Author
-
Tong Jiang, Hongfei Hou, Zhuocheng Feng, and Chang Chen
- Subjects
Ill-conditioned ,Large-scale ,Power flow calculation ,Equivalent admittance ,Alternating iteration ,Production of electric energy or power. Powerplants. Central stations ,TK1001-1841 - Abstract
Power flow is an extensively used tool in various operations and planning of power systems. In this paper, we propose an efficient hybrid method for solving the power flow problems in ill-conditioned power systems. This method is implemented in a framework of two-part alternating iteration. In Part I of this method, PV bus voltages are regarded as constant and PQ bus voltages are updated by the Z-bus Gauss method. In Part II of this method, PQ bus loads are equivalent to dynamic admittances, and PV bus voltage angles are updated by Newton’s method. The values of PQ bus voltage magnitudes and PV bus voltage angles are passed between Part I and Part II. The proposed method is validated in well- and ill-conditioned systems and compared with several well-known power flow methods. Results show that the proposed method is robust and efficient to address the issues related with large-scale ill-conditioned power systems and it is not significantly affected by the considered initial guess.
- Published
- 2024
- Full Text
- View/download PDF
27. Beaconet: A Reference‐Free Method for Integrating Multiple Batches of Single‐Cell Transcriptomic Data in Original Molecular Space
- Author
-
Han Xu, Yusen Ye, Ran Duan, Yong Gao, Yuxuan Hu, and Lin Gao
- Subjects
batch effects ,large‐scale ,molecular feature space ,reference‐free ,single‐cell datasets ,Science - Abstract
Abstract Integrating multiple single‐cell datasets is essential for the comprehensive understanding of cell heterogeneity. Batch effect is the undesired systematic variations among technologies or experimental laboratories that distort biological signals and hinder the integration of single‐cell datasets. However, existing methods typically rely on a selected dataset as a reference, leading to inconsistent integration performance using different references, or embed cells into uninterpretable low‐dimensional feature space. To overcome these limitations, a reference‐free method, Beaconet, for integrating multiple single‐cell transcriptomic datasets in original molecular space by aligning the global distribution of each batch using an adversarial correction network is presented. Through extensive comparisons with 13 state‐of‐the‐art methods, it is demonstrated that Beaconet can effectively remove batch effect while preserving biological variations and is superior to existing unsupervised methods using all possible references in overall performance. Furthermore, Beaconet performs integration in the original molecular feature space, enabling the characterization of cell types and downstream differential expression analysis directly using integrated data with gene‐expression features. Additionally, when applying to large‐scale atlas data integration, Beaconet shows notable advantages in both time‐ and space‐efficiencies. In summary, Beaconet serves as an effective and efficient batch effect removal tool that can facilitate the integration of single‐cell datasets in a reference‐free and molecular feature‐preserved mode.
- Published
- 2024
- Full Text
- View/download PDF
28. An overview of application-oriented multifunctional large-scale stationary battery and hydrogen hybrid energy storage system
- Author
-
Yuchen Yang, Zhen Wu, Jing Yao, Tianlei Guo, Fusheng Yang, Zaoxiao Zhang, Jianwei Ren, Liangliang Jiang, and Bo Li
- Subjects
Hybrid energy storage system ,Battery ,Hydrogen ,Stationary ,Large-scale ,Multifunctional ,Technology ,Science (General) ,Q1-390 - Abstract
The imperative to address traditional energy crises and environmental concerns has accelerated the need for energy structure transformation. However, the variable nature of renewable energy poses challenges in meeting complex practical energy requirements. To address this issue, the construction of a multifunctional large-scale stationary energy storage system is considered an effective solution. This paper critically examines the battery and hydrogen hybrid energy storage systems. Both technologies face limitations hindering them from fully meeting future energy storage needs, such as large storage capacity in limited space, frequent storage with rapid response, and continuous storage without loss. Batteries, with their rapid response (90 %), excel in frequent short-duration energy storage. However, limitations such as a self-discharge rate (>1 %) and capacity loss (∼20 %) restrict their use for long-duration energy storage. Hydrogen, as a potential energy carrier, is suitable for large-scale, long-duration energy storage due to its high energy density, steady state, and low loss. Nevertheless, it is less efficient for frequent energy storage due to its low storage efficiency (∼50 %). Ongoing research suggests that a battery and hydrogen hybrid energy storage system could combine the strengths of both technologies to meet the growing demand for large-scale, long-duration energy storage. To assess their applied potentials, this paper provides a detailed analysis of the research status of both energy storage technologies using proposed key performance indices. Additionally, application-oriented future directions and challenges of the battery and hydrogen hybrid energy storage system are outlined from multiple perspectives, offering guidance for the development of advanced energy storage systems.
- Published
- 2024
- Full Text
- View/download PDF
29. A Heuristic Cutting Plane Algorithm For Budget Allocation of Large-scale Domestic Airport Network Protection
- Author
-
Yan Xihong and Hao Shiyu
- Subjects
airport security ,large-scale ,heuristic cutting plane algorithm ,budget allocation ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
It is well known that airport security is an important component of homeland security, since airports are highly vulnerable to terrorist attacks. In order to improve the overall security of the domestic airport network, some work studied the budget allocation of domestic airport network protection. They established a minimax optimization model and designed an exact cutting plane algorithm to solve the problem. However, the exact algorithm can not solve large-scale problems in an acceptable time. Hence, this paper designs a heuristic cutting plane algorithm for solving budget allocation of large-scale domestic airport network protection. Finally, numerical experiments are carried out to demonstrate the feasibility and effectiveness of the new algorithm.
- Published
- 2024
- Full Text
- View/download PDF
30. Deformable Transformer and Spectral U-Net for Large-Scale Hyperspectral Image Semantic Segmentation
- Author
-
Tianjian Zhang, Zhaohui Xue, and Hongjun Su
- Subjects
Deep learning ,hyperspectral remote sensing ,large-scale ,semantic segmentation ,transformer ,Ocean engineering ,TC1501-1800 ,Geophysics. Cosmic physics ,QC801-809 - Abstract
Remote sensing semantic segmentation tasks aim to automatically extract land cover types by accurately classifying each pixel. However, large-scale hyperspectral remote sensing images possess rich spectral information, complex and diverse spatial distributions, significant scale variations, and a wide variety of land cover types with detailed features, which pose significant challenges for segmentation tasks. To overcome these challenges, this study introduces a U-shaped semantic segmentation network that combines global spectral attention and deformable Transformer for segmenting large-scale hyperspectral remote sensing images. First, convolution and global spectral attention are utilized to emphasize features with the richest spectral information, effectively extracting spectral characteristics. Second, deformable self-attention is employed to capture global-local information, addressing the complex scale and distribution of objects. Finally, deformable cross-attention is used to aggregate deep and shallow features, enabling comprehensive semantic information mining. Experiments conducted on a large-scale hyperspectral remote sensing dataset (WHU-OHS) demonstrate that: first, in different cities including Changchun, Shanghai, Guangzhou, and Karamay, DTSU-Net achieved the highest performance in terms of mIoU compared to the baseline methods, reaching 56.19%, 37.89%, 52.90%, and 63.54%, with an average improvement of 7.57% to 34.13%, respectively; second, module ablation experiments confirm the effectiveness of our proposed modules, and deformable Transformer significantly reduces training costs compared to conventional Transformers; third, our approach achieves the highest mIoU of 57.22% across the entire dataset, with a balanced trade-off between accuracy and parameter efficiency, demonstrating an improvement of 1.65% to 56.58% compared to the baseline methods.
- Published
- 2024
- Full Text
- View/download PDF
31. Chain-Splitting-Solving-Splicing Approach to Large-Scale OFISP-Modeled Satellite Range Scheduling Problem
- Author
-
De Meng, Zhen-Bao Liu, Yu-Hang Gao, Zu-Ren Feng, Wen-Hua Guo, and Zhi-Gang Ren
- Subjects
Large-scale ,fixed interval scheduling ,combinatorial optimization ,dynamic programming ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The dimension of the operational-fixed-interval-scheduling-problem-modeled (OFISP-modeled) $\mathcal {NP}$ -hard satellite range scheduling problem (SRSP) extends rather than expands especially in large cases. This feature inspires us with the chain-splitting-solving-splicing (CSSS) approach to this OFISP-modeled SRSP under the curse of dimension. To boost the performance for scheduling the large-scale SRSP, we introduce the tricks to the algorithmic design from the inside out. The proposed method splits the original problem to small morsels with chain-splitting (CS) procedure to feed the route-reduction-based dynamic programming (R-DP) with the introduction of a novel scheduling element, the critical resource (CR), to expedite the execution for the optimal subsolution to the subproblem. At last, the standard DP (S-DP) splices the subsolutions into a complete optimal one. We encase the CR-based subproblem solving with the chain splitting and splicing framework. We show that it stunts the exponential explosion in time expenditure greatly with the CP by the mathematical analysis. We justify the efficiency of the proposed method with an application to a large-scale real-world SRSP instance. The proposed CSSS approach outperforms in comparison with several state-of-the-art algorithms. We obtained the optimal solution with the proposed method within reasonable time for the cases up to 3000 jobs. To the best of our knowledge, besides our research, other research into the large-scale SRSP is still absent. This status quo implies the potential of our research as a benchmark case for the would-be comparison with the other future research.
- Published
- 2024
- Full Text
- View/download PDF
32. Large-Scale Green Method for Synthesizing Ultralong Uniform Tellurium Nanowires for Semiconductor Devices
- Author
-
Zhiyi Lyu, Mose Park, Yanjin Tang, Hoon Choi, Seung Hyun Song, and Hoo-Jeong Lee
- Subjects
tellurium nanowires ,green synthesis ,semiconductor applications ,thin-film transistors ,large-scale ,Chemistry ,QD1-999 - Abstract
This study presents a large-scale green approach for synthesizing ultralong tellurium nanowires with diameters around 13 nm using a solution-based method. By adjusting key synthesis parameters such as the surfactant concentration, temperature, and reaction duration, we achieved high-quality, ultralong Te NWs. These nanowires exhibit properties suitable for use in semiconductor applications, particularly when employed as channel materials in thin-film transistors, displaying a pronounced gate effect with a high switch of up to 104 and a mobility of 0.9 cm2 V−1s−1. This study underscores the potential of solvent-based methods in synthesizing large-scale ultralong Te NWs as a critical resource for future sustainable nanoelectronic devices.
- Published
- 2024
- Full Text
- View/download PDF
33. Evaluation of Ecological Environment Quality Using an Improved Remote Sensing Ecological Index Model
- Author
-
Yanan Liu, Wanlin Xiang, Pingbo Hu, Peng Gao, and Ai Zhang
- Subjects
ecological environments ,remote sensing ecological index ,large-scale ,3D ecological factors ,subjective and objective weights determination ,Science - Abstract
The Remote Sensing Ecological Index (RSEI) model is widely used for large-scale, rapid Ecological Environment Quality (EEQ) assessment. However, both the RSEI and its improved models have limitations in explaining the EEQ with only two-dimensional (2D) factors, resulting in inaccurate evaluation results. Incorporating more comprehensive, three-dimensional (3D) ecological information poses challenges for maintaining stability in large-scale monitoring, using traditional weighting methods like the Principal Component Analysis (PCA). This study introduces an Improved Remote Sensing Ecological Index (IRSEI) model that integrates 2D (normalized difference vegetation factor, normalized difference built-up and soil factor, heat factor, wetness, difference factor for air quality) and 3D (comprehensive vegetation factor) ecological factors for enhanced EEQ monitoring. The model employs a combined subjective–objective weighting approach, utilizing principal components and hierarchical analysis under minimum entropy theory. A comparative analysis of IRSEI and RSEI in Miyun, a representative study area, reveals a strong correlation and consistent monitoring trends. By incorporating air quality and 3D ecological factors, IRSEI provides a more accurate and detailed EEQ assessment, better aligning with ground truth observations from Google Earth satellite imagery.
- Published
- 2024
- Full Text
- View/download PDF
34. THE END OF TRADITIONAL FOCUS GROUPS? SCALING UP QUALITATIVE RESEARCH QUICK, YET MAINTAINING DEPTH.
- Author
-
Anis, Azaleah Mohd and Olisa, Nadia
- Subjects
- *
FOCUS groups , *QUALITATIVE research , *GENERALIZABILITY theory , *DATA quality , *ARTIFICIAL intelligence - Abstract
The key benefits of qualitative research are rich insights and thick data. However, this may come at the cost of small sample sizes and low generalisability of findings. With traditional focus group sessions (FGDs), this could be addressed by conducting multiple groups. However, this requires significant investment of time and manpower. It was prudent to explore methods to gather thick data quickly, to effectively replace FGDs. This would increase the number of respondents in a single session, without increasing manpower or lengthening fieldwork, and maintaining data quality. This paper details the experience of running a pilot study of a 1.5-hour online discussion with N=103 respondents, to capture in-depth responses at scale. Using pre-programmed questions and artificial intelligence (AI) to provide instant visual analyses of responses and additional probes to respondents live, a full qualitative study was run, with a larger sample and in the same duration required for a typical FGD. The discussion was text-based - respondents could view and give their agreement or disagreement to what others may have said without directly interacting. The data was compared to data collected from a previous study where a total of 35 respondents across 5 FGDs discussed a similar topic, and analysed from an operational aspect of conducting research and gathering insights from both methodologies. While this methodology does not replace traditional FGD, it has proven effective in scaling up qualitative research by gathering large amounts of qualitative data within a short duration, in real-time. It has its limitations, primarily the inability to further nuance responses. Despite this, the pilot appears to be a successful attempt conceptually, as the AI generated valuable instant insights while the study was ongoing, particularly from open-ended (OE) responses. It may add value to specific use cases such as large-scale engagement studies which require both breadth and scalability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. The Potential of AI-Driven Assistants in Scaled Agile Software Development.
- Author
-
Saklamaeva, Vasilka and Pavlič, Luka
- Subjects
AGILE software development ,SOFTWARE engineering ,ARTIFICIAL intelligence ,COMPUTER software development - Abstract
Scaled agile development approaches are now used widely in modern software engineering, allowing businesses to improve teamwork, productivity, and product quality. The incorporation of artificial intelligence (AI) into scaled agile development methods (SADMs) has emerged as a potential strategy in response to the ongoing demand for simplified procedures and the increasing complexity of software projects. This paper explores the intersection of AI-driven assistants within the context of the scaled agile framework (SAFe) for large-scale software development, as it stands out as the most widely adopted framework. Our paper pursues three principal objectives: (1) an evaluation of the challenges and impediments encountered by organizations during the implementation of SADMs, (2) an assessment of the potential advantages stemming from the incorporation of AI in large-scale contexts, and (3) the compilation of aspects of SADMs that AI-driven assistants enhance. Through a comprehensive systematic literature review, we identified and described 18 distinct challenges that organizations confront. In the course of our research, we pinpointed seven benefits and five challenges associated with the implementation of AI in SADMs. These findings were systematically categorized based on their occurrence either within the development phase or the phases encompassing planning and control. Furthermore, we compiled a list of 15 different AI-driven assistants and tools, subjecting them to a more detailed examination, and employing them to address the challenges we uncovered during our research. One of the key takeaways from this paper is the exceptional versatility and effectiveness of AI-driven assistants, demonstrating their capability to tackle a broader spectrum of problems. In conclusion, this paper not only sheds light on the transformative potential of AI, but also provides invaluable insights for organizations aiming to enhance their agility and management capabilities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. An efficient large‐scale whole‐genome sequencing analyses practice with an average daily analysis of 100Tbp: ZBOLT.
- Author
-
Li, Zhichao, Xie, Yinlong, Zeng, Wenjun, Huang, Yushan, Gu, Shengchang, Gao, Ya, Huang, Weihua, Lu, Lihua, Wang, Xiaohong, Wu, Jiasheng, Yin, Xiaoxu, Zhu, Rongyi, Huang, Guodong, Lu, Lin, Tang, Jingbo, Zheng, Yunping, Liu, Quan, Zhou, Xianqiang, Shan, Riqiang, and Wang, Bo
- Subjects
- *
NUCLEOTIDE sequencing , *SEQUENCE analysis , *BASE pairs , *GENOMICS , *ENERGY consumption , *GENETIC software - Abstract
Background: With the advancement of whole‐genome sequencing (WGS) technology, massively parallel sequencing (MPS) remains the mainstream due to its accuracy, low cost, and high throughput. The development of the analytical pipeline corresponding to MPS has always been of great importance. Increasingly large population genomics studies, as a specific type of big data research, pose new challenges for analysis solutions. Results: Here, we introduce ZBOLT, a comprehensive analysis system that incorporates both software and hardware advancements, making it an appropriate choice for large‐scale population genomic studies that require extensive data processing. In this study, we first evaluate ZBOLT's calling accuracy using the Genome in a Bottle (GIAB) benchmark dataset. Then we apply ZBOLT to a large‐scale population genomics study with 5,616 high sequencing depth samples totaling 1.16Pbp (base pair). As the results show, ZBOLT demonstrates exceptional efficiency and low energy consumption, processing 100Tbp per day and using 1kWh per 100Gbp sequenced sample. Conclusion: This research serves as a valuable reference for analyzing sequencing data from large population cohorts and underscores the significant potential of ZBOLT in large‐scale population genomics studies. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. Decision variable contribution based adaptive mechanism for evolutionary multi-objective cloud workflow scheduling.
- Author
-
Li, Jun, Xing, Lining, Zhong, Wen, Cai, Zhaoquan, and Hou, Feng
- Subjects
WORKFLOW management systems ,WORKFLOW ,SCHEDULING ,CLOUD computing ,PRODUCTION scheduling - Abstract
Workflow scheduling is vital to simultaneously minimize execution cost and makespan for cloud platforms since data dependencies among large-scale workflow tasks and cloud workflow scheduling problem involve large-scale interactive decision variables. So far, the cooperative coevolution approach poses competitive superiority in resolving large-scale problems by transforming the original problems into a series of small-scale subproblems. However, the static transformation mechanisms cannot separate interactive decision variables, whereas the random transformation mechanisms encounter low efficiency. To tackle these issues, this paper suggests a decision-variable-contribution-based adaptive evolutionary cloud workflow scheduling approach (VCAES for short). To be specific, the VCAES includes a new estimation method to quantify the contribution of each decision variable to the population advancement in terms of both convergence and diversity, and dynamically classifies the decision variables according to their contributions during the previous iterations. Moreover, the VCAES includes a mechanism to adaptively allocate evolution opportunities to each constructed group of decision variables. Thus, the decision variables with a strong impact on population advancement are assigned more evolution opportunities to accelerate population to approximate the Pareto-optimal fronts. To verify the effectiveness of the proposed VCAES, we carry out extensive numerical experiments on real-world workflows and cloud platforms to compare it with four representative algorithms. The numerical results demonstrate the superiority of the VCAES in resolving cloud workflow scheduling problems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Large‐scale assessment of genetic structure to assess risk of populations of a large herbivore to disease
- Author
-
W. David Walter, Alberto Fameli, Kelly Russo‐Petrick, Jessie E. Edson, Christopher S. Rosenberry, Krysten L. Schuler, and Michael J. Tonkovich
- Subjects
chronic wasting disease ,genetic structure ,large‐scale ,population assignment ,population structure ,white‐tailed deer ,Ecology ,QH540-549.5 - Abstract
Abstract Chronic wasting disease (CWD) can spread among cervids by direct and indirect transmission, the former being more likely in emerging areas. Identifying subpopulations allows the delineation of focal areas to target for intervention. We aimed to assess the population structure of white‐tailed deer (Odocoileus virginianus) in the northeastern United States at a regional scale to inform managers regarding gene flow throughout the region. We genotyped 10 microsatellites in 5701 wild deer samples from Maryland, New York, Ohio, Pennsylvania, and Virginia. We evaluated the distribution of genetic variability through spatial principal component analysis and inferred genetic structure using non‐spatial and spatial Bayesian clustering algorithms (BCAs). We simulated populations representing each inferred wild cluster, wild deer in each state and each physiographic province, total wild population, and a captive population. We conducted genetic assignment tests using these potential sources, calculating the probability of samples being correctly assigned to their origin. Non‐spatial BCA identified two clusters across the region, while spatial BCA suggested a maximum of nine clusters. Assignment tests correctly placed deer into captive or wild origin in most cases (94%), as previously reported, but performance varied when assigning wild deer to more specific origins. Assignments to clusters inferred via non‐spatial BCA performed well, but efficiency was greatly reduced when assigning samples to clusters inferred via spatial BCA. Differences between spatial BCA clusters are not strong enough to make assignment tests a reliable method for inferring the geographic origin of deer using 10 microsatellites. However, the genetic distinction between clusters may indicate natural and anthropogenic barriers of interest for management.
- Published
- 2024
- Full Text
- View/download PDF
39. Agile tailoring in distributed large-scale environments using agile frameworks: A Systematic Literature Review
- Author
-
Rafael Camara and Marcelo Marinho
- Subjects
Agile ,Distributed Software Development ,Large-scale ,Tailoring agile ,Scaling agile frameworks ,Agile frameworks ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
With the increasing adoption of agile methodologies in distributed software development teams, there is a need to adapt these practices for large-scale environments. However, the lack of specific guidance can make this process difficult. To evaluate how large-scale agile distributed teams adapt their practices to meet their specific contexts, this study launches a Systematic Literature Review (SLR). The SLR presents adaptations of agile methodologies in distributed software development teams operating in large-scale environments. With the growing popularity of agile methodologies, there is an increasing need to adapt them to suit the specific needs of distributed teams operating in large-scale contexts. The review identified 96 adapted practices from five agile frameworks (Scrum, Scaled Agile Framework (SAFe), Large Scale Scrum (LeSS), the Spotify model, and Disciplined Agile Delivery (DAD)) used in various case studies between 2007 and 2021. Scrum was the most commonly adapted framework with 32 customized practices, followed by SAFe (25), LeSS (17), the Spotify model (13), and DAD (9). The review provides insights into how these practices have been tailored to meet the needs of distributed teams in large-scale contexts. The findings can guide organizations in adapting agile practices to their specific contexts.
- Published
- 2024
- Full Text
- View/download PDF
40. Accelerate spatiotemporal fusion for large-scale applications
- Author
-
Yunfei Li, Liangli Meng, Huaizhang Sun, Qian Shi, Jun Li, and Yaotong Cai
- Subjects
Spatiotemporal fusion ,Accelerate ,Large-scale ,Physical geography ,GB3-5030 ,Environmental sciences ,GE1-350 - Abstract
Spatiotemporal fusion (STF) can provide dense satellite image series with high spatial resolution. However, most spatiotemporal fusion approaches are time-consuming, which seriously limits their applicability in large-scale areas. To address this problem, some efforts have been paid for accelerating STF approaches with help of graphics processing units (GPUs), whose effect is dramatic. However, this strategy is hardware dependent, which may not be always satisfied. In this paper, we develop a hardware independent accelerating strategy, named AcSTF. The proposed AcSTF consists of two steps, which are medium resolution STF (MSTF) and local normalization-based fast fusion (LNFM). The MSTF utilizes STF methods to improve the coarse spatial resolution images to a medium spatial resolution, while the LNFM further refines the medium spatial resolution images to provide fine spatial resolution images. To test the AcSTF, the experiments are conducted using five commonly used STF approaches on two public Landsat-MODIS datasets. The experimental results indicate that AcSTF can not only reduce 87%–95% running time of current STF approaches, but also preserve their qualitative and quantitative performance well. After that, we apply the AcSTF to produce an intact 30 m image of the whole Ukraine mainland. Without any hardware which can speed up computing,the time for reconstructing the 30 m image is 5.42 h just using an unremarkable central processing unit (CPU). Compared to the real Landsat image, the reconstructed image achieves remarkable qualitative and quantitative performance, which demonstrates the practicability of the AcSTF.
- Published
- 2024
- Full Text
- View/download PDF
41. THE END OF TRADITIONAL FOCUS GROUPS?
- Author
-
Azaleah Mohd Anis and Nadia Olisa
- Subjects
Big Qual ,Focus Groups ,Saturation ,Large-scale ,General Works - Abstract
The key benefits of qualitative research are rich insights and thick data. However, some argue these come at the cost of small sample sizes and low generalisability of findings. With traditional FGDs (FGDs), this could be addressed by conducting multiple groups. However, this requires significant investment of time and manpower. We explored methods to gather thick data quickly, aiming to increase the number of respondents without increasing manpower or lengthening fieldwork while maintaining data quality. In this paper, we detail our experience running a pilot study of a 1.5 hour online discussion with N=100 respondents, to capture in-depth responses at scale. Using pre-programmed questions and artificial intelligence (AI) to provide instant visual analyses of responses and additional probes to respondents live, we ran a full qualitative study with a bigger sample in the same duration required for a typical FGD. The discussion was text-based, with respondents being able to view and give their agreement or disagreement to what others may have said without interaction between them. The data was compared to data collected from a previous study with a similar topic and analysed from an operational aspect of conducting research and gathering insights from both methodologies. While this methodology does not replace traditional FGD, it has proven effective in scaling up qualitative research by gathering large amounts of qualitative data within a short duration, in real-time. The methodology has its limitations, primarily the inability to further nuance responses. Despite this, the pilot study appears to be a successful attempt conceptually, as the AI generated valuable instant insights while the study was ongoing, particularly from open-ended (OE) responses. It may add value to specific use cases such as quick sensing which require both breadth and scalability.
- Published
- 2024
- Full Text
- View/download PDF
42. Large-scale and high-resolution paddy rice intensity mapping using downscaling and phenology-based algorithms on Google Earth Engine
- Author
-
Liangli Meng, Yunfei Li, Ruoque Shen, Yi Zheng, Baihong Pan, Wenping Yuan, Jun Li, and Li Zhuo
- Subjects
Large-scale ,High-resolution ,Downscaling ,PRCI ,Phenology-based algorithms ,GEE ,Physical geography ,GB3-5030 ,Environmental sciences ,GE1-350 - Abstract
Accurate mapping of paddy rice cropping intensity (PRCI) affects precision agriculture, water use management, and informed decision-making. Many PRCI mapping approaches have been developed and achieved remarkable performance. However, large-scale and high-resolution PRCI mapping in southern China, especially in the Hunan Province, remains challenging. In this region, optical data suffer from serious data loss owing to frequent cloud coverage, whereas radar data are usually affected by mountainous terrain. Multisource data fusion may mitigate data loss; however, many data fusion approaches cannot be used in large-scale applications because of their significant computational complexity. To achieve accurate PRCI mapping in large-scale areas of southern China, we propose a novel large-scale downscaling method for multi-source data fusion. A new framework for PRCI mapping was developed using the Google Earth Engine. To test its performance, we mapped the 10 m PRCI of Hunan Province in 2020. The overall accuracy of the mapping was 90.70 % with a kappa coefficient of 0.81. Furthermore, our estimated sown area was highly correlated with the statistical yearbook (R2 = 0.84, RMSE = 11.47 kha). Our experimental results demonstrated the remarkable performance of the proposed framework for large-scale and high-resolution PRCI mapping in southern China.
- Published
- 2024
- Full Text
- View/download PDF
43. Experimental study on large-scale compression members strengthened with circumferential prestressed CFRP plate
- Author
-
Xiaoying Chen, Gang Yang, Jing Zhuo, Yonghui Zhang, Changrui Ren, Longsheng Qi, Hanchen Du, and Changming Bu
- Subjects
Circumferential tension ,Pre-stressed CFRP plate ,Large-scale ,Compression components ,The prestress loss ,Science (General) ,Q1-390 ,Social sciences (General) ,H1-99 - Abstract
There have been many research reports on the reinforcement of small-sized square columns with a cross-section of 200mm–300mm using prestressed carbon fiber-reinforced polymer (CFRP) materials, while there are few studies on piers in bridge and tower columns in cable-stayed bridges with a cross-section of several meters or even tens of meters. The horizontal prestressed steel tendons in the anchorage zone of tower columns in cable-stayed bridge replaced by prestressed CFRP sheets can not only facilitate construction and maintenance, but also have good fatigue resistance. The prestressed CFRP plate is used to reinforce the large-sized tower columns by using a specific device to tension the CFRP plate wrapped around the surface of the members. The tensioning device and test pedestal based on WSGG (wave-shaped-gear-grip) anchor clamping of CFRP plate have been developed in this paper, and the CFRP plate circumferential tensioning tests on the pedestal have been conducted. The test results are as follows: (1) the developed device can achieve circumferential tensioning of single-layer CFRP plate to 0.5ftk of the material, reaching a tensile force of 60kN, and generate effective restraint pressure on a 2-m long composite compression component; (2)The calculation formula for the constraint pressure generated by the circumferential prestressed CFRP sheet on the component has been derived and verified, and the maximum error between the calculated value and the experimental value is within 5%; (3) When iron sheet serves as the interface medium between CFRP plate and compression components, the prestress loss of the CFRP plate tensioned at one end is about 84% and 58%–60% when tensioned at both ends. It can be seen that the effective prestress of the CFRP plate with iron sheet as the interface medium is relatively small. Meanwhile, based on the distribution of compressive stress in the components and the effective pre tension value of CFRP plate, it can be seen that two end tensioning is better than one end tensioning; (4) The tensile stress of CFRP plate along the member is a cubic function when the tension force is 60kN, so it is deduced that the constrained compressive stress generated by CFRP plate on the member is a quadratic function distribution.
- Published
- 2024
- Full Text
- View/download PDF
44. An Integrated Framework for Geothermal Energy Storage with CO2 Sequestration and Utilization
- Author
-
Yueliang Liu, Ting Hu, Zhenhua Rui, Zheng Zhang, Kai Du, Tao Yang, Birol Dindoruk, Erling Halfdan Stenby, Farshid Torabi, and Andrey Afanasyev
- Subjects
Geothermal energy storage ,CO2 sequestration ,Carbon neutrality ,Large-scale ,CO2 utilization ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Subsurface geothermal energy storage has greater potential than other energy storage strategies in terms of capacity scale and time duration. Carbon dioxide (CO2) is regarded as a potential medium for energy storage due to its superior thermal properties. Moreover, the use of CO2 plumes for geothermal energy storage mitigates the greenhouse effect by storing CO2 in geological bodies. In this work, an integrated framework is proposed for synergistic geothermal energy storage and CO2 sequestration and utilization. Within this framework, CO2 is first injected into geothermal layers for energy accumulation. The resultant high-energy CO2 is then introduced into a target oil reservoir for CO2 utilization and geothermal energy storage. As a result, CO2 is sequestrated in the geological oil reservoir body. The results show that, as high-energy CO2 is injected, the average temperature of the whole target reservoir is greatly increased. With the assistance of geothermal energy, the geological utilization efficiency of CO2 is higher, resulting in a 10.1% increase in oil displacement efficiency. According to a storage-potential assessment of the simulated CO2 site, 110 years after the CO2 injection, the utilization efficiency of the geological body will be as high as 91.2%, and the final injection quantity of the CO2 in the site will be as high as 9.529 × 108 t. After 1000 years sequestration, the supercritical phase dominates in CO2 sequestration, followed by the liquid phase and then the mineralized phase. In addition, CO2 sequestration accounting for dissolution trapping increases significantly due to the presence of residual oil. More importantly, CO2 exhibits excellent performance in storing geothermal energy on a large scale; for example, the total energy stored in the studied geological body can provide the yearly energy supply for over 3.5 × 107 normal households. Application of this integrated approach holds great significance for large-scale geothermal energy storage and the achievement of carbon neutrality.
- Published
- 2023
- Full Text
- View/download PDF
45. In vivo hippocampal subfield volumes in bipolar disorder—A mega‐analysis from The Enhancing Neuro Imaging Genetics through Meta‐Analysis Bipolar Disorder Working Group
- Author
-
Haukvik, Unn K, Gurholt, Tiril P, Nerland, Stener, Elvsåshagen, Torbjørn, Akudjedu, Theophilus N, Alda, Martin, Alnæs, Dag, Alonso‐Lana, Silvia, Bauer, Jochen, Baune, Bernhard T, Benedetti, Francesco, Berk, Michael, Bettella, Francesco, Bøen, Erlend, Bonnín, Caterina M, Brambilla, Paolo, Canales‐Rodríguez, Erick J, Cannon, Dara M, Caseras, Xavier, Dandash, Orwa, Dannlowski, Udo, Delvecchio, Giuseppe, Díaz‐Zuluaga, Ana M, Erp, Theo GM, Fatjó‐Vilas, Mar, Foley, Sonya F, Förster, Katharina, Fullerton, Janice M, Goikolea, José M, Grotegerd, Dominik, Gruber, Oliver, Haarman, Bartholomeus CM, Haatveit, Beathe, Hajek, Tomas, Hallahan, Brian, Harris, Mathew, Hawkins, Emma L, Howells, Fleur M, Hülsmann, Carina, Jahanshad, Neda, Jørgensen, Kjetil N, Kircher, Tilo, Krämer, Bernd, Krug, Axel, Kuplicki, Rayus, Lagerberg, Trine V, Lancaster, Thomas M, Lenroot, Rhoshel K, Lonning, Vera, López‐Jaramillo, Carlos, Malt, Ulrik F, McDonald, Colm, McIntosh, Andrew M, McPhilemy, Genevieve, Meer, Dennis, Melle, Ingrid, Melloni, Elisa MT, Mitchell, Philip B, Nabulsi, Leila, Nenadić, Igor, Oertel, Viola, Oldani, Lucio, Opel, Nils, Otaduy, Maria CG, Overs, Bronwyn J, Pineda‐Zapata, Julian A, Pomarol‐Clotet, Edith, Radua, Joaquim, Rauer, Lisa, Redlich, Ronny, Repple, Jonathan, Rive, Maria M, Roberts, Gloria, Ruhe, Henricus G, Salminen, Lauren E, Salvador, Raymond, Sarró, Salvador, Savitz, Jonathan, Schene, Aart H, Sim, Kang, Soeiro‐de‐Souza, Marcio G, Stäblein, Michael, Stein, Dan J, Stein, Frederike, Tamnes, Christian K, Temmingh, Henk S, Thomopoulos, Sophia I, Veltman, Dick J, Vieta, Eduard, Waltemate, Lena, Westlye, Lars T, Whalley, Heather C, Sämann, Philipp G, Thompson, Paul M, Ching, Christopher RK, Andreassen, Ole A, Agartz, Ingrid, and Group, ENIGMA Bipolar Disorder Working
- Subjects
Biological Psychology ,Pharmacology and Pharmaceutical Sciences ,Biomedical and Clinical Sciences ,Psychology ,Brain Disorders ,Mental Health ,Serious Mental Illness ,Neurosciences ,Biomedical Imaging ,Bipolar Disorder ,Mental health ,Genetics ,Hippocampus ,Humans ,Magnetic Resonance Imaging ,Neuroimaging ,ENIGMA Bipolar Disorder Working Group ,bipolar disorder subtype ,hippocampus ,large-scale ,lithium ,psychosis ,structural brain MRI ,Cognitive Sciences ,Experimental Psychology ,Biological psychology ,Cognitive and computational psychology - Abstract
The hippocampus consists of anatomically and functionally distinct subfields that may be differentially involved in the pathophysiology of bipolar disorder (BD). Here we, the Enhancing NeuroImaging Genetics through Meta-Analysis Bipolar Disorder workinggroup, study hippocampal subfield volumetry in BD. T1-weighted magnetic resonance imaging scans from 4,698 individuals (BD = 1,472, healthy controls [HC] = 3,226) from 23 sites worldwide were processed with FreeSurfer. We used linear mixed-effects models and mega-analysis to investigate differences in hippocampal subfield volumes between BD and HC, followed by analyses of clinical characteristics and medication use. BD showed significantly smaller volumes of the whole hippocampus (Cohen's d = -0.20), cornu ammonis (CA)1 (d = -0.18), CA2/3 (d = -0.11), CA4 (d = -0.19), molecular layer (d = -0.21), granule cell layer of dentate gyrus (d = -0.21), hippocampal tail (d = -0.10), subiculum (d = -0.15), presubiculum (d = -0.18), and hippocampal amygdala transition area (d = -0.17) compared to HC. Lithium users did not show volume differences compared to HC, while non-users did. Antipsychotics or antiepileptic use was associated with smaller volumes. In this largest study of hippocampal subfields in BD to date, we show widespread reductions in nine of 12 subfields studied. The associations were modulated by medication use and specifically the lack of differences between lithium users and HC supports a possible protective role of lithium in BD.
- Published
- 2022
46. Striving for Freedom in a Large-Scale Agile Environment with an Entrepreneurial Mindset of a Product Owner
- Author
-
Niva, Piret, Paasivaara, Maria, Hyrynsalmi, Sami, van der Aalst, Wil, Series Editor, Ram, Sudha, Series Editor, Rosemann, Michael, Series Editor, Szyperski, Clemens, Series Editor, Guizzardi, Giancarlo, Series Editor, Stettina, Christoph J., editor, Garbajosa, Juan, editor, and Kruchten, Philippe, editor
- Published
- 2023
- Full Text
- View/download PDF
47. Decision variable contribution based adaptive mechanism for evolutionary multi-objective cloud workflow scheduling
- Author
-
Jun Li, Lining Xing, Wen Zhong, Zhaoquan Cai, and Feng Hou
- Subjects
Cloud computing ,Workflow scheduling ,Multi-objective ,Evolutionary optimization ,Large-scale ,Electronic computers. Computer science ,QA75.5-76.95 ,Information technology ,T58.5-58.64 - Abstract
Abstract Workflow scheduling is vital to simultaneously minimize execution cost and makespan for cloud platforms since data dependencies among large-scale workflow tasks and cloud workflow scheduling problem involve large-scale interactive decision variables. So far, the cooperative coevolution approach poses competitive superiority in resolving large-scale problems by transforming the original problems into a series of small-scale subproblems. However, the static transformation mechanisms cannot separate interactive decision variables, whereas the random transformation mechanisms encounter low efficiency. To tackle these issues, this paper suggests a decision-variable-contribution-based adaptive evolutionary cloud workflow scheduling approach (VCAES for short). To be specific, the VCAES includes a new estimation method to quantify the contribution of each decision variable to the population advancement in terms of both convergence and diversity, and dynamically classifies the decision variables according to their contributions during the previous iterations. Moreover, the VCAES includes a mechanism to adaptively allocate evolution opportunities to each constructed group of decision variables. Thus, the decision variables with a strong impact on population advancement are assigned more evolution opportunities to accelerate population to approximate the Pareto-optimal fronts. To verify the effectiveness of the proposed VCAES, we carry out extensive numerical experiments on real-world workflows and cloud platforms to compare it with four representative algorithms. The numerical results demonstrate the superiority of the VCAES in resolving cloud workflow scheduling problems.
- Published
- 2023
- Full Text
- View/download PDF
48. Development of a large-scale multi-extrusion FDM printer, and its challenges
- Author
-
Md. Hazrat Ali, Syuhei Kurokawa, Essam Shehab, and Muslim Mukhtarkhanov
- Subjects
Large-scale ,3D printing ,Printing parameters ,Adhesion ,Temperature control ,Technology - Abstract
This study focuses on the development of a large-dimensional 3D printer and its challenges in general. The major fused deposition modeling (FDM) printers focus on printing small-scale parts due to their challenges in printing large-scale objects using thermoplastic polymer filaments. A novel large-dimensional multi-extrusion FDM printer is developed at the workshop and printed several large-dimensional objects to emphasize its prospects in developing large-scale products. The printer has a print bed with a dimension of 900 mm × 1100 mm × 770 mm with respect to the length–width–height (L–W–H), respectively. There are many challenges to successfully printing large-dimensional objects using FDM technology. The experimental design elaborates on the challenges experienced during printing various large-dimensional objects. In addition, the paper focuses on the qualitative analysis of the optimal process parameters in section 4.5. Based on the experimental results, the key challenges are found to be uneven bed temperature, bending of print bed due to thermal effect, surface unsticks due to lack of adhesive force, surrounding temperature, and irregular filament feed. Experimental results also validate the key design specifications and their impact on enhancing large-scale 3D printing. The developed printer is capable of printing large-scale objects with five different thermoplastic materials using five individual extruders simultaneously. It adds a new dimension of flexible automation in additive manufacturing (AM).
- Published
- 2023
- Full Text
- View/download PDF
49. Cell type-specific connectome predicts distributed working memory activity in the mouse brain
- Author
-
Xingyu Ding, Sean Froudist-Walsh, Jorge Jaramillo, Junjie Jiang, and Xiao-Jing Wang
- Subjects
working memory ,connectome ,interneurons ,large-scale ,computational model ,cell-types ,Medicine ,Science ,Biology (General) ,QH301-705.5 - Abstract
Recent advances in connectomics and neurophysiology make it possible to probe whole-brain mechanisms of cognition and behavior. We developed a large-scale model of the multiregional mouse brain for a cardinal cognitive function called working memory, the brain’s ability to internally hold and process information without sensory input. The model is built on mesoscopic connectome data for interareal cortical connections and endowed with a macroscopic gradient of measured parvalbumin-expressing interneuron density. We found that working memory coding is distributed yet exhibits modularity; the spatial pattern of mnemonic representation is determined by long-range cell type-specific targeting and density of cell classes. Cell type-specific graph measures predict the activity patterns and a core subnetwork for memory maintenance. The model shows numerous attractor states, which are self-sustained internal states (each engaging a distinct subset of areas). This work provides a framework to interpret large-scale recordings of brain activity during cognition, while highlighting the need for cell type-specific connectomics.
- Published
- 2024
- Full Text
- View/download PDF
50. CDSKNNXMBD: a novel clustering framework for large-scale single-cell data based on a stable graph structure
- Author
-
Ren, Jun, Lyu, Xuejing, Guo, Jintao, Shi, Xiaodong, Zhou, Ying, and Li, Qiyuan
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.