1,037 results on '"SAMPLING STRATEGY"'
Search Results
2. Spatiotemporal heterogeneity of lake carbon dioxide flux leads to substantial uncertainties in regional upscaling estimates
- Author
-
Qi, Tianci, Shen, Ming, Luo, Juhua, Xiao, Qitao, Liu, Dong, and Duan, Hongtao
- Published
- 2024
- Full Text
- View/download PDF
3. Exploring the influence of training sampling strategies on time-series deep learning model in hydrology
- Author
-
Yoon, Sunghyun and Ahn, Kuk-Hyun
- Published
- 2025
- Full Text
- View/download PDF
4. Rethinking Sampling for Music-Driven Long-Term Dance Generation
- Author
-
Truong-Thuy, Tuong-Vy, Bui-Le, Gia-Cat, Nguyen, Hai-Dang, Le, Trung-Nghia, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Cho, Minsu, editor, Laptev, Ivan, editor, Tran, Du, editor, Yao, Angela, editor, and Zha, Hongbin, editor
- Published
- 2025
- Full Text
- View/download PDF
5. Research on enhancing the efficiency of food safety sampling inspections in China based on Pareto's law.
- Author
-
Li, Taiping, Luo, Yun, and Zhao, Tong
- Subjects
- *
FOOD inspection , *INSPECTION & review , *FOOD quality , *FOOD safety laws , *FOOD testing - Abstract
BACKGROUND: Food safety is pivotal for public welfare and directly impacts consumer health. Food safety sampling inspections (FSSIs) are essential in detecting unqualified food products and non‐compliant manufacturers, which form an integral part of government regulatory frameworks. However, given the constraints on budgetary resources, improving the efficiency of food safety sampling inspections (EFSSIs) remains a considerable challenge in China's food quality and safety supervision. This study aims to apply Pareto's law, starting from the examination of food sample testing items and major hazard types, to theoretically analyze methods for improving the EFSSIs. Following the theoretical analysis, the research employs provincial food sampling data from China in 2022 to empirically validate the proposed improvement strategies. RESULTS: The research findings indicate that applying Pareto's law significantly reduces the number of items that should be tested for each food subcategory, effectively lowering testing costs for each batch of food samples. Theoretically, employing Pareto's law in sampling inspections can increase the EFSSIs to 2.78 times the current observed level. Furthermore, empirical validation using food sampling data confirms that EFSSIs can be improved to 2.12 times the existing level, consistent with theoretical predictions. CONCLUSION: Implementing Pareto's law in FSSIs facilitates the detection of more unqualified food products and non‐compliant manufacturers without additional financial burden, significantly enhancing the EFSSIs. This approach provides an innovative strategy for government to bolster their food safety management efforts. © 2024 Society of Chemical Industry. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
6. Petrous bones versus tooth cementum for genetic analysis of aged skeletal remains.
- Author
-
Zupanič Pajnič, Irena, Jeromelj, Tonja, and Leskovar, Tamara
- Subjects
- *
CEMENTUM , *BIOLOGICAL specimens , *TOOTH roots , *MEDICAL sciences , *ANTHROPOMETRY - Abstract
A proper sampling strategy is important to obtain sufficient DNA for successful identification of aged skeletal remains. The petrous bone is the highest DNA-yielding bone in the human body. Because DNA extraction from the petrous bone is very destructive, the demand for other DNA sources is significant. When investigating aged skeletal remains, teeth are usually preserved, and recent studies have shown that DNA in teeth can be best preserved in the dental cementum that surrounds the surface of the tooth root. To extract DNA from the surface of the tooth root, a nondestructive method without grinding was used. Petrous bones and teeth from 60 archaeological adult skeletons were analyzed. The DNA yield, degree of DNA degradation, and STR typing success were compared, and the results showed higher DNA yield and higher amplification success in petrous bones, despite higher degradation of petrous bones' DNA. The greater success of petrous bones is associated with poorly preserved DNA in a quarter of the teeth analyzed. When teeth with badly preserved DNA were excluded from the statistical analysis, no differences in the success of STR loci amplification were observed even if DNA yield was higher in petrous bones, which can be explained by greater degradation of petrous bones' DNA. When teeth are well preserved, they can be used for genetically analyzing aged skeletal remains instead of petrous bones, and a rapid nondestructive extraction method can be applied to shorten the identification process and to physically preserve the biological specimen. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
7. CCIR: high fidelity face super-resolution with controllable conditions in diffusion models.
- Author
-
Chen, Yaxin, Du, Huiqian, and Xie, Min
- Abstract
Diffusion probabilistic models have demonstrated great potential in producing realistic-looking super-resolution (SR) images. However, the realism doesn't necessarily guarantee that the SR images are faithful to the ground truth high- resolution images. This paper develops a novel training-free framework namely Iterative Refinement with Controllable Condition (CCIR), for face SR based on controllable prior conditions in diffusion model. The goal is to generate SR images that are both realistic and faithful to the ground truth by controlling the prior conditions. Our framework consists of a pre-trained SR network, Local Implicit Image Function (LIIF), and a pre-trained diffusion model. The LIIF enhances the conditions provided by low-resolution images, while the diffusion model recovers fine details in the SR images. Notably, for the diffusion model, we propose a non-uniform low-pass filtering sampling strategy that dynamically adds controllable conditions to latent features during sampling process. This strategy provides a flexible balance between fidelity and realism in SR images, enabling the restoration of highly similar SR images from the same low-resolution input with different noise samples. Extensive experiments conducted on the benchmark of facial SR task demonstrate CCIR outperforms the state-of-the-art SISR methods, in qualitative and quantitative assessments, particularly in the case of magnifying very-low-resolution images or high-magnification factors. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. MSSA: Multi-Representation Semantics-Augmented Set Abstraction for 3D Object Detection.
- Author
-
Liu, Huaijin, Du, Jixiang, Zhang, Yong, Zhang, Hongbo, and Zeng, Jiandian
- Subjects
OBJECT recognition (Computer vision) ,POINT cloud ,COMPUTER vision ,POINT processes ,SPINE - Abstract
Accurate recognition and localization of 3D objects is a fundamental research problem in 3D computer vision. Benefiting from transformation-free point cloud processing and flexible receptive fields, point-based methods have become accurate in 3D point cloud modeling, but still fall behind voxel-based competitors in 3D detection. We observe that the set abstraction module, commonly utilized by point-based methods for downsampling points, tends to retain excessive irrelevant background information, thus hindering the effective learning of features for object detection tasks. To address this issue, we propose MSSA, a Multi-representation Semantics-augmented Set Abstraction for 3D object detection. Specifically, we first design a backbone network to encode different representation features of point clouds, which extracts point-wise features through PointNet to preserve fine-grained geometric structure features, and adopts VoxelNet to extract voxel features and BEV features to enhance the semantic features of key points. Second, to efficiently fuse different representation features of keypoints, we propose a Point feature-guided Voxel feature and BEV feature fusion (PVB-Fusion) module to adaptively fuse multi-representation features and remove noise. At last, a novel Multi-representation Semantic-guided Farthest Point Sampling (MS-FPS) algorithm is designed to help set abstraction modules progressively downsample point clouds, thereby improving instance recall and detection performance with more important foreground points. We evaluate MSSA on the widely used KITTI dataset and the more challenging nuScenes dataset. Experimental results show that compared to PointRCNN, our method improves the AP of "moderate" level for three classes of objects by 7.02%, 6.76%, and 5.44%, respectively. Compared to the advanced point-voxel-based method PV-RCNN, our method improves the AP of "moderate" level by 1.23%, 2.84%, and 0.55% for the three classes, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Exploring Sampling Strategies and Genetic Diversity Analysis of Red Beet Germplasm Resources Using SSR Markers.
- Author
-
Wu, Xiangjia, Pi, Zhi, Li, Shengnan, and Wu, Zedong
- Subjects
BEETS ,GERMPLASM ,GENETIC variation ,GENETIC distance ,PARAMETERS (Statistics) - Abstract
By using 14 SSR primer pairs, we here analyzed and compared the amplification results of 534 DNA samples from six red sugar beet germplasm resources under three treatments. These data were used to explore the sampling strategy for the aforementioned resources. Based on the sampling strategy results, 21 SSR primer pairs were used to evaluate the genetic diversity of 47 red sugar beet germplasm resources. The six population genetic parameters used for exploring the sampling strategy unveiled that individual plants within the population had a relatively large genetic distance. The genetic parameters Ne, I, and Nei's of the randomly mixed sampling samples increased rapidly between 10 and 30 plants before decreasing. Therefore, when SSR technology was used to analyze the genetic diversity of the red sugar beet germplasm resources, the optimal sampling gradient for each population was the adoption of a random single-plant mixed sampling sample of no less than 10 plants and no more than 30 plants. The 21 SSR primer pairs were used to detect genetic diversity in 30 random mixed samples of 47 resources. The average polymorphic information content (PIC) was 0.5738, the average number of observed alleles (Na) was 4.1905, the average number of effective alleles (Ne) was 2.8962, the average Shannon's information index (I) was 1.1299, the average expected heterozygosity (Nei's) was 0.6127, and the average expected heterozygosity (He) was 0.6127. The genetic distance of the 47 germplasm resources ranged from 0.0225 to 0.551 (average: 0.316). According to the population structure analysis, the most suitable K value was six, which indicated the presence of six populations. Based on the clustering analysis results, the 47 germplasm resources were segregated into six groups, with obvious clustering and some germplasm resources noted for having groups with close genetic relationships. We here established a more accurate and scientific sampling strategy for analyzing the genetic diversity of red sugar beet germplasm resources by using SSR molecular markers. The findings provide a reference for collecting and preserving red sugar beet germplasms and protecting their genetic diversity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. 基于异质图神经网络的服装兼容性预测.
- Author
-
鲁鸣鸣, 郭清明, 张亚, and 易贤康
- Abstract
Copyright of Journal of Computer-Aided Design & Computer Graphics / Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao is the property of Gai Kan Bian Wei Hui and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
11. Unsupervised machine learning with different sampling strategies and topographic factors for distinguishing between landslide source and runout areas to improve landslide inventory production
- Author
-
Jhe-Syuan Lai, Jun-Yi Huang, Hong-Mao Huang, and Yung-Chung Chuang
- Subjects
Clustering ,landslide inventory ,machine learning ,sampling strategy ,topographic factors ,Environmental technology. Sanitary engineering ,TD1-1066 ,Environmental sciences ,GE1-350 ,Risk in industry. Risk management ,HD61 - Abstract
This study derived 12 topographical and hydrological factors related to landslides from a 10-m digital elevation model. Three unsupervised machine learning algorithms were employed to distinguish between the features of landslide sources and runout areas for Typhoon Morakot. Two sampling strategies were designed in this study. The first strategy involved creating a data set by pairing landslide sources and runout areas on the basis of the size of polygonal areas recorded in the inventory and then exploring the effectiveness of feature separation with k-means++ as the primary method and EM (expectation maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) as supplementary methods. Because of practical challenges associated with the inability to determine whether remote sensing results correspond to landslide sources or runout areas, the first sampling strategy was used to test the feasibility of the adopted algorithms. In the second sampling strategy, the effectiveness of feature separation was tested by dividing polygon samples into several intervals on the basis of their sizes, thereby verifying the practicability of the adopted algorithms in real-world operations. This study verified that combining unsupervised machine learning algorithms with topographic factors facilitates the effective separation of landslide sources and runout areas, thereby enhancing the quality of landslide inventories and reducing the cost of landslide inventory creation. Moreover, it indicated that meter-scale topographic data yield classification results similar to those obtained from manually created landslide inventories based on centimeter-scale stereo aerial photos. The proposed method effectively balances the cost and quality of landslide inventories.
- Published
- 2024
- Full Text
- View/download PDF
12. Corner error reduction by Chebyshev transformed orthogonal grid
- Author
-
Zhang, Zebin, Jing, Shizhao, Li, Yaohui, and Meng, Xianzong
- Published
- 2024
- Full Text
- View/download PDF
13. Avoid non‐probability sampling to select population monitoring sites: Comment on McClure and Rolek (2023)
- Author
-
Jan Perret, Fabien Laroche, Guillaume Papuga, and Aurélien Besnard
- Subjects
biodiversity conservation ,demography ,occupancy ,population dynamics ,population monitoring ,sampling strategy ,Ecology ,QH540-549.5 ,Evolution ,QH359-425 - Abstract
Abstract Population monitoring programmes typically rely on sampling because it is impossible to survey all the sites within the study area. In such a situation, the general recommendation to obtain unbiased estimates of population trends is to select monitoring sites using probability sampling. However, site selection not based on probability sampling, such as selecting sites with the largest abundance of individuals at the beginning of the monitoring programme, is common in practice. Nevertheless, these methods carry the risk of obtaining biased trend estimates. Using simulations, McClure & Rolek (2023) investigated whether three non‐probability sampling site selection methods can yield unbiased trend estimates under some specific conditions. For two of these methods, that is selecting high quality sites and selecting sites known to be occupied, the authors conclude that there is a major risk of obtaining biased trend estimates. For the third method, that is selecting sites with the largest initial abundance, they found conditions in which unbiased estimates can be obtained. They conclude that the general recommendation to use probability sampling should be revised. Here, we show that the authors' results, although perfectly correct, do not invalidate this recommendation. First, we point out that the authors made strong assumptions about the populations' functioning in their simulations, especially that inter‐annual variance in abundance is similar for all sites, which is unlikely in most real populations. We show through simple simulations that even slightly relaxing this assumption invalidates the authors' results. We also point out that for most of the hypotheses made by the authors, it is generally not known at the beginning of a study whether they will be respected. Furthermore, the authors did not provide evidence that selecting sites based on high initial abundance leads to more precise trend estimates than probability sampling methods. Therefore, neither the benefits nor the risks of this method are known. We conclude that until evidence is provided that abundance‐based site selection improves estimate precision and the situations in which it provides unbiased estimates are clearly identified, using probability sampling should remain the rule.
- Published
- 2024
- Full Text
- View/download PDF
14. A Comprehensive Comparison of Stable and Unstable Area Sampling Strategies in Large-Scale Landslide Susceptibility Models Using Machine Learning Methods.
- Author
-
Sinčić, Marko, Bernat Gazibara, Sanja, Rossi, Mauro, Krkač, Martin, and Mihalić Arbanas, Snježana
- Subjects
- *
LANDSLIDE hazard analysis , *MACHINE learning , *DIGITAL elevation models , *SUPPORT vector machines , *RANDOM forest algorithms , *LANDSLIDES - Abstract
This paper focuses on large-scale landslide susceptibility modelling in NW Croatia. The objective of this research was to provide new insight into stable and unstable area sampling strategies on a representative inventory of small and shallow landslides mainly occurring in soil and soft rock. Four strategies were tested for stable area sampling (random points, stable area polygon, stable polygon buffering and stable area centroid) in combination with four strategies for unstable area sampling (landslide polygon, smoothing digital terrain model derived landslide conditioning factors, polygon buffering and landslide centroid), resulting in eight sampling scenarios. Using Logistic Regression, Neural Network, Random Forest and Support Vector Machine algorithm, 32 models were derived and analysed. The main conclusions reveal that polygon sampling of unstable areas is an imperative in large-scale modelling, as well as that subjective and/or biased stable area sampling leads to misleading models. Moreover, Random Forest and Neural Network proved to be more favourable methods (0.804 and 0.805 AUC, respectively), but also showed extreme sensitivity to the tested sampling strategies. In the comprehensive comparison, the advantages and disadvantages of 32 derived models were analysed through quantitative and qualitative parameters to highlight their application to large-scale landslide zonation. The results yielded by this research are beneficial to the susceptibility modelling step in large-scale landslide susceptibility assessments as they enable the derivation of more reliable zonation maps applicable to spatial and urban planning systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Genotyping of Autochthonous Rose Populations in the Netherlands for Effective Ex Situ Gene Conservation Management.
- Author
-
Buiteveld, Joukje, Smolka, Alisia, and Smulders, Marinus J. M.
- Subjects
SPECIES hybridization ,GENETIC variation ,MICROSATELLITE repeats ,INDIGENOUS species ,VEGETATIVE propagation ,GENES - Abstract
Most wild rose species in the Netherlands belong to Rosa section Caninae (dogroses), with Rosa arvensis (section Synstylae) and Rosa spinosissima (section Pimpinellifoliae) as other indigenous species. All species are rare, often found in small populations or as scattered individuals, except for Rosa canina and Rosa corymbifera. Conservation strategies have been developed for these roses, with a focus on ex situ methods, including clonal archives and seed orchards, using vegetative propagation from the original shrubs. Efficient collection management aims at preservation of maximum genetic diversity with a minimum of duplicated genotypes. However, dogrose taxonomy is complex because of species hybridization, different ploidy levels, and their matroclinal inheritance due to Canina meiosis. They can also reproduce vegetatively through root suckers. In order to assess the genetic structure and the levels of genetic diversity and clonality within and among the wild rose populations in the Netherlands, we genotyped individuals in wild populations and accessions in the ex situ gene bank with 10 highly polymorphic microsatellite markers. The analysis revealed 337 distinct multilocus genotypes (MLGs) from 511 sampled individuals, with some MLGs shared across different species and sites. The genetic structure analysis showed distinct clusters separating non-dogrose species from the Caninae section. Geographic distribution of MLGs indicated both local and widespread occurrences. Redundancy analysis identified 152 distinct MLGs from 244 gene bank accessions, suggesting a 38% redundancy rate. Core collections were optimized to retain genetic diversity with minimal redundancy, selecting subsets of 20–40 individuals from different species groups. The study highlights the value of genetic characterization in guiding sampling strategies for dogroses. We propose a two-step approach that may be used to reveal clonality and redundancy and to optimize core collections of species that combine sexual and vegetative reproduction, to maximize genetic capture in ex situ gene banks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Avoid non‐probability sampling to select population monitoring sites: Comment on McClure and Rolek (2023).
- Author
-
Perret, Jan, Laroche, Fabien, Papuga, Guillaume, and Besnard, Aurélien
- Subjects
POPULATION dynamics ,BIODIVERSITY conservation ,SAMPLING methods ,DEMOGRAPHY ,HYPOTHESIS - Abstract
Copyright of Methods in Ecology & Evolution is the property of Wiley-Blackwell and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
17. Surrogate-based robust design optimization by using Chebyshev-transformed orthogonal grid.
- Author
-
Jing, Shizhao, Zhang, Zebin, and Meng, Xianzong
- Abstract
Surrogate-based aerodynamic robust design optimization uses a surrogate model to calculate the robustness indices, which strongly relies on the overall accuracy of the model, as well as the efficient exploration of the design space. For most surrogate modeling approaches, significant inaccuracies are often observed at the outlier region of the design space, where very few samples are spotted. A novel method using Chebyshev transformation is applied to re-allocate the orthogonal Latin hypercube sample set to alleviate the corner errors and eventually improve the overall accuracy. An inner Kriging model is developed using the sampling method, and robustness indices are calculated based on the subspaces adjacent to the sampling points. Subsequently, an outer robust model is constructed with the robustness indices as the target. Ultimately, a combination of the inner and outer models is utilized with the genetic algorithm to accomplish multi-objective robust optimization. Theoretical tests are undertaken for classic test functions, showing the advantage of the proposed approach. Based on this method, aerodynamic robust design optimizations are carried out on the RAE 2822 airfoil, for which the lift coefficient and drag coefficient are optimized for a given range of geometrical parameters. An increase of 1.94% lift coefficient and a reduction of 2.53% drag coefficient are achieved compared to the baseline design without sacrificing the robust performances. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Web‐based application to guide sampling on salmon farms for more precise estimation of sea louse abundance.
- Author
-
Jeong, Jaewoon and Revie, Crawford W.
- Subjects
SALMON farming ,WEB-based user interfaces ,LEPEOPHTHEIRUS salmonis ,WATCHFUL waiting ,SAMPLE size (Statistics) - Abstract
Objective: Efficiently managing sea lice on salmon farms through active surveillance, crucial for lice abundance estimation, is challenging due to the need for effective sampling schemes. To address this, we developed an application that considers infestation levels, farm structure, and management protocols, enhancing the precision of sampling strategies for sea louse abundance estimation. Methods: Simulation‐based methods are valuable for estimating suitable sample sizes in complex studies where standard formulae are inadequate. We introduce FishSampling, an open Web‐based application tailored to determine precise sample sizes for specific scenarios and objectives. Result: The model incorporates factors such as sea lice abundance, farm pen numbers, potential clustering effects among these pens, and the desired confidence level. Simulation outcomes within this application provide practical advice on how to decide on the number of fish and pens to sample, under varying levels of assumed clustering. Conclusion: This approach can be used across the salmon aquaculture sector to improve sampling strategies for sea lice abundance estimation and balance surveillance costs against health objectives. Impact statementThe open‐source application FishSampling enhances sea lice monitoring on salmon farms with a novel simulation‐based approach for sample size determination. It offers precise estimates of sea lice abundance, crucial for regulatory purposes, and aids in the efficient allocation of sampling resources. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Investigation of Kriging-based SAEAs' metamodel samples for computationally expensive optimization problems.
- Author
-
Valadão, Mônica, Maravilha, André, and Batista, Lucas
- Abstract
Surrogate model assisted evolutionary algorithms (SAEAs) are strategies widely applied to deal with computationally expensive optimization problems (CEOPs). These methods employ metamodels to drive an evolutionary algorithm (EA) to promising design regions where new evaluations on the true-objective function must be performed. To do this, SAEAs are required to handle the challenge of training a metamodel to improve its predictions. The reliability of a metamodel is strongly related to the samples used for its training. Despite this, several SAEAs are proposed without concern about the sampling strategy employed. The ideal situation is to obtain a sample not far away from the solutions predicted on the metamodel. In this sense, the contribution/novelty of this paper regards an investigative study to compare five strategies for defining the metamodel sample in a proposed SAEA Framework (SAEA/F). The SAEA/F uses a one-dimensional Ordinary Kriging (OK) metamodel, and an expected improvement (EI) merit function is applied to define on which solutions to spend the budget of true-function evaluation. In this investigation, each strategy is incorporated into SAEA/F and then used to solve a set of analytical functions of single-objective optimization problems. The computational results suggest that two of the five sampling strategies stand out the best. The first strategy chooses those solutions with the lowest distance to the centroid of the population of solutions, and the second selects the newest solutions evaluated on true-objective function. The results highlight the potential of these approaches for solving expensive optimization problems since they speed up the algorithm convergence to improved solutions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Improving global soil moisture prediction through cluster-averaged sampling strategy
- Author
-
Qingliang Li, Qiyun Xiao, Cheng Zhang, Jinlong Zhu, Xiao Chen, Yuguang Yan, Pingping Liu, Wei Shangguan, Zhongwang Wei, Lu Li, Wenzong Dong, and Yongjiu Dai
- Subjects
Soil moisture ,Sampling strategy ,K-means ,Deep learning ,Science - Abstract
Understanding and predicting global soil moisture (SM) is crucial for water resource management and agricultural production. While deep learning methods (DL) have shown strong performance in SM prediction, imbalances in training samples with different characteristics pose a significant challenge. We propose that improving the diversity and balance of batch training samples during gradient descent can help address this issue. To test this hypothesis, we developed a Cluster-Averaged Sampling (CAS) strategy utilizing unsupervised learning techniques. This approach involves training the model with evenly sampled data from different clusters, ensuring both sample diversity and numerical consistency within each cluster. This approach prevents the model from overemphasizing specific sample characteristics, leading to more balanced feature learning. Experiments using the LandBench1.0 dataset with five different seeds for 1-day lead-time global predictions reveal that CAS outperforms several Long Short-Term Memory (LSTM)-based models that do not employ this strategy. The median Coefficient of Determination (R2) improved by 2.36 % to 4.31 %, while Kling-Gupta Efficiency (KGE) improved by 1.95 % to 3.16 %. In high-latitude areas, R2 improvements exceeded 40 % in specific regions. To further validate CAS under realistic conditions, we tested it using the Soil Moisture Active and Passive Level 3 (SMAP-L3) satellite data for 1 to 3-day lead-time global predictions, confirming its efficacy. The study substantiates the CAS strategy and introduces a novel training method for enhancing the generalization of DL models.
- Published
- 2024
- Full Text
- View/download PDF
21. Unveiling sedimentary microbial communities: Restored mangrove wetlands show higher heterogeneity and network stability than natural ones
- Author
-
Kexin Zhang, Changzhi Chen, Dandan Long, Shuang Wang, Jiqiu Li, and Xiaofeng Lin
- Subjects
Bioindicator ,Mangrove restoration ,Network stability ,Sampling strategy ,Sedimentary microbial biodiversity ,Ecology ,QH540-549.5 - Abstract
Mangrove wetlands, characterized by high biodiversity and crucial ecological functions, have witnessed extensive restoration efforts in China in recent decades. Despite the successful recovery of standing vegetation, the alterations in below-ground microbial communities, vital engines for completing biogeochemical cycles in wetlands, remain poorly understood. Here, the prokaryotic and microeukaryotic communities in restored mangrove forests (RMF) and natural mangrove forests (NMF) in Shenzhen, the world’s first International Mangrove Center in China were investigated and compared. This work aimed to explore an appropriate sampling strategy for sedimentary microbial community biodiversity study and evaluate their heterogeneity and network stability in RMF and NMF. The results revealed that a “broad-coverage and sparse-spot” sampling strategy was conducive to ensuring a sufficient representation of microbial diversity in mangrove wetlands. Significant differences were found in microbial community structures of RMF and NMF, with the relative abundances of Myxococcales and Cerozoa as potential biomarkers, respectively. RMF displayed higher prokaryotic richness, microbial community dissimilarity, and spatial turnover rate compared to those of NMF, indicating greater microbial diversity and spatial heterogeneity in RMF. Meanwhile, the RMF microbial network was characterized by lower complexity but higher modularity, robustness, and the ratio of negative to positive cohesion. These demonstrated that microbiomes were shaped by stability constraints and the RMF microbial network was more stable and resistant to environmental variations than that of NMF. Interestingly, the stability of these microbial networks appeared to depend on the topological roles of key species rather than on their abundance. Overall, this study disclosed a kind of relationship between the aboveground elements of restored and natural mangrove forests and their belowground microbial communities. These findings provide references for comprehensive assessments of biodiversity, stability, and ecological restoration in mangrove ecosystems.
- Published
- 2024
- Full Text
- View/download PDF
22. Dependence of debris flow susceptibility maps on sampling strategy with data-driven grid-based model
- Author
-
Ning Jiang, Fenghuan Su, Ruilong Wei, Yu Huang, Wen Jin, Peng Huang, and Qing Zeng
- Subjects
Debris flow susceptibility ,Sampling strategy ,Weights of evidence ,Logistic regression ,Deep neural network ,Ecology ,QH540-549.5 - Abstract
Different sampling strategies produce varying sample data, serve as the primary input data and directly affect the accuracy of predictions in data-driven grid-based susceptibility models. This study analyzes the accuracy and variation of debris flow susceptibility maps (DFSMs) generated by various sampling strategies. The study area is the Yingxiu region in China, where six sampling strategies were applied, including three sampling locations (deposition area, runout area, and source area) and two sampling types (centroid and polygon) for the debris flow inventory. The effectiveness of 10 conditioning factors used to build the model was assessed by using Pearson correlation coefficient, variance inflation factor, and information gain ratio (IGR) techniques. We then used Weight of Evidence (WofE), Logistic Regression (LR), and Deep Neural Network (DNN) models to produce DFSMs and quantify their performance using the receiver operating characteristic curve (ROC), Accuracy (ACC), Precision, F1 score, and Recall. The results show that the WofE (AUC: 0.754–0.960), LR (AUC: 0.761–0.965), and DNN (AUC: 0.786–0.976) models all perform well, but the DFSMs and dominant factors depend strongly on sampling strategies, especially on sampling location. If the sample areas are excessively large and span across different factor class labels, or if there is a concentration of either large or small sample areas within a specific region, the results of centroid and polygon sampling strategies may differ or even be contradictory.We recommend: (1) determining sampling locations based on the research objectives to provide more accurate evaluation results; (2) selecting the sampling type by first considering the sample size. If the aforementioned conditions are not present, the quicker and more convenient centroid sampling strategy can be chosen; and (3) determining an appropriate sampling strategy and ensuring the accuracy of initial samples are paramount before producing DFSMs.
- Published
- 2024
- Full Text
- View/download PDF
23. Susceptibility evaluation of Wenchuan coseismic landslides by gradient boosting decision tree and random forest based on optimal negative sample sampling strategies
- Author
-
Yanhao GUO, Jie DOU, Zilin XIANG, Hao MA, Aonan DONG, and Wanqi LUO
- Subjects
random forest(rf) ,gradient boosting decision tree(gbdt) ,machine learning ,frequency ratio(fr) ,sampling strategy ,coseismic landslide ,landslide susceptibility mapping ,Geology ,QE1-996.5 ,Engineering geology. Rock mechanics. Soil mechanics. Underground construction ,TA703-712 - Abstract
Objective Strong earthquake-induced landslides are characterized by large number, wide distribution and large scale, and seriously threaten people's lives and property. Landslide susceptibility mapping (LSM) can quickly predict the spatial distribution of prone areas, which is highly important for reducing the risk of post-earthquake disasters. However, in the studies of coseismic landslide LSMs, how to select negative landslide samples and integrate machine learning models to improve the evaluation accuracy still needs further investigation. Methods In this study, the landslides induced by the Wenchuan earthquake in mountainous areas are selected as a case study. First, 10 landslide influencing factors, such as topography, geological environment, and seismic parameters, are selected to analyse the spatial distribution of landslides. Then, collinearity analysis is used to test data redundancy, nonnegative sample points from the sampling strategies are randomly selected in the extremely low susceptibility regions by the frequency ratio (FR) method. Finally, gradient boosting decision tree (GBDT), random forest (RF), and their optimal models are used to predict coseismic landslide susceptibility, conduct a comparative study of the models and carry out an accuracy assessment. Results The results show that ① the spatial distribution of landslides is controlled by multiple factors, and ② the accuracy of the models is FR-RF(AUC=0.943)>FR-GBDT(AUC=0.926)>RF(AUC=0.901)>GBDT(AUC=0.856). ③ Selecting negative landslide samples in low susceptibility areas could significantly improve the accuracy of LSMs. Conclusion The research results can provide a reference for selecting negative landslide samples and constructing evaluation models, as well as for providing theoretical support for post-earthquake disaster prevention and mitigation.
- Published
- 2024
- Full Text
- View/download PDF
24. Material Selection for Restoration of Genetic Diversity of Abies koreana on Mt. Jirisan in South Korea
- Author
-
Han-na Seo and Hyo-In Lim
- Subjects
expected heterozygosity ,genetic diversity ,population genetics ,sampling strategy ,simple sequence repeat ,sub-alpine coniferous ,Forestry ,SD1-669.5 - Abstract
A strategy is required for selecting appropriate materials for the restoration of Abies koreana on Mt. Jirisan, where the habitat of A. koreana is continuously shrinking. The current study aimed to analyze the genetic characteristics of A. koreana in three subpopulations (Banyabong, Byeoksoryeng, and Cheonwangbong) on Mt. Jirisan using 10 nuclear simple sequence repeat (nSSR) markers and calculate the sampling distance for each subpopulation for avoiding genetically similar samples. Based on the calculated sampling distance, we proposed the size of a sample containing more than 95% of the alleles at a frequency greater than 0.05. AMOVA showed that the difference in genetic variation across subpopulations of A. koreana on Mt. Jirisan was small, approximately 3% of the total. Spatial genetic structure analysis results suggested that it would be appropriate to collect samples of the Banyabong subpopulation at intervals of 10 m or more, when sampling A. koreana, whereas for the Byeoksoryeong and Cheonwangbong subpopulations, samples should be collected at intervals of 20 m or more. Results of random sampling of 5 to 30 individuals indicated that, by applying a 10 m distance within the Banyabong subpopulation, more than 95% of the total alleles with a frequency ≥ 0.05 were secured when more than 25 individuals were extracted. Therefore, as a restoration strategy for A. koreana on Mt. Jirisan, we proposed the collection of more than 25 samples, keeping 10 m distance within the Banyabong subpopulation, which has a relatively high genetic diversity.
- Published
- 2024
- Full Text
- View/download PDF
25. How do the landslide and non-landslide sampling strategies impact landslide susceptibility assessment? — A catchment-scale case study from China
- Author
-
Zizheng Guo, Bixia Tian, Yuhang Zhu, Jun He, and Taili Zhang
- Subjects
Landslide susceptibility ,Sampling strategy ,Machine learning ,Random forest ,China ,Engineering geology. Rock mechanics. Soil mechanics. Underground construction ,TA703-712 - Abstract
The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment (LSA). The study area is the Feiyun catchment in Wenzhou City, Southeast China. Two types of landslides samples, combined with seven non-landslide sampling strategies, resulted in a total of 14 scenarios. The corresponding landslide susceptibility map (LSM) for each scenario was generated using the random forest model. The receiver operating characteristic (ROC) curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy. The results showed that higher accuracies were achieved when using the landslide core as positive samples, combined with non-landslide sampling from the very low zone or buffer zone. The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA, which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.
- Published
- 2024
- Full Text
- View/download PDF
26. Evaluation of the Simplified Method for the Assessment of Exposure to Organic Solvents in Paint and Thinner Industries
- Author
-
Chakroun, Radhouane, Pisello, Anna Laura, Editorial Board Member, Hawkes, Dean, Editorial Board Member, Bougdah, Hocine, Editorial Board Member, Rosso, Federica, Editorial Board Member, Abdalla, Hassan, Editorial Board Member, Boemi, Sofia-Natalia, Editorial Board Member, Mohareb, Nabil, Editorial Board Member, Mesbah Elkaffas, Saleh, Editorial Board Member, Bozonnet, Emmanuel, Editorial Board Member, Pignatta, Gloria, Editorial Board Member, Mahgoub, Yasser, Editorial Board Member, De Bonis, Luciano, Editorial Board Member, Kostopoulou, Stella, Editorial Board Member, Pradhan, Biswajeet, Editorial Board Member, Abdul Mannan, Md., Editorial Board Member, Alalouch, Chaham, Editorial Board Member, Gawad, Iman O., Editorial Board Member, Nayyar, Anand, Editorial Board Member, Amer, Mourad, Series Editor, Ksibi, Mohamed, editor, Negm, Abdelazim, editor, Hentati, Olfa, editor, Ghorbal, Achraf, editor, Sousa, Arturo, editor, Rodrigo-Comino, Jesus, editor, Panda, Sandeep, editor, Lopes Velho, José, editor, El-Kenawy, Ahmed M., editor, and Perilli, Nicola, editor
- Published
- 2024
- Full Text
- View/download PDF
27. Uncertainty quantification in the strain response of prestressed reinforced concrete structures using fractile based sampling.
- Author
-
Xingjian Wang, Strauss, Alfred, Randl, Norbert, and Bocchini, Paolo
- Subjects
- *
PRESTRESSED concrete , *REINFORCED concrete , *PRESTRESSED concrete beams , *LATIN hypercube sampling , *FINITE element method , *PREDICATE calculus , *CONCRETE beams - Abstract
The probabilistic evaluation of the capacity and response of structural systems is often done by probabilistic simulation. This approach relies on a considerable number of deterministic analyses to obtain an accurate estimation of the desired statistical characteristics. However, the need of performing a large number of analyses hinders the application of probabilistic simulation in practice, especially when the deterministic calculation of the structural response is computationally expensive in itself (e.g. nonlinear finite element analysis). To increase the appeal of probabilistic simulation, the number of analyses should be very small (i.e. ten or less). This study shows how different sampling techniques can be adopted to select a very small sample subset of analyses to be run, and how they can yield accurate results in estimating the shear capacity of prestressed reinforced concrete beams. Based on the results of the study, the Fractile Based Sampling (FBS) method emerges as a more promising sampling strategy than other sampling techniques, like Latin Hypercube Sampling (LHS). For the same small number of samples, the results show that FBS is preferable over LHS and other sampling techniques for capturing both the general distribution of the response and the tails of the distribution, which shows its usefulness in assessing probability of failure and reliability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Sampling the understory, midstory, and canopy is necessary to fully characterize native bee communities of temperate forests and their dynamic environmental relationships.
- Author
-
Cunningham-Minnick, Michael J., Roberts, H. Patrick, Milam, Joan, and King, David I.
- Subjects
TEMPERATE forests ,COMMUNITY forests ,FOREST canopies ,HONEY ,NONLINEAR analysis - Abstract
Introduction: Native bee communities of temperate forests are conventionally sampled from the understory, yet there is growing evidence that bee assemblages in forest canopies are distinct from those in the understory. Therefore, conventional approaches to quantify forest bee-habitat relationships may not comprehensively characterize forest bee communities. Methods: To examine this, we sampled bees 1--26 m from ground level at 5-m increments at 47 locations in forests located in western Massachusetts, USA. We evaluated bee abundance and species richness responses to a suite of environmental factors measured in the understory with linear and segmented regression comparing four bee sampling strategies: (1) understory sampling only, (2) understory and midstory, (3) understory and canopy, and (4) all strata combined. Results: We found that not sampling higher strata underestimated bee abundance and species richness, and linear models had less ability to explain the data when bees of higher strata were included. Among strategies, responses analyzed linearly differed in magnitude due to overall differences in abundance and species richness, but segmented regressions showed relationships with understory characteristics that also differed in slope, which would alter interpretation. Discussion: Collectively, our findings highlight the value of including vertically stratified sampling strategies throughout the flight season to fully characterize native bee and other pollinator communities of forests. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. The optimal CUSUM control chart with a dynamic non-random control limit and a given sampling strategy for small samples sequence.
- Author
-
Han, Dong, Tsung, Fugee, and Qiao, Lei
- Subjects
- *
CHANGE-point problems , *QUALITY control charts , *EARTHQUAKES , *SOIL sampling , *COMPUTER simulation - Abstract
This article proposes a performance measure to evaluate the detection performance of a control chart with a given sampling strategy for finite or small samples sequence and prove that the CUSUM control chart with dynamic non-random control limit and a given sampling strategy can be optimal under the measure. Numerical simulations and real data for an earthquake are provided to illustrate that for different sampling strategies, the CUSUM chart will have different monitoring performance in change-point detection. Among the six sampling strategies that take only a part of samples, the numerical comparing results illustrate that the uniform sampling strategy (uniformly dispersed sampling strategy) has the best monitoring effect. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. 基于表型性状的酸枣核心种质构建.
- Author
-
陈建华, 曲凯伦, 张云程, 孙永强, 李 彪, 康 莹, and 董胜君
- Abstract
Copyright of Journal of Shenyang Agricultural University is the property of Journal of Shenyang Agricultural University Editorial Department and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
31. Morris method supporting building optimization.
- Author
-
Vincze, Nándor, Horváth, Kristóf Roland, Kistelegdi, István, Storcz, Tamás, and Ercsey, Zsolt
- Subjects
SENSITIVITY analysis ,ENERGY consumption ,DATABASES ,DYNAMIC simulation - Abstract
As part of the energy design synthesis method, complex dynamic building simulation database was created with IDA ICE code for all family house building configurations for a considered problem. In this paper, the annual heat energy demand output parameter is considered to serve as basis of a building energy design investigation. The sensitivity analysis performed by Morris' elementary effect method was used. As the result of the sensitivity analysis of the output parameter, the most important input parameters can be identified, that influence the buildings' energy efficiency, that can support further building designs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. The Italian quick survey on the effects of the COVID‐19 health emergency on businesses: Sampling strategy and data editing.
- Author
-
Casciano, M. C. and Varriale, R.
- Abstract
During the first phase of the COVID‐19 pandemic, Istat performed the quick survey "Situation and perspectives of Italian enterprises during the COVID‐19 health emergency," with the aim of assessing the economic situation and the specific actions adopted by businesses to reduce the economic impacts of the emergency. To ensure the continuity in the information flow and to analyze the temporal evolution of the observed phenomena, the survey has been repeated in three different waves. The outcomes of each wave was released just after 2 months from the launch of the survey. The present work analyses the characteristics of the sampling strategy and describes the complexity of the data editing process, in the case of a survey planned to produce estimates able to ensure an acceptable level of accuracy in the maximum timeliness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Age of Information Joint Optimization for an Energy Harvesting Network With Erasure Channel
- Author
-
Qihang Qin and Hengzhou Ye
- Subjects
Age of information ,energy harvesting network ,sampling strategy ,scheduling strategy ,erasure channel ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
AoI (Age of Information), a measure of data freshness, is an important consideration in the optimal design of energy harvesting IoT(Internet of Things). Existing AoI studies tend to optimize only for the base station scheduling process or only for the source node update process, and rarely consider the energy constraints and unreliable channels. Therefore, a joint optimization strategy is proposed for an energy harvesting IoT scenario based on heterogeneous source nodes with erasure channels. Based on the energy constraint and channel quality, the base station selects the preferred source node for sample transmission based on the difference in AoI benefit and time cost brought by selecting different source nodes. The source node determines the sample collection moment by taking into account the impact of energy harvesting on the sample collection and transmission, as well as the impact of sample collection on the AoI. Simulation analysis shows that the method can obtain better long-term average AoI in different scenarios than baseline methods.
- Published
- 2024
- Full Text
- View/download PDF
34. Exploring Sampling Strategies and Genetic Diversity Analysis of Red Beet Germplasm Resources Using SSR Markers
- Author
-
Xiangjia Wu, Zhi Pi, Shengnan Li, and Zedong Wu
- Subjects
red beet ,germplasm resource ,SSR ,sampling strategy ,analysis of genetic diversity ,Plant culture ,SB1-1110 - Abstract
By using 14 SSR primer pairs, we here analyzed and compared the amplification results of 534 DNA samples from six red sugar beet germplasm resources under three treatments. These data were used to explore the sampling strategy for the aforementioned resources. Based on the sampling strategy results, 21 SSR primer pairs were used to evaluate the genetic diversity of 47 red sugar beet germplasm resources. The six population genetic parameters used for exploring the sampling strategy unveiled that individual plants within the population had a relatively large genetic distance. The genetic parameters Ne, I, and Nei’s of the randomly mixed sampling samples increased rapidly between 10 and 30 plants before decreasing. Therefore, when SSR technology was used to analyze the genetic diversity of the red sugar beet germplasm resources, the optimal sampling gradient for each population was the adoption of a random single-plant mixed sampling sample of no less than 10 plants and no more than 30 plants. The 21 SSR primer pairs were used to detect genetic diversity in 30 random mixed samples of 47 resources. The average polymorphic information content (PIC) was 0.5738, the average number of observed alleles (Na) was 4.1905, the average number of effective alleles (Ne) was 2.8962, the average Shannon’s information index (I) was 1.1299, the average expected heterozygosity (Nei’s) was 0.6127, and the average expected heterozygosity (He) was 0.6127. The genetic distance of the 47 germplasm resources ranged from 0.0225 to 0.551 (average: 0.316). According to the population structure analysis, the most suitable K value was six, which indicated the presence of six populations. Based on the clustering analysis results, the 47 germplasm resources were segregated into six groups, with obvious clustering and some germplasm resources noted for having groups with close genetic relationships. We here established a more accurate and scientific sampling strategy for analyzing the genetic diversity of red sugar beet germplasm resources by using SSR molecular markers. The findings provide a reference for collecting and preserving red sugar beet germplasms and protecting their genetic diversity.
- Published
- 2024
- Full Text
- View/download PDF
35. Primer set evaluation and sampling method assessment for the monitoring of fish communities in the North‐western part of the Mediterranean Sea through eDNA metabarcoding
- Author
-
Sylvain Roblet, Fabrice Priouzeau, Gilles Gambini, Benoit Dérijard, and Cécile Sabourault
- Subjects
biodiversity assessment ,eDNA metabarcoding ,fish monitoring ,North‐western Mediterranean Sea ,PCR primer sets ,sampling strategy ,Environmental sciences ,GE1-350 ,Microbial ecology ,QR100-130 - Abstract
Abstract Environmental DNA (eDNA) metabarcoding appears to be a promising tool to survey fish communities. However, the effectiveness of this method relies on primer set performance and on a robust sampling strategy. While some studies have evaluated the efficiency of several primers for fish detection, it has not yet been assessed in situ for the Mediterranean Sea. In addition, mainly surface waters were sampled and no filter porosity testing was performed. In this pilot study, our aim was to evaluate the ability of six primer sets, targeting 12S rRNA (AcMDB07; MiFish; Tele04) or 16S rRNA (Fish16S; Fish16SFD; Vert16S) loci, to detect fish species in the Mediterranean Sea using a metabarcoding approach. We also assessed the influence of sampling depth and filter pore size (0.45 μm versus 5 μm filters). To achieve this, we developed a novel sampling strategy allowing simultaneous surface and bottom on‐site filtration of large water volumes along the same transect. We found that 16S rRNA primer sets enabled more fish taxa to be detected across each taxonomic level. The best combination was Fish16S/Vert16S/AcMDB07, which recovered 95% of the 97 fish species detected in our study. There were highly significant differences in species composition between surface and bottom samples. Filters of 0.45 μm led to the detection of significantly more fish species. Therefore, to maximize fish detection in the studied area, we recommend to filter both surface and bottom waters through 0.45 μm filters and to use a combination of these three primer sets.
- Published
- 2024
- Full Text
- View/download PDF
36. Sampling the understory, midstory, and canopy is necessary to fully characterize native bee communities of temperate forests and their dynamic environmental relationships
- Author
-
Michael J. Cunningham-Minnick, H. Patrick Roberts, Joan Milam, and David I. King
- Subjects
sampling strategy ,native bee ,vertical gradient ,nonlinear analysis ,forest strata ,closed-canopy forest ,Evolution ,QH359-425 ,Ecology ,QH540-549.5 - Abstract
IntroductionNative bee communities of temperate forests are conventionally sampled from the understory, yet there is growing evidence that bee assemblages in forest canopies are distinct from those in the understory. Therefore, conventional approaches to quantify forest bee–habitat relationships may not comprehensively characterize forest bee communities.MethodsTo examine this, we sampled bees 1–26 m from ground level at 5-m increments at 47 locations in forests located in western Massachusetts, USA. We evaluated bee abundance and species richness responses to a suite of environmental factors measured in the understory with linear and segmented regression comparing four bee sampling strategies: (1) understory sampling only, (2) understory and midstory, (3) understory and canopy, and (4) all strata combined.ResultsWe found that not sampling higher strata underestimated bee abundance and species richness, and linear models had less ability to explain the data when bees of higher strata were included. Among strategies, responses analyzed linearly differed in magnitude due to overall differences in abundance and species richness, but segmented regressions showed relationships with understory characteristics that also differed in slope, which would alter interpretation.DiscussionCollectively, our findings highlight the value of including vertically stratified sampling strategies throughout the flight season to fully characterize native bee and other pollinator communities of forests.
- Published
- 2024
- Full Text
- View/download PDF
37. Interest of the appropriate sampling strategy and analysis procedure for a representative assessment of the agricultural soil pollution level of the Pb-Zn mining site of Jbel Ressas (NE Tunisia).
- Author
-
Nasraoui, Rawya, Trifi, Mariem, Romdhan, Dalila Fkih, Charef, Abdelkrim, Fitouhi, Imen, Attia, Rafla, and Ayari, Jamel
- Subjects
AGRICULTURAL pollution ,SOIL pollution ,METAL tailings ,AGRICULTURE ,EUROPEAN communities ,SOIL texture ,TRACE elements - Abstract
The Jbel Ressas site is an ancient Pb-Zn mine located in the northwest of Tunisia. The long activity of the mine has produced a huge volume of waste (3 tailings) rich in bioavailable fractions of Pb+Zn+Cd which have heavily polluted the surrounding agricultural soil. However, the results obtained from previous studies of this site are so variable that some authors have considered it unpolluted. It was suspected that the improper choice of sampling strategy and the extraction procedures for evaluation of the labile toxic trace element fractions were the sources of these differences. Therefore, the objective of this comparative study was to prove the interest of the choice of the sampling strategy and the extraction procedure in having a representative evaluation of mobile pollutants fractions. The critical step in any geochemical study is having representative site samples and considering that each site is unique. Therefore, environmental concerns, soil texture, principal wind directions, land slopes and topography, run flow directions, and equidistance's between the sampled plots were considered. Based on the mean total Pb+Zn+Cd concentrations obtained from analyzed samples collected in 2009, 2017, and 2021 (the present study) and previous results, the regularity and irregularity over time of the chronological evolution of the physiochemical parameters and the pollution status, respectively, were strong arguments that proved the right choice of our sampling strategy. To select the appropriate extraction procedure for labile TTE fractions, the commonly used procedures, including those used in previous studies, were tested. Using simple mode extractions, the previous authors extracted variable and even very small quantities of labile TTE, compared to similar contexts, and considered that the polluted Jbel Ressas site presented no or a low environmental risk. However, the BCR (European Community Bureau of Reference) procedure used in the present studies leached high quantities of the available toxic trace elements, which confirmed the high pollution level of the site and its environmental threat. The significant correlations among soil physicochemical parameters and total and labile TTE contents of the present study, deduced from statistical analysis, confirmed that the selected sampling strategy and the selected extraction procedure provided representative samples of the site and a fairly accurate assessment of its pollution levels. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Preserving genetic diversity in Pinus tabuliformis breeding population through core collection development.
- Author
-
Yang, Boning, Wang, Huili, Xia, Qijing, El-Kassaby, Yousry A., and Li, Wei
- Abstract
The conservation of genetic diversity is a crucial aspect of forest tree breeding programs, necessitating strategies for its safeguard. Here, the extent of genetic diversity was assessed in 260 Chinese pine (Pinus tabuliformis Carr.) germplasm samples from five provenances using 24 SSR markers. We systematically compared various methods for constructing a core collection aimed at conserving genetic diversity and the results revealed substantial genetic diversity within this germplasm collection. Extensive gene exchange was observed among four of the sampled five provenances which resulted in forming two genetically distinctive groups. To construct the core collection, six different sampling strategies (PowerCore, Power marker_allele number, Power marker_gene entropy, Power marker_gene diversity, Corehunter, and genetic distance-based) and five different sampling sizes (ranging from 10 to 30%) were employed. Comparative analysis of genetic diversity parameters was conducted across the identified 26 subsets, utilizing the PowerCore strategy as the primary approach for capturing all allelic variation present in the core collection, which consisted of only 61 individuals. A supplementary collection of 20 individuals with high genetic variation was identified to provide a final core collection of 81 individuals, representing 31.2% of the initial collection. The constructed core collection effectively captured the genetic diversity present in the initial collection and serves as a valuable resource for preserving genetic richness within the breeding population. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Data Presentation and Analysis
- Author
-
Muhamad, Goran M., Heshmati, Almas, Series Editor, and Muhamad, Goran M.
- Published
- 2023
- Full Text
- View/download PDF
40. Significance and Implications of Noise Mapping for Noise Pollution Control
- Author
-
Kumar, S., Chauhan, B. S., Garg, N., Angrisani, Leopoldo, Series Editor, Arteaga, Marco, Series Editor, Panigrahi, Bijaya Ketan, Series Editor, Chakraborty, Samarjit, Series Editor, Chen, Jiming, Series Editor, Chen, Shanben, Series Editor, Chen, Tan Kay, Series Editor, Dillmann, Rüdiger, Series Editor, Duan, Haibin, Series Editor, Ferrari, Gianluigi, Series Editor, Ferre, Manuel, Series Editor, Hirche, Sandra, Series Editor, Jabbari, Faryar, Series Editor, Jia, Limin, Series Editor, Kacprzyk, Janusz, Series Editor, Khamis, Alaa, Series Editor, Kroeger, Torsten, Series Editor, Li, Yong, Series Editor, Liang, Qilian, Series Editor, Martín, Ferran, Series Editor, Ming, Tan Cher, Series Editor, Minker, Wolfgang, Series Editor, Misra, Pradeep, Series Editor, Möller, Sebastian, Series Editor, Mukhopadhyay, Subhas, Series Editor, Ning, Cun-Zheng, Series Editor, Nishida, Toyoaki, Series Editor, Oneto, Luca, Series Editor, Pascucci, Federica, Series Editor, Qin, Yong, Series Editor, Seng, Gan Woon, Series Editor, Speidel, Joachim, Series Editor, Veiga, Germano, Series Editor, Wu, Haitao, Series Editor, Zamboni, Walter, Series Editor, Zhang, Junjie James, Series Editor, Yadav, Sanjay, editor, Chaudhary, K.P., editor, Gahlot, Ajay, editor, Arya, Yogendra, editor, Dahiya, Aman, editor, and Garg, Naveen, editor
- Published
- 2023
- Full Text
- View/download PDF
41. Repeatability and accuracy of various region-of-interest sampling strategies for hepatic MRI proton density fat fraction quantification
- Author
-
Hong, Cheng William, Cui, Jennifer Y, Batakis, Danielle, Xu, Yang, Wolfson, Tanya, Gamst, Anthony C, Schlein, Alexandra N, Negrete, Lindsey M, Middleton, Michael S, Hamilton, Gavin, Loomba, Rohit, Schwimmer, Jeffrey B, Fowler, Kathryn J, and Sirlin, Claude B
- Subjects
Biomedical and Clinical Sciences ,Clinical Sciences ,Biomedical Imaging ,Digestive Diseases ,Liver Disease ,Adult ,Humans ,Liver ,Magnetic Resonance Imaging ,Male ,Non-alcoholic Fatty Liver Disease ,Prospective Studies ,Protons ,Reproducibility of Results ,Young Adult ,Hepatic PDFF ,Repeatability ,Region-of-interest ,Sampling strategy ,Hepatic fat quantification ,Quantitative imaging biomarker ,QIB ,Nuclear Medicine & Medical Imaging ,Clinical sciences - Abstract
PurposeTo evaluate repeatability of ROI-sampling strategies for quantifying hepatic proton density fat fraction (PDFF) and to assess error relative to the 9-ROI PDFF.MethodsThis was a secondary analysis in subjects with known or suspected nonalcoholic fatty liver disease who underwent MRI for magnitude-based hepatic PDFF quantification. Each subject underwent three exams, each including three acquisitions (nine acquisitions total). An ROI was placed in each hepatic segment on the first acquisition of the first exam and propagated to other acquisitions. PDFF was calculated for each of 511 sampling strategies using every combination of 1, 2, …, all 9 ROIs. Intra- and inter-exam intra-class correlation coefficients (ICCs) and repeatability coefficients (RCs) were estimated for each sampling strategy. Mean absolute error (MAE) was estimated relative to the 9-ROI PDFF. Strategies that sampled both lobes evenly ("balanced") were compared with those that did not ("unbalanced") using two-sample t tests.ResultsThe 29 enrolled subjects (23 male, mean age 24 years) had mean 9-ROI PDFF 11.8% (1.1-36.3%). With more ROIs, ICCs increased, RCs decreased, and MAE decreased. Of the 60 balanced strategies with 4 ROIs, all (100%) achieved inter- and intra-exam ICCs > 0.998, 55 (92%) achieved intra-exam RC
- Published
- 2021
42. Managing work flow in high enrolling trials: The development and implementation of a sampling strategy in the PREPARE trial.
- Author
-
Pogorzelski, David, Nguyen, Uyen, McKay, Paula, Thabane, Lehana, Camara, Megan, Ramsey, Lolita, Seymour, Rachel, Goodman, J Brett, McGee, Sheketha, Fraifogl, Joanne, Hudgins, Andrea, Tanner, Stephanie L, Bhandari, Mohit, Slobogean, Gerard P, Sprague, Sheila, PREP-IT Investigators Executive Committee:, Steering Committee, Adjudication Committee, Data and Safety Monitoring Committee, Research Methodology Core, Patient Centred Outcomes Core, Orthopaedic Surgery Core, Operating Room Core, Infectious Disease Core, Military Core, McMaster University Methods Center, University of Maryland School of Medicine Administrative Center, University of Maryland School of Pharmacy, The PATIENTS Program, PREP-IT Clinical Sites: Lead Clinical Site (Aqueous-PREP and PREPARE), Aqueous-PREP and PREPARE, Aqueous-PREP, PREPARE, and PREP-IT Investigators Executive Committee
- Subjects
PREP-IT Investigators Executive Committee: ,Steering Committee ,Adjudication Committee ,Data and Safety Monitoring Committee ,Research Methodology Core ,Patient Centred Outcomes Core ,Orthopaedic Surgery Core ,Operating Room Core ,Infectious Disease Core ,Military Core ,McMaster University Methods Center ,University of Maryland School of Medicine Administrative Center ,University of Maryland School of Pharmacy ,The PATIENTS Program ,PREP-IT Clinical Sites: Lead Clinical Site ,Aqueous-PREP and PREPARE ,Aqueous-PREP ,PREPARE ,PREP-IT Investigators Executive Committee ,Cluster crossover ,Pragmatic ,Sampling ,Sampling framework ,Sampling strategy ,Work flow - Abstract
IntroductionPragmatic trials in comparative effectiveness research assess the effects of different treatment, therapeutic, or healthcare options in clinical practice. They are characterized by broad eligibility criteria and large sample sizes, which can lead to an unmanageable number of participants, increasing the risk of bias and affecting the integrity of the trial. We describe the development of a sampling strategy tool and its use in the PREPARE trial to circumvent the challenge of unmanageable work flow.MethodsGiven the broad eligibility criteria and high fracture volume at participating clinical sites in the PREPARE trial, a pragmatic sampling strategy was needed. Using data from PREPARE, descriptive statistics were used to describe the use of the sampling strategy across clinical sites. A Chi-square test was performed to explore whether use of the sampling strategy was associated with a reduction in the number of missed eligible patients.Results7 of 20 clinical sites (35%) elected to adopt a sampling strategy. There were 1539 patients excluded due to the use of the sampling strategy, which represents 30% of all excluded patients and 20% of all patients screened for participation. Use of the sampling strategy was associated with lower odds of missed eligible patients (297/4545 (6.5%) versus 341/3200 (10.7%) p
- Published
- 2021
43. Reducing clustering of readouts in non-Cartesian cine magnetic resonance imaging
- Author
-
Datta Singh Goolaub and Christopher K. Macgowan
- Subjects
Non-Cartesian MRI ,Clustering ,Imaging artifact ,Sampling strategy ,Diseases of the circulatory (Cardiovascular) system ,RC666-701 - Abstract
Background: Non-Cartesian magnetic resonance imaging trajectories at golden angle increments have the advantage of allowing motion correction and gating using intermediate real-time reconstructions. However, when the acquired data are cardiac binned for cine imaging, trajectories can cluster together at certain heart rates (HR) causing image artifacts. Here, we demonstrate an approach to reduce clustering by inserting additional angular increments within the trajectory, and optimizing them while still allowing for intermediate reconstructions. Methods: Three acquisition models were simulated under constant and variable HR: golden angle (Mtrd), random additional angles (Mrnd), and optimized additional angles (Mopt). The standard deviations of trajectory angular differences (STAD) were compared through their interquartile ranges (IQR) and the Kolmogorov-Smirnov test (significance level: p = 0.05). Agreement between an image reconstructed with uniform sampling and images from Mtrd, Mrnd, and Mopt was analyzed using the structural similarity index measure (SSIM). Mtrd and Mopt were compared in three adults at high, low, and no HR variability. Results: STADs from Mtrd were significantly different (p
- Published
- 2024
- Full Text
- View/download PDF
44. Objective evaluation-based efficient learning framework for hyperspectral image classification
- Author
-
Xuming Zhang, Jian Yan, Jia Tian, Wei Li, Xingfa Gu, and Qingjiu Tian
- Subjects
deep learning ,fully convolutional network ,features extraction ,sampling strategy ,Mathematical geography. Cartography ,GA1-1776 ,Environmental sciences ,GE1-350 - Abstract
Deep learning techniques with remarkable performance have been successfully applied to hyperspectral image (HSI) classification. Due to the limited availability of training data, earlier studies primarily adopted the patch-based classification framework, which divides images into overlapping patches for training and testing. However, this framework results in redundant computations and possible information leakage. This study proposes an objective evaluation-based efficient learning framework for HSI classification. It consists of two main parts: (i) a leakage-free balanced sampling strategy and (ii) an efficient fully convolutional network (EfficientFCN) optimized for the accuracy-efficiency trade-off. The leakage-free balanced sampling strategy first generates balanced and non-overlapping training and test data by partitioning the HSI and its ground truth image into non-overlapping windows. Then, the generated training and test data are used to train and test the proposed EfficientFCN. EfficientFCN exhibits a pixel-to-pixel architecture with modifications for faster inference speed and improved parameter efficiency. Experimental results demonstrate that the proposed sampling strategy can provide objective performance evaluation. EfficientFCN outperforms many state-of-the-art approaches concerning the speed-accuracy trade-off. For instance, compared to the recent efficient models EfficientNetV2 and ConvNeXt, EfficientFCN achieves 0.92% and 3.42% superior accuracy and 0.19s and 0.16s faster inference time, respectively, on the Houston dataset. Code is available at https://github.com/xmzhang2018.
- Published
- 2023
- Full Text
- View/download PDF
45. RepCo: Replenish sample views with better consistency for contrastive learning.
- Author
-
Lei, Xinyu, Liu, Longjun, Zhang, Yi, Jia, Puhang, Zhang, Haonan, and Zheng, Nanning
- Subjects
- *
OBJECT recognition (Computer vision) - Abstract
Contrastive learning methods aim to learn shared representations by minimizing distances between positive pairs, and maximizing distances between negative pairs in the embedding space. To achieve better performance of contrastive learning, one of the key problems is to design appropriate sample pairs. In most previous works, random cropping on the input image is utilized to obtain two views as positive pairs. However, such strategies lead to suboptimal performance since the sampled crops may have inconsistent semantic information, which consequently degrades the quality of contrastive views. To address this limitation, we explore to rep lenish sample views with better co nsistency of the image and propose a novel self-supervised learning (SSL) framework RepCo. Instead of searching for semantically consistent patches between two different views, we select patches on the same image as the replenishment of positive/negative pairs, encourage patches that are similar but come from different positions as positive pairs, and force patches that are dissimilar but come from adjacent positions to have different representations, i.e. construct negative pairs to enrich the learned representations. Our method effectively generates high-quality contrastive views, explores the untapped semantic consistency on images, and provides more informative representations for downstream tasks. Experiments on adequate downstream tasks have shown that, our approach achieves +2.1 AP 50 (COCO pre-trained) and +1.6 AP 50 (ImageNet pre-trained) gains on Pascal VOC object detection, +2.3 mIoU gains on Cityscapes semantic segmentation, respectively. • Propose a novel SSL method that can generate high-quality image views and achieve better performance for contrastive learning. • Use similarities of sampled patches as thresholds to determine whether two views should be treated as positive or negative pairs. • The sampled positive pairs can maintain a high consistency, negative samples are semantically similar but not identical to the anchor. • Propose an indicator matrix to help capture sampled positive and negative pairs. • Provide strong benefits for downstream tasks and demonstrate competitive performance. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
46. Soil Micro-eukaryotic Diversity Patterns Along Elevation Gradient Are Best Estimated by Increasing the Number of Elevation Steps Rather than Within Elevation Band Replication.
- Author
-
Huang, Shuyin, Lentendu, Guillaume, Fujinuma, Junichi, Shiono, Takayuki, Kubota, Yasuhiro, and Mitchell, Edward A. D.
- Subjects
- *
ALTITUDES , *MICROBIAL diversity , *NUCLEOTIDE sequencing , *SOILS - Abstract
The development of high-throughput sequencing (HTS) of environmental DNA (eDNA) has stimulated the study of soil microbial diversity patterns and drivers at all scales. However, given the heterogeneity of soils, a challenge is to define effective and efficient sampling protocols that allow sound comparison with other records, especially vegetation. In studies of elevational diversity pattern, a trade-off is choosing between replication within elevation bands vs. sampling more elevation bands. We addressed this question for soil protists along an elevation gradient on Mt. Asahi, Hokkaido, Japan. We compared two sampling approaches: (1) the replicate strategy (five replicates at six elevational bands, total = 30) and (2) the transect strategy (one sample in each of 16 different elevational bands). Despite a nearly twofold lower sampling effort, the transect strategy yielded congruent results compared to the replicate strategy for the estimation of elevational alpha diversity pattern: the regression coefficients between diversity indices and elevation did not differ between the two options. Furthermore, for a given total number of samples, gamma diversity estimated across the entire transect was higher when sampling more elevational bands as compared to replication from fewer elevational bands. Beta diversity (community composition turnover) was lower within a given elevational band than between adjacent bands and increased with elevation distance. In redundancy analyses, soil organic matter-related variable (the first principal component of soil organic matter, water content, total organic carbon, and nitrogen by whom were highly correlated) and elevation best explained elevational beta diversity pattern for both sampling approaches. Taken together, our results suggest that sampling a single plot per elevation band will be sufficient to obtain a good estimate of soil micro-eukaryotic diversity patterns along elevation gradients. This study demonstrated the effectiveness of the transect strategy in estimating diversity patterns along elevation gradients which is instructive for future environmental or even experimental studies. While not advocating for completely replacing replication-based sampling practices, it is important to note that both replicate and transect strategies have their merits and can be employed based on specific research goals and resource limitations. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. Scalable Optimal Formation Path Planning for Multiple Interconnected Robots via Convex Polygon Trees.
- Author
-
Lu, Wenjie, Xiong, Hao, Zhang, Zhengjie, Hu, Zhezhe, and Wang, Tianming
- Abstract
A Reconfigurable Modular Robotic System (RMRS) consists of multiple interconnected robots and can achieve various functionalities by rearranging its modular robots, such as transporting loads of various shapes. The path planning for an RMRS involves the system motion and also its formation arrangements. Sampling-based path planning for the RMRS might be inefficient due to the formation variety. Recently, convex subsets of the obstacle-free workspace, referred to as polygon nodes, are instead sampled to formulate constrained optimization problems. The success rate of sampling is however unsatisfactory due to connectivity requirements. This paper proposes an obstacle-aware mixture density network to guide the generation of polygon nodes, where the connectivity of polygon nodes is guaranteed by non-zero Minkowski differences between the formation geometry and the intersection of nodes. Subsequently, Convex-Polygon Trees* (CPTs*) are proposed to connect these polygon nodes in an RRT* manner, outputting candidates of convex optimization problems. The optimality degeneration due to distance approximation is proven bounded and the computational complexity is shown linear to the Lebesgue measure of the entire workspace space. Numerical simulations have shown that in most tested large and cluttered environments the CPT* is more than 8 times faster than an existing constrained optimization method. The results have also shown CPT*’ improved scalability to large environments and enhanced efficiency in dealing with narrow passages. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. Simulating capture efficiency of pitfall traps based on sampling strategy and the movement of ground‐dwelling arthropods.
- Author
-
Ahmed, Danish A., Beidas, Ayah, Petrovskii, Sergei V., Bailey, Joseph D., Bonsall, Michael B., Hood, Amelia S. C., Byers, John A., Hudgins, Emma J., Russell, James C., Růžičková, Jana, Bodey, Thomas W., Renault, David, Bonnaud, Elsa, Haubrock, Phillip J., Soto, Ismael, and Haase, Peter
- Subjects
PITFALL traps ,ARTHROPODA ,SPATIAL arrangement ,GROUND beetles ,RANDOM walks ,BEETLES - Abstract
Pitfall traps are frequently used to capture ground‐dwelling arthropods, particularly beetles, ants and spiders. The capture efficiency of a pitfall trapping system strongly depends on the number and opening size of traps, how traps are distributed over the sampling area (spatial arrangement) and the movement characteristics of arthropods.We use numerical simulations for a single species to analyse the trap count patterns that emerge from these variables. Arthropod movement of individuals is modelled as correlated random walks, with multiple traps placed over an area, and catches are simulated as individual interaction with traps. We consider four different types of spatial arrangements of traps across a homogeneous landscape: grid (i.e. rectangular array), transect, nested‐cross and randomised. We contextualise our results by considering the locomotion of Pterostichus melanarius, a highly active carabid beetle often serving as a biocontrol agent for the suppression of pest insects and weeds.By simulating the trapping of randomly moving ground‐dwelling arthropods, we show that there is an optimal inter‐trap separation distance (trap spacing) that maximises captures, that can be expressed using exact formulae in terms of trap opening sizes, sampling area and trap number. Moreover, for the grid and nested‐cross arrangements, larger trap spacing to maximise spatial coverage over the whole sampling area is suboptimal. Also, we find that over a large sampling area, there is a hierarchical order for spatial arrangements in relation to capture efficiency: grid, randomised, transect, followed by the nested‐cross. However, over smaller sampling areas, this order is changed as the rate at which trap counts accumulate with trap number varies across arrangements—eventually saturating at different levels. In terms of movement effects, capture efficiency is maximised over a narrow diffusive range and does not depend strongly on the type of spatial arrangement—indicating an approximate optimal mode of arthropod activity, i.e. rate of spread.Our approach simultaneously considers several important experimental design aspects of pitfall trapping providing a basis to optimise and adapt sampling protocols to other types of traps to better reflect their various purposes, such as monitoring, conservation or pest management. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Monitoring cadmium concentrations in cacao: inter-laboratory variation and the effect of sample size on variability among ready-for-sale beans.
- Author
-
Dekeyrel, Jesse, Wantiez, Léna, Chavez, Eduardo, De Ketelaere, Bart, and Smolders, Erik
- Subjects
- *
CACAO , *CACAO beans , *ATOMIC absorption spectroscopy , *CADMIUM , *SAMPLE size (Statistics) , *BEANS - Abstract
Since the implementation of new EU limits on cadmium (Cd) in cacao-derived products, reliable measurements of the Cd concentration in cacao samples have become even more important. This study was set up to analyse the robustness of the measured Cd concentrations in cacao as affected by sampling strategy and by the laboratory receiving these samples. Six different homogenised cacao liquor samples were sent to 25 laboratories, mainly located in Latin America. On average, only 76% of the laboratories reported acceptable results per sample using internationally accepted criteria. More unreliable data was obtained when Atomic Absorption Spectroscopy (AAS) rather than Inductively Coupled Plasma (ICP) instruments were used or where concentrations were outside the calibration range. Subsequently, four commercial lots in Ecuadorian warehouses were sampled to identify the variation among beans, bags and replicate chemical analyses of ground samples. Simulations indicate that a composite sample should be made from at least 10 bags on a pallet and at least 60 beans should be ground prior to analysis to obtain an acceptable CV below 15%. This study shows that current Cd analyses in cacao on the market are neither sufficiently accurate nor precise and that more control on laboratory certifications is needed for reliable screening of Cd in cacao. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. Less is More? Review and Recommendations for Qualitative Sampling Strategy using the S.C.A.D.E Approach.
- Author
-
Ting, Hiram, Turner, Daniel, Kim-Lim Tan, Sook Rei Tan, Munwai Wong, and Jiankun Gong
- Subjects
QUANTITATIVE research ,SAMPLE size (Statistics) ,RESEARCH personnel ,RESEARCH methodology ,QUALITATIVE research - Abstract
The determination of sample size in qualitative research introduces a unique and multifaceted challenge, setting it apart from the more structured methodology of quantitative research. Contrary to sampling methods in quantitative research, which primarily aim to secure random and statistically representative samples that facilitate the generalisation of findings to broader populations, sampling in qualitative research requires a distinct set of considerations in its pursuit of a deeper understanding of specific phenomena. The objective of this editorial is to provide qualitative researchers with clear and foundational guidance for effectively communicating the methodological aspects of their research papers, particularly pertaining to sample size justification. Building on this, we present S.C.A.D.E, an acronym comprising five key actionable elements--Selecting, Clarifying, Aligning, Deploying and Evaluating--to guide researchers in determining the appropriate sample size and ensuring that data saturation is achieved as they plan their qualitative exploration. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.