204 results on '"Kim H. Esbensen"'
Search Results
2. Variographic analysis: A new methodology for quality assurance of pharmaceutical blending processes.
- Author
-
Adriluz Sánchez-Paternina, Nobel O. Sierra-Vega, Vanessa Cárdenas, Rafael Méndez, Kim H. Esbensen, and Rodolfo J. Romañach
- Published
- 2019
- Full Text
- View/download PDF
3. Before reliable near infrared spectroscopic analysis - the critical sampling proviso. Part 2:Particular requirements for near infrared spectroscopy
- Author
-
Kim H Esbensen and Nawaf Abu-Khalaf
- Subjects
representative sampling ,replication experiment ,theory of sampling ,total measurement uncertainty ,sampling QA ,special NIR issues? ,Spectroscopy ,analysis QA ,sampling uncertainty - Abstract
Non-representative sampling of materials, lots and processes intended for NIR analysis is often fraught with hidden contributions to the full Measurement Uncertainty MUtotal = TSE + TAENIR. The Total Sampling Error (TSE) can dominate over the Total Analytical Error TAENIR by factors of 5 to 10 to even 25 times, depending on the degree of material heterogeneity and the specific sampling procedures employed to produce the minuscule aliquot, which is the only material actually analysed. Part 1 presented a brief of all sampling uncertainty elements in the “lot-to-aliquot” pathway, which must be identified and correctly managed (eliminated or reduced maximally), especially the sampling bias, as a prerequisite to achieve fully representative sampling. The key for this is the Theory of Sampling (TOS), which is presented in two parts in a novel compact fashion. Part 2 introduces (i) application of TOS to process sampling, specifically addressing and illustrating how this manifests itself in the realm of PAT, Process Analytical Technology, and (ii) an empirical safeguard facility, termed the Replication Experiment (RE), with which to estimate the effective sampling-plus-analysis uncertainty level (MUtotal) associated with NIR analysis. The RE is a defence against compromising the analytical responsibilities. Ignorance, either caused by lack of awareness or training, or by wilful neglect, of the demand for TSE minimisation, is a breach of due diligence concerning analysis QC/QA. Part 2 ends with a special focus on: “What does all this TOS mean specifically for NIR analysis?”. The answer to this question will perhaps surprise many. There is nothing special that need worrying NIR analysts relative to professionals from all other analytical modalities; all that is needed is embedded in the general TOS framework. Still, this review concludes by answering a set of typical concerns from NIR practitioners.
- Published
- 2022
- Full Text
- View/download PDF
4. Before reliable near infrared spectroscopic analysis - the critical sampling proviso. Part 1:Generalised theory of sampling
- Author
-
Kim H Esbensen and Nawaf Abu-Khalaf
- Subjects
representative sampling ,analysis ,MU ,Theory of Sampling ,total Measurement Uncertainty ,Spectroscopy ,before analysis ,sampling bias - Abstract
Non-representative sampling of materials, lots and processes intended for near infrared (NIR) analysis is often contributing hidden additions to the full Measurement Uncertainty (MUtotal = TSE + TAENIR). The Total Sampling Error (TSE) can dominate over the Total Analytical Error (TAENIR) by factors ranging from 5 to 10 to even 25 times, depending on material heterogeneity and the specific sampling procedures employed to produce the minuscule aliquot, which is the only material analysed. This review (Parts 1 and 2), extensively referenced with easily available complementing literature, presents a brief of all sampling uncertainty elements in the “lot-to-aliquot” pathway, which must be identified and correctly managed (eliminated or maximally reduced) in order to achieve, and to be able to document, fully minimised MUtotal. The more irregular and pervasive the heterogeneity, the higher the number of increments needed to reach ‘fit-for-purpose representativity’. A particular focus is necessary regarding the sampling bias, which is fundamentally different from the well-known analytical bias. Whereas the latter can easily be subjected to bias correction, the sampling bias is non-correctable by any posteori means, notably not by chemometrics, nor statistics. Instead, all sampling operations must be designed to exclude the so-called Incorrect Sampling Errors (ISE), which are the hidden bias-generating agents. The key element in this endeavour is representative sampling and sub-sampling before analysis, as laid out by the Theory of Sampling (TOS), which is presented here in a novel compact fashion along with a complement of selected examples and demonstrations. TOS includes a safeguard facility, termed the Replication Experiment (RE), which enables estimation of the total sampling- plus-analysis uncertainty level (MUtotal) associated with NIR analysis (the RE is, for practical and logistical reasons, found in Part 2). Neglecting the TSE effects from the before-analysis domain is lack of due diligence. TOS to the fore!
- Published
- 2022
- Full Text
- View/download PDF
5. Towards a more human (re)design of digital spatial technologies with emphasis on an uncertainty-based cartographic representation.
- Author
-
Kim Lowell, Geoffrey Edwards, and Kim H. Esbensen
- Published
- 1995
- Full Text
- View/download PDF
6. Validation, verification and evaluation (Panel).
- Author
-
Samuel P. Uselton, Geoff Dorn, Charbel Farhat, Michael W. Vannier, Kim H. Esbensen, and Al Globus
- Published
- 1994
- Full Text
- View/download PDF
7. Experimental Evaluation of Surface Water Sampling Variability for Environmental Monitoring in Iron Ore Operations
- Author
-
Karin Engström, Kim H. Esbensen, and Liselotte Olausson
- Subjects
Analyte ,Hydrogeology ,System of measurement ,Flux ,Sampling (statistics) ,Repeatability ,010501 environmental sciences ,010502 geochemistry & geophysics ,Geotechnical Engineering and Engineering Geology ,01 natural sciences ,Environmental monitoring ,Statistics ,Surface water ,0105 earth and related environmental sciences ,Water Science and Technology - Abstract
Environmental self-monitoring is a government requirement for Swedish process industries. This includes sampling and analysis of recipient water that might be adversely affected by emissions. The requirements for accredited analytical methods are strict, with well-defined measurement uncertainties, but estimations of the attendant sampling variability are seldom required, presented, or evaluated in environmental surface water sampling. The goal of this study was to perform an initial evaluation of the measurement variability for surface water sampling within the self-monitoring program for a large mining company in northern Sweden. The results indicate that the method for evaluation of sampling and measurement variability itself affects the results obtained. Therefore, the evaluation scope must be clearly defined in advance, so that the most appropriate approach, resulting in a realistic quantification of total variability, can be selected. This study shows that duplicate sampling experiments result in significantly larger sampling variability estimates when accounting for ambiguities in the sampling protocol than similar experiments under repeatability conditions. This is due to large temporal variations in stream flux and analyte concentrations in the evaluated sampling targets. The ambiguities in different sampling protocols must be fully described and considered when designing empirical evaluation experiments to allow valid evaluation of the total sampling and measurement system variability. An automated sampler using volume-proportional sampling to collect increments for composite samples is recommended to reduce unnecessary sampling variability and address significant temporal changes in stream flux and analyte concentrations appropriately.
- Published
- 2019
- Full Text
- View/download PDF
8. A Framework for Representative Sampling for NIR Analysis – Theory of Sampling (TOS)
- Author
-
Kim H. Esbensen and Rodolfo J. Romañach
- Subjects
Statistics ,Environmental science ,Sampling (statistics) ,Representative sampling - Published
- 2021
- Full Text
- View/download PDF
9. Partial least squares PLS1 vs. PLS2 - optimal input/output modeling in a compound industrial drying oven
- Author
-
Kim H. Esbensen, Maths Halstensen, and Ulrich Hundhausen
- Subjects
Input/output ,Control theory ,Partial least squares regression ,Mathematics - Published
- 2021
- Full Text
- View/download PDF
10. Real-time monitoring of wood cladding spray painting properties and nozzle condition using acoustic chemometrics
- Author
-
Onyinye Victoria Agu, Ulrich Hundhausen, Kim H. Esbensen, and Maths Halstensen
- Subjects
Chemometrics ,Materials science ,law ,Spray painting ,Nozzle ,Composite material ,Cladding (fiber optics) ,law.invention - Published
- 2021
- Full Text
- View/download PDF
11. Suitability of using a handheld XRF for quality control of quartz in an industrial setting
- Author
-
S. Lemieux, D. Desroches, Kim H. Esbensen, and L.P. Bédard
- Subjects
Detection limit ,Accuracy and precision ,Materials science ,Environmental analysis ,business.industry ,Mechanical Engineering ,Sample (material) ,010401 analytical chemistry ,Context (language use) ,General Chemistry ,010502 geochemistry & geophysics ,Geotechnical Engineering and Engineering Geology ,01 natural sciences ,0104 chemical sciences ,Matrix (chemical analysis) ,Control and Systems Engineering ,Sample preparation ,Process engineering ,business ,Quartz ,0105 earth and related environmental sciences - Abstract
Handheld XRF (HHXRF) is an analytical tool often used for chemical characterization, environmental analysis or mineral exploration. However, few studies deal with its potential use as a quality control instrument within an industrial context, such as mineral processing or transformation. Quartz is an ideal test material; it is a simple matrix allowing for better isolation of different parameters including detection limits, precision and accuracy, instrumental drift, surface representativeness and sample preparation. In this study, we determined that the specific limit of detection was lower than 70 μg/g for TiO 2 , Fe 2 O 3 and CaO. The HHXRF analyzed pressed pellets of matrix-matched reference materials. Estimates for TiO 2 , Fe 2 O 3 and CaO were similar to certified values, while estimates for low concentration light elements (Al 2 O 3 and MgO) were less accurate. For faster and higher throughput, as often required in an industrial context, HHXRF can be used directly on mineral sample without sample preparation. Five in-situ determinations on a 10-cm-sided block of quartz, produced an accuracy acceptable for industrial needs. However, in-situ determinations are limited by the flatness of the analytical surface, and minute accessory phases can induce some erratic results. Our tests suggest, however, that HHXRF is generally suitable for the control quality of quartz in an industrial context.
- Published
- 2018
- Full Text
- View/download PDF
12. Revisiting Pierre Gy's formula (TOS) – A return to size-density classes for applications to contaminated soils, coated particular aggregates and mixed material systems
- Author
-
Jean-Sébastien Dubé and Kim H. Esbensen
- Subjects
Contaminated soils ,Chemistry ,Sampling (statistics) ,Material system ,04 agricultural and veterinary sciences ,010502 geochemistry & geophysics ,01 natural sciences ,Biochemistry ,Sampling variance ,Specimen Handling ,Analytical Chemistry ,Soil ,Statistics ,040103 agronomy & agriculture ,0401 agriculture, forestry, and fisheries ,Environmental Chemistry ,Environmental Pollution ,Spectroscopy ,0105 earth and related environmental sciences - Abstract
For some real-world material systems, estimations of the incompressible sampling variance based on Gy’s classical s 2 (FSE) formula from the Theory of Sampling (TOS) show a significant discrepancy with empirical estimates of sampling variance. In instances concerning contaminated soils, coated particular aggregates and mixed material systems, theoretical estimates of sampling variance are larger than empirical estimates, a situation which does not have physical meaning in TOS. This has led us to revisit the development of estimates of s 2 (FSE) from this famous constitutional heterogeneity equation and explore the use of size-density classes for mixed material systems (mixtures of both analyte-enriched and coated particles), an approach which has been mostly unused since Gy’s original derivation. This approach makes it possible to avoid taking into account the granulometric and liberation factors from Gy’s classical treatment, and present grounds for criticising the use of ‘standard’ input values of critical parameters such as f:= 0.5, and g:= 0.25. But, as always, the “liberation factor” ( l ) issue still plays an important role, which is paid due attention. The constitutional heterogeneity formula based on size-density classes is presented in a form that allows for easy implementation in practice, within specified limitations. We present extensive experimental results from real-world systems. Using the “SDCD model” with published data reproduced the relative sampling variances calculated for the standard “mineral-like matrices”, but more importantly corrected the relative sampling variance calculated for real contaminants by several orders of magnitudes. In all cases, the recalculated relative sampling variances were decreased to below their corresponding experimental measurements, now fully as expected from TOS, substantiating our new development.
- Published
- 2022
- Full Text
- View/download PDF
13. Economic arguments for representative sampling
- Author
-
Oscar Dominguez, Zhu Mingwei, Dominique Francois-Bongarcon, Melissa Gouws, Elke Thisted, Martin Lischka, Eduardo Jara, Li Huachang, Abel Arkenbout, Pentti Minkkinen, Claudia Paoletti, Geoff Lyman, D. Aldwin Vogel, Simon C. Dominy, Quentin Dehaine, R.C.A. Minnitt, Trevor Bruce, Kim H. Esbensen, Ralph J. Holmes, Christopher Robben, Stephane Brochot, Philippe Davin, Pablo Carrasco, Pedro Carrasco, and Rodolfo J. Romañach
- Subjects
Geography ,Statistics ,Representative sampling - Published
- 2021
- Full Text
- View/download PDF
14. Optimal grade control sampling practice in open-pit mining – a full-scale blast hole versus reverse circulation variographic experiment
- Author
-
Kim H. Esbensen and Karin Engström
- Subjects
business.industry ,Full scale ,Open-pit mining ,Sampling (statistics) ,02 engineering and technology ,engineering.material ,010502 geochemistry & geophysics ,Geotechnical Engineering and Engineering Geology ,01 natural sciences ,020501 mining & metallurgy ,0205 materials engineering ,Mining engineering ,Iron ore ,Geochemistry and Petrology ,Earth and Planetary Sciences (miscellaneous) ,engineering ,Environmental science ,Circulation (currency) ,business ,Representative sampling ,0105 earth and related environmental sciences - Abstract
Misclassification of ore grades results in lost revenues, and the need for representative sampling procedures in open pit mining is increasingly important in all mining industries. This study evaluated possible improvements in sampling representativity with the use of Reverse Circulation (RC) drill sampling compared to manual Blast Hole (BH) sampling in the Leveäniemi open pit mine, northern Sweden. The variographic experiment results showed that sampling variability was lower for RC than for BH sampling. However, the total costs for RC drill sampling are significantly exceeding current costs for manual BH sampling, which needs to be compensated for by other benefits to motivate introduction of RC drilling. The main conclusion is that manual BH sampling can be fit-for-purpose in the studied open pit mine. However, with so many mineral commodities and mining methods in use globally, there is no universal best practice for open pit drill sampling and each case must be evaluated individually.
- Published
- 2017
- Full Text
- View/download PDF
15. Introduction to the Theory and Practice of Sampling
- Author
-
Kim H. Esbensen
- Subjects
Field (physics) ,Statistics ,Sampling (statistics) ,Mathematics - Abstract
Author Summary: The first of three new textbooks in the field of Theory and Practice of Sampling.
- Published
- 2020
- Full Text
- View/download PDF
16. WHAT are sampling errors—and WHAT can we do about them? Part 2: Sampling and weighing—different, but the same…
- Author
-
Kim H. Esbensen and Aldwin Vogel
- Subjects
Statistics ,Sampling (statistics) ,Sampling error ,Mathematics - Published
- 2021
- Full Text
- View/download PDF
17. WHAT are sampling errors—and WHAT can we do about them? Part 1
- Author
-
Kim H. Esbensen, Rodolfo J. Romañach, and Aidalu Joubert Castro
- Subjects
Computer science ,Statistics ,Sampling error - Published
- 2021
- Full Text
- View/download PDF
18. Multivariate and Hyperspectral Image Analysis
- Author
-
Paul Geladi, Kim H. Esbensen, and Hans Grahn
- Subjects
Multivariate statistics ,Geography ,Pixel ,Property (programming) ,Position (vector) ,business.industry ,Medical imaging ,Hyperspectral imaging ,Computer vision ,Latent variable ,Artificial intelligence ,business ,Visualization - Abstract
Multivariate image analysis (MIA) and hyperspectral image analysis (HIA) are methodologies for analyzing multivariate images, where the image coordinates are position (two or three dimensions) and variable number. Multivariate/hyperspectral images can have typical sizes 1024 × 1024, 512 × 512, 256 × 256, etc. and have between two and many hundreds of variables. The variables can be wavelength, electron energy, particle mass, and many others. Classical image analysis concentrates mainly on spatial relationships between pixels in a gray level image. MIA/HIA concentrates on the correlation of structure between the variables to provide extra information useful for exploring images and classifying regions in them. The many variables can be transformed into a few latent variable images containing condensed information. The sheer size of the data arrays necessitates visualization of raw data, intermediate data, model parameters, and residuals. When the images consist of continuous spectra having more than 100 variables, the name hyperspectral is used and the emphasis of the analysis is often on the spectral interpretation. All physical techniques for measuring materials can be made into imaging techniques, describing not only a property but also its position in a plane or volume. All imaging techniques can be expanded to become multivariate/hyperspectral. Multivariate imaging is used in many fields of research, but for practical reasons a subdivision in the classes remote sensing, medical imaging, and microscopy (including macroscopy) can be made. In microscopy, MIA/HIA can be used to study materials and biological processes by optical, electron, and charged particle techniques.
- Published
- 2016
- Full Text
- View/download PDF
19. Empirical Approach for Estimating Reference Material Heterogeneity and Sample Minimum Test Portion Mass for 'Nuggety' Precious Metals (Au, Pd, Ir, Pt, Ru)
- Author
-
Kim H. Esbensen, L. Paul Bédard, and Sarah-Jane Barnes
- Subjects
Basalt ,Mineralization (geology) ,010401 analytical chemistry ,Pellets ,Mineralogy ,Precious metal ,010502 geochemistry & geophysics ,01 natural sciences ,0104 chemical sciences ,Analytical Chemistry ,Metal ,visual_art ,visual_art.visual_art_medium ,Geology ,0105 earth and related environmental sciences ,Petrogenesis - Abstract
Quantification of precious metal content is important for studies of ore deposits, basalt petrogenesis, and precious metal geology, mineralization, mining, and processing. However, accurate determination of metal concentrations can be compromised by microheterogeneity commonly referred to as the "nugget effect", i.e., spatially significant variations in the distribution of precious metal minerals at the scale of instrumental analytical beam footprints. There are few studies focused on the spatial distribution of such minerals and its detrimental effects on quantification of the existing suite of relevant reference materials (RM). In order to assess the nugget effect in RM, pressed powder pellets of MASS-1, MASS-3, WMS-1a, WMS-1, and KPT-1 (dominantly sulfides) as well as CHR-Pt+ and CHR-Bkg (chromite-bearing) were mapped with micro-XRF. The number of verified nuggets observed was used to recalculate an effective concentration of precious metals for the analytical aliquot, allowing for an empirical estimate of a minimum mass test portion. MASS-1, MASS-3, and WMS-1a did not contain any nuggets; therefore, a convenient small test portion could be used here (0.1 g), while CHR-Pt+ would require 0.125 g and WMS-1 would need 23 g to be representative. For CHR-Bkg and KPT-1, the minimum test portion mass would have to be ∼80 and ∼342 g, respectively. Minimum test portions masses may have to be greater still in order to provide detectable analytical signals. Procedures for counteracting the detrimental manifestations of microheterogeneity are presented. It is imperative that both RM and pristine samples are treated in exactly the same way in the laboratory, lest powders having an unknown nugget status (in effect all field samples for analysis) can not be documented to be representing a safe minimum mass basis.
- Published
- 2016
- Full Text
- View/download PDF
20. TOS reflections: is there a third way? (to promote the Theory of Sampling)
- Author
-
Kim H. Esbensen
- Subjects
Sampling system ,Computer science ,Event (computing) ,Argument ,Sampling (statistics) ,Audit ,Marketing - Abstract
A standing discussion topic within the sampling community is: “What is the best way to promote the TOS—not only as a theory, but also as a tool to help customers?” The latter objective casts the question into a rather more direct format: “How to sell TOS-compliant equipment, sampling system solutions, consulting and audit services to customers with only little or no familiarity with the need for proper sampling?” These reflections address the two most dominant answers: i) the economic argument “You’ll lose a lot of money if you don’t…”; or ii) the technical argument: “You need to understand these critical aspects of the TOS, or else…”. However, this is usually but a futile debate; obviously one should be able to wield a flexible tactics which best matches a specific marketing or application need with one, or both, of these approaches. But a recent event has tickled the imagination—is there possibly also a third way?
- Published
- 2020
- Full Text
- View/download PDF
21. Origin of the World Conferences on Sampling and Blending (WCSB)
- Author
-
Kim H. Esbensen, Dominique Francois-Bongarcon, and John Vann
- Subjects
History ,Institution (computer science) ,Sampling (statistics) ,Genealogy - Abstract
We answer the question “Exactly how did the World Conference of Sampling and Blending originate?” With WCSB 10 approaching, “three fellows” decided to do something about this. It turned out to be quite a detective story spanning 20+ years, three continents, many obsolete PC platforms and searching through several thousands of old e-mails. The story, as told here by Messieurs Francois-Bongarcon, Vann and Esbensen, is a tour-de-force of the pre- and very early history of the WCSB institution.
- Published
- 2020
- Full Text
- View/download PDF
22. Theory of Sampling—an approach to representativity offering front line companies added value and potential substantial savings
- Author
-
Kim H. Esbensen, Jørgen Houe Pedersen, and Fritz Rendeman
- Subjects
Statistics ,Added value ,Sampling (statistics) ,Front line ,Mathematics - Published
- 2020
- Full Text
- View/download PDF
23. Sampling of particulate materials with significant spatial heterogeneity - Theoretical modification of grouping and segregation factors involved with correct sampling errors:Fundamental Sampling Error and Grouping and Segregation Error
- Author
-
Kim H. Esbensen and Pentti Minkkinen
- Subjects
Chemistry ,010401 analytical chemistry ,GRASP ,Segregation ,Sampling theory ,Sampling error ,02 engineering and technology ,Sampling uncertainty ,021001 nanoscience & nanotechnology ,Sampling errors ,01 natural sciences ,Biochemistry ,0104 chemical sciences ,Analytical Chemistry ,Spatial heterogeneity ,Fragment size ,Particulate material ,Statistics ,Environmental Chemistry ,Heterogeneity ,0210 nano-technology ,Grouping Factor ,Spectroscopy - Abstract
There has been an extensive abuse of Gy's Formula during the entire history of applied TOS (Theory of Sampling), it being applied too liberally to almost any aggregate material conceivable for many material classes of extremely different compositions with significant (to large, or extreme) fragment size distribution heterogeneity, for example many types of municipal and industrial waste materials. This abuse regimen is for the most part characterized by lack of fundamental TOS competence and the historical context of Gy's formula. The present paper addresses important theoretical details of TOS, which become important as sampling rates increase at the conclusion of the full ‘lot-to-analysis sampling pathway regarding finer details behind TOS’ central equations linking sampling conditions to material heterogeneity characteristics allowing the estimation of Total Sampling Error (TSE) manifestations. We derive a new, complementary understanding of the two conceptual factors, y the grouping factor and, z, the segregation factor, intended to represent the local (increment scale) and long-range (increment to lot-scale) heterogeneity aspects of lot materials, respectively. We contrast the standard TOS exposé with the new formulation. While the phenomenological meaning and content of the new proposed factors (y and z) remains the same, their numerical values and bracketing limits are different with z now representing more realistic effects of liberation and segregation combined. This new formulation makes it easier to get a first comprehensive grasp of TOS′ dealings with sampling of significantly heterogeneous materials. We believe this may present a slightly easier path into the core issues in TOS when sampling and sub-sampling gets closer to the final aliquot scale.
- Published
- 2019
- Full Text
- View/download PDF
24. A palynofacies study of past fluvio-deltaic and shelf environments, the Oligocene-Miocene succession, North Sea Basin:A reference data set for similar Cenozoic systems
- Author
-
Kasia K. Śliwińska, Anders Mathiesen, Erik S. Rasmussen, Karen Dybkjær, and Kim H. Esbensen
- Subjects
010504 meteorology & atmospheric sciences ,Lithology ,Stratigraphy ,Botryococcus ,010502 geochemistry & geophysics ,Oceanography ,01 natural sciences ,Palynofacies ,Sedimentary depositional environment ,Depositional environments ,Paleontology ,Fluvio-deltaic ,North Sea Basin ,Vitrinite ,0105 earth and related environmental sciences ,geography ,geography.geographical_feature_category ,biology ,Geology ,Miocene ,Sedimentary basin ,biology.organism_classification ,Geophysics ,Economic Geology ,Sedimentary rock ,Cenozoic - Abstract
Correct interpretations of depositional environments are fundamental for evaluating the geological history of a sedimentary basin. Palynofacies analyses are a valuable supplement to sedimentological and seismic studies. In order to develop a palynofacies reference dataset for fluvio-deltaic and shelfal successions, a study of the assemblages of sedimentary organic particles from seven different well-defined depositional environments within the uppermost Oligocene – lower Miocene succession onshore Denmark (eastern North Sea Basin) has been performed. The study deals with the following environments; floodplain, lagoon, washover-fan flat, prodelta, shoreface, offshore transition and shelf. The sedimentary organic particles were grouped into four major categories; 1) Structured wood particles, 2) Amorphous organic matter (AOM, in the present study mainly consisting of partly degraded vitrinite), 3) Cuticle and membranes and 4) Palynomorphs. The palynomorphs were grouped into eight subcategories; 1) Microspores, 2) Non-saccate pollen, 3) Bisaccate pollen, 4) Botryococcus, 5) Other freshwater algae, 6) Fungal hyphae and –spores, 7) Acritarchs and 8) Organic-walled dinoflagellate cysts. A combination of a univariate box plots and a multivariate Principal Component Analysis (PCA) of the palynofacies data clear revealed the quantitative characteristics and variations within each discrete environment as well as their principal similarities and differences. In spite of some natural overlaps, for example between the lagoon and offshore transition environments, the data revealed distinct characteristics, e.g. a strong dominance of wood particles in the shoreface environment, a strong dominance of bisaccate pollen in the washover-fan flat environment and a near absence of dinocysts in the floodplain environment. An overall increase in relative abundances of dinocysts and a decrease in abundances of non-saccate pollen in the proximal-distal trend were also outlined. This study outlines a palynofacies reference dataset that can be used as a tool for interpreting depositional environments in equivalent settings, preferentially combined with other information such as seismic data, well logs, and/or lithology.
- Published
- 2019
- Full Text
- View/download PDF
25. Variographic Assessment of Total Process Measurement System Performance for a Complete Ore-to-Shipping Value Chain
- Author
-
Karin Engström and Kim H. Esbensen
- Subjects
Prioritization ,lcsh:Mineralogy ,lcsh:QE351-399.2 ,Process (engineering) ,Computer science ,System of measurement ,Theory of Sampling (TOS) ,Sampling (statistics) ,Value (computer science) ,iron ore ,Geology ,02 engineering and technology ,Geotechnical Engineering and Engineering Geology ,020501 mining & metallurgy ,process sampling ,Empirical research ,0205 materials engineering ,Process measurement ,Statistics ,theory of sampling ,measurement systems ,quality control ,Variogram ,variographic characterisation - Abstract
Variographic characterisation has been shown to be a powerful tool to assess the performance of process measurement systems, using existing process data. Variogram interpretation enables decomposition of variabilities stemming from the process and measurement system, respectively, allowing to determine if measurements are able to describe the true process variability with sufficient resolution. This study evaluated 14 critical sampling locations, covering a total of 34 separate measurement systems, along the full processing value chain at Luossavaara Kiirunavaara limited company (LKAB), Sweden. A majority of the variograms show low sill levels, indicating that many sub-processes are well controlled. Many also show low nugget effect, indicating satisfactory measurement systems. However, some notable exceptions were observed, pointing to systems in the need of improvement. Even if some of these were previously recognized internally at LKAB, the use of variographic characterisation provide objective and numerical evidence of measurement system performance. The study also showed some unexpected results, for example that slurry shark-fin and spear sampling show acceptable variogram characteristics for the present materials, despite the associated incorrect sampling errors. On the other hand, the results support previous conclusions indicating that manual sampling and cross belt hammer samplers are leading to unacceptably large sampling errors and should be abandoned. Such specific findings underline the strength of comprehensive empirical studies. Based on the present compilation of results, it is possible to conduct rational enquiry of all evaluated measurement systems, enabling objective prioritization of where improvement efforts will have the largest cost&ndash, benefit effect.
- Published
- 2018
26. Adequacy and verifiability of pharmaceutical mixtures and dose units by variographic analysis (Theory of Sampling) – A call for a regulatory paradigm shift
- Author
-
Rodolfo J. Romañach, Adriluz Sanchez, Kim H. Esbensen, and Andrés D. Román-Ospino
- Subjects
Quality Control ,Process (engineering) ,Chemistry, Pharmaceutical ,Drug Compounding ,media_common.quotation_subject ,Control (management) ,Pharmaceutical Science ,Asset (computer security) ,Residual ,030226 pharmacology & pharmacy ,01 natural sciences ,03 medical and health sciences ,0302 clinical medicine ,Statistics ,Humans ,Technology, Pharmaceutical ,Medicine ,Quality (business) ,Regulatory science ,media_common ,business.industry ,010401 analytical chemistry ,Sampling (statistics) ,0104 chemical sciences ,Reliability engineering ,Pharmaceutical Preparations ,Paradigm shift ,business ,Tablets - Abstract
In spite of intense efforts in the last 20 years, the current state of affairs regarding evaluation of adequacy of pharmaceutical mixing is at an impressive standstill, characterized by two draft guidances, one withdrawn, and the other never approved. We here analyze the regulatory, scientific and technological situation and suggest a radical, but logical approach calling for a paradigm shift regarding sampling of pharmaceutical blends. In synergy with QbD/PAT efforts, blend uniformity testing should only be performed with properly designed sampling that can guarantee representativity-in contrast to the current deficient thief sampling. This is necessary for suitable in-process specifications and dosage units meeting desired specifications. The present exposé shows how process sampling based on the Theory of Sampling (TOS) constitutes a new asset for regulatory compliance, providing procedures that suppress hitherto adverse sampling errors. We identify that the optimal sampling location is after emptying the blender, guaranteeing complete characterisation of the residual heterogeneity. TOS includes variographic analysis that decomposes the effective total sampling and analysis error (TSE+TAE) from the variability of the manufacturing process itself. This approach provides reliable in-process characterization allowing independent approval or rejection by the Quality Control unit. The science-based sampling principles presented here will facilitate full control of blending processes, including whether post-blending segregation influences the material stream that reaches the tabletting feed-frame.
- Published
- 2016
- Full Text
- View/download PDF
27. Towards a Unified Sampling Terminology: Clarifying Misperceptions
- Author
-
Nancy Thiex, Kim H. Esbensen, and Claudia Paoletti
- Subjects
Pharmacology ,Protocol (science) ,Food Safety ,Standardization ,Computer science ,Sampling (statistics) ,Food Contamination ,Context (language use) ,Harmonization ,Constructive ,Analytical Chemistry ,Terminology ,Risk analysis (engineering) ,Research Design ,Terminology as Topic ,Food Quality ,Environmental Chemistry ,Agronomy and Crop Science ,Food Analysis ,Food Science ,Meaning (linguistics) - Abstract
International acceptance of data is a much-desired wish in many sectors to ensure equal standards forvalid information and data exchange, facilitate trade, support food safety regulation, and promote reliable communication among all parties involved. However, this cannot be accomplished without a harmonized approach to sampling and a joint approach to assess the practical sampling protocols used. Harmonizationbased on a nonrepresentative protocol, or on a restricted terminology tradition forced upon other sectors would negate any constructive outcome. An international discussion on a harmonized approach to sampling is severely hampered by a plethora of divergent sampling definitions and terms. Different meanings forthe same term are frequently used by the different sectors, and even within one specific sector. In other cases, different terms are used for the same concept. Before efforts to harmonize can be attempted, itis essential that all stakeholders can at least communicate effectively in this context. Therefore, a clear understanding of the main vocabularies becomes an essential prerequisite. As a first step, commonalities and dichotomies in terminology are here broughtto attention by providing a comparative summary of the terminology as defined by the Theory of Sampling(TOS) and those in current use by the International Organization for Standardization, the World Health Organization, the Food and Agriculture Organization Codex Alimentarius, and the U.S. Food and Drug Administration. Terms having contradictory meaning to the TOS are emphasized. To the degree possible, we present a successful resolution of some of the most important issues outlined, sufficient to support the objectives of the present Special Section.
- Published
- 2015
- Full Text
- View/download PDF
28. Theory of Sampling: Four Critical Success Factors Before Analysis
- Author
-
Kim H. Esbensen and Claas Wagner
- Subjects
Pharmacology ,Correctness ,Optimal sampling ,Guiding Principles ,Computer science ,Process (engineering) ,Reproducibility of Results ,Sampling (statistics) ,Food Contamination ,Animal Feed ,Analytical Chemistry ,Reliability engineering ,Task (project management) ,Research Design ,Critical success factor ,Environmental Chemistry ,Risk assessment ,Agronomy and Crop Science ,Food Analysis ,Selection Bias ,Food Science - Abstract
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
- Published
- 2015
- Full Text
- View/download PDF
29. Evaluation of sampling systems in iron ore concentrating and pelletizing processes - Quantification of Total Sampling Error (TSE) vs. process variation
- Author
-
Karin Engström and Kim H. Esbensen
- Subjects
Engineering ,Process (engineering) ,Iron ore ,02 engineering and technology ,Abstract process ,engineering.material ,Variographic analysis ,020501 mining & metallurgy ,Theory of sampling ,020401 chemical engineering ,Statistics ,Total sampling error ,0204 chemical engineering ,Time series ,Concentrating ,business.industry ,Mechanical Engineering ,Representativity ,Sampling (statistics) ,Pelletizing ,General Chemistry ,Work in process ,Geotechnical Engineering and Engineering Geology ,Sampling variability ,Process variation ,0205 materials engineering ,Control and Systems Engineering ,Process sampling ,business - Abstract
Process sampling is involved in grade control in all parts of the production value chain in mineral processing. Reliable sampling and assaying is essential to ensure final product quality, but the need for representative sampling is not always taken into account. By continuous control of the variability of sampling systems, a comprehensive understanding of the relationship between the sampling and process variability can lower the risk for overcorrections of process parameters due to sampling variability rather than process changes. Variographic characterization of process sampling makes it possible to assess all combined sampling and analytical errors from only 40–60 routine analytical values. The objective of this study is to evaluate total sampling variability in relation to process variability in the concentrating and pelletizing process sampling at LKAB. The results from the variographic analyses will form a basis for suggestions of possible improvements. The results show that variographic analysis is a powerful tool to evaluate both process variations and the variability of the sampling systems employed. The extensive access to time series data allow variographic characterization (quality control) of all critical measurement systems and locations. At the same time, periodicity and small changes in process variation can be detected and counteracted early, minimizing the risk for producing products out of specification.
- Published
- 2018
- Full Text
- View/download PDF
30. List of Contributors
- Author
-
Christian Airiau, Chris Antoniou, Massimiliano Barolo, Fabrizio Bezzo, Robert W. Bondi, Johan Bøtker, Richard D. Braatz, Dongsheng Bu, Chunsheng Cai, Graham Cook, Claudia C. Corredor, Kim H. Esbensen, Pierantonio Facco, Pedro M. Felizardo, Ana Patricia Ferreira, Geir Rune Flåten, John Gamble, Joerg Gampfer, Paul Geladi, Francisca F. Gouveia, Hans Grahn, Mark J. Henson, Benoît Igne, Dominique S. Kummli, Lorenz Liesum, João A. Lopes, David Lovett, John Mack, Nuno Matos, Neil McDowall, Gary McGeorge, Natascia Meneghetti, José C. Menezes, Ewan Mercer, Charles E. Miller, Chris Morris, Venkatesh Natarajan, Sarah Nicholson, Julia O’Neill, Antonio Peinado, Alan R. Potts, Jukka Rantanen, Clare Frances Rawlinson-Malone, Rodolfo J. Romañach, Andrés D. Román-Ospino, Mafalda C. Sarraguça, Kristen A. Severson, Zhenqi Shi, Christopher M. Sinko, Brad Swarbrick, Furqan Tahir, Jörg Thömmes, Mike Tobyn, and Jeremy G. VanAntwerp
- Published
- 2018
- Full Text
- View/download PDF
31. Theory of Sampling (TOS)
- Author
-
Kim H. Esbensen, Rodolfo J. Romañach, and Andrés D. Román-Ospino
- Subjects
Multivariate statistics ,Multivariate analysis ,Observational error ,Computer science ,Sampling error ,02 engineering and technology ,Abstract process ,021001 nanoscience & nanotechnology ,030226 pharmacology & pharmacy ,Signal acquisition ,Reliability engineering ,03 medical and health sciences ,0302 clinical medicine ,A priori and a posteriori ,Pharmaceutical manufacturing ,0210 nano-technology - Abstract
Process monitoring and control in technology and industry is incomplete without full understanding of all sources of variation, and pharmaceutical manufacturing is no exception. The power of multivariate data modeling and error treatment is not optimal when sampling errors are not adequately identified, quantified, and reduced to below a relevant a priori acceptance threshold. Process data are affected both by analytical measurement errors as well as sampling and/or PAT sensor acquisition errors. The latter partially unrecognized or unknown categories typically dominate over analytical errors by factors of 10–20+ if proper sampling competence is not brought to bear in the design, implementation, maintenance, and operation of the total process measurement system. It is not sufficiently known that PAT signal acquisition gives rise to identical error types, as does physical sample extraction; the latter is well understood and solutions abound from the theory of sampling (TOS). This chapter brings forth the critical analogy between PAT sensor application and conventional physical sample extraction in pharma and shows how variographic process characterization forms a necessary and sufficient on-line approach for total error management which is critical before chemometric calibration and prediction. Without proper sampling error treatment (identification, reduction, or elimination), multivariate data modeling in pharma will incorporate unnecessary, significantly inflated data uncertainties that will compromise the ultimate monitoring and prediction objectives. This chapter presents a brief outline of the necessary elements of TOS to identify typical sampling issues, errors, and effects in need of proper management before multivariate data analysis.
- Published
- 2018
- Full Text
- View/download PDF
32. Development and harmonisation of reliable sampling approaches for generation of data supporting GM plants risk assessment
- Author
-
Claas Wagner and Kim H. Esbensen
- Subjects
Engineering ,Risk analysis (engineering) ,business.industry ,Sampling (statistics) ,business ,Risk assessment ,Reliability engineering - Published
- 2017
- Full Text
- View/download PDF
33. Representative sampling in biomass studies—not so fast!
- Author
-
Kim H. Esbensen and Peter Thy
- Subjects
Waste management ,Mass reduction ,Chemistry ,Statistics ,Sampling (statistics) ,Biomass ,Energy community ,Biomass fuels ,Representative sampling - Abstract
Author Summary: Proper execution of representative sampling and laboratory mass reduction procedures are critical for the validity and reliability of chemical analyses of highly heterogeneous biomass fuels. In the study reported by Thy et al., it was demonstrated that faulty sampling had resulted in apparent ash compositions that differed from the true compositions by factors of two to three for many major oxides. Analytical results based on non-representative samples may thus not be representative for the specific fuel and processes being studied. Despite the general acceptance that accurate and representative compositions is a critical prerequisite for understanding reactions and elemental fractionation, the biomass energy community appears largely to have ignored the critical issues surrounding representative primary sampling. This can have resulted in misleading or faulty conclusions and may have restricted reliable predictive modelling.First published in TOS Forum Issue 1, 7 (2013)
- Published
- 2020
- Full Text
- View/download PDF
34. Theory of sampling (TOS) versus measurement uncertainty (MU) – A call for integration
- Author
-
Claas Wagner and Kim H. Esbensen
- Subjects
Computer science ,Process (engineering) ,Measurement uncertainty ,Correct sampling error ,Sampling (statistics) ,Sample (statistics) ,Sampling error ,Sampling bias ,Analytical Chemistry ,Terminology ,GUM ,Inconstant sampling bias ,Theory of sampling ,EURACHEM/CITAC ,Statistics ,Total sampling error ,Set (psychology) ,Incorrect sampling error ,Spectroscopy - Abstract
We assess current approaches to measurement uncertainty (MU) with respect to the complete ensemble of sources affecting the measurement process, in particular the extent to which sampling errors as set out in the Theory of Sampling (TOS) are appropriately considered in the GUM and EURACHEM/CITAC guides. All pre-analysis sampling steps play an important, often dominant role in the total uncertainty budget, thereby critically affecting the validity of MU estimates, but most of these contributions are not included in the current MU framework. The TOS constitutes the only complete theoretical platform for dealing appropriately with the entire pathway from field sample to test portion. We here propose a way to reconcile the often strongly felt differences between MU and TOS. There is no need to debate terminology, as both TOS and MU can be left with their current usages.
- Published
- 2014
- Full Text
- View/download PDF
35. Image analytical sandstone plug poro-perm prediction using angle measure technique (AMT) and chemometrics – A feasibility study
- Author
-
Knut Kvaal, Kim H. Esbensen, and Maths Halstensen
- Subjects
0303 health sciences ,Drill ,Process Chemistry and Technology ,010401 analytical chemistry ,Mineralogy ,01 natural sciences ,0104 chemical sciences ,Computer Science Applications ,Analytical Chemistry ,law.invention ,Chemometrics ,03 medical and health sciences ,Permeability (earth sciences) ,law ,Test set ,Outlier ,Spark plug ,Porosity ,Core plug ,Spectroscopy ,Software ,Geology ,030304 developmental biology - Abstract
This feasibility study evaluates an approach for prediction of sandstone plug porosity and permeability based on low-angle illumination imaging, the Angle Measure Technique (AMT) and chemometric multivariate calibration/validation. The AMT approach transforms 2-D texture images of drill core plug ends into 1-D ‘complexity spectra’ in which inherent porosity- and permeability-correlated features are subsequently extracted and subjected to multivariate calibration modelling. A training data set was selected because of its wide-spanning porosity and permeability ranges allowing evaluation of realistic prediction performance for typical North Sea/Scandinavian sandstone oil/gas reservoir rocks. This first study makes use of sand stone plugs from a single drill core from the Danish underground. Contingent upon proper test set validation (deliberately not deleting a few small, potential outliers), prediction performance assessment were for porosity [%] slope: 0.86; RMSEP: 2.2%; R2 = 0.90 and for permeability [mDarcy]: slope: 0.91; RMSEP: 458 mDarcy; R2 = 0.87, which translates into RMSEPrel of 12% and 19% respectively. These results pertain to a typical, well-spanning training data set (18 sandstone plugs); it is therefore concluded that the AMT approach to poro-perm prediction from images is feasible, but further, extended calibrations must be based on a more comprehensive training data sets covering the full geological regime of reservoir sandstones. We discuss possible application potentials and limitations of this approach.
- Published
- 2019
- Full Text
- View/download PDF
36. Representative sampling and use of HHXRF to characterize lot and sample quality of quartzite at a pyrometallurgical ferrosilicon plant
- Author
-
Kim H. Esbensen, L.P. Bédard, D. Desroches, and S. Lemieux
- Subjects
Sampling protocol ,Mechanical Engineering ,Sample (material) ,Sample mass ,Sampling (statistics) ,Sampling error ,02 engineering and technology ,General Chemistry ,010501 environmental sciences ,Geotechnical Engineering and Engineering Geology ,01 natural sciences ,020501 mining & metallurgy ,Sample quality ,Ferrosilicon ,0205 materials engineering ,Control and Systems Engineering ,Statistics ,Environmental science ,Representative sampling ,0105 earth and related environmental sciences - Abstract
Material sampling is a critical component in mining and mineral processing industries. Nonetheless, sampling is often considered to be a simple matter and, as such, non-rigorous sampling protocols are often applied. The use of inappropriate methods produces inferior, non-representative estimates of sampling target composition. To address weaknesses in sampling protocols and evaluate the representativeness of collected samples, we performed a feasibility study of the ability of handheld X-ray fluorescence (HHXRF) to achieve a satisfactory characterization of a raw material lot at a pyrometallurgical ferrosilicon plant. Using composite and grab samples, we determined the various sampling error manifestations stemming from the fundamental sampling error, grouping and segregation error, as well as increment delimitation, increment extraction, and increment preparation errors), and performed a first foray determination of optimal sample mass, and estimated the heterogeneity within the sampling target. HHXRF results were compared with the results obtained using laboratory XRF. A first estimate of optimized sample mass for HHXRF was 10 kg, given the large size of crushed quartz blocks used in ferrosilicon plants—roughly cubic, 10 cm per side; accuracy improved with increased sample mass (18% error with a 10 kg sample versus 35% error when using a 1 kg sample). A 10 kg sample is also the mass a technician can realistically transport from the sampling site to the preparation facilities. The main contribution to the global estimation error is from primary sampling. Variographic analysis illustrated a sill equal to the nugget effect, indicating that two adjacent samples are no more similar than two samples separated by larger distance; this suggests equal spatial heterogeneity at all scales larger than the increment mass in the sampling target. Analytically, the HHXRF and desktop XRF results compared very well. Overall, the error associated with our first attempt at field composite sampling was half of that obtained via grab sampling for both the HHXRF and desktop XRF protocols. Relative to conventional analysis based on grab sampling and analysis via desktop XRF, the use of handheld XRF coupled with composite sampling would appear to be a feasible approach for an improved sampling protocol for obtaining fit-for-purpose characterizations of industrial quartzite.
- Published
- 2019
- Full Text
- View/download PDF
37. Reducing global mercury pollution with simultaneous gold recovery from small-scale mining tailings
- Author
-
Kim H. Esbensen and Peter W.U. Appel
- Subjects
Gold mining ,education.field_of_study ,business.industry ,Population ,chemistry.chemical_element ,Artisanal mining ,Tailings ,Mercury (element) ,Food chain ,Spillage ,chemistry ,Environmental protection ,Business ,Treaty ,education - Abstract
Author Summary: The increasing population on planet Earth has many impacts—one is a strong influence on the amount of mercury released to the environment. The worst influence stems from the rapidly increasing number of small-scale gold miners in Asia, Africa, Central and South America, who presently provide food on the table for tens of millions of households. Small-scale gold miners use vast amounts of mercury to capture the gold, and much of this mercury is released directly to the environment. A large part evaporates to the atmosphere and the rest is transported downstream in rivers ending up in the oceans. The amount of mercury released is phenomenal: an estimated 3000 tons of mercury is released annually by small-scale gold miners alone. A vast proportion enters the food chain in fish and sea mammals, as well as in rice polluted by spillage waters which enters irrigation pathways. Human consumption of polluted fish and/or rice already today has a crippling impact on human health in some countries, and this will have even more severe consequences if the current situation is not changed radically and rapidly. It is of particular concern if mercury-intoxicated women become pregnant, because the foetus extracts mercury from the mother. The human foetus is much more sensitive to mercury intoxication and thus has a high risk of being born with brain damage as well as physical disabilities. Over just one generation this will cause reduced intelligence for exposed children. Through such organisations as the United Nations Environment Programme (UNEP), the World community has become acutely aware of the rapidly increasing global mercury pollution. The treaty designed to protect human health and nature, the “Minamata Convention” has today been signed by the majority of world countries. Signatory countries are hereby obliged to start initiatives to reduce and even stop mercury use. This grim outlook has prompted a group of international concerned researchers and small-scale gold miners from Philippines to start teaching small-scale gold miners to work without the use of mercury and simultaneously to find ways to clean mercury-polluted gold mining tailings, which are one of the main polluting agents. This latter will have an immediate positive economic effect for the communities involved, which should be a significant motivation to change to non-mercury recovery processes. We here describe the specific technological drive to be able to go mercury free.
- Published
- 2019
- Full Text
- View/download PDF
38. One Can Always Assign a Wavelength/Wavelength Region Specific for One's Application
- Author
-
Phil Williams, Paul Geladi, Kim H. Esbensen, and Anders Larsen
- Subjects
Wavelength ,Optics ,Region specific ,Computer science ,business.industry ,business ,Telecommunications ,Joint (geology) ,Column (database) - Abstract
A chance encounter at IDRC 2014 led to a desire to produce a joint Mythbuster column on the issues surrounding “peak assignment”. The regular column authors are here joined by Phil Williams.
- Published
- 2015
- Full Text
- View/download PDF
39. DS 3077 Horizontal—A New Standard for Representative Sampling. Design, History and Acknowledgements
- Author
-
Lars Petersen Julius and Kim H. Esbensen
- Subjects
Engineering management ,business.industry ,Salient ,Task force ,Computer science ,Telecommunications ,business ,Quality assurance ,Competence (human resources) ,Representative sampling ,Design history - Abstract
Author Summary: July 2013 saw the conclusion of a five-year project, design, development and quality assurance of a new generic sampling standard: DS 3077 Horizontal. DS 3077 Horizontal is published by the Danish Standardisation Authority (DS). Development of this standard was carried out by task force DS F-205. This contribution summarises the history of this endeavour, focuses on a few salient highlights and pays tribute to the taskforce and to a group of external collaborators responsible for initial proof-of-concept and the final practical quality assurance. DS 3077 describes the minimum Theory of Sampling (TOS) competence basis upon which any sampler must rely in that sampling can be documentable as representative, both with respect to accuracy and reproducibility. It represents a consensus based on industry, academe, official regulatory bodies, professionals, students and other interested individuals.
- Published
- 2013
- Full Text
- View/download PDF
40. The Replication Myth 2: Quantifying Empirical Sampling Plus Analysis Variability
- Author
-
Anders Larsen, Paul Geladi, and Kim H. Esbensen
- Subjects
Computer science ,Replication (statistics) ,Statistics ,Sampling (statistics) - Published
- 2013
- Full Text
- View/download PDF
41. Latent Variable Regression for Laboratory Hyperspectral Images
- Author
-
Kim H. Esbensen, Paul Geladi, and Hans Grahn
- Subjects
Multivariate statistics ,Computer science ,business.industry ,010401 analytical chemistry ,Sampling (statistics) ,Hyperspectral imaging ,Pattern recognition ,04 agricultural and veterinary sciences ,Latent variable ,040401 food science ,01 natural sciences ,Regression ,Quantitative model ,0104 chemical sciences ,0404 agricultural biotechnology ,Principal component analysis ,Artificial intelligence ,business - Abstract
This chapter is about the application of latent variable-based regression methods on hyperspectral images. It is an applied chapter, and no new PLS algorithms are presented. The emphasis is on visual diagnostics and interpretation by showing how these work for the examples given. Section 16.1 of this chapter introduces the basic concepts of multivariate regression and of multivariate and hyperspectral images. In Sect. 16.2 the hyperspectral imaging technique used and the two examples (cheese and textile) are explained. Also some sampling issues are discussed here. Principal component analysis (PCA) is a powerful latent variable-based tool for cleaning images. Section 16.3 describes PLS quantitative model building and diagnostics, both numerical and visual for the cheese example, and finishes with PLSDA qualitative modeling for the textile example.
- Published
- 2017
- Full Text
- View/download PDF
42. The Cretaceous Petroleum System in the Danish Central Graben (CRETSYS). Technical notes
- Author
-
Emma Sheldon, Jonathan Ralph Ineson, Karen Dybkj��r, Morten Bjerager, Anders Mathiesen, Claus Andersen, Christian Brogaard Pedersen, Hans Peter Nytoft, Rasmus Rasmussen, Tjerk Heijboer, Uffe Larsen, Finn Christian Jakobsen, Finn Mortanson M��rk, Carsten M��ller Nielsen, Lars Kristensen, Niels Hemmingsen Schovsbo, Nina Skaarup, Erik Thomsen, Henrik Knudsen, Louise Ponsaing Lauridsen, and Kim H. Esbensen
- Published
- 2017
- Full Text
- View/download PDF
43. The Replication Myth 1
- Author
-
Anders Larsen, Kim H. Esbensen, and Paul Geladi
- Subjects
Replication (statistics) ,Environmental science ,Virology - Published
- 2013
- Full Text
- View/download PDF
44. Transition to circular economy requires reliable statistical quantification and control of uncertainty and variability in waste
- Author
-
Kim H. Esbensen and Costas A. Velis
- Subjects
Engineering ,Environmental Engineering ,Municipal solid waste ,business.industry ,Circular economy ,010401 analytical chemistry ,Control (management) ,Environmental resource management ,Uncertainty ,010501 environmental sciences ,Environmental economics ,Solid Waste ,01 natural sciences ,Pollution ,Environmental Policy ,Refuse Disposal ,0104 chemical sciences ,Inventions ,Environmental policy ,business ,0105 earth and related environmental sciences - Published
- 2016
- Full Text
- View/download PDF
45. Beauty is Only Skin Deep—Especially When Imaging in the near Infrared
- Author
-
Anders Larsen, Paul Geladi, and Kim H. Esbensen
- Subjects
Optics ,business.industry ,media_common.quotation_subject ,Near-infrared spectroscopy ,Beauty ,business ,Geology ,media_common - Published
- 2012
- Full Text
- View/download PDF
46. Myth 2: The Myth of the Small Validation Error Goal—Beware of Over-Fitting
- Author
-
Anders Larsen, Paul Geladi, and Kim H. Esbensen
- Subjects
Computer science ,Validation error ,Overfitting ,Algorithm - Published
- 2012
- Full Text
- View/download PDF
47. Representative sampling of large kernel lots II. Application to soybean sampling for GMO control
- Author
-
Pentti Minkkinen, Kim H. Esbensen, and Claudia Paoletti
- Subjects
Accuracy and precision ,Computer science ,Mode (statistics) ,Sampling (statistics) ,Function (mathematics) ,Field (computer science) ,Analytical Chemistry ,Kernel (statistics) ,Statistics ,media_common.cataloged_instance ,Lot quality assurance sampling ,European union ,Spectroscopy ,media_common - Abstract
Official testing and sampling of large kernel lots for impurities [e.g., genetically modified organisms (GMOs)] is regulated by normative documents and international standards of economic, trade and societal importance. In Part I, we reviewed current official guides and standards for sampling large contaminated kernel lots and the basic concepts of the Theory of Sampling (TOS) for chemical analysis. Here, we re-interpret the data collected in a recent field study (KeLDA) from a stringent TOS perspective, focusing on representative process sampling and variographic analysis in order to characterize the heterogeneities of large kernel lots and to estimate both Total Sampling Error ( TSE ) and Total Analytical Error ( TAE ). This is used as a basis for developing a general approach for optimization of kernel sampling protocols that are “fit for purpose” i.e. robust to heterogeneity and sufficiently accurate also to detect critically low levels of concentration. We demonstrate that both TSE and TAE are significantly large for GMO quantitation, but that TSE still can be up two orders of magnitude larger than TAE , depending on heterogeneity, sampling mode and GMO concentration, signifying that efforts to reduce uncertainties should focus on sampling plans and not on further refinements of analytical precision. For GMO testing based on the current labeling threshold (0.9%) in European Union regulations, we show that 42 is the absolute minimum number of increments needed for reliable characterization of all lots with a heterogeneity comparable to the most severely heterogeneous KeLDA lots (Lot #1). We demonstrate that the TOS is a comprehensive tool for reliable estimation of the effects of alternative sampling procedures and schemes, especially when using 1-D process variography, with which to optimize both sampling accuracy and precision. We show how it is always possible to estimate TSE from one simple variographic experiment based solely on the simple process-sampling requirements of TOS. This approach is universal and can be carried to very many other (static or dynamic) sampling scenarios and materials (e.g., impurities, contaminants and trace concentrations). The present variographic approach is crucial for meaningful definition of “appropriate sampling plans” (i.e. sampling plans minimizing TSE as function of the specific heterogeneity of any given lot).
- Published
- 2012
- Full Text
- View/download PDF
48. Representative sampling of large kernel lots III. General considerations on sampling heterogeneous foods
- Author
-
Pentti Minkkinen, Claudia Paoletti, and Kim H. Esbensen
- Subjects
Basis (linear algebra) ,Computer science ,Sampling (statistics) ,Function (mathematics) ,Analytical Chemistry ,Kernel (statistics) ,Statistics ,Range (statistics) ,media_common.cataloged_instance ,Lot quality assurance sampling ,European union ,Spectroscopy ,TRACE (psycholinguistics) ,media_common - Abstract
Part I reviewed the Theory of Sampling (TOS) as applied to quantitation of genetically-modified organisms (GMOs). Part II re-analyzed KeLDA data from a variographic analysis perspective and estimated Total Sampling Error ( TSE ) versus Total Analytical Error ( TAE ). Results from this analysis are here used as a basis for developing a general approach to optimization of kernel-sampling protocols that are fit for purpose (i.e. scaled with respect to the effective heterogeneity while simultaneously sufficiently accurate to detect critically low concentration levels). While TAE is significantly large for GMO quantitation, TSE can still be up two orders of magnitude larger, signifying that efforts to reduce GMO-analysis uncertainties should focus on improving or optimizing sampling plans and not on further refinements of analytical precision. For GMO testing based on the current labeling threshold (0.9%) of European Union regulations, KeLDA re-analysis results show that the number of increments needed ( Q ) for reliable characterization of lots with significant heterogeneities range between 42 (highly heterogeneous lots) and 17 (close to uniform materials). We outline how it is always possible to estimate TSE from a simple variographic experiment based on TOS’ process-sampling requirements. This approach is universal and can be carried over from the GMO case to all other (static or dynamic) sampling scenarios and materials dealing with impurities, contaminants, or trace concentrations, without any loss of generality. A proper basis for TOS-based process sampling is essential for any meaningful definition of “appropriate sampling plans” (i.e. sampling plans minimizing TSE as function of the specific heterogeneity of any given lot). If unit-operation costs for sampling and analysis are known, sampling plans can also be optimized with respect to overall costs. We discuss the degree to which the present results can be generalized regarding official monitoring and inspection of food and feed materials. What is presented here in effect constitutes a contribution towards a comprehensive, horizontal process-sampling standard for heterogeneous materials in general.
- Published
- 2012
- Full Text
- View/download PDF
49. On-Line near Infrared Monitoring of Ammonium and Dry Matter in Bioslurry for Robust Biogas Production: A Full-Scale Feasibility Study
- Author
-
Michael Madsen, Kim H. Esbensen, Maths Halstensen, Jens Bo Holm-Nielsen, and Felicia Nkem Ihunegbo
- Subjects
Accuracy and precision ,Biorefinery ,Pulp and paper industry ,chemistry.chemical_compound ,Agronomy ,Biogas ,chemistry ,Bioenergy ,Yield (chemistry) ,Anaerobic digestion (AD) ,on-line process monitoring ,dry matter ,ammonium ,near infrared spectroscopy ,NIR ,multivariate calibration ,test set validation ,Slurry ,Environmental science ,Dry matter ,Ammonium ,Spectroscopy - Abstract
Heterogeneous substrates fed into agricultural biogas plants originate from many sources with resulting quality fluctuations potentially inhibiting the process. Biogas yield can be substantially increased by optimisation of the organic dry matter load. In this study, near infrared spectroscopy was applied on-line in a re-circulating loop configuration operating identically as a full-scale setup. Ammonium could be modelled in the industrially relevant range 2.42 – 8.52 g L-1 with an excellent accuracy and precision, slope ~1.0, r2 = 0.97, corresponding toa relative Root Mean Square Error of Prediction (RMSEP) of 6.7 %. Also, dry matter in the similar plant relevant range 5.8 – 10.8 weight-percent could be predicted with acceptable accuracy (slope ~1.0, r2 = 0.83, and a relative RMSEP below 8.0 %. Based on these performance characteristics ,it was concluded that near infrared spectroscopy can be applied for optimising the efficiency of current and future biogas plants, as well as in biorefinery contexts converting heterogeneous bioslurry, energy crops, and wastes into value-added products. Adding model transfer capabilities, it is indicated that handheld instrumentation can play a vital role in bringing NIR technology directly in the field, and onto the plant floor – the implications for reliable biogas NIR process monitoring and control are significant.
- Published
- 2012
- Full Text
- View/download PDF
50. Monitoring of anaerobic digestion processes: A review perspective
- Author
-
Michael Madsen, Kim H. Esbensen, and Jens Bo Holm-Nielsen
- Subjects
Engineering ,Process Analytical Technologies (PAT) ,Sanitation ,Renewable Energy, Sustainability and the Environment ,business.industry ,Emerging technologies ,Process (engineering) ,Theory of Sampling (TOS) ,Biodegradable waste ,Work in process ,Anaerobic digestion (AD) ,Anaerobic digestion ,Risk analysis (engineering) ,Process sampling ,Process monitoring ,Critical success factor ,Biorefining ,business ,Process engineering - Abstract
The versatility of anaerobic digestion (AD) as an effective technology for solving central challenges met in applied biotechnological industry and society has been documented in numerous publications over the past many decades. Reduction of sludge volume generated from wastewater treatment processes, sanitation of industrial organic waste, and benefits from degassing of manure are a few of the most important applications. Especially, renewable energy production, integrated biorefining concepts, and advanced waste handling are delineated as the major market players for AD that likely will expand rapidly in the near future. The complex, biologically mediated AD events are far from being understood in detail however. Despite decade-long serious academic and industrial research efforts, only a few general rules have been formulated with respect to assessing the state of the process from chemical measurements. Conservative reactor designs have dampened the motivation for employing new technologies, which also constitutes one of the main barriers for successful upgrade of the AD sector with modern process monitoring instrumentation. Recent advances in Process Analytical Technologies (PAT) allow complex bioconversion processes to be monitored and deciphered using e.g. spectroscopic and electrochemical measurement principles. In combination with chemometric multivariate data analysis these emerging process monitoring modalities carry the potential to bring AD process monitoring and control to a new level of reliability and effectiveness. It is shown, how proper involvement of process sampling understanding, Theory of Sampling (TOS), constitutes a critical success factor. We survey the more recent trends within the field of AD monitoring and the powerful PAT/TOS/chemometrics application potential is highlighted. The Danish co-digestion concept, which integrates utilisation of agricultural manure, biomass and industrial organic waste, is used as a case study. We present a first foray for the next research and development perspectives and directions for the AD bioconversion sector. The versatility of anaerobic digestion (AD) as an effective technology for solving central challenges met in applied biotechnological industry and society has been documented in numerous publications over the past many decades. Reduction of sludge volume generated from wastewater treatment processes, sanitation of industrial organic waste, and benefits from degassing of manure are a few of the most important applications. Especially, renewable energy production, integrated biorefining concepts, and advanced waste handling are delineated as the major market players for AD that likely will expand rapidly in the near future. The complex, biologically mediated AD events are far from being understood in detail however. Despite decade-long serious academic and industrial research efforts, only a few general rules have been formulated with respect to assessing the state of the process from chemical measurements. Conservative reactor designs have dampened the motivation for employing new technologies, which also constitutes one of the main barriers for successful upgrade of the AD sector with modern process monitoring instrumentation. Recent advances in Process Analytical Technologies (PAT) allow complex bioconversion processes to be monitored and deciphered using e.g. spectroscopic and electrochemical measurement principles. In combination with chemometric multivariate data analysis these emerging process monitoring modalities carry the potential to bring AD process monitoring and control to a new level of reliability and effectiveness. It is shown, how proper involvement of process sampling understanding, Theory of Sampling (TOS), constitutes a critical success factor. We survey the more recent trends within the field of AD monitoring and the powerful PAT/TOS/chemometrics application potential is highlighted. The Danish co-digestion concept, which integrates utilisation of agricultural manure, biomass and industrial organic waste, is used as a case study. We present a first foray for the next research and development perspectives and directions for the AD bioconversion sector.
- Published
- 2011
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.