120 results on '"Maurice G Cox"'
Search Results
2. GUM guidance on developing and using measurement models
- Author
-
Adriaan M. H. van der Veen, Maurice G. Cox, and Antonio Possolo
- Subjects
General Chemical Engineering ,General Chemistry ,Safety, Risk, Reliability and Quality ,Instrumentation - Published
- 2022
- Full Text
- View/download PDF
3. Getting started with uncertainty evaluation using the Monte Carlo method in R
- Author
-
Maurice G Cox and Adriaan M. H. van der Veen
- Subjects
Propagation of uncertainty ,010405 organic chemistry ,Computer science ,Calibration (statistics) ,General Chemical Engineering ,Monte Carlo method ,Rework ,General Chemistry ,01 natural sciences ,Industrial engineering ,Expression (mathematics) ,0104 chemical sciences ,010104 statistics & probability ,Computational statistics ,Measurement uncertainty ,0101 mathematics ,Safety, Risk, Reliability and Quality ,Instrumentation ,Implementation - Abstract
The evaluation of measurement uncertainty is often perceived by laboratory staff as complex and quite distant from daily practice. Nevertheless, standards such as ISO/IEC 17025, ISO 15189 and ISO 17034 that specify requirements for laboratories to enable them to demonstrate they operate competently, and are able to generate valid results, require that measurement uncertainty is evaluated and reported. In response to this need, a European project entitled “Advancing measurement uncertainty—comprehensive examples for key international standards” started in July 2018 that aims at developing examples that contribute to a better understanding of what is required and aid in implementing such evaluations in calibration, testing and research. The principle applied in the project is “learning by example”. Past experience with guidance documents such as EA 4/02 and the Eurachem/CITAC guide on measurement uncertainty has shown that for practitioners it is often easier to rework and adapt an existing example than to try to develop something from scratch. This introductory paper describes how the Monte Carlo method of GUM (Guide to the expression of Uncertainty in Measurement) Supplement 1 can be implemented in R, an environment for mathematical and statistical computing. An implementation of the law of propagation of uncertainty is also presented in the same environment, taking advantage of the possibility of evaluating the partial derivatives numerically, so that these do not need to be derived by analytic differentiation. The implementations are shown for the computation of the molar mass of phenol from standard atomic masses and the well-known mass calibration example from EA 4/02.
- Published
- 2021
- Full Text
- View/download PDF
4. A methodology for testing classes of approximation and optimisation.
- Author
-
Bernard Butler, Maurice G. Cox, Alistair Forbes, Simon Hannaby, and Peter M. Harris
- Published
- 1996
5. Explicit unconditionally numerically stable solution of a class of cubic equations
- Author
-
Maurice G. Cox
- Published
- 2022
- Full Text
- View/download PDF
6. Modelling of the dynamic gravimetric preparation of calibration gas mixtures using permeation for trace gas analysis
- Author
-
Adriaan M. H. van der Veen, Heleen Meuzelaar, Merima Čaušević, and Maurice G. Cox
- Published
- 2022
- Full Text
- View/download PDF
7. An interpolation scheme for precision intermediate frequency reflection coefficient measurement.
- Author
-
Maurice G. Cox, Mark P. Dainton, Nick M. Ridler, Martin J. Salter, and P. R. Young
- Published
- 2003
- Full Text
- View/download PDF
8. The Reconstruction of Workpiece Surfaces from Probe Coordinate Data.
- Author
-
Bernard Butler, Maurice G. Cox, and Alistair Forbes
- Published
- 1992
9. Advanced Mathematical And Computational Tools In Metrology Vi
- Author
-
Patrizia Ciarlini, Maurice G Cox, Franco Pavese
- Published
- 2004
10. Statistical analysis of temperature rise in passive medical implants in a magnetic resonance imaging environment
- Author
-
Maurice G Cox, S Rajan, and K Jagan
- Subjects
Nuclear magnetic resonance ,Materials science ,medicine.diagnostic_test ,medicine ,Magnetic resonance imaging ,Statistical analysis - Published
- 2021
- Full Text
- View/download PDF
11. Meaningful expressions of uncertainty in measurement
- Author
-
Maurice G Cox and A O'Hagan
- Subjects
Computer science ,Measurement uncertainty ,Applied mathematics - Published
- 2021
- Full Text
- View/download PDF
12. Advanced Mathematical And Computational Tools In Metrology V
- Author
-
Patrizia Ciarlini, Maurice G Cox, Eduarda Filipe
- Published
- 2001
13. An algorithm for the removal of noise and jitter in signals and its application to picosecond electrical measurement.
- Author
-
Maurice G. Cox, Peter M. Harris, and David A. Humphreys
- Published
- 1993
- Full Text
- View/download PDF
14. Calibration of reference antennas for vector measurements of SAR
- Author
-
Maurice G Cox, D J Bownds, Djamel Allal, G Revillod, and A P Gregory
- Subjects
Waveguide (electromagnetism) ,Computer science ,Acoustics ,Instrumentation ,Astrophysics::Instrumentation and Methods for Astrophysics ,Specific absorption rate ,Imaging phantom ,law.invention ,law ,Calibration ,Reference antenna ,Dipole antenna ,Antenna (radio) ,Computer Science::Information Theory - Abstract
Vector-based instrumentation for the determination of Specific Absorption Rate (SAR) uses measurements of the complex E-field inside a phantom at multiple locations. Re-construction algorithms are used to calculate the SAR and its peak spatial value. The required measurements can be made by using an E-field vector probe that is controlled by a robot, or by using a static array of vector probes (for example, dipole antennas and receivers). The latter approach potentially offers much reduced measurement times, which is particularly advantageous for type-approval tests on smart phones as these can have 30 or more transmitting modes. To enable traceable measurement of SAR, an array-based system requires calibration data for its individual antennas, which can be obtained by exposing the individual antennas to a characterised complex E-field from a stable antenna (which will be referred to as a reference antenna). This report describes the process for characterising a reference antenna: traceable mapping of complex E-fields inside a liquid phantom which is illuminated by the antenna from beneath. The measurements of complex E-field were made by using an electrooptic probe that has been calibrated by using a traceable waveguide system.
- Published
- 2021
- Full Text
- View/download PDF
15. Reply to 'Comment on Liu et al. ’Discrepancies of Measured SAR between Traditional and Fast Measuring Systems’ Int. J. Environ. Res. Public Health, 2020, 17, 2111'
- Author
-
Maurice G Cox, Djamel Allal, Joe Wiart, and Zicheng Liu
- Subjects
Reply ,medicine.medical_specialty ,Health, Toxicology and Mutagenesis ,lcsh:Medicine ,010501 environmental sciences ,01 natural sciences ,03 medical and health sciences ,0302 clinical medicine ,fast SAR measurement ,medicine ,030212 general & internal medicine ,Sociology ,uncertainty analysis ,0105 earth and related environmental sciences ,traditional SAR measurement ,Data Collection ,Welfare economics ,Public health ,field reconstruction ,Comment ,lcsh:R ,Public Health, Environmental and Occupational Health ,Reproducibility of Results ,Environmental research ,measurement discrepancy ,specific absorption rate ,plane-wave expansion ,Public Health - Abstract
An article published in the International Journal of Environmental Research and Public Health compares two types of specific absorption rate measurement systems—a fast system using a time-domain array and a traditional system using probe scanning. While the time-domain array system is analyzed in detail under idealized conditions, the probe-scanning system evaluation used a fixed set of scanning and evaluation parameters that are not fully compliant with the requirements of the published standards. This leads to a false comparison and the incorrect conclusion that time-domain array systems can be theoretically more accurate than probe-scanning systems. We have repeated the analysis applied in the paper using the same raw data but with state-of-the art scanning and evaluation parameters. The results confirm the high accuracy of probe-scanning systems for any field distribution. Due to the high precision, robustness, and reliability of probe-scanning systems, the results of these systems are often referred to as reference results.
- Published
- 2020
16. Discrepancies of Measured SAR between Traditional and Fast Measuring Systems
- Author
-
Zicheng Liu, Joe Wiart, Maurice G Cox, Djamel Allal, Chaire Modélisation, Caractérisation et Maîtrise des expositions aux ondes électromagnétiques (C2M), IMT Atlantique Bretagne-Pays de la Loire (IMT Atlantique), Institut Mines-Télécom [Paris] (IMT)-Institut Mines-Télécom [Paris] (IMT)-Télécom ParisTech, National Physical Laboratory [Teddington] (NPL), Institut Polytechnique de Paris (IP Paris), Laboratoire Traitement et Communication de l'Information (LTCI), Télécom ParisTech-Institut Mines-Télécom [Paris] (IMT)-Centre National de la Recherche Scientifique (CNRS), Département Communications & Electronique (COMELEC), and Télécom ParisTech
- Subjects
Signal Processing (eess.SP) ,Field (physics) ,Health, Toxicology and Mutagenesis ,Acoustics ,FOS: Physical sciences ,lcsh:Medicine ,02 engineering and technology ,Imaging phantom ,Article ,03 medical and health sciences ,[SPI]Engineering Sciences [physics] ,0302 clinical medicine ,Region of interest ,Electric field ,FOS: Electrical engineering, electronic engineering, information engineering ,0202 electrical engineering, electronic engineering, information engineering ,030212 general & internal medicine ,Electrical Engineering and Systems Science - Signal Processing ,uncertainty analysis ,Physics ,System of measurement ,field reconstruction ,lcsh:R ,Public Health, Environmental and Occupational Health ,Specific absorption rate ,020206 networking & telecommunications ,measurement discrepancy ,Computational Physics (physics.comp-ph) ,Physics - Medical Physics ,[SPI.ELEC]Engineering Sciences [physics]/Electromagnetism ,Amplitude ,specific absorption rate ,plane-wave expansion ,fast sar measurement ,Medical Physics (physics.med-ph) ,Physics - Computational Physics ,Energy (signal processing) ,traditional sar measurement - Abstract
International audience; Human exposure to mobile devices is traditionally measured by a system in which the human body (or head) is modelled by a phantom and the energy absorbed from the device is estimated based on the electric fields measured with a single probe. Such a system suffers from low efficiency due to repeated volumetric scanning within the phantom needed to capture the absorbed energy throughout the volume. To speed up the measurement, fast SAR (specific absorption rate) measuring systems have been developed. However, discrepancies of measured results are observed between traditional and fast measuring systems. In this paper, the discrepancies in terms of post-processing procedures after the measurement of electric field (or its amplitude) are investigated. Here, the concerned fast measuring system estimates SAR based on the reconstructed field of the region of interest while the amplitude and phase of the electric field are measured on a single plane with a probe array. The numerical results presented indicate that the fast SAR measuring system has the potential to yield more accurate estimations than the traditional system, but no conclusion can be made on which kind of system is superior without knowledge of the field-reconstruction algorithms and the emitting source.
- Published
- 2020
- Full Text
- View/download PDF
17. The GUM perspective on straight-line errors-in-variables regression
- Author
-
Clemens Elster, Steffen Martens, Katy Klauenberg, Maurice G Cox, Alen Bošnjaković, and Adriaan M H van der Veen
- Subjects
Propagation of uncertainty ,Mathematical optimization ,Errors-in-variables ,Straight-line regression ,Weighted total least-squares ,Law of propagation of uncertainty Monte Carlo method ,Implicit measurement mode ,Applied Mathematics ,Contrast (statistics) ,Expression (computer science) ,Condensed Matter Physics ,Regression ,Perspective (geometry) ,Measurement uncertainty ,Errors-in-variables models ,Electrical and Electronic Engineering ,Understatement ,Instrumentation ,Mathematics - Abstract
Following the Guide to the expression of uncertainty in measurement (GUM), the slope and intercept in straight-line regression tasks can be estimated and their uncertainty evaluated by defining a measurement model. Minimizing the weighted total least-squares functional appropriately defines such a model when both regression input quantities ( X and Y ) are uncertain. This paper compares the uncertainty of the straight line evaluated by propagating distributions and by the law of propagation of uncertainty (LPU). The latter is in turn often approximated because the non-linear measurement model does not have closed form. We reason that the uncertainty recommended in the dedicated technical specification ISO/TS 28037:2010 does not fully implement the LPU (as intended) and can understate the uncertainty. A systematic simulation study quantifies this understatement and the circumstances where it becomes relevant. In contrast, the LPU uncertainty may often be appropriate. As a result, it is planned to revise ISO/TS 28037:2010.
- Published
- 2022
- Full Text
- View/download PDF
18. Informative Bayesian Type A uncertainty evaluation, especially applicable to a small number of observations
- Author
-
Katsuhiro Shirono and Maurice G Cox
- Subjects
020208 electrical & electronic engineering ,Bayesian probability ,General Engineering ,Bayes factor ,02 engineering and technology ,Physics::Data Analysis ,Statistics and Probability ,01 natural sciences ,010309 optics ,Bayesian statistics ,Frequentist inference ,0103 physical sciences ,Statistics ,0202 electrical engineering, electronic engineering, information engineering ,Econometrics ,Bayesian hierarchical modeling ,Bayesian linear regression ,Bayesian average ,Uncertainty analysis ,Mathematics - Abstract
A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM's Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.
- Published
- 2017
- Full Text
- View/download PDF
19. The equilibrium liquidus temperatures of rhenium–carbon, platinum–carbon and cobalt–carbon eutectic alloys
- Author
-
María E. Martín, Z. Yuan, Yuki Yamaguchi, Howard W. Yoon, F. Jahan, D. J. Woods, B. Rougié, D. Lowe, B. Khlevnoy, Maurice G Cox, X. Lu, T Wang, John T. Woodward, D Wei, D. del Campo, Graham Machin, M. R. Dury, D. R. Taubert, A. D. W. Todd, Joaquín Campos, H. C. McEvoy, V. Gavrilov, A. Whittam, S. Briaudeau, María Luisa Hernanz, R. Van den Bossche, Irina Grigoryeva, K. Anhalt, S. G. R. Salim, Yoshiro Yamada, N. Sasajima, J. M. Mantilla, F. Bourson, V B. Khromchenko, B. Wilthan, Mark Ballico, Mohamed Sadli, P. Bloembergen, Emma R. Woolliams, and E. W. M. van der Ham
- Subjects
Materials science ,Thermodynamic state ,General Engineering ,Thermodynamics ,chemistry.chemical_element ,02 engineering and technology ,Liquidus ,Thermodynamic temperature ,Rhenium ,01 natural sciences ,010309 optics ,020401 chemical engineering ,chemistry ,0103 physical sciences ,0204 chemical engineering ,Platinum ,Cobalt ,Carbon ,Eutectic system - Abstract
The eutectic alloys rhenium–carbon, platinum–carbon and cobalt–carbon have been proposed as reference standards for thermometry, with temperature and uncertainty values specified within the mise en pratique of the definition of the kelvin. These alloys have been investigated in a collaboration of eleven national measurement institutes and laboratories. Published results reported the point-of-inflection in the melting curve with extremely low uncertainties. However, to be considered as standards it is necessary to stipulate what phenomenon a temperature value has been ascribed to; specifically, this should be a thermodynamic state. Therefore, the data have been further evaluated and the equilibrium liquidus temperatures determined based on a consideration of limits and assuming a rectangular probability distribution. The values are: for rhenium–carbon 2747.91 ± 0.44 K, for platinum–carbon 2011.50 ± 0.22 K and for cobalt–carbon 1597.48 ± 0.14 K, with uncertainties at approximately a 95% coverage probability. It is proposed that these values could be used as the basis of thermodynamic temperature measurement at high temperatures (above 1300 K).
- Published
- 2017
- Full Text
- View/download PDF
20. Role of measurement uncertainty in the comparison of average areal rainfall methods
- Author
-
Maurice G Cox, L. L. Martins, João Alencar de Sousa, D Loureiro, A C Soares, Alexandra Ribeiro, M C Almeida, M. V. R. Silva, and R Brito
- Subjects
General Engineering ,Measurement uncertainty ,Environmental science ,Atmospheric sciences - Abstract
The main motivation for this research is the growing awareness of the impact of climate change and the increasing relevance of the United Nations Sustainable Development Goals, aiming to contribute to the measurement of quantities like precipitation and rate of rainfall. This knowledge is widely used in hydrology, climatology and meteorology, providing data and information applied in modelling, pattern definition and recognition, and forecasting. This work is concerned with estimating the average areal rainfall in a stipulated region from rainfall intensity observations made at measurement stations within that region. It focuses on three straightforward estimation approaches: the arithmetic mean method, the Thiessen polygon method and the isohyetal method. The evaluation of the associated measurement uncertainty, for which the law of propagation of uncertainty and a Monte Carlo method as described in guidance documents from the Joint Committee for Guides in Metrology are applied, is the main consideration. The approaches described may be readily applied by practitioners. A comparison of results from applying these methods to a simple example is made. Such results are required for conformity assessment and support in urban management and water resources management worldwide.
- Published
- 2021
- Full Text
- View/download PDF
21. Analysis of a regional metrology organization key comparison: Preliminary consistency check of the linking-laboratory data with the CIPM key comparison reference value
- Author
-
Maurice G Cox and Katsuhiro Shirono
- Subjects
Computer science ,Consistency (statistics) ,Key (cryptography) ,Data mining ,computer.software_genre ,Value (mathematics) ,computer ,Metrology - Published
- 2018
- Full Text
- View/download PDF
22. Uncertainty propagation for SPECT/CT-based renal dosimetry in177Lu peptide receptor radionuclide therapy
- Author
-
Lena Johansson, Gustav Brolin, Michael Ljungberg, Johan Gustafsson, Maurice G Cox, and Katarina Sjögreen Gleisner
- Subjects
Receptors, Peptide ,Monte Carlo method ,Partial volume ,Kidney ,Octreotide ,Spect imaging ,Image Processing, Computer-Assisted ,Organometallic Compounds ,Humans ,Dosimetry ,Gamma Cameras ,Tissue Distribution ,Radiology, Nuclear Medicine and imaging ,Radiometry ,Tomography, Emission-Computed, Single-Photon ,Physics ,Propagation of uncertainty ,Radiological and Ultrasound Technology ,Phantoms, Imaging ,business.industry ,Uncertainty ,Absorbed dose ,Radionuclide therapy ,Curve fitting ,Radiopharmaceuticals ,Tomography, X-Ray Computed ,Nuclear medicine ,business ,Monte Carlo Method ,Radiology, Nuclear Medicine and Medical Imaging ,Biomedical engineering - Abstract
A computer model of a patient-specific clinical (177)Lu-DOTATATE therapy dosimetry system is constructed and used for investigating the variability of renal absorbed dose and biologically effective dose (BED) estimates. As patient models, three anthropomorphic computer phantoms coupled to a pharmacokinetic model of (177)Lu-DOTATATE are used. Aspects included in the dosimetry-process model are the gamma-camera calibration via measurement of the system sensitivity, selection of imaging time points, generation of mass-density maps from CT, SPECT imaging, volume-of-interest delineation, calculation of absorbed-dose rate via a combination of local energy deposition for electrons and Monte Carlo simulations of photons, curve fitting and integration to absorbed dose and BED. By introducing variabilities in these steps the combined uncertainty in the output quantity is determined. The importance of different sources of uncertainty is assessed by observing the decrease in standard deviation when removing a particular source. The obtained absorbed dose and BED standard deviations are approximately 6% and slightly higher if considering the root mean square error. The most important sources of variability are the compensation for partial volume effects via a recovery coefficient and the gamma-camera calibration via the system sensitivity.
- Published
- 2015
- Full Text
- View/download PDF
23. Uncertainty Analysis of Thermal Comfort Parameters
- Author
-
Alistair B. Forbes, Luís Martins, Maurice G Cox, L. Cordeiro Matias, A. Silva Ribeiro, and J. Alves e Sousa
- Subjects
Heat balance ,Monte Carlo method ,Experimental data ,Applied mathematics ,Thermal comfort ,Condensed Matter Physics ,Uncertainty analysis ,Mathematics - Abstract
International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote (PMV) and predicted percentage dissatisfied (PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.
- Published
- 2015
- Full Text
- View/download PDF
24. EANM practical guidance on uncertainty analysis for molecular radiotherapy absorbed dose calculations
- Author
-
Johan Gustafsson, Maurice G Cox, Jonathan Gear, Glenn D. Flux, Katarina Sjögreen Gleisner, Gerhard Glatting, Mark Konijnenberg, Iain Murray, and Radiology & Nuclear Medicine
- Subjects
Volume of interest ,medicine.medical_treatment ,Partial volume ,Guidelines ,030218 nuclear medicine & medical imaging ,Recovery coefficient ,03 medical and health sciences ,0302 clinical medicine ,Dosimetry ,Neoplasms ,Calibration ,medicine ,Humans ,Applied mathematics ,Yttrium Radioisotopes ,Radiology, Nuclear Medicine and imaging ,Uncertainty analysis ,Mathematics ,Radiotherapy Planning, Computer-Assisted ,Uncertainty ,Radiotherapy Dosage ,General Medicine ,Radiation therapy ,030220 oncology & carcinogenesis ,Absorbed dose ,Practice Guidelines as Topic ,Radiopharmaceuticals ,Algorithms - Abstract
A framework is proposed for modelling the uncertainty in the measurement processes constituting the dosimetry chain that are involved in internal absorbed dose calculations. The starting point is the basic model for absorbed dose in a site of interest as the product of the cumulated activity and a dose factor. In turn, the cumulated activity is given by the area under a time-activity curve derived from a time sequence of activity values. Each activity value is obtained in terms of a count rate, a calibration factor and a recovery coefficient (a correction for partial volume effects). The method to determine the recovery coefficient and the dose factor, both of which are dependent on the size of the volume of interest (VOI), are described. Consideration is given to propagating estimates of the quantities concerned and their associated uncertainties through the dosimetry chain to obtain an estimate of mean absorbed dose in the VOI and its associated uncertainty. This approach is demonstrated in a clinical example.
- Published
- 2018
25. On a Monte Carlo method for measurement uncertainty evaluation and its implementation
- Author
-
Maurice G Cox and Peter M. Harris
- Subjects
Mathematical optimization ,Change of variables ,Computer science ,Calibration (statistics) ,Monte Carlo method ,General Engineering ,Coverage probability ,Econometrics ,Measurement uncertainty ,Context (language use) ,Interval (mathematics) ,Expression (computer science) - Abstract
The 'Guide to the Expression of Uncertainty in Measurement' (GUM) provides a framework and procedure for evaluating and expressing measurement uncertainty. The procedure has two main limitations. Firstly, the way a coverage interval is constructed to contain values of the measurand with a stipulated coverage probability is approximate. Secondly, insufficient guidance is given for the multivariate case in which there is more than one measurand. In order to address these limitations, two specific guidance documents (or 'Supplements to the GUM') on, respectively, a Monte Carlo method for uncertainty evaluation (Supplement 1) and extensions to any number of measurands (Supplement 2) have been published. A further document on developing and using measurement models in the context of uncertainty evaluation (Supplement 3) is also planned, but not considered in this paper.An overview is given of these guidance documents. In particular, a Monte Carlo method, which is the focus of Supplements 1 and 2, is described as a numerical approach to implement the 'propagation of distributions' formulated using the 'change of variables formula'. Although applying a Monte Carlo method is conceptually straightforward, some of the practical aspects of using the method are considered, such as the choice of the number of trials and ensuring an implementation is memory-efficient. General comments about the implications of using the method in measurement and calibration services, such as the need to achieve transferability of measurement results, are made.
- Published
- 2014
- Full Text
- View/download PDF
26. Validating the applicability of the GUM procedure
- Author
-
Maurice G Cox and Peter M. Harris
- Subjects
Chebyshev polynomials ,Computer science ,Benchmark (surveying) ,Monte Carlo method ,General Engineering ,Univariate ,Measurement uncertainty ,Probability distribution ,Data mining ,Expression (computer science) ,computer.software_genre ,computer ,Convolution - Abstract
International Standard ISO 17025:2005 states that the degree of rigour needed in an estimation of uncertainty of measurement depends on several factors including the requirements of the customer. So, generally to deliver a measurement result acceptable to a customer requires a degree of assurance in its quality. We investigate the extent to which such assurance can be given in applying the Guide to the expression of uncertainty in measurement (GUM). For many practical cases, a measurement result incorporating an evaluated uncertainty that is correct to one significant decimal digit would be acceptable. Accordingly, attention is given in published GUM Supplements and in the GUM revision process to quantifying the numerical precision of an uncertainty statement provided by following those documents. Any such quantification is naturally relative to the adequacy of the measurement model and the knowledge used about the input quantities in that model. For general univariate and multivariate measurement models, we emphasize the use of a Monte Carlo method, as recommended in GUM Supplements 1 and 2. One use of this method is as a benchmark in terms of which measurement results provided by the GUM can be assessed in any particular instance. For measurement models that are linear in the input quantities and for which the probability distributions for those quantities are independent, we indicate the use of other approaches such as convolution methods based on the fast Fourier transform and Chebyshev polynomials as benchmarks.
- Published
- 2014
- Full Text
- View/download PDF
27. Summarizing the output of a Monte Carlo method for uncertainty evaluation
- Author
-
Maurice G Cox, Alistair B. Forbes, C E Matthews, and Peter M. Harris
- Subjects
symbols.namesake ,Mathematical optimization ,Gaussian ,Monte Carlo method ,General Engineering ,symbols ,Coverage probability ,Probability distribution ,Measurement uncertainty ,Sample (statistics) ,Interval (mathematics) ,Quantile function ,Mathematics - Abstract
The ?Guide to the Expression of Uncertainty in Measurement? (GUM) requires that the way a measurement uncertainty is expressed should be transferable. It should be possible to use directly the uncertainty evaluated for one measurement as a component in evaluating the uncertainty for another measurement that depends on the first. Although the method for uncertainty evaluation described in the GUM meets this requirement of transferability, it is less clear how this requirement is to be achieved when GUM Supplement 1 is applied. That Supplement uses a Monte Carlo method to provide a sample composed of many values drawn randomly from the probability distribution for the measurand. Such a sample does not constitute a convenient way of communicating knowledge about the measurand. In this paper consideration is given to obtaining a more compact summary of such a sample that preserves information about the measurand contained in the sample and can be used in a subsequent uncertainty evaluation. In particular, a coverage interval for the measurand that corresponds to a given coverage probability is often required. If the measurand is characterized by a probability distribution that is not close to being Gaussian, sufficient information has to be conveyed to enable such a coverage interval to be computed reliably.A quantile function in the form of an extended lambda distribution can provide adequate approximations in a number of cases. This distribution is defined by a fixed number of adjustable parameters determined, for example, by matching the moments of the distribution to those calculated in terms of the sample of values. In this paper, alternative flexible models for the quantile function and methods for determining a quantile function from a sample of values are proposed for meeting the above needs.
- Published
- 2014
- Full Text
- View/download PDF
28. Statistical reassessment of calibration and measurement capabilities based on key comparison results
- Author
-
Maurice G Cox and Katsuhiro Shirono
- Subjects
Bayes estimator ,Calibration (statistics) ,General Engineering ,Value (computer science) ,Interval (mathematics) ,01 natural sciences ,Confidence interval ,010309 optics ,Consistency (statistics) ,0103 physical sciences ,Statistics ,Measurement uncertainty ,010306 general physics ,Equivalence (measure theory) ,Mathematics - Abstract
According to the CIPM Mutual Recognition Arrangement, a calibration and measurement capability (CMC) uncertainty for a laboratory offering a particular calibration or measurement service is normally expressed at a \SI{95}{\percent} level of confidence. When laboratories' CMC claims are unsupported by the relevant key comparison (KC), one option is for modified values to be assigned to their CMC uncertainties. In many cases when CMCs apply to a continuous interval of values such as mass concentration or wavelength, there is no directly relevant KC that can be used in support since KCs are carried out only for selected discrete values of the quantity concerned. Under realistic assumption, we developed a method that is applicable in such an instance and for which the reported CMC uncertainties are amplified so that they are underpinned by the results of the KC. The amplification factors depend on the laboratories' degrees of equivalence (DoEs) for these discrete values, and judiciously adjusted to achieve consistency with the key comparison reference values. The method is based on the patterns in the individual behaviour of the DoEs of the participating laboratories for the discrete values, implying the presence of correlation associated with the DoE values. It applies when the weighted mean of some or all of the measured values reported by the participating laboratories in the KC is used to obtain the KC reference value.
- Published
- 2019
- Full Text
- View/download PDF
29. Uncertainty in measurement of protein circular dichroism spectra
- Author
-
Maurice G Cox, Jascindra Ravi, Paulina D. Rakowska, and Alex E. Knight
- Subjects
Circular dichroism ,Computer science ,General Engineering ,Experimental data ,Measurement uncertainty ,Uncertainty budget ,Circular dichroism spectra ,Biological system ,Spectroscopy ,Measure (mathematics) ,Protein secondary structure - Abstract
Circular dichroism (CD) spectroscopy of proteins is widely used to measure protein secondary structure, and to detect changes in secondary and higher orders of structure, for applications in research and in the quality control of protein products such as biopharmaceuticals. However, objective comparison of spectra is challenging because of a limited quantitative understanding of the sources of error in the measurement. Statistical methods can be used for comparisons, but do not provide a mechanism for dealing with systematic, as well as random, errors. Here we present a measurement model for CD spectroscopy of proteins, incorporating the principal sources of uncertainty, and use the model in conjunction with experimental data to derive an uncertainty budget. We show how this approach could be used in practice for the objective comparison of spectra, and discuss the benefits and limitations of this strategy.
- Published
- 2014
- Full Text
- View/download PDF
30. Uncertainty Analysis in the Calibration of an Emission Tomography System for Quantitative Imaging
- Author
-
Maurice G Cox, Marco D’Arienzo, and D'Arienzo, M.
- Subjects
Article Subject ,Computer science ,Calibration (statistics) ,Image processing ,Radiation Dosage ,010403 inorganic & nuclear chemistry ,lcsh:Computer applications to medicine. Medical informatics ,01 natural sciences ,General Biochemistry, Genetics and Molecular Biology ,030218 nuclear medicine & medical imaging ,03 medical and health sciences ,0302 clinical medicine ,Image Processing, Computer-Assisted ,Econometrics ,Humans ,Uncertainty analysis ,Tomography, Emission-Computed, Single-Photon ,General Immunology and Microbiology ,Phantoms, Imaging ,Applied Mathematics ,Uncertainty ,Process (computing) ,General Medicine ,Function (mathematics) ,Models, Theoretical ,0104 chemical sciences ,Acquisition Duration ,Modeling and Simulation ,Calibration ,Measurement uncertainty ,lcsh:R858-859.7 ,Tomography ,Algorithm ,Research Article - Abstract
It is generally acknowledged that calibration of the imaging system (be it a SPECT or a PET scanner) is one of the critical components associated with in vivo activity quantification in nuclear medicine. The system calibration is generally performed through the acquisition of a source with a known amount of radioactivity. The decay-corrected calibration factor is the "output" quantity in a measurement model for the process. This quantity is a function of a number of "input" variables, including total counts in the volume of interest (VOI), radionuclide activity concentration, source volume, acquisition duration, radionuclide half-life, and calibration time of the radionuclide. Uncertainties in the input variables propagate through the calculation to the "combined" uncertainty in the output quantity. In the present study, using the general formula given in the GUM (Guide to the Expression of Uncertainty in Measurement) for aggregating uncertainty components, we derive a practical relation to assess the combined standard uncertainty for the calibration factor of an emission tomography system. At a time of increasing need for accuracy in quantification studies, the proposed approach has the potential to be easily implemented in clinical practice. © 2017 Marco D'Arienzo and Maurice Cox.
- Published
- 2017
31. The application of self-validation to wireless sensor networks
- Author
-
Mihaela Duta, Peter M. Harris, M A Collett, T J Esward, Maurice G Cox, and Manus Henry
- Subjects
Electromagnetics ,Visual sensor network ,Computer science ,business.industry ,Applied Mathematics ,Real-time computing ,computer.software_genre ,Sensor fusion ,Soft sensor ,Key distribution in wireless sensor networks ,Mobile wireless sensor network ,Wireless ,Data mining ,business ,Instrumentation ,Engineering (miscellaneous) ,computer ,Wireless sensor network - Abstract
Self-validation is a valuable tool for extending the operating range of sensing systems and making them more robust. Wireless sensor networks suffer many limitations meaning that their efficacy could be greatly improved by self-validation techniques. We present two independently developed data analysis techniques and demonstrate that they can be applied to a wireless sensor network. Using an acoustic ranging application we demonstrate an improvement of more than ten-fold in the uncertainty of a single measurement where multiple sensor readings are appropriately combined. We also demonstrate that of the two methods for determining a largest consistent subset one is more rigorous in dealing with correlation, and the other more suited to time-series data. © 2008 IOP Publishing Ltd.
- Published
- 2016
32. Revision of the ‘Guide to the Expression of Uncertainty in Measurement’
- Author
-
W Tyler Estler, Clemens Elster, René Dybkaer, Brynn Hibbert, C Michotte, Steve Sidney, Maurice G Cox, Adriaan M H van der Veen, Lars Nielsen, Willem Kool, Walter Bich, Leslie Pendrill, Hidetaka Imai, and Wolfgang Wöger
- Subjects
Engineering management ,Expression (architecture) ,Operations research ,Computer science ,General Engineering ,Joint Committee for Guides in Metrology ,Measurement uncertainty - Abstract
The Joint Committee for Guides in Metrology, Working Group 1, JCGM-WG1, is currently revising the 'Guide to the Expression of Uncertainty in Measurement'. In this communication, the motivation for undertaking such a revision is given and the main changes with respect to the current, 2008 edition are outlined.
- Published
- 2012
- Full Text
- View/download PDF
33. GUM anniversary issue
- Author
-
Maurice G Cox and Peter J. Harris
- Subjects
Computer science ,General Engineering - Published
- 2014
- Full Text
- View/download PDF
34. Unilateral degree of equivalence maximizing the power of test in an analysis of a regional metrology organization key comparison
- Author
-
Maurice G Cox and Katsuhiro Shirono
- Subjects
History ,Units of measurement ,Computer science ,Statistics ,Test statistic ,Dummy data ,Statistical model ,Equivalence (measure theory) ,Computer Science Applications ,Education ,Metrology ,Statistical hypothesis testing - Abstract
Data analysis of a regional metrology organization (RMO) when linked to an International Committee of Weights and Measures key comparison (KCs) is discussed. One of the purposes of an RMO KC is to check the appropriateness of participants - results from a statistical point of view. For this purpose, a statistical test is useful. In this study, we discuss the derivation of a unilateral Degree of Equivalence (DoE) as a test statistic, and apply the test to dummy data. The unilateral DoEs given by the method in this study can be significantly different from those given by previous studies. These differences can be ascribed to the use of different statistical models. A Consultative Committee (CC) should choose an appropriate statistical model for the analysis in accordance with its purpose in implementing an RMO KC.
- Published
- 2018
- Full Text
- View/download PDF
35. A probabilistic approach to the analysis of measurement processes
- Author
-
Maurice G Cox, Giovanni Battista Rossi, Alistair B. Forbes, and Peter M. Harris
- Subjects
Divergence-from-randomness model ,General Engineering ,Probabilistic logic ,Statistical model ,computer.software_genre ,Probability distribution ,Probabilistic analysis of algorithms ,Data mining ,Representation (mathematics) ,Conformance testing ,Algorithm ,computer ,Probabilistic relevance model ,Mathematics - Abstract
We consider a probabilistic model of the measurement process, based on identifying two main sub-processes, named observation and restitution. Observation constitutes the transformations involved in producing the observable output. Restitution constitutes the determination of the measurand (the quantity measured) from the observable output, and includes data processing. After providing a probabilistic representation of the observation sub-process, we derive appropriate formulae for addressing restitution and describing the overall measurement process. The model allows the treatment in probabilistic terms of both the random and systematic effects that influence the measurement process, and may prove particularly useful in the formulation phase of uncertainty evaluation. We also discuss the different ways in which the measurand can be characterized by a probability distribution, and demonstrate the application of the approach to the analysis of risk in conformance testing.
- Published
- 2008
- Full Text
- View/download PDF
36. Uncertainty Evaluation and Validation of a Comparison Methodology to Perform In-house Calibration of Platinum Resistance Thermometers using a Monte Carlo Method
- Author
-
A. Silva Ribeiro, M. Pimenta de Castro, C. Oliveira Costa, J. Alves e Sousa, and Maurice G Cox
- Subjects
Traceability ,Calibration (statistics) ,Computer science ,Monte Carlo method ,Reference data (financial markets) ,Sensitivity (control systems) ,Condensed Matter Physics ,Temperature measurement ,Algorithm ,Uncertainty analysis ,Interpolation - Abstract
The uncertainty required by laboratories and industry for temperature measurements based on the practical use of platinum resistance thermometers (PRTs) can commonly be achieved by calibration using temperature reference conditions and comparison methodologies (TCM) instead of the more accurate primary fixed-point (ITS-90) method. TCM is suitable for establishing internal traceability chains, such as connecting reference standards to transfer and working standards. The data resulting from the calibration method can be treated in a similar way to that prescribed for the ITS-90 interpolation procedure, to determine the calibration coefficients. When applying this approach, two major tasks are performed: (i) the evaluation of the uncertainty associated with the estimate of temperature (a requirement shared by the ITS-90 method), based on knowledge of the uncertainties associated with the temperature fixed points and the measured electrical resistances, and (ii) the validation of this practical comparison considering that the reference data are obtained using the ITS-90 method. The conventional approach, using the GUM uncertainty framework, requires approximations with unavoidable loss of accuracy and might not provide adequate uncertainty evaluation for the methods mentioned, because the conditions for its valid use, such as the near-linearity of the mathematical model relating temperature to electrical resistance, and the near-normality of the measurand (temperature), might not apply. Moreover, there can be some difficulty in applying the GUM uncertainty framework relating to the formation of sensitivity coefficients through partial derivatives for a model that, as here, is somewhat complicated and not readily expressible in an explicit form. Alternatively, uncertainty evaluation can be carried out by a Monte Carlo method (MCM), a numerical implementation of the propagation of distributions that is free from such conditions and straightforward to apply. In this paper, (a) the use of MCM to evaluate uncertainties relating to the ITS-90 interpolation procedure, and (b) a validation procedure to perform in-house calibration of PRTs by comparison are discussed. An example illustrating (a) and (b) is presented.
- Published
- 2008
- Full Text
- View/download PDF
37. Aggregating measurement data influenced by common effects
- Author
-
T J Esward, Maurice G Cox, Peter M. Harris, João Alencar de Sousa, and M A Collett
- Subjects
Noise ,Consistency (statistics) ,System of measurement ,Statistics ,General Engineering ,Calibration ,Data analysis ,Wireless sensor network ,Measured quantity ,Metrology ,Mathematics - Abstract
The requirement to aggregate measurement data, in the case where each measured value corresponds to nominally the same quantity, is important throughout metrology. Correlation associated with measured values is shown to arise in a number of measurement problems, with examples taken from the areas of interlaboratory comparisons, including key comparisons, the calibration of measuring systems and instruments, wireless sensor networks and prediction on the basis of different models. Consideration is given to the effect that correlation has on the determination of an aggregated estimate of the measured quantity, and on testing the consistency of measurement data. Approaches to quantifying the correlation associated with measured values are presented, and a formulation is given of the problem of determining the largest consistent subset of data having associated correlation. Results for three measurement problems are given, concerned with an interlaboratory comparison of noise in a coaxial line, the calibration of a thermometer and the analysis of data arising from a wireless sensor network.
- Published
- 2007
- Full Text
- View/download PDF
38. The evaluation of key comparison data: determining the largest consistent subset
- Author
-
Maurice G Cox
- Subjects
Set (abstract data type) ,Statistics ,General Engineering ,Key (cryptography) ,Value (computer science) ,Standard uncertainty ,Algorithm ,Statistical hypothesis testing ,Mathematics ,Metrology - Abstract
Suppose a single stable travelling standard is circulated around the national metrology institutes (NMIs) participating in a key comparison. Consider the set of data consisting of a measurement result, comprising a measured value and the associated standard uncertainty, provided independently by each such NMI. Each measured value is the corresponding NMI's best estimate of a single stipulated property of the standard. The weighted mean (WM) of the measured values can be formed, the weights being proportional to the reciprocals of the squared standard uncertainties. If this WM is consistent with the measured values according to a statistical test, it can be accepted as a key comparison reference value for the comparison. Otherwise, the WM of a largest consistent subset (LCS) can be determined. The LCS contains as many as possible of those results of participating NMIs that are consistent with the WM of that subset. An efficient approach for determining the LCS having smallest chi-squared value is described, and applied to length, temperature and ionizing radiation comparisons.
- Published
- 2007
- Full Text
- View/download PDF
39. A model for characterizing the frequency-dependent variation in sensitivity with temperature of underwater acoustic transducers from historical calibration data
- Author
-
Gary Hayman, Maurice G Cox, Peter M. Harris, G A Beamiss, G H Nam, I M Smith, T J Esward, and Stephen P. Robinson
- Subjects
Transducer ,Time history ,Applied Mathematics ,Acoustics ,Calibration ,Environmental science ,Measurement uncertainty ,Sensitivity (control systems) ,Underwater ,Underwater acoustics ,Variation (astronomy) ,Instrumentation ,Engineering (miscellaneous) - Abstract
The performance of underwater electroacoustic transducers often depends on water temperature and, for accurate calibrations, it is necessary to take account of this influence on measurement data. Doing so is particularly important for open-water calibration facilities where the environmental conditions cannot be controlled and seasonal variations in temperature can contribute significant measurement uncertainty. This paper describes the characterization of the sensitivity of underwater electroacoustic transducers in terms of their variation with water temperature. A model containing adjustable parameters is developed for providing frequency-dependent temperature coefficients whose use enables measurement data to be corrected for temperature. Estimates of these coefficients are determined by applying the model to the time history of measurement data obtained over a period of several years. The influence of seasonal temperature variation can then be separated from the slow temporal drift in the transducer sensitivity. The uncertainties associated with the corrected sensitivity values are evaluated.
- Published
- 2007
- Full Text
- View/download PDF
40. Gamma camera calibration and validation for quantitative SPECT imaging with (177)Lu
- Author
-
Aldo Fazio, M. Cazzato, Lidia Strigari, M.L. Cozzella, Lena Johansson, Giuseppe Iaccarino, M D'Andrea, Maurice G Cox, Andrew Fenwick, Marco D’Arienzo, P. De Felice, Sara Ungania, De Felice, P., Fazio, A., Cozzella, M. L., and D'Arienzo, M.
- Subjects
Point source ,Quantitative imaging ,Lutetium ,Radionuclide therapy ,Imaging phantom ,030218 nuclear medicine & medical imaging ,law.invention ,03 medical and health sciences ,Molecular radiotherapy ,0302 clinical medicine ,Optics ,law ,Spect imaging ,Image Interpretation, Computer-Assisted ,Calibration ,Humans ,Gamma Cameras ,Gamma camera ,Physics ,Radioisotopes ,Tomography, Emission-Computed, Single-Photon ,Radiation ,Tomographic reconstruction ,business.industry ,Phantoms, Imaging ,030220 oncology & carcinogenesis ,business ,Correction for attenuation - Abstract
Over the last years 177Lu has received considerable attention from the clinical nuclear medicine community thanks to its wide range of applications in molecular radiotherapy, especially in peptide-receptor radionuclide therapy (PRRT). In addition to short-range beta particles, 177Lu emits low energy gamma radiation of 113 keV and 208 keV that allows gamma camera quantitative imaging. Despite quantitative cancer imaging in molecular radiotherapy having been proven to be a key instrument for the assessment of therapeutic response, at present no general clinically accepted quantitative imaging protocol exists and absolute quantification studies are usually based on individual initiatives.The aim of this work was to develop and evaluate an approach to gamma camera calibration for absolute quantification in tomographic imaging with 177Lu. We assessed the gamma camera calibration factors for a Philips IRIX and Philips AXIS gamma camera system using various reference geometries, both in air and in water. Images were corrected for the major effects that contribute to image degradation, i.e. attenuation, scatter and dead- time. We validated our method in non-reference geometry using an anthropomorphic torso phantom provided with the liver cavity uniformly filled with 177LuCl3.Our results showed that calibration factors depend on the particular reference condition. In general, acquisitions performed with the IRIX gamma camera provided good results at 208 keV, with agreement within 5% for all geometries. The use of a Jaszczak 16 mL hollow sphere in water provided calibration factors capable of recovering the activity in anthropomorphic geometry within 1% for the 208 keV peak, for both gamma cameras. The point source provided the poorest results, most likely because scatter and attenuation correction are not incorporated in the calibration factor. However, for both gamma cameras all geometries provided calibration factors capable of recovering the activity in anthropomorphic geometry within about 10% (range -11.6% to +7.3%) for acquisitions at the 208 keV photopeak.As a general rule, scatter and attenuation play a much larger role at 113 keV compared to 208 keV and are likely to hinder an accurate absolute quantification. Acquisitions of only the 177Lu main photopeak (208 keV) are therefore recommended in clinical practice. Preliminary results suggest that the gamma camera calibration factor can be assessed with a standard uncertainty below (or of the order of) 3% if activity is determined with equipment traceable to primary standards, accurate volume measurements are made, and an appropriate chemical carrier is used to allow a homogeneous and stable solution to be used during the measurements. © 2016 Elsevier Ltd.
- Published
- 2015
41. An approach based on the SIR measurement model for determining the ionization chamber efficiency curves, and a study of and photon emission intensities
- Author
-
C Michotte, A.K. Pearce, Maurice G Cox, and J.-J. Gostely
- Subjects
Radiation ,Photon emission ,Covariance matrix ,Reference values ,Ionization chamber ,Monte Carlo method ,Analytical chemistry ,Nuclear data ,Exponential function ,Mathematics ,Computational physics - Abstract
The measurement model used to determine ionization chamber efficiency curves accounts from the outset for impurity corrections and beta spectrum shapes. The curves are represented by exponentials of polynomials whose coefficients are adjusted using non-linear least-squares minimization. The curves are validated by comparing with SIR key comparison reference values (KCRVs) and other published curves. The associated covariance matrix is also evaluated. Deviations from model predictions for 65Zn and 201Tl using recommended nuclear data are studied.
- Published
- 2006
- Full Text
- View/download PDF
42. Evolution of the ‘Guide to the Expression of Uncertainty in Measurement’
- Author
-
Maurice G Cox, Walter Bich, and Peter M. Harris
- Subjects
Engineering management ,Vocabulary ,Promotion (rank) ,Operations research ,Computer science ,media_common.quotation_subject ,General Engineering ,Joint Committee for Guides in Metrology ,Measurement uncertainty ,Joint (building) ,Expression (mathematics) ,media_common ,Metrology - Abstract
A number of Joint Committees of the Bureau International des Poids et Mesures and other international organizations carry out particular tasks of common interest. The Joint Committee for Guides in Metrology (JCGM) has amongst its tasks the promotion of the 'Guide to the Expression of Uncertainty in Measurement' (GUM), the preparation of further documents for its broad application, and revision and promotion of the use of the 'International Vocabulary of Basic and General Terms in Metrology'. This paper summarizes the documents relating to the GUM planned by JCGM.
- Published
- 2006
- Full Text
- View/download PDF
43. The use of a Monte Carlo method for evaluating uncertainty and expanded uncertainty
- Author
-
Maurice G Cox and Bernd R. L. Siebert
- Subjects
Mathematical optimization ,Markov chain ,Computer science ,Monte Carlo method ,Bayesian probability ,General Engineering ,Measurement uncertainty ,Sensitivity analysis ,Probability density function ,Expression (mathematics) ,Uncertainty analysis - Abstract
The Guide to the Expression of Uncertainty in Measurement (GUM) is the internationally accepted master document for the evaluation of uncertainty. It contains a procedure that is suitable for many, but not all, uncertainty evaluation problems met in practice. This procedure constitutes an approximation to the general solution of the Markov formula, which infers the probability density function (PDF) for the output quantities (measurands) from the model of the measurement and the PDFs for the input quantities. This paper shows that a Monte Carlo method is an effective and versatile tool for determining the PDF for the measurands. This method provides a consistent Bayesian approach to the evaluation of uncertainty. Although in principle straightforward, some care is required in representing and validating the results obtained using the method. The paper provides guidance on optimizing the approach, identifies some pitfalls and indicates means for validating the results.
- Published
- 2006
- Full Text
- View/download PDF
44. The use of a Monte Carlo method for uncertainty calculation, with an application to the measurement of neutron ambient dose equivalent rate
- Author
-
Peter J. Harris, Maurice G Cox, Gyeonghee Nam, and David Thomas
- Subjects
Computer science ,Monte Carlo method ,Hybrid Monte Carlo ,Nuclear Reactors ,Statistics ,Radiology, Nuclear Medicine and imaging ,Quasi-Monte Carlo method ,Statistical physics ,Radiometry ,Neutrons ,Propagation of uncertainty ,Models, Statistical ,Radiation ,Radiological and Ultrasound Technology ,Radiotherapy Planning, Computer-Assisted ,Uncertainty ,Public Health, Environmental and Occupational Health ,Radiotherapy Dosage ,General Medicine ,Models, Theoretical ,Europe ,Research Design ,Calibration ,Dynamic Monte Carlo method ,Anisotropy ,Monte Carlo integration ,Monte Carlo method in statistical physics ,Monte Carlo Method ,Software ,Monte Carlo molecular modeling - Abstract
This paper is concerned with the use of a Monte Carlo method for uncertainty calculation as an implementation of the propagation of distributions. It reviews the basic principles of the propagation of distributions and numerical aspects of a Monte Carlo implementation. It also discusses the possible advantages in some circumstances of the propagation of distributions over the GUM uncertainty framework, and how the results obtained in any particular instance can be compared with those provided by that framework. To illustrate these various aspects, an application to the measurement of neutron dose equivalent rate is given. A key consideration in this application is the manner in which the dominant source of uncertainty, namely that associated with the field-specific correction factor, is treated. The information available concerning this factor constitutes the correction factors for a set of fields of the same type as that in which a measurement is being made. This information is encoded as a probability density function (PDF) for the correction factor. This PDF constitutes an input to both methods of evaluation.
- Published
- 2006
- Full Text
- View/download PDF
45. The CCPR K1-a key comparison of spectral irradiance from 250 nm to 2500 nm: measurements, analysis and results
- Author
-
Emma R. Woolliams, Maurice G Cox, Nigel Fox, Peter M. Harris, and N J Harrison
- Subjects
Halogen lamp ,law ,Physical laboratory ,General Engineering ,Key (cryptography) ,Irradiance ,Environmental science ,Mutual recognition ,Metrology ,law.invention ,Remote sensing - Abstract
The CCPR K1-a key comparison of spectral irradiance (from 250 nm to 2500 nm) was carried out to meet the requirements of the Mutual Recognition Arrangement by 13 participating national metrology institutes. Because of the fragile nature of the tungsten halogen lamps used as comparison artefacts the comparison was arranged as a star comparison with many more lamps than participants. The National Physical Laboratory (UK) piloted the comparison and, by measuring all lamps, provided a link between participants' measurements. The comparison was analysed using a model-based method, which ensured that all participants, including the pilot, were treated equitably. This paper presents the comparison philosophy, methodology, analysis and results.
- Published
- 2006
- Full Text
- View/download PDF
46. Measurement uncertainty and traceability
- Author
-
Maurice G Cox and Peter M. Harris
- Subjects
Traceability ,Calibration curve ,Computer science ,Applied Mathematics ,Econometrics ,Measurement uncertainty ,Sensitivity analysis ,Instrumentation ,Engineering (miscellaneous) ,Equivalence (measure theory) ,Uncertainty analysis - Abstract
Obtaining confidence in a measured value requires a quantitative statement of its quality, which in turn necessitates the evaluation of the uncertainty associated with the value. The basis for the value and the associated uncertainty is traceability of measurement, involving the relationship of relevant quantities to national or international standards through an unbroken chain of measurement comparisons. Each comparison involves calibration of a standard at one level in the chain using a standard at a higher level. Global economy considerations mean that this basis also requires the national measurement institutes to carry out comparative assessment of the degree of equivalence of national standards through their participation in key comparisons. The evaluation of uncertainty of measurement is founded on the use of models of measurement for each stage of the chain and at the highest level to interrelate national standards. Basic aspects of uncertainty evaluation are covered in this paper, and forms for the above types of model considered, with attention given to least squares as a basis for calibration curves (and certain other types of calibration) and also for key comparison data evaluation.
- Published
- 2006
- Full Text
- View/download PDF
47. An Outline of Supplement 1 to the Guide to the Expression of Uncertainty in Measurement on Numerical Methods for the Propagation of Distributions
- Author
-
Maurice G Cox and Peter M. Harris
- Subjects
Mathematical optimization ,Computer science ,Generalization ,Applied Mathematics ,Numerical analysis ,Monte Carlo method ,Measurement uncertainty ,Sensitivity analysis ,Instrumentation ,Uncertainty analysis ,Expression (mathematics) - Abstract
Supplement 1 to the Guide to the Expression of Uncertainty in Measurement (GUM), concerned with numerical methods for the propagation of distributions, embodies a generalization of the uncertainty framework of the GUM. This paper presents a number of aspects of that supplement.
- Published
- 2005
- Full Text
- View/download PDF
48. Harmonisation of coupled calibration curves to reduce correlated effects in the analysis of natural gas by gas chromatography
- Author
-
Maurice G Cox, Sarantis Kamvissis, Martin J. T. Milton, and Gergely Vargha
- Subjects
Fossil Fuels ,Chromatography, Gas ,Chromatography ,Calibration curve ,business.industry ,Chemistry ,Calibration (statistics) ,Thermal conductivity detector ,Organic Chemistry ,General Medicine ,Residual ,Biochemistry ,Analytical Chemistry ,law.invention ,Standard curve ,Natural gas ,law ,Calibration ,Flame ionization detector ,Gas chromatography ,business - Abstract
Quantitative analysis of natural gas depends on the calibration of a gas chromatograph with certified gas mixtures and the determination of a response relationship for each species by regression analysis. The uncertainty in this calibration is dominated by variations in the amount of the sample used for each analysis that are strongly correlated for all species measured in the same run. The "harmonisation" method described here minimises the influence of these correlations on the calculated calibration curves and leads to a reduction in the root-mean-square residual deviations from the fitted curve of a factor between 2 and 5. Consequently, it removes the requirement for each run in the calibration procedure to be carried out under the same external conditions, and opens the possibility that new data, measured under different environmental or instrumental conditions, can be appended to an existing calibration database.
- Published
- 2005
- Full Text
- View/download PDF
49. The use of a mixture of probability distributions in temperature interlaboratory comparisons
- Author
-
Maurice G Cox, Patrizia Ciarlini, Franco Pavese, and Giuseppe Regoliosi
- Subjects
education.field_of_study ,Statistics ,Population ,General Engineering ,Probabilistic logic ,Mixture distribution ,Probability distribution ,Probability density function ,Function (mathematics) ,Expectation value ,education ,Value (mathematics) ,Mathematics - Abstract
Several studies highlight the need for appropriate statistical and probabilistic tools to analyse the data provided by the participants in an interlaboratory comparison. In some temperature comparisons, where the measurand is a physical state, independent realizations of the same physical state are acquired in each participating institute, which should be considered as belonging to a single super-population. This paper introduces the use of a probabilistic tool, a mixture of probability distributions, to represent the overall population in such a temperature comparison. This super-population is defined by combining the local populations in given proportions. The mixture density function identifies the total data variability, and the key comparison reference value has a natural definition as the expectation value of this probability density.
- Published
- 2004
- Full Text
- View/download PDF
50. Technical Aspects of Guidelines for the Evaluation of Key Comparison Data
- Author
-
Maurice G Cox and Peter M. Harris
- Subjects
Propagation of uncertainty ,Applied Mathematics ,Gaussian ,Robust statistics ,Estimator ,Probability density function ,computer.software_genre ,symbols.namesake ,Statistics ,symbols ,Measurement uncertainty ,Data mining ,Instrumentation ,Equivalence (measure theory) ,computer ,Weighted arithmetic mean ,Mathematics - Abstract
Some of the technical aspects of guidelines for key comparison data evaluation prepared by BIPM Director's Advisory Group on Uncertainties are considered. These guidelines relate to key comparisons based on the measurement of a travelling standard having good short-term stability and stability during transport, in cases where the institutes' measurements are realised independently. They include two procedures for forming a key comparison reference value (KCRV), and the associated uncertainty, and the consequent degrees of equivalence (including the associated uncertainties), in accordance with the Mutual Recognition Arrangement. The basis of the procedures is (a) the representation of the information provided by the participating institutes as probability density functions, and (b) the estimator (model) used as the KCRB. The calculation of the KCRV and the associated uncertainty and the degrees of equivalence is then undertaken in accordance with the law of propagation of uncertainty, as described in the Guide to the Expression of Uncertainty in Measurement (GUM), or the propagation of distributions, a generalisation of the law of propagation of uncertainty, covered in the supplemental guide to the GUM. Attention is paid to the choice of model, relating it to the conditions that apply to the key comparison. The first procedure is intended for cases where for each institute a Gaussian distribution is assigned to the measurand of which the institute's measurement is an estimate. The weighted mean is used as the model in this case. A consistency test is included to determine whether the model is consistent with the data. If the test is satisfied, the weighted mean is accepted as the KCRV. The second procedure is used in circumstances where (a) not all the pdf's assigned are Gaussian or (b) where the first procedure had previously been applied, the consistency test was not satisfied and there was no opportunity to correct all institutes' data regarded as discrepant. The model in this case id chosen to be a more robust estimator such as the median or another estimator considered appropriate for the particular comparison.
- Published
- 2004
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.