124 results on '"monte carlo analysis"'
Search Results
2. A cost-benefit analysis of essential oil extraction technology of aromatic plants in Pothwar Punjab, Pakistan
- Author
-
Anjum, Fouzia, Muhammad, Sher, Siddiqui, Badar Naseem, Anjum, Muhammad Shahbaz, and Yaseen, Muhammad
- Published
- 2024
- Full Text
- View/download PDF
3. Analyzing the effects of data splitting and covariate shift on machine learning based streamflow prediction in ungauged basins
- Author
-
Li, Pin-Ching, Dey, Sayan, and Merwade, Venkatesh
- Published
- 2025
- Full Text
- View/download PDF
4. Transformer-based anomaly detection in P-LEO constellations: A dynamic graph approach.
- Author
-
Indaco, Manuel and Guzzetti, Davide
- Subjects
- *
ANOMALY detection (Computer security) , *TRANSFORMER models , *MONTE Carlo method , *CONSTELLATIONS - Abstract
Successful management of large space systems, such as the most recent Proliferated Low Earth Orbit (P-LEO) constellations, demands an increased level of autonomy and irregular behavior detection capability. Existing techniques for satellite network management rely on monitoring satellites individually, potentially failing to capture events shared across multiple units. In this work, we propose a pipeline for identifying anomalous connections in proliferated satellite networks. Our pipeline relies on representing a satellite constellation as a dynamic graph. Dynamic graphs can be conveniently exploited to represent P-LEO networks, as they allow to capture meaningful structural and temporal correlations. Such spatial and temporal information are first projected into an embedding space; next, this encoded representation is processed by a transformer-encoder network for the anomaly detection task. We conduct extensive analyses of the main problem parameters, including the temporal horizon, the structure of the graph, and the size of the constellation. To assess the general performance of the method, we perform a Monte Carlo analysis on 960 ground node pair connections, providing empirical evidence of the algorithm's generalization capability. The obtained results encourage the extension of the method to more complex, realistic modeling and kindred applications such as the on-edge detection problem. • Topology changes of P-LEO satellite networks are explored to detect anomalous links. • Spatial–temporal features of the network are leveraged to boost performance. • Evidence of algorithm generalization capability is collected via randomized simulations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Neutronic calculations for preliminary core design of SCW-SMR.
- Author
-
Antók, Csenge, Czifrus, Szabolcs, and Giusti, Valerio
- Subjects
- *
MONTE Carlo method , *SUPERCRITICAL water , *NEUTRON capture , *CONSTRUCTION materials , *TEMPERATURE effect - Abstract
• An SCW-SMR core model is developed based on Serpent 2 simulations. • Assembly structures and materials are studied to decrease neutron absorption. • Effects of moderator temperature and assembly gap size changes are examined. • Enrichment maps are tested to increase reserve reactivity and shape power profiles. • Resulting core model has adequate burnup cycle length and power profiles. Serpent 2 particle transport code is used to develop the pre-conceptual neutronic design of the Supercritical Water Cooled SMR. After initial criticality and burnup calculations, the starting core design of (Schulenberg and Otic, 2021) is improved using predetermined criteria, such as burnup cycle length and power distribution, while also considering operational safety. In order to achieve higher reserve reactivity, several modifications are considered, including the introduction of alternative structural materials and fuel assembly wall type, moderation improvement by adjustment of moderator temperature and fuel assembly gap width, and selection of a suitable enrichment map. As a result of the introduced modifications, the burnup cycle length is increased to 26 months and an acceptable core power distribution is achieved. The improved core design can be used for further investigations, such as coupled calculations using neutronic and thermal–hydraulic codes and examinations targeting reactivity control during burnup. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Underpinnings of reservoir and techno-economic analysis for Himalayan and Son-Narmada-Tapti geothermal sites of India.
- Author
-
Kiran, Raj, Upadhyay, Rajeev, Rajak, Vinay Kumar, Kumar, Ashutosh, and Datta Gupta, Saurabh
- Subjects
- *
CARBON dioxide mitigation , *MONTE Carlo method , *CARBON emissions , *GEOTHERMAL resources , *COAL-fired power plants - Abstract
The high capital cost and risks associated with the extraction process of geothermal energy are key factors in the project economics. This paper presents a comprehensive technoeconomic study based on reservoir heat and flow simulations along with the levelized cost of energy (LCOE) for two geothermal sites (Himalayan and Son-Narmada-Tapti i.e. SONATA) in India. Dual porosity reservoir simulation models were constructed and were used to obtain the cumulative and the dynamic thermal energy potentials of these sites. The simulation studies suggest that both sites are equally potent for geothermal energy extraction. Comparatively, the proven cumulative energy potentials for the Himalayan and SONATA sites are found to be 4.92–5.10 and 3.62–3.7 TW h, respectively, for the 30 years of production at a rate of 92 kg/s. This difference can be attributed mainly to higher temperatures and deeper sites at the Himalayan Province. Sensitivity analyses were conducted to assess the impact of various parameters, including porosity, permeability, fracture spacing, and thermal conductivity. The results suggest that porosity and permeability are the key enablers of efficient energy production. The economic feasibility study suggests that the LCOE costs are high for these sites ($148/MWh for the Himalayan and $137/MWh for the SONATA). These two sites have CO 2 mitigation potential in the range of 3.12 and 11.36 Megatons from a single doublet well configuration considering the CO 2 emissions from conventional coal-fired thermal power plants. Overall, SONATA can be the better prospect considering the logistics and operational challenges along with reservoir heat potential. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Dust impact and attitude analysis for JAXA's probe on the Comet Interceptor mission.
- Author
-
Machuca, P., Ozaki, N., Sánchez, J.P., and Felicetti, L.
- Subjects
- *
MONTE Carlo method , *DUST , *COMETS , *SOLAR cells , *ANGULAR velocity , *GRANULAR flow , *MINERAL dusts - Abstract
Comet Interceptor (Comet-I), to be launched in 2029 as a piggyback to ESA's ARIEL mission, is aimed to perform the first fly-by of a pristine long-period comet. The mission will be composed of a main spacecraft, SC A (ESA), and two small probes to be released prior to the fly-by, SC B1 (JAXA) and SC B2 (ESA). This work analyzes the attitude performance of JAXA's 24U-sized spacecraft through the dust environment of a yet-to-be-discovered target comet. Main challenges to the mission are associated to the high levels of uncertainty and extremity of fly-by conditions: highly-active dust environment, uncertain fly-by altitude (750 ± 250 km (1σ), as of 2021), and large and unknown relative fly-by speeds (15–70 km/s). A Monte Carlo analysis is performed to characterize the effect of dust particle impacts on the attitude of SC B1, and to evaluate the likelihood of satisfying pointing and angular velocity requirements of the science camera. Analysis initially shows that particles of mass 10−8–10−5 kg represent the most relevant source of perturbation due to their transferred angular momentum and likelihood of being encountered, and saturation of reaction wheels is shown unlikely given the large fly-by speeds and short fly-by durations (20 min–2 h). More detailed analysis ultimately suggests a probability larger than 90% of satisfying science camera requirements despite the extreme, uncertain fly-by conditions, dust environment, and component inaccuracies (star tracker, gyroscopes, and reaction wheels). Results also show that upgrading the reaction wheel that is implemented along the camera line-of-sight can improve, but only marginally, attitude performance, and proper alignment of solar arrays parallel to the incoming flow of particles is shown essential to maximize probability of success. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. Residential consumer enrollment in demand response: An agent based approach.
- Author
-
Sridhar, Araavind, Honkapuro, Samuli, Ruiz, Fredy, Stoklasa, Jan, Annala, Salla, and Wolff, Annika
- Subjects
- *
BATTERY storage plants , *MONTE Carlo method , *CONSUMER behavior , *CLEAN energy , *SCHOOL enrollment - Abstract
Residential consumers play an important role in the sustainable transition of the energy system by leveraging their household loads for demand response (DR). This paper aims to analyze the enrollment rates of residential consumers within DR through an agent-based model (ABM). Both economic and noneconomic (social/behavioral) parameters that influence the consumer enrollment in DR are considered. An energy management model, a home energy management system (HEMS), is used to identify the potential economic savings of consumers enrolling in DR. Consumers are randomly assigned to different neighborhoods and have different social relationships (e.g., friends, neighbors), which, in turn, influences their decision-making in the ABM. The results of this paper highlight the indirect relationship of expected annual savings and direct relationship of the share of consumers having electric vehicles (EV), photovoltaics (PV), and battery energy storage systems (BESSs) on the DR enrollment rates. Based on the enrollment rates, the maximum energy savings were obtained in April and the minimum during the last quarter of the year. Monte Carlo analysis is employed to handle the randomness associated with different variable selections, which provides a ± 10 % variation of consumer enrollment rate in DR. The results of this study have practical implications for energy flexibility in the residential sector. • Introduced an agent-based model to simulate residential consumer enrollment in DR. • Utilized economic and noneconomic factors to identify consumer decision-making to enroll. • Introduced different enrollment contract types in DR for consumers. • Analyzed different parameters influencing consumer enrollment rates. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. The model and characteristics of polarized light transmission applicable to polydispersity particle underwater environment.
- Author
-
Dong, Chao, Fu, Qiang, Wang, Kaikai, Zong, Fangxing, Li, Mingxuan, He, Qingyi, Liu, Xuanwei, Liu, Jianhua, and Zhu, Yong
- Subjects
- *
STATISTICAL sampling , *LIGHT transmission , *BODIES of water , *DATA transmission systems , *SIMULATION methods & models - Abstract
• Establish a Monte Carlo improved simulation model considering the particle size of underwater transmission media. • The improved Monte Carlo simulation method for fitting phase functions through random sampling provides a more comprehensive and accurate complex water transport model. • The improved simulation model combine the scattering angle and the scattering phase function which make it is more in line with the actual probability model of complex water environment. • Obtained the influence of different variables on the polarization maintaining performance of polarized light. The information about the polarization of underwater objects can be affected by the absorption and scattering of water particles, making it challenging to detect underwater targets accurately. Therefore, it is essential to analyze the characteristics of polarization transmission in water. Traditional Monte Carlo simulations have yet to effectively treat particle size in water, which is unsuitable for the underwater environment of polydispersity particles with significant particle size differences. In this paper, a new simulation model using a Monte Carlo method with random sampling fitting phase functions is proposed, which makes the transmission characteristics of water particles more in line with the actual probability model of complex water bodies. The experimental results demonstrate that the proposed Monte Carlo method produces polarization closer to experimental results. The larger the particle size of water suspensions, the greater change of the polarization degree of polarized light. The polarization-maintaining characteristics of circularly polarized light are better than those of linearly polarized light, providing data support for the transmission and detection of underwater light. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Recognizing the role of uncertainties in the transition to renewable hydrogen.
- Author
-
Fazeli, Reza, Beck, Fiona J., and Stocks, Matt
- Subjects
- *
HYDROGEN as fuel , *HYDROGEN production , *HYDROGEN , *CARBON pricing , *GREENHOUSE gas mitigation , *HYDROGEN analysis , *FOSSIL fuels - Abstract
Achieving the goal of net zero emissions targeted by many governments and businesses around the world will require an economical zero-emissions fuel, such as hydrogen. Currently, the high production cost of zero emission 'renewable' hydrogen, produced from electrolysis powered by renewable electricity, is hindering its adoption. In this paper, we examine the role of uncertainties in projections of techno-economic factors on the transition from hydrogen produced from fossil fuels to renewable hydrogen. We propose an integrated framework, linking techno-economic and Monte-Carlo based uncertainty analysis with quantitative hydrogen supply-demand modelling, to examine hydrogen production by different technologies, and the associated greenhouse gas (GHG) emissions from both the feedstock supply and the production process. The results show that the uncertainty around the cost of electrolyser systems, the capacity factor, and the gas price are the most critical factors affecting the timing of the transition to renewable H 2. We find that hydrogen production will likely be dominated by fossil fuels for the next few decades if the cost of carbon emissions are not accounted for, resulting in cumulative emissions from hydrogen production of 650 Mt CO 2 -e by 2050. However, implementing a price on carbon emissions can significantly expedite the transition to renewable hydrogen and cut the cumulative emissions significantly. • The impacts of uncertainty on the transition to renewable hydrogen are quantified. • System cost of electrolyser, the capacity factor and gas price are critical factors. • Cumulative emissions from hydrogen production by 2050 can be very significant. • Carbon price can significantly advance the transition and cut cumulative emissions. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
11. Uncertainty analysis of the optimal health-conscious operation of a hybrid PEMFC coastal ferry.
- Author
-
Dall'Armi, C., Pivetta, D., and Taccani, R.
- Subjects
- *
FERRIES , *MONTE Carlo method , *HYDROGEN as fuel , *POLYELECTROLYTES , *POLYMERIC membranes , *ENERGY management - Abstract
Hydrogen fueled Polymer Electrolyte Membrane Fuel Cells/Lithium-Ion Battery powertrains could be a promising solution for zero-local-emission shipping. The power allocation between PEMFC and LIB and their respective performance degradation play a crucial role in reducing the powertrain operating and maintenance costs. While several research works proposed energy management strategies to face these issues, a long-term operation optimization including the uncertainty in the input parameters of the model has not been extensively addressed. To this purpose, this study couples an operation optimization model of a PEMFC/LIB ferry propulsion system with a Monte-Carlo analysis to investigate the influence of PEMFC, LIB and hydrogen costs on the optimal operation of a hydrogen-powered ferry in the long-term. Hydrogen cost results to be the most influent parameter, in particular toward the end of the plant lifetime, when hydrogen consumption increases by up to 30%. Nevertheless, the variability of optimal ferry operation gradually decreases with the progressive PEMFC/LIB degradation. • Health-conscious energy management strategy of hybrid PEMFC/LIB coastal ferry. • Monte Carlo uncertainty analysis of the long-term operation of a hybrid ship. • Global sensitivity analysis indicates hydrogen fuel cost as the most influent parameter. • PEMFC/LIB degradation causes a 30% increase of hydrogen consumption at end of life. • Variability of powertrain optimal power allocation decreases toward the end of life. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
12. Forecasting revenue from primary and secondary sources of rare earth elements.
- Author
-
Gupta, Ajay, Williams, Eric, and Gaustad, Gabrielle
- Subjects
RARE earth metals ,ELECTRONIC waste ,VALUE (Economics) ,INDUSTRIAL wastes ,NICKEL-metal hydride batteries ,HYDROTHERMAL deposits ,RARE earth oxides ,COAL ash - Abstract
• Rare earths are main source of revenue among coproducts of most secondary sources. • Different sources within each category shows large variability in contained value. • Industrial waste revenues are driven by scandium, electronic waste by neodymium. • Contained rare earth value increases as power law of total rare earth oxide content. Expanding use of rare earth elements (REEs) necessitates characterizing deposits. Challenges include variable REE concentrations (e.g., coal ash ranges from 267 to 843 ppm) and price volatility. For a range of sources, we estimate distributions in the REE value per tonne of material, by collecting multiple data points for each type and using mean-reversion price forecasting. The study covers primary ores (e.g., bastnaesite), industrial wastes (e.g., red mud) and consumer wastes (e.g., NiMH batteries). Electronic wastes have highest value, driven by neodymium, industrial waste value is driven by scandium. Variability exists within resource types, e.g., the value of Australian monazite > Bayan Obo bastnaesite > Malaysian monazite. Using a power-law relationship, the total REE value of a sources correlates well with its ore grade. These results inform investment decisions to develop primary and secondary sources by clarifying potential variability and providing a useful rule of thumb to estimate revenues. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Non-linear tendon fatigue life under uncertainties.
- Author
-
Rodriguez Reinoso, Mariana, Antonaci, Paola, Pugno, Nicola M., and Surace, Cecilia
- Subjects
- *
MONTE Carlo method , *TENDONS , *CYCLIC loads , *TISSUES , *RANGE of motion of joints , *TENSION loads - Abstract
Tendons play a pivotal role in facilitating joint movement by transmitting muscular forces to bones. The intricate hierarchical structure and diverse material composition of tendons contribute to their non-linear mechanical response. However, comprehensively grasping their mechanical properties poses a challenge due to inherent variability in biological tissues. This necessitates a thorough examination of uncertainties associated with properties measurements, particularly under diverse loading conditions. Given the cyclic loading experienced by tendons throughout an individual's lifespan, understanding their mechanical behaviour under such circumstances becomes crucial. This study addresses this need by introducing a generalised Paris Erdogan Law tailored for non-linear materials. To examine uncertainties within this proposed framework, Monte Carlo Analysis is employed. This approach allows for a thorough exploration of the uncertainties associated with tendon mechanics, contributing to a more robust comprehension of their behaviour under cyclic loading conditions. Finally, self-healing has been integrated into the fatigue law of tendons through the proposal of a healing function, formulated as a polynomial function of the maximum stress. This approach allows to account for an increase in the number of cycles for each stress value due to self-repair after the damage event generated by long-term cycling load over the individual's life span. • -Development of a Generalised Paris Erdogan law for non-linear materials such as soft tissues. • -Use of Monte Carlo Analysis for robust comprehension of tendon mechanics uncertainties. • -Estimation of mechanical parameters C and m_p for the PE law in tendon cases. • -Introduction of Healing Function to observe increased cycles due to self-healing effect. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. A framework for the in silico assessment of the robustness of an MPC in a CDC line in function of process variability.
- Author
-
Waeytens, Ruben, Van Hauwermeiren, Daan, Grymonpré, Wouter, Nopens, Ingmar, and De Beer, Thomas
- Subjects
- *
DISTRIBUTION (Probability theory) , *MONTE Carlo method , *SELF-tuning controllers , *STOCHASTIC models , *STOCHASTIC processes , *CONTINUOUS processing - Abstract
The shift from batch manufacturing towards continuous manufacturing for the production of oral solid dosages requires the development and implementation of process models and process control. Previous work focused mainly on developing deterministic models for the investigated system. Furthermore, the in silico tuning and analysis of a control strategy are mostly done based on deterministic models. This deterministic approach could lead to wrong actions in diversion strategies and poor transferability of the controller performance if the system behaves differently than the deterministic model. This work introduces a framework that explicitly includes the process variability which is characteristic of powder handling processes and tests it on a novel continuous feeding–blending unit (i.e., the FE continuous processing system (CPS)), followed by a tablet press (i.e., the FE 55). It employs a stochastic model by allowing the model parameters to have a probability distribution. The performance of a model predictive control (MPC), steering the feed rate of the main excipient feeder to compensate for the feed rate deviations of the active pharmaceutical ingredient (API) feeder to keep the API concentration close to the desired value, is evaluated and the impact of process variability is assessed in a Monte Carlo (MC) analysis. Next to the process variability, a model for the prediction error of the chemometric model and realistic feed rate disturbances were included to increase the transferability of the results to the real system. The obtained results show that process variability is inherently present and that wrong conclusions can be drawn if it is not taken into account in the in silico analysis. [Display omitted] • A stochastic process model is developed for a continuous feeding–blending unit. • The impact of process variability on the performance of an MPC is investigated. • A case study is carried out for a realistic setup of a CDC line in closed-loop. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Achieving efficiency in quantitative risk analysis process – Application on infrastructure projects.
- Author
-
Nabawy, Mohamed and Khodeir, Laila M.
- Subjects
MONTE Carlo method ,RISK assessment ,INFRASTRUCTURE (Economics) ,QUANTITATIVE research ,PROJECT managers - Abstract
Infrastructure Projects (IP) are characterised by their extreme complexity in construction and management. Lack of effective risk analysis in construction of IP can cause failure in projects delivery. This paper aims to produce specific guidelines for the efficient application of quantitative risk analysis process. The literature guided contractors into efficient quantitative risk analysis (QRA) of IP to improve contractors QRA practices in the presence of uncertainty during the construction stage. The paper reviewed the literature of QRA process in construction of IP, and then performed a full quantitative risk analysis on a case study of IP in Egypt. The study includes sewage networks, water networks, irrigation networks, and district cooling networks of Cairo festival city project. The paper used an infrastructure schedule to apply the quantitative risk analysis. The schedule included activities for the construction of sewage networks, irrigation network, water network, and district cooling network. Based on the findings, Monte Carlo analysis technique has proved to be an efficient quantitative technique in supporting project managers in allocating deviations. Whereas, sensitivity analysis technique helped in ranking the most driving activities to failure in delivery of IP. The paper added value by providing knowledge and practical guidance for contractors in improving their QRA practice and decision making during the construction of IP. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
16. A systematic review of quantitative risk analysis in construction of mega projects.
- Author
-
Nabawy, Mohamed and Khodeir, Laila M.
- Subjects
MONTE Carlo method ,RISK assessment ,CONSTRUCTION projects ,QUANTITATIVE research ,LITERARY sources - Abstract
Mega projects (MP) require efficient management of risks during their construction. Therefore, it is crucial to identify any possible deviations towards meeting their objectives. Such deviation forced MP to be delivered behind schedule and over budget. According references, MP does not require only qualitative analysis but requires an accurate quantitative analysis based on knowledge and practice. Thus, this paper aims to undergo a systematic review of quantitative analysis literature in construction of worldwide MP, with the utmost aim to improve contractors quantitative risk analysis practices in the presence of uncertainty. A time line was produced which shows the process of quantitative risk analysis in this literature including the past six years from 2013 to 2018. This was followed by a critical analysis in order to account for quantitative risk analysis techniques highlighted throughout literature sources. Furthermore, the paper reviews the literature of worldwide mega projects by which quantitative risk analysis process was practiced. It was observed that Monte Carlo analysis technique has succeeded in supporting project managers in allocating deviations in the objectives of MP. The paper adds value to practitioners using the process of quantitative risk analysis as well as contractors working on construction of MP. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
17. Realization of fractional-order capacitor based on passive symmetric network.
- Author
-
Semary, Mourad S., Fouda, Mohammed E., Hassan, Hany N., and Radwan, Ahmed G.
- Subjects
- *
CAPACITORS , *RELAXATION oscillators , *RC circuits , *MATHEMATICAL optimization , *MONTE Carlo method - Abstract
• A new realization of the fractional capacitor using passive symmetric networks is proposed. • General analysis of this network regardless of the internal impedances composition is introduced. • Three scenarios based on RC circuit or integer Cole-Impedance circuit or both are utilized. • The network size is optimized using Minimax and least m th optimization techniques. • Monte Carlo simulations and experimental results are provided with applications. In this paper, a new realization of the fractional capacitor (FC) using passive symmetric networks is proposed. A general analysis of the symmetric network that is independent of the internal impedance composition is introduced. Three different internal impedances are utilized in the network to realize the required response of the FC. These three cases are based on either a series RC circuit, integer Cole-impedance circuit, or both. The network size and the values of the passive elements are optimized using the minimax and least m th optimization techniques. The proposed realizations are compared with well-known realizations achieving a reasonable performance with a phase error of approximately 2 o. Since the target of this emulator circuit is the use of off-the-shelf components, Monte Carlo simulations with 5 % tolerance in the utilized elements are presented. In addition, experimental measurements of the proposed capacitors are preformed, therein showing comparable results with the simulations. The proposed realizations can be used to emulate the FC for experimental verifications of new fractional-order circuits and systems. The functionality of the proposed realizations is verified using two oscillator examples: a fractional-order Wien oscillator and a relaxation oscillator. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
18. Techno-economic analysis and optimal control of battery storage for frequency control services, applied to the German market.
- Author
-
Engels, Jonas, Claessens, Bert, and Deconinck, Geert
- Subjects
- *
STORAGE batteries , *BATTERY storage plants , *NET present value , *ELECTRIC vehicle batteries , *APPROXIMATION error - Abstract
• Techno-economic analysis of battery storage systems providing frequency control. • A stochastic, data-driven optimisation algorithm to optimise the battery controller. • Use of frequency data and detailed, yet computationally efficient battery models. • Case study of Germany shows the highest margins for a battery rated at 1.6 MW/1.6 MWh. • Calendar ageing drives battery degradation, while cycle ageing has less impact. Optimal investment in battery energy storage systems, taking into account degradation, sizing and control, is crucial for the deployment of battery storage, of which providing frequency control is one of the major applications. In this paper, we present a holistic, data-driven framework to determine the optimal investment, size and controller of a battery storage system providing frequency control. We optimised the controller towards minimum degradation and electricity costs over its lifetime, while ensuring the delivery of frequency control services compliant with regulatory requirements. We adopted a detailed battery model, considering the dynamics and degradation when exposed to actual frequency data. Further, we used a stochastic optimisation objective while constraining the probability on unavailability to deliver the frequency control service. Through a thorough analysis, we were able to decrease the amount of data needed and thereby decrease the execution time while keeping the approximation error within limits. Using the proposed framework, we performed a techno-economic analysis of a battery providing 1 MW capacity in the German primary frequency control market. Results showed that a battery rated at 1.6 MW, 1.6 MWh has the highest net present value, yet this configuration is only profitable if costs are low enough or in case future frequency control prices do not decline too much. It transpires that calendar ageing drives battery degradation, whereas cycle ageing has less impact. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
19. A discussion of "a simplified prediction method for evaluating tunnel displacement induced by laterally adjacent excavations" by Zheng et al. (2018).
- Author
-
Shadab Far, Mahdi, Huang, Hongwei, Xue, Yadong, and Zhou, Mingliang
- Subjects
- *
EXCAVATION (Civil engineering) , *RELIABILITY in engineering , *MONTE Carlo method , *PROBABILITY theory , *RISK assessment - Abstract
Abstract In this short paper, the semi-empirical model introduced by Zheng et al. (2018) for evaluating the tunnel displacements induced by adjacent excavation was extended to a probabilistic model. To this end, the input parameters of the problem were defined as random variables with certain means and standard deviations. A reliability model was then established and solved by the Monte Carlo method. The results showed that the probability of exceedance fell sharply by increasing the tunnel displacement so that the probability of tunnel displacement of larger than 41 mm was less than 5%. Moreover, the reliability sensitivity analysis showed that the excavation depth and tunnel burial depth mostly affected the small range of tunnel displacements between 0.5 and 10 mm. However, the maximum impact of the retaining structure displacement and the horizontal distance of the tunnel from the excavation site were found to fall in medium and large (0.5 to larger than 21 mm) ranges of tunnel displacement. Having the exceedance probability curve presented in this paper, the exceedance probability for any desired value of the tunnel displacement is available, which can be used for safety evaluation and risk assessment of existing tunnels close to the surface excavations. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
20. Probabilistic assessment of realizing the 1.5 °C climate target.
- Author
-
Marcucci, Adriana, Panos, Evangelos, Kypreos, Socrates, and Fragkos, Panagiotis
- Subjects
- *
GREENHOUSE gas mitigation , *ABATEMENT (Atmospheric chemistry) , *GLOBAL warming , *EARLY retirement , *ENERGY consumption , *MARKET penetration - Abstract
Highlights • Probabilistic assessment of the technical feasibility of achieving the 1.5 °C target. • Analysis of alternative pathways to achieve the Paris 1.5 and 2 degrees targets. • Limiting warming to 1.5 °C is feasible but it needs early and rapid global action. • Key pillars are biomass, development of CO 2 removal and energy efficiency. Abstract In this paper we develop a probabilistic assessment of the energy transition and economic consequences of limiting global warming by the end of the century to 1.5 °C. The assessment is made by applying a Monte Carlo analysis in MERGE-ETL, a technology rich integrated assessment model with endogenous learning. We assume a deterministic 1.5 °C target and uncertainty in other factors such as economic growth, resources, and technology costs. The distributions of these variables are obtained by the PROMETHEUS stochastic world energy model. The study assumes early actions and quantifies the market penetration of low carbon technologies, the emission pathways and the economic costs for an efficient reduction of greenhouse gas emissions such that the temperature limit is not exceeded. We find that achieving the 1.5 °C Paris target is technically feasible but it requires immediate and global action. Key pillars in the decarbonization of the energy system are large deployment of renewable energy, early retirement of fossil based power plants, energy efficiency and negative emission technologies. Furthermore, that the availability of biomass resources, the rapid decrease of the costs of renewables and improvements in energy efficiency are the factors with the largest effect on the cost of carbon. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
21. Dynamics and control of spacecraft with a large misaligned rotational component.
- Author
-
Wu, Fan, Cao, Xibin, Butcher, Eric A., and Wang, Feng
- Subjects
- *
SPACE vehicles , *DYNAMICS , *CAUSATION (Philosophy) , *ANALYTICAL solutions , *ANGULAR momentum (Mechanics) - Abstract
Abstract A new type of earth observation mission using a spacecraft with rotational payload is introduced in this paper. The kinematic and dynamic equations for the nadir-aligned attitude tracking problem are derived in the presence of a misaligned rotational payload. A quaternion-based feedback control law is proposed and its stability is analyzed using the Lyapunov direct method. Comparisons are made between the simulation results of the proposed control law and that of a conventional PD feedback control law. The analytical solution of the spacecraft's residue angular momentum is obtained. Monte Carlo analyses are employed to demonstrate the angular rate accuracy of the system and to study the effects caused by the knowledge error of parallel misalignment. Simulation results show that the angular rate accuracy improves greatly by the proposed control law. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
22. A review of approaches to uncertainty assessment in energy system optimization models.
- Author
-
Yue, Xiufeng, Pye, Steve, DeCarolis, Joseph, Li, Francis G.N., Rogan, Fionn, and Gallachóir, Brian Ó.
- Abstract
Energy system optimization models (ESOMs) have been used extensively in providing insights to decision makers on issues related to climate and energy policy. However, there is a concern that the uncertainties inherent in the model structures and input parameters are at best underplayed and at worst ignored. Compared to other types of energy models, ESOMs tend to use scenarios to handle uncertainties or treat them as a marginal issue. Without adequately addressing uncertainties, the model insights may be limited, lack robustness, and may mislead decision makers. This paper provides an in-depth review of systematic techniques that address uncertainties for ESOMs. We have identified four prevailing uncertainty approaches that have been applied to ESOM type models: Monte Carlo analysis, stochastic programming, robust optimization, and modelling to generate alternatives. For each method, we review the principles, techniques, and how they are utilized to improve the robustness of the model results to provide extra policy insights. In the end, we provide a critical appraisal on the use of these methods. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
23. Risk analysis of heavy metal concentration in surface waters across the rural-urban interface of the Wen-Rui Tang River, China.
- Author
-
Qu, Liyin, Huang, Hong, Xia, Fang, Liu, Yuanyuan, Dahlgren, Randy A., Zhang, Minghua, and Mei, Kun
- Subjects
HEAVY metals & the environment ,HEAVY metal toxicology ,ECOLOGICAL risk assessment ,RURAL-urban differences ,RIVER sediments - Abstract
Heavy metal pollution is a major concern in China because of its serious effects on human health. To assess potential human health and ecological risks of heavy metal pollution, concentration data for seven heavy metals (As, Pb, Cd, Cr, Hg, Cu, Zn) from 14 sites spanning the rural-urban interface of the Wen-Rui Tang River watershed in southeast China were collected from 2000 to 2010. The heavy metal pollution index (HPI), hazard index (HI) and carcinogenic risk (CR) metrics were used to assess potential heavy metal risks. Further, we evaluated the uncertainty associated with the risk assessment indices using Monte Carlo analysis. Results indicated that all HPI values were lower than the critical level of 100 suggesting that heavy metal levels posed acceptable ecological risks; however, one site having an industrial point-source input reached levels of 80–97 on several occasions. Heavy metal concentrations fluctuated over time, and the decrease after 2007 is due to increased wastewater collection. The HI suggested low non-carcinogenic risk throughout the study period (HI < 1); however, nine sites showed CR values above the acceptable level of 10 −4 for potential cancer risk from arsenic in the early 2000s. Uncertainty analysis revealed an exposure risk for As at all sites because some CR values exceeded the 10 −4 level of concern; levels of Cd near an old industrial area also exceeded the Cd exposure standard (2.6% of CR values > 10 −4 ). While most metrics for human health risk did not exceed critical values for heavy metals, there is still a potential human health risk from chronic exposure to low heavy metal concentrations due to long-term exposure and potential metal interactions. Results of this study inform water pollution remediation and management efforts designed to protect public health in polluted urban area waterways common in rapidly developing regions. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
24. Techno-economic analysis of a 5th generation district heating system using thermo-hydraulic model: A multi-objective analysis for a case study in heating dominated climate.
- Author
-
Saini, Puneet, Huang, Pei, fiedler, Frank, Volkova, Anna, and Zhang, Xingxing
- Subjects
- *
HEATING from central stations , *HEATING , *MONTE Carlo method , *HEAT pumps , *EVIDENCE gaps , *SOLAR technology - Abstract
A 5th generation district heating (5GDH) system consists of a low-temperature network used as a heat source for de-centralized heat pumps to serve heating demand. Until now, there is a lack of studies looking into the economic aspect of implementing the 5GDH concept. The performance characteristics, system dynamics, and economic feasibility of the 5GDH system are insufficiently investigated in cold climates. This paper aims to bridge the research gap by performing the techno-economic analysis of a 5GDH system using a case study based in Tallinn, Estonia. A detailed thermo-hydraulic simulation model is constructed in TRNSYS and Fluidit Heat. In addition, the uncertainty and sensitivities on the economic performance are analysed using Monte Carlo method implemented in Python. The study further analyses the effectiveness of using solar power technologies in reducing the cost of heating. For designed boundary conditions, the system can deliver heat at levelised cost of heating (LCOH) of 80 €/MWh. Integration of photovoltaic up to a limited capacity results in 1 % reduction when compared to the base case LCOH. The economic benefit of photovoltaic thermal is lower compared to photovoltaic. This study can provide a benchmark for the application of 5GDH systems in heating dominated regions. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. Recoil distance method lifetime measurements at TRIUMF-ISAC using the TIGRESS Integrated Plunger.
- Author
-
Chester, A., Ball, G.C., Bernier, N., Cross, D.S., Domingo, T., Drake, T.E., Evitts, L.J., Garcia, F.H., Garnsworthy, A.B., Hackman, G., Hallam, S., Henderson, J., Henderson, R., Krücken, R., MacConnachie, E., Moukaddam, M., Padilla-Rodal, E., Paetkau, O., Pore, J.L., and Rizwan, U.
- Subjects
- *
ANTENNA arrays , *X-rays , *COULOMB functions , *EXOTIC nuclei , *GAMMA rays - Abstract
The TIGRESS Integrated Plunger device (TIP) has been developed for recoil distance method (RDM) lifetime measurements using the TIGRESS array of HPGe γ -ray detectors at TRIUMF’s ISAC-II facility. A commissioning experiment was conducted utilizing a 250 MeV 84 Kr beam at ≈ 2 × 1 0 8 particles per second. The 84 Kr beam was Coulomb excited to the 2 1 + state on a movable 27 Al target. A thin Cu foil fixed downstream from the target was used as a degrader. Excited nuclei emerged from the target and decayed by γ -ray emission at a distance determined by their velocity and the lifetime of the 2 1 + state. The ratio of decays which occur between the target and degrader to those occurring after traversing the degrader changes as a function of the target–degrader separation distance. Gamma-ray spectra at 13 target–degrader separation distances were measured and compared to simulated lineshapes to extract the lifetime. The result of τ = 5 . 541 ± 0 . 013 (stat.) ± 0 . 063 (sys.) ps is shorter than the literature value of 5 . 84 ± 0 . 18 ps with a reduction in uncertainty by a factor of approximately two. The TIP plunger device, experimental technique, analysis tools, and result are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
26. Modeling forest above-ground biomass dynamics using multi-source data and incorporated models: A case study over the qilian mountains.
- Author
-
Tian, Xin, Yan, Min, van der Tol, Christiaan, Li, Zengyuan, Su, Zhongbo, Chen, Erxue, Li, Xin, Li, Longhui, Wang, Xufeng, Pan, Xiaoduo, Gao, Lushuang, and Han, Zongtao
- Subjects
- *
FOREST meteorology , *BIOMASS , *CARBON , *LANDSAT satellites , *FOREST ecology - Abstract
In this work, we present a strategy for obtaining forest above-ground biomass (AGB) dynamics at a fine spatial and temporal resolution. Our strategy rests on the assumption that combining estimates of both AGB and carbon fluxes results in a more accurate accounting for biomass than considering the terms separately, since the cumulative carbon flux should be consistent with AGB increments. Such a strategy was successfully applied to the Qilian Mountains, a cold arid region of northwest China. Based on Landsat Thematic Mapper 5 (TM) data and ASTER GDEM V2 products (GDEM), we first improved the efficiency of existing non-parametric methods for mapping regional forest AGB for 2009 by incorporating the Random Forest (RF) model with the k -Nearest Neighbor ( k -NN). Validation using forest measurements from 159 plots and the leave-one-out (LOO) method indicated that the estimates were reasonable (R 2 = 0.70 and RMSE = 24.52 tones ha −1 ). We then obtained one seasonal cycle (2011) of GPP (R 2 = 0.88 and RMSE = 5.02 gC m −2 8d −1 ) using the MODIS MOD_17 GPP (MOD_17) model that was calibrated to Eddy Covariance (EC) flux tower data (2010). After that, we calibrated the ecological process model (Biome-BioGeochemical Cycles (Biome-BGC)) against above GPP estimates (for 2010) for 30 representative forest plots over an ecological gradient in order to simulate AGB changes over time. Biome-BGC outputs of GPP and net ecosystem exchange (NEE) were validated against EC data (R 2 = 0.75 and RMSE = 1. 27 gC m −2 d −1 for GPP, and R 2 = 0.61 and RMSE = 1.17 gC m −2 d −1 for NEE). The calibrated Biome-BGC was then applied to produce a longer time series for net primary productivity (NPP), which, after conversion into AGB increments according to site-calibrated coefficients, were compared to dendrochronological measurements (R 2 = 0.73 and RMSE = 46.65 g m −2 year −1 ). By combining these increments with the AGB map of 2009, we were able to model forest AGB dynamics. In the final step, we conducted a Monte Carlo analysis of uncertainties for interannual forest AGB estimates based on errors in the above forest AGB map, NPP estimates, and the conversion of NPP to an AGB increment. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
27. Is SCENA a good approach for side-stream integrated treatment from an environmental and economic point of view?
- Author
-
Longo, Stefano, Frison, Nicola, Renzi, Daniele, Fatone, Francesco, and Hospido, Almudena
- Subjects
- *
SEWAGE disposal plants , *NITROGEN removal (Sewage purification) , *PHOSPHORUS , *ELECTRIC power consumption , *LIFE cycle costing - Abstract
The environmental and economic benefits and burdens of including the first Short Cut Enhanced Nutrient Abatement (SCENA) into a real municipal wastewater treatment plant were evaluated using life cycle assessment (LCA) and life cycle cost (LCC). The implications of accomplishing nitrogen (N) removal and phosphorus (P) recovery via nitrite in the side stream were assessed taking into account the actual effluent quality improvement, the changes in the electricity and chemical consumption, N 2 O, CO 2 and CH 4 emissions and the effects of land application of biosolids, among others. In addition, a case-specific estimation of the P availability when sludge is applied to land, therefore replacing conventional fertilizer, was performed. Furthermore, to account for the variability in input parameters, and to address the related uncertainties, Monte Carlo simulation was applied. The analysis revealed that SCENA in the side stream is an economic and environmentally friendly solution compared to the traditional plant layout with no side-stream treatment, thanks to the reduction of energy and chemical use for the removal of N and P, respectively. The uncertainty analysis proved the validity of the LCA results for global warming potential and impact categories related to the consumption of fossil-based electricity and chemicals, while robust conclusions could not be drawn on freshwater eutrophication and toxicity-related impact categories. Furthermore, three optimization scenarios were also evaluated proving that the performance of the WWTP can be further improved by, for instance, substituting gravitational for mechanical thickening of the sludge or changing the operational strategy to the chemically enhanced primary treatment, although this second alternative will increase the operational cost by 5%. Finally, the outcomes show that shifting P removal from chemical precipitation in the main line to biologically enhanced uptake in the side stream is key to reducing chemicals use, thus the operational cost, and increasing the environmental benefit of synthetic fertilizers replacement. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
28. Quantifying economic risk in photovoltaic power projects.
- Author
-
Tomosk, Steve, Haysom, Joan E., and Wright, David
- Subjects
- *
PHOTOVOLTAIC power systems , *ENERGY economics , *RATE of return , *ELECTRIC utility costs , *SOLAR energy - Abstract
Risk analysis is essential for attracting investment to solar projects. This paper measures risk as the variability in internal rate of return (IRR) and estimates it from the uncertainty in (i) future systems prices, (ii) operations costs and (iii) revenues based on energy yield, irradiance and electricity prices. We quantify these risks for photovoltaic (PV) and concentrated photovoltaic (CPV) projects starting in 2016, 18 and 20 for customers selling solar-generated electricity under a fixed feed-in tariff (FIT) and for large business customers displacing electricity loads that they would pay for according to variable market rates. An international comparison of results is provided. Uncertainty in future systems prices causes on average 45% (PV) and 93% (CPV) variation in IRR, which is important to a developer’s planning process but is resolvable with negotiated system prices from suppliers. Uncertainty in future operations costs impacts the IRR by on average 17% (PV) and 20% (CPV). Uncertainty in revenues impacts the IRR by at most 3.6%. Furthermore, the analysis shows that overall percentage variability in a project’s IRR is much less than the percentage variability in operations costs and revenues, which are the two factors at play once the system is operating. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
29. Financial analysis and risk assessment of hydroprocessed renewable jet fuel production from camelina, carinata and used cooking oil.
- Author
-
Chu, Pei Lin, Vanderghem, Caroline, MacLean, Heather L., and Saville, Bradley A.
- Subjects
- *
JET fuel , *RENEWABLE energy sources , *CAMELINA , *FATS & oils , *CORPORATE finance - Abstract
This paper evaluates the financial viability of renewable jet fuel production, from two oilseed crops Camelina sativa (camelina) and Brassica carinata (carinata) and used cooking oil (UCO), by the hydrodeoxygenation pathway. A Monte Carlo analysis is performed to examine the robustness of the financial performance by taking into consideration key uncertain parameters, including capital cost, oil content of seeds, and prices of feedstocks, gas, electricity, water, meal co-product, and crude oil (indicator of fuel product prices). The Monte Carlo analysis revealed that under the conditions analyzed, the probabilities that the net present value would be positive are 29% for camelina, 18% for carinata and 8% for UCO, indicating that the three projects are risky for investors. Sensitivity analysis determined that the projects’ financial performance is highly sensitive to prices of fuel products and feedstocks. The impacts of two different hypothetical biofuel economic incentives were assessed: Carbon trading and tradable credits similar to the Renewable Identification Number (RIN). Income earned in the form of a RIN would have a large positive impact on the projects’ viabilities. By assuming an incentive of $0.20/L of renewable fuel, the probabilities that the NPV would be positive are 85% for camelina, 75% for carinata and 58% for UCO. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
30. CCII and RC fractance based fractional order current integrator.
- Author
-
Goyal, Divya and Varshney, Pragya
- Subjects
- *
CURRENT conveyors , *RC circuits , *INTEGRATORS , *PID controllers , *FRACTIONAL calculus , *ELECTRIC capacity - Abstract
Integrators are an important functional module of several filters, PID controllers and automated systems. Designing these integrators in fractional domain enhance their operations and make the responses highly precise and accurate. This paper presents a Fractional Order Current Integrator using second generation current conveyor as the active block, and fractional capacitor as a grounded fractance. Analog realization of this fractance comprises of resistances and capacitances arranged in parallel RC ladder topology. The motivation behind this work is that more accurate and stable fractional integrator with fewer passive elements, for lower bias voltage and high dynamic range, can be designed and implemented. Integrators of fractional orders from 0.1 to 0.9 are simulated using TSMC 0.25 µm technology parameters. The transient and frequency responses obtained in Mentor Graphics are in close conformity with the theoretical values of magnitude and phase. Robustness of the proposed model is verified by performing Monte Carlo analysis in time and in frequency domain. Comparisons of fractional order current integrators with existing analog fractance models have also been included to further validate the work presented in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
31. Cluster-based delta-QMC technique for fast yield analysis.
- Author
-
Qui, Nguyen Cao, He, Si-Rong, and Liu, Chien-Nan Jimmy
- Subjects
- *
MONTE Carlo method , *METAL oxide semiconductor field-effect transistor circuits , *ANALOG circuits , *SIMULATION Program with Integrated Circuit Emphasis , *ELECTRIC circuit analysis - Abstract
Monte Carlo (MC) analysis is often considered a golden reference for yield analysis because of its high accuracy. However, repeating the simulation hundreds of times is often too expensive for large circuit designs. The most widely used approach to reduce MC complexity is using efficient sampling methods to reduce the number of simulations. Aside from those sampling techniques, this paper proposes a novel approach to further improve MC simulation speed with almost the same accuracy. By using an improved delta circuit model, simulation speed can be improved automatically due to the dynamic step control in transient analysis. In order to further improve the efficiency while combining the delta circuit model and the sampling technique, a cluster-based delta-QMC technique is proposed in this paper to reduce the delta change in each sample. Experimental results indicate that the proposed approach can increase speed by two orders of magnitude with almost the same accuracy, which significantly improves the efficiency of yield analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
32. Techno-economic and reliability assessment of solar water heaters in Australia based on Monte Carlo analysis.
- Author
-
Rezvani, S., Bahri, P.A., Urmee, T., Baverstock, G.F., and Moore, A.D.
- Subjects
- *
SOLAR water heaters , *STAINLESS steel , *PRODUCT life cycle , *MONTE Carlo method , *WEIBULL distribution - Abstract
Monte Carlo analysis is used in this study to estimate the techno-economic benefits and reliabilities of solar water heaters. The study focuses on a product range manufactured by a local company in Australia. The historical data provided by the company forms the basis of this investigation. The inverse Weibull distribution function is a good match for representing the historical data in the model in terms of the number of failures per operating time for each component. The overall system reliability is determined as the sum of individual component failures during the product lifetime. The analysis is carried out for different system configurations using copper, stainless steel and glass-lined storage tanks. All the systems utilise flat plate collectors. The product with glass-lined storage tanks and electric boosters show a good overall reliability if systems are maintained. Based on the probability model, the variable maintenance costs of solar water heaters were estimated over the product lifetime. This together with capital expenditures and fuel charges are used to compute the specific price of hot water supply for different system configurations. Moreover, a sensitivity analysis is implemented to show the impact of auxiliary heating on the economic viability of the products. The results show that solar water heaters can offer significantly better long-term economic viability compared to conventional systems at moderate auxiliary energy consumptions. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
33. A predictive tool for determining patient-specific mechanical properties of human corneal tissue.
- Author
-
Ariza-Gracia, Miguel Ángel, Redondo, Santiago, Piñero Llorens, David, Calvo, Begoña, and Rodriguez Matas, José Felix
- Subjects
- *
CORNEA , *TESTING , *PREDICTION models , *INTRAOCULAR pressure , *FINITE element method , *GEOMETRIC analysis , *MATHEMATICAL models - Abstract
A computational predictive tool for assessing patient-specific corneal tissue properties is developed. This predictive tool considers as input variables the corneal central thickness (CCT), the intraocular pressure (IOP), and the maximum deformation amplitude of the corneal apex (U) when subjected to a non-contact tonometry test. The proposed methodology consists of two main steps. First, an extensive dataset is generated using Monte Carlo (MC) simulations based on finite element models with patient-specific geometric features that simulate the non-contact tonometry test. The cornea is assumed to be an anisotropic tissue to reproduce the experimentally observed mechanical behavior. A clinical database of 130 patients (53 healthy, 63 keratoconic and 14 post-LASIK surgery) is used to generate a dataset of more than 9000 cases by permuting the material properties. The second step consists of constructing predictive models for the material parameters of the constitutive model as a function of the input variables. Four different approximations are explored: quadratic response surface (QRS) approximation, multiple layer perceptron (MLP), support vector regressor (SVR), and K-nn search. The models are validated against data from five real patients. The material properties obtained with the predicted models lead to a simulated corneal displacement that is within 10% error of the measured value in the worst case scenario of a patient with very advanced keratoconus disease. These results demonstrate the potential and soundness of the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
34. Dynamics of dissolved organic carbon in hillslope discharge: Modeling and challenges.
- Author
-
Dusek, Jaromir, Vogel, Tomas, Dohnal, Michal, Sanda, Martin, Jankovec, Jakub, Barth, Johannes A.C., and Marx, Anne
- Subjects
- *
CARBON compounds , *CHEMICAL synthesis , *MOLECULAR structure of carbon compounds , *BIODEGRADATION of carbon compounds - Abstract
Reliable quantitative prediction of water movement and fluxes of dissolved substances – specifically organic carbon – at both the hillslope and the catchment scales remains a challenge due to complex boundary conditions and soil spatial heterogeneity. In addition, microbially mediated transformations of dissolved organic carbon (DOC) are recognized to determine the balance of DOC in soils. So far, only few studies utilized stable water isotope information in modeling and even fewer linked dissolved carbon fluxes to mixing and/or transport models. In this study, stormflow dynamics of 18 O/ 16 O ratios in the water molecules (expressed as δ 18 O) and DOC were analyzed using a physically-based modeling approach. A one-dimensional dual-continuum vertical flow and transport model was used to simulate the subsurface transport processes in a forest hillslope soil over a period of 2.5 years. The model was applied to describe the transformation of input signals of δ 18 O and DOC into output signals observed in the hillslope stormflow. To quantify uncertainty associated with the model parameterization, Monte Carlo analysis in conjunction with Latin hypercube sampling was applied. δ 18 O variations in hillslope discharge and in soil pore water were predicted reasonably well. Despite the complex nature of microbial transformations that caused uncertainty in model parameters and subsequent prediction of DOC transport, the simulated temporal patterns of DOC concentration in stormflow showed similar behavior to that reflected in the observed DOC fluxes. Due to preferential flow, the contribution of the hillslope DOC export was higher than the amounts that are usually found in the available literature. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
35. Monte Carlo analysis of focused Gaussian beam in scattering media: Curvature correction and Mie scattering.
- Author
-
Arjonillo, Hannah Christina C. and Saloma, Caesar A.
- Subjects
- *
MONTE Carlo method , *GAUSSIAN beams , *CURVATURE , *MIE scattering , *NUMERICAL apertures , *RAY tracing - Abstract
We introduce curvature correction into the Monte Carlo (MC) technique to determine the suitable photon-step size and initial photon distribution in the focusing lens aperture, that can account for the effects of the radius of curvature on the axial intensity profile of a focused Gaussian beam that is propagated through a scattering medium. The scattering anisotropy of the medium is determined via the Mie scattering theory for given values of the scattering particle radius, density, and phase distribution values while optical ray tracing is used to determine the occurrence of scattering events. We evaluate the performance of the modified MC technique for random and periodic media by examining the axial intensity profiles of the propagating focused beam. Relative to the predictions of the scalar diffraction theory, curvature correction improves the accuracy of the MC results with decreasing step size s and increasing number of steps N s. In the absence of scattering, the axial beam distribution approaches that of the scalar diffraction theory for a given numerical aperture (N A ≤ 0. 5), s , and N s based on the Linfoot's image quality criteria of fidelity, correlation quality and structural content. • This work describes the introduction of curvature correction into standard MC techniques, accounting for radius of curvature. • Our aim is to improve MC axial profile predictions to be more consistent with those of the scalar diffraction theory (SDT). • In the absence of scattering, the axial beam distribution approaches axial SDT profiles based on Linfoot's criteria. • This result holds for different NA values (0. 2 ≤ NA ≤ 0. 5) of the focusing lens with the appropriate step size parameter. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. Massive data analysis to assess PV/ESS integration in residential unbalanced LV networks to support voltage profiles.
- Author
-
Lamberti, Francesco, Calderaro, Vito, Galdi, Vincenzo, and Graditi, Giorgio
- Subjects
- *
LOW voltage systems , *ENERGY storage , *PHOTOVOLTAIC power generation , *MONTE Carlo method , *SUPPLY & demand - Abstract
The integration of energy storage systems (ESSs), co-located with distributed photovoltaic (PV) units in low voltage (LV) networks, offers new opportunities to support distribution system operator (DSO) in distribution network operations and management. The deepening penetration of renewable resources exacerbates the challenge to maintain demand–supply equilibrium. ESSs can tackle this challenge making PV resources dispatchable. Here, we apply a Monte Carlo analysis considering different residential load profiles and PV/ESS characteristics (e.g., penetration levels, locations, and capabilities) to assess the impact that two different control strategies have in supporting the DSO in improving the power quality of the distribution network. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
37. Uncertainty of modelled urban peak O3 concentrations and its sensitivity to input data perturbations based on the Monte Carlo analysis.
- Author
-
Pineda Rojas, Andrea L., Venegas, Laura E., and Mazzeo, Nicolás A.
- Subjects
- *
ATMOSPHERIC ozone , *AIR quality , *MONTE Carlo method , *PERTURBATION theory , *REGRESSION analysis - Abstract
A simple urban air quality model [MODelo de Dispersión Atmosférica Ubana – Generic Reaction Set (DAUMOD-GRS)] was recently developed. One-hour peak O 3 concentrations in the Metropolitan Area of Buenos Aires (MABA) during the summer estimated with the DAUMOD-GRS model have shown values lower than 20 ppb (the regional background concentration) in the urban area and levels greater than 40 ppb in its surroundings. Due to the lack of measurements outside the MABA, these relatively high ozone modelled concentrations constitute the only estimate for the area. In this work, a methodology based on the Monte Carlo analysis is implemented to evaluate the uncertainty in these modelled concentrations associated to possible errors of the model input data. Results show that the larger 1-h peak O 3 levels in the MABA during the summer present larger uncertainties (up to 47 ppb). On the other hand, multiple linear regression analysis is applied at selected receptors in order to identify the variables explaining most of the obtained variance. Although their relative contributions vary spatially, the uncertainty of the regional background O 3 concentration dominates at all the analysed receptors (34.4–97.6%), indicating that their estimations could be improved to enhance the ability of the model to simulate peak O 3 concentrations in the MABA. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
38. Carbon dioxide-emission in China׳s power industry: Evidence and policy implications.
- Author
-
Yang, Lisha and Lin, Boqiang
- Subjects
- *
ELECTRIC utilities , *CARBON dioxide mitigation , *ENERGY policy , *ENERGY consumption , *ELECTRIC industries , *ECONOMIC activity - Abstract
The Logarithmic Mean Divisia Index (LMDI) and scenario analysis have been applied respectively to analyze the impact of CO 2 -emission and its potential reduction in China׳s power industry. According to the results of LMDI, there are six factors that affect the carbon emission in the power industry. Electricity intensity (EI), and economic activity (EA) are the primary driving factors for the increment in emissions, accounting for 42.33% and 57.05% of the total increment during 1985 to 2011. Results also demonstrated that energy efficiency (EE) contributed 13.54% abatement during 1985 to 2011, and will play a key role in emission abatement in the future. Furthermore, the paper estimate the trend of power sector׳s carbon dioxide emission under three scenarios (basic, moderate and optimum) in order to determine the mitigation potential. The potential mitigation rate will equal to 22.03% and 37.57% in 2020 in Case A and Case B respectively in 2020. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
39. Feasible process development and techno-economic evaluation of paper sludge to bioethanol conversion: South African paper mills scenario.
- Author
-
Robus, Charles L.L., Gottumukkala, Lalitha Devi, van Rensburg, Eugéne, and Görgens, Johann F.
- Subjects
- *
PAPER mills , *ETHANOL as fuel , *SLUDGE management , *ENERGY conversion , *INDUSTRIAL waste management - Abstract
Paper sludge samples collected from recycling mills exhibited high ash content in the range of 54.59%–65.50% and glucose concentrations between 21.97% and 31.11%. Washing the sludge reduced the total ash content to between 10.7% and 19.31% and increased the concentration of glucose, xylose and lignin. Samples were screened for ethanol production and fed-batch simultaneous saccharification and fermentation (SSF) was optimised for the washed samples that resulted in highest and lowest ethanol concentrations. Maximum ethanol concentrations of 57.31 g/L and 47.72 g/L (94.07% and 85.34% of the maximum theoretical yield, respectively) was predicted for high and low fermentative potential samples, respectively, and was experimentally achieved with 1% deviation. A generic set of process conditions were established for the conversion of high ash-containing paper sludge to ethanol. Techno-economic analysis based on three different revenue scenarios, together with Monte Carlo analysis revealed 95% probability of achieving IRR values in excess of 25% at a paper sludge feed rate of 15 t/d. Feed rates of 30 t/d and 50 t/d exhibited a cumulative probability of 100%. This study presents the technical feasibility and economic viability of paper mills expansion towards bioethanol production from paper sludge. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
40. Analyzing the sustainability performance of public transit.
- Author
-
Miller, Patrick, de Barros, Alexandre G., Kattan, Lina, and Wirasinghe, S.C.
- Subjects
- *
SUSTAINABLE transportation , *PUBLIC transit , *URBAN transportation , *SUSTAINABILITY , *HUMAN ecology , *SUSTAINABLE communities - Abstract
In recent years there have been many advances in understanding how transportation can promote or detract from sustainability. The role of public transit has been established as a critical element in promoting sustainable and vibrant cities. While decision making tools, such as composite indices, have been developed to understand overall urban and transportation sustainability, few tools exist to directly analyze public transit systems based on sustainability criteria. This paper introduces the Public Transit Sustainable Mobility Analysis Tool (PTSMAT) framework, which uses composite sustainability index techniques along with research into transport sustainability to propose a new transit analysis tool that can be used in both planning/decision making and research contexts. First, this paper reviews definitions of sustainability and sustainable transport, and sustainability analysis tools and relates them back to a legible analytic framework for public transit systems. Next, the PTMSAT framework is introduced along with a description of its data collection, data analysis, and index calculation processes. The application of PTSMAT to decision making and research scenarios is also outlined. A summary of relevant indicators for public transits sustainability and integrates them into the PTSMAT framework using a quadruple bottom line approach. Next, a case study of the framework’s application is provided for the UBC Corridor study in Vancouver, Canada. This case study demonstrates how the tool may be used to inform decision making and planning efforts. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
41. Multi-objective optimization of long-term groundwater monitoring network design using a probabilistic Pareto genetic algorithm under uncertainty.
- Author
-
Luo, Qiankun, Wu, Jianfeng, Yang, Yun, Qian, Jiazhong, and Wu, Jichun
- Subjects
- *
GROUNDWATER monitoring , *GENETIC algorithms , *ERROR analysis in mathematics , *HYDRAULIC conductivity , *MATHEMATICAL optimization - Abstract
Summary Optimal design of long term groundwater monitoring (LTGM) network often involves conflicting objectives and substantial uncertainty arising from insufficient hydraulic conductivity ( K ) data. This study develops a new multi-objective simulation–optimization model involving four objectives: minimizations of (i) the total sampling costs for monitoring contaminant plume, (ii) mass estimation error, (iii) the first moment estimation error, and (iv) the second moment estimation error of the contaminant plume, for LTGM network design problems. Then a new probabilistic Pareto genetic algorithm (PPGA) coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, is developed to search for the Pareto-optimal solutions to the multi-objective LTGM problems under uncertainty of the K -fields. The PPGA integrates the niched Pareto genetic algorithm with probabilistic Pareto sorting scheme to deal with the uncertainty of objectives caused by the uncertain K -field. Also, the elitist selection strategy, the operation library and the Pareto solution set filter are conducted to improve the diversity and reliability of Pareto-optimal solutions by the PPGA. Furthermore, the sampling strategy of noisy genetic algorithm is adopted to cope with the uncertainty of the K -fields and improve the computational efficiency of the PPGA. In particular, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology in finding Pareto-optimal sampling network designs of LTGM systems through a two-dimensional hypothetical example and a three-dimensional field application in Indiana (USA). Comprehensive analysis demonstrates that the proposed PPGA can find Pareto optimal solutions with low variability and high reliability and is a promising tool for optimizing multi-objective LTGM network designs under uncertainty. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
42. Critical issues in existing RC deck stiffened arch bridges under seismic actions.
- Author
-
Crisci, Giovanni, Ceroni, Francesca, Lignola, Gian Piero, and Prota, Andrea
- Subjects
- *
ARCH bridges , *EARTHQUAKE resistant design , *BRIDGES , *BEHAVIORAL assessment , *LIVE loads , *CONCRETE bridges , *FAILURE mode & effects analysis , *REINFORCED concrete - Abstract
• Technical background about RC Deck Arch Bridges in Italy - design principles largely adopted in the past are summarized; • Numerical studies to generate simulated samples of RC Deck Arch Bridges; • Procedure for assessment of seismic vulnerability for RC Deck Arch Bridges and derivation of preliminary fragility functions at Ultimate Limit State. Existing Reinforced Concrete arch bridges represent still today an important part of transportation network. The peculiarities of such bridges make particularly complex the assessment of their behaviour under both static and dynamic actions and, therefore, their safety. This work concentrates on a specific arch bridge typology known as the "Maillart –Type Arch Bridges" or "Deck-Stiffened Arch Bridges", characterized by a very stiff deck beam and a slender and wide vault. As well as for the RC structures designed during 50s of the last century, where some important details for concrete elements were not considered, RC Deck-Stiffened Arch Bridges could be subjected to similar structural deficiencies. Moreover, the current loading conditions provide actions that were not considered in the original design or have changed over the time, such as the seismic actions or the moving loads due to the vehicular traffic. For the evaluation of the main critical issues related to the current performance of the "Maillart–Type Arch Bridges", this study, starting from a "simulated design" according to the design rules and the mandatory codes in force at the construction time, defines a large building inventory of simulated bridges (3000 samples) characterized by different values of the most significant geometrical parameters. Firstly, each bridge of the inventory is modelled and studied by means of a linear Response Spectrum Analysis (RSA) implemented in the software SAP2000. The large number of structural analyses allowed to investigate the seismic behaviour of such a bridge typology, assess and localize the most frequent failure modes. The results of the RSA allowed developing preliminary fragility functions at the Ultimate Limit State, based on shear or flexural failure of single elements of the bridges. Lastly, Non-Linear Time History Analyses (NLTHA) were carried out on two representative case studies extracted from the simulated inventory and the results were examined in terms of fragility curves obtained according to the Cloud methodology for four Damage States. Despite the different approaches (RSA on large scale and NLTHA on two prototypes), a comparison of the fragility curves for equivalent Damage States was also proposed and a good agreement was found. These fragility curves are a first step for the assessment of the seismic vulnerability of "Maillart–Type Arch Bridges" and can be useful for planning seismic risk mitigation interventions for such a bridge typology. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
43. Pareto-based efficient stochastic simulation–optimization for robust and reliable groundwater management.
- Author
-
Sreekanth, J., Moore, Catherine, and Wolf, Leif
- Subjects
- *
STOCHASTIC processes , *GROUNDWATER management , *GROUNDWATER recharge , *HYDRAULIC conductivity , *MATHEMATICAL optimization - Abstract
Summary Simulation–optimization methods are used to develop optimal solutions for a variety of groundwater management problems. The true optimality of these solutions is often dependent on the reliability of the simulation model. Therefore, where model predictions are uncertain due to parameter uncertainty, this should be accounted for within the optimization formulation to ensure that solutions are robust and reliable. In this study, we present a stochastic multi-objective formulation of the otherwise single objective groundwater optimization problem by considering minimization of prediction uncertainty as an additional objective. The proposed method is illustrated by applying to an injection bore field design problem. The primary objective of optimization is maximization of the total volume of water injected into a confined aquifer, subject to the constraints that the resulting increases in hydraulic head in a set of control bores are below specified target levels. Both bore locations and injection rates were considered as optimization variables. Prediction uncertainty is estimated using stacks of uncertain parameters and is explicitly minimized to produce robust and reliable solutions. Reliability analysis using post-optimization Monte Carlo analysis proved that while a stochastic single objective optimization failed to provide reliable solutions with a stack size of 50, the proposed method resulted in many robust solutions with high reliability close to 1.0. Results of the comparison indicate potential gains in efficiency of the stochastic multi-objective formulation to identify robust and reliable groundwater management strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
44. Carbon dioxide (CO2) emissions from electricity: The influence of the North Atlantic Oscillation.
- Author
-
Curtis, John, Lynch, Muireann Á., and Zubiate, Laura
- Subjects
- *
CARBON dioxide mitigation , *ELECTRIC power , *WIND power , *RENEWABLE energy sources - Abstract
The North Atlantic Oscillation (NAO) is a large-scale circulation pattern driving climate variability in north-western Europe. In recent years there has been an increasing deployment of wind-powered generation technology, i.e. wind farms, on electricity networks across Europe. As this deployment increases it is important to understand how climate variability will affect both wind-powered and non-renewable power generation. This study extends the literature by assessing the impact of NAO, via wind-power generation, on carbon dioxide emissions from the wider electricity system. A Monte Carlo approach is used to model NAO phases, generate hourly wind speed time-series data, electricity demand and fuel input data. A unit commitment, least-cost economic dispatch model is used to simulate an entire electricity system, modelled on the all-island Irish electricity system. Our results confirm that the NAO has a significant impact on monthly mean wind speeds, wind power output, and carbon dioxide emissions from the entire electricity system. The impact of NAO on emissions obviously depends on the level of wind penetration within an electricity system but our results indicate that emissions intensity within the Irish electricity system could vary by as much as 10% depending on the NAO phase within the next few years. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
45. Automated finite element model updating of a scale bridge model using measured static and modal test data.
- Author
-
Sanayei, Masoud, Khaloo, Ali, Gul, Mustafa, and Necati Catbas, F.
- Subjects
- *
FINITE element method , *DATA analysis , *STRUCTURAL health monitoring , *PARAMETER estimation , *STIFFNESS (Engineering) - Abstract
Structural Health Monitoring (SHM) using nondestructive test data has become promising for finite element (FE) model updating, model verification, structural evaluation and damage assessment. This research presents a multiresponse structural parameter estimation method for the automated FE model updating using data obtained from a set of nondestructive tests conducted on a laboratory bridge model. Both stiffness and mass parameters are updated at the element level, simultaneously. Having measurement and modeling errors is an inevitable part of data acquisition systems and finite element models. The presence of these errors can affect the accuracy of the estimated parameters. Therefore, an error sensitivity analysis using Monte Carlo simulation was used to study the input–output error behavior of each parameter based on the load cases and measurement locations of the nondestructive tests. Given the measured experimental responses, the goal was to select the unknown parameters of the FE model with high observability that leads to creating a well-conditioned system with the least sensitivity to measurement errors. A data quality study was performed to assess the accuracy and reliability of the measured data. Based on this study, a subset of the most reliable measured data was selected for the FE model updating. The selected subset of higher quality measurements and the observable unknown parameters were used for FE model updating. Three static and dynamic error functions were used for structural parameter estimation using the selected measured static strains, displacements, and slopes as well as dynamic natural frequencies and associated mode shapes. The measured data sets were used separately and also together for multiresponse FE model updating to match the predicted analytical response with the measured data. The FE model was successfully calibrated using multiresponse data. Two separate commercially available software packages were used with real-time data communications utilizing Application Program Interface (API) scripts. This approach was efficient in utilizing these software packages for automated and systematic FE model updating. The usefulness of the proposed method for automated finite element model updating at the element level is shown by being able to lead to simultaneous estimation of the stiffness and mass parameters using experimental data. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
46. Application of endurance time method in performance-based optimum design of structures.
- Author
-
Basim, Mohammad Ch. and Estekanchi, Homayoon E.
- Subjects
- *
EARTHQUAKE damage , *OPTIMAL designs (Statistics) , *PARAMETER estimation , *PERFORMANCE evaluation , *STRUCTURAL analysis (Engineering) - Abstract
In this research, application of the Endurance Time (ET) method in performance-based design of structures with and without consideration of uncertainties is investigated and a practical optimum design procedure is proposed. The ET method is used as an analytical assessment tool because of its capabilities in response estimation with an affordable computational demand. In the first step of the proposed method, ET analysis is implemented in a multi-objective optimum design procedure in order to achieve a set of Pareto optimal designs. Optimization is conducted with respect to initial cost and expected life cycle cost using a deterministic approach. For each design alternative the median damages due to probable earthquakes in its life time is estimated by the ET method, and the expected cost of earthquake consequences is calculated using Life Cycle Cost Analysis (LCCA). In the next step, a comprehensive performance assessment is carried out on a candidate optimal design considering inherent uncertainties in the framework of FEMA-P-58. It is also proposed to use the ET method as the response assessment tool in this framework. Expected damage costs, fatalities and probability of collapse are estimated using a Monte Carlo approach to account for uncertainties. The candidate design can then be altered to another optimal design from the Pareto set in the case of undesirable performance. The advantages and shortcomings of the method are investigated by comparing the results from the ET method and recommended procedure using a suite of ground motions. The results provide a pathway towards practical use of the ET analysis method in state of the art performance-based design and probabilistic estimation of losses. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
47. Integrating remediation and resource recovery: On the economic conditions of landfill mining.
- Author
-
Frändegård, Per, Krook, Joakim, and Svensson, Niclas
- Subjects
- *
ENVIRONMENTAL remediation , *WASTE recycling , *LANDFILLS , *SEPARATION (Technology) , *IMPACT factor (Citation analysis) - Abstract
This article analyzes the economic potential of integrating material separation and resource recovery into a landfill remediation project, and discusses the result and the largest impact factors. The analysis is done using a direct costs/revenues approach and the stochastic uncertainties are handled using Monte Carlo simulation. Two remediation scenarios are applied to a hypothetical landfill. One scenario includes only remediation, while the second scenario adds resource recovery to the remediation project. Moreover, the second scenario is divided into two cases, case A and B. In case A, the landfill tax needs to be paid for re-deposited material and the landfill holder does not own a combined heat and power plant (CHP), which leads to disposal costs in the form of gate fees. In case B, the landfill tax is waived on the re-deposited material and the landfill holder owns its own CHP. Results show that the remediation project in the first scenario costs about €23/ton. Adding resource recovery as in case A worsens the result to −€36/ton, while for case B the result improves to −€14/ton. This shows the importance of landfill tax and the access to a CHP. Other important factors for the result are the material composition in the landfill, the efficiency of the separation technology used, and the price of the saleable material. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
48. On the uncertainty of temperature estimation in a rapid compression machine.
- Author
-
Weber, Bryan W., Sung, Chih-Jen, and Renfro, Michael W.
- Subjects
- *
TEMPERATURE measurements , *COMBUSTION chambers , *MONTE Carlo method , *PARAMETER estimation , *DISTRIBUTION (Probability theory) , *ISENTROPIC processes - Abstract
Rapid compression machines (RCMs) have been widely used in the combustion literature to study the low-to-intermediate temperature ignition of many fuels. In a typical RCM, the pressure during and after the compression stroke is measured. However, measurement of the temperature history in the RCM reaction chamber is challenging. Thus, the temperature is generally calculated by the isentropic relations between pressure and temperature, assuming that the adiabatic core hypothesis holds. To estimate the uncertainty in the calculated temperature, an uncertainty propagation analysis must be carried out. Our previous analyses assumed that the uncertainties of the parameters in the equation to calculate the temperature were normally distributed and independent, but these assumptions do not hold for typical RCM operating procedures. In this work, a Monte Carlo method is developed to estimate the uncertainty in the calculated temperature, while taking into account the correlation between parameters and the possibility of non-normal probability distributions. In addition, the Monte Carlo method is compared to an analysis that assumes normally distributed, independent parameters. Both analysis methods show that the magnitude of the initial pressure and the uncertainty of the initial temperature have strong influences on the magnitude of the uncertainty. Finally, the uncertainty estimation methods studied here provide a reference value for the uncertainty of the reference temperature in an RCM and can be generalized to other similar facilities. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
49. Recovering highly exploited stocks with heterogeneous resilience capacities in mixed fisheries under uncertain illegal fishing.
- Author
-
Villanueva, Raul, Seijo, Juan Carlos, and Aranceta-Garza, Fernando
- Subjects
FISHERIES ,FISHERY management ,MONTE Carlo method ,FISHERY resources ,FISHING ,PRICES - Abstract
The complexity of stock recovery strategies in mixed-species fisheries involving heterogeneous resilience capacities of target and incidentally harvested species is further challenged by possible levels of illegal exploitation rates. This context adds to the uncertainty for decision-makers when establishing proper management measures to recover highly overexploited stocks towards the selected target reference point (e.g., B i M S Y ). With a dynamic bioeconomic model, we analyzed the stock recovery performance of a mixed fishery with two overexploited species using combinations of moratoria durations (μ i) and subsequent exploitation rates (F i M S Y ) , under possible states of nature of illegal fishing (θ j). Alternative stock recovery management strategies (D j) were considered to include the effect of possible θ j. Both stocks present heterogenous abundance, renewability capacities, and relative prices. The impact of possible levels of illegal fishing on the legal resource rent and the corresponding forgone legal resource rent was also assessed. A Monte Carlo analysis was undertaken to calculate the risk of falling below B i M S Y considering alternative management strategies to deal with possible illegal fishing (ρ i , D j ). Overexploited species with lower resilience capacity require longer moratoria timelines, extended when dealing with alternative states of nature of illegal fishing. Management strategies based on target species achieved the highest fishery legal resource rent while having the highest risk of not recovering the stock of incidental species with lower resilience capacity. In this study, a mixed-strategy management decision where a moratorium is based on target species and subsequent exploitation rates (F i) based on lower resilience species resulted in minimum risks of falling below species MSY at high levels of illegal fishing. This same strategy resulted in maximum NPV of resource rent per vessel for alternative levels of illegal fishing. Lack of consideration of possible levels of illegal fishing may lead to failure in achieving stock recovery to target reference points in mixed fisheries. • Recovery of highly exploited stocks in mixed fisheries under illegal fishing. • Moratoria and subsequent exploitation rates in mixed fisheries. • Assessed effects of heterogeneous species abundance, renewability, and prices. • Maximum rent per vessel: a combination of species-based moratoria and exploitation rates. • Monte Carlo risk analysis of alternative mixed fishery management strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
50. Dynamic determination of kinetic parameters, computer simulation, and probabilistic analysis of growth of Clostridium perfringens in cooked beef during cooling.
- Author
-
Huang, Lihan
- Subjects
- *
CLOSTRIDIUM perfringens , *BEEF , *COOKING , *BACTERIAL growth , *NUMERICAL analysis , *COMPUTER simulation , *STANDARD deviations - Abstract
The objective of this research was to develop a new one-step methodology that uses a dynamic approach to directly construct a tertiary model for prediction of the growth of Clostridium perfringens in cooked beef. This methodology was based on simultaneous numerical analysis and optimization of both primary and secondary models using multiple dynamic growth curves obtained under different conditions. Once the models were constructed, the bootstrap method was used to calculate the 95% confidence intervals of kinetic parameters, and a Monte Carlo simulation method was developed to validate the models using the growth curves not previously used in model development. The results showed that the kinetic parameters obtained from this study accurately matched the common characteristics of C. perfringens , with the optimum temperature being 45.3 °C. The results also showed that the predicted growth curves matched accurately with experimental observations used in validation. The mean of residuals of the predictions is − 0.02 log CFU/g, with a standard deviation of only 0.23 log CFU/g. For relative growths < 1 log CFU/g, the residuals of predictions are < 0.4 log CFU/g. Overall, 74% of the residuals of predictions are < 0.2 log CFU/g, 7.7% are > 0.4 log CFU/g, while only 1.5% are > 0.8 log CFU/g. In addition, the dynamic model also accurately predicted four isothermal growth curves arbitrarily chosen from the literature. Finally, the Monte Carlo simulation was used to provide the probability of > 1 and 2 log CFU/g relative growths at the end of cooling. The results of this study will provide a new and accurate tool to the food industry and regulatory agencies to assess the safety of cooked beef in the event of cooling deviation. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.