90 results on '"open-source code"'
Search Results
2. Open-Source Data Logger System for Real-Time Monitoring and Fault Detection in Bench Testing.
- Author
-
Amorim, Marcio Luís Munhoz, Lima, Jorge Gomes, Torres, Norah Nadia Sánchez, Afonso, Jose A., Lopes, Sérgio F., Carmo, João P. P. do, Hartmann, Lucas Vinicius, Souto, Cicero Rocha, Salvadori, Fabiano, and Ando Junior, Oswaldo Hideo
- Subjects
COMBUSTION efficiency ,SYSTEM failures ,SPARK ignition engines ,EXHAUST systems ,DATA acquisition systems ,MICROPHONES - Abstract
This paper presents the design and development of a proof of concept (PoC) open-source data logger system for wireless data acquisition via Wi-Fi aimed at bench testing and fault detection in combustion and electric engines. The system integrates multiple sensors, including accelerometers, microphones, thermocouples, and gas sensors, to monitor critical parameters, such as vibration, sound, temperature, and CO
2 levels. These measurements are crucial for detecting anomalies in engine performance, such as ignition and combustion faults. For combustion engines, temperature sensors detect operational anomalies, including diesel engines operating beyond the normal range of 80 °C to 95 °C and gasoline engines between 90 °C and 110 °C. These readings help identify failures in cooling systems, thermostat valves, or potential coolant leaks. Acoustic sensors identify abnormal noises indicative of issues such as belt misalignment, valve knocking, timing irregularities, or loose parts. Vibration sensors detect displacement issues caused by engine mount failures, cracks in the engine block, or defects in pistons and valves. These sensors can work synergistically with acoustic sensors to enhance fault detection. Additionally, CO2 and organic compound sensors monitor fuel combustion efficiency and detect failures in the exhaust system. For electric motors, temperature sensors help identify anomalies, such as overloads, bearing problems, or excessive shaft load. Acoustic sensors diagnose coil issues, phase imbalances, bearing defects, and faults in chain or belt systems. Vibration sensors detect shaft and bearing problems, inadequate motor mounting, or overload conditions. The collected data are processed and analyzed to improve engine performance, contributing to reduced greenhouse gas (GHG) emissions and enhanced energy efficiency. This PoC system leverages open-source technology to provide a cost-effective and versatile solution for both research and practical applications. Initial laboratory tests validate its feasibility for real-time data acquisition and highlight its potential for creating datasets to support advanced diagnostic algorithms. Future work will focus on enhancing telemetry capabilities, improving Wi-Fi and cloud integration, and developing machine learning-based diagnostic methodologies for combustion and electric engines. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
3. Open-Source Data Logger System for Real-Time Monitoring and Fault Detection in Bench Testing
- Author
-
Marcio Luís Munhoz Amorim, Jorge Gomes Lima, Norah Nadia Sánchez Torres, Jose A. Afonso, Sérgio F. Lopes, João P. P. do Carmo, Lucas Vinicius Hartmann, Cicero Rocha Souto, Fabiano Salvadori, and Oswaldo Hideo Ando Junior
- Subjects
open-source code ,data logger ,Internet of Things ,wireless communication ,combustion engines ,electrical engines ,Engineering machinery, tools, and implements ,TA213-215 ,Technological innovations. Automation ,HD45-45.2 - Abstract
This paper presents the design and development of a proof of concept (PoC) open-source data logger system for wireless data acquisition via Wi-Fi aimed at bench testing and fault detection in combustion and electric engines. The system integrates multiple sensors, including accelerometers, microphones, thermocouples, and gas sensors, to monitor critical parameters, such as vibration, sound, temperature, and CO2 levels. These measurements are crucial for detecting anomalies in engine performance, such as ignition and combustion faults. For combustion engines, temperature sensors detect operational anomalies, including diesel engines operating beyond the normal range of 80 °C to 95 °C and gasoline engines between 90 °C and 110 °C. These readings help identify failures in cooling systems, thermostat valves, or potential coolant leaks. Acoustic sensors identify abnormal noises indicative of issues such as belt misalignment, valve knocking, timing irregularities, or loose parts. Vibration sensors detect displacement issues caused by engine mount failures, cracks in the engine block, or defects in pistons and valves. These sensors can work synergistically with acoustic sensors to enhance fault detection. Additionally, CO2 and organic compound sensors monitor fuel combustion efficiency and detect failures in the exhaust system. For electric motors, temperature sensors help identify anomalies, such as overloads, bearing problems, or excessive shaft load. Acoustic sensors diagnose coil issues, phase imbalances, bearing defects, and faults in chain or belt systems. Vibration sensors detect shaft and bearing problems, inadequate motor mounting, or overload conditions. The collected data are processed and analyzed to improve engine performance, contributing to reduced greenhouse gas (GHG) emissions and enhanced energy efficiency. This PoC system leverages open-source technology to provide a cost-effective and versatile solution for both research and practical applications. Initial laboratory tests validate its feasibility for real-time data acquisition and highlight its potential for creating datasets to support advanced diagnostic algorithms. Future work will focus on enhancing telemetry capabilities, improving Wi-Fi and cloud integration, and developing machine learning-based diagnostic methodologies for combustion and electric engines.
- Published
- 2024
- Full Text
- View/download PDF
4. A Data-Driven Analysis of Formula 1 Car Races Outcome
- Author
-
Patil, Ankur, Jain, Nishtha, Agrahari, Rahul, Hossari, Murhaf, Orlandi, Fabrizio, Dev, Soumyabrata, Filipe, Joaquim, Editorial Board Member, Ghosh, Ashish, Editorial Board Member, Prates, Raquel Oliveira, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, Longo, Luca, editor, and O’Reilly, Ruairi, editor
- Published
- 2023
- Full Text
- View/download PDF
5. Open-Source Codes of Topology Optimization: A Summary for Beginners to Start Their Research.
- Author
-
Yingjun Wang, Xinqing Li, Kai Long, and Peng Wei
- Subjects
LEVEL set methods ,TOPOLOGY ,STRUCTURAL optimization - Abstract
Topology optimization (TO), a numerical technique to find the optimalmaterial layoutwith a given design domain, has attracted interest from researchers in the field of structural optimization in recent years. For beginners, opensource codes are undoubtedly the best alternative to learning TO, which can elaborate the implementation of a method in detail and easily engage more people to employ and extend the method. In this paper, we present a summary of various open-source codes and related literature on TO methods, including solid isotropic material with penalization (SIMP), evolutionary method, level set method (LSM), moving morphable components/voids (MMC/MMV) methods, multiscale topology optimization method, etc. Simultaneously, we classify the codes into five levels, fromeasy to difficult, depending on their difficulty, so that beginners can get started and understand the form of code implementation more quickly. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. Estudio estadístico de las desviaciones en las estimaciones del tiempo de retraso a la ignición para mezclas de H2/CH4 utilizando un código comercial y de fuente abierta.
- Author
-
Yepes, Hernando A., Salazar, Adalberto, and Cardona, Arley
- Published
- 2023
- Full Text
- View/download PDF
7. An independent analysis of bias sources and variability in wind plant pre‐construction energy yield estimation methods
- Author
-
Austin C. Todd, Mike Optis, Nicola Bodini, Michael Jason Fields, Jordan Perr‐Sauer, Joseph C. Y. Lee, Eric Simley, and Robert Hammond
- Subjects
annual energy production ,benchmark ,energy yield assessment ,open‐source code ,operational analysis ,P50 bias ,Renewable energy sources ,TJ807-830 - Abstract
Abstract The wind resource assessment community has long had the goal of reducing the bias between wind plant pre‐construction energy yield assessment (EYA) and the observed annual energy production (AEP). This comparison is typically made between the 50% probability of exceedance (P50) value of the EYA and the long‐term corrected operational AEP (hereafter OA AEP) and is known as the P50 bias. The industry has critically lacked an independent analysis of bias investigated across multiple consultants to identify the greatest sources of uncertainty and variance in the EYA process and the best opportunities for uncertainty reduction. The present study addresses this gap by benchmarking consultant methodologies against each other and against operational data at a scale not seen before in industry collaborations. We consider data from 10 wind plants in North America and evaluate discrepancies between eight consultancies in the steps taken from estimates of gross to net energy. Consultants tend to overestimate the gross energy produced at the turbines and then compensate by further overestimating downstream losses, leading to a mean P50 bias near zero, still with significant variability among the individual wind plants. Within our data sample, we find that consultant estimates of all loss categories, except environmental losses, tend to reduce the project‐to‐project variability of the P50 bias. The disagreement between consultants, however, remains flat throughout the addition of losses. Finally, we find that differences in consultants' estimates of project performance can lead to differences up to $10/MWh in the levelized cost of energy for a wind plant.
- Published
- 2022
- Full Text
- View/download PDF
8. The VM2D Open Source Code for Two-Dimensional Incompressible Flow Simulation by Using Fully Lagrangian Vortex Particle Methods.
- Author
-
Marchevsky, Ilia, Sokol, Kseniia, Ryatina, Evgeniya, and Izmailova, Yulia
- Subjects
- *
FLOW simulations , *VORTEX methods , *TWO-dimensional bar codes , *FLUID-structure interaction , *SOURCE code , *INCOMPRESSIBLE flow , *CENTRAL processing units - Abstract
This article describes the open-source C++ code VM2D for the simulation of two-dimensional viscous incompressible flows and solving fluid-structure interaction problems. The code is based on the Viscous Vortex Domains (VVD) method developed by Prof. G. Ya. Dynnikova, where the viscosity influence is taken into account by introducing the diffusive velocity. The original VVD method was supplemented by the author's algorithms for boundary condition satisfaction, which made it possible to increase the accuracy of flow simulation near the airfoil's surface line and reduce oscillations when calculating hydrodynamic loads. This paper is aimed primarily at assessing the efficiency of the parallelization of the algorithm. OpenMP, MPI, and Nvidia CUDA parallel programming technologies are used in VM2D, which allow performing simulations on computer systems of various architectures, including those equipped with graphics accelerators. Since the VVD method belongs to the particle methods, the efficiency of parallelization with the usage of graphics accelerators turns out to be quite high. It is shown that in a real simulation, one graphics card can replace about 80 nodes, each of which is equipped with 28 CPU cores. The source code of VM2D is available on GitHub under GNU GPL license. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. Open-source code to convert Journal Article Tag Suite Extensible Markup Language (JATS XML) to various viewers and other XML types for scholarly journal publishing
- Author
-
Younsang Cho
- Subjects
crossref ,digital publishing ,github ,jats xml ,open-source code ,Science (General) ,Q1-390 - Abstract
There are many ways to use open source code to implement digital standards for scholarly journal publishing. However, providing digital services using open-source code can be a challenge, especially for small and local academic society journals. This paper provides some critical examples of using some of the many open-source code resources available to the public. Journal Article Tag Suite (JATS) Extensible Markup Language (XML) has been established as an essential tool, and is now used by most journals for digital publication. JATS XML can be converted to other viewer formats, including Extensible Hypertext Markup Language, PubReader, and EPUB 3.0. It can also be used to create dynamic interactive PDFs. It can be converted to other XMLs, incluing Crossref XML, PubMed XML, and DOAJ XML. Open-source code published on GitHub, National Information Standards Organization, and the US National Library of Medicine can be used for Crossref XML deposition for digital object identifier and Crossmark stamp registration. These examples of open-source code need to be implemented on journal websites to provide local academic journal publishers with various critical functions. This paper provides instructions on the best ways to realize these digital standards so that journal content can be provided to readers in a more friendly and effective way.
- Published
- 2022
- Full Text
- View/download PDF
10. Innovative modeling and simulation of membrane-based dehumidification and energy recovery equipment
- Author
-
Zhiming Gao, Joe Rendall, Kashif Nawaz, Ahmad Abuheiba, and Omar Abdelaziz
- Subjects
Membrane ,Dehumidification ,Energy recovery ,Modeling ,Open-source code ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Membrane-based dehumidification is a promising solution for building applications because of its low cost and limited energy consumption. Developing an efficient and cost-effective open-source code simulation tool is important for optimizing and evaluating such devices in HVAC applications. This paper describes a physics-based model, which accounts for the fundamental heat and mass transfer between a humid-air vapor stream on the feed side and a flowing stream on the permeate side of a membrane. The developed model comprises two mass transfer submodels—a microstructure model and a performance map model—and adopts a segment-by-segment method for discretizing heat and mass transfer governing equations for flow streams on the feed and permeate sides of a membrane. The model can simulate dehumidifiers and energy recovery ventilators with parallel-flow, cross-flow, and counter-flow configurations, and the predictions compare reasonably well with the measurements. The model was used to evaluate the effect of membrane microstructure parameters and membrane surface deflection factors, as well as to investigate the performance of combined dehumidification and energy recovery exchangers. The model and C++ open-source codes are expected to become a fundamental tool in analyzing future membrane-based dehumidification systems.
- Published
- 2023
- Full Text
- View/download PDF
11. WEC-SIM Phase 1 Validation Testing -- Numerical Modeling of Experiments: Preprint
- Author
-
Yu, Yi-Hsiang
- Published
- 2016
- Full Text
- View/download PDF
12. RESEARCH OF POSSIBILITIES OF DEFAULT REFACTORING ACTIONS IN SWIFT LANGUAGE.
- Author
-
Tkachuk, Andrii and Bulakh, Bogdan
- Subjects
- *
SOFTWARE refactoring , *PROGRAMMING languages , *SOURCE code , *SOFTWARE development tools , *PROBLEM solving , *DEFAULT (Finance) - Abstract
The object of research in the paper is a built-in refactoring mechanism in the Swift programming language. Swift has gained a lot of popularity recently, which is why there are many new challenges associated with the need to support and modify the source code written in this programming language. The problem is that the more powerful refactoring mechanism that can be applied to Swift is proprietary and cannot be used by other software. Moreover, even closed-source refactoring software tools are not capable of performing more complex queries. To explore the possibilities of expanding the built-in refactoring, it is suggested to investigate the software implementation of the sourcekit component of the Swift programming language, which is responsible for working with «raw» source code, and to implement new refactoring action in practice. To implement the research plan, one refactoring activity that was not present in the refactoring utilities (adding an implementation of the Equatable protocol) was chosen. Its implementation was developed using the components and resources provided within the sourcekit component. To check the correctness and compliance with the development conditions, several tests were created and conducted. It has been discovered that both refactoring mechanisms supported by the Swift programming language have a limited context and a limited scope and application. That is why the possibility of expanding the functionality should not be based on the local level of code processing, but on the upper level, where it is possible to combine several source files, which often happens in projects. The work was directed to the development of the own refactoring action to analyze and obtain a perfect representation of the advantages and disadvantages of the existing component. As a result, a new approach to refactoring was proposed, which will allow solving the problems described above. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. An independent analysis of bias sources and variability in wind plant pre‐construction energy yield estimation methods.
- Author
-
Todd, Austin C., Optis, Mike, Bodini, Nicola, Fields, Michael Jason, Perr‐Sauer, Jordan, Lee, Joseph C. Y., Simley, Eric, and Hammond, Robert
- Subjects
WIND power plants ,WIND power ,OFFSHORE wind power plants ,OPERATIONS research ,COMMUNITY life ,CONSULTANTS - Abstract
The wind resource assessment community has long had the goal of reducing the bias between wind plant pre‐construction energy yield assessment (EYA) and the observed annual energy production (AEP). This comparison is typically made between the 50% probability of exceedance (P50) value of the EYA and the long‐term corrected operational AEP (hereafter OA AEP) and is known as the P50 bias. The industry has critically lacked an independent analysis of bias investigated across multiple consultants to identify the greatest sources of uncertainty and variance in the EYA process and the best opportunities for uncertainty reduction. The present study addresses this gap by benchmarking consultant methodologies against each other and against operational data at a scale not seen before in industry collaborations. We consider data from 10 wind plants in North America and evaluate discrepancies between eight consultancies in the steps taken from estimates of gross to net energy. Consultants tend to overestimate the gross energy produced at the turbines and then compensate by further overestimating downstream losses, leading to a mean P50 bias near zero, still with significant variability among the individual wind plants. Within our data sample, we find that consultant estimates of all loss categories, except environmental losses, tend to reduce the project‐to‐project variability of the P50 bias. The disagreement between consultants, however, remains flat throughout the addition of losses. Finally, we find that differences in consultants' estimates of project performance can lead to differences up to $10/MWh in the levelized cost of energy for a wind plant. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
14. An open-source code to calculate pressure-composition-temperature diagrams of multicomponent alloys for hydrogen storage.
- Author
-
Pedroso, Otávio Abreu, Botta, Walter José, and Zepon, Guilherme
- Subjects
- *
HYDROGEN storage , *ALLOYS , *PYTHON programming language , *TRACE elements , *HYDRIDES - Abstract
Recently, multicomponent alloys have been studied for hydrogen storage because of their vast compositional field, which opened an exciting path for designing alloys with optimized properties for any specific application, in a properties-on-demand approach. Since the experimental measurements of hydrogen storage properties are very time-consuming, computational tools to assist the exploration of the endless compositional field of multicomponent alloys are needed. In a previous work reported by Zepon et al. (2021), a thermodynamic model to calculate pressure-composition-temperature (PCT) diagrams for body-centered-cubic (BCC) multicomponent alloys was proposed. In the present work, we implemented this model in an open-source code with an user-friendly interface to calculate PCT diagrams for BCC multicomponent alloys having any of the following elements: Mg, Al, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Zr, Nb, Mo, Pd, Hf, and Ta. The open-source code aims to allow the use of the thermodynamic model for alloy design as well as to encourage other researchers to improve the inputs and the initial thermodynamic model. As an example of application of the model for alloy design, the code was employed to investigate the effect of different metals (M) on the PCT diagrams of Ti 0.3 V 0.3 Nb 0.3 M 0.1 alloys. • Thermodynamic model for BCC multicomponent alloys was implemented in a code. • The code is open-source and was developed in Python language. • PCT diagrams of BCC multicomponent alloys can be calculated. • The user-friendly code may assist the alloy design for hydrogen storage. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Reliability Study of Inertial Sensors LIS2DH12 Compared to ActiGraph GT9X: Based on Free Code.
- Author
-
Martín-Martín, Jaime, Jiménez-Partinen, Ariadna, De-Torres, Irene, Escriche-Escuder, Adrian, González-Sánchez, Manuel, Muro-Culebras, Antonio, Roldán-Jiménez, Cristina, Ruiz-Muñoz, María, Mayoral-Cleries, Fermín, Biró, Attila, Tang, Wen, Nikolova, Borjanka, Salvatore, Alfredo, and Cuesta-Vargas, Antonio I.
- Subjects
- *
INTRACLASS correlation , *DETECTORS - Abstract
The study's purpose was to assess the reliability of the LIS2DH12 in two different positions, using the commercial sensor Actigraph GT9X as a reference instrument. Five participants completed two gait tests on a treadmill. Firstly, both sensors were worn on the wrist and around the thigh. Each test consisted of a 1 min walk for participants to become accustomed to the treadmill, followed by a 2 min trial at ten pre-set speeds. Data from both sensors were collected in real-time. Intraclass correlation coefficient (ICC) was used to evaluate the equality of characteristics obtained by both sensors: maximum peaks, minimum peaks, and the mean of the complete signal (sequence of acceleration values along the time) by each axis and speed were extracted to evaluate the equality of characteristics obtained with LIS2DH12 compared to Actigraph. Intraclass correlation coefficient (ICC) was extracted, and a standard deviation of the mean was obtained from the data. Our results show that LIS2DH12 measurements present more reliability than Actigraph GT9X, ICC > 0.8 at three axes. This study concludes that LIS2DH12 is as reliable and accurate as Actigraph GT9X Link and, therefore, would be a suitable tool for future kinematic studies. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
16. Power Spectral Density Background Estimate and Signal Detection via the Multitaper Method.
- Author
-
Di Matteo, S., Viall, N. M., and Kepko, L.
- Abstract
We present a new spectral analysis method for the identification of periodic signals in geophysical time series. We evaluate the power spectral density with the adaptive multitaper method, a nonparametric spectral analysis technique suitable for time series characterized by colored power spectral density. Our method provides a maximum likelihood estimation of the power spectral density background according to four different models. It includes the option for the models to be fitted on four smoothed versions of the power spectral density when there is a need to reduce the influence of power enhancements due to periodic signals. We use a statistical criterion to select the best background representation among the different smoothing + model pairs. Then, we define the confidence thresholds to identify the power spectral density enhancements related to the occurrence of periodic fluctuations (γ test). We combine the results with those obtained with the multitaper harmonic F test, an additional complex‐valued regression analysis from which it is possible to estimate the amplitude and phase of the signals. We demonstrate the algorithm on Monte Carlo simulations of synthetic time series and a case study of magnetospheric field fluctuations directly driven by periodic density structures in the solar wind. The method is robust and flexible. Our procedure is freely available as a stand‐alone IDL code at https://zenodo.org/record/3703168. The modular structure of our methodology allows the introduction of new smoothing methods and models to cover additional types of time series. The flexibility and extensibility of the technique makes it broadly suitable to any discipline.Key Points: Our technique provides a robust estimate of the continuous background of colored Power Spectral DensityThis method uses a combination of spectral and harmonic statistical tests to identify periodic fluctuationsThere are multiple options for the method of Power Spectral Density smoothing and the background model [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
17. KORIŠTENJE SOFTVERA SA OTVORENIM KODOM OPENFOAM® SA TENZORSKIM OBJEKTNO-ORJENTISANIM PRISTUPOM U SIMULACIJAMA MEHANIKE FLUIDA.
- Author
-
Berberović, Edin
- Abstract
Copyright of Proceedings on Quality is the property of University of Zenica, Faculty of Mechanical Engineering and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2019
18. Estudio estadístico de las desviaciones en las estimaciones del tiempo de retraso a la ignición para mezclas de H2/CH4 utilizando un código comercial y de fuente abierta
- Author
-
Yepes, Hernando A., Salazar, Adalberto, Cardona, Arley, Yepes, Hernando A., Salazar, Adalberto, and Cardona, Arley
- Abstract
An adequate ignition delay time prediction is one of the most important study fields in combustion engineering. In this way, the aim of this study was to evaluate the possible deviations that an open-source program such as Cantera may present with respect to the results delivered by a commercial application, in this case, CHEMKIN 19.0. The methodology used in the work was based on the comparison of means with respect to a fixed value and the analysis of variance (ANOVA), considering a complete factorial experimental design of the 2k type. A variable transformation on the relative differences was applied in order to achieve the normal distribution condition. The obtained results establish that disagreement exists from a statistical point of view, although negligible for a practical and engineering focus. In conclusion, a confidence interval and superior threshold could be established for the differences with a 95 % confidence. The sixth root of the relative differences is lower for 0.8 and it is into the 0.67 and 0.71 interval, confirming that the deviations are irrelevant since the relative differences are even less., La adecuada estimación del tiempo de retraso a la ignición es uno de los temas de mayor relevancia en la ingeniería de combustión. En este sentido, el objetivo del presente estudio fue evaluar las posibles desviaciones que puede presentar un programa de código libre como es Cantera respecto a los resultados entregados por una aplicación comercial, en este caso CHEMKIN 19.0. La metodología empleada en el trabajo estuvo basada en la comparación de medias con respecto a un valor fijo y el análisis de varianza (ANOVA), considerando un diseño experimental factorial completo del tipo 2k. Se aplicó una transformación a la diferencia relativa estimada para ser usada como variable de respuesta cumpliendo así la condición de distribución normal. Los resultados obtenidos permiten establecer que existen desviaciones desde un punto de vista estadístico, aunque estas son muy pequeñas y descartables desde un enfoque práctico y de ingeniería. Como conclusión, se pudo establecer un intervalo de confianza y un umbral superior para dichas diferencias con un 95 % de confianza, donde la raíz sexta de la desviación relativa promedio no supera el valor de 0.8 y además está contenida entre 0.67 y 0.71, confirmando que las diferencias no son relevantes teniendo en cuenta que la desviación relativa es aún menor.
- Published
- 2023
19. Scientific tool integration in CBRAIN
- Author
-
Beck, Natacha, Adalat, Reza, Boroday, Sergiy, Das, Samir, Glatard, Tristan, Khalili-Mahani, Najmeh, Lecours-Boucher, Xavier, Pham, Xuan Mai, Quesnel, Darcy, Rioux, Pierre, Caron, Bryan, and Evans, Alan
- Subjects
Informatics ,Neuroinformatics and data sharing ,Databasing and data sharing ,workflows ,Open-source code - Abstract
Accepted abstract and poster presented at the Organization for Human Brain Mapping Conference 2023, July 2023.
- Published
- 2023
- Full Text
- View/download PDF
20. Efficient and scalable access to the UK Biobank data using the NeuroHub Platform
- Author
-
Caron, Bryan, Abou-Haider, Rida, Beck, Natacha, Boroday, Serge, Das, Samir, Hutton, Alexandre, Le, Diana, Lecours-Boucher, Xavier, O'Brien, Emmet, Quesnel, Darcy, Rioux, Pierre, Poline, Jean-Baptiste, and Evans, Alan
- Subjects
Computational Neuroscience ,Data Analysis ,Informatics ,Neuroinformatics and data sharing ,Computing ,Open-Source Code ,Data Organization ,Workflows - Abstract
Accepted abstract and poster for presentation at the Organization for Human Brain Mapping Conference, July 2023.
- Published
- 2023
- Full Text
- View/download PDF
21. A MATLAB Package for Calculating Partial Derivatives of Surface-Wave Dispersion Curves by a Reduced Delta Matrix Method.
- Author
-
Wu, Dunshi, Wang, Xiaowei, Su, Qin, and Zhang, Tao
- Subjects
GROUP velocity ,PHASE velocity ,IMPLICIT functions ,DISPERSION (Chemistry) ,CURVES - Abstract
Various surface-wave exploration methods have become increasingly important tools in investigating the properties of subsurface structures. Inversion of the experimental dispersion curves is generally an indispensable component of these methods. Accurate and reliable calculation of partial derivatives of surface-wave dispersion curves with respect to parameters of subsurface layers is critical to the success of these approaches if the linearized inversion strategies are adopted. Here we present an open-source MATLAB package, named SWPD (Surface Wave Partial Derivative), for modeling surface-wave (both Rayleigh- and Love-wave) dispersion curves (both phase and group velocity) and particularly for computing their partial derivatives with high precision. The package is able to compute partial derivatives of phase velocity and of Love-wave group velocity analytically based on the combined use of the reduced delta matrix theory and the implicit function theorem. For partial derivatives of Rayleigh-wave group velocity, a hemi-analytical method is presented, which analytically calculates all the first-order partial differentiations and approximates the mixed second-order partial differentiation term with a central difference scheme. We provide examples to demonstrate the effectiveness of this package, and demo scripts are also provided for users to reproduce all results of this paper and thus to become familiar with the package as quickly as possible. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
22. Open-source chemogenomic data-driven algorithms for predicting drug–target interactions.
- Author
-
Hao, Ming, Bryant, Stephen H, and Wang, Yanli
- Subjects
- *
CHEMICAL biology , *HUMAN genome , *ALGORITHMS , *PROGRAMMING languages , *SOURCE code , *RANKING (Statistics) , *OPEN source software - Abstract
While novel technologies such as high-throughput screening have advanced together with significant investment by pharmaceutical companies during the past decades, the success rate for drug development has not yet been improved prompting researchers looking for new strategies of drug discovery. Drug repositioning is a potential approach to solve this dilemma. However, experimental identification and validation of potential drug targets encoded by the human genome is both costly and time-consuming. Therefore, effective computational approaches have been proposed to facilitate drug repositioning, which have proved to be successful in drug discovery. Doubtlessly, the availability of open-accessible data from basic chemical biology research and the success of human genome sequencing are crucial to develop effective in silico drug repositioning methods allowing the identification of potential targets for existing drugs. In this work, we review several chemogenomic data-driven computational algorithms with source codes publicly accessible for predicting drug–target interactions (DTIs). We organize these algorithms by model properties and model evolutionary relationships. We re-implemented five representative algorithms in R programming language, and compared these algorithms by means of mean percentile ranking, a new recall-based evaluation metric in the DTI prediction research field. We anticipate that this review will be objective and helpful to researchers who would like to further improve existing algorithms or need to choose appropriate algorithms to infer potential DTIs in the projects. The source codes for DTI predictions are available at: https://github.com/minghao2016/chemogenomicAlg4DTIpred. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
23. CHAP: Open-source software for processing and analyzing pupillometry data.
- Author
-
Hershman, Ronen, Henik, Avishai, and Cohen, Noga
- Subjects
- *
PUPILLARY reflex , *GRAPHICAL user interfaces , *COMPUTER software - Abstract
Pupil dilation is an effective indicator of cognitive and affective processes. Although several eyetracker systems on the market can provide effective solutions for pupil dilation measurement, there is a lack of tools for processing and analyzing the data provided by these systems. For this reason, we developed CHAP: open-source software written in MATLAB. This software provides a user-friendly graphical user interface for processing and analyzing pupillometry data. Our software creates uniform conventions for the preprocessing and analysis of pupillometry data and provides a quick and easy-to-use tool for researchers interested in pupillometry. To download CHAP or join our mailing list, please visit CHAP's website: http://in.bgu.ac.il/en/Labs/CNL/chap. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
24. WEC-Sim Phase 1 Validation Testing: Numerical Modeling of Experiments
- Author
-
Yu, Yi-Hsiang
- Published
- 2016
- Full Text
- View/download PDF
25. A Computational Framework for Aerodynamic and Aeroelastic Modeling of Wind Loads on Tall Buildings
- Author
-
Melaku, Abiy Fantaye
- Subjects
validation ,fluid-structure interaction (FSI) ,computational efficiency ,inflow turbulence generation ,wind-induced response ,Structural Engineering ,Atmospheric boundary layer (ABL) ,open-source code ,Computational Engineering ,aeroelastic modeling ,structural dynamics ,computational fluid dynamics (CFD) ,Civil Engineering ,tall building ,large-eddy simulation (LES) ,spectral representation method ,wind loads ,software implementation - Abstract
Driven by the burgeoning growth of computing power over the last few decades, the capability of computational fluid dynamics (CFD) to simulate turbulent flows of practical interest has progressed rapidly. In the past, a notable research effort has been dedicated to applying CFD for modeling wind loads on structures, particularly for tall buildings. However, the current state of CFD for wind load evaluation of tall buildings using Large-Eddy Simulation (LES) has several critical challenges, including the treatment of atmospheric boundary layer (ABL) flow conditions, turbulence modeling of separated flows around buildings, and simulation of wind-structure interaction for dynamically sensitive buildings. For CFD to be a practically useful wind engineering tool, these challenges must be addressed adequately meeting the rigors of the current wind engineering practice. This thesis presents the development of a CFD-based framework for accurate aerodynamic and aeroelastic modeling of tall buildings with the objective of overcoming these key limitations. The capabilities of the framework are demonstrated using a series of case studies. The CFD-based framework is developed in three major phases. In the first phase, computationally efficient methods were developed for modeling the characteristics of the approaching ABL turbulence. A novel synthetic inflow turbulence generation method is proposed that satisfies two-point flow statistics coupled with an implicit ground roughness modeling technique to represent the local terrain effect. In the next phase of the framework, aerodynamic wind loads on tall buildings having different surrounding configurations are simulated and validated against wind tunnel results. Initially, the cladding and overall loads, as well as responses of an isolated standard tall building, are investigated. Then, the framework is applied to a more realistic case involving a complex-shaped tall building located in a city center. In the final phase of research, the capability of the framework is extended by implementing a high-fidelity fluid-structure interaction (FSI) procedure to model the aeroelastic response of tall buildings. The implemented FSI algorithm uses a partitioned approach that couples a transient fluid solver with a multi-degree-of-freedom model of the building. Then the FSI procedure is applied to simulate the aeroelastic response of a tall flexible building. Overall, comparing the results from each phase of the study with wind tunnel measurements showed an encouraging level of agreement. It is expected that the framework presented in this thesis is of practical importance to the wind-resistant design of tall buildings.
- Published
- 2023
26. A Cartesian-octree adaptive front-tracking solver for immersed biological capsules in large complex domains.
- Author
-
Huet, Damien P. and Wachs, Anthony
- Subjects
- *
BOUNDARY element methods , *VISCOUS flow , *FINITE element method , *FLUID flow , *DIRAC function , *POLYMERIC membranes - Abstract
We present an open-source adaptive front-tracking solver for biological capsules in viscous flows. The membrane elastic and bending forces are solved on a Lagrangian triangulation using a linear Finite Element Method and a paraboloid fitting method. The fluid flow is solved on an octree adaptive grid using the open-source platform Basilisk. The Lagrangian and Eulerian grids communicate using an Immersed Boundary Method by means of Peskin-like regularized Dirac delta functions. We demonstrate the accuracy of our solver with extensive validations: in Stokes conditions against the Boundary Integral Method, and in the presence of inertia against similar (but not adaptive) front-tracking solvers. Excellent qualitative and quantitative agreements are shown. We then demonstrate the robustness of the present solver in challenging cases featuring extreme membrane deformations, very large computational domains and high volume fractions. Moreover, we illustrate the capability of the solver to simulate inertial capsule-laden flows in complex STL-defined geometries, opening the door for bioengineering applications featuring large three-dimensional channel structures. The source code and all the test cases presented in this paper are freely available. • Front Tracking/Immersed Boundary method to simulate a 3D flow laden with deformable capsules in a complex geometry. • Extension of the method to adaptive mesh refinement on Cartesian octree grids. • Implementation in the open-source software Basilisk. • Complex geometries are considered via STL files. • Comprehensive validations to verify the robustness and accuracy of the method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. Research of possibilities of default refactoring actions in Swift language
- Author
-
Andrii Tkachuk and Bogdan Bulakh
- Subjects
sourcekit component ,Swift programming language ,General Engineering ,open-source code ,refactoring - Abstract
The object of research in the paper is a built-in refactoring mechanism in the Swift programming language. Swift has gained a lot of popularity recently, which is why there are many new challenges associated with the need to support and modify the source code written in this programming language. The problem is that the more powerful refactoring mechanism that can be applied to Swift is proprietary and cannot be used by other software. Moreover, even closed-source refactoring software tools are not capable of performing more complex queries. To explore the possibilities of expanding the built-in refactoring, it is suggested to investigate the software implementation of the sourcekit component of the Swift programming language, which is responsible for working with «raw» source code, and to implement new refactoring action in practice. To implement the research plan, one refactoring activity that was not present in the refactoring utilities (adding an implementation of the Equatable protocol) was chosen. Its implementation was developed using the components and resources provided within the sourcekit component. To check the correctness and compliance with the development conditions, several tests were created and conducted. It has been discovered that both refactoring mechanisms supported by the Swift programming language have a limited context and a limited scope and application. That is why the possibility of expanding the functionality should not be based on the local level of code processing, but on the upper level, where it is possible to combine several source files, which often happens in projects. The work was directed to the development of the own refactoring action to analyze and obtain a perfect representation of the advantages and disadvantages of the existing component. As a result, a new approach to refactoring was proposed, which will allow solving the problems described above.
- Published
- 2022
28. Дослідження можливостей вбудованого рефакторингу у мові SWIFT
- Subjects
рефакторинг ,компонент sourcekit ,sourcekit component ,Swift programming language ,open-source code ,мова програмування Swift ,refactoring ,відкритий вихідний код - Abstract
The object of research in the paper is a built-in refactoring mechanism in the Swift programming language. Swift has gained a lot of popularity recently, which is why there are many new challenges associated with the need to support and modify the source code written in this programming language. The problem is that the more powerful refactoring mechanism that can be applied to Swift is proprietary and cannot be used by other software. Moreover, even closed-source refactoring software tools are not capable of performing more complex queries. To explore the possibilities of expanding the built-in refactoring, it is suggested to investigate the software implementation of the sourcekit component of the Swift programming language, which is responsible for working with «raw» source code, and to implement new refactoring action in practice. To implement the research plan, one refactoring activity that was not present in the refactoring utilities (adding an implementation of the Equatable protocol) was chosen. Its implementation was developed using the components and resources provided within the sourcekit component. To check the correctness and compliance with the development conditions, several tests were created and conducted. It has been discovered that both refactoring mechanisms supported by the Swift programming language have a limited context and a limited scope and application. That is why the possibility of expanding the functionality should not be based on the local level of code processing, but on the upper level, where it is possible to combine several source files, which often happens in projects. The work was directed to the development of the own refactoring action to analyze and obtain a perfect representation of the advantages and disadvantages of the existing component. As a result, a new approach to refactoring was proposed, which will allow solving the problems described above., Об’єктом дослідження у роботі є вбудований механізм рефакторингу у мові програмування Swift. Swift набуває великої популярності за останній час, саме тому з’являється багато нових викликів, пов’язаних із необхідністю здійснювати підтримку та модифікацію вихідного коду, написаного цією мовою програмування. Існуюча проблема полягає в тому, що більш потужний механізм рефакторингу, який може бути застосований до мови Swift, є пропрієтарним і не може використовуватися іншими програмними засобами. Більше того, навіть програмні засоби рефакторингу із закритим вихідним кодом не здатні виконувати більш складні запити. Для дослідження можливостей розширення вбудованого рефакторингу пропонується дослідити програмну реалізацію компонента sourcekit мови програмування Swift, що відповідає за роботу із «сирим» вихідним кодом, а також реалізувати додавання нової дії з рефакторингу з його використанням. Для виконання плану дослідження було обрано одну дію рефакторингу, що не була присутня в утилітах рефакторингу, а саме додавання реалізації протоколу Equatable. Було розроблено її програмну імплементацію за допомогою компонентів і ресурсів, що надаються в межах компоненту sourcekit. Для перевірки правильності та відповідності умовам розробки було створено та проведено ряд випробувань. Встановлено, що обидва механізми рефакторингу, які підтримуються мовою програмування Swift, мають обмежений контекст і обмежену зону дії та застосування. Саме тому можливість розширення функціоналу має базуватись не на локальному рівні опрацювання коду, а на верхньому рівні, де можливо поєднати кілька вихідних файлів, що часто відбувається у проєктах. Робота була направлена на розробку власної дії рефакторингу для аналізу та отримання досконалого представлення про переваги та недоліки існуючого компоненту. Як результат, було запропоновано новий підхід до здійснення рефакторингу, що дозволить вирішити описані вище проблеми
- Published
- 2022
29. Avoimen lähdekoodin hyödyntäminen väsymisanalyysissä
- Author
-
Ilkka Valkonen
- Subjects
business.industry ,Computer science ,Mechanical Engineering ,unit load ,open-source code ,Structural engineering ,Finite element method ,avoin lähdekoodi ,Mechanics of Materials ,aikasarja ,multi-axial ,fatigue ,Artikkelit ,time series ,business ,väsyminen ,moniaksiaalinen ,yksikkökuorma - Abstract
Artikkelissa esitetään menetelmät väsymisdatan ja yksikkökuormien yhdistämisestä ja sen käytöstä avoimen lähdekoodin FEM-ohjelmistojen yhteydessä väsymisvaurion selvittämiseksi. Kirjallisuudesta löytyvän esimerkin ja ohjelmalla lasketut tulokset olivat lähellä toisiaan. Avoimen lähdekoodin tarjoamat ratkaisut ovat siis varteenotettavia moniaksiaalisen väsymisen analysoinnissa., This article presents methods for combining fatigue data and unit loads to be applied for fatigue analysis within open-source FEM codes. Example results from literature are close to the results obtained with the present methods. Accordingly, open-source tools can be considered as a viable option for multi-axial fatigue analysis.
- Published
- 2021
30. LES and RANS calculations of particle dispersion behind a wall-mounted cubic obstacle
- Author
-
Atzori, Marco, Chibbaro, Sergio, Duwig, Christophe, Brandt, Luca, Atzori, Marco, Chibbaro, Sergio, Duwig, Christophe, and Brandt, Luca
- Abstract
In the present paper, we evaluate the performances of three stochastic models for particle dispersion in the case of a three-dimensional turbulent flow. We consider the flow in a channel with a cubic wall-mounted obstacle, and perform large-eddy simulations (LESs) including passive particles injected behind the obstacle, for cases of low and strong inertial effects. We also perform Reynolds-averaged simulations of the same case, using standard turbulence models, and employ the two discrete stochastic models for particle dispersion implemented in the open-source code OpenFOAM and the continuous Lagrangian stochastic model proposed by Minier et al. (2004). The Lagrangian model is consistent with a Probability Density Function (PDF) model of the exact particle equations, and is based on the modelling of the fluid velocity seen by particles. This approach allows a consistent formulation which eliminates the spurious drifts flawing discrete models and to have the drag force in a closed form. The LES results are used as reference data both for the fluid RANS simulations and particle simulations with dispersion models. The present test case allows to evaluate the performance of dispersion models in highly non-homogeneous flow, and it used in this context for the first time. The continuous stochastic model generally shows a better agreement with the LES than the discrete stochastic models, in particular in the case of particles with higher inertia., QC 20221125
- Published
- 2022
- Full Text
- View/download PDF
31. Bayesian Inversion with Open-Source Codes for Various One-Dimensional Model Problems in Computational Mechanics
- Author
-
Noii, Nima, Khodadadian, Amirreza, Ulloa, Jacinto, Aldakheel, Fadi, Wick, Thomas, François, Stijn, Wriggers, Peter, Noii, Nima, Khodadadian, Amirreza, Ulloa, Jacinto, Aldakheel, Fadi, Wick, Thomas, François, Stijn, and Wriggers, Peter
- Abstract
The complexity of many problems in computational mechanics calls for reliable programming codes and accurate simulation systems. Typically, simulation responses strongly depend on material and model parameters, where one distinguishes between backward and forward models. Providing reliable information for the material/model parameters, enables us to calibrate the forward model (e.g., a system of PDEs). Markov chain Monte Carlo methods are efficient computational techniques to estimate the posterior density of the parameters. In the present study, we employ Bayesian inversion for several mechanical problems and study its applicability to enhance the model accuracy. Seven different boundary value problems in coupled multi-field (and multi-physics) systems are presented. To provide a comprehensive study, both rate-dependent and rate-independent equations are considered. Moreover, open source codes (https://doi.org/10.5281/zenodo.6451942) are provided, constituting a convenient platform for future developments for, e.g., multi-field coupled problems. The developed package is written in MATLAB and provides useful information about mechanical model problems and the backward Bayesian inversion setting. © 2022, The Author(s).
- Published
- 2022
32. Open-source quality control routine and multi-year power generation data of 175 PV systems
- Author
-
Visser, Lennard, Elsinga, Boudewijn, AlSkaif, Tarek, van Sark, Wilfried, Visser, Lennard, Elsinga, Boudewijn, AlSkaif, Tarek, and van Sark, Wilfried
- Abstract
Description The repository contains an extensive dataset of PV power measurements and a python package (qcpv) for quality controlling PV power measurements. The dataset features four years (2014-2017) of power measurements of 175 rooftop mounted residential PV systems located in Utrecht, the Netherlands. The power measurements have a 1-min resolution. PV power measurements Three different versions of the power measurements are included in three data-subsets in the repository. Unfiltered power measurements are enclosed in unfiltered_pv_power_measurements.csv. Filtered power measurements are included as filtered_pv_power_measurements_sc.csv and filtered_pv_power_measurements_ac.csv. The former dataset contains the quality controlled power measurements after running single system filters only, the latter dataset considers the output after running both single and across system filters. The metadata of the PV systems is added in metadata.csv. This file holds for each PV system a unique ID, start and end time of registered power measurements, estimated DC and AC capacity, tilt and azimuth angle, annual yield and mapped grids of the system location (north, south, west and east boundary). Quality control routine An open-source quality control routine that can be applied to filter erroneous PV power measurements is added to the repository in the form of the Python package qcpv (qcpv.py). Sample code to call and run the functions in the qcpv package is available as example.py. Objective By publishing the dataset we provide access to high quality PV power measurements that can be used for research experiments on several topics related to PV power and the integration of PV in the electricity grid. By publishing the qcpv package we strive to set a next step into developing a standardized routine for quality control of PV power measurements. We hope to stimulate others to adopt and improve the routine of quality control and work towards a widely adopted standardized routine. Data
- Published
- 2022
33. Design and testing of camera developed on Raspberry Pi platform
- Author
-
Testen, Matej and Grgić, Sonja
- Subjects
digitalna fotografija ,TECHNICAL SCIENCES. Computing ,TEHNIČKE ZNANOSTI. Računarstvo ,Raspberry Pi ,kod otvorenog izvora ,digital photography ,open-source code ,Python - Abstract
U radu se opisuje optički sustav, sustav za dohvat slike i rad kamere kao cjeline. Programsko rješenje je pisano u programskom jeziku Python na principu koda otvorenog izvora. Sklopovsko rješenje se temelji na Raspberry Pi platformi i omogućuje modularan dizajn i nadogradnju u budućnosti. Analizirana je kvaliteta izvedbe i dobiveni rezultati. Paper describes the optical system, image aquisition system and a digital camera. Software solution was written in Python as open-source code. Hardware solution is based on the Raspberry Pi platform and enables modular design and future upgrades. Production quality and produced results were analyzed.
- Published
- 2022
34. HASKER: An efficient algorithm for string kernels. Application to polarity classification in various languages.
- Author
-
Popescu, Marius, Tudor Ionescu, Radu, and Grozea, Cristian
- Subjects
ALGORITHMS ,KERNEL (Mathematics) ,SIMILARITY (Language learning) ,POLARITY (Linguistics) ,SENTIMENT analysis ,SOURCE code - Abstract
String kernels have successfully been used for various NLP tasks, ranging from text categorization by topic to native language identification. In this paper, we present a simple and efficient algorithm for computing various spectrum string kernels. When comparing two strings, we store the p -grams in the first string into a hash table, and then we apply a hash table lookup for the p -grams that occur in the second string. In terms of time, we show that our algorithm can outperform a state-of-the-art tool for computing string similarity. In terms of accuracy, we show that our approach can reach state-of-the-art performance for polarity classification in various languages. Our efficient implementation is provided online for free at http://string-kernels.herokuapp.com . [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
35. Reliability Study of Inertial Sensors LIS2DH12 Compared to ActiGraph GT9X: Based on Free Code
- Author
-
Jaime Martín-Martín, Ariadna Jiménez-Partinen, Irene De-Torres, Adrian Escriche-Escuder, Manuel González-Sánchez, Antonio Muro-Culebras, Cristina Roldán-Jiménez, María Ruiz-Muñoz, Fermín Mayoral-Cleries, Attila Biró, Wen Tang, Borjanka Nikolova, Alfredo Salvatore, and Antonio I. Cuesta-Vargas
- Subjects
R code ,reliability ,Medicine (miscellaneous) ,open-source code ,functional assessment ,Fidelidad ,inertial sensors - Abstract
The study’s purpose was to assess the reliability of the LIS2DH12 in two different positions, using the commercial sensor Actigraph GT9X as a reference instrument. Five participants completed two gait tests on a treadmill. Firstly, both sensors were worn on the wrist and around the thigh. Each test consisted of a 1 min walk for participants to become accustomed to the treadmill, followed by a 2 min trial at ten pre-set speeds. Data from both sensors were collected in real-time. Intraclass correlation coefficient (ICC) was used to evaluate the equality of characteristics obtained by both sensors: maximum peaks, minimum peaks, and the mean of the complete signal (sequence of acceleration values along the time) by each axis and speed were extracted to evaluate the equality of characteristics obtained with LIS2DH12 compared to Actigraph. Intraclass correlation coefficient (ICC) was extracted, and a standard deviation of the mean was obtained from the data. Our results show that LIS2DH12 measurements present more reliability than Actigraph GT9X, ICC > 0.8 at three axes. This study concludes that LIS2DH12 is as reliable and accurate as Actigraph GT9X Link and, therefore, would be a suitable tool for future kinematic studie This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 823871 (iGame). Partial funding for open access charge: Universidad de Málaga
- Published
- 2022
36. MODELING OF NON-SPHERICAL, ELONGATED PARTICLES FOR INDUSTRIAL SUSPENSION FLOW SIMULATION.
- Author
-
Redlinger-Pohn, Jakob D., König, Lisa M., Kloss, Christoph, Goniva, Christoph, and Radl, Stefan
- Subjects
EULER-Lagrange equations ,MULTIPHASE flow ,INDUSTRIAL applications ,PARTICLES ,CRYSTALS - Abstract
Euler-Lagrange (EL) simulations of particulate suspension flow are an important tool to understand and predict multiphase flow in nature and industrial applications. Unfortunately, solid-liquid suspensions are often of (mathematically) stiff nature, i.e., the relaxation time of suspended particles may be small compared to relevant flow time scales. Involved particles are typically in the size range from μm to mm, and of non-spherical shape, e.g., elongated particles such as needle-shaped crystals and/or natural and man-made fibres. Depending on their aspect ratio and bending stiffness, those particles can be treated as rigid, or flexible. In this paper we present a recent implementation into the open-source LIGGGHTS® and CFDEM® software package for the simulation of systems involving stiff non-spherical, elongated particles. A newly implemented splitting technique of the coupling forces and torques, following the ideas of Fan and Ahmadi (J. Aerosol Sci. 26, 1995), allows significantly larger coupling intervals, leading to a substantial reduction in the computational cost. Hence, large-scale industrial systems can be simulated in an acceptable amount of time. We first present our modeling approach, followed by the verification of our code based on benchmark problems. Second, we present results of one-way coupled CFD-DEM simulations. Our simulations reveal segregation of fibres in dependence on their length due to fibre-fluid interaction in torus flow. [ABSTRACT FROM AUTHOR]
- Published
- 2016
37. An Open-Source Toolbox for PEM Fuel Cell Simulation
- Author
-
Jean-Paul Kone, Xinyu Zhang, Yuying Yan, and Stephen Adegbite
- Subjects
computational fluid dynamics ,modelling ,numerical ,open-source code ,proton exchange membrane fuel cell ,simulation ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
In this paper, an open-source toolbox that can be used to accurately predict the distribution of the major physical quantities that are transported within a proton exchange membrane (PEM) fuel cell is presented. The toolbox has been developed using the Open Source Field Operation and Manipulation (OpenFOAM) platform, which is an open-source computational fluid dynamics (CFD) code. The base case results for the distribution of velocity, pressure, chemical species, Nernst potential, current density, and temperature are as expected. The plotted polarization curve was compared to the results from a numerical model and experimental data taken from the literature. The conducted simulations have generated a significant amount of data and information about the transport processes that are involved in the operation of a PEM fuel cell. The key role played by the concentration constant in shaping the cell polarization curve has been explored. The development of the present toolbox is in line with the objectives outlined in the International Energy Agency (IEA, Paris, France) Advanced Fuel Cell Annex 37 that is devoted to developing open-source computational tools to facilitate fuel cell technologies. The work therefore serves as a basis for devising additional features that are not always feasible with a commercial code.
- Published
- 2018
- Full Text
- View/download PDF
38. Development and implementation of a software for wire arc additive manufacturing preprocessing planning : trajectory planning and machine code generation
- Author
-
Rafael Pereira Ferreira, Louriel Oliveira Vilarinho, and Américo Scotti
- Subjects
Datavetenskap (datalogi) ,Mechanics of Materials ,Computer Sciences ,Mechanical Engineering ,Additives ,Motion planning ,Open source software ,Open systems ,Trajectories ,Wire ,Metals and Alloys ,Code programming ,G codes ,Machine codes ,Open-source code ,Open-source code programming ,Preprocessing planning ,Trajectory generation ,Trajectory Planning ,Wire arc ,Wire arc additive manufacturing ,Manufacturing, Surface and Joining Technology ,Bearbetnings-, yt- och fogningsteknik ,3D printers - Abstract
To overcome a shortage of flexible and low-cost solutions for wire arc additive manufacturing (WAAM) preprocessing, this work´s objective was to develop and validate an in-house computational programme in an open-source environment for WAAM preprocessing planning. Algorithms for reading STL (stereolithography) files and implementing rotation, slicing, trajectory planning, and machine code generation were elaborated and implemented in the Scilab environment (free and open-source). A graphical interface was developed to facilitate user interaction, with 5 options for path planning. The functionality of each work step is detailed. For validation of the software, single and multiple-layer prints, with different geometrical complexity and printing challenges, were built in a CNC table geared by the generated machine code. The validation criteria were deposition imperfection, morphological, and dimensional tolerances. The outputs showed that the parts were successfully printed. Therefore, this work demonstrates that Scilab provides the necessary resources for companies and universities to implement and/or develop algorithms for planning and generating trajectories for WAAM. Moreover, emerging ideas can be reasonably easily implemented in such software, not always possible in commercial packages. Open access funding provided by University West. This work was partially funded by the National Council for Scientific and Technological Development—CNPq (grant number 302863/2016–8) and PETROBRAS (project number 23117.018175/2019–80).
- Published
- 2022
39. Open-source quality control routine and multi-year power generation data of 175 PV systems
- Subjects
PV power measurements ,quality control routine ,open-source code ,Toegepaste Informatiekunde ,Information Technology ,open-source dataset - Abstract
Description The repository contains an extensive dataset of PV power measurements and a python package (qcpv) for quality controlling PV power measurements. The dataset features four years (2014-2017) of power measurements of 175 rooftop mounted residential PV systems located in Utrecht, the Netherlands. The power measurements have a 1-min resolution. PV power measurements Three different versions of the power measurements are included in three data-subsets in the repository. Unfiltered power measurements are enclosed in unfiltered_pv_power_measurements.csv. Filtered power measurements are included as filtered_pv_power_measurements_sc.csv and filtered_pv_power_measurements_ac.csv. The former dataset contains the quality controlled power measurements after running single system filters only, the latter dataset considers the output after running both single and across system filters. The metadata of the PV systems is added in metadata.csv. This file holds for each PV system a unique ID, start and end time of registered power measurements, estimated DC and AC capacity, tilt and azimuth angle, annual yield and mapped grids of the system location (north, south, west and east boundary). Quality control routine An open-source quality control routine that can be applied to filter erroneous PV power measurements is added to the repository in the form of the Python package qcpv (qcpv.py). Sample code to call and run the functions in the qcpv package is available as example.py. Objective By publishing the dataset we provide access to high quality PV power measurements that can be used for research experiments on several topics related to PV power and the integration of PV in the electricity grid. By publishing the qcpv package we strive to set a next step into developing a standardized routine for quality control of PV power measurements. We hope to stimulate others to adopt and improve the routine of quality control and work towards a widely adopted standardized routine. Data usage If you use the data and/or python package in a published work please cite: Visser, L., Elsinga, B., AlSkaif, T., van Sark, W., 2022. Open-source quality control routine and multi-year power generation data of 175 PV systems. Journal of Renewable and Sustainable Energy. Units Timestamps are in UTC (YYYY-MM-DD HH:MM:SS+00:00). Power measurements are in Watt. Installed capacities (DC and AC) are in Watt-peak. Additional information A detailed discussion of the data and qcpv package is presented in: Visser, L., Elsinga, B., AlSkaif, T., van Sark, W., 2022. Open-source quality control routine and multi-year power generation data of 175 PV systems. Journal of Renewable and Sustainable Energy. Acknowledgements This work is part of the Energy Intranets (NEAT: ESI-BiDa 647.003.002) project, which is funded by the Dutch Research Council NWO in the framework of the Energy Systems Integration & Big Data programme. The authors would especially like to thank the PV owners who volunteered to take part in the measurement campaign.
- Published
- 2022
- Full Text
- View/download PDF
40. Bayesian Inversion with Open-Source Codes for Various One-Dimensional Model Problems in Computational Mechanics
- Author
-
Nima Noii, Amirreza Khodadadian, Jacinto Ulloa, Fadi Aldakheel, Thomas Wick, Stijn François, and Peter Wriggers
- Subjects
Mathematics, Interdisciplinary Applications ,Technology ,MATLAB ,elastoplasticity ,Simulation systems ,Open-source codes ,VARIATIONAL APPROACH ,Engineering, Multidisciplinary ,Bayesian inversion ,Programming codes ,Modeling parameters ,phase-field fracture ,FATIGUE ,Boundary value problems ,Engineering ,ddc:690 ,Computational mechanics ,PLASTICITY ,Open systems ,FORMULATION ,fatigue failure ,Science & Technology ,Applied Mathematics ,Markov processes ,Monte Carlo methods ,BRITTLE-FRACTURE ,Open source software ,Open-source code ,GRADIENT DAMAGE MODELS ,Computer Science Applications ,One-dimensional model ,Materials parameters ,Simulation response ,Physical Sciences ,Computer Science ,PHASE-FIELD MODEL ,Dewey Decimal Classification::600 | Technik::690 | Hausbau, Bauhandwerk ,Computer Science, Interdisciplinary Applications ,Forward modeling ,Mathematics ,Model problems - Abstract
The complexity of many problems in computational mechanics calls for reliable programming codes and accurate simulation systems. Typically, simulation responses strongly depend on material and model parameters, where one distinguishes between backward and forward models. Providing reliable information for the material/model parameters enables us to calibrate the forward model (e.g., a system of PDEs). Markov chain Monte Carlo methods are efficient computational techniques to estimate the posterior density of the parameters. In the present study, we employ Bayesian inversion for several mechanical problems and study its applicability to enhance the model's accuracy. Seven different boundary value problems in coupled multi-field (and multi-physics) systems are presented. To provide a comprehensive study, both rate-dependent and rate-independent equations are considered. Moreover, open-source codes are provided, constituting a convenient platform for future developments for, e.g., multi-field coupled problems. The developed package is written in MATLAB and provides useful information about mechanical model problems and the backward Bayesian inversion setting. https://doi.org/10.1007/s11831-022-09751-6, Please check the "ReadMe" file and the zip folder (ONE_D_Codes.zip) for the code's description. - Now the paper is available online: https://doi.org/10.1007/s11831-022-09751-6
- Published
- 2022
41. Modeling Injection Molding of High-Density Polyethylene with Crystallization in Open-Source Software
- Author
-
Anton Krebelj, Miroslav Halilovič, Kristjan Krebelj, and Nikolaj Mole
- Subjects
0209 industrial biotechnology ,Materials science ,Polymers and Plastics ,injection molding ,Crystallization of polymers ,open-source code ,02 engineering and technology ,Heat transfer coefficient ,Article ,law.invention ,Stress (mechanics) ,lcsh:QD241-441 ,020901 industrial engineering & automation ,Thermal conductivity ,lcsh:Organic chemistry ,law ,Latent heat ,numerične simulacije ,polymer crystallization ,udc:519.876.5:621.767(045) ,Composite material ,Crystallization ,injekcijsko brizganje ,kristalizacija polimerov ,Thermal contact ,General Chemistry ,021001 nanoscience & nanotechnology ,numerical simulation ,High-density polyethylene ,0210 nano-technology ,odprtokodni programi - Abstract
This work investigates crystallization modeling by modifying an open-source computational fluid dynamics code OpenFOAM. The crystallization behavior of high-density polyethylene (HDPE) is implemented according to theoretical and experimental literature. A number of physical interdependencies are included. The cavity is modeled as deformable. The heat transfer coefficient in the thermal contact towards the mold depends on contact pressure. The thermal conductivity is pressure- and crystallinity-dependent. Specific heat depends on temperature and crystallinity. Latent heat is released according to the crystallization progress and temperature. Deviatoric elastic stress is evolved in the solidified material. The prediction of the cavity pressure evolution is used for the assessment of the solution quality because it is experimentally available and governs the residual stress development. Insight into the thermomechanical conditions is provided with through-thickness plots of pressure, temperature and cooling rate at different levels of crystallinity. The code and simulation setup are made openly available to further the research on the topic.
- Published
- 2021
42. Model-free environmental contours in higher dimensions.
- Author
-
Mackay, Ed and de Hauteclocque, Guillaume
- Subjects
- *
WIND waves , *ACCOUNTING methods , *WIND turbines , *SOURCE code , *SURFACE waves (Seismic waves) , *UNIVARIATE analysis , *WIND speed - Abstract
This paper presents a method for estimating environmental contours in an arbitrary number of dimensions. The method, referred to as Direct-IFORM, does not require a model for the joint distribution of the variables. It can therefore be applied in higher dimensions without degradation in performance. The method involves multiple univariate analyses under rotations of the axes to estimate return values in various directions, which are used to form a contour. The method accounts for serial correlation in the observations, which removes the positive bias that occurs when this is neglected. An efficient open-source code is provided for estimating Direct-IFORM contours. A four-dimensional example is presented for contours of wind speed, wave height, wave period and wind-wave misalignment direction. The application of the method to the design of offshore wind turbines is discussed. • Method presented for estimating contours without need to fit joint distribution. • Method applicable in higher dimensions without degradation in performance. • Method accounts for serial correlation in observations, reducing positive bias. • Open source code in MATLAB and Python provided for estimating Direct-IFORM contours. • Example presented for 4-dimensional contours of wind and wave parameters. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. RLT-POS: Reformulation-Linearization Technique-based optimization software for solving polynomial programming problems.
- Author
-
Dalkiran, Evrim and Sherali, Hanif
- Abstract
In this paper, we introduce a Reformulation-Linearization Technique-based open-source optimization software for solving polynomial programming problems (RLT-POS). We present algorithms and mechanisms that form the backbone of RLT-POS, including constraint filtering techniques, reduced RLT representations, and semidefinite cuts. When implemented individually, each model enhancement has been shown in previous papers to significantly improve the performance of the standard RLT procedure. However, the coordination between different model enhancement techniques becomes critical for an improved overall performance since special structures in the original formulation that work in favor of a particular technique might be lost after implementing some other model enhancement. More specifically, we discuss the coordination between (1) constraint elimination via filtering techniques and reduced RLT representations, and (2) semidefinite cuts for sparse problems. We present computational results using instances from the literature as well as randomly generated problems to demonstrate the improvement over a standard RLT implementation and to compare the performances of the software packages BARON, COUENNE, and SparsePOP with RLT-POS. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
44. An accelerated computation of combustion with finite-rate chemistry using LES and an open source library for In-Situ-Adaptive Tabulation.
- Author
-
Fooladgar, Ehsan, Chan, C.K., and Nogenmyr, Karl-Johan
- Subjects
- *
COMBUSTION , *SIMULATION methods & models , *DARMSTADTIUM , *TURBULENT flow , *STRATIFIED flow - Abstract
In this paper, a modified version of the latest implementation of In-Situ-Adaptive-Tabulation (ISAT) algorithm, ISAT-CK7, is introduced and linked to OpenFOAM for accelerating chemical computation in simulating combustion with the finite chemistry (FRC) approach. In this new version, ISAT-CK7-Cantera, which is an open-source and free code, replaces Chemkin II in ISAT-CK7 as the thermo-chemical library. This new library facilitates using LES with FRC and detailed chemistry which has not been commonly done. In addition to successful validation of ISAT-CK7-Cantera in the simple 0D and 2D geometries, the resulting package, consisting of OpenFOAM, ISAT and Cantera, is used to evaluate the performance of LES-FRC with a partially stirred reactor combustion model and DRM19 skeletal mechanism against experimental data of the Darmstadt turbulent stratified flame. LES results show good to excellent agreement with the measured data. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
45. Interfaces
- Author
-
Wick, Thomas and Wick, Thomas
- Abstract
In this course, coupled problems with interfaces are considered. Some applications and examples are discussed first. Then, interfaces are defined and classified into three categories. Numerical modeling of interfaces is a central aspect in this presentation. These theoretically-oriented parts are followed by numerical simulations using an open-source fluid-structure interaction benchmark code based on the finite element library deal.II. For joint coding, a docker image was installed on qarnot and repl.it for cloud computing., Course held at the CSMA Junior section workshop ahead of the 14th WCCM & EDDOMAS Congress 2020
- Published
- 2021
46. FALL3D-8.0: a computational model for atmospheric transport and deposition of particles, aerosols and radionuclides – Part 2: Model validation
- Author
-
Barcelona Supercomputing Center, Prata, Andrew T., Mingari, Leonardo, Folch, Arnau, Macedonio, Giovanni, Costa, Antonio, Barcelona Supercomputing Center, Prata, Andrew T., Mingari, Leonardo, Folch, Arnau, Macedonio, Giovanni, and Costa, Antonio
- Abstract
This paper presents model validation results for the latest version release of the FALL3D atmospheric transport model. The code has been redesigned from scratch to incorporate different categories of species and to overcome legacy issues that precluded its preparation towards extreme-scale computing. The model validation is based on the new FALL3D-8.0 test suite, which comprises a set of four real case studies that encapsulate the major features of the model; namely, the simulation of long-range fine volcanic ash dispersal, volcanic SO2 dispersal, tephra fallout deposits and the dispersal and deposition of radionuclides. The first two test suite cases (i.e. the June 2011 Puyehue-Cordón Caulle ash cloud and the June 2019 Raikoke SO2 cloud) are validated against geostationary satellite retrievals and demonstrate the new FALL3D data insertion scheme. The metrics used to validate the volcanic ash and SO2 simulations are the structure, amplitude and location (SAL) metric and the figure of merit in space (FMS). The other two test suite cases (i.e. the February 2013 Mt. Etna ash cloud and associated tephra fallout deposit, and the dispersal of radionuclides resulting from the 1986 Chernobyl nuclear accident) are validated with scattered ground-based observations of deposit load and local particle grain size distributions and with measurements from the Radioactivity Environmental Monitoring database. For validation of tephra deposit loads and radionuclides, we use two variants of the normalised root-mean-square error metric. We find that FALL3D-8.0 simulations initialised with data insertion consistently improve agreement with satellite retrievals at all lead times up to 48 h for both volcanic ash and SO2 simulations. In general, SAL scores lower than 1.5 and FMS scores greater than 0.40 indicate acceptable agreement with satellite retrievals of volcanic ash and SO2. In addition, we show very good agreement, across several orders of magnitude, between the model and observat, This research has been supported by the European Commission, H2020 Excellence Science (ChEESE (grant no. 823844)), the European Commission, H2020 Marie Skłodowska-Curie Actions (STARS (grant no. 754433)), the European Commission, H2020 Research Infrastructures (EUROVOLC (grant no. 731070)) and the Ministero dell'Istruzione, dell'Università e della Ricerca (grant no. 805 FOE 2015)., Peer Reviewed, Postprint (published version)
- Published
- 2021
47. Geonetdigitizer: Open source code to digitalize information given in Wulff and Lambert nets.
- Author
-
Suarez-Burgoa, Ludger O.
- Abstract
The article carries out the theoretical basis, the usefulness, and validation of the computational program geonetDigitizer (licensed as an open-source code as BSD-2) in order to apply as a toolbox for the programming language MATLAB©. The program geonetDigitizer was designed for the digitalization of plots expressed in the spherical projections Wulff and Lambert, which is of extensive use in structural geology and geomechanics. The reasons of presenting the program as an open code has the purpose to recover historical data presented in those projections, as also to recover old and modern information presented in that projections. All of these will reduce the possible phenomenon known as Digital Dark Age. To validate the code, it is presented here two examples, where real known values are compared with their corresponding values obtained by the process of digitizing. Finally, in the appendix is presented two application examples. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
48. Flaw and Configuration Analysis of Cloud Component Using First Order Logic.
- Author
-
Ando, Ruo
- Abstract
Nowadays, large scale systems of open source code are adopted for mission critical systems on Cloud computing environment. However, despite of availability of Cloud component as open source software, there has been no methodology proposed for analyzing configuration flaw for these open source systems. In this paper we propose a FoL (First order Logic) based configuration analysis for detecting configuration flaw of source code. In proposed system, programming code is translated into clausal representation of FoL. Extracting call chain from a flaw detected to configuration part enables us to find where and how to erase the flaw of large scale Cloud component. In experiment, we have discovered several configurations which has potential vulnerabilities in large scale open source code of Cloud component. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
49. Deneb: An open-source high-performance multi-physical flow solver based on high-order DRM-DG method.
- Author
-
You, Hojun, Kim, Juhyun, and Kim, Chongam
- Subjects
- *
FLOW simulations , *NAVIER-Stokes equations , *SUPERSONIC flow , *GROUNDWATER flow , *NUMERICAL integration , *KRYLOV subspace , *DIGITAL rights management - Abstract
High-order methods are being recognized as powerful tools for handling scale-resolving simulations over complex geometry. However, several obstacles still block their complete applications to practical engineering problems: a compromise between accuracy and efficiency on mixed-curved meshes, inherent vulnerability to numerical oscillations, and lack of open-source high-performance solvers for researchers. To address these issues, we present Deneb, an open-source high-order accurate numerical solver that enables high-performance scale-resolving simulations on PDE-based flow systems. Deneb uses the physical domain-based modal discontinuous Galerkin (DG) method; thus, it can provide an arbitrary high-order accurate solution on mixed-curved meshes and has the potential for handling polyhedral meshes as well. The direct reconstruction method (DRM) efficiently executes the numerical integration of DG volume and surface integrals without accuracy loss on non-affine elements where mapping functions are high-degree. The resulting DRM-DG method eliminates the severe cost of a quadrature-based approach on mixed-curved meshes. Deneb offers explicit and implicit Runge–Kutta methods as well to achieve high-order accuracy in time. In addition, Krylov subspace methods and preconditioners are available for high-performance linear system solving in parallel. Deneb possesses a strong capability to resolve multi-physical shocks without numerical instabilities with the aid of multi-dimensional limiting and artificial viscosity methods. In particular, the hierarchical multi-dimensional limiting process enables efficient computations of supersonic flows without time-step restriction. The current release of Deneb covers the simulations of hypersonic equilibrium and magneto-hydrodynamic flows as well as compressible Navier–Stokes equations, but it has the potential to solve any PDE-based multi-physical flow systems. Several benchmark problems are presented to highlight Deneb's capability to perform scale-resolving and multi-physical flow simulations. A scalability test is also presented to verify the scaling characteristics of Deneb for high-performance computing. Program title: Deneb CPC library link to program files: https://doi.org/10.17632/723n5r797n.1 Developer's repository link: https://github.com/HojunYouKr/Deneb Licensing provisions: BSD-3-Clause Programming language: C++17 Nature of problem: The physical domain-based modal DG method can achieve the expected order of accuracy with the optimal number of polynomial bases even on non-affine elements. However, the DG method becomes significantly expensive on high-order curved elements when using quadrature rules, blocking its applicability to practical engineering problems. The numerical integration should be much more efficient without compromising accuracy. In addition, the less diffusive nature of high-order methods makes them susceptible to producing spurious numerical oscillations near flow discontinuities, potentially leading to numerical instabilities. Thus, an accurate and robust shock-capturing method is essential to simulate multi-physical flows under compressible regimes. Finally, the solver needs high scalability to perform large-scale computations in parallel. Solution method: DRM is applied to the DG volume and surface integrals to perform efficient numerical integration on non-affine elements without accuracy loss. The resulting method, DRM-DG, provides arbitrary high-order accurate solutions to various PDE-based flow problems on mixed-curved meshes. The solution is also high-order accurate in time due to high-order explicit and implicit Runge–Kutta methods implemented. The external library enables high-performance linear system solving with various preconditioners in parallel. Both multi-dimensional limiting and artificial viscosity methods suppress unwanted subcell oscillations across physical discontinuities. In particular, the limiting methods simulate complex supersonic flows efficiently without time-step restriction. The solver is highly scalable on parallel computing with the aid of non-blocking communications and latency hiding. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. Building level flood exposure analysis using a hydrodynamic model.
- Author
-
Bertsch, Robert, Glenis, Vassilis, and Kilsby, Chris
- Subjects
- *
FLOOD risk , *FLOODS , *ARCHITECTURAL details , *CITIES & towns , *ACTUARIAL risk , *DECISION making - Abstract
The advent of detailed hydrodynamic model simulations of urban flooding has not been matched by improved capabilities in flood exposure analysis which rely on validation against observed data. This work introduces a generic, building-level flood exposure analysis tool applying high resolution flood data and building geometries derived from hydrodynamic simulations performed with the 2D hydrodynamic flood modelling software CityCAT. Validation data were obtained from a survey of affected residents following a large pluvial flood event in Newcastle upon Tyne, UK. Sensitivity testing was carried out for different hydrodynamic model and exposure tool settings and between 68% and 75% of the surveyed buildings were correctly modelled as either flooded or not flooded. The tool tends to underrepresent flooding with a better performance in identifying true negatives (i.e. no flooding observed with no flooding modelled) compared to true positives. As higher true positive rates were accompanied by higher false positive rates, no single scenario could be identified as the optimal solution. However, the results suggest a greater sensitivity of the results to the classification scheme than to the buffer distance applied. Overall, if applied to high resolution flood depth maps, the method is efficient and suitable for application to large urban areas for flood risk management and insurance analysis purposes. • A flood exposure analysis tool which uses high resolution inundation grids and detailed building geometries for cities avoiding aggregation of data. • The tool is validated against observed data collected after a large pluvial flood event. • Developed as a Python-based Jupyter notebook the code can be easily adapted and extended to accommodate specific user needs. • Decision making support tool for stakeholders in flood risk management. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.