318 results
Search Results
2. The JAL Guide to the Professional Literature.
- Author
-
Borchardt, D.H., Horrocks, Norman, Dahms, Moshie, Rice, James, Croghan, Antony, Morrison, Ray L., Munford, W.A., Harrison, K.C., Gaines, Ervin J., Owens, Marie Foster, Fielding, Derek, Broadbent, Marrianne, Martin, Bill, Collins, Anne M.K., Hamlin, Arthur T., Miller, Mary Elsie, Maloney, Karen, Love, Jane Hazelton, and Boehmer, M. Clare
- Subjects
- *
NONFICTION - Abstract
Reviews several non-fiction books about library science. 'Collected Papers of Frederick G. Kilgour,' edited by Lois L. Yoakam; 'Encyclopedia of Library and Information Science,' edited by Allen Kent; 'The Library in Society,' A. Robert Rogers and K. McChesney.
- Published
- 1985
3. Sound Rising from the Paper: Nineteenth-Century Martial Arts Fiction and the Chinese Acoustic Imagination.
- Author
-
Mason, Paul H.
- Subjects
- *
MARTIAL arts , *NONFICTION - Published
- 2018
- Full Text
- View/download PDF
4. Editing Research: The Author Editing Approach to Providing Effective Support to Writers of Research Papers, Valerie Matarese. Information Today. Medford, New Jersey (2016). 244 pp., Softbound: $49.50, Web Order Price: $44.55.
- Author
-
Burgess, Sally
- Subjects
- *
MANUSCRIPT preparation (Authorship) , *REPORT writing , *NONFICTION - Published
- 2017
- Full Text
- View/download PDF
5. Writing and publishing science research papers in english: A global perspective.
- Author
-
Mu, Congjun
- Subjects
- *
SCIENCE publishing , *SCIENTIFIC terminology , *NONFICTION , *HIGHER education - Published
- 2015
- Full Text
- View/download PDF
6. lifex-cfd: An open-source computational fluid dynamics solver for cardiovascular applications.
- Author
-
Africa, Pasquale Claudio, Fumagalli, Ivan, Bucelli, Michele, Zingaro, Alberto, Fedele, Marco, Dede', Luca, and Quarteroni, Alfio
- Subjects
- *
COMPUTATIONAL fluid dynamics , *NAVIER-Stokes equations , *HEART valves , *PROGRAMMING languages , *SOURCE code - Abstract
Computational fluid dynamics (CFD) is an important tool for the simulation of the cardiovascular function and dysfunction. Due to the complexity of the anatomy, the transitional regime of blood flow in the heart, and the strong mutual influence between the flow and the physical processes involved in the heart function, the development of accurate and efficient CFD solvers for cardiovascular flows is still a challenging task. In this paper we present life▪-cfd, an open-source CFD solver for cardiovascular simulations based on the life▪ finite element library, written in modern C++ and exploiting distributed memory parallelism. We model blood flow in both physiological and pathological conditions via the incompressible Navier-Stokes equations, accounting for moving cardiac valves, moving domains, and transition-to-turbulence regimes. In this paper, we provide an overview of the underlying mathematical formulation, numerical discretization, implementation details and examples on how to use life▪-cfd. We verify the code through rigorous convergence analyses, and we show its almost ideal parallel speedup. We demonstrate the accuracy and reliability of the numerical methods implemented through a series of idealized and patient-specific vascular and cardiac simulations, in different physiological flow regimes. The life▪-cfd source code is available under the LGPLv3 license, to ensure its accessibility and transparency to the scientific community, and to facilitate collaboration and further developments. Program Title: life▪-cfd CPC Library link to program files: https://doi.org/10.17632/hzsnc3jgds.1 Developer's repository link: https://gitlab.com/lifex/lifex-cfd Licensing provisions: LGPLv3 Programming language: C++ (standard ≥17) Supplementary material: https://doi.org/10.5281/zenodo.7852088 contains the application executable in binary form, compatible with any recent enough x86-64 Linux system, assuming that glibc version ≥ 2.28 is installed. Data and parameter files necessary to replicate the test cases described in this manuscript are also available. Nature of problem: The program allows to run computational fluid dynamics simulations of cardiovascular blood flows in physiological and pathological conditions, modeled through incompressible Navier-Stokes equations, including moving cardiac valves, moving domains (such as contracting cardiac chambers) in the arbitrary Lagrangian-Eulerian framework, and transition-to-turbulence flow. Given the scale of the typical applications, the program is designed for parallel execution. Solution method: The equations are discretized using the Finite Element method, on either tetrahedral or hexahedral meshes. The software builds on top of deal.II, implementing the mathematical models and numerical methods specific for CFD cardiovascular simulations. Parallel execution exploits the MPI paradigm. The software supports both Trilinos and PETSc as linear algebra backends. Additional comments including restrictions and unusual features: The program provides a general-purpose executable that can be used to run CFD simulations without having to access or modify the source code. The program allows to setup simulations through a user-friendly yet flexible interface, by means of readable and self-documenting parameter files. On top of that, more advanced users can modify the source code to implement more sophisticated test cases. life▪-cfd supports checkpointing, i.e. simulations can be stopped and restarted at a later time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Neko: A modern, portable, and scalable framework for high-fidelity computational fluid dynamics.
- Author
-
Jansson, Niclas, Karp, Martin, Podobas, Artur, Markidis, Stefano, and Schlatter, Philipp
- Subjects
- *
COMPUTATIONAL fluid dynamics , *REYNOLDS number , *COMPUTER software developers , *TURBULENCE , *TURBULENT flow - Abstract
Computational fluid dynamics (CFD), in particular applied to turbulent flows, is a research area with great engineering and fundamental physical interest. However, already at moderately high Reynolds numbers the computational cost becomes prohibitive as the range of active spatial and temporal scales is quickly widening. Specifically scale-resolving simulations, including large-eddy simulation (LES) and direct numerical simulations (DNS), thus need to rely on modern efficient numerical methods and corresponding software implementations. Recent trends and advancements, including more diverse and heterogeneous hardware in High-Performance Computing (HPC), are challenging software developers in their pursuit for good performance and numerical stability. The well-known maxim "software outlives hardware" may no longer necessarily hold true, and developers are today forced to re-factor their codebases to leverage these powerful new systems. In this paper, we present Neko, a new portable framework for high-order spectral element discretization, targeting turbulent flows in moderately complex geometries. Neko is fully available as open software. Unlike prior works, Neko adopts a modern object-oriented approach in Fortran 2008, allowing multi-tier abstractions of the solver stack and facilitating hardware backends ranging from general-purpose processors (CPUs) down to exotic vector processors and FPGAs. We show that Neko's performance and accuracy are comparable to NekRS, and thus on-par with Nek5000's successor on modern CPU machines. Furthermore, we develop a performance model, which we use to discuss challenges and opportunities for high-order solvers on emerging hardware. • We introduce Neko, a modernized framework for high-fidelity CFD simulations. • We reveal Neko's inner implementation details and design decisions. • We develop a performance model, which we validate and use to project performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. H-wave – A Python package for the Hartree-Fock approximation and the random phase approximation.
- Author
-
Aoyama, Tatsumi, Yoshimi, Kazuyoshi, Ido, Kota, Motoyama, Yuichi, Kawamura, Taiki, Misawa, Takahiro, Kato, Takeo, and Kobayashi, Akito
- Subjects
- *
HARTREE-Fock approximation , *CONDENSED matter physics , *HUBBARD model , *PROGRAMMING languages , *INTEGRATED software - Abstract
H-wave is an open-source software package for performing the Hartree–Fock approximation (HFA) and random phase approximation (RPA) for a wide range of Hamiltonians of interacting fermionic systems. In HFA calculations, H-wave examines the stability of several symmetry-broken phases, such as anti-ferromagnetic and charge-ordered phases, in the given Hamiltonians at zero and finite temperatures. Furthermore, H-wave calculates the dynamical susceptibilities using RPA to examine the instability toward the symmetry-broken phases. By preparing a simple input file for specifying the Hamiltonians, users can perform HFA and RPA for standard Hamiltonians in condensed matter physics, such as the Hubbard model and its extensions. Additionally, users can use a Wannier90 -like format to specify fermionic Hamiltonians. A Wannier90 format is implemented in RESPACK to derive ab initio Hamiltonians for solids. HFA and RPA for the ab initio Hamiltonians can be easily performed using H-wave. In this paper, we first explain the basis of HFA and RPA, and the basic usage of H-wave , including download and installation. Thereafter, the input file formats implemented in H-wave , including the Wannier90 -like format for specifying the interacting fermionic Hamiltonians, are discussed. Finally, we present several examples of H-wave such as zero-temperature HFA calculations for the extended Hubbard model on a square lattice, finite-temperature HFA calculations for the Hubbard model on a cubic lattice, and RPA in the extended Hubbard model on a square lattice. Program Title: H-wave CPC Library link to program files: https://doi.org/10.17632/9gr6pxhfjm.1 Developer's repository link: https://github.com/issp-center-dev/H-wave Code Ocean capsule: https://codeocean.com/capsule/6875177 Licensing provisions: GNU General Public License version 3 Programming language: Python3 External routines/libraries: NumPy, SciPy, Tomli, Requests. Nature of problem: Physical properties of strongly correlated electrons are examined such as ground state phase structure and response functions at zero and finite temperatures. Solution method: Calculations based on the unrestricted Hartree-Fock approximation and the random phase approximation are performed for the quantum lattice models such as the Hubbard model and its extensions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Expanding PyProcar for new features, maintainability, and reliability.
- Author
-
Lang, Logan, Tavadze, Pedram, Tellez, Andres, Bousquet, Eric, Xu, He, Muñoz, Francisco, Vasquez, Nicolas, Herath, Uthpala, and Romero, Aldo H.
- Subjects
- *
GRAPHICAL user interfaces , *PYTHON programming language , *FERMI surfaces , *ATOMIC orbitals , *MATERIALS science , *MAINTAINABILITY (Engineering) , *ELECTRONIC structure - Abstract
This paper presents a comprehensive update to PyProcar, a versatile Python package for analyzing and visualizing density functional theory (DFT) calculations in materials science. The latest version introduces a modularized codebase, a centralized example data repository, and a robust testing framework, offering a more reliable, maintainable, and scalable platform. Expanded support for various DFT codes broadens its applicability across research environments. Enhanced documentation and an example gallery make the package more accessible to new and experienced users. Incorporating advanced features such as band unfolding, noncollinear calculations, and derivative calculations of band energies enriches its analytic capabilities, providing deeper insights into electronic and structural properties. The package also incorporates PyPoscar, a specialized toolkit for manipulating POSCAR files, broadening its utility in computational materials science. These advancements solidify PyProcar's position as a comprehensive and highly adaptable tool, effectively serving the evolving needs of the materials science community. Program title: PyProcar CPC Library link to program files: https://doi.org/10.17632/d4rrfy3dy4.2 Developer's repository link: https://github.com/romerogroup/pyprocar Licensing provisions: GPLv3 Programming language: Python Supplementary material: Pyprocar-Supplementary Information Journal reference of previous version: Comput. Phys. Commun. 251 (2020) 107080, https://doi.org/10.1016/j.cpc.2019.107080 Does the new version supersede the previous version?: Yes Reasons for the new version: Changes in the directory structure, the addition of new features, enhancement of the manual and user documentation, and generation of interfaces with other electronic structure packages. Summary of revisions: These updates enhance its capabilities and ensure developers' and users' maintainability, reliability, and ease of use. Nature of problem: To automate, simplify, and serialize the analysis of band structure and Fermi surface, especially for high throughput calculations. Solution method: Implement a Python library able to handle, combine, parse, extract, plot, and even repair data from density functional calculations from diverse electronic structure packages. PyProcar uses color maps on the band structures or Fermi surfaces to give a simple representation of the relevant characteristics of the electronic structure. Additional comments including restrictions and unusual features: PyProcar can produce high-quality figures of band structures and Fermi surfaces (2D and 3D), projection of atomic orbitals, atoms, and/or spin components. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Life Is With Others—Selected Papers on Child Psychiatry.
- Author
-
Henderson, Schuyler W.
- Subjects
- *
CHILD psychology , *NONFICTION - Abstract
The article reviews the book "Life Is With Others: Selected Papers on Child Psychiatry by Donald J. Cohen," edited by Andrés Martin and Robert A. King.
- Published
- 2006
- Full Text
- View/download PDF
11. ValerieMatareseEditing research: The author editing approach to providing effective support to writers of research papers2016Information TodayMedford, NJ244 pp., ISBN: 978-157387-531-8, US $49.50.
- Author
-
Conrad, Nina
- Subjects
- *
ACADEMIC discourse , *COPY editing , *NONFICTION - Published
- 2017
- Full Text
- View/download PDF
12. White Paper on Electronic Journal Usage Statistics (Book Review).
- Author
-
Dugan, Robert E.
- Subjects
- *
ELECTRONIC journals , *NONFICTION - Abstract
Reviews the book `White Paper on Electronic Journal Usage Statistics,' by Judy Luther.
- Published
- 2001
- Full Text
- View/download PDF
13. OnTrack: Reflecting on domain specific formal methods for railway designs.
- Author
-
James, Phillip, Moller, Faron, and Pantekis, Filippos
- Subjects
- *
RAILROAD design & construction , *ENGINEERING models , *ROLE models - Abstract
OnTrack is a tool that supports workflows for railway verification that has been implemented using model driven engineering frameworks. Starting with graphical scheme plans and finishing with automatically generated formal models set-up for verification, OnTrack allows railway engineers to interact with verification procedures through encapsulating formal methods. OnTrack is grounded on a domain specification language (DSL) capturing scheme plans and supports generation of various formal models using model transformations. In this paper, we detail the role model driven engineering takes within OnTrack and reflect on the use of model driven engineering concepts for developing domain specific formal methods toolsets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. SolAR: Automated test-suite generation for solidity smart contracts.
- Author
-
Driessen, S.W., Di Nucci, D., Tamburri, D.A., and van den Heuvel, W.J.
- Subjects
- *
CONTRACTS , *GENETIC algorithms , *SOLAR power plants , *BLOCKCHAINS - Abstract
Smart contracts have rapidly gained popularity as self-contained pieces of code, especially those run on the Ethereum blockchain. On the one hand, smart contracts are immutable, have transparent workings, and execute autonomously. On the other hand, these qualities make it essential to properly test the behavior of a smart contract before deploying it. In this paper, we introduce SolAR , a tool and approach for Sol idity A utomated Test Suite Gene R ation. SolAR allows smart contract developers to generate test suites for Solidity smart contracts optimized automatically for branch coverage using either a state-of-the-art genetic algorithm or a fuzzing approach. It enables a novel way to handle blockchain operations—or ChainOps—from a pipeline perspective, entailing a larger-scale as well as more manageable and maintainable service continuity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. ACSmt: A plugin for eclipse papyrus to model systems of systems.
- Author
-
Remond Harbo, Sean Kristian, Palmelund Voldby, Emil, Madsen, Jonas, and Albano, Michele
- Subjects
- *
SYSTEM of systems , *SOURCE code , *UNIFIED modeling language - Abstract
While System of Systems (SoS) architectures for large and complex software projects are gaining momentum, the commonly used modeling and tooling approaches are still general-purpose or oriented towards single systems. Developers could benefit from methods and tools that avoid system-centric details in favor of native SoS modeling support. This paper presents a diagram-centric modeling tool with native SoS modeling support. The tool is implemented as a plugin for the Eclipse Papyrus modeling tool. The tool was showcased as a demo at MODELS'22. The code of the plugin is freely available via Github. • The Abstract Communicating Systems (ACS) methodology can support designing complex platforms based on Systems of Systems. • ACS was mapped on UML 2.5, and ACS modeling tool (ACSmt) is the first tool implementing the ACS methodology. • ACSmt is implemented as an Eclipse Papyrus plugin, which supports UML 2.5 and is well-accepted in the industry. • ACSmt allows for verifying structural properties of the designed SoS. • The open source code of ACSmt can be used as reference when implementing plugins for Papyrus. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. ESA: An efficient sequence alignment algorithm for biological database search on Sunway TaihuLight.
- Author
-
Zhang, Hao, Huang, Zhiyi, Chen, Yawen, Liang, Jianguo, and Gao, Xiran
- Subjects
- *
SEQUENCE alignment , *BIOLOGICAL databases , *SUPERCOMPUTERS , *DATABASE searching , *DATA structures , *AMINO acid sequence - Abstract
In computational biology, biological database search has been playing a very important role. Since the COVID-19 outbreak, it has provided significant help in identifying common characteristics of viruses and developing vaccines and drugs. Sequence alignment, a method finding similarity, homology and other information between gene/protein sequences, is the usual tool in the database search. With the explosive growth of biological databases, the search process has become extremely time-consuming. However, existing parallel sequence alignment algorithms cannot deliver efficient database search due to low utilization of the resources such as cache memory and performance issues such as load imbalance and high communication overhead. In this paper, we propose an efficient sequence alignment algorithm on Sunway TaihuLight, called ESA, for biological database search. ESA adopts a novel hybrid alignment algorithm combining local and global alignments, which has higher accuracy than other sequence alignment algorithms. Further, ESA has several optimizations including cache-aware sequence alignment, capacity-aware load balancing and bandwidth-aware data transfer. They are implemented in a heterogeneous processor SW26010 adopted in the world's 6th fastest supercomputer, Sunway TaihuLight. The implementation of ESA is evaluated with the Swiss-Prot database on Sunway TaihuLight and other platforms. Our experimental results show that ESA has a speedup of 34.5 on a single core group (with 65 cores) of Sunway TaihuLight. The strong and weak scalabilities of ESA are tested with 1 to 1024 core groups of Sunway TaihuLight. The results show that ESA has linear weak scalability and very impressive strong scalability. For strong scalability, ESA achieves a speedup of 338.04 with 1024 core groups compared with a single core group. We also show that our proposed optimizations are also applicable to GPU, Intel multicore processors, and heterogeneous computing platforms. • In this paper, we propose and implement an efficient sequence alignment algorithm, ESA, for biological database search on SW26010 heterogeneous processors. This algorithm adopts both local and global alignments for biological database search with several optimizations. ESA achieves high computational performance without sacrificing accuracy. To the best of our knowledge, this is the first attempt to parallelize hybrid sequence alignment on Sunway TaihuLight using multi-level optimizations. • We propose three optimization strategies in ESA: cache-aware sequence alignment, capacity-aware load balancing and bandwidth-aware data transfer. Cache-aware sequence alignment effectively reduces the size of the data structure for sequence alignment and fully utilizes the vectorization of the slave cores of SW26010. With capacity-aware load balancing, we distribute the workload evenly among the cores of SW26010. With bandwidth-aware data transfer, ESA reduces the communication overhead by using asynchronous DMA transmission and RLC. • We evaluate the performance of ESA using the Swiss-Prot database on Sunway TaihuLight. Our experimental results show that ESA achieves a speedup of 34.5 times on a single CG over the manager core. Compared with a serial implementation on Intel (R) Xeon (R) CPU E5-2620 v4 processor, ESA achieves a speedup of 21.6 on a single CG. We also demonstrate that ESA has linear weak scalability and very competitive strong scalability. Finally, we compare ESA with mainstream algorithms on the CPU+GPU platform and achieve the highest GCUPS of 228.91. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. Development of the fully Geant4 compatible package for the simulation of Dark Matter in fixed target experiments.
- Author
-
Banto Oberhauser, B., Bisio, P., Celentano, A., Depero, E., Dusaev, R.R., Kirpichnikov, D.V., Kirsanov, M.M., Krasnikov, N.V., Marini, A., Marsicano, L., Molina-Bueno, L., Mongillo, M., Shchukin, D., Sieber, H., and Voronchikhin, I.V.
- Subjects
- *
DARK matter , *C++ , *MUONS , *PHOTON beams , *SIMULATION software , *USER interfaces - Abstract
The search for new comparably light (well below the electroweak scale) feebly interacting particles is an exciting possibility to explain some mysterious phenomena in physics, among them the origin of Dark Matter. The sensitivity study through detailed simulation of projected experiments is a key point in estimating their potential for discovery. Several years ago we created the DMG4 package for the simulation of DM (Dark Matter) particles in fixed target experiments. The natural approach is to integrate this simulation into the same program that performs the full simulation of particles in the experiment setup. The Geant4 toolkit framework was chosen as the most popular and versatile solution nowadays. The simulation of DM particles production by this package accommodates several possible scenarios, employing electron, muon or photon beams and involving various mediators, such as vector, axial vector, scalar, pseudoscalar, or spin 2 particles. The bremsstrahlung, annihilation or Primakoff processes can be simulated. The package DMG4 contains a subpackage DarkMatter with cross section methods weakly connected to Geant4. It can be used in different frameworks. In this paper, we present the latest developments of the package, such as extending the list of possible mediator particle types, refining formulas for the simulation and extending the mediator mass range. The user interface is also made more flexible and convenient. In this work, we also demonstrate the usage of the package, the improvements in the simulation accuracy and some cross check validations. Program title: DMG4 CPC Library link to program files: https://doi.org/10.17632/cmr4bcrj6j.1 Licensing provisions: GNU General Public License 3 Programming language: c++ Journal reference of previous version: Comput. Phys. Commun. 269 (2021) 108129 Does the new version supersede the previous version?: Yes Reasons for the new version: Numerous developments, addition of new features Summary of revisions: WW approximation cross sections for the muon beam are implemented and cross-checked, models with semivisible A ′ (inelastic Dark Matter) and spin 2 mediators are added. The range of possible mediator masses is extended. Several important improvements for the annihilation processes are made, the number of possible annihilation processes is extended. User interface is improved. Several bugs are fixed. Nature of problem: For the simulation of Dark Matter production processes in fixed target experiments a code that can be easily integrated in programs for the full simulation of experimental setup is needed. Solution method: A fully Geant4 compatible DM simulation package DMG4 was presented in 2020. We present numerous further developments of this package. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. FiniteFieldSolve: Exactly solving large linear systems in high-energy theory.
- Author
-
Mangan, James
- Subjects
- *
SYSTEMS theory , *LINEAR systems , *FINITE fields , *MODULAR arithmetic , *SET theory , *PRIME numbers - Abstract
Large linear systems play an important role in high-energy theory, appearing in amplitude bootstraps and during integral reduction. This paper introduces FiniteFieldSolve, a general-purpose toolkit for exactly solving large linear systems over the rationals. The solver interfaces directly with Mathematica, is straightforward to install, and seamlessly replaces Mathematica's native solvers. In testing, FiniteFieldSolve is approximately two orders of magnitude faster than Mathematica and uses an order of magnitude less memory. The package also compares favorably against other public solvers in FiniteFieldSolve's intended use cases. As the name of the package suggests, solutions are obtained via well-known finite field methods. These methods suffer from introducing an inordinate number of modulo (or integer division) operations with respect to different primes. By automatically recompiling itself for each prime, FiniteFieldSolve converts the division operations into much faster combinations of instructions, dramatically improving performance. The technique of compiling the prime can be applied to any finite field solver, where the time savings will be solver dependent. The operation of the package is illustrated through a detailed example of an amplitude bootstrap. Program Title: FiniteFieldSolve CPC Library link to program files: https://doi.org/10.17632/ntxvp58mjg.1 Developer's repository link: https://github.com/jfmangan/FiniteFieldSolve Licensing provisions: GPLv3 Programming language: Mathematica, C++ Nature of problem: Exactly solving large linear systems over the rationals occurs in various settings in high-energy theory, for example when performing integral reduction or bootstrapping an amplitude. Solution method: The linear system is solved by repeatedly row reducing over different finite fields (see Ref. [1] and references therein). Finite fields avoid the intermediary expression swell inherent to arbitrary precision rationals and bypass roundoff errors from floating point numbers. A downside to using modular arithmetic is that it introduces a tremendous number of integer divisions, but this can be mitigated by compiling the divisions down to simpler instructions. The solver is designed to handle arbitrarily dense systems such as those that appear in certain amplitudes bootstraps. [1] M. Kauers, Fast solvers for dense linear systems, Nucl. Phys. B Proc. Suppl. 183, 245–250 (2008), 10.1016/j.nuclphysbps.2008.09.111 [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. The Making of Contemporary China.
- Author
-
Borchert, Heiko and Hampton, Mary N.
- Subjects
- MAO'S China & the Cold War (Book), TIANAMEN Papers, The (Book), HEGEMON (Book), COMING Collapse of China, The (Book)
- Abstract
Reviews several books on political affairs of China. 'Mao's China and the Cold War,' by Chen Jian; 'The Tianamen Papers,' edited by Zhang Liang, Andrew J. Nathan, and Perry Link; 'Hegemon: China's Plan to Dominate Asia and the World,' by Steven W. Mosher; 'The Coming Collapse of China,' by Gordon G. Chang.
- Published
- 2002
20. SMIwiz: An integrated toolbox for multidimensional seismic modelling and imaging.
- Author
-
Yang, Pengliang
- Subjects
- *
IMAGING systems in seismology , *FINITE difference time domain method , *NONLINEAR programming , *OPEN source software , *THREE-dimensional imaging , *PROGRAMMING languages - Abstract
This paper contributes an open source software - SMIwiz, which integrates seismic modelling, reverse time migration (RTM), and full waveform inversion (FWI) into a unified computer implementation. SMIwiz has the machinery to do both 2D and 3D simulation in a consistent manner. The package features a number of computational recipes for efficient calculation of imaging condition and inversion gradient: a dynamic evolving computing box to limit the simulation cube and a well-designed wavefield reconstruction strategy to reduce the memory consumption when dealing with 3D problems. The modelling in SMIwiz runs independently: each shot corresponds to one processor in a bijective manner to maximize the scalability. A batchwise job scheduling strategy is designed to handle large 3D imaging tasks on computer with limited number of cores. The viability of SMIwiz is demonstrated by a number of applications on benchmark models. Program Title: SMIwiz CPC Library link to program files: https://doi.org/10.17632/tygszns27k.1 Developer's repository link: https://github.com/yangpl/SMIwiz Licensing provisions: GNU General Public License v3.0 Programming language: C, Shell, Fortran External dependencies: MPI [1], FFTW [2] Nature of problem: Seismic modelling and imaging (FWI and RTM) Solution method: High-order finite-difference time-domain (FDTD) for modelling on staggered grid; Quasi-Newton LBFGS algorithm for nonlinear optimization; line search to estimate step length based on Wolfe condition [1] https://www.mpich.org/ [2] http://fftw.org/ [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. MLQD: A package for machine learning-based quantum dissipative dynamics.
- Author
-
Ullah, Arif and Dral, Pavlo O.
- Subjects
- *
QUANTUM theory , *ARTIFICIAL intelligence , *PYTHON programming language , *MACHINE learning , *CONVOLUTIONAL neural networks , *QUANTUM wells - Abstract
Machine learning has emerged as a promising paradigm to study the quantum dissipative dynamics of open quantum systems. To facilitate the use of our recently published ML-based approaches for quantum dissipative dynamics, here we present an open-source Python package MLQD (https://github.com/Arif-PhyChem/MLQD), which currently supports the three ML-based quantum dynamics approaches: (1) the recursive dynamics with kernel ridge regression (KRR) method, (2) the non-recursive artificial-intelligence-based quantum dynamics (AIQD) approach and (3) the blazingly fast one-shot trajectory learning (OSTL) approach, where both AIQD and OSTL use the convolutional neural networks (CNN). This paper describes the features of the MLQD package, the technical details, optimization of hyperparameters, visualization of results, and the demonstration of the MLQD 's applicability for two widely studied systems, namely the spin-boson model and the Fenna–Matthews–Olson (FMO) complex. To make MLQD more user-friendly and accessible, we have made it available on the Python Package Index (PyPi) platform and it can be installed via ▪. In addition, it is also available on the XACS cloud computing platform (https://XACScloud.com) via the interface to the MLatom package (http://MLatom.com). Program Title: MLQD CPC Library link to program files: https://doi.org/10.17632/yxp37csy5x.1 Developer's repository link: https://github.com/Arif-PhyChem/MLQD Code Ocean capsule: https://codeocean.com/capsule/5563143/tree Licensing provisions: Apache Software License 2.0 Programming language: Python 3.0 Supplementary material: Jupyter Notebook-based tutorials External routines/libraries: Tensorflow, Scikit-learn, Hyperopt, Matplotlib, MLatom Nature of problem: Fast propagation of quantum dissipative dynamics with machine learning approaches. Solution method: We have developed MLQD as a comprehensive framework that streamlines and supports the implementation of our recently published machine learning-based approaches for efficient propagation of quantum dissipative dynamics. This framework encompasses: (1) the recursive dynamics with kernel ridge regression (KRR) method, as well as the non-recursive approaches utilizing convolutional neural networks (CNN), namely (2) artificial intelligence-based quantum dynamics (AIQD), and (3) one-shot trajectory learning (OSTL). Additional comments including restrictions and unusual features: 1. Users can train a machine learning (ML) model following one of the ML-based approaches: KRR, AIQD and OSTL. 2. Users have the option to propagate dynamics with the existing trained ML models. 3. MLQD also provides the transformation of trajectories into the training data. 4. MLQD also supports hyperparameter optimization using MLATOM's grid search functionality for KRR and Bayesian methods with Tree-structured Parzen Estimator (TPE) for CNN models via the HYPEROPT package. 5. MLQD also facilitates the visualization of results via auto-plotting. 6. MLQD is designed to be user-friendly and easily accessible, with availability on the XACS cloud computing platform (https://XACScloud.com) via the interface to the MLATOM package (http://MLatom.com). In addition, it is also available as a pip package which makes it easy to install. Future outlook: MLQD will be extended to more realistic systems along with the incorporation of other machine learning-based approaches as well as the traditional quantum dynamics methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Detecting depression tendency with multimodal features.
- Author
-
Zhang, Hui, Wang, Hong, Han, Shu, Li, Wei, and Zhuang, Luhe
- Subjects
- *
USER-generated content , *SOCIAL media , *KNOWLEDGE representation (Information theory) , *MENTAL depression , *WORD frequency , *MENTAL health , *NAIVE Bayes classification - Abstract
• Our MTDD model is an integrated knowledge-driven and data-driven model. This method avoids problems such as not considering expert experience or insight, not paying attention to the overall situation, and lack of interpretability brought about by only using data-driven technology. It not only utilizes the text features and the semantic features but also applies the domain knowledge to learn the representation of the depression tendency, thus making the model more robust. In other words, the model combines text features, semantic features, and domain knowledge. The Word2Vec word embedding integrates the emotional information of the words in the emotional dictionary, expands the existing emotional dictionary, extracts the TF-IDF word frequency feature, and proposes seven grammatical analysis rules to obtain the text emotional value feature making it more suitable for depression tendency detection classification task. • The MTDD model is a deep neural network hybrid model, which circumvents the weak generalization ability of a single model for identifying depression tendencies. Specifically, the MTDD model combines the advantages of CNN and Bi-LSTM networks. The CNN network offers the advantage of the ability to extract the local features of the text. In addition, BiLSTM can effectively capture bidirectional semantic information. This combination can better represent text features and improve the model's classification accuracy. • Our MTDD model is obtained based on real data, thus making the model more suitable for practical problems. As far as we know, many existing depression detection methods are only trained on some experimental data sets, so the model's generalization ability is limited and cannot even be applied in realistic scenes. In comparison, the MTDD model is trained on social platform data, making the data more objective and accurate. In addition, the data of social platforms can be obtained at a low cost. It is easy to operate and does not require a lot of laborious labeling. Moreover, our approach avoids the influence of subjective factors in the method of consultation by mental health experts and the influence of non-public and imperfect data used for depression. • We conducted extensive experiments on a Reddit data set and a Twitter data set. The results show that, compared to multiple latest depression detection models, our MTDD model detects users who may be depressed with a 95% F1 value and obtains SOTA results. [Display omitted] Background and Objective: Depression can severely impact physical and mental health and may even harm society. Therefore, detecting the early symptoms of depression and treating them on time is critical. The widespread use of social media has led individuals with depressive tendencies to express their emotions on social platforms, share their painful experiences, and seek support and help. Therefore, the massive available amounts of social platform data provide the possibility of identifying depressive tendencies. Methods: This paper proposes a neural network hybrid model MTDD to achieve this goal. Analysis of the content of users' posts on social platforms has facilitated constructing a post-level method to detect depressive tendencies in individuals. Compared with existing methods, the MTDD model uses the following innovative methods: First, this model is based on social platform data, which is objective and accurate, can be obtained at a low cost, and is easy to operate. The model can avoid the influence of subjective factors in the depressive tendency detection method based on consultation with mental health experts. In other words, it can avoid the problem of undisclosed and imperfect data in depressive tendency detection. Second, the MTDD model is based on a deep neural network hybrid model, combining the advantages of CNN and BiLSTM networks and avoiding the problem of poor generalization ability in a single model for depression tendency recognition. Third, the MTDD model is based on multimodal features for learning the vector representation of depression-prone text, including text features, semantic features, and domain knowledge, making the model more robust. Results: Extensive experimental results demonstrate that our MTDD model detects users who may have a depressive tendency with a 95 % F1 value and obtained SOTA results. Conclusions: Our MTDD model can detect depressive users on social media platforms more effectively, providing the possibility for early diagnosis and timely treatment of depression. The experiment proves that our MTDD model outperforms many of the latest depressive tendency detection models. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
23. Multi-focus image fusion using structure-guided flow.
- Author
-
Duan, Zhao, Luo, Xiaoliu, and Zhang, Taiping
- Subjects
- *
IMAGE fusion , *CAPSULE neural networks , *CONVOLUTIONAL neural networks , *DEEP learning , *SUPERVISED learning - Abstract
[Display omitted] • Introduce capsule network for multi-focus image fusion. • Utilize structure information to help locate focus regions. • Design structure-guided flow module to integrate structure features. Existing deep learning based methods have shown their advantages in multi-focus image fusion task. However, most methods still suffer from inaccurate focus region detection. In this paper, we employ the property of part-whole relationships embedded by the Capsule Network (CapsNet) to solve the problem. Specifically, we introduce CapsNet in multi-focus image fusion task, and design a structure-guided flow module, which fully utilizes structure information to help locate focus regions. CapsNet is introduced to extract structure features by supervising gradient information of the image. Compared with traditional convolutional neural networks (CNNs), CapsNet takes into account the correlation of features from different positions, such that it encodes more compact features. Once structure features are obtained, a flow alignment module is introduced to learn flow field between structure and image features, and propagate effectively structure features to image features to make confident focus region detection. Experimental results show the proposed method achieves robust fusion performance on three publicly available multi-focus datasets, and outperforms or is comparable to the state-of-the-art methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Whither editing?: The correspondence of John Flamsteed, first Astronomer RoyalEric G. Forbes, Lesley Murdin, & Frances Willmoth (Eds.), volume 2, 1682–1703, volume 3, 1703–1719; Institute of Physics Publishing, Bristol & Philadephia, 1997, 2002, pp. xlvii+1095, lxvi+1038, Price £199 each hardback, ISBN 0-7503-0391-3, 0-7503-0763-3The correspondence of John Wallis, volume 1 (1641–1659)Philip Beeley, & Christoph J. Scriba (Eds.), with the assistance of Uwe Mayer and Siegmund Probst; Oxford University Press, Oxford, 2003, pp. xlvii+651, Price £120 hardback, ISBN 0-19-851066-7 The Hartlib Papers. Second edition. A complete text and image database of the papers of Samuel Hartlib (c.1600–1662) HROnline, Sheffield, 2 CDs, Price £1,500, ISBN 0-9542608-0-5The Letters of Jan Jonston to Samuel HartlibW. J. Hitchens, Adam Matruszewski, & John Young (Eds.); Retro-Art, Warsaw, 2000, pp. 269, Price £25 paperback, ISBN 83-87992-12-7
- Author
-
Hunter, Michael
- Subjects
- *
EDITING , *ELECTRONIC publications , *LITERATURE ,EDITORIALS - Abstract
This is a review essay of printed editions of the correspondence of John Flamsteed, Jan Jonston and John Wallis, and of the CD—ROM edition of the Hartlib Papers. It raises various issues concerning the relationship between editions of correspondence and their archival base, and about the criteria used to decide what is appropriate to include as ‘correspondence’. It also addresses the rationale of the electronic edition of the Hartlib Papers, particularly the second edition, which extends its remit from the main archive at Sheffield to include a selection of ancillary material from other collections, questioning whether the effort involved would have been better employed in improving the basic resource. It then considers the use of electronic media in editing more generally, using the edition of Galileo’s notes on motion to illustrate the potential for intensive editorial intervention in a text, in contrast to the more extensive method used in the Hartlib edition, and thereby drawing attention to some of the attitudes that electronic editing is prone to induce. [Copyright &y& Elsevier]
- Published
- 2003
- Full Text
- View/download PDF
25. The JAL guide to the professional literature: Publishing.
- Author
-
Altman, Ellen and Pratt, Allan
- Subjects
- PAPER Persists (Book)
- Abstract
Reviews the book, `Paper Persists: Why Physical Library Collections Still Matter,' by Walt Crawford.
- Published
- 1998
- Full Text
- View/download PDF
26. The JAL Guide to the Professional Literature; Librarianship.
- Author
-
Gorman, Michael, Malinconico, S. Michael, Garriock, Jean, Holley, Edward G, and Johnson, Richard D.
- Subjects
- *
LIBRARY science - Abstract
Reviews several non-ficiton books on librarianship. 'Collected Papers of Frederick G. Kilgour,' edited by Lois L. Yoakum.; 'Pathways for Communication Books and Libraries in the Information Age,' by D. J. Foskett; 'Leaders in American Academic Librarianship,' edited by Wayne A. Weigand.
- Published
- 1985
27. A Researcher-oriented Automated Data Ingestion Tool for rapid data Processing, Visualization and Preservation.
- Author
-
Hacker, Thomas, Dyke, Shirley, Ozdagli, Ali Irmak, Marshall, Gemez, Thompson, Christopher, Rohler, Brian, and Yeum, Chul Min
- Subjects
- *
DASHBOARDS (Management information systems) , *DATA modeling , *CYBERINFRASTRUCTURE - Abstract
A select number of scientific communities have been quite successful in evolving the culture within their community to encourage publishing and to provide resources for re-using well-documented data. These data have great potential for analysis and knowledge generation beyond the purposes for which they were collected and intended. However, there are still barriers in this process. To explore this problem, we have developed a prototype tool: the Experiment Dashboard (ED), with the objective of demonstrating the ability and potential of enabling automated data ingestion from typical research laboratories. This innovative prototype was developed to create a novel system and artifact to explore the possibilities of allowing researchers in laboratories across the nation to link their data acquisition systems directly to structured data repositories for data and metadata ingestion. The prototype functions with commonly used data acquisition software at the data source and the HUBzero scientific gateway at the data sink. ED can be set up with minimal effort and expertise. In this paper, we describe the motivation and purposes for the prototype, the architecture we devised and functionality of this tool, and provide a demonstration of the tool for optical measurements in a structural engineering laboratory. The goal of this paper is to articulate and show through our prototype a vision for future cyberinfrastructure for empirical disciplines that rely on the rapid collection, analysis, and dissemination of valuable experimental data. We also discuss lessons learned that may be useful for others seeking to solve similar problems. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
28. ElecTra code: Full-band electronic transport properties of materials.
- Author
-
Graziosi, Patrizio, Li, Zhen, and Neophytou, Neophytos
- Subjects
- *
PHONON scattering , *BOLTZMANN'S equation , *ACOUSTIC phonons , *THERMAL conductivity , *CHARGE carrier mobility , *INELASTIC scattering , *FERMI level - Abstract
This paper introduces ElecTra , an open-source code which solves the linearized Boltzmann transport equation in the relaxation time approximation for charge carriers in a full-band electronic structure of arbitrary complexity, including their energy, momentum, and band-index dependence. ElecTra stands for 'ELECtronic TRAnsport' and computes the electronic and thermoelectric transport coefficients electrical conductivity, Seebeck coefficient, electronic thermal conductivity, and mobility, for semiconductor materials, for both unipolar and bipolar (small bandgap) materials. The code uses computed full-bands and relevant scattering parameters as inputs and considers single crystal materials in 3D and 2D. The present version of the code (v1) considers: i) elastic scattering with acoustic phonons and inelastic scattering with non-polar optical phonons in the deformation potential approximation, ii) inelastic scattering with polar phonons, iii) scattering with ionized dopants, and iv) alloy scattering. The user is given the option of intra- and inter-band scattering considerations. The simulation output also includes relevant relaxation times and mean-free-paths. The transport quantities are computed as a function of Fermi level position, doping density, and temperature. ElecTra can interface with any DFT code which saves the electronic structure in the '.bxsf' format. In this paper ElecTra is validated against ideal electronic transport situations of known analytical solutions, existing codes employing the constant relaxation time approximation, as well as experimentally well-assessed materials such as Si, Ge, SiGe, and GaAs. Program title: ElecTra – Electronic Transport simulation lab CPC Library link to program files: https://doi.org/10.17632/ycgx2fjzb6.1 Licensing provisions: GPLv3 Programming Language: MATLAB® Nature of the problem: computing the electronic and thermoelectric charge transport coefficients of materials with arbitrary complex full-band electronic structures, considering the carrier energy-, momentum-, and band-dependence of the scattering rates. Solution method: Semiclassical Linearized Boltzmann transport equation, with electronic structures (DFT or analytical) as input, formed into constant-energy surfaces, with scattering rates evaluated using Fermi's Golden Rule. Additional comments including restrictions and unusual features: • Programming interface: any DFT code which saves data in the '.bxsf' format. • RAM: a case study for a half-Heusler bandstructure on a 51 × 51 × 51 k -mesh, 2 Gb per processor is needed • Running time: for the example above, depending on the number and complexity of the scattering mechanisms and the number of simulated Fermi levels and temperatures considered, the time needed varies from ∼ 1 hour on a desktop PC or laptop (light simulations), to 5-10 hours on an HPC with 30-45 cores (heavy simulations). Using the constant relaxation time and constant mean-free-path approximations on a desktop PC or laptop, the running time is of the order of minutes. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
29. OPSimTool: A custom tool for optical photon simulation in Geant4.
- Author
-
Kandemir, Mustafa
- Subjects
- *
PHOTONS , *PROGRAMMING languages , *C++ , *MAINTAINABILITY (Engineering) , *OPTICAL materials - Abstract
This paper introduces OPSimTool, a set of additions to the Geant4 toolkit that simplify the development of optical photon applications and enhance the flexibility and maintainability of developed applications. The tool also provides interfaces for creating reusable and portable material build code. This tool has been developed according to users' needs and perspectives, considering frequently asked questions, most encountered challenges, and evolving needs over time in the optical category of the Geant4 official forum page. Program title: OPSimTool CPC Library link to program files: https://doi.org/10.17632/6jtbxdnpm4.1 Developer's repository link: https://github.com/mkandemirr/OpSim Licensing provisions: GNU General Public License 3 Programming language: C++ External routines/libraries: Geant4, CMake Nature of problem: Although the toolkit provided by Geant4 is sufficient to perform optical photon simulations, providing additional tools will be highly beneficial for developing more flexible and sustainable applications. Solution method: We have developed a set of C++ classes in the spirit of Geant4 users and made them available to the public. • OPSimTool is a set of additions to the Geant4 toolkit that simplifies the development of optical photon applications. • OPSimTool increases the flexibility and maintainability of the developed code. • OPSimTool provides interfaces for creating reusable and portable material build code. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. ACFlow: An open source toolkit for analytic continuation of quantum Monte Carlo data.
- Author
-
Huang, Li
- Subjects
- *
MONTE Carlo method , *GREEN'S functions , *CONTINUATION methods , *STATISTICAL correlation , *FREDHOLM equations - Abstract
The purpose of analytic continuation is to establish a real frequency spectral representation of single-particle or two-particle correlation function (such as Green's function, self-energy function, spin and charge susceptibilities) from noisy data generated in finite temperature quantum Monte Carlo simulations. It requires numerical solutions of a family of Fredholm integral equations of the first kind, which is indeed a challenging task. In this paper, an open source toolkit (dubbed ACFlow) for analytic continuation of quantum Monte Carlo data is presented. We first give a short introduction to the analytic continuation problem. Next, three popular analytic continuation algorithms, including the maximum entropy method, the stochastic analytic continuation, and the stochastic optimization method, as implemented in this toolkit are reviewed. And then we elaborate on the major features, implementation details, basic usage, inputs, and outputs of this toolkit. Finally, four representative examples, including analytic continuations of Matsubara self-energy function, Matsubara and imaginary time Green's functions, and current-current correlation function, are shown to demonstrate the usefulness and flexibility of the ACFlow toolkit. Program Title: ACFlow CPC Library link to program files: https://doi.org/10.17632/th6w74gwjm.1 Developer's repository link: https://github.com/huangli712/ACFlow Licensing provisions: GNU General Public License Version 3 Programming language: Julia Nature of problem: Most of the quantum Monte Carlo methods work on imaginary axis. In order to extract physical observables and compare them with the experimental results, analytic continuation must be done in the post-processing stage to convert the quantum Monte Carlo simulated data from imaginary axis to real axis. Solution method: Three well-established analytic continuation methods, including the maximum entropy method, the stochastic analytic continuation (both A. W. Sandvik's and K. S. D. Beach's algorithms), and the stochastic optimization method, have been implemented in the ACFlow toolkit. Additional comments including restrictions and unusual features: The ACFlow toolkit is written in pure Julia language. It is highly optimized and parallelized. It can be executed interactively in a Jupyter notebook environment. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. Interface tool from Wannier90 to RESPACK: wan2respack.
- Author
-
Kurita, Kensuke, Misawa, Takahiro, Yoshimi, Kazuyoshi, Ido, Kota, and Koretsune, Takashi
- Subjects
- *
PROGRAMMING languages , *FORTRAN , *SUPERCONDUCTORS , *COPPER oxide - Abstract
We develop the interface tool wan2respack, which connects RESPACK (software that derives the low-energy effective Hamiltonians of solids) with Wannier90 (software that constructs Wannier functions). wan2respack converts the Wannier functions obtained by Wannier90 into those used in RESPACK, which is then used to derive the low-energy effective Hamiltonians of solids. In this paper, we explain the basic usage of wan2respack and show its application to standard compounds of correlated materials, namely, the correlated metal SrVO 3 and the high- T c superconductor La 2 CuO 4. Furthermore, we compare the low-energy effective Hamiltonians of these compounds using Wannier functions obtained by Wannier90 and those obtained by RESPACK. We confirm that both types of Wannier functions give the same Hamiltonians. This benchmark comparison demonstrates that wan2respack correctly converts Wannier functions in the Wannier90 format into those in the RESPACK format. Program title: wan2respack CPC Library link to program files: https://doi.org/10.17632/6zfj2dkv5b.1 Licensing provisions: GNU General Public License version 3 Programming language: Fortran and python3 External routines/libraries: Quantum ESPRESSO (version 6.6), Wannier90 (version 3.0.0), RESPACK (version 20200113), tomli. Nature of problem: Using RESPACK, one can derive low-energy effective Hamiltonians of solids from maximally localized Wannier functions. However, due to the differences in the representation of Wannier functions, the Wannier functions obtained by Wannier90 cannot be directly used in RESPACK. Solution method: wan2respack converts the Wannier functions in the Wannier90 format into those in the RESPACK format. Using the converted Wannier functions, one can derive the low-energy effective Hamiltonians using RESPACK. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. JAX-FEM: A differentiable GPU-accelerated 3D finite element solver for automatic inverse design and mechanistic data science.
- Author
-
Xue, Tianju, Liao, Shuheng, Gan, Zhengtao, Park, Chanwook, Xie, Xiaoyu, Liu, Wing Kam, and Cao, Jian
- Subjects
- *
DATA science , *AUTOMATIC differentiation , *PYTHON programming language , *GRAPHICS processing units , *FINITE element method , *COMPUTATIONAL mechanics , *SOURCE code - Abstract
This paper introduces JAX-FEM, an open-source differentiable finite element method (FEM) library. Constructed on top of Google JAX, a rising machine learning library focusing on high-performance numerical computing, JAX-FEM is implemented with pure Python while scalable to efficiently solve problems with moderate to large sizes. For example, in a 3D tensile loading problem with 7.7 million degrees of freedom, JAX-FEM with GPU achieves around 10× acceleration compared to a commercial FEM code depending on platform. Beyond efficiently solving forward problems, JAX-FEM employs the automatic differentiation technique so that inverse problems are solved in a fully automatic manner without the need to manually derive sensitivities. Examples of 3D topology optimization of nonlinear materials are shown to achieve optimal compliance. Finally, JAX-FEM is an integrated platform for machine learning-aided computational mechanics. We show an example of data-driven multi-scale computations of a composite material where JAX-FEM provides an all-in-one solution from microscopic data generation and model training to macroscopic FE computations. The source code of the library and these examples are shared with the community to facilitate computational mechanics research. Program Title: JAX-FEM CPC Library link to program files: https://doi.org/10.17632/hgwshjbcw6.1 Developer's repository link: https://github.com/tianjuxue/jax-am/tree/main/jax%5fam/fem Licensing provisions: GPLv3 Programming language: Python Nature of problem: Implementation of the finite element method (FEM) with several appealing features that classic FEM software typically does not have: realized with pure Python; running on CPU/GPU; differentiable for solving PDE-constrained optimization problems; seamless integration with machine learning. Solution method: Our framework JAX-FEM is based on Google JAX, a high-performance numerical computing library with automatic differentiation features and supporting both CPU/GPU. Unlike classic FEM software written in Fortran or C/C++, JAX-FEM is implemented with pure Python and can easily be installed as a Python package. We demonstrate our software by solving problems including forward PDE prediction, inverse design/optimization, and data-driven analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. QuOCS: The quantum optimal control suite.
- Author
-
Rossignolo, Marco, Reisser, Thomas, Marshall, Alastair, Rembold, Phila, Pagano, Alice, Vetter, Philipp J., Said, Ressa S., Müller, Matthias M., Motzoi, Felix, Calarco, Tommaso, Jelezko, Fedor, and Montangero, Simone
- Subjects
- *
QUANTUM computers , *OPTIMAL control theory , *PYTHON programming language , *QUANTUM information science , *AUTOMATIC differentiation , *QUANTUM computing , *HUMAN information processing , *FEATURE selection - Abstract
Quantum optimal control includes a family of pulse-shaping algorithms that aim to unlock the full potential of a variety of quantum technologies. The Quantum Optimal Control Suite (QuOCS) unites experimental focus and model-based approaches in a unified framework. Easy usage and installation presented here and the availability of various combinable optimization strategies is designed to improve the performance of many quantum technology platforms, such as color defects in diamond, superconducting qubits, atom- or ion-based quantum computers. It can also be applied to the study of more general phenomena in physics. In this paper, we describe the software and the toolbox of gradient-free and gradient-based algorithms. We then show how the user can connect it to their experiment. In addition, we provide illustrative examples where our optimization suite solves typical quantum optimal control problems, in both open- and closed-loop settings. Integration into existing experimental control software is already provided for the experiment control software Qudi (Binder et al., 2017 [41]), and further extensions are investigated and highly encouraged. QuOCS is available from GitHub , under Apache License 2.0, and can be found on the PyPI repository. Program Title: QuOCS - Quantum Optimal Control Suite CPC Library link to program files: https://doi.org/10.17632/wjjch757fk.1 Developer's repository link: https://github.com/Quantum-OCS/QuOCS Licensing provisions: Apache-2.0 Programming language: Python External routines: NumPy [1], SciPy [1], JAX [2] Nature of problem: Quantum systems are typically controlled by time-dependent electromagnetic fields to perform a certain set of quantum operations. Those operations may in turn provide building blocks for various quantum information processing tasks such as quantum computation, communication, simulation, sensing, and metrology. Numerous control strategies exist to design and improve such operations [3]. While some strategies are constructed to target a rather specific problem with high efficiency, others are more general to solve a wide range of applications [4,5]. To access the different algorithms, one has to download different optimization suites with different input and output parameters, making them hard to compare and combine. To benefit from the variety of algorithms, we have devised a customizable and intuitive optimization suite that simultaneously provides access to some of the most popular quantum optimal control algorithms. Solution method: We combine, in a unified framework, some of the frequently used optimal control algorithms which are the dressed Chopped Random Basis method (dCRAB) [6], and Gradient Ascent Pulse Engineering (GRAPE) [7], with an extension to make use of Automatic Differentiation (AD) [8]. The software is able to connect to both models of quantum dynamics in simulations and real-time quantum experiments to perform open- and closed-loop optimization, respectively. With minimal knowledge of optimal control theory, the user can manage to run optimizations of quantum processes using a variety of additional features such as stopping criteria and drift compensation. Logging and data management of the optimization progress and results are also handled by the suite. Its modular structure allows for extensions that accommodate customized algorithms and can be implemented by the user straightforwardly. Additional comments including unusual features: The connection to the experiments is performed by an extension that enables a direct integration to a laboratory control software Qudi [9]. [1] T.E. Oliphant, Comput. Sci. Eng. 9 (2007) 10, http://www.scipy.org/. [2] J. Bradbury, et al., JAX: composable transformations of Python+NumPy programs, http://github.com/google/jax , 2018. [3] S. Glaser, U. Boscain, T. Calarco, et al., Eur. Phys. J. D 69 (2015) 279. [4] C.P. Koch, U. Boscain, T. Calarco, et al., EPJ Quantum Technol. 9 (2022) 19. [5] Schaefer, Ido, Ronnie Kosloff, Phys. Rev. A 101 (2) (2020). [6] N. Rach, M.M. Müller, T. Calarco, S. Montangero, Phys. Rev. A 92 (2015) 6. [7] N. Khaneja, et al., J. Magn. Reson. 92 (6) (2015) 296–305. [8] N. Leung, et al., Phys. Rev. A 95 (2017) 4. [9] J.M. Binder, et al., SoftwareX 6 (2017) 85–90. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. How to Write and Publish a Scientific Paper.
- Author
-
Horner, Michelle S.
- Subjects
- *
SCIENTIFIC communication , *NONFICTION - Abstract
The article reviews the book "How to Write and Publish a Scientific Paper" by Robert Day and Barbara Gastel.
- Published
- 2013
- Full Text
- View/download PDF
35. Reasonable design of Sm-modified Cu-based catalyst for NH3-SCO: Role of the amide intermediates.
- Author
-
Lv, Dengke, Liu, Jun, Zhang, Guojie, Wang, Ying, Ge, Shiwei, Zhao, Yuqiong, and Li, Guoqiang
- Subjects
- *
SELECTIVE catalytic oxidation , *CATALYSTS , *CATALYTIC activity , *TITANIUM dioxide , *WATER vapor , *ATMOSPHERIC ammonia - Abstract
The selective catalytic oxidation of ammonia (NH 3 -SCO) is currently the most effective method used to eliminate NH 3. However, one of the major challenges for NH 3 -SCO is the development of catalyst capable of completely converting NH 3 into harmless N 2 and water vapor. In this paper, a high N 2 selective catalyst prepared by the sol-gel method using TiO 2 as the support, Cu as the active species, and Sm as the auxiliary agent is presented. Compared to traditional Cu/TiO 2 catalyst, the 4SmCu/TiO 2 catalyst has higher catalytic activity and N 2 selectivity at low temperatures. The NH 3 conversion can reach 100%, and the selectivity of N 2 can be maintained at 100% at 275 °C. The excellent catalytic activity is attributed to the highly dispersed active species, abundant Lewis acid sites (LASs), and the generation of large amounts of surface-adsorbed oxygen. In addition, doping Sm species will cause TiO 2 lattice distortion, and at the same time load the distorted TiO 2 with more active material. Moreover, in-situ DRIFTS analysis suggests that both the 4SmCu/TiO 2 and Cu/TiO 2 catalysts follow the "internal" selective catalytic reduction (iSCR) mechanism during NH 3 -SCO reactions. The 4SmCu/TiO 2 catalyst generates more amides (-NH 2), which reduces the non-selective oxidation of the catalysts and promotes the formation of N 2. This provides a new idea and method for the selective catalytic oxidation of NH 3 to N 2 and water vapor using Cu-based catalysts. [Display omitted] • 4SmCu/TiO2 catalyst converts all NH3 into N2 and water vapor at 275 ℃. • The lattice distortion of TiO2 can support more active species. • Doping of Sm species increased the content of Lewis acid site and surface-adsorbed oxygen. • The rich amide intermediates reduce the non-selective oxidation of NH3 and O2. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. ChampKit: A framework for rapid evaluation of deep neural networks for patch-based histopathology classification.
- Author
-
Kaczmarzyk, Jakub R., Gupta, Rajarsi, Kurc, Tahsin M., Abousamra, Shahira, Saltz, Joel H., and Koo, Peter K.
- Subjects
- *
DEEP learning , *ARTIFICIAL neural networks , *TRANSFORMER models , *COMPUTER vision , *HISTOPATHOLOGY , *CHOICE (Psychology) - Abstract
• We present ChampKit, a Python-based software package that enables rapid exploration and evaluation of deep learning models for patch-level classification of histopathology data. It is open source and available at https://github.com/SBU-BMI/champkit. ChampKit is designed to be highly reproducible and enables systematic, unbiased evaluation of patch-level histopathology classification. It incorporates public datasets for six clinically important tasks and access to hundreds of (pre-trained) deep learning models. It can easily be extended to custom patch classification datasets and custom deep learning architectures. • The users are intended to be (1) biomedical research groups interested in finding and fine-tuning the best models to analyze a broad collection of whole slide images, and (2) deep learning methods research groups interested in systematically and quickly evaluating their methods against a set of state-of-the-art methods with different pretraining and transfer learning configurations. • We demonstrate the utility of ChampKit by evaluating two ResNet models and one vision transformer on the six diverse classification tasks for patch-level histopathology datasets. We did not find consistent benefits from pretrained models versus random initialization across the different datasets, which suggests that a thorough exploration of model architectures is important to identify optimal models for a given dataset. Histopathology is the gold standard for diagnosis of many cancers. Recent advances in computer vision, specifically deep learning, have facilitated the analysis of histopathology images for many tasks, including the detection of immune cells and microsatellite instability. However, it remains difficult to identify optimal models and training configurations for different histopathology classification tasks due to the abundance of available architectures and the lack of systematic evaluations. Our objective in this work is to present a software tool that addresses this need and enables robust, systematic evaluation of neural network models for patch classification in histology in a light-weight, easy-to-use package for both algorithm developers and biomedical researchers. Here we present ChampKit (Comprehensive Histopathology Assessment of Model Predictions toolKit): an extensible, fully reproducible evaluation toolkit that is a one-stop-shop to train and evaluate deep neural networks for patch classification. ChampKit curates a broad range of public datasets. It enables training and evaluation of models supported by timm directly from the command line, without the need for users to write any code. External models are enabled through a straightforward API and minimal coding. As a result, Champkit facilitates the evaluation of existing and new models and deep learning architectures on pathology datasets, making it more accessible to the broader scientific community. To demonstrate the utility of ChampKit, we establish baseline performance for a subset of possible models that could be employed with ChampKit, focusing on several popular deep learning models, namely ResNet18, ResNet50, and R26-ViT, a hybrid vision transformer. In addition, we compare each model trained either from random weight initialization or with transfer learning from ImageNet pretrained models. For ResNet18, we also consider transfer learning from a self-supervised pretrained model. The main result of this paper is the ChampKit software. Using ChampKit, we were able to systemically evaluate multiple neural networks across six datasets. We observed mixed results when evaluating the benefits of pretraining versus random intialization, with no clear benefit except in the low data regime, where transfer learning was found to be beneficial. Surprisingly, we found that transfer learning from self-supervised weights rarely improved performance, which is counter to other areas of computer vision. Choosing the right model for a given digital pathology dataset is nontrivial. ChampKit provides a valuable tool to fill this gap by enabling the evaluation of hundreds of existing (or user-defined) deep learning models across a variety of pathology tasks. Source code and data for the tool are freely accessible at https://github.com/SBU-BMI/champkit. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. AmbieGen: A search-based framework for autonomous systems testing.
- Author
-
Humeniuk, Dmytro, Khomh, Foutse, and Antoniol, Giuliano
- Subjects
- *
DRIVERLESS cars , *TEST systems , *AUTONOMOUS robots , *MOBILE robots , *EVOLUTIONARY algorithms , *AUTONOMOUS vehicles , *SEARCH algorithms - Abstract
Thorough testing of safety-critical autonomous systems, such as self-driving cars, autonomous robots, and drones, is essential for detecting potential failures before deployment. One crucial testing stage is model-in-the-loop testing, where the system model is evaluated by executing various scenarios in a simulator. However, the search space of possible parameters defining these test scenarios is vast, and simulating all combinations is computationally infeasible. To address this challenge, we introduce AmbieGen, a search-based test case generation framework for autonomous systems. AmbieGen uses evolutionary search to identify the most critical scenarios for a given system, and has a modular architecture that allows for the addition of new systems under test, algorithms, and search operators. Currently, AmbieGen supports test case generation for autonomous robots and autonomous car lane keeping assist systems. In this paper, we provide a high-level overview of the framework's architecture and demonstrate its practical use cases. • AmbieGen is an evolutionary algorithm based test scenario generation tool. • The search algorithm maximizes the difficulty of test scenarios as well as their diversity. • The tool is customizable and can be used to test different robotic systems. • Current tool version includes test scenario generation for autonomous vehicles and mobile robots. • The tool can be accessed at: https://github.com/swat-lab-optimization/AmbieGen-tool. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. FLUST: A fast, open source framework for ultrasound blood flow simulations.
- Author
-
Ekroll, Ingvild Kinn, Saris, Anne E.C.M., and Avdal, Jørgen
- Subjects
- *
FLOW simulations , *SIGNAL integrity (Electronics) , *ULTRASONIC imaging , *DOPPLER ultrasonography , *BLOOD flow , *DIGITAL image correlation , *TRANSDUCERS , *IMAGING phantoms - Abstract
• We introduce the open source simulator FLUST, as part of the UltraSound ToolBox (USTB). • FLUST produces multiple realizations of ultrasound signals from flow fields. • High integrity signals are achieved at low computational cost. • Framework includes tools for visualization and assessment of estimator performance. • Database includes customizable acquisition setups, flow phantoms and estimators. Background and objective: Ultrasound based blood velocity estimation is a continuously developing frontier, where the vast number of possible acquisition setups and velocity estimators makes it challenging to assess which combination is better suited for a given imaging application. FLUST, the Flow-Line based Ultrasound Simulation Tool, may be used to address this challenge, providing a common platform for evaluation of velocity estimation schemes on in silico data. However, the FLUST approach had some limitations in its original form, including reduced robustness for phase sensitive setups and the need for manual selection of integrity parameters. In addition, implementation of the technique and therefore also documentation of signal integrity was left to potential users of the approach. Methods: In this work, several improvements to the FLUST technique are proposed and investigated, and a robust, open source simulation framework developed. The software supports several transducer types and acquisition setups, in addition to a range of different flow phantoms. The main goal of this work is to offer a robust, computationally cheap and user-friendly framework to simulate ultrasound data from stationary blood velocity fields and thereby facilitate design and evaluation of estimation schemes, including acquisition design, velocity estimation and other post-processing steps. Results: The technical improvements proposed in this work resulted in reduced interpolation errors, reduced variability in signal power, and also automatic selection of spatial and temporal discretization parameters. Results are presented illustrating the challenges and the effectiveness of the solutions. The integrity of the improved simulation framework is validated in an extensive study, with results indicating that speckle statistics, spatial and temporal correlation and frequency content all correspond well with theoretical predictions. Finally, an illustrative example shows how FLUST may be used throughout the design and optimization process of a velocity estimator. Conclusions: The FLUST framework is available as a part of the UltraSound ToolBox (USTB), and the results in this paper demonstrate that it can be used as an efficient and reliable tool for the development and validation of ultrasound-based velocity estimation schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Nano-paper that can combat viruses.
- Author
-
Donaldson, Laurie
- Published
- 2014
- Full Text
- View/download PDF
40. Double Fold: Libraries and the Assault on Paper (Book).
- Author
-
McKinzie, Steve
- Subjects
- *
LIBRARIES , *NONFICTION - Abstract
Reviews the book 'Double Fold: Libraries and the Assault on Paper,' by Nicholson Baker.
- Published
- 2002
- Full Text
- View/download PDF
41. The enigma of the aerofoil: Rival theories in aerodynamics, 1909–1930. David Bloor; University of Chicago Press, Chicago, 2011, pp. 608, Price $35.00 paper, ISBN: 978-0-226-06095-8.
- Author
-
Wisnioski, Matthew
- Published
- 2013
- Full Text
- View/download PDF
42. FabSim3: An automation toolkit for verified simulations using high performance computing.
- Author
-
Groen, Derek, Arabnejad, Hamid, Suleimenova, Diana, Edeling, Wouter, Raffin, Erwan, Xue, Yani, Bronik, Kevin, Monnier, Nicolas, and Coveney, Peter V.
- Subjects
- *
HIGH performance computing , *PYTHON programming language , *AUTOMATION , *ERROR probability , *HUMAN error , *BUDGET - Abstract
A common feature of computational modelling and simulation research is the need to perform many tasks in complex sequences to achieve a usable result. This will typically involve tasks such as preparing input data, pre-processing, running simulations on a local or remote machine, post-processing, and performing coupling communications, validations and/or optimisations. Tasks like these can involve manual steps which are time and effort intensive, especially when it involves the management of large ensemble runs. Additionally, human errors become more likely and numerous as the research work becomes more complex, increasing the risk of damaging the credibility of simulation results. Automation tools can help ensure the credibility of simulation results by reducing the manual time and effort required to perform these research tasks, by making more rigorous procedures tractable, and by reducing the probability of human error due to a reduced number of manual actions. In addition, efficiency gained through automation can help researchers to perform more research within the budget and effort constraints imposed by their projects. This paper presents the main software release of FabSim3, and explains how our automation toolkit can improve and simplify a range of tasks for researchers and application developers. FabSim3 helps to prepare, submit, execute, retrieve, and analyze simulation workflows. By providing a suitable level of abstraction, FabSim3 reduces the complexity of setting up and managing a large-scale simulation scenario, while still providing transparent access to the underlying layers for effective debugging. The tool also facilitates job submission and management (including staging and curation of files and environments) for a range of different supercomputing environments. Although FabSim3 itself is application-agnostic, it supports a provably extensible plugin system where users automate simulation and analysis workflows for their own application domains. To highlight this, we briefly describe a selection of these plugins and we demonstrate the efficiency of the toolkit in handling large ensemble workflows. Program Title: FabSim3 CPC Library link to program files: https://doi.org/10.17632/6nfrwy7ptj.1 Licensing provisions: BSD 3-clause Programming language: Python 3 Nature of problem: Many aspects are crucial for obtaining reproducible and robust simulation results. For instance, we need to curate all the inputs and outputs for later scrutiny, scrutinize the model behaviour under slightly perturbed circumstances, quantify the propagation of key uncertainties from input data and known parameters and analyze the sensitivity for any parameters for which the exact specification eludes us. Solution method: FabSim3 uses a range of methods to provide automation. These primarily include: (i) SSH + Fabric2 to enable remote execution of SSH commands, (ii) an internal parameter state space using primarily Python dict objects that can be customized with machine- plugin- and user-specific modifications, (iii) Python templating to quickly enable the insertion of state space variables into supercomputing scripts, (iv) multiprocessing and/or QCG-PilotJob to enable efficient submission and execution of job arrays and (v) a system of flexibly installable and modifiable Python3 plugins which allows users to create and customize application-specific functionalities without modifying the core code base. In addition to the written code, FabSim3 also relies on a set of user conventions to maintain a separation of concerns (particularly between machine-, user- and application-specific settings). Additional comments including restrictions and unusual features: This paper serves as the definitive reference for FabSim3. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. The JAL Guide to the professional literature.
- Author
-
Frost, Gary and Cooper, Jeff
- Subjects
- EARLY Bindings in Paper (Book)
- Abstract
Reviews the book `Early Bindings in Paper: A Brief History of European Hand-Made Paper-Covered Books with a Multilingual Glossary, by Michele V. Cloonan.
- Published
- 1993
44. The JAL guide to the professional literature: Books on books.
- Author
-
Jenkins, Fred W. and Kaser, David
- Subjects
- EARLY Bindings in Paper (Book), SCRIBES, Script & Books (Book)
- Abstract
Reviews the books `Early Bindings in Paper: A Brief History of European Hand-Made Paper-Covered Books with a Multilingual Glossary,' by Michele V. Cloonan and `Scribes, Script, and Books: The Book Arts from Antiquity to the Renaissance,' by Leila Avrin.
- Published
- 1992
45. The Structured Oral Examination in Clinical Anaesthesia: Practice Examination Papers.
- Author
-
Barker, I.
- Subjects
- *
ANESTHESIA , *NONFICTION - Abstract
The article reviews the book "The Structured Oral Examination in Clinical Anaesthesia: Practice Examination Papers," edited by C. Mendonca, C. Hillermann, J. James and G. S. A. Kumar.
- Published
- 2009
- Full Text
- View/download PDF
46. Physics meets philosophy at the planck scale: contemporary theories in quantum gravity: Craig Callender and Nick Huggett (Eds.); Cambridge University Press, Cambridge, 2001, x + 365 pp., prices US $100.00 (cloth), US $36.00 (paper), ISBN 052166280X (cloth), 0521664454 (paper)
- Author
-
Maudlin, T.
- Published
- 2004
- Full Text
- View/download PDF
47. Life Is With Others—Selected Papers on Child Psychiatry.
- Author
-
Blair, Jennifer
- Subjects
- *
CHILD psychology , *NONFICTION - Abstract
The article reviews the book "Life Is With Others: Selected Papers on Child Psychiatry by Donald J. Cohen," edited by Andrés Martin and Robert A. King.
- Published
- 2006
- Full Text
- View/download PDF
48. Redesign of Catalogs and Indexes for Improved Online Subject Access (Book).
- Author
-
Mandel, Carol A.
- Subjects
- *
ONLINE library catalogs , *NONFICTION - Abstract
Reviews the non-fiction book 'Redesign of Catalogs and Indexes for Improved Online Subject Access: Selected Papers of Pauline A. Cochrane,' edited by Pauline Cochrane.
- Published
- 1986
49. libEMM: A fictious wave domain 3D CSEM modelling library bridging sequential and parallel GPU implementation.
- Author
-
Yang, Pengliang
- Subjects
- *
HIGH performance computing , *FINITE difference time domain method , *PROGRAMMING languages , *GRAPHICS processing units , *FORTRAN , *ELECTROMAGNETISM - Abstract
This paper delivers a software - libEMM - for 3D controlled-source electromagnetics (CSEM) modelling in fictitious wave domain, based on the newly developed high-order finite-difference time-domain (FDTD) method on non-uniform grid. The numerical simulation can be carried out over a number of parallel processors using MPI-based high performance computing architecture. The FDTD kernel coded in C has been parallelized with OpenMP for speedup using local shared memory. In addition, the software features a GPU implementation of the same algorithm based on CUDA programming language, which can be cross-validated and compared in terms of efficiency. A perspective of libEMM on the horizon is its application to 3D CSEM inversion in land and marine environment. Program Title: libEMM CPC Library link to program files: https://doi.org/10.17632/p769t7c5bk.1 Developer's repository link: https://github.com/yangpl/libEMM Licensing provisions: GNU General Public License v3.0 Programming language: C, CUDA, Fortran, Shell External dependencies: MPI [1], FFTW [2], CUDA [3] Nature of problem: Controlled-source electromagnetics (CSEM) Solution method: High-order finite-difference time-domain (FDTD) on non-uniform grid by fictious wave domain transformation [1] https://www.mpich.org/ [2] http://fftw.org/ [3] https://developer.nvidia.com/cuda-toolkit [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. QMaxUSE: A new tool for verifying UML class diagrams and OCL invariants.
- Author
-
Wu, Hao
- Subjects
- *
ENGINEERING - Abstract
Formal verification of a UML class diagram annotated with OCL constraints has been a long-standing challenge in Model-driven Engineering. In the past decades, many tools and techniques have been proposed to tackle this challenge. However, they do not scale well and are often unable to locate the conflicts when then number of OCL constraints significantly increases. In this paper, we present a new tool called QMaxUSE. This tool is designed for verifying UML class diagrams annotated with large number of OCL invariants. QMaxUSE is easy to install and deploy. It offers two distinct features. (1) A simple query language that allows users to choose parts of a UML class diagram to be verified. (2) A new procedure that is capable of performing concurrent verification. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.