94 results
Search Results
2. Advancing the field of language assessment: Papers from TIRF doctoral dissertation grantees.
- Author
-
Zhong, Yu and Gu, Xiangdong
- Subjects
- *
LANGUAGE & languages , *NONFICTION - Published
- 2018
- Full Text
- View/download PDF
3. Sound Rising from the Paper: Nineteenth-Century Martial Arts Fiction and the Chinese Acoustic Imagination.
- Author
-
Mason, Paul H.
- Subjects
- *
MARTIAL arts , *NONFICTION - Published
- 2018
- Full Text
- View/download PDF
4. Changing Behaviours: On the Rise of the Psychological State. Rhys Jones, Jessica Pykett, Mark Whithead. Edward Elgar, Cheltenham, U.K. (2013). xiii and 216pp., £25.00 (U.K.), ISBN: 9780857936882 (paper)
- Author
-
Lewis, Alan
- Published
- 2014
- Full Text
- View/download PDF
5. How much is saving humanity and/or our planet worth?MichaelJonesAccounting for Biodiversity2014Earthscan/RoutledgeLondon(304 pages, xvii page introduction. US$160 (hard cover), $59.95 (soft cover). ISBN 978-0-415-63063-4; also in paper)
- Author
-
Lempert, David
- Published
- 2015
- Full Text
- View/download PDF
6. Editing Research: The Author Editing Approach to Providing Effective Support to Writers of Research Papers, Valerie Matarese. Information Today. Medford, New Jersey (2016). 244 pp., Softbound: $49.50, Web Order Price: $44.55.
- Author
-
Burgess, Sally
- Subjects
- *
MANUSCRIPT preparation (Authorship) , *REPORT writing , *NONFICTION - Published
- 2017
- Full Text
- View/download PDF
7. Writing and publishing science research papers in english: A global perspective.
- Author
-
Mu, Congjun
- Subjects
- *
SCIENCE publishing , *SCIENTIFIC terminology , *NONFICTION , *HIGHER education - Published
- 2015
- Full Text
- View/download PDF
8. ValerieMatareseEditing research: The author editing approach to providing effective support to writers of research papers2016Information TodayMedford, NJ244 pp., ISBN: 978-157387-531-8, US $49.50.
- Author
-
Conrad, Nina
- Subjects
- *
ACADEMIC discourse , *COPY editing , *NONFICTION - Published
- 2017
- Full Text
- View/download PDF
9. R.L.HampelPaul Diederich and the Progressive American High SchoolA Volume in Readings in Educational Thought2014Information Age PublishingCharlotte, NCISBN Hardcover 978-1-62396-578-5, paper 978-1-62396-577-8, eBook 978-1-62396-579-2.
- Author
-
White, Edward M.
- Subjects
- *
HIGH schools , *NONFICTION - Published
- 2015
- Full Text
- View/download PDF
10. lifex-cfd: An open-source computational fluid dynamics solver for cardiovascular applications.
- Author
-
Africa, Pasquale Claudio, Fumagalli, Ivan, Bucelli, Michele, Zingaro, Alberto, Fedele, Marco, Dede', Luca, and Quarteroni, Alfio
- Subjects
- *
COMPUTATIONAL fluid dynamics , *NAVIER-Stokes equations , *HEART valves , *PROGRAMMING languages , *SOURCE code - Abstract
Computational fluid dynamics (CFD) is an important tool for the simulation of the cardiovascular function and dysfunction. Due to the complexity of the anatomy, the transitional regime of blood flow in the heart, and the strong mutual influence between the flow and the physical processes involved in the heart function, the development of accurate and efficient CFD solvers for cardiovascular flows is still a challenging task. In this paper we present life▪-cfd, an open-source CFD solver for cardiovascular simulations based on the life▪ finite element library, written in modern C++ and exploiting distributed memory parallelism. We model blood flow in both physiological and pathological conditions via the incompressible Navier-Stokes equations, accounting for moving cardiac valves, moving domains, and transition-to-turbulence regimes. In this paper, we provide an overview of the underlying mathematical formulation, numerical discretization, implementation details and examples on how to use life▪-cfd. We verify the code through rigorous convergence analyses, and we show its almost ideal parallel speedup. We demonstrate the accuracy and reliability of the numerical methods implemented through a series of idealized and patient-specific vascular and cardiac simulations, in different physiological flow regimes. The life▪-cfd source code is available under the LGPLv3 license, to ensure its accessibility and transparency to the scientific community, and to facilitate collaboration and further developments. Program Title: life▪-cfd CPC Library link to program files: https://doi.org/10.17632/hzsnc3jgds.1 Developer's repository link: https://gitlab.com/lifex/lifex-cfd Licensing provisions: LGPLv3 Programming language: C++ (standard ≥17) Supplementary material: https://doi.org/10.5281/zenodo.7852088 contains the application executable in binary form, compatible with any recent enough x86-64 Linux system, assuming that glibc version ≥ 2.28 is installed. Data and parameter files necessary to replicate the test cases described in this manuscript are also available. Nature of problem: The program allows to run computational fluid dynamics simulations of cardiovascular blood flows in physiological and pathological conditions, modeled through incompressible Navier-Stokes equations, including moving cardiac valves, moving domains (such as contracting cardiac chambers) in the arbitrary Lagrangian-Eulerian framework, and transition-to-turbulence flow. Given the scale of the typical applications, the program is designed for parallel execution. Solution method: The equations are discretized using the Finite Element method, on either tetrahedral or hexahedral meshes. The software builds on top of deal.II, implementing the mathematical models and numerical methods specific for CFD cardiovascular simulations. Parallel execution exploits the MPI paradigm. The software supports both Trilinos and PETSc as linear algebra backends. Additional comments including restrictions and unusual features: The program provides a general-purpose executable that can be used to run CFD simulations without having to access or modify the source code. The program allows to setup simulations through a user-friendly yet flexible interface, by means of readable and self-documenting parameter files. On top of that, more advanced users can modify the source code to implement more sophisticated test cases. life▪-cfd supports checkpointing, i.e. simulations can be stopped and restarted at a later time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Ç.H.ŞekercioğluD.G.WennyC.J.WhelanWhy Birds Matter: Ecological Function and Ecosystem Services2016The University of Chicago Press(x+387 pp. Paper $45.00 ISBN: 9780226382630; Cloth $135.00 ISBN: 9780226382463; e-book $45.00 ISBN: 9780226382777)
- Author
-
Holden, Madronna
- Subjects
- *
ECOSYSTEM services , *NONFICTION - Published
- 2017
- Full Text
- View/download PDF
12. ESA: An efficient sequence alignment algorithm for biological database search on Sunway TaihuLight.
- Author
-
Zhang, Hao, Huang, Zhiyi, Chen, Yawen, Liang, Jianguo, and Gao, Xiran
- Subjects
- *
SEQUENCE alignment , *BIOLOGICAL databases , *SUPERCOMPUTERS , *DATABASE searching , *DATA structures , *AMINO acid sequence - Abstract
In computational biology, biological database search has been playing a very important role. Since the COVID-19 outbreak, it has provided significant help in identifying common characteristics of viruses and developing vaccines and drugs. Sequence alignment, a method finding similarity, homology and other information between gene/protein sequences, is the usual tool in the database search. With the explosive growth of biological databases, the search process has become extremely time-consuming. However, existing parallel sequence alignment algorithms cannot deliver efficient database search due to low utilization of the resources such as cache memory and performance issues such as load imbalance and high communication overhead. In this paper, we propose an efficient sequence alignment algorithm on Sunway TaihuLight, called ESA, for biological database search. ESA adopts a novel hybrid alignment algorithm combining local and global alignments, which has higher accuracy than other sequence alignment algorithms. Further, ESA has several optimizations including cache-aware sequence alignment, capacity-aware load balancing and bandwidth-aware data transfer. They are implemented in a heterogeneous processor SW26010 adopted in the world's 6th fastest supercomputer, Sunway TaihuLight. The implementation of ESA is evaluated with the Swiss-Prot database on Sunway TaihuLight and other platforms. Our experimental results show that ESA has a speedup of 34.5 on a single core group (with 65 cores) of Sunway TaihuLight. The strong and weak scalabilities of ESA are tested with 1 to 1024 core groups of Sunway TaihuLight. The results show that ESA has linear weak scalability and very impressive strong scalability. For strong scalability, ESA achieves a speedup of 338.04 with 1024 core groups compared with a single core group. We also show that our proposed optimizations are also applicable to GPU, Intel multicore processors, and heterogeneous computing platforms. • In this paper, we propose and implement an efficient sequence alignment algorithm, ESA, for biological database search on SW26010 heterogeneous processors. This algorithm adopts both local and global alignments for biological database search with several optimizations. ESA achieves high computational performance without sacrificing accuracy. To the best of our knowledge, this is the first attempt to parallelize hybrid sequence alignment on Sunway TaihuLight using multi-level optimizations. • We propose three optimization strategies in ESA: cache-aware sequence alignment, capacity-aware load balancing and bandwidth-aware data transfer. Cache-aware sequence alignment effectively reduces the size of the data structure for sequence alignment and fully utilizes the vectorization of the slave cores of SW26010. With capacity-aware load balancing, we distribute the workload evenly among the cores of SW26010. With bandwidth-aware data transfer, ESA reduces the communication overhead by using asynchronous DMA transmission and RLC. • We evaluate the performance of ESA using the Swiss-Prot database on Sunway TaihuLight. Our experimental results show that ESA achieves a speedup of 34.5 times on a single CG over the manager core. Compared with a serial implementation on Intel (R) Xeon (R) CPU E5-2620 v4 processor, ESA achieves a speedup of 21.6 on a single CG. We also demonstrate that ESA has linear weak scalability and very competitive strong scalability. Finally, we compare ESA with mainstream algorithms on the CPU+GPU platform and achieve the highest GCUPS of 228.91. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
13. ENDFtk: A robust tool for reading and writing ENDF-formatted nuclear data.
- Author
-
Haeck, W., Gibson, N., and Talou, P.
- Subjects
- *
PYTHON programming language , *C++ - Abstract
ENDFtk is a recently developed C++ and Python interface to interact with ENDF-6 formatted nuclear data files. It provides a robust and complete interface, allowing the reading and writing of all formats currently part of the ENDF-6 formats manual, as well as some non-ENDF formats used by the NJOY processing code. It provides an interface that mimics the names in the ENDF-6 formats manual as well as an equivalent interface using human-readable attribute names. It is robust and powerful enogh for nuclear data experts to develop complex applications, while also simple enough to be used non-experts to retrieve and manipulate evaluated nuclear data. ENDFtk offers the ability to easily interrogate and manipulate data either in large-scale code projects or in simple Python scripts. In this paper, a brief overview of the interface is given, as well as more substantial examples demonstrating plotting simple data, interacting with more complex data, and writing new data to files. ENDFtk is open source and available for download via GitHub (https://github.com/njoy/ENDFtk). Program title: ENDFtk 1.0 CPC Library link to program files: https://doi.org/10.17632/9p4kxc2cvd.1 Developer's repository link: https://github.com/njoy/ENDFtk Licensing provisions: BSD-3 clause Programming language: C++ and Python External routines/libraries: pybind11, ranges-v3, spdlog Nature of problem: Provide an interface to read, write and manipulate nuclear data files using the ENDF-6 format. This interface can be integrated into other libraries requiring access to nuclear data, or be used directly using the Python interface. Solution method: Library of C++ routines, with their Python bindings, to be integrated in higher-level codes and scripts [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Effective curriculum for teaching L2 writing: Principles and techniques, E. Hinkel. Routledge/Taylor & Francis: New York (2015). x. 302 pp. $90.00, hardcover. $32.99 paper.
- Author
-
Yigitoglu, Nur
- Subjects
- *
CURRICULUM , *NONFICTION ,WRITING - Published
- 2016
- Full Text
- View/download PDF
15. Remarks on the paper by Sh. A. Mukhamediev, E. I. Ryzhak, and S. V. Sinyukhina “Stability of a two-layer system of inhomogeneous heavy barotropic fluids”, J. Appl. Math. Mech.: 2016, Vol. 80, No. 3, pp. 264–270.
- Author
-
Kulikovskii, A.G.
- Subjects
- *
NONFICTION - Published
- 2016
- Full Text
- View/download PDF
16. Reflections on the settlement of fisheries disputes between the EU-UK in the post-Brexit era: Lessons for China's fishery enforcement disputes settlement.
- Author
-
Zhu, Jiaxin and Xu, Qi
- Subjects
EUROPE-Great Britain relations ,DISPUTE resolution ,INTERNATIONAL cooperation ,FISHERIES ,INTERNATIONAL law - Abstract
The Trade and Cooperation Agreement sets out in details the mechanism for resolving fisheries disputes between the EU and the UK. It consists of the following core components: the mechanism for the settlement of fisheries disputes arising from access to waters, the mechanism for the settlement of fisheries disputes arising from failure to fulfill fisheries obligations, the Specialised Committee on Fisheries as a platform for consultation, and the Arbitration Tribunal for Disputes. Although it has assisted both parties to resolve fisheries disputes peacefully within the framework of international law, the implementation of the mechanism still faces a number of challenges. At present, China is facing long-term and complex fishery enforcement disputes with neighboring countries. Although China has signed bilateral fishery cooperation agreements with several countries, the establishment of an effective fishery enforcement disputes settlement mechanism on fisheries remains unresovled. This paper on the EU-UK fishery dispute mechanism aims to provide some experience and lessons for China to resolve fishery enforcement disputes with neighboring countries more effectively. This paper is mainly divided into three parts, the first part is mainly about the overview of the EU-UK fishery dispute mechanism, the second part is the dilemma of China's fishery enforcement dispute settlement, and the third part is the lessons from the EU-UK fishery dispute mechanism. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. The ELL Writer: Moving Beyond Basics in the Secondary Classroom, C. Ortmeier-Hooper. Teacher’s College Press, NY (2013). xv. 197 pp. $68.00, cloth. $31.95 paper.
- Author
-
Caplan, Nigel A.
- Subjects
- *
TEACHING , *NONFICTION - Published
- 2015
- Full Text
- View/download PDF
18. PW-NeRF: Progressive wavelet-mask guided neural radiance fields view synthesis.
- Author
-
Han, Xuefei, Liu, Zheng, Nan, Hai, Zhao, Kai, Zhao, Dongjie, and Jin, Xiaodan
- Subjects
- *
RADIANCE , *GAUSSIAN processes , *IMAGE compression - Abstract
Neural Radiance Fields (NeRF) can achieve state-of-the-art new view results when given a sufficient number of training views. However, NeRF's rendering is based on minimizing photometric consistency loss, and during the optimization process, it can suffer from overfitting due to factors such as lighting and texture, resulting in poor geometric and color reconstruction. The less data there is in the overlapping areas of the images, the greater the impact on the results, such as insufficient data caused by occlusion relationships. In this paper, we observed through experiments that introducing low-frequency images into NeRF during training can quickly obtain approximate geometric structures, which can guide NeRF to achieve more stable view synthesis results. Therefore, we propose a progressive wavelet mask to assist in the training of neural radiance fields. By first constructing a gaussian pyramid for the training images and then applying wavelet decomposition to them, we can obtain a series of low-frequency region masks to guide the neural radiance field to focus on learning low-frequency pixel regions and gradually introduce high-frequency lighting and texture changes. Our experiments on the LLFF and Blender datasets show that using a progressive wavelet mask in NeRF training can achieve more realistic generation effects with almost no additional computational overhead. At the same time, we also tested our method in sparse scenes, where it still performs well in avoiding overfitting. • NeRF get more stable geometric structures and rendering quality with blur image. • Designed a new training strategy to enhance the rendering performance of NeRF. • NeRF model with neural network is influenced by the ray training strategy. • The method is more effective in NeRF with neural network. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. A Researcher-oriented Automated Data Ingestion Tool for rapid data Processing, Visualization and Preservation.
- Author
-
Hacker, Thomas, Dyke, Shirley, Ozdagli, Ali Irmak, Marshall, Gemez, Thompson, Christopher, Rohler, Brian, and Yeum, Chul Min
- Subjects
- *
DASHBOARDS (Management information systems) , *DATA modeling , *CYBERINFRASTRUCTURE - Abstract
A select number of scientific communities have been quite successful in evolving the culture within their community to encourage publishing and to provide resources for re-using well-documented data. These data have great potential for analysis and knowledge generation beyond the purposes for which they were collected and intended. However, there are still barriers in this process. To explore this problem, we have developed a prototype tool: the Experiment Dashboard (ED), with the objective of demonstrating the ability and potential of enabling automated data ingestion from typical research laboratories. This innovative prototype was developed to create a novel system and artifact to explore the possibilities of allowing researchers in laboratories across the nation to link their data acquisition systems directly to structured data repositories for data and metadata ingestion. The prototype functions with commonly used data acquisition software at the data source and the HUBzero scientific gateway at the data sink. ED can be set up with minimal effort and expertise. In this paper, we describe the motivation and purposes for the prototype, the architecture we devised and functionality of this tool, and provide a demonstration of the tool for optical measurements in a structural engineering laboratory. The goal of this paper is to articulate and show through our prototype a vision for future cyberinfrastructure for empirical disciplines that rely on the rapid collection, analysis, and dissemination of valuable experimental data. We also discuss lessons learned that may be useful for others seeking to solve similar problems. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
20. Making Hawai’i’s Place: Review of The World and All the Things Upon It: Native Hawaiian Geographies of Exploration, David A. Chang, University of Minnesota Press, 2016, $27.00 (paper), 344 pages, ISBN: 978-0-8166-9942-1.
- Author
-
Pickman, Sarah M.
- Subjects
- *
AMERICAN geographical discoveries , *NONFICTION - Published
- 2017
- Full Text
- View/download PDF
21. V.SperlingSex, politics, & Putin: Political legitimacy in Russia2015Oxford University PressOxford(ix, 360pp. Appendix. Bibliography. Index. Paper).
- Author
-
Pfahlert, Jeanine
- Subjects
- *
NONFICTION , *HISTORY ,RUSSIAN politics & government - Published
- 2017
- Full Text
- View/download PDF
22. Development of the fully Geant4 compatible package for the simulation of Dark Matter in fixed target experiments.
- Author
-
Banto Oberhauser, B., Bisio, P., Celentano, A., Depero, E., Dusaev, R.R., Kirpichnikov, D.V., Kirsanov, M.M., Krasnikov, N.V., Marini, A., Marsicano, L., Molina-Bueno, L., Mongillo, M., Shchukin, D., Sieber, H., and Voronchikhin, I.V.
- Subjects
- *
DARK matter , *C++ , *MUONS , *PHOTON beams , *SIMULATION software , *USER interfaces - Abstract
The search for new comparably light (well below the electroweak scale) feebly interacting particles is an exciting possibility to explain some mysterious phenomena in physics, among them the origin of Dark Matter. The sensitivity study through detailed simulation of projected experiments is a key point in estimating their potential for discovery. Several years ago we created the DMG4 package for the simulation of DM (Dark Matter) particles in fixed target experiments. The natural approach is to integrate this simulation into the same program that performs the full simulation of particles in the experiment setup. The Geant4 toolkit framework was chosen as the most popular and versatile solution nowadays. The simulation of DM particles production by this package accommodates several possible scenarios, employing electron, muon or photon beams and involving various mediators, such as vector, axial vector, scalar, pseudoscalar, or spin 2 particles. The bremsstrahlung, annihilation or Primakoff processes can be simulated. The package DMG4 contains a subpackage DarkMatter with cross section methods weakly connected to Geant4. It can be used in different frameworks. In this paper, we present the latest developments of the package, such as extending the list of possible mediator particle types, refining formulas for the simulation and extending the mediator mass range. The user interface is also made more flexible and convenient. In this work, we also demonstrate the usage of the package, the improvements in the simulation accuracy and some cross check validations. Program title: DMG4 CPC Library link to program files: https://doi.org/10.17632/cmr4bcrj6j.1 Licensing provisions: GNU General Public License 3 Programming language: c++ Journal reference of previous version: Comput. Phys. Commun. 269 (2021) 108129 Does the new version supersede the previous version?: Yes Reasons for the new version: Numerous developments, addition of new features Summary of revisions: WW approximation cross sections for the muon beam are implemented and cross-checked, models with semivisible A ′ (inelastic Dark Matter) and spin 2 mediators are added. The range of possible mediator masses is extended. Several important improvements for the annihilation processes are made, the number of possible annihilation processes is extended. User interface is improved. Several bugs are fixed. Nature of problem: For the simulation of Dark Matter production processes in fixed target experiments a code that can be easily integrated in programs for the full simulation of experimental setup is needed. Solution method: A fully Geant4 compatible DM simulation package DMG4 was presented in 2020. We present numerous further developments of this package. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. FiniteFieldSolve: Exactly solving large linear systems in high-energy theory.
- Author
-
Mangan, James
- Subjects
- *
SYSTEMS theory , *LINEAR systems , *FINITE fields , *MODULAR arithmetic , *SET theory , *PRIME numbers - Abstract
Large linear systems play an important role in high-energy theory, appearing in amplitude bootstraps and during integral reduction. This paper introduces FiniteFieldSolve, a general-purpose toolkit for exactly solving large linear systems over the rationals. The solver interfaces directly with Mathematica, is straightforward to install, and seamlessly replaces Mathematica's native solvers. In testing, FiniteFieldSolve is approximately two orders of magnitude faster than Mathematica and uses an order of magnitude less memory. The package also compares favorably against other public solvers in FiniteFieldSolve's intended use cases. As the name of the package suggests, solutions are obtained via well-known finite field methods. These methods suffer from introducing an inordinate number of modulo (or integer division) operations with respect to different primes. By automatically recompiling itself for each prime, FiniteFieldSolve converts the division operations into much faster combinations of instructions, dramatically improving performance. The technique of compiling the prime can be applied to any finite field solver, where the time savings will be solver dependent. The operation of the package is illustrated through a detailed example of an amplitude bootstrap. Program Title: FiniteFieldSolve CPC Library link to program files: https://doi.org/10.17632/ntxvp58mjg.1 Developer's repository link: https://github.com/jfmangan/FiniteFieldSolve Licensing provisions: GPLv3 Programming language: Mathematica, C++ Nature of problem: Exactly solving large linear systems over the rationals occurs in various settings in high-energy theory, for example when performing integral reduction or bootstrapping an amplitude. Solution method: The linear system is solved by repeatedly row reducing over different finite fields (see Ref. [1] and references therein). Finite fields avoid the intermediary expression swell inherent to arbitrary precision rationals and bypass roundoff errors from floating point numbers. A downside to using modular arithmetic is that it introduces a tremendous number of integer divisions, but this can be mitigated by compiling the divisions down to simpler instructions. The solver is designed to handle arbitrarily dense systems such as those that appear in certain amplitudes bootstraps. [1] M. Kauers, Fast solvers for dense linear systems, Nucl. Phys. B Proc. Suppl. 183, 245–250 (2008), 10.1016/j.nuclphysbps.2008.09.111 [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. ElecTra code: Full-band electronic transport properties of materials.
- Author
-
Graziosi, Patrizio, Li, Zhen, and Neophytou, Neophytos
- Subjects
- *
PHONON scattering , *BOLTZMANN'S equation , *ACOUSTIC phonons , *THERMAL conductivity , *CHARGE carrier mobility , *INELASTIC scattering , *FERMI level - Abstract
This paper introduces ElecTra , an open-source code which solves the linearized Boltzmann transport equation in the relaxation time approximation for charge carriers in a full-band electronic structure of arbitrary complexity, including their energy, momentum, and band-index dependence. ElecTra stands for 'ELECtronic TRAnsport' and computes the electronic and thermoelectric transport coefficients electrical conductivity, Seebeck coefficient, electronic thermal conductivity, and mobility, for semiconductor materials, for both unipolar and bipolar (small bandgap) materials. The code uses computed full-bands and relevant scattering parameters as inputs and considers single crystal materials in 3D and 2D. The present version of the code (v1) considers: i) elastic scattering with acoustic phonons and inelastic scattering with non-polar optical phonons in the deformation potential approximation, ii) inelastic scattering with polar phonons, iii) scattering with ionized dopants, and iv) alloy scattering. The user is given the option of intra- and inter-band scattering considerations. The simulation output also includes relevant relaxation times and mean-free-paths. The transport quantities are computed as a function of Fermi level position, doping density, and temperature. ElecTra can interface with any DFT code which saves the electronic structure in the '.bxsf' format. In this paper ElecTra is validated against ideal electronic transport situations of known analytical solutions, existing codes employing the constant relaxation time approximation, as well as experimentally well-assessed materials such as Si, Ge, SiGe, and GaAs. Program title: ElecTra – Electronic Transport simulation lab CPC Library link to program files: https://doi.org/10.17632/ycgx2fjzb6.1 Licensing provisions: GPLv3 Programming Language: MATLAB® Nature of the problem: computing the electronic and thermoelectric charge transport coefficients of materials with arbitrary complex full-band electronic structures, considering the carrier energy-, momentum-, and band-dependence of the scattering rates. Solution method: Semiclassical Linearized Boltzmann transport equation, with electronic structures (DFT or analytical) as input, formed into constant-energy surfaces, with scattering rates evaluated using Fermi's Golden Rule. Additional comments including restrictions and unusual features: • Programming interface: any DFT code which saves data in the '.bxsf' format. • RAM: a case study for a half-Heusler bandstructure on a 51 × 51 × 51 k -mesh, 2 Gb per processor is needed • Running time: for the example above, depending on the number and complexity of the scattering mechanisms and the number of simulated Fermi levels and temperatures considered, the time needed varies from ∼ 1 hour on a desktop PC or laptop (light simulations), to 5-10 hours on an HPC with 30-45 cores (heavy simulations). Using the constant relaxation time and constant mean-free-path approximations on a desktop PC or laptop, the running time is of the order of minutes. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. OnTrack: Reflecting on domain specific formal methods for railway designs.
- Author
-
James, Phillip, Moller, Faron, and Pantekis, Filippos
- Subjects
- *
RAILROAD design & construction , *ENGINEERING models , *ROLE models - Abstract
OnTrack is a tool that supports workflows for railway verification that has been implemented using model driven engineering frameworks. Starting with graphical scheme plans and finishing with automatically generated formal models set-up for verification, OnTrack allows railway engineers to interact with verification procedures through encapsulating formal methods. OnTrack is grounded on a domain specification language (DSL) capturing scheme plans and supports generation of various formal models using model transformations. In this paper, we detail the role model driven engineering takes within OnTrack and reflect on the use of model driven engineering concepts for developing domain specific formal methods toolsets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Anxiety disorders in children and adolescents with intellectual disability: Prevalence and assessment.
- Author
-
Reardon, Tessa C., Gray, Kylie M., and Melvin, Glenn A.
- Subjects
- *
DEVELOPMENTAL disabilities research , *CHILDREN with intellectual disabilities , *DISEASE prevalence , *INTELLECTUAL disabilities , *ANXIETY disorders , *DISEASES in teenagers - Abstract
Children and adolescents with intellectual disability are known to experience mental health disorders, but anxiety disorders in this population have received relatively little attention. Firstly, this paper provides a review of published studies reporting prevalence rates of anxiety disorders in children and adolescents with intellectual disability. Secondly, the paper reviews measures of anxiety that have been evaluated in children/adolescents with intellectual disability, and details the associated psychometric properties. Seven studies reporting prevalence rates of anxiety disorders in this population were identified, with reported rates varying from 3% to 22%. Two-one studies evaluating a measure of anxiety in a sample of children/adolescents with intellectual disability were identified. While these studies indicate that several measures show promise, further evaluation studies are needed; particularly those that evaluate the capacity of measures to screen for anxiety disorders, not only measure symptoms. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
27. The Sounding of the Whale, D. Graham Burnett. The University of Chicago Press (2012). 793 pp. Cloth $45.00, ISBN: 9780226081304, Paper $30.00, ISBN: 9780226100579.
- Author
-
Cripps, Simon
- Subjects
- *
WHALE sounds , *NONFICTION - Published
- 2015
- Full Text
- View/download PDF
28. Neko: A modern, portable, and scalable framework for high-fidelity computational fluid dynamics.
- Author
-
Jansson, Niclas, Karp, Martin, Podobas, Artur, Markidis, Stefano, and Schlatter, Philipp
- Subjects
- *
COMPUTATIONAL fluid dynamics , *REYNOLDS number , *COMPUTER software developers , *TURBULENCE , *TURBULENT flow - Abstract
Computational fluid dynamics (CFD), in particular applied to turbulent flows, is a research area with great engineering and fundamental physical interest. However, already at moderately high Reynolds numbers the computational cost becomes prohibitive as the range of active spatial and temporal scales is quickly widening. Specifically scale-resolving simulations, including large-eddy simulation (LES) and direct numerical simulations (DNS), thus need to rely on modern efficient numerical methods and corresponding software implementations. Recent trends and advancements, including more diverse and heterogeneous hardware in High-Performance Computing (HPC), are challenging software developers in their pursuit for good performance and numerical stability. The well-known maxim "software outlives hardware" may no longer necessarily hold true, and developers are today forced to re-factor their codebases to leverage these powerful new systems. In this paper, we present Neko, a new portable framework for high-order spectral element discretization, targeting turbulent flows in moderately complex geometries. Neko is fully available as open software. Unlike prior works, Neko adopts a modern object-oriented approach in Fortran 2008, allowing multi-tier abstractions of the solver stack and facilitating hardware backends ranging from general-purpose processors (CPUs) down to exotic vector processors and FPGAs. We show that Neko's performance and accuracy are comparable to NekRS, and thus on-par with Nek5000's successor on modern CPU machines. Furthermore, we develop a performance model, which we use to discuss challenges and opportunities for high-order solvers on emerging hardware. • We introduce Neko, a modernized framework for high-fidelity CFD simulations. • We reveal Neko's inner implementation details and design decisions. • We develop a performance model, which we validate and use to project performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. H-wave – A Python package for the Hartree-Fock approximation and the random phase approximation.
- Author
-
Aoyama, Tatsumi, Yoshimi, Kazuyoshi, Ido, Kota, Motoyama, Yuichi, Kawamura, Taiki, Misawa, Takahiro, Kato, Takeo, and Kobayashi, Akito
- Subjects
- *
HARTREE-Fock approximation , *CONDENSED matter physics , *HUBBARD model , *PROGRAMMING languages , *INTEGRATED software - Abstract
H-wave is an open-source software package for performing the Hartree–Fock approximation (HFA) and random phase approximation (RPA) for a wide range of Hamiltonians of interacting fermionic systems. In HFA calculations, H-wave examines the stability of several symmetry-broken phases, such as anti-ferromagnetic and charge-ordered phases, in the given Hamiltonians at zero and finite temperatures. Furthermore, H-wave calculates the dynamical susceptibilities using RPA to examine the instability toward the symmetry-broken phases. By preparing a simple input file for specifying the Hamiltonians, users can perform HFA and RPA for standard Hamiltonians in condensed matter physics, such as the Hubbard model and its extensions. Additionally, users can use a Wannier90 -like format to specify fermionic Hamiltonians. A Wannier90 format is implemented in RESPACK to derive ab initio Hamiltonians for solids. HFA and RPA for the ab initio Hamiltonians can be easily performed using H-wave. In this paper, we first explain the basis of HFA and RPA, and the basic usage of H-wave , including download and installation. Thereafter, the input file formats implemented in H-wave , including the Wannier90 -like format for specifying the interacting fermionic Hamiltonians, are discussed. Finally, we present several examples of H-wave such as zero-temperature HFA calculations for the extended Hubbard model on a square lattice, finite-temperature HFA calculations for the Hubbard model on a cubic lattice, and RPA in the extended Hubbard model on a square lattice. Program Title: H-wave CPC Library link to program files: https://doi.org/10.17632/9gr6pxhfjm.1 Developer's repository link: https://github.com/issp-center-dev/H-wave Code Ocean capsule: https://codeocean.com/capsule/6875177 Licensing provisions: GNU General Public License version 3 Programming language: Python3 External routines/libraries: NumPy, SciPy, Tomli, Requests. Nature of problem: Physical properties of strongly correlated electrons are examined such as ground state phase structure and response functions at zero and finite temperatures. Solution method: Calculations based on the unrestricted Hartree-Fock approximation and the random phase approximation are performed for the quantum lattice models such as the Hubbard model and its extensions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Expanding PyProcar for new features, maintainability, and reliability.
- Author
-
Lang, Logan, Tavadze, Pedram, Tellez, Andres, Bousquet, Eric, Xu, He, Muñoz, Francisco, Vasquez, Nicolas, Herath, Uthpala, and Romero, Aldo H.
- Subjects
- *
GRAPHICAL user interfaces , *PYTHON programming language , *FERMI surfaces , *ATOMIC orbitals , *MATERIALS science , *MAINTAINABILITY (Engineering) , *ELECTRONIC structure - Abstract
This paper presents a comprehensive update to PyProcar, a versatile Python package for analyzing and visualizing density functional theory (DFT) calculations in materials science. The latest version introduces a modularized codebase, a centralized example data repository, and a robust testing framework, offering a more reliable, maintainable, and scalable platform. Expanded support for various DFT codes broadens its applicability across research environments. Enhanced documentation and an example gallery make the package more accessible to new and experienced users. Incorporating advanced features such as band unfolding, noncollinear calculations, and derivative calculations of band energies enriches its analytic capabilities, providing deeper insights into electronic and structural properties. The package also incorporates PyPoscar, a specialized toolkit for manipulating POSCAR files, broadening its utility in computational materials science. These advancements solidify PyProcar's position as a comprehensive and highly adaptable tool, effectively serving the evolving needs of the materials science community. Program title: PyProcar CPC Library link to program files: https://doi.org/10.17632/d4rrfy3dy4.2 Developer's repository link: https://github.com/romerogroup/pyprocar Licensing provisions: GPLv3 Programming language: Python Supplementary material: Pyprocar-Supplementary Information Journal reference of previous version: Comput. Phys. Commun. 251 (2020) 107080, https://doi.org/10.1016/j.cpc.2019.107080 Does the new version supersede the previous version?: Yes Reasons for the new version: Changes in the directory structure, the addition of new features, enhancement of the manual and user documentation, and generation of interfaces with other electronic structure packages. Summary of revisions: These updates enhance its capabilities and ensure developers' and users' maintainability, reliability, and ease of use. Nature of problem: To automate, simplify, and serialize the analysis of band structure and Fermi surface, especially for high throughput calculations. Solution method: Implement a Python library able to handle, combine, parse, extract, plot, and even repair data from density functional calculations from diverse electronic structure packages. PyProcar uses color maps on the band structures or Fermi surfaces to give a simple representation of the relevant characteristics of the electronic structure. Additional comments including restrictions and unusual features: PyProcar can produce high-quality figures of band structures and Fermi surfaces (2D and 3D), projection of atomic orbitals, atoms, and/or spin components. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. FabSim3: An automation toolkit for verified simulations using high performance computing.
- Author
-
Groen, Derek, Arabnejad, Hamid, Suleimenova, Diana, Edeling, Wouter, Raffin, Erwan, Xue, Yani, Bronik, Kevin, Monnier, Nicolas, and Coveney, Peter V.
- Subjects
- *
HIGH performance computing , *PYTHON programming language , *AUTOMATION , *ERROR probability , *HUMAN error , *BUDGET - Abstract
A common feature of computational modelling and simulation research is the need to perform many tasks in complex sequences to achieve a usable result. This will typically involve tasks such as preparing input data, pre-processing, running simulations on a local or remote machine, post-processing, and performing coupling communications, validations and/or optimisations. Tasks like these can involve manual steps which are time and effort intensive, especially when it involves the management of large ensemble runs. Additionally, human errors become more likely and numerous as the research work becomes more complex, increasing the risk of damaging the credibility of simulation results. Automation tools can help ensure the credibility of simulation results by reducing the manual time and effort required to perform these research tasks, by making more rigorous procedures tractable, and by reducing the probability of human error due to a reduced number of manual actions. In addition, efficiency gained through automation can help researchers to perform more research within the budget and effort constraints imposed by their projects. This paper presents the main software release of FabSim3, and explains how our automation toolkit can improve and simplify a range of tasks for researchers and application developers. FabSim3 helps to prepare, submit, execute, retrieve, and analyze simulation workflows. By providing a suitable level of abstraction, FabSim3 reduces the complexity of setting up and managing a large-scale simulation scenario, while still providing transparent access to the underlying layers for effective debugging. The tool also facilitates job submission and management (including staging and curation of files and environments) for a range of different supercomputing environments. Although FabSim3 itself is application-agnostic, it supports a provably extensible plugin system where users automate simulation and analysis workflows for their own application domains. To highlight this, we briefly describe a selection of these plugins and we demonstrate the efficiency of the toolkit in handling large ensemble workflows. Program Title: FabSim3 CPC Library link to program files: https://doi.org/10.17632/6nfrwy7ptj.1 Licensing provisions: BSD 3-clause Programming language: Python 3 Nature of problem: Many aspects are crucial for obtaining reproducible and robust simulation results. For instance, we need to curate all the inputs and outputs for later scrutiny, scrutinize the model behaviour under slightly perturbed circumstances, quantify the propagation of key uncertainties from input data and known parameters and analyze the sensitivity for any parameters for which the exact specification eludes us. Solution method: FabSim3 uses a range of methods to provide automation. These primarily include: (i) SSH + Fabric2 to enable remote execution of SSH commands, (ii) an internal parameter state space using primarily Python dict objects that can be customized with machine- plugin- and user-specific modifications, (iii) Python templating to quickly enable the insertion of state space variables into supercomputing scripts, (iv) multiprocessing and/or QCG-PilotJob to enable efficient submission and execution of job arrays and (v) a system of flexibly installable and modifiable Python3 plugins which allows users to create and customize application-specific functionalities without modifying the core code base. In addition to the written code, FabSim3 also relies on a set of user conventions to maintain a separation of concerns (particularly between machine-, user- and application-specific settings). Additional comments including restrictions and unusual features: This paper serves as the definitive reference for FabSim3. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. SolAR: Automated test-suite generation for solidity smart contracts.
- Author
-
Driessen, S.W., Di Nucci, D., Tamburri, D.A., and van den Heuvel, W.J.
- Subjects
- *
CONTRACTS , *GENETIC algorithms , *SOLAR power plants , *BLOCKCHAINS - Abstract
Smart contracts have rapidly gained popularity as self-contained pieces of code, especially those run on the Ethereum blockchain. On the one hand, smart contracts are immutable, have transparent workings, and execute autonomously. On the other hand, these qualities make it essential to properly test the behavior of a smart contract before deploying it. In this paper, we introduce SolAR , a tool and approach for Sol idity A utomated Test Suite Gene R ation. SolAR allows smart contract developers to generate test suites for Solidity smart contracts optimized automatically for branch coverage using either a state-of-the-art genetic algorithm or a fuzzing approach. It enables a novel way to handle blockchain operations—or ChainOps—from a pipeline perspective, entailing a larger-scale as well as more manageable and maintainable service continuity. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. ACSmt: A plugin for eclipse papyrus to model systems of systems.
- Author
-
Remond Harbo, Sean Kristian, Palmelund Voldby, Emil, Madsen, Jonas, and Albano, Michele
- Subjects
- *
SYSTEM of systems , *SOURCE code , *UNIFIED modeling language - Abstract
While System of Systems (SoS) architectures for large and complex software projects are gaining momentum, the commonly used modeling and tooling approaches are still general-purpose or oriented towards single systems. Developers could benefit from methods and tools that avoid system-centric details in favor of native SoS modeling support. This paper presents a diagram-centric modeling tool with native SoS modeling support. The tool is implemented as a plugin for the Eclipse Papyrus modeling tool. The tool was showcased as a demo at MODELS'22. The code of the plugin is freely available via Github. • The Abstract Communicating Systems (ACS) methodology can support designing complex platforms based on Systems of Systems. • ACS was mapped on UML 2.5, and ACS modeling tool (ACSmt) is the first tool implementing the ACS methodology. • ACSmt is implemented as an Eclipse Papyrus plugin, which supports UML 2.5 and is well-accepted in the industry. • ACSmt allows for verifying structural properties of the designed SoS. • The open source code of ACSmt can be used as reference when implementing plugins for Papyrus. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. SMIwiz: An integrated toolbox for multidimensional seismic modelling and imaging.
- Author
-
Yang, Pengliang
- Subjects
- *
IMAGING systems in seismology , *FINITE difference time domain method , *NONLINEAR programming , *OPEN source software , *THREE-dimensional imaging , *PROGRAMMING languages - Abstract
This paper contributes an open source software - SMIwiz, which integrates seismic modelling, reverse time migration (RTM), and full waveform inversion (FWI) into a unified computer implementation. SMIwiz has the machinery to do both 2D and 3D simulation in a consistent manner. The package features a number of computational recipes for efficient calculation of imaging condition and inversion gradient: a dynamic evolving computing box to limit the simulation cube and a well-designed wavefield reconstruction strategy to reduce the memory consumption when dealing with 3D problems. The modelling in SMIwiz runs independently: each shot corresponds to one processor in a bijective manner to maximize the scalability. A batchwise job scheduling strategy is designed to handle large 3D imaging tasks on computer with limited number of cores. The viability of SMIwiz is demonstrated by a number of applications on benchmark models. Program Title: SMIwiz CPC Library link to program files: https://doi.org/10.17632/tygszns27k.1 Developer's repository link: https://github.com/yangpl/SMIwiz Licensing provisions: GNU General Public License v3.0 Programming language: C, Shell, Fortran External dependencies: MPI [1], FFTW [2] Nature of problem: Seismic modelling and imaging (FWI and RTM) Solution method: High-order finite-difference time-domain (FDTD) for modelling on staggered grid; Quasi-Newton LBFGS algorithm for nonlinear optimization; line search to estimate step length based on Wolfe condition [1] https://www.mpich.org/ [2] http://fftw.org/ [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. MLQD: A package for machine learning-based quantum dissipative dynamics.
- Author
-
Ullah, Arif and Dral, Pavlo O.
- Subjects
- *
QUANTUM theory , *ARTIFICIAL intelligence , *PYTHON programming language , *MACHINE learning , *CONVOLUTIONAL neural networks , *QUANTUM wells - Abstract
Machine learning has emerged as a promising paradigm to study the quantum dissipative dynamics of open quantum systems. To facilitate the use of our recently published ML-based approaches for quantum dissipative dynamics, here we present an open-source Python package MLQD (https://github.com/Arif-PhyChem/MLQD), which currently supports the three ML-based quantum dynamics approaches: (1) the recursive dynamics with kernel ridge regression (KRR) method, (2) the non-recursive artificial-intelligence-based quantum dynamics (AIQD) approach and (3) the blazingly fast one-shot trajectory learning (OSTL) approach, where both AIQD and OSTL use the convolutional neural networks (CNN). This paper describes the features of the MLQD package, the technical details, optimization of hyperparameters, visualization of results, and the demonstration of the MLQD 's applicability for two widely studied systems, namely the spin-boson model and the Fenna–Matthews–Olson (FMO) complex. To make MLQD more user-friendly and accessible, we have made it available on the Python Package Index (PyPi) platform and it can be installed via ▪. In addition, it is also available on the XACS cloud computing platform (https://XACScloud.com) via the interface to the MLatom package (http://MLatom.com). Program Title: MLQD CPC Library link to program files: https://doi.org/10.17632/yxp37csy5x.1 Developer's repository link: https://github.com/Arif-PhyChem/MLQD Code Ocean capsule: https://codeocean.com/capsule/5563143/tree Licensing provisions: Apache Software License 2.0 Programming language: Python 3.0 Supplementary material: Jupyter Notebook-based tutorials External routines/libraries: Tensorflow, Scikit-learn, Hyperopt, Matplotlib, MLatom Nature of problem: Fast propagation of quantum dissipative dynamics with machine learning approaches. Solution method: We have developed MLQD as a comprehensive framework that streamlines and supports the implementation of our recently published machine learning-based approaches for efficient propagation of quantum dissipative dynamics. This framework encompasses: (1) the recursive dynamics with kernel ridge regression (KRR) method, as well as the non-recursive approaches utilizing convolutional neural networks (CNN), namely (2) artificial intelligence-based quantum dynamics (AIQD), and (3) one-shot trajectory learning (OSTL). Additional comments including restrictions and unusual features: 1. Users can train a machine learning (ML) model following one of the ML-based approaches: KRR, AIQD and OSTL. 2. Users have the option to propagate dynamics with the existing trained ML models. 3. MLQD also provides the transformation of trajectories into the training data. 4. MLQD also supports hyperparameter optimization using MLATOM's grid search functionality for KRR and Bayesian methods with Tree-structured Parzen Estimator (TPE) for CNN models via the HYPEROPT package. 5. MLQD also facilitates the visualization of results via auto-plotting. 6. MLQD is designed to be user-friendly and easily accessible, with availability on the XACS cloud computing platform (https://XACScloud.com) via the interface to the MLATOM package (http://MLatom.com). In addition, it is also available as a pip package which makes it easy to install. Future outlook: MLQD will be extended to more realistic systems along with the incorporation of other machine learning-based approaches as well as the traditional quantum dynamics methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Detecting depression tendency with multimodal features.
- Author
-
Zhang, Hui, Wang, Hong, Han, Shu, Li, Wei, and Zhuang, Luhe
- Subjects
- *
USER-generated content , *SOCIAL media , *KNOWLEDGE representation (Information theory) , *MENTAL depression , *WORD frequency , *MENTAL health , *NAIVE Bayes classification - Abstract
• Our MTDD model is an integrated knowledge-driven and data-driven model. This method avoids problems such as not considering expert experience or insight, not paying attention to the overall situation, and lack of interpretability brought about by only using data-driven technology. It not only utilizes the text features and the semantic features but also applies the domain knowledge to learn the representation of the depression tendency, thus making the model more robust. In other words, the model combines text features, semantic features, and domain knowledge. The Word2Vec word embedding integrates the emotional information of the words in the emotional dictionary, expands the existing emotional dictionary, extracts the TF-IDF word frequency feature, and proposes seven grammatical analysis rules to obtain the text emotional value feature making it more suitable for depression tendency detection classification task. • The MTDD model is a deep neural network hybrid model, which circumvents the weak generalization ability of a single model for identifying depression tendencies. Specifically, the MTDD model combines the advantages of CNN and Bi-LSTM networks. The CNN network offers the advantage of the ability to extract the local features of the text. In addition, BiLSTM can effectively capture bidirectional semantic information. This combination can better represent text features and improve the model's classification accuracy. • Our MTDD model is obtained based on real data, thus making the model more suitable for practical problems. As far as we know, many existing depression detection methods are only trained on some experimental data sets, so the model's generalization ability is limited and cannot even be applied in realistic scenes. In comparison, the MTDD model is trained on social platform data, making the data more objective and accurate. In addition, the data of social platforms can be obtained at a low cost. It is easy to operate and does not require a lot of laborious labeling. Moreover, our approach avoids the influence of subjective factors in the method of consultation by mental health experts and the influence of non-public and imperfect data used for depression. • We conducted extensive experiments on a Reddit data set and a Twitter data set. The results show that, compared to multiple latest depression detection models, our MTDD model detects users who may be depressed with a 95% F1 value and obtains SOTA results. [Display omitted] Background and Objective: Depression can severely impact physical and mental health and may even harm society. Therefore, detecting the early symptoms of depression and treating them on time is critical. The widespread use of social media has led individuals with depressive tendencies to express their emotions on social platforms, share their painful experiences, and seek support and help. Therefore, the massive available amounts of social platform data provide the possibility of identifying depressive tendencies. Methods: This paper proposes a neural network hybrid model MTDD to achieve this goal. Analysis of the content of users' posts on social platforms has facilitated constructing a post-level method to detect depressive tendencies in individuals. Compared with existing methods, the MTDD model uses the following innovative methods: First, this model is based on social platform data, which is objective and accurate, can be obtained at a low cost, and is easy to operate. The model can avoid the influence of subjective factors in the depressive tendency detection method based on consultation with mental health experts. In other words, it can avoid the problem of undisclosed and imperfect data in depressive tendency detection. Second, the MTDD model is based on a deep neural network hybrid model, combining the advantages of CNN and BiLSTM networks and avoiding the problem of poor generalization ability in a single model for depression tendency recognition. Third, the MTDD model is based on multimodal features for learning the vector representation of depression-prone text, including text features, semantic features, and domain knowledge, making the model more robust. Results: Extensive experimental results demonstrate that our MTDD model detects users who may have a depressive tendency with a 95 % F1 value and obtained SOTA results. Conclusions: Our MTDD model can detect depressive users on social media platforms more effectively, providing the possibility for early diagnosis and timely treatment of depression. The experiment proves that our MTDD model outperforms many of the latest depressive tendency detection models. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. Multi-focus image fusion using structure-guided flow.
- Author
-
Duan, Zhao, Luo, Xiaoliu, and Zhang, Taiping
- Subjects
- *
IMAGE fusion , *CAPSULE neural networks , *CONVOLUTIONAL neural networks , *DEEP learning , *SUPERVISED learning - Abstract
[Display omitted] • Introduce capsule network for multi-focus image fusion. • Utilize structure information to help locate focus regions. • Design structure-guided flow module to integrate structure features. Existing deep learning based methods have shown their advantages in multi-focus image fusion task. However, most methods still suffer from inaccurate focus region detection. In this paper, we employ the property of part-whole relationships embedded by the Capsule Network (CapsNet) to solve the problem. Specifically, we introduce CapsNet in multi-focus image fusion task, and design a structure-guided flow module, which fully utilizes structure information to help locate focus regions. CapsNet is introduced to extract structure features by supervising gradient information of the image. Compared with traditional convolutional neural networks (CNNs), CapsNet takes into account the correlation of features from different positions, such that it encodes more compact features. Once structure features are obtained, a flow alignment module is introduced to learn flow field between structure and image features, and propagate effectively structure features to image features to make confident focus region detection. Experimental results show the proposed method achieves robust fusion performance on three publicly available multi-focus datasets, and outperforms or is comparable to the state-of-the-art methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Reasonable design of Sm-modified Cu-based catalyst for NH3-SCO: Role of the amide intermediates.
- Author
-
Lv, Dengke, Liu, Jun, Zhang, Guojie, Wang, Ying, Ge, Shiwei, Zhao, Yuqiong, and Li, Guoqiang
- Subjects
- *
SELECTIVE catalytic oxidation , *CATALYSTS , *CATALYTIC activity , *TITANIUM dioxide , *WATER vapor , *ATMOSPHERIC ammonia - Abstract
The selective catalytic oxidation of ammonia (NH 3 -SCO) is currently the most effective method used to eliminate NH 3. However, one of the major challenges for NH 3 -SCO is the development of catalyst capable of completely converting NH 3 into harmless N 2 and water vapor. In this paper, a high N 2 selective catalyst prepared by the sol-gel method using TiO 2 as the support, Cu as the active species, and Sm as the auxiliary agent is presented. Compared to traditional Cu/TiO 2 catalyst, the 4SmCu/TiO 2 catalyst has higher catalytic activity and N 2 selectivity at low temperatures. The NH 3 conversion can reach 100%, and the selectivity of N 2 can be maintained at 100% at 275 °C. The excellent catalytic activity is attributed to the highly dispersed active species, abundant Lewis acid sites (LASs), and the generation of large amounts of surface-adsorbed oxygen. In addition, doping Sm species will cause TiO 2 lattice distortion, and at the same time load the distorted TiO 2 with more active material. Moreover, in-situ DRIFTS analysis suggests that both the 4SmCu/TiO 2 and Cu/TiO 2 catalysts follow the "internal" selective catalytic reduction (iSCR) mechanism during NH 3 -SCO reactions. The 4SmCu/TiO 2 catalyst generates more amides (-NH 2), which reduces the non-selective oxidation of the catalysts and promotes the formation of N 2. This provides a new idea and method for the selective catalytic oxidation of NH 3 to N 2 and water vapor using Cu-based catalysts. [Display omitted] • 4SmCu/TiO2 catalyst converts all NH3 into N2 and water vapor at 275 ℃. • The lattice distortion of TiO2 can support more active species. • Doping of Sm species increased the content of Lewis acid site and surface-adsorbed oxygen. • The rich amide intermediates reduce the non-selective oxidation of NH3 and O2. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. ChampKit: A framework for rapid evaluation of deep neural networks for patch-based histopathology classification.
- Author
-
Kaczmarzyk, Jakub R., Gupta, Rajarsi, Kurc, Tahsin M., Abousamra, Shahira, Saltz, Joel H., and Koo, Peter K.
- Subjects
- *
DEEP learning , *ARTIFICIAL neural networks , *TRANSFORMER models , *COMPUTER vision , *HISTOPATHOLOGY , *CHOICE (Psychology) - Abstract
• We present ChampKit, a Python-based software package that enables rapid exploration and evaluation of deep learning models for patch-level classification of histopathology data. It is open source and available at https://github.com/SBU-BMI/champkit. ChampKit is designed to be highly reproducible and enables systematic, unbiased evaluation of patch-level histopathology classification. It incorporates public datasets for six clinically important tasks and access to hundreds of (pre-trained) deep learning models. It can easily be extended to custom patch classification datasets and custom deep learning architectures. • The users are intended to be (1) biomedical research groups interested in finding and fine-tuning the best models to analyze a broad collection of whole slide images, and (2) deep learning methods research groups interested in systematically and quickly evaluating their methods against a set of state-of-the-art methods with different pretraining and transfer learning configurations. • We demonstrate the utility of ChampKit by evaluating two ResNet models and one vision transformer on the six diverse classification tasks for patch-level histopathology datasets. We did not find consistent benefits from pretrained models versus random initialization across the different datasets, which suggests that a thorough exploration of model architectures is important to identify optimal models for a given dataset. Histopathology is the gold standard for diagnosis of many cancers. Recent advances in computer vision, specifically deep learning, have facilitated the analysis of histopathology images for many tasks, including the detection of immune cells and microsatellite instability. However, it remains difficult to identify optimal models and training configurations for different histopathology classification tasks due to the abundance of available architectures and the lack of systematic evaluations. Our objective in this work is to present a software tool that addresses this need and enables robust, systematic evaluation of neural network models for patch classification in histology in a light-weight, easy-to-use package for both algorithm developers and biomedical researchers. Here we present ChampKit (Comprehensive Histopathology Assessment of Model Predictions toolKit): an extensible, fully reproducible evaluation toolkit that is a one-stop-shop to train and evaluate deep neural networks for patch classification. ChampKit curates a broad range of public datasets. It enables training and evaluation of models supported by timm directly from the command line, without the need for users to write any code. External models are enabled through a straightforward API and minimal coding. As a result, Champkit facilitates the evaluation of existing and new models and deep learning architectures on pathology datasets, making it more accessible to the broader scientific community. To demonstrate the utility of ChampKit, we establish baseline performance for a subset of possible models that could be employed with ChampKit, focusing on several popular deep learning models, namely ResNet18, ResNet50, and R26-ViT, a hybrid vision transformer. In addition, we compare each model trained either from random weight initialization or with transfer learning from ImageNet pretrained models. For ResNet18, we also consider transfer learning from a self-supervised pretrained model. The main result of this paper is the ChampKit software. Using ChampKit, we were able to systemically evaluate multiple neural networks across six datasets. We observed mixed results when evaluating the benefits of pretraining versus random intialization, with no clear benefit except in the low data regime, where transfer learning was found to be beneficial. Surprisingly, we found that transfer learning from self-supervised weights rarely improved performance, which is counter to other areas of computer vision. Choosing the right model for a given digital pathology dataset is nontrivial. ChampKit provides a valuable tool to fill this gap by enabling the evaluation of hundreds of existing (or user-defined) deep learning models across a variety of pathology tasks. Source code and data for the tool are freely accessible at https://github.com/SBU-BMI/champkit. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. VazaDengue: An information system for preventing and combating mosquito-borne diseases with social networks.
- Author
-
Sousa, Leonardo, de Mello, Rafael, Cedrim, Diego, Garcia, Alessandro, Missier, Paolo, Uchôa, Anderson, Oliveira, Anderson, and Romanovsky, Alexander
- Subjects
- *
MEDICAL software , *PREVENTIVE medicine , *DENGUE - Abstract
Dengue is a disease transmitted by the Aedes Aegypti mosquito, which also transmits the Zika virus and Chikungunya. Unfortunately, the population of different countries has been suffering from the diseases transmitted by this mosquito. The communities should play an important role in combating and preventing the mosquito-borne diseases. However, due to the limited engagement of the population, new solutions need to be used to strengthen the mosquito surveillance. VazaDengue is one of these solutions, which offers the users a web and mobile platform for preventing and combating mosquito-borne diseases. The system relies on social actions of citizens reporting mosquito breeding sites and dengue cases, in which the reports are made available to the community and health agencies. In order to address the limited population engagement, the system proactively monitors social media network as Twitter to enrich the information provided by the system. It processes the natural language text from the network to classify the tweets according to a set of predefined categories. After the classification, the relevant tweets are provided to the users as reports. In this paper, we describe the VazaDengue features including its ability to harvest and classify tweets. Since the VazaDengue system aims to strengthen the entomological surveillance of the mosquito that transmits Dengue, Zika, and Chikungunya by providing geolocated reports, we present here two studies to evaluate its potential contributions. The first evaluation uses a survey conducted in the Brazilian community of health agents. The goal is to evaluate the relevance of the classified tweets according to the health agents’ perspective. The second study compares the official reports of the 2015–2016 epidemic waves in Brazil with the concentration of mosquito-related tweets found by VazaDengue. The goal is to verify if the concentration of tweets can be used for monitoring the mosquito manifestation in big cities. The results of these two evaluations are encouraging. For instance, we have found that the health agents tend to agree with the relevance of the classified tweets. Moreover, the concentration of tweets is likely to be effective for monitoring big cities. The results of these evaluations are helping us to improve the VazaDengue system further. These improvements will make the VazaDengue system even more useful for combating and preventing the mosquito-borne diseases. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
41. mtCMF: A novel memory table based content management framework for automatic website generation.
- Author
-
Bandirmali, Necla
- Subjects
- *
INTERNET content management systems , *APPLICATION program interfaces - Abstract
The desperate need for tools with features similar to the Content Management Systems (CMS), offering Application Programming Interfaces (APIs) and providing flexibility for specific business and user demands, is the key motivation behind the work presented in this paper. A novel Content Management Framework (CMF), called mtCMF, is introduced. It offers user-friendly wizards and state-of-the-art adaptive structure. It does not have any constraints for use in terms of a predefined website structure, data types or database tables. Business objects can be easily defined by using the proposed mtCMF, and database tables are created from these objects automatically. mtCMF has an adaptive scaffolding architecture to allow generating Create, Read, Update, and Delete (CRUD) screens, based on the novel on-the-fly approach for all types of database tables. It has a flexible localization option to support multiple languages, and delivers Representational State Transfer (REST) services for mobile clients and remote application development. Its simple structure and novel wizard-based user-friendly interface make it superior to traditional CMS tools and web development frameworks. Performance analysis of mtCMF was carried out by using Blackfire Profiler. The results clearly have proved that mtCMF achieves better performance than its counterparts Laravel, Symfony, WordPress and Joomla. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
42. MLweb: A toolkit for machine learning on the web.
- Author
-
Lauer, Fabien
- Subjects
- *
OPEN source software , *JAVASCRIPT programming language - Abstract
This paper describes MLweb, an open source software toolkit for machine learning on the web. The specificity of MLweb is that all computations are performed on the client side without the need to send data to a third-party server. MLweb includes three main components: a JavaScript API for scientific computing (LALOLib), an extension of this library with machine learning tools (ML.js) and an online development environment (LALOLab) with many examples. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
43. Improved designs of digit-by-digit decimal multiplier.
- Author
-
Ahmed, Syed Ershad, Varma, Santosh, and Srinivas, M.B.
- Subjects
- *
CONVERTERS (Electronics) , *ANALOG multipliers , *BINARY-coded decimal system , *COMPUTER architecture , *IEEE 802 standard - Abstract
Decimal multiplication is a ubiquitous operation which is inherently complex in terms of partial product generation and accumulation. In this paper, the authors propose a generalized design approach and architectural framework for ‘digit-by-digit’ multiplication. Decimal partial products are generated in parallel using fast and area efficient BCD digit multipliers and their reduction is achieved using hybrid multi-operand binary-to-decimal converters. In contrast to most of the previous implementations, which propose changes either in partial product generation or reduction, this work proposes modifications at both partial product generation and reduction stages resulting in an improved performance. A comprehensive analysis of synthesis results (consistent with IEEE-compliant 16-digit decimal multiplier architecture) indicates an improvement in delay of 8–29% and a reduced area-delay product of 4–38% compared to similar work published previously. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
44. Geo3DML: A standard-based exchange format for 3D geological models.
- Author
-
Wang, Zhangang, Qu, Honggang, Wu, Zixing, and Wang, Xianghong
- Subjects
- *
GEOLOGY databases , *GEOLOGICAL modeling , *GEOLOGICAL surveys - Abstract
A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS). [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
45. SelEQ: An advanced ground motion record selection and scaling framework.
- Author
-
Macedo, L. and Castro, J.M.
- Subjects
- *
SEISMOLOGY software , *OPEN source software , *NONLINEAR dynamical systems - Abstract
The consensual agreement that ground motion record selection plays an important role in the non-linear dynamic structural response has contributed to numerous research studies seeking the definition of accurate ground motion record selection techniques. However, most of the available tools only allow for record selection based on spectral compatibility between the mean response spectrum of a record suite and a target response spectrum. This paper presents SelEQ, a fully integrated framework that implements state-of-the art procedures for ground motion record selection and scaling. In addition to typical record selection procedures, SelEQ allows obtaining the Conditional Mean Spectrum (CMS) for the European territory, the latter making use of the open source platform OpenQuake and the recently proposed SHARE seismic hazard model. This important feature allows state-of-the-art record selection for probabilistic-based assessment and risk analysis. SelEQ incorporates a number of procedures available in the literature that facilitate preliminary record selection (e.g. disaggregation for a specific site) and that allow advanced selection criteria (e.g. control of mismatch of individual ground motion records). The framework makes use of the Adaptive Harmony Search meta-heuristic optimization algorithm in order to significantly minimize computational cost and analysis time, whilst still meeting the imposed selection constraints. Application examples of the framework indicate that it can accurately select suites of ground motion records for code-based and probabilistic-based seismic assessment. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
46. AmbieGen: A search-based framework for autonomous systems testing.
- Author
-
Humeniuk, Dmytro, Khomh, Foutse, and Antoniol, Giuliano
- Subjects
- *
DRIVERLESS cars , *TEST systems , *AUTONOMOUS robots , *MOBILE robots , *EVOLUTIONARY algorithms , *AUTONOMOUS vehicles , *SEARCH algorithms - Abstract
Thorough testing of safety-critical autonomous systems, such as self-driving cars, autonomous robots, and drones, is essential for detecting potential failures before deployment. One crucial testing stage is model-in-the-loop testing, where the system model is evaluated by executing various scenarios in a simulator. However, the search space of possible parameters defining these test scenarios is vast, and simulating all combinations is computationally infeasible. To address this challenge, we introduce AmbieGen, a search-based test case generation framework for autonomous systems. AmbieGen uses evolutionary search to identify the most critical scenarios for a given system, and has a modular architecture that allows for the addition of new systems under test, algorithms, and search operators. Currently, AmbieGen supports test case generation for autonomous robots and autonomous car lane keeping assist systems. In this paper, we provide a high-level overview of the framework's architecture and demonstrate its practical use cases. • AmbieGen is an evolutionary algorithm based test scenario generation tool. • The search algorithm maximizes the difficulty of test scenarios as well as their diversity. • The tool is customizable and can be used to test different robotic systems. • Current tool version includes test scenario generation for autonomous vehicles and mobile robots. • The tool can be accessed at: https://github.com/swat-lab-optimization/AmbieGen-tool. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. FLUST: A fast, open source framework for ultrasound blood flow simulations.
- Author
-
Ekroll, Ingvild Kinn, Saris, Anne E.C.M., and Avdal, Jørgen
- Subjects
- *
FLOW simulations , *SIGNAL integrity (Electronics) , *ULTRASONIC imaging , *DOPPLER ultrasonography , *BLOOD flow , *DIGITAL image correlation , *TRANSDUCERS , *IMAGING phantoms - Abstract
• We introduce the open source simulator FLUST, as part of the UltraSound ToolBox (USTB). • FLUST produces multiple realizations of ultrasound signals from flow fields. • High integrity signals are achieved at low computational cost. • Framework includes tools for visualization and assessment of estimator performance. • Database includes customizable acquisition setups, flow phantoms and estimators. Background and objective: Ultrasound based blood velocity estimation is a continuously developing frontier, where the vast number of possible acquisition setups and velocity estimators makes it challenging to assess which combination is better suited for a given imaging application. FLUST, the Flow-Line based Ultrasound Simulation Tool, may be used to address this challenge, providing a common platform for evaluation of velocity estimation schemes on in silico data. However, the FLUST approach had some limitations in its original form, including reduced robustness for phase sensitive setups and the need for manual selection of integrity parameters. In addition, implementation of the technique and therefore also documentation of signal integrity was left to potential users of the approach. Methods: In this work, several improvements to the FLUST technique are proposed and investigated, and a robust, open source simulation framework developed. The software supports several transducer types and acquisition setups, in addition to a range of different flow phantoms. The main goal of this work is to offer a robust, computationally cheap and user-friendly framework to simulate ultrasound data from stationary blood velocity fields and thereby facilitate design and evaluation of estimation schemes, including acquisition design, velocity estimation and other post-processing steps. Results: The technical improvements proposed in this work resulted in reduced interpolation errors, reduced variability in signal power, and also automatic selection of spatial and temporal discretization parameters. Results are presented illustrating the challenges and the effectiveness of the solutions. The integrity of the improved simulation framework is validated in an extensive study, with results indicating that speckle statistics, spatial and temporal correlation and frequency content all correspond well with theoretical predictions. Finally, an illustrative example shows how FLUST may be used throughout the design and optimization process of a velocity estimator. Conclusions: The FLUST framework is available as a part of the UltraSound ToolBox (USTB), and the results in this paper demonstrate that it can be used as an efficient and reliable tool for the development and validation of ultrasound-based velocity estimation schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. QMaxUSE: A new tool for verifying UML class diagrams and OCL invariants.
- Author
-
Wu, Hao
- Subjects
- *
ENGINEERING - Abstract
Formal verification of a UML class diagram annotated with OCL constraints has been a long-standing challenge in Model-driven Engineering. In the past decades, many tools and techniques have been proposed to tackle this challenge. However, they do not scale well and are often unable to locate the conflicts when then number of OCL constraints significantly increases. In this paper, we present a new tool called QMaxUSE. This tool is designed for verifying UML class diagrams annotated with large number of OCL invariants. QMaxUSE is easy to install and deploy. It offers two distinct features. (1) A simple query language that allows users to choose parts of a UML class diagram to be verified. (2) A new procedure that is capable of performing concurrent verification. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. OPSimTool: A custom tool for optical photon simulation in Geant4.
- Author
-
Kandemir, Mustafa
- Subjects
- *
PHOTONS , *PROGRAMMING languages , *C++ , *MAINTAINABILITY (Engineering) , *OPTICAL materials - Abstract
This paper introduces OPSimTool, a set of additions to the Geant4 toolkit that simplify the development of optical photon applications and enhance the flexibility and maintainability of developed applications. The tool also provides interfaces for creating reusable and portable material build code. This tool has been developed according to users' needs and perspectives, considering frequently asked questions, most encountered challenges, and evolving needs over time in the optical category of the Geant4 official forum page. Program title: OPSimTool CPC Library link to program files: https://doi.org/10.17632/6jtbxdnpm4.1 Developer's repository link: https://github.com/mkandemirr/OpSim Licensing provisions: GNU General Public License 3 Programming language: C++ External routines/libraries: Geant4, CMake Nature of problem: Although the toolkit provided by Geant4 is sufficient to perform optical photon simulations, providing additional tools will be highly beneficial for developing more flexible and sustainable applications. Solution method: We have developed a set of C++ classes in the spirit of Geant4 users and made them available to the public. • OPSimTool is a set of additions to the Geant4 toolkit that simplifies the development of optical photon applications. • OPSimTool increases the flexibility and maintainability of the developed code. • OPSimTool provides interfaces for creating reusable and portable material build code. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. ACFlow: An open source toolkit for analytic continuation of quantum Monte Carlo data.
- Author
-
Huang, Li
- Subjects
- *
MONTE Carlo method , *GREEN'S functions , *CONTINUATION methods , *STATISTICAL correlation , *FREDHOLM equations - Abstract
The purpose of analytic continuation is to establish a real frequency spectral representation of single-particle or two-particle correlation function (such as Green's function, self-energy function, spin and charge susceptibilities) from noisy data generated in finite temperature quantum Monte Carlo simulations. It requires numerical solutions of a family of Fredholm integral equations of the first kind, which is indeed a challenging task. In this paper, an open source toolkit (dubbed ACFlow) for analytic continuation of quantum Monte Carlo data is presented. We first give a short introduction to the analytic continuation problem. Next, three popular analytic continuation algorithms, including the maximum entropy method, the stochastic analytic continuation, and the stochastic optimization method, as implemented in this toolkit are reviewed. And then we elaborate on the major features, implementation details, basic usage, inputs, and outputs of this toolkit. Finally, four representative examples, including analytic continuations of Matsubara self-energy function, Matsubara and imaginary time Green's functions, and current-current correlation function, are shown to demonstrate the usefulness and flexibility of the ACFlow toolkit. Program Title: ACFlow CPC Library link to program files: https://doi.org/10.17632/th6w74gwjm.1 Developer's repository link: https://github.com/huangli712/ACFlow Licensing provisions: GNU General Public License Version 3 Programming language: Julia Nature of problem: Most of the quantum Monte Carlo methods work on imaginary axis. In order to extract physical observables and compare them with the experimental results, analytic continuation must be done in the post-processing stage to convert the quantum Monte Carlo simulated data from imaginary axis to real axis. Solution method: Three well-established analytic continuation methods, including the maximum entropy method, the stochastic analytic continuation (both A. W. Sandvik's and K. S. D. Beach's algorithms), and the stochastic optimization method, have been implemented in the ACFlow toolkit. Additional comments including restrictions and unusual features: The ACFlow toolkit is written in pure Julia language. It is highly optimized and parallelized. It can be executed interactively in a Jupyter notebook environment. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.