5,084 results on '"computational methods"'
Search Results
2. Density Fluctuations in the Intracluster Medium: An Attempt to Constrain Viscosity with Cosmological Simulations.
- Author
-
Marin-Gilabert, Tirso, Steinwandel, Ulrich P., Valentini, Milena, Vallés-Pérez, David, and Dolag, Klaus
- Subjects
- *
GALAXY clusters , *ASTROPHYSICS , *KINETIC energy , *VISCOSITY ,COLD regions - Abstract
The impact of viscosity in the intracluster medium (ICM) is still an open question in astrophysics. To address this problem, we have run a set of cosmological simulations of three galaxy clusters with a mass larger than M Vir > 1015 M ⊙ at z = 0 using the smoothed particle magnetohydrodynamics-code OpenGadget3. We aim to quantify the influence of viscosity and constrain its value in the ICM. Our results show significant morphological differences at small scales, temperature variations, and density fluctuations induced by viscosity. We observe a suppression of instabilities at small scales, resulting in a more filamentary structure and a larger amount of small structures due to the lack of mixing with the medium. The conversion of kinetic to internal energy leads to an increase of the virial temperature of the cluster of ∼5%–10%, while the denser regions remain cold. The amplitude of density and velocity fluctuations are found to increase with viscosity. However, comparison with observational data indicates that the simulations, regardless of the viscosity, match the observed slope of the amplitude of density fluctuations, challenging the direct constraint of viscosity solely through density fluctuations. Furthermore, the ratio of density to velocity fluctuations remains close to 1 regardless of the amount of viscosity, in agreement with the theoretical expectations. Our results show for the first time in a cosmological simulation of a galaxy cluster the effect of viscosity in the ICM, a study that is currently missing in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Blending Borders and Sparking Change: Sidney Yip, Hybridity, and the Rise of Molecular Simulations in Cold War Materials Science.
- Author
-
Macuglia, Daniele
- Abstract
Between the mid-1970s and mid-1980s, molecular simulations emerged as a transformative force within materials science. Sidney Yip's early contributions at the Massachusetts Institute of Technology, alongside his involvement in the 1985 International School of Physics "Enrico Fermi" in Varenna, Italy, catalyzed the convergence of traditional methods with computational techniques and helped drive a redefinition of the discipline's epistemic and methodological boundaries. This article argues that Yip's biography and professional trajectory as a Chinese-born engineer and scientist in the United States during the Cold War facilitated the acceptance and advancement of molecular simulations within materials research. His work also attracted the interest of leaders from established fields, such as condensed matter physics and chemical physics, to explore the potential applications of these techniques in materials science. In examining his journey, this study illuminates the dual processes of cultural assimilation and hybridity, and highlights Yip's boundary work that promoted the integration of diverse epistemic traditions and heterogeneous communities. The analysis traces the epistemological transformations, methodological shifts, and the institutional and disciplinary dynamics that fostered the incorporation of molecular simulations into materials science. This examination foregrounds the co-construction of scientific knowledge and technological practice through Yip's boundary work, and offers an assessment of his contributions within the broader sociotechnical networks that shaped the field. Recognizing the paucity of existing historiography on the subject, this article aims to establish a framework based on primary sources that can serve as a foundation for future scholarly inquiry. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. The success rate of processed predicted models in molecular replacement: implications for experimental phasing in the AlphaFold era.
- Author
-
Keegan, Ronan M., Simpkin, Adam J., and Rigden, Daniel J.
- Subjects
- *
PROTEIN structure prediction , *CRYSTAL structure , *OPEN-ended questions , *FORECASTING - Abstract
The availability of highly accurate protein structure predictions from AlphaFold2 (AF2) and similar tools has hugely expanded the applicability of molecular replacement (MR) for crystal structure solution. Many structures can be solved routinely using raw models, structures processed to remove unreliable parts or models split into distinct structural units. There is therefore an open question around how many and which cases still require experimental phasing methods such as single‐wavelength anomalous diffraction (SAD). Here, this question is addressed using a large set of PDB depositions that were solved by SAD. A large majority (87%) could be solved using unedited or minimally edited AF2 predictions. A further 18 (4%) yield straightforwardly to MR after splitting of the AF2 prediction using Slice'N'Dice, although different splitting methods succeeded on slightly different sets of cases. It is also found that further unique targets can be solved by alternative modelling approaches such as ESMFold (four cases), alternative MR approaches such as ARCIMBOLDO and AMPLE (two cases each), and multimeric model building with AlphaFold‐Multimer or UniFold (three cases). Ultimately, only 12 cases, or 3% of the SAD‐phased set, did not yield to any form of MR tested here, offering valuable hints as to the number and the characteristics of cases where experimental phasing remains essential for macromolecular structure solution. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. The power of digital activism for transnational advocacy: Leadership, engagement, and affordance.
- Author
-
Cheng, Edmund W, Lui, Elizabeth, and Fu, King-wa
- Subjects
- *
PUBLIC demonstrations , *POLITICAL communication , *DIGITAL technology , *DIGITAL media , *COLLECTIVE action , *SOCIAL movements - Abstract
Recent literature has underscored the power of digital activism, but few studies have symmetrically examined its impact beyond domestic audiences and among illiberal regimes. The co-occurrence of mass protests in East and Southeast Asia in 2019–2021, when protesters called for help from international communities, offers a valuable opportunity to test the power of digital media. This study uses a data set of 154 million Twitter posts and a time-series model to contrast sets of collective action metrics and connective action metrics with a novel dependent variable—foreign politicians' responses. We then analyze the directional, intensity, and time-lagged effects of the relevant cue-taking processes. We find that the new metrics are more potent in predicting responses from foreign politicians. Agency- and network-centered metrics also outperform number- and intensity-oriented metrics across the three cases. These findings have implications for the roles of opinion leadership and engagement networks in digital activism. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Analyzing One- and Two-bit Data to Reduce Memory Requirements for F -statistic-based Gravitational Wave Searches.
- Author
-
Clearwater, P., Melatos, A., Nepal, S., and Bailes, M.
- Subjects
- *
GRAVITATIONAL wave detectors , *MONTE Carlo method , *GRAVITATIONAL wave astronomy , *LASER interferometers , *RANDOM noise theory - Abstract
Searches for continuous-wave gravitational radiation in data collected by modern long-baseline interferometers, such as the Laser Interferometer Gravitational-wave Observatory (LIGO), the Virgo interferometer, and the Kamioka Gravitational Wave Detector, can be memory intensive. A digitization scheme is described that reduces the 64-bit interferometer output to a one- or two-bit data stream while minimizing distortion and achieving considerable reduction in storage and input/output cost. For the representative example of the coherent, maximum-likelihood matched filter known as the F -statistic, it is found using Monte Carlo simulations that the injected signal only needs to be ≈24% stronger (for one-bit data) and ≈6.4% stronger (for two-bit data with optimal thresholds) than a 64-bit signal in order to be detected with 90% probability in Gaussian noise. The foregoing percentages do not change significantly when the signal frequency decreases secularly, or when the noise statistics are not Gaussian, as verified with LIGO Science Run 6 data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Low rank approximation in the computation of first kind integral equations with TauToolbox.
- Author
-
Vasconcelos, Paulo B., Grammont, Laurence, and Lima, Nilson J.
- Subjects
- *
NUMERICAL solutions to integral equations , *FREDHOLM equations , *INTEGRAL equations , *POLYNOMIAL approximation , *TIKHONOV regularization - Abstract
Tau Toolbox is a mathematical library for the solution of integro-differential problems, based on the spectral Lanczos' Tau method. Over the past few years, a class within the library, called polynomial, has been developed for approximating functions by classical orthogonal polynomials and it is intended to be an easy-to-use yet efficient object-oriented framework. In this work we discuss how this class has been designed to solve linear ill-posed problems and we provide a description of the available methods, Tikhonov regularization and truncated singular value expansion. For the solution of the Fredholm integral equation of the first kind, which is built from a low-rank approximation of the kernel followed by a numerical truncated singular value expansion, an error estimate is given. Numerical experiments illustrate that this approach is capable of efficiently compute good approximations of linear discrete ill-posed problems, even facing perturbed available data function, with no programming effort. Several test problems are used to evaluate the performance and reliability of the solvers. The final product of this paper is the numerical solution of a first-kind integral equation, which is constructed using only two inputs from the user: the kernel and the right-hand side. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Discovery of skin lightening agents in Curcuma longa using computational methods.
- Author
-
Chidiebere, Chiagoziem Wisdom and Chioma, Abah Veronica
- Subjects
TURMERIC ,HUMAN skin color ,SKIN color lighteners ,MOLECULAR docking ,OLEIC acid - Abstract
Curcuma longa has been reported to impart fairness to human skin traditionally. However, the empirical evidence of this report has barely being actualized. Efforts were made by using Molecular Docking approach and Adsorption Distribution Metabolism Excretion and Toxicity (ADMET) analysis to identify the potential compounds responsible for imparting fairness. The results showed that the hit compounds - Turmerone, curlone, 6-Octadecenoic acid (Z)-, Ar-turmerone, 9-Octadecenoic acid and Hexanoic acid, 5-oxo-,ethyl ester were the most abundant compounds with percentages 16.7 %, 9.31 %, 9.19 %, 7.98 %, 6.65 % and 5.24 % have more stable binding affinity on the human tyrosinase-related protein 1 and ADMET analysis proved these hit compounds to be skin friendly. [ABSTRACT FROM AUTHOR]
- Published
- 2024
9. Computational applications for the discovery of novel antiperovskites and chalcogenide perovskites: a review.
- Author
-
Sheng, Ming, Wang, Suqin, Zhu, Hui, Liu, Zhuang, and Zhou, Guangtao
- Subjects
- *
PEROVSKITE , *STRUCTURAL stability , *ELECTRONIC structure , *ELECTRONIC materials , *RESEARCH personnel - Abstract
Novel perovskites pertain to newly discovered or less studied variants of the conventional perovskite structure, characterized by distinctive properties and potential for diverse applications such as ferroelectric, optoelectronic, and thermoelectric uses. In recent years, advancements in computational methods have markedly expedited the discovery and design of innovative perovskite materials, leading to numerous pertinent reports. However, there are few reviews that thoroughly elaborate the role of computational methods in studying novel perovskites, particularly for state-of-the-art perovskite categories. This review delves into the computational discovery of novel perovskite materials, with a particular focus on antiperovskites and chalcogenide perovskites. We begin with a discussion on the computational methods applied to evaluate the stability and electronic structure of materials. Next, we highlight how these methods expedite the discovery process, demonstrating how rational simulations contribute to researching novel perovskites with improved performance. Finally, we thoroughly discuss the remaining challenges and future outlooks in this research domain to encourage further investigation. We believe that this review will be highly beneficial both for newcomers to the field and for experienced researchers in computational science who are shifting their focus to novel perovskites. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. A new density filter for pipes for fluid topology optimization.
- Author
-
Choi, Young Hun and Yoon, Gil Ho
- Subjects
FLUID control ,FLUID flow ,ENERGY dissipation ,BODY fluids ,TOPOLOGY - Abstract
This study presents a new density filter for a pipe-shaped structure and its application to fluid topology optimization. A simple and straight pipe-shaped structure for fluid is preferred for many engineering purposes rather than the perplex manifold structure provided by the topology optimization method. To determine an optimal pipe structure for fluid, we develop a new density filter and apply it to fluid topology optimization. Hence, the original spatially varying design variables of the fluid topology optimization are modified based on the pipe density filter. Subsequently, the filter design variables, including a uniform pipe wall thickness and adjusted cross-section, are used for artificial pseudo-rigid bodies in fluid topology optimization. An additional constraint is imposed to maintain a nearly uniform pipe thickness. Several numerical examples are solved to demonstrate the validity of the present pipe density filter for fluid topology optimization problems minimizing the energy dissipation of the fluid and controlling the particles suspended in the fluid. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Quantitative evaluation of lesion response heterogeneity for superior prognostication of clinical outcome.
- Author
-
Lokre, Ojaswita, Perk, Timothy G., Weisman, Amy J., Govindan, Rajkumar Munian, Chen, Song, Chen, Meijie, Eickhoff, Jens, Liu, Glenn, and Jeraj, Robert
- Subjects
- *
DIFFUSE large B-cell lymphomas , *NON-small-cell lung carcinoma , *TREATMENT effectiveness , *CANCER patients , *LUNG cancer - Abstract
Purpose: Standardized reporting of treatment response in oncology patients has traditionally relied on methods like RECIST, PERCIST and Deauville score. These endpoints assess only a few lesions, potentially overlooking the response heterogeneity of all disease. This study hypothesizes that comprehensive spatial-temporal evaluation of all individual lesions is necessary for superior prognostication of clinical outcome. Methods: [18F]FDG PET/CT scans from 241 patients (127 diffuse large B-cell lymphoma (DLBCL) and 114 non-small cell lung cancer (NSCLC)) were retrospectively obtained at baseline and either during chemotherapy or post-chemoradiotherapy. An automated TRAQinform IQ software (AIQ Solutions) analyzed the images, performing quantification of change in regions of interest suspicious of cancer (lesion-ROI). Multivariable Cox proportional hazards (CoxPH) models were trained to predict overall survival (OS) with varied sets of quantitative features and lesion-ROI, compared by bootstrapping with C-index and t-tests. The best-fit model was compared to automated versions of previously established methods like RECIST, PERCIST and Deauville score. Results: Multivariable CoxPH models demonstrated superior prognostic power when trained with features quantifying response heterogeneity in all individual lesion-ROI in DLBCL (C-index = 0.84, p < 0.001) and NSCLC (C-index = 0.71, p < 0.001). Prognostic power significantly deteriorated (p < 0.001) when using subsets of lesion-ROI (C-index = 0.78 and 0.67 for DLBCL and NSCLC, respectively) or excluding response heterogeneity (C-index = 0.67 and 0.70). RECIST, PERCIST, and Deauville score could not significantly associate with OS (C-index < 0.65 and p > 0.1), performing significantly worse than the multivariable models (p < 0.001). Conclusions: Quantitative evaluation of response heterogeneity of all individual lesions is necessary for the superior prognostication of clinical outcome. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Does Scientific Evidence Sell? Combining Manual and Automated Content Analysis to Investigate Scientists' and Laypeople's Evidence Practices on Social Media.
- Author
-
Biermann, Kaija, Nowak, Bianca, Braun, Lea-Marie, Taddicken, Monika, Krämer, Nicole C., and Stieglitz, Stefan
- Subjects
- *
SCIENTIFIC communication , *CONTENT analysis , *COVID-19 - Abstract
Examining the dissemination of evidence on social media, we analyzed the discourse around eight visible scientists in the context of COVID-19. Using manual (N = 1,406) and automated coding (N = 42,640) on an account-based tracked Twitter/X dataset capturing scientists' activities and eliciting reactions over six 2-week periods, we found that visible scientists' tweets included more scientific evidence. However, public reactions contained more anecdotal evidence. Findings indicate that evidence can be a message characteristic leading to greater tweet dissemination. Implications for scientists, including explicitly incorporating scientific evidence in their communication and examining evidence in science communication research, are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Numerical Analysis of Bacterial Meningitis Stochastic Delayed Epidemic Model through Computational Methods.
- Author
-
Shafique, Umar, Al-Shamiri, Mohamed Mahyoub, Raza, Ali, Fadhal, Emad, Rafiq, Muhammad, and Ahmed, Nauman
- Subjects
BACTERIAL meningitis ,STOCHASTIC analysis ,FINITE differences ,NUMERICAL analysis ,SPINAL cord - Abstract
Based on the World Health Organization (WHO), Meningitis is a severe infection of the meninges, the membranes covering the brain and spinal cord. It is a devastating disease and remains a significant public health challenge. This study investigates a bacterial meningitis model through deterministic and stochastic versions. Four-compartment population dynamics explain the concept, particularly the susceptible population, carrier, infected, and recovered. The model predicts the nonnegative equilibrium points and reproduction number, i.e. the Meningitis-Free Equilibrium (MFE), and Meningitis-Existing Equilibrium (MEE). For the stochastic version of the existing deterministic model, the two methodologies studied are transition probabilities and non-parametric perturbations. Also, positivity, boundedness, extinction, and disease persistence are studied rigorously with the help of well-known theorems. Standard and nonstandard techniques such as Euler Maruyama, stochastic Euler, stochastic Runge Kutta, and stochastic nonstandard finite difference in the sense of delay have been presented for computational analysis of the stochastic model. Unfortunately, standard methods fail to restore the biological properties of the model, so the stochastic nonstandard finite difference approximation is offered as an efficient, low-cost, and independent of time step size. In addition, the convergence, local, and global stability around the equilibria of the nonstandard computational method is studied by assuming the perturbation effect is zero. The simulations and comparison of the methods are presented to support the theoretical results and for the best visualization of results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. A Novel Underwater Wireless Optical Communication Optical Receiver Decision Unit Strategy Based on a Convolutional Neural Network.
- Author
-
El Ramley, Intesar F., Bedaiwi, Nada M., Al-Hadeethi, Yas, Barasheed, Abeer Z., Al-Zhrani, Saleha, and Chen, Mingguang
- Subjects
- *
CONVOLUTIONAL neural networks , *BIT error rate , *OPTICAL receivers , *OPTICAL distortion , *WIRELESS communications , *OPTICAL communications - Abstract
Underwater wireless optical communication (UWOC) systems face challenges due to the significant temporal dispersion caused by the combined effects of scattering, absorption, refractive index variations, optical turbulence, and bio-optical properties. This collective impairment leads to signal distortion and degrades the optical receiver's bit error rate (BER). Optimising the receiver filter and equaliser design is crucial to enhance receiver performance. However, having an optimal design may not be sufficient to ensure that the receiver decision unit can estimate BER quickly and accurately. This study introduces a novel BER estimation strategy based on a Convolutional Neural Network (CNN) to improve the accuracy and speed of BER estimation performed by the decision unit's computational processor compared to traditional methods. Our new CNN algorithm utilises the eye diagram (ED) image processing technique. Despite the incomplete definition of the UWOC channel impulse response (CIR), the CNN model is trained to address the nonlinearity of seawater channels under varying noise conditions and increase the reliability of a given UWOC system. The results demonstrate that our CNN-based BER estimation strategy accurately predicts the corresponding signal-to-noise ratio (SNR) and enables reliable BER estimation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Modon solutions in an N-layer quasi-geostrophic model.
- Author
-
Crowe, Matthew N. and Johnson, Edward R.
- Subjects
LINEAR algebra ,EIGENVALUES ,OCEAN ,FLUIDS - Abstract
Modons, or dipolar vortices, are common and long-lived features of the upper ocean, consisting of a pair of counter-rotating monopolar vortices moving through self-advection. Such structures remain stable over long times and may be important for fluid transport over large distances. Here, we present a semi-analytical method for finding fully nonlinear modon solutions in a multi-layer quasi-geostrophic model with arbitrarily many layers. Our approach is to reduce the problem to a multi-parameter linear eigenvalue problem which can be solved using numerical techniques from linear algebra. The method is shown to replicate previous results for one- and two-layer models and is applied to a three-layer model to find a solution describing a mid-depth propagating, topographic vortex. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Stochastic delayed analysis of coronavirus model through efficient computational method.
- Author
-
Shahid, Naveed, Raza, Ali, Iqbal, Sana, Ahmed, Nauman, Fadhal, Emad, and Ceesay, Baboucarr
- Subjects
- *
SARS-CoV-2 Omicron variant , *STOCHASTIC differential equations , *PATTERNS (Mathematics) , *FINITE differences , *COVID-19 - Abstract
Stochastic delayed modeling has a significant non-pharmaceutical intervention to control transmission dynamics of infectious diseases and its results are close to the reality of nature. The covid-19 has been controlled globally but there is still a threat and appears in different variants like omicron and SARS-CoV-2 etc. globally. This article, considered pattern a mathematical model based on Susceptible, Infected, and recovered populations with highly nonlinear incidence rates. we studied the dynamics of the coronavirus model; a newly proposed version is a stochastic delayed model that is based on nonlinear stochastic delayed differential equations (SDDEs). Transition probabilities and parametric perturbation methods were used for the construction of the stochastic delayed model. The fundamental properties like positivity, boundedness, existence and uniqueness, and stability results of equilibria of the model with certain conditions of reproduction number are studied regularly. Also, the extinction and persistence of disease are studied with the help of well-known theorems. The numerical methods used to find a visualization of results due to the complexity of stochastic delayed differential equations. Furthermore, for computational analysis, we implemented existing methods in the literature and compared their results with the proposed method like nonstandard finite difference for stochastic delayed model. The proposed method restores all dynamical properties of the model with a free choice of time steps. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Direct Method to Compute Doppler Beaming Factors in Binary Stars.
- Author
-
Zheng, Chuanjie, Huang, Yang, Liu, Jifeng, Lu, Youjun, Han, Henggeng, Tan, Yuan, and Beers, Timothy C.
- Subjects
- *
GIANT stars , *WHITE dwarf stars , *BINARY number system , *BINARY stars , *STELLAR spectra - Abstract
The Doppler beaming effect, induced by the reflex motion of stars, introduces flux modulations and serves as an efficient method to photometrically determine mass functions for a large number of close binary systems, particularly those involving compact objects. In order to convert observed beaming-flux variations into a radial-velocity curve, precise determination of the beaming factor is essential. Previously, this factor was calculated as a constant, assuming a power-law profile for stellar spectra. In this study, we present a novel approach to directly compute this factor. Our new method not only simplifies the computation, especially for blue bands and cool stars, but also enables us to evaluate whether the relationship between beaming flux and radial velocity can be accurately described as linear. We develop a Python code and compute a comprehensive beaming-factor table for commonly used filter systems covering main-sequence, subgiant, and giant stars, as well as hot subdwarf and white dwarf stars. Both the code and our table are archived and publicly available on Zenodo: doi:10.5281/zenodo.13049419. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Geographical and genetic clines in Dracocephalum kotschyi X Dracocephalum oligadenium hybrids: landscape genetics and genocline analyses.
- Author
-
Sheidai, Masoud, Koohdar, Fahimeh, and Mazinani, Javad
- Subjects
- *
SPECIES distribution , *HYBRID zones , *LANDSCAPE changes , *CLIMATE change , *GENETICS - Abstract
Conservation and management of medicinally important plants are among the necessary tasks all over the world. The genus Dracocephalum (Lamiaceae) contains about 186 perennials, or annual herb species that have been used for their medicinal values in different parts of the world as an antihyperlipidemic, analgesic, antimicrobial, antioxidant, as well as anticancer medicine. Producing detailed data on the genetic structure of these species and their response against climate change and human landscape manipulation can be very important for conservation purposes. Therefore, the present study was performed on six geographical populations of two species in the Dracocephalum genus, namely, Dracocephalum kotschyi, and Dracocephalum oligadenium, as well as their inter-specific hybrid population. We carried out, population genetic study, landscape genetics, species modeling, and genetic cline analyses on these plants. We present here, new findings on the genetic structure of these populations, and provide data on both geographical and genetic clines, as well as morphological clines. We also identified genetic loci that are potentially adaptive to the geographical spatial features and genocide conditions. Different species distribution modeling (SDM) methods, used in this work revealed that bioclimatic variables related to the temperature and moisture, play an important role in Dracocephalum population's geographical distribution within IRAN and that due to the presence of some potentially adaptive genetic loci in the studied plants, they can survive well enough by the year 2050 and under climate change. The findings can be used for the protection of these medicinally important plant. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Utilizing computational methods for analysing media framing of organizational crises: The 'Datalek' scandal during the COVID‐19 pandemic in the Netherlands.
- Author
-
Nguyen, Dennis, Nguyen, Sergül, Le, Phuong Hoan, Oomen, Tessa, and Wang, Yijing
- Subjects
- *
LIFE cycles (Biology) , *DATA security failures , *DATA transmission systems , *CONTENT analysis , *PUBLIC communication , *CRISIS communication - Abstract
Media framing of organizational crises is an important factor to consider in crisis communication since it can shape stakeholders' perceptions of organizations and discussions in the public sphere. This takes place in complex media ecologies where public communication happens at a large scale, both in the news and on social media. Here, computational methods offer new venues for analysing media framing in flux throughout the crisis life cycle. Especially methods for automated content analysis can quickly and efficiently reveal what media frames emerge in a crisis context and how they change over time across different channels and platforms. The present study showcases the benefits of such methodological approaches by critically exploring the example of the data breach at the national municipal health service in the Netherlands. Using computational methods for media frame analysis on news texts (N1 = 519) and social media postings (N2 = 2986), this article reconstructs how the incident was perceived throughout four crisis stages (build‐up, outbreak, chronic stage, termination). The article critically discusses the relevance of researching media framing empirically with emphasis on the benefits but also limitations of computational approaches. It concludes with some general pointers for crisis researchers interested in such methods as well as their implications for practitioners in the field. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Measuring Cultural Diversity in Text with Word Counts.
- Author
-
Wood, Michael Lee
- Subjects
- *
LANGUAGE & languages , *SOCIAL media , *OCCUPATIONS , *ENCYCLOPEDIAS & dictionaries , *SOCIAL theory , *CULTURAL pluralism - Abstract
A long-standing concern in the study of culture is understanding how culture is distributed, often discussed in terms of cultural "coherence." Cultural diversity, defined as the degree to which people share beliefs or meanings, is one dimension of cultural coherence that has been associated with many social outcomes. This article contributes to this area of research by considering how to measure cultural diversity in text and introducing a simple approach that uses word counts and sets of diversity indices (called "diversity profiles"). Text is useful for social-psychological analysis because as an artifact of individual thought, it provides a way to measure how beliefs and meanings are distributed and made salient across groups. The measurement approach outlined here contrasts to many contemporary computational approaches to measuring culture in text, which employ a relational logic of meaning based on word co-occurrences. While these more sophisticated approaches are well suited to measuring diversity in many instances, I show that there are some cases for which simpler measures based on word counts are ideal. After discussing the measurement of cultural diversity using word counts, I present a computational analysis of interview transcripts of American religious parents discussing the ages at which it is appropriate for children to participate in different practices often considered inappropriate for young children. The analysis points to the homogenizing influence of institutions on discussions of age appropriateness. I conclude by discussing implications for cultural analysis more generally. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Computational methods for climate change frame analysis: Techniques, critiques, and cautious ways forward.
- Author
-
Hirsbrunner, Simon David
- Abstract
Frame analysis is a popular methodological paradigm to investigate how climate change is reported in the media, how it is negotiated by political actors, and perceived by publics. Its scope of application extends across various academic disciplines and transcends traditional boundaries of research such as those between quantitative and qualitative methods. Recent transformations of the media landscape have a strong influence on how frame analysis is conducted and how it is used to investigate climate change communication. Online data mining and computational methods have now become increasingly mainstream to investigate discursive elements in online media. Scholars have highlighted the potential, but also the risks associated with sophisticated computational methods, such as machine learning, increasingly used in the context of frame analysis. This advanced review gathers the scientific literature on computational frame analysis for analyzing climate change communication and discusses ways of dealing with associated risks and caveats by incorporating ideas from Science & Technology Studies (STS) and other stances of critical scholarship. Recommended ways forward include combining methods, practicing theoretical interdisciplinarity, infrastructuring reflexivity in research constellations, and embracing transparency, documentation, and accessibility of methods. This article is categorized under:The Social Status of Climate Change Knowledge > Sociology/Anthropology of Climate KnowledgePerceptions, Behavior, and Communication of Climate Change > CommunicationClimate, History, Society, Culture > Technological Aspects and Ideas [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Finding love in algorithms: deciphering the emotional contexts of close encounters with AI chatbots.
- Author
-
Li, Han and Zhang, Renwen
- Subjects
GENERATIVE artificial intelligence ,EMOTIONS ,SOCIAL interaction ,ARTIFICIAL intelligence ,USER experience ,CHATBOTS - Abstract
AI chatbots are permeating the socio-emotional realms of human life, presenting both benefits and challenges to interpersonal dynamics and well-being. Despite burgeoning interest in human–AI relationships, the conversational and emotional nuances of real-world, in situ human–AI social interactions remain underexplored. Through computational analysis of a multimodal dataset with over 35,000 screenshots and posts from r/replika , we identified seven prevalent types of human–AI social interactions: intimate behavior, mundane interaction, self-disclosure, play and fantasy, customization, transgression, and communication breakdown, and examined their associations with six basic human emotions. Our findings suggest the paradox of emotional connection with AI , indicated by the bittersweet emotion in intimate encounters with AI chatbots, and the elevated fear in uncanny valley moments when AI exhibits semblances of mind in deep self-disclosure. Customization characterizes the distinctiveness of AI companionship, positively elevating user experiences, whereas transgression and communication breakdown elicit fear or sadness. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Computing Longitudinal Moments for Heterogeneous Agent Models.
- Author
-
Ocampo, Sergio and Robinson, Baxter
- Subjects
MONTE Carlo method ,DYNAMIC programming ,MEMORY - Abstract
Computing population moments for heterogeneous agent models is a necessary step for their estimation and evaluation. Computation based on Monte Carlo methods is time- and resource-consuming because it involves simulating a large sample of agents and tracking them over time. We formalize how an alternative non-stochastic method, widely used for computing cross-sectional moments, can be extended to also compute longitudinal moments. The method relies on following the distribution of populations of interest by iterating forward the Markov transition function that defines the evolution of the distribution of agents in the model. Approximations of this function are readily available from standard solution methods of dynamic programming problems. We document the performance of this method vis-a-vis standard Monte Carlo simulations when calculating longitudinal moments. The method provides precise estimates of moments like top-wealth shares, auto-correlations, transition rates, age-profiles, or coefficients of population regressions at lower time- and resource-costs compared to Monte Carlo based methods. The method is particularly useful for moments of small groups of agents or involving rare events, but implies increasing memory costs in models with a large state space. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Leveraging computational methods for nonprofit social media research: a systematic review and methodological framework.
- Author
-
Wu, Viviana Chiu Sik
- Subjects
SOCIAL media ,SUPERVISED learning ,GENERATIVE artificial intelligence ,COMMUNITY foundations ,NONPROFIT organizations ,MICROBLOGS - Abstract
While social media platforms are valuable for examining the online engagement of nonprofit and philanthropic organizations, the research considerations underlying social media data remain opaque to most. Through a systematic review of nonprofit studies that analyze social media data, I propose a methodological framework incorporating three common data types: text, engagement and network data. The review reveals that most existing studies rely heavily on manual coding to analyze relatively small datasets of social media messages, thereby missing out on the automation and scalability offered by advanced computational methods. To address this gap, I demonstrate the application of supervised machine learning to train, predict, and analyze a substantial dataset consisting of 66,749 social media messages posted by community foundations on Twitter/X. This study underscores the benefits of combining manual content analysis with automated approaches and calls for future research to explore the potential of generative AI in advancing nonprofit social media research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Computational psychiatry and the evolving concept of a mental disorder.
- Author
-
Genin, Konstantin, Grote, Thomas, and Wolfers, Thomas
- Abstract
As a discipline, psychiatry is in the process of finding the right set of concepts to organize research and guide treatment. Dissatisfaction with the status quo as expressed in standard manuals has animated a number of computational paradigms, each proposing to rectify the received concept of mental disorder. We explore how different computational paradigms: normative modeling, network theory and learning-theoretic approaches like reinforcement learning and active inference, reconceptualize mental disorders. Although each paradigm borrows heavily from machine learning, they differ significantly in their methodology, their preferred level of description, the role they assign to the environment and, especially, the degree to which they aim to assimilate psychiatric disorders to a standard medical disease model. By imagining how these paradigms might evolve, we bring into focus three rather different visions for the future of psychiatric research. Although machine learning plays a crucial role in the articulation of these paradigms, it is clear that we are far from automating the process of conceptual revision. The leading role continues to be played by the theoretical, metaphysical and methodological commitments of the competing paradigms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Geographical and genetic clines in Dracocephalum kotschyi X Dracocephalum oligadenium hybrids: landscape genetics and genocline analyses
- Author
-
Masoud Sheidai, Fahimeh Koohdar, and Javad Mazinani
- Subjects
Computational methods ,Dracocephalum taxa ,Genetic structure ,Hybrid zone ,Selection ,Botany ,QK1-989 - Abstract
Abstract Conservation and management of medicinally important plants are among the necessary tasks all over the world. The genus Dracocephalum (Lamiaceae) contains about 186 perennials, or annual herb species that have been used for their medicinal values in different parts of the world as an antihyperlipidemic, analgesic, antimicrobial, antioxidant, as well as anticancer medicine. Producing detailed data on the genetic structure of these species and their response against climate change and human landscape manipulation can be very important for conservation purposes. Therefore, the present study was performed on six geographical populations of two species in the Dracocephalum genus, namely, Dracocephalum kotschyi, and Dracocephalum oligadenium, as well as their inter-specific hybrid population. We carried out, population genetic study, landscape genetics, species modeling, and genetic cline analyses on these plants. We present here, new findings on the genetic structure of these populations, and provide data on both geographical and genetic clines, as well as morphological clines. We also identified genetic loci that are potentially adaptive to the geographical spatial features and genocide conditions. Different species distribution modeling (SDM) methods, used in this work revealed that bioclimatic variables related to the temperature and moisture, play an important role in Dracocephalum population’s geographical distribution within IRAN and that due to the presence of some potentially adaptive genetic loci in the studied plants, they can survive well enough by the year 2050 and under climate change. The findings can be used for the protection of these medicinally important plant.
- Published
- 2024
- Full Text
- View/download PDF
27. A unified pipeline for FISH spatial transcriptomics
- Author
-
Cisar, Cecilia, Keener, Nicholas, Ruffalo, Mathew, and Paten, Benedict
- Subjects
Biological Sciences ,Genetics ,Biotechnology ,Bioengineering ,FISH ,computational methods ,genomics ,spatial transcriptomics ,transcriptomics - Abstract
High-throughput spatial transcriptomics has emerged as a powerful tool for investigating the spatial distribution of mRNA expression and its effects on cellular function. There is a lack of standardized tools for analyzing spatial transcriptomics data, leading many groups to write their own in-house tools that are often poorly documented and not generalizable. To address this, we have expanded and improved the starfish library and used those tools to create PIPEFISH, a semi-automated and generalizable pipeline that performs transcript annotation for fluorescence in situ hybridization (FISH)-based spatial transcriptomics. We used this pipeline to annotate transcript locations from three real datasets from three different common types of FISH image-based experiments, MERFISH, seqFISH, and targeted in situ sequencing (ISS), and verified that the results were high quality using the internal quality metrics of the pipeline and also a comparison with an orthogonal method of measuring RNA expression. PIPEFISH is a publicly available and open-source tool.
- Published
- 2023
28. Stability and computational analysis of Influenza-A epidemic model through double time delay
- Author
-
Ateq Alsaadi, Ali Raza, Muhammed Bilal Riaz, and Umar Shafique
- Subjects
Novel Influenza SEIR model ,Delay differential equations (DDEs) ,Feasible properties ,Stability results ,Computational methods ,Convergence analysis ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
Delay factors demonstration has a significant role in controlling a strain of infectious disease instead of a pharmaceutical strategy. According to the World Health Organization (WHO), 3–5 million cases are reported annually and approximately 290,000 to 650,000 respiratory deaths annually. So, in the present study, we develop a delayed mathematical model based on delay differential equations (DDEs) for the influenza epidemic using a deterministic approach by introducing double delay parameters. The four distinct sub-populations are considered susceptible, exposed, infected, and recovered. For the rigorous analysis, the fundamental properties of the model like positivity, boundedness, existence, and uniqueness, were studied. The influenza-free equilibrium (IFE) and influenza-existing equilibrium (IEE), are the two nonnegative equilibrium points that the model demonstrates. Both locally and globally, the asymptotic stability of the equilibrium points of the model is established and shown under specific situations of reproduction number. Additionally, investigated the model's parameter sensitivity and determined the relative sensitivity of each parameter. Both standard and nonstandard methods—such as Euler, Runge-Kutta, and nonstandard finite difference with a delayed sense—are presented to make computational analysis support a dynamical analysis and the best visualization of results. The stability of the non-standard finite difference scheme is thoroughly analyzed around the steady states of the model. Additionally, the results show that the nonstandard finite difference approximation is an efficient, cost-effective method, independent of time step size, to solve such highly nonlinear and complex real-world problems.
- Published
- 2025
- Full Text
- View/download PDF
29. Databases and computational methods for the identification of piRNA-related molecules: A survey
- Author
-
Chang Guo, Xiaoli Wang, and Han Ren
- Subjects
piRNA ,Machine learning ,Deep learning ,Computational methods ,piRNA–disease association prediction ,Biotechnology ,TP248.13-248.65 - Abstract
Piwi-interacting RNAs (piRNAs) are a class of small non-coding RNAs (ncRNAs) that plays important roles in many biological processes and major cancer diagnosis and treatment, thus becoming a hot research topic. This study aims to provide an in-depth review of computational piRNA-related research, including databases and computational models. Herein, we perform literature analysis and use comparative evaluation methods to summarize and analyze three aspects of computational piRNA-related research: (i) computational models for piRNA-related molecular identification tasks, (ii) computational models for piRNA–disease association prediction tasks, and (iii) computational resources and evaluation metrics for these tasks. This study shows that computational piRNA-related research has significantly progressed, exhibiting promising performance in recent years, whereas they also suffer from the emerging challenges of inconsistent naming systems and the lack of data. Different from other reviews on piRNA-related identification tasks that focus on the organization of datasets and computational methods, we pay more attention to the analysis of computational models, algorithms, and performances that aim to provide valuable references for computational piRNA-related identification tasks. This study will benefit the theoretical development and practical application of piRNAs by better understanding computational models and resources to investigate the biological functions and clinical implications of piRNA.
- Published
- 2024
- Full Text
- View/download PDF
30. Relations in Aesthetic Space: How Color Enables Market Positioning.
- Author
-
Sgourev, Stoyan V., Aadland, Erik, and Formilan, Giovanni
- Subjects
COLOR ,AESTHETICS ,MARKET positioning ,MUSIC & color ,HEAVY metal music ,IDENTITY (Psychology) ,SOCIAL stigma - Abstract
Color is omnipresent, but organizational research features no systematic theory or established method for analyzing it. We develop a relational approach to color, conceptualizing it as a means of positioning relative to a reference group or style and validating it through a computational method for processing digital images. The research context is Norwegian black metal—a genre of extreme metal music that achieved notoriety in the early 1990s through band members' criminal activity. Our analysis of 5,125 album covers between 1989 and 2019 confirms the alignment of aesthetic and music features and articulates the role of color in the construction of a relational identity based on forces of association and disassociation. Black metal bands associated with past color choices of non-black metal bands up to a point, after which they started to disassociate from them. The positioning is dynamic, pursuing adaptation to external events. Black metal bands reacted to their stigmatization in Norwegian society by increasing colorfulness and later returning to a darker aesthetic in defiance of the genre's commercialization. Our analysis attests to color's ability to organize producers' exchange of information and attention, illustrating the interweaving of aesthetic features and relational processes in markets. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. Advances in computational methods for process and data mining in healthcare
- Author
-
Marco Pegoraro, Elisabetta Benevento, Davide Aloini, and Wil M.P. van der Aalst
- Subjects
process mining ,process science ,data mining ,healthcare ,clinical data ,medical data ,computational methods ,Biotechnology ,TP248.13-248.65 ,Mathematics ,QA1-939 - Published
- 2024
- Full Text
- View/download PDF
32. The Study of Pigments in Cultural Heritage: A Review Using Machine Learning
- Author
-
Astrid Harth
- Subjects
pigments ,dyes ,cultural heritage ,topic modeling ,literature review ,computational methods ,Archaeology ,CC1-960 - Abstract
In this review, topic modeling—an unsupervised machine learning tool—is employed to analyze research on pigments in cultural heritage published from 1999–2023. The review answers the following question: What are topics and time trends in the past three decades in the analytical study of pigments within cultural heritage (CH) assets? In total, 932 articles are reviewed, ten topics are identified and time trends in the share of these topics are revealed. Each topic is discussed in-depth to elucidate the community, purpose and tools involved in the topic. The time trend analysis shows that dominant topics over time include T1 (the spectroscopic and microscopic study of the stratigraphy of painted CH assets) and T5 (X-ray based techniques for CH, conservation science and archaeometry). However, both topics have experienced a decrease in attention in favor of other topics that more than doubled their topic share, enabled by new technologies and methods for imaging spectroscopy and imaging processing. These topics include T6 (spectral imaging techniques for chemical mapping of painting surfaces) and T10 (the technical study of the pigments and painting methods of historical and contemporary artists). Implications for the field are discussed in conclusion.
- Published
- 2024
- Full Text
- View/download PDF
33. Leveraging geo-computational innovations for sustainable disaster management to enhance flood resilience
- Author
-
Harshita Jain
- Subjects
Flood disasters ,Computational methods ,Remote sensing ,Geographic information systems ,Artificial intelligence ,Geospatial data analysis ,Geology ,QE1-996.5 ,Geophysics. Cosmic physics ,QC801-809 - Abstract
Abstract The increasing frequency of flood disasters around the globe highlights the need for creative approaches to improve disaster preparedness. This thorough analysis and assessment explore the topic of enhancing flood disaster resilience by utilising cutting-edge geo-computational techniques. By combining a variety of techniques, such as remote sensing, geographic information systems (GIS), LiDAR, unmanned aerial vehicles (UAVs), and cutting-edge technologies like machine learning and geospatial big data analytics, the study provides a complex framework for flood monitoring, risk assessment, and mitigation. By using remote sensing technology, flood occurrences can be tracked in real time and inundations may be precisely mapped, which makes proactive response plans possible. GIS facilitates effective evacuation planning by streamlining spatial analysis and decision-making procedures and providing critical insights into risky locations. High-resolution elevation data is provided by LiDAR technology, which is essential for precise flood modelling and simulation. Unmanned Aerial Vehicles (UAVs) may be quickly deployed to assist with situational awareness and damage assessment during a disaster. Furthermore, predictive skills are enhanced by the combination of machine learning and geographic big data analytics, opening the door to the creation of adaptive reaction plans and early warning systems. This investigation highlights how geo-computational tools may significantly improve community resilience and lessen the negative effects of flood disasters. After a thorough review of the literature and case studies, this study clarifies how these approaches might improve disaster response and preparation to a great extent.
- Published
- 2024
- Full Text
- View/download PDF
34. Feather keratin in Pavo cristatus: A tentative structure [version 1; peer review: awaiting peer review]
- Author
-
Peter Russ, Helmut O.K. Kirchner, Herwig Peterlik, and Ingrid M. Weiss
- Subjects
Research Article ,Articles ,biological materials ,mechanical properties ,structural biology ,small angle x-ray scattering ,computational methods ,AlphaFold ,molecular docking ,Aves - Abstract
Background F-keratin forms an evolutionary conserved nanocomposite which allows birds to fly. Structural models for F-keratin are all based on the pioneering X-ray diffraction studies on seagull F-keratin, first published in 1932, confirmed for other species and refined over the years. There is, however, no experimental proof because native F-keratin does not form a perfect molecular crystal as required for structure determination. Methods Peacock’s tail feathers were systematically re-investigated by taking diffraction patterns at different rotation angles. Using the recently developed AlphaFold algorithm, a collection of 3D models of arbitrarily truncated and multiplied Pavo cristatus F-keratin sequences was created. The shape, dimensions, density and interfacial exposure of functionally relevant amino acid side chains of the calculated 3D building blocks were used as the initial selection criteria for filamentous F-keratin precursors. Full reproducibility of in silico folding and agreement with previous results from mechanical testing, biochemical analyses and SAXS experiments was mandatory for suggesting the tentative structure for the novel F-keratin repeating unit. Results The filament of the F-keratin polymer is an alternating arrangement of two units called 'N-block' and 'C-block': Four strands AA 1–52 form a disulfide-stabilized twisted parallelepiped, with 89° internal rotation within eight levels of β-sandwiches. Four strands AA 81–100 form a two-level “ β-sandwich” in which aromatic residues provide resilience, like vertebral discs in a spinal column. The pitch of an N+C-block octamer is 10 nm. Solidification may involve 'C-blocks' to temporarily mold into 'C-wedges' of 18° tilt, which align F-keratin into laterally amorphous fiber-reinforced composites of 9.5 nm axial periodicity. This experimentally significant distance corresponds to the fully stretched AA 53–80 matrix segment. The deformed “spinal column” unwinds under compression when F-keratin filaments perfectly align horizontally into stacked sheets in the solid state. Conclusions At present, the tentative structure presented here is without alternatives.
- Published
- 2024
- Full Text
- View/download PDF
35. Incorporating Machine Learning into Sociological Model-Building.
- Author
-
Verhagen, Mark D.
- Abstract
Quantitative sociologists frequently use simple linear functional forms to estimate associations among variables. However, there is little guidance on whether such simple functional forms correctly reflect the underlying data-generating process. Incorrect model specification can lead to misspecification bias, and a lack of scrutiny of functional forms fosters interference of researcher degrees of freedom in sociological work. In this article, I propose a framework that uses flexible machine learning (ML) methods to provide an indication of the fit potential in a dataset containing the exact same covariates as a researcher's hypothesized model. When this ML-based fit potential strongly outperforms the researcher's self-hypothesized functional form, it implies a lack of complexity in the latter. Advances in the field of explainable AI, like the increasingly popular Shapley values, can be used to generate understanding into the ML model such that the researcher's original functional form can be improved accordingly. The proposed framework aims to use ML beyond solely predictive questions, helping sociologists exploit the potential of ML to identify intricate patterns in data to specify better-fitting, interpretable models. I illustrate the proposed framework using a simulation and real-world examples. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Exploring the Value of Computational Methods for Metajournalistic Discourse: The Example of COVID-19 Reporting in Dutch Newspapers.
- Author
-
Nguyen, Dennis and van Es, Karin
- Subjects
- *
CRITICAL thinking , *COVID-19 pandemic , *REPORTERS & reporting , *EMPIRICAL research , *CONTENT analysis - Abstract
The COVID-19 pandemic raised questions about trust in journalism and the quality of news reporting during societal crises. While journalists and media professionals frequently offered critical reflections based on personal experiences and observations, computational methods are not widely used to support these evaluative processes. We aim to extend the conversation on metajournalistic discourse by considering the inclusion of empirical methods for monitoring journalistic practices. By disclosing our findings about Dutch news media's corona reporting between 2019 and 2022, we demonstrate how computational methods for content analyses of news texts can yield empirically informed insights into different facets of journalistic performance. The corpus includes 106,616 corona-related articles from national and regional newspapers in the Netherlands. We deployed text analytical methods such as topic modelling and named entity recognition to explore Dutch corona reporting in respect to different normative criteria (informing, monitoring, offering platforms for discussion and opinion, interpretation, analysis, and setting public agendas). The study was requested by a large Dutch newspaper to receive a systematic-empirical analysis of journalistic practice for self-evaluation. We argue that computational methods combined with qualitative analyses can stimulate dialogue and critical reflection among news media professionals. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. AI-Driven Deep Learning Techniques in Protein Structure Prediction.
- Author
-
Chen, Lingtao, Li, Qiaomu, Nasif, Kazi Fahim Ahmad, Xie, Ying, Deng, Bobin, Niu, Shuteng, Pouriyeh, Seyedamin, Dai, Zhiyu, Chen, Jiawei, and Xie, Chloe Yixin
- Subjects
- *
MACHINE learning , *PROTEIN structure prediction , *COMPUTATIONAL intelligence , *PROTEIN structure , *PROTEIN models , *DEEP learning - Abstract
Protein structure prediction is important for understanding their function and behavior. This review study presents a comprehensive review of the computational models used in predicting protein structure. It covers the progression from established protein modeling to state-of-the-art artificial intelligence (AI) frameworks. The paper will start with a brief introduction to protein structures, protein modeling, and AI. The section on established protein modeling will discuss homology modeling, ab initio modeling, and threading. The next section is deep learning-based models. It introduces some state-of-the-art AI models, such as AlphaFold (AlphaFold, AlphaFold2, AlphaFold3), RoseTTAFold, ProteinBERT, etc. This section also discusses how AI techniques have been integrated into established frameworks like Swiss-Model, Rosetta, and I-TASSER. The model performance is compared using the rankings of CASP14 (Critical Assessment of Structure Prediction) and CASP15. CASP16 is ongoing, and its results are not included in this review. Continuous Automated Model EvaluatiOn (CAMEO) complements the biennial CASP experiment. Template modeling score (TM-score), global distance test total score (GDT_TS), and Local Distance Difference Test (lDDT) score are discussed too. This paper then acknowledges the ongoing difficulties in predicting protein structure and emphasizes the necessity of additional searches like dynamic protein behavior, conformational changes, and protein–protein interactions. In the application section, this paper introduces some applications in various fields like drug design, industry, education, and novel protein development. In summary, this paper provides a comprehensive overview of the latest advancements in established protein modeling and deep learning-based models for protein structure predictions. It emphasizes the significant advancements achieved by AI and identifies potential areas for further investigation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Methods for Parameter Estimation in Wine Fermentation Models.
- Author
-
Coleman, Robert, Nelson, James, and Boulton, Roger
- Subjects
PARAMETER estimation ,FERMENTATION ,PARTICLE swarm optimization ,DIFFERENTIAL evolution ,NUMERICAL integration - Abstract
The estimation of parameters in a wine fermentation model provides the opportunity to predict the rate and concentration outcomes, to strategically intervene to change the conditions, and to forecast the rates of heat and carbon dioxide release. The chosen parameters of the fermentation model are the initial assimilable nitrogen concentration and yeast properties (lag time, viability constant, and specific maintenance rate). This work evaluates six methods for parameter estimation: Bard, Bayesian Optimization, Particle Swarm Optimization, Differential Evolution, Genetic Evolution, and a modified Direct Grid Search technique. The benefits and drawbacks of the parameter computational methods are discussed, as well as a comparison of numerical integration methods (Euler, Runge–Kutta, backward differential formula (BDF), and Adams/BDF). A test set of density-time data for five white and five red commercial wine fermentations across vintage, grape cultivar, fermentation temperature, inoculated yeast strain, and fermentor size was used to evaluate the parameter estimation methods. A Canonical Variate Analysis shows that the estimation methods are not significantly different from each other while, in the parameter space, each of the fermentations were significantly different from each other. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. On the Replication of the Pre-kernel and Related Solutions.
- Author
-
Meinhardt, Holger I.
- Subjects
CONVEX functions ,GAME theory ,NEGOTIATION ,DEFAULT (Finance) ,GAMES - Abstract
Based on the results discussed by Meinhardt (The Pre-Kernel as a Tractable Solution for Cooperative Games: An Exercise in Algorithmic Game Theory, volume 45 of Theory and Decision Library: Series C, Springer, Heidelberg, 2013). which presents a dual characterization of the pre-kernel by a finite union of solution sets of a family of quadratic and convex objective functions, we could derive some results related to the single-valuedness of the pre-kernel. Rather than extending the knowledge of game classes for which the pre-kernel consists of a single point, we apply a different approach. We select a game from an arbitrary game class with a single pre-kernel element satisfying the non-empty interior condition of a payoff equivalence class and then establish that the set of related and linear independent games which are derived from this pre-kernel point of the default game replicates this point also as its sole pre-kernel element. Hence, a bargaining outcome related to this pre-kernel element is stable. Furthermore, we establish that on the restricted subset on the game space that is constituted by the convex hull of the default and the set of related games, the pre-kernel correspondence is single-valued; and consequently continuous. In addition, we provide sufficient conditions that preserve the pre-nucleolus property for related games even when the default game possesses not a single pre-kernel point. Finally, we apply the same techniques to related solutions of the pre-kernel, namely the modiclus, and anti-pre-kernel, to work out replication results for them. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. The Stagger Code for Accurate and Efficient, Radiation-coupled Magnetohydrodynamic Simulations.
- Author
-
Stein, Robert F., Nordlund, Åke, Collet, Remo, and Trampedach, Regner
- Subjects
- *
STELLAR chromospheres , *STELLAR atmospheres , *MOLECULAR clouds , *ATOMIC physics , *SPECTRAL lines , *SOLAR atmosphere , *SOLAR photosphere , *SOLAR chromosphere - Abstract
We describe the Stagger code for simulations of magnetohydrodynamic (MHD) systems. This is a modular code with a variety of physics modules that will let the user run simulations of deep stellar atmospheres, sunspot formation, stellar chromospheres and coronae, proto-stellar disks, star formation from giant molecular clouds, and even galaxy formation. The Stagger code is efficiently and highly parallelizable, enabling such simulations with large ranges of both spatial and temporal scales. We describe the methodology of the code and present the most important of the physics modules, as well as its input and output variables. We show results of a number of standard MHD tests to enable comparison with other, similar codes. In addition, we provide an overview of tests that have been carried out against solar observations, ranging from spectral line shapes, spectral flux distribution, limb darkening, intensity and velocity distributions of granulation, to seismic power spectra and the excitation of p -modes. The Stagger code has proven to be a high-fidelity code with a large range of uses. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Politicization of Science in COVID-19 Vaccine Communication: Comparing US Politicians, Medical Experts, and Government Agencies.
- Author
-
Zhou, Alvin, Liu, Wenlin, and Yang, Aimei
- Subjects
- *
GOVERNMENT agencies , *COVID-19 vaccines , *POLITICAL communication , *COVID-19 , *ANTI-vaccination movement , *COVID-19 pandemic , *SUICIDE statistics - Abstract
We compare the social media discourses on COVID-19 vaccines constructed by U.S. politicians, medical experts, and government agencies, and investigate how various contextual factors influence the likelihood of government agencies politicizing the issue. Taking the political corpus and the medical corpus as two extremes, we propose a language-based definition of politicization of science and measure it on a continuous scale. By building a machine learning classifier that captures subtle linguistic indicators of politicization and applying it to two years of government agencies' Facebook posting history, we demonstrate that: 1) U.S. politicians heavily politicized COVID-19 vaccines, medical experts conveyed minimal politicization, and government agencies' discourse was a mix of the two, yet more closely resembled medical experts;' 2) increasing COVID-19 infection rates reduced government agencies' politicization tendencies; 3) government agencies in Democratic-leaning states were more likely to politicize COVID-19 vaccines than those in Republican-leaning states; and 4) the degree of politicization did not significantly differ across agencies' jurisdiction levels. We discuss the conceptualization of politicization of science, the incumbency effect, and government communication as an emerging area for political communication research. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. The Study of Pigments in Cultural Heritage: A Review Using Machine Learning.
- Author
-
Harth, Astrid
- Subjects
- *
LITERATURE reviews , *CULTURAL property , *PAINTING techniques , *MACHINE learning , *TEXT mining , *SPECTRAL imaging - Abstract
In this review, topic modeling—an unsupervised machine learning tool—is employed to analyze research on pigments in cultural heritage published from 1999–2023. The review answers the following question: What are topics and time trends in the past three decades in the analytical study of pigments within cultural heritage (CH) assets? In total, 932 articles are reviewed, ten topics are identified and time trends in the share of these topics are revealed. Each topic is discussed in-depth to elucidate the community, purpose and tools involved in the topic. The time trend analysis shows that dominant topics over time include T1 (the spectroscopic and microscopic study of the stratigraphy of painted CH assets) and T5 (X-ray based techniques for CH, conservation science and archaeometry). However, both topics have experienced a decrease in attention in favor of other topics that more than doubled their topic share, enabled by new technologies and methods for imaging spectroscopy and imaging processing. These topics include T6 (spectral imaging techniques for chemical mapping of painting surfaces) and T10 (the technical study of the pigments and painting methods of historical and contemporary artists). Implications for the field are discussed in conclusion. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Modelling of phase transformations occurring in the process of austempered ductile iron manufacturing.
- Author
-
Olejarczyk-Wożeńska, Izabela, Mrzygłód, Barbara, and Adrian, Henryk
- Abstract
The paper presents mathematical models and their implementation in C # language describing the phase transformations occurring in the process of austempered ductile iron (ADI) manufacture. The research includes two main stages: austenitization and isothermal holding at the bainitic range. The influence of free energy of austenite and ferrite on transformations was taken into account. Parameters of models were identified based on inverse analysis and experimental research. As part of the research, verification and validation of the developed models was carried out based on the results of experimental research. The tool developed and implemented enables the analysis of phase transformations occurring during heat treatment with isothermal holding of ductile iron with Ni addition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Accelerated Discovery of Halide Perovskite Materials via Computational Methods: A Review.
- Author
-
Sheng, Ming, Zhu, Hui, Wang, Suqin, Liu, Zhuang, and Zhou, Guangtao
- Subjects
- *
BAND gaps , *PEROVSKITE , *HALIDES , *MATERIALS science , *HIGH throughput screening (Drug development) , *OPTOELECTRONIC devices - Abstract
Halide perovskites have gained considerable attention in materials science due to their exceptional optoelectronic properties, including high absorption coefficients, excellent charge-carrier mobilities, and tunable band gaps, which make them highly promising for applications in photovoltaics, light-emitting diodes, synapses, and other optoelectronic devices. However, challenges such as long-term stability and lead toxicity hinder large-scale commercialization. Computational methods have become essential in this field, providing insights into material properties, enabling the efficient screening of large chemical spaces, and accelerating discovery processes through high-throughput screening and machine learning techniques. This review further discusses the role of computational tools in the accelerated discovery of high-performance halide perovskite materials, like the double perovskites A2BX6 and A2BB′X6, zero-dimensional perovskite A3B2X9, and novel halide perovskite ABX6. This review provides significant insights into how computational methods have accelerated the discovery of high-performance halide perovskite. Challenges and future perspectives are also presented to stimulate further research progress. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Stability analysis, lump and exact solutions to Sharma–Tasso–Olver–Burgers equation.
- Author
-
Rehman, Shafqat Ur, Ahmad, Jamshad, Nisar, Kottakkaran Sooppy, and Abdel-Aty, Abdel-Haleem
- Abstract
In this work, the Hirota bilinear symbolic computational method along with test functions and the generalized exponential rational function method are capitalized to secure soliton and lump solutions to the Sharma–Tasso–Olver–Burgers equation. Several novel soliton solutions are observed in unique patterns such as periodic, exponential, hyperbolic, dark, singular, and combo forms. Additionally, we also extract interaction, lump and breather solutions of the governing model. The novel characteristic of this work is the attained results, which were not before computed. Modulation instability of the governing equation is also examined via linear stability theory. To demonstrate the physical aspects and configuration of the attained solitons, some distinct graphs are plotted in different shapes. The validity of the solutions is verified by using Mathematica. The constructed outcomes are very encouraging and entail that the concerned methods can be utilized to acquire assorted improved, innovative, and advantageous outcomes for miscellaneous remarkable nonlinear evolution equations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. AstroDLLC: Efficiently Reducing Storage and Transmission Costs for Massive Solar Observation Data via Deep Learning-based Lossless Compression.
- Author
-
Liu, Xiaoying, Liu, Yingbo, Yang, Lei, Wu, Shichao, Jiang, Rong, and Xiang, Yongyuan
- Subjects
- *
DEEP learning , *DATA compression , *SOLAR telescopes , *DATA warehousing , *DATA transmission systems , *QUANTITATIVE research - Abstract
Effective data compression technology is essential for addressing data storage and transmission needs, especially given the escalating volume and complexity of data generated by contemporary astronomy. In this study, we propose utilizing deep learning-based lossless compression techniques to improve compression efficiency. We begin with a qualitative and quantitative analysis of the temporal and spatial redundancy in solar observation data. Based on this analysis, we introduce a novel deep learning-based framework called AstroDLLC for the lossless compression of astronomical solar images. AstroDLLC first segments high-resolution images into blocks to ensure that deep learning model training does not rely on high-computation power devices. It then addresses the non-normality of the partitioned data through simple reversible computational methods. Finally, it utilizes Bit-swap to train deep learning models that capture redundant features across multiple image frames, thereby enhancing compression efficiency. Comprehensive evaluations using data from the New Vacuum Solar Telescope reveal that AstroDLLC achieves a maximum compression ratio of 3.00 per image, surpassing Gzip, RICE, and other lossless technologies. The performance of AstroDLLC underscores its potential to address data compression challenges in astronomy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Efficient Time-Dependent Method for Strong-Field Ionization of Atoms with Smoothly Varying Radial Steps.
- Author
-
Douguet, Nicolas, Guchkov, Mikhail, Bartschat, Klaus, and Santos, Samantha Fonseca dos
- Subjects
TIME-dependent Schrodinger equations ,RYDBERG states ,ELECTRONS ,ATOMS ,LASERS ,ATTOSECOND pulses - Abstract
We present an efficient numerical method to solve the time-dependent Schrödinger equation in the single-active electron picture for atoms interacting with intense optical laser fields. Our approach is based on a non-uniform radial grid with smoothly increasing steps for the electron distance from the residual ion. We study the accuracy and efficiency of the method, as well as its applicability to investigate strong-field ionization phenomena, the process of high-order harmonic generation, and the dynamics of highly excited Rydberg states. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Computational Fluid–Structure Interaction in Microfluidics.
- Author
-
Musharaf, Hafiz Muhammad, Roshan, Uditha, Mudugamuwa, Amith, Trinh, Quang Thang, Zhang, Jun, and Nguyen, Nam-Trung
- Subjects
STRUCTURAL mechanics ,FLUID mechanics ,TREATMENT effectiveness ,STRUCTURAL dynamics ,MICROPUMPS ,MICROFLUIDICS - Abstract
Micro elastofluidics is a transformative branch of microfluidics, leveraging the fluid–structure interaction (FSI) at the microscale to enhance the functionality and efficiency of various microdevices. This review paper elucidates the critical role of advanced computational FSI methods in the field of micro elastofluidics. By focusing on the interplay between fluid mechanics and structural responses, these computational methods facilitate the intricate design and optimisation of microdevices such as microvalves, micropumps, and micromixers, which rely on the precise control of fluidic and structural dynamics. In addition, these computational tools extend to the development of biomedical devices, enabling precise particle manipulation and enhancing therapeutic outcomes in cardiovascular applications. Furthermore, this paper addresses the current challenges in computational FSI and highlights the necessity for further development of tools to tackle complex, time-dependent models under microfluidic environments and varying conditions. Our review highlights the expanding potential of FSI in micro elastofluidics, offering a roadmap for future research and development in this promising area. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Understanding regulation using the Institutional Grammar 2.0.
- Author
-
Siddiki, Saba and Frantz, Christopher K.
- Subjects
GRAMMAR ,CONTENT analysis - Abstract
Over the last decade, there has been increased interest in understanding the design (i.e., content) of regulation as a basis for studying regulation formation, implementation, and outcomes. Within this line of research, scholars have been particularly interested in investigating regulatory dynamics relating to features and patterns of regulatory text and have engaged a variety of methodological approaches to support their assessments. One approach featured in this research is the Institutional Grammar (IG). The IG supports syntactic and semantic analyses of institutional statements (e.g., regulatory provisions) that embed within regulatory text. A recently revised version—called the IG 2.0—further supports robust analyses of regulatory text by offering an expanded feature set particularly well‐suited to extracting and classifying content relevant for the study of regulation. This paper (i) provides a brief introduction to the IG 2.0 and (ii) discusses theoretical and analytical advantages of using the IG 2.0 to study regulation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Near-tip correction functions for the actuator line method to improve the predicted lift and drag distributions.
- Author
-
Trigaux, Francois, Villeneuve, Thierry, Dumas, Guy, and Winckelmans, Grégoire
- Subjects
VORTEX lattice method ,ASPECT ratio (Aerofoils) ,VORTEX methods ,VORTEX shedding ,WIND turbines - Abstract
The actuator line method (ALM) is a commonly used technique to simulate slender lifting and dragging bodies such as wings or blades. However, the accuracy of the method is significantly reduced near the tip. To quantify the loss of accuracy, translating wings with various aspect and taper ratios are simulated using several methods: wall-resolved Reynolds-averaged Navier–Stokes (RANS) simulations, an advanced ALM with two-dimensional (2-D) mollification of the force, a lifting line method, a mollified lifting line method and a vortex lattice method. Significant differences in the lift and drag distributions are found on the part of the wing where the distance to the tip is smaller than approximately 3 chords and are identified to arise from both the forces mollification and the uneven induced velocity along the chord. Correction functions acting on the lift coefficient and effective angle of attack near the wing tip are then derived for rectangular wings of various aspect ratios. They are then also applied to wings of various taper ratios using the 'effective dimensionless distance to the tip' as the main parameter. The application of the correction not only leads to a much improved lift distribution, but also to a more consistent drag distribution. The correction functions are also obtained for various mollification sizes, as well as for ALM with three-dimensional (3-D) mollification. These changes mostly impact the correction for the effective angle of attack. Finally, the correction is applied to simulations of the NREL Phase VI wind turbine, leading to an enhanced agreement with the experimental data. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.