195 results
Search Results
2. Semantics, Analytics, Visualization: 3rd International Workshop, SAVE-SD 2017, Perth, Australia, April 3, 2017, and 4th International Workshop, SAVE-SD 2018, Lyon, France, April 24, 2018, Revised Selected Papers
- Abstract
This book constitutes the refereed proceedings of the 3rd International Workshop, SAVE-SD 2017, held in Perth, Australia, in April 2017, and the 4th International Workshop, SAVE-SD 2018, held in Lyon, France, in April 2018. The 6 full, 2 position and 4 short papers were selected from 16 submissions. The papers describe multiple ways in which scholarly dissemination can be approved: Creating structured data, providing methods for semantic computational analysis and designing systems for navigating. This allows a variety of stakeholders to understand research dynamics, predict trends and evaluate the quality of research.
- Published
- 2018
3. Position paper on requirements for toxicological studies in the specific case of radiopharmaceuticals
- Abstract
This is a position paper of the Radiopharmacy Committee of the EANM (European Association of Nuclear Medicine) addressing toxicology studies for application of new diagnostic and therapeutic radiopharmaceuticals (RP) that are not approved (i.e., not having a marketing authorization or a monograph in the European Pharmacopoeia), excluding endogenous and ubiquitous substances in humans. This paper discusses the requirements for clinical trials with radiopharmaceuticals for clinical research applications, not necessarily intended to aim at a marketing authorization. If marketing authorization is intended, scientific advice of the competent authorities is mandatory and cannot be replaced by this position paper. The position paper reflects the view of the Radiopharmacy Committee of the EANM and can be used as a basis for discussions with the responsible authorities.
- Published
- 2017
4. Position paper on requirements for toxicological studies in the specific case of radiopharmaceuticals
- Abstract
This is a position paper of the Radiopharmacy Committee of the EANM (European Association of Nuclear Medicine) addressing toxicology studies for application of new diagnostic and therapeutic radiopharmaceuticals (RP) that are not approved (i.e., not having a marketing authorization or a monograph in the European Pharmacopoeia), excluding endogenous and ubiquitous substances in humans. This paper discusses the requirements for clinical trials with radiopharmaceuticals for clinical research applications, not necessarily intended to aim at a marketing authorization. If marketing authorization is intended, scientific advice of the competent authorities is mandatory and cannot be replaced by this position paper. The position paper reflects the view of the Radiopharmacy Committee of the EANM and can be used as a basis for discussions with the responsible authorities.
- Published
- 2017
5. Preface [to WMC 5 – 5th Int. Workshop on Membrane Computing, Selected papers]
- Published
- 2005
6. Fast but approximate homomorphic k-means based on masking technique
- Abstract
Nowadays, computing on encrypted data seems to be more practical than a few years ago, thanks to the emergence of new Homomorphic Encryption schemes. In this paper, an algorithm based on Homomorphic Encryption for Arithmetic of Approximate Numbers (Cheon et al., in: Takagi, Peyrin (eds) Advances in cryptology—ASIACRYPT 2017, Springer, Cham, pp 409–437, 2017) (HEAAN, or also CKKS) scheme, that is able to perform a secure k-means algorithm which processes encrypted data, has been studied and presented. The performance of the classifier running on encrypted data has been evaluated using a standard k-means algorithm that works on plain data as a supervised structure, since the results are obtained by approximated computations. The main point of this paper is to take existent theoretical techniques (for example approximations of sgn (x)), to use them and to observe if they are valid in practical applications. The output of the algorithm is a set of k encrypted masks that can be applied to the original dataset in order to obtain different clusters. The setting is a standard client–server one. The workload is heavily server-centric, as the client only has to execute a light masking algorithm at the end of each iteration, which, excluding the decryption, is faster than a plain k-means iteration; the main disadvantage concerns the accuracy of the results. Experiments show that the algorithm can be executed fairly quickly: the execution time of the training phase is on the order of seconds, while classification is on the order of tenths of a second.
- Published
- 2023
7. Empirically Investigating Virtual Learning Companions to Enhance Social Media Literacy
- Abstract
Social media platforms provide opportunities for users across the world to connect and communicate between them and engage into acts of social support and entertainment. Yet it can also bring negative consequences as it has been associated with poor mental health and life dissatisfaction. This underlines the importance of delivering social media literacy (SML) interventions that raise awareness of the dangers and threats that are hidden within. To this date, SML initiatives have shown their benefits towards the acquisition of SML skills through the forms of school interventions and mini-games. However, studies on promoting SML through social media platforms need to be also encouraged and innovative approaches to provide interactive scenarios with hands-on experiences need to be formulated. Hence, the project COURAGE introduces a new approach towards SML by proposing the integration of educational opportunities within a controlled social media platform. To provide students the opportunity to learn whilst they naturally explore social media we propose the integration of virtual learning companions. In this paper we report seven empirical approaches towards SML skills acquisition powered by virtual learning companions. The paper concludes with a discussion towards the benefits and limitations of using this type of SML interventions.
- Published
- 2023
8. CSO Classifier 3.0: a scalable unsupervised method for classifying documents in terms of research topics
- Abstract
Classifying scientific articles, patents, and other documents according to the relevant research topics is an important task, which enables a variety of functionalities, such as categorising documents in digital libraries, monitoring and predicting research trends, and recommending papers relevant to one or more topics. In this paper, we present the latest version of the CSO Classifier (v3.0), an unsupervised approach for automatically classifying research papers according to the Computer Science Ontology (CSO), a comprehensive taxonomy of research areas in the field of Computer Science. The CSO Classifier takes as input the textual components of a research paper (usually title, abstract, and keywords) and returns a set of research topics drawn from the ontology. This new version includes a new component for discarding outlier topics and offers improved scalability. We evaluated the CSO Classifier on a gold standard of manually annotated articles, demonstrating a significant improvement over alternative methods. We also present an overview of applications adopting the CSO Classifier and describe how it can be adapted to other fields.
- Published
- 2022
9. The influences of the COVID-19 pandemic on sustainable consumption: an international study
- Abstract
Background: Sustainable production and consumption are two important issues, which mutually interact. Whereas individuals have little direct influence on the former, they can play a key role on the latter. This paper describes the subject matter of sustainable consumption and outlines its key features. It also describes some international initiatives in this field. Results: By means of an international survey, the study explores the emphasis given to sustainable consumption during the second wave of the COVID-19 pandemic, and the degree of preparedness in individuals to engage in the purchase of green and sustainably manufactured products. The main results indicate that the pandemic offered an opportunity to promote sustainable consumption; nevertheless, the pandemic alone cannot be regarded as a ‘game changer’ in this topic. Conclusions: Apart from an online survey with responses from 31 countries, which makes it one of the most representative studies on the topic, a logit model was used to analyse the main variables that affect the probability of pro-environmental consumption behaviour because of the COVID-19 pandemic. The paper lists some of the technological and social innovations that may be needed, so as to guide more sustainable consumption patterns in a post-pandemic world.
- Published
- 2022
10. CSO Classifier 3.0: a scalable unsupervised method for classifying documents in terms of research topics
- Abstract
Classifying scientific articles, patents, and other documents according to the relevant research topics is an important task, which enables a variety of functionalities, such as categorising documents in digital libraries, monitoring and predicting research trends, and recommending papers relevant to one or more topics. In this paper, we present the latest version of the CSO Classifier (v3.0), an unsupervised approach for automatically classifying research papers according to the Computer Science Ontology (CSO), a comprehensive taxonomy of research areas in the field of Computer Science. The CSO Classifier takes as input the textual components of a research paper (usually title, abstract, and keywords) and returns a set of research topics drawn from the ontology. This new version includes a new component for discarding outlier topics and offers improved scalability. We evaluated the CSO Classifier on a gold standard of manually annotated articles, demonstrating a significant improvement over alternative methods. We also present an overview of applications adopting the CSO Classifier and describe how it can be adapted to other fields.
- Published
- 2022
11. Social and sustainability inclusion: The case study of maam in rome
- Abstract
The aim of the paper is to analyze the state of the art of bottom-up actions that lead to cultural and social regeneration using an in-depth case study in the city of Rome: the MAAM (Museum of the Other and the Elsewhere), a converted factory that has been occupied and it has a twofold use: it is a social housing and a contemporary art museum. The objectives of this paper are: a) a literature review of bottom up organizations that are able to transforms from illegal to institutional (still remaining illegal from a juridical point of view) b) a qualitative descriptive analysis of the case study MAAM enriched by interviews ad hoc to the “pioneers” on cultural processes and experiences in the studied areas c) a documental analysis of the case study. In particular, the authors have organized the parameters into eight specific categories through the information analyzed in the documents, that help to have a mapping of the practices, their similarities and distinctions. The scheme is useful to replicate the study in other Urban areas and compare the results.
- Published
- 2021
12. Connectedness versus diversification: two sides of the same coin
- Abstract
In the financial framework, the concepts of connectedness and diversification have been introduced and developed respectively in the context of systemic risk and portfolio theory. In this paper we propose a theoretical approach to bring to light the relation between connectedness and diversification. Starting from the respective axiomatic definitions, we prove that a class of proper measures of connectedness verifies, after a suitable functional transformation, the axiomatic requirements for a measure of diversification. The core idea of the paper is that connectedness and diversification are so deeply related that it is possible to pass from one concept to the other. In order to exploit such correspondence, we introduce a function, depending on the classical notion of rank of a matrix, that transforms a suitable proper measure of connectedness in a measure of diversification. We point out general properties of the proposed transformation function and apply it to a selection of measures of connectedness, such as the well-known Variance Inflation Factor.
- Published
- 2021
13. Grossman–Hart–Moore Goes to Italy: Rethinking the Boundaries of the Firm
- Abstract
This paper provides new empirical evidence on the boundaries of the firm, as shaped by the ownership (make-or-buy) and location (domestic-or-foreign) decisions of sourcing. In particular, we draw on the Grossman–Hart–Moore framework to investigate the role of input characteristics, investment spillovers and firm productivity in ownership and location decisions. For the purpose of the empirical analysis, we rely on original survey data of a stratified sample of Italian manufacturing firms, headquartered in Lombardy. Our probit, multinomial probit and conditional mixed process estimations suggest a number of robust regularities. Some of them confirm so far unexplored theoretical predictions from the Grossman–Hart–Moore framework; others provide new insights on specific relationships on which the theory is silent. As for ownership, we find that reliance on specific inputs and intangible inputs fosters integration over non-integration; moreover, firms acknowledging cross spillover effects are more likely to opt for joint-venture than non-integration. As for location, domestic sourcing prevails over foreign sourcing in presence of investment spillovers, whereas input characteristics play no role. Lastly, productivity is a major driver of the boundaries of the firm in that productive firms are more likely to source abroad than domestically. Holding across different econometric models and a number of robustness checks, our results contribute to the property rights theory of the firm and its recent developments.
- Published
- 2023
14. Cutting-edge technology and automation in the pathology laboratory
- Abstract
One of the goals of pathology is to standardize laboratory practices to increase the precision and effectiveness of diagnostic testing, which will ultimately enhance patient care and results. Standardization is crucial in the domains of tissue processing, analysis, and reporting. To enhance diagnostic testing, innovative technologies are also being created and put into use. Furthermore, although problems like algorithm training and data privacy issues still need to be resolved, digital pathology and artificial intelligence are emerging in a structured manner. Overall, for the field of pathology to advance and for patient care to be improved, standard laboratory practices and innovative technologies must be adopted. In this paper, we describe the state-of-the-art of automation in pathology laboratories in order to lead technological progress and evolution. By anticipating laboratory needs and demands, the aim is to inspire innovation tools and processes as positively transformative support for operators, organizations, and patients.
- Published
- 2023
15. Improving the power of hypothesis tests in sparse contingency tables
- Abstract
When analyzing data in contingency tables it is frequent to deal with sparse data, particularly when the sample size is small relative to the number of cells. Most analyses of this kind are interpreted in an exploratory manner and even if tests are performed, little attention is paid to statistical power. This paper proposes a method we call redundant procedure, which is based on the union–intersection principle and increases test power by focusing on specific components of the hypothesis. This method is particularly helpful when the hypothesis to be tested can be expressed as the intersections of simpler models, such that at least some of them pertain to smaller table marginals. This situation leads to working on tables that are naturally denser. One advantage of this method is its direct application to (chain) graphical models. We illustrate the proposal through simulations and suggest strategies to increase the power of tests in sparse tables. Finally, we demonstrate an application to the EU-SILC dataset.
- Published
- 2023
16. Firm policies and employees’ participation in conversation about their employer on social media
- Abstract
This paper studies the relationship between firms’ strategy and policies in regard to social media and their employees’ propensity to endorse them by using their personal social media accounts. In particular, the study investigates the effect of employees’ perception of firms’ social media strategy and initiatives aimed at influencing employees’ behavior on their personal social media profiles (communication of policies on the use of social media, training programs, and encouragement to join social media conversations regarding the firm). Based on the responses of 224 employees who use their personal accounts to talk about their firms, findings show that employees’ positive evaluation of firms’ social media strategy and firms’ explicit encouragement are positively associated with employees’ propensity to endorse their firms on social media. Moreover, results reveal the moderating effect of employees’ frequency of social media use on the relationship between communication of social media policies and the employees’ propensity to endorse their firm, as well as on the relationship between training programs and the propensity to endorse. This study provides evidence of the influence of firms’ social media activity and policies on the willingness of employees to promote and advocate their employers using their personal accounts, with theoretical and practical implications. The research also suggests that the effectiveness of firms’ policies may differ according to the frequency of social media usage by employees.
- Published
- 2023
17. The SL(2, ℤ) dualization algorithm at work
- Abstract
Recently an algorithm to dualize a theory into its mirror dual has been proposed, both for 3d N = 4 linear quivers and for their 4d N = 1 uplift. This mimics the manipulations done at the level of the Type IIB brane setup that engineers the 3d theories, where mirror symmetry is realized as S-duality, but it is enirely field-theoretic and based on the application of genuine infra-red dualities that implement the local action of S-duality on the quiver. In this paper, we generalize the algorithm to the full duality group, which is SL(2, ℤ) in 3d and PSL(2, ℤ) in 4d. This also produces dualities for 3d N = 3 theories with Chern-Simons couplings, some of which have enhanced N = 4 supersymmetry, and their new 4d N = 1 counterpart. In addition, we propose three ways to study the RG flows triggered by possible VEVs appearing at the last step of the algorithm, one of which uses a new duality that implements the Hanany-Witten move in field theory.
- Published
- 2023
18. University engagement in open innovation and intellectual property: evidence from university–industry collaborations
- Abstract
University–industry collaborations are an important pathway through which academic scientists engage with industry and society, co-create new knowledge, and raise funds to carry out costly research endeavors. Nonetheless, the management of such collaborations is challenging and requires universities to protect their investments in intellectual property and to capture value from them. This paper examines how scientists’ motivations to undertake inventive activities shape the relationship between research partnerships, the ownership of academic patents resulting from such partnerships, and the licensing of university-owned patents. We examine the interactions between these factors using data on 501 research projects conducted by scientists affiliated with universities located in various countries. Our analysis indicates that the inventors’ motivations bear a direct effect on the ownership and commercialization of academic patents. Moreover, these motivations positively moderate the relationship between research partnerships and the management of academic patents. These findings have interesting implications for university administrators striving to enhance the effectiveness of the technology transfer process.
- Published
- 2023
19. Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora
- Abstract
The Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/c charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1 ± 0.6 % and 84.1 ± 0.6 %, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation.
- Published
- 2023
20. The relationship between shape parameters and kurtosis in some relevant models
- Abstract
When a distributional model is chosen, the analytic relation between its shape parameters and the values taken by some kurtosis indexes, especially if they are unconventional, is rarely known. In addition, different indexes may provide contrasting evidence about the level of global kurtosis, when the parameters of the model are varied. That happens because just few parameters act “plainly” on kurtosis, namely so as to produce consistent modifications of the shape of the graph on both its sides. Many parameters, instead, affect kurtosis along with a change of the skewness of the distribution, that is by “inflating” a single side of the graph (usually a tail) at the expense of the other. Thanks to some relevant examples, this paper tries to provide general indications to recognize the two kinds of parameters above and to interpret their effect on the classical Pearson’s standardized fourth moment and on some lesser known kurtosis indexes. Specifically, it is shown that only a decomposed analysis of indexes can help to understand their apparent contradictions, especially when some of them are too sensitive to changes in the tails. Finally, some applications are provided.
- Published
- 2023
21. The Role of Educational Interventions in Facing Social Media Threats: Overarching Principles of the COURAGE Project
- Abstract
Social media are offering new opportunities for communication and interaction way beyond what was possible only a few years ago. However, social media are also virtual spaces where young people are exposed to a variety of threats. Digital addiction, discrimination, hate speech, misinformation, polarization as well as manipulative influences of algorithms, body stereotyping, and cyberbullying are examples of challenges that find fertile ground on social media. Educators and students are not adequately prepared to face these challenges. To this aim, the COURAGE project, presented in this paper, introduces new tools and learning methodologies that can be adopted within higher education learning paths to train educators to deal with social media threats. The overarching principles of the COURAGE project leverage the most recent advances in the fields of artificial intelligence and in the educational domain paired with social and media psychological insights to support the development of the COURAGE ecosystem. The results of the experiments currently implemented with teachers and students of secondary schools as well as the impact of the COURAGE project on societal changes and ethical questions are presented and discussed.
- Published
- 2023
22. A spatial semiparametric M-quantile regression for hedonic price modelling
- Abstract
This paper proposes an M-quantile regression approach to address the heterogeneity of the housing market in a modern European city. We show how M-quantile modelling is a rich and flexible tool for empirical market price data analysis, allowing us to obtain a robust estimation of the hedonic price function whilst accounting for different sources of heterogeneity in market prices. The suggested methodology can generally be used to analyse nonlinear interactions between prices and predictors. In particular, we develop a spatial semiparametric M-quantile model to capture both the potential nonlinear effects of the cultural environment on pricing and spatial trends. In both cases, nonlinearity is introduced into the model using appropriate bases functions. We show how the implicit price associated with the variable that measures cultural amenities can be determined in this semiparametric framework. Our findings show that the effect of several housing attributes and urban amenities differs significantly across the response distribution, suggesting that buyers of lower-priced properties behave differently than buyers of higher-priced properties.
- Published
- 2023
23. A model-based ultrametric composite indicator for studying waste management in Italian municipalities
- Abstract
A Composite Indicator (CI) is a useful tool to synthesize information on a multidimensional phenomenon and make policy decisions. Multidimensional phenomena are often modeled by hierarchical latent structures that reconstruct relationships between variables. In this paper, we propose an exploratory, simultaneous model for building a hierarchical CI system to synthesize a multidimensional phenomenon and analyze its several facets. The proposal, called the Ultrametric Composite Indicator (UCI) model, reconstructs the hierarchical relationships among manifest variables detected by the correlation matrix via an extended ultrametric correlation matrix. The latter has the feature of being one-to-one associated with a hierarchy of latent concepts. Furthermore, the proposal introduces a test to unravel relevant dimensions in the hierarchy and retain statistically significant higher-level CIs. A simulation study is illustrated to compare the proposal with other existing methodologies. Finally, the UCI model is applied to study Italian municipalities’ behavior toward waste management and to provide a tool to guide their councils in policy decisions.
- Published
- 2023
24. A Pan-European Review of Good Practices in Early Intervention Safeguarding Practice with Children, Young People and Families: Evidence Gathering to Inform a Multi-disciplinary Training Programme (the ERICA Project) in Preventing Child Abuse and Neglect in Seven European Countries
- Abstract
Child maltreatment has detrimental social and health effects for individuals, families and communities. The ERICA project is a pan-European training programme that equips non-specialist threshold practitioners with knowledge and skills to prevent and detect child maltreatment. This paper describes and presents the findings of a rapid review of good practice examples across seven participating countries including local services, programmes and risk assessment tools used in the detection and prevention of child maltreatment in the family. Learning was applied to the development of the generic training project. A template for mapping the good practice examples was collaboratively developed by the seven participating partner countries. A descriptive data analysis was undertaken organised by an a priori analysis framework. Examples were organised into three areas: programmes tackling child abuse and neglect, local practices in assessment and referral, risk assessment tools. Key findings were identified using a thematic approach. Seventy-two good practice examples were identified and categorised according to area, subcategory and number. A typology was developed as follows: legislative frameworks, child health promotion programmes, national guidance on child maltreatment, local practice guidance, risk assessment tools, local support services, early intervention programmes, telephone or internet-based support services, COVID-19 related good practices. Improved integration of guidance into practice and professional training in child development were highlighted as overarching needs. The impact of COVID-19 on safeguarding issues was apparent. The ERICA training programme formally responded to the learning identified in this international good practice review.
- Published
- 2023
25. Semisimple Flat F-Manifolds in Higher Genus
- Abstract
In this paper, we generalize the Givental theory for Frobenius manifolds and cohomological field theories to flat F-manifolds and F-cohomological field theories. In particular, we define a notion of Givental cone for flat F-manifolds, and we provide a generalization of the Givental group as a matrix loop group acting on them. We show that this action is transitive on semisimple flat F-manifolds. We then extend this action to F-cohomological field theories in all genera. We show that, given a semisimple flat F-manifold and a Givental group element connecting it to the constant flat F-manifold at its origin, one can construct a family of F-CohFTs in all genera, parameterized by a vector in the associative algebra at the origin, whose genus 0 part is the given flat F-manifold. If the flat F-manifold is homogeneous, then the associated family of F-CohFTs contains a subfamily of homogeneous F-CohFTs. However, unlike in the case of Frobenius manifolds and CohFTs, these homogeneous F-CohFTs can have different conformal dimensions, which are determined by the properties of a certain metric associated to the flat F-manifold.
- Published
- 2023
26. Development and testing of a deep learning-based strategy for scar segmentation on CMR-LGE images
- Abstract
Objective: The aim of this paper is to investigate the use of fully convolutional neural networks (FCNNs) to segment scar tissue in the left ventricle from cardiac magnetic resonance with late gadolinium enhancement (CMR-LGE) images. Methods: A successful FCNN in the literature (the ENet) was modified and trained to provide scar-tissue segmentation. Two segmentation protocols (Protocol 1 and Protocol 2) were investigated, the latter limiting the scar-segmentation search area to the left ventricular myocardial tissue region. CMR-LGE from 30 patients with ischemic-heart disease were retrospectively analyzed, for a total of 250 images, presenting high variability in terms of scar dimension and location. Segmentation results were assessed against manual scar-tissue tracing using one-patient-out cross validation. Results: Protocol 2 outperformed Protocol 1 significantly (p value < 0.05), with median sensitivity and Dice similarity coefficient equal to 88.07% [inter-quartile range (IQR) 18.84%] and 71.25% (IQR 31.82%), respectively. Discussion: Both segmentation protocols were able to detect scar tissues in the CMR-LGE images but higher performance was achieved when limiting the search area to the myocardial region. The findings of this paper represent an encouraging starting point for the use of FCNNs for the segmentation of nonviable scar tissue from CMR-LGE images.
- Published
- 2019
27. Directed Graph Encoding in Quantum Computing Supporting Edge-Failures
- Abstract
Graphs are one of the most common data structures in classical computer science and graph theory has been widely used in complexity and computability. Recently, the use of graphs in application domains such as routing, network analysis and resource allocation has become crucial. In these areas, graphs are often evolving in time: for example, connection links may fail due to temporary technical issues, meaning that edges of the graph cannot be traversed for some time interval and alternative paths have to be followed. In classical computation, where graphs are represented as adjacency matrices or lists, these problems are well studied and ad-hoc visit procedures have been developed. For specific problems quantum computation, through superpositions and entanglement has provided faster algorithms than their classical counterpart. However, in this model, only reversible operations are allowed and this poses the quest of augmenting a graph in order to be able to reverse edge traversals. In this paper we present a novel graph representation in quantum computation supporting dynamic connectivity typical of real-world network applications. Our proposal has the advantage of being closer than others in literature to the adjacency matrix of the graph. This makes easy dynamic edge-failure modeling. We introduce optimal algorithms for computing our graph encoding and we show the effectiveness of our proposal with some examples.
- Published
- 2022
28. Search for long-lived particles decaying into muon pairs in proton-proton collisions at √s = 13 TeV collected with a dedicated high-rate data stream
- Abstract
A search for long-lived particles decaying into muon pairs is performed using proton-proton collisions at a center-of-mass energy of 13 TeV, collected by the CMS experiment at the LHC in 2017 and 2018, corresponding to an integrated luminosity of 101 fb−1. The data sets used in this search were collected with a dedicated dimuon trigger stream with low transverse momentum thresholds, recorded at high rate by retaining a reduced amount of information, in order to explore otherwise inaccessible phase space at low dimuon mass and nonzero displacement from the primary interaction vertex. No significant excess of events beyond the standard model expectation is found. Upper limits on branching fractions at 95% confidence level are set on a wide range of mass and lifetime hypotheses in beyond the standard model frameworks with the Higgs boson decaying into a pair of long-lived dark photons, or with a long-lived scalar resonance arising from a decay of a b hadron. The limits are the most stringent to date for substantial regions of the parameter space. These results can be also used to constrain models of displaced dimuons that are not explicitly considered in this paper.
- Published
- 2022
29. Measuring the relative development and integration of EU countries' capital markets using composite indicators and cluster analysis
- Abstract
The paper proposes a set of metrics and a methodology to measure the progress that European Union Member States are making towards the development and integration of capital markets. It identifies a set of indicators and analyzes the performance of these countries over the 2007–2018 period using a composite indicator approach (in both a static and dynamic environment), based on the six priorities related to achieving a well-functioning and integrated European capital market included in the European Commission Capital Markets Union Action Plan. The author uses robust clustering to identify groups of countries and tracks their development over time. He finds that the process of capital market development and integration process has started but is not completed and that it is mainly associated with countries’ adherence to European increasing trends driven by the benchmarks rather than the policy actions of countries aimed at catching up with the best performers.
- Published
- 2022
30. Regular black holes without mass inflation instability
- Abstract
Generic models of regular black holes have separate outer and inner horizons, both with nonzero surface gravity. It has been shown that a nonzero inner horizon surface gravity results in exponential instability at the inner horizon controlled by this parameter. This phenomenon takes the name of “mass inflation instability”, and its presence has put in question the physical viability of regular black holes as alternatives to their (singular) general relativity counterparts. In this paper, we show that it is possible to make the inner horizon surface gravity vanish, while maintaining the separation between horizons, and a non-zero outer horizon surface gravity. We construct specific geometries satisfying these requirements, and analyze their behavior under different kinds of perturbations, showing that the exponential growth characteristic of mass inflation instability is not present for these geometries. These “inner-extremal” regular black holes are thereby better behaved than singular black holes and generic regular black holes, thus providing a well-motivated alternative of interest for fundamental and phenomenological studies.
- Published
- 2022
31. 4d S-duality wall and SL(2, Z) relations
- Abstract
In this paper we present various 4dN = 1 dualities involving theories obtained by gluing two E[USp(2N)] blocks via the gauging of a common USp(2N) symmetry with the addition of 2L fundamental matter chiral fields. For L = 0 in particular the theory has a quantum deformed moduli space with chiral symmetry breaking and its index takes the form of a delta-function. We interpret it as the Identity wall which identifies the two surviving USp(2N) of each E[USp(2N)] block. All the dualities are derived from iterative applications of the Intriligator-Pouliot duality. This plays for us the role of the fundamental duality, from which we derive all others. We then focus on the 3d version of our 4d dualities, which now involve the N = 4 T[SU(N)] quiver theory that is known to correspond to the 3d S-wall. We show how these 3d dualities correspond to the relations S2 = −1, S−1S = 1 and STS = T−1S−1T−1 for the S and T generators of SL(2, ℤ). These observations lead us to conjecture that E[USp(2N)] can also be interpreted as a 4d S-wall.
- Published
- 2022
32. Dualities from dualities: the sequential deconfinement technique
- Abstract
It is an interesting question whether a given infra-red duality between quantum field theories can be explained in terms of other more elementary dualities. For example recently it has been shown that mirror dualities can be obtained by iterative applications of Seiberg-like dualities. In this paper we continue this line of investigation focusing on theories with tensor matter. In such cases one can apply the idea of deconfinement, which consists of trading the tensor matter for extra gauge nodes by means of a suitable elementary duality. This gives an auxiliary dual frame which can then be manipulated with further dualizations, in an iterative procedure eventually yielding an interesting dual description of the original theory. The sequential deconfinement technique has avatars in different areas of mathematical physics, such as the study of hypergeometric and elliptic hypergeometric integral identities or of 2d free field correlators. We discuss various examples in the context 4dN = 1 supersymmetric theories, which are related to elliptic hypergeometric integrals. These include a new self-duality involving a quiver theory which exhibits a non-trivial global symmetry enhancement to E6.
- Published
- 2022
33. Property-Preserving Transformations of Elementary Net Systems Based on Morphisms
- Abstract
Structural transformations that preserve properties of formal models of concurrent systems make their verification easier. We define structural transformations that allow to abstract and refine elementary net systems. Relations between abstract models and their refinements are formalized using morphisms. Transformations proposed in this paper induce morphisms between elementary net systems as well as preserve their behavioral properties, especially deadlocks. We also show the application of the proposed transformations to the construction of a correct composition of interacting workflow net components.
- Published
- 2022
34. A SAT Encoding to Compute Aperiodic Tiling Rhythmic Canons
- Abstract
In Mathematical Music theory, the Aperiodic Tiling Complements Problem consists in finding all the possible aperiodic complements of a given rhythm A. The complexity of this problem depends on the size of the period n of the canon and on the cardinality of the given rhythm A. The current state-of-the-art algorithms can solve instances with n smaller than 180. In this paper we propose an ILP formulation and a SAT Encoding to solve this mathemusical problem, and we use the Maplesat solver to enumerate all the aperiodic complements. We validate our SAT Encoding using several different periods and rhythms and we compute for the first time the complete list of aperiodic tiling complements of standard Vuza rhythms for canons of period n= { 180, 420, 900 }.
- Published
- 2022
35. Static, Dynamic and Acceleration Features for CNN-Based Speech Emotion Recognition
- Abstract
Speech emotion recognition is a significant source of information especially when other channels, like face or body, are hidden. The shape of the vocal tract, tone of the voice, pitch and other characteristics are influenced by human emotions. In this paper, we propose the use of static, dynamic and acceleration features which are very effective in encoding those characteristics of the speech that are influenced by human emotions. These features are based on the concatenation of three global measures of Mel-frequency Cepstral Coefficients (MFCCs) (the static part) and the first (the dynamic part) and second derivatives (the acceleration part) of MFCCs. The features are processed with a custom 1-D CNN suitable designed by the authors for emotion recognition. Experiments are performed on two publicly available speech datasets containing audio files from different people and language and several emotions. Our approach on average overcomes the state of the art on both datasets.
- Published
- 2022
36. Evaluating space measures in P systems
- Abstract
P systems with active membranes are a variant of P systems where membranes can be created by division of existing membranes, thus creating an exponential amount of resources in a polynomial number of steps. Time and space complexity classes for active membrane systems have been introduced, to characterize classes of problems that can be solved by different membrane systems making use of different resources. In particular, space complexity classes introduced initially considered a hypothetical real implementation by means of biochemical materials, assuming that every single object or membrane requires some constant physical space (corresponding to unary notation). A different approach considered implementation of P systems in silico, allowing to store the multiplicity of each object in each membrane using binary numbers. In both cases, the elements contributing to the definition of the space required by a system (namely, the total number of membranes, the total number of objects, the types of different membranes, and the types of different objects) was considered as a whole. In this paper, we consider a different definition for space complexity classes in the framework of P systems, where each of the previous elements is considered independently. We review the principal results related to the solution of different computationally hard problems presented in the literature, highlighting the requirement of every single resource in each solution. A discussion concerning possible alternative solutions requiring different resources is presented.
- Published
- 2022
37. A rescaling technique to improve numerical stability of portfolio optimization problems
- Abstract
This paper analyzes the numerical stability of Markowitz portfolio optimization model, by identifying and studying a source of instability, that strictly depends on the mathematical structure of the optimization problem and its constraints. As a consequence, it is shown how standard portfolio optimization models can result in an unstable model also when the covariance matrix is well conditioned and the objective function is numerically stable. This depends on the fact that the linear equality constraints of the model very often suffer of almost collinearity and/or bad scaling. A theoretical approach is proposed that exploiting an equivalent formulation of the original optimization problem considerably reduces such structural component of instability. The effectiveness of the proposal is empirically certified through applications on real financial data when numerical optimization approaches are needed to compute the optimal portfolio. Gurobi and MATLAB’s solvers quadprog and fmincon are compared in terms of convergence performances.
- Published
- 2022
38. On the stability of string theory vacua
- Abstract
Vacuum compactifications may suffer from instabilities under small perturbations or tunnel effects; both are difficult to analyze. In this paper we consider the issue from a higher-dimensional perspective. We first look at how stability works for supersymmetric vacua, where it is widely expected to hold. We first show that the nucleation of brane bubbles in type II AdS compactifications is forbidden in the probe approximation by a simple argument involving pure spinors and calibrations. We then adapt familiar positive-energy theorems directly to M-theory and type II supergravity, rather than to their effective lower-dimensional reductions, also showing how to consistently include localized sources. We finally initiate an analysis of how these arguments might be extended to non-supersymmetric vacua. In M-theory, at the lower-derivative level, we find that the most natural modifications fail to stabilize the skew-whiffed and Englert vacua.
- Published
- 2022
39. Dual frame design in agricultural surveys: reviewing roots and methodological perspectives
- Abstract
This paper intends to contribute to an up-to-date discussion of dual frame designs in agricultural surveys. It starts by reviewing historical scenarios of applications to envision new perspectives, and ends by presenting a modern approach to the problem. A dual frame sampling design is proposed that has the appeal of relying on low-cost technological resources. The design has enough generality to allow for applications not only on agricultural but also on rural and environmental surveys, or any other survey related to the use of soil. Unbiased estimations based on domain and multiplicity approaches are presented and their major differences are discussed. Design parameters, design feasibility by different sample size allocations, as well as the statistical performance of several dual frame estimators are investigated using a Monte Carlo simulation study that is built on information from the Brazilian agricultural census of 2006 and FAO’s Global Strategy’s field experiences in the city of Goiana, Pernambuco. The results show dual frames present a gain in precision when compared to a single area frame survey. In addition, the choice of the best design and estimator depends upon scenarios with different types of allocation and different sizes of area frame segments.
- Published
- 2022
40. Hierarchical disjoint principal component analysis
- Abstract
Dimension reduction, by means of Principal Component Analysis (PCA), is often employed to obtain a reduced set of components preserving the largest possible part of the total variance of the observed variables. Several methodologies have been proposed either to improve the interpretation of PCA results (e.g., by means of orthogonal, oblique rotations, shrinkage methods), or to model oblique components or factors with a hierarchical structure, such as in Bi-factor and High-Order Factor analyses. In this paper, we propose a new methodology, called Hierarchical Disjoint Principal Component Analysis (HierDPCA), that aims at building a hierarchy of disjoint principal components of maximum variance associated with disjoint groups of observed variables, from Q up to a unique, general one. HierDPCA also allows choosing the type of the relationship among disjoint principal components of two sequential levels, from the lowest upwards, by testing the component correlation per level and changing from a reflective to a formative approach when this correlation turns out to be not statistically significant. The methodology is formulated in a semi-parametric least-squares framework and a coordinate descent algorithm is proposed to estimate the model parameters. A simulation study and two real applications are illustrated to highlight the empirical properties of the proposed methodology.
- Published
- 2022
41. The world trade network: country centrality and the COVID-19 pandemic
- Abstract
International trade is based on a set of complex relationships between different countries that can be modelled as an extremely dense network of interconnected agents. On the one hand, this network might favour the economic growth of countries, but on the other, it can also favour the diffusion of diseases, such as COVID-19. In this paper, we study whether, and to what extent, the topology of the trade network can explain the rate of COVID-19 diffusion and mortality across countries. We compute the countries’ centrality measures and we apply the community detection methodology based on communicability distance. We then use these measures as focal regressors in a negative binomial regression framework. In doing so, we also compare the effects of different measures of centrality. Our results show that the numbers of infections and fatalities are larger in countries with a higher centrality in the global trade network.
- Published
- 2022
42. A new approach in model selection for ordinal target variables
- Abstract
Multi-class predictive models are generally evaluated averaging binary classification indicators without a distinction between nominal and ordinal dependent variables. This paper introduces a novel approach to assess performances of predictive models characterized by an ordinal target variable and a new index for model evaluation is proposed. The new index satisfies mathematical properties and it can be applied to the evaluation of parametric and non parametric models. In order to show how our performance indicator works, empirical evidences obtained on toy examples and simulated data are provided. On the basis of the results achieved, we underline that our approach can be a more suitable criterion for model selection than the performance indexes currently suggested in the literature.
- Published
- 2022
43. Non-Gaussianities in the statistical distribution of heavy OPE coefficients and wormholes
- Abstract
The Eigenstate Thermalization Hypothesis makes a prediction for the statistical distribution of matrix elements of simple operators in energy eigenstates of chaotic quantum systems. As a leading approximation, off-diagonal matrix elements are described by Gaussian random variables but higher-point correlation functions enforce non-Gaussian corrections which are further exponentially suppressed in the entropy. In this paper, we investigate non- Gaussian corrections to the statistical distribution of heavy-heavy-heavy OPE coefficients in chaotic two-dimensional conformal field theories. Using the Virasoro crossing kernels, we provide asymptotic formulas involving arbitrary numbers of OPE coefficients from modular invariance on genus-g surfaces. We find that the non-Gaussianities are further exponentially suppressed in the entropy, much like the ETH. We discuss the implication of these results for products of CFT partition functions in gravity and Euclidean wormholes. Our results suggest that there are new connected wormhole geometries that dominate over the genus-two wormhole.
- Published
- 2022
44. The Role of Information and Communication Technologies in Researching Older People During the Covid-19 Pandemic
- Abstract
The Longitudinal Study on Older People’s Quality of Life during the Covid-19 pandemic (ILQA-19) is a qualitative study carried out during the 2020 lockdown on 40 older men and women living in the ten villages in northern Italy subject to the first lockdown in Europe. This study focuses on older people’s lives and the role of digital technologies during the pandemic, and it has been carried out fully remotely. Despite the need to research the social consequence of pandemics for older people, there is a shortage of studies that provide guidelines on how to successfully involve this population in online qualitative studies. This paper contributes to fill this gap by discussing the use of Information and Communication Technology (ICT) in implementing the different stages of ILQA-19 research. The best practices of qualitative studies conducted through ICTs are discussed, along with the strategies we enacted to enhance participation in the study. Specifically, panel engagement, tailoring procedures and building positive and trustworthy interactions with study members are crucial when researching older people through online methods.
- Published
- 2022
45. Intraoperative MRI versus intraoperative ultrasound in pediatric brain tumor surgery: is expensive better than cheap? A review of the literature
- Abstract
Purpose: The extent of brain tumor resection (EOR) is a fundamental prognostic factor in pediatric neuro-oncology in association with the histology. In general, resection aims at gross total resection (GTR). Intraoperative imaging like intraoperative US (iOUS) and MRI have been developed in order to find any tumoral remnant but with different costs. Aim of our work is to review the current literature in order to better understand the differences between costs and efficacy of MRI and iOUS to evaluate tumor remnants intraoperatively. Methods: We reviewed the existing literature on PubMed until 31st December 2021 including the sequential keywords “intraoperative ultrasound and pediatric brain tumors”, “iUS and pediatric brain tumors”, “intraoperative magnetic resonance AND pediatric brain tumors”, and “intraoperative MRI AND pediatric brain tumors. Results: A total of 300 papers were screened through analysis of title and abstract; 254 were excluded. After selection, a total of 23 articles were used for this systematic review. Among the 929 patients described, a total of 349(38%) of the cases required an additional resection after an iMRI scan. GTR was measured on 794 patients (data of 69 patients lost), and it was achieved in 552(70%) patients. In case of iOUS, GTR was estimated in 291 out of 379 (77%) cases. This finding was confirmed at the post-operative MRI in 256(68%) cases. Conclusions: The analysis of the available literature demonstrates that expensive equipment does not always mean better. In fact, for the majority of pediatric brain tumors, iOUS is comparable to iMRI in estimating the EOR.
- Published
- 2022
46. New horizons for fundamental physics with LISA
- Abstract
The Laser Interferometer Space Antenna (LISA) has the potential to reveal wonders about the fundamental theory of nature at play in the extreme gravity regime, where the gravitational interaction is both strong and dynamical. In this white paper, the Fundamental Physics Working Group of the LISA Consortium summarizes the current topics in fundamental physics where LISA observations of gravitational waves can be expected to provide key input. We provide the briefest of reviews to then delineate avenues for future research directions and to discuss connections between this working group, other working groups and the consortium work package teams. These connections must be developed for LISA to live up to its science potential in these areas.
- Published
- 2022
47. Parametric Bandits for Search Engine Marketing Optimisation
- Abstract
Expense optimisation for online marketing is a relevant and challenging task. In particular, the problem of splitting daily budget among campaigns, together with the problem of setting bids for the auctions that regulate ad appearance, have been recently cast as a multi-armed bandit problem. However, at the current state of the art several shortcomings limit practical applications. Indeed, campaigns are routinely divided by practitioners into sub-entities called ad groups, while current approaches take into account only the case of single ad groups: in this paper, we extend the state of the art to multiple ad groups. Moreover, we propose a contextual bandit model which achieves high data efficiency, especially important for campaigns with few clicks and/or small conversion rate. Our model exploits domain knowledge to greatly reduce the exploration space by using parametric Bayesian regression. Elicitation of prior distributions from domain experts is simplified by interpretability, while action selection is carried out by Thompson sampling and local optimisation methods. A simulation environment was built to compare the proposed approach to current state-of-the-art methods. Effectiveness of the proposed approach is confirmed by a rich set of numerical experiments, especially in the early days of marketing expense optimisation.
- Published
- 2022
48. The world trade network: country centrality and the COVID-19 pandemic
- Abstract
International trade is based on a set of complex relationships between different countries that can be modelled as an extremely dense network of interconnected agents. On the one hand, this network might favour the economic growth of countries, but on the other, it can also favour the diffusion of diseases, such as COVID-19. In this paper, we study whether, and to what extent, the topology of the trade network can explain the rate of COVID-19 diffusion and mortality across countries. We compute the countries’ centrality measures and we apply the community detection methodology based on communicability distance. We then use these measures as focal regressors in a negative binomial regression framework. In doing so, we also compare the effects of different measures of centrality. Our results show that the numbers of infections and fatalities are larger in countries with a higher centrality in the global trade network.
- Published
- 2022
49. Fractional Kirchhoff Hardy problems with singular and critical Sobolev nonlinearities
- Abstract
The paper deals with the following singular fractional problem {M(integral integral(R2N) vertical bar u(x) - u(y)vertical bar(2)/vertical bar x - y vertical bar(N+2s) dxdy) (-Delta)(s)u - mu u/vertical bar x vertical bar(2s) = lambda f(x)u(-gamma) + g(x)u(2s)*(-1) in Omega, u > 0 in Omega, u = 0 in R-nOmega, where Omega subset of R-N is an open bounded domain, with 0 is an element of Omega, dimension N > 2s with s is an element of (0, 1), 2(s)* = 2N/(N - 2s) is the fractional critical Sobolev exponent, lambda and mu are positive parameters, exponent gamma is an element of (0, 1), M models a Kirchhoff coefficient, f is a positive weight while g is a sign-changing function. The main feature and novelty of our problem is the combination of the critical Hardy and Sobolev nonlinearities with the bi-nonlocal framework and a singular nondifferentiable term. By exploiting the Nehari manifold approach, we provide the existence of at least two positive solutions.
- Published
- 2022
50. Forming classes in an e-Learning social network scenario
- Abstract
Online Social Networks are suitable environments for e-Learning for several reasons. First of all,there are many similarities between social network groups and classrooms. Furthermore,trust relationships taking place within groups can be exploited to give to the users the needed motivations to be engaged in classroom activities. In this paper we exploit information about users’ skills,interactions and trust relationships,which are supposed to be available on Online Social Networks,to design a model for managing formation and evolution of e-Learning classes and providing suggestions to a user about the best class to join with and to the class itself about the best students to accept. The proposed approach is validated by a simulation which proves the convergence of the distributed algorithm discussed in this paper.
- Published
- 2017
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.