683 results
Search Results
2. Artificial life and intelligent agents: second international symposium, ALIA 2016, Birmingham, UK, June 14-15, 2016, revised selected papers.
- Author
-
Lewis, Peter R, Headleand, Christopher J., Battle, Steve, and Ritsos, Panagiotis D.
- Subjects
Robotics ,Artificial intelligence ,Computer simulation ,Computational intelligence ,Genetic algorithms - Abstract
Summary: This book constitutes the refereed proceedings of the Second International Symposium on Artificial Life and Intelligent Agents, ALIA 2016, held in Birmingham, UK, in June 2016. The 8 revised full papers and three revised short papers presented together with two demo papers were carefully reviewed and selected from 25 submissions. The papers are organized in topical sections on modelling; robotics; bio-inspired problem solving; human-like systems; applications and games.
- Published
- 2018
3. Artificial intelligence for knowledge management : second IFIP WG 12.6 International Workshop, AI4KM 2014, Warsaw, Poland, September 7-10, 2014, revised selected papers.
- Author
-
Mercier-Laurent, Eunika, Owoc, Mieczysław Lech, and Boulanger, Danielle
- Subjects
Artificial intelligence ,Knowledge management ,Data mining ,Database management - Abstract
Summary: This book features a selection of papers presented at the Second IFIP WG 12.6 International Workshop on Artificial Intelligence for Knowledge Management, AI4KM 2014, held in Wroclaw, Poland, in September 2014, in the framework of the Federated Conferences on Computer Science and Information Systems, FedCSIS 2014. The 9 revised and extended papers and one invited paper were carefully reviewed and selected for inclusion in this volume. They present new research and innovative aspects in the field of knowledge management and are organized in the following topical sections: tools and methods for knowledge acquisition; models and functioning of knowledge management; techniques of artificial intelligence supporting knowlege management; and components of knowledge flow.
- Published
- 2016
4. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
5. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
6. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
7. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
8. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
9. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
10. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
11. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
12. Off-the-shelf CRM with Drupal: a case study of documenting decorated papers
- Abstract
We present a method of setting up a website using the Drupal CMS to publish CRM data. Our setup requires basic technical expertise by researchers who are then able to publish their records in both a human accessible way through HTML and a machine friendly format through RDFa. We begin by examining previous work on Drupal and the CRM and identifying useful patterns. We present the Drupal modules that are required by our setup and we explain why these are sustainable. We continue by giving guidelines for setting up Drupal to serve CRM data easily and we describe a specific installation for our case study which is related to decorated papers alongside our CRM mapping. We finish with highlighting the benefits of our method (i.e. speed and user-friendliness) and we refer to a number of issues which require further work (i.e. automatic validation, UI improvements and the provision for SPARQL endpoints).
- Published
- 2016
13. Composting paper and grass clippings with anaerobically treated palm oil mill effluent
- Abstract
Purpose The purpose of this study is to investigate the composting performance of anaerobically treated palm oil mill effluent (AnPOME) mixed with paper and grass clippings. Methods Composting was conducted using a laboratory scale system for 40 days. Several parameters were determined: temperature, mass reduction, pH, electrical conductivity, colour, zeta potential, phytotoxicity and final compost nutrients. Results The moisture content and compost mass were reduced by 24 and 18 %, respectively. Both final compost pH value and electrical conductivity were found to increase in value. Colour (measured as PtCo) was not suitable as a maturity indicator. The negative zeta potential values decreased from −12.25 to −21.80 mV. The phytotoxicity of the compost mixture was found to decrease in value during the process and the final nutrient value of the compost indicates its suitability as a soil conditioner. Conclusions From this study, we conclude that the addition of paper and grass clippings can be a potential substrate to be composted with anaerobically treated palm oil mill effluent (AnPOME). The final compost produced is suitable for soil conditioner.
- Published
- 2016
14. Composting paper and grass clippings with anaerobically treated palm oil mill effluent
- Abstract
Purpose The purpose of this study is to investigate the composting performance of anaerobically treated palm oil mill effluent (AnPOME) mixed with paper and grass clippings. Methods Composting was conducted using a laboratory scale system for 40 days. Several parameters were determined: temperature, mass reduction, pH, electrical conductivity, colour, zeta potential, phytotoxicity and final compost nutrients. Results The moisture content and compost mass were reduced by 24 and 18 %, respectively. Both final compost pH value and electrical conductivity were found to increase in value. Colour (measured as PtCo) was not suitable as a maturity indicator. The negative zeta potential values decreased from −12.25 to −21.80 mV. The phytotoxicity of the compost mixture was found to decrease in value during the process and the final nutrient value of the compost indicates its suitability as a soil conditioner. Conclusions From this study, we conclude that the addition of paper and grass clippings can be a potential substrate to be composted with anaerobically treated palm oil mill effluent (AnPOME). The final compost produced is suitable for soil conditioner.
- Published
- 2016
15. Combinatorial Algorithms: 21st International Workshop, IWOCA 2010, London, UK, July 26-28, 2010, Revised Selected Papers
- Author
-
Iliopoulos, C.S. and Iliopoulos, C.S.
- Abstract
Edited Book
- Published
- 2011
16. Supply Rigidities in Input-Output Modeling of the Wood and Paper Industry Development
- Abstract
A historical analysis of the development of the USSR wood and paper industry was undertaken in order to reflect some specific features of the distribution of resources in the structure of the IIASA global model for the forest sector.
- Published
- 1985
- Full Text
- View/download PDF
17. Supply Rigidities in Input-Output Modeling of the Wood and Paper Industry Development
- Abstract
A historical analysis of the development of the USSR wood and paper industry was undertaken in order to reflect some specific features of the distribution of resources in the structure of the IIASA global model for the forest sector.
- Published
- 1985
- Full Text
- View/download PDF
18. Standardizing the Software Tag in Japan for Transparency of Development
- Abstract
In this paper, we describe the Software Tag which makes software development visible to software purchasers (users). A software tag is a partial set of empirical data about a software development project shared between the purchaser and developer. The purchaser uses the software tag to evaluate the software project, allowing them to recognize the quality level of the processes and products involved. With Japanese government support, we have successfully standardized the software tag named Software Tag Standard 1.0, and have developed various associated tools for tag data collection and visualization. For its initial evaluation, the software tag has been applied to several projects. This paper also presents various activities aimed at promoting the use of the software tag in Japan and the world., PROFES 2010 : Product-Focused Software Process Improvement, June 21-23, 2010, Limerick, Ireland
- Published
- 2023
19. Neural embedding: learning the embedding of the manifold of physics data
- Abstract
In this paper, we present a method of embedding physics data manifolds with metric structure into lower dimensional spaces with simpler metrics, such as Euclidean and Hyperbolic spaces. We then demonstrate that it can be a powerful step in the data analysis pipeline for many applications. Using progressively more realistic simulated collisions at the Large Hadron Collider, we show that this embedding approach learns the underlying latent structure. With the notion of volume in Euclidean spaces, we provide for the first time a viable solution to quantifying the true search capability of model agnostic search algorithms in collider physics (i.e. anomaly detection). Finally, we discuss how the ideas presented in this paper can be employed to solve many practical challenges that require the extraction of physically meaningful representations from information in complex high dimensional datasets.
- Published
- 2023
20. From Bismarck to Beveridge: the other pension reform in Spain
- Abstract
Ageing is the major challenge for the PAYG pension systems in developed countries. Most of them are undergoing reforms in order to adapt to the new demographic reality. The package of reforms implemented includes increasing the retirement age, reducing the replacement rate, or introducing a sustainability factor linking pension to life expectancy. The aim of this paper is to analyse the potential consequences of a different type of reform that is at a very incipient stage in Spain but that could have a significant impact if it were fully implemented. This reform, called ‘silent reform’ because it is imperceptible to citizens in its early stages, basically consists in increasing maximum pensions in line with inflation instead of wage or productivity growth. This policy is reducing the replacement rate only for high earning workers and increasing the redistributive component of the system. This paper is the first to quantify and evaluate the potential consequences of this type of reform in Spain. We have used an accounting model with heterogeneous agents and overlapping generations in order to project pension expenditure for the next six decades. The results show that this type of reform could potentially contain future expenditure but at the cost of changing the nature of the pension system from a contributory or Bismarckian-type system into a pure redistributive pension system or Beveridgean-type one., Ministerio de Ciencia e Innovación (MICINN), Depto. de Análisis Económico y Economía Cuantitativa, Fac. de Ciencias Económicas y Empresariales, TRUE, pub
- Published
- 2023
21. Effort Estimation Based on Collaborative Filtering
- Abstract
Effort estimation methods are one of the important tools for project managers in controlling human resources of ongoing or future software projects. The estimations require historical project data including process and product metrics that characterize past projects. Practically, in using the estimation methods, it is a problem that the historical project data frequently contain substantial missing values. In this paper, we propose an effort estimation method based on Collaborative Filtering for solving the problem. Collaborative Filtering has been developed in information retrieval researchers, as one of the estimation techniques using defective data, i.e. data having substantial missing values. The proposed method first evaluates similarity between a target (ongoing) project and each past project, using vector based similarity computation equation. Then it predicts the effort of the target project with the weighted sum of the efforts of past similar projects. We conducted an experimental case study to evaluate the estimation performance of the proposed method. The proposed method showed better performance than the conventional regression method when the data had substantial missing values., PROFES 2004 : Product Focused Software Process Improvement, April 5-8, 2004, Kansai Science City, Japan
- Published
- 2023
22. Webjig: An Automated User Data Collection System for Website Usability Evaluation
- Abstract
In order to improve website usability, it is important for developers to understand how users access websites. In this paper, we present Webjig, which is a support system for website usability evaluation in order to resolve the problems associated with the existing systems. Webjig can collect users’ interaction data from static and dynamic websites. Moreover, by using Webjig, developers can precisely identify users’ activities on websites. By performing an experiment to evaluate the usefulness of Webjig, we have confirmed that developers could effectively improve website usability., HCI 2009 : Human-Computer Interaction. New Trends, July 19-24, 2009, San Diego, CA, USA
- Published
- 2023
23. Community Search: A Collaborative Searching Web Application with a User Ranking System
- Abstract
People are using search engine in daily life. But most of the tools that we have today treat information-seeking tasks as a transient activity. In this research paper we introduce a web application system that provides collaborative function and experts finding system. We develop a system that will help user to organize search result and to do the collaboration with others. With the new iterative algorithm, users will also gain more percentage of needed result and the system will be able to suggest the experts related to the search keyword., OCSC 2011 : Online Communities and Social Computing, July 9-14, 2011, Orlando, FL, USA
- Published
- 2023
24. Exploiting Eye Gaze Information for Operating Services in Home Network System
- Abstract
This paper presents a system which extensively exploits user’s eye gaze information for operating services and appliances in the emerging home network system (HNS). We design and implement the system called AXELLA, which captures user’s gaze, then invokes a service operation, and finally announces the response via voice. AXELLA interprets the gaze information together with supplementary information as a gaze context, and triggers a service module associated by a service rule. Thus, a simple gazing activity can be used for various service operations. Service developers (or even home users) can easily develop context-aware HNS services with the eye-gaze-based UI. We demonstrate a practical service called “See and Know” implemented using AXELLA, where a user can acquire the current status information of every appliance just by looking at the appliance. It was shown that the proposed system can reduce the artificial dependency significantly with respect to ease-of-learning and system scalability., UCS 2006 : Ubiquitous Computing Systems, October 11-13, 2006, Seoul, Korea
- Published
- 2023
25. A Software Process Tailoring System Focusing to Quantitative Management Plans
- Abstract
This paper presents a survey about use of quantitative management indicators in a Japanese software development organization. This survey is conducted in order to investigate possible criteria for selecting and customizing organizational standard indicators according to the context of each project. Based on results of the survey, we propose a process tailoring support system that is mainly focusing to quantitative management planning. The system EPDG+ (Electronic Process Data Guidebook Plus) helps project planners select / customize indicators to be employed in process control. Derived software project plans including measurement and analysis activities can be browsed in detail with this system., PROFES 2006 : Product-Focused Software Process Improvement, June 12-14, 2006, Amsterdam, The Netherlands
- Published
- 2023
26. Impact Analysis of Granularity Levels on Feature Location Technique
- Abstract
Due to the increasing of software requirements and software features, modern software systems continue to grow in size and complexity. Locating source code entities that required to implement a feature in millions lines of code is labor and cost intensive for developers. To this end, several studies have proposed the use of Information Retrieval (IR) to rank source code entities based on their textual similarity to an issue report. The ranked source code entities could be at a class or function granularity level. Source code entities at the class-level are usually large in size and might contain a lot of functions that are not implemented for the feature. Hence, we conjecture that the class-level feature location technique requires more effort than function-level feature location technique. In this paper, we investigate the impact of granularity levels on a feature location technique. We also presented a new evaluation method using effort-based evaluation. The results indicated that function-level feature location technique outperforms class-level feature location technique. Moreover, function-level feature location technique also required 7 times less effort than class-level feature location technique to localize the first relevant source code entity. Therefore, we conclude that feature location technique at the function-level of program elements is effective in practice., APRES 2014 : Asia Pacific Requirements Engineering Symposium, April 28-29, 2014, Auckland, New Zealand
- Published
- 2023
27. An Analysis of Eye Movements during Browsing Multiple Search Results Pages
- Abstract
In general, most search engines display a certain number of search results on a search results page at one time, separating the entire search results into multiple search results pages. Therefore, lower ranked results (e.g., 11th-ranked result) may be displayed on the top area of the next (second) page and might be more likely to be browsed by users, rather than results displayed on the bottom of the previous (first) results page. To better understand users’ activities in web search, it is necessary to analyze the effect of display positions of search results while browsing multiple search results pages. In this paper, we present the results of our analysis of users’ eye movements. We have conducted an experiment to measure eye movements during web search and analyzed how long users spend to view each search result. From the analysis results, we have found that search results displayed on the top of the latter page were viewed for a longer time than those displayed on the bottom of the former page., HCI 2009 : Human-Computer Interaction. New Trends, July 19-24, 2009, San Diego, CA, USA
- Published
- 2023
28. Characterizing Safety of Integrated Services in Home Network System
- Abstract
This paper formalizes three kinds of safety to be satisfied by networked appliances and services in the emerging home network system (HNS). The local safety is defined by safety instructions of individual networked appliances. The global safety is specified as required properties of HNS services, which use multiple appliances simultaneously. The environment safety is derived from residential rules in home and surrounding environments. Based on the safety defined, we propose a modeling/validation framework for the safety. Specifically, we first introduce an object-oriented modeling technique to clarify the relationships among the appliances, the services and the home (environment) objects. We then employ the technique of Design by Contract with JML (Java Modeling Language), which achieves systematic safety validation through testing., ICOST 2007 : Pervasive Computing for Quality of Life Enhancement, June 21-23, 2007, Nara, Japan
- Published
- 2023
29. Artificial intelligence applications and innovations: AIAI 2018 IFIP WG 12.5 International Workshops, SEDSEAL, 5G-PINE, MHDW, and HEALTHIOT, Rhodes, Greece, May 25-27, 2018, Proceedings.
- Author
-
Iliadis, Lazaros S., Maglogiannis, Ilias G., and Plagianakos, Vassilis
- Subjects
Artificial Intelligence Applications ,Artificial Intelligence Innovations ,Computational linguistics - Abstract
Summary: This book constitutes the refereed proceedings of 4 workshops held at the 14th IFIP WG 12.5 International Conference on Artificial Intelligence Applications and Innovations, AIAI 2018, held in Rhodes, Greece, in May 2018. The workshops were the Workshop on Semantics in the Deep: Semantic Analytics for Big Data, SEDSEAL 2018; the Third Workshop on 5G - Putting Intelligence to the Network Edge, 5G-PINE 2018; the 7th Mining Humanistic Data Workshop, MHDW 2018; and the Workshop on Intelligent Cloud and IOT Paradigms in EHealth, HEALTHIOT 2018. The 19 full papers and 5 short papers presented were carefully reviewed and selected from a total of 53 submissions: SEDSEAL accepted 2 full papers out of 5 submissions, 5G-PINE 6 full and one short paper out of 24, MHDW 7 full and 4 short papers out of 15, and HEALTHIOT 4 full papers out of 9. The papers cover topics such as AI in 5G and telecommunications, AI and e-health services, AI in 5G networks, incremental learning, clustering, AI in text mining, visual data analytics, AI in molecular biology, DNA, RNA, proteins, big data analytics, Internet of Things and recommender systems, and AI in biomedical applications.
- Published
- 2018
30. Intuitionistic fuzziness and other intelligent theories and their applications. / M Hadjiski, K T Atanassov.
- Author
-
Hadjiski, M and Atanasov, Krasimir
- Subjects
Artificial intelligence ,Computational intelligence ,Fuzzy systems - Abstract
Summary: This book gathers extended versions of the best papers presented at the 8th IEEE conference on Intelligent Systems, held in Sofia, Bulgaria on September 4-6, 2016, which are mainly related to theoretical research in the area of intelligent systems. The main focus is on novel developments in fuzzy and intuitionistic fuzzy sets, the mathematical modelling tool of generalized nets and the newly defined method of intercriteria analysis. The papers reflect a broad and diverse team of authors, including many young researchers from Australia, Bulgaria, China, the Czech Republic, Iran, Mexico, Poland, Portugal, Slovakia, South Korea and the UK.
- Published
- 2018
31. Computational intelligence and intelligent systems.
- Author
-
Li, Kangshun, Li, Jin, Liu, Yong, and Castiglione, Aniello
- Subjects
Computer simulation ,Data mining ,Artificial intelligence - Abstract
Summary: This book constitutes the refereed proceedings of the 7th International Symposium on Intelligence Computation and Applications, ISICA 2015, held in Guangzhou, China, in November 2015. The 77 revised full papers presented were carefully reviewed and selected from 189 submissions. The papers feature the most up-to-date research in analysis and theory of evolutionary computation, neural network architectures and learning; neuro-dynamics and neuro-engineering; fuzzy logic and control; collective intelligence and hybrid systems; deep learning; knowledge discovery; learning and reasoning.
- Published
- 2016
32. Abstracting mobility flows from bike-sharing systems
- Abstract
Bicycling has grown significantly in the past ten years. In some regions, the implementation of large-scale bike-sharing systems and improved cycling infrastructure are two of the factors enabling this growth. An increase in non-motorized modes of transportation makes our cities more human, decreases pollution, traffic, and improves quality of life. In many cities around the world, urban planners and policymakers are looking at cycling as a sustainable way of improving urban mobility. Although bike-sharing systems generate abundant data about their users’ travel habits, most cities still rely on traditional tools and methods for planning and policy-making. Recent technological advances enable the collection and analysis of large amounts of data about urban mobility, which can serve as a solid basis for evidence-based policy-making. In this paper, we introduce a novel analytical method that can be used to process millions of bike-sharing trips and analyze bike-sharing mobility, abstracting relevant mobility flows across specific urban areas. Backed by a visualization platform, this method provides a comprehensive set of analytical tools to support public authorities in making data-driven policy and planning decisions. This paper illustrates the use of the method with a case study of the Greater Boston bike-sharing system and, as a result, presents new findings about that particular system. Finally, an assessment with expert users showed that this method and tool were considered very useful, relatively easy to use and that they intend to adopt the tool in the near future.
- Published
- 2022
33. DE-Sinc methods have almost the same convergence property as SE-Sinc methods even for a family of functions fitting the SE-Sinc methods Part I: Definite integration and function approximation
- Abstract
In this paper, the theoretical convergence rate of the trapezoidal rule combined with the double-exponential (DE) transformation is given for a class of functions for which the single-exponential (SE) transformation is suitable. It is well known that the DE transformation enables the rule to achieve a much higher rate of convergence than the SE transformation, and the convergence rate has been analyzed and justified theoretically under a proper assumption. Here, it should be emphasized that the assumption is more severe than the one for the SE transformation, and there actually exist some examples such that the trapezoidal rule with the SE transformation achieves its usual rate, whereas the rule with DE does not. Such cases have been observed numerically, but no theoretical analysis has been given thus far. This paper reveals the theoretical rate of convergence in such cases, and it turns out that the DE’s rate is almost the same as, but slightly lower than that of the SE. By using the analysis technique developed here, the theoretical convergence rate of the Sinc approximation with the DE transformation is also given for a class of functions for which the SE transformation is suitable. The result is quite similar to above; the convergence rate in the DE case is slightly lower than in the SE case. Numerical examples which support those two theoretical results are also given.
- Published
- 2022
34. Quivers with subadditive labelings: classification and integrability
- Abstract
Strictly subadditive, subadditive and weakly subadditive labelings of quivers were introduced by the second author, generalizing Vinberg’s definition for undirected graphs. In our previous work we have shown that quivers with strictly subadditive labelings are exactly the quivers exhibiting Zamolodchikov periodicity. In this paper, we classify all quivers with subadditive labelings. We conjecture them to exhibit a certain form of integrability, namely, as the T-system dynamics proceeds, the values at each vertex satisfy a linear recurrence. Conversely, we show that every quiver integrable in this sense is necessarily one of the 19 items in our classification. For the quivers of type $${\hat{A}} \otimes A$$ A ^ ⊗ A we express the coefficients of the recurrences in terms of the partition functions for domino tilings of a cylinder, called Goncharov–Kenyon Hamiltonians. We also consider tropical T-systems of type $${\hat{A}} \otimes A$$ A ^ ⊗ A and explain how affine slices exhibit solitonic behavior, i.e. soliton resolution and speed conservation. Throughout, we conjecture how the results in the paper are expected to generalize from $${\hat{A}} \otimes A$$ A ^ ⊗ A to all other quivers in our classification.
- Published
- 2021
35. Circumnavigating collinear superspace
- Abstract
In this paper, we extend the collinear superspace formalism to include the full range of N$$ \mathcal{N} $$ = 1 supersymmetric interactions. Building on the effective field theory rules developed in a companion paper — Navigating Collinear Superspace [1] — we construct collinear superspace Lagrangians for theories with non-trivial F- and D-term auxiliary fields. For (massless) Wess-Zumino models, the key ingredient is a novel type of Grassmann-valued supermultiplet whose lowest component is a (non-propagating) fermionic degree of freedom. For gauge theories coupled to charged chiral matter, the key ingredient is a novel type of vector superfield whose lowest component is a non-propagating gauge potential. This unique vector superfield is used to construct a gauge-covariant derivative; while such an object does not appear in the standard full superspace formalism, it is crucial for modeling gauge interactions when the theory is expres sed on a collinear slice. This brings us full circle, by showing that all types of N$$ \mathcal{N} $$ = 1 theories in four dimensions can beconstructed in collinear superspace from purely infrared considerations. We speculate that supersymmetric theories with N$$ \mathcal{N} $$ > 1 could also be implemented using similar collinear superspace constructions.
- Published
- 2021
36. Measurement of CP asymmetries and branching fraction ratios of B− decays to two charm mesons
- Abstract
The CP asymmetries of seven B− decays to two charm mesons are measured using data corresponding to an integrated luminosity of 9 fb−1 of proton-proton collisions collected by the LHCb experiment. Decays involving a D*0 or D s ∗ − $$ {D}_s^{\ast -} $$ meson are analysed by reconstructing only the D0 or D s − $$ {D}_s^{-} $$ decay products. This paper presents the first measurement of A $$ \mathcal{A} $$ CP(B− → D s ∗ − $$ {D}_s^{\ast -} $$ D0) and A $$ \mathcal{A} $$ CP(B− → D s − $$ {D}_s^{-} $$ D∗0), and the most precise measurement of the other five CP asymmetries. There is no evidence of CP violation in any of the analysed decays. Additionally, two ratios between branching fractions of selected decays are measured.
- Published
- 2023
37. OME-Zarr: a cloud-optimized bioimaging file format with international community support
- Abstract
A growing community is constructing a next-generation file format (NGFF) for bioimaging to overcome problems of scalability and heterogeneity. Organized by the Open Microscopy Environment (OME), individuals and institutes across diverse modalities facing these problems have designed a format specification process (OME-NGFF) to address these needs. This paper brings together a wide range of those community members to describe the cloud-optimized format itself—OME-Zarr—along with tools and data resources available today to increase FAIR access and remove barriers in the scientific process. The current momentum offers an opportunity to unify a key component of the bioimaging domain—the file format that underlies so many personal, institutional, and global data management and analysis tasks.
- Published
- 2023
38. OME-Zarr: a cloud-optimized bioimaging file format with international community support
- Abstract
A growing community is constructing a next-generation file format (NGFF) for bioimaging to overcome problems of scalability and heterogeneity. Organized by the Open Microscopy Environment (OME), individuals and institutes across diverse modalities facing these problems have designed a format specification process (OME-NGFF) to address these needs. This paper brings together a wide range of those community members to describe the cloud-optimized format itself—OME-Zarr—along with tools and data resources available today to increase FAIR access and remove barriers in the scientific process. The current momentum offers an opportunity to unify a key component of the bioimaging domain—the file format that underlies so many personal, institutional, and global data management and analysis tasks.
- Published
- 2023
39. On the causality paradox and the Karch-Randall braneworld as an EFT
- Abstract
Holography on cutoff surfaces can appear to be in tension with causality. For example, as argued by Omiya and Wei [1], double holography seemingly allows for superluminal signalling. In this paper we argue that the brane description of double holography should be treated as an effective theory and demonstrate that causality violations due to faster-than-light communication are not visible above the associated cutoff length scale. This suggests that end-of-the-world brane models are consistent with causality and that the apparent superluminal signalling is a UV effect. Moreover, we argue that short distance non-localities generically give rise to apparent faster-than-light propagation of signals in Anti-de Sitter space. Nonetheless, superluminal signalling indicates that the causal structure on holographic cutoff surfaces needs to be modified. We propose and study three different candidate regions that might replace the domain of dependence in the brane EFT of the Karch-Randall model. These regions are defined by unitarity on the brane, through bulk entanglement wedges and through the nice slice criterion, respectively. In all dimensions, these candidate regions exclude those parts of the domain of dependence which are affected by superluminal signalling. While all three definitions agree in two dimensions, they are different in higher dimensions.
- Published
- 2023
40. Pileup and Infrared Radiation Annihilation (PIRANHA): a paradigm for continuous jet grooming
- Abstract
Jet grooming is an important strategy for analyzing relativistic particle collisions in the presence of contaminating radiation. Most jet grooming techniques introduce hard cutoffs to remove soft radiation, leading to discontinuous behavior and associated experimental and theoretical challenges. In this paper, we introduce Pileup and Infrared Radiation Annihilation (Piranha), a paradigm for continuous jet grooming that overcomes the discontinuity and infrared sensitivity of hard-cutoff grooming procedures. We motivate Piranha from the perspective of optimal transport and the Energy Mover’s Distance and review Apollonius Subtraction and Iterated Voronoi Subtraction as examples of Piranha-style grooming. We then introduce a new tree-based implementation of Piranha, Recursive Subtraction, with reduced computational costs. Finally, we demonstrate the performance of Recursive Subtraction in mitigating sensitivity to soft distortions from hadronization and detector effects, and additive contamination from pileup and the underlying event.
- Published
- 2023
41. Cooperative distributed state estimation: resilient topologies against smart spoofers
- Abstract
A network of observers is considered, where through asynchronous (with bounded delay) communications, they cooperatively estimate the states of a linear time-invariant (LTI) system. In such a setting, a new type of adversary might affect the observation process by impersonating the identity of the regular node, which is a violation of communication authenticity. These adversaries also inherit the capabilities of Byzantine nodes, making them more powerful threats called smart spoofers. We show how asynchronous networks are vulnerable to smart spoofing attack. In the estimation scheme considered in this paper, information flows from the sets of source nodes, which can detect a portion of the state variables each, to the other follower nodes. The regular nodes, to avoid being misguided by the threats, distributively filter the extreme values received from the nodes in their neighborhood. Topological conditions based on strong robustness are proposed to guarantee the convergence. Two simulation scenarios are provided to verify the results.
- Published
- 2023
42. Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora
- Abstract
The Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/c charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1 $$\pm 0.6$$ ± 0.6 % and 84.1 $$\pm 0.6$$ ± 0.6 %, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation.
- Published
- 2023
43. A burden shared: the financial, psychological, and health-related consequences borne by family members and caregivers of people with cancer in India
- Abstract
In India, approximately 1.4 million new cases of cancer are recorded annually, with 26.7 million people living with cancer in 2021. Providing care for family members with cancer impacts caregivers’ health and financial resources. Effects on caregivers’ health and financial resources, understood as family and caregiver “financial toxicity” of cancer, are important to explore in the Indian context, where family members often serve as caregivers, in light of cultural attitudes towards family. This is reinforced by other structural issues such as grave disparities in socioeconomic status, barriers in access to care, and limited access to supportive care services for many patients. Effects on family caregivers’ financial resources are particularly prevalent in India given the increased dependency on out-of-pocket financing for healthcare, disparate access to insurance coverage, and limitations in public expenditure on healthcare. In this paper, we explore family and caregiver financial toxicity of cancer in the Indian context, highlighting the multiple psychosocial aspects through which these factors may play out. We suggest steps forward, including future directions in (1) health services research, (2) community-level interventions, and (3) policy changes. We underscore that multidisciplinary and multi-sectoral efforts are needed to study and address family and caregiver financial toxicity in India.
- Published
- 2023
44. A spectral metric for collider geometry
- Abstract
By quantifying the distance between two collider events, one can triangulate a metric space and reframe collider data analysis as computational geometry. One popular geometric approach is to first represent events as an energy flow on an idealized celestial sphere and then define the metric in terms of optimal transport in two dimensions. In this paper, we advocate for representing events in terms of a spectral function that encodes pairwise particle angles and products of particle energies, which enables a metric distance defined in terms of one-dimensional optimal transport. This approach has the advantage of automatically incorporating obvious isometries of the data, like rotations about the colliding beam axis. It also facilitates first-principles calculations, since there are simple closed-form expressions for optimal transport in one dimension. Up to isometries and event sets of measure zero, the spectral representation is unique, so the metric on the space of spectral functions is a metric on the space of events. At lowest order in perturbation theory in electron-positron collisions, our metric is simply the summed squared invariant masses of the two event hemispheres. Going to higher orders, we present predictions for the distribution of metric distances between jets in fixed-order and resummed perturbation theory as well as in parton-shower generators. Finally, we speculate on whether the spectral approach could furnish a useful metric on the space of quantum field theories.
- Published
- 2023
45. Identifying latent activity behaviors and lifestyles using mobility data to describe urban dynamics
- Abstract
Urbanization and its problems require an in-depth and comprehensive understanding of urban dynamics, especially the complex and diversified lifestyles in modern cities. Digitally acquired data can accurately capture complex human activity, but it lacks the interpretability of demographic data. In this paper, we study a privacy-enhanced dataset of the mobility visitation patterns of 1.2 million people to 1.1 million places in 11 metro areas in the U.S. to detect the latent mobility behaviors and lifestyles in the largest American cities. Despite the considerable complexity of mobility visitations, we found that lifestyles can be automatically decomposed into only 12 latent interpretable activity behaviors on how people combine shopping, eating, working, or using their free time. Rather than describing individuals with a single lifestyle, we find that city dwellers’ behavior is a mixture of those behaviors. Those detected latent activity behaviors are equally present across cities and cannot be fully explained by main demographic features. Finally, we find those latent behaviors are associated with dynamics like experienced income segregation, transportation, or healthy behaviors in cities, even after controlling for demographic features. Our results signal the importance of complementing traditional census data with activity behaviors to understand urban dynamics.
- Published
- 2023
46. Thermal non-line-of-sight imaging from specular and diffuse reflections
- Abstract
This paper presents a non-line-of-sight technique to estimate the position and temperature of an occluded objectfrom a camera via reflection on a wall. Because objects with heat emit far infrared light with respect to theirtemperature, positions and temperatures are estimated from reflections on a wall. A key idea is that light paths from ahidden object to the camera depend on the position of the hidden object. The position of the object is recoveredfrom the angular distribution of specular and diffuse reflection component, and the temperature of the heat source isrecovered from the estimated position and the intensity of reflection. The effectiveness of our method is evaluated byconducting real-world experiments, showing that the position and the temperature of the hidden object can berecovered from the reflection destination of the wall by using a conventional thermal camera.
- Published
- 2023
47. On approximations of the PSD cone by a polynomial number of smaller-sized PSD cones
- Abstract
We study the problem of approximating the cone of positive semidefinite (PSD) matrices with a cone that can be described by smaller-sized PSD constraints. Specifically, we ask the question: “how closely can we approximate the set of unit-trace $$n \times n$$ n × n PSD matrices, denoted by D, using at most N number of $$k \times k$$ k × k PSD constraints?” In this paper, we prove lower bounds on N to achieve a good approximation of D by considering two constructions of an approximating set. First, we consider the unit-trace $$n \times n$$ n × n symmetric matrices that are PSD when restricted to a fixed set of k-dimensional subspaces in $${\mathbb {R}}^n$$ R n . We prove that if this set is a good approximation of D, then the number of subspaces must be at least exponentially large in n for any $$k = o(n)$$ k = o ( n ) . Second, we show that any set S that approximates D within a constant approximation ratio must have superpolynomial $${\varvec{S}}_+^k$$ S + k -extension complexity. To be more precise, if S is a constant factor approximation of D, then S must have $${\varvec{S}}_+^k$$ S + k -extension complexity at least $$\exp ( C \cdot \min \{ \sqrt{n}, n/k \})$$ exp ( C · min { n , n / k } ) where C is some absolute constant. In addition, we show that any set S such that $$D \subseteq S$$ D ⊆ S and the Gaussian width of S is at most a constant times larger than the Gaussian width of D must have $${\varvec{S}}_+^k$$ S + k -extension complexity at least $$\exp ( C \cdot \min \{ n^{1/3}, \sqrt{n/k} \})$$ exp ( C · min { n 1 / 3 , n / k } ) . These results imply that the cone of $$n \times n$$ n × n PSD matrices cannot be approximated by a polynomial number of $$k \times k$$ k × k PSD constraints for any $$k = o(n / \log ^2 n)$$ k = o ( n / log 2 n ) . These results generalize the recent work of Fawzi (Math Oper Res 46(4):1479–1489, 2021) on the hardness of polyhedral approximations of $${\varvec{S}}_+^n$$ S + n , which corresponds to the special case with $$k
- Published
- 2023
48. Closed string theory without level-matching at the free level
- Abstract
In its traditional form, the string field in closed string field theory is constrained by the level-matching condition, which is imposed beside the action. By analogy with the similar problem for the Ramond sector, it was understood by Okawa and Sakaguchi how to lift this condition and work with unconstrained field by introducing spurious free fields. These authors also pointed out that new backgrounds may exist thanks to a new gauge field which is trivial on flat space, but can generate fluxes on a toroidal background. In this paper, we perform a complete study of the free theory at the tachyonic and massless levels with the aim of setting the stage for studying backgrounds without level-matching.
- Published
- 2023
49. Human-Informed Topology Optimization: interactive application of feature size controls
- Abstract
This paper presents a new topology optimization framework in which the design decisions are made by humans and machines in collaboration. The new Human-Informed Topology Optimization approach eases the accessibility of topology optimization tools and enables improved design identification for the so-called ‘everyday’ and ‘in-the-field’ design situations. The new framework is based on standard density-based compliance minimization. However, the design engineer is enabled to actively use their experience and expertise to locally alter the minimum feature size requirements. This is done by conducting a short initial solution and prompting the design engineer to evaluate the quality. The user can identify potential areas of concern based on the initial material distribution. In these areas, the minimum feature size requirement can be altered as deemed necessary by the user. The algorithm rigorously resolves the compliance problem using the updated filtering map, resulting in solutions that eliminate, merge, or thicken topological members of concern. The new framework is demonstrated on 2D benchmark examples and the extension to 3D is shown. Its ability to achieve performance improvement with few computational resources are demonstrated on buckling and stress concentration examples.
- Published
- 2023
50. Disjunctive cuts in Mixed-Integer Conic Optimization
- Abstract
This paper studies disjunctive cutting planes in Mixed-Integer Conic Programming. Building on conic duality, we formulate a cut-generating conic program for separating disjunctive cuts, and investigate the impact of the normalization condition on its resolution. In particular, we show that a careful selection of normalization guarantees its solvability and conic strong duality. Then, we highlight the shortcomings of separating conic-infeasible points in an outer-approximation context, and propose conic extensions to the classical lifting and monoidal strengthening procedures. Finally, we assess the computational behavior of various normalization conditions in terms of gap closed, computing time and cut sparsity. In the process, we show that our approach is competitive with the internal lift-and-project cuts of a state-of-the-art solver.
- Published
- 2023
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.