158 results on '"Axiom"'
Search Results
2. On how to allocate the fixed cost of transport systems
- Author
-
Estañ, Teresa, Llorca, Natividad, Martínez, Ricardo, and Sánchez-Soriano, Joaquín
- Published
- 2021
- Full Text
- View/download PDF
3. axiom
- Author
-
Weik, Martin H. and Weik, Martin H.
- Published
- 2001
- Full Text
- View/download PDF
4. Malcev products of weakly cancellative monoids and varieties of bands
- Author
-
Petrich, Mario
- Published
- 2015
- Full Text
- View/download PDF
5. Understandings and misunderstandings of multidimensional poverty measurement
- Author
-
Alkire, Sabina and Foster, James
- Published
- 2011
- Full Text
- View/download PDF
6. Discrete and Integer Valued Inputs and Outputs in Data Envelopment Analysis
- Author
-
Abolfazl Keshvari, Reza Kazemi Matin, and Timo Kuosmanen
- Subjects
Mathematical optimization ,Computer science ,Data envelopment analysis ,Inefficiency ,Measure (mathematics) ,Random variable ,Axiom ,Convexity ,Complement (set theory) ,Integer (computer science) - Abstract
Standard axioms of free disposability, convexity and constant returns to scale employed in Data Envelopment Analysis (DEA) implicitly assume continuous, real-valued inputs and outputs. However, the implicit assumption of continuous data will never hold with exact precision in real world data. To address the discrete nature of data explicitly, various formulations of Integer DEA (IDEA) have been suggested. Unfortunately, the axiomatic foundations and the correct mathematical formulation of IDEA technology has caused considerable confusion in the literature. This chapter has three objectives. First, we re-examine the axiomatic foundations of IDEA, demonstrating that some IDEA formulations proposed in the literature fail to satisfy the axioms of free disposability of continuous inputs and outputs, and natural disposability of discrete inputs and outputs. Second, we critically examine alternative efficiency metrics available for IDEA. We complement the IDEA formulations for the radial input measure with the radial output measure and the directional distance function. We then critically discuss the additive efficiency metrics, demonstrating that the optimal slacks are not necessarily unique. Third, we consider estimation of the IDEA technology under stochastic noise, modeling inefficiency and noise as Poisson distributed random variables.
- Published
- 2015
7. DEA Models with Undesirable Inputs, Intermediates, and Outputs
- Author
-
Zhongbao Zhou and Wenbin Liu
- Subjects
Mathematical optimization ,Computer science ,Data envelopment analysis ,Data transformation (statistics) ,Production (economics) ,Construct (python library) ,Axiom - Abstract
In real applications involving the use of Data Envelopment Analysis (DEA) models, undesirable inputs and outputs have been frequently encountered and addressed, e.g., via data transformation. These studies were scattered in the literature and often confined to some particular applications. In this paper, we present a systematic investigation concerning the building of DEA models. First, we describe the desirability of inputs and outputs, as well as the disposability assumptions in the presence of undesirable inputs and outputs. Next we construct a number of DEA models with different disposability assumptions and performance measures for the case of single-stage DEA. Next, we try to systematically investigate two-stage DEA models with undesirable inputs, intermediates and outputs. Particularly, we utilize the free-disposal axioms to construct the production possibility sets and the corresponding DEA models with undesirable inputs, intermediates, and outputs.
- Published
- 2015
8. Stochastic Nonparametric Approach to Efficiency Analysis: A Unified Framework
- Author
-
Antti Saastamoinen, Andrew L. Johnson, and Timo Kuosmanen
- Subjects
Heteroscedasticity ,Stochastic frontier analysis ,Computer science ,Econometrics ,Nonparametric statistics ,Data envelopment analysis ,Inefficiency ,Field (computer science) ,Axiom ,Quantile - Abstract
Bridging the gap between axiomatic Data Envelopment Analysis (DEA) and econometric Stochastic Frontier Analysis (SFA) has been one of the most vexing problems in the field of efficiency analysis. Recent developments in multivariate convex regression, particularly Convex Nonparametric Least Squares (CNLS) method, have led to the full integration of DEA and SFA into a unified framework of productivity analysis, referred to as Stochastic Nonparametric Envelopment of Data (StoNED). The unified framework of StoNED offers a general and flexible platform for efficiency analysis and related themes such as frontier estimation and production analysis, allowing one to combine existing tools of efficiency analysis in novel ways across the DEA-SFA spectrum, facilitating new opportunities for further methodological development. This chapter provides an updated and elaborated presentation of the CNLS and StoNED methods. This chapter also extends the scope of the StoNED method in several directions. Most notably, this chapter examines quantile estimation using StoNED and an extension of the StoNED method to the general case of multiple inputs and multiple outputs. This chapter also provides a detailed discussion of how to model heteroscedasticity in the inefficiency and noise terms.
- Published
- 2015
9. Post Script: Responding to Existential Risks
- Author
-
Janet McIntyre-Mills
- Subjects
Environmental justice ,Idealism ,Essentialism ,media_common.quotation_subject ,Environmental ethics ,Sociology ,Space (commercial competition) ,Axiom ,Existentialism ,Focus (linguistics) ,Diversity (politics) ,media_common - Abstract
The problem with integrated approaches is that they need to preserve space for doubt, diversity and disagreement. But the axiom that needs to guide this freedom is that we should not allow freedom and rights of some to undermine the rights and freedoms of others and future generations of life. This is where transformation is needed. Much of the activity which takes place in the social and natural sciences is in the areas of knowledge that focus on research on others. We need to do some research on our own lives and to reflect on the consequences of our choices. Instead of idealism and essentialist categories, we need to extend our vision to take into account the social, economic and environmental consequences for future generations of life.
- Published
- 2014
10. Ranking opportunity profiles through dependent evaluation of policies
- Author
-
Jorge Alcalde-Unzu, Miguel A. Ballester, Universidad Pública de Navarra. Departamento de Economía, and Nafarroako Unibertsitate Publikoa. Ekonomia Saila
- Subjects
Organizational Behavior and Human Resource Management ,Opportunity profiles ,Sociology and Political Science ,business.industry ,Welfare economics ,Distribution (economics) ,Evaluation of policies ,Microeconomics ,Advantage ,Equality ,Ranking ,Economics ,business ,General Economics, Econometrics and Finance ,Axiom ,Public finance - Abstract
The final publication is available at Springer via http://dx.doi.org/10.1007/s10888-011-9165-4 Rankings to evaluate opportunity distributions present in most of the literature judge a policy (change from one distribution of opportunities to another) on the basis of the changes created and, thus, independently of the original situation. This paper proposes a group of axioms capturing the idea that rankings of equality of opportunities might consider not only the changes promoted, but also the initial situation in society. The combination of this group of axioms with other well-established properties enables us to characterize two families of new opportunity distribution rankings. The first family weighs each individual’s percentage share in the total number of opportunities, while the second weighs opportunities depending on how many agents have them available. Financial support from the Spanish Ministry of Education through grants ECO2008-04756, ECO2009- 11213, ECO2009-12836, Juan de la Cierva and Ramon y Cajal programs, FEDER, and the Barcelona Economics Program of CREA is gratefully acknowledged.
- Published
- 2012
11. Free Energies and the Dissipation Principle
- Author
-
Mauro Fabrizio, John M. Golden, and Giovambattista Amendola
- Subjects
Physics ,Theoretical physics ,Cyclic process ,Order (ring theory) ,Equivalence relation ,Context (language use) ,Free energies ,Dissipation ,Axiom - Abstract
We present in this chapter an axiomatic formulation of thermodynamics in order to introduce free energies in a very general manner and to prove certain fundamental properties of these quantities. For most of the discussion, no underlying model is assumed, in contrast to the previous chapter. However, in the context of an equivalence relation between states, we ascribe a form to the work function consistent with the general nonisothermal theory introduced in Chapter 5.
- Published
- 2011
12. The Long-Term Cognitive Development of Reasoning and Proof
- Author
-
Juan Pablo Mejía-Ramos and David Tall
- Subjects
Cognitive science ,Mental world ,Formalism (philosophy) ,Euclidean geometry ,Cognitive development ,Mathematics education ,Mathematical proof ,Psychology ,Construct (philosophy) ,Formal proof ,Axiom - Abstract
This paper uses the framework of “three worlds of mathematics” (Tall 2004a, b) to chart the development of mathematical thinking from the thought processes of early childhood to the formal structures of set-theoretic definition and formal proof. It sees the development of mathematical thinking building on experiences that the individual has met before, as the child coordinates perceptions and actions to construct thinkable concepts in two different ways. One focuses on objects, exploring their properties, describing them, using carefully worded descriptions as definitions, inferring that certain properties imply others and on to coherent frameworks such as Euclidean geometry through a developing mental world of conceptual embodiment. The other focuses on actions (such as counting), first as procedures and then compressed into thinkable concepts (such as number) using symbols such as 3 + 2, ¾, 3a + 2b, f (x), dy/dx; these operate dually as computable processes and thinkable concepts, termed procepts, in a developing mental world of proceptual symbolism. These may lead later to a third mental world of axiomatic formalism based on set-theoretic definition and formal proof. In addition to charting the development of proof concepts through these three worlds, we use the theory of Toulmin to analyse the processes of reasoning by which proofs are constructed.
- Published
- 2009
13. Preaxiomatic Mathematical Reasoning: An Algebraic Approach
- Author
-
Mary Leng
- Subjects
Mathematical practice ,Mathematics::Logic ,Computer science ,media_common.quotation_subject ,Calculus ,Contradiction ,Mathematical object ,Algebraic number ,Mathematical reasoning ,Axiom ,Subject matter ,media_common ,Terminology - Abstract
In their correspondence on the nature of axioms, Frege and Hilbert clashed over the question of how best to understand axiomatic mathematical theories and, in particular, the nonlogical terminology occurring in axioms. According to Frege, axioms are best viewed as attempts to assert fundamental truths about a previously given subject matter. Hilbert disagreed emphatically, holding that axioms contextually define their subject matter; thus, so long as an axiom system implies no contradiction, its axioms are to be thought of as true. This paper considers whether it is possible to extend Hilbert’s “algebraic” view of axioms to preaxiomatic mathematical reasoning, where our mathematical concepts are not yet pinned down by axiomatic definitions. I argue that, even at the preaxiomatic stage, our informal characterizations of mathematical concepts are determinate enough that viewing our mathematical theories as setting well-defined “problems” with mathematical concepts as “solutions” remains illuminating.
- Published
- 2009
14. The Automatic Integration of Folksonomies with Taxonomies Using Non-axiomatic Logic
- Author
-
Joe Geldart and Stephen Cummins
- Subjects
Structure (mathematical logic) ,Information retrieval ,Computer science ,business.industry ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,Inference ,Ontology (information science) ,computer.software_genre ,Metadata ,Resource (project management) ,Artificial intelligence ,Precision and recall ,business ,Folksonomy ,computer ,Axiom ,Natural language processing - Abstract
Cooperative tagging systems such as folksonomies are powerful tools when used to annotate information resources. The inherent power of folksonomies is in their ability to allow casual users to easily contribute ad hoc, yet meaningful, resource metadata without any specialist training. Older folksonomies have begun to degrade due to the lack of internal structure and from the use of many low quality tags. This chapter describes a remedy for some of the problems associated with folksonomies. We introduce a method of automatic integration and inference of the relationships between tags and resources in a folksonomy using non-axiomatic logic. We test this method on the CiteULike corpus of tags by comparing precision and recall between it and standard keyword search. Our results show that non-axiomatic reasoning is a promising technique for integrating tagging systems with more structured knowledge representations.
- Published
- 2009
15. THE ACQUISITION OF CLASS DEFINITIONS IN THE COMMODITY ONTOLOGY OF AGRICULTURAL MEANS OF PRODUCTION
- Author
-
Li Kang, Xinrong Cheng, Lu Zhang, Zhijia Niu, and Guowu Jiang
- Subjects
Class (computer programming) ,Knowledge management ,Information retrieval ,Means of production ,business.industry ,Computer science ,Online encyclopedia ,Taxonomy (general) ,Encyclopedia ,Domain knowledge ,Ontology (information science) ,business ,Axiom - Abstract
The agricultural means of production is also called the means of agricultural production. The previous work focused on constructing the means of agricultural production commodities ontology taxonomy. After finishing constructing the ontology taxonomy, adding the detail information to the ontology are the following work. The detail information includes classes’ definitions, properties, relations, instances and axioms. The classes’ definitions are the concrete domain knowledge manifestation. Therefore they can be used to learn class's properties and relations. This paper focuses on obtaining these definitions. To this end, the most important work is to determine where to obtain class's definitions. This paper discusses the selective process. According to authority, completeness, accuracy, practicability and computerization, compares three kinds of knowledge sources, and then selects online “Encyclopedia of China” as the knowledge source. Analyzes the encyclopedia website and its entries, and then proposes an automatic method to get class definitions. The experiment shows that nearly 70% of classes can get their definitions. Using Jena API to add the definitions to the ontology model represented in OWL format.
- Published
- 2009
16. Axiom-based Potential Functional Failure Analysis for Risk-free Design
- Author
-
Henning Lian, Zhonghang Bai, Guozhong Cao, and Runhua Tan
- Subjects
Fault tree analysis ,Axiom independence ,Conceptual design ,Computer science ,Product (mathematics) ,media_common.quotation_subject ,Design process ,Quality (business) ,Reliability (statistics) ,Axiom ,Reliability engineering ,media_common - Abstract
Quality and reliability of product is not established completely in detail design process, but is brought out essentially in the course of conceptual design. A potential functional failure analysis method to improve reliability of product based on design axiom in the stage of conceptual design was introduced in this paper. This method provided designers with an analytical and non-probabilistic tool to evaluate the result of conceptual design from the opinion of design axioms. Functional failure modes can be identified by function analysis based on the ideality axiom, the independence axiom and the information axiom. These potential functional failures point out working direction for the latter improving design. A speedy cutting off valve in the TRT (Top Gas Pressure Recovery Turbine) System is studied as an example to illustrate this method’s potential.
- Published
- 2008
17. Arrow’s Impossibility Theorem
- Author
-
Bernard Grofman
- Subjects
Power (social and political) ,Arrow ,Axiomatic system ,Impossibility ,Positive economics ,Public choice ,Social choice theory ,Arrow's impossibility theorem ,Axiom - Abstract
Kenneth Arrow’s Social Choice and Individual Values (1951, 1963), one of the five “founding books” of the Public Choice movement, is a seminal work in social science. It reformulates the theory of social welfare in ordinal rather than cardinal terms; it demonstrates the power of an axiomatic approach to economic modeling; and it offers a new approach to traditional issues in democratic theory having to do with the nature of collective choice that has had enormous impact in political science, presaging later aspects of “economic imperialism” vis-a-vis the other social sciences. The key result in the book, Arrow’s Impossibility Theorem, is arguably the best known purely mathematical result in the social sciences. Directly inspiring a huge literature, including numerous axiomatic formulations that were more in the nature of characterization or existence results than impossibility theorems, Arrow’s work laid the reinvigorated foundations for the subfield of social choice and welfare that came to be exemplified in the journal of that same name. The impossibility theorem is also perhaps the most important of the many contributions which earned Arrow his Nobel Prize in Economics in 1972.
- Published
- 2008
18. Positivism or Non-Positivism — Tertium Non Datur
- Author
-
Bernd Carsten Stahl
- Subjects
Law of excluded middle ,Field (Bourdieu) ,Ontology ,Information system ,Spell ,Sociology ,Positivism ,Axiom ,Epistemology - Abstract
This paper revisits the debate between positivism and its alternatives in the field of information systems from a philosophical point of view. It will argue that the heart of the debate is the ontological difference between the views of reality as observer-independent versus observer-dependent. The logical axiom of the excluded third (tertium non datur) informs us that two contradictory options cannot simultaneously be true. The paper will discuss what the incompatibility of the ontological positions of positivism and its alternatives means for IS research. It will discuss why scholars attempt to mix the two and will spell out the consequences of an acceptance of their incompatibility. The paper will end by arguing that this debate needs to be contextualized with the problem of positivism versus non-positivism in society and it will ask whether a tolerant coexistence of the two approaches is feasible. Without this contextualized understanding of ontology in general, regional ontologies in IS are not likely to be successful as they will be based on unclear bases.
- Published
- 2007
19. Automation of the Ontology Axioms Transformation into Information Processing Rules Model
- Author
-
Diana Bugaite and Olegas Vasilecas
- Subjects
Event ontology ,Transformation (function) ,Information retrieval ,business.industry ,Business rule ,Information processing ,Upper ontology ,Ontology (information science) ,business ,Automation ,Axiom ,Mathematics - Published
- 2007
20. The Meaning and Understanding of Mathematics
- Author
-
Carmen Díaz and Carmen Batanero
- Subjects
Frequentist inference ,Institution (computer science) ,Probabilistic logic ,Applied mathematics ,Semiotics ,Meaning (existential) ,Philosophy of mathematics education ,Axiom ,Epistemology - Abstract
We summarize a model with which to analyze the meaning of mathematical concepts, distinguishing five interrelated components. We also distinguish between the personal and the institutional meaning to differentiate between the meaning that has been proposed for a given concept in a specific institution, and the meaning given to the concept by a particular person in the institution. We use these ideas to analyze the historical emergence of probability and its different current meanings (intuitive, classical, frequentist, propensity, logical, subjective and axiomatic). We furthermore describe mathematical activity as a chain of semiotic functions and introduce the idea of semiotic conflict that can be used to give an alternative explanation to some widespread probabilistic misconceptions.
- Published
- 2007
21. Finding Common Ground: Efficiency Indices
- Author
-
Shawna Grosskopf, Valentin Zelenyuk, and Rolf Färe
- Subjects
Productive efficiency ,Index (economics) ,Capital (economics) ,Economics ,Production (economics) ,Common ground ,Mathematical economics ,Axiom - Abstract
The last two decades have witnessed a revival in interest in the measurement of productive efficiency pioneered by (1957) and (1951). 1978 was a watershed year in this revival with the christening of DEA by (1978) and the critique of Farrell technical efficiency in terms of axiomatic production and index number theory in Fare and (1978). These papers have inspired many others to apply these methods and to add to the debate on how best to define technical efficiency.
- Published
- 2007
22. Kinematics of Averaged Fields
- Author
-
Mamoru Ishii and Takashi Hibiki
- Subjects
Physics ,Convection ,Corollary ,Classical mechanics ,Continuum mechanics ,Volumetric flux ,Material derivative ,Kinematics ,Spatial description ,Axiom - Abstract
The time-mean values are consistently expressed by the spatial description as shown by the definitions (4-15) and (4-16), and the idea of the particle coordinates for the averaged two-phase flow fields is not clear nor trivial due to the phase changes and the diffusions. The phase change corresponds to the production or disappearance of fluid particles for each phase throughout the field. The difficulty arises because each phase itself does not apparently obey the corollary of the axiom of continuity, namely, the permanence of matter. However, the diffusion of each phase permits the penetration of mixture particles by other fluid particles. It is clear that the material coordinates, which is the base of the standard continuum mechanics, is not inherent to a general two-phase flow field obtained from the time averaging. However, it is possible to introduce mathematically special convective coordinates which are useful in studying the kinematics of each phase and of the mixture.
- Published
- 2006
23. Uncertainty and Information: Emergence of Vast New Territories
- Author
-
George J. Klir
- Subjects
Class (set theory) ,Geography ,Fuzzy set ,Probability distribution ,Uncertainty theory ,Set theory ,Information theory ,Mathematical economics ,Cartography ,Uncertainty reduction theory ,Axiom - Abstract
A research program whose objective is to study uncertainty and uncertainty-based information in all their manifestations was introduced in the early 1990’s under the name “generalized information theory” (GIT). This research program, motivated primarily by some fundamental methodological issues emerging from the study of complex systems, is based on a two-dimensional expansion of classical, probability-based information theory. In one dimension, additive probability measures, which are inherent in classical information theory, are expanded to various types of nonadditive measures. In the other dimension, the formalized language of classical set theory, within which probability measures are formalized, is expanded to more expressive formalized languages that are based on fuzzy sets of various types. As in classical information theory, uncertainty is the primary concept in GIT and information is defined in terms of uncertainty reduction. This restricted interpretation of the concept of information is described in GIT by the qualified term “uncertainty-based information”. Each uncertainty theory that is recognizable within the expanded framework is characterized by: (i) a particular formalized language (a theory of fuzzy sets of some particular type); and (ii) a generalized measure of some particular type (additive or nonadditive). The number of possible uncertainty theories is thus equal to the product of the number of recognized types of fuzzy sets and the number of recognized types of generalized measures. This number has been growing quite rapidly with the recent developments in both fuzzy set theory and the theory of generalized measures. In order to fully develop any of these theories of uncertainty requires that issues at each of the following four levels be adequately addressed: (i) the theory must be formalized in terms of appropriate axioms; (ii) a calculus of the theory must be developed by which the formalized uncertainty is manipulated within the theory; (iii) a justifiable way of measuring the amount of relevant uncertainty (predictive, diagnostic, etc.) in any situation formalizable in the theory must be found; and (iv) various methodological aspects of the theory must be developed. Among the many uncertainty theories that are possible within the expanded conceptual framework, only a few theories have been sufficiently developed so far. By and large, these are theories based on various types of generalized measures, which are formalized in the language of classical set theory. Fuzzification of these theories, which can be done in different ways, has been explored only to some degree and only for standard fuzzy sets. One important result of research in the area of GIT is that the tremendous diversity of uncertainty theories made possible by the expanded framework is made tractable due to some key properties of these theories that are invariant across the whole spectrum or, at least, within broad classes of uncertainty theories. One important class of uncertainty theories consists of theories that are viewed as theories of imprecise probabilities. Some of these theories are based on Choquet capacities of various orders, especially capacities of order infinity (the well known theory of evidence), interval-valued probability distributions, and Sugeno λ-measures. While these theories are distinct in many respects, they share several common representations, such as representation by lower and upper probabilities, convex sets of probability distributions, and so-called Mobius representation. These representations are uniquely convertible to one another, and each may be used as needed. Another unifying feature of the various theories of imprecise probabilities is that two types of uncertainty coexist in each of them. These are usually referred to as nonspecificity and conflict. It is significant that well-justified measures of these two types of uncertainty are expressed by functionals of the same form in all the investigated theories of imprecise probabilities, even though these functionals are subject to different calculi in different theories. Moreover, equations that express relationship between marginal, joint, and conditional measures of uncertainty are invariant across the whole spectrum of theories of imprecise probabilities. The tremendous diversity of possible uncertainty theories is thus compensated by their many commonalities.
- Published
- 2006
24. The Foundation Principles of Classical Mechanics
- Author
-
Millard F. Beatty
- Subjects
Gravitation ,Structure (mathematical logic) ,Natural philosophy ,Inertial frame of reference ,Classical mechanics ,Computer science ,Dynamics (mechanics) ,Axiom ,Motion (physics) ,Contact force - Abstract
Dynamics is the theory of motion and the forces and torques that produce it. This theory integrates our earlier studies of kinematics, the geometry of motion, with certain fundamental laws of nature that relate force, torque, and motion. In this chapter the primitive concepts of mass and force introduced in Chapter 1 are related to motion through some basic principles commonly known as Newton’s laws. Sir Isaac Newton (1642–1727) in his Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), often referred to as simply the Principia, published in 1687, formalized and extended earlier achievements of others by creating an axiomatic structure for the foundation principles of mechanics. By the organization of problems around his fundamental laws, Newton successfully demonstrated the application of his theory to the study of problems of mechanics of the solar system. He thus began the idea that the motions of bodies may be deduced from a few simple principles.
- Published
- 2006
25. Utility in Social Choice
- Author
-
Walter Bossert and John A. Weymark
- Subjects
Economics ,Arrow ,Social Welfare ,Mathematical economics ,Social welfare function ,Arrow's impossibility theorem ,Social preferences ,Preference (economics) ,Social choice theory ,Axiom - Abstract
In Arrovian [Arrow (1951, 1963)] social choice theory, the objective is to construct a social welfare function—a mapping which assigns a social preference ordering to each admissible profile of individual preferences—satisfying several a priori appealing conditions. Arrow showed that the only social welfare functions satisfying his axioms are dictatorial in the sense that there exists an individual whose strict preference over any two social alternatives is always replicated in the social ordering, no matter what the preferences of the remaining members of society happen to be. This negative result has initiated a series of contributions which attempt to avoid Arrow’s impossibility theorem by weakening one or more of his original axioms. The results in this literature are, on the whole, rather negative as well.
- Published
- 2004
26. On the Owen Set of Transportation Situations
- Author
-
Joaquín Sánchez-Soriano, Manuel Pulido, Natividad Llorca, and Elisenda Molina
- Subjects
Set (abstract data type) ,Theoretical computer science ,Operations research ,Computer science ,Consistency (statistics) ,Characterization (mathematics) ,Axiom - Abstract
This paper presents an axiomatic characterization of the Owen set of transportation games. In the characterization we use six properties including consistency (CONS2) and splitting and merging (SM) which are firstly proposed and defined for this setup in the present paper.
- Published
- 2004
27. Computational Modeling and Explanation
- Author
-
Steven O. Kimbrough
- Subjects
Computational model ,Application areas ,business.industry ,Computer science ,Natural computing ,Natural (music) ,Darwinism ,Artificial intelligence ,business ,Axiom ,Term (time) - Abstract
Computational explanations appeal to computational models, in contrast to equations or axioms, to explain their target systems. These models are typically inspired by natural phenomena and the term natural computation has been used in the literature. Darwinian or evolutionary models and explanations are a prominent form. This paper presents and reviews the concept of computational explanation, and its uses in the biological and social sciences. Emphasis is placed on recent innovations in algorithms for computational modeling. Further, the paper briefly describes application areas that are exploiting the modeling resources lately developed.
- Published
- 2003
28. A Propositional Logic for Access Control Policy in Distributed Systems
- Author
-
Mirosław Kurkowski and Jerzy Pejaś
- Subjects
Computer access control ,business.industry ,Semantics (computer science) ,Programming language ,Computer science ,Access control ,Computer security model ,Computer security ,computer.software_genre ,Propositional calculus ,Description logic ,Computer Science::Logic in Computer Science ,Formal language ,business ,computer ,Axiom - Abstract
The goal of this paper is to pursue a proposal of the logic-based model for interpreting the basic events and properties of the distributed access control systems. We provide a convenient formal language, an axiomatic inference system, a model of computation, and semantics. We prove some important properties of this logic and show how our logical language can express some access control policies proposed so far.
- Published
- 2003
29. Mechanical Equilibrium and Equilibrium Systems
- Author
-
Tamás Rapcsák
- Subjects
Mechanical equilibrium ,General equilibrium theory ,Equilibrium thermodynamics ,law ,Thermodynamic equilibrium ,Variational inequality ,Applied mathematics ,Configuration space ,Virtual work ,Axiom ,Mathematics ,law.invention - Abstract
In the paper, it is shown that the principle of virtual work considered an axiom of mechanics by Lagrange (1788) and Farkas (1906) can be embedded in a general equilibrium system, the quasi-variational inequalities introduced by Bensoussan and Lions in 1973, assuming force fields and holonomic-scleronomic constraints. Then, the dual form of the principle of virtual work is formulated in this case, the procedure for solving mechanical equilibrium problems, the existence of solutions and some examples are discussed.
- Published
- 2003
30. On Randomness and Infinity
- Author
-
Grégory Lafitte
- Subjects
Discrete mathematics ,media_common.quotation_subject ,010102 general mathematics ,0102 computer and information sciences ,Infinity ,01 natural sciences ,010201 computation theory & mathematics ,Randomness tests ,Set theory ,0101 mathematics ,Randomness ,Axiom ,media_common ,Mathematics - Abstract
In this paper, we investigate refined definitions of random sequences. Classical definitions have always the shortcome of making use of the notion of an algorithm. We discuss the nature of randomness and different ways of obtaining satisfactory definitions of randomness after reviewing previous attempts at producing a non-algorithmical definition. We present alternative definitions based on infinite time machines and set theory and explain how and why randomness is strongly linked to strong axioms of infinity.
- Published
- 2002
31. A Template-Based Approach Toward Acquisition of Logical Sentences
- Author
-
Chih-Sheng Johnson Hou, Mark A. Musen, and Natalya F. Noy
- Subjects
Phrase ,Knowledge representation and reasoning ,Computer science ,business.industry ,Representation (arts) ,Ontology (information science) ,computer.software_genre ,Knowledge acquisition ,Domain (software engineering) ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Frame (artificial intelligence) ,Artificial intelligence ,business ,computer ,Natural language processing ,Axiom - Abstract
Ontology-development languages may allow users to supplement frame-based representations with arbitrary logical sentences. In the case of the Ontolingua ontology library, only 10% of the ontologies have any user-defined axioms. We believe the phrase “writing axioms is difficult” accounts for this phenomenon; domain experts often cannot translate their thoughts into symbolic representation. We attempt to reduce this chasm in communication by identifying groups of axioms that manifest common patterns creating ‘emplates’ that allow users to compose axioms by ‘filling-in-the-blanks.’ We studied axioms in two public ontology libraries, and derived 20 templates that cover 85% of all the user-defined axioms. We describe our methodology for collecting the templates and present sample templates. We also define several properties of templates that will allow users to find an appropriate template quickly. Thus, our research entails a significant simplification in the process for acquiring axioms from domain experts. We believe that this simplification will foster the introduction of axioms and constraints that are currently missing in the ontologies.
- Published
- 2002
32. An Axiomatic Approach to Some Biological Themes
- Author
-
Lodovico Galleni, Marco Forti, and Paolo Freguglia
- Subjects
Field (Bourdieu) ,Natural science ,Criticism ,Axiomatic system ,Axiom ,Scientific activity ,Simple (philosophy) ,Epistemology - Abstract
In this paper we continue the enterprise of elaborating an axiomatic framework suitable for modern Biology, started in 1,2,3. We isolate and propose to the attention of the scientific community a few axioms expressing some basic biological facts and their theoretical interpretations. We hope we can foster criticism and contributions from researchers of any field of the Natural Sciences. In fact, we conceive this paper as part of a general research programme on the Foundations of Science, originated in the Eighties by E. De Giorgi at the Scuola Normale Superiore, Pisa and carried on by several researchers after his death in 1996 (see 4,5,6,7,8). The aim of this programme is not to provide safe and unquestionable grounds to scientific activity, but rather to develop conceptual environments where this activity can be carried out naturally and without artificial constraints, and the notions considered in various disciplines can be presented in a simple and clear-cut way, so as to allow for both a deeper analysis by researchers in the field, and a proficuous interdsciplinary debate.
- Published
- 2002
33. Dual Approaches to State-Contingent Supply Response Systems under Price and Production Uncertainty
- Author
-
John Quiggin and Robert Chambers
- Subjects
Set (abstract data type) ,Risk aversion ,Production model ,Economics ,Arrow ,Production (economics) ,State (functional analysis) ,Mathematical economics ,Axiom ,Dual (category theory) - Abstract
Following a research direction originally set by Debreu (1959), Arrow (1953), Hirshleifer (1965), and Yaari (1969), Chambers and Quiggin (1992, 1996, 1997, 1998, 2000, 2001a, 2001b) and Quiggin and Chambers (1998a, 1998b, 2000) have studied the axiomatic foundations and theoretical applications of state-contingent production models. Among other results, they have shown that dual cost structures exist for state-contingent technologies and that these dual cost structures can be used to simplify the analysis of stochastic decision making. The guiding principle of their work was elucidated almost 50 years ago by Debreu. The state-contingent approach “allows one to obtain a theory of uncertainty free from any probability concept and formally identical with the theory of certainty.”
- Published
- 2002
34. Generalized Associativity on Rectangular Quasigroups
- Author
-
Aleksandar Krapež
- Subjects
Mathematics::Group Theory ,Pure mathematics ,Mathematics::General Mathematics ,Mathematics::Operator Algebras ,Semigroup ,Word problem (mathematics) ,Axiom ,Quasigroup ,Associative property ,Direct product ,Computer Science::Cryptography and Security ,Mathematics - Abstract
A type of groupoid called a rectangular quasigroup is defined as a direct product of a left zero semigroup, a quasigroup and a right zero semigroup. We give three different axiom systems for these groupoids. Some important properties of rectangular quasigroups are derived, the solvability of the word problem among them.
- Published
- 2002
35. Value Function Methods: Indirect And Interactive
- Author
-
Theodor J. Stewart and Valerie Belton
- Subjects
Combinatorics ,Bellman equation ,Importance Weight ,Independence (mathematical logic) ,Decision maker ,Value (mathematics) ,Axiom ,Mathematics - Abstract
In the previous chapter, we discussed different methods for assessing value functions expressed in the additive form: $$V(a) = \sum\limits_{i = 1}^m {{w_i}{\upsilon _i}(a)}$$ (6.1) on the assumption that the relevant preferential independence axioms hold. For the purposes of this chapter, it will be convenient to reformulate the above into the form: $$V(a) = \sum\limits_{i = 1}^m {{u_i}(a)}$$ (6.2) wheretz. u i (a) = w i v i (a). In other words, the partial value functions are now scaled in proportion to their importance weight.
- Published
- 2002
36. Logical Foundation of Multicriteria Preference Aggregation
- Author
-
Raymond Bisdorff
- Subjects
Section (archaeology) ,Computer science ,Concordance ,Computational logic ,Foundation (evidence) ,Coherence (philosophical gambling strategy) ,Aggregation problem ,Mathematical economics ,Preference (economics) ,Axiom - Abstract
In this chapter, we would like to show Bernard Roy’s contribution to modern computational logic. Therefore we first present his logical approach for multicriteria preference modelling. Here, decision aid is based upon a refined methodological construction, that provides the family of criteria with important logical properties giving access to the concordance principle used for aggregating preferential assertions from multiple semiotical points of view. In a second section, we introduce the semiotical foundation of the concordance principle and present a new formulation of the concordance principle with its associated necessary coherence axioms imposed on the family of criteria. This new methodological framework allows us, in a third part, to extend the classical concordance principle and its associated coherence axioms imposed on the family of criteria — first to potentially redundant criteria — but also to missing individual evaluations and even partial performance tableaux.
- Published
- 2002
37. General Properties of Information
- Author
-
Jan Kåhre
- Subjects
Set (abstract data type) ,Computer science ,media_common.quotation_subject ,Deterministic system (philosophy) ,Common sense ,Information measure ,Information theory ,Transfer function ,Mathematical economics ,Axiom ,media_common - Abstract
In Chapter 1 we argued that The Law of Diminishing Information is a necessary condition for an information measure to be acceptable. But is the Law (2.7.1) also a sufficient condition in the sense that it covers all mathematical aspects of information? Many intuitive properties have been attributed to information measures. We have systematically collected the proposed information properties, both academic desiderata from the classical information theory and common sense ones from proverbs and poetry. Of course it is not possible to guarantee that any such collection is an exhaustive set, but we will show that all the acceptable intuitive properties and desiderata can be derived as theorems using the Law (2.7.1) as an axiom.
- Published
- 2002
38. A Note on the Holler-Packel Axiomatization of the Public Good Index (PGI)
- Author
-
Stefan Napel
- Subjects
Index (economics) ,Bargaining power ,Power index ,Value (economics) ,Economics ,Independence (mathematical logic) ,Public good ,Mathematical economics ,Measure (mathematics) ,Axiom - Abstract
The public good index (PGI) measures a priori power in coalition games while interpreting the coalition value as a public good. It was axiomatized by Holler and Packet (1983), which greatly facilitated a comparison with other power measures. This note fills a gap that was left in the original axiomatization: the independence and non-redundancy of the four Holler-Packel axioms are demonstrated. Holler and Packet’s axiomatization of the PGI — and thus this completion — is of relevance not only for the PGI itself but also for another power index, the member bargaining power measure (MBP).
- Published
- 2001
39. Learning About Speech from Data: Beyond NETtalk
- Author
-
Robert I. Damper
- Subjects
Learnability ,Emerging technologies ,business.industry ,Computer science ,First language ,Speech synthesis ,computer.software_genre ,Blank ,Focus (linguistics) ,NETtalk ,Artificial intelligence ,business ,computer ,Natural language processing ,Axiom - Abstract
Speech synthesis is an emerging technology with a wide range of potential applications. In most such applications, the message to be spoken will be in the form of text input, so the main focus of development is text-to-speech (TTS) synthesis. Strongly influenced by the academic traditions of generative linguistics, early work on TTS systems took it as axiomatic that a knowledge-based approach was essential to successful implementation. Presumed theoretical constraints on the learnability of their native language by humans were applied by extension to machine learners to conclude the futility of trying to make useful ‘blank slate’ inferences about speech and language simply from exposure. This situation has changed dramatically in recent years with the easy availability of computers to act as machine learners and large databases to act as training resources. Many positive achievements in machine learning have comprehensively proven its usefulness in a range of natural language processing tasks, despite the negative assumptions of earlier times. Thus, contemporary speech synthesis relies heavily on data-driven techniques.
- Published
- 2001
40. Basic Elements of Dempster-Shafer Theory
- Author
-
Ivan Kramosil
- Subjects
Set (abstract data type) ,Discrete mathematics ,Dempster–Shafer theory ,Probability distribution ,Function (mathematics) ,Computer Science::Artificial Intelligence ,Finite set ,Axiom ,Real number ,Mathematics ,Unit interval - Abstract
The greatest part of works dealing with the fundamentals of Dempster-Shafer theory is conceived either on the combinatoric, or on the axiomatic, but in both the cases on a very abstract level. The first approach begins by the assumption that S is a nonempty finite set, that m is a mapping which ascribes to each A ⊂ S a real number m(A) from the unit interval [0,1] in such a way that ∑A⊂ S m(A) = 1 (m is called a basic probability assignment on S), and that the (normalized) belief function induced by m is the mapping bel m : P(S) → [0,1] defined, for each A ⊂ S, by bel m (A) = (1 - m(o))-1 ∑o≠ B⊂A m(B), if m(o) < 1, bei m being undefined otherwise Shafer (1976) and elsewhere). The other (axiomatic) approach begins with the idea that belief function on a finite nonempty set S is a mapping bel: P(S) → [0,1], satisfying certain conditions (obeying certain axioms, in other terms). If these conditions (axioms) are strong and reasonable enough, it can be proved that it is possible to define uniquely a basic probability assignment m on S such that the belief function induced by m is identical with the original belief function defined by axioms, so that both the approaches meet each other and yield the same notion of belief function (Smets (1994)). The problems how to understand and obtain the probability distribution m over P(S) in the first case, or how to justify the particular choice of the demands imposed to belief functions in the other case, are put aside or are “picked before brackets” and they are not taken as a part of Dempster-Shafer theory in its formalized setting.
- Published
- 2001
41. Many Valued Convergence Theory
- Author
-
Ulrich Höhle
- Subjects
Algebra ,Computer science ,Normal convergence ,Convergence (routing) ,Limit point ,Mathematics::Metric Geometry ,Symbolic convergence theory ,Topological space ,Monad (functional programming) ,Modes of convergence ,Axiom - Abstract
Convergence is one of the most fundamental, topological concepts. In order to avoid misunderstandings we are interested neither in many valued convergence structures, nor in the characterization of many valued topologies by appropriate convergence axioms; what is more, we are interested in a comprehensive study of all those topological notions and axioms which can be based on the B-valued filter monad or on some of its submonads.
- Published
- 2001
42. Categorical Basis of Topology
- Author
-
Ulrich Höhle
- Subjects
Compact space ,Mathematics::Category Theory ,Hausdorff space ,Closure operator ,Universal property ,Topological space ,Topology ,Categorical variable ,Axiom ,Mathematics ,Separation axiom - Abstract
In this chapter we lay down the categorical formulation of the most important topological notions and axioms — e.g. Hausdorff’s separation axiom, regularity, compactness. Among other things we will present a categorical version of J. Dieudonne’s principle of continuous extension and give a categorical discussion of the Tychonov theorem. We start with the formulation of topological space objects based on a given category C.
- Published
- 2001
43. Values for Multialternative Games and Multilinear Extensions
- Author
-
Rie Ono-Yoshida
- Subjects
Multilinear map ,Banzhaf value ,Extension (predicate logic) ,Mathematical economics ,Value (mathematics) ,Axiom ,Mathematics - Abstract
We generalized the Banzhaf value in multialternative games defined by Bolger. We developed a new vallue, called a Banzhaf-like value, based on the axioms similar to those of the Bolger value. Like the derivation of the Banzhaf value, we modified one of the axioms of Bolger. We also applied Owen’s multilinear extension to the multialternative games to show that this application gives a Banzhaf-like value as well.
- Published
- 2001
44. An Average Value Function for Cooperative Games
- Author
-
William Kerby and Vladimir Akimov
- Subjects
media_common.quotation_subject ,MathematicsofComputing_GENERAL ,TheoryofComputation_GENERAL ,Shapley value ,Blocking (computing) ,Dual (category theory) ,Core (game theory) ,Bellman equation ,Voting ,Value (economics) ,Economics ,Mathematical economics ,Axiom ,media_common - Abstract
The Shapley value measures the contribution of a player to the grand coalition. In many practical situations the grand coalition does not typically occur. A new value is introduced which by taking all possible coalitions into account measures the coalition supporting power of each player. It is shown that the Shapley value can be expressed as a sum of the coalition supporting value and the coalition suppressing value. The coalition supporting value of a player is his average Shapley value taken over all subgames and his coalition suppressing value is his average Shapley value taken over all dual subgames. An axiomatic characterisation of the value function is also presented. As an example the coalition supporting power (passing power) and the coalition suppressing power (blocking power) in the voting system of the UN Security Council are computed.
- Published
- 2001
45. Restricted Cooperation in Games
- Author
-
Anne van den Nouweland and Marco Slikker
- Subjects
Computer science ,Section (typography) ,Link (knot theory) ,Grand coalition ,Value (mathematics) ,Shapley value ,Mathematical economics ,Axiom - Abstract
In this chapter, we will put the two main concepts introduced in chapter 1 together and study games with restricted communication. In section 2.1 we show how restrictions on communication are integrated into a coalitional game. This section contains the definitions of the network-restricted game and the link game. These games are used in the other two sections to define two allocation rules for games with communication restrictions, the Myerson value in section 2.2 and the position value in section 2.3. Both of these values are based on the Shapley value, which was discussed extensively in section 1.1. A completely different value is also discussed, in section 2.3. Throughout this book, the Myerson value is the predominant allocation rule for games with communication restrictions. We discuss this rule and some of its axiomatic characterizations extensively in section 2.2.
- Published
- 2001
46. Microfacies Analysis Assisting Archaeological Stratigraphy
- Author
-
Marie-Agnès Courty
- Subjects
Archaeological record ,Matrix (music) ,Sedimentary rock ,Excavation ,Stratigraphy (archaeology) ,Archaeology ,Geology ,Natural (archaeology) ,Axiom - Abstract
Accurate construction of archaeological stratigraphy has long been recognized as crucial in providing a solid chronocultural framework for discussing past behavioral activities and their linkages with geological processes (Gasche and Tunca, 1984; Harris, 1979). As a consequence, a major effort during excavation has been directed toward the definition of individual strata and their spatial variations. This goal has been accomplished through careful observation of the properties of the sedimentary matrix and its organization in three-dimensional space. The interfering effects of natural agents and human activities on the accumulation of the sedimentary matrix has been considered by some to conform to the principle of stratigraphic succession—as elaborated by earth scientists—and thus conforming to geological laws (Renfrew, 1976; Stein, 1987). Others have strongly argued that the rules and axioms of geological sedimentation cannot be applied to archaeological layers because they are produced by people and thus constitute an entirely distinct set of phenomena (Harris, 1979; Brown and Harris, 1993). Understanding the processes involved in the formation of archaeological stratification has also long been a question of passionate debate, with the views of human or natural deposition being opposed to the theory of biological mixing (Johnson and Watson-Stegner, 1990). These contradictory perceptions have been tentatively reconciled by the recognition of the inherent general complexity of archaeological stratigraphy, that can be isolated into its lithostratigraphic, chronoslratigraphic, and ethnostratigraphic components (Barham, 1995; Gasche and Tunca, 1984).
- Published
- 2001
47. Global Monotonicity of Values of Cooperative Games: An Argument Supporting the Explanatory Power of Shapley’s Approach
- Author
-
Peter Silárszky and René Levínský
- Subjects
Set (abstract data type) ,Mathematics::Logic ,Computer Science::Computer Science and Game Theory ,Axiomatic system ,Extension (predicate logic) ,Function (mathematics) ,Transferable utility ,Solution concept ,Mathematical economics ,Shapley value ,Axiom ,Mathematics - Abstract
In 1953, Shapley proposed a solution concept for cooperative games with transferable utility. The Shapley value is a unique function which obeys three axioms — symmetry, efficiency and additivity. The aim of our article is to provide a new axiomatic approach which classifies the existing values (indices). Shapley’s efficiency and symmetry conditions are kept whereas the additivity axiom is replaced by the axiom of global monotonicity. The Shapley value satisfies the new set of axioms. Some other values (indices) also satisfy the new set of axioms. However, our extension of the set of acceptable values (indices) excludes the Banzhaf-Coleman and Holler-Packel indices.
- Published
- 2001
48. Preliminaries on Axiomatic Probability Theory
- Author
-
Ivan Kramosil
- Subjects
Discrete mathematics ,Probability theory ,Order (exchange) ,Aside ,Computer science ,Subject (philosophy) ,Axiomatic system ,Formal tool ,Mathematical economics ,Matter of fact ,Axiom - Abstract
This work has been conceived as a purely theoretical and mathematical study dealing with the subject of its interest at a highly abstract and formalized level. Probability theory will serve as one and, as a matter of fact, the most important and the most powerful formal tool used below in order to achieve this goal. Therefore, beginning with a brief survey of the most elementary notions of probability theory, just the most elementary abstract ideas and construction of the axiomatic probability theory, as settled by Kolmogorov in Kolmogorov (1974) are presented in this chapter, intentionally leaving aside all the informal discussions, motivations, and practical examples preceding the formalized explanations of probability theory in the greatest part of textbooks and monographs dealing with this theory. The reader interested in these informal parts of probability theory is kindly invited to consult an appropriate textbook or monograph, let us mention explicitly the already classical textbooks Feller (1957) and Gnedenko (1965), where just these informal parts are explained very carefully, in detail, and with a lot of various examples. On the other side, Loeve (1960) treates probability theory at an exclusively abstract and formalized level.
- Published
- 2001
49. Questioning Stern-Gerlach
- Author
-
Tore Wessel-Berg
- Subjects
Theoretical physics ,Stern–Gerlach experiment ,Philosophy ,Mathematical formulation of quantum mechanics ,Subject (philosophy) ,Quantum ,Axiom - Abstract
The Stern-Gerlach experiment [18] is hailed as a dramatic demonstration of the necessity of a radical departure from the concepts of classical mechanics. The basic postulates of quantum mechanics are often formulated in axiomatic manners with the example of the Stern-Gerlach experiment in the back of the reader’s mind. In fact, several textbooks on quantum mechanics introduce the subject by devoting the very first chapter to the experiment [41], [43] thereby intending to attune the reader from the very beginning to the ‘quantum mechanical way’ of thinking.
- Published
- 2001
50. Probabilistic Model of Decision Making under Uncertainty
- Author
-
Ivan Kramosil
- Subjects
Expected value of including uncertainty ,Computer science ,Probabilistic-based design optimization ,Probabilistic logic ,Influence diagram ,Language of mathematics ,Mathematical economics ,Probabilistic relevance model ,Axiom ,Optimal decision - Abstract
Like the last chapter, also this one could be conceived at a purely formalized level, speaking about sets, mappings, functions, relations and ordered n-tuples of such objects satisfying some mathematically formalized demands. The difference between the two chapters consists in the fact that the intuition, interpretation and motivation behind the axiomatic probability theory can be found in most of the textbooks and monographs dealing with this theory, on the other side, in the case of general probabilistic and statistical models of decision making under uncertainty the situation is not so simple, let us mention here Lehman (1947) or Blackwell and Girshick (1954) as good introductory texts. Therefore we begin our explanation using informal terms charged by some extra-mathematical semantics, but our intention will be to get back to a formalized mathematical language as soon as possible.
- Published
- 2001
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.