672 results on '"a priori probability"'
Search Results
602. Performance of Likelihood Ratio Processors
- Author
-
W.S. Hpdgloss
- Subjects
A priori probability ,business.industry ,Aerospace Engineering ,Probability density function ,Pattern recognition ,Function (mathematics) ,Moment-generating function ,Bayes' theorem ,Prior probability ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,Likelihood function ,Random variable ,Mathematics - Abstract
Summarized by its ROC curve, the performance of a Bayes optimal detector is a function of two joint a priori probability density functions on the uncertain parameters involved. An investigation is made to determine at what point in the generation of an ROC curve for a given pair of priors sufficient information is retained for the calculation of a new ROC based on a different pair of priors.
- Published
- 1979
- Full Text
- View/download PDF
603. A method of matching data
- Author
-
Michael Z. Hanani and Naphtali Rishe
- Subjects
A priori probability ,Matching (statistics) ,Theoretical computer science ,unreliable data ,General Mathematics ,Probabilistic logic ,Statistical model ,Object (computer science) ,Data type ,Set (abstract data type) ,Matching ,Engineering(all) ,Parametric statistics ,Mathematics - Abstract
A probabilistic model and a software implementation have been developed to aid in finding missing persons and in related applications. The method can be applied generally to find most probable correspondences between two sets of imprecisely described objects. These can be descriptions of illnesses vs patients (diagnoses), job offerings vs job applicants, special tasks vs a personnel file (task assignment problem), etc. Every object of the two sets is described by a collection of data, a significant part of which can be erroneous, unreliable, imprecise or given in several contradicting versions. Among the parameters of the method is the following information about each of the data item types and of some of their possible combinations (the parametric information does not depend on the actual data): its logical characteristics, its importance relatively to other types of data items, the meaning and the relative degrees of kinship between values of this data item for two objects to be compared (e.g. kinship of equal values: phonetic kinship; numeric kinship, whose degree is proportional to the inverse of arithmetic difference between the values; matrix of kinship degrees defined for possible pairs of values), interpretation of multiplicity of values for this data item for one object, the a priori probability of data item's correctness (in addition, the probability of any value for any object can provided in a set of objects' descriptions by an investigator who gathers the actual data), etc. A straightforward implementation of the method by software would result in unfeasible time complexity for large sets of objects. Therefore special algorithms have been designed to preprocess the sets of descriptions so that the time of matching-finding is reduced byan order of magnitude while the probabilistic output remains unaltered.
- Published
- 1987
- Full Text
- View/download PDF
604. Isomer Distribution in Hydrocarbons from the Fischer‐Tropsch Process
- Author
-
R. A. Friedel and Sol Weller
- Subjects
A priori probability ,Inorganic chemistry ,General Physics and Astronomy ,chemistry.chemical_element ,Fischer–Tropsch process ,Chemical reaction ,Catalysis ,Pentane ,chemistry.chemical_compound ,chemistry ,Molecule ,Physical chemistry ,Physical and Theoretical Chemistry ,Cobalt ,Octane - Abstract
The isomer distribution experimentally found for the saturated‐hydrocarbon products (pentane to octane range) from a cobalt Fischer‐Tropsch catalyst have been deduced from probability considerations. With certain restrictions, it is assumed that the carbon skeleton is built up by addition to any terminal carbon atom (for which the a priori probability has a constant value a) or to any penultimate carbon atom (with a priori probability b). Values of a and b, determined from experimental data, led to deduced isomer concentrations in each molecular‐weight cut agreeing with experimental concentrations with an average deviation of 0.7 percent. Peculiarities in the observed isomer distribution were also reproduced.
- Published
- 1949
- Full Text
- View/download PDF
605. What triggers causal attributions? The impact of valence and subjective probability
- Author
-
Gerd Bohner, Fritz Strack, Herbert Bless, and Norbert Schwarz
- Subjects
A priori probability ,education.field_of_study ,Social Psychology ,Social perception ,Population ,Cognition ,Causal reasoning ,Valence (psychology) ,Psychology ,Attribution ,Everyday life ,education ,Social psychology - Abstract
Various field studies and experimental simulations demonstrated that causal reasoning increases after unexpected as well as after unpleasant events. However, unpleasant events are seen as less likely than pleasant ones in everyday life. Accordingly, the subjective probability of the event and its hedonic quality were naturally confounded in these studies. To isolate the contribution of both determinants, the subjective probability and the valence of an event were independently manipulated in a laboratory experiment. Subjects completed an ostensible ‘professional skills test’ and received either success or failure feedback in relation to a criterion set by the experimenter. The subjective probability of success was varied by informing subjects about the distribution of success and failure in a comparable population (either 23 per cent or 77 per cent were said to meet the criterion). The results indicate a pronounced valence effect: The iniensity of causal reasoning and the number of possible reasons reported for the outcome was greater after negative than after positive feedback, independent of the a priori probability of the outcome. No evidence for an increase in causal explanations after unexpected, as compared to expected, events was obtained. Several mediating processes are discussed.
- Published
- 1988
606. Random Partition of an Integer n (RANPAR)
- Author
-
Herbert S. Wilf and Albert Nijenhuis
- Subjects
Discrete mathematics ,Combinatorics ,A priori probability ,Subroutine ,A priori and a posteriori ,Partition (number theory) ,Mathematics ,Linear array - Abstract
This chapter presents an algorithm for generating random partition of an integer n . The chapter discusses that for a given n ≥ 1, it is wished to select uniformly at random a partition of n , so that each partition has a priori probability l/ p ( n ) of being selected. The chapter describes an algorithm, which avoids any tabulation of a function of two indices, requiring only a linear array. The chapter discusses subroutine specifications for the subprogram RANPAR. In the analysis described in the chapter, the subprogram RANPAR was called 880 times with N = 6. The frequencies with which each of the 11 partitions of 6 were obtained are shown. Thus, 6 = 3 + 2 + 1 occurred 83 times, etc. The value χ 2 was calculated to be 13.475. In 95% of such experiments, the observed value of χ 2 would lie between 3.247 and 20.483 if the partitions did indeed have equal a priori probabilities.
- Published
- 1978
- Full Text
- View/download PDF
607. A Two-Dimensional Continuum of a Priori Probability Distributions on Constituents
- Author
-
Theo A.F. Kuipers and Faculty of Philosophy
- Subjects
Generalized inverse Gaussian distribution ,A priori probability ,Regular conditional probability ,Continuum (topology) ,Joint probability distribution ,ComputingMethodologies_DOCUMENTANDTEXTPROCESSING ,Probability distribution ,Statistical physics ,Convolution of probability distributions ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) ,Mathematics ,K-distribution - Abstract
Hintikka has defined a one-dimensional continuum of a priori probability distributions on constituents and has built on it a two-dimensional continuum of inductive methods with the aid of Carnap’s λ-continuum ([1]), which plays also a fundamental role in his continuum of a priori distributions, and the formula of Bayes ([2]). Here a two-dimensional continuum of a priori probability distributions will be introduced. On its base a three-dimensional continuum of inductive methods can be constructed in the same way as Hintikka has done. The importance of the new continuum of a priori distributions, which is also based on Carnap’s λ-continuum but in a completely different way, is that it leaves room for almost all kinds of a priori considerations, whereas Hintikka’s continuum admits only considerations that lead to increasing probability for the constituents by increasing size.
- Published
- 1976
- Full Text
- View/download PDF
608. Coupled Probabilistic And Possibilistic Uncertainty Estimation In Rule-Based Analysis Systems
- Author
-
Lefteri H. Tsoukalas and Magdi Ragheb
- Subjects
A priori probability ,business.industry ,Fuzzy set ,Probabilistic logic ,Sensitivity analysis ,Artificial intelligence ,business ,Fuzzy logic ,Uncertainty analysis ,Randomness ,Mathematics ,Event (probability theory) - Abstract
Champaign, 103 S. Goodwin Ave., Urbana IL 61801AbstractA methodology is developed for estimating the Performance of monitored engineering devices. Inferencing and decision-making underuncertainty is considered in Production -Rule Analysis systems where the knowledge about the system is both probabilistic and possibilistic.In this case uncertainty is considered as consisting of two components: Randomness describing the uncertainty of occurrence of an object,and Fuzziness describing the imprecision of the meaning of the object. The concepts of information granularity and of the probability of afuzzy event are used. Propagation of the coupled Probabilistic and possibilistic uncertainty is carriedout over model -based systems usingthe Rule -Based paradigm. The approach provides a measure of both the performance level and the reliability ofa device.IntroductionTwo types of uncertainty need to be considered when monitoring the performance of engineering devices: Uncertainty due to randomness,which expresses the uncertainty of occurence of an event, and uncertainty due to fuzziness which expresses the imprecision of the meaningof an event. Consider for example the statement: "Power will increase about 10% with a probability of 60 % ". This statement describes anevent whose random component is given by "a probability of 60 % ", and a fuzzy component by "about 10 % ".1,2 Probabilistic or randomuncertainty is associated with the outcome of future phenomena which can be quantified either from symmetry considerations and othermathematical principles (a priori probability), or from past statistical data (a posteriori probability). Fuzzy uncertainty on the other hand,does not depend on the outcome of an event, but in the degree by which an object belongs to a certain set.3From Fuzzy Set Theory4,5, the meaning of the words "about 10 %" can be precisely defined by a possibility distribution function tt}f,which may be induced by physical constraints or be epistemic in nature, in which case it is induced by a collection of propositions.6 From
- Published
- 1987
- Full Text
- View/download PDF
609. The predictive factor--a method to simplify Bayes' formula and its application to diagnostic procedures
- Author
-
H. Tillil, J. Köbberling, and K. Richter
- Subjects
A priori probability ,Statistics as Topic ,Models, Biological ,03 medical and health sciences ,Bayes' theorem ,0302 clinical medicine ,HLA Antigens ,Positive predicative value ,Drug Discovery ,Statistics ,Humans ,Spondylitis, Ankylosing ,Sensitivity (control systems) ,Genetics (clinical) ,HLA-B27 Antigen ,Mathematics ,Variable (mathematics) ,Probability ,Clinical Laboratory Techniques ,Diagnostic test ,Bayes Theorem ,General Medicine ,Prognosis ,Predictive value ,Predictive factor ,030220 oncology & carcinogenesis ,Molecular Medicine ,030215 immunology - Abstract
According to Bayes' rule the predictive value (PV) of a diagnostic test (= probability of disease if the test is positive) depends on the prevalence of the disease (= a priori probability), the sensitivity (c1) and the specificity (c2) of the test. A new variable has been introduced, the predictive factor (c), which is calculated as follows: c = c1/(c1 +1 -c2). Since the PV only depends on this factor and on the prevalence, the calculation is much easier and a general graphical solution is possible. This simplification renders several additional advantages and facilitates the understanding of the dependence of PV on prevalence.
- Published
- 1984
610. The Axiomatic Non-Probabilistic Justification of Bayesian Optimality Conditions
- Author
-
Jonas Mockus
- Subjects
A priori probability ,Linear function (calculus) ,Bayesian probability ,Probabilistic logic ,Applied mathematics ,Sample (statistics) ,Function (mathematics) ,Measure (mathematics) ,Axiom ,Mathematics ,Epistemology - Abstract
A definition of Bayesian optimality (2.1.11) was derived assuming that the following conditions were satisfied: 3.1.1. An optimal method should minimize the average losses of deviation. 3.1.2. The losses connected with the deviation from the global minimum f0 are a linear function of the difference f(xN+l)−f0, where xN+l is the point of final decision. 3.1.3. The function to be minimized is a sample of some stochastic function defined by the a priori probability measure P.
- Published
- 1989
- Full Text
- View/download PDF
611. On quantifying surprise: the variation of event-related potentials with subjective probability
- Author
-
Emanuel Donchin and Connie C. Duncan-Johnson
- Subjects
Male ,A priori probability ,medicine.medical_specialty ,Cognitive Neuroscience ,Magnitude (mathematics) ,Experimental and Cognitive Psychology ,Audiology ,Electroencephalography ,Tone (musical instrument) ,Developmental Neuroscience ,Event-related potential ,medicine ,Waveform ,Humans ,Attention ,Biological Psychiatry ,Probability ,Communication ,medicine.diagnostic_test ,Endocrine and Autonomic Systems ,business.industry ,General Neuroscience ,Brain ,Linear discriminant analysis ,Neuropsychology and Physiological Psychology ,Amplitude ,Neurology ,business ,Psychology - Abstract
Two factors are known to determine the waveform of event-related potentials (ERP) elicited by task-relevant stimuli: the a priori probability of the stimuli and the sequence of immediately preceding stimuli. The relative contribution of these factors to the ERP waveform was assessed at nine levels of a priori probability (from .10 to .90). Random sequences of high (1500 Hz) and low (1000 Hz) tones were presented to 10 male subjects at each level of probability, both when the events were task-relevant and when the subjects were performing an alternate task to which the tones were irrelevant. The EEG was recorded from five midline electrode sites referred to linked mastoids. The amplitude of the P300 and Slow Wave components was inversely proportional to the a priori probability of task-relevant events. At every level of a priori probability, the magnitude of the P300 complex (N200-P300-Slow Wave) was diminished when the eliciting tone repeated the preceding tone, and was enhanced when it was preceded by the other tone. Thus, a priori probability and sequential structure appear to be independent determinants of the P300 complex.
- Published
- 1977
612. Entropy and Uncertainty
- Author
-
Teddy Seidenfeld
- Subjects
A priori probability ,Bayesian probability ,Statistics ,Conditional probability ,Inference ,Entropy (information theory) ,Disjoint sets ,Bayesian inference ,Mathematical economics ,Principle of indifference ,Mathematics - Abstract
This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two where the Bayesian model for MAXENT inference uses an a priori probability that is uniform, and where all MAXENT constraints are limited to 0–1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 establishes a sensitivity of MAXENT inference to the choice of the algebra of possibilities even though all empirical constraints imposed on the MAXENT solution are satisfied in each measure space considered. The resulting MAXENT distribution is not invariant over the choice of measure space. Thus, old and familiar problems with the Laplacean principle of Insufficient Reason also plague MAXENT theory. Result 3 builds upon the findings of Friedman and Shimony (1971,1973) and demonstrates the absence of an exchangeable, Bayesian model for predictive MAXENT distributions when the MAXENT constraints are interpreted according to Jaynes’ (1978) prescription for his (1963) Brandeis Dice problem. Last, Result 4 generalizes the Friedman and Shimony objection to cross-entropy (Kullback-information) shifts subject to a constraint of a new odds-ratio for two disjoint events.
- Published
- 1987
- Full Text
- View/download PDF
613. Other Solutions
- Author
-
Silviu Guiasu and Mircea Malitza
- Subjects
Weight value ,A priori probability ,Value (economics) ,Stochastic game ,ComputingMilieux_PERSONALCOMPUTING ,MathematicsofComputing_GENERAL ,Economics ,TheoryofComputation_GENERAL ,Extension (predicate logic) ,Measure (mathematics) ,Mathematical economics - Abstract
Publisher Summary This chapter presents an extension to the von Neumann–Morgenstern theory. The von Neumann–Morgenstern theory does not deal with the way in which the members of a coalition share their joint payoff. Shapley introduces values to represent the negotiating strength of each player, that is, what each player may ask of the total payoff achieved by the total coalition in the triad. Shapley argued that if the worth of a coalition can be expressed by a single number, then the value added to a coalition by a player is the number expressing the worth of that coalition with that player in it minus the number expressing the worth of that coalition without that actor. To compute the value that that player can expect from participation in a coalition, Shapley suggested that the value added by that player to each of his possible coalitions be multiplied by the a priori probability that each coalition would form, and these products then added together. The sum obtained in this way would be the measure of the player's value in the situation. Caplow, too, is concerned with how coalitions are formed and how the payoffs are distributed among players of different strength. A certain weight is attached to every player in a triad. It is not the weight value that counts but the players' weight quotients that does.
- Published
- 1980
- Full Text
- View/download PDF
614. P3 waves to the discrimination of targets in homogeneous and heterogeneous stimulus sequences
- Author
-
Eric Courchesne, Steven A. Hillyard, and Rachel Y. Courchesne
- Subjects
Adult ,A priori probability ,Adolescent ,Cognitive Neuroscience ,Experimental and Cognitive Psychology ,Stimulus (physiology) ,Correlation ,Combinatorics ,Orienting response ,P3 latency ,Discrimination, Psychological ,Developmental Neuroscience ,Orientation ,medicine ,Reaction Time ,Humans ,Biological Psychiatry ,Communication ,Endocrine and Autonomic Systems ,business.industry ,General Neuroscience ,Brain ,Electrophysiology ,Neuropsychology and Physiological Psychology ,Amplitude ,medicine.anatomical_structure ,Neurology ,Homogeneous ,Scalp ,business ,Psychology - Abstract
Event-related potentials (ERPs) were recorded from subjects who discriminated infrequent target slides (particular letters or numbers) within a sequence of non-target letters. In the first experiment, subjects counted occurrences of a fixed target letter that were randomly intermixed within either a homogeneous sequence of As (deviating condition) or a heterogeneous sequence of all the letters of the alphabet (non-deviating condition). The P3 components (300-600 msec) of the ERPs to the deviating and non-deviating targets were virtually identical in amplitude, waveshape, latency and scalp distribution. Thus, the deviation of a target from an on-going sequence is not a prerequisite for the elicitation of high-amplitude P3 waves. A significant correlation of RT with P3 latency was found across-subjects, but not within-subjects. In a second experiment, subjects were presented with randomized sequences consisting of 80%As, 10%Bs, and 10% of either letters (between C and Z) or numbers (between 0 and 23). Although the a priori probability of Bs was about 20 times greater than any one of the letter or number slides, the averaged P3 waves to each of these types of targets did not differ from one another in amplitude, waveshape or scalp distribution. Apparently, the P3 wave is determined more by the probability of a stimulus class and the associated psychological operation than to the a priori probability of an individual stimulus.
- Published
- 1977
615. PROBABILISTIC DYNAMIC PROGRAMMING FOR FAULT ISOLATION
- Author
-
Theodore J. Sheskin
- Subjects
Dynamic programming ,A priori probability ,Sequence ,Mathematical optimization ,Sample problem ,Computer science ,Probabilistic logic ,Diagnostic test ,Fault detection and isolation ,Reliability engineering ,Test (assessment) - Abstract
The problem of generating a sequence of diagnostic tests which can be executed to isolate a single faulty module in electronic equipment at minimum expected cost is formulated as a probabilistic dynamic program. A test is specified by identifying those modules which must be good for the test to pass. Associated with each test is a known cost. Each module has an a priori probability of failure. Probabilistic dynamic programming is applied to derive a least expected cost testing sequence for a sample problem.
- Published
- 1979
- Full Text
- View/download PDF
616. The dilemma of the solitary thyroid nodule: resolution through decision analysis
- Author
-
Stephen P. Bartol, Shelley L. Bartold, and James C. Sisson
- Subjects
Thyroid nodules ,Adult ,Male ,medicine.medical_specialty ,A priori probability ,Decision Making ,Thyroid Gland ,Solitary thyroid nodule ,medicine ,False positive paradox ,Humans ,Radiology, Nuclear Medicine and imaging ,Ultrasonics ,Thyroid Neoplasms ,Radionuclide Imaging ,Probability ,business.industry ,Nodule (medicine) ,Resolution (logic) ,medicine.disease ,Carcinoma, Papillary ,Surgery ,Dilemma ,Thyroidectomy ,Radiology ,medicine.symptom ,business ,Mathematics ,Decision analysis - Abstract
Patients with solitary thyroid nodules evoke a therapeutic dilemma: to excise or to manage without operation. The dilemma can be resolved rationally for individual patients through decision analysis. Available probabilities of specific events affecting the outcomes subsequent to each treatment are, in some instances, uncertain, but reasonable assumptions should still lead to logical conclusion. Life utilities are computed in a new way. Abbreviated life and morbidity are valued in terms of losses on the same scale; losses from morbidity equal the years of life a patient would trade for an existence free of morbidity. In the decision analysis tree, a threshold level for decision, the point of therapeutic indifference, is computed. For the thyroid nodule, the threshold level for decision is a probability of benign disease within the nodule. If, frem experience, the a priori probability for benign disease exceeds the threshold level for benignancy, the analysis directs no operation to be the better choice. Surgical excision is preferred if the opposite is determined, i.e., the a priori probability of a nonmalignant nodule, based on knowledge from previous patients, is less than the calculated threshold level for the same histological process. In the latter case, which fits most patients with thyroid nodules, a useful diagnostic test must have the capacity to increase the probability of benign disease above the threshold level for decision. In this way, the test appropriately alters the choice of treatment. A radioiodine image portraying good function in the nodule increases the probability of benign disease sufficiently to warrant changing the prescription for operation to no operation. The demonstration of a wholly cystic nodule by ultrasound also correctly redirects the management of some patients. Diagnostic procedures are clinically useful only if they have the potential to change or direct therapeutic decisions, and, thereby, provide better patient care. Through decision analysis of a specific clinical situation, and with knowledge of true positives and false positives for a diagnostic test, the limitations of usefulness can be determined for the test as it is applied to the clinical situation.
- Published
- 1978
617. Final State Energy Distributions for Exoergic Reactions
- Author
-
S. Fischer
- Subjects
Physics ,A priori probability ,Work (thermodynamics) ,education.field_of_study ,Triatomic molecule ,Population ,Degrees of freedom (physics and chemistry) ,Statistical physics ,education ,Rotation (mathematics) ,Quantum ,Saddle - Abstract
The rapid development of laser techniques makes it possible to get detailed information about product state distributions for reactive collisions. It is the objective of this paper to present a model which makes a priori predictions about vibrational, translational and rotational final state distributions. The vibrational and the translational degrees of freedom are treated quantum mechanically in close analogy to a collinear system. The rotation is treated statistically. The application of the saddle-point method brings into the theory a distribution parameter which might be looked at as the generalization of the scheme of an internal temperature to systems for which the a priori probability of final state population is still dictated by the dynamics. Applications to very exothermic triatomic exchange reactions are discussed and the theoretical predictions are compared with experiments. The work has been performed in collaboration with G. Venzl [1,2,3].
- Published
- 1978
- Full Text
- View/download PDF
618. Maximum Entropy and Bayesian Approach in Tomographic Image Reconstruction and Restoration
- Author
-
Ali Mohammad-Djafari and Guy Demoment
- Subjects
A priori probability ,Covariance matrix ,business.industry ,Principle of maximum entropy ,Bayesian probability ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Pattern recognition ,Prior probability ,Maximum a posteriori estimation ,A priori and a posteriori ,Artificial intelligence ,business ,Algorithm ,Image restoration ,Mathematics - Abstract
In this paper we propose a Bayesian approach with Maximum Entropy (ME) priors to solve an integral equation which arises in various image restoration and reconstruction problems. Our contributions in this paper are the following: i) We discuss the a priori probability distributions which are deduced from different a priori constraints when the principle of ME is used. ii) When the a priori knowledge is only the noise covariance matrix and the image total intensity, and when the maximum a posteriori (MAP) is chosen as the decision rule to determine the values of image pixels, we show that the solution may be obtained by minimizing a criterion in which the structural entropy of the image is used as a particular choice of a regularization functional. The discussion is illustrated with some simulated results.
- Published
- 1989
- Full Text
- View/download PDF
619. Autocatalytic sets of proteins
- Author
-
Stuart A. Kauffman
- Subjects
Statistics and Probability ,Peptide Biosynthesis ,A priori probability ,Polymers ,Protein Conformation ,Models, Biological ,General Biochemistry, Genetics and Molecular Biology ,Catalysis ,Enzyme catalysis ,Autocatalysis ,Protein structure ,Amino Acids ,Probability ,General Immunology and Microbiology ,Chemistry ,Applied Mathematics ,General Medicine ,Directed graph ,Kinetics ,Biochemistry ,Modeling and Simulation ,Protein Biosynthesis ,Artificial chemistry ,General Agricultural and Biological Sciences ,Biological system ,Autocatalytic set - Abstract
This article investigates the possibility that the emergence of reflexively autocatalytic sets of peptides and polypeptides may be an essentially inevitable collective property of any sufficiently complex set of polypeptides. The central idea is based on the connectivity properties of random directed graphs. In the set of amino acid monomer and polymer species up to some maximum length, M, the number of possible polypeptides is large, but, for specifiable "legitimate" end condensation, cleavage and transpeptidation exchange reactions, the number of potential reactions by which the possible polypeptides can interconvert is very much larger. A directed graph in which arrows from smaller fragments to larger condensation products depict potential synthesis reactions, while arrows from the larger peptide to the smaller fragments depict the reverse cleavage reactions, comprises the reaction graph for such a system. Polypeptide protoenzymes are able to catalyze such reactions. The distribution of catalytic capacities in peptide space is a fundamental problem in its own right, and in its bearing on the existence of autocatalytic sets of proteins. Using an initial idealized hypothesis that an arbitrary polypeptide has a fixed a priori probability of catalyzing any arbitrary legitimate reaction to assign to each polypeptide those reactions, if any, which it catalyzes, the probability that the set of polypeptides up to length M contains a reflexively autocatalytic subset can be calculated and is a percolation problem on such reaction graphs. Because, as M increases, the ratio of reactions among the possible polypeptides to polypeptides rises rapidly, the existence of such autocatalytic subsets is assured for any fixed probability of catalysis. The main conclusions of this analysis appear independent of the idealizations of the initial model, introduce a novel kind of parallel selection for peptides catalyzing connected sequences of reactions, depend upon a new kind of minimal critical complexity whose properties are definable, and suggest that the emergence of self replicating systems may be a self organizing collective property of critically complex protein systems in prebiotic evolution. Similar principles may apply to the emergence of a primitive connected metabolism. Recombinant DNA procedures, cloning random DNA coding sequences into expression vectors, afford a direct avenue to test the distribution of catalytic capacities in peptide space, may provide a new means to select or screen for peptides with useful properties, and may ultimately lead toward the actual construction of autocatalytic peptide sets.
- Published
- 1986
620. Vertex evoked potentials in a rating-scale detection task: relation to signal probability
- Author
-
Kenneth C. Squires, Steven A. Hillyard, and Nancy K. Squires
- Subjects
Vertex (graph theory) ,Adult ,A priori probability ,Decision Making ,Differential Threshold ,behavioral disciplines and activities ,Signal ,Discrimination, Psychological ,Psychophysics ,Reaction Time ,Humans ,Detection theory ,Psychoacoustics ,Evoked potential ,Evoked Potentials ,General Environmental Science ,Probability ,business.industry ,Pattern recognition ,Electroencephalography ,Confidence interval ,Electrooculography ,Acoustic Stimulation ,Auditory Perception ,Visual Perception ,General Earth and Planetary Sciences ,Artificial intelligence ,business ,Psychology ,Social psychology ,Perceptual Masking ,Photic Stimulation - Abstract
Vertex evoked potentials were recorded from human subjects performing in an auditory detection task with rating scale responses. Three values of a priori probability of signal presentation were tested. The amplitudes of the N1 and P3 components of the vertex potential associated with correct detections of the signal were found to be systematically related to the strictness of the response criterion and independent of variations in a priori signal probability. No similar evoked potential components were found associated with signal absent judgements (misses and correct rejections) regardless of the confidence level of the judgement or signal probability. These results strongly support the contention that the form of the vertex evoked response is closely correlated with the subject's psychophysical decision regarding the presence or absence of a threshold level signal.
- Published
- 1975
621. What chance did Mendel's experiments give him of noticing linkage?
- Author
-
L Douglas and E Novitski
- Subjects
Linkage (software) ,A priori probability ,Genes ,Genetic Linkage ,Statistics ,Genetics ,History, 19th Century ,Biology ,Genetics (clinical) ,Chromosomes ,Probability - Abstract
The a priori probability of noticeable linkage among all conceivable experiments of the size reported by Mendel cannot reasonably be taken as greater than 24-36 per cent; and therefore, the frequently heard opinion that his chances of encountering linkage were high, approaching 99.4 per cent, appears to be mistaken.
- Published
- 1977
622. Compositional nonrandomness: a quantitatively conserved evolutionary invariant
- Author
-
Richard Holmquist and Herbert Moise
- Subjects
Genetics ,A priori probability ,Protein family ,A protein ,Biology ,Genetic code ,Biological Evolution ,Models, Biological ,Amino acid composition ,Genetic Code ,Amino Acid Sequence ,Invariant (mathematics) ,Protein length ,Biological system ,Molecular Biology ,Ecology, Evolution, Behavior and Systematics ,Probability - Abstract
The a priori probability that the amino acid composition of a protein will exhibit a given overall deviation from the genetic code table frequencies is the same for all protein families independent of protein length, biological function, or origin.
- Published
- 1975
623. A mathematical solution for the probability of the paradox of voting
- Author
-
Herbert F. Weisberg and Richard G. Niemi
- Subjects
Majority rule ,A priori probability ,Information Systems and Management ,Strategy and Management ,media_common.quotation_subject ,Rank (computer programming) ,Decision Making ,Politics ,General Social Sciences ,Condorcet method ,Models, Theoretical ,Voting paradox ,Interpretation (model theory) ,Order (exchange) ,Voting ,Humans ,General Agricultural and Biological Sciences ,Mathematical economics ,Mathematics ,media_common ,Probability - Abstract
The paradox of voting occurs when individual rank orders of three or more alternatives lead to an intransitive social ordering. This means, for example, that with a majority decision rule for voting between pairs of alternatives, it is possible that no alternative will receive a majority vote over all of the other alternatives. The a priori probability of the paradox, based on certain probability assumptions, has been sought in order to judge how serious the paradox is for societal decision-making. In this paper, the authors specify the model underlying these attempted calculations. In the process, new problems of interpretation are raised. Through the use of this model, a general solution for the probability of the paradox is derived, together with an approximation for computational convenience. Some numerical results are given to demonstrate the nontrivial probability of the paradox with a moderate number of alternatives and the assumption that all possible rank orders are equally likely.
- Published
- 1968
624. The Relationship between the Composition of the Exchange Complex and the Composition of the Soil Solution
- Author
-
G. H. Bolt and C. G. E. M. van Beek
- Subjects
Salinity ,Activity coefficient ,A priori probability ,fluids and secretions ,Statistics ,Soil solution ,biochemical phenomena, metabolism, and nutrition ,Composition (combinatorics) ,General validity ,Value (mathematics) ,Mathematics - Abstract
For some time, the SAR-ESP relationship established by the United States Salinity Laboratory Staff (1954) has been used to estimate the ESP value from the cationic composition of the soil solution. Such a relationship obviously derives its value from its general applicability, although it is only an approximation. In discussing the a priori probability of the existence of a SAR-ESP relationship of general validity, several aspects present themselves.
- Published
- 1973
- Full Text
- View/download PDF
625. Selforganization of Nucleic Acids and the Evolution of the Genetic Apparatus
- Author
-
Hans Kuhn
- Subjects
A priori probability ,Earth history ,Ingenuity ,Property (philosophy) ,Basis (linear algebra) ,Computer science ,Process (engineering) ,media_common.quotation_subject ,Biochemical engineering ,media_common - Abstract
It is unclear how biological systems could evolve during a time of 109 years given by earth history [1–7]. Even the simplest systems that can be imagined to evolve to more complicated ones must have the property of self-reproduction, and this is only possible for systems which already have an appreciable complexity. They must have a device similar to the genetic apparatus of the known organisms, a machinery of highest skill and ingenuity. How was it possible, that such systems evolved? Can this evolution be explained on the basis of physical chemistry, and in this case is it a common process under appropriate environment conditions of a process of extremely low a priori probability?
- Published
- 1973
- Full Text
- View/download PDF
626. Semantic Information and Inductive Logic
- Author
-
Jaakko Hintikka and Juhani Pietarinen
- Subjects
A priori probability ,Inductive logic ,business.industry ,Computer science ,Decision theory ,media_common.quotation_subject ,Measure (mathematics) ,Artificial intelligence ,Semantic information ,business ,Function (engineering) ,Mathematical economics ,Axiom ,media_common - Abstract
Publisher Summary The basic ideas of modern decision theory might be used in understanding the adoption and rejection of scientific hypotheses and theories. This chapter discusses the special kinds of scientific or theoretical utilities named “epistemic utilities.” To qualify as a utility, a measure of information must satisfy the usual Von Neumann–Morgenstern utility axioms. Levi's negative results reinforce the larger question whether any approach to induction in terms of epistemic utilities has much hope of success. In addition, the chapter discusses some special measures of semantic information. The first of them is based on a regular and symmetrical measure function that gives each constituent an equal a priori probability. The chapter also defines a priori probability of each state-description. This probability is obtained by dividing the weight of each constituent evenly among the state-descriptions that make this constituent true.
- Published
- 1966
- Full Text
- View/download PDF
627. Events and Probabilities
- Author
-
Howard G. Tucker
- Subjects
Discrete mathematics ,A priori probability ,Conditional dependence ,Elementary event ,Statistics ,Law of total probability ,Conditional probability ,Complementary event ,Frequency ,Mathematics ,Event (probability theory) - Abstract
This chapter focuses on events and probabilities. The notion of the probability of an event is approached by three different methods. One method is to repeat an experiment or game many times under identical conditions and compute the relative frequency with which an event occurs. In the second way of approaching the notion of probability, a minimal list of axioms is set down, which assumes certain properties of probabilities. From this minimal set of assumptions, the further properties of probability are deduced and applied. The third method for arriving at the notion of probability is limited in application; however, it is extremely useful. The probability of the event is defined to be the number of “equally likely” ways, in which the event can occur divided by the total number of possible “equally likely” outcomes. The number of equally likely ways in which the event can occur must be from among the total number of equally likely outcomes. An event is simply a collection of certain elementary events. Different events are different collections of elementary events.
- Published
- 1962
- Full Text
- View/download PDF
628. INTRODUCTION TO PROBABILITY THEORY
- Author
-
K.S. Snell and J.B. Morgan
- Subjects
A priori probability ,Frequentist probability ,Elementary event ,Statistics ,Sample space ,Conditional probability ,Arithmetic ,Probability interpretations ,Tree diagram ,Mathematics ,Event (probability theory) - Abstract
This chapter presents the probability theory. In many examples of common experience, there is a feeling of the chance or likelihood of a statement being either true or false. The actual performance of experiments with articles such as coins, cards, and dice is of great help in understanding the ideas of probability and the relation between theory and practice. The chapter presents a sample space for a real or imagined experiment as the set of symbols that indicate the total possible different outcomes of the experiment. When each simple event in a sample space is equally likely, and there are N simple events, the fraction 1/ N is assigned as the probability of each simple event. The total probability of all the simple events is, therefore, 1 (unity), so that a probability of 1 is equivalent to a certainty that one or other of the constituent events will occur. The probability of an event that is represented by the empty set is zero; this is equivalent to saying that the event cannot occur within the given sample space.
- Published
- 1966
- Full Text
- View/download PDF
629. REPRODUCING DISTRIBUTIONS FOR MACHINE LEARNING
- Author
-
null J. D. and Jr Spragins
- Subjects
A priori probability ,Mathematical model ,business.industry ,Bayesian probability ,Machine learning ,computer.software_genre ,Bayes' theorem ,A priori and a posteriori ,Probability distribution ,Artificial intelligence ,Limit (mathematics) ,business ,computer ,Sufficient statistic ,Mathematics - Abstract
A model is proposed for learning the nature and value of an unknown parameter, or unknown parameters, in a probability distribution which forms part of a body of statistics related to some system or process. The model is Bayesian, involving the assumption of an a priori probability distribution over the possible values of the unknown parameters; the performance of experiments to gain information about the parameters; and the alteration of the a priori probabilities by Bayes' rule. In the limit, as the number of experiments approaches infinity, the a posteriori distribution in most cases encountered in practice approaches a delta function at the true values of the unknown parameters, so the system learns the values of the parameters exactly. The learning process developed in the paper is shown to be technically feasible if the a priori and a posteriori distributions are of the same form, with the learning accomplished by calculating new parameters for these distributions. It is shown that a necessary and sufficient condition for fulfillment of this feasibility criterion is for a sufficient statistic of fixed dimension to exist. If such a sufficient statistic exists, the a posteriori distributions may vary in form initially, but they eventually become of fixed form. The techniques developed indicate logical methods for choosing a priori probabilities and are applied in pattern recognition, estimation, and other problems.
- Published
- 1963
- Full Text
- View/download PDF
630. The Concept of Probability in Psychological Experiments
- Author
-
Carl-Axel S. Staël von Holstein
- Subjects
A priori probability ,Frequentist probability ,Probability theory ,Computer science ,Probability and statistics ,Coherence (statistics) ,Imprecise probability ,Mathematical economics ,Probability interpretations ,Algorithm ,Probability measure - Abstract
Publisher Summary This chapter discusses concept of probability in psychological experiments. Probability theory, in the chapter, is seen as a branch of mathematics, representing no more than an application of measure theory. The chapter discusses classification of probability interpretations. The attributes objective and subjective are used. Subjective probabilities are here seen as degrees of belief and may also be called “personal probabilities.” One can distinguish between the concept of subjective probability intended to describe actual behavior, which is typically what interests psychologists, and subjective probability theory aimed at characterizing coherent behavior. Coherence is equivalent to the condition that the laws of probability be obeyed. Objective interpretations can be based on arguments of symmetry, that is, equiprobable cases, or arguments that “objective probability is revealed by frequency.” An objective probability is then seen as a physical property of an object that can be estimated with sufficient precision by repeated measurement under identical conditions.
- Published
- 1973
- Full Text
- View/download PDF
631. Quantitative evoked potential correlates of the probability of events
- Author
-
Samuel Sutton, Joseph Zubin, and Patricia Tueting
- Subjects
A priori probability ,Injury control ,Accident prevention ,Cognitive Neuroscience ,Poison control ,Experimental and Cognitive Psychology ,Stimulus (physiology) ,Electroencephalography ,Developmental Neuroscience ,Statistics ,medicine ,Humans ,Evoked potential ,Late positive component ,Evoked Potentials ,Biological Psychiatry ,Probability ,medicine.diagnostic_test ,Endocrine and Autonomic Systems ,Computers ,General Neuroscience ,body regions ,Neuropsychology and Physiological Psychology ,Neurology ,Gambling ,Cues ,Psychology ,Social psychology - Abstract
A late positive-going component (P3) of the average evoked potential recorded from human scalp was shown to be quantitatively related to a priori stimulus probability both when the S was told the identity of the stimulus before it was presented and when the S was not told, and was instructed to guess. In the guessing situation, the amplitude of P3 was much larger and was influenced not only by the a priori probability of events determined by the experimenter but also by the interaction of these probabilities with the S's guessing behavior. The amplitude of the late positive component was inversely related to the proportion of trials in which a particular event was associated with a particular guess, i.e., the proportion of hits and misses. It was larger the more unexpected the outcome of the guess. This relationship held for different methods of manipulating the probability of two events.
- Published
- 1970
632. Classical Probability and Its Renaissance
- Author
-
Terrence L. Fine
- Subjects
Equiprobability ,A priori probability ,Principle of maximum entropy ,Decision theory ,media_common.quotation_subject ,Calculus ,Ignorance ,Principle of sufficient reason ,Mathematical economics ,Principle of indifference ,Axiom ,Mathematics ,media_common - Abstract
The classical approach to probability attempts to assess unique probabilities for random events even in the absence of extensive prior knowledge or information concerning a random experiment. In its early formulation by Laplace, through the principle of nonsufficient reason, equiprobable events were identified by the absence of reasons to expect the contrary—a balance of ignorance. Later rephrasing by Keynes as the principle of indifference or of sufficient reason avoided certain paradoxes by restricting the determination of equiprobability to cases where there was a balance of knowledge or information concerning the tendency for events to occur or propositions to be true. Difficulties in assessing equiprobable events in complex experiments, problems with paradoxical conclusions, and questions of justification of the principle of indifference have led to axiomatic reformulations. The use of the principles of invariance and the information–theoretic principles of maximum entropy and mutual information seems to have enlarged the domain of classical probability to include unequal probability assignments. Decision theory has also provided some instances where a pragmatic justification for the classical approach can be developed. This chapter illustrates the classical argument of probability and presents the assignments of equiprobability. The hallmark of the so-called classical or Laplacian approach to probability is the conversion of either complete ignorance or partial, symmetric knowledge concerning which of a set of alternatives is true into a uniform probability distribution over the alternatives. The core of this approach is either the principle of nonsufficient reason or the principle of indifference. The chapter also presents the axiomatic formulations of the classical approach.
- Published
- 1973
- Full Text
- View/download PDF
633. Auditory signal detectability as a function of pre-experimental shock
- Author
-
Peter C. Dodwell and Kuechler Ha
- Subjects
A priori probability ,medicine.medical_specialty ,Neurotic Disorders ,Auditory signal ,Audiology ,Signal ,050105 experimental psychology ,03 medical and health sciences ,Tone (musical instrument) ,0302 clinical medicine ,Statistics ,medicine ,Humans ,0501 psychology and cognitive sciences ,Analysis of Variance ,Electroshock ,05 social sciences ,General Medicine ,White noise ,Function (mathematics) ,Interval (music) ,Shock (circulatory) ,Auditory Perception ,Schizophrenic Psychology ,medicine.symptom ,Psychology ,030217 neurology & neurosurgery - Abstract
A neurotic, schizophrenic and normal control group detected a 1,000 cps. 2 sec. tone with an a priori probability of occurrence of 0.5 against a background of continuous white noise during a specified 4 sec. interval. An index of signal detectability, dprime;, and a decision criterion, β, were calculated from the YES-NO responses for 150 control trials, and 150 trials which were preceded by free electric shock at maximum tolerance level. The results show a decrement in signals detected for the neurotic and schizophrenic group as compared to the normal control, whose detection rate improved significantly. The decision criterion remained unaffected.
- Published
- 1968
634. Test of the TSD model in human eyelid conditioning: a priori probability and payoff manipulations
- Author
-
Janet F. Rees and Harold D. Fishbein
- Subjects
Adult ,Male ,A priori probability ,Eyelid Conditioning ,Analysis of Variance ,Stochastic game ,Conditioning, Classical ,General Medicine ,Conditioning, Eyelid ,Test (assessment) ,Discrimination, Psychological ,Reward ,Statistics ,Auditory Perception ,Humans ,Female ,Psychology ,Noise - Published
- 1970
635. The Role of Subjective Probability and Utility in Decision-making
- Author
-
Patrick Suppes
- Subjects
A priori probability ,Primitive notion ,Basis (linear algebra) ,media_common.quotation_subject ,Subjective expected utility ,62.0X ,Econometrics ,Economics ,Probability distribution ,Foundations of statistics ,Set (psychology) ,Function (engineering) ,Mathematical economics ,media_common - Abstract
Although many philosophers and statisticians believe that only an objectivistic theory of probability can have serious application in the sciences, there is a growing number of physicists and statisticians, if not philosophers, who advocate a subjective theory of probability. The increasing advocacy of subjective probability is surely due to the increasing awareness that the foundations of statistics are most properly constructed on the basis of a general theory of decision-making. In a given decision situation subjective elements seem to enter in three ways: (i) in the determination of a utility function (or its negative, a loss function) on the set of possible consequences, the actual consequence being determined by the true state of nature and the decision taken; (ii) in the determination of an a priori probability distribution on the states of nature; (iii) in the determination of other probability distributions in the decision situation.
- Published
- 1956
636. [Untitled]
- Subjects
Physics ,A priori probability ,General Physics and Astronomy ,020206 networking & telecommunications ,Measurement problem ,02 engineering and technology ,Quantum key distribution ,01 natural sciences ,Small set ,Qubit ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,Statistical physics ,010306 general physics ,Random variable ,Quantum ,Coherence (physics) - Abstract
Incompatibility of certain measurements — impossibility of obtaining deterministic outcomes simultaneously — is a well known property of quantum mechanics. This feature can be utilized in many contexts, ranging from Bell inequalities to device dependent QKD protocols. Typically, in these applications the measurements are chosen from a predetermined set based on a classical random variable. One can naturally ask, whether the non-determinism of the outcomes is due to intrinsic hiding property of quantum mechanics, or rather by the fact that classical, incoherent information entered the system via the choice of the measurement. Authors Rozpedek et al (2017 New J. Phys. 19 023038) examined this question for a specific case of two mutually unbiased measurements on systems of different dimensions. They have somewhat surprisingly shown that in case of qubits, if the measurements are chosen coherently with the use of a controlled unitary, outcomes of both measurements can be guessed deterministically. Here we extend their analysis and show that specifically for qubits, measurement result for any set of measurements with any a priori probability distribution can be faithfully guessed by a suitable state preparation and measurement. We also show that up to a small set of specific cases, this is not possible for higher dimensions. This result manifests a deep difference in properties of qubits and higher dimensional systems and suggests that these systems might offer higher security in specific cryptographic protocols. More fundamentally, the results show that the impossibility of predicting a result of a measurement is not caused solely by a loss of coherence between the choice of the measurement and the guessing procedure.
637. Sample covariance matrix based parameter estimation for digital synchronization
- Author
-
Javier Villares and Gregori Vazquez
- Subjects
A priori probability ,Mathematical optimization ,Continuous phase modulation ,Minimum mean square error ,Optimal estimation ,Computer science ,Estimation theory ,Gaussian ,Maximum likelihood ,Estimator ,Sample mean and sample covariance ,symbols.namesake ,Estimation of covariance matrices ,symbols ,Random variable - Abstract
In this paper we develop a new, versatile framework for the design of optimal non-data-aided (NDA) parameter estimators based on the exploitation of the received signal sample covariance matrix. The estimator coefficients are optimized in. order to yield minimum mean squared error (MSE) estimates of the parameter. Some linear constraints are introduced into the optimization process allowing the designer to have control over the estimator characteristic response. For those scenarios where bias is forbidden, as it happens in ranging applications, we provide the optimal solution minimizing the estimates bias within the range of the received parameter. The adopted approach is Bayesian as we treat the wanted parameter as a random variable with a known a priori probability distribution (prior). This modeling allows us to unify the design of both open- and closed-loop estimators. The proposed formulation encompasses all the linear modulations as well as the binary continuous phase modulation (CPM). The new approach supplies optimal estimation schemes without the need of assuming a given statistics for the unknown symbols, that is, avoiding the common adoption of the Gaussian assumption, which does not apply in digital communications. Special attention is paid to those low-complexity implementations for which the maximum likelihood efficiency is not guaranteed.
638. Estimating priors in maximum entropy image processing
- Author
-
Ali Mohammad-Djafari and G. Demoment
- Subjects
A priori probability ,Mathematical optimization ,Discretization ,Principle of maximum entropy ,Prior probability ,Bayesian probability ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Maximum a posteriori estimation ,Probability density function ,Image processing ,Algorithm ,Mathematics - Abstract
A class of discrete image-reconstruction and restoration problems is addressed. A brief description is given of the maximum a posteriori (MAP) Bayesian approach with maximum entropy (ME) priors to solve the linear system of equations which is obtained after the discretization of the integral equations which arises in various tomographic image restoration and reconstruction problems. The main problems of choosing an a priori probability law for the image and determining its parameters from the data is discussed. A method simultaneously estimating the parameters of the ME a priori probability density function and the pixel values of the image is proposed, and some simulations which compare this method with some classical ones are given. >
639. Mobile mixing
- Author
-
Tomasz Łuczak, Marcin Gogolewski, and Mirosław Kutyłowski
- Subjects
A priori probability ,Traffic analysis ,Markov chain ,business.industry ,Computer science ,Probability distribution ,Adversary ,business ,Communications protocol ,Mathematical proof ,Computer network ,Anonymity - Abstract
We consider a process during which encoded messages are processed through a network; at one step a message can be delivered only to a neighbor of the current node; at each node a message is recoded cryptographically so that an external observer cannot link the messages before and after re-coding. The goal of re-coding is to hide origins of the messages from an adversary who monitors the traffic. Recoding becomes useful, if at least two messages simultaneously enter a node – then the node works like a mix server. We investigate how long the route of messages must be so that traffic analysis does not provide any substantial information for the adversary. Anonymity model we consider is very strong and concerns distance between a priori probability distribution describing origins of each message, and the same probability distribution but conditioned upon the traffic information. We provide a rigid mathematical proof that for a certain route length, expressed in terms of mixing time of the network graph, variation distance between the probability distributions mentioned above is small with high probability (over possible traffic patterns). While the process concerned is expressed in quite general terms, it provides tools for proving privacy and anonymity features of many protocols. For instance, our analysis extends results concerning security of an anonymous communication protocol based on onion encoding – we do not assume, as it is done in previous papers, that a message can be sent directly between arbitrary nodes. However, the most significant application now might be proving immunity against traffic analysis of RFID tags with universal re-encryption performed for privacy protection.
640. Estimation of the hysteresis value for handover decision algorithms using Bayes criterion
- Author
-
Boualem Boashash and Bouchra Senadji
- Subjects
Computer Science::Performance ,Bayes' theorem ,A priori probability ,Handover ,Optimal estimation ,Computer science ,Estimation theory ,Computer Science::Multimedia ,Bayesian probability ,Computer Science::Networking and Internet Architecture ,False alarm ,Algorithm ,Statistical power - Abstract
In mobile radio communications, inter-cell handover is the process whereby a call in progress is maintained while the mobile unit passes through different cells. Current handover decision algorithms compare the difference between the received signal strengths from different base-stations to a hysteresis value. The proposed paper is a contribution towards an optimal estimation of the hysteresis value using Bayes criterion. An expression of the threshold value h is derived in terms of the standard deviation a of the log-normal shadowing affecting the received signals and the ratio /spl eta/ between the a priori probability of handover and probability of no handover. The performance of the handover decision algorithm is evaluated in terms of the probability of false alarm, or probability of unnecessary handover, and probability of detection, or probability of successful handover.
641. [Untitled]
- Subjects
0301 basic medicine ,Input/output ,Algorithmic information theory ,A priori probability ,Multidisciplinary ,Kolmogorov complexity ,General Physics and Astronomy ,General Chemistry ,Upper and lower bounds ,General Biochemistry, Genetics and Molecular Biology ,03 medical and health sciences ,030104 developmental biology ,0302 clinical medicine ,Simple (abstract algebra) ,Ordinary differential equation ,A priori and a posteriori ,Applied mathematics ,030217 neurology & neurosurgery ,Mathematics - Abstract
Many systems in nature can be described using discrete input–output maps. Without knowing details about a map, there may seem to be no a priori reason to expect that a randomly chosen input would be more likely to generate one output over another. Here, by extending fundamental results from algorithmic information theory, we show instead that for many real-world maps, the a priori probability P(x) that randomly sampled inputs generate a particular output x decays exponentially with the approximate Kolmogorov complexity $$\tilde K(x)$$ K ̃ ( x ) of that output. These input–output maps are biased towards simplicity. We derive an upper bound P(x) ≲ $$2^{ - a\tilde K(x) - b}$$ 2 - a K ̃ ( x ) - b , which is tight for most inputs. The constants a and b, as well as many properties of P(x), can be predicted with minimal knowledge of the map. We explore this strong bias towards simple outputs in systems ranging from the folding of RNA secondary structures to systems of coupled ordinary differential equations to a stochastic financial trading model.
642. Hidden markov model classification based on empirical frequencies of observed symbols
- Author
-
Keroglou, C., Hadjicostis, Christoforos N., Lennartson B., Lesage J.-J., Faure J.-M., Cury J.E.R., and Hadjicostis, Christoforos N. [0000-0002-1706-708X]
- Subjects
A priori probability ,Probability of misclassification ,Errors ,Discrete event simulation ,Bayes classifier ,Empirical probability ,Upper and lower bounds ,Empirical frequencies ,Maximum a posteriori estimation ,Hidden markov models ,Hidden Markov model ,Mathematics ,Probability ,Classification (of information) ,business.industry ,Probability of errors ,Low computational complexity ,Markov processes ,Pattern recognition ,Decision rule ,Classification ,Maximum a posteriori probabilities ,Storage requirements ,Probability distributions ,A-priori probabilities ,Hidden markov model ,Hidden markov models (hmms) ,Trellis codes ,Artificial intelligence ,Hidden semi-Markov model ,business - Abstract
Given a sequence of observations, classification among two known hidden Markov models (HMMs) can be accomplished with a classifier that minimizes the probability of error (i.e., the probability of misclassification) by enforcing the maximum a posteriori probability (MAP) rule. For this MAP classifier, the a priori probability of error (before any observations are made) can be obtained, as a function of the length of the sequence of observations, by summing up the probability of error over all possible observation sequences of the given length, which is a computationally expensive task. In this paper, we obtain an upper bound on the probability of error of the MAP classifier. Our results are based on a suboptimal decision rule that ignores the order with which observations occur and relies solely on the empirical frequencies with which different symbols appear. We describe necessary and sufficient conditions under which this bound on the probability of error decreases exponentially with the length of the observation sequence. Apart from the usefulness of the suboptimal rule in bounding the probability of misclassification, its numerous advantages (such as low computational complexity, reduced storage requirements, and potential applicability to distributed or decentralized decision schemes) could prove a useful alternative to the MAP rule for HMM classification in many applications.
643. LQG-MP: Optimized path planning for robots with motion uncertainty and imperfect state information
- Author
-
Pieter Abbeel, Jur van den Berg, and Ken Goldberg
- Subjects
A priori probability ,Mathematical optimization ,Computer science ,Applied Mathematics ,Mechanical Engineering ,Gaussian ,Linear-quadratic-Gaussian control ,Any-angle path planning ,Serial manipulator ,Computer Science::Robotics ,symbols.namesake ,Artificial Intelligence ,Control theory ,Modeling and Simulation ,Path (graph theory) ,symbols ,Robot ,Motion planning ,Electrical and Electronic Engineering ,Software ,Mathematics - Abstract
In this paper we present LQG-MP (linear-quadratic Gaussian motion planning), a new approach to robot motion planning that takes into account the sensors and the controller that will be used during the execution of the robot’s path. LQG-MP is based on the linear-quadratic controller with Gaussian models of uncertainty, and explicitly characterizes in advance (i.e. before execution) the a priori probability distributions of the state of the robot along its path. These distributions can be used to assess the quality of the path, for instance by computing the probability of avoiding collisions. Many methods can be used to generate the required ensemble of candidate paths from which the best path is selected; in this paper we report results using rapidly exploring random trees (RRT). We study the performance of LQG-MP with simulation experiments in three scenarios: (A) a kinodynamic car-like robot, (B) multi-robot planning with differential-drive robots, and (C) a 6-DOF serial manipulator. We also present a method that applies Kalman smoothing to make paths Ck-continuous and apply LQG-MP to precomputed roadmaps using a variant of Dijkstra’s algorithm to efficiently find high-quality paths.
644. Lower Limits of Frequencies in Computable Sequences and Relativized a Priori Probability
- Author
-
An. A. Muchnik
- Subjects
Statistics and Probability ,Discrete mathematics ,A priori probability ,Statistics, Probability and Uncertainty ,Mathematics - Published
- 1988
- Full Text
- View/download PDF
645. On the probability of radiation being the cause of leukaemia
- Author
-
Knut Magnus, Torleif Hvinden, and Per Oftedal
- Subjects
Adult ,Male ,A priori probability ,Leukemia ,Computer science ,business.industry ,General Medicine ,Radiation Effects ,Humans ,A priori and a posteriori ,Radiology, Nuclear Medicine and imaging ,Nuclear medicine ,business ,Algorithm ,Probability ,Event (probability theory) - Abstract
In the a posteriori analysis of the causes of an observed event, the statistical theorem known as “Baye's Law” is applicable. Expressed in simplified non-technical terms, this law contends that the probability of any cause X being the real cause in the observed event, is equal to the a priori probability px of this cause X leading to the event, relative to the sum of all a priori probabilities for the event to occur, or where pY is the probability of the observed event being due to all causes except x.
- Published
- 1968
- Full Text
- View/download PDF
646. Statistical Mechanics and the Critically Branched State
- Author
-
M. Gordon and M. Judd
- Subjects
Gel point ,A priori probability ,Multidisciplinary ,Basis (linear algebra) ,Distribution (number theory) ,Probabilistic logic ,Statistical mechanics ,State (functional analysis) ,Statistical physics ,Divergence (statistics) ,Mathematics ,Mathematical physics - Abstract
THIS communication deals with controversies concerning a standard reference case for equilibria in polymer science, namely the equilibrium distribution for a random f-functional polycondensation system. On the assumption that all functionalities are equally reactive and that intramolecular reaction does not occur, this distribution was derived by different probability arguments by Flory1,2, Stockmayer3, Good4, Whittle5 and others: where wx is the weight fraction of x-mer and α the conversion. Recently, Masson et al.6 have claimed that Flory's derivation contains logical errors in the use of a priori probability and have denied that certain probabilistic operations used by Flory were meaningful. From an amended probability argument, they derived a new distribution which in the same notation takes the form Distributions (1) and (2) are both normalized and share the same number average DPn=〈x−1〉−1 . Distribution (1), however, describes correctly the unique dimerization equilibrium for f = 1, whereas distribution (2) obviously does not. Moreover, distributions (1) and (2) differ vitally as regards the weight average ( DPw〈x〉), which diverges for distribution (1) at This divergence was interpreted by Flory as the statistical basis of gelation at the gel point αc. The behaviour of matter in the critically branched state, that is, of systems near their gel point, has been rationalized on this basis. For instance, sol–gel analysis7 accounts for measurements on many different systems.
- Published
- 1971
- Full Text
- View/download PDF
647. Some Upper Bounds on Error Probability for Multiclass Pattern Recognition
- Author
-
Godfried T. Toussaint
- Subjects
Probability box ,A priori probability ,Chain rule (probability) ,business.industry ,Posterior probability ,Law of total probability ,Pattern recognition ,Symmetric probability distribution ,Upper and lower bounds ,Theoretical Computer Science ,Computational Theory and Mathematics ,Hardware and Architecture ,Probability distribution ,Artificial intelligence ,business ,Software ,Mathematics - Abstract
An upper bound on the probability of error for the general pattern recognition problem is obtained as a functional of the pairwise Kolmogorov variational distances. Evaluation of the bound requires knowledge of a priori probabilities and of the class-conditional probability density functions. A tighter bound is obtained for the case of equal a priori probabilities, and a further bound is obtained that is independent of the a priori probabilities.
- Published
- 1971
- Full Text
- View/download PDF
648. A class of upper bounds on probability of error for multihypotheses pattern recognition (Corresp.)
- Author
-
D. Lainiotis
- Subjects
Probability box ,A priori probability ,business.industry ,Posterior probability ,Law of total probability ,Pattern recognition ,Library and Information Sciences ,Symmetric probability distribution ,Upper and lower bounds ,Computer Science Applications ,Combinatorics ,Regular conditional probability ,Probability distribution ,Artificial intelligence ,business ,Information Systems ,Mathematics - Abstract
A class of upper bounds on the probability of error for the general multihypotheses pattern recognition problem is obtained. In particular, an upper bound in the class is shown to be a linear functional of the pairwise Bhattacharya coefficients. Evaluation of the bounds requires knowledge of a priori probabilities and of the hypothesis-conditional probability density functions. A further bound is obtained that is independent of a priori probabilities. For the case of unknown a priori probabilities and conditional probability densities, an estimate of the latter upper bound is derived using a sequence of classified samples and Kernel functions to estimate the unknown densities.
- Published
- 1969
- Full Text
- View/download PDF
649. The Universe and Irreversibility
- Author
-
J. B. S. Haldane
- Subjects
Physics ,A priori probability ,Multidisciplinary ,Principle of maximum entropy ,Thermodynamics ,Statistical physics ,Entropy (arrow of time) - Abstract
ON the assumptions that space-time is finite spatially but not temporally (apart from supernatural events such as creation), and that atoms and radiation are mutually convertible, Sir James Jeans (NATURE, 122, p. 689; 1928) arrives at the conclusion that the universe is progressing towards a final state of maximum entropy from which no return is possible. While such a state has a maximum a priori probability, it does not follow that it is final.
- Published
- 1928
- Full Text
- View/download PDF
650. New Approach to DL Concept from Point of View of Information Theory
- Author
-
Vladimír Majerník
- Subjects
A priori probability ,Theoretical computer science ,Quantitative Biology::Neurons and Cognition ,Acoustics and Ultrasonics ,Gaussian ,Information Theory ,Acoustics ,Space (mathematics) ,Information theory ,Measure (mathematics) ,Limen ,symbols.namesake ,Arts and Humanities (miscellaneous) ,Psychophysics ,symbols ,Information theory and measure theory ,Perception ,Algorithm ,Mathematics - Abstract
A new measure for difference limen (DL) using only the communication properties of the auditory system is proposed. By using this measure, one can also take into account the a priori probability distribution of acoustical signals. For the uniform a priori probability distribution of acoustical stimuli and in the case of Gaussian linkage between the points of signal and sensory space, this measure passes over to the usual measure for DL known in psychophysics.
- Published
- 1968
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.