1. Law invariant risk measures and information divergences
- Author
-
Daniel Lacker
- Subjects
Statistics and Probability ,Kullback–Leibler divergence ,Science (General) ,media_common.quotation_subject ,information divergence ,01 natural sciences ,FOS: Economics and business ,010104 statistics & probability ,Q1-390 ,Additive function ,0502 economics and business ,time consistency ,FOS: Mathematics ,QA1-939 ,0101 mathematics ,Invariant (mathematics) ,media_common ,Probability measure ,Mathematics ,050208 finance ,law invariance ,Applied Mathematics ,Probability (math.PR) ,05 social sciences ,Probability and statistics ,risk measures ,Certainty ,Chain rule ,Time consistency ,Risk Management (q-fin.RM) ,Modeling and Simulation ,Law ,Mathematics - Probability ,Quantitative Finance - Risk Management - Abstract
A one-to-one correspondence is drawn between law invariant risk measures and divergences, which we define as functionals of pairs of probability measures on arbitrary standard Borel spaces satisfying a few natural properties. Divergences include many classical information divergence measures, such as relative entropy and $f$-divergences. Several properties of divergence and their duality with law invariant risk measures are developed, most notably relating their chain rules or additivity properties with certain notions of time consistency for dynamic law invariant risk measures known as acceptance and rejection consistency. These properties are linked also to a peculiar property of the acceptance sets on the level of distributions, analogous to results of Weber on weak acceptance and rejection consistency. Finally, the examples of shortfall risk measures and optimized certainty equivalents are discussed in some detail, and it is shown that the relative entropy is essentially the only divergence satisfying the chain rule.
- Published
- 2018