### Abstract

Familiar approaches to risk and preferences involve minimizing the expectation E_{IP}(X) of a payoff function X over a family Γ of plausible risk factor distributions IP. We consider Γ determined by a bound on a convex integral functional of the density of IP, thus Γ may be an I-divergence (relative entropy) ball or some other f-divergence ball or Bregman distance ball around a default distribution IP_{0}. Using a Pythagorean identity we show that whether or not a worst case distribution exists (minimizing E_{IP}(X) subject to IP ∈ Γ), the almost worst case distributions cluster around an explicitly specified, perhaps incomplete distribution. When Γ is an f-divergence ball, a worst case distribution either exists for any radius, or it does/does not exist for radius less/larger than a critical value. It remains open how far the latter result extends beyond f-divergence balls.

Original language | English |
---|---|

Title of host publication | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |

Publisher | Springer Verlag |

Pages | 435-443 |

Number of pages | 9 |

Volume | 9389 |

ISBN (Print) | 9783319250397, 9783319250397 |

DOIs | |

Publication status | Published - 2015 |

Event | 2nd International Conference on Geometric Science of Information, GSI 2015 - Palaiseau, France Duration: okt. 28 2015 → okt. 30 2015 |

### Publication series

Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|

Volume | 9389 |

ISSN (Print) | 03029743 |

ISSN (Electronic) | 16113349 |

### Other

Other | 2nd International Conference on Geometric Science of Information, GSI 2015 |
---|---|

Country | France |

City | Palaiseau |

Period | 10/28/15 → 10/30/15 |

### Fingerprint

### ASJC Scopus subject areas

- Computer Science(all)
- Theoretical Computer Science

### Cite this

*Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)*(Vol. 9389, pp. 435-443). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9389). Springer Verlag. https://doi.org/10.1007/978-3-319-25040-3_47

**An information geometry problem in mathematical finance.** / Csiszár, I.; Breuer, Thomas.

Research output: Conference contribution

*Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).*vol. 9389, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9389, Springer Verlag, pp. 435-443, 2nd International Conference on Geometric Science of Information, GSI 2015, Palaiseau, France, 10/28/15. https://doi.org/10.1007/978-3-319-25040-3_47

}

TY - GEN

T1 - An information geometry problem in mathematical finance

AU - Csiszár, I.

AU - Breuer, Thomas

PY - 2015

Y1 - 2015

N2 - Familiar approaches to risk and preferences involve minimizing the expectation EIP(X) of a payoff function X over a family Γ of plausible risk factor distributions IP. We consider Γ determined by a bound on a convex integral functional of the density of IP, thus Γ may be an I-divergence (relative entropy) ball or some other f-divergence ball or Bregman distance ball around a default distribution IP0. Using a Pythagorean identity we show that whether or not a worst case distribution exists (minimizing EIP(X) subject to IP ∈ Γ), the almost worst case distributions cluster around an explicitly specified, perhaps incomplete distribution. When Γ is an f-divergence ball, a worst case distribution either exists for any radius, or it does/does not exist for radius less/larger than a critical value. It remains open how far the latter result extends beyond f-divergence balls.

AB - Familiar approaches to risk and preferences involve minimizing the expectation EIP(X) of a payoff function X over a family Γ of plausible risk factor distributions IP. We consider Γ determined by a bound on a convex integral functional of the density of IP, thus Γ may be an I-divergence (relative entropy) ball or some other f-divergence ball or Bregman distance ball around a default distribution IP0. Using a Pythagorean identity we show that whether or not a worst case distribution exists (minimizing EIP(X) subject to IP ∈ Γ), the almost worst case distributions cluster around an explicitly specified, perhaps incomplete distribution. When Γ is an f-divergence ball, a worst case distribution either exists for any radius, or it does/does not exist for radius less/larger than a critical value. It remains open how far the latter result extends beyond f-divergence balls.

KW - Almost worst case densities

KW - Bregman distance

KW - Convex integral functional

KW - Fdivergence

KW - I-divergence

KW - Payoff function

KW - Pythagorean identity

KW - Risk measure

KW - Worst case density

UR - http://www.scopus.com/inward/record.url?scp=84950327509&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84950327509&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-25040-3_47

DO - 10.1007/978-3-319-25040-3_47

M3 - Conference contribution

AN - SCOPUS:84950327509

SN - 9783319250397

SN - 9783319250397

VL - 9389

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 435

EP - 443

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

PB - Springer Verlag

ER -