19 results
Search Results
2. Analyzing risk and performance using the multi-factor concept
- Author
-
Nico van der Wijst, Jaap Spronk, Erik M. Vermeulen, and Erasmus School of Economics
- Subjects
Risk analysis ,Information Systems and Management ,Actuarial science ,General Computer Science ,business.industry ,Computer science ,Financial risk management ,Risk management tools ,Factor analysis of information risk ,Management Science and Operations Research ,Industrial and Manufacturing Engineering ,Field (computer science) ,Risk analysis (business) ,Modeling and Simulation ,Factor (programming language) ,Econometrics ,business ,computer ,Risk management ,computer.programming_language - Abstract
In this paper, we present a new model to analyze the risk and the expected level of firm performance. This model is based on the multi-factor approach to risk, in which unexpected performance is explained through sensitivities to unexpected changes of risk factors. Instead of using the multi-factor approach for the analysis of security portfolios, it is used to analyze performance measures of firms. In this paper the multi-factor approach is not only used to analyze risk, but also to analyze the expected level of performance. Furthermore, it is analyzed how instruments, as for instance projects, can be used to change the risk and the expected level of performance. An illustrative application in the field of finance is presented, although the model can also be applied in other areas.
- Published
- 1996
3. 3 Semiparametric methods for asset pricing models
- Author
-
Bruce N. Lehmann
- Subjects
Investment theory ,Actuarial science ,Consumption-based capital asset pricing model ,Arbitrage pricing theory ,Economics ,Econometrics ,Capital asset pricing model ,Rational pricing ,Hedge (finance) ,Basis risk ,Generalized method of moments - Abstract
This paper discusses semiparametric estimation procedures for asset pricing models within the generalized method of moments (GMM) framework. GMM is widely applied in the asset pricing context in its unconditional form but the conditional mean restrictions implied by asset pricing theory are seldom fully exploited. The purpose of this paper is to take some modest steps toward removing these impediments. The nature of efficient GMM estimation is cast in a language familiar to financial economists: the language of maximum correlation or optimal hedge portfolios. Similarly, a family of beta pricing models provides a natural setting for identifying the sources of efficiency gains in asset pricing applications. My hope is that this modest contribution will facilitate more routine exploitation of attainable efficiency gains.
- Published
- 1996
4. Frontiers in VaR forecasting and backtesting
- Author
-
María Rosa Nieto, Esther Ruiz, and Ministerio de Economía y Competitividad (España)
- Subjects
Alternative methods ,Risk ,Actuarial science ,Series (mathematics) ,Autoregressive conditional heteroskedasticity ,Risk measure ,05 social sciences ,Extreme value theory ,Estadística ,Backtesting ,01 natural sciences ,Garch ,010104 statistics & probability ,0502 economics and business ,Econometrics ,Economics ,Relevance (information retrieval) ,0101 mathematics ,Business and International Management ,Quantile ,Value at risk ,050205 econometrics - Abstract
The interest in forecasting the Value at Risk (VaR) has been growing over the last two decades, due to the practical relevance of this risk measure for financial and insurance institutions. Furthermore, VaR forecasts are often used as a testing ground when fitting alternative models for representing the dynamic evolution of time series of financial returns. There are vast numbers of alternative methods for constructing and evaluating VaR forecasts. In this paper, we survey the new benchmarks proposed in the recent literature. Financial support from Project ECO2012-32401 by the Spanish Government is gratefully acknowledged by the second author. We are also grateful to the Editor Rob Hyndman for his support and to three anonymous reviewers for their detailed and constructive comments.
- Published
- 2016
5. How to Use These Scenarios for Asset Management?
- Author
-
Hervé Fraysse and Arnaud Clément-Grandcourt
- Subjects
Actuarial science ,business.industry ,Kalman filter ,computer.software_genre ,Econometrics ,Arbitrage pricing theory ,Economics ,Mean variance ,Asset management ,Algorithmic trading ,business ,Particle filter ,computer ,Smoothing ,Stock (geology) - Abstract
The University of Zurich did some research on features to overcome the weaknesses of the traditional mean variance optimization. Indeed, the mean variance utility function of the Sharpe method is the easiest to implement, but using the second-order moment causes several optimization issues. Variance, as an addition of squared values, does not distinguish positive and negative moves: this is an important problem when there are dissymmetric fat tails. Moreover, in a bear market, volume is decreasing and operators are more nervous; consequently, noise becomes important and adds up in variance. Studies of noise in the markets are numerous: for instance, the papers by Filippi, Lepage, Meyrignac and Pochon on high-frequency statements of stock prices [FRA 12], Kai Yao [YIN 90] on low frequency and Professor Haugen’s studies on weekly statements and monthly statements showed important random noises that increase in bear markets. Trading algorithms use filters, such as the Kalman filter for low-frequency information or the particle filter for high-frequency information; smoothing formulas are used by Professor Haugen and many others. Trading systems are issuing more than one order out of two in New York and one out of three in Europe.
- Published
- 2015
6. Pricing and hedging guaranteed annuity options via static option replication
- Author
-
Antoon Pelsser and Erasmus School of Economics
- Subjects
Statistics and Probability ,Economics and Econometrics ,Computer science ,media_common.quotation_subject ,INTEREST-RATES ,VALUATION ,LIFE-INSURANCE LIABILITIES ,static option replication ,jel:G22 ,Swap (finance) ,TERM STRUCTURE ,Econometrics ,Hedge (finance) ,Market value ,media_common ,CONTRACTS ,Actuarial science ,DERIVATIVES ,POLICIES ,FRAMEWORK ,Martingale (betting system) ,jel:G13 ,Replication (computing) ,Interest rate ,guaranteed annuity options ,Interest rate risk ,Annuity (American) ,Replicating portfolio ,hedging methodology ,Business ,Statistics, Probability and Uncertainty ,Martingale (probability theory) - Abstract
In this paper we derive a market value for with-profits guaranteed annuity options (GAOs) using martingale modelling techniques. Furthermore, we show how to construct a static replicating portfolio of vanilla interest rate swaptions that replicates the with-profits GAO. Finally, we illustrate with historical UK interest rate data from the period 1980 to 2000 that the static replicating portfolio would have been extremely effective as a hedge against the interest rate risk involved in the GAO, that the static replicating portfolio would have been considerably cheaper than up-front reserving and also that the replicating portfolio would have provided a much better level of protection than an up-front reserve. (C) 2003 Elsevier B.V. All rights reserved.
- Published
- 2003
7. Optimal portfolio selection in a Value-at-Risk framework
- Author
-
Rachel Campbell, Ronald Huisman, Kees Koedijk, Department of Finance, Finance, and RS: GSBE METEOR T4
- Subjects
Rate of return on a portfolio ,Economics and Econometrics ,Actuarial science ,Sharpe ratio ,Econometrics ,Economics ,Portfolio ,Expected return ,Time horizon ,Portfolio optimization ,Finance ,Value at risk ,Modern portfolio theory - Abstract
In this paper, we develop a portfolio selection model which allocates financial assets by maximising expected return subject to the constraint that the expected maximum loss should meet the Value-at-Risk limits set by the risk manager. Similar to the mean–variance approach a performance index like the Sharpe index is constructed. Furthermore when expected returns are assumed to be normally distributed we show that the model provides almost identical results to the mean–variance approach. We provide an empirical analysis using two risky assets: US stocks and bonds. The results highlight the influence of both non-normal characteristics of the expected return distribution and the length of investment time horizon on the optimal portfolio selection.
- Published
- 2001
8. Comonotonicity, correlation order and premium principles
- Author
-
Shaun Wang, Jan Dhaene, and ASE RI (FEB)
- Subjects
Statistics and Probability ,Economics and Econometrics ,Actuarial science ,Property (philosophy) ,Dependency (UML) ,Comonotonicity ,Bivariate analysis ,Correlation ,Distribution (mathematics) ,Simple (abstract algebra) ,Order (exchange) ,Economics ,Econometrics ,Statistics, Probability and Uncertainty - Abstract
In this paper, we investigate the notion of dependency between risks and its effect on the related stop-loss premiums. The concept of comonotonicity, being an extreme case of dependency, is discussed in detail. For the bivariate case, it is shown that, given the distributions of the individual risks, comonotonicity leads to maximal stop-loss premiums. Some properties of stop-loss preserving premium principles are considered. A simple proof is given for the sub-additivity property of Wang's premium principle.
- Published
- 1998
9. Forecasting Exchange Rates: an Investor Perspective
- Author
-
John Prins, Michael Melvin, and Duncan Shand
- Subjects
Transaction cost ,Actuarial science ,Exchange rate ,Currency ,Econometrics ,Economics ,Equity (finance) ,Portfolio ,Passive management ,Conditioners ,Random walk - Abstract
The popular scholarly exercise of evaluating exchange rate forecasting models relative to a random walk was stimulated by the well-cited Meese and Rogoff (1983) paper. Practitioners who construct quantitative models for trading exchange rates approach forecasting from a different perspective. Rather than focus on forecast errors for bilateral exchange rates, as in the Meese–Rogoff case, we present what is required for constructing a successful trading model. To provide more perspective, a particular approach to quantitative modeling is presented that incorporates return forecasts, a risk model, and a transaction cost constraint in an optimization framework. Since beating a random walk is not a useful evaluation metric for currency investing, we discuss the use of benchmarks and conclude that performance evaluation in currencies is much more problematic than in equity markets due to the lack of a passive investment strategy and the multitude of alternative formulations of well-known currency style factors. We then provide analytical tools that can be useful in evaluating currency manager skill in terms of portfolio tilts and timing. Finally, we examine how conditioning information can be employed to enhance timing skill in trading generic styles like the carry trade. Such information can be valuable in reducing the duration and magnitude of portfolio drawdowns.
- Published
- 2013
10. Choosing the optimum mix of duration and effort in education
- Author
-
Hessel Oosterbeek and ASE RI (FEB)
- Subjects
Economics and Econometrics ,Actuarial science ,Higher education ,business.industry ,Econometrics ,Economics ,Economic model ,Duration (project management) ,business ,Relative price ,Socioeconomic status ,Education ,Study duration - Abstract
In this paper a simple economic model is employed to analyse the determinants of expected study duration and weekly effort. Although some of the outcomes do not fit into our theoretical framework, a substantial number of the results support the hypothesis that the duration/effort ratio is determined by the relative prices of these inputs of the learning process. We find that a higher socio-economic status increases the duration/effort ratio. Children from higher income families and/or with more highly education parents expect longer durations and/or invest less weekly effort. For experienced students, the prediction that higher ability levels will decrease both effort and duration is confirmed by the findings. We consider this to be a result firmly in favour of our model. [ JEL I21]
- Published
- 1995
11. A Bayesian analysis of the OPES model with a nonparametric component: An application to dental insurance and dental care
- Author
-
Murat K. Munkin and Pravin K. Trivedi
- Subjects
education.field_of_study ,Actuarial science ,Average treatment effect ,Population ,Adverse selection ,Nonparametric statistics ,Ordered probit ,Dental insurance ,stomatognathic diseases ,Incentive ,stomatognathic system ,Econometrics ,Economics ,Endogeneity ,education - Abstract
This paper analyzes the effect of dental insurance on utilization of general dentist services by adult US population aged from 25 to 64 years using the ordered probit model with endogenous selection. Our econometric framework accommodates endogeneity of insurance and the ordered nature of the measure of dental utilization. The study finds strong evidence of endogeneity of dental insurance to utilization and identifies interesting patterns of nonlinear dependencies between the dental insurance status and individual's age and income. The calculated average treatment effect supports the claim of adverse selection into the treated (insured) state and indicates a strong positive incentives effect of dental insurance.
- Published
- 2008
12. Examining Explanations of a Market Anomaly: Preferences or Perceptions?
- Author
-
Erik Snowberg and Justin Wolfers
- Subjects
Actuarial science ,Political science ,Anomaly (natural sciences) ,Perception ,media_common.quotation_subject ,Econometrics ,Market anomaly ,Small probability ,Parametric statistics ,media_common ,Test (assessment) - Abstract
This paper compiles and summarizes the theoretical literature on the favorite-longshot bias, an anomaly that has been found in sports betting markets for over half a century. Explanations of this anomaly can be broken down into two broad categories, those involving preferences and those involving perceptions. We propose a novel test of these two classes of models that allows us to discriminate between them without parametric assumptions. We execute these tests on a new dataset, which is an order of magnitude larger than any used in previous studies, and conclude that the perceptions model, in which bettors overestimate the chances of small probability events, provides a better fit to the data.
- Published
- 2008
13. Unobserved Heterogeneity and the Term-Structure of Default
- Author
-
Koresh Galil
- Subjects
Actuarial science ,Moral hazard ,Hazard ratio ,Adverse selection ,Econometrics ,Economics ,Hazard ,Term (time) - Abstract
This paper estimates the conditional hazard baseline (term-structure) of the hazard rate to default at the time of bonds’ issuance by using two hazard models–one ignoring and another allowing unobserved heterogeneity (UH) in the hazard rate. Following Diamond (1989) one can predict a declining hazard rate to default due to adverse selection and moral hazard. After controlling for UH caused by adverse selection and time-series shocks, the hazard rate shows to be increasing over time and hence the moral hazard effect cannot be confirmed.
- Published
- 2007
14. 10 Interest rate spreads as predictors of business cycles
- Author
-
Kajal Lahiri and Jiazhuo G. Wang
- Subjects
Actuarial science ,media_common.quotation_subject ,Bond ,Floating interest rate ,Federal funds ,Economics ,Econometrics ,Yield curve ,Real interest rate ,Trough (economics) ,Interest rate ,media_common ,Treasury - Abstract
Publisher Summary The chapter discusses the comparative performance of a number of interest rate spreads as predictors of U.S. business cycle turning points. In order to map changes in the predictor variables into turning point predictions, a non-linear filter is used. In this framework, the dynamic behavior of the economy is allowed to vary between expansions and recessions in terms of duration and volatility. The chapter focuses on three spreads that have shown maximum potential in past research. They are the difference between the Federal funds rate and the ten-year Treasury bond rate, the difference between the ten-year Treasury bond rate and the one-year Treasury bill rate, and the spread between the six-month commercial paper and six-month Treasury bill rates. The peak signals came with an average lead time of nearly 20 months and the trough signals with an average lead time of nearly 3 months. The behavior of the spread based on the Federal funds rate is similar to that of the yield curve with very similar lead times.
- Published
- 1996
15. A MARKOV PROCESS MODEL FOR CASH MANAGEMENT
- Author
-
K.J. Leonard and Z.S. Khalil
- Subjects
Actuarial science ,Exponential distribution ,media_common.quotation_subject ,Mathematics::Optimization and Control ,Markov process ,Poisson distribution ,Net present value ,Computer Science::Computers and Society ,Computer Science::Other ,Terminal value ,symbols.namesake ,Computer Science::Computational Engineering, Finance, and Science ,Cash ,Forecast period ,symbols ,Econometrics ,Business ,Cash management ,media_common - Abstract
In this paper, we consider a Markov process model for optimizing the operating cash of a firm. We use a stochastic inventory model in which we assume that the input process (receipt of cash) is Poisson and the cash disbursement time is exponentially distributed. We derive steady state probabilities of reaching various cash levels and obtain some important characteristics of such systems.
- Published
- 1984
16. EXAMINING THE MARKET VALUE OF ENERGY EFFICIENCY
- Author
-
Richard L. Haney
- Subjects
Actuarial science ,Empirical research ,Present value ,Value (economics) ,Econometrics ,Economics ,Cost approach ,Regression analysis ,Market value ,Efficient energy use - Abstract
The life cycle cost approach to estimating the value of energy saving improvements presumes that consumers can accurately estimate, and will willingly pay for, the future energy savings associated with the improvement. This paper reports the initial results of an empirical test of that proposition. Using actual re-sale prices for homes with solar domestic hot water systems and the present value of the utility savings estimated via the F-CHART procedure, the study uses multiple regression analysis to estimate that consumers are willing to pay for 89 percent of an installed DHW system's capitalized value.
- Published
- 1986
17. Portfolio Analysis Using Possibility Distributions
- Author
-
J. J. Buckley
- Subjects
Set (abstract data type) ,Actuarial science ,Ranking ,media_common.quotation_subject ,Fuzzy set ,Econometrics ,Fuzzy number ,Cash flow ,Net present value ,Modern portfolio theory ,Interest rate ,media_common ,Mathematics - Abstract
This paper considers the important problem in finance of ranking investment proposals, characterized by uncertain future cash flows, project duration and interest rates, from best to worst. Uncertainty is modeled by possibility distributions defined by fuzzy numbers or discrete fuzzy sets. Using the concepts of conditional, joint and marginal possibility distributions we compute one possibility distribution, representing average net present value, for each project. The proposals are then ranked producing a set of undominated projects all considered equally best.
- Published
- 1987
18. ANALYSIS OF RISK IN MINING PROJECTS
- Author
-
T. Alan O'Hara
- Subjects
Actuarial science ,Operating cash flow ,Risk analysis (business) ,Yield (finance) ,Return on investment ,Econometrics ,Economics ,Capital cost ,Cash flow ,Return on capital ,Discounted cash flow - Abstract
The decision to invest capital in developing a mine and constructing a mineral processing plant and services is usually made after a feasibility study shows that the estimated future cash flow from mine operations will yield an adequate return on the estimated capital expenditures. The actual return on investment may differ substantially from that estimated in the feasibility study, because of the probabilities of errors in estimating capital costs, ore reserves, operating costs, mineral revenue and operating productivity. This paper outlines the R.S.S. (root sum of squares) risk analysis procedure of computing the overall probability of the actual project D.C.F. (rate of discounted cash flow), differing from that estimated in the feasibility study.
- Published
- 1987
19. THE EFFECT OF INFLATION ON THE EVALUATION OF MINES
- Author
-
Herbert D. Drechsler and James B. Stephenson
- Subjects
Actuarial science ,Price index ,Hurdle rate ,Price change ,Economics ,Econometrics ,TheoryofComputation_GENERAL ,Internal rate of return ,Valuation (finance) - Abstract
The paper identifies the major components of price change which affect the determination of the internal rate of return of a mining project. It illustrates several price indexes which may be used to measure general price increases, identifying techniques which might incorporate them into mine valuation procedures. The “hurdle rate” for new investment is discussed.
- Published
- 1987
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.