68 results on '"evaluating forecasts"'
Search Results
2. The role of temporal dependence in factor selection and forecasting oil prices.
- Author
-
Binder, Kyle E., Pourahmadi, Mohsen, and Mjelde, James W.
- Subjects
PETROLEUM sales & prices ,FORECASTING ,FORECASTING methodology ,PRINCIPAL components analysis ,COVARIANCE matrices - Abstract
Extracting information from high-dimensional time series in the form of underlying factors is an increasingly popular methodology in forecasting applications. In this paper, principal component analysis (PCA) and three other methods for factor extraction are compared based on their deterministic and probabilistic forecasting performances using factor-augmented vector autoregressive (FAVAR) models. The existing PCA-based methods use only the contemporaneous covariance matrix of the data, while the other methods rely on weighted lagged cross-covariance matrices. Our empirical study considers four crude oil future price instruments and a 241 variable dataset of global energy prices and quantity, macroeconomic indicators, and financial series which are thought to influence oil price movements. Overall empirical findings are: (1) the PCA-based method performs better at shorter forecast horizons whereas the new methods involving lagged cross-covariance matrices tend to perform better at longer horizons (2 months or greater); (2) the performance ranking of the four methods under both deterministic and probabilistic forecasting is greatly affected by the number of factors included in the FAVAR models; (3) the forecast performances of the four methods are close to each other and no method performs uniformly better than the others. More research on the role of temporal dependence in determining the number of factors is warranted. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
3. Quantifying Priorities in Business Cycle Reports: Analysis of Recurring Textual Patterns around Peaks and Troughs
- Author
-
Foltas, Alexander and Foltas, Alexander
- Abstract
I propose a novel approach to uncover business cycle reports’ priorities and relate them to economic fluctuations. To this end, I leverage quantitative business-cycle forecasts published by leading German economic research institutes since 1970 to estimate the proportions of latent topics in associated business cycle reports. I then employ a supervised approach to aggregate topics with similar themes, thus revealing the proportions of broader macroeconomic subjects. I obtain measures of forecasters’ subject priorities by extracting the subject proportions’ cyclic components. Correlating these priorities with key macroeconomic variables reveals consistent priority patterns throughout economic peaks and troughs. The forecasters prioritize inflation-related matters over recession-related considerations around peaks. This finding suggests that forecasters underestimate growth and overestimate inflation risks during contractive monetary policies, which might explain their failure to predict recessions. Around troughs, forecasters prioritize investment matters, potentially suggesting a better understanding of macroeconomic developments during those periods compared to peaks.
- Published
- 2023
4. Fitting the damped trend method of exponential smoothing.
- Author
-
Gardner, Everette S. and Acar, Yavuz
- Subjects
STATISTICAL smoothing ,MULTILEVEL models ,TELECOMMUNICATION ,TIME series analysis ,DAMPING (Mechanics) - Abstract
The well-established forecasting methods of exponential smoothing rely on the "optimal" estimation of parameters if they are to perform well. A grid search procedure to minimise the MSE is often used in practice to fit exponential smoothing methods, especially in large inventory control applications. Grid searches are also found in some modern statistical software. We ask whether the ex ante forecast accuracy of the damped trend method of exponential smoothing can be improved by optimising parameters. Furthermore, we ask whether the method should be fitted according to a mean absolute error criterion rather than the mean squared error commonly used in practice. We found that model-fitting matters. Parameter optimization makes significant improvements in forecast accuracy regardless of the fit criterion. We also show that the mean absolute error criterion usually produces better results. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
5. Forecasting Using Nonlinear Long Memory Models with Artificial Neural Network Expansion
- Author
-
Kongcharoen, Chaleampong, Huynh, Van-Nam, editor, Kreinovich, Vladik, editor, Sriboonchitta, Songsak, editor, and Suriya, Komsan, editor
- Published
- 2013
- Full Text
- View/download PDF
6. Cross-temporal forecast reconciliation: Optimal combination method and heuristic alternatives
- Author
-
Tommaso Di Fonzo and Daniele Girolimetto
- Subjects
FOS: Computer and information sciences ,Mathematical optimization ,Series (mathematics) ,Process (engineering) ,Heuristic ,Computer science ,GDP from Income and Expenditure side ,Heuristic techniques ,Notation ,Least squares ,Linearly constrained multiple time series, Combining forecasts, Heuristic techniques, Evaluating forecasts, GDP from Income and Expenditure side ,Linearly constrained multiple time series ,Expression (mathematics) ,Methodology (stat.ME) ,Combining forecasts ,Point (geometry) ,Evaluating forecasts ,Business and International Management ,Dimension (data warehouse) ,Statistics - Methodology - Abstract
Forecast reconciliation is a post-forecasting process aimed to improve the quality of the base forecasts for a system of hierarchical/grouped time series (Hyndman et al., 2011). Contemporaneous (cross-sectional) and temporal hierarchies have been considered in the literature, but - except for Kourentzes and Athanasopoulos (2019) - generally these two features have not been fully considered together. Adopting a notation able to simultaneously deal with both forecast reconciliation dimensions, the paper shows two new results: (i) an iterative cross-temporal forecast reconciliation procedure which extends, and overcomes some weaknesses of, the two-step procedure by Kourentzes and Athanasopoulos (2019), and (ii) the closed-form expression of the optimal (in least squares sense) point forecasts which fulfill both contemporaneous and temporal constraints. The feasibility of the proposed procedures, along with first evaluations of their performance as compared to the most performing `single dimension' (either cross-sectional or temporal) forecast reconciliation procedures, is studied through a forecasting experiment on the 95 quarterly time series of the Australian GDP from Income and Expenditure sides considered by Athanasopoulos et al. (2019)., Main text: 49 pages, 10 figures, 2 tables. Appendix: 68 pages, 29 figures, 17 tables
- Published
- 2023
7. TIME SERIES FORECASTING WITH A PRIOR WAVELET-BASED DENOISING STEP.
- Author
-
Bašta, Milan
- Subjects
SIGNAL denoising ,WAVELETS (Mathematics) ,ALGORITHMS - Abstract
We provide an extensive study assessing whether a prior wavelet-based denoising step enhances the forecast accuracy of standard forecasting models. Many combinations of attribute values of the thresholding (denoising) algorithm are explored together with several traditional forecasting models used in economic time series forecasting. The results are evaluated using M3 competition yearly time series. We conclude that the performance of a forecasting model combined with the prior denoising step is generally not recommended, which implies that a straightforward generalisation of some of the results available in the literature (which found the denoising step to be beneficial) is not possible. Even if cross-validation is used to select the value of the threshold, a superior performance of the forecasting model with the prior denoising step does not generally follow. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
8. Does the interaction between the accounting method choice and disclosure affect financial analysts’ information environment? The case of joint ventures under IAS 31.
- Author
-
Giner Inchausti, Begoña, Iñiguez Sanchez, Raul, and Poveda Fuentes, Francisco
- Subjects
ACCOUNTING ,JOINT ventures ,FINANCIAL statements ,EARNINGS forecasting ,CORPORATE profits - Abstract
Copyright of Spanish Journal of Finance & Accounting / Revista Espanola de Financiacion y Contabilidad is the property of Taylor & Francis Ltd and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2017
- Full Text
- View/download PDF
9. Has the forecasting performance of the Federal Reserve's Greenbooks changed over time?
- Author
-
Ekşi, Ozan, Orman, Cüneyt, and Onur Taş, Bedri Kamil
- Subjects
BAYESIAN analysis ,INFERENTIAL statistics ,INFLATION forecasting - Abstract
We investigate how the forecasting performance of the Federal Reserve Greenbooks has changed relative to commercial forecasters between 1974 and 2009. To this end, we analyze time-variation in the Greenbook coefficients in forecast encompassing regressions. Assuming that model coefficients change continuously, we estimate unobserved components models using Bayesian inference techniques. To verify that our results do not depend on the specific way change is modeled, we also allow the coefficients to change discretely rather than continuously and test for structural breaks using classical inference techniques. We find that the Greenbook forecasts have been consistently superior to the commercial forecasts at all horizons throughout our sample period. Although the forecasting performance gap has narrowed at more distant horizons after the early-to-mid 1980s, it remains significant. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
10. Quantile Regression Estimates of Confidence Intervals for WASDE Price Forecasts
- Author
-
Olga Isengildina-Massa, Scott H. Irwin, and Darrel L. Good
- Subjects
commodity ,evaluating forecasts ,government forecasting ,judgmental forecasting ,prediction intervals ,price forecasting ,Agriculture - Abstract
This study uses quantile regressions to estimate historical forecast error distributions for WASDE forecasts of corn, soybean, and wheat prices, and then compute confidence limits for the forecasts based on the empirical distributions. Quantile regressions with fit errors expressed as a function of forecast lead time are consistent with theoretical forecast variance expressions while avoiding assumptions of normality and optimality. Based on out-of-sample accuracy tests over 1995/96Ð2006/07, quantile regression methods produced intervals consistent with the target confidence level. Overall, this study demonstrates that empirical approaches may be used to construct accurate confidence intervals for WASDE corn, soybean, and wheat price forecasts.
- Published
- 2010
- Full Text
- View/download PDF
11. A Nonlinear Approach for Modeling and Forecasting US Business Cycles.
- Author
-
BouAli, Meriam, Ben Nasr, Adnen, and Trabelsi, Abdelwahed
- Subjects
BUSINESS cycles ,BUSINESS forecasting ,NONLINEAR systems ,BUSINESS models ,GROSS domestic product ,GROWTH rate - Abstract
The purpose of this paper is to provide a complete evaluation of four regime-switching models by checking their performance in detecting US business cycle turning points, in replicating US business cycle features and in forecasting US GDP growth rate. Both individual and combined forecasts are considered. Results indicate that while the Markov-switching model succeeded in replicating all the NBER peak and trough dates without an extra-cycle detection, it seems to be outperformed by the Bounce-back model in term of the delay time to a correct alarm. Concerning business cycle features characterization, none of the competing models dominates over all the features. The performance of the Markov-switching and bounce back models in detecting turning points was not translated into an improved business cycle feature characterization since they are outperformed by the Floor and Ceiling model. The forecast performance of the considered models varies across regimes and across forecast horizons. That is, the model performing best in an expansion period is not necessarily the same in a recession period and similarly for the forecast horizons. Finally, combining such individual forecasts generally leads to increased forecast accuracy especially forh=1. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
12. What drives the accuracy of PV output forecasts?
- Author
-
Nguyen, Thi Ngoc and Müsgens, Felix
- Subjects
- *
NUMERICAL weather forecasting , *FORECASTING , *WAVELET transforms - Abstract
• Combining methodologies (hybrid models) achieve the lowest errors. • ML models do not show a robust performance but have the fastest improvement. • Using data processing techniques reduces forecast errors. • Forecast horizon and test set length positively correlate with forecast errors. • The possibility of "cherry picking" in reporting errors is observed. In this paper, 180 papers on photovoltaic (PV) output forecasting were reviewed and a database of forecast errors was extracted for statistical analysis. The paper shows that among the forecast models, hybrid models are most likely to become the primary form of PV output forecasting in the future. The use of data processing techniques is positively correlated with the forecast quality, while the lengths of the forecast horizons and out-of-sample test sets have negative effects on the forecast accuracy. The paper also found that the use of data normalization, the wavelet transform, and the inclusion of clear sky index and numerical weather prediction variables are the most effective data processing techniques. Furthermore, the paper found some evidence of "cherry picking" in the reporting of errors and we recommend that the test sets be at least one year long to avoid any distortion in the performance of the models. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
13. Evaluating Probabilistic Forecasts with Bayesian Signal Detection Models.
- Author
-
Steyvers, Mark, Wallsten, Thomas S., Merkle, Edgar C., and Turner, Brandon M.
- Subjects
SIGNAL detection ,RECEIVER operating characteristic curves ,BAYESIAN analysis ,DECISION making ,STOCHASTIC information theory - Abstract
We propose the use of signal detection theory (SDT) to evaluate the performance of both probabilistic forecasting systems and individual forecasters. The main advantage of SDT is that it provides a principled way to distinguish the response from system diagnosticity, which is defined as the ability to distinguish events that occur from those that do not. There are two challenges in applying SDT to probabilistic forecasts. First, the SDT model must handle judged probabilities rather than the conventional binary decisions. Second, the model must be able to operate in the presence of sparse data generated within the context of human forecasting systems. Our approach is to specify a model of how individual forecasts are generated from underlying representations and use Bayesian inference to estimate the underlying latent parameters. Given our estimate of the underlying representations, features of the classic SDT model, such as the receiver operating characteristic (ROC) curve and the area under the ROC curve (AUC), follow immediately. We show how our approach allows ROC curves and AUCs to be applied to individuals within a group of forecasters, estimated as a function of time, and extended to measure differences in forecastability across different domains. Among the advantages of this method is that it depends only on the ordinal properties of the probabilistic forecasts. We conclude with a brief discussion of how this approach might facilitate decision making. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
14. Explaining variance in the accuracy of prediction markets
- Author
-
Sveinung Arnesen, Oliver Strijbis, University of Zurich, and Strijbis, Oliver
- Subjects
1403 Business and International Management ,prediction markets ,Logarithm ,Computer science ,Scoring rule ,05 social sciences ,Sample (statistics) ,Variance (accounting) ,Prediction market ,referenda ,Prediction methods ,320 Political science ,0502 economics and business ,Econometrics ,10113 Institute of Political Science ,050207 economics ,Business and International Management ,vote shares ,Focus (optics) ,market scoring rules ,evaluating forecasts ,050205 econometrics ,Event (probability theory) - Abstract
Thus far, the focus in prediction market research has been on establishing its forecast accuracy relative to those of other prediction methods, or on the investigation of a few single sources of forecast error. This article is the first attempt to overcome the narrow focus of the literature by combining observational and experimental analyses of prediction market errors. It investigates the prediction error of a real money prediction market uusing a logarithmic market scoring rule for 65 direct democratic votes in Switzerland. The article distinguishes between prediction market error due to the setup of the market, features of the event to be predicted, and the participants involved, and finds that the prediction market accuracy varies primarily according to the setup of the market, with the features of the event and especially the composition of the participant sample hardly mattering.
- Published
- 2019
- Full Text
- View/download PDF
15. Expert forecasting with and without uncertainty quantification and weighting: What do the data say?
- Author
-
Cooke, R.M. (author), Marti, Deniz (author), Mazzuchi, Thomas (author), Cooke, R.M. (author), Marti, Deniz (author), and Mazzuchi, Thomas (author)
- Abstract
Post-2006 expert judgment data has been extended to 530 experts assessing 580 calibration variables from their fields. New analysis shows that point predictions as medians of combined expert distributions outperform combined medians, and medians of performance weighted combinations outperform medians of equal weighted combinations. Relative to the equal weight combination of medians, using the medians of performance weighted combinations yields a 65% improvement. Using the medians of equally weighted combinations yields a 46% improvement. The Random Expert Hypothesis underlying all performance-blind combination schemes, namely that differences in expert performance reflect random stressors and not persistent properties of the experts, is tested by randomly scrambling expert panels. Generating distributions for a full set of performance metrics, the hypotheses that the original panels’ performance measures are drawn from distributions produced by random scrambling are rejected at significance levels ranging from E−6 to E−12. Random stressors cannot produce the variations in performance seen in the original panels. In- and out-of-sample validation results are updated., Applied Probability
- Published
- 2020
- Full Text
- View/download PDF
16. Nowcasting Business Cycles Using Toll Data.
- Author
-
Askitas, Nikolaos and Zimmermann, Klaus F.
- Subjects
BUSINESS cycles ,DATA analysis ,FINANCIAL crises ,ECONOMIC forecasting ,TRANSPORTATION ,DATA mining ,MACROECONOMICS ,PRODUCTION (Economic theory) - Abstract
ABSTRACT Nowcasting has been a challenge in the recent economic crisis. We introduce the Toll Index, a new monthly indicator for business cycle forecasting, and demonstrate its relevance using German data. The index measures the monthly transportation activity performed by heavy transport vehicles across the country and has highly desirable availability properties (insignificant revisions, short publication lags) as a result of the innovative technology underlying its data collection. It is coincident with production activity due to the prevalence of just-in-time delivery. The Toll Index is a good early indicator of production as measured, for instance, by the German Production Index, provided by the German Statistical Office, which is a well-known leading indicator of the gross national product. The proposed new index is an excellent example of technological, innovation-driven economic telemetry, which we suggest should be established more around the world. Copyright © 2011 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
17. Growth Rates of Global Energy Systems and Future Outlooks.
- Author
-
Höök, Mikael, Li, Junchen, Johansson, Kersti, and Snowden, Simon
- Subjects
TOTAL energy systems (On-site electric power production) ,ENERGY industry forecasting ,POWER resources forecasting ,RENEWABLE energy sources ,FOSSIL fuels ,DEVELOPED countries - Abstract
The world is interconnected and powered by a number of global energy systems using fossil, nuclear, or renewable energy. This study reviews historical time series of energy production and growth for various energy sources. It compiles a theoretical and empirical foundation for understanding the behaviour underlying global energy systems' growth. The most extreme growth rates are found in fossil fuels. The presence of scaling behaviour, i.e. proportionality between growth rate and size, is established. The findings are used to investigate the consistency of several long-range scenarios expecting rapid growth for future energy systems. The validity of such projections is questioned, based on past experience. Finally, it is found that even if new energy systems undergo a rapid 'oil boom'-development-i.e. they mimic the most extreme historical events-their contribution to global energy supply by 2050 will be marginal. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
18. Time series forecasting with a prior wavelet-based denoising step
- Author
-
Milan Bašta
- Subjects
noise ,Computer science ,Noise reduction ,computer.software_genre ,wavelets ,01 natural sciences ,010104 statistics & probability ,Wavelet ,lcsh:Finance ,lcsh:HG1-9999 ,0502 economics and business ,0101 mathematics ,Time series ,Physics::Atmospheric and Oceanic Physics ,Series (mathematics) ,lcsh:Economic theory. Demography ,05 social sciences ,General Medicine ,automatic forecasting ,Thresholding ,lcsh:HB1-3840 ,Noise (video) ,Data mining ,computer ,050203 business & management ,evaluating forecasts - Abstract
We provide an extensive study assessing whether a prior wavelet-based denoising step enhances the forecast accuracy of standard forecasting models. Many combinations of attribute values of the thresholding (denoising) algorithm are explored together with several traditional forecasting models used in economic time series forecasting. The results are evaluated using M3 competition yearly time series. We conclude that the performance of a forecasting model combined with the prior denoising step is generally not recommended, which implies that a straightforward generalisation of some of the results available in the literature (which found the denoising step to be beneficial) is not possible. Even if cross-validation is used to select the value of the threshold, a superior performance of the forecasting model with the prior denoising step does not generally follow.
- Published
- 2018
- Full Text
- View/download PDF
19. An Evaluation of Inflation Forecasts from Surveys Using Real-Time Data.
- Author
-
Croushore, Dean
- Subjects
PRICE inflation ,ECONOMIC forecasting ,SURVEYS ,GROSS domestic product ,TIME series analysis ,ERRORS - Abstract
This paper carries out the task of evaluating inflation forecasts from the Livingston Survey and the Survey of Professional Forecasters, using the Real-Time Data Set for Macroeconomists as a source of real-time data. We examine the magnitude and patterns of revisions to the inflation rate based on the output price index. We then run tests on the forecasts from the surveys to see how good they are. We find that there are several episodes in which forecasters made persistent forecast errors, but the episodes are so short that by the time they can be identified, they have nearly disappeared. Thus, improving on the survey forecasts seems to be very difficult in real time, and the attempt to do so leads to increased forecast errors. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
20. A Mixed Historical Formula to forecast volatility.
- Author
-
Ferulano, Roberto
- Subjects
MARKET volatility ,ECONOMIC forecasting ,RANDOM walks ,ARCH model (Econometrics) ,FIXED incomes ,FOREIGN exchange ,STATISTICAL smoothing ,ECONOMIC statistics - Abstract
This study presents a new methodology for forecasting volatility. It relies on a weighted mean of short and long estimates of variance, based on a Moving Average framework. The quality of the predictions obtained with the proposed formula was checked with both simulated and real data. When applied to the analysis of simulated data, the new formula provides the least reliable forecast when a Random Walk is used as Data Generating Process (DGP) and the forecast variance is a simple Moving Average. This is also the case when the DGP belongs to the ARCH model family and the associated forecast formula is used. However, compared to existing approaches, the new methodology allows for the most reliable forecast on 5-day and 20-day horizons, when it is applied to Index, Fixed Income and Foreign Exchange data series.Journal of Asset Management (2009) 10, 124–136. doi:10.1057/jam.2009.2 [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
21. Does the interaction between the accounting method choice and disclosure affect financial analysts’ information environment? The case of joint ventures under IAS 31
- Author
-
Raúl Íñiguez Sánchez, Begoña Giner Inchausti, Francisco Poveda Fuentes, Universidad de Alicante. Departamento de Economía Financiera y Contabilidad, and Contabilidad y Finanzas (CyF)
- Subjects
Information disclosure ,Integración proporcional ,Divulgación de información ,Economics and Econometrics ,Accounting ,Equity method ,Proportionate consolidation ,Consolidation (business) ,Recomendaciones sobre acciones ,0502 economics and business ,Economics ,Evaluación de predicciones ,Evaluating forecasts ,Stock (geology) ,Finance ,Earnings response coefficient ,Método de la participación ,050208 finance ,Earnings ,Accounting method ,business.industry ,05 social sciences ,050201 accounting ,Information environment ,Economía Financiera y Contabilidad ,Predicción de beneficios ,Accounting information system ,Stock recommendations ,business ,Earnings forecasting - Abstract
IAS 31 allowed firms to choose between proportionate consolidation and the equity method to record joint ventures in the consolidated accounts of the venturer. Moreover, this election implied a decision about including information in the primary financial statements or in the notes. This paper investigates if financial analysts perceive accounting information differently depending on the method chosen conditioned to the disclosure of the required information in the notes. We analysed a sample of Spanish firms during 2005–2010. We not only considered earnings forecasts, but also examined target prices and stock recommendations. Furthemore, we look at how this accounting choice affects analysts’ information environment. Our results suggest that the choice of accounting regime does not affect the bias and accuracy of earnings forecasts, nor target prices nor stock recommendations, no matter if firms provide or not information in the notes. While the proportionate method implies lower dispersion in analysts’ forecasts than the equity method, our tests do not allow us to confirm that the information environment depends on the accounting method. These results support the decision adopted in IFRS 11 to impose a unique method for the accounting of joint ventures. La NIC 31 permitía elegir entre integración proporcional y el método de la participación para la inclusión de los negocios conjuntos en las cuentas consolidadas de los partícipes. Además, esta elección conllevaba la inclusión de cierta información sobre los negocios conjuntos en los estados financieros primarios o en la memoria. Este trabajo investiga si los analistas financieros perciben la información contable de forma distinta dependiendo del método elegido, condicionado a que se divulgue la información requerida en la memoria. Se analiza una muestra de empresas españolas en el periodo 2005-2010, y se considera la influencia de la decisión contable sobre las predicciones de beneficios, precios objetivo y recomendaciones sobre acciones. Adicionalmente, se estudia si esta opción contable afecta al entorno informativo de los analistas. Nuestros resultados sugieren que la elección no afecta al sesgo ni a la precisión de las predicciones de beneficios, ni tampoco a precios objetivo ni a recomendaciones, al margen de que las empresas proporcionen o no información en la memoria. Pese a que la integración proporcional implica menor dispersión en las predicciones, los contrastes realizados no permiten confirmar que el entorno informativo se vea afectado por el uso de un método contable u otro. Estos resultados apoyan la decisión tomada en la NIIF 11 de imponer un único método para la contabilización de los negocios conjuntos. Begoña Giner gratefully acknowledges the financial support of the Spanish Ministry of Economy and Competitiveness, under grant [ECO2013-48208-P].
- Published
- 2017
- Full Text
- View/download PDF
22. Evaluating the Performance of Forecasting Models for Portfolio Allocation Purposes with Generalized GRACH Method
- Author
-
Adel Azar, Mohsen Hamidian, Maryam Saberi, and Mohammad Norozi
- Subjects
lcsh:HF5691-5716 ,lcsh:Finance ,lcsh:HG1-9999 ,Loss functions ,lcsh:Business mathematics. Commercial arithmetic. Including tables, etc ,Evaluating forecasts ,Portfolio allocation - Abstract
Portfolio theory assumes that investors accept risk. This means thatin the equal rate of return on the two assets, the assets were chosenthat have a lower risk level. Modern portfolio theory is accepted byinvestors who believe that they are not cope with the market. Sothey keep many different types of securities in order to access theoptimum efficiency rate that is close to the rate of return on market.One way to control investment risk is establishing the portfolioshares. There are many ways to choose the optimal portfolioshares. Among these methods in this study we use loss functions.For this, we choose all firms from the year2011to the end of 2015that had been a member in the Tehran Stock Exchange. The resultsof this research show that the likelihood functions have the bestperformance in Forecasting the optimal portfolio allocationprob-lem.
- Published
- 2017
23. FORECAST COMPARISONS.
- Author
-
Barrell, Ray, Kirby, Simon, and Metz, Robert
- Subjects
ECONOMIC indicators ,ECONOMIC forecasting ,PRICE inflation ,BUDGET surpluses - Abstract
The article compares the National Institute of Economic and Social Research's (NIESR's) forecasts for Great Britain's economic indicators with the corresponding forecasts from the Bank of England and the Institute for Fiscal Studies (IFS). It was found that the NIESR performs better than the Bank of England and the IFS in its forecasts for output and in particular inflation where simple scores are used. It also performs well on the forecasting of the government current budget surplus. Statistical estimates of accuracy provide a less clear picture but their reliability is blighted by the small sample size.
- Published
- 2005
- Full Text
- View/download PDF
24. THE NATIONAL INSTITUTE DENSITY FORECASTS OF INFLATION.
- Author
-
Mitchell, James
- Subjects
PRICE inflation ,ECONOMIC forecasting ,ECONOMIC models ,ECONOMIC indicators ,MATHEMATICAL models - Abstract
The article focuses on the National Institute of Economic and Social Research's density forecasts of the annual inflation in Great Britain. Increased attention is now given to providing measures of uncertainty associated with forecasts. Density forecasts capture this uncertainty fully. The density forecasts are derived from a large-scale macro-econometric model. In deriving the density forecasts, normality is assumed. The variance of the density forecast is set equal to the variance of the historical forecast error. Past forecast errors are commonly used as a practical way of forecasting future errors.
- Published
- 2005
- Full Text
- View/download PDF
25. Forecasting inbound Canadian tourism: an evaluation of Error Corrections Model forecasts.
- Author
-
Veloce, William
- Subjects
TOURISM ,FORECASTING ,TOURISTS ,TRAVEL ,REGRESSION analysis - Abstract
This paper computes and evaluates a variety of quantitative forecasts for inbound Canadian tourists, including the Error Corrections Model (ECM) and the traditional regression model forecasts. A number of forecasting methods are employed: naive to sophisticated, univariate to multivariate, time series and econometric. Forecasts for the number of inbound Canadian tourists are derived using data from four major markets: the USA, the UK, Germany and Japan. The evaluation of the forecasts is based on the Generalized Forecast Error Second Moment (GFESM) criterion developed by Clements and Hendry (1993) and the Adjusted Mean Absolute Percentage Error (AMAPE) criterion. The ECM forecasts performed best, while the traditional regression model forecasts performed poorly. En this study, using Canadian data, the development of an ECM (which entails careful analysis of the integration and co-integration properties of the variables) provides an improvement in forecast accuracy. Previous tourism studies have found less promising results concerning the performance of the ECM forecasts. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
26. Expert forecasting with and without uncertainty quantification and weighting
- Author
-
Thomas A. Mazzuchi, Roger M. Cooke, and Deniz Marti
- Subjects
Median ,Calibration (statistics) ,05 social sciences ,Ranging ,Judgmental forecasting ,Weighting ,Scrambling ,Set (abstract data type) ,Combining forecasts ,0502 economics and business ,Statistics ,Calibration ,Point (geometry) ,Evaluating forecasts ,050207 economics ,Business and International Management ,Uncertainty quantification ,Simulation ,050205 econometrics ,Mathematics ,Panel data - Abstract
Post-2006 expert judgment data has been extended to 530 experts assessing 580 calibration variables from their fields. New analysis shows that point predictions as medians of combined expert distributions outperform combined medians, and medians of performance weighted combinations outperform medians of equal weighted combinations. Relative to the equal weight combination of medians, using the medians of performance weighted combinations yields a 65% improvement. Using the medians of equally weighted combinations yields a 46% improvement. The Random Expert Hypothesis underlying all performance-blind combination schemes, namely that differences in expert performance reflect random stressors and not persistent properties of the experts, is tested by randomly scrambling expert panels. Generating distributions for a full set of performance metrics, the hypotheses that the original panels’ performance measures are drawn from distributions produced by random scrambling are rejected at significance levels ranging from E − 6 to E − 12 . Random stressors cannot produce the variations in performance seen in the original panels. In- and out-of-sample validation results are updated.
- Published
- 2020
27. Has the forecasting performance of the Federal Reserve's Greenbooks changed over time?
- Author
-
Orman, Cüneyt, Ekşi, Ozan, Taş, Bedri Kamil Onur, Orman, Cüneyt, Ekşi, Ozan, and Taş, Bedri Kamil Onur
- Abstract
We investigate how the forecasting performance of the Federal Reserve Greenbooks has changed relative to commercial forecasters between 1974 and 2009. To this end, we analyze time-variation in the Greenbook coefficients in forecast encompassing regressions. Assuming that model coefficients change continuously, we estimate unobserved components models using Bayesian inference techniques. To verify that our results do not depend on the specific way change is modeled, we also allow the coefficients to change discretely rather than continuously and test for structural breaks using classical inference techniques. We find that the Greenbook forecasts have been consistently superior to the commercial forecasts at all horizons throughout our sample period. Although the forecasting performance gap has narrowed at more distant horizons after the early-to-mid 1980s, it remains significant.
- Published
- 2019
28. Has the forecasting performance of the Federal Reserve's Greenbooks changed over time?
- Author
-
Ekşi, Ozan, Taş, Bedri Kamil Onur, Orman, Cüneyt, Ekşi, Ozan, Taş, Bedri Kamil Onur, and Orman, Cüneyt
- Abstract
We investigate how the forecasting performance of the Federal Reserve Greenbooks has changed relative to commercial forecasters between 1974 and 2009. To this end, we analyze time-variation in the Greenbook coefficients in forecast encompassing regressions. Assuming that model coefficients change continuously, we estimate unobserved components models using Bayesian inference techniques. To verify that our results do not depend on the specific way change is modeled, we also allow the coefficients to change discretely rather than continuously and test for structural breaks using classical inference techniques. We find that the Greenbook forecasts have been consistently superior to the commercial forecasts at all horizons throughout our sample period. Although the forecasting performance gap has narrowed at more distant horizons after the early-to-mid 1980s, it remains significant.
- Published
- 2019
29. Forecasting dynamically asymmetric fluctuations of the U.S. business cycle
- Author
-
Emilio Zanetti Chini
- Subjects
unemployment ,nonlinear time series ,media_common.quotation_subject ,statistical tests ,Settore SECS-P/05 - Econometria ,01 natural sciences ,Asymmetry ,density forecasts ,econometric modelling ,evaluating forecasts ,generalized logistic ,industrial production ,point forecasts ,010104 statistics & probability ,0502 economics and business ,Econometrics ,Business cycle ,0101 mathematics ,Business and International Management ,Logistic function ,050205 econometrics ,media_common ,Mathematics ,Statistical hypothesis testing ,Series (mathematics) ,05 social sciences ,Symmetry (physics) ,Autoregressive model ,Null hypothesis - Abstract
The generalized smooth transition autoregression (GSTAR) parametrizes the joint asymmetry in the duration and length of cycles in macroeconomic time series by using particular generalizations of the logistic function. The symmetric smooth transition and linear autoregressions are nested in the GSTAR. A test for the null hypothesis of dynamic symmetry is presented. Two case studies indicate that dynamic asymmetry is a key feature of the U.S. economy. The GSTAR model beats its competitors for point forecasting, but this superiority becomes less evident for density forecasting and in uncertain forecasting environments.
- Published
- 2018
30. Has the forecasting performance of the Federal Reserve's Greenbooks changed over time?
- Author
-
Ozan Eksi, Bedri Kamil Onur Tas, Cuneyt Orman, TOBB ETU, Faculty of Economics and Administrative Sciences, Department of Economics, TOBB ETÜ, İktisadi ve İdari Bilimler Fakültesi, İktisat Bölümü, Taş, Bedri Kamil Onur, and Ekşi, Ozan
- Subjects
Economics and Econometrics ,spf inflation forecasts ,05 social sciences ,greenbook inflation forecasts ,Inference ,Sample (statistics) ,Performance gap ,Bayesian inference ,time-variation in coefficients ,0502 economics and business ,Econometrics ,Economics ,050207 economics ,Physics::Atmospheric and Oceanic Physics ,050205 econometrics ,evaluating forecasts - Abstract
We investigate how the forecasting performance of the Federal Reserve Greenbooks has changed relative to commercial forecasters between 1974 and 2009. To this end, we analyze time-variation in the Greenbook coefficients in forecast encompassing regressions. Assuming that model coefficients change continuously, we estimate unobserved components models using Bayesian inference techniques. To verify that our results do not depend on the specific way change is modeled, we also allow the coefficients to change discretely rather than continuously and test for structural breaks using classical inference techniques. We find that the Greenbook forecasts have been consistently superior to the commercial forecasts at all horizons throughout our sample period. Although the forecasting performance gap has narrowed at more distant horizons after the early-to-mid 1980s, it remains significant.
- Published
- 2017
31. Future changes in age and household patterns: Some implications for public finances
- Author
-
Svend E. Hougaard Jensen and Rasmus Højbjerg Jacobsen
- Subjects
Population ageing ,Economic growth ,media_common.quotation_subject ,Public finances ,Probability forecasting ,Demographic forecasting ,Payment ,Fiscal impact ,Register data ,Economics ,Demographic economics ,Evaluating forecasts ,Household composition ,Business and International Management ,Point forecast ,Welfare ,health care economics and organizations ,media_common - Abstract
Using stochastic forecasting techniques, this paper assesses the consequences for public finances of changes in age and household structures in Denmark over the period 2008–2037. Focusing on components of welfare provisions and tax payments with noticeable differences across age and household status, we show that, based on a point forecast, the fiscal impact of changes in household structures amounts to an annual negative effect of 0.5% of GDP, and the effect of changes in age structures is forecast to worsen the public budget by 3.7% of GDP per year. While being subject to a considerable amount of uncertainty, the prospect of such a dramatic weakening of public finances is likely to trigger demands for welfare reforms characterized by a more individualized system of public transfer and tax payments, in addition to the measures that have already been taken to address the fiscal effects of population ageing.
- Published
- 2014
- Full Text
- View/download PDF
32. Does the interaction between the accounting method choice and disclosure affect financial analysts’ information environment? The case of joint ventures under IAS 31
- Author
-
Universidad de Alicante. Departamento de Economía Financiera y Contabilidad, Giner Inchausti, Begoña, Íñiguez Sánchez, Raúl, Poveda Fuentes, Francisco, Universidad de Alicante. Departamento de Economía Financiera y Contabilidad, Giner Inchausti, Begoña, Íñiguez Sánchez, Raúl, and Poveda Fuentes, Francisco
- Abstract
IAS 31 allowed firms to choose between proportionate consolidation and the equity method to record joint ventures in the consolidated accounts of the venturer. Moreover, this election implied a decision about including information in the primary financial statements or in the notes. This paper investigates if financial analysts perceive accounting information differently depending on the method chosen conditioned to the disclosure of the required information in the notes. We analysed a sample of Spanish firms during 2005–2010. We not only considered earnings forecasts, but also examined target prices and stock recommendations. Furthemore, we look at how this accounting choice affects analysts’ information environment. Our results suggest that the choice of accounting regime does not affect the bias and accuracy of earnings forecasts, nor target prices nor stock recommendations, no matter if firms provide or not information in the notes. While the proportionate method implies lower dispersion in analysts’ forecasts than the equity method, our tests do not allow us to confirm that the information environment depends on the accounting method. These results support the decision adopted in IFRS 11 to impose a unique method for the accounting of joint ventures., La NIC 31 permitía elegir entre integración proporcional y el método de la participación para la inclusión de los negocios conjuntos en las cuentas consolidadas de los partícipes. Además, esta elección conllevaba la inclusión de cierta información sobre los negocios conjuntos en los estados financieros primarios o en la memoria. Este trabajo investiga si los analistas financieros perciben la información contable de forma distinta dependiendo del método elegido, condicionado a que se divulgue la información requerida en la memoria. Se analiza una muestra de empresas españolas en el periodo 2005-2010, y se considera la influencia de la decisión contable sobre las predicciones de beneficios, precios objetivo y recomendaciones sobre acciones. Adicionalmente, se estudia si esta opción contable afecta al entorno informativo de los analistas. Nuestros resultados sugieren que la elección no afecta al sesgo ni a la precisión de las predicciones de beneficios, ni tampoco a precios objetivo ni a recomendaciones, al margen de que las empresas proporcionen o no información en la memoria. Pese a que la integración proporcional implica menor dispersión en las predicciones, los contrastes realizados no permiten confirmar que el entorno informativo se vea afectado por el uso de un método contable u otro. Estos resultados apoyan la decisión tomada en la NIIF 11 de imponer un único método para la contabilización de los negocios conjuntos.
- Published
- 2017
33. Quantile Regression Estimates of Confidence Intervals for WASDE Price Forecasts
- Author
-
Isengildina-Massa, Olga, Irwin, Scott H., and Good, Darrel L.
- Subjects
lcsh:Agriculture ,commodity, evaluating forecasts, government forecasting, judgmental forecasting, prediction intervals, price forecasting, Crop Production/Industries, Demand and Price Analysis ,judgmental forecasting ,prediction intervals ,Demand and Price Analysis ,price forecasting ,lcsh:S ,government forecasting ,commodity ,Crop Production/Industries ,evaluating forecasts - Abstract
This study uses quantile regressions to estimate historical forecast error distributions for WASDE forecasts of corn, soybean, and wheat prices, and then compute confidence limits for the forecasts based on the empirical distributions. Quantile regressions with fit errors expressed as a function of forecast lead time are consistent with theoretical forecast variance expressions while avoiding assumptions of normality and optimality. Based on out-of-sample accuracy tests over 1995/96–2006/07, quantile regression methods produced intervals consistent with the target confidence level. Overall, this study demonstrates that empirical approaches may be used to construct accurate confidence intervals for WASDE corn, soybean, and wheat price forecasts.
- Published
- 2010
34. Effects of the Swiss Franc/Euro Exchange Rate Floor on the Calibration of Probability Forecasts
- Author
-
Brian D. Deaton
- Subjects
050208 finance ,Calibration (statistics) ,05 social sciences ,Causal learning ,Univariate ,probability forecasting ,calibration ,evaluating forecasts ,causality ,exchange rates ,vector autoregression models ,Independent component analysis ,Vector autoregression ,Exchange rate ,Autoregressive model ,0502 economics and business ,Metric (mathematics) ,Econometrics ,050207 economics ,Mathematics - Abstract
Probability forecasts of the Swiss franc/euro (CHF/EUR) exchange rate are generated before, surrounding and after the placement of a floor on the CHF/EUR by the Swiss National Bank (SNB). The goal is to determine whether the exchange rate floor has a positive, negative or insignificant effect on the calibration of the probability forecasts from three time-series models: a vector autoregression (VAR) model, a VAR model augmented with the LiNGAM causal learning algorithm, and a univariate autoregressive model built on the independent components (ICs) of an independent component analysis (ICA). Score metric rankings of forecasts and plots of calibration functions are used in an attempt to identify the preferred time-series model based on forecast performance. The study not only finds evidence that the floor on the CHF/EUR has a negative impact on the forecasting performance of all three time-series models but also that the policy change by the SNB altered the causal structure underlying the six major currencies.
- Published
- 2018
- Full Text
- View/download PDF
35. Comparing the Information Contents of IMF and OECD Macroeconomic Forecasts
- Author
-
Takagi, Shinji and Kucur, Halim
- Subjects
Macroeconomic forecasting ,bootstrapping ,IMF forecasts ,information content of macroeconomic forecasts ,herding behavior ,Consensus forecasts ,evaluating forecasts ,OECD forecasts - Abstract
The paper compares the information contents of two of the most widely used sets of macroeconomic forecasts, namely, the macroeconomic forecasts produced by the International Monetary Fund (IMF) and the Organization for Economic Cooperation and Development (OECD). By testing whether the release of public sector forecasts had systematic impact on subsequent revisions in Consensus (private sector) forecasts, the paper finds that, for the period 1994?2003, private forecasters reacted to the information provided by the IMF on regions (such as non?G7 Europe and Latin America) for which the IMF’s forecasts are known to be more accurate or about which information is less available. What determines the information value of public sector forecasts, however, appears complex. The OECD’s impact was limited in influencing private sector forecasts for the G7 countries, though it had a superior forecast performance than the IMF. Some herding behavior was observed among private sector forecasters.
- Published
- 2008
36. Selecting volatility forecasting models for portfolio allocation purposes
- Author
-
Becker, Ralf, Clements, Adam, Doolan, Mark, Hurn, Aubrey, Becker, Ralf, Clements, Adam, Doolan, Mark, and Hurn, Aubrey
- Abstract
Techniques for evaluating and selecting multivariate volatility forecasts are not yet understood as well as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a set of competing forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood-based loss function outperforms its competitors, including those based on the given portfolio application. This result indicates that considering the particular application of forecasts is not necessarily the most effective basis on which to select models.
- Published
- 2015
37. Analyzing Fixed-event Forecast Revisions
- Author
-
Chang, Chia-Lin, de Bruijn, Bert, Franses, Philip Hans, and McAleer, Michael
- Subjects
Macroeconomic forecasting ,E37 ,E27 ,Macroeconomía ,ddc:330 ,Evaluating forecasts ,Weak-form efficiency ,Econometría ,C53 ,Rationality ,Fixed-event forecasts ,C22 ,Intuition - Abstract
It is common practice to evaluate fixed-event forecast revisions in macroeconomics by regressing current forecast revisions on one-period lagged forecast revisions. Under weak-form (forecast) efficiency, the correlation between the current and one-period lagged revisions should be zero. The empirical findings in the literature suggest that this null hypothesis of zero correlation is rejected frequently, where the correlation can be either positive (which is widely interpreted in the literature as “smoothing”) or negative (which is widely interpreted as “over-reacting”). We propose a methodology to interpret such non-zero correlations in a straightforward and clear manner. Our approach is based on the assumption that numerical forecasts can be decomposed into both an econometric model and random expert intuition. We show that the interpretation of the sign of the correlation between the current and one-period lagged revisions depends on the process governing intuition, and the current and lagged correlations between intuition and news (or shocks to the numerical forecasts). It follows that the estimated non-zero correlation cannot be given a direct interpretation in terms of smoothing or overreaction.
- Published
- 2013
38. Factors afecting time series forecasting accuracy: simulation & analysis
- Author
-
Petropoulos, Fotios M.
- Subjects
Ανάλυση μέσω παλινδρόμησης ,Αυτόματη επιλογή μεθόδου ,Forecasting accuracy ,Ακρίβεια πρόβλεψης ,Automatic method selection ,Συνάθροιση ,Smulation ,Προσομοίωση ,Factors affecting accuracy ,Aggregation ,Παράγοντες επιρροής ακρίβειας ,Evaluating forecasts ,Regression analysis ,Αξιολόγηση προβλέψεων - Abstract
256 σ., Αντικείμενο της παρούσας διδακτορικής διατριβής είναι η ανάλυση των παραγόντων που επηρεάζουν τη στατιστική ακρίβεια πρόβλεψης των τεχνικών προέκτασης. Η μελέτη εστιάζει στα δεδομένα συνεχούς φύσης, τα οποία ορίζονται ως μία συνάρτηση των τεσσάρων βασικών συνιστωσών τους, που είναι η εποχιακότητα, η τάση, ο κύκλος και η τυχαιότητα. Η βιβλιογραφική έρευνα που εκπονήθηκε στα πλαίσια της διατριβής είχε ως στόχο την πλήρη κατανόηση των χρονοσειρών καθώς και της ανάλυσης αυτών. Μελετήθηκαν τα κύρια ποιοτικά χαρακτηριστικά, και έγινε καταγραφή μεθόδων αποσύνθεσης, στατιστικής ανάλυσης και μετασχηματισμού των αρχικών δεδομένων. Κατόπιν, μελετήθηκε σε βάθος η βιβλιογραφία που αφορά καθιερωμένες και σύγχρονες μεθοδολογίες πρόβλεψης, τόσο σε επίπεδο χρονοσειρών συνεχούς ζήτησης, όσο και σε επίπεδο δεδομένων διακοπτόμενης φύσης. Μελετήθηκε και αναπτύχθηκε πλήρης βιβλιογραφία σχετικά με τους παράγοντες που επηρεάζουν τις προβλέψεις, διακρίνοντάς τρεις βασικές κατηγορίες: στατιστικούς, κριτικούς και ψυχολογικούς. Η έρευνα επί του μετασχηματισμού οδήγησε στην περιγραφή και παρουσίαση της καινοτόμας μεθόδου προβλέψεων ADIDA, που πρόκειται για μια τεχνική συνάθροισης των δεδομένων σε σειρές χαμηλότερης συχνότητας με στόχο τη μείωση της παρουσίας μηδενικών τιμών. Η συνάθροιση των δεδομένων ακολουθείται από πρόβλεψη στο συναθροισμένο επίπεδο και, τελικά, διαχωρισμό των συναθροισμένων προβλέψεων σε προβλέψεις υψηλότερης συχνότητας. Η φιλοσοφία αυτή εφαρμόστηκε σε σύνολα πραγματικών δεδομένων συνεχούς και διακοπτόμενης φύσης και επιδεικνύει πολύ ελπιδοφόρα αποτελέσματα, λειτουργώντας ως ένας μηχανισμός «αυτό-βελτίωσης» των τεχνικών προέκτασης. Η μελέτη, ανάλυση και εξέταση των παραγόντων που επηρεάζουν τις στατιστικές προβλέψεις επιτεύχθηκε μέσω μιας εκτενούς πειραματικής διαδικασίας προσομοίωσης. Η διατριβή περιγράφει αναλυτικά την επιλογή των συνιστωσών και των επιπέδων αυτών αλλά και τη διαδικασία κατασκευής ενός πολύ μεγάλου τεχνητού συνόλου δεδομένων (σχεδόν 80 εκατομμύρια σειρές), οι οποίες και αποτέλεσαν και τη βάση για το κύριο μέρος της έρευνας της παρούσας διατριβής. Η στατιστική ανάλυση μέσω πολλαπλής παλινδρόμησης καθιστά σαφές το ύψος της επιρροής των εξεταζόμενων παραγόντων. Τέλος, προτείνεται ένα πλαίσιο επιλογής μεθόδου βάσει της ανάλυσης των χαρακτηριστικών της χρονοσειράς., The current doctoral thesis’ main objective is the analysis of factors affecting statistical accuracy of extrapolation techniques. The study focuses on fast demand time series, which can be described as a function of four basic components: seasonality, trend, cycle and randomness. The literature review conducted in this thesis was to understand every aspect of time series analysis. We studied the main qualitative characteristics and we recorded widely used methods regarding decomposition, statistical analysis and transformation of the original data. Furthermore, we studied in depth the literature on well-established and modern forecasting methodologies, regarding on both fast and intermittent demand natures. Lastly, we studied and developed comprehensive literature review on factors affecting forecasts and forecasting procedure, by distinguishing three main categories: statistical, judgmental and psychological factors. The research regarding transformation of the original data led to the definition of a unique forecasting methodology, the ADIDA framework, which is an aggregation-disaggregation technique of gathering data in lower frequency ranges, so to reduce dramatically the presence of zero values. The aggregation of data is followed by extrapolation at the aggregated level and, finally, separation of the aggregated point forecast at forecasts of higher frequency. This philosophy was applied to real data sets of fast as well as intermittent demand nature, where the results were very promising, serving as a “self-improvement” mechanism for forecasting methods. The analysis and examination of factors affecting the statistical predictions was achieved through an extensive experimental simulation process. The thesis describes the selection of the levels for each examined factor and the generation procedure of an extremely large simulated data set (including about 80 million series), which was the basis for the main part of the current research. The statistical analysis by multiple regression functions verifies the main hypothesis and illustrates the level of influence of the examined factors. Finally, we propose a method selection framework, based on the analysis of series characteristics and the results of this study.
- Published
- 2012
- Full Text
- View/download PDF
39. Fitting and Forecasting Sovereign Defaults Using Multiple Risk Signals
- Author
-
Marika Vezzoli and Roberto Savona
- Subjects
Statistics and Probability ,Economics and Econometrics ,media_common.quotation_subject ,Logit ,jel:C23 ,jel:H63 ,jel:G01 ,01 natural sciences ,010104 statistics & probability ,Debt ,0502 economics and business ,Econometrics ,Economics ,050207 economics ,Predictability ,0101 mathematics ,Emerging markets ,media_common ,050208 finance ,05 social sciences ,jel:C14 ,Interest rate ,Real gross domestic product ,Data mining ,Evaluating forecasts ,Model selection ,Panel data ,Probability forecasting ,Default ,Statistics, Probability and Uncertainty ,Social Sciences (miscellaneous) ,European debt crisis - Abstract
In this paper we face the fitting versus forecasting paradox with the objective of realizing an optimal Early Warning System to better describe and predict past and future sovereign defaults. We do this by proposing a new Regression Tree-based model that signals a potential crisis whenever preselected indicators exceed specific thresholds. Using data on 66 emerging markets over the period 1975-2002, our model provides an accurate description of past data, although not the best description relative to existing competing models (Logit, Stepwise logit, Noise-to-Signal Ratio and Regression Trees), and produces the best forecasts accomodating to different risk aversion targets. By modulating in- and out-of sample model accuracy, our methodology leads to unambiguous empirical results, since we find that illiquidity (short-term debt to reserves ratio), insolvency (reserve growth) and contagion risks act as the main determinants/predictors of past/future debt crises.
- Published
- 2012
40. Beating the random walk in Central and Eastern Europe by survey forecasts
- Author
-
Naszódi, Anna
- Subjects
Slowakei ,Zinsstruktur ,F36 ,G13 ,Polen ,Tschechische Republik ,exchange rate ,Rumänien ,time-varying parameter ,Wechselkurs ,survey forecast ,Random Walk ,ddc:330 ,Ungarn ,term-structure of forecasts ,Prognoseverfahren ,F31 ,evaluating forecasts ,Schätzung - Abstract
This paper investigates the forecasting ability of survey data on exchange rate expectations with multiple forecast horizons. The survey forecasts are on the exchange rates of five Central and Eastern European currencies: Czech Koruna, Hungarian Forint, Polish Zloty, Romanian Leu and Slovakian Koruna. First, different term-structure models are fitted on the survey forecasts. Then, the forecasting performances of the fitted forecasts are compared. The fitted forecasts for the 5 months horizon and beyond are proved to be significantly better than the random walk on the pooled data of the five currencies. The best performing term-structure model is the one that assumes an exponential relationship between the forecast and the forecast horizon, and has time-varying parameters.
- Published
- 2011
41. Nowcasting business cycles using toll data
- Author
-
Askitas, Nikos and Zimmermann, Klaus F.
- Subjects
transportation ,Straßengüterverkehr ,E37 ,telemetry ,macroeconomic forecasting ,data mining ,nowcasting ,Straßenbenutzungsgebühr ,Konjunkturindikator ,C82 ,new products ,business cycles ,ddc:330 ,Statistik ,L92 ,E01 ,production forecasting ,Konjunkturprognose ,Prognoseverfahren ,Deutschland ,Gesamtwirtschaftliche Produktion ,E32 ,evaluating forecasts - Abstract
Nowcasting has been a challenge in the recent economic crisis. We introduce the Toll Index, a new monthly indicator for business cycle forecasting and demonstrate its relevance using German data. The index measures the monthly transportation activity performed by heavy transport vehicles across the country and has highly desirable availability properties (insignificant revisions, short publication lags) as a result of the innovative technology underlying its data collection. It is coincident with production activity due to the prevalence of just-in-time delivery. The Toll Index is a good early indicator of production as measured for instance by the German Production Index, provided by the German Statistical Office, which is a well-known leading indicator of the Gross National Product. The proposed new index is an excellent example of technological, innovation-driven economic telemetry, which we suggest should be established more around the world.
- Published
- 2011
42. Analyzing Fixed-event Forecast Revisions
- Author
-
Philip Hans Franses, Chia-Lin Chang, and Michael McAleer
- Subjects
jel:E27 ,jel:C53 ,Evaluating forecasts ,Macroeconomic forecasting ,Rationality ,Intuition ,Weak-form efficiency ,Fixed-event forecasts ,Evaluating forecasts, Macroeconomic forecasting, Rationality, Intuition, Weak-form efficiency, Fixed-event forecasts ,jel:C22 ,jel:E37 - Abstract
It is common practice to evaluate fixed-event forecast revisions in macroeconomics by regressing current revisions on one-period lagged revisions. Under weak-form efficiency, the correlation between the current and one-period lagged revisions should be zero. The empirical findings in the literature suggest that the null hypothesis of zero correlation between the current and one-period lagged revisions is rejected quite frequently, where the correlation can be either positive or negative. In this paper we propose a methodology to be able to interpret such non-zero correlations in a straightforward manner. Our approach is based on the assumption that forecasts can be decomposed into both an econometric model and expert intuition. The interpretation of the sign of the correlation between the current and one-period lagged revisions depends on the process governing intuition, and the correlation between intuition and news.
- Published
- 2011
43. Role thinking: standing in other people's shoes to forecast decisions in conflicts
- Author
-
J. Scott Armstrong, Kesten C. Green, Green, Kesten C, and Armstrong, J Scott
- Subjects
role playing ,Management science ,perspective-taking ,media_common.quotation_subject ,Applied psychology ,combining forecasts ,Group decision-making ,Social group ,Negotiation ,Order (business) ,Organizational behavior ,Perspective-taking ,organizational behavior ,group decision making ,unaided judgment ,simulated interaction ,Sanctions ,Business and International Management ,Psychology ,Simulation methods ,media_common ,evaluating forecasts ,expert judgment - Abstract
When forecasting decisions in conflict situations, experts are often advised to figuratively stand in the other person’s shoes. We refer to this as “role thinking”, because, in practice, the advice is to think about how other protagonists will view the situation in order to predict their decisions. We tested the effect of role thinking on forecast accuracy. We obtained 101 role-thinking forecasts of the decisions that would be made in nine diverse conflicts from 27 Naval postgraduate students (experts) and 107 role-thinking forecasts from 103 second-year organizational behavior students (novices). The accuracy of the novices’ forecasts was 33% and that of the experts’ was 31%; both were little different from chance (guessing), which was 28%. The small improvement in accuracy from role-thinking strengthens the finding from earlier research that it is not sufficient to think hard about a situation in order to predict the decisions which groups of people will make when they are in conflict. Instead, it is useful to ask groups of role players to simulate the situation. When groups of novice participants adopted the roles of protagonists in the aforementioned nine conflicts and interacted with each other, their group decisions predicted the actual decisions with an accuracy of 60%. Refereed/Peer-reviewed
- Published
- 2011
44. New methods for forecasting inflation, applied to the US
- Author
-
Aron, Janine and Muellbauer, John
- Subjects
jel:C52 ,jel:C53 ,jel:C51 ,Error Correction Models ,Evaluating Forecasts ,Model Selection ,Multivariate Time Series ,jel:E52 ,jel:C22 ,jel:E31 ,jel:E37 - Abstract
Models for the twelve-month-ahead US rate of inflation, measured by the chain weighted consumer expenditure deflator, are estimated for 1974-99 and subsequent pseudo out-of-sample forecasting performance is examined. Alternative forecasting approaches for different information sets are compared with benchmark univariate autoregressive models, and substantial out-performance is demonstrated. Three key ingredients to the out-performance are: including equilibrium correction terms in relative prices; introducing non-linearities to proxy state dependence in the inflation process; and replacing the information criterion, commonly used in VARs to select lag length, with a ‘parsimonious longer lags’ (PLL) parameterisation. Forecast pooling or averaging also improves forecast performance.
- Published
- 2010
45. Does aggregating forecasts by CPI component improve inflation forecast accuracy in South Africa?
- Author
-
Aron, Janine and Muellbauer, John
- Subjects
jel:C52 ,jel:C53 ,jel:C51 ,CPI Sub-Components ,Disaggregation ,Error Correction Models ,Evaluating Forecasts ,Model Selection ,Multivariate Time Series ,Sectoral Inflation ,jel:C32 ,jel:E52 ,jel:C22 ,jel:E31 - Abstract
Inflation is a far from homogeneous phenomenon, a fact often neglected in modelling consumer price inflation. This study, the first of its kind for an emerging market country, investigates gains to inflation forecast accuracy by aggregating weighted forecasts of the sub-component price indices, versus forecasting the aggregate consumer price index itself. Rich multivariate equilibrium correction models employ general and sectoral information for ten sub-components, taking account of structural breaks and institutional changes. Model selection is over 1979-2003, with pseudo out-of-sample forecasts, four-quarters-ahead, generated to 2007. Aggregating the weighted forecasts of the sub-components does outperform the aggregate CPI forecasts, and also offers substantial gains over forecasting using benchmark naïve models. The analysis also contributes an improved understanding of sectoral inflationary pressures. This forecasting method should be more robust to the regular reweighting of the CPI index.
- Published
- 2010
46. Forecast revisions of Mexican inflation and GDP growth
- Author
-
Capistrán, Carlos and López-Moctezuma, Gabriel
- Subjects
E37 ,Prognose ,Surveys ,Inflation ,C83 ,Macroeconomic forecasting ,Inflation forecasting ,Mexiko ,ddc:330 ,Evaluating forecasts ,C53 ,Wirtschaftsprognose ,C23 ,Panel data ,Sozialprodukt - Abstract
We analyze forecasts of inflation and GDP growth contained in Banco de México's Survey of Professional Forecasters for the period 1995-2009. The forecasts are for the current and the following year, comprising an unbalanced three-dimensional panel with multiple individual forecasters, target years, and forecast horizons. The fixed-event nature of the forecasts enables us to examine efficiency by looking at the revision process. The panel structure allows us to control for aggregate shocks and to construct a measure of the news that impacted expectations in the period under study. The results suggest that respondents seem to rely for longer than appears to be optimal on their previous forecasts, and that they do not seem to use past information in an efficient manner. In turn, this means there are areas of opportunity to improve the accuracy of the forecasts, for instance, by taking into account the positive autocorrelation found in forecast revisions.
- Published
- 2010
47. A hierarchical procedure for the combination of forecasts
- Author
-
Mauro Costantini and Carmine Pappalardo
- Subjects
Industrial production ,Model selection ,Econometric models ,Variable (computer science) ,Econometric model ,Combining forecasts ,Evaluating forecasts ,Robustness (computer science) ,Econometrics ,Economics ,Range (statistics) ,Business and International Management ,Consensus forecast ,Technology forecasting - Abstract
This paper proposes a strategy to increase the efficiency of forecast combination. Given the availability of a wide range of forecasts for the same variable of interest, our goal is to apply combining methods to a restricted set of models. With this aim, a hierarchical procedure based on an encompassing test is considered. First, forecasting models are ranked according to a measure of predictive accuracy (RMSFE). The models are then selected for combination such that each forecast is not encompassed by any of the competing forecasts. Thus the hierarchical procedure represents a compromise between model selection and model averaging. The robustness of the procedure is investigated in terms of the relative RMSFE using ISAE (Institute for Studies and Economic Analyses) short-term forecasting models for monthly industrial production in Italy.
- Published
- 2010
48. Accuracy, unbiasedness and efficiency of professional macroeconomic forecasts: An empirical comparison for the G7
- Author
-
Dovern, Jonas and Weisser, Johannes
- Subjects
Fixed-Event Forecasts ,Survey Data ,E37 ,G-7-Staaten ,Macroeconomic Forecasting ,Rationality ,Sachverständige ,Rationalismus ,Bias ,ddc:330 ,Bewertung ,C25 ,Evaluating forecasts ,Konjunkturprognose ,E32 - Abstract
In this paper, we use survey data to analyze the accuracy, unbiasedness, and the efficiency of professional macroeconomic forecasts. We analyze a large panel of individual forecasts that has not been analyzed in the literature so far. We provide evidence on the properties of forecasts for all G7 counties and for four diffierent macroeconomic variables. Our results show a high degree of dispersion of forecast accuracy across forecasters. We also find that there are large diffierences in the performance of forecasters not only across countries but also across diffierent macroeconomic variables. In general, forecasts tend to be biased in situations where forecasters have to respond to large structural shocks or gradual changes in the trend of a variable. Furthermore, while a sizable fraction of forecasters seem to smooth their GDP forecasts significantly, this does not apply to forecasts made for other macroeconomic variables.
- Published
- 2009
49. A hierarchical procedure for the combination of forecasts
- Author
-
Costantini, Mauro and Pappalardo, Carmine
- Subjects
models selection ,Welt ,econometric models ,Modellierung ,Ökonometrisches Modell ,combining forecasts ,ddc:330 ,Zeitreihenanalyse ,time series ,Prognoseverfahren ,C53 ,C32 ,evaluating forecasts ,Schätzung - Abstract
This paper proposes a strategy to increase the efficiency of forecast combination. Given the availability of a wide range of forecasts for the same variable of interest, our goal is to apply combining methods to a restricted set of models. To this aim, a hierarchical procedure based on an encompassing test is developed. Firstly, forecasting models are ranked according to a measure of predictive accuracy (RMSFE). The models are then selected for combination such that each forecast is not encompassed by any of the competing forecasts. Thus, the procedure aims to unit model selection and model averaging methods. The robustness of the procedure is investigated in terms of the relative RMSFE using ISAE (Institute for Studies and Economic Analyses) short-term forecasting models for monthly industrial production in Italy.
- Published
- 2009
50. Analyzing Fixed-Event Forecast Revisions
- Author
-
Chang, C-L. (Chia-Lin), Bruijn, L.P. (Bert) de, Franses, Ph.H.B.F. (Philip Hans), McAleer, M.J. (Michael), Chang, C-L. (Chia-Lin), Bruijn, L.P. (Bert) de, Franses, Ph.H.B.F. (Philip Hans), and McAleer, M.J. (Michael)
- Abstract
It is common practice to evaluate fixed-event forecast revisions in macroeconomics by regressing current forecast revisions on one-period lagged forecast revisions. Under weak-form (forecast) efficiency, the correlation between the current and one-period lagged revisions should be zero. The empirical findings in the literature suggest that this null hypothesis of zero correlation is rejected frequently, where the correlation can be either positive (which is widely interpreted in the literature as “smoothing”) or negative (which is widely interpreted as “over-reacting”). We propose a methodology to in
- Published
- 2013
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.