9 results on '"Hecq, Alain"'
Search Results
2. Forecasting Mixed Frequency Time Series with ECM-MIDAS Models
- Author
-
Götz Thomas, Hecq Alain, and Urbain Jean-Pierre
- Subjects
econometrics - Abstract
This paper proposes a mixed-frequency error-correction model in order to develop a regressionapproach for non-stationary variables sampled at different frequencies that are possiblycointegrated. We show that, at the model representation level, the choice of the timing betweenthe low-frequency ependent and the high-frequency explanatory variables to be included in thelong-run has an impact on the remaining dynamics and on the forecasting properties. Then, wecompare in a set of Monte Carlo experiments the forecasting performances of the low-frequencyaggregated model and several mixed-frequency regressions. In particular, we look at both theunrestricted mixed-frequency model and at a more parsimonious MIDAS regression. Whilst theexisting literature has only investigated the potential improvements of the MIDAS framework forstationary time series, our study emphasizes the need to include the relevant cointegratingvectors in the non-stationary case. Furthermore, it is illustrated that the exact timing of thelong-run relationship does notmatter as long as the short-run dynamics are adapted according to the composition of thedisequilibrium error. Finally, the unrestricted model is shown to suffer from parameterproliferation for small sample sizeswhereas MIDAS forecasts are robust to over-parameterization. Hence, the data-driven,low-dimensional and flexible weighting structure makes MIDAS a robust and parsimonious method tofollow when the true underlying DGP is unknown while still exploiting information present in thehigh-frequency. An empirical application illustrates the theoretical and the Monte Carlo results.
- Published
- 2012
3. Do Seasonal Adjustments Induce Noncausal Dynamics in Inflation Rates?
- Author
-
Hecq, Alain, Telg, Sean, and Lieb, Lenard
- Subjects
INFLATION forecasting ,PRICE inflation ,AUTOREGRESSIVE models ,FALSE precision (Statistics) ,ECONOMETRICS - Abstract
This paper investigates the effect of seasonal adjustment filters on the identification of mixed causal-noncausal autoregressive models. By means of Monte Carlo simulations, we find that standard seasonal filters induce spurious autoregressive dynamics on white noise series, a phenomenon already documented in the literature. Using a symmetric argument, we show that those filters also generate a spurious noncausal component in the seasonally adjusted series, but preserve (although amplify) the existence of causal and noncausal relationships. This result has has important implications for modelling economic time series driven by expectation relationships. We consider inflation data on the G7 countries to illustrate these results. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
4. SEPARATION, WEAK EXOGENEITY, AND P-T DECOMPOSITION IN COINTEGRATED VAR SYSTEMS WITH COMMON FEATURES.
- Author
-
Hecq, Alain, Palm, FranzC., and Urbain, Jean-Pierre
- Subjects
- *
TIME series analysis , *EXOGENEITY (Econometrics) , *MATHEMATICAL decomposition , *ECONOMETRICS , *HIGH technology industries - Abstract
The aim of this paper is to study the concept of separability in multiple nonstationary time series displaying both common stochastic trends and common stochastic cycles. When modeling the dynamics of multiple time series for a panel of several entities such as countries, sectors, firms, imposing some form of separability and commonalities is often required to restrict the dimension of the parameter space. For this purpose we introduce the concept of common feature separation and investigate the relationships between separation in cointegration and separation in serial correlation common features. Loosely speaking we investigate whether a set of time series can be partitioned into subsets such that there are serial correlation common features within the sub-groups only. The paper investigates three issues. First, it provides conditions for separating joint cointegrating vectors into marginal cointegrating vectors as well as separating joint short-term dynamics into marginal short-term dynamics. Second, conditions for making permanenttransitory decompositions based on marginal systems are given. Third, issues of weak exogeneity are considered. Likelihood ratio type tests for the different hypotheses under study are proposed. An empirical analysis of the link between economic fluctuations in the United States and Canada shows the practical relevance of the approach proposed in this paper. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
5. PERMANENT-TRANSITORY DECOMPOSITION IN VAR MODELS WITH COINTEGRATION AND COMMON CYCLES.
- Author
-
Hecq, Alain, Palm, Franz C., and Urbain, Jean-Pierre
- Subjects
TIME series analysis ,MATHEMATICAL decomposition ,COINTEGRATION ,ECONOMETRICS ,DECOMPOSITION method ,PROBABILITY theory ,MATHEMATICAL statistics - Abstract
The purpose of this paper is to derive permanent-transitory decompositions of nonstationary multiple times series generated by a finite order Gaussian VAR(p) model with both cointegration and serial correlation common features. For cointegrated processes, several permanent-transitory decompositions have been extensively used in empirical and theoretical analyses. These include the multivariate extension of the Beveridge-Nelson decomposition proposed by researcher J.H. Stock and M.W. Watson, the observable permanent-transitory decomposition of researcher J. Gonzalo and C.W.J. Granger where the components are identified as being combinations of the observable series. It proposes a permanent-transitory decomposition which satisfies three criteria. Firstly, the decomposition studied in the paper is expressed in terms of observable variables and only involves quantities already available from the vector error correction model and the estimation of common features and cointegrating vectors. Secondly, the decomposition takes into account cointegrating and common cyclical feature restrictions. Thirdly, this decomposition should not only be a permanent-transitory decomposition but also a common trend-common cycle decomposition.
- Published
- 2000
- Full Text
- View/download PDF
6. Modelling and forecasting economic time series with mixed causal-noncausal models
- Author
-
Elisa Marie Voisin, Hecq, Alain, Wilms, Ines, RS: GSBE other - not theme-related research, and QE Econometrics
- Subjects
bubbles ,forecasting ,time series ,econometrics - Abstract
This thesis investigates the forecasting ability of mixed causal-noncausal (MAR) models. This type of models can be employed to model and forecast financial and economic time series that are for instance characterized by bubbles. Bubbles (a persistent increase followed by a sudden crash) can have dramatic impacts, an example is the U.S. housing market bubble that led to a global financial crisis. The models employed in this thesis are simple to use and offer a large flexibility. Since MAR models are still rather new, the literature on some aspects remains scarce. This thesis thus first focuses on predictions, with applications ranging from extreme episodes to more stable ones. MAR models can be employed in various areas of applications and the predictive densities obtained from them can for instance be used to construct risk measures or credibility indices of Central bank. The thesis also investigates extensions of the model, such as the use of external variables or employing the model in a multivariate setting.
- Published
- 2022
7. Time series analysis under model uncertainty
- Author
-
Jan Hendrik Lohmeyer, Palm, Franz, Urbain, Jean pierre, Hecq, Alain, RS: GSBE Theme Data-Driven Decision-Making, and QE Econometrics
- Subjects
Series (mathematics) ,Computer science ,media_common.quotation_subject ,Monetary policy ,model averaging ,vector auto regression ,Vector autoregression ,Interest rate ,Interdependence ,Economic data ,model selection criteria ,Order (exchange) ,Econometrics ,model uncertainty ,structural VAR ,Time series ,media_common - Abstract
Central banks analyze economic data in order to uncover the dynamics of the economy and the interdependencies between different economic factors, for example between interest rates and economic growth. This thesis motivates and develops new analysis methods for such economic times series. One of the proposed methods allows the user to explicitly define and take into account the goal of her analysis; another presented method is an extension of a popular tool for economic policy analysis. The thesis illustrates the methods' properties and shows how they compare with other commonly used methods. The results can help central banks and economists to make more informed decisions about which analysis tools to use.
- Published
- 2019
- Full Text
- View/download PDF
8. Mixed causal-noncausal models
- Author
-
Sean Telg, Hecq, Alain, Urbain, Jean pierre, Lieb, Lenard, QE Econometrics, and RS: GSBE ETBC
- Subjects
Inflation ,Estimation ,Identification (information) ,Series (mathematics) ,Computer science ,media_common.quotation_subject ,Econometrics ,Inference ,Time series ,Field (geography) ,Economic bubble ,media_common - Abstract
This thesis researches time series data (e.g. consumption, inflation) in the field of econometrics. Since most data is correlated with time, it is often modelled as to depend on its own past values. This common methodology is extended such that the dependence of data on its own future values and other related series is allowed. It is found that series containing cycles and economic bubbles can be represented by these models. Their applicability is investigated in various economic frameworks and a software package is developed to perform analysis with these models.
- Published
- 2018
- Full Text
- View/download PDF
9. Essays in real-time forecasting
- Author
-
Liebermann, Joëlle, Weil, Philippe, Reichlin, Lucrezia, Fuss, Catherine, Verardi, Vincenzo, Giannone, Domenico, and Hecq, Alain
- Subjects
factor model ,Economic forecasting ,Marché obligataire ,nws ,Bond market ,Macroeconomics -- Mathematical models ,real-time ,forecasting ,nowcasting ,Econométrie ,Economie ,Econometrics ,Prévision économique ,Macroéconomie -- Modèles mathématiques ,BVAR - Abstract
This thesis contains three essays in the field of real-time econometrics, and more particularlyforecasting.The issue of using data as available in real-time to forecasters, policymakers or financialmarkets is an important one which has only recently been taken on board in the empiricalliterature. Data available and used in real-time are preliminary and differ from ex-postrevised data, and given that data revisions may be quite substantial, the use of latestavailable instead of real-time can substantially affect empirical findings (see, among others,Croushore’s (2011) survey). Furthermore, as variables are released on different datesand with varying degrees of publication lags, in order not to disregard timely information,datasets are characterized by the so-called “ragged-edge”structure problem. Hence, specialeconometric frameworks, such as developed by Giannone, Reichlin and Small (2008) mustbe used.The first Chapter, “The impact of macroeconomic news on bond yields: (in)stabilities overtime and relative importance”, studies the reaction of U.S. Treasury bond yields to real-timemarket-based news in the daily flow of macroeconomic releases which provide most of therelevant information on their fundamentals, i.e. the state of the economy and inflation. Wefind that yields react systematically to a set of news consisting of the soft data, which havevery short publication lags, and the most timely hard data, with the employment reportbeing the most important release. However, sub-samples evidence reveals that parameterinstability in terms of absolute and relative size of yields response to news, as well assignificance, is present. Especially, the often cited dominance to markets of the employmentreport has been evolving over time, as the size of the yields reaction to it was steadilyincreasing. Moreover, over the recent crisis period there has been an overall switch in therelative importance of soft and hard data compared to the pre-crisis period, with the latterbecoming more important even if less timely, and the scope of hard data to which marketsreact has increased and is more balanced as less concentrated on the employment report.Markets have become more reactive to news over the recent crisis period, particularly tohard data. This is a consequence of the fact that in periods of high uncertainty (bad state),markets starve for information and attach a higher value to the marginal information contentof these news releases.The second and third Chapters focus on the real-time ability of models to now-and-forecastin a data-rich environment. It uses an econometric framework, that can deal with largepanels that have a “ragged-edge”structure, and to evaluate the models in real-time, weconstructed a database of vintages for US variables reproducing the exact information thatwas available to a real-time forecaster.The second Chapter, “Real-time nowcasting of GDP: a factor model versus professionalforecasters”, performs a fully real-time nowcasting (forecasting) exercise of US real GDPgrowth using Giannone, Reichlin and Smalls (2008), henceforth (GRS), dynamic factormodel (DFM) framework which enables to handle large unbalanced datasets as availablein real-time. We track the daily evolution throughout the current and next quarter of themodel nowcasting performance. Similarly to GRS’s pseudo real-time results, we find thatthe precision of the nowcasts increases with information releases. Moreover, the Survey ofProfessional Forecasters does not carry additional information with respect to the model,suggesting that the often cited superiority of the former, attributable to judgment, is weakover our sample. As one moves forward along the real-time data flow, the continuousupdating of the model provides a more precise estimate of current quarter GDP growth andthe Survey of Professional Forecasters becomes stale. These results are robust to the recentrecession period.The last Chapter, “Real-time forecasting in a data-rich environment”, evaluates the abilityof different models, to forecast key real and nominal U.S. monthly macroeconomic variablesin a data-rich environment and from the perspective of a real-time forecaster. Amongthe approaches used to forecast in a data-rich environment, we use pooling of bi-variateforecasts which is an indirect way to exploit large cross-section and the directly pooling ofinformation using a high-dimensional model (DFM and Bayesian VAR). Furthermore forecastscombination schemes are used, to overcome the choice of model specification faced bythe practitioner (e.g. which criteria to use to select the parametrization of the model), aswe seek for evidence regarding the performance of a model that is robust across specifications/combination schemes. Our findings show that predictability of the real variables isconfined over the recent recession/crisis period. This in line with the findings of D’Agostinoand Giannone (2012) over an earlier period, that gains in relative performance of modelsusing large datasets over univariate models are driven by downturn periods which are characterizedby higher comovements. These results are robust to the combination schemesor models used. A point worth mentioning is that for nowcasting GDP exploiting crosssectionalinformation along the real-time data flow also helps over the end of the great moderation period. Since this is a quarterly aggregate proxying the state of the economy,monthly variables carry information content for GDP. But similarly to the findings for themonthly variables, predictability, as measured by the gains relative to the naive randomwalk model, is higher during crisis/recession period than during tranquil times. Regardinginflation, results are stable across time, but predictability is mainly found at nowcastingand forecasting one-month ahead, with the BVAR standing out at nowcasting. The resultsshow that the forecasting gains at these short horizons stem mainly from exploiting timelyinformation. The results also show that direct pooling of information using a high dimensionalmodel (DFM or BVAR) which takes into account the cross-correlation between thevariables and efficiently deals with the “ragged-edge”structure of the dataset, yields moreaccurate forecasts than the indirect pooling of bi-variate forecasts/models., Doctorat en Sciences économiques et de gestion, info:eu-repo/semantics/nonPublished
- Published
- 2012
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.