|
on Econometrics |
By: | Sauvenier, Mathieu (Université catholique de Louvain, LIDAM/CORE, Belgium); Van Bellegem, Sébastien (Université catholique de Louvain, LIDAM/CORE, Belgium) |
Abstract: | In high-dimensional sparse linear regression, the selection and the estimation of the parameters are studied based on an L0−constraint on the direction of the vector of parameters. We first establish a general result for the direction of the vector of parameters, which is identified through the leading generalized eigenspace of measurable matrices. Based on this result, we suggest addressing the best subset selection problem from a new perspective by solving an empirical generalized eigenvalue problem to estimate the direction of the high-dimensional vector of parameters. We then study a new estimator based on the RIFLE algorithm and demonstrate a nonasymptotic bound of the L2 risk, the minimax convergence of the estimator and a central limit theorem. Simulations show the superiority of the proposed inference over some known l0 constrained estimators. |
Keywords: | High-dimensional model ; sparsity ; generalized eigenvalue problem ; identification ; best subset selection ; minimax L0 estimation ; central limit theorem |
JEL: | C30 C55 C59 |
Date: | 2023–01–17 |
URL: | http://d.repec.org/n?u=RePEc:cor:louvco:2023005&r=ecm |
By: | Pesaran, M. H.; Yang, L. |
Abstract: | This paper considers a first-order autoregressive panel data model with individual specific effects and a heterogeneous autoregressive coefficient. It proposes estimators for the moments of the cross-sectional distribution of the autoregressive coefficients, with a focus on the first two moments, assuming a random coefficient model for the autoregressive coefficients without imposing any restrictions on the fixed effects. It is shown that the standard generalized method of moments estimators obtained under homogeneous slopes are biased. The paper also investigates conditions under which the probability distribution of the autoregressive coefficients is identified assuming a categorical distribution with a finite number of categories. Small sample properties of the proposed estimators are investigated by Monte Carlo experiments and compared with alternatives both under homogenous and heterogeneous slopes. The utility of the heterogeneous approach is illustrated in the case of earning dynamics, where a clear upward pattern is obtained in the mean persistence of earnings by the level of educational attainments. |
Keywords: | Dynamic panels, categorical distribution, random and group heterogeneity, short T panels, earnings dynamics |
JEL: | C22 C23 C46 |
Date: | 2023–06–06 |
URL: | http://d.repec.org/n?u=RePEc:cam:camdae:2342&r=ecm |
By: | Millimet, Daniel L. (Southern Methodist University); Bellemare, Marc (University of Minnesota) |
Abstract: | Across many disciplines, the fixed effects estimator of linear panel data models is the default method to estimate causal effects with nonexperimental data that are not confounded by time-invariant, unit-specific heterogeneity. One feature of the fixed effects estimator, however, is often overlooked in practice: With data over time t ∈ {1, ..., T} for each unit of observation i ∈ {1, ..., N}, the amount of unobserved heterogeneity the researcher can remove with unit fixed effects is weakly decreasing in T. Put differently, the set of attributes that are time-invariant is not invariant to the length of the panel. We consider several alternatives to the fixed effects estimator with T > 2 when relevant unit-specific heterogeneity is not time-invariant, including existing estimators such as the first-difference, twice first-differenced, and interactive fixed effects estimators. We also introduce several novel algorithms based on rolling estimators. In the situations considered here, there is little to be gained and much to lose by using the fixed effects estimator. We recommend reporting the results from multiple linear panel data estimators in applied research. |
Keywords: | panel data, fixed effects, first-differences, interactive fixed effects, unobserved heterogeneity, time-varying individual effects |
JEL: | C23 C51 C52 |
Date: | 2023–06 |
URL: | http://d.repec.org/n?u=RePEc:iza:izadps:dp16202&r=ecm |
By: | Hafner, Christian M. (Université catholique de Louvain, LIDAM/ISBA, Belgium); Herwartz, Helmut; Wang, Shu |
Abstract: | Independent component analysis has recently become a promising data-based approach to detect structural relations in multivariate dynamic systems in cases when apriori knowledge about causal patterns are scant. This paper suggests a kernel-based ML estimation that is largely agnostic with regard to the distributional features of the structural origins of data variation and enables causal analysis under the assumption of having only a subset of independent shocks. In an empirical application to the global oil market model of Kilian (2009) we illustrate the benefits of allowing for an unmodelled higher-order dependence among the oil supply and speculative oil demand shocks. |
Keywords: | Structural VAR ; structural MGARCH ; Independent component analysis |
JEL: | C14 C32 Q43 |
Date: | 2023–01–25 |
URL: | http://d.repec.org/n?u=RePEc:aiz:louvad:2023004&r=ecm |
By: | Sauvenier, Mathieu (Université catholique de Louvain, LIDAM/CORE, Belgium); Van Bellegem, Sébastien (Université catholique de Louvain, LIDAM/CORE, Belgium) |
Abstract: | A goodness-of-fit test for the outcome of variable selection in a high dimensional linear model is studied. The test minimizes a moment condition that reflects the sparsity constraint. Testing this constraint is possible thanks to a high dimensional central limit Theorem that is proved under heteroskedasticity. To implement the test a multiple-splitting projection test procedure that has been recently proposed in the literature is employed. Monte Carlo experiments demonstrate the power of the test. A real data application considers the problem of selecting predictors to nowcast quarterly GDP. The empirical results show that it is possible to select a minimal number of variables such that every larger set of variables would pass the goodness-of-fit test. |
Keywords: | High dimensional model ; Sparsity ; Goodness-of-Fit ; Projection test ; Nowcasting |
Date: | 2023–03–17 |
URL: | http://d.repec.org/n?u=RePEc:cor:louvco:2023008&r=ecm |
By: | Kandelhardt, Johannes |
Abstract: | The Berry, Levinsohn, and Pakes (1995, BLP) model is widely used to obtain parameter estimates of market forces in differentiated product markets. The results are often used as an input to evaluate economic activity in a structural model of demand and supply. Precise estimation of parameter estimates is therefore crucial to obtain realistic economic predictions. The present paper combines the BLP model and the logit mixed logit model of Train (2016) to estimate the distribution of consumer heterogeneity in a flexible and parsimonious way. A Monte Carlo study yields asymptotically normally distributed and consistent estimates of the structural parameters. With access to micro data, the approach allows for the estimation of highly flexible parametric distributions. The estimator further allows to introduce correlations between tastes, yielding more realistic demand patterns without substantially altering the procedure of estimation, making it relevant for practitioners. The BLP estimator is established to yield biased and inconsistent results when the underlying distributional shape is non-normally distributed. An application shows the estimator to perform well on a real world dataset and provides similar estimates as the BLP estimator with the option of specifying consumer heterogeneity as a function of a polynomial, step function or spline, resulting in a flexible estimation procedure. |
Date: | 2023 |
URL: | http://d.repec.org/n?u=RePEc:zbw:dicedp:399&r=ecm |
By: | Barrows, Geoffrey; Calel, Raphael; Jégard, Martin; Ollivier, Hélène |
Abstract: | This paper presents a method for estimating treatment effects of regulations when treated and control firms compete on the output market. We develop a GMM estimator that recovers reduced-form parameters consistent with a model of differentiated product markets with multi-plant firms, and use these estimates to evaluate counterfactual revenues and emissions. Our procedure recovers unbiased estimates of treatment effects in Monte Carlo experiments, while difference-in-differences estimators and other popular methods do not. In an application, we find that the European carbon market reduced emissions at regulated plants without undermining revenues of regulated firms, relative to an unregulated counterfactual. |
Keywords: | regulation; spillovers; environment; energy; firms |
JEL: | Q48 L10 L50 |
Date: | 2023–05–18 |
URL: | http://d.repec.org/n?u=RePEc:ehl:lserod:119259&r=ecm |
By: | Shuyang Sheng; Xiaoting Sun |
Abstract: | This paper explores the identification and estimation of social interaction models with endogenous group formation. We characterize group formation using a two-sided many-to-one matching model, where individuals select groups based on their preferences, while groups rank individuals according to their qualifications, accepting the most qualified until reaching capacities. The selection into groups leads to a bias in standard estimates of peer effects, which is difficult to correct for due to equilibrium effects. We employ the limiting approximation of a market as the market size grows large to simplify the selection bias. Assuming exchangeable unobservables, we can express the selection bias of an individual as a group-invariant nonparametric function of her preference and qualification indices. In addition to the selection correction, we show that the excluded variables in group formation can serve as instruments to tackle the reflection problem. We propose semiparametric distribution-free estimators that are root-n consistent and asymptotically normal. |
Date: | 2023–06 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2306.01544&r=ecm |
By: | Ariel Boyarsky; Hongseok Namkoong; Jean Pouget-Abadie |
Abstract: | Experiments on online marketplaces and social networks suffer from interference, where the outcome of a unit is impacted by the treatment status of other units. We propose a framework for modeling interference using a ubiquitous deployment mechanism for experiments, staggered roll-out designs, which slowly increase the fraction of units exposed to the treatment to mitigate any unanticipated adverse side effects. Our main idea is to leverage the temporal variations in treatment assignments introduced by roll-outs to model the interference structure. We first present a set of model identification conditions under which the estimation of common estimands is possible and show how these conditions are aided by roll-out designs. Since there are often multiple competing models of interference in practice, we then develop a model selection method that evaluates models based on their ability to explain outcome variation observed along the roll-out. Through simulations, we show that our heuristic model selection method, Leave-One-Period-Out, outperforms other baselines. We conclude with a set of considerations, robustness checks, and potential limitations for practitioners wishing to use our framework. |
Date: | 2023–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2305.10728&r=ecm |
By: | François-Éric Racicota; David Tessierc |
Abstract: | The main objective of this research note is to establish a link between the local projection (LP) approaches of Jorda (2005), Kilian and Kim (2011), and more recently Plagborg-Møller and Wolf (2021), Li et al. (2022) and the multihorizon causality analysis of Dufour and Renault (1998) and Dufour et al. (2006). Our detailed review of these papers with particular attention to Jorda?s local projection methodology make an enlightened comparison with Dufour et al.?s generalized causality methodology which is based on the (p, h)-autoregression concept. In particular, we highlight the fact that Jorda?s approach relies on standard Cholesky decomposition to compute the IRF while we use the GIRF?i.e., the Koop et al. (1996) and Pesaran and Shin (1998) generalized impulse response function (see also Warne, 2008)?to make this comparison more reliable and in line with Dufour et al.?s robust methodology, which also refers to GIRF coefficients. Dufour et al.?s methodology does not require orthogonalization of the disturbances. We therefore call their method ?the Dufour et al.?s GIRF??a new type of GIRF?that, unlike the usual IRF which is for horizon h = 1, is for any horizon h ? 1. We also highlight the fact that our multihorizon causality test based on the (p, h)-autoregression relies on Monte Carlo simulation, which can greatly improve test level in small samples, alleviating substantially the variance problem observed in the literature. As shown in Li et al. (2022), while LP is less biased than VAR OLS methods, there is a serious variance problem that makes the LP method quite erratic. Our Monte Carlo simulation method seems therefore quite well adapted for tackling this issue, improving LP or our (p, h)-autoregression method sufficiently to make it reliable in small samples. This multihorizon causality test that we develop and apply in this paper, and the empirical evidence we present shows that it is reliable when applied to classical monetary causations. |
Keywords: | Multihorizon causality; (p, h)-autoregression; Local projection IRF and GIRF; Conservative Monte Carlo test; VAR estimatio |
JEL: | C01 C12 C32 |
Date: | 2023–06–01 |
URL: | http://d.repec.org/n?u=RePEc:ipg:wpaper:2023-001&r=ecm |
By: | Jie Wei; Yonghui Zhang |
Abstract: | This paper studies the principal component (PC) method-based estimation of weak factor models with sparse loadings. We uncover an intrinsic near-sparsity preservation property for the PC estimators of loadings, which comes from the approximately upper triangular (block) structure of the rotation matrix. It implies an asymmetric relationship among factors: the rotated loadings for a stronger factor can be contaminated by those from a weaker one, but the loadings for a weaker factor is almost free of the impact of those from a stronger one. More importantly, the finding implies that there is no need to use complicated penalties to sparsify the loading estimators. Instead, we adopt a simple screening method to recover the sparsity and construct estimators for various factor strengths. In addition, for sparse weak factor models, we provide a singular value thresholding-based approach to determine the number of factors and establish uniform convergence rates for PC estimators, which complement Bai and Ng (2023). The accuracy and efficiency of the proposed estimators are investigated via Monte Carlo simulations. The application to the FRED-QD dataset reveals the underlying factor strengths and loading sparsity as well as their dynamic features. |
Date: | 2023–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2305.05934&r=ecm |
By: | Samuel N. Cohen; Silvia Lui; Will Malpass; Giulia Mantoan; Lars Nesheim; \'Aureo de Paula; Andrew Reeves; Craig Scott; Emma Small; Lingyi Yang |
Abstract: | Key economic variables are often published with a significant delay of over a month. The nowcasting literature has arisen to provide fast, reliable estimates of delayed economic indicators and is closely related to filtering methods in signal processing. The path signature is a mathematical object which captures geometric properties of sequential data; it naturally handles missing data from mixed frequency and/or irregular sampling -- issues often encountered when merging multiple data sources -- by embedding the observed data in continuous time. Calculating path signatures and using them as features in models has achieved state-of-the-art results in fields such as finance, medicine, and cyber security. We look at the nowcasting problem by applying regression on signatures, a simple linear model on these nonlinear objects that we show subsumes the popular Kalman filter. We quantify the performance via a simulation exercise, and through application to nowcasting US GDP growth, where we see a lower error than a dynamic factor model based on the New York Fed staff nowcasting model. Finally we demonstrate the flexibility of this method by applying regression on signatures to nowcast weekly fuel prices using daily data. Regression on signatures is an easy-to-apply approach that allows great flexibility for data with complex sampling patterns. |
Date: | 2023–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2305.10256&r=ecm |
By: | Francesco Cesarone; Rosella Giacometti; Jacopo Maria Ricci |
Abstract: | In this paper, we propose an outlier detection algorithm for multivariate data based on their projections on the directions that maximize the Cumulant Generating Function (CGF). We prove that CGF is a convex function, and we characterize the CGF maximization problem on the unit n-circle as a concave minimization problem. Then, we show that the CGF maximization approach can be interpreted as an extension of the standard principal component technique. Therefore, for validation and testing, we provide a thorough comparison of our methodology with two other projection-based approaches both on artificial and real-world financial data. Finally, we apply our method as an early detector for financial crises. |
Date: | 2023–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2305.10911&r=ecm |
By: | Benjamin Fan; Edward Qiao; Anran Jiao; Zhouzhou Gu; Wenhao Li; Lu Lu |
Abstract: | We develop a methodology that utilizes deep learning to simultaneously solve and estimate canonical continuous-time general equilibrium models in financial economics. We illustrate our method in two examples: (1) industrial dynamics of firms and (2) macroeconomic models with financial frictions. Through these applications, we illustrate the advantages of our method: generality, simultaneous solution and estimation, leveraging the state-of-art machine-learning techniques, and handling large state space. The method is versatile and can be applied to a vast variety of problems. |
Date: | 2023–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2305.09783&r=ecm |
By: | Aurélien Alfonsi (MATHRISK - Mathematical Risk Handling - UPEM - Université Paris-Est Marne-la-Vallée - ENPC - École des Ponts ParisTech - Inria de Paris - Inria - Institut National de Recherche en Informatique et en Automatique, CERMICS - Centre d'Enseignement et de Recherche en Mathématiques et Calcul Scientifique - ENPC - École des Ponts ParisTech); Bernard Lapeyre (MATHRISK - Mathematical Risk Handling - UPEM - Université Paris-Est Marne-la-Vallée - ENPC - École des Ponts ParisTech - Inria de Paris - Inria - Institut National de Recherche en Informatique et en Automatique, CERMICS - Centre d'Enseignement et de Recherche en Mathématiques et Calcul Scientifique - ENPC - École des Ponts ParisTech); Jérôme Lelong (DAO - Données, Apprentissage et Optimisation - LJK - Laboratoire Jean Kuntzmann - Inria - Institut National de Recherche en Informatique et en Automatique - CNRS - Centre National de la Recherche Scientifique - UGA - Université Grenoble Alpes - Grenoble INP - Institut polytechnique de Grenoble - Grenoble Institute of Technology - UGA - Université Grenoble Alpes) |
Abstract: | The problem of computing the conditional expectation E[f (Y)|X] with least-square Monte-Carlo is of general importance and has been widely studied. To solve this problem, it is usually assumed that one has as many samples of Y as of X. However, when samples are generated by computer simulation and the conditional law of Y given X can be simulated, it may be relevant to sample K ∈ N values of Y for each sample of X. The present work determines the optimal value of K for a given computational budget, as well as a way to estimate it. The main take away message is that the computational gain can be all the more important that the computational cost of sampling Y given X is small with respect to the computational cost of sampling X. Numerical illustrations on the optimal choice of K and on the computational gain are given on different examples including one inspired by risk management. |
Keywords: | Least square Monte-Carlo, Conditional expectation estimators, Variance reduction, Least square Monte-Carlo Conditional expectation estimators Variance reduction AMS 2020: 65C05 91G60, Variance reduction AMS 2020: 65C05, 91G60 |
Date: | 2023 |
URL: | http://d.repec.org/n?u=RePEc:hal:journl:hal-03770051&r=ecm |
By: | Alessandro Giovannelli; Marco Lippi; Tommaso Proietti |
Abstract: | The paper deals with the construction of a synthetic indicator of economic growth, obtained by projecting a quarterly measure of aggregate economic activity, namely gross domestic product (GDP), into the space spanned by a finite number of smooth principal components, representative of the medium-to-long-run component of economic growth of a high-dimensional time series, available at the monthly frequency. The smooth principal components result from applying a cross-sectional filter distilling the low-pass component of growth in real time. The outcome of the projection is a monthly nowcast of the medium-to-long-run component of GDP growth. After discussing the theoretical properties of the indicator, we deal with the assessment of its reliability and predictive validity with reference to a panel of macroeconomic U.S. time series. |
Date: | 2023–05 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2305.06618&r=ecm |
By: | Carollo, Angela (Max Planck Institute for Demographic Research); Putter, Hein; Eilers, Paul H. C.; Gampe, Jutta |
Abstract: | Event history models are based on transition rates between states and, to define such hazards of experiencing an event, the time scale over which the process evolves needs to be identified. In many applications, however, more than one time scale might be of importance. Here we demonstrate how to model a hazard jointly over two time dimensions. The model assumes a smooth bivariate hazard function, and the function is estimated by two-dimensional P-splines. We provide an R-package TwoTimeScales for the analysis of event history data with two time scales. As an example, we model transitions from cohabitation to marriage or separation simultaneously over the age of the individual and the duration of the cohabitation. We use data from the German Family Panel (pairfam) and demonstrate that considering the two time scales as equally important provides additional insights about the transition from cohabitation to marriage or separation. |
Date: | 2023–05–18 |
URL: | http://d.repec.org/n?u=RePEc:osf:socarx:4ewv3&r=ecm |