"Quantifying deregulation and its economic effects: a large language model approach" with Matteo Iacoviello
Data series available from Jan 1960 through Dec 2025
We construct a news-based index of deregulation for the United States from 1960 through 2025 using AI to semantically classify newspaper articles. We distinguish articles discussing deregulation from those discussing increased regulation, assigning intensity scores that reflect both the centrality of deregulatory content and whether articles discuss advocacy, proposals, or enacted measures. Human validation confirms strong agreement between AI and human classifications. The deregulation index captures major reform episodes including transportation and telecommunications liberalization in the 1970s–1980s, financial deregulation in the 1980s–1990s, and recent deregulatory activity. We decompose the index by sector, type of deregulation, and policy stage. We validate the news-based index against a parallel index constructed using Federal Register documents: the news-based index leads the Federal Register one by eleven months, consistent with media coverage reflecting policy intentions before formal implementation. Unlike measures based on detailed statutory coding or Federal Register counts that weigh all rules equally, our approach covers the entire economy and weighs naturally by newsworthiness, capturing regulatory shifts before they materialize in law. Positive shocks to deregulation boost investment, productivity, stock prices, profits, and GDP. Industry-specific deregulation shocks boost industry-level stock returns, consistent with a shift in the composition of deregulation toward measures that may impact incumbent profitability and operational efficiency more than competitive entry.