What we’re reading (1/22)

  • “The Deal To Secure TikTok’s Future In The US Has Finally Closed” (CNN Business). “The transaction’s close concludes a yearslong effort to secure TikTok’s long-term future in the United States and address concerns that it posed a national security risk.”

  • “Natural-Gas Prices Soar As U.S. Braces For Arctic Blast” (Wall Street Journal). “Natural-gas prices have jumped 63% this week in response to forecasts calling for some of the coldest, snowiest weather in years to freeze the country from the West Texas desert to the Great Lakes.  The forecasts have stoked fears of a repeat of the deadly winter storm that froze Texas in 2021 and left millions of people without electricity for days. Energy producers and utilities are preparing for the worst. The Energy Department late Thursday ordered grid operators to be prepared to take extraordinary steps to tap in to backup power generation.”

  • “Intel Stock Plunges 13% On Soft Guidance, Concerns About Chip Production” (CNBC). “Intel said it expected first-quarter revenue between $11.7 billion and $12.7 billion, and breakeven adjusted earnings per share. That came in below LSEG expectations of 5 cents earnings per share on $12.51 billion in sales.”

  • “This Stock-Market Indicator Just Flashed One Of Its Most Bullish Signals Since 2000” (MarketWatch). “The average short-term timer that my firm tracks reduced recommended equity exposure on Tuesday by almost 20 percentage points, as judged by the Hulbert Stock Newsletter Sentiment Index. That’s one of the biggest one-day HSNSI drops since 2000, which is how far back data extend.”

  • “Teaching Economics To The Machines” (Hui Chen, Yuhan Cheng, Yanchu Liu & Ke Tang). “Structural economic models, while parsimonious and interpretable, often exhibit poor data fit and limited forecasting performance. Machine learning models, by contrast, offer substantial flexibility but are prone to overfitting and weak out-of-distribution generalization. We propose a theory-guided transfer learning framework that integrates structural restrictions from economic theory into machine learning models. The approach pre-trains a neural network on synthetic data generated by a structural model and then fine-tunes it using empirical data, allowing potentially misspecified economic restrictions to inform and regularize learning on empirical data. Applied to option pricing, our model substantially outperforms both structural and purely data-driven benchmarks, with especially large gains in small samples, under unstable market conditions, and when model misspecification is limited. Beyond performance, the framework provides diagnostics for improving structural models and introduces a new model-comparison metric based on data-model complementarity.”

Previous
Previous

What we’re reading (1/24)

Next
Next

What we’re reading (1/21)