Score contribution per author:
α: calibrated so average coauthorship-adjusted count equals average raw count
Fluctuations in aggregate crime rates contrary to recent shifts in the age distribution of the U.S. population have cast doubt on the predictive power of the age–crime hypothesis. By examining a longer time horizon, back to the early 1930s, we show that the percentage of the young population is a robust predictor of the observed large swings in the U.S. murder rate over time. However, changes in the misery index—the sum of the inflation and unemployment rates—significantly contribute to explaining changes in the murder rate. This applies, in particular, to those changes that are at odds with the long-run trend of the U.S. age distribution, such as the decline in the murder rate in the latter part of the 1970s or its increase starting around the middle of the 1980s.