Robust forecast superiority testing with an application to assessing pools of expert forecasters

B-Tier
Journal: Journal of Applied Econometrics
Year: 2023
Volume: 38
Issue: 4
Pages: 596-622

Score contribution per author:

0.670 = (α=2.01 / 3 authors) × 1.0x B-tier

α: calibrated so average coauthorship-adjusted count equals average raw count

Abstract

We develop forecast superiority tests that are robust to the choice of loss function by following Jin, Corradi and Swanson (JCS: 2017), and relying on a mapping between generic loss forecast evaluation and stochastic dominance principles. However, unlike JCS tests, which are not uniformly valid and are correctly sized only under the least favorable case, our tests are uniformly asymptotically valid and non‐conservative. To show this, we establish uniform convergence of HAC variance estimators. Monte Carlo experiments indicate good finite sample performance of our tests, and an empirical illustration suggests that prior forecast accuracy matters in the Survey of Professional Forecasters.

Technical Details

RePEc Handle
repec:wly:japmet:v:38:y:2023:i:4:p:596-622
Journal Field
Econometrics
Author Count
3
Added to Database
2026-01-25