Score contribution per author:
α: calibrated so average coauthorship-adjusted count equals average raw count
We reconsider one of the most widely studied behavioral biases: anchoring effects. We estimate that study designs in this literature, including replication studies, routinely fail to achieve statistical power of more than 30%. This study replicates an anchoring study that reported an effect size of a 31% increase in participants' bids. In the replication, we increased the design's statistical power from 46% to 96%, reducing the average exaggeration of a statistically significant result by a factor of seven. Our replication results reject the size of the original estimated effects. We find an estimated effect of 3.4% (95% CI [−3.4%, 10%]).