Score contribution per author:
α: calibrated so average coauthorship-adjusted count equals average raw count
We review the effectiveness of various adjustment methods in correcting the truncation-bias in patents data and the implications for existing studies. The NBER patents-database was recently updated, extending the sample from 2006 to 2010. The updated sample is largely free of truncation-bias over the period covered by the NBER-2006 sample, allowing us to evaluate the bias-adjustment methods. We find that existing adjustments perform poorly towards the end of NBER-2006 sample. We re-examine multiple studies from the recent literature on innovation and show that findings based on the last few years of NBER-2006 data are not supported in the updated patents-database.