Score contribution per author:
α: calibrated so average coauthorship-adjusted count equals average raw count
We show how data from an evaluation in which subjects are randomly assigned to some treatment versus a control group can be combined with nonexperimental methods to estimate the differential effects of alternative treatments. We propose tests for the validity of these methods. We use these methods and tests to analyze the differential effects of labor force attachment (LFA) versus human capital development (HCD) training components with data from California's Greater Avenues to Independence (GAIN) program. While LFA is more effective than HCD training in the short term, we find that HCD is relatively more effective in the longer term.