It is widely known that the earnings of workers without college degrees have stagnated in recent years and employment rates, especially among men, have strongly declined (Groshen and Holzer, 2021; Reeves, 2021).1 The recent Covid-19 pandemic and recession exacerbated earnings and employment gaps by race/gender and education, and also generated millions more displaced workers (Hershbein and Holzer, 2021); and automation in the job market will no doubt create many more displaced workers over time. The “Great Resignation” of workers from many low-wage jobs also demonstrates a need for successful post-secondary education or training, so that earnings and upward mobility prospects among those with lower levels of education can rise.
For decades policy analysts have sought to identify cost-effective job training programs with lasting impacts for disadvantaged or displaced workers, with limited success. But a very promising model for such training has recently emerged: sector-based training, where people are trained for existing jobs in high-demand sectors that pay well for workers without four-year college degrees. Several rigorous evaluations studies (reported below) have found large and lasting impacts of several such programs on worker earnings and sometimes on college credential attainment (Holzer, 2021).
Indeed, this record of success motivated the Biden administration and the House of Representatives to include at least $10B of training for jobs of “in-demand sectors” in the Build Back Better bill which the House recently passed.2 The Workforce Innovation and Opportunity Act (WIOA), which will likely be reauthorized in 2022, also encourages state and local workforce boards to fund sector-based programs.
But the What Works Clearinghouse (WWC) of the US Department of Education, which summarizes the results of rigorous evaluation studies for policymakers and practitioners, released reports in November casting doubt on the effectiveness of two of the best-known sector-based programs for disadvantaged workers: Project Quest and Year Up.3
Both programs have been evaluated many times and are widely considered to be very successful. But, in each case, WWC provides evidence of only one or two positive impacts on specific measures of education or earnings, while claiming zero or negative impacts on most others. In many cases, the weak results are generated by WWC protocols for judging evaluation results that, while often sensible, can also be arbitrary in some contexts. And WWC failed to include several more recent reports in its summary that provide more positive evidence of longer-term impacts.
Below I argue that the best available evidence still suggests that Project Quest and Year Up, along with other sector-based programs, remain among our most successful education and training efforts for disadvantaged US workers. While major challenges remain in scaling such programs and limiting their cost, the evidence to date of their effectiveness remains strong, and they should continue to be a major pillar of workforce policy going forward.
Sector-Based Training: Evidence of Success and Why It Works
Sector-based training programs explicitly target key sectors of the economy, where labor demand is strong and where workers without college degrees can earn living wages, and train workers for jobs in those sectors – which include health care, information technology, advanced manufacturing, and transportation and logistics.
In doing so, they employ what is often known as a “dual customer” focus – treating employers as major clients whose interests in filling available jobs must be taken just as seriously as the workers receiving the training. Indeed, employers help produce the curricula in which students or workers are trained, and then workers are referred to them for subsequent employment. Key features of the approach include:
- A sector “partnership” with representatives from employers or industry associations, training providers, and, often, an intermediary to link training with skill needs.
- Job training for industries where middle-skilled workers are paid well (including those with no college degree), employers have unmet needs for such workers, and continued growth is expected.
- Wraparound services such as case management, supportive social services, basic and life skills training, career coaching, and/or job placement and retention supports.4
Furthermore, the relatively greater success of this approach (compared to more traditional training that omits an explicit demand-side focus) in improving worker earnings has been demonstrated in rigorous evaluations of several such programs, using randomized controlled trails (RCTs) to generate “gold standard” evaluation results. These studies have identified large and lasting impacts of sector-based training on the education and/or earnings of the disadvantaged, and are cost-effective over time.
These studies are summarized in an important recent paper by Lawrence Katz of Harvard University and his colleagues (Katz et al., 2020). They review several rigorous evaluations of sector-based programs that appear successful, though they focus primarily on the WorkAdvance program.5They also identify what causes sector-based training to be so successful: their ability to train workers for high-paying jobs and sectors, the transferable and certifiable skills that such training generates, and reductions of employment barriers to high-wage sectors, particularly for women and people of color.
Among the programs that Katz et al. identify as successful are Project Quest and Year Up. These two programs differ from each other in their approaches to employment and training and target quite different populations. Project Quest, first implemented in San Antonio TX in 1992, delivers training to community college students and helps them attain college credentials (such as certificates or associate degrees) in certain high-demand fields, like health care. In contrast, Year Up targets out-of-school youth; it provides six months of training to disadvantaged high school graduates (or GED attainers) in information technology or other high-demand fields, followed by six-month paid internships with private employers.
As Katz et al. indicate, Project Quest and Year Up have each been evaluated a number of times, in studies using RCTs, and the results have indicated strong positive impacts of each. Specifically, Roder and Elliott (2018, 2019, 2021) have demonstrated large and lasting impacts of Project Quest on worker earnings, lasting up to nine or eleven years. And Fein and Hamadyk (2018) and Fein et al. (2021) have shown that Year Up raises earnings by large amounts for disadvantaged youth as much as five years afterwards – a more positive finding than has been observed in any other youth training program to date.6
WWC Reports on Project Quest and Year Up
To help policymakers and practitioners identify which programs and approaches work or don’t work, the US Department of Education created the What Works Clearinghouse (WWC). For any given education program or intervention, WWC identifies all studies that have been done (as of a point in time), which use rigorous methods, and what the rigorous studies have found, on average.
To do this, WWC uses very specific protocols to determine which studies are rigorous, and which results in these studies are included in their reports as positive or negative evidence. Then, due to lags in the review and publication process, the reports often include only studies that were available at least a year before any report is published, and sometimes 2-3 years earlier.
WWC Reports on Project Quest and Year Up: What Did They Find?
WWC’s review of Project Quest is based on four studies that it deems rigorous enough to include in their review: 1) Roder and Elliott, 2018 and 2019; 2) Rolston et al., 2017; and 3) Juniper et al., 2020. Notably, only the studies by Roder and Elliott focus explicitly on Project Quest; the other two focus on other programs in Texas – the Valley Initiative for Development and Advancement (VIDA) in the Rio Grande Valley and the version of Capital IDEA that was implemented in Travis County TX – which were explicitly modeled after the original Project Quest, but now likely differ from it on implementation details.7 WWC’s review of Year Up is based on two studies: 1) Roder and Elliott, 2014 and 2) Fein and Hamadyk, 2018.
WWC’s report on Project Quest summarizes its measured impacts as follows:
- Positive impacts on credit accumulation (designated by a single +) and stronger positive evidence on attainment of “industry-recognized credential, certificate, or license completion” (designated by ++);
- A negative effect (designated by -) on “postsecondary degree attainment”; and
- Zero impacts on short-term employment or earnings, medium-term employment or earnings, and on long-term earnings.8
WWC’s report of Year Up summarizes its measured impacts as follows:
- Positive impacts on short-term earnings (designated by ++); and
- Zero impacts on short-term employment, medium-term employment or earnings, and industry-recognized credential, certificate or license completion.
WWC’s reports provide considerable detail and nuance on the data and estimation methods used in the impact studies, as well as how/why WWC summarized them as it did. Unfortunately, it is likely that many readers seeking information on the performance of PQ and YU will see only their shorter snapshots and briefs, rather than the longer reports that provide more nuance.
Were the WWC Reports Accurate?
I strongly believe that WWC’s summaries of Project Quest and Year Up – especially in the snapshots but also in the briefs and reports – paint portraits of both programs that are much less positive than what the extant research really shows.
There are two primary reasons for this: 1) WWC’s summaries of the studies they reviewed, based on some very specific protocols it uses, are somewhat incomplete and not fully accurate; and 2) More importantly, they did not review some more recently published studies, which appeared in 2021 and before the publication of WWC’s reports in November, which show much stronger positive impacts than those in what WWC covered.
Below I briefly summarize my reactions to the WWC reports, while more details are available in an online Appendix.
WWC’s Summaries
A closer look at the studies of Project Quest and Year Up reviewed by WWC shows that some of their concerns are very sensible, while others are based on a set of protocols that lend consistency to their summaries over time but are not always well-suited to certain analyses.
For instance, in these reports WWC corrects some statistical problems with the original studies (such as standard errors on estimates that weren’t adjusted for clustering and multiple observations) that, when adjusted, reduce their statistical significance. They note some cases where outcomes were not fully equivalent between the treatment and control groups at baseline, and also where some questionable statistical techniques were used to impute missing control variables.9 Various findings in the original reports were excluded for these reasons, and these are reasonable positions for WWC to take.
At the same time, the WWC reports:
- Rigidly adhere to statistical significance levels of .05 and deem outcomes just above that level but not higher than .10 to be insignificant;
- Consider outcomes only at certain pre-determined points in time – in the case of Project Quest, years 3, 5, and 7 after random assignment were used to estimate short-, medium- and long-term outcomes of education interventions – while those at years 4, 6, or 8-9 were ignored (and deemed “not aligned with their preferred measures”);10
- Ignore outcomes that fall outside of their pre-determined domains that are deemed “ineligible” for consideration despite being likely correlated with later earnings and success.11
Overall, the results suggest that Year Up generates positive impacts on both short-term earnings and employment; WWC notes only the former. For Project Quest, the evidence implies not only positive impacts on academic credits earned and on credential attainment in the short-term (acknowledged by WWC), but also evidence on associate degrees attainment that is more mixed than WWC indicates.12 A closer look at the studies themselves also show positive medium-term impacts of PQ on earnings and employment, measured in Years 5 and 6.
More Recent Reports
Three very important additional reports have been released on these programs, beyond those summarized by WWC (and by Katz et al.). These include Roder and Elliott (2021) on Project Quest, Rolston et al. (2021) on VIDA, and Fein et al. (2021) on Year Up.
On Project Quest, the more recent study by Roder and Elliott reports significant (at the .10 level or higher) 10- and 11-year impacts on earnings and employment that are large and lasting (on top of the significant 9-year impact in their 2019 report). They also show positive impacts not only on certificate attainment but to some extent on associate degree attainment as well, and significantly in the target field of health care.13
On VIDA, Rolston et al. (2021) largely confirm the positive findings on postsecondary attainment that were apparent in their earlier report, with clearer evidence now of positive impacts on associate degrees as well as certificates and credits. While they find no significant impacts on earnings after three years, the positive impacts on both credit and credential attainment almost certainly suggest that positive impacts on earnings will materialize, given the clear labor market returns to postsecondary credits and credentials in the research (Backes et al., 2015; Holzer and Baum, 2017).
Regarding Year Up, Fein et al. find large and persistent earnings impacts at five years after random assignment that show no signs of fading out.
Had WWC included these reports in their summaries of Project Quest and Year Up – all of which were available well before the publication of its report – the positive impacts of these programs would have been more obvious. Indeed, summaries by the Pathways to Work Evidence Clearinghouse and the Arnold Foundation, published before the WWC reports, suggest stronger positive impacts of PQ and YU on education and employment outcomes than what the WWC reports.14
Conclusion
Sector-based training programs remain one of the most promising approaches to raising the earnings and employment of those without 4-year college degrees in America. Several rigorous evaluations in recent years suggest that they can have large and lasting impacts on worker earnings and are often cost-effective over time. For this reason, the Build Back Better legislation and other workforce development efforts focus heavily on scaling these successful interventions.
Unfortunately, the US Department of Education’s What Works Clearinghouse recently generated two reports summarizing evaluation evidence on two of the best-known sector-based programs – Project Quest and Year Up – suggesting that the impacts were much weaker than has widely been noted. But the more negative summaries produced by WWC were, in many cases, generated by protocols determining which studies to review and which findings to believe that may often be sensible but were too rigidly applied in this case.
A careful review of the studies they summarize, and especially three more additional ones that they failed to report (due to publication lags created by their review process), paint much more positive portraits of both programs. In particular, the full batch of studies demonstrate clear positive impacts on virtually all measures of postsecondary education by Project Quest, and on employment outcomes for disadvantaged students and workers by both Year Up and Project Quest. These more positive portraits are also consistent with the recent evidence reviews by Katz et al., the Pathways to Work Evidence Clearinghouse and the Arnold Foundation that we note above. Indeed, the positive impacts appear sufficiently large and long-lasting to render these programs cost-effective.15
We need to learn much more about these programs – for instance, how to maintain their cost-effectiveness when we scale them, and perhaps how to extend them to less work-ready individuals. In the meantime, the underlying success of these programs should not be challenged by research summaries with major shortcomings.
I believe that WWC can continue to play an important role in identifying what works; but, in the future, they should try to avoid the pitfalls that weaken the credibility of their summaries. Regarding the studies they review, I suggest the following:
- WWC should regard p-values of .10 or less as statistically significant, and it should report confidence intervals based on this level of significance for all estimated impacts.
- WWC should not rigidly adhere to evidence based only on very specific years after random assignment or program completion – for instance, they should not ignore evidence of impacts in years 4, 6, and 8 and beyond (claiming they do not “align with WWC’s preferred measures”), while focusing exclusively on those in years 3, 5, and 7 (the shortest periods over which they measure short-, medium- and long-term impacts of Project Quest) – especially since labor market outcomes tend to fluctuate across individual years.
- WWC could acknowledge a wider range of evidence beyond the scope of the very limited outcomes it considers, that it now deems “ineligible” for inclusion in its reports. For instance, it could acknowledge the positive impacts on full-time enrollment found in the original Rolston et al. study, or the various impacts on career progress in Fein and Hamadyk – even if it does not use these impacts in any of its specific impact assessments.
- WWC could be more open to some statistical techniques (such as Fein and Hamadyk’s use of “hot deck” imputation of covariates) where it is not clear that such techniques are responsible for any positive impact estimates.
And, when more recent evaluations have been published before WWC releases its reports on earlier evidence:
- WWC should very clearly acknowledge the existence of such reports and their findings. It could do so without explicitly embracing them or including them in its summary of impact measures.
It would also be useful for WWC to provide more of this broader evidence in its “snapshots” and “briefs,” as well as its longer reports.
The Brookings Institution is financed through the support of a diverse array of foundations, corporations, governments, individuals, as well as an endowment. A list of donors can be found in our annual reports published online here. The findings, interpretations, and conclusions in this report are solely those of its author(s) and are not influenced by any donation.
References
Backes, Ben et al. 2015. “Is It Worth It? Postsecondary Education and Labor Market Outcomes for the Disadvantaged.” IZA Journal of Labor Policy.
Fein, David et al. 2021. Still Bridging the Opportunity Divide for Low-Income Youth: Year Up’s Longer-Term Impacts. OPRE Report
Fein, David and Hamdyck, 2018. Bridging the Opportunity Divide for Low-Income Youth: Year Up Implementation and Early Impact Report. OPRE Report No. 2018-65.
Groshen, Erica and Harry Holzer. 2021. “Labor Market Trends and Outcomes: What Has Changed Since the Great Recession?” Annals of the Academy of Political and Social Science.
Heckman, James et al. 1999. “The Economics and Econometrics of Active Labor Market Programs.” In O. Ashenfelter and D. Card eds. Handbook of Labor Economics, Vol. 3. Amsterdam: North Holland.
Hershbein, Brad and Harry Holzer. 2021. “The Covid-19 Pandemic’s Evolving Impact on the Labor Market: Who Has Been Hurt and What Should We Do.” IZA Discussion Paper.
Holzer, Harry and Sandy Baum. 2017. Making College Work: Pathways to Success for Disadvantaged Students. Washington DC: Brookings.
Juniper, Cynthia et al. 2020. Evaluation of Travis County Investments in Workforce Development: 2020 Update. Ray Marshall Center, University of Texas at Austin.
Katz, Lawrence et al. 2020. “Why Do Sectoral Employment Programs Work? Lessons from WorkAdvance.” National Bureau of Economic Research Working Paper.
Maguire, Sheila et al. 2010. Tuning Into Local Labor Markets. PPV: Philadelphia.
Maynard, Rebecca et al. 2007. “Issues in Calculating Average Effect Sizes in Meta-Analyses.” Office of Policy, Research and Evaluation, US Department of Health and Human Services.
Perez-Johnson, Irma and Heinrich Hoch. 2020. Workforce Development and Economic Prosperity. Unpublished, American Institutes for Research.
Reeves, Richard. 2021. Poverty Hurts the Boys the Most: Inequality at the Intersection of Class and Gender. Policy Brief, Center on Children and Families, Brookings Institution, Washington DC.
Roder, Anne and Mark Elliott. 2021. Eleven Year Gains: Project QUEST’s Investment Continues to Pay Dividends. New York: Economic Mobility Corporation.
Roder, Anne and Mark Elliott. 2018. Escalating Gains: Project QUEST’s Sectoral Strategy Pays Off. New York: Economic Mobility Corporation.
Roder, Anne and Mark Elliott. 2019. Nine Year Gains: Project QUEST’s Continuing Impact. New York: Economic Mobility Corporation.
Roder, Anne and Mark Elliott. 2014. Year Up’s Continuing Impacts on Young Adult Earnings. New York: Economic Mobility Corporation.
Rolston, Howard et al. 2021. Valley Initiative for Development and Advancement: Three-Year Impact Report. OPRE Report No. 2021-96, US Department of Health and Human Services.
Rolston, Howard et al. 2017. Valley Initiative for Development and Advancement: Implementation and Early Impact Report. OPRE Report No. 2017-83, US Department of Health and Human Services.
Schaberg, Kelsey. 2020. Meeting the Needs of Job Seekers and Employers A Synthesis of Findings on Sector Strategies.
-
Footnotes
- While college enrollments over time have grown dramatically, college attainment has been limited among lower-income or first-generation students, especially since their completion rates in two-year and four-year colleges are low (Holzer and Baum, 2017).
- The bill allocates $5B to workforce boards, unions, employers and others for sector-based training, and another $5B that will flow directly to community colleges. In addition, other provisions in the bill – like the $4B allocated to training on climate-related jobs – will likely use sector-based training as well.
- The reports can be found here: https://virtualschooling.wordpress.com/2021/11/21/new-wwc-reports-year-up-and-project-quest-show-positive-findings-for-postsecondary-students/.
- See Perez-Johnson and Hoch (2020). Due to the fairly rigorous training curricula used in many such programs, and also in order to make sure that employers are satisfied with the employees who are referred to them, the sector-based programs tend to screen out individuals who do not meet certain standards of academic preparation (such as the ability to read or do math at the 9th or 10th-grade levels) or those who lack work readiness and reliability for a number of reasons, such as substance abuse.
- WorkAdvance was a model that was evaluated in four sites, including Per Scholas in New York. Other programs reported by Katz et al. include the Jewish Vocational Services in Boston, the Wisconsin Regional Training Partnership in Milwaukee, and Project Quest and Year Up, which are described below. For more evidence see Maguire et al. (2010) and Schaberg (2020).
- Katz et al. included the findings of Roder and Elliott (2018 and 2019), which found impacts of Project Quest lasting nine years, and Fein and Hamadyk, which found impacts of Year Up lasting three years. The other studies appeared later, and are described more fully below.
- The first two of these studies use RCTs in their evaluations of program impacts, while the third uses propensity score matching on observable characteristics – a method that is broadly considered to be somewhat less rigorous. Under very specific conditions, propensity score matching methods might approximate RCT results (Heckman et al., 1999), though whether this is true in particular studies remains a bit unclear.
- WWC describes the ++ impacts as positive (and “likely to change” an outcome), the + or – impacts as potentially positive or negative (which “may change an outcome”), and No impact as no discernible effects (“may result in little or no change”). Unfortunately, WWC will designate this last rating even when no evaluation results are presented at all – in other words, even when an outcome has not yet been evaluated in the few studies that they review.
- Three of four outcomes in Juniper et al. were not considered by WWC to be of sufficient quality because baseline equivalence of outcomes was not achieved. WWC also disqualified some findings in Fein and Hamadyk because that study used “hot deck” methods to impute the values of some control variables, though there was no evidence that this had any major effect on impact estimates.
- See the Appendix for detail on estimates with p-values between .05 and .10 that WWC deemed insignificant. Roder and Elliott (2018) show significant impacts on outcomes in year 5 at the .10 level, using WWC’s corrected standard errors, and in year 6 after random assignment. Their report in 2019 shows positive but insignificant impacts on earnings in years 7-8 and a large and significant one in year 9, though WWC only used and reported the year 7 estimate. Rolston et al. show positive impacts on full-time enrollment, which is a strong predictor of degree attainment in the future (Holzer and Baum, op cit.). And Fein and Hamadyk show impacts of Year Up on weekly hours worked, as well as on working in jobs requiring mid-level skills, working in a Year Up target occupation, and on several self-reported measures of career knowledge and contacts.
- See the Appendix for detail on estimates with p-values between .05 and .10 that WWC deemed insignificant. Roder and Elliott (2018) show significant impacts on outcomes in year 5 at the .10 level, using WWC’s corrected standard errors, and in year 6 after random assignment. Their report in 2019 shows positive but insignificant impacts on earnings in years 7-8 and a large and significant one in year 9, though WWC only used and reported the year 7 estimate. Rolston et al. show positive impacts on full-time enrollment, which is a strong predictor of degree attainment in the future (Holzer and Baum, op cit.). And Fein and Hamadyk show impacts of Year Up on weekly hours worked, as well as on working in jobs requiring mid-level skills, working in a Year Up target occupation, and on several self-reported measures of career knowledge and contacts.
- Roder and Elliott (2018) report a negative impact of Project Quest on associate degree attainment of 8 points while Rolston et al. find a positive impact of 5 points, both with p-values of .10. But WWC reports larger differences in effect sizes between them, from which they infer a (possible) negative impact on degree attainment, despite questions about how we should interpret such effect sizes (Maynard et al., 2007).
- Overall, impacts on certificate and associate degree attainment are 12.3 and 1.4 percentage points respectively; in health care, they are 13.4 and 6.9 points respectively.
- See https://www.arnoldventures.org/stories/11-year-follow-up-of-the-randomized-controlled-trial-rct-of-project-quest-a-workforce-development-program-for-low-income-adults/ and https://pathwaystowork.acf.hhs.gov/intervention-detail/679.
- Project Quest costs about $10,000 per year, according to Roder and Elliott (while Rolston et al. report costs somewhat higher). Year Up costs about $28,000 for a year – though its primary costs are for wage payments during internships, for which employers should pay if they are receiving labor services in return.