In the movies, academic books tumble from their shelves as a military helicopter sets down in the front yard: “Professor, we have to go—your government needs you.”
For Janet Currie, the summons earlier this year was less dramatic—but still not exactly convenient. It was the end of the week, and the White House was calling: Could she be in D.C. on Tuesday?
“The first thing I thought was, ‘No, I can’t do that,’” said Currie, an Institute advisor and professor of economics and public affairs at Princeton. She had just started a visiting professorship across the country. The White House wanted to talk about removing lead water pipes, a topic she hadn’t worked on for a few years and that had already received funding in the 2021 infrastructure bill. It wasn’t clear how much time she would have or who would be in the audience.
“Then I just thought about it a little more,” Currie said. “And I thought, ‘Well, I could actually do this. And I probably should do this.’”
Good call. Come Tuesday, she found herself in the Roosevelt Room for a solid hour with the president, vice president, and much of the Cabinet and economic team. A follow-up courtesy tour was interrupted when President Biden called her back to the Oval Office for more conversation.
While noting the importance of the recent infrastructure investment, Currie used her moment to shift policymakers’ attention to something new: rigorous childhood testing for lead. “In Flint, if those kids were being properly tested for lead, they would’ve discovered what was going on much earlier,” said Currie. Public health “is also a part of our infrastructure. And I think I was able to make that point successfully by putting it in that context.”
Enough to simply “add to the knowledge base”?
While most economists won’t find themselves with a rapt audience at the White House, “even those who devote their energies to resolving purely theoretical issues imagine that somehow in the end their efforts will prove socially useful.” So said former Fed Vice Chair Alice Rivlin in her 1986 presidential address to the American Economic Association.
At the time, Rivlin bemoaned American political and policy processes that were “fragmented” and rancorous, in which “many of the most sophisticated and realistic members of the [economics] profession, conscious of all these difficulties, have abandoned the attempt to advise governments on policies in favor of the more manageable tasks of adding to the knowledge base.”
Translation: Economists want to make a difference with their work. But even in 1986, it was often easier just to publish and stay clear of the fray.
The ambivalent relationship of academic economics to the policy world is embodied in the National Bureau of Economic Research (NBER). Every year, economists post more than 1,200 working papers through the private, nonpartisan organization to gain exposure and feedback before seeking formal publication. However, accessing that wide audience comes with a stipulation.
“To this day, NBER research is bound by a restriction that the founders imposed,” according to the bureau’s self-written history. “Studies may present data and research findings, but may not make policy recommendations or make normative statements about policy.”
Wariness of policy advocacy has value in a field that strives toward scientific credibility and precision. It nonetheless creates a gap to fill between the research findings and the lives they might change. Filling that gap is the challenge that Rivlin laid out to the prior generation: “The objective of economists ought to be to raise the level of debate on economic policy, to make clear what they know and do not know, and to increase the chances of policy decisions that make the economy work better.”
Heeding the call
In a 35-year career focused on policy-rich topics like children’s health and early childhood education, Currie has become zen about her lack of control in the policy realm.
“You never know when it’s going to happen,” she said. “For academics, you work on something and you would like to tell everyone about it, but no one cares at the moment. When they do call you, it’s about something that maybe you did a long time ago—or maybe it’s not your research, but you’re qualified to talk about it. I think you just have to be open to be helpful when you can.”
Nor can you control how your work is interpreted. Currie has swung from “radioactive” (in her words) when her early research on the Head Start program showed racial differences in program outcomes—finding herself mysteriously disinvited from a key advisory panel—to “the darling” a decade later when different political winds brought her work a fresh appraisal.
Institute Senior Research Economist Amanda Michaud placed her call on hold a few times before uprooting herself in 2021 to serve as a senior economist for the White House Council of Economic Advisers. “Initially I was skeptical about how much influence the council actually would have,” Michaud said. She felt uncertain “about whether I would be effective in the role, whether the people who were directing the organization would use me effectively.”
The pandemic shifted her calculation. “It was a crisis situation,” said Michaud. “The value of doing something well was very high, and the cost of doing something poorly was very high. There was less room for only doing things for the sake of politics. It was a very serious time.”
Michaud says she was “pleasantly surprised” by the earnest interest of officials in what economists had to offer, and by the ability of academic economists—including herself—to adapt to a new way of working.
She recalls assignments to analyze the impact of the Paycheck Protection Program and changes to the child tax credit. “If I was doing an academic study of this, it might take me a year. But at the council, you would have two days,” Michaud said. “My initial reaction was, of course, one of panic: How can I possibly be choosing something so important with so little time? But then I realized that I was actually very prepared to do this.”
Meeting the needs of policymakers required Michaud to apply her economic models more broadly than in the academic context. “Our quantitative models are built such that when [data] go into them, they’ll replicate what the world has looked like in the past.” But policymakers typically want an assessment of various present or future scenarios. “We have to have more of a loose-but-useful understanding of what those outcomes would look like in different situations,” said Michaud. “We’re not always so good at putting those in our papers, because they seem tangential to the main point. But they are very important for practical policymaking.”
Taking the long road
Harvard labor economist and Institute advisor Larry Katz can reminisce about daily pressures of working with the press and policymakers from his time as chief economist for the U.S. Department of Labor in the early ’90s. Since then, however, he has become a foremost practitioner of a different channel, where progress is measured over decades: working directly with nonprofits and government agencies to implement and test research-based ideas.
Katz—with Institute advisor Raj Chetty and former visiting scholar Nathaniel Hendren—is closely associated with the Moving to Opportunity (MTO) research, a randomized controlled trial that found young children in high-poverty neighborhoods have substantially improved earnings as adults when their parents were given a housing voucher and specialized counseling to move to low-poverty neighborhoods. Katz helped design the program while at the Labor Department, and in 1994 the U.S. Department of Housing and Urban Development (HUD) began the MTO study with 4,600 families in public housing.
This kind of longitudinal research requires a very long attention span. Earnings results for children in the MTO study did not emerge for 20 years. It takes still longer for institutions with capacity to act—public housing authorities, in this case—to assemble the will and funding to put the ideas into action, as Seattle and King County did when they launched the ongoing Creating Moves to Opportunity program in 2018 for subsidized housing residents.
And it was not until 2019—25 years after the research began—that Congress approved a “demonstration project” for HUD to advance the concept. The total expansion: 666 additional vouchers in nine cities, subject to another multiyear study. “Most public housing authorities in the U.S. don’t have serious housing mobility plans,” Katz said. “The hope is that these pilots and the continuing evidence will then diffuse, but that’s much easier said than done and takes years.”
In the meantime, Katz has become deeply involved in another area where he believes the mounting evidence should motivate wider action and where progress is a similarly slow burn: job training.
“We’ve had decades of research and evaluations of job-training programs, many that were quite disappointing,” said Katz. “But we did see kernels of programs that seemed to work better.” Rather than train workers to land an initial decent job, he believes the evidence points to an approach that trains people in need of new skills for sectors where the data show long-term prospects for a job ladder within the field.
“The fact that we’ve had evidence on some of the effectiveness of sectoral employment training programs for over a decade, and the movement is sort of just taking off, tells you how long and difficult it is,” he said. “We’re building up the evidentiary base, but in all these cases scale is still a huge issue. We have good examples of [programs] in the thousands, but not in the millions of participants.”
The risks and rewards of working with economists
Many large policymaking entities, including federal government agencies and the Federal Reserve System, have the counsel of in-house economists. That is not the case for many working on policy at the local and nonprofit level, where linking your fortunes to an academic economist comes with trade-offs.
The mission of the Boston-based nonprofit EMPath is to help people move themselves out of poverty. Using Katz’s team to implement and evaluate their new job-mentoring program— a partnership with Boston Housing Authority called “AMP Up Boston”—means not everyone is going to get their help. That’s because economists want to compare the outcomes of treatment groups (who get assistance) and control groups (who don’t).
“Having a control group is not something we love to do,” said Ruthie Liberman, EMPath vice president for public policy. “It’s a very hard thing for staff to do these orientation sessions—where everybody there wants to be in the program—knowing that only half are going to be selected.” Liberman says EMPath and the housing authority balked at the initial AMP Up study design, in which only 1 in 3 low-income participants would receive the full suite of services. As a result, the study was delayed while they raised more money to support a larger treatment group.
Partnering with economists for a credible academic assessment takes patience—a harder virtue to sustain when your instinct is to help as many people as possible. And when results begin to flow in four years, EMPath must brace for the economists’ honest assessment of its core job-mentoring platform. “It’s something I think a lot about, and I’m nervous for that,” said Liberman. “We’ve seen tremendous outcomes in the [previous] evaluations that we’ve done. … We expect good results. But there’s always the possibility they will just be kind of ‘meh.’”
Nonetheless, a credible randomized controlled trial is what EMPath needs if they want the funding required to scale up. “Having someone like Larry as our principal investigator, hopefully we’ll come out with strong results and join with him” to publicize them, Liberman said. “It’s that gravitas and it’s the gold seal of approval that’s necessary to indicate that you are evidence-based.”
Greg Russ, chair of the board of directors of the New York City Housing Authority, was instrumental in calling housing authorities’ attention to the MTO research. When someone mentioned the findings to him at a HUD meeting in 2015, Russ was leading the public housing authority in Cambridge, Massachusetts. Chetty, Hendren, and Katz were just down the road at Harvard; after tracking down their new paper, Russ reached out and set up an informal meeting. From there, he worked the phones to assemble more than a dozen leaders from other housing authorities for a conference with the economists.
To Russ, this kind of serendipitous connection is too ad hoc, and it happens too rarely. “We don’t have a good way of baking-in the partnership of research and implementation,” Russ said. The list of obstacles is formidable. In addition to concerns about the fairness of control groups, studying program participants can seem invasive. Staff are already stretched for time and budgets, and social service programs rarely come with funding and flexibility for testing new ideas. “I’d like to set aside housing vouchers for policy development and research,” Russ said, but that is not generally allowed under the federal housing program rules.
Russ says these factors prevent policy practitioners from reaching out to academic collaborators. Meanwhile, economists can become enthralled with interventions that look good on paper, without an appreciation for the day-to-day challenges of program administrators and participants. “I think we underestimate, as program people, the power of some of these ideas,” said Russ. “I think on their side, they underestimate how difficult it is to run the program.”
Russ and Liberman credit Katz for establishing a template for academic-policy collaboration—a back-and-forth process with a long-term commitment. Katz considers interviews and focus groups essential to assuring his work has impact. “Sitting back in your office and looking at the data is incredibly important,” said Katz. “But we can actually talk to participants—figure out what’s motivating them, and how they understand what’s going on.”
Call it anecdotal evidence, but these qualitative findings shape the research design into something less likely to fall apart in practice. Personal stories, Katz said, also “become very valuable in being able to speak with policymakers.”
The power of (understandable) data
Nearly 40 years ago, Alice Rivlin worried that politics was turning off economists who might otherwise seek to make a difference. Today’s bitter partisanship and information overload arguably create an even rougher climate for research to cut through.
And even well-meaning government officials can struggle to make use of the information at their disposal. A recent experiment presented 192 high-ranking civil-service employees across 22 federal agencies with a test, of sorts. The officials received sample evidence for five hypothetical programs in their area of expertise and were asked to assess the value of each.
The study by Mattie Toma and Elizabeth Bell found the policymakers’ assessments were “markedly inelastic with respect to impact.” That’s a polite way of saying the actual evidence they were provided about the programs was largely irrelevant to the decisions they made. Follow-up surveys identified the primary barrier: “[Even] experts in these types of decisions,” the authors find, “place less weight on impact-relevant, evidence- based features of programs due to the cognitive complexity of the decision environment.”
However, the researchers conclude with some better news: The officials’ assessments improved when provided with simplified “decision aids” that boiled down the numbers and used presentation techniques borrowed from psychology and marketing. In terms of practical impact of research on policy, it seems the evidence is only as powerful as the tools we use to communicate it.
Institute economist Amanda Michaud sees signs of hope in a profession-wide shift to research driven by real-world data. “When I was in grad school, I think many of us imagined ourselves being theorists in the future,” Michaud said. Yet, “almost universally across my classmates, I’ve seen them move into more intensive data work.” This shift could improve the potential for economics to influence policy and public understanding. “The availability of data and the emphasis on data is something that makes our profession and our methodology easier to communicate to anybody,” Michaud said. “If you can show a result in the data, everybody can usually understand that.”
Princeton’s Janet Currie sees increasing barriers to data as a concurrent danger. Her prime example is a plan to suppress detailed data from the U.S. Census Bureau in light of fears that bad actors might exploit it to identify individuals. Dependable survey data is also under threat as Americans increasingly screen calls or decline to participate. Even as economists make strides in working with and communicating about data, Currie believes it is a pressing policy priority for economists “to educate the public about the importance of having good data.”
For economists who want to increase their odds of a wider impact, today’s world offers more outlets than ever to expound on their research, from #econtwitter to podcasts to policy-focused websites like VoxEU. Currie’s econ students want to make a difference with their work, and her advice to them is the same she follows herself: Work on something you care about. Learn to translate your ideas for a general audience. And be ready when you get the call.
“Don’t chase after the topic of the moment,” said Currie. “You don’t want to work on something that nobody’s ever going to care about, but there’s a broad set of topics that are going to come up one of these days. Pick one of those and work on it—so that when it comes up, you can be in a good position to say something.”
This article is featured in the Fall 2022 issue of For All, the magazine of the Opportunity & Inclusive Growth Institute
Jeff Horwich is the senior economics writer for the Minneapolis Fed. He has been an economic journalist with public radio, commissioned examiner for the Consumer Financial Protection Bureau, and director of policy and communications for the Minneapolis Public Housing Authority. He received his master’s degree in applied economics from the University of Minnesota.