↳ Evidence

July 1st, 2019

↳ Evidence

Quandary

HOW RESEARCH AFFECTS POLICY

Results from Brazil

How can evidence inform the decisions of policymakers? What value do policymakers ascribe to academic research? In January, we highlighted Yale's Evidence in Practice project, which emphasizes the divergence between policymakers' needs and researchers' goals. Other work describes the complexity of getting evidence into policy. A new study by JONAS HJORT, DIANA MOREIRA, GAUTAM RAO, and JUAN FRANCISCO SANTINI surprises because of the simplicity of its results—policymakers in Brazilian cities and towns are willing to pay for evidence, and willing to implement (a low-cost, letter-mailing) evidence-based policy. The lack of uptake may stem more from a lack of information than a lack of interest: "Our findings make clear that it is not the case, for example, that counterfactual policies' effectiveness is widely known 'on the ground,' nor that political leaders are uninterested in, unconvinced by, or unable to act on new research information."

From the abstract:

"In one experiment, we find that mayors and other municipal officials are willing to pay to learn the results of impact evaluations, and update their beliefs when informed of the findings. They value larger-sample studies more, while not distinguishing on average between studies conducted in rich and poor countries. In a second experiment, we find that informing mayors about research on a simple and effective policy (reminder letters for taxpayers) increases the probability that their municipality implements the policy by 10 percentage points. In sum, we provide direct evidence that providing research information to political leaders can lead to policy change. Information frictions may thus help explain failures to adopt effective policies."

Link to the paper.

  • New work from Larry Orr et al addresses the question of how to take evidence from one place (or several places) and make it useful to another. "[We provide] the first empirical evidence of the ability to use multisite evaluations to predict impacts in individual localities—i.e., the ability of 'evidence‐based policy' to improve local policy." Link.
  • Cited within the Hjort et al paper is research from Eva Vivalt and Aidan Coville on how policymakers update their prior beliefs when presented with new evidence. "We find evidence of 'variance neglect,' a bias similar to extension neglect in which confidence intervals are ignored. We also find evidence of asymmetric updating on good news relative to one’s prior beliefs. Together, these results mean that policymakers might be biased towards those interventions with a greater dispersion of results." Link.
  • From David Evans at CGDev: "'The fact that giving people information does not, by itself, change how they act is one of the most firmly established in social science.' So stated a recent op-ed in the Washington Post. That’s not true. Here are ten examples where simply providing information changed behavior." Link. ht The Weekly faiV.
  • For another iteration of the question of translating evidence into policy, see our February letter on randomized controlled trials. Link.
⤷ Full Article