Phenomenal World

September 1st, 2018

Phenomenal World

The Braid

CONSTRAINED POSSIBILITIES

On the relationship between academic economics and public policy

In a recent working paper, ELIZABETH POPP BERMAN discusses the interconnected fields of academic economics and public policy. The paper conceptualizes the translation of certain academic ideas into public policy, clarifying the relation by describing different economic theories as having certain “affordances”:

"I borrow the concept of affordances, which has been used widely to describe how particular technologies proved the potential for some kinds of action but not others. I suggest that knowledge, like technologies, may afford some possibilities but not others. In particular, some theories produce knowledge that, simply because of the kind of knowledge it is, is useful and usable for particular actors in the policy field, while others, regardless of their truth or the accuracy with which they describe the world, do not.”

The paper also examines the gap between academic theory and policy application and includes takeaways for those interested in the role of academic experts in the process of policy creation:

"It is important to recognize the relative autonomy of the academic field from the policy field. While outside groups may support one school of thought or another, the development of academic disciplines is not determined solely by who has the most money, but also by stakes—including intellectual stakes—specific to the academic field. Similarly, while the academic and policy fields may be linked in ways that facilitate the transmission of people and ideas, the academic dominance of a particular approach does not translate to policy dominance, even given influential champions.”

Link to full paper. ht Michael

  • This work builds off a 2014 paper Berman co-authored with David Hirschman, which also explores the degree to which economists, their tools and ideas, influence and create policy. Similar to the concept of “affordances”, Berman and Hirschman argue that “economic style can shape how policymakers approach problems, even if they ignore the specific recommendations of trained economists.” Link.
  • A 2010 paper offers a new framework for properly assessing research impact, which includes quantifying conventional citation data as well as other qualitative outputs. Link.

PRESCIENT HEGEMON

Branko Milanovic with a speculative paper on globalization from the turn of the millennium

Back in 1999, economist Branko Milanovic wrote a ("several times rejected") paper proposing three periods of globalization—the third being the present one—and the countervailing ideologies that sprang up to contest the first two. From the paper:

“We are currently standing at the threshold of the Third Globalization. the Roman-led one of the 2nd-4th century, the British-led one of the 19th century, and the current one led by the United States. Each of them not only had a hegemon country but was associated with a specific ideology. However, in reaction to the dominant ideology and the effects of globalization (cultural domination, increasing awareness of economic inequities) an alternative ideology (in the first case, Christianity, in the second, Communism) sprang up. The alternative ideology uses the technological means supplied by the globalizers to subvert or attack the dominant ideological paradigm."

Read the full paper here.

  • For more Milanovic on the politics of globalization, slides from a recent presentation of his on global inequality and its political consequences features much of relevance to this vintage paper. Some of its broader questions: "Does global equality of opportunity matter? Is 'citizenship rent' morally acceptable? What is the 'optimal' global income distribution? Can something 'good' (global middle class) be the result of something 'bad' (shrinking of national middle classes and rising income inequality)? Are we back to Mandeville?" Link.
⤷ Full Article

August 25th, 2018

A Ship So Big, A Bridge Cringes

SPATIAL PARAMETERS

On place-based and adaptable public policy

A recent report published by BROOKINGS INSTITUTE discusses the potential effectiveness of place-based policies for strengthening the economies of depressed areas. Co-authored by Harvard’s BENJAMIN AUSTIN, EDWARD GLAESER, and LAWRENCE H. SUMMERS, the report emphasizes that region-specific, flexible policies may best foster a nation-wide equilibrium:

"Traditionally, economists have been skeptical towards [place-based] policies because of a conviction that relief is best targeted towards poor people not poor places, because incomes in poor areas were converging towards incomes in rich areas anyway, and because of fears that favoring one location would impoverish another. This paper argues for reconsidering place-based policies ...

Indeed, even the most diehard opponent of place-based redistribution should see the logic of tailoring Federal policies to local labor market conditions. Standard social policy rules, like the Bailey (1976)—Chetty (2006) formula for unemployment insurance, depend on parameters that differ across space. If non-employment is particularly harmful in one location and particularly sensitive to public policies, then that diehard could still support a place-based revenue-neutral twist that reallocates funds from benefits that subsidize not working to benefits that encourage employment, without encouraging migration or raising housing prices.”

Link popup: yes to full paper. The two main policy recommendations are an expanded EITC and subsidies for employment.

⤷ Full Article

August 18th, 2018

House Fronts

COMPENSATION TREATMENT

In Iran, cash transfers don't reduce labor supply

A new study examines the effects of Iran's changeover from energy subsidies to cash transfers. From the abstract, by DJAVAD SALEHI-ISFAHANI and MOHAMMED H. MOSTAFAVI-DEHZOOEI of the ECONOMIC RESEARCH FORUM:

“This paper examines the impact of a national cash transfer program on labor supply in Iran. [...] We find no evidence that cash transfers reduced labor supply, in terms of hours worked or labor force participation. To the contrary, we find positive effects on the labor supply of women and self-employed men.”

Most recent version here. The ungated working paper is available here.

  • Another paper co-authored by Salehi-Isfahani further details the energy subsidies program and the role that cash transfers played in the reforms, with a specific focus on differences in take-up. Link.
  • We’ve previously shared work from Damon Jones and Ioana Marinescu on the Alaska Permanent Fund dividend, which found that “a universal and permanent cash transfer does not significantly decrease aggregate employment.” Link.
  • In other Basic Income news, petitions and protests are being organized in response to the cancellation of the Ontario pilot.
⤷ Full Article

August 11th, 2018

Constellation

ALCHEMIST STOCK

Automation, employment, and capital investment

At his blog STUMBLING AND MUMBLING, CHRIS DILLOW discusses recent reporting on rapid automation fears in the United Kingdom:

"'More than six million workers are worried their jobs could be replaced by machines over the next decade' says the Guardian. This raises a longstanding paradox – that, especially in the UK, the robot economy is much more discussed than observed.

What I mean is that the last few years have seen pretty much the exact opposite of this. Employment has grown nicely whilst capital spending has been sluggish. The ONS says that 'annual growth in gross fixed capital formation has been slowing consistently since 2014.' And the OECD reports that the UK has one of the lowest usages of industrial robots in the western world.

My chart, taken from the Bank of England and ONS, puts this into historic context. It shows that the gap between the growth of the non-dwellings capital stock and employment growth has been lower in recent years than at any time since 1945. The time to worry about machines taking people’s jobs was the 60s and 70s, not today.… If we looked only at the macro data, we’d fear that people are taking robots' jobs – not vice versa."

Link to the post.

⤷ Full Article

August 4th, 2018

The Great Abundance

ENERGY BOOM

A new carbon tax proposal and a big new carbon tax research report

Representative Carlos Curbelo (R-FL) introduced a carbon tax bill to the House last week (though it is “sure to fail” with the current government, it's unusual to see a carbon tax proposed by a Republican). According to Reuters popup: yes, “Curbelo said the tax would generate $700 billion in revenue over a decade for infrastructure investments.” A deep analysis popup: yes is available from The Center on Global Energy Policy at Columbia SIPA, which started up a Carbon Tax Initiative this year.

For a broader look at carbon taxes, earlier this month the Columbia initiative published a significant four-part series on the “economic, energy, and environmental implications of federal carbon taxes” (press release here popup: yes).

The overview covers impacts on energy sources:

“The effects of a carbon tax on prices are largest for energy produced by coal, followed by oil, then natural gas, due to the difference in carbon intensity of each fuel. Every additional dollar per ton of the carbon tax increases prices at the pump by slightly more than one cent per gallon for gasoline and slightly less than one cent per gallon for diesel.”

And examines a few possible revenue uses:

“How the carbon tax revenue is used is the major differentiating factor in distributional outcomes. A carbon tax policy can be progressive, regressive, or neither.”

Overview here popup: yes. Link popup: yes to report on energy and environmental implications; link to report popup: yes on distributional implications; link to report popup: yes on implications for the economy and household welfare.

⤷ Full Article

July 28th, 2018

Jetty

BANKING AS ART

On the history of economists in central banks 

A recent paper by FRANÇOIS CLAVEAU and JÉRÉMIE DION applies quantitative methods to the historical study of central banks, demonstrating the transition of central banking from an "esoteric art" to a science, the growth of economics research within central banking institutions, and the corresponding rise in the dominance of central banks in the field of monetary economics. From the paper: 

"We study one type of organization, central banks, and its changing relationship with economic science. Our results point unambiguously toward a growing dominance of central banks in the specialized field of monetary economics. Central banks have swelling research armies, they publish a growing share of the articles in specialized scholarly journals, and these articles tend to have more impact today than the articles produced outside central banks."

Link to the paper, popup: yes which contains a vivid 1929 dialogue between Keynes and Sir Ernest Musgrave Harvey of the Bank of England, who asserts, "It is a dangerous thing to start giving reasons." 

h/t to the always-excellent Beatrice Cherrier popup: yes who highlighted this work in a brief thread popup: yes and included some visualizations, including this one showing the publishing rate of central banking researchers: 

  • Via both Cherrier and the paper, a brief Economist article on the crucial significance of the central banking conference in Jackson Hole, hosted by the Federal Reserve Bank of Kansas City: "Davos for central bankers." Link popup: yes. (And link popup: yes to an official history of the conference.) 
  • Another paper co-authored by Claveau looks at the history of specialties in economics, using quantitative methods to map the importance of sets of ideas through time. "Among our results, especially noteworthy are (1) the clear-cut existence of ten families of specialties, (2) the disappearance in the late 1970s of a specialty focused on general economic theory, (3) the dispersal of the econometrics-centered specialty in the early 1990s and the ensuing importance of specific econometric methods for the identity of many specialties since the 1990s, and (4) the low level of specialization of individual economists throughout the period in contrast to physicists as early as the late 1960s." Link popup: yes
⤷ Full Article

July 21st, 2018

High Noon

ALTERNATIVE ACTUARY

History of risk assessment, and some proposed alternate methods 

A 2002 paper by ERIC SILVER and LISA L. MILLER on actuarial risk assessment tools provides a history of statistical prediction in the criminal justice context, and issues cautions now central to the contemporary algorithmic fairness conversations:  

"Much as automobile insurance policies determine risk levels based on the shared characteristics of drivers of similar age, sex, and driving history, actuarial risk assessment tools for predicting violence or recidivism use aggregate data to estimate the likelihood that certain strata of the population will commit a violent or criminal act. 

To the extent that actuarial risk assessment helps reduce violence and recidivism, it does so not by altering offenders and the environments that produced them but by separating them from the perceived law-abiding populations. Actuarial risk assessment facilitates the development of policies that intervene in the lives of citizens with little or no narrative of purpose beyond incapacitation. The adoption of risk assessment tools may signal the abandonment of a centuries-long project of using rationality, science, and the state to improve upon the social and economic progress of individuals and society."

Link popup: yes to the paper.

A more recent paper presented at FAT* in 2018 and co-authored by CHELSEA BARABAS, KARTHIK DINAKAR, JOICHI ITO, MADARS VIRZA, and JONATHAN ZITTRAIN makes several arguments reminiscent of Silver and Miller's work. They argue in favor of causal inference framework for risk assessments aimed at working on the question "what interventions work":

"We argue that a core ethical debate surrounding the use of regression in risk assessments is not simply one of bias or accuracy. Rather, it's one of purpose.… Data-driven tools provide an immense opportunity for us to pursue goals of fair punishment and future crime prevention. But this requires us to move away from merely tacking on intervenable variables to risk covariates for predictive models, and towards the use of empirically-grounded tools to help understand and respond to the underlying drivers of crime, both individually and systemically."

Link popup: yes to the paper. 

  • In his 2007 book Against Prediction popup: yes, lawyer and theorist Bernard Harcourt provided detailed accounts and critiques of the use of actuarial methods throughout the criminal legal system. In place of prediction, Harcourt proposes a conceptual and practical alternative: randomization. From a 2005 paper on the same topic: "Instead of embracing the actuarial turn in criminal law, we should rather celebrate the virtues of the random: randomization, it turns out, is the only way to achieve a carceral population that reflects the offending population. As a form of random sampling, randomization in policing has significant positive value: it reinforces the central moral intuition in the criminal law that similarly situated individuals should have the same likelihood of being apprehended if they offend—regardless of race, ethnicity, gender or class." Link popup: yes to the paper. (And link popup: yes to another paper of Harcourt's in the Federal Sentencing Reporter, "Risk as a Proxy for Race.") 
  • A recent paper by Megan Stevenson assesses risk assessment tools: "Despite extensive and heated rhetoric, there is virtually no evidence on how use of this 'evidence-based' tool affects key outcomes such as incarceration rates, crime, or racial disparities. The research discussing what 'should' happen as a result of risk assessment is hypothetical and largely ignores the complexities of implementation. This Article is one of the first studies to document the impacts of risk assessment in practice." Link popup: yes
  • A compelling piece of esoterica cited in Harcourt's book: a doctoral thesis by Deborah Rachel Coen on the "probabilistic turn" in 19th century imperial Austria. Link popup: yes.
⤷ Full Article

July 14th, 2018

Traveling Light

DATA IS NONRIVAL

Considerations on data sharing and data markets 

CHARLES I. JONES and CHRISTOPHER TONETTI contribute to the “new but rapidly-growing field” known as the economics of data:

“We are particularly interested in how different property rights for data determine its use in the economy, and thus affect output, privacy, and consumer welfare. The starting point for our analysis is the observation that data is nonrival. That is, at a technological level, data is not depleted through use. Most goods in economics are rival: if a person consumes a kilogram of rice or an hour of an accountant’s time, some resource with a positive opportunity cost is used up. In contrast, existing data can be used by any number of firms or people simultaneously, without being diminished. Consider a collection of a million labeled images, the human genome, the U.S. Census, or the data generated by 10,000 cars driving 10,000 miles. Any number of firms, people, or machine learning algorithms can use this data simultaneously without reducing the amount of data available to anyone else. The key finding in our paper is that policies related to data have important economic consequences.”

After modeling a few different data-ownership possibilities, the authors conclude, “Our analysis suggests that giving the data property rights to consumers can lead to allocations that are close to optimal.” Link to the paper popup: yes.

  • Jones and Tonetti cite an influential 2015 paper by Alessandro Acquisti, Curtis R. Taylor, and Liad Wagman on “The Economics of Privacy”: “In digital economies, consumers' ability to make informed decisions about their privacy is severely hindered, because consumers are often in a position of imperfect or asymmetric information regarding when their data is collected, for what purposes, and with what consequences.” Link popup: yes.
  • For more on data populi, Ben Tarnoff has a general-interest overview in Logic Magazine, including mention of the data dividend and a comparison to the Alaska Permanent Fund. Tarnoff uses the oil industry as an analogy throughout: “In the oil industry, companies often sign ‘production sharing agreements’ (PSAs) with governments. The government hires the company as a contractor to explore, develop, and produce the oil, but retains ownership of the oil itself. The company bears the cost and risk of the venture, and in exchange receives a portion of the revenue. The rest goes to the government. Production sharing agreements are particularly useful for governments that don’t have the machinery or expertise to exploit a resource themselves.” Link popup: yes.
⤷ Full Article

July 7th, 2018

Quodlibet

EVIDENCE PUZZLES

The history and politics of RCTs 

In a 2016 working paper, JUDITH GUERON recounts and evaluates the history of randomized controlled trials (RCTs) in the US, through her own experience in the development of welfare experiments through the MDRC and the HHS: 

“To varying degrees, the proponents of welfare experiments at MDRC and HHS shared three mutually reinforcing goals. The first was to obtain reliable and—given the long and heated controversy about welfare reform—defensible evidence of what worked and, just as importantly, what did not. Over a pivotal ten years from 1975 to 1985, these individuals became convinced that high-quality RCTs were uniquely able to produce such evidence and that there was simply no adequate alternative. Thus, their first challenge was to demonstrate feasibility: that it was ethical, legal, and possible to implement this untried—and at first blush to some people immoral—approach in diverse conditions. The other two goals sprang from their reasons for seeking rigorous evidence. They were not motivated by an abstract interest in methodology or theory; they wanted to inform policy and make government more effective and efficient. As a result, they sought to make the body of studies useful, by assuring that it addressed the most significant questions about policy and practice, and to structure the research and communicate the findings in ways that would increase the potential that they might actually be used." 

⤷ Full Article

June 30th, 2018

The Duel

CLIMATE PREDICTION MARKET

How to link a carbon tax to climate forecasting

A 2011 paper by SHI-LING HSU suggests a way of using a carbon tax to generate more accurate predictions of future climate conditions:

“The market for tradable permits to emit in the future is essentially a prediction market for climate outcomes. And yet, unlike prediction markets that have been operated or proposed thus far, this prediction market for climate outcomes operates against the backdrop of an actual and substantial tax liability. Whereas prediction markets have heretofore largely involved only recreational trading, this prediction market will operate against a regulatory backdrop and thus will provide much stronger incentives for traders to acquire and trade on information.”

 Link popup: yes to the full paper.

A 2018 paper by GARY LUCAS and FELIX MORMANN suggests using similar predictions for climate policies beyond carbon taxes:

“We explain how both the federal and state governments could use prediction markets to help resolve high-profile controversies, such as how best to allocate subsidies to promote clean technology innovation and which policy strategy promises the greatest reduction in carbon emissions.” 

Link popup: yes to their paper.

  • In 2016, a group of researchers modeled the way that information would converge in a climate prediction market, and found “market participation causes most traders to converge quickly toward believing the ‘true’ climate model, suggesting that a climate market could be useful for building public consensus.” Link popup: yes.
  • Tyler Cowen wrote about Hsu’s paper in 2011: “I think of such fine-tuning as a misguided approach. Is there such a good ‘basket’ measure of climate outcomes with sufficiently low short-term volatility?” Link popup: yes.
  • A 2017 paper by Michael Thicke makes a similar point about prediction models for science generally: “Prediction markets for science could be uninformative or deceptive because scientific predictions are often long-term, while prediction markets perform best for short-term questions.” Link popup: yes.
⤷ Full Article