↳ Climate

April 29th, 2019

↳ Climate

Green Power

MODELED WISDOM

Modeling policy levers for housing affordability in urban centers

In nearly every major urban center, housing affordability is in crisis. Since the 1960s, median home value has risen by 112% across the country, while median owner incomes rose just 50%. For renters, especially since 2008, the problem is increasingly acute:nearly half of renters (over 20 million people) pay over 30% of their income on rent.

In New York City, nearly two-thirds of all residents are renters (half of whom are rent-burdened), and the politics of housing policy remain correspondingly fraught. In a recent paper, researchers JACK FAVILUKIS, PIERRE MABILLE, and STIJN VAN NIEUWERBURGH at Columbia Business School develop a dynamic stochastic spatial equilibrium model to quantify the welfare implications of various policy tools. Calibrating the model to New York City, the authors examine the interactions between funding and affordability policies to chart a possible path forward. From the paper:

"Policy makers are under increasing pressure to improve affordability. They employ four broad categories of policy tools: rent control (RC), zoning policies, housing vouchers, and tax credits for developers. Each policy affects the quantity and price of owned and rented housing and its spatial distribution. It affects incentives to work, wages, commuting patterns, and ultimately output. Each policy affects wealth inequality in the city and in each of its neighborhoods.

While there is much work, both empirical and theoretical, on housing affordability, what is missing is a general equilibrium model that quantifies the impact of such policies on prices and quantities, on the spatial distribution of households, on income inequality within and across neighborhoods, and ultimately on individual and city-wide welfare. Consistent with conventional wisdom, increasing the housing stock in the urban core by relaxing zoning regulations is welfare improving. Contrary to conventional wisdom, increasing the generosity of the rent control or housing voucher systems is welfare increasing."

Link to the paper, and link to a press release from Columbia Business School.

  • Data for Progress analyzed housing proposals from the leading 2020 candidates. Link to the reports, and link to an updated version of Senator Warren's proposal.
  • A report from last spring by authors Peter Gowan and Ryan Cooper advocates for an across-the-board expansion of social housing in the United States. Link.
  • Tangentially related, a JFI letter from last year highlighted thinking and proposals around the implementation of a land value tax. Link.
⤷ Full Article

April 13th, 2019

Assemblies

FISSURED CHURN

Reexamining claims about automation and labor displacement

Current UBI discussions emerged out of concerns over the role of human beings in a machine-dominated labor market. In 2013, a paper by Oxford University professors Carl Benedikt Frey and Michael Osborne claimed that 47% of US jobs were at risk of long term automation. The statistic circulated widely, prompting fears of widespread unemployment. The debate over these predictions is complex: those who deny any threat from automation often point to near-full employment, and risk overlooking the proliferation of low-paying and precarious jobs; while those who forecast mass unemployment risk assuming that technological development necessarily leads to labor displacement.

In a 2018 paper, legal scholar BRISHEN ROGERS argues that fears of a robot takeover misapprehend the real dynamics in the labor market:

"In a period of technological upswing, with companies rapidly installing robotics and other automation devices, we would also see significant increases in labor productivity. In fact, productivity growth has recently been the slowest as at any time since World War II. What’s more, productivity change in the manufacturing sector—where automation is easiest—has been especially tepid lately, at 0.7 percent over the last decade. On a related note, levels of 'occupational churn,' or the net creation of jobs in growing occupations and loss of jobs in declining occupations, are also at historic lows.

Even more striking, if firms expected artificial intelligence to be a major source of productivity in the near future, they would surely be investing in information technology and intellectual property. But they aren’t. Computers and software constituted 13.5 percent of the value of companies’ investments from 2000-2007, as the internet was coming into wide use. Over the last decade, that rate declined to 4.8 percent. These differences strongly suggest that there is nothing inevitable about precarious work or economic inequality. Hotel work, food services, janitorial work, and retail work have become precarious over the past twenty years because companies in those sectors forcibly de-unionized and/or 'fissured' away their workers to subcontractors or franchisors, thereby denying them effective access to many legal rights."

Link to the paper.

  • An MIT Technology Review from 2018 surveyed the predictions of every paper published on job losses due to automation. The results: "There is really only one meaningful conclusion: we have no idea how many jobs will actually be lost to the march of technological progress." Link.
  • "...even those occupations which are contracting due to technological change will continue to provide plenty of job openings over the next two decades. The challenge lies in improving the quality of these jobs going forward." Paul Osterman anticipates Rogers' arguments in a column from 2017. Link.
  • Another recent paper by Brishen Rogers (to which we previously linked) continues the thread: "Based on a detailed review of the capacities of existing technologies, automation is not a major threat to workers today, and it will not likely be a major threat anytime soon." Link.
  • Daron Acemoglu and Pascual Restrepo published two papers on automation and employment: the first uses industry level data to observe changes in the task content of production. The second argues that automation has been primarily concerned with reducing the need for labor, with insufficient attention being paid to socially productive investment. Link to the first, link to the second.
  • Frank Levy on the relationship between automation-induced job losses and the rebirth of populist politics. Link.
  • From EconFIP, a research brief on automation, AI, and the labor share. Link.
⤷ Full Article

January 19th, 2019

Self-Portrait

PLENARY CONFIDENCE

A look at China's social credit system

In a recent newsletter, we noted a spate of reporting drawing attention to the authoritarianism of China's growing Social Credit System. This week, we are sharing a paper by YU-JIE CHEN, CHING-FU LIN, AND HAN-WEI LIU that casts light on the details of the program's workings, corrects common misconceptions, proposes some likely and disturbing future scenarios, and offers a useful frame for understanding the significant shift it is bringing about in Chinese governance.

"A new mode of governance is emerging with the rise of China’s 'Social Credit System' (shehui xinyong zhidu) or SCS. The SCS is an unusual, comprehensive governance regime designed to tackle challenges that are commonly seen as a result of China’s 'trustless' society that has featured official corruption, business scandals and other fraudulent activities. The operation of the SCS relies on a number of important and distinctive features—information gathering, information sharing, labeling, and credit sanctions—which together constitute four essential elements of the system.

In our view, the regime of the SCS reflects what we call the 'rule of trust,' which has significant implications for the legal system and social control in China. We define the 'rule of trust' as a governance mode that imposes arbitrary restrictions—loosely defined and broadly interpreted trust-related rules—to condition, shape, and compel the behavior of the governed subjects… The 'rule of trust' is in fact undermining 'rule of law.'

In the context of governance, the unbounded notion of 'trust' and the unrestrained development of technology are a dangerous combination."

Link to the paper.

⤷ Full Article

December 1st, 2018

Energy Field

GREEN INFLUENCE

A discussion of different approaches to climate policy

Last week, the U.S. government released the Fourth National Climate Assessmentwhich outlined the dire economic and environmental consequences of climate change. Instead of highlighting key findings of the report—two good summaries are available here and here—we'll contextualize the current climate debate within legal history, which shows the limitations of current economically-focused arguments for climate policy.

A 2010 Yale Law Journal article by Jedediah Purdy situates the current climate debate within the long tradition of political argument about the natural world, and challenges assumptions that environmental values which appeal to moral and civic duty are too weak and vague to spur political action. In fact, Purdy argues that major environmental legislation emerged from "democratic argument over the value of the natural world and its role in competing ideas of citizenship, national purpose, and the role and scale of government." Purdy does more than just argue that environmental public language is more coherent than conventionally understood, he argues that understanding climate policy through economic self-interest diminishes the role political struggle plays in shaping national values and interests:

"Consider one example that makes little sense through the lens of narrow self-interest, much more as part of an ongoing debate over environmental values: the organizing project that has led 1015 city governments to adopt the goals of the Kyoto Protocol (a seven percent reduction in greenhouse-gas emissions from 1990 levels by 2012) through an instrument called the Mayors Climate Protection Agreement. Since the costs are not zero, and the benefits, in theory, are almost exactly that, the question of motivation is still fairly sharply presented.

...

In private interviews and public statements, city officials explain their efforts in several ways. They are quick to cite the advantage certain regions hope to enjoy from early adoption and manufacture of technologies that may later become standard. They embrace a simple public-choice motive: city governments hope to benefit from green-development block grants and, in the longer term, density-friendly economic development, and early efforts may position them to do both.
They also regard themselves as engaged in political persuasion that they hope will induce others to take similar action. Whether this is plausible is partly endogenous to the politics itself. This politics seeks to affect the reasons—specifically those grounded in environmental values—that people understand themselves to have for joining collective undertakings. Rather than a specimen of an independently established logic of collective action, it is an engagement with that logic itself."

Link to full paper.

  • In a 2018 article, Purdy looks more deeply at the history of environmental justice, and why its concerns were left out of mainstream environmental law: "Mainstream environmental law was the last major legal product of 'the great exception,' the decades of the mid-twentieth century when, unlike any other time in modern history, economic inequality was declining and robust growth was widely shared." Link.

  • A 2017 dissertation examines the environment as an object of politics, as opposed to natural capital, and argues that the environment is a "political problem that entails ongoing negotiations over the legitimacy of market rule, the role of the state in relation to the market, and the value of ecological stewardship." Link.

  • The new climate reports have brought attention back to solar geoengineering, which the Guardian, covering a Gernot Wagner paper, notes is extremely inexpensive and possibly an option for desperate circumstances: “The IPCC [Intergovernmental Panel on Climate Change] report said geoengineering might be adopted as a temporary “remedial measure” in extreme circumstances.”Link.

⤷ Full Article

October 6th, 2018

Earth Men

HARD CAPS

Economic growth vs. natural resources

A recent Foreign Policy op-ed by JASON HICKEL examines “green growth,” a policy that calls for the absolute decoupling of GDP from the total use of natural resources. Hickel synthesizes three studies and explains that even in high-efficiency scenarios, economic growth makes it impossible to avoid unsustainably using up natural resources (including fossil fuels, minerals, livestock, forests, etc).

“Study after study shows the same thing. Scientists are beginning to realize that there are physical limits to how efficiently we can use resources. Sure, we might be able to produce cars and iPhones and skyscrapers more efficiently, but we can’t produce them out of thin air. We might shift the economy to services such as education and yoga, but even universities and workout studios require material inputs. Once we reach the limits of efficiency, pursuing any degree of economic growth drives resource use back up.”

The op-ed sparked debate about the state of capitalism in the current climate crisis, most notably in an Bloomberg op-ed by NOAH SMITH, who claims that Hickel is a member of “a small but vocal group of environmentalists telling us that growth is no longer possible—that unless growth ends, climate change and other environmental impacts will destroy civilization.” Though Smith’s op-ed doesn’t directly engage with many of Hickel’s points, his general position prompted a clarifying (and heated)response from Hickel:

“Noah is concerned that if we were to stop global growth, poor countries would be ‘stuck’ at their present level of poverty. But I have never said that poor countries shouldn’t grow—nor has anyone in this field of study (which Noah would know had he read any of the relevant literature). I have simply said that we can’t continue with aggregate global growth.

...
While poor countries may need some GDP growth, that should never—for any nation, rich or poor—be the objective as such. The objective should be to improve human well-being: better health, better education, better housing, happiness, etc. The strategy should be to target these things directly. To the extent that achieving these goals entails some growth, so be it. But that’s quite different from saying that GDP needs to grow forever.”

  • From a study on the limits of green growth: “GDP cannot be decoupled from growth in material and energy use. It is therefore misleading to develop growth-oriented policy around the expectation that decoupling is possible. GDP is increasingly seen as a poor proxy for societal wellbeing. Society can sustainably improve wellbeing, including the wellbeing of its natural assets, but only by discarding GDP growth as the goal in favor of more comprehensive measures of societal wellbeing.” Link.
  • In a recent article, Juan Moreno-Cruz, Katharine L. Ricke, and Gernot Wagner discuss ways to approach the climate crisis and argue that “mitigation (the reduction of carbon dioxide and other greenhouse gas emissions at the source) is the only prudent response.” Link.
⤷ Full Article

August 4th, 2018

The Great Abundance

ENERGY BOOM

A new carbon tax proposal and a big new carbon tax research report

Representative Carlos Curbelo (R-FL) introduced a carbon tax bill to the House last week (though it is “sure to fail” with the current government, it's unusual to see a carbon tax proposed by a Republican). According to Reuters popup: yes, “Curbelo said the tax would generate $700 billion in revenue over a decade for infrastructure investments.” A deep analysis popup: yes is available from The Center on Global Energy Policy at Columbia SIPA, which started up a Carbon Tax Initiative this year.

For a broader look at carbon taxes, earlier this month the Columbia initiative published a significant four-part series on the “economic, energy, and environmental implications of federal carbon taxes” (press release here popup: yes).

The overview covers impacts on energy sources:

“The effects of a carbon tax on prices are largest for energy produced by coal, followed by oil, then natural gas, due to the difference in carbon intensity of each fuel. Every additional dollar per ton of the carbon tax increases prices at the pump by slightly more than one cent per gallon for gasoline and slightly less than one cent per gallon for diesel.”

And examines a few possible revenue uses:

“How the carbon tax revenue is used is the major differentiating factor in distributional outcomes. A carbon tax policy can be progressive, regressive, or neither.”

Overview here popup: yes. Link popup: yes to report on energy and environmental implications; link to report popup: yes on distributional implications; link to report popup: yes on implications for the economy and household welfare.

⤷ Full Article

July 14th, 2018

Traveling Light

DATA IS NONRIVAL

Considerations on data sharing and data markets 

CHARLES I. JONES and CHRISTOPHER TONETTI contribute to the “new but rapidly-growing field” known as the economics of data:

“We are particularly interested in how different property rights for data determine its use in the economy, and thus affect output, privacy, and consumer welfare. The starting point for our analysis is the observation that data is nonrival. That is, at a technological level, data is not depleted through use. Most goods in economics are rival: if a person consumes a kilogram of rice or an hour of an accountant’s time, some resource with a positive opportunity cost is used up. In contrast, existing data can be used by any number of firms or people simultaneously, without being diminished. Consider a collection of a million labeled images, the human genome, the U.S. Census, or the data generated by 10,000 cars driving 10,000 miles. Any number of firms, people, or machine learning algorithms can use this data simultaneously without reducing the amount of data available to anyone else. The key finding in our paper is that policies related to data have important economic consequences.”

After modeling a few different data-ownership possibilities, the authors conclude, “Our analysis suggests that giving the data property rights to consumers can lead to allocations that are close to optimal.” Link to the paper popup: yes.

  • Jones and Tonetti cite an influential 2015 paper by Alessandro Acquisti, Curtis R. Taylor, and Liad Wagman on “The Economics of Privacy”: “In digital economies, consumers' ability to make informed decisions about their privacy is severely hindered, because consumers are often in a position of imperfect or asymmetric information regarding when their data is collected, for what purposes, and with what consequences.” Link popup: yes.
  • For more on data populi, Ben Tarnoff has a general-interest overview in Logic Magazine, including mention of the data dividend and a comparison to the Alaska Permanent Fund. Tarnoff uses the oil industry as an analogy throughout: “In the oil industry, companies often sign ‘production sharing agreements’ (PSAs) with governments. The government hires the company as a contractor to explore, develop, and produce the oil, but retains ownership of the oil itself. The company bears the cost and risk of the venture, and in exchange receives a portion of the revenue. The rest goes to the government. Production sharing agreements are particularly useful for governments that don’t have the machinery or expertise to exploit a resource themselves.” Link popup: yes.
⤷ Full Article

July 7th, 2018

Quodlibet

EVIDENCE PUZZLES

The history and politics of RCTs 

In a 2016 working paper, JUDITH GUERON recounts and evaluates the history of randomized controlled trials (RCTs) in the US, through her own experience in the development of welfare experiments through the MDRC and the HHS: 

“To varying degrees, the proponents of welfare experiments at MDRC and HHS shared three mutually reinforcing goals. The first was to obtain reliable and—given the long and heated controversy about welfare reform—defensible evidence of what worked and, just as importantly, what did not. Over a pivotal ten years from 1975 to 1985, these individuals became convinced that high-quality RCTs were uniquely able to produce such evidence and that there was simply no adequate alternative. Thus, their first challenge was to demonstrate feasibility: that it was ethical, legal, and possible to implement this untried—and at first blush to some people immoral—approach in diverse conditions. The other two goals sprang from their reasons for seeking rigorous evidence. They were not motivated by an abstract interest in methodology or theory; they wanted to inform policy and make government more effective and efficient. As a result, they sought to make the body of studies useful, by assuring that it addressed the most significant questions about policy and practice, and to structure the research and communicate the findings in ways that would increase the potential that they might actually be used." 

⤷ Full Article

June 30th, 2018

The Duel

CLIMATE PREDICTION MARKET

How to link a carbon tax to climate forecasting

A 2011 paper by SHI-LING HSU suggests a way of using a carbon tax to generate more accurate predictions of future climate conditions:

“The market for tradable permits to emit in the future is essentially a prediction market for climate outcomes. And yet, unlike prediction markets that have been operated or proposed thus far, this prediction market for climate outcomes operates against the backdrop of an actual and substantial tax liability. Whereas prediction markets have heretofore largely involved only recreational trading, this prediction market will operate against a regulatory backdrop and thus will provide much stronger incentives for traders to acquire and trade on information.”

 Link popup: yes to the full paper.

A 2018 paper by GARY LUCAS and FELIX MORMANN suggests using similar predictions for climate policies beyond carbon taxes:

“We explain how both the federal and state governments could use prediction markets to help resolve high-profile controversies, such as how best to allocate subsidies to promote clean technology innovation and which policy strategy promises the greatest reduction in carbon emissions.” 

Link popup: yes to their paper.

  • In 2016, a group of researchers modeled the way that information would converge in a climate prediction market, and found “market participation causes most traders to converge quickly toward believing the ‘true’ climate model, suggesting that a climate market could be useful for building public consensus.” Link popup: yes.
  • Tyler Cowen wrote about Hsu’s paper in 2011: “I think of such fine-tuning as a misguided approach. Is there such a good ‘basket’ measure of climate outcomes with sufficiently low short-term volatility?” Link popup: yes.
  • A 2017 paper by Michael Thicke makes a similar point about prediction models for science generally: “Prediction markets for science could be uninformative or deceptive because scientific predictions are often long-term, while prediction markets perform best for short-term questions.” Link popup: yes.
⤷ Full Article

June 23rd, 2018

Yielding Stone

VISIBLE CONSTRAINT

Including protected variables can make algorithmic decision-making more fair 

A recent paper co-authored by JON KLEINBERG, JENS LUDWIG, SENDHIL MULLAINATHAN, and ASHESH RAMBACHAN addresses algorithmic bias, countering the "large literature that tries to 'blind' the algorithm to race to avoid exacerbating existing unfairness in society":  

"This perspective about how to promote algorithmic fairness, while intuitive, is misleading and in fact may do more harm than good. We develop a simple conceptual framework that models how a social planner who cares about equity should form predictions from data that may have potential racial biases. Our primary result is exceedingly simple, yet often overlooked: a preference for fairness should not change the choice of estimator. Equity preferences can change how the estimated prediction function is used (such as setting a different threshold for different groups) but the estimated prediction function itself should not change. Absent legal constraints, one should include variables such as gender and race for fairness reasons.

Our argument collects together and builds on existing insights to contribute to how we should think about algorithmic fairness.… We empirically illustrate this point for the case of using predictions of college success to make admissions decisions. Using nationally representative data on college students, we underline how the inclusion of a protected variable—race in our application—not only improves predicted GPAs of admitted students (efficiency), but also can improve outcomes such as the fraction of admitted students who are black (equity).

Across a wide range of estimation approaches, objective functions, and definitions of fairness, the strategy of blinding the algorithm to race inadvertently detracts from fairness."

Read the full paper here popup: yes.

⤷ Full Article