Phenomenal World

September 9th, 2019

Phenomenal World

Original & Forgery

MULTIPLY EFFECT

The difficulties of causal reasoning and race

While the thorny ethical questions dogging the development and implementation of algorithmic decision systems touch on all manner of social phenomena, arguably the most widely discussed is that of racial discrimination. The watershed moment for the algorithmic ethics conversation was ProPublica's 2016 article on the COMPAS risk-scoring algorithm, and a huge number of ensuing papers in computer science, law, and related disciplines attempt to grapple with the question of algorithmic fairness by thinking through the role of race and discrimination in decision systems.

In a paper from earlier this year, ISSA KOHLER-HAUSMAN of Yale Law School examines the way that race and racial discrimination are conceived of in law and the social sciences. Challenging the premises of an array of research across disciplines, Kolher-Hausmann argues for both a reassessment of the basis of reasoning about discrimination, and a new approach grounded in a social constructivist view of race.

From the paper:

"This Article argues that animating the most common approaches to detecting discrimination in both law and social science is a model of discrimination that is, well, wrong. I term this model the 'counterfactual causal model' of race discrimination. Discrimination, on this account, is detected by measuring the 'treatment effect of race,' where treatment is conceptualized as manipulating the raced status of otherwise identical units (e.g., a person, a neighborhood, a school). Discrimination is present when an adverse outcome occurs in the world in which a unit is 'treated' by being raced—for example, black—and not in the world in which the otherwise identical unit is 'treated' by being, for example, raced white. The counterfactual model has the allure of precision and the security of seemingly obvious divisions or natural facts.

Currently, many courts, experts, and commentators approach detecting discrimination as an exercise measuring the counterfactual causal effect of race-qua-treatment, looking for complex methods to strip away confounding variables to get at a solid state of race and race alone. But what we are arguing about when we argue about whether or not statistical evidence provides proof of discrimination is precisely what we mean by the concept DISCRIMINATION."

Link to the article. And stay tuned for a forthcoming post on the Phenomenal World by JFI fellow Lily Hu that grapples with these themes.

  • For an example of the logic Kohler-Hausmann is writing against, see Edmund S. Phelps' 1972 paper "The Statistical Theory of Racism and Sexism." Link.
  • A recent paper deals with the issue of causal reasoning in an epidemiological study: "If causation must be defined by intervention, and interventions on race and the whole of SeS are vague or impractical, how is one to frame discussions of causation as they relate to this and other vital issues?" Link.
  • From Kohler-Hausmann's footnotes, two excellent works informing her approach: first, the canonical book Racecraft by Karen Fields and Barbara Fields; second, a 2000 article by Tukufu Zuberi, "Decracializing Social Statistics: Problems in the Quantification of Race." Link to the first, link to the second.
⤷ Full Article

July 15th, 2019

The River

APPROPRIATION FRICTION

Swedish wage earner funds and their limitations

Beyond growing calls for welfare expansion and a more progressive tax system, recent policy debates have begun to consider alternative models of firm ownership. Last year, the UK Labour party published a report outlining a path towards a more diverse set of ownership arrangements, including worker cooperatives and municipally owned service providers. The party also articulated an intention to transition ownership of up to 10% of shares to workers in large firms. More recently, the Bernie Sanders campaign announced a proposal for the formation of inclusive ownership funds, whereby large corporations contribute a portion of their stocks to an employee controlled fund. This comes in addition to Elizabeth Warren’s Accountable Capitalism Act, which calls for increased workers representation on company boards.

These recent proposals are distinctly reminiscent of programs put forward throughout the twentieth century, the most famous of which is Sweden’s 1976 Meidner Plan. In his 1992 book, Princeton politics professor JONAS PONTUSSON analyzes the origins of the plan, and the barriers to its successful implementation. He compares the movement for wage earner funds with that of co-determination, suggesting that the failure of the former has to do with the politics of legislation:

"In unfavorable political circumstances, it made sense for organized business to avoid major political confrontation and instead use post-legislative bargaining to define co-determination. By contrast, the Meidner Plan and subsequent wage-earner funds proposals left very little room for post-legislative bargaining. In other words, the co-determination offensive was a legislative success for the same reason that the implementation of new legislation became a disappointment for labor.

Organized business spent about as much on its advertising and media campaign against wage earner funds in 1982 as the five parliamentary parties spent on the election campaign that same year. But couldn’t the labor movement have devoted more resources to acquiring the expertise needed to influence industrial policy? The final lesson seems to be that when industrial policy, and investment policy more generally, fails to address wage earner’s immediate concerns, they are bound to become less supportive of further efforts to democratize investment decisions."

Link to the book chapter, and link to an article in which Pontusson assesses the diluted version of the plan which was implemented in 1983.

  • "The concept of a society which is built on moral values remains, in my view, too promising to be extinguished by inhuman market forces." The disintegration of the Swedish model, in Rudolf Meidner’s words: Link. See also this interview with Meidner from 1998. Link.
  • In the largest study of its kind, Joseph Blasi, Douglas Kruse and Dan Weltmann explore the impact of employee stock ownership plans (ESOP) on firm performance, concluding that "Privately held ESOP companies were only half as likely as non-ESOP firms to go bankrupt or close, had significantly higher post-adoption annual employment and sales growth, and demonstrated higher sales per employee." Link.
  • In 1987, political theorist Jon Elster argued that the plan’s failure was the result of a fundamental problem in its conception of justice, which would leave much of the real decision making capabilities in the hands of the union bureaucracy. Link.
  • A report on Public-Common Partnerships, an "alternative institutional design that moves us beyond the overly simplistic binary of market/state." Link.
⤷ Full Article

September 3rd, 2019

Eye Machine

IMPLICIT FAVOR

The failures of research on fin-tech and poverty alleviation

Last week, we considered how social and political standards can pressure climate scientists to under-report their findings, introducing an underestimation bias into published climate research. In a recent thread, Nicholas Loubere examines the development buzz around mobile money, showing how similar factors can serve to exaggerate the findings of academic studies.

In a new article quoted in the thread, MILFORD BATEMAN, MAREN DUVENDACK, and NICHOLAS LOUBERE contest a much cited study on the poverty alleviating effects of mobile money platforms like M-Pesa. The criticism rests largely on grounds of omission: the study, they argue, ignores the closure of nearly half of microenterprises opened with M-Pesa, the jobs and incomes lost with the introduction of new businesses into fragile markets, the burgeoning debt accrued through digital loans, the overwhelmingly foreign ownership of M-Pesa and its profits, and the wealthy networks composing its primary users. Methodologically, it had no control group, used a small sample size, and overlooked the potential for reverse causality.

Why was a potentially flawed study so well regarded? According to Bateman, Duvendack, and Loubre, it's in part because its results told researchers and policymakers what they wanted to hear. From the article:

"The rapid popularization of fin-tech as a developmental solution is premised on the continued prominence of microcredit and the broader concept of financial inclusion. The microcredit movement was established and validated in the 1980s on overblown and ultimately false claims that providing small loans to groups of poor women was a panacea for global poverty reduction—claims that were especially associated with Dr Muhammad Yunus. Empirical justification came from an impact evaluation undertaken in Bangladesh by then World Bank economists Mark Pitt and Shahidur Khandker, which claimed that microcredit programs had significant beneficial results for impoverished female clients. For many years, Muhammad Yunus used Pitt and Khandker’s findings to successfully ‘sell’ the microcredit model to the international development community, generating a consensus that the microcredit model was the most effective way to efficiently provide enormous benefits to the global poor."

Link to the article, and link to a blogpost in which the authors outline their key findings.

  • "Kenya’s new experience of debt reveals a novel, digitized form of slow violence that operates not so much through negotiated social relations, nor the threat of state enforcement, as through the accumulation of data, the commodification of reputation, and the instrumentalization of social ties." Kevin P. Donovan and Emma Park report on the consequences of mobile debt for poor borrowers. Link.
  • In an article from 2017, Loubere "examines examples of exploitation, fraud, instability, and extraction related to expanded digital financial coverage in contemporary China." Link. At Bloomberg, David Malingha compares credit markets in Asia with those of sub-Saharan Africa. Link.
  • "This article claims that to bring finance back to serve the real economy, it is fundamental to (a) de-financialize companies in the real economy, and (b) think clearly about how to structure finance so that it can provide the long-term committed patient capital required by innovation." Mariana Mazzucato on governments' role in ensuring that finance serves public ends. Link.
⤷ Full Article

July 8th, 2019

Model of a Cabin

SELECTED MOBILITY

Examining the college premium

Higher education is widely understood to be a major driver of intergenerational mobility in the United States. Despite the clear (and growing) inequalities between and within colleges, it remains the case that higher education reduces the impact that parental class position has on a graduate's life outcomes.

In an intriguing paper, associate professor of economics at Harvard XIANG ZHOU scrutinizes the implied causal relationship between college completion and intergenerational mobility. Specifically, Zhou uses a novel weighting method "to directly examine whether and to what extent a college degree moderates the influence of parental income" outside of selection effects, seeking to distinguish between the "equalization" and "selection" hypotheses of higher ed's impact on intergenerational mobility.

From the paper:

"Three decades have passed since Hout’s (1988) discovery that intergenerational mobility is higher among college graduates than among people with lower levels of education. In light of this finding, many researchers have portrayed a college degree as 'the great equalizer' that levels the playing field, and hypothesized that an expansion in postsecondary education could promote mobility because more people would benefit from the high mobility experienced by college graduates. Yet this line of reasoning rests on the implicit assumption that the 'college premium' in intergenerational mobility reflects a genuine 'meritocratic' effect of postsecondary education, an assumption that has rarely, if ever, been rigorously tested.

In fact, to the extent that college graduates from low and moderate-income families are more selected on such individual attributes as ability and motivation than those from high-income families, the high mobility observed among bachelor’s degree holders may simply reflect varying degrees of selectivity of college graduates from different family backgrounds."

In sum, Zhou finds that the "selection" hypothesis carries more weight than the "equalization" hypothesis. One implication of this finding is that "simply expanding the pool of college graduates is unlikely to boost intergenerational income mobility in the US." Link to the paper.

  • A 2011 paper by Michael Bastedo and Ozan Jaquette looks at the stratification dynamics affecting low-income students within higher ed. Link. A paper from the same year by Martha Bailey and Susan Dynarski surveys the state of inequality in postsecondary education. Link.
  • An op-ed by E. Tammy Kim in the Times argues for higher-education as a public good. Link.
  • Marshall Steinbaum and Julie Margetta Morgan's 2018 paper examines the student debt crisis in the broader context of labor market trends: "Reliance on the college earnings premium [as a measure of success] is that it focuses primarily on the individual benefit of educational attainment, implying that college is worthwhile as long as individuals are making more than they would have otherwise. But in the context of public investment in higher education, we need to know not only how individuals are faring but also how investments in higher education are affecting our workforce and the economy as a whole." Link.
⤷ Full Article

July 1st, 2019

Quandary

HOW RESEARCH AFFECTS POLICY

Results from Brazil

How can evidence inform the decisions of policymakers? What value do policymakers ascribe to academic research? In January, we highlighted Yale's Evidence in Practice project, which emphasizes the divergence between policymakers' needs and researchers' goals. Other work describes the complexity of getting evidence into policy. A new study by JONAS HJORT, DIANA MOREIRA, GAUTAM RAO, and JUAN FRANCISCO SANTINI surprises because of the simplicity of its results—policymakers in Brazilian cities and towns are willing to pay for evidence, and willing to implement (a low-cost, letter-mailing) evidence-based policy. The lack of uptake may stem more from a lack of information than a lack of interest: "Our findings make clear that it is not the case, for example, that counterfactual policies' effectiveness is widely known 'on the ground,' nor that political leaders are uninterested in, unconvinced by, or unable to act on new research information."

From the abstract:

"In one experiment, we find that mayors and other municipal officials are willing to pay to learn the results of impact evaluations, and update their beliefs when informed of the findings. They value larger-sample studies more, while not distinguishing on average between studies conducted in rich and poor countries. In a second experiment, we find that informing mayors about research on a simple and effective policy (reminder letters for taxpayers) increases the probability that their municipality implements the policy by 10 percentage points. In sum, we provide direct evidence that providing research information to political leaders can lead to policy change. Information frictions may thus help explain failures to adopt effective policies."

Link to the paper.

  • New work from Larry Orr et al addresses the question of how to take evidence from one place (or several places) and make it useful to another. "[We provide] the first empirical evidence of the ability to use multisite evaluations to predict impacts in individual localities—i.e., the ability of 'evidence‐based policy' to improve local policy." Link.
  • Cited within the Hjort et al paper is research from Eva Vivalt and Aidan Coville on how policymakers update their prior beliefs when presented with new evidence. "We find evidence of 'variance neglect,' a bias similar to extension neglect in which confidence intervals are ignored. We also find evidence of asymmetric updating on good news relative to one’s prior beliefs. Together, these results mean that policymakers might be biased towards those interventions with a greater dispersion of results." Link.
  • From David Evans at CGDev: "'The fact that giving people information does not, by itself, change how they act is one of the most firmly established in social science.' So stated a recent op-ed in the Washington Post. That’s not true. Here are ten examples where simply providing information changed behavior." Link. ht The Weekly faiV.
  • For another iteration of the question of translating evidence into policy, see our February letter on randomized controlled trials. Link.
⤷ Full Article

June 24th, 2019

Push Pull

PROGRESS UNCOUPLE

Debating growth and the Green New Deal

In past newsletters, we have highlighted research and policy proposals relating to the Green New Deal and the literature surrounding "degrowth"—the idea that the growth imperative is at odds with human flourishing. In a recent exchange, economist Robert Pollin debates sociologists Juliet Schor and Andrew Jorgenson on the relative merits of "decoupling" and "degrowth." The former asserts that "economies can continue to grow while advancing a viable climate-stabilization project, as long as the growth process is decoupled from fossil-fuel consumption." The latter holds that public discussions over combating climate change must turn "from growthcentricity to needs- and people-centered policies."

The authors share a commitment to increased public investment, and both sides emphasize the distributional consequences of decarbonization. Their debate turns on, and illuminates larger conversations regarding the discursive frameworks and metrics we use to understand economic life. Schor and Jorgenson see reducing GDP in the global north as one element of a program to radically restructure the principles of society; Pollin understands these efforts to muddy the mandate for immediate climate action.

From Pollin:

"Let’s assume that global GDP contracts by 10 percent over the next two decades, following a degrowth scenario. That would entail a reduction of global GDP four times larger than what we experienced over the 2007–2009 financial crisis and Great Recession. In terms of CO2 emissions, the net effect of this 10 percent GDP contraction, considered on its own, would be to push emissions down by precisely 10 percent—that is, from 32 billion tons to 29 billion. So, the global economy would still not come close to bringing emissions down to 20 billion tons by 2040.

The overwhelming factor pushing emissions down will not be a contraction of overall GDP but massive growth in energy efficiency and clean renewable energy investments (which, for accounting purposes, will contribute toward increasing GDP) along with similarly dramatic cuts in fossil-fuel production and consumption (which will register as reducing GDP). In my view, addressing these matters in terms of their specifics is much more constructive than presenting broad generalities about the nature of economic growth, positive or negative."

Link to Pollin's initial paper, link to Schor and Jorgenson.

  • Pollin elaborates on this point in his follow-up statement with a case study of Japan: "Despite the fact that Japan has been close to a no-growth economy for twenty years, its CO2 emissions remain among the highest in the world, at 9.5 tons per capita." Link. Another recent article reviews and recaps the decoupling vs. degrowth exchanges. Link.
  • Schor and Jorgenson’s follow-up challenges Pollin's conviction that decoupling is either possible or efficient: "After decades of promises from advocates of green growth that absolute decoupling will happen, the record is dismal. The simple point about growth is therefore that it makes the nearly impossibly high mountain that we need to climb even steeper. Why rule out an important source of emissions reductions before we’ve even started?" Link.
  • Another iteration of the debate in a compilation of INET papers: Schröder et al argue that "if past performance is relevant for future outcomes, our results should put to bed the possibility of 'green growth.'" Michael Grubb takes a different tack: "Before declaring that history has set limits on what is possible, we need to be extremely careful. The future has already started, though its beginnings may be modest." Link.
  • From Autonomy, a proposal for a shortened work week—a key element of several green degrowth arguments. Link.
  • Mark Paul, Anders Fremstad, and JW Mason offer a brand new paper on US decarbonization. "In an economy facing persistent demand constraints and weak labor markets, public spending on decarbonization will raise wages and living standards." Link.
⤷ Full Article

June 17th, 2019

Insulators (Magritte machine)

MOBILE COGNITION

The political history of economic statistics

Debates over the relevance of indicators like GDP for assessing the health of domestic economies are persistent and growing. Critics of such measures point to the failures of such measures to holistically capture societal wellbeing, and argue in favor of alternative metrics and the disaggregation of GDP data. These debates reflect the politics behind the economic knowledge that shapes popular understanding and policy debates alike.

In his 2001 book Statistics and the German State, historian Adam Tooze examined the history of statistical knowledge production in Germany, covering the period from the turn-of-the-century to the end of the Nazi regime, "driven by the desire to understand how this peculiar structure of economic knowledge came into existence… and the relationship between efforts to govern the economy and efforts to make the economy intelligible through systematic quantification."

From the book's conclusion:

"We need to broaden our analysis of the forces bearing on the development of modern economic knowledge. This book has sought to portray the construction of a modern system of economic statistics as a complex and contested process of social engineering. This certainly involved the mobilization of economists and policy-makers, but it also required the creation of a substantial technical infrastructure. The processing of data depended on the concerted mobilization of thousands of staff. In this sense the history of modern economic knowledge should be seen as an integral part of the history of the modern state apparatus and more generally of modern bureaucratic organizations… The development of new forms of economic knowledge can therefore be understood as part of the emergence of modern economic government and as a sensitive indicator of the relationship between state and civil society."

Link to the book preview, link to the book page on Tooze's website.

  • For a more generalized account of the political history of statistical knowledge (inclusive of economic statistics), see the The Politics of Large Numbers by Alain Desrosières. Link. Another excellent item in the history of statistical knowledge: A History of the Modern Fact, on the advent and impact of double-entry bookkeeping. Link.
  • In the Winter 2019 issue of the Journal of Economic Perspectives, Hugh Rockoff examines the political history of American economic statistics, and tracks the emergence and institutionalization of measures of "prices, national income and product, and unemployment." Link.
  • Previously shared here, research by Aaron Benanev examines the institutional history linking the concept of "informality" and unemployment metrics developed by the International Labor Organization. Link to his paper.
  • A recent paper by Andrea Mennicken and Wendy Nelson Espeland surveys the quantification literature. Link. And a (previously shared) panel discussion on the historiography of quantification. Link.
⤷ Full Article

August 26th, 2019

Summer in Brabant

INTEMPERATE OBJECTIVITY

On the pressures of policy-relevant climate science

Without any “evidence of fraud, malfeasance or deliberate deception or manipulation,” or any promotion of inaccurate views, how can bias enter a scientific assessment? In their new book, Discerning Experts, Michael Oppenheimer, Naomi Oreskes, Dale Jamieson, et al explore the pattern of underestimation of the true consequences of climate change.

Climate change's impacts are uncertain; predictions about climate change are difficult to make. Taking an ethnographic approach, Discerning Experts shows how those difficulties, coupled with the nature of the public discourse, and the pressures that come when research is going to be discussed and used in policy, have tilted climate assessment optimistic and cautious.

In a summary of their book, Oreskes et al explain three reasons for the tilt:

“The combination of … three factors—the push for univocality, the belief that conservatism is socially and politically protective, and the reluctance to make estimates at all when the available data are contradictory—can lead to ‘least common denominator' results—minimalist conclusions that are weak or incomplete.”

These tendencies, according to the authors, pertain to the applied research context. The academic context is different: “The reward structure of academic life leans toward criticism and dissent; the demands of assessment push toward agreement.” Link to a summary essay in Scientific American. Link to the book.

  • In an interview, Michael Oppenheimer elaborates on other elements that skew the assessments: the selection of authors, the presentation of the resulting information, and others. Link.
  • In a review of the book, Gary Yohe reflects on his own experience working on major climate assessments, such the IPCC’s. Link.
  • A David Roberts post from 2018 finds another case of overly cautious climate science: models of the economic effects of climate change may be much more moderate than models of the physical effects. To remedy this, “We need models that negatively weigh uncertainty, properly account for tipping points, incorporate more robust and current technology cost data, better differentiate sectors outside electricity, rigorously price energy efficiency, and include the social and health benefits of decarbonization.” Link.
  • Tangentially related: carbon tax or green investment? It’s worth considering not just all possible policy options but also their optimal interactions. A paper by Julie Rozenberg, Adrien Vogt-Schilb, and Stephane Hallegatte concludes, “Optimal carbon price minimizes the discounted social cost of the transition to clean capital, but imposes immediate private costs that disproportionately affect the current owners of polluting capital, in particular in the form of stranded assets.” Link to a summary which contains a link to the unpaywalled paper.
⤷ Full Article

June 10th, 2019

Sketch for a Counter-Sky

MECHANICAL SHADOWS

On central bank independence and the rise of shadow money

Debates over the political impacts of Central Bank Independence (CBI) reached their peak in the late 90s and early 2000s, due to rising inequality and the volatility of financial markets. Initiated with the 1977 Federal Reserve Act and Paul Volcker’s subsequent term as chairman of the Fed, CBI was, and remains, a means of isolating the more "mechanical" field of monetary policy from the fleeting interests of politicians. In order to preserve stability and credibility, independent central banks have made inflation targeting the center point of their agenda. Critics of CBI have argued that the distinction between economic science and political incentives are not as clear as they might seem; low levels of inflation may benefit creditors and investors, but they harm those whose income entirely depends on rising wages. While monetary policy has distributional and political consequences, its decision makers are insulated from public accountability.

Expanding the literature on the politics of CBI, BENJAMIN BRAUN and DANIELA GABOR examine its financial consequences. In a recently published paper, they argue that the anti-inflationary policies of central banks have catalyzed dependence on shadow money and shadow banking, key components of a broader trend towards financialization:

"In the late 1990s, the US Federal Reserve was confronted with a peculiar predicament. While the world was celebrating central bank independence as a mark of 'scientific' economic governance after the populist era of monetizing government bonds, the US Federal Reserve worried about projections that the US government would pay down all its debt by 2012. A world without US government debt, they worried, was a world filled with monetary dangers. Market participants would not have a safe, liquid asset to turn to in times of distress.

Rather than seeking to limit shadow money supply, the Fed actively encouraged its expansion, seeking market solutions to political problems. It lobbied Congress to ensure that holders of shadow money backed by private (securitized) collateral had the same legal rights to collateral as those holding shadow money issued against US government debt. The Fed also changed its lending practices, allowing banks to issue shadow money backed by private collateral to borrow from the Fed. These concrete steps contrast starkly with the picture of central banks watching passively from the margins, as financial institutions find new ways to monetize credit and circumvent rules."

Link to the article.

  • More contemporary iterations of the debate over CBI can be found in the comparison between a 2018 HKS working paper, which distinguishes between "political oversight" and "operational independence," and a 2014 Levy Institute working paper which argues there is no practical meaning of operational independence at all. Link and link.

  • A primer on shadow banking, from Stijn Claessens and Lev Ratnovski at Vox EU. Link.

  • A new article by Andreas Kerna, Bernhard Reinsbergc, and Matthias Rau-Göhring finds that the IMF’s targeted lending practices actively encouraged the proliferation of independent central banks in low income economies. Link.

  • On CBI, inflationary targets, and the 2010 Eurocrisis, by Mark Copelovitch, Jeffry Frieden, and Stefanie Walter. Link.

⤷ Full Article

June 3rd, 2019

Convex Mirror

GREEN PLAN(K)

Growing the Green New Deal in the US and Europe

Jay Inslee, the governor of Washington State and Democratic presidential candidate, has made climate policy the center of his longer-than-long-shot campaign. On May 3rd, he released 8 pages of goals, and on May 16th, he released the 35-page, 28-policy “Evergreen Economy Plan,” with several more similarly lengthy reports on the way. David Roberts, an energy commentator at Vox, had a representative reaction to Inslee’s policies: “Inslee’s campaign is systematically translating the Green New Deal’s lofty goals — to decarbonize the economy sector by sector, in a way that creates high-quality jobs and protects frontline communities — into policy proposals.”

The report includes policies on infrastructure, manufacturing, R&D, and policies for energy workers, including a “GI Bill” and 8 million new jobs over the next 10 years. But beyond its extremely detailed recommendations, a key point of interest is seeing how the GND’s idealistic goals are cashed out. The introduction emphasizes a restorative approach:

“Inherent throughout ... is the urgent need to support frontline, low-income, and Indigenous communities, and communities of color. These communities are being impacted first and worst by the accelerating damages of climate change, and have endured a legacy of air, water, toxics and climate pollution, along with a deficit of public investment and support. Through an assertive agenda of reinvestment that is guided by strong local input, Governor Inslee’s plan seizes the opportunity to build a clean energy economy that provides inclusive prosperity — upon a foundation of economic, environmental, racial and social justice.”

Link to the report.

  • Leah Stokes, a climate and energy policy expert, contextualizes the expense on Twitter: “Spending $300 bn annually on climate is about 10% of the federal budget. Warren’s plan aggressively targets climate action by the military, which is about 20% of the federal budget. Hence, Inslee’s plan is about half the size of military spending. That’s BIG.” Link to the thread. - -
  • Noah Smith also takes up the cost question: "It’s probably less than 20 percent of what Ocasio-Cortez’s plan would cost, and only a quarter of the total would be paid by the government, so new budget deficits or taxes would be relatively modest. And because Inslee’s plan is more narrowly focused on value-generating investments like infrastructure rather than new entitlement spending, a higher percentage of the cost would be recouped down the road." Link.
  • As the GND gains momentum in the US, some in Europe are taking similar steps. DiEM25 (a pan-European organization co-founded in 2015 by Yanis Varoufakis) and the European Spring have released a report on a Green New Deal for Europe. Link. (Their candidates fell short in last week’s elections.)
  • An article in the World Economic Forum on the Green New Deal for Europe explains some of the theoretical differences between the US and European plans. "Whereas the Americans are building on a century-old tradition of the original New Deal, we’re trying to marry that language with existing programs." Link.
⤷ Full Article