➔ Phenomenal World

February 16th, 2019

➔ Phenomenal World

Cup and Ring

GAP PROGRESSION

New life in the debates over poverty measurement

In recent weeks, a familiar debate over how we understand the global poverty rate across time reappeared in mainstream op-ed pages. Sparked initially by Bill Gates tweeting out an infographic produced by Our World in Data—which visualizes massive decreases (94% to 10% of people) in global poverty over the past two-hundred years—the notable discussants have been LSE anthropologist JASON HICKEL and Our World in Data researchers JOE HASELL and MAX ROSER.

Hickel published a polemical Guardian op-ed criticizing the publication of this chart, which, he argued, misrepresents the history it claims to communicate and relies on contestable and imprecise data sources to bolster its universal progress narrative, taking "the violence of colonisation and repackaging it as a happy story of progress." Theresponses were numerous.

Among them, a post by Hasell and Roser provided detailed descriptions of the methods and data behind their work to answer the following: "How do we do know that the vast majority of the world population lived in extreme poverty just two centuries ago as this chart indicates? And how do we know that this account of falling global extreme poverty is in fact true?"

In addition to methodological arguments regarding data sources and the poverty line, Hickel's argument emphasizes the gap between poverty and the capacity to eliminate it:

"What matters, rather, is the extent of global poverty vis-à-vis our capacity to end it. As I have pointed out before, our capacity to end poverty (e.g., the cost of ending poverty as a proportion of the income of the non-poor) has increased many times faster than the proportional poverty rate has decreased. By this metric we are doing worse than ever before. Indeed, our civilization is regressing. On our existing trajectory, according to research published in the World Economic Review, it will take more than 100 years to end poverty at $1.90/day, and over 200 years to end it at $7.4/day. Let that sink in. And to get there with the existing system—in other words, without a fairer distribution of income—we will have to grow the global economy to 175 times its present size. Even if such an outlandish feat were possible, it would drive climate change and ecological breakdown to the point of undermining any gains against poverty.

It doesn’t have to be this way, of course."

Link to that post, and link to a subsequent one, which responds directly to the methods and data-use questions addressed by Hasell and Roser.

⤷ Full Article

February 9th, 2019

Overboard

HETEROGENOUS GAP

In search of a more just model for higher education financing

This week, we delve into the persisting inequalities of our higher education system. Since Winston, Hill, and Boyd found that only 10% of students at elite universities came from families who fell within the bottom 40% of the income distribution in 2005, universities across the board have revived efforts to diversify their student bodies.

The idea that there's a need for greater socioeconomic diversity in higher education is largely uncontroversial, particularly amid growing evidence of the higher earnings potential for college graduates. However, the policies best suited to addressing this gap see far less consensus. ROSINGER, BELASCO, and HEARN in the Journal for Higher Education examine the impact of both means tested and universal policies that replace student loans with grants in financial aid packages. The impact of these policies on socioeconomic diversity is somewhat counterintuitive:

"We found that colleges offering universal discounts experienced increased enrollment among middle-class students. Our study indicates universal no-loan policies represent one strategy to increase access and affordability for the middle-class in the elite reaches of higher education. The study also, however, raises questions about the policies’ effectiveness in addressing access for low-income students and efficiency in targeting aid."

Link to the full paper.

  • For more on the potential for universities to facilitate both the entrenchment and supersession of generational inequalities, see the groundbreaking 2017 paper by Chetty et. al. The authors used fourteen years of federal income tax data to construct mobility report cards of nearly 2000 colleges, provoking a range of new literature in the field. Their findings: "The colleges that have the highest bottom-to-top-quintile mobility rates – i.e., those that offer both high success rates and low-income access – are typically mid-tier public institutions. For instance, many campuses of the City University of New York (CUNY), certain California State colleges, and several campuses in the University of Texas system have mobility rates above 6%… Elite private (Ivy-Plus) colleges have an average mobility rate of 2.2%." Link to the paper, as well as the digitization of its results, courtesy of the New York Times.
  • Drawing on "Mobility Report Cards," a recent paper by Bloome, Dyer, and Zhou finds that parental income has become less predictive of adult income, offsetting inter-generational income persistence resulting from education. Link.
  • Anna Manzoni and Jessi Streib find that wage gaps between first- and continuing-generation college students are not caused by the institutions they attend, the grades they earn, or the subjects they study: "Our decomposition analysis shows that the uneven distribution of students into labor market sectors, occupations, hours worked, and urban locations is more responsible for the wage gap than the distribution of students into and within educational institutions." Link.
  • A book on the trials and tribulations of building and maintaining the "Harvard of the proletariat": Anthony Picciano and Chet Jordan on the history of the CUNY system. Link.
⤷ Full Article

February 2nd, 2019

The Summit

LEGITIMATE ASSESSMENT

Moving beyond computational questions in digital ethics research

In the ever expanding digital ethics literature, a number of researchers have been advocating a turn away from enticing technical questions—how to mathematically define fairness, for example—and towards a more expansive, foundational approach to the ethics of designing digital decision systems.

A 2018 paper by RODRIGO OCHIGAME, CHELSEA BARABAS, KARTHIK DINAKAR, MADARS VIRZA, and JOICHI ITO is an exemplary paper along these lines. The authors dissect the three most-discussed categories in the digital ethics space—fairness, interpretability, and accuracy—and argue that current approaches to these topics may unwittingly amount to a legitimation system for unjust practices. From the introduction:

“To contend with issues of fairness and interpretability, it is necessary to change the core methods and practices of machine learning. But the necessary changes go beyond those proposed by the existing literature on fair and interpretable machine learning. To date, ML researchers have generally relied on reductive understandings of fairness and interpretability, as well as a limited understanding of accuracy. This is a consequence of viewing these complex ethical, political, and epistemological issues as strictly computational problems. Fairness becomes a mathematical property of classification algorithms. Interpretability becomes the mere exposition of an algorithm as a sequence of steps or a combination of factors. Accuracy becomes a simple matter of ROC curves.

In order to deepen our understandings of fairness, interpretability, and accuracy, we should avoid reductionism and consider aspects of ML practice that are largely overlooked. While researchers devote significant attention to computational processes, they often lack rigor in other crucial aspects of ML practice. Accuracy requires close scrutiny not only of the computational processes that generate models but also of the historical processes that generate data. Interpretability requires rigorous explanations of the background assumptions of models. And any claim of fairness requires a critical evaluation of the ethical and political implications of deploying a model in a specific social context.

Ultimately, the main outcome of research on fair and interpretable machine learning might be to provide easy answers to concerns of regulatory compliance and public controversy"

⤷ Full Article

January 26th, 2019

Bone Mobile

LONG RUNNING

Learning about long-term effects of interventions, and designing interventions to facilitate the long view

A new paper from the Center for Effective Global Action at Berkeley surveys a topic important to our researchers here at JFI: the question of long-run effects of interventions. In our literature review of cash transfer studies, we identified the need for more work beyond the bounds of a short-term randomized controlled trial. This is especially crucial for basic income, which is among those policies intended to be permanent.

The authors of the new Berkeley report, Adrien Bouguen, Yue Huang, Michael Kremer, and Edward Miguel, note that it’s a particularly apt moment for this kind of work: “Given the large numbers of RCTs launched in the 2000’s, every year that goes by means that more and more RCT studies are ‘aging into’ a phase where the assessment of long-run impacts becomes possible.”

The report includes a summary of what we know about long-run impacts so far:

"Section 2 summarizes and evaluates the growing body of evidence from RCTs on the long-term impacts of international development interventions, and find most (though not all) provide evidence for positive and meaningful effects on individual economic productivity and living standards. Most of these studies examine existing cash transfer, child health, or education interventions, and shed light on important theoretical questions such as the existence of poverty traps (Bandiera et al., 2018) and returns to human capital investments in the long term."

Also notable is the last section, which contains considerations for study design, "lessons from our experience in conducting long-term tracking studies, as well as innovative data approaches." Link to the full paper.

  • In his paper "When are Cash Transfers Transformative?," Bruce Wydick also notes the need for long-run analysis: "Whether or not these positive impacts have long-term transformative effects—and under what conditions—is a question that is less settled and remains an active subject of research." The rest of the paper is of interest as well, including Wydick's five factors that tend to signal that a cash transfer will be transformative. Link.
  • For more on the rising popularity of RCTs, a 2016 paper by major RCT influencers Banerjee, Duflo, and Kremer quantifies that growth and discusses the impact of RCTs. Link. Here’s the PowerPoint version of that paper. David McKenzie at the World Bank responds to the paper, disputing some of its claims. Link.
⤷ Full Article

September 16th, 2019

Live Airborne System

DIVIDED INTEGRATION

Immanuel Wallerstein's contributions to research in the social sciences

Two weeks ago today marked the passing of the great Immanual Wallerstein. His work has had resounding influence across fields: from literature, to legal theory, education, development studies, and international relations. Among his foremost contributions is the four volume Modern World System series, which recount the transformation of feudalism into global capitalism through progressive incorporation of new regions into the European capitalist core. Complementing this history was world-systems theory, an analytical approach which challenged the tendency of social science research to identify simplified and direct causal relationships.

Wallerstein argued that purely economic, historical, or political analyses of society exclude more factors than they incorporate, casting doubt on both their internal and external validity. From the introduction to World Systems Analysis:

The phenomena dealt with in these separate boxes are so closely intermeshed that each presumes the other, each affects the other, each is incomprehensible without taking into account the other boxes. The separate boxes of analysis are an obstacle, not an aid, to understanding the world. Structurally, the social reality within which we live has not been the multiple national states of which we are citizens but something larger, which we call a world-system. This world-system has had many institutions—states and the interstate system, productive firms, households, classes, identity groups of all sorts—which form a matrix which permits the system to operate but at the same time stimulates both the conflicts and the contradictions which permeate it.

The world-system is a social creation, with a history, whose origins need to be explained, whose ongoing mechanisms need to be delineated, and whose inevitable terminal crisis needs to be discerned. For this reason, it is important to look anew not only at how the world in which we live works but also at how we have come to think about this world.

Link to the book's first pages.

  • "My intellectual development led me to historicize social movements, not only to better understand how they came to do the things they did, but also in order to better formulate the political options that were truly available in the present." On his website, Wallerstein reflects on the questions and contradictions that informed his life's work. Link.
  • "The Modern World-System is a theoretically ambitious work that deserves to be critically analyzed as such." Theda Skocpol's sympathetic scrutiny of the weaknesses in Wallerstein's major work, from the 1977 Review of American Sociology. Link.
  • Wallerstein's account of feudal breakdown, which stressed external factors like increased trade, countered that of historians like Robert Brenner, who focused instead on internal factors like peasant revolts. Robert A. Denemark and Kenneth P. Thomas give an overview of the debate. Link.
⤷ Full Article

January 19th, 2019

Self-Portrait

PLENARY CONFIDENCE

A look at China's social credit system

In a recent newsletter, we noted a spate of reporting drawing attention to the authoritarianism of China's growing Social Credit System. This week, we are sharing a paper by YU-JIE CHEN, CHING-FU LIN, AND HAN-WEI LIU that casts light on the details of the program's workings, corrects common misconceptions, proposes some likely and disturbing future scenarios, and offers a useful frame for understanding the significant shift it is bringing about in Chinese governance.

"A new mode of governance is emerging with the rise of China’s 'Social Credit System' (shehui xinyong zhidu) or SCS. The SCS is an unusual, comprehensive governance regime designed to tackle challenges that are commonly seen as a result of China’s 'trustless' society that has featured official corruption, business scandals and other fraudulent activities. The operation of the SCS relies on a number of important and distinctive features—information gathering, information sharing, labeling, and credit sanctions—which together constitute four essential elements of the system.

In our view, the regime of the SCS reflects what we call the 'rule of trust,' which has significant implications for the legal system and social control in China. We define the 'rule of trust' as a governance mode that imposes arbitrary restrictions—loosely defined and broadly interpreted trust-related rules—to condition, shape, and compel the behavior of the governed subjects… The 'rule of trust' is in fact undermining 'rule of law.'

In the context of governance, the unbounded notion of 'trust' and the unrestrained development of technology are a dangerous combination."

Link to the paper.

⤷ Full Article

January 12th, 2019

Worldviews

SOFT CYBER

Another kind of cybersecurity risk: the destruction of common knowledge

In a report for the Berkman Klein center, Henry Farrell and Bruce Schneier identify a gap in current approaches to cybersecurity. National cybersecurity officials still base their thinking on Cold War-type threats, where technologists focus on hackers. Combining both approaches, Farrell and Schneier make a wider argument about collective knowledge in democratic systems—and the dangers of its diminishment.

From the abstract:

"We demonstrate systematic differences between how autocracies and democracies work as information systems, because they rely on different mixes of common and contested political knowledge. Stable autocracies will have common knowledge over who is in charge and their associated ideological or policy goals, but will generate contested knowledge over who the various political actors in society are, and how they might form coalitions and gain public support, so as to make it more difficult for coalitions to displace the regime. Stable democracies will have contested knowledge over who is in charge, but common knowledge over who the political actors are, and how they may form coalitions and gain public support... democracies are vulnerable to measures that 'flood' public debate and disrupt shared decentralized understandings of actors and coalitions, in ways that autocracies are not."

One compelling metaresearch point from the paper is that autocratic governments receive analysis of information trade-offs, while democratic governments do not:

"There is existing research literature on the informational trade-offs or 'dictators' dilemmas' that autocrats face, in seeking to balance between their own need for useful information and economic growth, and the risk that others can use available information to undermine their rule. There is no corresponding literature on the informational trade-offs that democracies face between desiderata like availability and stability."

Full paper available on SSRN here.

  • Farrell summarizes the work on Crooked Timber: "In other words, the same fake news techniques that benefit autocracies by making everyone unsure about political alternatives undermine democracies by making people question the common political systems that bind their society." Many substantive comments follow. Link.
  • Jeremy Wallace, an expert on authoritarianism, weighs in on Twitter: "Insiders, inevitably, have even more information about the contours of these debates. On the other hand, there's a lot that dictators don't know--about their own regimes, the threats that they are facing, etc." Link to Wallace's work on the topic.
  • Related reading recommended by Wallace, from Daniel Little, a 2016 paper on propaganda: "Surprisingly, the government tends to pick a high level of propaganda precisely when it is ineffective." Link.
⤷ Full Article

January 5th, 2019

Aunt Eliza

PROCESS INTEGRATION

Bringing evidence to bear on policy

Happy 2019. We’re beginning with a report from Evidence in Practice, a project from the Yale School of Management. The report focuses on how to integrate rigorously researched evidence with policy and practice, with an emphasis on international development. The needs numerous stakeholders involved in research and policymaking are enumerated, along with their own needs and priorities: funders, researchers, intermediaries, policymakers, and implementers each receive consideration. One of the strengths of the report is its quotations from dozens of interviews across these groups, which give a sense of the messy, at times frustrating, always collaborative business of effecting change in the world. As to the question of what works:

⤷ Full Article

December 22nd, 2018

Frohes Fest | Year in Review

The JFI Letter has grown and morphed over the past twelve months; thank you to our readers for opening, skimming, clicking, and writing us every week. We'll be offline until January 5. In the meantime, here's a list of our favorite spotlights from last year and a list of favorite researchers to watch in the next.

⤷ Full Article

December 15th, 2018

Space Dance

SCIENTIFIC RETURNS

A new book examines the economic and social impacts of R&D

Last May, we highlighted a report on workforce training and technological competitiveness which outlined trends in research and development investment. The report found that despite "total U.S. R&D funding reaching an all-time high in 2015," it's shifted dramatically to the private sector: "federal funding for R&D, which goes overwhelmingly to basic scientific research, has declined steadily and is now at the lowest level since the early 1950s." This week, we take a look at the returns to these investments and discuss how best to measure and trace the ways research spending affects economic activity and policy.

In the most recent Issues in Science and Technology, IRWIN FELLER reviewsMeasuring the Economic Value of Research, a technical monograph that discusses how best to measure the impact and value of research on policy objectives. Notably, the book highlights UMETRICS, a unified dataset from a consortium of universities "that can be used to better inform decisions relating to the level, apportionment, human capital needs, and physical facility requirements of public investments in R&D and the returns of these investments." While it represents a big data approach to program evaluation, Feller notes that UMETRICS' strength is in the "small data, theory-driven, and exacting construction of its constituent datasets," all of which offer insight into the importance of human capital in successful R&D:

"The book’s characterization of the ways in which scientific ideas are transmitted to and constitute value to the broader economy encompasses publications and patents, but most importantly includes the employment of people trained in food safety research. This emphasis on human capital reflects a core proposition of UMETRICS, namely the 'importance of people—students, principal investigators, postdoctoral researchers, and research staff—who conduct research, create new knowledge, and transmit that knowledge into the broader economy.'

In particular, the chapters on workforce dynamics relating to employment, earnings, occupations, and early careers highlight the nuanced, disaggregated, and policy-relevant information made possible by UMETRICS. These data provide much-needed reinforcement to the historic proposition advanced by research-oriented universities that their major contribution to societal well-being—economic and beyond—is through the joint production of research and graduate education, more than patents or other metrics of technology transfer or firm formation."

The UMETRICS dataset traces the social and economic returns of research universities and allows for a larger examination of universities as sociopolitical anchors and scientific infrastructure.

⤷ Full Article