Phenomenal World

March 9th, 2019

Phenomenal World

Incomplete Squares

CONTEXT ALLOCATION

Expanding the frame for formalizing fairness

In the digital ethics literature, there's a consistent back-and-forth between attempts at designing algorithmic tools that promote fair outcomes in decision-making processes, and critiques that enumerate the limits of such attempts. A December paper by ANDREW SELBST, dana boyd, SORELLE FRIEDLER, SURESH VENKATASUBRAMANIAN, and JANET VERTESI—delivered at FAT* 2019—contributes to the latter genre. The authors build on insights from Science and Technology Studies and offer a list of five "traps"—Framing, Portability, Formalism, Ripple Effect, and Solutionism—that fair-ML work is susceptible to as it aims for context-aware systems design. From the paper:

"We contend that by abstracting away the social context in which these systems will be deployed, fair-ML researchers miss the broader context, including information necessary to create fairer outcomes, or even to understand fairness as a concept. Ultimately, this is because while performance metrics are properties of systems in total, technical systems are subsystems. Fairness and justice are properties of social and legal systems like employment and criminal justice, not properties of the technical tools within. To treat fairness and justice as terms that have meaningful application to technology separate from a social context is therefore to make a category error, or as we posit here, an abstraction error."

In their critique of what is left out in the formalization process, the authors argue that, by "moving decisions made by humans and human institutions within the abstraction boundary, fairness of the system can again be analyzed as an end-to-end property of the sociotechnical frame." Link to the paper.

  • A brand new paper by HODA HEIDARI, VEDANT NANDA, and KRISHNA GUMMADI attempts to produce fairness metrics that look beyond "allocative equality," and directly grapples with the above mentioned "ripple effect trap." The authors "propose an effort-based measure of fairness and present a data-driven framework for characterizing the long-term impact of algorithmic policies on reshaping the underlying population." Link.
  • In the footnotes to the paper by Selbst et al, a 1997 chapter by early AI researcher turned sociologist Phil Agre. In the chapter: institutional and intellectual history of early AI; sociological study of the AI field at the time; Agre’s departure from the field; discussions of developing "critical technical practice." Link.
⤷ Full Article

March 2nd, 2019

Weak Local Lineament

REACH ARREARS

Charting the significance of credit and debt throughout society

Household debt has proliferated in the past decade. In the final quarter of 2018, it reached $13.54 trillion—an $869 billion increase since the previous peak in 2008 and a 21.4% increase since the post-crisis trough. While it is now widely recognized that the financialization of household consumption set the groundwork for the Recession (see for example this chapter by Manuel Aalbers), credit markets seem immune to structural reform.

On the one hand, access to credit enables purchases and investments crucial to long term financial mobility; on the other, it incorporates those who lack resources into a cycle of obligations to lenders. In her most recent publication in the Annual Review of Sociology, RACHEL E. DWYER questions how debt has shaped the American social landscape. She develops a two dimensional model of formal debt relationships which categorizes contracts according to the source of credit (state vs. market) and the nature of the obligation (prospective vs. retrospective). The model integrates the logic of debt and credit relationships with an analysis of distributional politics:

"The top row of prospective credit offers are more likely made to affluent or middle-class and disproportionately white populations, and the bottom row of retrospective financial obligations are more likely to fall on lower-income or poor and disproportionately racial/ethnic minority populations. The experience of debt and financial fragility is thus different across these social groups defined by class, race/ethnicity, and other social status, though also tied together by similar logics of financialization and individualized accountability for life conditions."

Dwyer's research shows how credit and debt relations vary geographically and temporally, encouraging a comparative analysis of debt relationships in countries with different political economies. Link to the article.

  • On the unique role that credit markets play in the American economy, see Monica Prasad on the credit-welfare state tradeoff, and Colin Crouch on privatized-mortgage Keynesianism. Link to the first; link to the second.
  • For a pre-crisis examination of credit and inequality, see Patrick Bolton and Howard Rosenthal's Credit Markets for the Poor. Link.
  • Vicki Been, Ingrid Ellen, and Josiah Madar explore the relationship between urban segregation and subprime mortgage lending. Link.

    New Researchers: VISIBILITY PREMIUM

Political effects of celebrity exposure

In a novel paper, HEYU XIONG—a Phd candidate at NORTHWESTERN and newly appointed professor at CASE WESTERN RESERVE UNIVERSITY—studied the political consequences of television celebrity. He used the career of Ronald Reagan as a case study and exploited quasi-experimental variation in television reception to estimate the effects of celebrity media exposure on political outcomes, finding that
support for Reagan based on non-political factors extended nearly two decades after his television career—an effect more pronounced in areas in which Reagan was not a known political entity. The findings suggest that elections hinge considerably more on non-political media exposure and personal characteristics than previously assumed.

From the abstract:

"My results contribute to our knowledge of the vote decision process. Understanding what candidate information is pertinent and how that information is processed is key to understanding the selection of elected officials and, subsequently, the policies those elected officials enact. The economic theory of electoral competition is traditionally situated in the framework of the policy oriented voter. Even without the assertion of rationality, voters are, at the very least, presumed to be voting in order to advance a policy position or to express a political preference. While this preoccupation is not misplaced, the results suggest that candidates' personal characteristics constitute a significant, if substandard, criterion for vote choice."

Link to the paper.

⤷ Full Article

February 23rd, 2019

Grievous Plans

NO SHORTAGE

New evidence on the relationship between skills and labor supply

More than a decade after the financial crisis of 2008, median household incomes have stagnated at their pre-2008 levels, and global economic growth is expected to decline further from what is already a historic low. While the unemployment rate has rebounded, part time, service, and temporary work remain the principal drivers behind labor market growth. Weak recovery from the crisis has been widely attributed to the “skills gap”; commentators and policymakers alike hold that quality jobs are there, but Americans are simply not qualified to perform them.

At the American Economic Association’s most recent conference, ALICIA SASSER MODESTINO, DANIEL SHOAG, and JOSHUA BALLANCE provide evidence against this view. Using a proprietary database of more than 36 million online job postings, they show that employers increased skill requirements in states and occupations which experienced larger increases in the unemployment rate. Their findings suggest that it wasn’t a shortage of skills which weakened labor markets, but rather the ubiquity of qualified applicants which drove employers to raise hiring standards. By testing employer responses to an influx of veterans from Iraq and Afghanistan, the authors are able to confirm this mechanism:

"As a source of exogenous variation in the availability of skilled workers, we make use of a natural experiment resulting from the large increase in the post-9/11 veteran labor force following troop withdrawals from Iraq and Afghanistan... Panel A of Table 5 demonstrates that there is a strong, significant, and positive relationship between the sharp increase in the supply of returning veterans and the rise in employer skill requirements for both education and experience."

This is among the first pieces of empirical evidence which suggests that employer skill requirements are driven, in part, by labor supply. Link to the conference webpage, where a full version of the paper is available for download.

  • As early as 2011, Lawrence Mishel argued against analysts who asserted that the unemployment crisis was structural, proposing instead that the economy was experiencing a crisis of demand. Link.
  • In his most recent book, LSE anthropologist David Graeber examines the relationship between skill and value, questioning why jobs which produce the most social value tend to be categorized as unskilled, consequently earning lower wages. Link to Graeber's widely acclaimed essay from 2013 that first outlined his argument, and link to the Google preview of his new book.
  • In a report for the Roosevelt Institute, Marshall Steinbaum and Julie Margetta Morgan argue that the 'skills gap' narrative is inconsistent with student debt crisis: "Although the country’s populace is becoming more educated, each educational group is becoming less well paid." Link.
  • Paul Osterman wrote an accessible overview of the debate for The Atlantic in 2014: “The claim that a shortage of skilled workers has exacerbated inequality has gained traction but it is not supported by the data… For instance, while 38 percent of manufacturing firms require math beyond simple addition, subtraction, and multiplication, the type of math employees need to be able to handle are standard features of a good high school education and part of the curriculum for most community-college students…Nearly 65 percent of businesses report they have no vacancies whatsoever, and another 76.3 percent report they have no long-term vacancies…” Link.
⤷ Full Article

February 16th, 2019

Cup and Ring

GAP PROGRESSION

New life in the debates over poverty measurement

In recent weeks, a familiar debate over how we understand the global poverty rate across time reappeared in mainstream op-ed pages. Sparked initially by Bill Gates tweeting out an infographic produced by Our World in Data—which visualizes massive decreases (94% to 10% of people) in global poverty over the past two-hundred years—the notable discussants have been LSE anthropologist JASON HICKEL and Our World in Data researchers JOE HASELL and MAX ROSER.

Hickel published a polemical Guardian op-ed criticizing the publication of this chart, which, he argued, misrepresents the history it claims to communicate and relies on contestable and imprecise data sources to bolster its universal progress narrative, taking "the violence of colonisation and repackaging it as a happy story of progress." Theresponses were numerous.

Among them, a post by Hasell and Roser provided detailed descriptions of the methods and data behind their work to answer the following: "How do we do know that the vast majority of the world population lived in extreme poverty just two centuries ago as this chart indicates? And how do we know that this account of falling global extreme poverty is in fact true?"

In addition to methodological arguments regarding data sources and the poverty line, Hickel's argument emphasizes the gap between poverty and the capacity to eliminate it:

"What matters, rather, is the extent of global poverty vis-à-vis our capacity to end it. As I have pointed out before, our capacity to end poverty (e.g., the cost of ending poverty as a proportion of the income of the non-poor) has increased many times faster than the proportional poverty rate has decreased. By this metric we are doing worse than ever before. Indeed, our civilization is regressing. On our existing trajectory, according to research published in the World Economic Review, it will take more than 100 years to end poverty at $1.90/day, and over 200 years to end it at $7.4/day. Let that sink in. And to get there with the existing system—in other words, without a fairer distribution of income—we will have to grow the global economy to 175 times its present size. Even if such an outlandish feat were possible, it would drive climate change and ecological breakdown to the point of undermining any gains against poverty.

It doesn’t have to be this way, of course."

Link to that post, and link to a subsequent one, which responds directly to the methods and data-use questions addressed by Hasell and Roser.

⤷ Full Article

February 9th, 2019

Overboard

HETEROGENOUS GAP

In search of a more just model for higher education financing

This week, we delve into the persisting inequalities of our higher education system. Since Winston, Hill, and Boyd found that only 10% of students at elite universities came from families who fell within the bottom 40% of the income distribution in 2005, universities across the board have revived efforts to diversify their student bodies.

The idea that there's a need for greater socioeconomic diversity in higher education is largely uncontroversial, particularly amid growing evidence of the higher earnings potential for college graduates. However, the policies best suited to addressing this gap see far less consensus. ROSINGER, BELASCO, and HEARN in the Journal for Higher Education examine the impact of both means tested and universal policies that replace student loans with grants in financial aid packages. The impact of these policies on socioeconomic diversity is somewhat counterintuitive:

"We found that colleges offering universal discounts experienced increased enrollment among middle-class students. Our study indicates universal no-loan policies represent one strategy to increase access and affordability for the middle-class in the elite reaches of higher education. The study also, however, raises questions about the policies’ effectiveness in addressing access for low-income students and efficiency in targeting aid."

Link to the full paper.

  • For more on the potential for universities to facilitate both the entrenchment and supersession of generational inequalities, see the groundbreaking 2017 paper by Chetty et. al. The authors used fourteen years of federal income tax data to construct mobility report cards of nearly 2000 colleges, provoking a range of new literature in the field. Their findings: "The colleges that have the highest bottom-to-top-quintile mobility rates – i.e., those that offer both high success rates and low-income access – are typically mid-tier public institutions. For instance, many campuses of the City University of New York (CUNY), certain California State colleges, and several campuses in the University of Texas system have mobility rates above 6%… Elite private (Ivy-Plus) colleges have an average mobility rate of 2.2%." Link to the paper, as well as the digitization of its results, courtesy of the New York Times.
  • Drawing on "Mobility Report Cards," a recent paper by Bloome, Dyer, and Zhou finds that parental income has become less predictive of adult income, offsetting inter-generational income persistence resulting from education. Link.
  • Anna Manzoni and Jessi Streib find that wage gaps between first- and continuing-generation college students are not caused by the institutions they attend, the grades they earn, or the subjects they study: "Our decomposition analysis shows that the uneven distribution of students into labor market sectors, occupations, hours worked, and urban locations is more responsible for the wage gap than the distribution of students into and within educational institutions." Link.
  • A book on the trials and tribulations of building and maintaining the "Harvard of the proletariat": Anthony Picciano and Chet Jordan on the history of the CUNY system. Link.
⤷ Full Article

February 2nd, 2019

The Summit

LEGITIMATE ASSESSMENT

Moving beyond computational questions in digital ethics research

In the ever expanding digital ethics literature, a number of researchers have been advocating a turn away from enticing technical questions—how to mathematically define fairness, for example—and towards a more expansive, foundational approach to the ethics of designing digital decision systems.

A 2018 paper by RODRIGO OCHIGAME, CHELSEA BARABAS, KARTHIK DINAKAR, MADARS VIRZA, and JOICHI ITO is an exemplary paper along these lines. The authors dissect the three most-discussed categories in the digital ethics space—fairness, interpretability, and accuracy—and argue that current approaches to these topics may unwittingly amount to a legitimation system for unjust practices. From the introduction:

“To contend with issues of fairness and interpretability, it is necessary to change the core methods and practices of machine learning. But the necessary changes go beyond those proposed by the existing literature on fair and interpretable machine learning. To date, ML researchers have generally relied on reductive understandings of fairness and interpretability, as well as a limited understanding of accuracy. This is a consequence of viewing these complex ethical, political, and epistemological issues as strictly computational problems. Fairness becomes a mathematical property of classification algorithms. Interpretability becomes the mere exposition of an algorithm as a sequence of steps or a combination of factors. Accuracy becomes a simple matter of ROC curves.

In order to deepen our understandings of fairness, interpretability, and accuracy, we should avoid reductionism and consider aspects of ML practice that are largely overlooked. While researchers devote significant attention to computational processes, they often lack rigor in other crucial aspects of ML practice. Accuracy requires close scrutiny not only of the computational processes that generate models but also of the historical processes that generate data. Interpretability requires rigorous explanations of the background assumptions of models. And any claim of fairness requires a critical evaluation of the ethical and political implications of deploying a model in a specific social context.

Ultimately, the main outcome of research on fair and interpretable machine learning might be to provide easy answers to concerns of regulatory compliance and public controversy"

⤷ Full Article

January 26th, 2019

Bone Mobile

LONG RUNNING

Learning about long-term effects of interventions, and designing interventions to facilitate the long view

A new paper from the Center for Effective Global Action at Berkeley surveys a topic important to our researchers here at JFI: the question of long-run effects of interventions. In our literature review of cash transfer studies, we identified the need for more work beyond the bounds of a short-term randomized controlled trial. This is especially crucial for basic income, which is among those policies intended to be permanent.

The authors of the new Berkeley report, Adrien Bouguen, Yue Huang, Michael Kremer, and Edward Miguel, note that it’s a particularly apt moment for this kind of work: “Given the large numbers of RCTs launched in the 2000’s, every year that goes by means that more and more RCT studies are ‘aging into’ a phase where the assessment of long-run impacts becomes possible.”

The report includes a summary of what we know about long-run impacts so far:

"Section 2 summarizes and evaluates the growing body of evidence from RCTs on the long-term impacts of international development interventions, and find most (though not all) provide evidence for positive and meaningful effects on individual economic productivity and living standards. Most of these studies examine existing cash transfer, child health, or education interventions, and shed light on important theoretical questions such as the existence of poverty traps (Bandiera et al., 2018) and returns to human capital investments in the long term."

Also notable is the last section, which contains considerations for study design, "lessons from our experience in conducting long-term tracking studies, as well as innovative data approaches." Link to the full paper.

  • In his paper "When are Cash Transfers Transformative?," Bruce Wydick also notes the need for long-run analysis: "Whether or not these positive impacts have long-term transformative effects—and under what conditions—is a question that is less settled and remains an active subject of research." The rest of the paper is of interest as well, including Wydick's five factors that tend to signal that a cash transfer will be transformative. Link.
  • For more on the rising popularity of RCTs, a 2016 paper by major RCT influencers Banerjee, Duflo, and Kremer quantifies that growth and discusses the impact of RCTs. Link. Here’s the PowerPoint version of that paper. David McKenzie at the World Bank responds to the paper, disputing some of its claims. Link.
⤷ Full Article

January 19th, 2019

Self-Portrait

PLENARY CONFIDENCE

A look at China's social credit system

In a recent newsletter, we noted a spate of reporting drawing attention to the authoritarianism of China's growing Social Credit System. This week, we are sharing a paper by YU-JIE CHEN, CHING-FU LIN, AND HAN-WEI LIU that casts light on the details of the program's workings, corrects common misconceptions, proposes some likely and disturbing future scenarios, and offers a useful frame for understanding the significant shift it is bringing about in Chinese governance.

"A new mode of governance is emerging with the rise of China’s 'Social Credit System' (shehui xinyong zhidu) or SCS. The SCS is an unusual, comprehensive governance regime designed to tackle challenges that are commonly seen as a result of China’s 'trustless' society that has featured official corruption, business scandals and other fraudulent activities. The operation of the SCS relies on a number of important and distinctive features—information gathering, information sharing, labeling, and credit sanctions—which together constitute four essential elements of the system.

In our view, the regime of the SCS reflects what we call the 'rule of trust,' which has significant implications for the legal system and social control in China. We define the 'rule of trust' as a governance mode that imposes arbitrary restrictions—loosely defined and broadly interpreted trust-related rules—to condition, shape, and compel the behavior of the governed subjects… The 'rule of trust' is in fact undermining 'rule of law.'

In the context of governance, the unbounded notion of 'trust' and the unrestrained development of technology are a dangerous combination."

Link to the paper.

⤷ Full Article

January 12th, 2019

Worldviews

SOFT CYBER

Another kind of cybersecurity risk: the destruction of common knowledge

In a report for the Berkman Klein center, Henry Farrell and Bruce Schneier identify a gap in current approaches to cybersecurity. National cybersecurity officials still base their thinking on Cold War-type threats, where technologists focus on hackers. Combining both approaches, Farrell and Schneier make a wider argument about collective knowledge in democratic systems—and the dangers of its diminishment.

From the abstract:

"We demonstrate systematic differences between how autocracies and democracies work as information systems, because they rely on different mixes of common and contested political knowledge. Stable autocracies will have common knowledge over who is in charge and their associated ideological or policy goals, but will generate contested knowledge over who the various political actors in society are, and how they might form coalitions and gain public support, so as to make it more difficult for coalitions to displace the regime. Stable democracies will have contested knowledge over who is in charge, but common knowledge over who the political actors are, and how they may form coalitions and gain public support... democracies are vulnerable to measures that 'flood' public debate and disrupt shared decentralized understandings of actors and coalitions, in ways that autocracies are not."

One compelling metaresearch point from the paper is that autocratic governments receive analysis of information trade-offs, while democratic governments do not:

"There is existing research literature on the informational trade-offs or 'dictators' dilemmas' that autocrats face, in seeking to balance between their own need for useful information and economic growth, and the risk that others can use available information to undermine their rule. There is no corresponding literature on the informational trade-offs that democracies face between desiderata like availability and stability."

Full paper available on SSRN here.

  • Farrell summarizes the work on Crooked Timber: "In other words, the same fake news techniques that benefit autocracies by making everyone unsure about political alternatives undermine democracies by making people question the common political systems that bind their society." Many substantive comments follow. Link.
  • Jeremy Wallace, an expert on authoritarianism, weighs in on Twitter: "Insiders, inevitably, have even more information about the contours of these debates. On the other hand, there's a lot that dictators don't know--about their own regimes, the threats that they are facing, etc." Link to Wallace's work on the topic.
  • Related reading recommended by Wallace, from Daniel Little, a 2016 paper on propaganda: "Surprisingly, the government tends to pick a high level of propaganda precisely when it is ineffective." Link.
⤷ Full Article

January 5th, 2019

Aunt Eliza

PROCESS INTEGRATION

Bringing evidence to bear on policy

Happy 2019. We’re beginning with a report from Evidence in Practice, a project from the Yale School of Management. The report focuses on how to integrate rigorously researched evidence with policy and practice, with an emphasis on international development. The needs numerous stakeholders involved in research and policymaking are enumerated, along with their own needs and priorities: funders, researchers, intermediaries, policymakers, and implementers each receive consideration. One of the strengths of the report is its quotations from dozens of interviews across these groups, which give a sense of the messy, at times frustrating, always collaborative business of effecting change in the world. As to the question of what works:

⤷ Full Article