➔ Phenomenal World

December 9th, 2017

➔ Phenomenal World

Un Coup de dés


A new report argues that quality, not access, is the pivotal challenge for colleges and universities

From the American Academy of Arts and Sciences, a 112-page report with "practical and actionable recommendations to improve the undergraduate experience":

"Progress toward universal education has expanded most recently to colleges and universities. Today, almost 90 percent of high school graduates can expect to enroll in an undergraduate institution at some point during young adulthood and they are joined by millions of adults seeking to improve their lives. What was once a challenge of quantity in American undergraduate education, of enrolling as many students as possible, is now a challenge of quality—of making sure that all students receive the rigorous education they need to succeed, that they are able to complete the studies they begin, and that they can do this affordably, without mortgaging the very future they seek to improve."

Link to the full report. Co-authors include Gail Mellow, Sherry Lansing, Mitch Daniels, and Shirley Tilghman. ht Will, who highlights a few of the report's recommendations that stand out:

  • From page 40: "Both public and private colleges and universities as well as state policy-makers [should] work collaboratively to align learning programs and expectations across institutions and sectors, including implementing a transferable general education core, defined transfer pathway maps within popular disciplines, and transfer-focused advising systems that help students anticipate what it will take for them to transfer without losing momentum in their chosen field."
  • From page 65: "Many students, whether coming straight out of high school or adults returning later to college, face multiple social and personal challenges that can range from homelessness and food insecurity to childcare, psychological challenges, and even imprisonment. The best solutions can often emerge from building cooperation between a college and relevant social support agencies.
  • From page 72: "Experiment with and carefully assess alternatives for students to manage the financing of their college education. For example, income-share agreements allow college students to borrow from colleges or investors, which then receive a percentage of the student’s after-graduation income."
  • On a related note, see this 2016 paper from the Miller Center at the University of Virginia: "Although interest in the ISA as a concept has ebbed and flowed since Milton Friedman first proposed it in the 1950s, today it is experiencing a renaissance of sorts as new private sector partners and institutions look to make the ISA a feasible option for students. ISAs offer a novel way to inject private capital into higher education systems while striking a balance between consumer preferences and state needs for economic skill sets. The different ways ISAs can be structured make them highly suitable as potential solutions for many states’ education system financing problems." Link.
  • Meanwhile, Congress is working on the reauthorization of the Higher Education Act: "Much of the proposal that House Republicans released last week is controversial and likely won’t make it into the final law, but the plan provides an indication of Congressional Republicans’ priorities for the nation’s higher education system. Those priorities include limiting the federal government’s role in regulating colleges, capping graduate student borrowing, making it easier for schools to limit undergraduate borrowing — and overhauling the student loan repayment system. Many of those moves have the potential to create a larger role for private industry." Link.
⤷ Full Article

March 31st, 2020



Historical comparisons of European monetary unions

The need to formulate a unified COVID response has placed pressure on European integration in recent days, with Germany and the Netherlands resisting Southern European calls for the issuing of "coronabonds." A 2018 paper by John Ryan and John Loughlin assesses the history of the Latin Monetary Union (LMU), the Scandinavian Monetary Union (SMU), and the Austro-Hungarian Monetary Union (AHMU) in order to glean lessons for EU policymakers in the present.

From the paper:

"The LMU was originally envisaged as a bimetallic agreement, though it transitioned into an effective gold standard in 1878. French economist and politician Félix Esquirou de Parieu saw such a union as the first step in a process of European (even global) integration, which he hoped would culminate in the creation of a full common currency, and, as he predicted somewhat precociously, a 'European Union' directed by a 'European Commission'. The disintegration of the union with the Great War illustrates the danger of insufficient coordination among member states. Partially inspired by the LMU, the SMU was deeply tied to the rise of a political Scandinavism. Like the LMU, it foundered as a result of the impact of the First World War. The conditions were propitious in the Scandinavian countries as they imitated each other’s policy approaches. There were, however, great economic disparities across the different countries, and this points to the dangers of a monetary union without sufficient economic convergence among its member states. Finally, The AHMU was created through an agreement known as the 1867 Compromise which ensured that Austria and Hungary shared a common currency while remaining fiscally sovereign. The main lesson of the AHMU is about the nature of institutional structures. Because of the relative size and power of Austria and Hungary, the union's disintegration illuminates the game theoretic interaction of nations within a monetary union, including their asymmetric ability to exert power and influence over the terms of the supranational agreement."

Link to the piece.

  • "The decision to create the monetary union, the decision of whom to admit, and the decision of whom to appoint to run the ECB are political decisions, taken by political leaders, subject to political constraints, not the social-welfare maximizing decisions of some mythical social planner." Barry Eichengreen and Jeffry Frieden analyze "The Political Economy of European Monetary Unification." Link.
  • A 2019 Max Weber lecture by Philippe Van Parijs discusses notions of justice and their (in)operability within the monetary union framework, featuring discussion from Rawls on the EU and a reading of Hayek on monetary unions. Link.
  • "Many regional currency institutions were established in sub-Saharan Africa under colonial rule. Surprisingly, a number of these colonial institutions survived the transition to national independence, and several have survived to the present day." Scott Cooper and Clark Asay on the colonial legacy of the West African franc zones and the Southern African rand zone. Link.
⤷ Full Article

December 2nd, 2017



The gray box of XAI

A recent longform piece in the New York Times identifies the problem of explaining artificial intelligence. The stakes are high because of the European Union’s controversial and unclear “right-to-explanation” law, which will become active in May 2018.

“Instead of certainty and cause, A.I. works off probability and correlation. And yet A.I. must nonetheless conform to the society we’ve built — one in which decisions require explanations, whether in a court of law, in the way a business is run or in the advice our doctors give us. The disconnect between how we make decisions and how machines make them, and the fact that machines are making more and more decisions for us, has birthed a new push for transparency and a field of research called explainable A.I., or X.A.I. Its goal is to make machines able to account for the things they learn, in ways that we can understand. But that goal, of course, raises the fundamental question of whether the world a machine sees can be made to match our own.”

Full article by CLIFF KUANG here. This page provides a short overview of DARPA's XAI (Explainable Artificial Intelligence) program.

An interdisciplinary group addresses the problem:

"Contrary to popular wisdom of AI systems as indecipherable black boxes, we find that this level of explanation should often be technically feasible but may sometimes be practically onerous—there are certain aspects of explanation that may be simple for humans to provide but challenging for AI systems, and vice versa. As an interdisciplinary team of legal scholars, computer scientists, and cognitive scientists, we recommend that for the present, AI systems can and should be held to a similar standard of explanation as humans currently are; in the future we may wish to hold an AI to a different standard."

Full article by FINALE DOSHI-VELEZ et al. here. ht Margarita For the layperson, the most interesting part of the article may be its general overview of societal norms around explanation and explanation in the law.

Michael comments: Human cognitive systems have generated similar questions in vastly different contexts. The problem of chick-sexing (see Part 3) gave rise to a mini-literature within epistemology.

From Michael S. Moore’s book Law and Society: Rethinking the Relationship: “A full explanation in terms of reasons for action requires two premises: the major premise, specifying the agent’s desires (goals, objectives, moral beliefs, purposes, aims, wants, etc.), and the minor premise, specifying the agent’s factual beliefs about the situation he is in and his ability to achieve, through some particular action, the object of his desires.” Link. ht Margarita

  • A Medium post with an illustrated summary of some XAI techniques. Link.
⤷ Full Article

November 18th, 2017

Duchamp Wanted


How to build justice into algorithmic actuarial tools

Key notions of fairness contradict each other—something of an Arrow’s Theorem for criminal justice applications of machine learning.

"Recent discussion in the public sphere about algorithmic classification has involved tension between competing notions of what it means for a probabilistic classification to be fair to different groups. We formalize three fairness conditions that lie at the heart of these debates, and we prove that except in highly constrained special cases, there is no method that can satisfy these three conditions simultaneously. Moreover, even satisfying all three conditions approximately requires that the data lie in an approximate version of one of the constrained special cases identified by our theorem. These results suggest some of the ways in which key notions of fairness are incompatible with each other, and hence provide a framework for thinking about the trade-offs between them."

Full paper from JON KLEINBERG, SENDHIL MULLAINATHAN and MANISH RAGHAVAN here. h/t research fellow Sara, who recently presented on bias in humans, courts, and machine learning algorithms, and who was the source for all the papers in this section.

In a Twitter thread, ARVIND NARAYANAN describes the issue in more casual terms.

"Today in Fairness in Machine Learning class: a comparison of 21 (!) definitions of bias and fairness [...] In CS we're used to the idea that to make progress on a research problem as a community, we should first all agree on a definition. So 21 definitions feels like a sign of failure. Perhaps most of them are trivial variants? Surely there/s one that's 'better' than the rest? The answer is no! Each defn (stat. parity, FPR balance, contextual fairness in RL...) captures something about our fairness intuitions."

Link to Narayanan’s thread.

Jay comments: Kleinberg et al. describe their result as choosing between conceptions of fairness. It’s not obvious, though, that this is the correct description. The criteria (calibration and balance) discussed aren’t really conceptions of fairness; rather, they’re (putative) tests of fairness. Particular questions about these tests aside, we might have a broader worry: if fairness is not an extensional property that depends upon, and only upon, the eventual judgments rendered by a predictive process, exclusive of the procedures that led to those judgments, then no extensional test will capture fairness, even if this notion is entirely unambiguous and determinate. It’s worth consideringNozick’s objection to “pattern theories” of justice for comparison, and (procedural) due process requirements in US law.

⤷ Full Article

November 11th, 2017

The Hülsenbeck Children


Recommender systems power YouTube's controversial kids' videos

Familiar cartoon characters are placed in bizarre scenarios, sometimes by human content creators, sometimes by automated systems, for the purpose of attracting views and ad money. First, from the New York Times:

“But the app [YouTube Kids] contains dark corners, too, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms.

“In recent months, parents like Ms. Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes. Many have taken to Facebook to warn others, and share video screenshots showing moments ranging from a Claymation Spider-Man urinating on Elsa of ‘Frozen’ to Nick Jr. characters in a strip club.”

Full piece by SAPNA MAHESHWARI in the Times here.

On Medium, JAMES BRIDLE expands on the topic, and criticizes the structure of YouTube itself for incentivizing these kinds of videos, many of which have millions of views.

“These videos, wherever they are made, however they come to be made, and whatever their conscious intention (i.e. to accumulate ad revenue) are feeding upon a system which was consciously intended to show videos to children for profit. The unconsciously-generated, emergent outcomes of that are all over the place.

“While it is tempting to dismiss the wilder examples as trolling, of which a significant number certainly are, that fails to account for the sheer volume of content weighted in a particularly grotesque direction. It presents many and complexly entangled dangers, including that, just as with the increasing focus on alleged Russian interference in social media, such events will be used as justification for increased control over the internet, increasing censorship, and so on.”

Link to Bridle’s piece here.

⤷ Full Article

November 4th, 2017



Sociologist Zeynep Tufekci engages with Adam Mosseri, who runs the Facebook News Feed

Tufekci: “…Facebook does not ask people what they want, in the moment or any other way. It sets up structures, incentives, metrics & runs with it.”

Mosseri: “We actually ask 10s of thousands of people a day how much they want to see specific stories in the News Feed, in addition to other things.”

Tufekci: “That’s not asking your users, that’s research on your product. Imagine a Facebook whose customers are users—you’d do so much differently. I mean asking all people, in deliberate fashion, with sensible defaults—there are always defaults—even giving them choices they can change…Think of the targeting offered to advertisers—with support to make them more effective—and flip the possibilities, with users as customers. The users are offered very little in comparison. The metrics are mostly momentary and implicit. That’s a recipe to play to impulse.”

The tweets are originally from Zeynep Tufekci in response to Benedict Evans (link), but the conversation is much easier to read in Hamza Shaban’s screenshots here.

See the end of this newsletter for an extended comment from Jay.

  • On looping effects (paywall): “This chapter argues that today's understanding of causal processes in human affairs relies crucially on concepts of ‘human kinds’ which are a product of the modern social sciences, with their concern for classification, quantification, and intervention. Child abuse, homosexuality, teenage pregnancy, and multiple personality are examples of such recently established human kinds. What distinguishes human kinds from ‘natural kinds’, is that they have specific ‘looping effects’. By coming into existence through social scientists' classifications, human kinds change the people thus classified.” Link. ht Jay


Mechanisms and causes between micro and macro

Daniel Little, the philosopher of social science behind Understanding Society, haswritten numerous posts on the topic. Begin with this one from 2014:

“It is fairly well accepted that there are social mechanisms underlying various patterns of the social world — free-rider problems, communications networks, etc. But the examples that come readily to mind are generally specified at the level of individuals. The new institutionalists, for example, describe numerous social mechanisms that explain social outcomes; but these mechanisms typically have to do with the actions that purposive individuals take within a given set of rules and incentives.

“The question here is whether we can also make sense of the notion of a mechanism that takes place at the social level. Are there meso-level social mechanisms? (As always, it is acknowledged that social stuff depends on the actions of the actors.)”

In the post, Little defines a causal mechanism and a meso-level mechanism, then offers example research.

“…It is possible to identify a raft of social explanations in sociology that represent causal assertions of social mechanisms linking one meso-level condition to another. Here are a few examples:

  • Al Young: decreasing social isolation causes rising inter-group hostility (link)
  • Michael Mann: the presence of paramilitary organizations makes fascist mobilization more likely (link)
  • Robert Sampson: features of neighborhoods influence crime rates (link)
  • Chuck Tilly: the availability of trust networks makes political mobilization more likely (link)
  • Robert Brenner: the divided sovereignty system of French feudalism impeded agricultural modernization (link)
  • Charles Perrow: legislative control of regulatory agencies causes poor enforcement performance (link)

More of Little’s posts on the topic are here. ht Steve Randy Waldman

⤷ Full Article

March 25th, 2020

Tilted Ark


Wartime economic planning

This week, reports swirled regarding President Trump's invocation of the Defense Production Act—a 1950 law passed to manage production in the context of the Korean War—to meet the coming demand of crucial medical supplies to treat people with COVID-19. Much of the ensuing commentary has elided necessary distinctions between the Cold War–era DPA and the more memorable interventions into the productive capacity of the US economy that defined the Second World War. (For a helpful disaggregation, see this essay by Tim Barker; for a rundown of the DPA's history, see this summary from the Congressional Research Service.)

In his book, Arsenal of World War II (the fourth in a five-volume series on the political economy of American warfare), PAUL KOISTINEN provides a uniquely comprehensive and detailed account of the often misunderstood economics and administration of America's World War II mobilization effort.

From the book's introduction:

"An ironic legacy of the New Deal was that it helped create the partnership between corporate and military America that was destructive to reform. In the defense and war years, New Dealers took the lead in preparing the nation for World War II. Once hostilities ensued, the same reformers were at the center of devising the structure and controls essential for successfully harnessing the economy for war under stable economic conditions. Many of those same New Dealers became victims of the industry-military alliance that their mobilization policies and methods had assisted in bringing into being.

Despite advancement in weaponry, massive output was the critical World War II development, and that depended on successful economic mobilization policies. The political economy of warfare involves the interrelations of political, economic, and military institutions in devising the means to mobilize resources for defense and to conduct war. In each war, the magnitude and the duration of the fighting have dictated what the nation had to do to harness its economic power, but prewar trends have largely determined how this mobilization took place."

Link to the book page.

  • Mark Wilson's 2016 book, Destructive Creation, also on the business-government relationships that defined the World War II mobilization effort. Link.
  • A few recent articles on medical supplies: on the ventilator shortage; on mask production in China; on Taiwan's response to the virus; on the EU's plans to airlift masks; on China's increasing medical supply delivery to Europe.
  • From Otto Neurath's 1919 "War Economy": "The main result of our investigation may be expressed as follows: war forces a nation to pay more attention to the amount of goods which are at its disposal, less to the available amounts of money than it usually does." Link to Neurath's collected writings on economics.
⤷ Full Article

March 16th, 2020

Study for a Club Scene


Supply chains and geographical dispersion

At present it's difficult to think of much else beyond the fragility of our global economic infrastructure. A 2012 discussion paper by RICHARD BALDWIN looks at global supply chains: their history, future, and policy implications.

From the paper:

"Globalization’s second unbundling and the global supply chains it spawned have produced and continue to produce changes that alter all aspects of international relations: economic, political and even military. Supply chain fractionalization—the functional unbundling of production processes—is governed by a fundamental trade-off between specialization and coordination costs. Supply chain dispersion—the geographical unbundling of stages of production—is governed by a balance between dispersion forces and agglomeration forces.

The future of global supply chains will be influenced by four key determinants: 1) improvements in coordination technology that lowers the cost of functional and geographical unbundling, 2) improvements in computer integrated manufacturing that lowers the benefits of specialization and shifts stages toward greater skill-, capital, and technology-intensity, 3) narrowing of wage gaps that reduces the benefit of North-South offshoring to nations like China, and 4) the price of oil that raises the cost of unbundling."

Link to the paper.

  • "If the virus continues to spread at the same rate, supply chains will inevitably break apart and factories will start to close." From February, the FT editorial board on the "decoupling of global trade." Link.
  • A paper from the Institute for Global Law and Policy "asserts the centrality of legal regimes and private ordering mechanisms to the creation, structure, geography, distributive effects and governance of global value chains." Link. See also: a LPE Blog symposium based on the paper. Link.
  • "Capital is thoroughly globalized. Could it now be labor’s turn?" Peter Evans on a global strategy for organized labor. Link. And a new paper by Adrien Thomas "looks at strategies adopted by trade unions to unionize migrant workers, and discusses tensions related to the diversification of trade union policies and organizational structures in response to labor migration." Link.

h/t the one and only Francis Tseng for many of these links.

⤷ Full Article

March 9th, 2020

Flanked by Two Dolphins


An ecosocial theory of disease

The correlation between health, income, and wealth is widely recognized in contemporary research and policy circles. This broadly social understanding of public health outcomes has its origins in a theoretical tradition dating back to the 1970s and 80s, in which scholars began to embed medical research within a political and economic framework.

In a 2001 paper, epidemiologist NANCY KRIEGER seeks to strengthen the theoretical foundations of epidemiological research by linking them back to biological study.

From the paper:

"If social epidemiologists are to gain clarity on causes of and barriers to reducing social inequalities in health, adequate theory is a necessity. Grappling with notions of causation raises issues of accountability and agency: simply invoking abstract notions like 'society' and disembodied 'genes' will not suffice. Instead, the central question becomes who and what is responsible for population patterns of health, disease, and well-being, as manifested in present, past and changing social inequalities in health?

Arising in part as a critique of proliferating theories that emphasize individuals' responsibility to choose healthy lifestyles, the political economy of health school explicitly addresses economic and political determinants of health and disease, including structural barriers to people living healthy lives. Yet, despite its invaluable contributions to identifying social determinants of population health, a political economy of health perspective affords few principles for investigating what these determinants are determining. I propose a theory that conceptualizes changing population patterns of health, disease and well-being in relation to each level of biological, ecological and social organization (e.g. cell, organ, organism/ individual, family, community, population, society, ecosystem). Unlike prior causal frameworks—whether of a triangle connecting 'host', 'agent' and 'environment', or a 'chain of causes' arrayed along a scale of biological organization, from 'society' to 'molecular and submolecular particles'—this framework is multidimensional and dynamic and allows us to elucidate population patterns of health, disease and well-being as biological expressions of social relations—potentially generating new knowledge and new grounds for action."

Link to the piece.

  • Krieger's 1994 article takes a closer look at epidemiological causal frameworks, questioning the adequacy of multiple causation. And her 2012 paper asks: "Who or what is a population?" and articulates the analytical significance of this definition for epidemiological research. Link and link.
  • "Disease epidemics are as much markers of modern civilization as they are threats to it." In NLR, Rob and Rodrick Wallace consider how the development of the global economy has altered the spread of epidemics, taking the 2014 Ebola outbreak as a case study. Link.
  • Samuel S. Myers and Jonathan A. Patz argue that climate change constitutes the "greatest public health challenge humanity has faced." Link.
  • A history of epidemics in the Roman Empire, from 27 BC – 476 AD, by Francois Relief and Louise Cilliers. Link. And a 1987 book by Ann Bowman Jannetta analyzes the impact of disease on institutional development in early modern Japan. Link.
⤷ Full Article

March 2nd, 2020



Evaluating evidence-based policy

Over the past two decades, "evidence-based policy" has come to define the common sense of research and policymakers around the world. But while attempts have been made to create formalization schemes for the ranking of evidence for policy, a gulf remains between rhetoric about evidence-based policy and applied theories for its development.

In a 2011 paper, philosophers of science NANCY CARTWRIGHT and JACOB STEGENGA lay out a "theory of evidence for use," discussing the role of causal counterfactuals, INUS conditions, and mechanisms in producing evidence—and how all this matters for its evaluators.

From the paper:

"Truth is a good thing. But it doesn’t take one very far. Suppose we have at our disposal the entire encyclopaedia of unified science containing all the true claims there are. Which facts from the encyclopaedia do we bring to the table for policy deliberation? Among all the true facts, we want on the table as evidence only those that are relevant to the policy. And given a collection of relevant true facts we want to know how to assess whether the policy will be effective in light of them. How are we supposed to make these decisions? That is the problem from the user’s point of view and that is the problem of focus here.

We propose three principles. First, policy effectiveness claims are really causal counterfactuals and the proper evaluation of a causal counterfactual requires a causal model that (i) lays out the causes that will operate and (ii) tells what they produce in combination. Second, causes are INUS conditions, so it is important to review both the different causal complexes that will affect the result (the different pies) and the different components (slices) that are necessary to act together within each complex (or pie) if the targeted result is to be achieved. Third, a good answer to the question ‘How will the policy variable produce the effect’ can help elicit the set of auxiliary factors that must be in place along with the policy variable if the policy variable is to operate successfully."

Link to the paper.

  • Cartwright has written extensively on evidence and its uses. See: her 2012 book Evidence Based Policy: A Practical Guide to Doing it Better; her 2011 paper in The Lancet on RCTs and effectiveness; and her 2016 co-authored monograph on child safety, featuring applications of the above reasoning.
  • For further introduction to the philosophical underpinnings of Cartwright's applied work, and the relationship between theories of causality and evidence, see her 2015 paper "Single Case Causes: What is Evidence and Why." Link. And also: "Causal claims: warranting them and using them." Link.
  • Obliquely related, see this illuminating discussion of causality in the context of reasoning about discrimination in machine learning and the law, by JFI fellow and Harvard PhD Candidate Lily Hu and Yale Law School Professor Issa Kohler-Hausmann: "What's Sex Got To Do With Machine Learning?" Link.
  • A 2017 paper by Abhijit Banerjee et al: "A Theory of Experimenters," which models "experimenters as ambiguity-averse decision-makers, who make trade-offs between subjective expected performance and robustness. This framework accounts for experimenters' preference for randomization, and clarifies the circumstances in which randomization is optimal: when the available sample size is large enough or robustness is an important concern." Link.
⤷ Full Article