Justin Joque, University of Michigan (presenting)
John Cheney-Lippold, University of Michigan
The DCI is pleased to host a book workshop on May 2. The event is free, everyone is welcome, but please RSVP (see below)!
Workshop description
Our finances, politics, media, opportunities, information, shopping and knowledge production are all mediated through statistics and related machine-learning techniques. As such these technologies and methodologies increasingly form the organizational backbone of contemporary capitalism. Everywhere massive stores of data are continually transformed into actionable information for either human or computer consumption. Stocks are traded, prisoners’ sentences are adjusted, credit is granted or denied, and scientific facts are inferred. This work that statistics does is simultaneously material–it calculates with magnetic bits and moves capital and goods around the world–and deeply metaphysical turning data about the world into concepts, predictions and inferences. Like the commodity, for Marx, statistics think and work for us.
Just as Taylorism revolutionized industrial production, the ‘inference revolution’ has revolutionized the abstraction of information from data. But, at the same time statistics was busy revolutionizing capitalism, statistics was undergoing its own revolution. Over the last few decades frequentist approaches presenting an objective measure of likelihood as a proportion of occurrences in a long run system (e.g. rolling dice) began giving way to Bayesian approaches founded on a subjective likelihood that updates as new evidence is gathered. We argue that this revolution in statistical epistemology is in fact a revolution in production that directly affects the functioning of capitalism.
The scale, complexity and abstract natures of the systems we have created are outpacing our ability to comprehend or direct them. The clearest example of this is the logic of austerity politics of the last decade. The rationale of a market largely run by algorithms trading stocks between themselves has placed a set of unrefusable demands on the state to slash budget outlays. It is essential to critically understand how statistics and machine learning function, but not exclusively on a mathematical level, but rather as a form of epistemology and ideology, making the world decipherable. Indeed, important statistical treatise throughout the 20th century are awash in deep metaphysical claims. This book explores the philosophical development of statistics in the twentieth century and argues for its importance in understanding our current political and economic reality.
About the workshop
The workshop will provide an opportunity for participants to read an early draft of the manuscript for this book, which is slated to be published by Verso in 2019, and provide feedback. A month prior to the workshop a draft will be made available. The focus of the workshop will be on parts two and three, which most directly engage the history and philosophy of statistics and the production of scientific knowledge; but we will provide and welcome feedback on the entire manuscript. Justin will provide a brief overview and introduction to the project; the majority of the time will be devoted to a critical discussion about the book.
Because of this format, please RSVP to christoph.becker@utoronto.ca to let us know you’re coming. We’ll be in touch with materials upfront!
Location: BL 417, in the Inforum on the 4th floor on the right.
Date and time: May 2, 2pm-5pm
Format and Schedule:
- 30 minutes – Introduction/overview
- 1 hour – initial responses
- Break
- 1 hour – another round of discussion/feedback
For more information on the book, see Infidel Mathematics – Overview for Toronto (PDF).