Latest News

  • PSA2018: Call for Posters

    Twenty-Sixth Biennial Meeting of the Philosophy of Science Association November 1 – November 4, 2018 Seattle, WA Submission is now open for abstracts for posters to be presented at the PSA2018 meeting in Seattle, WA, on November 1-4, 2018. The poster forum will be on the evening of November 2. This will be the 50th anniversary of the first biennial...
  • Richard M. Burian Receives 2017 David L. Hull Prize

    Richard M. Burian, PSA member and Professor Emeritus of Philosophy & Science Studies at Virginia Tech, has been awarded the 2017 David L. Hull Prize. The David L. Hull Prize is a biennial prize established in 2011 by the ISHPSSB to honor the life and legacy of David L. Hull (1935–2010). The David L. Hull Prize honors an extraordinary contribution t...
  • Emily C. Parke Awarded the 2017 Marjorie Grene Prize

    PSA member Emily C. Parke has been awarded the 2017 Marjorie Grene Prize for her essay, 'Experiments, Simulations, and Epistemic Privilege' (Philosophy of Science, 2014, 81(4), 516–536). The Marjorie Grene Prize is awarded by the International Society for the History, Philosophy and Social Studies of Biology and named in honor of Marjorie Grene (19...
  • Lynn Chiu Awarded the 2017 Werner Callebaut Prize

    PSA member Lynn Chiu has been awarded the 2017 Werner Callebaut Prize for her essay (co-authored with biologist Scott Gilbert), 'The Birth of the Holobiont: Multi-species Birthing Through Mutual Scaffolding and Niche Construction' (Chiu & Gilbert (2015) Biosemiotics, 8: 191-210). The Werner Callebaut Prize is awarded by the International Society fo...
  • PSA2018: Call for Papers

    Twenty-Sixth Biennial Meeting of the Philosophy of Science Association November 1 – November 4, 2018 Seattle, WA Submission is now open for papers to be presented at the PSA2018 meeting in Seattle, WA, on November 1-4, 2018. This will be the 50th anniversary of the first biennial meeting of the PSA. The deadline for submitting a paper is March 1, 2...
  • PSA2018: Call for Symposium Proposals

    Twenty-Sixth Biennial Meeting of the Philosophy of Science Association November 1 – November 4, 2018 Seattle, WA Submission is now open for proposals for symposia to be presented at the PSA2018 meeting in Seattle, WA, on November 1-4, 2018. This will be the 50th anniversary of the first biennial meeting of the PSA. The deadline for submitting sympo...
  • Michela Massimi Awarded the Wilkins-Bernal-Medawar Medal

    Wilkins-Bernal-Medawar Medal and Lecture The 2017 Wilkins-Bernal-Medawar Medal is awarded to Professor Michela Massimi, PSA member and former Governing Board Member, for her interdisciplinary interests in and communication of modern philosophy and science: particularly in relation to physics, and the thinking of Newton, Kant and Pauli. Professor Ma...
  • 2017 PSA Election Results

    On behalf of the Governing Board of the Philosophy of Science Association, it is my pleasure to announce the results of the 2017 PSA Election. Thanks to all those who voted. We had an excellent turnout of 54.9%. Alisa Bokulich of Boston University and Hasok Chang of the University of Cambridge were elected to the Governing Board of the PSA. Each wi...
  • View Recording of PSA/UCS Joint Webinar: Scientific Facts vs. Alternative Facts (sic)

    Wednesday, June 7, 2017 2:00 PM EDT - 3:00 PM EDT Hosted by the Union of Concerned Scientists Center for Science and Democracy View recording of the webinar here: https://ucsusa.adobeconnect.com/pae09j6wqa0k/ How can we understand and respond to “alternative facts” when they are presented as of equal value as scientific facts? The UCS Center for Sc...
  • PSA Stands Up for Science

    The PSA is an official partner of the March for Science, and many of our members participated in marches all over the world, among them our President, Sandy Mitchell, and Past-President, Ken Waters.
  • Letter from Sandra D. Mitchell, PSA President

    Dear PSA Members, I am honored to be serving as the President of PSA (as of January 1, 2017) and want to take a moment to both look forward to new initiatives and opportunities and reflect back on what the organization has been doing. Most importantly, I want to invite you to participate in the activities of the PSA and in collectively considering...
  • Letter from C. Kenneth Waters, PSA Past-President

    Dear PSA Members, I am writing as Past President to report on what PSA has accomplished over the last two years. Let me begin by congratulating our new President, Sandy Mitchell, new Vice President/President-elect, Alison Wylie, and incoming Governing Board members Megan Delehanty and Edouard Machery. There have been two fronts of change for the PS...

Sorry. What's that?

Try Again...

History, Aim, and Schedule of Philosophy of Science

Philosophy of Science, the official journal of the Philosophy of Science Association (PSA), has been published continuously since 1934. Since 1996 the journal has been published by the University of Chicago Press, with copyright retained by the PSA.

Philosophy of Science publishes the best work in philosophy of science, broadly construed, five times a year. Every January, April, July, and October (the regular issues) the journal publishes articles, book reviews, discussion notes, and essay reviews; every December it publishes proceedings from the most recent Biennial Meeting of the Philosophy of Science Association. Philosophy of Science receives approximately 340 submissions for the regular issues each year.

Circulation and Impact

Over the course of 2014 and 2015, Philosophy of Science has continued to achieve high circulation, usage, and impact. As of 2014, the journal was available in over 7,000 institutions and in nearly 120 countries around the world. This circulation includes traditional paid subscriptions, complimentary and philanthropic subscriptions, and inclusion of the journal in high profile databases. In 2014, the journal received 1,022 traditional paid institutional subscriptions, an increase of nearly 3% from 2013. Usage has remained strong with 265,784 accesses in 2014. Furthermore, Philosophy of Science’s Impact Factor increased in 2014 to 0.833. The journal now ranks 10th in the “History and Philosophy of Science” category of Thomson Reuter’s ISI social science edition.

Editor-in-Chief’s Responsibility

The Editor-in-Chief selects manuscripts for publication in the regular issues of Philosophy of Science on the basis of selected peer reviews and supervises the Philosophy of Science editorial office. While the Editor-in-Chief is not involved in the selection or review of papers for the December issues, the editorial office is involved in various aspects of the submission and publication process for the December issues. The Editor-in-Chief is responsible for coordinating the production of the journal with UC Press for all issues, including the December issues. The Editor-in-Chief must hold a tenured academic appointment, have a distinguished record of research in philosophy of science, and be able to deal judiciously and promptly with contributors. The Editor-in-Chief is responsible for appointing a managing editor and nominating associate editors for Governing Board approval, as well as selecting a book review editor and an editorial board. The Editor-in-Chief will convene an editorial board meeting at the Biennial PSA meeting in order to inform the board about the current status of the journal and solicit recommendations from the board. The Editor-in-Chief will present an informational report to the PSA Governing Board at the Biennial PSA meeting. Finally, the Editor-in-Chief will submit an annual report to the PSA Executive Director for transmittal to the Governing Board of the PSA.

Host Institution Responsibility

The Editor-in-Chief’s home institution will provide support for the Philosophy of Science editorial office in accord with an agreement negotiated with the Editor-in-Chief and the PSA. Such support may include release from teaching for the Editor-In-Chief, support for the Managing Editor, office space for the Philosophy of Science editorial office, postage, and computing, internet, and telephone access and support.

PSA Responsibility

The PSA will provide support for the Philosophy of Science editorial office in accord with an agreement negotiated with the Editor-in-Chief and Editor-in-Chief’s home institution. Such support may include support for release from teaching for the Editor-in-Chief or funds to support the employment of a Managing Editor.

University of Chicago Press Responsibility

The University of Chicago Press is responsible for all matters related to the publication of Philosophy of Science, including print production, electronic production, subscription fulfillment, marketing, and editing. The Press is also responsible for setting subscription, advertising, and other rates related to the journal, in consultation with the PSA Governing Board.

Term of Editorship

The Editor-in-Chief of Philosophy of Science is appointed by the Governing Board of the PSA for a term of five years and serves at the pleasure of the Governing Board. The term of the next Editor-in-Chief of Philosophy of Science will begin July 1, 2017, and end June 30, 2022.

Transition Schedule

The PSA Governing Board expects to select the next Editor-in-Chief of Philosophy of Science by July 2016, and to complete the transition to the new Editor-in-Chief and editorial office by July 1, 2017. The PSA and the University of Chicago Press will work to ensure a successful transition.

The first draft of the PSA 2014 preprint volume on has now been generated and posted on PhilSci-Archive for convenient access during the conference: http://philsci-archive.pitt.edu/11109/ The volume will be updated periodically, so participants who have not yet posted may still post to the PSA conference section on PhilSci-Archive in order to be included in the updated PDFs.

  • To post your PSA2014 paper at PhilSci-Archive, click here.
  • For instructions on how upload a conference paper to PhilSci-Archive click here.
  • To view PSA2014 papers posted on PhilSci-Archive, click here.

Note that the Symposia abstracts are abstracts of the entire symposium, not the individual papers presented in the symposium. The Symposia Abstracts are ordered alphabetically by the title of the Session.

50 Years of Inclusive Fitness

In 1964, W. D. Hamilton introduced what is now the best known and most widely used framework for understanding the evolution of social behaviour: inclusive fitness theory. This symposium aims firstly to mark the 50th anniversary of Hamilton's pioneering work, and secondly to make progress on some important but previously neglected conceptual issues that the move from classical fitness to inclusive fitness raises. The symposium will be highly interdisciplinary, incorporating philosophical, sociological, historical, and theoretical perspectives. The five papers will address a number of key foundational questions, including: (i) the extent to which inclusive fitness provides a 'design objective' for animal behaviour; (ii) the extent to which inclusive fitness is an inherently causal notion; (iii) the relationship between inclusive fitness and rational choice theory; (iv) the relevance of inclusive fitness to the evolution of morality; (v) the role of robustness analysis in evaluating inclusive fitness models; and (vi) the 'kin selection' vs 'group selection' controversy.

Agnotology -- Its Untapped Potential      

Science has traditionally been billed as our foremost producer of knowledge.  For at least 10 years now, however, science has also been billed as an important producer of ignorance.  Indeed, historian of science Robert Proctor has coined a new term, agnotology, to refer to the study of ignorance, and it turns out that almost all of the ignorance studied in this new area is produced by science.  The type of ignorance that has garnered most attention is what Proctor calls ignorance as "active construct," the kind of ignorance (to use Proctor's phrase) "made, maintained, and manipulated" by science -- by an increasingly politicized and commercialized science.  But Proctor has distinguished two other types of ignorance also produced by science:  ignorance as "passive construct," that is, ignorance as an unintended by-product of research decisions; and ignorance as "virtuous" -- when "not knowing" is accepted in research as a consequence of adopting certain values.  In our symposium we will focus on these other two types of ignorance, thus far mostly ignored, furnishing examples that show the variety of forms they take and their importance.  In so doing, we will be helping to provide an understanding of the full potential of agnotology and what it might contribute to history and philosophy of science.   

Beyond the Lab Experiment: Field Experiments, Natural Experiments, Simulations, and Causal Process Tracing  

The papers in this session focus on identifying causes and causal processes in settings beyond the traditional controlled laboratory experiment. They explore a range of experimental approaches (including field experiments, natural experiments, and social experiments), the relationship between each of these and causal process tracing, and how experiments and simulations differ and converge as tools for generating causal claims. While most philosophers discussing experiments have focused on physics and the biomedical sciences, the papers in this session frame their discussion of these topics around less-examined research areas, including the social sciences, evolutionary biology, ecology, and physical geography. Thus these papers all contribute to the growing literature on experiments, but focus on topics that have not been central in the discussion to date.

Chemical Structure         

Chemical theory is in large part concerned with structure: the structure of atoms, of small molecules, of macromolecules, and of materials. This symposium investigates the foundations of structural theory across all these scales, revealing the breadth and application of these ideas throughout chemistry. The first part of the symposium explores concepts of structure at two very different scales: that of nanostructure and that of macromolecular superstructure. The second part of the symposium examines how abstraction and idealization influence structure and its cognitive role in chemical practice.  By exploring the nature and effect of concepts of structure throughout chemical cognition and custom, this symposium offers a novel, comprehensive, and unifying picture of a foundational feature of chemistry.

Complex Life Cycles, Reproduction, and Evolution           

Evolutionary explanations as usually conceived are based on the reproductive powers of biological individuals. Much work in evolutionary theory and the philosophy of biology has been done by assuming simple life cycles in which reproduction is clear and obvious. However, simple life cycles are not the norm across the range of biological diversity. Many organisms alternate between different modes of reproduction, sexual and asexual. A "like begets like" assumption about reproduction is routinely violated in much of the tree of life. How much of standard evolutionary thinking is dependent on simple and vertebrate-centric views of life cycles? This symposium will explicitly consider the consequences of complex life cycles for our understanding of reproduction, individuality, and evolution.

Conceptions of Space, Time and Spacetime

Questions regarding the structure of space and time have been of central concern to philosophers since at least the time of Aristotle. The questions raised, and the answers provided, have been strongly influenced by the physical theories and mathematical apparatus of the day. Nowadays, classical mechanics, general relativity, quantum field theory, string theory and the theory of the real numbers have been pre-eminently influential. The four talks of the proposed symposium are all concerned with one or more of these theories. They are also unified by a concern with limitations to our epistemic warrant for ascriptions of structure to space and time—as illustrated by the consideration of alternative structures.  Belot will open the symposium by discussing the homogeneity of time in both classical mechanics and general relativity. Butterfield will then discuss two disparate kinds of under-determination of spacetime structure that arise in quantum field theory and string theory. Next, Manchak will discuss the peculiar causal properties of Gödel’s cosmological model that seem to imply the non-existence of an objective lapse of time. And, finally, Ehrlich will challenge the near universally-held view that points of space, having length zero, are necessarily unextended.

Curie's Principle: The Good, the Bad, and the Symmetry Violating            

This symposium aims to clarify an important symmetry principle known as Curie's principle, and to identify its historical and foundational significance. In particular, we will 1) Articulate a variety of distinct propositions that might be characterized as Curie's principle, and identify the plausibility of each; 2) Clarify the historical significance of Curie's principle in the famous CP-violation and T-violation experiments in the 20th century; 3) Respond to the recent claim of Roberts (2013) that Curie's principle is false; 4) Propose generalizations of Curie's principle in deterministic, probabilistic, and indeterministic theories, as well as to quantum theories beyond the standard model; and 5) Consider the role of Curie's principle in guiding theory construction.

Explanation: Communication, Representation, Objectivity          

Existing work on scientific explanation typically focuses on the nature of explanatory relations. It asks: what types of facts about the world are explanatory? Answers include laws of nature, causal relations, statistical regularities, etc. In this symposium we take a step back to ask: is the explanatory relation all there is to explanatory norms? If there is something else--in particular, representational strategies or communicative needs--is it auxiliary, or do these human-centered elements of explanatory practice modulate the nature of the explanatory relation itself? The latter would imply that much contemporary research into explanation, with its "ontic" focus, is on the wrong track.    The participants differ. Two of us, Potochnik and Levy, believe that explanation is a far more subject-relative category than current discussions allow and advocate greater attention to its representational, communicative and psychological aspects. The other two participants, Strevens and Franklin-Hall, acknowledge the importance of communication and related aspects, but hold that explanation is fundamentally objective. They argue that an account of its objective basis should be the first stop in any quest to understand explanation.    By exploring questions of subjectivity and objectivity in scientific explanation, this symposium spurs critical evaluation of the current agenda for work on explanation and explanatory practice in science.

Formal Methods in Philosophy of Science             

The aim of this symposium is to present some new formal tools and apply them to traditional questions in philosophy of science. In particular, we will consider how tools from category theory and the theory of multigraphs can provide insight about the relationships between scientific theories. Such tools allow for new approaches to old philosophical issues like theoretical equivalence, empirical equivalence, and the structure of scientific change. Furthermore, they suggest relations between theories that have yet to be examined by philosophers. Category theory allows for the explication and formalization of the intuitive notion of ``the amount of structure'' that a scientific theory posits. It also allows one to characterize and reason more carefully about gauge theories. Similar tools from the theory of multigraphs have applications in evolutionary biology, ecology, and other special sciences. They allow philosophers of science to explore relations between theories and models, discuss theoretical virtues, and they can also indicate opportunities for future conceptual work.

How Adequate Are Causal Graphs and Bayesian Networks for Modeling Biological Mechanisms?                

The theory of causation has been profoundly transformed by the availability of powerful formal frameworks using tools such as directed acyclic graphs (DAGs) and Bayesian Networks in order to represent causal relations. These formal calculi are well suited for being given a causal interpretation, for example, with the help of interventionism. Furthermore, these formalisms have been fruitfully applied not only in the sciences, but also to address philosophical issues having to do with causation and scientific explanation in a number of scientific disciplines, including economics, psychology, physics, and biology. Recently, several authors have used formal theories of causation in order to model biological mechanisms, which are widely and justly thought to be crucially important in the contemporary life sciences. The aim of this symposium is to examine the strengths of formal accounts of mechanisms, while also revealing the limitations of such accounts. Two presenters (Alexander Gebharter, Lorenzo Casini) will expand and defend the formal approaches to representing mechanisms that they have developed, while two other presenters (Marie Kaiser, Marcel Weber) will criticize the view that mechanistic biological explanations can be adequately formalized by such an approach.

How Many Sigmas to Discovery? Philosophy and Statistics in the Higgs Experiments         

A 5 sigma effect! is how the recent Higgs boson discovery was reported. Yet before the dust had settled, the very nature and rationale of the 5 sigma (or 5 standard deviation) discovery criteria began to be challenged and debated both among scientists and in the popular press. Why 5 sigma? How is it to be interpreted? Do p-values in high-energy physics (HEP) avoid controversial uses and misuses of p-values in social and other sciences? The goal of our symposium is to combine the insights of philosophers and scientists whose work interrelates philosophy of statistics, data analysis and modeling in experimental physics, with critical perspectives on how discoveries proceed in practice. Our contributions will link questions about the nature of statistical evidence, inference, and discovery with questions about the very creation of standards for interpreting and communicating statistical experiments. We will bring out some unique aspects of discovery in modern HEP. We also show the illumination the episode offers to some of the thorniest issues revolving around statistical inference, frequentist and Bayesian methods, and the philosophical, technical, social, and historical dimensions of scientific discovery.

Learning from others: pooling vs updating            

Our symposium presents recent work within formal social epistemology. More precisely, it presents conceptual and formal results on epistemological aspects of public deliberation and consensus formation. It consists of four papers that each concern the relation between two important models for the reasoning processes that follow divergent opinions in a group, to wit, opinion pooling and Bayesian conditionalization on the opinions of others. These models are shown to be conceptually distinct, yet intricately linked in virtue of a number of new formal results. The symposium has a high degree of internal coherence as it builds on a consensus among the symposium participants that was formed in a series of informal workshops held over the past few years. Although the symposium is focused on a specific theme, it advertises an analytic approach to the social dimension of scientific inquiry that bears relevance to philosophy of science and epistemology in general.

Measuring What? Measurement Validation Across the Sciences                

What makes a measure valid, whether it is a measure of time or happiness? Despite the recent growth of philosophy of measurement, validation practices across social, natural and medical sciences remain insufficiently examined by philosophers of science. This symposium focuses on three core questions: Is there a single epistemology to measure validation? Are the current validation procedures across sciences defensible? Finally what are the moral dimensions of measurement in social and medical sciences? Four papers by six philosophers of science examine these questions by attending to examples of current approaches to validation of physical, psychological and medical measures.

Moral Emotions: Approaches, Origins, and Ethics               

Hume famously argued that certain emotions lead to moral behavior.  Recent work in cognitive science, psychology, and biology has indicated that moral emotions are indeed a crucial part of prosocial behavior, leading philosophers to take notice.  This symposium includes new work in philosophy of science on moral emotions.  This work is largely concerned with two questions:  How and why do moral emotions evolve? And what are the upshots for norms and ethics?  The four talks in this symposium represent different approaches to these problems drawing from evolutionary biology, evolutionary game theory, psychology, and naturalized ethics.

Narrating Order                 

Narrative is surprisingly common in the practices of many sciences, but there is as yet no common understanding of the role or roles it plays, or whether indeed there is any commonality to the functions of narrative in different sciences. While it is tempting to view the ubiquity of narrative in the sciences mainly as a means to gloss or even popularize research findings, the papers of this symposium argue the contrary. Based on examples from a diverse set of fields - chemistry, medicine, biology and sociology - the symposium panelists will explore the various epistemic and ontological functions that narrative plays in ordering the phenomena of a domain.

Naturalism and Values: Or, what can Data do for Philosophy of Science?                  

Participants in this symposium will present and discuss data relating to the role of the values in science and will use these data as a backdrop against which to explore the popular concept of naturalism. According to naturalism, philosophers should not attempt to legislate a priori principles but should instead make normative claims that are informed by an understanding of actual science. Yet a naturalistic perspective of this sort could be interpreted in a number of different ways, some of which are not very plausible. For instance, a naive "just ask the scientists" conception of naturalism will not do, since scientists may not have systematically considered the philosophical issues, may disagree when they do, and there might be grounds for a critique even if they were in universal accord. So one major aim of the symposium will be to clarify in what senses naturalism is, and is not, a reasonable constraint on philosophy of science. The symposium will then examine the implications of the data for normative philosophical projects related to the topic of values in science given the more plausible versions of naturalism.

Newton, Mathematics, and Mechanism                 

The symposium focuses on the meanings of 'mechanism,' 'mechanics,' and 'the mechanical philosophy' in Isaac Newton's highly mathematical natural philosophy and their implications for his philosophy of science.    It addresses several interrelated questions: the worth of using 'mechanism' as an interpretive lens through which to understand Newton's natural philosophical output and its innovation, the applicability of mathematics to natural phenomena in Newton's physics given his understanding of mechanics, the status of fundamental ontological commitments in Newton's philosophy of science, and the compatibility of Newton's new ontology of forces with the ontological and methodological commitments of 'mechanism.'    The purpose of this symposium is to bring issues that have traditionally been studied in historical circles into contact with contemporary philosophy. In particular, despite the fact that Newton's contributions to the philosophy of space and time and scientific methodology have long been a staple of contemporary philosophy of science, study of his philosophy of mathematics and his relation to the so-called 'mechanical philosophy' have traditionally been undertaken by historians. We hope to address these latter issues, which were equally central to Newton's natural philosophy.  

Newtonian Relativity     

This symposium will consist of presentations by George Smith (Tufts University), Robert DiSalle (Western University), Michael Friedman (Stanford University), and Craig Fox (graduate student, Western University). Potentially, Howard Stein will offer brief commentary on the papers in which he reflects on how this new work relates to, and, indeed, extends his seminal 1967 paper, “Newtonian Spacetime.”

NON-CAUSAL EXPLANATION IN THE SCIENCES      

According to the presently received view, the sciences explain by providing information about causes and causal mechanisms. However, in the recent literature, an increasing number of philosophers argue that the explanatory practices in the sciences are considerably richer than the causal model of explanation suggests. These philosophers argue that there are non-causal explanations that cannot be accommodated by the causal model. Case studies of non-causal explanations come in a surprisingly diverse variety: for instance, the non-causal character of scientific explanations is based on the explanatory use of non-causal laws, purely mathematical facts, symmetry principles, inter-theoretic relations, renormalization group methods, and so forth. However, if there are non-causal ways of explaining, then the causal model cannot be the whole story about scientific explanation. The goal of this symposium is to shed light on, by and large, unexplored philosophical terrain: that is, to develop a philosophical account of conceptual, epistemological and metaphysical aspects of non-causal explanations in the sciences.

Perspectivism: Models, Realism, and Pluralism 

Perspectivism--the recognition that there can be no 'view from nowhere' (to echo Thomas Nagel)--is, among other things, a promising new view in the debate on realism and antirealism about science. To a first approximation, perspectivism vindicates the intuition that our knowledge is always from a well-defined historical and theoretical perspective, while also delivering on the promise of realism. But can perspectivism in fact deliver the best of both worlds? Can it account for scientific knowledge being historically and theoretically situated, while also vindicating some form of realism about science? The goal of this symposium is to explore this thorny issue, and to clarify some key metaphysical and epistemic assumptions, by exploring how perspectivism may work in practice through the use and integration of a plurality of models in the life sciences and the physical sciences.

Philosophy of Interdisciplinarity                

Much contemporary science is becoming increasingly interdisciplinary. As a consequence, interdisciplinarity has become a topic of reflection for many practicing scientists who, most often, are trained in 'disciplinary' backgrounds and paradigms.  Similarly, the importance of interdisciplinary research for addressing the grand challenges that face modern society is a key priority for funding agencies, policy organizations and university administrators around the globe. However, effective integration of different scientific cultures requires a detailed understanding of the processes that make interdisciplinarity successful.   This symposium presents new work in philosophy of science that is based on a detailed understanding of the collaborative and interdisciplinary practices of today's science and which re-addresses classic topics in philosophy of science such as problem solving, explanation, mathematical modeling, and evidence in order to re-orient the discussion to the collaborative and interdisciplinary character of contemporary science.   Philosophy of science has an important role to play in developing detailed accounts of the interdisciplinary practices of contemporary science that can be used to train practicing scientists and to advise educators, administrators and policy makers on the practices of science today. In this way, new work on the philosophy of interdisciplinarity contributes both to the development of philosophy of science and to making philosophical analyses applicable outside philosophy itself. 

Population Concepts and Race  

The symposium will discuss what happens when particular population concepts are applied to biological discussions of race. This symposium will pull together a number of scholars whose diverse areas of expertise are needed in order to make significant headway on this topic. The symposium will include discussion of the following topics  (amongst others) that address the overarching theme: the ontological status of populations for biology and biomedical sciences; whether or not a biological population concept can support the claim that there are human races (and if so, how); how both realists and anti-realists ought to approach questions about biological racial realism; if biological populations are nominal or not; the particular boundary conditions for biological populations; whether or not \conventional" races - a la Lewontin (1972) - have any justified connection to human populations; and what follows from applying the Causal Interactionist Population Concept (CIPC) found in Millstein (2010) to human beings."

Putting Pressure On Human Nature         

The goal of this symposium is to put pressure on the new defense of human nature, particularly as applied in the human behavioral and social sciences. This will include presentation of both conceptual challenges and empirical test cases, by a mix of philosophers and working scientists.  Philosophers of science have recently revived a defense of human nature. However, this is not a defense of the essentialist notions of old. Rather, these new approaches acknowledge and accept the complaints against these essentialist notions, preferring, instead, to frame their arguments in terms of heuristic or pragmatic value. Here we evaluate these heuristic and pragmatic arguments by testing them against case studies and a careful evaluation of practices in the human behavioral and social sciences. This includes asking whether these new notions of human nature demarcate a useful and interesting object of study, and, if so, whether there is any value (or harm!) in calling that object 'human nature'. We will also consider competing conceptual frameworks for cultural evolution and what role, if any, these heuristic notions may play in them.

Quantifying Life: Foundations of Measurement for the Life Sciences       

Measurement is the assignment of numbers to empirical structures with the purpose of obtaining quantitative structures in order to use the powerful techniques of modern mathematics. Quantification and measurement are omnipresent in all mature sciences. At least since the work of von Helmholtz's the foundations of measurement have been explored for the physical sciences and later also for the social sciences and psychology. There is one major scientific field, however, where a systematic approach to quantification and measurement is virtually unknown - the life sciences. Only recently some researchers have begun to identify and investigate important measurement problems in the biological sciences. The purpose of the symposium is to introduce this emerging field to a broader audience in the philosophy of science. While many of the measurement problems in the life sciences are pressing concerns arising from applications, the resulting questions call for both precise conceptual and mathematical foundations. It is the hope of the participants of the symposium that philosophers of biology, of statistics and, in general, of science will be inspired to join in the program of exploring the foundations of measurement in biology. 

The Foundations of Gravitation and Thermodynamics     

Since the discovery in the 1970s that black holes seem to be truly thermodynamical objects, it has been widely believed in the physics community that there is a deep, hitherto unexpected relation between gravity and thermodynamics.  This development raises a wide and deep range of philosophical problems that the philosophy of physics community has only begun to scratch the surface of.  We propose in this symposium to explore several facets of this relationship, including the nature of the analogy between the laws of black hole mechanics and the laws of thermodynamics, the nature of black hole entropy, the possible role of gravity in the fixing of special thermodynamical conditions in the early universe, the problem of ``information loss'' in black hole evaporation, and the nature of the analogy between the thermodynamical behavior of black holes and various phenomena in fluid mechanics (``dumb holes'').  We also intend to provide along the way a synoptic discussion of the major outstanding issue and problems in the field, with the hope to inspire other philosophers to work on these problems.

The Principal Principle 

The Principal Principle is the key theoretical tool in the analysis of the relation of objective and subjective probability: it links objective probabilities to subjective probabilities (credences) in a Bayesian manner by requiring subjective degrees of belief to be equal to the objective probabilities once the subjective degrees of beliefs have been conditionalized on the values of the objective probabilities. The Principal Principle also has been used to analyze the concept of chance. The Principal Principle has been controversial ever since it was formulated; in particular, its consistency has been questioned and various attempts have been made to amend, modify and qualify it. The symposium is devoted to the discussion of the newest trends and results on the Principal Principle. The papers analyze in particular: different strategies that may be used to establish various versions of the Principal Principle; the compatibility of two new principles in connection with the Principal Principle (Epistemic Relevancy of High-order Explanations and Admissibility of High-order Explanations); and the abstract consistency of the Principal Principle in a measure theoretic framework of probability.

The scientific method revisited                  

This symposium will revisit one of the classical questions in the philosophy of science: is there something like 'the' scientific method, and if so, are there any grounds for thinking that we have an adequate conception of it or that we will be able to obtain one in the near future? Although such questions were at the forefront of the thoughts of some of the founding fathers of our discipline (e.g., Carnap, Popper and Kuhn), they have been largely side-lined in the past few decades by other, perhaps more specific and localized, concerns. Yet, questions about the scientific method ought to receive more attention within the philosophy of science: not only are they part-and-parcel of our teaching, but they are also something contemporary philosophers *of science* simply ought to have something to say about. This is why the symposium will seek to put questions about the scientific method back on the philosophical agenda.

Unifying the Mind-Brain Sciences

A renewed debate has been raging over whether and how to unify the mind-brain sciences (for starters, see the 2011 issue of Synthese on Neuroscience and Its Philosophy, and papers in preparation or forthcoming by the participants in the present symposium as well as Mazviita Chirimuuta, Carl Craver, Robert Cummins, Frances Egan, David Kaplan, and many others). To push the debate forward, several questions must be investigated. What explanatory roles do different areas of the mind-brain sciences play in advancing our understanding of cognitive and behavioral phenomena? Are practitioners in cognitive psychology and neuroscience engaged in the same explanatory project? Or, are the explanatory projects and investigative approaches of cognitive psychology and neuroscience sufficiently dissimilar yet equally important that the unification of cognitive psychology and neuroscience is neither feasible nor desirable? The participants in the proposed symposium aim to answer these questions via an evaluation of the explanatory, methodological and conceptual practices in the mind-brain sciences. The diverse array of philosophical approaches represented and areas of the mind-brain sciences considered are intended to stimulate lively discussion about the future and unity of the mind-brain sciences.              

What kind of climate evidence does climate policy really need?                

There is a trend in climate policy, at least in the UK, towards decisions that require, for justification, more and more detailed climate evidence. This leads to a demand for climate models to deliver such detailed climate evidence, typically projections of climate conditions in very precise geographical locations. An important question is whether such precise projections are warranted or whether they instead derail rational decision making.    This symposium explores issues of climate-science methodology that underpin this question. In particular: What sorts of climate hypotheses (projections) can we plausibly investigate, and what methods and kinds of evidence are best for confirming such hypotheses? The five contributions to the symposium approach these questions from different angles. Two papers probe the issue of plausible climate hypotheses by considering sensitivity of projections to structural model error and by exploring whether it is better to focus on climate scenarios instead of precise local predictions. The remaining papers explore the power and limitations of confirmation practices for discriminating between the plausible climate hypotheses: one paper considers the value of 'observational' evidence that is itself the output of model(s), and the remaining papers investigate the usefulness and commitments of various 'model selection' methods for confirming climate hypotheses. 

What's New in Network Analysis?             

The symposium will discuss philosophical foundational issues of network analysis in an interdisciplinary setting. Network analysis is a mathematical tool which is increasingly being used in special sciences such as biology, ecology, neuroscience, computer science, economy to describe real-world systems, their elements and their interactions as graphs. Network analysis allows researchers to find mathematical features in the network representing a given system - properties such as small-world structures, scale-free structure - thus making the enormous complexity more tractable. The set of foundational questions we want to focus on are of the novelty, conditions and limits of network analysis as a tool for description, representation and explanation. This first symposium on network analysis at the PSA will systematically introduce a core set of questions and issues, discuss them by three philosophers of, respectively, mind, neuroscience and biology, who will confront the practice of two working scientists, in ecology and in neurosciences.  This symposium will bring together a group of individuals who first independently and then collaboratively worked on these issues over the last three years, and hold different takes on network analysis.


Notice that these are not abstracts of individual papers, but of the entire sessions.  These abstracts are organized alphabetically by the title of the session.

 

Approaches in the Philosophy of Science in Practice

The Society for Philosophy of Science in Practice (SPSP) grew out of a recognition of the need to promote the philosophical study of “science in practice”, by which the organizers of the Society meant both scientific practice and the functioning of science in practical realms of life. Despite occasional exceptions such as some recent literature on models, experimentation, and measurement, which have engaged in detailed consideration of scientific practices in pursuit of their philosophical points, concern with practice has tended to fall outside the mainstream of Anglophone analytic philosophy of science. SPSP was founded with the aim of changing this situation, through the promotion of conscious, detailed, and systematic study of scientific practice that nevertheless does not dispense with concerns about truth and rationality. The purpose of this session is twofold: firstly to present a concise view of the vision of ‘philosophy of science in practice’ pursued by SPSP, and secondly to present some of its approaches.

Causation, Kinds, and Structure in Chemical Theory

Chemistry is focused on understanding the structure of matter, and the ways that we can transform one kind of substance into another. The papers in this symposium ask how this is possible. From Robert Boyle’s Chymical Philosophy, to modern Ligand Field Theory, what are the conceptual tools at the heart of chemistry’s success? In the first part of the symposium, we focus on the ways causal notions are deployed (or not) in accounts of chemical transformation. The papers in the second part of the symposium concern the role of structural properties in chemical explanations past and present.

Enriching Philosophy of Science through Collaborative and Feminist Approaches

This symposium introduces, describes, and provides examples of work in Socially Relevant Philosophy of/in Science and Engineering. The symposium begins with an introduction to SRPoiSE and its fruitful overlap with an older organization committed to socially relevant work: The Association of Feminist Epistemology, Methodology, Metaphysics, and Science Studies. As a sample of SRPoiSE work, the second paper presents a philosophy of science analysis of the so-called “Hispanic Paradox,” the observation that US Hispanic health is far better than expected given the epidemiological variables known to be present in the population (high rates of poverty, etc.). The third paper provides a narrative for how the largest multi-member SRPoiSE project, the Toolbox Project, has evolved over time. In the final paper, a SRPoiSE member provides a feminist critique of the Toolbox Project, demonstrating how the SRPoiSe community fosters mutual dialogue, constructive criticism, etc., all in line with the FEMMSS mission.

Heterogeneity in Medicine and Psychiatry: Empirical Strategies, Conceptual Problems

‘People are not alike.’ This apparent truism poses considerable problems for biomedical research, diagnosis, treatment and policy decisions. Diseases have different morphologies in different patients, patients react differently to the same therapeutic regime and often it is hard if not impossible to determine the population of which a given patient is a member. And yet, methodologists of medicine and philosophers of science have tended to offer ‘one size fits all’ approaches to disease causation and classification, best epistemic practice and decision making. Focusing on epistemic and conceptual issues in biomedical research and treatment decision making, the uniting theme of all papers in this panel is that the ‘one size fits all’ approach is not good enough. Different therapeutic problems appear to require different, tailor-made solutions, and the best solutions to a specific case often conflict with widely accepted philosophical and methodological positions. The panellists do not, however, contend themselves with making this negative point about lack of homogeneity. Instead, in each case a positive response to the problem at hand is developed and defended in the context of the specific case. Samantha Kleinberg’s and David Teira’s papers examine clinical research on treatment efficacy. Kleinberg notes that many (if not most) research findings in this area are not reproducible. Specifically, she looks at observational data, notes three challenges research based on observational data faces and proposes to use simulated data for replication purposes. Teira’s topic is experimental research and, in particular, the requirements that (cancer) treatments be tested in large randomised trials. He argues that instead smaller but well-defined populations of patients provide good normative grounds for impartial regulatory decisions about targeted therapies.Maël Lemoine and Julian Reiss examine disease classification and causation in psychiatry and carcinogenesis, respectively. Lemoine asks whether statistical methods can establish or confirm the homogeneity of a biologically undefined disease entity such as ‘major depressive episode’. He argues among other things that it is unhelpful and misguided to consider the question as an instance of the ‘natural kinds’ problem. Reiss investigates the fitness of mainstream theories of causation such as Codell Carter’s, Woodward’s and Bayes’ nets to describe certain features of heterogeneous cancer causation. He finds them all wanting in some respects and proposes an alternative, inferentialist account of causation. Brendan Clark, finally, takes up the topic of therapeutic decision making. There is evidence that there is important variation in the causes and therefore also the most appropriate treatments of hypertension in different ethnic groups. To decide which therapy is best for an individual patient means to solve the reference-class problem. Clark argues that Salmon’s solution to the problem does is not successful in this case and develops a better alternative. 

A HOPOS Sampler: Exemplary Work in the History of the Philosophy of Science

The International Society for the History of Philosophy of Science is devoted to promoting serious, scholarly research in the history of the philosophy of science (HOPOS). This panel presents a sampling of such research, including considerations of HOPOS methodology and examples of HOPOS scholarship. The session will encourage historically informed work in the philosophy of science and better understanding of the general sensibility represented by HOPOS. In that spirit, we offer a session of four talks, each developed from talks given at the Society’s most recent biennial meeting, held July 3–5, 2014, in Ghent, Belgium. In “Methodologies in Context,” Jutta Schickore (Indiana University) considers the difficulties of synthesizing over‐arching perspectives offered by philosophies of science with the local contextualizations offered by its history. She thus engages with the central methodological difficulty of HOPOS scholarship: how to use facts about how science was to draw morals about how it should be? Barnaby Hutchins (Ghent University) turns to more concrete issues in his “Reduction, Integration and Mechanism in Descartes’s Biology,” comparing Descartes’s writings concerning mechanism to Descartes’s use of mechanism in his discussion of the heartbeat. Descartes’s use of mechanism, Hutchins argues, suggests his was an integrationist rather than reductionist approach to mechanism, a fact that illuminates the “new mechanism” associated with Machamer, Darden and Craver (2000). Here we have work on seventeenth-century ideas being brought to bear on a debate in the contemporary philosophy of biology. Similarly, Jennifer Jhun’s (University of Pittsburgh) study of historical uses of ceteris paribus clauses in partial and general equilibrium economic models, “A Lesson from Economic History: Idealization and Ceteris Paribus Clauses,” locates in the history of economic thought a sense of ceteris paribus more subtle, Jhun argues, than that found in contemporary debates. Finally, Janet Folina (Macalester College and, incidentally, President of the Society), in her “Poincaré and Structuralism in the Philosophy of Mathematics,” takes up a chapter in the pre‐history of contemporary structuralism in the philosophy of mathematics, arguing that Poincaré’s Philosophy of mathematics included an epistemology and metaphysics of structures.

Interdisciplinary Explanations in Economics

This symposium explores how the explanations provided by current economic theories sometimes require inputs from other disciplines, despite their apparently self-standing status. Melissa Vergara (Erasmus University Rotterdam) challenges the traditional view according to which relatively small epistemic units—arguments, causal scenarios, models––do all the explanatory work. Instead, she argues that economists explain on the basis of large clusters of models and other pieces of information, which often includes insights from disciplines outside of economics. Mariam Thalos (University of Utah) argues that the successful use of decision theory depends on a rich sociological understanding of the context of application. Tyler DesRoches (University of British Columbia) takes a look at the economics of sustainable development and argues that interdisciplinary insights, in particular from the life sciences, are incompatible with an standard assumption made by mainstream economics: that natural capital is substitutable. Lastly, Jesús Zamora-Bonilla (UNED) explores to what extent economics can be naturalized and how it can help other disciplines to provide hermeneutic explanations of action.

Pedagogy Panel: A New Paradigm for Graduate Education: A Joint Research Course for Science, Engineering, and Philosophy of Science PhD. Students

The panel will center on a full-credit graduate research course for science, engineering, and philosophy of science graduate students that aimed to prepare students for collaborating in multidisciplinary investigation and for recognizing and addressing ethics issues in the natural course of research. The three successive offerings of the course in the autumns of 2010, 2011, and 2012 at lllinois Institute of Technology (IIT) in Chicago were funded by the National Science Foundation. Science and engineering graduate students were recruited from IIT, and philosophy of science graduate students were recruited from other Chicago area universities, University of Illinois at Chicago, University of Chicago, Northwestern, Loyola, and Notre Dame. The project featured five faculty: Vivian Weil (PI) IIT, ethics specialist; Jordi Cat (Co-Pi), Indiana University, philosopher of science; Eric Brey IIT, biomedical engineer; Sandra Bishnoi, Rice University (formerly IIT), chemist; and Nick Huggett, University of Illinois at Chicago (UIC), philosopher of science. The panel is to include two of the faculty, Weil and Brey; four of the students, two philosophy of science students, Monica Solomon, Notre Dame, and Josh Norton, University of Illinois at Chicago, and two mechanical engineering students, John Hasier, IIT, and Ming Yin, IIT; and one of the three independent reviewers for 2011 and 2012, Kevin Elliott, Michigan State University, philosopher of science. Weil will provide background on the development of the course, noting perceived gaps in graduate education in all three areas that NSF recognized. To address the gaps, the course was to prompt engineering and science students to engage with probing questions about research practices (e.g. experimentation) and to familiarize philosophy of science students with actual research at the bench. The other instructors will explain – from the point of view of their different disciplines – the problems that they encountered, and how, with the assistance of annual external reviews, they made improvements (and what difficulties remain). Of course, they will also explain the positive lessons learned for the challenging tasks of developing cross disciplinary dialogue and collaborative research between graduate students. The students will speak about their experience in the course -- class discussion, collaborations, final joint reports, and indications for further research and teaching –from the perspectives of their different disciplines. Finally, Elliot will present his perception of lessons learned and the value of carrying forward such educational efforts. The experience from three consecutive offerings of the course, building on what worked and avoiding what did not work, was intended to yield products such as this panel, among others, and educational materials for use in a variety of formats and settings, as well as the classroom and seminar room.

Perspectives in the Philosophy of Mathematics

As detailed in our Mission Statement, PMA exists for the purpose of promoting and supporting the study of, and research in, the philosophy of mathematics. It construes this subject broadly and takes it to include the study and investigation of pertinent topics in allied subjects and disciplines. These include parts of the larger study of mathematics, the history of mathematics, logic, the foundations of mathematics, computer science, cognitive science, and the history and philosophy of science. It also recognizes that the philosophy of mathematics can profitably be studied and investigated by diverse methods, and from different perspectives. The purpose of this special session is to emphasize the importance of the connection between PMA and PSA by highlighting the contributions made by philosophy of mathematics. Our aim here is to show the variety of interests of philosophers of mathematics. Specially, this session will include presentations that consider mathematical issues from historical, formal and philosophical perspectives. Since much of philosophy of science depends on, or at least is informed by, philosophy of mathematics it is crucial that such connections be both highlighted and valued. Again, well-witnessing the varying perspectives and differing investigations of philosophers of mathematics the topics of this session will include, the history and philosophy of scientific structuralism, mereologically interpreted geometry, and the formal nature of reasoning.

Recent Trends in the Philosophy of Social Science

The papers in this symposium explore a variety of interrelated trends that have emerged in different topics areas and related core issues in the contemporary philosophy of social science. These include: topics in political theory (republicanism and democratic theory) that also involve traditional concerns with rationality and cultural difference, empirical work bearing on the explanation of norms, rationality and cultural interpretation, the “return” of philosophy of history, and the alleged special status of normative explanations within social science. Discussions of these issues will also make clear how philosophy of social science may be viewed as something other than the step-child of philosophy of science proper. In particular, debates within philosophy of social science impact core issues in philosophy of science, and attention to these debates indicate how in certain respects philosophy of science ought to take more notice of topics within philosophy of social science.

Technology and the Production of Scientific Knowledge: Reflections on Converging Territories

The symposium will bring together four speakers with extensive research expertise in the intersection of philosophy of science and technology. The presentations will engage technological dimensions of practice in a broad range of scientific contexts, including theoretical and experimental particle physics, optics, interferometry, chemistry, genomics and biomedicine. They will draw upon a range of methodologies and traditions, including phenomenology, history of science, virtue epistemology and virtue ethics. Though the methodological accounts and scientific applications will vary, a common thread linking the presentations is the illustration of technology’s increasingly essential role in the production of scientific knowledge, and the importance of more intensive philosophical study of the technological dimensions of scientific practice.

The following are the titles and abstracts of accepted contributed papers as there were originally submitted, listed alphabetically by the first author’s last name.

Abrams, Marshall: Coherence, Muller's Ratchet, and the Maintenance of Culture           

I investigate the structure of an argument that culture cannot be maintained in a population if each individual acquires each cultural variant from a single person.  I note two puzzling consequences of the argument. I resolve the first by showing that one of the models central to the argument is conceptually analogous and mathematically equivalent to a model used to investigate the biological evolution of sexual reproduction. I resolve the second by arguing that probabilistic models of epistemological coherence can be reinterpreted as models of mutual support between cultural variants.  I develop a model of cultural transmission illustrating this idea.

Akagi, Mikio: Going against the Grain: Functionalism and Generalization in Cognitive Science

Functionalism is widely regarded as the central doctrine in the philosophy of cognitive science, and is invoked by philosophers of cognitive science to settle disputes over methodology and other puzzles. I describe a recent dispute over extended cognition in which many commentators appeal to functionalism. I then raise an objection to functionalism as it figures in this dispute, targeting the assumption that generality and abstraction are tightly correlated. Finally, I argue that the new mechanist framework offers more realistic resources for understanding cognitive science, and hence is a better source of appeal for resolving disagreement in philosophy of science.

Alexander J.:  Cheap Talk, Reinforcement Learning and the Emergence of Cooperation                 

Cheap talk has often been thought incapable of supporting the emergence of cooperation because costless signals, easily faked, are unlikely to be reliable (Zahavi and Zahavi, 1997). I show how, in a social network model of cheap talk with reinforcement learning, cheap talk does enable the emergence of cooperation, provided that individuals also temporally discount the past. This establishes one mechanism that suffices for moving a population of initially uncooperative individuals to a state of mutually beneficial cooperation even in the absence of formal institutions.

Autzen, Bengt: The Star Tree Paradox in Bayesian Phylogenetics             

The 'star tree paradox' in Bayesian phylogenetics refers to the phenomenon that a particular binary phylogenetic tree sometimes has a very high posterior probability even though a star tree generates the data. In this paper I discuss two proposals of how to solve the star tree paradox. In particular, I defend Lewis, Holder, and Holsinger's polytomy prior against some objections found in the biological literature and argue that is preferable to Yang's data-size dependent prior from a methodological perspective.

Baetu, Tudor: The Completeness of Mechanistic Explanations 

The purpose of the paper is to provide methodological guidelines for evaluating mechanistic explanations meant to complement previously elaborated interventionist norms. According to current accounts, a satisfactory mechanistic explanation should include all of the relevant features of the mechanism, its component entities and activities, their properties and their organization, as well as exhibit productive continuity. It is not specified, however, how this kind of mechanistic completeness can be demonstrated. I argue that parameter sufficiency inferences based on mathematical model simulations of known mechanisms is used to determine whether a mechanism capable of producing the phenomenon of interest can be constructed from mechanistic components organized, acting, and having the properties described in the mechanistic explanation.

Bain, Jonathan: Pragmatists and Purists on CPT Invariance in Relativistic Quantum Field Theories

Greenberg(2002) claims that the violation of CPT invariance in an interacting RQFT entails the violation of Lorentz invariance.  This claim is surprising since standard proofs of the CPT theorem require more assumptions than Lorentz invariance, and are restricted to non-interacting, or at best, unrealistic interacting theories.  This essay analyzes Greenberg's claim in the context of the debate between pragmatist approaches to RQFTs, which trade mathematical rigor for the ability to derive predictions from realistic interacting theories, and purist approaches, which trade the ability to formulate realistic interacting theories for mathematical rigor.

Banville, Frédéric-I:  Accounting for the Dynamics of Inquiry in Neuroscience    

In this paper I demonstrate that Bechtel and Richardson's (2010) account of heuristics in science is too narrow to capture the range of problems involved in inquiry. This is a problem insofar as Bechtel and Richardson are concerned with descriptive accuracy. Recognizing distinct kinds of problems and heuristics enables an appreciation of the dynamics of scientific inquiry that is more descriptively accurate. Through a case study of the discovery of place cells (O'Keefe and Dostrovsky 1971) and the general framework that made this discovery possible (Tolman 1948), I show that conceptual problems determine which empirical problems constitute research agendas.

Barwich, Ann-Sophie: A Fine Nose for Timeliness: The Discovery of the Olfactory Receptors and the Question of Novelty             

What characterizes novelty in a scientific discovery? I present a case study that has not been dealt with in philosophical debate and revisit the notion of scientific discovery. Analyzing the historical trajectory that led to the discovery of the olfactory receptors, and focusing on the unprecedented application of degenerate Polymerase Chain Reaction, I develop an alternative notion of discovery that pertains to the development of laboratory practices. It concerns the ways in which advancing techniques become integrated into an evolving experimental context by bridging the gap between the requirements of standardization and the idiosyncrasy of research materials.

Bechtel, William:  Biological Mechanisms Don't Exist Except as Theoretical Posits

I argue that biological mechanisms exist as constitutionally and dynamically distinct entities only as posited in mechanistic explanations. Evidence is growing that what are taken to be constituents of biological mechanisms often interact as much with entities outside the putative mechanism as with each other. Moreover, the activities of a mechanism are often modulated by its own activity in the distant past. Some would take these findings as an impetus to holism and abandonment of the pursuit of mechanistic explanation. I argue instead for a more nuanced understanding of mechanistic explanation, in which the delineation of mechanisms is recognized as based on the epistemic aims of scientists.

Benétreau-Dupin, Yann: Blurring Out Cosmic Puzzles     

The Doomsday argument and anthropic arguments are illustrations of a paradox. In both cases, a lack of knowledge apparently yields surprising conclusions. Since they are formulated within a Bayesian framework, the paradox constitutes a challenge to Bayesianism. Several attempts, some successful, have been made to avoid these conclusions, but some versions of the paradox cannot be dissolved within the framework of orthodox Bayesianism. I show that adopting an imprecise framework of probabilistic reasoning allows for a more adequate representation of ignorance in Bayesian reasoning, and explains away these puzzles.

Bewersdorf, Benjamin: Total Evidence, Uncertainty and A Priori Beliefs                

Defining the rational belief state of an agent in terms of an a priori, hypothetical or initial belief state as well as the agent's total evidence can help to address a number of interesting philosophical problems. In this paper, I discuss how this strategy can be applied to cases in which evidence is uncertain. I also argue that taking evidence to be uncertain allows us to uniquely determine the subjective a priori belief state of an agent from her present belief state and her total evidence, given that evidence is understood in terms of update factors.

Bigaj, Tomasz: Quantum particles, individual properties, and discernibility       

The paper discusses how to formally represent properties characterizing individual components of quantum systems containing many particles of the same type. It is argued that this can be done using only fully symmetric projection operators. An appropriate interpretation is proposed and scrutinized, and its consequences related to the notion of quantum entanglement and the issue of discernibility and individuality of quantum particles are ascertained.

Bokulich, Alisa: Frankenmodels, Or a Cautionary Tale of Coupled Models in the Earth Sciences                 

In recent decades a new breed of simulation models has emerged in the Earth sciences known as coupled models, which involve a suite of independently developed component modules integrated in a software framework.  Such models purport to offer 'plug and play' capabilities, allowing researchers to easily combine and swap out different component models in order to build larger, more complex models and facilitate inter-model comparisons.  I examine such coupled models in the context of geomorphology and argue that although advances in software programming mean these models can be coupled from a 'technological' standpoint, they are not typically adequately coupled from a scientific standpoint, leading to what I call 'Frankenmodels.'  I highlight a number of conceptual challenges that the coupled-model approach in geomorphology will need to overcome in order to succeed.

Brigandt, Ingo:  Social Values Influence the Adequacy Conditions of Scientific Theories: Beyond Inductive Risk                 

Many philosophers who maintain that social and other non-epistemic values may influence theory acceptance do so based on the idea is that when the social consequences of erroneously accepting a theory would be severe, a higher evidential threshold has to obtain. While an implication of this position is that increasing evidence makes the impact of social values converge to zero, I argue for a stronger role for social values, according to which social values (together with epistemic values) determine a theory's conditions of adequacy, e.g., what makes a scientific account complete and unbiased.

Brown, Matthew, and Havstad, Joyce: The Disconnect Problem in Science and Policy    

We diagnose a new problem for philosophers of science to engage with: the disconnect problem.  Instances of the disconnect problem arise wherever there is ongoing and severe discordance between the scientific assessment of a politically relevant issue, and the politics and legislation of said issue.  Issues which currently display a persistent disconnect problem include both biological education about evolution and anthropogenic global climate change.  Here we use the case of climate change first to diagnose the disconnect problem -- uniting scattered philosophical critiques of various science-policy interactions in the process -- and then to solve it, by proposing an alternative framework for thinking about science-policy interaction.  We sketch this framework, which draws on feminist and pragmatist sources, in general and in application to the increasingly serious problem of climate change.

Bueno, Otavio: What Does a Mathematical Proof Really Prove?                

Jody Azzouni ([1994], [2004], and [2006]) has argued that underlying the practice of creating mathematical proofs there is a very specific norm: to each proof there should be a corresponding algorithmic derivation, a derivation in an algorithmic system. In this paper, I take issue with this proposal, and provide a framework to classify and assess mathematical proofs. I argue that there is a plurality of kinds of proofs in mathematics and a plurality of roles these proofs play. In the end, mathematical practice is far less unified than Azzouni's view recommends.

Bueter, Anke:  Is it Time for an Etiological Revolution in Psychiatric Classification?        

Current psychiatric classification (as exemplified by the DSMs) is based on phenomenology - a fact that many critics hold accountable for its problems, such as heterogeneity, comorbidity, or lack of predictive success. Therefore, these critics urge that it is time for psychiatric classification to move on to an etiology-based system. Yet, most of the arguments brought forward for such a change rely on unwarranted epistemological, ontological, or empirical assumptions. Others raise valid points about problems with the current DSM classification; however, those more successful arguments do not establish the legitimacy of an etiological revolution either, but instead call for greater pluralism.

Callender, Craig, and Wüthrich, Christian:  What Becomes of a Causal Set            

Contemporary physics is notoriously hostile to an A-theoretic metaphysics of time. A recent approach to quantum gravity promises to reverse that verdict: advocates of causal set theory have argued that their framework is at least consistent with a fundamental notion of `becoming'. In this paper, after presenting some new twists and challenges, we show that a novel and exotic notion of becoming is compatible with causal sets.

Cao, Rosa: Where Information Fades Away: Some Limitations of Informational Explanations in Neuroscience 

Should the ubiquitous talk of information and coding in neuroscience be taken at face value?  I will examine cases in sensory neurophysiology where spike trains in single neurons, or spiking activity in small groups of neurons are said to encode information about distal stimuli.  Given the explanatory goals in those cases, I will argue that ascribing informational content to patterns of spike activity in these cases is not doing explanatory work - spikes carry information for the scientist, rather than for the system itself. The activity of large populations of cells, by contrast, are better candidates for informational ascriptions.

Chambliss, Bryan: Optimality and Bayesian Perceptual Systems               

Bayesian models of perception are criticized, ranging from being untestable (because unrelated to neurophysiology), to being false (because committed to optimal behavior). Consequently, Bayesian models seem futile if understood as being more than merely predictive.     Careful inspection of the explanatory purport of Bayesian models splits this dilemma.  Structural similarities between Bayesian models of perception and optimality modeling in biology suggest that we export lessons concerning biological optimality models to the context of Bayesian modeling.  Upon properly appreciating (a) the operative level of analysis and (b) the explanatory force of Bayesian models, we see their explanatory purport and understand their limitations. 

Clatterbuck, Hayley: Contingency and the Origin of Life                  

Recently, several philosophers of science (White 2007, Nagel 2012) have argued that naturalistic scientific theories cannot adequately explain the origin of life because they show that life's emergence was highly contingent on precise and improbable initial conditions. Bayesian confirmation theory offers an analysis of when and how contingency presents impediments to scientific explanation. These impediments can be overcome, so non-contingency is not a necessary condition for good scientific explanation. Further, alternative, non-naturalistic theories do no better than naturalistic theories in securing the virtues of non-contingent explanations when applied to the case of life's origin.

Coffey, Kevin: Quine-Duhem through a Bayesian Lens  

One virtue attributed to Bayesian confirmation theory is its ability to solve the Quine-Duhem problem---the problem of distributing blame between conjuncts in light of evidence refuting their conjunction.  By requiring rational agents to update their (partial) beliefs by conditionalizing on new evidence, Bayesianism seems to provide a well-defined method for assessing the impact of empirical evidence on our beliefs.  Recently, however, Michael Strevens has criticized the standard Bayesian solution.  By appealing to the concept of a Newstein effect, he argues that the standard `blame measure' fails to be sufficiently objective and to accurately reflect the impact of a conjunction's falsification on its individual conjuncts.  Strevens then proposes an alternative Bayesian solution that purports to avoid these difficulties.  This paper assesses Strevens' claims in light of a more detailed analysis of Newstein effects.  I conclude that (1) his alternative blame measure is equally susceptible to Newstein effects, but (2) Newstein effects are actually unproblematic features of Bayesian epistemology, already implicit in the Bayesian framework, and (3) Strevens' alternative blame measure fails as a general solution to the Quine-Duhem problem.  The upshot is that the standard Bayesian solution to the Quine-Duhem problem is superior to Strevens' alternative.

Cordero-Lecca, Alberto:  Where's the Beef? Selective Realism and Truth-Content Identification

Selective realism seeks to identify theory-parts with high truth-content. One recent strategy, led by Juha Saatsi, Peter Vickers, and Ioannis Votsis focuses on the analysis of derivations of impressive predictions from a theory. This line greatly clarifies and refines the notions of "theory-part" and "truth-content ," but also leads to a "bare-bones" version of realism that stops surprisingly short of target. Sections 1 and 2 discuss the derivational approach in progress. Sections 3 and 4 trace the noted shortcoming to an excessive concentration on single-case derivations and minimalist interpretation. Sections 5 and 6 suggest adjustments that keep the focus on truth-content but expand the assessment of theory-parts to include both their overall track-record and external support for them. The resulting criterion, I argue, yields a "beefier" version of selective realism that is reasonably in tune with the array of theory-parts deemed successful and beyond reasonable doubt (based on standard confirmational practices in the natural sciences).

Cuffaro, Michael:  How-Possibly Explanations in Quantum Computer Science   

A primary goal of quantum computer science is to find an explanation for the fact that quantum computers are more powerful than classical computers. In this paper I argue that to answer this question is to compare algorithmic processes of various kinds, and in so doing to describe the possibility spaces associated with these processes. By doing this we explain how it is possible for one process to outperform its rival. Further, in this and similar examples little is gained in subsequently asking a how-actually question. Once one has explained how-possibly there is little left to do.

Danks, David: The Mathematics of Causal Capacities    

Models based on causal capacities, or independent causal influences/mechanisms, are widespread in the sciences. This paper develops a natural mathematical framework for representing such capacities by extending and generalizing previous results in cognitive psychology and machine learning, based on observations and arguments from prior philosophical debates. In addition to its substantial generality, the resulting framework provides a theoretical unification of the widely-used noisy-OR/AND and linear models, thereby showing how they are complementary rather than competing. This unification helps to explain many of the shared cognitive and mathematical properties of those models.

Del Pinal, Guillermo, and Nathan, Marco: Bridge Laws and the Psycho-Neural Interface                

Recent advancements in the brain sciences have enabled researchers to determine, with increasing accuracy, patterns and locations of neural activation. These findings have revived a longstanding debate regarding the relation between scientific fields: while many authors now claim that neuroscientific data can be used to advance our theories of higher cognition, others defend the so-called `autonomy' of psychology. Settling this question requires understanding the nature of the bridge laws used at the psycho-neural interface. While bridge laws have been the topic of longstanding philosophical discussions, philosophers have mostly focused on a particular type of bridge laws, namely, reductive bridge laws. The aim of this article is to present and provide a systematic analysis of links of a different kind---associative bridge laws---that play a central role in current scientific practice, but whose role has been often overlooked in methodological and metaphysical discussions in philosophy of mind and science.

Dougherty, John:  A few points on gunky space 

Arntzenius (2012) proposed a mathematical model for gunky space, motivated in part by philosophical considerations, in part by considerations from quantum mechanics. After recasting this model in perspicuous terms, I answer an unresolved technical question. Turning to quantum mechanics, I argue that the motivations for this model disregard important parts of the quantum formalism: the observables. When these are taken into account, gunk appears incompatible with quantum mechanics. This conclusion poses a dilemma for the reconciliation of gunk with modern physics. Either one must reject the physical possibility of gunk, or one must find a systematic way of eliminating points from physics.

Duwell, Armond, and Le Bihan, Soazig:  Enlightening falsehoods: A modal view of scientific understanding         

A new concept of understanding is articulated, modal understanding, which is characterized as follows: one has some modal understanding of some phenomena if and only if one knows how to navigate some of the possibility space associated with these phenomena, where by ``possibility space" it is meant the set of possible dependency structures that give rise to all subsets of the phenomena and the relations between those structures.  When fully articulated, the notion of modal understanding serves as a suitable concept of understanding that is appropriately neutral with respect to the debate on scientific realism and helps explain modeling practices of scientists."

Feest, Uljana:  Physicalism, Introspection, and Psychophysics: The Carnap/Duncker Exchange                  

In 1932, Rudolf Carnap published his article "Psychology in a Physical Language." The article prompted a critical response by the Gestalt psychologist Karl Duncker. The exchange is marked by mutual lack of comprehension. In this paper I will provide a contextualized explication of the exchange. I will show that Carnap's physicalism was deeply rooted in the psychophysical tradition that also informed Gestalt psychological research. By failing to acknowledge this, Carnap missed out on the possibility to enter into a serious debate and to forge an alliance with a like-minded psychologist at the time.

Fenton-Glynn, Luke: Ceteris Paribus Laws and Minutis Rectis Laws       

Special science generalizations admit of exceptions. Among the class of non-exceptionless special science generalizations, I distinguish (what I will call) *minutis rectis* (*mr*) generalizations from the more familiar category of *ceteris paribus* (*cp*) generalizations. I argue that the challenges involved in showing that *mr* generalizations can play the law role are underappreciated, and quite different from those involved in showing that *cp* generalizations can do so. I outline some potential strategies for meeting the challenges posed by *mr* generalizations.

Fillion, Nicolas, and Bangu, Sorin:  Solutions in the Mathematical Sciences & Epistemic Hierarchies      

Modern mathematical sciences are hard to imagine without the involvement of computers, or more generally, without appeal to numerical methods. Interesting conceptual problems arise from this interaction, and yet philosophers of science have yet to catch up with these developments. This paper sketches and examines one such problem, a tension between two types of epistemic contexts, one in which exact solutions can be found, and one in which they can't. Against this background, an investigation of some intriguing computational (a)symmetries is undertaken.

Forster, Malcolm: How the Quantum Sorites Phenomenon Strengthens Bell's Argument              

A recent theorem by Colbeck and Renner (2011) leads to a significantly stronger theorem than Bell's famous theorem; the new theorem does not assume Outcome Independence.  It is suggested that the reason that the stronger theorem was not discovered earlier is that it exploits a very strong prediction of quantum mechanics by applying some mathematical "tricks" to bypass the need to use Outcome Independence.  The very strong prediction of quantum mechanics is aptly described as the "quantum Sorites" phenomenon.

Frost-Arnold, Greg: Should a Historically Motivated Anti-Realist be a Stanfordite?         

Suppose one holds that the historical record of discarded scientific theories provides good evidence against scientific realism.  Should one adopt Kyle Stanford's specific critique of realism?  I present reasons for answering this question in the negative.  In particular, Stanford's challenge, based on the problem of unconceived alternatives, cannot use many of the prima facie strongest pieces of historical evidence against realism: (i) superseded theories whose successors were explicitly conceived, and (ii) superseded theories that were not the result of elimination-of-alternatives inferences.

Fuller, Jonathan:  The Confounding Question of Confounding Causes in Randomized Trials          

Comparative group studies, including randomized trials, control for confounding variables. Some epidemiologists praise randomized trials for controlling for all confounding causes, while some philosophers praise the assumption that all confounding causes are controlled for supporting sound causal inference. Both views are problematic. Exposing the problems clears the way for an alternate assumption that can better guide causal inference in group studies.

Fumagalli, Roberto:   No Learning from Minimal Models                

This paper examines the issue whether consideration of so-called minimal models can prompt learning about real-world targets. Using a widely cited example as a test case, I argue against the increasingly popular view that consideration of minimal models can prompt learning about such targets. In particular, I criticize the proponents of this view for failing to explicate in virtue of what properties or features minimal models supposedly prompt learning and for substantially overstating the epistemic import of minimal models. I then consider and rebut three arguments one might develop to defend the claim that consideration of minimal models can prompt learning about real-world targets. In doing so, I illustrate the implications of my critique for the wider debate on the epistemology of scientific modelling.

Gandenberger, Greg: Why I Am Not a Methodological Likelihoodist       

Methodological likelihoodism is the view that it is  possible to provide an adequate self-contained methodology for science  on the basis of likelihood functions alone. I argue that  methodological likelihoodism is false because an adequate self-contained methodology for science provides good norms of commitment vis-a-vis hypotheses and no purely likelihood-based norm meets this standard.

Garson, Justin: Why (a Form of) Function Indeterminacy is Still a Problem for Biomedicine, and How Seeing Functional Items as Components of Mechanisms Can Solve it

During the 1990s, many philosophers wrestled with the problem of function indeterminacy. Although interest in the problem has waned, I argue that solving the problem is of value for biomedical research and practice. This is because a solution to the problem is required in order to specify rigorously the conditions under which a given item is "dysfunctional." In the following I revisit a solution developed originally by Neander (1995), which uses functional analysis to solve the problem. I situate her solution in the framework of mechanistic explanation and suggest two improvements.

Genin, Konstantin, Kelly, Kevin, and Lin, Hanti: A Topological Theory of Empirical Simplicity       

We propose an account of empirical simplicity relative to an empirical problem, which specifies a question to be answered and the potential information states the scientist might encounter.  The motivating idea is that empirical simplicity reflects iterated empirical underdetermination in the given problem, which agrees closely with Popper's ideas about falsifiability.  The idea is to collapse distinctions in the problem by means of structure-preserving transformations until the simplicity order emerges.  Such a simplicity concept serves as a genuinely epistemic road map for a learner seeking the true answer to an empirical problem.

Gomori, Marton, and Szabo, Laszlo: How to Move an Electromagnetic Field?      

The special relativity principle presupposes that the states of the physical system concerned can be meaningfully characterized, at least locally, as such in which the system is at rest or in motion with some velocity relative to an arbitrary frame of reference. In the first part of the paper we show that electrodynamic systems, in general, do not satisfy this condition. In the second part of the paper we argue that exatly the same condition serves as a necessary condition for the persistence of an extended physical object. As a consequence, we argue, electromagnetic field strengths cannot be the individuating properties of electromagnetic field---contrary to the standard realistic interpretation of CED. In other words, CED is ontologically incomplete.

Grüne-Yanoff, Till: Why Behavioural Policy Needs Mechanistic Evidence              

Proponents seek to justify behavioural policies as "evidence-based". Yet they typically fail to show through which mechanisms these policies operate. This paper shows - at the hand of examples from economics, psychology and biology - that without sufficient mechanistic evidence, one often cannot determine whether a given policy in its target environment will be efficacious, robust, persistent or welfare-improving. Because these properties are important for justification, policies that lack support from mechanistic evidence should not be called "evidence-based".

Harinen, Totte:  Normal Causes for Normal Effects          

Halpern and Hitchcock have used normality considerations in order to provide an analysis of actual causation. Their methodology is that of taking a set of causal scenarios and showing how their account of actual causation accords with typical judgments about those scenarios. Consequently, Halpern and Hitchcock have recently demonstrated that their theory deals with an impressive number of problem cases discussed in the literature. However, in this paper I first show that the way in which they rule out certain cases of bogus prevention leaves their account susceptible to counterexamples. I then sketch an alternative approach to prevention scenarios which draws on the observation that, in addition to abnormal causes, people usually tend to focus on abnormal effects.

Hartmann, Stephan:  A New Solution to the Problem of Old Evidence      

The Problem of Old Evidence has troubled Bayesians ever since Clark Glymour first presented it in 1980. Several solutions have been proposed, but all of them have issues and none of them is considered to be the definite solution. In this short note, I propose a new solution which combines several old ideas with a new one. It circumvents the crucial omniscience problem in an elegant way and leads to a considerable confirmation of the hypothesis in question.

Heilmann, Conrad:  A New Interpretation of the Representational Theory of Measurement       

On the received interpretation, the Representational Theory of Measurement (RTM) depicts measurement as numerical representation of empirical relations. This account of measurement has been widely criticised. In this paper, I provide a new interpretation of RTM that sidesteps these criticisms. Instead of assessing it as a candidate for a full-fledged theory of measurement, I propose to view RTM as a library of theorems that investigate the numerical representability of qualitative relations. Such theorems are useful tools for concept formation for a broad range of important applications in linguistics, rational choice, metaphysics, economics, and the social sciences.

Henderson, Leah:  Should the debate over scientific realism go local? 

There have been suggestions that 'going local' may be the most productive approach for one or both sides in the debate over scientific realism. I distinguish between two different things that might be meant by going local. I argue that in neither case is it clear that localizing is either necessary or helpful. In fact, it may even be a distraction from a proper investigation of the empirical question about the history of science which emerges from the traditional debate.

Hey, Spencer: Theory Testing and Implication in Clinical Trials

John Worrall (2010) and Nancy Cartwright (2011) argue that randomized controlled trials (RCTs) are "testing the wrong theory." RCTs are designed to test inferences about the causal relationships in the study population, but this does not guarantee a justified inference about the causal relationships in the more diverse population in clinical practice. In this essay, I argue that the epistemology of theory testing in trials is more complicated than either Worrall's or Cartwright's accounts suggest. I illustrate this more complex theoretical structure with case-studies in medical theory testing from (1) Alzheimer's research and (2) anti-cancer drugs in personalized medicine.

Hicks, Michael:  Solving the Coordination Problem          

Discussions of the relationship between physics and the special sciences focus on the question of which sciences reduce to physics and how reduction is to be understood. This focus distracts from crucial featues of the relationship between sciences. I identify four features of the relationship between physics and the special sciences and argue that no current view of their relationship adequately explains these four features. I then present a new view and argue that it explains these aspects of the relationship between distinct sciences.

Hicks, Daniel: Genetically Modified Crops and the Underdetermination of Evidence by Epistemology

The underdetermination of theory by evidence has been discussed widely.  In this paper, I discuss a distinct kind of underdetermination:  the underdetermination of evidence by epistemology.  I examine the controversy over the yields of genetically modified [GM] crops, and show that proponents and opponents of GM crops cite, as evidence, two rival sets of claims.  By the lights of what I call classical experimental epistemology, one set of claims counts as evidence, but the other does not.  However, Nancy Cartwright's evidence for use arrives at exactly the opposite conclusion:  the latter counts as evidence, and the former does not.

Hofer-Szabó, Gábor, and Vecsernyés, Péter: Bell's local causality for philosophers          

This paper is the philosopher-friendly version of our more technical work ( ). It aims to give a clear-cut definition of Bell's notion of local causality. Having provided a framework, called local physical theory, which integrates probabilistic and spatiotemporal concepts, we formulate the notion of local causality and relate it to other locality and causality concepts. Then we compare Bell's local causality with Reichenbach's Common Cause Principle and relate both to the Bell inequalities. We find a nice parallelism: both local causality and the Common Cause Principle are more general notions than captured by the Bell inequalities. Namely, Bell inequalities cannot be derived neither from local causality nor from a common cause unless the local physical theory is classical or the common cause is commuting, respectively.

Holman                  , Bennett:  Why Most Sugar Pills are not Placebos           

The standard philosophic definition of placebos offered by Grünbaum is incompatible with the experimental role they must play in randomized clinical trials as articulated by Cartwright.  I offer a modified account of placebos that respects this role and clarifies why many current medical trials fail to warrant the conclusions they are typically seen as yielding.  I then consider recent changes to guidelines for reporting medical trials and show that pessimism over parsing out the cause "unblinding" is premature.  Specifically, using a trial of antidepressants, I show how more sophisticated statistical analyses can parse out the source of such effects and serve as an alternative to placebo control.

Holman, Bennett, and Bruner, Justin: The Problem of Intransigently Biased Agents        

In recent years the social nature of scientific inquiry has generated considerable interest.  We examine the effect an epistemically impure agent on a community of honest truth-seekers.  Extending a formal model of network epistemology pioneered by Zollman, we conclude that an intransigently biased agent prevents the community from ever converging to the truth.  We explore two solutions to this problem, including a novel procedure for endogenous network formation in which agents choose who to trust.  We contend our model nicely captures aspects of current problems in medical research, and gesture at some morals for medical epistemology more generally.

Holter, Brandon:  Rudner's Challenge and the Epistemic Significance of Inductive Risk 

Richard Rudner argues that the traditional view of scientific justification, which is supposed to be free of moral and practical influence, cannot explain why different scientific contexts require different standards of sufficient evidence. Traditionalists distinguish the epistemic justification of theories from the practical justification of actions, arguing that Rudner's examples are merely practical. This response is inadequate, however, because it does not secure a value-free epistemology. It neither explains epistemic variation nor asserts that there is one universal epistemic standard of evidential sufficiency. Only the epistemic influence of values, I argue, can account for the evidential standards of science.

Howick, Jeremy                  , and Worrall, John:  What counts as a placebo is relative to a target disorder and therapeutic theory: defending a modified version of Grunbaum's scheme     

There is currently no widely accepted definition of 'placebos'. Yet debates about the ethics of placebo use (in routine practice or clinical trials) and the magnitude (if any!) of 'placebo' effects continue to rage. Even if not formally required, a definition of the 'placebo' concept could inform these debates. Grunbaum's 1981/1986 characterization of the 'placebo' has been cited as the best attempt thus far, but has not been widely accepted. Here we argue that criticisms of Grunbaum's scheme are either exaggerated, unfounded or based on misunderstandings. We propose that, with three modifications, Grunbaum's scheme can be defended. Grünbaum argues that all interventions can be classified by a therapeutic theory into 'incidental' and 'characteristic' features. 'Placebos', then, are treatments whose characteristic features do not have effects on the target disorder. To Grünbaum whether a treatment counts as a placebo or not is relative to a target disorder, and a therapeutic theory. We modify Grunbaum's scheme in the following way. First, we add 'harmful intervention' and 'nocebo' categories; second, we insist that what counts as a 'placebo' (or nonplacebo) be relativized to patients; and third, we issue a clarification about the overall classification of an intervention. We argue that our modified version of Grunbaum's scheme resists published criticisms. Our work warrants a re-examination of current policies of the ethics of placebos in both clinical practice and clinical trials, and a revised empirical estimation of 'placebo' effects, both in the context of clinical trials and clinical practice.

Huggett, Nick, and Vistarini, Tiziana:  Deriving General Relativity from String Theory       

Weyl symmetry of the classical bosonic string is a mathematical consequence of its Lagrangian. However, quantization breaks it with profound consequences that we describe here, along with a relatively straight-forward review of string theory, suitable for philosophers of physics. First, reimposing the symmetry requires that spacetime have 26 dimensions. Moreover it requires that the background spacetime satisfy the equations of general relativity; it follows in turn that general relativity, hence classical spacetime as we know it, arises from string theory. In conclusion we argue that Weyl symmetry is not an independent postulate, but required in quantum string theory.

Imbert. Cyrille:  Getting the advantages of theft without honest toil:  Realism about the complexity of (some) physical systems without realist commitments to their scientific representations                

This paper shows that, under certain reasonable conditions, if the investigation of the behavior of a physical system is difficult, no scientific change can make it significantly easier. This impossibility result implies that complexity is then a necessary feature of models which truly represent the target system and of all models which are rich enough to catch its behavior and therefore that it is an inevitable element of any possible science in which this behavior is accounted for. I finally argue that complexity can then be seen as representing an intrinsic feature of the system itself.

Jansson, Lina:  Making Room For Explanatory Fictions Within Realism   

There have been several recent challenges to the idea that accurate representation is a necessary condition for successful model explanation. In this paper I look in detail at the case of wavefunction scarring that Bokulich presents. I argue that cases like this one push us towards recognising ways of mediating between what is being explained and what is doing the explaining that depart radically from the traditional accounts. However, the commitment to accurate representation of what is doing the explaining and what is being explained remains unshaken by these examples.

Jantzen, Benjamin:  Why Talk about 'Non-Individuals' Is Meaningless    

It has been suggested that puzzles in the interpretation of quantum mechanics motivate consideration of non-individuals, entities that are numerically distinct but do not stand in a relation of identity with themselves or non-identity with others. I argue that talk about non-individuals is either meaningless or not about non-individuals. It is meaningless insofar as we attempt to take the foregoing characterization literally. It is meaningful, however, if talk about non-individuals is taken as elliptical for either nominal or predicative use of a special class of mass-terms.

Kao, Molly:  Unification in the Old Quantum Theory        

In this paper, I consider some of the central developments of old quantum theory between the years 1900 and 1913 and provide an analysis of the nature of the unificatory power of the hypothesis of quantization of energy in a Bayesian framework. I argue that the best way to understand the unification here is in terms of informational relevance: on the assumption of the quantum hypothesis, phenomena that were previously thought to be unrelated turned out to yield information about one another based on agreeing measurements of the numerical value of Planck's constant. 

Kennedy, Ashley, and Jebeile, Julie:  Idealization in the Process of Model Explanation

In this paper we argue that model explanation is a process in which idealization plays an important and active role. Via examination of a case study from contemporary astrophysics, we show that a) idealizations can, in some cases, make for better model explanations and that b) they do this by creating comparison cases which serve to highlight important explanatory components in the modeled target system. Thus our view is that the role of idealization in scientific model explanation goes beyond both simplification and isolation, which are the roles that have been previously described in the literature.

Keren, Arnon:  Science and Informed, Counterfactual, Democratic Consent      

On many science-related policy questions, the public is unable to make informed decisions, because of its inability to make use of knowledge and information obtained by scientists. Philip Kitcher and James Fishkin have both suggested therefore that on certain science-related issues, public policy should not be decided upon by actual democratic vote, but should instead conform to the public's Counterfactual Informed Democratic Decision (CIDD). Indeed, this suggestion underlies Kitcher's specification of an ideal of a well-ordered science. The paper argues that this suggestion misconstrues the normative significance of CIDDs. At most, CIDDs might have epistemic significance, but no authority or legitimizing force.

Keskin, Emre: Collective Success of Cosmological Simulations 

I argue that cosmological simulations that aim to model the time-evolution of large-scale structure formation in the universe—in addition to their numerical results—yield more than what Wendy Parker calls "adequacy-for-purpose." I claim that it is possible obtain evidence from simulations confirming or disconfirming the underlying fundamental theories. Especially in the context of cosmology, the current examples of simulations yield such evidence. This shows simulations in cosmology have a particular fundamental strength in addition to producing numerical output, which is not apparent in climate modeling.

Kincaid, Harold: Open Empirical and Methodological Issues in the Individualism-Holism Debate            

I briefly argue that some issues in the individualism-holism debate have been fairly clearly settled and that others are still plagued by unclarity.  The main argument of the paper is that there are a set of clear empirical issues around the holism-individualism debate that are central problems in current social science research. Those include questions about when we can be holist and how individualist we can be in social explanation.

King, Martin: Idealization and Explanation in Physics   

Many recent accounts of explanation acknowledge the importance of idealization. Alisa Bokulich offers her structural model account of explanation to allow for non-causal idealized models. However, the problem with opening explanation to idealizations is that many models may be counted as explanatory when they are only heuristic. The trajectories of semiclassical mechanics are literally false of quantum systems, but can be surprisingly effective in prediction. This paper takes a look at the difference between heuristic and explanatory power and argues that Bokulich seems to have conflated the two in her argument that semiclassical mechanics can be explanatory of quantum phenomena.

Klein, Colin:  Brain Regions as Difference-Makers.           

It is common to speak of brain regions for particular cognitive functions: regions for reading, for seeing faces, or for doing math. This suggests that brain regions have a single function which they always and uniquely perform. Advances in neuroscience show that this simple picture cannot be correct. I suggest that we ought to instead read 'region for' as designating brain areas that make a difference to the performance of personal-level activities. I discuss how brain regions can have specific and systematic relationships to personal- level activities, and argue that this combination allows neuroimaging constrain cognitive theories indirectly.

Knuuttila, Tarja Tellervo:  Abstract and Concrete: Towards an Artifactual Theory of Fiction in Science   

This paper presents a novel artifactual approach to fiction in science that addresses the shared features of models and fictions. It approaches both models and fictions as purposefully created entities, artifacts, which are constructed by making use of culturally established representational tools in their various modes and media. As intersubjectively available artifacts models and fictions have both abstract and concrete dimensions. Three further features that models and fictions share are discussed: constructedness, partial autonomy, and incompleteness. The account proposed gives a unified account of different model types and circumvents some problems of those approaches that consider models as imagined systems.

Kovaka, Karen: Biological Individuality and Scientific Practice  

An issue that is largely neglected by the literature biological individuality concerns the methodological question of what sort of concept of individuality is needed for scientific practice. In this paper, I explore what characterizations of biological individuality are valuable for scientists engaged in empirical research. I consider two claims about the relationship between individuality and scientific practice. Against these two claims, I argue that the scientific value of any particular characterization of individuality lies in its ability to inform and direct future research, rather than its capacity to resolve practical problems.

Kronfeldner, Maria: When specificity trumps proximity                

This paper analyzes the epistemic role of specificity and proximity in how scientists deal with causal complexity. I will, first, present evidence for the claim that scientists try to get rid of causal complexity by focusing on rather specific, ideally mono-causal relationships. This is likely to be uncontroversial, but in philosophy of science rarely noticed explicitly. I will, second, illustrate that specificity and proximity - understood as explanatory virtues, representation-dependent properties that a good explanation exhibits - can point in different directions. Even when specificity and proximity are explicitly addressed in the philosophical literature, this trade-off is ignored. The main claims that I will defend are that proximity and specificity are instrumental for stability, scope and parsimony and that - if in conflict - specificity regularly trumps proximity.

Lam, Vincent: In search of a primitive ontology for quantum field theory             

Primitive ontology is a recently much discussed approach to the ontology of quantum theory according to which the theory is ultimately about entities in 3-dimensional space and their temporal evolution. This paper critically discusses the proposed primitive ontologies for quantum field theory in the light of the existence of unitarily inequivalent representations. These primitive ontologies rely either on a Fock space representation or a wave functional representation, which are strictly speaking unambiguously available only for free systems in flat spacetime. As a consequence, it is argued that they constitute only `effective ontologies' and are hardly satisfying as a fundamental ontology for quantum field theory.

Lamb, Maurice, and Chemero, Anthony:  Understanding Dynamical in Cognitive Science             

Neo-mechanists argue that in order for a claim to be an explanation in cognitive science it must reveal something about the mechanisms of a cognitive system. Recently, they claimed that JAS Kelso and colleagues have begun to favor mechanistic explanations of neuroscientific phenomena. We argue that this view results from a failure to understand dynamic systems explanations and the general structure of dynamic systems research. Further, we argue that the explanations by Kelso and colleagues cited are not mechanistic explanations and that neo-mechanists have misunderstood Kelso and colleagues' work, which blunts the force of one of the neo-mechanists' arguments.

Lee, Carole: Commensuration Bias in Peer Review         

To arrive at their final evaluation of a manuscript or grant proposal, reviewers must convert a submission's strengths and weaknesses for heterogeneous peer review criteria into a single metric of quality or merit. I identify this process of commensuration as the locus for a new kind of peer review bias. Commensuration bias illuminates how the systematic prioritization of some peer review criteria over others permits and facilitates problematic patterns of publication and funding in science. Commensuration bias also foregrounds a range of structural strategies for realigning peer review practices and institutions with the aims of science.

Lehtinen, Aki: Derivational robustness and indirect confirmation          

This paper examines the role of derivational robustness in confirmation, arguing that it allocates confirmation to individual assumptions, and thereby increases the degree to which various pieces of evidence indirectly confirm the robust result. If one can show that a result is robust, and that the various individual models used to derive it also have other confirmed results, these other results may indirectly confirm the robust result. Confirmation derives from the fact that data not known to bear on a result are shown to be relevant when it is shown to be robust.

Leonelli, Sabina:  What Counts as Scientific Data? A Relational Framework        

This paper proposes an account of scientific data that makes sense of recent debates on data-driven research, while also building on the history of data production and use particularly within biology. In this view, 'data' is a relational category applied to research outputs that are taken, at specific moments of inquiry, to provide evidence for knowledge claims of interest to the researchers involved. They do not have truth-value in and of themselves, nor can they be seen as straightforward representations of given phenomena. Rather, they are fungible objects defined by their portability and their prospective usefulness as evidence.

Li, Bihui: Coarse-Graining as a Route to Microscopic Physics: The Renormalization Group in Quantum Field Theory      

The renormalization group has been characterized as merely a coarse-graining procedure that does not illuminate the microscopic content of quantum field theory (QFT), but merely gets us from that content, as given by axiomatic QFT, to macroscopic predictions. I argue that in the constructive field theory tradition, the RG techniques do illuminate the microscopic dynamics of a QFT, which are not automatically given by axiomatic QFT. RG techniques in constructive field theory are also rigorous, so one cannot object to their foundational import on grounds of lack of rigor.

Linquist, Stefan: Against Lawton's contingency thesis, or, why the reported demise of community ecology is greatly exaggerated.       

Lawton's contingency thesis (CT) states that there are no useful generalizations ("laws") at the level of ecological communities because these systems are especially prone to contingent historical events. I argue that this influential thesis has been grounded on the wrong kind of evidence. CT is best understood in Woodward's (2010) terms as a claim about the instability of certain causal dependencies across different background conditions. A recent distinction between evolution and ecology reveals what an adequate test of Lawton's thesis would look like. To date, CT remains untested. But developments in genome and molecular ecology point in a promising direction.

Lisciandra, Chiara:  Robustness Analysis as a Non-empirical Confirmatory Practice        

Robustness analysis is a method of testing whether the predictions of a model are the unintended effect of the unrealistic assumptions of the model. As such, the method resembles the analysis, conducted in experimental sciences, to test the effect of possible confounders on the empirical results. The arguments in support of robustness analysis in non-experimental contexts, however, are often left implicit or are unreflectively imported from the experimental sciences. Throughout this paper, I defend the claim that the comparison of the results coming from models based on different tractability assumptions is in principle helpful, but that the method of conducting this sort of analysis is far from straightforward. I indicate some of the difficulties encountered in the practice and suggest possible alternatives, focusing on a case study in economic geography.

Lombardi, Olimpia, Forton, Sebastian, and Vanni, Leonardo: A pluralist view about information                

Focusing on Shannon information, this article shows that, even on the basis of the same formalism, there may be different interpretations of the concept of information, and that disagreements may be deep enough to lead to very different conclusions about the informational characterization of certain physical situations. On this basis, a pluralist view is argued for, according to which the concept of information is primarily a formal concept that can adopt different interpretations that are not mutually exclusive, but each useful in a different specific context.

Magnus, P.D.: What the 19th century knew about taxonomy and the 20th forgot             

The accepted narrative treats John Stuart Mill's Kinds as the historical prototype for our natural kinds, but Mill actually employs two separate notions: Kinds and natural groups. Considering these, along with the accounts of Mill's 19th-century interlocutors, forces us to recognize two distinct questions. First, what marks a natural kind as worthy of inclusion in taxonomy? Second, what exists in the world that makes a category meet that criterion? Mill's two notions offer separate answers to the two questions: natural groups for taxonomy, and Kinds for ontology. This distinction is ignored in many contemporary debates about natural kinds and is obscured by the standard narrative which treats our natural kinds just as a development of Mill's Kinds.

Malinsky, Daniel: Hypothesis testing, "Dutch Book" arguments, and risk              

Dutch Book arguments and references to gambling theorems are typical in the debate between Bayesians and scientists committed to "classical" statistical methods. These arguments have rarely convinced non-Bayesian scientists to abandon certain conventional practices (like fixed-level null hypothesis significance testing), partially because many scientists feel that gambling theorems have little relevance to their research activities. In other words, scientists "don't bet". This paper examines one attempt, by Schervish, Seidenfeld, and Kadane, to progress beyond such apparent stalemates by connecting "Dutch Book"-type mathematical results with principles actually endorsed by practicing experimentalists.

Marcellesi, Alexandre: External Validity: Is There Still a Problem?           

I first propose to distinguish between two kinds of external validity inferences, predictive and explanatory. I then argue that we have a satisfactory answer---formulated in slightly different ways by Cartwright and Hardie on the one hand and Pearl and Bareinboim on the other---to the question of the conditions under which predictive external validity inferences are valid. If this claim is correct, then it has two immediate consequences: First, some external validity inferences are deductive, contrary to the what is commonly assumed. Second, Steel's requirement that an account of external validity inference break what he calls the 'Extrapolator's Circle' is misplaced, at least when it comes to predictive external validity inferences.

Marcoci, Alexandru:  Solving the Absentminded Driver Problem Through Deliberation                  

Piccione and Rubinstein [1997] have suggested a sequential decision problem with absentmindedness in which there seem to be two equally compelling, but divergent, routes to calculating the expected utility of an agent's actions.  The first route would correspond to an ex ante calculation while the latter to  an ex interim calculation. Piccione and Rubinstein conclude that since the verdicts of the two calculations disagree they lead to an inconsistency in rational decision theory. In this paper I firstly argue that the ex ante route to calculating expected utility is not available in decision problems such as that introduced by Piccione and Rubinstein. The second part of the paper will explore the ex interim expected utility formula. This has been largely neglected in the literature and is always presented as only offering the agent a parametric optimal strategy in terms of his initial belief in being at the first decision node. I will argue that if we construe agents as maximising the ex interim expected utility in steps through a deliberative dynamics, then this formula can make a precise recommendation with regards to the driver's optimal strategy  irrespective of his initial beliefs.

Martini, Carlo: The limits of trust in interdisciplinary science   

In this paper I argue that lack of trust networks among researchers employed in interdisciplinary collaborations is potentially hampering successful interdisciplinary research. I use Hardwig's concept of epistemic dependence in order to explore the problem theoretically, and MacLeod and Nersessian's ethnographic studies in order to illustrate the problem from the viewpoint of concrete interdisciplinary science practice. I suggest that some possible solutions to the problem are in need of further exploration.

Matthews, Lucas: Embedded Mechanisms and Phylogenetics  

The role and value of mechanisms in the process-oriented life sciences is quite clear (Machamer, Darden, and Craver 2000; Bechtel and Richardson 1997/2010; Darden 2006; Craver 2007; Bechtel 2008). Demonstrating the role and value of mechanisms in other domains of scientific investigation, however, remains an important challenge of scope for the new mechanistic account of explanation. This paper helps answer that challenge by demonstrating one valuable role mechanisms play in the pattern-oriented science of phylogenetics. Using the Transition-Transversion (ti/tv) rate parameter as an example, this paper argues that models embedded with mechanisms produce stronger phylogenetic tree hypotheses, as measured by Maximum Likelihood (ML) logL values. Two important implications for the new mechanistic account of explanation are considered.

Mayo-Wilson, Conor: Structural Chaos 

Philosophers often distinguish between parameter error and model error.  Frigg et. al. (2014) argue that the distinction is important because although there are methods for making predictions given parameter error and chaos, there are no  methods for dealing with model error and ``structural chaos.''  However, Frigg et. al. (2014) neither define ``structural chaos'' nor explain the relationship between it and chaos (simpliciter).  I propose a definition of ``structural chaos'', and I explain two new theorems that show that if a set of models contains a chaotic function, then the set is structurally chaotic.   Finally, I discuss the relationship between my results and structural stability.

McCaffrey, Joseph: Neural Multi-Functionality and Mechanistic Role Functions               

Multi-functionality presents new challenges for mapping the brain's functional topography.  Cathy Price and Karl Friston argue that brain areas have many functions at one level of description and a single function at another.  Thus, researchers need to develop new cognitive ontologies to obtain robust functional mappings.  Colin Klein counters that this strategy will produce uninformative mappings.  Therefore, researchers should relativize functional mappings to particular contexts.  Using Carl Craver's concept of "mechanistic role functions" to illustrate that mechanistic components can be multi-functional in different ways, I argue that both accounts mistakenly construe brain areas as multi-functional in a preferred way.

Meketa, Irina: EXPERIMENT AND ANIMAL MINDS: WHY STATISTICAL CHOICES MATTER        

Comparative cognition is the interdisciplinary study of nonhuman animal cognition. It has been criticized for systematically underattributing sophisticated cognition to nonhuman animals, a problem that I refer to as the underattribution bias. In this paper, I show that philosophical treatments of this bias at the experimental level have emphasized one feature of the experimental-statistical methodology (the preferential guarding against false positives over false negatives) at the expense of neglecting another feature (the default, or null, hypothesis). In order to eliminate this bias, I propose a reformulation of the standard statistical framework in comparative cognition. My proposal identifies and removes a problematic reliance on the value of parsimony in the calibration of the null hypothesis, replacing it with relevant empirical and theoretical information. In so doing, I illustrate how epistemic and non-epistemic values can covertly enter scientific methodology through features of statistical models, potentially biasing the products of scientific research. Broadly construed, this paper calls for increased philosophical attention to the experimental methodology and statistical choices.  

Miller, Michael: Haag's Theorem and Successful Applications of Scattering Theory        

Earman and Fraser (2006) clarifies how it is possible to give mathematically consistent calculations in scattering theory despite Haag's theorem. However, their analysis does not fully address the worry raised by the result. In particular, I argue that their approach fails to be a complete explanation of why Haag's theorem does not undermine claims about the empirical adequacy of particular quantum field theories. I then show that such empirical adequacy claims are protected from Haag's result by the techniques that are required to obtain theoretical predictions for realistic experimental observables. I conclude by advocating that Haag's theorem should be understood as providing information about the nature of the relation between the perturbative expansion and non-perturbative characterizations of quantum field theories.

Miller, Boaz: What is Hacking's Argument for Entity Realism Anyway?   

According to Hacking's Entity Realism, unobservable entities that scientists carefully manipulate to study other phenomena are real. Although Hacking presents his case in an intuitive, attractive, and persuasive way, his argument remains elusive. I present five possible readings of Hacking's argument: a no-miracle argument, an indispensability argument, a transcendental argument, a Vichian argument, and a non-argument. I reconstruct Hacking's argument according to each reading, and review their prima facie strengths and weaknesses.

Miyake, Teru: Reference Models: Using Models to Turn Data into Evidence        

Reference models of the earth's interior play an important role in the acquisition of knowledge about the earth's interior and the earth as a whole.  Such models are used as a sort of standard reference against which data are compared.  I argue that the use of reference models merits more attention than it has gotten so far in the literature on models, for it is an example of a method of doing science that has a long and significant history, and a study of reference models could increase our understanding of this methodology.

Muntean, Ioan:                   Genetic algorithms in scientific discovery: a new epistemology?             

Based on a concrete case of scientific discovery presented by Schmidt and Lipson (2009), I argue that the evolutionary computation used has important consequences for the computational epistemology and for the philosophy of simulations in science. The genetic algorithms illustrate an "upward epistemology" from data to theories and is relevant in the context of scientific discovery. I explore the epistemological richness and novelty of this specific case study and extend my analysis to the more general framework of computer-aided scientific discovery. The evolutionary computation, has a reassuring epistemic status compared to previous attempts to use computers in scientific discovery and can draw a bridge between AI and the practice of biological sciences.

Nathan, Marco, and Love, Alan: The Idealization of Causation in Mechanistic Explanation          

Causal relations among components and activities in mechanisms are intentionally misrepresented in the mechanistic explanations found routinely in the life sciences. Since these causal relations are the source of the explanatory power ascribed to descriptions of mechanisms, and advocates of mechanistic explanation explicitly recognize the importance of an accurate representation of actual causal relations, the reliance on these idealizations in explanatory practice conflicts with the stated rationale for mechanistic explanations. We argue that these idealizations signal an overlooked feature of reasoning in molecular and cell biology--mechanistic explanations do not occur in isolation--and suggest that explanatory practices within the mechanistic tradition share commonalities with the model-based science prevalent in population biology.

Nguyen, James: Why data models do not supply the target structure required by the structuralist account of scientific representation             

Van Fraassen (2008) supplies an intriguing argument for the claim that data models provide the target-end structure required by structuralist accounts of scientific representation. This paper is a response to his argument. I first outline a variety of structuralist accounts, before turning to the question of target-end structure. I claim that if data models are invoked as supplying such structures, then it is unclear how scientific models represent physical targets. I argue that van Fraassen's answer to this question - that, pragmatically, for an individual scientist, representing a data model just is representing a target - is unsuccessful.

North, Jill: The Structure of Spacetime: a New Approach to the Spacetime Ontology Debate     

I propose that we understand the debate about spacetime ontology as a debate about whether spatiotemporal structure is fundamental. This yields a novel argument for substantivalism. Even so, that conclusion can be overridden by future developments in physics. I conclude that the debate about spacetime ontology, properly understood, is a substantive dispute, which the substantivalist is currently winning.

Northcott, Robert: Opinion polling and election predictions     

Election prediction by means of opinion polling is a rare empirical success story for social science, but one not previously considered by philosophers. I examine the details of a prominent case and draw two lessons of more general interest:  1) Methodology over metaphysics. Traditional metaphysical criteria were not a useful guide to whether successful prediction would be possible; instead, the crucial thing was selecting an effective methodology.  2) Which methodology? Success required sophisticated use of case-specific evidence from opinion polling. The pursuit of explanations via general theory or causal mechanisms, by contrast, turned out to be precisely the wrong path - contrary to much recent philosophy of social science. 

Norton, Joshua: Weak Discernibility and Relations Between Quanta    

Some authors (Muller and Saunder's 2008, Huggett and Norton, 2013) have attempted to defend Leibniz's identity of indicernibes through weakly discernibility.  The idea is that if there is a symmetric, non-reflexive physical relation which holds between two particles, then those particles cannot be identical.  In this paper I focus only on Muller and Saunder's account and argue that the means by which they achieve weak discernibility is not through a physical observable but an alternate mathematical construction which is both unorthodox and incomplete.  Muller and Saunders build a map from slot labels to a set of observables and out of this map construct a weakly discerning formal relation.  What Muller and Saunder's do not provide is a worked out account of how such maps pick out physical relations between particles.

Nyrup, Rune: How Explanatory Reasoning Justifies Pursuit: A Peircean View of IBE         

This paper defends an account of explanatory reasoning generally, and inference to the best explanation in particular, according to which it first and foremost justifies pursuing hypotheses rather than accepting them as true. This side-steps the problem of why better explanations should be more likely to be true. Furthermore, I argue that my account faces no analogous problems. I propose an account of justification for pursuit and show how this provides a simple and straightforward connection between explanatoriness and justification for pursuit.

O`Neill, Elizabeth: Which causes of moral beliefs matter?           

I argue that the distal causes of moral beliefs, such as evolution, are only relevant for assessing the epistemic status of moral beliefs in cases where we cannot determine whether a given proximal cause is reliable just by looking at the properties of that proximal cause. This means that any investigation into the epistemic status of moral beliefs given their causes should start with proximal causes—not with evolution. I discuss two proximal psychological causes of moral beliefs—disgust and sympathy—to demonstrate the feasibility of drawing epistemic conclusions from an examination of proximal causes alone.

Overton, James: Explanation in Science                 

Practises of scientific explanation are diverse and so are philosophical accounts of scientific explanation. In this paper I propose a general philosophical account designed to handle this diversity. Using a large set of small case studies drawn from the journal Science, I argue for a single explain-relation that builds on different counterfactual-supporting "core" relations in different cases. Five familiar categories help us to classify scientific explanations, discover their form, and better understand the core relations. I close by considering how existing philosophical accounts fit with my evidence base and my general account.

Park, Ilho: Conditionalization and Credal Conservatism                

This paper is intended to show an epistemic trait of the Bayesian updating rule. In particular, I will show in this paper that Simple/Jeffrey/Adams Conditionalization is equivalent to what I call Credal Conservatism, which says that when we undergo a course of experience, our credences irrelevant to the experience should remain the same.

Pashby, Thomas: Quantum Mechanics for Event Ontologists       

In an event ontology, matter is `made up of' events.  This provides a distinctive foil to the standard view of a quantum state in terms of properties possessed by a system.  Here I provide an argument against the standard view and suggest instead a way to conceive of quantum mechanics in terms of probabilities for the occurrence of events localized in space and time.  To that end I construct an appropriate probability space for these events and give a way to calculate them as conditional probabilities.  I suggest that these probabilities can be usefully thought of as Lewisian objective chances.

Pence, Charles, and Ramsey, Grant: Is Organismic Fitness at the Basis of Evolutionary Theory?

Fitness is a central theoretical concept in evolutionary theory. Despite its importance, much debate has occurred over how to conceptualize and formalize fitness. One point of debate concerns the roles of organismic and trait fitness. In a recent addition to this debate, Elliott Sober argues that trait fitness is the central fitness concept, and that organismic fitness is of little value. In this paper, by contrast, we argue that it is organismic fitness that lies at the bases of both the conceptual role of fitness, as well as its role as a measure of evolutionary dynamics.

Perry, Zee: Intensive and Extensive Quantities                  

Quantities are properties and relations which exhibit "quantitative structure". For physical quantities, this structure can impact the non-quantitative world in different ways. In this paper I introduce and motivate a novel distinction between quantities based on the way their quantitative structure constrains the possible mereological structure of their instances. I borrow the terms 'extensive' and 'intensive' for these categories, though my use is substantially revisionary. I present and motivate this distinction using two case studies of successful physical measurements. (of mass and length, respectively). I argue that the best explanation for the success of the length measurement requires us to adopt my notion of extensiveness, which is distinct from what's sometimes called "additivity". I further discuss this distinction and its consequences, sketching an application of extensiveness for the project of producing a non-mathematical and non-metrical reductive metaphysics of quantity.

Pietsch, Wolfgang: Aspects of theory-ladenness in data-intensive science         

Recent claims, mainly from computer scientists, concerning a largely automated and model-free data-intensive science have been countered by critical reactions from a number of philosophers of science. The debate suffers from a lack of detail in two respects, regarding (i) the actual methods used in data-intensive science and (ii) the specific ways in which these methods presuppose theoretical assumptions. I examine two widely-used algorithms, classificatory trees and non-parametric regression, and argue that these are theory-laden in an external sense, regarding the framing of research questions, but not in an internal sense concerning the causal structure of the examined phenomenon. With respect to the novelty of data-intensive science, I draw an analogy to exploratory as opposed to theory-directed experimentation.

Pincock, Chris: Newton, Laplace and Salmon on Explaining the Tides    

Salmon cites Newton's explanation of the tides in support of a causal account of scientific explanation. In this paper I reconsider the details of how Newton and his successors actually succeeded in explaining several key features of the tides. It turns out that these explanations depend on elements that are not easily interpreted in causal terms. Often an explanation is obtained even though there is a considerable gap between what the explanation says and the underlying causes of the phenomenon being explained. More work is needed to determine the admissible ways in which this gap can be filled. I use the explanations offered after Newton to indicate two different ways that non-causal factors can be significant for scientific explanation. In Newton's equilibrium explanation, only a few special features of the tides can be explained. A later explanation deploys a kind of harmonic analysis to provide an informative classification of the tides at different locations. I consider the options for making sense of these explanations.

Pitts, James Brian: Historical and Philosophical Insights about General Relativity and Space-time from Particle Physics             

Historians recently rehabilitated Einstein's "physical strategy" for GR.  Independently, particle physicists similarly re-derived Einstein's equations for a massless spin 2 field.  But why not a light _massive_ spin 2, like Neumann-Seeliger?  Massive gravities are bimetric, supporting  conventionalism over geometric empiricism.  Nonuniquess lets one explain geometry via field equations, but not conversely.  Massive gravity would have blocked Schlick's critique of Kant's synthetic a priori.  Finally c. 1970 a dilemma appeared:  massive spin 2 gravity was unstable or empirically falsified.  GR was vindicated, but later and on better grounds.  Recently dark energy and theoretical progress have made massive spin 2 gravity viable.

Povich, Mark: Mechanisms and Model-Based fMRI         

Mechanistic explanations satisfy widely held norms of explanation: the ability to answer counterfactual questions and allowance for manipulation. A currently debated issue is whether any non-mechanistic explanations can satisfy these explanatory norms. Weiskopf (2011) argues that the models of object recognition and categorization, JIM, SUSTAIN, and ALCOVE, are not mechanistic, yet satisfy these norms of explanation. In this paper I will argue that these models are sketches of mechanisms. My argument will make use of model-based fMRI, a novel neuroimaging approach whose significance for current debates on psychological models and mechanistic explanation has yet to be explored.

Powers, John: Atrazine Research and Criteria of Characterizational Adequacy

The effects of atrazine on amphibians has been the subject of much research, requiring the input of many disciplines. Theory reductive accounts of the relationships among scientific disciplines do not seem to characterize well the ways that diverse disciplines interact in the context of addressing such complex scientific problems.  "Problem agenda" accounts of localized scientific integrations seem to fare better.  However, problem agenda accounts have tended to focus rather narrowly on scientific explanation.  Attention to the details of atrazine research reveals that characterization deserves the sort of attention that problem agenda theorists have thus far reserved for explanation.

Richardson, Sarah: The Concept of Gender Bias in Science            

This paper presents a definition and explication of "gender bias in science." Charges of gender bias in science are frequently misunderstood.  When persuaded that a particular case of gender bias is also a case of bias by the standards internal to the field, scientific communities have often been responsive to such charges.  However, many scientific researchers and research communities see gender bias in science as an extrascientific political, moral, or ethical problem rather than an epistemological failure.  Among its merits, the proposed "contextualist-attributional" account of the concept of gender bias in science helps to explain why some scientific communities find such charges incoherent or irrelevant and points to more strategic approaches to engagement with scientific communities by feminist science analysts.  This is shown by analysis of a case study of scientific response to a recent charge of gender bias in neuroendocrinology.

Rinard, Susanna: Imprecise Probability and Higher Order Vagueness     

The tripartite model of belief (belief / disbelief / suspension of judgment) is accurate, but unspecific.  The orthodox Bayesian (single-function) model is specific, but inaccurate (because we don't always have precise credences).  The set of functions model is an improvement, but faces a problem analogous to higher order vagueness.  Solving it, I argue, requires endorsing Insurmountable Unclassifiability, with a surprising consequence: no model can be both fully accurate and maximally specific.  What we can do, though, is improve on existing models.  I present a new model that is more specific than the tripartite model, and, unlike existing Bayesian models, perfectly accurate.

Robus, Olin: Does Science License Metaphysics?             

Naturalized metaphysicians defend the thesis that science licenses metaphysics, such that only metaphysical results that are based on the best science are to be considered legitimate. This view is problematic, due to the fact that the reasons they identify for such license are apparently self-defeating. Chakravartty (2013) defends a revised approach to understanding the licensing relation. I argue that the proposed response is a step forward on behalf of naturalizing metaphysics, but still does not take seriously the contention that science involves, inextricably, a contribution from the a priori. I conclude by considering what options the aspiring naturalized metaphysician is left with.

Rohwer                  , Yasha: Iterated Theory of Mind and the Evolution of Human Intelligence            

Did human intelligence evolve via an arms-race style competition between conspecifics (Flinn et al. 2005) or through collective action (Sterelny 2007, 2012)? I argue that to critically compare these two models it is necessary to focus on the nature of the particular cognitive capacities predicted by the unique selective pressure proposed by each model. Focusing on theory of mind, I conclude that the competitive model makes predictions unsupported by empirical evidence and that the cooperative model better accounts for our current theory of mind. This result has interesting implications for the evolution of prosocial behavior.

Romero, Felipe:  Infectious Falsehoods                 

Published results influence subsequent research. False positives have a detrimental influence in the sense that they mislead scientists that rely on them. In some cases, false positives inspire large research programs, and have a systematic influence in the community. I call this phenomenon epistemic infection. In this paper (Section 1) I characterize the phenomenon, and (Section 2) study conditions that increase the risk of contagion in scientific communities. Then (Section 3) I argue that infections are an effect of contingent and defective incentive structures of contemporary science. As a case of study, I discuss the recent controversies in the social priming research program in social psychology.

Rosaler, Joshua: Is de Broglie-Bohm Theory Specially Equipped to Recover Classical Behavior?                  

Supporters of de Broglie-Bohm (dBB) theory argue that because the theory, like classical mechanics, concerns the motions of point particles in 3D space, it is specially suited to recover classical behavior. I offer a novel account of classicality in dBB theory, if only to show that such an account falls out almost trivially from results developed in the context of decoherence theory. I then argue that this undermines any special claim that dBB theory is purported to have on the unification of the quantum and classical realms.  

Roush, Sherrilyn: The Epistemic Superiority of Experiment to Simulation              

This paper defends the naive thesis that the method of experiment is epistemically superior to simulation, other things equal, a view that has been resisted by some philosophers writing about simulation. There are three challenges in defending this thesis. One is to say how "other things equal" can be defined, another to identify and explain the source of the epistemic advantage of experiment in a hypothetical comparison so defined. Finally, I must explain why this comparison matters, since it is not the type of situation scientists can expect often to face when they choose an experiment or a computer simulation.

Rubin, Hannah: The Phenotypic Gambit                 

The 'phenotypic gambit,' the assumption that we can ignore genetics and look at the fitness of phenotypes to determine the expected evolutionary dynamics of a population, is often used in evolutionary game theory. However, as this paper will show, an overlooked genotype to phenotype map can qualitatively affect dynamical outcomes in ways the phenotypic approach cannot predict or explain.

Runhardt, Rosa: Evidence for causal mechanisms in social science: recommendations from Woodward's manipulability theory of causation

In a backlash against the prevalence of statistical methods, recently social scientists have focused more on studying causal mechanisms. They increasingly rely on a technique called process-tracing, which involves contrasting the observable implications of several alternative mechanisms. Problematically, process-tracers do not commit to a fundamental notion of causation, and therefore arguably they cannot discern between mere correlation between the links of their purported mechanisms and genuine causation. In this paper, I argue that committing to Woodward's interventionist notion of causation would solve this problem: process-tracers should take into account evidence for possible interventions on the mechanisms they study.

Ruphy, Stephanie:  Which forms of limitation of the autonomy of science are epistemologically acceptable (and politically desirable)?             

This paper will investigate whether constraints on possible forms of limitation of the autonomy of science can be derived from epistemological considerations. Proponents of the autonomy of science often link autonomy with virtues such as epistemic fecundity, capacity to generate technological innovations and capacity to produce neutral expertise. I will critically discuss several important epistemological assumptions underlying these links, in particular the "unpredictability argument". This will allow me to spell out conditions to be met by any form of limitation of the autonomy of science to be epistemologically acceptable. These conditions can then be used as a framework to evaluate possible or existing forms of limitations of the autonomy of science. And it will turn out that the option of direct public participation (a lively option in philosophy of science today) might not be the best way to go to democratize the setting of research agenda.

Rusanen, Anna-Mari: On Relevance        

Relevance is one of the key concepts in literature on modeling. However, the notion of relevance is often left unspecified. In this paper a sketch for a general notion of relevance is outlined. In addition, it will be argued that there are at least two interpretations of relevance available in the literature on modeling. The first one is ontic, and the second one is pragmatic.

Sample, Matthew: Stanford's Unconceived Alternatives from the Perspective of Epistemic Obligations               

Kyle Stanford's reformulation of the problem of underdetermination has the potential to highlight the epistemic obligations of scientists. Stanford, however, has presented the phenomenon of unconceived alternatives as a problem for realists, despite his critics' insistence that we have contextual explanations for scientists' inability to conceive of their successors' theories. I propose that the concept of "role oughts," as discussed by Richard Feldman, can help pacify Stanford's critics and reveal the broader relevance of the "new induction." The possibility of unconceived alternatives pushes us to question our contemporary expectation for scientists to reason outside of their historical moment.

Savitt, Steven:   I ♥ ♦s

                 

Richard Arthur (2006) and Steven Savitt (2009) proposed that the present in (time-oriented) Minkowski spacetime should be thought of as a small causal diamond. That is, given two timelike separated events p and q, with p earlier than q, they suggest that the present (relative to those two events) is the set   I+(p) ∩ I-(q). Mauro Dorato (2011) presents three criticisms of this proposal. I rebut all three and then offer two more plausible criticisms of the Arthur/Savitt proposal. I argue that these criticisms also fail.  

Sebens, Charles: Killer Collapse: Empirically Probing the Philosophically Unsatisfactory Region of GRW               

GRW theory offers precise laws for the collapse of the wave function.  These collapses are characterized by two new constants, λ and σ. Recent work has put experimental upper bounds on the collapse rate, λ. Lower bounds on λ have been more controversial since GRW begins to take on a many-worlds character for small values of λ.  Here I examine GRW in this odd region of parameter space where collapse events act as natural disasters that destroy branches of the wave function along with their occupants.  Our continued survival provides evidence that we don't live in a universe like that.

Shavit, Ayelet: You Can't Go Home Again - or Can You? 'Replication' Indeterminacy and 'Location' Incommensurability in Three Biological Re-Surveys

Reproducing empirical results and repeating experimental processes is fundamental to science, but is of grave concern to scientists. Revisiting the same location is necessary for tracking biological processes, yet I argue that 'location' and 'replication' contain a basic ambiguity. The analysis of the practical meanings of 'replication' and 'location' will strip of incommensurability from its common conflation with empirical equivalence, underdetermination and indeterminacy of reference. In particular, I argue that three biodiversity re-surveys, conducted by the research institutions of Harvard, Berkeley, and Hamaarag, all reveal incommensurability without indeterminacy in the smallest spatial scale, and indeterminacy without incommensurability in higher scales.  

Shen, Jian: Gradual Revelation: A Signaling Model           

Most models discussed in the sender-receiver literature inspired by Lewis and Skyrms assume that the sender has unchanging knowledge of the world. This paper explores the consequence of dropping that assumption, and proposes a gradual revelation model in which the sender finds out about the world little by little. This model is then used to model the distinction between indicative and imperative contents, which I argue it does better than traditional models.

Sheredos, Benjamin: Ontic accounts of explanation cannot support norms of generality and systematicity      

Recent attempts to unify the ontic and epistemic approaches to (mechanistic) explanation propose that we simply pursue epistemic and ontic norms in tandem. I aim to upset this armistice. There are epistemic norms of attaining general/systematic explanations which we cannot fulfill if we are constrained always to fulfill ontic norms of explanation. Put another way, (some) epistemic norms are autonomous of and in tension with ontic norms. Put a third way, radically distinct forms of explanation are required to fulfill ontic and (some) epistemic norms. A result is that some central arguments put forth by ontic theorists against epistemic theorists are revealed as not only question-begging, but ultimately self-defeating.

Skillings, Derek: Mechanistic Explanation of Biological Processes           

Biological processes are often explained by identifying the underlying mechanisms that generate a phenomenon of interest. I characterize a basic account of mechanistic explanation and then present three challenges to this account, illustrated with examples from molecular biology. The basic mechanistic account is insufficient for explaining: 1) non-sequential and non-linear dynamic processes, 2) the inherently stochastic nature of many biological mechanisms, and 3) fails to give a proper framework for analyzing organization. I suggest that biological processes are best approached as a multi-dimensional gradient--with some processes being paradigmatic cases of mechanisms and some processes being marginal cases.

Slater, Matthew: In Favor of the (Possible) Reality of Race           

Disputes over the reality of race — in particular, whether there are races are natural kinds — have often foundered on an inability to make clear sense of the sense in which race can be biologically real and yet (also) socially constructed. I sketch an account of natural kinds that gives specific content to this possibility and argue for the tenability of treating races as genuine kinds on this account, even if they fail to be "biologically real".

Stanford, P. Kyle: Catastrophism, Uniformitarianism, and a Realism Dispute that Makes a Difference

In support of Stanford's problem of unconceived alternatives and against his critics, I argue that contemporary scientific communities may well be no better and are perhaps even substantially worse than their historical predecessors as vehicles for discovering, developing, and exploring fundamentally distinct alternatives to existing scientific theories.  I then argue that even recognizing the need to confront this question invites us to reconceive what is most fundamentally at issue in the debate concerning scientific realism in a way that ensures that the position we take in that debate actually makes some difference to how we conduct the scientific enterprise itself.

Stevens, Syman: The Dynamical Approach as Practical Geometry            

This essay introduces Harvey Brown and Oliver Pooley's 'dynamical approach' to special relativity and argues that it is best construed as a relationalist form of Einstein's 'practical geometry', according to which Minkowski geometrical structure supervenes upon the symmetries of the best-systems dynamical laws for a material world with primitive topological or differentiable structure. This construal of the dynamical approach is shown to be compatible with the related chapters of Brown's text, as well as recent descriptions of the dynamical approach by Pooley and others.

Stoeltzner, Michael: On Virtues and Vices of Axiomatic Quantum Field Theory 

I analyze the recent debate between Fraser and Wallace whether axiomatic approaches to quantum field theory (AQFT) represent the preferred starting point for philosophers or whether conventional, Lagrangian-based quantum field theory (CQFT) scores better, not least because this research program provides testable models for particle physics while AQFT only delivers general insights. I argue that Fraser's underdetermination argument does not provide a good defense for AQFT and compare both research programs within a suitably modified Lakatosian context. This however requires a more opportunistic attitude towards the axiomatic method that is in line with the actual developments within mathematical physics.

Tabb, Kathryn: After Psychiatric Kinds: Diagnosis Specificity and Progress in Psychiatric Research         

The failure of psychiatry to validate its constructs is often attributed to the use of operational rather than etiological diagnostic criteria. Recently, however, the National Institute of Mental Health has proposed a new diagnosis: the problem is psychiatry's focus on validating psychiatric kinds rather than domains of functioning implicated in psychopathology. I support this view by arguing that behind psychiatry's failure to validate its nosology is not a metaphysical but an epistemological one, which I call diagnosis specificity: the assumption that the diagnostic act correctly identifies the patient's condition as belonging to a homogeneous type that allows for ampliative inferences.

Theurer, Kari: More Information, Better Explanations: Reductionism in Biological Psychiatry  

I introduce a model of reduction in neuroscience inspired by Kemeny and Oppenheim's (1956) account, which was quickly dismissed as being too weak to support actual reductions. This view has been unjustifiably overlooked. I sketch a mechanistic model of Kemeny and Oppenheim reduction, on which reduction requires demonstrating that the reducing mechanism has an explanatory scope at least as great as the phenomenon to be reduced. In order to demonstrate that this model is already hard at work in current neuroscience, I draw upon recent reductionistic research in biological psychiatry concerning the etiology of schizophrenia.

Tulodziecki, Dana: Realist continuity, approximate truth, and the pessimistic meta-induction                 

The pessimistic meta-induction (PMI) seeks to undercut the realist's alleged connection between success and (approximate) truth by by arguing that highly successful, yet wildly false theories are typical of the history of science.  Realist responses to the PMI try to rehabilitate this connection by stressing various kinds of continuity between earlier and later theories.  Here, I argue that these extant realist responses are inadequate, by showing - through the example of the 19th century miasma theory of disease - that there are cases of genuinely successful, yet false theories, that do not exhibit any of the required realist continuities

Vassend, Olav: Confirmation Measures and Sensitivity

Stevens (1946) draws a useful distinction between ordinal scales, interval scales, and ratio scales. Most recent discussions of confirmation measures have proceeded on the ordinal level of analysis. In this paper, I give a more quantitative analysis. In particular, I show that the requirement that our desired confirmation measure be at least an interval measure naturally yields necessary conditions that jointly entail the log-likelihood measure. Thus I conclude that the log-likelihood measure is the only good candidate interval measure.

Vorms, Marion: Spatial representations in science: towards a typology               

We have an intuitive idea of the distinction between "images" and linguistic representations. Together with this intuitive distinction comes the (intuitive) claim that images are less "abstract", by enabling us to "visualize" objects, relations, or processes. Spelling out this distinction and the associated claims regarding abstractness and visualization, however, is far from a trivial task. Acknowledging that non linguistic representations play an essential role in scientific theorizing, I aim at contributing to this enterprise, by distinguishing two broad types of spatial representations, and highlighting the different sorts of theorizing and abstraction processes associated with these different types. 

Wagenknecht, Susann: A double notion of knowing and knowledge         

This paper addresses the conceptual gap between knowledge notions in general epistemology and philosophy of science. It suggests a double notion of individual knowing (as a form of believing) and collaboratively created knowledge (as discursive content), and highlights the dis-/continuity of knowing and knowledge thus understood. Thereby, it contributes to social epistemology's discussion of collective knowledge and generates novel questions in the analysis of collaborative scientific practice. 

Walsh, Kirsten: Phenomena in Newton's Principia           

Newton described his Principia as a work of 'experimental philosophy', where theories were deduced from phenomena.  He introduced six 'phenomena': propositions describing patterns of motion, generalised from astronomical observations.  However, these don't fit Newton's contemporaries' definitions of 'phenomenon'.  Drawing on Bogen and Woodward's (1988) distinction between data, phenomena and theories, I argue that Newton's 'phenomena' were explanatory targets drawn from raw data.  Viewed in this way, the phenomena of the Principia and the experiments from the Opticks were different routes to the same end: isolating explananda.

Walsh, Elena: Top-Down 'Causation' and Developmental Explanation   

In recent years developmental psychologists have begun to describe intentional attitudes (such as emotions) as 'emergent properties' of complex systems.  Emergent properties are often thought to exert a 'top-down' constraint or causal influence on their physical realisers.  This paper argues that the notion of top-down constraint plays an essential role in developmental explanation.  It also employs this notion to suggest that folk psychology and neuroscience should be viewed as dependent and complementary explanatory modes.

Werndl, Charlotte, and Frick, Roman: Rethinking Boltzmannian Equilibrium       

Boltzmannian statistical mechanics partitions the phase space into macroregions, and the largest of these is identified with equilibrium. What justifies this identification? Common answers focus on Boltzmann's combinatorial argument, the Maxwell-Boltzmann distribution, and maximum entropy considerations. We argue that they fail and present a new answer. We characterise equilibrium as the macrostate in which the system spends most of its time and prove a new theorem establishing that equilibrium thus defined corresponds to the largest macroregion. Our derivation is completely general in that it does not rely on assumptions about the system's dynamics or internal interactions.

Wiegman, Isaac: Evidential Criteria of Homology: Adjudicating Competing Homology Claims     

While the homology concept has taken on importance in thinking about the nature of psychological kinds (e.g. Griffiths 1997), no one has shown how comparative psychological and behavioral evidence can distinguish between competing homology claims. I adapt the operational criteria of homology to accomplish this. I consider two competing homology claims that compare human anger with putative aggression systems of nonhuman animals, and demonstrate the effectiveness of the criteria in adjudicating between these claims.

Woody, Andrea: Re-orienting Discussions of Scientific Explanation: A Functional Perspective 

Most literature on scientific explanation presumes proper analysis rests at the level of individual explanations, but there are other options. Shifting focus from explanations, as achievements, toward explaining, as a coordinated activity of communities, the functional perspective aims to reveal how the practice of explanatory discourse functions within scientific communities.  Here I outline the functional perspective and argue that it reveals an important methodological role for explanation in science, which consequently provides resources for developing more adequate responses to traditional concerns, including how best to conceive of explanatory power as a theoretical virtue. Consideration of the ideal gas law grounds the discussion.

Wright, Jake: The Moral of the Story: What Does the Evolutionary Contingency Thesis Teach Us About Biological Laws?               

Beatty's [1995] Evolutionary Contingency Thesis has generated a number of responses. I examine three (Brandon [1997], Sober [1997] and Mitchell [2003]) and present a synthesized response to Beatty based primarily on Mitchell. Like Mitchell, I argue Beatty's thesis shows we should view laws pragmatically. Contra Mitchell, I argue this pragmatic view must maintain distinctions between types of lawful generalization. My argument responds to Mitchell's concern that disciplines seeking naturally necessary generalizations will be privileged over disciplines that do not. Because different types of generalization aim at different goals, we will favor naturally necessary generalizations only if they can achieve goals no other generalization could achieve.

Wuethrich, Adrian: The Higgs Discovery as a Diagnostic Causal Inference

I describe how the discovery of elementary particles, such as the Higgs boson, is a case of causal inference. The case illustrates how Lipton's challenge of inferred differences can be met even in a paradigm case of ``unobservable'' causes of which the mere existence has to be established. The view of the discovery of elementary particles as a causal inference has several likely consequences, most of them attractive, concerning the role of theory and the problems of selection bias and unconceived alternatives.

Zautra, Nicholas: Embodiment, Interaction, and Experience: Toward a Comprehensive Model in Addiction Science       

Current models attempt to specify how addiction is developed, how it is maintained, and how people can recover from it. In this paper, I explain why none of these theories can be accepted as a comprehensive model of addiction. I argue that current models fail to account for differences in embodiment, interaction processes, and the experience of addiction. To redress these limiting factors, I design a proposal for an enactive account of addiction that complements the enactive model of autism proposed by Hanne De Jaegher.

Zednik, Carlos: Are Systems Neuroscience Explanations Mechanistic? 

Whereas most branches of neuroscience are thought to provide mechanistic explanations, systems neuroscience is not. Two reasons are typically cited in support of this conclusion. First, systems neuroscientists rarely, if ever, rely on the dual strategies of decomposition and localization. Second, they typically emphasize organizational properties over the properties of individual components. In this paper, I argue that neither reason is conclusive: researchers might rely on alternative strategies for mechanism discovery, and focusing on organization is often appropriate and consistent with the norms of mechanistic explanation. Thus, many explanations in systems neuroscience can also be viewed as mechanistic explanations.

Zhang, Jiji: Likelihood and Consilience: On Forster's Counterexamples to the Likelihood Theory of Evidence     

Forster presented some interesting examples having to do with distinguishing the direction of causal influence between two variables, which he argued are counterexamples to the likelihood theory of evidence (LTE). In this paper, we refute Forster's arguments by carefully examining one of the alleged counterexamples. We argue that the example is not convincing as it relies on dubious intuitions that likelihoodists have forcefully criticized. More importantly, we show that contrary to Forster's contention, the consilience-based methodology he favored is accountable within the framework of the LTE.

Zheng, Robin: Responsibility, Causality, and Social Inequality 

I explore the intertwined philosophical and social scientific research on the fundamental attribution error and causal attributions for poverty in the United States. I expose the way in which what appear to be empirical disputes about causes turn out to be fundamentally political and moral disagreements based on normative expectations about the distribution of powers and social roles that could have prevented an event or state. Thus, moral philosophers who work to reshape normative expectations also play a role in restructuring causal explanations, and hence interventions, for problems like poverty and social inequality.

Zollman, Kevin: The handicap principle is an artifact     

The handicap principle is one of the most influential ideas in evolutionary biology. It asserts that when there is conflict of interest in a signaling interaction signals must be costly in order to be reliable.  We show how the handicap principle is a limiting case of honest signaling, which can also be sustained by other mechanisms.  This fact has gone unnoticed because in evolutionary biology it is a common practice to distinguish between cues, indexes and fakable signals, where cues provide information but are not signals and indexes are signals that cannot be faked. We find that the dichotomy between indexes and fakable signals is an artifact of the existing signaling models.  Our results suggest that one cannot adequately understand signaling behavior by focusing solely on cost. Under our reframing, cost becomes one---and probably not the most important---of a collection of factors preventing deception.