Abstractsof Contributed Papers
 


"Do We See Through a Social Microscope?: Credibility as Vicarious Selector"
Douglas Allchin, University of Texas at El Paso

 Abstract:
Credibility in a scientific community (sensu Shapin) is a vicariousselector (sensu Campbell) for the reliability of reports by individualscientists or institutions. Similarly, images from a microscope (sensuHacking) are vicarious selectors for studying specimens. Working at differentlevels, the process of indirect reasoning and checking indicates a unityto experimentalist and sociological perspectives, along with a resonanceof strategies for assessing reliability.
 
 

"The Dogma of Isomorphism: A Case Study FromSpeech Perception"
Irene Appelbaum, University of Montana

 Abstract:
It is a fundamental tenet of philosophy of the "special sciences" thatan entity may be analyzed at multiple levels of organization. As a corollary,it is often assumed that the levels into which a system may be theoreticallyanalyzed map straightforwardly onto real stages of processing. I criticizethis assumption in a case study from the domain of speech science. I argue(i) that the dominant research framework in speech perception embodiesthe assumption that units of processing mirror units of conceptual structure,and (ii) that this assumption functions not as a falsifiable hypothesis,but as an entrenched dogma.

 

"The Curve Fitting Problem"
Prasanta S. Bandyopadhyay and Robert J. Boik, Montana State University

 Abstract:
In the curve fitting problem two conflicting desiderata, simplicityand goodness-of-fit pull in opposite directions. To solve this problem,two proposals, the first one based on Bayes' theorem criterion (BTC) andthe second one advocated by Forster and Sober based on Akaike's InformationCriterion (AIC) are discussed. We show that AIC, which is frequentist inspirit, is logically equivalent to BTC, provided that a suitable choiceof priors is made. We evaluate the charges against Bayesianism and contendthat AIC approach has shortcomings. We also discuss the relationship betweenSchwarz's Bayesian Information Criterion and BTC.

 

"Bell's Theorem, Non-Separability and Space-TimeIndividuation in Quantum Mechanics"
Darrin W. Belousek, University of Notre Dame

 Abstract:
We first examine Howard's analysis of the Bell factorizability conditionin terms of "separability" and "locality" and then consider his claimsthat the violations of Bell's inequality by the statistical predictionsof quantum mechanics should be interpreted in terms of "non-separability"rather than "non-locality" and that "non-separability" implies the failureof space-time as a principle of individuation for quantum-mechanical systems.And I find his arguments for both claims to be lacking.

 

"Why Physical Symmetries?"
Elena Castellani, University of Florence

 Abstract:
This paper is concerned with the meaning of "physical symmetries,"i.e. the symmetries of the so-called laws of nature. The importance ofsuch symmetries in nowadays science raises the question of understandingtheir very nature. After a brief review of the relevant physical symmetries,I first single out and discuss their most significant functions in today'sphysics. Then I explore the possible answers to the interpretation problemraised by the role of symmetries and I argue that investigating the realnature of physical symmetries implies, in some sense, discussing the meaningand methods of physics itself.

 

"Problems with the Deductivist Image of Scientific Reasoning"
Philip E. Catton, University of Canterbury

 Abstract:
There seem to be some very good reasons for a philosopher of scienceto be a deductivist about scientific reasoning. Deductivism is apparentlyconnected with a demand for clarity and definiteness in the reconstructionof scientists' reasonings. And some philosophers even think that deductivismis the way round the problem of induction. But the deductivist image ischallenged by cases of actual scientific reasoning, in which hard-to-stateand thus discursively ill-defined elements of though nonetheless significantlycondition what practitioners accept as cogent argument. And arguably, theseproblem cases abound. For example, even geometry? for most of its history?was such a problem case, despite its exactness and rigor. It took a tremendouseffort on the part of Hilbert and others, to make geometry fit the deductivistimage. Looking to the empirical sciences, the problems seem worse. Eventhe most exact and rigorous of empirical sciences --mechanics-- is stillthe kind of problem case which geometry once was. In order for the deductivistimage to fit mechanics, Hilbert's sixth problem (for mechanics) would needto be solved. This is a difficult, and perhaps ultimately impossible task,in which the success so far achieved is very limited. I shall explore someconsequences of this for realism as well as for deductivism. Through discussinglinks between non-monotonicity, skills, meaning, globality in cognition,models, scientific understanding, and the ideal of rational unification,I argue that deductivists can defend their image of scientific reasoningonly by trivializing it, and that for the adequate illumination of science,insights from anti-deductivism are needed as much as those which come fromdeductivism.

 

"Are GRW Tails as Bad as They Say?"
Alberto Cordero, Graduate Center and Queens College, CUNY

 Abstract:
GRW models of the physical world are severely criticized in the literaturefor involving wave function 'tails', the allegation being that the lattercreate fatal interpretive problems and even compromise standard arithmetic.I find such objections both unfair and misguided. But not all is well withthe GRW approach. The complaint I articulate in this paper does not haveto do with tails as such but with the specific way in which past physicalstructures linger forever in the total GRW wave function. I argue thatthis feature, which is an artifact of a particular and ultimately optionalgenre of collapse mechanisms, yields a total picture that is too closeto the "Many Worlds" type to deserve clear methodological acceptability,particularly in light of the effective empirical equivalence between thetwo objectivist approaches.

 

"Why Bayesian Psychology is Incomplete"
Frank Döring, University of Cincinnati

 Abstract:
Bayesian psychology, in what is perhaps its most familiar version,is incomplete: Jeffrey conditionalization is sensitive to the order inwhich the evidence arrives. This order effect can be so pronounced as tocall for a belief adjustment that cannot be understood as an assimilationof incoming evidence by Jeffrey's rule. Hartry Field's reparameterizationof Jeffrey's rule avoids the order effect but fails as an account of hownew evidence should be assimilated.

 

"The Conserved Quantity Theory of Causation and ChanceRaising"
Phil Dowe, University of Tasmania

 Abstract:
In this paper I consider a problem for the Conserved Quantity Theoryof causation, in its most recent version (Salmon 1997). The problem concernscases where an event, which tends to prevent another, fails to on a particularoccasion, and where the two are linked by causal processes. I call thisthe case of "connected non-causes." Chance-raising theories of causation,on the other hand, do account easily for this problem, but face the problemof chance-lowering causes. I show how the two approaches can be combinedto solve both problems.

 

"Laudan's Naturalistic Axiology"
Karyn Freedman, University of Toronto

 Abstract:
Doppelt (1986, 1990), Siegel (1990) and Rosenberg (1996) argue thatthe pivotal feature of Laudan's normative naturalism, namely his axiology,lacks a naturalistic foundation. In this paper I show that thisobjection is ill founded, and turns on an ambiguity in the notion of 'naturalism.'Specifically, I argue that there are two important senses of naturalismrunning through Laudan's work. Once these two strands are made explicit,the objection raised by Doppelt et al. simply evaporates.

 

"Is Pure R-Selection Really Selection?"
Bruce Glymour, Kansas State University

 Abstract:
Lennox and Wilson (1994) critique modern accounts of selection on thegrounds that such accounts will class evolutionary events as cases of selectionwhether or not the environment checks population growth. Lennox and Wilsonclaim that pure r-selection involves no environmental checks, andthat accounts of natural selection ought to distinguish between the twosorts of cases. I argue that Lennox and Wilson are mistaken in claimingthat pure r-selection involves no environmental checks, but suggestthat two related cases support their substantive complaint, namely thatmodern accounts of selection have resources insufficient for making importantdistinctions in causal structure.

 

"Explanatory Pluralism in Paleobiology"
Todd A. Grantham, College of Charleston

 Abstract:
This paper is a defense of "explanatory pluralism" (i.e., the viewthat some events can be correctly explained in two distinct ways). To defendpluralism, I argue that a certain class of macroevolutionary trends (whatI call "asymmetrical passive trends") can be explained in two distinctbut compatible ways. The first approach ("actual sequence explanation")is to trace out the particular forces that affect each species. The secondapproach treats the trend as "passive" or "random" diffusion from a boundaryin morphological space. I argue that while these strategies are distinct,both kinds of explanation can be true of a single trend. Further, sinceneither strategy can be reduced or eliminated from paleobiology, we shouldaccept that both strategies can provide correct explanations for a singletrend.

 

"Theories as Complexes of Representational Media"
Robin F. Hendry, University of Durham, and
Stathis Psillos, The London School of Economics

 Abstract:
In this paper, we review two standard analyses of scientific theoriesin terms of linguistic and nonlinguistic structures respectively. We showthat by focusing exclusively on either linguistic or extra linguistic representationalmedia, both of the standard views fail to capture correctly the complexnature of scientific theories. We argue primarily against strong versionsof the two approaches and suggest that the virtues of their weaker versionscan be brought together under our own interactions approach. As historicalindividuals, theories are complex consortia of different representationalmedia: words, equations, diagrams, analogies and models of different kinds.To replace this complexity with a monistic account is to ignore the representationaldiversity that characterizes even the most abstract physical theory.
 

"Helmholtz's Naturalized Conception of Geometryand his Spatial Theory of Signs"
David J. Hyder, Max-Planck-Institut für Wissenschaftsgeschichte

 Abstract:
I analyze the two main doctrines of Helmholtz's "The Facts in Perception,"in which he argued that the axioms of Euclidean geometry are not, as hisneo-Kantian opponents had argued, binding on any experience of the externalworld. This required two argumentative steps: (1) a new account of thestructure of our representations which was consistent both with the experienceof our (for him) Euclidean world and with experience of a non-Euclideanone, and (2) a demonstration of how geometric and mathematical propositionsderived not from Kantian intuition, but from the contingent natures ofthe representations themselves. I show how (1) and (2) together provideda naturalized "physiological epistemology," a view that was adopted inmany of its essential features by Einstein, Wittgenstein and the ViennaCircle.

 

"Use-Novelty, Gellerization, and Severe Tests"
Tetsuji Iseda, University of Maryland, College Park

 Abstract:
This paper analyzes Deborah Mayo's recent criticism of use-noveltyrequirement. She claims that her severity criterion captures actual scientificpractice better than use-novelty, and that use-novelty is not a necessarycondition for severity. Even though she is right in that there are certaincases in which evidence used for the construction of the hypothesis cantest the hypothesis severely, I do not think that her severity criterionfits better with out intuition about good tests than use-novelty. I arguefor this by showing a parallelism in terms of severity between the confidenceinterval case and what she calls "gellerization." To account for the differencebetween these cases, we need to take into account certain additional considerationslike a systematic neglect of relevant alternatives.

 

"A Note on Nonlocality, Causation and Lorentz-Invariance"
Federico Laudisa, University of Florence

 Abstract:
The status of a causal approach to EPR-Bell nonlocal correlations interms of a counterfactual theory of causation is reviewed. The need totake into due account the spacetime structure of the events involved isemphasized. Furthermore, it is argued that adopting this approach entailsthe assumption of a privileged frame of reference, an assumption that seemseven more in need of justification than the causal theory itself.

 

"Explaining the Emergence of Cooperative Phenomena"
Chuang Liu, University of Florida

 Abstract:
Phase transitions, such as spontaneous magnetization in ferromagnetism,are the most fundamental cooperative phenomenon in which long-rangeorders emerge in a system under special conditions. Unlike collective phenomenawhich are mere collections of processes (or actions) mostly accountableby statistical averages of the micro-constituents' properties, they areresults of genuine cooperation in which unique properties emerge for thewhole system that are not such simple averages. In this paper I investigatewhat the problem of phase transitions is, how it is solved by a mathematicalmaneuver, i.e. taking the thermodynamic limit, and whether the solutionis sound and rigorous as claimed.

 

"Van Fraassen and Ruetsche on Preparation and Measurement"
Bradley Monton, Princeton University

 Abstract:
Ruetsche (1996) has argued that van Fraassen's (1991) Copenhagen Variantof the Modal Interpretation (CVMI) gives unsatisfactory accounts of measurementand of state preparation. I defend the CVMI against Ruetsche's first argumentby using decoherence to show that the CVMI does not need to account forthe measurement scenario which Ruetsche poses. I then show, however, thatthere is a problem concerning preparation, and the problem is more seriousthan the one Ruetsche focuses on. The CVMI makes no substantive predictionsfor the everyday processes we take to be measurements.

 

"Applying Pure Mathematics"
Anthony Peressini, Marquette University

 Abstract:
Much of the current thought concerning mathematical ontology and epistemologyfollows Quine and Putnam in looking to the indispensable application ofmathematics in science. In particular, the Quine/Putnam indispensabilityapproach is the inevitable staging point for virtually all contemporarydiscussions of mathematical ontology. Just recently serious challengesto the indispensability approach have begun appearing. At the heart ofthis debate is the notion of an indispensable application of (pure) mathematicsin scientific theory. To date the discussion has focused on indispensability,while little has been said about the process of application itself.In this paper I focus on the process of applying (pure) mathematical theoryin physical theory.

 

"Functional and Intentional Action Explanations"
Mark Risjord, Emory University

 Abstract:
Functional explanation in the social sciences is the focal point forconflict between individualistic and social modes of explanation. Whilethe agent may have thought she had reasons for action, the functional explanationreveals the hidden strings of the puppet master. This essay argues thatthe conflict is merely apparent. The erotetic model of explanation is usedto analyze the forms of intentional action and functional explanations.One result is that there are two kinds of functional explanation, and onlyone is appropriate in the social sciences. While a functional explanationmay have the same topic as an intentional action explanation, they arecompatible.

 

"What Should a Normative Theory of Values in ScienceAccomplish?"
Kristina Rolin, University of Helsinki

 Abstract:
This paper compares two different views about the role of values inscientific judgment, one proposed by Ernan McMullin and the other by HelenLongino. I argue first that McMullin's distinction among epistemic andnon-epistemic values is based on a problematic understanding of the goalsof science. Second, I argue that Longino offers a more adequate understandingof the goals of science but her concept of constitutive value is not sufficientlynormative. Third, I conclude that a normative theory of values in scienceshould address the normative concerns central to McMullin's work whileintegrating the more complex understanding of the goals of science emergingfrom Longino's work.

 

"Changing the Subject: Redei on Causal Dependenceand Screening Off in Relativistic Quantum Field Theory"
Laura Ruetsche and Rob Clifton, University of Pittsburgh

 Abstract:
In a recent pair of articles (Redei 1996, 1997), Miklos Redei has takenenormous strides toward characterizing the conditions under which relativisticquantum field theory is a safe setting for the deployment of causal talk.Here, we challenge the adequacy of the accounts of causal dependence andscreening off on which rests the relevance of Redei's theorems to the questionof causal good behavior in the theory.
 

"Visual Prototypes"
Pauline Sargent, University of California, San Diego

 Abstract:
In this paper I introduce the concept of "visual prototype" to captureone particular kind of work done by visual representation in the practiceof science. I use an example from neuroscience; more particulartly, theexample of the visual representation of the pattern of convolutions (thegyri and sulci) of the human cortex as this pattern is constructed andused in the practice of mapping the functional human brain. I argue thatthis work which is visual representation as distinct from linguistic representationis an essential piece of the human brain mapping project.

 

"Selection and the Extent of Explanatory Unification"
Rob Skipper, University of Maryland at College Park

 Abstract:
According to Philip Kitcher, scientific unification is achieved viathe derivation of numerous scientific statements from economies of argumentschemata. I demonstrate that the unification of selection phenomena acrossthe domains in which it is claimed to occur, evolutionary biology, immunologyand, speculatively, neurobiology, is unattainable on Kitcher's view. Ithen introduce an alternative method for rendering the desired unificationbased on the concept of a mechanism schema which can be integrated withWesley Salmon's causal-mechanical model of explanation. I conclude thatthe gain in unification provided by the alternative account suggests thatKitcher's view is defective.

 

"Rhetoric, Narrative and Argument in Bruno Latour'sScience in Action"
David G. Stern, University of Iowa

 Abstract:
Why does Latour's Science in Action, which approachesscience rhetorically, highlighting the persuasive and political work thatmust be done to establish a scientific or technological fact, not examineits own rhetoric? Latour acknowledges he is making use of the persuasivetechniques he attributes to science, but to submit his own techniques tothe demythologizing scrutiny he directs at philosophy of science wouldundercut his primary goal of recruiting readers to his colors. The paperscrutinizes three central figures in Science in Action: scienceas war, as network, and as Janus-faced, and examines its disciplinary andbiographical context.

 

"The Failure of Equivariance for Real Ensemblesof Bohmian Systems"
Jitendra Subramanyam, University of Maryland

 Abstract:
The continuity equation has long been thought to secure the equivarianceof ensembles of Bohmian systems. The statistical postulate then securesthe empirical adequacy of Bohm's theory. This is true for ideal ensemblesthose represented as continuous fluids in configuration space. But ensemblesthat confirm the experimental predictions of quantum mechanics are notideal. For these non-ideal ensembles neither equivariance nor some approximationto equivariance follows from the axioms of Bohm's theory. Approximate equivariance(and hence the empirical adequacy of Bohm's theory) requires adding toBohm's axioms or giving a detailed account of the behavior of Bohmian ensembles.

 

"Reconsidering the Concept of Equilibrium in ClassicalStatistical Mechanics"
Janneke van Lith-van Dis, Utrecht University

 Abstract:
In the usual procedure of deriving equilibrium thermodynamics fromclassical statistical mechanics, Gibbsian fine-grained entropy is takenas the analog of thermodynamical entropy. However, it is well known thatthe fine-grained entropy remains constant under the Hamiltonian flow. Inthis paper it is argued that we needn't search for alternatives for fine-grainedentropy, nor do we have to give up Hamiltonian dynamics, in order to solvethe problem of the constancy of fine-grained entropy and, more generally,account for the non-equilibrium part of the laws of thermodynamics. Rather,we have to weaken the requirement that equilibrium identified with a stationaryprobability distribution

 

"Who's Afraid of Undermining? Why the PrincipalPrinciple Need Not Contradict Humean Supervenience"
Peter B. Vranas, University of Michigan

 Abstract:
The Principal Principle (PP) says that, for any proposition A, givenany admissible evidence and the proposition that the chance of A is x%,one's conditional credence in A should be x%. Humean Supervenience (HS)claims that, among possible worlds like ours, no two differ without differingin the spacetime-point-by-spacetime-point arrangement of local properties.David Lewis (1986b, 1994) has argued that PP contradicts HS, and his argumenthas been accepted by Bigelow, Collins, and Pargetter (1993), Thau (1994),Hall (1994), Strevens (1995), Ismael (1996), and Hoefer (1997). Againstthis consensus, I argue that PP need not contradict HS.

 

"Reasoning With the 'Rosetta Stones' of Biology:Experimental Systems and Doable Research"
Kevin Lattery, University of Minnesota

 Abstract:
I compare the Rosetta Stone's function of making hieroglyphics research"doable" with the use of experimental systems. Although biologists routinelyrecognize the importance of these systems, philosophers have not. One reasonfor this is that philosophers typically conceive experimental tools asmere instruments for gathering evidence. I show why we need an accountof experimental tools that conceives them beyond the context of experimentsand evidential reasoning. More specifically, I describe an investigativereasoning with experimental systems to establish and extend "doable research"and argue that this investigative reasoning is central to justifying andstructuring our knowledge in experimental biology.

 

"The Limited World of Science: A Tractarian Accountof Objective Knowledge"
Alfred Nordmann, University of South Carolina

 Abstract:
According to Wittgenstein's Tractatus, scientists determinethe structure of the world by identifying the true propositions which describeit. This determination is possible because i) elementary propositions arecontingent, i.e., their truth depends on nothing but agreement with whatis the case, ii) completely general propositions delimit the degree offreedom which the totality of elementary propositions leaves to the structureof the world. Once such completely general propositions are adopted (e.g.,the principle of conservation of weight), agreement among scientists canreduce to agreement among representations. On this account, Lavoisier'schemistry better promotes objective knowledge than Priestley's.

 

"The Likelihood Principle and the Reliability ofEvidence"
Andrew Backe, University of Pittsburgh

 Abstract:
The leading philosophical account of statistical inference relies ona principle which implies that only the observed outcome from an experimentmatters to an inference. This principle is the likelihood principle, andit entails that information about the experimental arrangement itself isof no inferential value. In the present paper, I argue that adherence tothe likelihood principle results in an account of inference that is insensitiveto the capacity of observed evidence to distinguish a genuine phenomenonfrom a chance effect. This capacity, which is referred to as the reliabilityof the evidence, is dependent on the error properties of the experimentalarrangement itself. Error properties incorporate outcomes not actuallyobserved and, thus, cannot always be captured by likelihood functions.If my analysis is correct, then the central principle underlying the leadingphilosophical theory of inference should be rejected.
 
 
 

"Objects or Events? Towards an Ontology for QuantumField Theory"
Andreas Bartels, Universität Gesamthochschule Paderborn

 Abstract:
Recently P. Teller and S. Auyang have suggested competing ersatz-ontologieswhich could account for the "loss of classical particples" in Quantum FieldTheory (QFT): Field quanta vs. Field events. However, both ontologies sufferfrom serious defects. While quanta lack numerical identity, spatiotemporallocalizability, and independence from basic representations, events ifunderstood as concrete measurement events are related to the theory onlystatistically. I propose an alternative solution: The entities of QFT areevents of the type "Quantum system S is in quantum state Y." The latterare not point events, but Davidsonian events, i.e. they can be identifiedby their location inside the causal net of the world.
 
 
 

"No One Knows the Date of the Hour: An UnorthodoxApplication of Rev. Bayes' Theorem"
Paul Bartha, University of British Columbia, and
Christopher Hitchcock, Rice University

 Abstract:
Carter and Leslie (1996) have argued, using Bayes' theorem, that ourbeing alive now supports the hypothesis of an early "Doomsday." Unlikesome critics (Eckhardt 1997), we accept their argument in part: given thatwe exist, our existence now indeed favors "Doom sooner" over "Doom later."The very fact of our existence, however, favors "Doom later." In simplecases, a hypothetical approach to the problem of "old evidence" shows thatthese two effects cancel out: our existence now yields no information aboutthe coming of Doom. More complex cases suggest a move from countably additiveto non-standard probability measures.

 
 

"Can Experiments Help Us Choose Between the Bohmand Copenhagen Interpretations of Quantum Mechanics?"
Lon Becker, University of Illinois at Chicago

 Abstract:
In this paper I will look at whether it is possible to help us decidebetween the Bohm and Copenhagen interpretations of quantum mechanics. Iwill look at experiments which assume that the two interpretations areempirically indistinguishable but highlight features that might lead usto favor one interpretation over the other. I will argue that such experimentsare interesting but ultimately not convincing. I will then sketch an experimentto suggest how it might be possible to create an experiment which coulddistinguish between the two interpretations by focusing on the presenceor absence of collapse.

 
 

"Empiricism, Conservativeness and Quasi-Truth"
Otavio Bueno, University of Leeds

 Abstract:
A first step is taken towards articulating a constructive empiricistphilosophy of mathematics, thus extending van Fraassen's account to thisdomain. In order to do so, I adapt Field's nominalisation programme, makingit compatible with an empiricist stance. Two changes are introduced: (a)Instead of taking conservativeness as the norm of mathematics, the empiricistcountenances the weaker notion of quasi-truth (as formulated by da Costaand French), from which the formal properties of conservativeness are derived.(b) Instead of quantifying over space-time regions, the empiricist onlyadmits quantification over occupied regions, since this is enough for hisor her needs.

 

"Organization, Evolution and Cognition: BeyondCampbell's Evolutionary Epistemology"
Wayne Christensen and Clifford Hooker, University of Newcastle

 Abstract:
Daniel Campbell has long advocated a naturalist epistemology basedon a general selection theory, with the scope of knowledge restricted tovicarious adaptive processes. But being a vicariant is problematic becauseit involves an unexplained epistemic relation. We argue that this relationis to be explicated organizationally in terms of the regulation of behaviorand internal state by the vicariant, but that Campbell's selectionist accountcan give no satisfactory account of it because it is opaque to organization.We show how organizational constraints and capacities are crucial to understandingboth evolution and cognition and conclude with a proposal for an enriched,generalized model of evolutionary epistemology that places high-order regulatoryorganization at the center.

 
 

"The Analysis of Singular Spacetimes"
Erik Curiel, University of Chicago

 Abstract:
Much controversy surrounds the question of what ought to be the properdefinition of 'singularity' in general relativity, and the question ofwhether the prediction of such entities leads to a crisis for the theory.I argue that a definition in terms of curve incompleteness is adequate,and in particular that the idea that singularities correspond to 'missingpoints' has insurmountable problems. I conclude that singularities perse pose no serious problems for the theory, but their analysis doesbringing focus several problems of interpretation at the foundation ofthe theory often ignored in the philosophical literature.

 
 

"The Light at the End of the Tunneling: Observationsand Underdetermination"
Michael Dickson, Indiana University

 Abstract:
Some version(s) of the following two theses are commonly accepted.(1) Observations are somehow "theory-laden." (2) There are observationallyequivalent (but otherwise distinct) theories. How can both be true? Whatis meant by observational equivalence" if there are no "observations" thatcould be shared by theories? This paper is an attempt to untangle theseissues by saying what "theory-ladenness" and "observational equivalence"amount to, and by considering an example. Having done so, the paper outlinesa program for reconciling (1) and (2) within some sciences, based on aconception of theories as mathematical formalism plus interpretation.

 
 

"Inference to the Best Explanation is Coherent"
Igor Douven, Utrecht University

 Abstract:
In his (1989) van Fraassen argues that Inference to the Best Explanationis incoherent in the sense that adopting it as a rule for belief changewill make one susceptible to a dynamic Dutchbook. The present paper arguesagainst this. An epistemic strategy is described that allows us to inferto the best explanation free of charge.

 
 
NIH Consensus Conferences: Resolving MedicalControversy with Data in a
Science Court
John H. Ferguson, Office of Medical Applications of Research, NIH

Abstract:
Although the word science connotes exactness and precision, especiallywhen applied to the so called hard sciences like mathematics, physics andchemistry, there is the need in all the sciences for interpretation ofdata.  Interpretation of experiments and data may be even more essentialin the softer sciences like biology, medicine and certainly the social
sciences.  As the courts must interpret the law, so too must thelatest scientific findings be interpreted by someone, some group or institutionto apply the findings in a useful or meaningful societal setting.
 The NIH has had a program for the interpretation of new medicalscience for the past twenty years called the Consensus Development Program.  This program was initiated to address some congressional concerns thatnew
research funded by the NIH that was applicable to medical practicewas not getting applied in the health care community.  The NIH ConsensusConferences were designed to bridge this gap by assessing medical treatmentsand procedures that were newly derived from research and were now readyfor application in the health care system, but where controversy
regarding use or application remained ---  in other words ---where unbiased interpretation was required.  These conferences area kind of court proceeding attempting to resolve these controversies withinterpretation of the new medical science and health data by an impartialjury.
 

"The Plurality of Bayesian Measures of Confirmationand the Problem of Measure Sensitivity"
Branden Fitelson, University of Wisconsin-Madison

 Abstract:
Contemporary Bayesian confirmation theorists measure degree of confirmationusing a variety of non-equivalent relevance measures. As a result, a greatmany of the arguments surrounding quantitative Bayesian confirmation theoryare implicitly sensitive to choice of measure of confirmation. Strictlyspeaking, such arguments are enthymematic, since they presuppose that somerelevance measure (or class of relevance measures) is superior to otherrelevance measures that have been proposed and defended in the philosophicalliterature. I present a survey of this pervasive class of Bayesian confirmation-theoreticenthymemes, and a brief analysis of some recent attempts to resolve thisproblem of measure sensitivity.

 
 

"Moral Responsibility and the 'Ignorant Scientist'"
John Forge, Griffith University

 Abstract:
The question addressed in this paper is whether a scientist who hasengaged in pure research can be held morally responsible for outcomes ofthat research which affect others, most notably through technology, whenthese effects were not foresee. The scientist was "ignorant" of these outcomes.It would appear that the answer must be that the or she could not be responsible,for surely one is only responsible for what one is, in the appropriatesense, in control of, and being genuinely ignorant would seem to precludecontrol. As against this, it will be claimed that a case can be made forsaying that if the scientist can be held responsible for being ignorant,then he or she can be held (indirectly) responsible for the outcomes inquestions, and indeed there are circumstances in which scientists can beresponsible for being ignorant. As it happens, it is only in the senseof blame, and not of praise, that responsiblity has purchase here.

 
 

"Interpolation as Explanation"
Jaakko Hintikka, Boston University, and
Ilpo Halonen, University of Helsinki

 Abstract:
A (normalized) interpolant I in Craig's theorem is a kind of explanationwhy the consequence relation (from F to G) holds. This is because I isa summary of the interaction of the configurations specified by F and G,respectively, that shows how G follows from F.  If explaining E meansderiving it from a background theory T plus situational information A andif among the concepts of E we can separate those occurring only in T oronly in A, then the interpolation theorem applies in two different waysyielding two different explanations and two different covering laws.

 
 

"Category Theory: The Language of Mathematics"
Elaine Landry, McGill University

 Abstract:
In this paper, I set out to situate my claim that category theory providesthe language for mathematical discourse. Against foundational approaches,I argue that there is no need to reduce either the content or structureof mathematical concepts and theories to either the universe of sets orthe category of categories. I assign category theory the role of organizingwhat we say about the structure of both mathematical concepts andtheories. Finally, I argue that category theory, sees as the languageof mathematics, provides a framework for mathematical structuralism.

 
 

"Defending Abduction"
Ilkka Niiniluoto, University of Helsinki

 Abstract:
Charles S. Peirce argued that, besides deduction and induction, thereis a third mode of inference which he called "hypothesis" or "abduction."He characterized abduction as reasoning "from effect to cause", and as"the operation of adopting an explanatory hypothesis." Peirce's ideas aboutabduction, which are related also to historically earlier accounts of heuristicreasoning (the method of analysis), have been seen as providing a logicof scientific discovery. Inference to the best explanation (IBE) has beenregarded as an important mode of justification, both in everyday life,detective stories, and science. In particular, scientific realism has beendefended by an abductive no-miracle argument (Smart, Putnam, Boyd), whilethe critics of realism have attempted to show that this appeal to abductionis question-begging, circular, or incoherent (Fine, Laudan, van Fraassen).This paper approaches these issues by distinguishing weaker and strongerforms of abduction, and by showing how these types of inferences can begiven Peircean and Bayesian probabilistic reconstructions.

 
 

"'Laws of Nature' as an Indexical Term: A Reinterpretationof Lewis's Best-System Analysis"
John Roberts, University of Pittsburgh

 Abstract:
David Lewis's best-system analysis of laws of nature is perhaps thebest known sophisticated regularity theory of laws. Its strengths are widelyrecognized, even by some of its ablest critics. Yet it suffers from whatappears to be a glaring weakness: It seems to grant an arbitrary privilegeto the standards of our own scientific culture. I argue that by reformulating,or reinterpreting, Lewis's exposition of the best-system analysis, we arriveat a view that is free of this weakness. The resulting theory of laws hasthe surprising consequence that the term "law of nature" is indexical.

 
 
 

"Scientific Objectivity and Psychiatric Nosology"
Patricia Ross, University of Minnesota

 Abstract:
This paper challenges the traditional conception of objectivity inscience by arguing that its singular focus on evidential relations andsearch for a perspectivalism renders it inadequate. Through an examinationof psychiatric nosology, I argue that interesting and unique problems arisethat challenge this conception of objectivity and that these challengescannot be met by this account. However, a social practice account of objectivityprovides a much more successful way of thinking about values and objectivityin psychiatric nosology. Moreover, it provides us with a far more successfulaccount in general.

 
 

"The Semantic (Mis)Conception of Theories"
C. Wade Savage, University of Minnesota

 Abstract:
On the traditional, "syntactic" conception, a scientific theory isa (preferably axiomatized) collection of statements. On the version ofthe opposing "semantic" conception advanced by Beatty and Giere, a theoryis a definition plus an hypothesis that the definition has application.On the version advanced by Giere and van Fraassen, a theory is a non-linguisticmodel specified by a definition plus a hypothesis that the model appliesto some empirical system. On the version advanced by van Fraassen, Suppes,and Suppe, a theory is a collection of mathematical models. The advantagesclaimed for these versions of the "semantic" conception are either spuriousor can be obtained under a suitable version of the "syntactic" conception.This conclusion will be argued here for the first two conceptions.

 
 

"Functionalism and the Meaning of Social Facts"
Warren Schmaus, Illinois Institute of Technology

 Abstract:
This paper defends a social functionalist interpretation, modeled onpsychological functionalism, of the meanings of social facts. Social functionalismprovides a better explanation of the possibility of interpreting othercultures than approaches that identify the meanings of social facts witheither mental states or behavior. I support this claim through a functionalistreinterpretation of sociological accounts of the categories that identifythem with their collective representations. Taking the category of causalityas my example, I show that if we define it instead in terms of its functionalrelations to moral rules, it becomes easier to recognize in other cultures.

 
 

"Proper Function and Recent Selection"
Peter Schwartz, University of Pennsylvania

 Abstract:
"Modern History" version of the etiological theory claim that in orderfor a trait X to have the proper function F, individuals with X must havebeen recently favored by natural selection for doing F (Godfrey-Smith 1994,Griffiths 1992, 1993). For many traits with prototypical proper functions,however, such recent selection may not have occurred: traits may have beenmaintained due to lack of variation or due to selection for other effects.I examine this flaw in Modern History accounts and offer an alternativeetiological theory, the Continuing Usefulness account, which appears toavoid such problems.

 
 

"Does Science Undermine Religion?"
Neven Sesardic, Miyazaki International College

 Abstract:
In this paper I first explain the difference between three kinds ofquestions (purely scientific, purely religious, and mixed). Second, I tryto show that historical conflicts between religion and science were notaccidental; there is an inherent logic that forces religion to make empiricalclaims, putting it on a collision course with science. Third, I argue thatthe defeat of religion in its conflict with science over a number of empiricalissues eventually undermines the credibility of "purely religious" beliefsas well. Fourth, I discuss some consequences of these views for the relationbetween science and religion.

 
 

"Degrees of Freedom in the Social World"
Mariam Thalos, SUNY at Buffalo

 Abstract:
Ever since Hobbes we have sought to explain such extraordinarily commonplacefacts as that prudent people trust each other, keep promises and succeedby and large at coordinating so as not to collide in corridors and roadways.In fact, it is the success which meets our efforts to coordinate withoutcommunication so as not to collide in corridors, which is perhaps the mostperplexing of all social phenomena, because it is enjoyed by creatureswith the capacity to conceive of (inconveniently) many coordination schemesthat achieve the same ends. And ever since Thomas Schelling we have beensuspecting that traditional game theory, which as I shall show embracesa Kantian view of agency, cannot explain this success. I shall here proposean anti-Kantian account of agency which takes as its point of departurethe kinds of coordinations that Schelling demonstrated are so difficultfor traditional game theory to explain. My proposal shall reject the Bayesianfoundations of game theory, which rest on this Kantian view of agency.

 
 

"Measured Realism and Statistical Inference: AnExplanation for the Fast Progress of 'Hard' Psychology"
J.D. Trout, Loyola University of Chicago

 Abstract:
The use of null hypothesis significance testing (NHST) in psychologyhas been under sustained attack, despite its reliable use in the notablysuccessful, so-called "hard" areas of psychology, such as perception andcognition. I argue that, in contrast to merely methodological analysesof hypothesis testing (in terms of "test severity," or other confirmation-theoreticnotions), only a patently metaphysical position can adequately capturethe uneven but undeniable successes of theories in "hard psychology." Icontend that Measured Realism satisfies this description and characterizesthe role of NHST in hard psychology.

 
 
 

"The Principle of the Common Cause Faces the BernsteinParadox"
Jos Uffink, Utrecht University

 Abstract:
I consider the problem of extending Reichenbach's principle of thecommon cause to more than two events, vis-à-vis an example posedby Bernstein. It is argued that the only reasonable extension of Reichenbach'sprinciple stands in conflict with a recent proposal due to Horwich. I alsodiscuss prospects of the principle of the common cause in the light ofthese and other difficulties known in the literature and argue that a moreviable version of the principle is the one provided by Penrose and Percifal(1962).

 
 
"Inference to the Best Explanation and TheoreticalEntities"
Susan Vineberg, Wayne State University

 Abstract:
Scientific entity realism has been challenged on the grounds that itdepends on the controversial principle of inference to the best explanation.I defend the view against this challenge, by showing that the particularinferences needed by the entity realist do not have the objectionable featuresantirealists cite in questioning inference to the best explanation. Indeed,these inferences are either ones needed by empiricists in arguing for empiricaladequacy, or which are no stronger than the realist needs for justifyinghis claims about ordinary objects.

 
 
 

"Gravity and Gauge Theory"
Steve Weinstein, Northwestern University

 Abstract:
Gauge theories are theories which are invariant under a characteristicgroup of "gauge" transformations. General relativity is invariant undertransformations of the diffeomorphism group. This has prompted many philosophersand physicists to treat general relativity as a gauge theory, and diffeomorphismsas gauge transformations. I argue that this approach is misguided.

 
 

"Defending Longino's Social Epistemology"
K. Brad Wray, University of Calgary

 Abstract:
Though many agree social factors influence inquiry, developing a viablesocial epistemology has proved to be difficult. According to Longino, itis the processes that make inquiry possible that are social; they requirea number of people to sustain them. These processes also ensure that theresults of inquiry are not merely subjective opinions, and thus deservethe label "knowledge."
I defend Longino's epistemology against charges of Kitcher, Schmitt,and Solomon. Longino rightly recognizes that different social factors havedifferent effects on inquiry, and recommends reconceptualizing "knowledge,"distinguishing knowledge from opinion by reference to a social standard.
 
 
 
 



 
 
 Symposium: Philosophical Perspectives on Quantum Chaos

Organizer: Frederick M. Kronz, University of Texas at Austin

Chair: TBA

“Bohmian Insights into Quantum Chaos”
James T. Cushing, University of Notre Dame

Abstract:
The ubiquity of chaos in classical mechanics (CM), as opposed to thesituation in standard quantum mechanics (QM), might be taken as speakingagainst QM being the fundamental theory of physical phenomena.  Bohmianmechanics (BM), as a formulation of quantum theory, may clarify both theexistence of chaos in the quantum domain and the nature of the classicallimit.  Two interesting possibilities are (i) that CM and classicalchaos are included in and underwritten by quantum mechanics (BM) or (ii)that BM and CM simply possess a common region of (noninclusive) overlap. In the latter case, neither CM nor QM alone would be sufficient, even inprinciple, to account for all the physical phenomena we encounter. In this talk I shall summarize and discuss the implications of recent workon chaos and the classical limit within the framework of BM.

“Nonseparability and Quantum Chaos”
Frederick M. Kronz, The University of Texas at Austin

Abstract:
The clockwork image of classical physics has been shattered by therelatively recent discovery of chaotic classical models.  The converseto this ironic situation seems to be developing in quantum physics. Conventional wisdom has it that chaotic behavior is either strongly suppressedor absent in quantum models.  Some have concluded that these considerationsserve to undermine the correspondence principle thereby raising seriousdoubts about the adequacy of quantum mechanics.  So, the quantum chaosquestion is a prime subject for philosophical analysis.  The mostsignificant reasons given for the absence of suppression of chaotic behaviorin quantum models are the linearity of Schrödinger’s equation andthe unitarity of the time-evolution described by that equation.  Bothare shown in this essay to be irrelevant by demonstrating that the crucialfeature for chaos is the nonseparability of the Hamiltonian.  Thatdemonstration indicates that quantum chaos is likely to be exhibited inmodels of open quantum systems.  A measure for probing such modelsfor chaotic behavior is developed and then used to show that quantum mechanicshas chaotic models for systems having a continuous energy spectrum. The prospects of this result for vindicating the correspondence principleare then briefly examined.

“Chaos and Fundamentalism”
Gordon Belot, Princeton University

Abstract:
I discuss the philosophical significance of the problem of quantumchaos.  I concentrate on the most pressing form of this problem: thefact that there are many chaotic classical systems whose quantizationsare non-chaotic, even periodic.  I argue that this puts a strain onthe correspondence principle, conceived of as requiring that quantum mechanicsshould be capable of accounting for the empirical successes of classicalmechanics.  I argue that even very weak versions of the correspondenceprinciple are in some danger of being falsified in light of the problemof quantum chaos: the viability of weak versions of the correspondenceprinciple depends upon rather delicate empirical considerations. This leaves us worrying about a question which was once thought to havebeen settled: the empirical and conceptual relationship between classicalmechanics and quantum mechanics.  I close with some comments concerningthe philosophical import of the details of such limiting relations betweentheories.
 
 

Symposium: The Organism in Philosophical Focus

Organizer: Manfred D. Laubichler, Princeton University

Chair: TBA

“Fashioning a Descriptive Model of the Worm”
Rachel Ankeny, University of Pittsburgh

Abstract:
The nematode Caenorhabditis elegans frequently is touted as an exemplarof a model organism for genetics, development, and neurobiology. This paper analyzes the concept of C. elegans as a model organism againstthe background of a larger historical study.  I argue that the earlyhistory of the worm project included a pre-explanatory stage that involveddescriptive rather than explanatory models and accordingly lacked muchemphasis on theory or explanations as they are traditionally viewed inthe philosophy of science; therefore the applicability of previous philosophicalviews on modeling (e.g., the semantic conception of theories) is limited. As an alternative, I expand the concept of a descriptive model, and useit to understand how C. elegans may be viewed as a prototype of the metazoaagainst the backdrop of examples of the three types of modeling that occurwith C. elegans: modeling of structures, of processes, and of information. In conclusion, I suggest that more investigation of descriptive models(such as those generated in the worm project) and their relation to explanatorymodels developed in later stages of a scientific research program mustbe done in order to capture important aspects of organism-based research.

“Behavior at the Organismal and Molecular Levels: The Case of C. elegans”
Kenneth Schaffner, George Washington University

Abstract:
Caenorhabditis elegans (C. elegans) is a tiny worm that has becomethe focus of a large number of world wide research projects examining itsgenetics, development, neuroscience, and behavior.  Recently two groupsof investigators (Lockery’s and Rankin’s labs) have begun to tie togetherthe behavior of the organism and the underlying neural circuits and molecularprocesses implemented in those circuits.  Behavior is quintessentiallyorganismal—it is the organism as a whole that moves and mates—but the explanationsare devised at the molecular and neurocircuit levels, and tested in populationsusing protocols that span many levels of aggregation.  This paperpresents a summary of key results in neural network analysis of the worm,and considers the prospects for generalizing those results, both methodologicallyand substantively, to other organisms.  In particular, some comparisonsand contrasts with behavioral and neuroscience work on the fruit fly, Drosophilamelanogaster, will be briefly considered.

“Organism and Decomposition: Steps Towards an Integrative Theory ofBiology”
Manfred D. Laublicher, Princeton University, and
Günter P. Wagner, Yale University

Abstract:
The organism is the central reference frame for much of biologicaltheory and practice.  It is therefore all the more surprising thatno well defined organism concept exists or even that the organism conceptdoes not draw similar attention as other intensely debated biological conceptssuch as the gene or the species concept.  So far, the problem of theorganism has been mostly discussed in the context of the problem of individualityand the unit of selection issue (see e.g., Buss 1987, Hull 1989, Ghiselin1974).
Here we investigate the question whether there is an organism hiddenin many successful biological theories and models.  We propose a notionof the organism that would serve as the focus of integration for the manyseparate biological theories that make up the newly established fieldsof organismal and integrative biology.  We argue that the structuralweakness of many of these theories lies in the absence of a general theoryof the organism that, together with an appropriate decomposition theorem,would account for the specific variables in these theories and models asproperly individualized characters of an organism.  We will investigatethe formal properties of such a general organism concept and demonstratethe application of a decomposition theorem in the case of the unit of selection(see also Laubichler 1997, Wagner, Laubichler, and Bagheri 1997).

“Ontological Butchery: Distinguishing Organisms from Other FunctionallyIntegrated Biological Entities”
Jack A. Wilson, Washington and Lee University

Abstract:
Biological entities of many kinds, such as cells, organisms, and colonies,are made up of physically continuous and causally integrated parts. Some are composed of parts which are themselves built from causally integratedparts, e.g., a colonial siphonophore and the zooids that compose it. Both have many of the properties commonly associated with an organism. The parts of these entities display diverse degrees of integration whichmakes it difficult to distinguish organisms from other kinds of livingthings.  In this paper, I explore the history of the organism conceptin biology and defend a demarcation criterion based on a position sharedby J.S. Huxley and Richard Dawkins, that an organism is a living entitythat is sufficiently heterogeneous in form to be rendered non-functionalif cut in half.

“The Organism in Development”
Robert C. Richardson, University of Cincinnati

Abstract:
Developmental biology has resurfaced in recent years, but often apparentlywithout a clear central role for the organism.  The organism is pulledin divergent directions: on the one hand, there is an important body ofwork that emphasizes the role of the gene in development, as executingand controlling embryological changes (cf. Gilbert 1991); on the otherhand, there are more theoretical approaches under which the organism disappearsas little more than an instance for testing biological theories (cf. Brandon1997; Schank and Wimsatt 1976, and Kauffman 1993).  I press for theineliminability of the organism in developmental biology.  The disappearanceof the organism is illusion.  Heterochrony, or change of timing indevelopment, assumes a central role in evolutionary developmental biology(Gould 1977; McKinney and McNamara 1991), whether or not it deserves therole often accorded to it as the central developmental mechanism (cf. Raff1996).  Genetic studies of the basis of heterochrony in C. elegans,for example, display early appearance of adult structures, and deletionof early stages, both heterochronic changes.  They nonetheless displaythe global basis of heterochronies (Ambros and Moss 1994), and leave acentral role to the organism.  More theoretical approaches treat heterotopy,or spatial patterning in development, as the consequence of more generaland abstract processes of development (Kauffman 1993), or understand changesin timing in terms of general dependency relations (Schank and Wimsatt1976).  These more global, “emergentist,” approaches fit poorly witha broad range of cases in developmental biology.  Classical transplantationexperiments show a degree of epigenetic (systemic and local) control fornormal growth (Bryant and Simpson 1984) which is not consistent eitherwith more formal models or with developmental entrenchment.  In studiesof adult organisms (Maunz and German 1997), there are heterochronic patternsof growth which are displayed by particular organisms (in this case, rabbits),but which may not be general.  There are many sources of heterochrony,and it is important to determine which are prevalent in a given case. Again, the organism turns out to be central in understanding the evolutionof development.

Commentator: Jane Maienschein, Arizona State University
 
 

Symposium: Studies in the Interaction of Psychology and Neuroscience

Organizer: Gary Hatfield, University of Pennsylvania

Chair: TBA

“Mental Functions as Constraints on Neurophysiology: Biology and Psychologyof Color Vision”
Gary Hatfield, University of Pennsylvania

Abstract:
The concept of function has been prominent in both philosophy of biologyand philosophy of psychology.  Philosophy of psychology, or philosophicalanalysis of psychological theory, reveals that rigorous functional analysescan be carried out in advance of physiological knowledge.  Indeed,in the area of sensory perception, and color vision in particular, knowledgeof psychological function leads the way in the individuation and investigationof visual neurophysiology.  Psychological functions constrain biologicalinvestigation.  This example is of general interest as an instanceof the relation between biological and psychological functions and their“wet” realizations.

“Cognitive Science and the Neuroethology of Electroreception”
Brian L. Keeley, Washington University in St. Louis

Abstract:
The theoretical relationship between psychology and neurobiology hasbeen a long standing point of debate within the philosophy of cognitivescience.  Some, Fodor for example, have argued that psychology hasa significant degree of theoretical autonomy from the facts of neurobiology. Others, such as the Churchlands, have responded with arguments purportingto show that neurobilolgy has a crucial and ineliminable role to play incognitive theorizing at all levels.  This paper takes as its startingpoint the science of neuroethology—a multidisciplinary investigation ofanimal behavior that shares many structural similarities with cognitivescience.  The neuro-ethological investigation which led to the discoveryof electroreception (the ability of certain non-human animals to perceivethe world via electricity) is used as a model for how a variety of disciplinesshould interact and cross-constrain one another.  Such a model suggeststhat the fields that make up cognitive science ought to co-evolve and constrainone another in a mutually supportive fashion.

“Neuropsychology and Neurology”
William Hirstein, William Paterson University

Abstract:
Blind-sight and the various agnosias provide important windows on brainfunctioning.  Through the study of deficits caused by brain damage,inferences can be made about the normal functioning of the brain. But these inferences require a conception of normal function cast in psychologicalterms.  The methodology of neuropsychology shed light on the cross-constraintsbetween psychology and neuroscience.
 
 

Symposium: Toward a New Understanding of Scientific Success

Organizer: Janet A. Kourany, University of Notre Dame

Chair: TBA

“How Inevitable Are the Results of Successful Science?”
Ian Hacking, University of Toronto

Abstract:
There are innumerable ways in which successful sciences and their applicationscould have gone differently—different questions, different aims, differentpatrons, different talents, different recourses, different interests. Yet there is a strong conviction, among many scientists, that any successfulinvestigation of a definite subject matter (such as genetics, heat, orastrophysics) would produce equivalent results.  But even some quitecautious authors, like Keller, doubt this.  Genetics without the ideaof a genetic program could, she seems to imply, have led to a non-equivalentbut successful genetics.  The strongest claim for inevitability isthat any intelligent being who took up science, even an alien in outerspace, would develop something equivalent to our fundamental physics, ifthat being was curious in the ways in which we are curious.  In contrastmy claims about the forms of scientific knowledge seem to imply that hadthe physics of the past fifty years not been weapons-funded, non-equivalentphysics would have resulted.  Of course some workers with a socialapproach to knowledge make much stronger suggestions than these.
Who is right?  The “inevitabilist” or the “contingentist”? To answer, one needs first to understand what it is to make different investigationsof the “same” field of inquiry.  One needs to understand what it isfor two bodies of knowledge to be “equivalent.”  Only when these conceptualissues are clarified can one even begin to assess claims to inevitabilityor contingency of successful fundamental scientific conclusions. My contribution to this symposium will be an attempted clarification of“equivalence,” and a discussion of the implications for “inevitability.”

“What Does ‘Explanation’ Mean in Developmental Biology?”
Evelyn Fox Keller, MIT

Abstract:
Three questions form the background to my remarks:
(1) What are the prevailing explanatory narratives in contemporarybiological discussions of development?
(2) What are the scientific functions of these narratives?
(3) How are we to assess the productivity of an explanatory narrative?
By referring to ‘explanations’ in terms of ‘narrative’, I mean to underscoretwo starting assumptions: (a) Explanations are rarely if ever either logicallycomplete or robust; (b) Yet, whatever their purely logical shortcomings,they are nonetheless indispensable to the conduct of science.  Thus,attempts to assess the productivity of explanatory narratives cannot restsolely on logical measures, but will also need to take other measures,including creativity and rhetorical power, into account.  It is partof my aim to describe more appropriate (in the sense, i.e., of realistic) measures of explanatory narratives in developmental biology.  It isalso my hope that such an exercise might even be useful to working scientistsin helping them to identify narratives that have outlived their usefulness.
As a case in point, I want to consider the notion of “genetic program.” Ever since its introduction in 1961, this notion has served as somethingof a rhetorical tour de force for molecular biology that has manifestedtremendous productivity.  Today, however, I will argue that this notionhas outlived its usefulness, and may even have become counterproductive. I will illustrate by examining the specific case of the much publicizedrecent success of mammalian cloning.

“A Successor to the Realism-Anti-Realism Question”
Janet A. Kourany, University of Notre Dame

Abstract:
Ian Hacking and Evelyn Fox Keller have suggested, both in their papersfor this symposium and in previous work, that our views of scientific assessmenthave to be enriched in certain ways.  What follows with regard toour views of the results of such assessment?  Traditional philosophyof science, for the most part, gives us only two options.  Eitherscientific theories that are positively assessed are true (or approximatelytrue, or the like), or they are only useful fictions (“empirically adequate,”or something of the sort).  Of course, traditional philosophy of sciencethen goes on to disclose the intractable problems associated with eachof these options, leading at least some philosophers of science (e.g.,Arthur Fine) to conclude that not only are both options indefensible, butso is the question that calls them forth.  From my point of view,also, the traditional options are unacceptable, but for different reasons. They inspire the wrong kinds of attitudes toward the results of science. To say that a theory is true (or approximately true, etc.) is a discussion-stopper. However socially dangerous or unhelpful or sexist or racist or whateverthe theory is, it is to say that that’s the way things are—we have to acceptthem.  Of course, we need not pursue such a theory, but that is adifficult course of action to adopt.  After all, truth, it is said,has intrinsic value: it should not be suppressed.  And at any ratetruth, it is said, will doubtless prove useful in the long run, even ifit does not seem so in the short run.  To say, on the other hand,that a theory is only useful (or empirically adequate, etc.) is anotherdiscussion-stopper, though not as profound a one.  However sociallydangerous or unhelpful or sexist or racist or whatever the theory is, itis to say that it still has value, that it still can be used for purposesof prediction.  But it leaves open the possibility of pursuing analternative theory that is “just as useful,” but without the drawbacks. The problem is that it buries the question of what the theory is usefulfor, why the predictions are needed—whether, in fact, the theory is reallyuseful to us at all.
What changes occur when we enrich our views of theory assessment alongthe lines suggested by Hacking and Keller?  I will argue that therealism-anti-realism question under these conditions changes significantly,and I will link up pursuit of this changed question with the descriptive-prescriptiveprogram for philosophy of science I argue for elsewhere.
 
 
 

Symposium: Evidence, Data Generation, and Scientific Practice: TowardA Reliabilist Philosophy of Experiment

Organizer: Deborah G. Mayo, Virginia Polytechnic Institute and StateUniversity

Chair: Ian Hacking, University of Toronto

“Data, Phenomena and Reliability”
James F. Woodward, California Institute of Technology

Abstract:
This paper will explore, in the context of the distinction that I haveintroduced elsewhere between data and phenomena, how data serve as evidencefor phenomena.  Phenomena (e.g., the existence of neutral currents)are stable, repeatable effects or processes that are potential objectsof explanation by general theories and which can serve as evidence forsuch theories.  Data (e.g., bubble chamber photographs) serve as evidencefor the existence of phenomena but are typically not explained or entailedby general theory, even in connection with background information.
In contrast to standard philosophical models which invite us to thinkof evidential relationships as logical relationships of some kind I willargue that the reliability of the processes by which data are producedis crucual to their evidential status.  Reliability is roughly thenotion invoked by the reliabalist tradition in epistemology—it is a matterof good error characteristics under repetition.  On the reliabalistpicture, it is this information about error characteristics rather thanthe deductive or inductive logical relationships between data and phenomenaclaims that determine whether the former support the latter.
This paper will describe several examples of data to phenomena reasoningin which the reliability of a detection process is at issue.  I willthen use these to motivate a number of more general claims about reliabilityand evidential support.  For example, although the process by whichevidence is produced is usually regarded as irrelevant by the standardmodels of confirmation, it is crucial on the reliabalist alternative. Given two qualitatively identical pieces of evidence (e. g., two indistinguishablebubble chamber photographs) one may be good evidence for the existenceof neutral currents and the other may not be at all, depending on the wayin which they have been produced.  Similarly, while most standardmodels of evidential support require that evidence be (or be representableas) sentential or propositional in form the reliabalist model doesn’t imposethis requirement.  Data which are non-propositional and which no oneknows how to represent propositionally (e. g. X-ray photographs of possibletumors) can figure as the outcome of a reliable detection process. Moreover, various patterns of data to phenomena reasoning that are actuallyemployed in science and that look circular or otherwise defective on standardaccounts are readily reconstructable as legitimate on a reliabalist analysis. In all of these respects, the reliabalist account provides a better reconstructionof reasoning from data to phenomena than standard models of confirmation.

“Can Philosophical Theories of Evidence be Useful to Scientists?”
Peter Achinstein, Johns Hopkins University

Abstract:
Scientists frequently argue about whether, or to what extent, someputative evidence supports a given hypothesis.  (Witness the recentdebate concerning whether the Martian meteorite is evidence of past lifeon Mars.)  The debates in question are not about whether the statementsdescribing the evidence are true but about the relationship between thesestatements and the hypothesis they are supposed to support.  Now thereis an abundance of theories of evidence proposed by philosophers of science—probabilisticas well as non-probabilistic accounts—that attempt to say how to determinewhether, or to what extent, evidence supports an hypothesis.  Yetthese theories are generally ignored by scientists in their disputes. Why is this so?
Is it simply the result of a lack of communication between disciplines? Is it that philosophical theories of evidence are too abstract to be ofany practical use?  Is it that although methodological theories maydescribe actual practice, they have little if any effect in shaping oraltering that practice?  Perhaps some of each.  But there isanother factor that I want to consider, viz. that philosophers, particularlythose espousing objective (by contrast to subjective) theories of evidence,generally assume that evidence is an a priori relationship.  Theyassume that whether e, if true, supports h, and to what extent it does,can be determined by a priori calculation.  I believe this assumptionis mistaken.  In general, whether e, if true, supports h, and to whatextent, are empirical questions.  If so, then scientists in a givendiscipline would seem to be in a better position to determine evidentialrelationships than philosophers.
In this paper I propose to do the following:
(1) To argue that the relationship in question is empirical, not apriori.  I will do so by showing that whether, and the extent to which,putative evidence supports an hypothesis depends crucially on how the evidencewas gathered.  Flaws in selection procedures are possible that canaffect confirmation, and these can only be known empirically.  Suchflaws can be totally unexpected, given what is known, yet they can be shownto affect confirmation because they impugn the reliability of the datagathering method.  I will argue that this explains why certain controversiesoccur among scientists over whether the putative evidence supports an hypothesis,even when there is no disagreement over what the evidence is.  Italso explains why certain experiments are repeated, usually with significantmodifications.
(2) To consider whether this makes philosophical accounts irrelevantin settling evidential disputes (if so, why, if not, why not?); and toconsider, more generally, how philosophers can contribute to scientificdebates about evidence.  Carnap held the idea that because the relationshipbetween evidence and hypothesis is a priori, in principle philosopherscan help settle scientific disputes by performing a priori calculations. This goal, I think, must be abandoned.  Nevertheless, I believe thatphilosophical contributions are possible.  I will argue that scientistsin actual practice operate with several different concepts of evidencethat need to be distinguished and clarified.  I will distinguish foursuch, one subjective and the other three objective.  All of them,I will argue, represent evidence as an empirical relationship.  Inthe objective class one concept is relativized to a set of beliefs, whilethe other two are not.  Of the latter, one requires the hypothesisitself to be true, the other does not.  So there are fundamental differences.
By reference to actual examples, I will show how certain scientificdisputes about the efficacy of the evidence arise because of empiricaldisagreements involving the testing procedure (something one would expectusing concepts that represent evidence as an empirical relationship), whilesome disputes emerge from a use of the different concepts themselves. In the latter cases I will argue that attention to these differences andclarification of the concepts can aid in the resolution of the disputes.

“Experimental Practice and the Reliable Detection of Errors”
Deborah G. Mayo, Virginia Polytechnic Institute and State University

Abstract
Given the growing trend to renounce “armchair” philosophy of science,it is surprising that current philosophies of evidence remain largely divorcedfrom the problems of evidence in actual scientific practice.  Thesource of the problem, I argue, is the assumption—retained from logicalempiricist accounts of confirmation—that how well evidence supports a hypothesisis a matter of a logical calculation based on given statements of dataand hypotheses.  Such accounts lack the resources to address centralscientific questions about evidence because these questions turn on empiricalinformation about the overall reliability of the procedures for generatingthe data and selecting the test—on what may be called procedural reliability. Two pieces of evidence that would equally well warrant a given hypothesisH on logical measures of evidential-relationship, may in practice be regardedas differing greatly in their evidential value because of differences inhow reliably each is able to rule out errors in affirming H. I believeit is time to remedy this situation.  Drawing on the work of my fellowcontributors, Jim Woodward and Peter Achinstein, as well as my own work(e.g., Mayo 1996), I will  attempt to sketch the directions in whicha new “reliabilist” account of evidence might move.
 
 
 

Symposium: Conceptual Foundations of Field Theories in Physics

Organizer: Andrew Wayne, Concordia University

Chair: TBA

“The Gauge Argument”
Paul Teller, University of California at Davis

Abstract:
Contemporary quantum field theory contains a remarkable argument forthe existence of so called “gauge particles,” such as the photon and itsgeneralizations.  Starting from the assumption that the quantum fieldhas certain very natural spatial symmetry properties, keeping the quantumfield equations consistent appears to force the introduction of a fieldfor the gauge particles.  Physicists are careful not to claim thatthis argument provides a purely a priori deduction of the existence ofgauge particles.  But the argument does appear to require extraordinarilylittle to get it started.  My effort will be to understand the natureof this argument as clearly as possible.

“Counterintuitive Features of Quantum Field Theory”
Gordon Fleming, The Pennsylvania State University

Abstract:
I will discuss the conceptual and structural aspects of several counterintuitivefeatures of quantum field theory selected from the following topics: TheReeh-Schieder property of the vacuum state; Haag’s theorem and the existenceof unitarily inequivalent representations of a quantum field theory; Rindlerquanta and horizon radiation.  All of these features were puzzlingsurprises to the physics community when they were discovered.  Allof them have long since been incorporated into the technical repertoireof quantum field theorists via widely accepted formal structures. None of them enjoys more than very indirect empirical support, if that. And none of them has yet been provided with better than tentatively plausibleconceptual analysis and physical interpretation.  This situation willnot be rectified here but some suggestions will be made.

“Description,
Individuation, and Relation in Gauge Field Theory”
Sunny Auyang, Independent Scholar

Abstract:
A quantum field is best interpreted as the quantization of a classicalfield, a continuous system.  Together with the gravitational field,the quantum fields make up the fundamental ontology of modern physics. Despite its importance, the philosophical implications of the field ontologyremain largely unexplored.  This paper argues, first, the field ontologydoes not abandon the common sense notion of individual entities with definiteproperties but makes the notion precise; second, as spatio-temporally structuredmatter, it is incompatible with both the substantival and relational theoriesof spacetime.
Three types of entities are distinguished in field theories. Events, represented by local field operators, are the basic extensionlessentities individuated spatio-temporally.  The equivalent of enduringbodies represented by paths, relate the events causally.  Particles,which are the modes of field excitation, lack numerical identity and cannotbe referred to individually.  The primacy of local fields over particlesimplies that the significance of spacetime lies in individuating and identifyingentities.  As such it is built into the definition of the entitiesand exhausted by the definition.  Spacetime does not exist independentlyand contain or support matter, as substantivalism maintains.  Noris it the relation among predefined entities, for there are no individualentities in the field without presupposing its spatio-temporal structure.

Commentator: Andrew Wayne, Concordia University
 
 
 

Symposium: Philosophy and the Social Aspects of Scientific Inquiry:Moving On From the “Science Wars”

Organizer: Noretta Koertge, Indiana University at Bloomington

Chair: Cassandra Pinnick, Western Kentucky University

“Reviving the Sociology of Science”
Philip Kitcher, University of California, San Diego

Abstract:
Despite the prevalence of the idea that “sociology of science” hascome to play a major role in science studies, the discipline of sociologyof science, begun by Merton and his successors, has been largely abandoned. Using comparisons with other areas of sociology, I shall try to show whata genuine sociology of science might be, arguing that it can make importantcontributions to our understanding of scientific practices.  Thosecontributions would not be antithetical to philosophy of science but, rather,would enrich our investigations.
 
 

“The Recent Past and Possible Future of the Sociology of Science”
Stephen Cole, SUNY at Stony Brook

Abstract:
In the 1980s the social constructivist approach became the overwhelminglydominant influence on the sociology of science.  Contrary to the widedefinition of constructivism used by some historians and philosophers ofscience, I see constructivism as having at its essence a relativistic epistemology. Without relativism, constructivism does not differ in many significantways from the traditional Mertonian sociology of science.  Mertonians,in fact, studied more than twenty years ago many of the problems whichconstructivists address.
Social constructivism is seen not only as an intellectual movement,but as a political movement.  As an intellectual movement constructivismis in shatters, not being able to successfully defend itself from the problemsof reflexivity, the countless cases brought to light where evidence fromthe empirical world played determining roles in theory choice, and fromattacks by natural scientists.  As a political movement, constructivismretains its hegemony.  All major disciplinary organizations and journalsare dominated by or controlled by constructivists.  It is not surprisingthat the number of people doing non constructivist sociology of sciencehas dwindled to a mere handful.  A once promising discipline, withimportant policy implications, has essentially been killed.
There are many problems in the organization and doing of science whichcry out for sociological analysis.  The new sociologists of sciencewill be informed by some of the work of constructivists, but will ultimatelyreturn to their disciplinary routes and use the theories and methods ofsociology which made the earlier work of substantial utility.  Thepaper will briefly outline some of the problems that are most in need ofattention.

“Science, Values and the Value of Science”
Noretta Koertge, Indiana University at Bloomington

Abstract:
Many people who reject the strong claims of postmodernists about theinevitability of epistemic relativism, the tyranny of language games, andthe determinative influence of social values and professional interestson the content of scientific knowledge, nevertheless share some sympathywith the political critiques of science, technology and science policycoming out of cultural studies.  In this paper I demonstrate the influence of postmodernist perspectives on definitions of scientific literacy andnew initiatives in math and science education.  Philosophers of scienceneed to articulate more clearly a normative model of the role of valuesin science that can supplant the simplistic alternatives of positivismand postmodernism.  I contrast Longino’s approach with one derivingfrom Popper’s philosophy of social science.
 
 

“The Contexts of Scientific Practice”
Rose-Mary Sargent, Merrimack College

Abstract:
When those in science studies began to look at the particularitiessurrounding laboratory practices it was a good counter to what had beena rather simplistic account of experiment in the philosophical literature. But the trend to see practices as purely social and divorced from the realmof theoretical knowledge has gone too far.  I argue that studies ofthe social, technological, and intellectual contexts of science must beintegrated in order to capture the full complexity and variety of experimentalactivities.  In addition, historical cases show that although experimentershave pursued different goals ranging from the purely epistemic to the primarilysocial, significant methodological standards have tended to remain constantacross contexts.
 
 

Symposium: Philosophy of Chemistry

Organizer: Eric R. Scerri, Bradley University

Chair: TBA

“Agency of Multilevel Coherences: in Chemistry and in Other Sciences”
Joseph E. Earley, Sr., Georgetown University

Abstract:
Investigators in many fields routinely deal with compound individuals—entitiescomposed of parts that themselves have definite characteristics. Observable properties of composites generally depend in complex ways onfeatures of the constituents.  In the kinds of systems that chemistsdeal with, agency of compound entities can adequately be understood interms of the composition and structure of the coherence, and the propertiesof the components—but that understanding is not always straightforward. This paper considers approaches to the functioning of compound individualsthat are generally employed in contemporary chemistry, and examines whethersuch approaches might be useful in dealing with some currently-open questionsof philosophical interests, such as the validity of “multi-level selection”in evolutionary biology (cf. special issue of The American Naturalist,July 1997).

“Can We Exclude Nature from the Stability of Laboratory Research?”
Daniel Rothbart, George Mason University

Abstract:
How can we explain the stability of chemical research?  In hisRepresenting and Intervening Ian Hacking argues that the technician’s useof modern instruments is stabilized by the causal powers of certain entities,a position known as entity realism.  But the conception of laboratoryresearch that emerges from his more recent writings contradicts, by implication,entity realism, and suggests a kind of constructivist antirealism. We find in these writing an impoverished conception of research, one whichfails to recognize the philosophical importance of the engineer’s designof modern instruments.  An exploration of modern spectrometers inchemistry illustrates this point.  A functional realism is developed,according to which the experimenter is committed to the existence of causalprocesses as defined purposively from the engineer’s design of the moderninstrument.

“Putting Quantum Mechanics to Work in Chemistry”
Andrea I. Woody, The University of Chicago

Abstract:
This essay examines contemporary diagrammatic and graphical representationsof molecular systems, specifically molecular orbital models and electrondensity graphs, derived from quantum mechanical models.  Its aim isto suggest that recent dissatisfaction with reductive accounts of chemistry(see, e.g., Synthese, June 1997) may stem from the inability of standardanalyses of reduction to incorporate the diverse forms of representationwidely employed in the application of the stationary state Schrödingerequation to chemistry.  After a brief sketch of the historical developmentof these techniques, in part to display the computational complexitiesinherent in corresponding quantum calculations, I compare algebraic anddiagrammatic representations of a few specific molecules, demonstratingthe strengths of each as tools for a variety of reasoning tasks prevalentin chemical practice.  I subsequently provide a more general, andpreliminary, analysis of the inferential resources of molecular orbitaldiagrams, arguing that certain characteristics of this two-dimensionalrepresentation scheme are particularly well-designed to promote the robustnessof common forms of chemical inference.

“The Periodic System: Its Status and its Alleged Reduction”
Eric R. Scerri, Bradley University

Abstract:
One of the reasons why philosophers of science have not paid much attentionto chemistry, until recently, has been the view that this field does notpossess any profound theories or ideas like quantum mechanics, relativityor Darwin’s theory of evolution.  The purpose of this article is tosuggest that, within chemistry and in terms of explanatory power, the periodicsystem possesses a comparable status to the above mentioned theories.
The tendency to dismiss the periodic system’s role seems to stem fromthe mistaken view that it has been reduced to quantum mechanics. It will be argued that all that can be strictly deduced from the principlesof quantum mechanics is an explanation for the closing of electron shells,whereas the independent problem of the closing of the periods cannot. In order to obtain the electron configurations of atoms it is necessaryto use a number of ad hoc schemes such as assuming the order of fillingof electron shells by reference to experimental data.  The periodicsystem exists as an autonomous and essentially chemical explanatory principle.
 
 

Symposium: The Coevolution of Language and Brain: Is there AnythingNew Under the Sun?

Organizer: Edward Manier, University of Notre Dame

Chair: Edward Manier, University of Notre Dame

“The Roots of Human Linguistic Competence—in Apes”
Duane M. Rumbaugh, Georgia State University

Abstract:
Human competencies for language, symbolism, facile learning, prediction,mathematics, and so on, are natural expressions of processes that evolvednotably within the order Primates.  The evolution of large-bodiedprimates was basic to the evolution of a large brain.  But particularlywithin the great ape and human species the process of encephalization contributedto an even larger brain, one within which the operations of intelligenceplayed an increasingly important role in adaptation.  From that point,it is proposed that evolution became directed more to the evolution oflarge brains, not large bodies.  Even at that point in evolutionaryhistory, the neurological bases for language and a broad range of cognitiveskills were in place.  From these natural endowments, our specieshas excelled as a natural projection of competencies that became refinedin the hominids.
Recent research with apes has revealed the critical importance of thelogic-structure of early environment and patterns in interaction with brainsize and complexity to the acquisition of language and relational learningprocesses in young apes.  It will be argued that conventional viewsof learning that emphasize stimuli, responses, and reinforcement are insufficientto the challenge of accounting for the acquisition of such skills in infantapes.  To account for them, the merits of a new class of learningphenomena termed Emergents will be advanced.

“Syntax Facit Saltum: Minimal Syntactic Computation and the Evolutionof Language”
Robert C. Berwick, MIT

Abstract:
Ever since Darwin and long before, the species-specific human languagefaculty has captured the imagination of evolutionary thinkers.  Thereason is simple: For evolutionists, novelties or new traits in a singlelineage— “autapomorphies”—have always posed a challenge.  Whateverthe evolutionary scenario, one must strike a balance between language’ssingularity and evolutionary continuity: How can one account for the strikingspecificity of human syntactic constraints and at the same time retaina Darwinian-style explanation for language’s emergence?
In this talk we show how to resolve this discontinuity paradox in anew way—appealing to the recent linguistic syntactic theory dubbed the“Minimalist Program” (Chomsky, 1995).  We show that once the fundamentalproperty of generativity emerges, then minimalism forces much of the restof human syntax to follow.  All we need in addition is what Deaconargues for: pre-existing substrate of symbols/words (a lexicon). Coupled with the appearance of a single combinatorial operation of “hierarchicalconcatenation,” this leads directly to many of the distinguishing humansyntactic properties: recursive generative capacity; basic grammaticalrelations like subject and object (and only these); observed locality constraintson natural language syntax; and the non-counting, structure dependenceof natural language rules.
Put another way, while it is surely true that natural language, likethe vertebrate eye, is in some sense an “organ of extreme complexity andperfection” in Darwin’s terms, we shall argue that one does not need toadvance incremental, adaptationist arguments with intermediate steps toexplain much of natural language’s specific syntactic design.

“Evolutionary Engineering: How our Mental Capacities were Built froma Set of Primitives.”
Marc Hauser, Harvard University

Abstract:
Studies within cognitive neuroscience illustrate how patients withparticular forms of neurological damage can exhibit dissociations betweenimplicit and explicit knowledge.  Similarly, recent work in cognitivedevelopment shows that up to a certain age, infants have an implicit understandingof the physical and psychological world, but that this knowledge only becomesexplicit with the emergence of other domains of expertise such as language. In this presentation, I would like to propose that the distinction betweenimplicit and explicit knowledge can shed light on cognitive evolution,and in particular, the ways in which nonhuman animals are relatively constrainedin their cognitive abilities.  More specifically, I propose that inthe domain of belief-desire psychology, nonhuman animals are limited toimplicit understanding and that the evolutionary acquisition of languagefacilitated the transition from implicit to explicit understanding of thepsychological world.  To evaluate this hypothesis, I present resultsfrom experiments on nonhuman primates using tasks that tap into both implicitand explicit knowledge.  Tests of implicit knowledge are modeled afterthose used in studies of prelinguistic infants.  Specifically, weuse the preferential looking time procedure to explore what nonhuman primatesknow about (i) invisible displacements, (ii) the causes of object contactand movement, (iii) the distinction about animate and inanimate objects,and (iv) objects with goals, desires and beliefs.  We then use theresults from tests of implicit knowledge to explore comparable problemsat an explicit level.  Results support the position that for somedomain-specific systems of knowledge, nonhuman primates may be incapableof accessing, explicitly, what they know implicitly.  Consequently,nonhuman primates may be frozen in an implicit state of understanding theworld, in much the same way that some autistic children and prelinguistichuman infants appear to be.

“Could Grammar Have Evolved?”
Terry Deacon, McLean Hospital, Harvard Medical School, and Boston University

Abstract:
One intention of many models of language evolution is to explain howgrammatical analysis algorithms could have evolved to become innately representedin some form within the human mind (read brain).  Though I think thatthe presumed necessity of postulating such a linguistic instinct to explainlanguage learning is incorrect for other reasons (see Deacon, 1997), Iintend in this paper to examine the claim that a biological evolutionarymechanism could produce such results.
A biologically plausible model cannot merely postulate some big bangmutation that creates this unique and sophisticated set of interlinkedalgorithms by accident.  It must instead either show how it is alreadypresent in other systems in a sort of preadaptationist argument (e.g. theoriesthat grammar and syntax are entirely computationally parasitic on priormotor action programming systems) or show how it can be progressively “instantiated”in the brain’s wetware by a form of Baldwinian evolution, which, by entirelyDarwinian means, gradually allows patterns of language use to select forspecific biological substrates that support more efficient performance.
Though I am sympathetic with the preadaptationist approach and theanalogies between syntax and action planning are insightful, I do not thinkthe strong form of this argument can succeed.  The principal evidenceagainst it is the fact that the corresponding preadaptation is presentin quite elaborate form in most birds and mammals, without any correlatedlanguage ability.  There are also well known linguistic challengesto many related functionalist accounts of grammar which cannot be entirelydiscounted.
Baldwinian models also provide very informative approaches and somehave led to remarkable predictions about rates and constraints of thissort of coevolution.  However, all Baldwinian models must make certainassumptions that I will show cannot be met, specifically in the case oflanguage.  These have to do with the requirement that for selectionto take place the neural substrate of the linguistic computation in questionmust be essentially the same for all users in essentially all circumstancesacross long stretches of evolutionary time.
The principal problem is the identification of which sign stimuli willconstitute which grammatical categories of language elements.  Ifthe assignment of these grammatical categories (or any surrogate categoricaldistinction) can be “given” by some universal invariants in the input oroutput stimuli, or in the computation itself, then there can be a universalneural substrate correlated with the grammatical computation enabling selectionto take place.
Such invariants are evident in the case of many aspects of languagefunction (sound analysis, speech articulation, symbol learning, short-termverbal memory demands) and in much nonlanguage communication (alarm calls,honey bee foraging dances, human laughter and sobbing), but not with respectto the categorical distinctions that define the basic units of a grammarand syntax.
Because these are symbolic rule systems applied to arbitrary and highlyvariable surface structures from phonemes to phrases, they are intrinsicallynot correlated with invariants present in the surface structure or usepatterns, and rapidly change in very brief time spans.  Grammaticaland syntactic knowledge thus cannot evolve to be increasingly innate foressentially the same reasons that word reference cannot: the indirect natureof symbolic reference (that both are based on) necessarily undermines anypotential invariance relationships.  The interesting question is “Whatinvariant relationships are left for explaining the unquestioned evolutionof a uniquely human propensity for grammar and syntax?”

“Richard Rudner, Postmodern Separatism and The Symbolic Species”

David Rudner, Tufts University

Abstract:
The philosophy of science and the ethnography of science representrespectively  prescriptive and descriptive approaches to a commontopic.  But neither is pure.  Both disciplines poach in eachother’s waters.  Both pass judgments about each other’s competencies. Neither one can ignore the other.  This paper will explore certainkey elements in the post-modern ethnographic critique of science, a surprisingconvergence between these elements and parallel tenets in a pragmatic philosophyof science, and the prospects for a new science of humanity based on empiricalfindings reported in Terrence Deacon’s book, The Symbolic Species.
I begin this discussion by quoting a paper that my father, RichardRudner, wrote twenty five years ago.  At the time, he had taken upthe gauntlet against philosophers and social scientists who would denythe very idea of social science.  At best, according to proponentsof (what my father termed the “separatist position”) students of humanbehavior were engaged in what Peter Winch called, “a misbegotten epistemology”—anexercise in analysis, explication and definition, but an exercise thatnecessarily lacked any component of empirical verification.  In thisregard, the social sciences were held to be essentially distinct from thenon-social or non-human or natural sciences.
The separatist arguments were diverse.  Some were powerful andsubtle; others not.  My father addressed all of them and, as far asI’m concerned, demolished them.  But his efforts predated the riseof postmodernism in the academy and, twenty-five years later, the separatistsare back with a vengeance.  Indeed, in an ironic twist, some of themore extreme separatist arguments now militate against the possibilityof any science at all—social or natural.  In effect, these extremistshave become proponents of a new unified view of human inquiry—but a non-scientificview.
I believe my father would have welcomed the unification of systematicinquiry and even relished the irony in the separatists’ self-inflictedalliance with the non-social sciences.  I think he would also havewelcomed many aspects of the postmodern critique of science.  ButI am also certain that he would have deconstructed that critique and madeuse of the deconstruction to forge a new vision of science that coveredboth human and non-human domains.  My task in this paper will be toexplore certain key elements in the post modern critique of science, asurprising convergence between these elements and parallel tenets in apragmatic philosophy of science, and the prospects for a new science ofhumanity based on empirical findings reported in Terrence Deacon’s book,The Symbolic Species.

Contents:
The Gauntlet
Science as a Cultural System
Power, Hegemony and the Privileging of Science
Discursive Ascent
Indeterminacy, Plurality and Incoherence
“The Scientist qua Scientist Makes Value Judgments”
The Symbolic Species  and the Evolution of Science
 
 
 

Symposium: The Developmental Systems Perspective in Philosophy of Biology

Organizer: Peter Godfrey-Smith, Stanford University

Chair: TBA

“Causal Symmetries and Developmental Systems Theory”
Peter Godfrey-Smith, Stanford University

Abstract:
Some of the central philosophical claims made by Oyama and other developmentalsystems theorists concern causation.  In particular, it is arguedthat there is no way to apportion causal responsibility between geneticand environmental factors, when explaining particular biological traits. So developmental systems views oppose the idea that some traits are “mostlygenetic” while others are “mostly environmental.”
The plausibility of this idea depends on how causal relations are understood. I will evaluate this part of the developmental systems view, making useof some new ideas about the nature of causation.

“Development, Culture, and the Units of Inheritance”
James R. Griesemer, University of California at Davis

Abstract:
Developmental systems theory (DST) expands the unit of replicationfrom genes to whole systems of developmental resources, which DST interpretsin terms of cycling developmental processes.  Expansion seems requiredby DST’s argument against privileging genes in evolutionary and developmentalexplanations of organic traits.  DST and the expanded replicator brookno distinction between biological and cultural evolution.  However,by endorsing a single expanded unit of inheritance and leaving the classicalmolecular notion of gene intact, DST achieves only a nominal reunificationof heredity and development.  I argue that an alternative conceptualizationof inheritance denies the classical opposition of genetics and developmentwhile avoiding the singularity inherent in the replicator concept. It also yields a new unit—the reproducer—which genuinely integrates geneticand developmental perspectives.  The reproducer concept articulatesthe non-separability of “genetic” and “developmental” roles in units ofheredity, development, and evolution.  DST reformulated in terms ofreproducers rather than replicators leaves intact a conceptual basis foran empirically interesting distinction between cultural and biologicalevolution.

“Causal Democracy and Causal Contribution in DST”
Susan Oyama, John Jay College, CUNY

Abstract:
In reworking a variety of biological concepts, DST has made frequentuse of parity of reasoning.  We have done this to show that factorsthat have similar sorts of impact on a developing organism, for instance,tend nevertheless to be invested with quite different causal importance. We have made similar arguments about evolutionary processes.  Together,these analyses have allowed DST not only to resolve some age-old muddlesabout the nature of development, but also to effect a long-delayed reintegrationof developmental into evolutionary theory.
Our penchant for causal symmetry, however (or “causal democracy,” asit has recently been termed), has sometimes been misunderstood.  Thispaper shows that causal symmetry is neither a platitude about multipleinfluences nor a denial of useful distinctions, but a powerful way of exposinghidden assumptions and opening up traditional formulations to fruitfulchange.

“Development, Evolution, Adaptation”
Kim Sterelny, Victoria University, New Zealand

Abstract:
Developmental systems theorists argue that standard versions of neo-Darwinismneed fundamental reorganization, not just repair.  They argue, forexample, that the notion of an innate trait depends on a dichotomous viewof development that is unrescuably confused.  They have argued thatthe “Modern Synthesis” downplayed the role of development and failed tointegrate developmental biology within evolutionary biology.  Butthey further suggest that developmental biology cannot be integrated withinevolutionary biology without fundamentally rethinking neo-Darwinian ideas. In this paper I shall argue against these claims.  The developmentalsystems theory critique of the received view of evolutionary theory hasoften been insightful, but these insights can, I argue, be incorporatedwithin a modified version of that view.
 
 

Symposium: The Prospects for Presentism in Spacetime Theories

Organizer: Steven Savitt, The University of British Columbia

Chair: Lawrence Sklar, University of Michigan

“There’s No Time Like the Present (in Minkowski Spacetime)”
Steven Savitt, The University of British Columbia

Abstract:
Mark Hinchliff concludes a recent paper, “The Puzzle of Change,” witha section entitled “Is the Presentist Refuted by the Special Theory ofRelativity?” His answer is “no.” I respond by arguing that  presentists face great difficulties in merely stating their position in Minkowski spacetime. I round up some likely candidates for the job and  exhibit their deficiencies.
Specifically I consider proposals that the surface of a point’s pastlight cone (Godfrey Smith), its absolute elsewhere (Weingard), and onlythe point itself (Sklar?) be considered its present. I also consider thesuggestion that the present be relativized to “observers” and reject itin (I think) a novel way.

“Special Relativity and the Present”
Mark Hinchliff, Reed College

Abstract:
This paper opens by responding to a set of stock objections to presentism(the view, roughly speaking, that only presently existing things exist)that arise from the special theory of relativity.  It then considerstwo further STR-based objections that have received less attention. And it ends with a proposal for how to fit a view of time, presentism,having considerable intuitive support together with a scientific theory,STR, having considerable empirical support.  Guiding the paper inmany places is an analogy with the complexity and subtlety of fitting ourintuitive view of the mind with our best scientific theories of the body.

“Is Presentism Worth Its Price?”
Craig Callendar, The London School of Economics

Abstract:
This paper highlights some of the undesirable consequences of presentismand then asks whether the original reasons for believing in presentismare worth suffering these consequences.  First, I review the stateof presentism in a specially relativistic (SR) world.  I argue thatrecent contributions which define “objective becoming” upon Minkowski spacetime(e.g., Dorato 1995, Hogarth and Clifton 1995, and Rakic 1997) are not reallydoing presentists a favor.  By “deflating” tenses so that they arecompatible with correctly representing the world as a 4-dimensional manifold,these recent contributions miss the point of presentism.  With noescape through the “deflated tenses” program, I claim that we must agreewith S. Savitt (see his contribution) that no plausible presentist positioncompatible with SR is in the offing.
I then briefly look at general relativity (GR), where matters are evenworse.  Here presentists must try to make sense of objective becomingin all sorts of “pathological” spacetimes, e.g., spacetimes without theability to be temporally oriented, without global time functions, withclosed timelike curves, and more.  If SR makes one pessimistic aboutthe chances of reconciling presentism with modern spacetime physics, GRcauses one to despair.
One way out of all of these problems is to interpret relativity (SRand GR) “instrumentally,” à la J.S. Bell.  Since this readingof relativity may be forced upon us by the nonlocalities of quantum mechanics,this move (for this reason) may be a respectable one.  Understoodeither way, it is clear that presentism requires a radical re-thinkingof contemporary spacetime physics.  Given all the work one must doto sustain presentism, therefore, I would like to re-examine the originalreasons for believing it in the second half of the paper.  Are thesereasons so compelling as to warrant radically re-conceiving contemporaryphysics?  I think not.  For even in its own territory, metaphysics,the case for presentism is far from convincing. Presentism just isn’t worthits price.

“Presentism According to Relativity and Quantum Mechanics”
Simon Saunders, University of Oxford

Abstract:
Presentism is not, I claim, a viable metaphysical position, given thespecial or general theory of relativity as currently understood. For presentismis only plausible insofar as “the present” is taken to be an intersubjectivespatial reality, rather than attaching to an individual person or worldline. From a 4-dimensional point of view, this is to specify a particularspace-time foliation; but failing special matter distributions (and associatedsolutions to the Einstein equations), there is no such privileged foliation.An example, indeed, would have been the Lorentz ether; this is just thekind of object required by presentism (and, in general, by “tensed” theoriesof time). In renouncing the ether as a physical object, Einstein keyedthe equations to space-time structure instead, and not to any special matterdistribution.
On the other hand we can surely define an “effective” dynamics, appropriatefor the description of a single world-line. There is no claim that theresulting equations have universal applicability; they are as keyed tothe specific as you like. But do they have to obey at least some of thefundamental symmetries? Would it matter if these equations turned out tobe non-local, for instance?
The question has parallels in the case of quantum mechanics. In theDe Broglie-Bohm theory there is likewise an “effective” dynamics (an “effective”state reduction); the same is true in the Everett relative-state theory.The latter, of course, purports to respect the space-time symmetries. Wouldit matter if these equations were non-local or non covariant? I suggestthat it would, and locality and covariance should function as a constrainteven at this level.
Locality and covariance cannot be renounced at the “effective” level,because, ultimately, that is the basis of the experience of each one ofus. If these theories cannot deliver at that level, then any purportedconsistency of the fundamental (as opposed to effective) equations withrelativity would seem little more than specious.

Commentator: Lawrence Sklar, University of Michigan [???]
 
 

Symposium: Special Relativity and Ontology

Organizer: Yuri V. Balashov, University of Notre Dame

Chair: TBA

“Becoming and the Arrow of Causation”
Mauro Dorato, University of Rome

Abstract:
In a reply to Nicholas Maxwell, Stein (1991) has proved that Minkowskispacetime can leave room for the kind of indeterminateness required bothby certain interpretations of quantum mechanics and by objective becoming. More recently, his result has been strengthened and extended to worldline-dependentbecoming by Clifton and Hogarth (1995).  In a recent paper, however,I have argued that by examining the consequences of outcome-dependencefor the co-determinateness of spacelike-related events in Bell-type experiments,it turns out that the only becoming relation that is compatible with bothcausal and noncausal readings of nonlocality is the universal relation(Dorato 1996).  Such a result will be discussed vis-à-vis recentclaims to the effect that the relation of causation should be treated astime-symmetric (Price 1996), something which would rule out, on a differentbasis, Stein’s attempt at defining objective becoming in terms of an asymmetricrelation of causal connectibility.  Other, more recent attempts (Rakic1997) at extending Minkowksi spacetime by introducing a non-invariant becomingrelation will be examined and rejected on methodological grounds. Finally, aspects of the crucial problem of exporting current attempts atdefining becoming in STR into GTR will be presented.

“QM and STR: Combining Quantum Mechanics With Relativity Theory”
Storrs McCall, McGill University

Abstract:
Stein (1991) and Clifton & Hogarth (1995) argue that Minkowskispacetime permits the notion of objective becoming to be defined, providedthat an event may be said to have “become” only in relation to anotherevent in its absolute future (Stein) or to a particular inertial worldlineor observer (Clifton & Hogarth).  Non-locality in quantum mechanics,on the other hand, would appear to call for a broader definition. Thus in the Bell-Aspect experiment if the outcome at A is h, the probabilityof h at B is p, whereas if the outcome at A is v the probability is p'. For the photon at B it is plausible to argue that the outcome at A has“already become,” or that the entangled quantum state has “already partiallycollapsed,” even though A lies outside the past light cone of B and hencehas not “become” in the Stein-Clifton-Hogarth sense.
A broader definition would make becoming and collapse relative notto particular events or worldlines but to coordinate frames.  Thusin frame f the collapse at A affects the probability at B, while in framef' B precedes A and the collapse at B affects the probability at A. Becoming and collapse are world-wide processes, though always relativeto a frame, and causal influences in the EPR experiment are reciprocalrather than unidirectional.  We end by showing that the apparentlyconflicting currents of world-wide becoming in different frames can bereconciled in a Lorentz-invariant branched spacetime structure definedin terms of formal constraints on an ordering relation ‘<’, the Einsteincausal relation.

“Relativity and Persistence”

Yuri Balashov, University of Notre Dame

Abstract:

The nature of persistence over time has been intensely debated in contemporarymetaphysics.  The two opposite views are widely known as “endurantism”(or “three dimensionalism”) and “perdurantism” (“four-dimensionalism”). According to the former, objects are extended in three spatial dimensionsand persist through time by being wholly present at any moment at whichthey exist.  On the rival account, objects are extended both in spaceand time and persist by having “temporal parts,” no part being presentat more than one time.
Relativistic considerations seem highly relevant to this debate. But they have played little role in it so far.  This paper seeks toremedy that situation.  I argue that the four dimensional ontologyof perduring objects is required, and the rival endurantist ontology isruled out, by the special theory of relativity (SR).  My strategyis the following.  I take the essential idea of endurantism, thatobjects are entirely present at single moments of time, and show that itcommits one to unacceptable conclusions regarding coexistence, in the contextof SR.  I then propose and discuss a plausible and relativistically-invariantaccount of coexistence for perduring objects, which is free of these defects. The options available to the endurantist are as follows: (1) reject endurantism,(2) reject the idea that objects can coexist, (3) reject SR or (4) takean instrumentalist stance with regard to it.
Of these options, (4) is surely the least implausible one for the endurantist. I briefly discuss what is at stake in taking it up.
 
 

Symposium: Kuhn, Cognitive Science, and Conceptual Change

Organizer: Nancy J. Nersessian, Georgia Institute of Technology

Chair: Nancy J. Nersessian, Georgia Institute of Technology

“Continuity Through Revolutions: A Frame-Based Account of ConceptualChange During Scientific Revolutions”
Xiang Chen, California Lutheran University, and
Peter Barker, University of Oklahoma

Abstract:
According to Kuhn’s original account, a revolutionary change in scienceis an episode in which one major scientific system replaces another ina discontinuous manner.  The scientific change initiated by Copernicusin the 16th century has been used as a prototype of this kind of discontinuousscientific change.  However, many recent historical studies indicatethat this “prototype” exhibited strong continuity.  We suggest thatthe difficulty in capturing continuity results, in part, from an unreflexiveuse of the classical account of human concepts, which represents conceptsby means of necessary and sufficient conditions.
A more satisfactory account has been developed in cognitive psychologyover the last decade, using “frames” as the basic vehicle for representingconcepts.  Using frames to capture structural relations within conceptsand the direct links between concept and taxonomy, we develop a model ofconceptual change in science that more adequately reflects the currentinsight that episodes like the Copernican revolution are not always abruptchanges.  We show that, when concepts are represented by frames, thetransformation from one taxonomy to another can be achieved in a piecemealfashion not preconditioned by a crisis stage, and that a new taxonomy canarise naturally out of the old frame instead of emerging separately fromthe existing conceptual system.  This cognitive mechanism of continuouschange demonstrates that anomaly and incommensurability are no longer liabilities,but play a constructive role in promoting the progress of science.

“Nomic Concepts, Frames, and Conceptual Change”
Hanne Andersen, University of Roskilde, Denmark, and
Nancy J. Nersessian, Georgia Institute of Technology

Abstract:
In his last writings Thomas Kuhn introduced a distinction between normicconcepts—concepts such as ‘liquid’, ‘gas’ and ‘solid’, which together formcontrast sets—and nomic terms—terms such as force which form part of lawsof nature, such as Newton’s three laws of motion.  We argue that familyresemblance is the basis on which both normic and nomic concepts build,but that nomic concepts show an additional fine structure that cannot beexplained by the family resemblance account alone.
We develop a cognitive-historical account of concepts drawing bothon investigations from the cognitive sciences on how humans reason andrepresent concepts, generally, and on fine-grained historical investigationsof cognitive practices employed in science.  Our account shows thatrepresenting nomic concepts requires extending current cognitive scienceresearch on family resemblance concepts.  We propose that nomic conceptscan be represented in frame-like structures that capture such aspects ofthe concept in question as the ontological status and function and causalpower attributed to it in explanations of the problem situations in whichit participates.  Finally, we explore the implications of this accountfor conceptual change, specifically for the problem of the cognitive disequilibriumpreceding conceptual change.  On our account of concepts, conceptualchange can be triggered by two different kinds of disequilibrium in theconceptual system: either related to changes of the similarity classesof problem situations or related to changes in the fine structure of thesituation type.

“Kuhnian Puzzle Solving and Schema Theory”
Thomas J. Nickles, University of Nevada at Reno

Abstract:
In a recent article I showed how case-based reasoning (CBR) illuminatesand corrects Kuhn’s account of puzzle solving by direct modeling on exemplars. But CBR helps only up to a point, where it must yield to something likeschema theory.  After developing this theme and attending to somedifficulties within schema theory, I consider the implications for methodologyand philosophy of science.  (1) A non-rules conception of inquiryobviously challenges traditional conceptions of method and of inquiry itselfas a rationally intelligible, step-by-step procedure.  (2) Kuhnianinquiry is rhetorical rather than logical in the senses {plural????} thatsimilarity, analogy, metaphor, etc., are the important cognitive relations;Kuhn’s similarity metric is graded rather than all-or nothing; and he introducesplace (topos) as well as time (history) into his model of science. Rather than searching the space of all logical possibilities, scientistsin a given historical period construct and explore local search spacesaround the points they actually occupy.  The resulting scientificdevelopment resembles the nearly continuous biological evolutionary pathwaysthrough design space more than the great leaps that have long tempted rationalisticphilosophers.  But what could methodology then look like?
 
 
 

Symposium: The Medical Consensus Conference

Organizer: Miriam Solomon, Temple University

Chair: TBA

“The Epistemic Contribution of Medical Consensus Conferences”
Paul Thagard, University of Waterloo

Abstract:
I recently completed a study of the development and acceptance of thebacterial theory of peptic ulcers (Thagard, forthcoming).  A key eventin the general acceptance of this theory, which was greeted with greatskepticism when it was first proposed in 1983, was a Consensus Conferencesponsored by NIH in 1994 that endorsed antibacterial treatment of ulcers. In the following three years, at least nine other countries held similarconferences and the American Digestive Health Foundation sponsored a secondAmerican consensus conference in February 1997.  I attended the CanadianHelicobacter Pylori Consensus Conference that took place in Ottawa in April1997.  My talk will describe the aims and events of the conference,then focus on the general question: how do medical consensus conferencescontribute to medical knowledge and practice?  After a discussionof the nature of evidence-based medicine and the logic involved in decisionsabout treatment and testing, I extend and apply Alvin Goldman’s (1992)epistemic standards to evaluate the scientific and practical benefits ofconsensus conferences.

Title TBA
John H. Ferguson, Office of Medical Applications of Research, NIH

Abstract:
To be provided

“Are Consensus Conferences Political Instruments?”
Reidar Lie, University of Bergen, Norway

Abstract:
Consensus conferences have been adopted by health authorities in anumber of countries as a method of technology assessment in medicine. The idea is that by having evidence from a variety of experts presentedto a panel of neutral people it is possible to reach a balanced decisionconcerning appropriate use of a new technology.  Critics have pointedout that the conclusion reached by the panel depends crucially on who arechosen as the experts and panelists, and that it is possible to predictthe decision if you know who these people are.  There is also a problemwhen consensus conferences are organized in rapidly evolving fields ofknowledge: the results of ongoing clinical trials may for example makethe decision by a consensus conference quickly obsolete.  For thisreason one may suspect that consensus conferences are used as politicalinstruments rather than as instruments of neutral assessment of new technologies. Two examples from Norway, the consensus conferences to assess ultrasoundscreening in pregnancy and estrogen replacement therapy, will be used toexamine this claim.

Commentator: Miriam Solomon, Temple University
 
 

Symposium: Relations between Philosophy of Science and Sociology ofScience in Central Europe, 1914?1945

Organizer: Alan Richardson, University of British Columbia

Chair: TBA
 

“On the Relations between Psychologism and Sociologism in Early Twentieth-CenturyGerman Philosophy”
Martin Kusch, Cambridge University, UK

Abstract:
In this paper I shall investigate one of the intellectual roots ofthe early sociology of knowledge in the German-speaking world (especiallyFleck, Jerusalem, Mannheim, Scheler).  This root is the debate overpsychological naturalism.  I shall show that the sociology of knowledgewas informed and influenced by both psychologism and by the phenomenologicalcritique of psychologism.  I shall also focus on missed opportunities,and some important social influences upon the sociology of knowledge (e.g.one might say that a strong programme in the sociology of knowledge firstsurfaced in Scheler’s war-writings).
 

“Edgar Zilsel’s ‘Sociological Turn’ in Philosophy”
Elisabeth Nemeth, University of Vienna, Austria

Abstract:
Edgar Zilsel’s studies on the origins of modern science are well knownto historians and sociologists of science.  Philosophers, by contrast,have hardly taken note of his work.  One reason might be that it isdifficult to define his philosophical position “on the periphery” of theVienna Circle, that is, it cannot be exclusively related to analytic philosophyor, for all that matter, to any other twentieth-century philosophical tradition. It could also be that philosophers have never really appreciated the “sociologicaland historical turn” that Ziesel wanted to introduce in philosophy. That Ziesel is ignored as a philosopher is certainly unfortunate. First, his philosophical position is interesting in and of itself: hisKantian philosophy of science in conjunction with a materialist theoryof history and his very broad research interests in the history of philosophy,aesthetics and science brought forth his unique sort of empiricism. Second, Ziesel provides an interesting illustration of how a philosophicaland epistemological concern with science can be fruitfully incorporatedin a project of empirical research.  My essay will sketch some ofthe characteristic features of this “transposition” of philosophy intosociology.
 

“Logical Empiricism and the Sociology of Knowledge: Rethinking the Relationsof Philosophy and Sociology of Science”
Alan Richardson, University of British Columbia

Abstract:
Recent re-appraisals of logical empiricism have highlighted some hithertoseldom noticed features of the “received view” of science and related projectssuch as Popper’s critical rationalism.  A number of themes in therecent interpretative work invite a curiosity about the relation of logicalempiricism to sociology of science as practiced in the 1930s and 1940sand as practiced now.  This essay will examine some themes in Carnap’sphilosophy of science with an eye toward their relations to historicaland contemporary concerns with the social aspects of knowledge.  Thethemes are: a concern for objectivity in the sense of intersubjectivity;pluralism, conventionalism, and voluntarism in the foundations of knowledge;practical principles for the adoption of linguistic frameworks, practicalrationality, and the limits of epistemological explanation.  It willexplore features of Carnap’s project in relation to Scheler’s sociologyof science (especially, the rejection of transcendental philosophy of valueas a constraint on practical rationality), Merton’s normative sociology,and current projects in SSK, especially David Bloor’s work on constructivismand conventionalism and Harry Collins’s on the experimenter’s regress.

“Otto Neurath and the Sociology of Knowledge”
Thomas E. Uebel, The London School of Economics, UK

Abstract:
Neurath was the only one of the core members of the Vienna Circle,who actively pursued an interest in the sociology of knowledge.  Thispaper will consider several aspects of Neurath’s involvement, partly tochronicle it historically and partly to assess its systematic place inhis thought.  This paper will show that, predisposed by his trainingin the social sciences, Neurath early on showed an awareness of the ideologicaldimension of economic and social theories and, by the time of the Circle,of that of scientific metatheory and philosophy.  Relevant texts includethe Circle’s manifesto as well as his critique of Mannheim.  Moreover,his own version of naturalism, already controversial in the Circle itself,attempted to integrate sociological considerations within his epistemology. A brief comparison with some contemporary approaches will conclude thepaper.
 
 

Symposium: Realism and Classification in the Social Sciences

Organizer: Michael Root, University of Minnesota

Chair: John Dupre, University of London

“How We Divide the World?”
Michael Root, University of Minnesota

Abstract:
Most systems of classification or kinds in the social sciences areinvented rather than discovered, though, in most cases, invented by thesubjects of the science and not the social scientist.  Some of thesekinds are real and other only nominal.  According to conventionalwisdom, this cannot be, for a kind cannot be both constructed and real. I explain how kinds can be both and use the account to resolve a recentdebate over the reality of race.

“The Politics of Taxonomy”
Richard W. Miller, Cornell University

Abstract:
In the natural sciences, taxonomies are justified by their role infacilitating the explanatory use of a unifying theory (for example, evolutionarytheory in the case of biology).  But even on the broadest understandingof ‘theory’, no warranted unifying theory regulates the scientific explanationof any significant range of social phenomena.  This distinctive disunityof the social sciences, along with the distinctive nature of social questions,sustains a different way of assessing classifications as marking genuinedistinguishing properties, a form of taxonomic assessment in which moralevaluations and political contexts are directly relevant.  I willshow how such assessment should proceed in the judgment of the social scientificstanding of classifications by ethnicity and nationality.  Appealingto connections in natural science between classification, communicationand interests in change, I will argue that the inherently political natureof social scientific taxonomy is not a barrier to objectivity.  Rather,traditional questions about the boundary between facts and values becomequestions of the rational social division of labor in social inquiry.

“Race: Biological Reality or Social Construct?”
Robin Andreasen, University of Wisconsin, Madison

Abstract:
Race was once thought to be a real biological kind.  Today, however,the dominant view is that biological races don’t exist.  Theoristsare now debating the question of the social reality, or lack thereof, ofrace.  I challenge the trend to reject the biological objectivityof race by arguing that cladism—a branch of systematic biology that individuatestaxa solely by appeal to common ancestry—in conjunction with current workin human evolution, provides a new way to define race biologically. I then show how to reconcile the proposed biological conception with currentsocial scientific theories about the nature of human racial categories. Specifically, I argue that social scientific conceptions of race and thecladistic concept are complementary; they should not be interpreted asin competition.
 
 

“Local Realism and Global Arguments”
Harold Kincaid, University of Alabama at Birmingham

Abstract:
The paper looks at realism issues in the social sciences taken as debateswhether basic predicates of kind terms in the social sciences should betaken realistically, i.e. as picking out mind-independent entities. The first half looks at various general conceptual arguments given to showthat social kinds should not be taken realistically.  These argumentsinclude (1) claims that the best explanation of poor empirical progressin the social sciences is that they lack natural kinds, (2) arguments basedon the idea that social kinds don’t have the right relation to the physical,(3) doubts based on the lack of any clear fact/value distinction in thesocial science, and (4) arguments pointing to apparent multiplicity ofcompeting ways to dividing up the social world.  I argue that noneof these considerations establishes that social predicates do not refer. At most, they provide important clues to potential obstacles.  Thesecond half of the paper then goes on to argue that the interesting issuesare not ones that can be decided independently of substantive issues inthe social sciences themselves and that therefore trying to decide realismissues in a global fashion for all social science is misguided.  Ithus look in detail at some basic categories in economics—in particular,the productive/nonproductive distinction which goes to the heart of thediscipline—as a way of identifying some of the various kinds of generalempirical considerations that are at issue in deciding when social kindsshould be taken realistically and when they should not.  Crucial issuesinvolve whether we can produce well-confirmed causal explanations, andI try to show how that is possible with social kinds despite the doubtsthat have been raised about them.
 
 

Symposium: The Structure of Scientific Theories Thirty Years On

Organizers:  Steven French, Leeds University, and
  Nick Huggett, University of Illinois at Chicago

Chair: TBA

“Understanding Scientific Theories: An Assessment of Developments, 1969?1998”
Frederick Suppe, University of Maryland

Abstract
The positivistic Received View construed scientific theories syntacticallyas axiomatic calculi where theoretical terms were given a partial interpretationvia correspondence rules connecting them to observation statements. This view came under increasing attack in the 1960s, and its death canbe dated March 26, 1969, when Carl G. Hempel opened the Illinois Symposiumon Theories with an address repudiating the Received View and proposingan alternative analysis.  This paper will assess what, with hindsight,seem the most important defects in the Received View; survey the main proposedsuccessor analyses to the Received View—semantic reworkings of the ReceivedView, various versions of the Semantic Conception of Theories. and theStructuralist Analysis of Theories; evaluate how well they avoid thosedefects; examine what new problems they face and where the most promisingrequire further development or leave unanswered questions; and exploreimplications of recent work on models for understanding theories.

“Theories, Models and Structures: Thirty Years On”
Steven French, University of Leeds, and
Newton da Costa, University of Sao Paulo,

Abstract
In The Structure of Scientific Theories, Suppe put forward the SemanticApproach as a “... promising alternative to the linguistic formulationapproaches of the Received View and Weltanschauungen analyses.”  Thisapproach has since entered the “mainstream” of philosophy of science throughthe work of van Fraassen, Giere and Suppe himself.  Our aim in thispaper is to survey the developments of the past thirty years from the perspectiveof Suppes’ set-theoretic analysis and to suggest how recent work mightovercome a series of objections to the Semantic Approach.

“The Emergence of Feminist Philosophy of Science”
Helen E. Longino, University of Minnesota

Abstract
This paper will describe the emergence of feminist concerns in philosophyof science.  Feminist philosophers and scientists have been concernedwith explicitly sexist or androcentric aspects of scientific research programsand with the metaphoric gendering of various aspects of scientific practiceand content  In the effort to identify and understand these featuresof the sciences, feminist scholars have advanced theses about the roleof social values in scientific research, about the character of experiment,about evidential relations, and about science and power.  In thisthey have made common cause with scholars in social and cultural studiesof science, but with important differences.  This paper will articulatethe relationship of the work of feminists to other changes in philosophyof science in the last thirty years.

“Local Philosophies of Science”
Nick Huggett, University of Illinois in Chicago

Abstract
One of the central characteristics of the received view was its tendencyto treat questions in the philosophy of science as “global”: what is thenature of scientific theories?  What are the units of significancein science?  What is an explanation?  And so on.  Even problemsin specific sciences were typically approaches (for instance, by Reichenbach)with answers to the global questions in mind.  Much contemporary workstill seeks global answers, but there is also a widespread opinion thatscience has no essence, and that philosophical work should be concentratedon “local” questions that arise within sciences, independently of generaltheories about the nature of science.  This idea can be particularlyidentified, for instance, in the work of Nancy Cartwright, Arthur Fine,and Ian Hacking, as well as in contemporary philosophy of individual sciences. This paper will discuss the “no essence” view, and evaluate the prospectsfor “local” philosophical research programs.

“The Legacy of ‘Weltanschauungen’ Approaches in Philosophy of Science”
Alison Wylie, University of Western Ontario

Abstract:
When Suppe mapped the emerging terrain of post-positivist philosophyof science in 1968, what he called “Weltanschauungen” approaches figuredprominently.  As a loosely articulated family of views about sciencethese require that, in various senses and to varying degrees, the languageand logic of science, the knowledge it produces, its preferred forms ofpractice and even its defining epistemic ideals must be contextualized. The central question that divides the advocates of such approaches is,what constitutes “context”: is it restricted to the theoretical, conceptualassumptions that inform inquiry (aspects of context internal to science),or does it extend to various of the (external) human, social conditionsthat make possible the scientific enterprise?
In the intervening thirty years, “Weltanschauungen” approaches haveproven to be the dominant legacy of the demise of positivism in sciencestudies outside philosophy, and they have generated intense debate aboutthe goals and strategies appropriate to philosophical studies of science. This is an important juncture at which to assess the implications of thesedevelopments for philosophy of science.  Increasingly, even the mostuncompromising constructivists acknowledge that sociologically reductiveaccounts of science are inadequate, while a growing number of philosophersrecognize the need to broaden the scope of their contextualizing arguments. Although this rapprochement has been tentative and uneven, it has givenrise to a fundamental reassessment of the traditional opposition betweenepistemic and social categories of analysis that is evident in recent philosophicalwork in a number of areas.  I will focus here on arguments for rethinkingobjectivist ideals that recognize their pragmatic, socio-historic contingency. One central problem here, which feminist philosophers of science have takenparticular initiative in addressing, is to explain how it is that the featuresof context typically seen as contaminants of scientific inquiry can serveas a source of critical insight that enhances the integrity, credibilityand, indeed, the objectivity of scientific inquiry.
 
 
 

Symposium: Recent Advances in the Logic of Decision: A symposium inHonor of Richard Jeffrey.

Organizer: Allan Franklin, University of Colorado

Chair: Brian Skyrms, University of California, Irvine

Why We Still Need the Logic of Decision
James Joyce, University of Michigan

Abstract
Richard Jeffrey’s classic The Logic of Decision (LOD) defends a versionof expected utility theory that requires agents to strive to secure evidenceof desirable outcomes.  As many philosophers have argued, Jeffrey’ssystem fails as an account of rational choice because it is unable to properlydistinguish the causal properties of acts from their merely evidentialfeatures.  Only an appropriately “causal” utility theory can do thework of a true “logic of decision.”  This notwithstanding, Jeffrey’ssystem has two desirable traits that its causal competitors lack. Ethan Bolker’s elegant representation theorem provides LOD with a theoreticalfoundation superior to that of any other utility theory.  In addition,LOD assigns utilities to actions in a manner that does not depend on theway their consequences happen to be individuated.  These advantagesare so significant, in my view, that we should reject any decision theorythat does not use Jeffrey’s system as its underlying account of value. All value, I claim, is news value seen from an appropriate epistemic perspective,with different perspectives being appropriate for the evaluation of anagent’s actions and for the evaluation of events that lie outside her control. The upshot of this is that causal decision theorists must employ Jeffrey’stheory in order to properly formulate their own.  I will show howthis can be done consistently.  This dependence of causal decisiontheory on the system of LOD may seem to present a problem, however, sinceLOD is often compared unfavorably with other versions of decision theorybecause it cannot, except in recherché cases, extract unique probability/utilityrepresentations from facts about preferences (given a zero and unit formeasuring utility).  Fortunately, this shortcoming is not a seriousone.  Three points need to be made:  First, all theories thatclaim to deliver unique representations resort to technical tricks of dubiousmerit.  Second, reflection on the notion of expected utility theorysuggests that unique representations should not be extractable from factsabout preferences alone, but only from facts about preferences and beliefs. Third, unique representability can be obtained in LOD if we supplementthe Jeffrey/Bolker axioms governing rational preference with independentconstraints on rational (comparative) belief.  This last point turnsout to have significant theoretical implications because it lets us generalizeBolker’s representation theorem so that it can be used as a foundationfor a wide range of decision theories, causal decision theory among them.
 
 

Title: TBA
Ethan Bolker, University of Massachusetts, Boston

Abstract:
To be supplied
 
 

“Conditionals and the Logic of Decision”
Richard Bradley, London School of Economics

Abstract
Rational decision making is typically guided by practical reasoningof a hypothetical kind: the sort of reasoning an agent engages in whenshe supposes that she will perform some or other action under some or otherset of conditions and then considers the consequences of her doing so underjust these conditions.   In reasoning hypothetically agents showthat they can take rational attitudes to the possibilities falling withinthe scope of their hypotheses: that they can believe that if they jumpout the window then they will hurt themselves, can desire that if it snowstomorrow that school will be cancelled and can prefer that if beef is fordinner that it be of French rather than British origin.
In this paper, I extend Richard Jeffrey’s logic of decision by incorporationof conditionals as objects of agents’ attitudes.  (A genuine augmentationis involved because I take conditionals to be non-propositional). In the extended logic, probabilities of conditionals equal their conditionalprobabilities (in accordance with Adams’ famous thesis) and the desirabilitiesof conditionals are weighted averages of the desirabilities of the mutuallyexclusive consequences consistent with their truth, where the weights areprovided by the probabilities of the antecedently presupposed conditions.
In similar fashion, Bolker’s axioms are augmented to define the conditionssatisfied by rational preferences for conditionals (roughly, a conditionala->b is preferable to another conditional c->d iff it is preferable tolearn that b is the case if a is, than to learn that d is the case if cis).  Unlike in the Jeffrey-Bolker system, the assumption that theseconditions are satisfied by an agent’s preferences suffices for them todetermine a unique representation of her degrees of belief and, up to achoice of scale, of her degrees of desire (consistent with the extendedlogic of decision). This shows, I suggest, that to obtain a complete andprecise description of a decision-maker’s state of mind it is necessaryto identify the attitudes she takes to the kinds of possibilities expressedby conditionals.
 
 

"Subjective Thoughts on Subjective Probability: What did Dick Jeffrey know and when did he know it? and a Mathematician'sRequest for Help from the Philosophers"
Ethan Bolker, Department of Mathematics and Computer Science Universityof Massachusetts, Boston

Abstract
In this talk I provide some views of contemporary decision theory frommy perspective as a professional mathematician, an amateur philosopher,and Dick Jeffrey's friend. I'll discuss what Dick and I learned from eachother in Princeton in 1963, and what I've learned since as a decision theoryobserver. In my speculations on the usefulness of that theory I finessethe well studied question "what is probability?" but will address the question"what is random?",  with further thoughts on whether TCMITS will believethe answer. I conclude with my own interpretation of Jeffrey's radicalprobabilism.
 
 
 

Commentator:
Richard Jeffrey, Princeton University