Duhem : Images of Science , Historical Continuity , and the First Crisis in Physics

Duhem used historical arguments to draw philosophical conclusions about the aim and structure of physical theory. He argued against explanatory theories and in favor of theories that provide natural classifications of the phenomena. This paper presents those arguments and, with the benefit of hindsight, uses them as a test case for the prevalent contemporary use of historical arguments to draw philosophical conclusions about science. It argues that Duhem provides us with an illuminating example of philosophy of science developing as a contingent, though natural, response to problems arising in a particular scientific context and under a particular understanding of the history of science in that context. It concludes that the history of science provides little support for interesting theses about the present or future state of science.


Introduction
Debates between scientific realists and antirealists have persisted in the philosophy of science literature of the past half century following the demise of logical positivism.Realists view scientific theories as largely faithful representations of reality; antirealists do not.On the realist image of science, scientific theories reveal the reality that underlies and causally explains the phenomena.On the antirealist image, in contrast, scientific theories are representational fictions constructed by human beings to solve problems that seem pressing at a particular time, to save the phenomena, etc. 3 A good deal of the dialectic between the two relies on meta-level inductions on the history of science.Realists apply an optimistic induction on the history of science.They see it as the continuous, progressive evolution, guided by reality, of increasingly correct theories and increasingly reliable methods that put us in closer contact with reality: the history of modern science, they argue, is a history of increasing instrumental success; such a history would be a miracle if scientific theories were not successively tracking reality more closely with the passage of time -if they weren't progressively converging on the truth (Putnam 1975(Putnam , 1978)).Antirealists press a pessimistic induction.They see a graveyard of past theories that enjoyed significant empirical success but turned out to be false and were discarded -phlogiston theory, caloric theory, and Fresnel's aether theory of optics, for example (Kuhn 1970;Laudan 1981).Though few are nowadays persuaded either by the optimistic path to realism or by the pessimistic path to antirealism, qualified versions of these debates continue today with little resolution. 4 These contemporary philosophical debates bear striking similarities to debates about physics that took place in the late 19 th century.The contemporary philosophical literature makes passing references to the 19 th century debates.For example, Poincaré is correctly considered to be a precursor of contemporary structuralism (Worrall 1989); far more controversially Duhem is variously considered to be a "paradigm antirealist" (Van Fraassen 1980), an antirealist "who rejects theoretical laws" (Cartwright 1983), a convergent realist (Lugg 1990), and a structural realist (Psillos 1999).However, this engagement is unsystematic and consists mainly in selectively drawing examples that support one's position or in citing some 19 th century figure as authority for that position.This is unfortunate, because the 19 th century debates provide an ideal study of the use of historical arguments to support realist and antirealist views about science.The principals of these debates -Duhem, Helmholtz, Hertz, Kelvin, Mach, Maxwell, and Poincaré -were primarily reflective physicists (philosopher-physicists, as I think of them). 5Their primary motivation traces to their concerns as historically informed, working physicists attempting to make sense of their enterprise based on their reflections on the history of science and the state of extant theories.They wondered, as our contemporaries do, about the relationship between physics and metaphysics, the aims and methods of science, the content of physical theories, and the extent to which the progress of science, understood as a series of attempts to fathom the depth and/or extent of the universe, is a "bankrupt" history.Most importantly in the context of using them as a test case, some of them made pronouncements about the future state and proper aim of science based on their historical extrapolations, pronouncements that we can now assess with the benefit of 100 years of hindsight.In this paper I will focus on some of Duhem's historical arguments as such a test case.Duhem was the historical expert among this historically informed group -not only did he write a number of histories devoted to special branches of physics; he is widely acknowledged to be the founder of the history of medieval science. 6As such, one would expect him to get the projectable historical patterns right if anyone could.In this, I will argue, he was only partly successful.His story provides us with an illuminating example of philosophy of science developing as a contingent, though natural, response to problems arising in a particular scientific context and under a particular understanding of the history of science in that context.I will conclude that the history of science provides little support for interesting theses about the proper aim or future state of science.on historical arguments.For a comprehensive account of the twists and turns the debates (including their historical dimensions) have taken from the late 19 th century to the present, see (Liston 2016). 4Thus, for example, it is now commonplace to argue that we should be (selectively) realist about those parts (and only those parts) of scientific theories that explain their instrumental success and are preserved in successor theories, where those parts are variously understood to be structures (Worrall 1989), working posits (Kitcher 1993), core causal descriptions (Psillos 1999), detection property clusters (Chakravartty 2007).(Stanford 2006) challenges all such selective realist strategies. 5Their separation into realists and antirealists is complicated, but Helmholtz, Hertz, Kelvin, and Maxwell had realist sympathies and Duhem, Mach, and Poincaré had antirealist doubts.Though Duhem, Kelvin, Mach, and Poincaré were active into the 20 th century, they are 19 th century thinkers. 6Mach, of course, also authored important critical histories of mechanics and of theories of heat.However, while Mach's histories were selective, pursued primarily as an illustrative guide to the present, and largely relied on secondary sources, Duhem's histories, especially after 1903, increasingly became excavations of primary sources, particularly texts of the Middle Ages.This is convincingly argued in (Martin 1991).

th Century Images of Science, Duhem's Historical Arguments
Duhem begins his major philosophical work (Duhem 1991(Duhem [1906]], 7) with two images of physical theory, characterized in terms of aims:7

R:
A physical theory … has for its object the explanation of a group of laws experimentally established [the stripping of reality of the appearances covering it]

AR:
A physical theory … is an abstract system whose aim is to summarize and classify logically a group of experimental laws without claiming to explain these laws.
He argues that R makes physics subordinate to metaphysical systems since, at any given time, it requires that our physical theories be guided by a metaphysical picture that shapes the explanatory fit.But this is bad, he thinks, for two reasons.First, the history of modern metaphysics is one of irresoluble disputes (action-at-a-distance versus contact action, atoms versus continuous matter-stuff, etc.) with the disputants accusing each other of positing absurd or occult causes.Second, a metaphysical system at best provides directions for constructing models showing that physical theories are consistent with it, but no metaphysical system suffices to derive a physical system from it.The Cartesians and Leibnizeans, for example, argued about what quantity of motion was conserved in the universe as a result of God's immutability -momentum (mv) or vis viva (mv 2 ) -each was consistent with their fundamental metaphysical systems, but only experiment determined (or could determine) that Leibniz was right.Given the sorry track record of metaphysics guiding physics, Duhem concludes that R hinders the progress of physics: physics must be autonomous; i.e., not be hostage to substantive metaphysical or cosmological hypotheses.However, Duhem does not quite conclude, as Mach does, that AR is correct, that a physical theory is merely an economical instrument for organizing the phenomena (Mach 1960(Mach [1893]], 577-595).While physics must not be subordinate to substantive metaphysics, it is ultimately grounded in a general metaphysical conviction: that nature is orderly: "the belief in an order transcending physics is the sole justification of physical theory" (Duhem 1991(Duhem [1908]], 335) While we can't prove the existence of an ontological order, the metaphysical truth that nature is orderly is presupposed and displayed in our scientific activities and expectations (our urge to generalize and unify expecting success); we have an instinctive belief in this metaphysical truth that cannot be shaken by philosophical doubt.Accordingly, Duhem introduces a third image (Duhem 1991(Duhem [1906]], 30).

NC:
A physical theory is a natural classification.
For Duhem a natural classification is a mathematical physical hierarchical organization of the phenomena which, as it becomes more complete, is the reflection of an ontological order.Newton's great achievement in Principia, he argues, is a natural classification that unites heavenly and terrestrial motions so that Neptune's existence was predicted and subsequently discovered.As a theory makes novel predictions, we can't but feel it is providing a natural classification -no merely contrived artifice should be expected to "be a prophet for us".By making physics depend on metaphysical speculations R puts the cart before the horse; instead we need to use the natural classifications of the phenomena provided by physics to tentatively guide metaphysical investigation, since they allow us to see dimly the cosmological/metaphysical realities behind them.
Duhem's sympathy for NC/AR and antipathy to R is motivated by an induction from the past and current state of physics to the conclusion that proponents of causal explanatory theories guided by R allow metaphysical preconceptions to influence their physical theorizing and end up with theories that fail to satisfy the requirements expected of an explanation -and thereby fail to achieve their defining explanatory goal.By contrast, the proponents of abstract representative theories guided by NC have made slow but steady progress in fulfilling their aims.
Most of (Duhem 1991(Duhem [1906]], Part I) is an historical argument designed to show that the abstract representative approach guided by NC is the progressive path to a perfected physics that reflects an ontological order in contrast with the problem-strewn random path of the causal-explanatory approach.8Versions of this historical argument can be found throughout Duhem's writings.His Theories of Heat (1895) is a brief for thermodynamics contrasted with kinetic theories; his Mixture and Chemical Combination (1902) advocates physical chemistry based on thermodynamic and thermochemical foundations contrasted with chemistry based on atomism; his The Evolution of Mechanics (1903) is similarly a brief for energetics as opposed to mechanics -though the settings are different, the central theme and argumentative structure are the same.Key themes are: first, causal-explanatory hypotheses (atoms, ether, etc.) wax and wane, while the history of abstract theories displays steady cumulative growth with later theories preserving earlier theories as special cases; second, unification is achieved by the mathematical organization of experimental data rather than by the search for deep explanatory mechanisms; third, divergences of opinion about mechanisms are irresoluble, while disagreements about abstract theories are eventually settled.

The First Crisis in Physics: Synthetic Physics of Mechanism or Analytic Physics of Principles
But the primary motivation for this comparative exercise lies less in Duhem's historical analysis than in his reflections on the state of science and the extant theories of his day, including the theories he was working on as a practicing physicist, and they provide the best insight to his view.By the 1880-s it had become apparent to working physicists that classical mechanics lacked both the conceptual and mathematical tools to properly describe a host of phenomena.This sense of dissatisfaction with classical mechanics is elegantly expressed in the writings of both Poincaré and Duhem.Poincaré describes the grand, majestic conception of a physics inspired by Newton's and Laplace's treatment of the heavens, a physics of central forces acting between material points attracting or repelling each other with inverse forces, a physics that attempted "to penetrate into the detail of the structure of the universe, to isolate the pieces of this vast mechanism, [and] to analyze one by one the forces which put them in motion".9And he continues, "Nevertheless, a day arrived when the conception of central forces no longer appeared sufficient" and calls this the "first crisis" of physics (Poincaré 1913, 299).The old physics of mechanisms guided by R was failing, and it was time for a new physics of principles.
Duhem similarly distinguishes two types of methods, synthetic and analytic (Duhem 1903).Synthetic methods, guided by R, build up the mechanism from the sizes, shapes, and masses of its elementary bodies and fundamental forces acting on them, construct the law of motion in differential equation form, and compare with experiment the results obtained when initial conditions are set.Only synthetic methods were used for much of modern physics, Duhem says, and he cites some celebrated results: the Cartesian explanation of weight by vortex motion, Lesage's explanation of gravity by impulses of particles on bodies, kinetic theories of gases, Kelvin's gyroscopic ether, Maxwell's mechanical models of electromagnetism, and contemporary mechanical models of light, electricity, and new radiations proposed by Lorentz, Larmor, J.J. Thomson, Langevin, and Perrin.Most contemporary physicists, Duhem points out, have concluded that synthetic methods cannot deliver mechanical explanations of natural phenomena that are complete, unified, general, coherent, or empirically adequate.Instead, Duhem claims, like Poincaré, most contemporary physicists have turned to analytic treatments.

Synthetic Treatments: Problems
Synthetic treatments are guided by R, by a picture of the universe in which all empirical regularities are the effects of fundamental processes involving fundamental entities.They attempt to derive from the bottom up the equations of motion of an isolated mechanical system S from the laws governing S's elementary parts: the state of S is determined by the positions and motions of its component bodies (understood ultimately as fundamental particles), and the motions of S are determined by the motions of its parts and the forces to which they are subject, generally assumed to be inverse functions of their relative distances from each other.It is a physics of differential equations whose natural class of applications is initial value problems.While this worked very successfully to describe the motions of heavenly bodies, Duhem argues, there were many difficulties, especially when it came to dealing with terrestrial phenomena.Some of the difficulties concerned empirical adequacy.For example, Poisson's physical mechanics, a synthetic theory, predicts various bulk and elasticity ratios that disagreed with experiment and, in some cases, were absurd.Some of the difficulties were conceptual.According to mechanism, energy added to a system, like that produced by heat or friction, is converted into energy of the system's elementary bodies.But now the same problem arises at the lower scale: how is energy distributed to an elementary body b? Changes in b's kinetic energy T must be compensated by corresponding changes somewhere.Moreover, both energy distribution at the microscale and spectroscopic data implied that molecules, if they exist, must have elastic properties like modes of vibration.It was therefore important to have a better account of elasticity.But accounts of elasticity attempted by physical mechanists like Poisson were empirical failures.Some of the difficulties were mathematical.Any physical system will have a huge number of degrees of freedom, and the system of differential equations needed to solve the problem of its motion will become unsolvable.To get around these tractability problems, various techniques appealing to macroscopic constraints and boundary conditions have to be deployed to narrow down the number of degrees of freedom to a manageable set.But these techniques are not validated by the synthetic basis (they are added from above, not derived from below); sometimes they cannot be understood within the conceptual framework of the synthetic approach and their consistency with the basic picture is questionable.Finally, some of the problems concerned generality/extendibility.Conservation of energy can be derived from Newton's laws only for conservative systems, where no net work is performed by forces acting on the system.Few natural systems are conservative; indeed, many concrete systems (steam engines, e.g.) lose energy through heat or friction.Extending synthetic treatments to such systems thus became pressing.
Physical mechanists had responses to these problems, but they seemed stretched, ad hoc, and objectionably complicated to Duhem.Poisson, for example, adds elements to his models that are inconsistent with his background assumptions and is forced to replace summation by integration, a replacement that requires crude approximating conditions.Duhem's criticisms of them are trenchant: they employ "ruses and chicanery", retain the theory only by "subtleties and subterfuges" (Duhem 1903, 45), and lack mathematical rigor.His criticisms of Kelvin, Lodge, Maxwell, and the Victorian penchant for synthetically constructed models are caustic.He comments with Gallic flair on Lodge's Modern Views of Electricity: "We thought we were entering the tranquil and neatly ordered abode of reason, but we find ourselves in a factory" (Duhem 1991(Duhem [1906]], 71).He points out that there are as many kinds of material molecules as there are kinds of physical phenomena or experimental laws (Duhem 1991(Duhem [1906]], 82-83).Similar critiques are directed at Maxwell's and Lodge's different mechanical analogies of electromagnetic phenomena and at Kelvin's vortex atom model.Faced with failure of a model, they switch to new models that are inconsistent with other models they use.Such theories, Duhem concludes, cover only "a miniscule fragment of Physics" and the "fragmented representations may not be welded together to form a coherent and logical explication of the inanimate Universe" (Duhem 1903, 100).
In a nutshell, then, physics developed according to the synthetic approach had failed, according to its critics."Visualizable" material points or atoms subject to position-dependent central forces, so successful for representing celestial phenomena, were ill-suited to represent electromagnetic phenomena, "dissipative" phenomena in heat engines and chemical reactions, fluid phenomena, etc. Deformable bodies and viscous fluids are conceptually difficult to construct from atom-like material points; shearing forces are incompatible with central force assumptions; frictional and electrical forces are velocity-dependent.Physics had become a disorganized patchwork of poorly understood theories, each dealing with special cases in its own domain and inconsistent with others, without adequate unified foundations or empirically determined values of microscopic parameters.Lacking coherence, unity, and empirical determinacy, these theories could not claim to be explanatory or realistic.

Analytic Treatments: Promises
This is where the new physics of principles (Poincaré) or physics developed on the analytic approach (Duhem) comes to the rescue.Physicists responded to this crisis, Poincaré explains, not by giving up the dream that the universe is a machine, but by, in a sense, side-stepping the problems.Suppose we have a machine whose initial and final wheels are open to view but whose intermediate machinery for the transmission of energy between the two is hidden.We can determine by experiment that the final wheel turns 10 times less quickly than the initial wheel and, using the principle of conservation of energy (PCE), determine that a couple applied to the one will be balanced by a couple 10 times greater applied to the other.In order to know that equilibrium is maintained by this compensation, we do not need to know how the forces inside the black box compensate each other.Similarly, using the principles of dynamics, we can draw conclusions about macroscopic motions based on observations of them without knowing anything about the microscopic machinery, conclusions that will hold true whatever the microscopic details may be.In addition to PCE Poincaré lists several other principles whose application to physical phenomena suffices "for our learning of them all that we could reasonably hope to know" (Poincaré 1913, 300). 10imilarly, on Duhem's view, analytic treatments develop from general principles like PCE and Carnot's Principle (Duhem 1911, Vol I, 2).Historically they began with Lagrange, who condensed statics into a general principle, the Principle of Virtual Velocities (PVV).PVV tells us that a mechanical system X is in static equilibrium just in case in all infinitesimal virtual displacements of X the forces applied to X perform zero work.In a perfectly balanced see-saw, for example, the sum of the work done by both the external forces (like gravity) acting on it and the internal forces holding it together is zero.Lagrange's analytical mechanics has several attractive features that were later heavily exploited in analytical treatments.First, it was extendible.Using d'Alembert's principle, Lagrange showed how to extend PVV from statics to dynamics: we simply add fictitious "inertial" forces to balance the external forces that are really acting on X to produce its acceleration, so that X is in equilibrium at each instant.Then the work done by the external, internal, and inertial forces will sum to zero.Second, it provides a powerful algebraic method (Lagrange multipliers) that uses constraints in a principled manner to reduce the number of degrees of freedom and thereby to overcome the tractability problems mentioned earlier.Third, Lagrange heavily relies on conjugate pairs of generalized coordinates/quantities and actions that are mathematically related to each other as are position and force: the quantities are "position-like" in the algebraic sense that their empirical behavior is related to the empirical behavior of their 1 st and 2 nd derivatives as position is related to velocity and acceleration.The actions can be interpreted as force, moment of a couple, surface tension, or pressure for corresponding generalized coordinates understood as distance, angle, surface, or volume, respectively; the products of the actions and the corresponding coordinate shifts yield generalized work (as [force times distance moved] is work) as well as generalized versions of "kinetic energy-like" quantities (functions of the squares of the 1 st derivatives of the coordinates), "potential energy-like" quantities (functions of the coordinates but independent of their 1 st derivatives), etc.This history is sketched in (Duhem 1903, Part I).
Duhem was struck by the analogy between these methods of Lagrangian mechanics and the pioneering methods of thermodynamics developed by Clausius, Helmholtz, and Gibbs, which also relied on equilibrium as a central concept, on work-energy relations, and on highly abstract mathematical processes mimicking real processes as the Carnot engine mimics heat engines.Throughout his life as a theoretical physicist, he formulated, defended, and actively pursued a program of energetics (generalized thermomechanics), extending analytical principles to a wide range of mechanical, thermodynamic, chemical, and electromagnetic systems.In the style and method of Lagrange's analytical mechanics, Duhem further generalizes: the conjugate coordinates α may be any collection of variables, functions of which determine the physical state of a system (including its mechanical, thermodynamic, chemical, electrical, and magnetic state); the corresponding actions A are just abstract analogs of mechanical force that are empirically determinable; virtual displacements become virtual modifications of any variable determining the state; locomotion becomes any change of physical state; and equilibrium is a similarly generalized notion covering mechanical, thermodynamic, chemical, and magnetic equilibria.11On Duhem's generalization, the αcoordinates may include standard position and velocity as well as entropy, volume, number of component substances, electric and magnetic charge, etc.Similarly, the A-actions may include standard distancedependent forces as well as velocity-dependent forces (like friction), temperature, pressure, chemical potential, and various actions associated with electric and magnetic fields.
In this way he hoped for a truly general unified theory of rational mechanics that would have better empirical support than rivals and be mathematically and conceptually coherent.This theory expresses a tree at the root of which is the generalized principle of virtual modification covering all systems in mechanical and chemical statics.These systems, like our earlier perfectly balanced see-saw, are in equilibrium if, and only if, for every virtual modification achieved without temperature change, the work performed by actions "which are to symbols for various quantities what forces are to coordinates of mechanical systems" (Duhem 2002(Duhem [1901]], 293) is balanced by the internal thermodynamic potential of the system.The tree grows by addition of branches for various kinds of systems, in effect by supplementing the root equation with new terms to balance work for that type of system, analogously to Lagrange's earlier extension from statics to dynamics by addition of balancing inertial forces.Mechanical and chemical statics are extended to mechanical and chemical dynamics, to viscous fluids, and to a host of systems classified by their phenomenological properties mathematically expressed: systems with friction, static systems exhibiting hysteresis (like annealed steel, permanently dilated glass, colloidal absorption of water vapor); dynamical systems involving hysteresis; thermal systems without friction or hysteresis (reversible heat cycles); thermal systems with friction and/or hysteresis (irreversible heat cycles); etc.The various principles thus supplemented by their own appropriate terms sitting on the various branches of this tree yield equations of "motion" for the type of system characterized by the principle.Duhem sketches these extensions in (Duhem 1903, Part II) and (Duhem 2002(Duhem [1901]]) and lays out the details in his major (1000+ pages) text in physics, (Duhem 1911).
The perceived superiority of these analytic over synthetic treatments rested on the following reasons.First, and most importantly, they allowed physics to avoid hidden mechanisms, and this was considered by many to be a positive given the lack of empirically determined specific information about particles.Just as we can explore the energy connections between Poincaré's two wheels without knowing anything about their physical connections, we can use our balance equations to calculate, for example, the value of the internal thermodynamic potential or the entropy change of a system from empirically determinable quantities like temperature, pressure, and change of volume, without knowing what constitutes any of these quantities.Second, they promised more empirical success than synthetic methods.Writing in 1892, the elastician A.E.H. Love claimed that the best modern experiments supported the multi-constant results of the analytical theories over the rari-constant results of Poisson's synthetic theory (Love 1892, 14).Third, by avoiding hidden mechanisms and developing coherent continuum theories of elasticity they allowed physicists to side-step problematic conceptual questions about how energy is distributed among particles and the whole problem of replication of macro-problems at the micro-level.Fourth, by enabling the free choice of suitable generalized coordinates that fit constraints, and by inventing algebraic procedures (like Lagrange's method of multipliers) for reducing calculational complexity in a principled way, the analytic mechanists avoided the mathematical problems mentioned earlier.Finally, the theories promised to be extendible to dissipative systems in principled ways.
In a nutshell, then, physics developed according to the analytical approach promised to succeed where the old physics had failed.However, what most inspired the proponents of the new physics of principles was the generality they promised.Lagrange's analytic mechanics unified under a single principle, the principle of virtual velocities, long-known laws like the ancient law of the lever and Pascal's law of hydrostatic pressure, was extendible to cover a wide class of dynamical problems, and provided a systematic way of solving them.Gibbs and Helmholtz further extended that system to thermodynamics and physical chemistry.Duhem hoped to extend it to wider and wider classes of dissipative phenomena.

Reactions to the Crisis
As a result of the crisis, physicists became increasingly pre-occupied with foundational efforts to put their house in order.There was widespread belief that the most promising physics required general analytical principles that could not be derived from Newtonian laws and the abstract concepts (action, energy, internal potential, entropy, absolute temperature) needed to construct and apply these principles could not be built from the ordinary intuitive concepts (position, mass, force) of classical mechanics.The more reflective physicists, however, reacted in different ways to this state of affairs.Here we look at four of them in ascending order in their opposition to realism.
Kelvin seems to have held on to the physics of mechanisms to the end, claiming that synthetic models were necessary and sufficient for understanding -"I am never content until I have constructed a mechanical model of the subject I am studying.If I succeed in making one, I understand; otherwise I do not" (Kelvin 1904, Lecture 20) -and that "there must be something in this molecular hypothesis and that as a mechanical symbol, it is certainly not a mere hypothesis, but a reality" (Kelvin 1904, Lecture 1).It should be noted, though, that despite his hard-headed realism Kelvin, the youth, was a principal early developer of energy physics; so, though wedded to the mechanical viewpoint, he was flexible enough to experiment with other styles of theorizing.
Although Maxwell had achieved great success in the Treatise using the analytical (or as he called it, the "dynamical") approach, he nevertheless felt that its methods were too algebraic and did not provide a proper understanding of the phenomena unless they were underwritten by physical ideas involving forces and mechanisms.In this he followed the tradition of the "northern wizards", Thomson (Kelvin) and Tait, who developed analytical approaches which were designed to facilitate ignoration of coordinates (avoidance of hidden mechanisms) but were based on work-energy theorems themselves based on Newtonian impetus.He holds that the electromagnetic field must be a medium for energy transport but admits that we lack any clear representation of the details of its action, and he proposes the mathematical relations between the phenomena that he develops in the Treatise as a first step toward clarity.Maxwell's approach seems both provisional and commendably tentative.12He employs ignoration of coordinates tentatively and methodologically: given our present ignorance of the hidden mechanisms and the absence of empirically determined values for many of their parameters, we should press on with theorizing that will allow us to avoid them and hopefully learn more about at least the form of their parameters from those theories.Maxwell, we might say, espoused a local variety of antirealism about action-at-a-distance forces.Though he employed analytical techniques, he was no antirealist of the in-principle sort.(He believed in atoms but acknowledged that not much was known about them, for example.) But others, like Poincaré and Duhem, reacted to the crisis by espousing more global forms of antirealism.Poincaré seems to have taken a less militant approach to the crisis.Though he saw there were problems with the causal explanatory approach, he also saw problems for the principles approach looming on the horizon.He refers to these problems as a possible 2 nd crisis in physics: the experimentally acknowledged random motion of atoms, the null results of the Michelson-Morley experiments, and newly discovered radioactivity, respectively, were challenging the universal validity of the principles of degradation of energy, Galilean relativity, and conservation of mass and energy. 13In 1905 none of these experimental results was well understood; so it was not clear how theory would respond to the challenges.But Poincaré is hopeful that they will be resolved and the theories that meet them will retain the best current principles as approximations.Poincaré adopts a structuralist position: current entities may be discarded as past ones were, but the structural features of the world or of the phenomena that are expressed by the mathematical equations of current theories will be retained in future theories.
Duhem, as we have seen, rejected all physical theorizing guided by R that appealed to underlying causal mechanisms and instead placed all his confidence in analytic physics guided by its associated NCaim.It is hard to emphasize how different the analytic and synthetic styles of theorizing are.The synthetic style deals with relatively concrete "visualizable" bodies, subject to actual displacements that result from experimental manipulation and the application of physical forces, and moving in paths in spacetime.The generalized analytic style deals with highly abstract properties of abstract systems classified under an abstract principle, PVV, subject to virtual displacements that result from conceptual manipulation, and "moving" in paths that are a continuous sequence of static states (each of which is allowed enough "time" to relax to equilibrium from a virtual manipulation).Natural classifications exploit formal, mathematical rather than sensible, intuitive analogies and end up classifying phenomena in ways that are unexpected from the classificatory perspective of everyday common sense.Duhem thinks of the tree structure that results from these analytic techniques as a natural classification of the phenomena into types of systems.Such a classification would be unified (since organized under one general principle, the root equation) and completely general (since new branches could be added in a well-motivated manner as new systems were discovered).

Aftermath
With hindsight we can see that each side -the proponents of mechanisms and the proponents of principles -turned out to be partly right and partly wrong about the physics.On the one hand, a physics of principles was partly vindicated by later developments.Einstein held that proper understanding requires synthetic theories but progress is often hindered by premature synthetic theories (Howard 2004).In such cases principles theories can come to the rescue by providing extra constraints that make more determinate the synthetic options, and Einstein's own theories of special and general relativity do exactly that (as Maxwell's had earlier).During the 2 nd half of the 20 th century Clifford Truesdell and his students, using new mathematical techniques, provided a rigorous grounding for much of the macroscopic physics of continua that was conceptually and mathematically problematic in the 19 th century and that pushed physicists to develop analytic techniques.These contemporary treatments accept a background of atoms and fundamental forces, but they do not try to explain continuum phenomena in terms of atoms and fundamental forces.Instead, they work entirely at the macroscopic scale and, following Duhem's lead, try to impose conceptually and mathematically coherent order on the domains they study.
On the other hand, a partial vindication of atomism was just around the corner (due to Einstein's theoretical and Perrin's experimental studies of Brownian motion), and nearly everyone, with the exception of Duhem, was converted.No doubt, as Poincaré's references to indifferent hypotheses and Duhem's references to "the absolute indeterminacy of the masses and hidden motions" (Duhem 1903, 97) make evident, the inability of scientists before Perrin's experiments on Brownian motion to empirically determine with any accuracy the properties of atoms and molecules (like their absolute sizes, gram-molecular weight, and number per mole) played an important role in supporting the skepticism/agnosticism of the anti-atomists.But they greatly underestimated the ingenuity of theorists and experimenters and their ability to devise hypotheses that would tie empirically measurable parameters to parameters of elementary bodies sufficiently tightly to determine the latter.The story of how this work led to Perrin's multiple determinations of the values of various parameters including Avogadro's number and their interlocking, mutually supporting consilience is told in various places, e.g., (van Fraassen 2009).But fast-forwarding another twenty-five years or so, quantum mechanics had become generally accepted as the most empirically adequate account of atomic behavior, and though everyone believed in atoms, hardly anyone believed their behavior could be modeled in terms of physically familiar parameters and operations, and the abstract conception had returned in full force.Scientific progress was made, but it appears to have assimilated strands from both the synthetic and analytic traditions.

Some Philosophical and Historiographical Lessons
I conclude by drawing some lessons, largely negative, about the current practice of appealing to the history of science to draw philosophical conclusions about science.
First, we should be wary of using history to set limiting scientific images like R, AR, and NC, characterized in terms of aims and goals.Such restrictive images sound suspiciously a priori.In setting a causal explanatory agenda for science R presupposes that in principle everything is explainable bottom-up from the workings of fundamental entities or stuff.But, other than by an act of faith, we do not have good reasons to believe that.We do not have bottom-up explanations of entropy and many of the processes Duhem and his Truesdellian successors study.Nevertheless, entropy is an important physical quantity that provides important information about physical systems.If proper science required us to fathom the deep explanatory structure of the world that causally explains the unfolding of all else, then continuum mechanics and macroscopic thermodynamics wouldn't count as science.But surely they are science; it's just that they are not illuminatingly organized under R. But, by the same token, contrary to Duhem, neither do we have good reason to believe that the path to progress lies only in the pursuit of NC.If proper science required that, then much of 20 th and 21 st century physics wouldn't count as science.The 19 th century example and its aftermath should make us question notions like the proper aim and form of physical theory, since such notions may be responding, as they were in the 19 th century writings, only to contingent features of our current and past theories.Not only was Duhem partly wrong about the path physics would take, he was completely wrong about the proper aim of physics.It is one thing to propose freeing physics from unsuccessful mechanical conceptions and advocate the pursuit of relatively promising analytical theories.It is quite another thing to restrict the aim and scope of physics to the discovery of real relations between hidden entities underlying the phenomena (Poincaré) or to the non-literal abstract representation of the phenomena that leads to a "natural classification" (Duhem).Why didn't the antirealists follow Maxwell's example and say, "Well, right now, we don't know enough about the minute workings of nature, and we should use analytical techniques or indeed any other techniques we can come up with to see whether we can impose more order on the phenomena which in turn might provide us with more empirical information that might be used to better home in on the minutiae"?We should be, like Maxwell, as humble about our philosophy of science as about our science itself, because nature can surprisingly force us to change our most entrenched historical course.Duhem provides a good example of the need for caution.At his best, he proceeds urging such caution about inferences from past and present states of physical theory to its future states and about inferences from present states of physical theory to conclusions about the world underlying the phenomena (Duhem 1991(Duhem [1905]]).He also acknowledges that newly discovered radiations "have revealed (…) some effects so strange, so difficult to subject to the laws of our Thermodynamics, that no one would be surprised to see a new branch of Mechanics swell up from [their] study" (Duhem 1903, 185).And he is modest about the fate of his general thermodynamics: "It would be quite presumptuous to imagine that [this] system (…) will escape the fate common to the systems that have preceded it (…); but (…) [the theoretician] has the right to believe his efforts will not be sterile; through the centuries the ideas that he has sown and germinated will continue to increase and to bear their fruit" (Duhem 1903, 188-189).
But though Duhem attempted to be impartial, he did not succeed.The historical evidence seems not to support as clear a distinction as he draws between the good guys and bad guys, with a given physicist being singularly committed to one goal rather than another.On the one hand, Duhem's Newton made the first great contribution to natural classification (in Principia), yet the unofficial Newton toyed with causal explanations of gravity, and Newton's emissionist theory was not only causal but arguably sufficiently influential to retard the mathematical development of optics.On the other hand, despite his atomism, for which Duhem criticizes him, Huygens' optics was purely mathematical and provided the basis for Fresnel's subsequent mathematical development of the wave theory.Similarly, Duhem's examination of extant theories of mechanics downplays the fact that many of the mechanists whom he vehemently criticized contributed significantly to the abstract approach Duhem favored.Kelvin virtually invented the analytic approach to heat and energy in the 1850s and strongly influenced Rankine's energetics program, which Duhem acknowledges as his inspiration.And Maxwell's Treatise on Electricity and Magnetism was the 19 th century zenith of the principles program applied to the novel phenomena of the day.Duhem was nothing if not a sensitive historian -his massive work on medieval physics and the clear superiority of his histories of mechanics and heat compared with those of Mach amply demonstrate this.He was surely aware of these subtleties in the historical record.Unfortunately, he largely ignored them.It would be all too easy to convict Duhem of partiality, to claim that his views were motivated primarily by the desire to combat certain kinds of Godless cosmologies associated with metaphysical atomism and physical mechanism and to defend a cosmology and physics more closely aligned with the teachings of Mother Church.Perhaps he was consciously or unconsciously influenced by such desiderata.But, when assessing his motivations we must also attend to what he literally committed to print, and here he is thoroughly frank about his metaphysical and religious predilections and emphatic that his conclusions about the connections between physics and metaphysics are based entirely on his examination of physics and its history.I take him at his word and doubt that this omission had dishonest motives.Nevertheless, one can only assume he thought these historical connections to be unimportant because they didn't fit the pattern of progress and cumulativeness he saw and the image he endorsed.
The general problem with historical extrapolations of the kind Duhem and our own contemporaries want to make is that it is all too often too easy to find a suitable pattern to project.His attempts to extrapolate structures and predict the future of physics should make us wary of all such arguments.Historically sensitive and cautious though he was, Duhem blundered.Why should we think we can do better?The history of physics is a Delphic oracle and its future, shaped as it will be by our contingent and accidental approach to the world, is unlikely to be predictable with any confidence.