Contemporary Epistemology V: How We Know

UC Davis Philosophy 102

Theory of Knowledge

Fall, 2004

Instructor: G. J. Mattey, Senior Lecturer

Version 1, December 3, 2004


In this module, we will discuss some of the more interesting contemporary proposals about how we come to know or believe what we do. These proposals may or may not be compatible with claims about epistemic norms. Some grow out of epistemic norms themselves, and some go so far as to reject them altogether.

A Priori Knowledge

Plato held that we gain knowledge a priori through recollection. Some remnant of what was known before life remains in the soul at birth, and through prompting the soul can recover it. Descartes held that humans are created by God in possession of innate ideas. Contemporary epistemologists who hold that we have knowledge a priori tend to look for other mechanisms to explain how this is possible.

The empiricist account of a priori knowledge is that it is gained through conceptual analysis. This view was held in the twentieth century by the "logical empiricists" or "logical positivists" the 1920s and 30s. It has fallen out of favor for a couple of reasons. One is the Wittgensteinian doctrine that we should not look for the meaning of concepts, but rather for the use. Another is the attack by Quine on the very notions of analytic and a priori truth.

In the 1960s, the linguist Noam Chomsky proposed that humans are born equipped with innate linguistic structures, which account for their ability to learn easily any of myriad languages. There is an ongoing debate about whether such structures exist, what they might be like, and how they affect the possibility of human knowledge. We will not discuss these issues further here.

Interest in a priori knowledge has grown recently due to new accounts of it by George Bealer, John Bigelow, Michael Friedman, Tyler Burge, and Laurence BonJour, among others. Some of the accounts are developed along externalist lines. We will here take a brief look at BonJour's version, which most resembles the classical internalist accounts.

BonJour's Account of A Priori Knowledge

In his 1998 book In Defense of Pure Reason, BonJour argues for the indispensability of a priori knowledge. The gist of his argument is that a priori knowledge is a necessary condition for empirical (a posteriori) knowledge. The reason is that to be warranted in our non-foundational empirical beliefs, we must rely on inference that go beyond the content of experience, and so cannot itself be warranted on the basis of experience. If there is to be any warrant for them at all, if a deep skepticism is to be avoided, it must be a priori.

Thus far, the account has been normative. Toward the end of the book, BonJour makes a tentative attempt to give a description of how a priori knowledge comes about. He thinks that its origin must be found in a direct connection between the concepts that make up the content of thought and the universals which theses contents represent. Our thought-contents, he argues, are not mere symbols of universals, but rather are intrinsically connected to them. He is convinced that to give an account of this connection requires "metaphysics of a pretty hard-core kind, a kind that is still relatively rare and unfashionable even in this post-positivistic age" (p. 181).

The account he gives is very tentative, and we will here give only its essentials. Consider the property triangularity, of which one might be said to have a priori knowledge. When triangularity is part of the content of my thought, my mind literally instantiates a property that is "intimately related" to the property triangularity instantiated by triangular things (p. 183).

If something like this view is correct, then one can have knowledge a priori of the property triangularity which triangles have through the "presence" in my mind of the related property. I then could know a priori through the related property the features of triangles.

Naturalized Epistemology

W.v.O. Quine proposed in 1969 that epistemology should be treated as a branch of empirical psychology, which was mentioned in the introductory module ("Epistemology Naturalized"). Since Quine made that proposal, a large body of literature on naturalized epistemology has appeared. We shall here make some brief remarks about the project of naturalizing epistemology.

Quine himself construed psychology rather narrowly, in the tradition of the behaviorists. Epistemology would be the study of the correlation of inputs, sensory stimuli, and outputs, linguistic expressions about the world which provides the stimuli. The main thing to be explained, on Quine's view, is how a relatively "meager" set of inputs gives rise to a "torrent" of outputs, including very sophisticated scientific theories.

Epistemologists today do not take such a narrow view of the scope of the project. First, some of them move inward and try to describe the internal workings of the mind. Second, many of them move outward by considering the epistemic subject as part of a broader ecological system, both in terms of the immediate environment and in terms of the evolutionary development of the organism.

The Internal Mechanisms of the Mind

A common theme in recent work on the mind is that the common-sense description of its workings, what is called "folk psychology," must be superseded by a more scientific approach. Some radical theorists (e.g., Paul Churchland and Steven Stich) go so far as to reject the very existence of beliefs, the cornerstone of standard analyses of knowledge.

Two of the leading scientific approaches are connectionism and artificial intelligence (AI). The former approach is tied very closely to a model of how the brain functions. The latter is quite abstract, in the sense that it describes the mind in a way that is independent of the kind of mechanism which runs the mental "programs."

The thesis of connectionism is that the human mind is a network of neurons in the brain. Connectionists build mathematical models of the ways in which neurons are connected and extrapolate from them accounts of how mental representation and learning take place. These tend to be quite different from the folk-psychological accounts. If connectionism is right, then epistemology will emerge with a new look that we can only faintly glimpse at this time.

An example of the AI approach is the recent OSCAR project of John Pollock. The project is an attempt to build an abstract implementation of reasoning which embodies the epistemic norms Pollock believes are built into human beings.

The objective of the OSCAR Project is twofold. On the one hand, it is to construct a general theory of rational cognition. On the other hand, it is to construct an artificial rational agent (an "artilect") implementing that theory. This is a joint project in philosophy and AI. (OSCAR project Web site)
The construction of the "artilect" is a scientific endeavor that requires more than merely the intuitions of the epistemologist working from the "armchair."

Mind as a Part of Nature

Some epistemologists emphasize the relation of the mind to the natural environment in which it exists. The most prominent development along these lines is "evolutionary epistemology," which attempts to account for human knowledge as the outcome of natural selection. (For an excellent summary of evolutionary epistemology, click here.)

One type of evolutionary epistemology applies evolutionary theory to scientific change. A more plausible type investigates how the evolutionary development of human cognitive systems (such as perceptual systems) is related to knowledge. One consequence of this kind of investigation is that it could turn out the those systems are not sensitive to the truth, but rather are adapted to respond to some other feature of the environment in such a way as to promote survival.

Science

For much of the twentieth century, the epistemology of science was concerned with the representation in logical terms of scientific theories and reasoning. As such, it was engaged in the normative project. In the middle of the century, attention began to turn to the social conditions under which science develops. At present, in the field now known as "science studies," the emphasis is primarily descriptive, with history and sociololgy taking center stage.

The key figure behind the shift was Paul Feyerabend. His original idea is that there is that all statements of "observation" are in fact relative to the theoretical context in which they are made. He later embraced a much stronger form of relativism, claiming that science does not, and should not, develop according to fixed rules of procedure. Nor should scientists try to make their theories compatible with old theories. Great advances in science take place precisely when the old theories and rules of procedure are thrown out the window.

The popularization of the descriptive approach to science is due to the Harvard history professor Thomas Kuhn. His original work was on the "Copernican revolution," in which the old geocentric theory of the motions of heavenly bodies was replaced by a heliocentric theory. Kuhn generalized the results of his specialized study in the famous book The Structure of Scientific Revolutions.

According to Kuhn's description, most of the time scientists are engaged in "normal" science, which follows an agreed-upon set of procedures which Kuhn called a "paradigm." For the Ptolemaic astronomers, the motion of heavenly bodies must be circular. Their task was to find ways of compounding circular motions to produce paths which reflect the observed paths of the heavenly bodies.

A "scientific revolution" occurs when the paradigm breaks down because it is unable to explain certain "anomalies." Ptolemaic astronomers never could get the paths of the heavenly bodies right. One important failing was that they could not produce an adequate calendar, a point made by Copernicus.

When a scientific revolution occurs, there is a shift to a new paradigm, and a new form of normal science emerges. Kuhn's most controversial thesis was that paradigms are "incommensurable." By this he meant that one cannot understand the what the new paradigm means from the standpoint of the old paradigm, or what the old paradigm means in terms of the new.

Many people have drawn a radical epistemological conclusion from Kuhn's description of scientific change. They think that it shows that there is a fundamental irrationality at the heart of scientific practice. One way to put it is that a feature of incommensurability is that paradigms cannot be compared with respect to epistemic superiority. One theory is not better than another: they are just different.

An even more radical view, one held by Feyerabend, is that scientific theorizing itself is not superior to more traditional "folk" ways of understanding the world. This kind of view as developed by David Bloor and others in Edinburgh has come to be known as the "strong programme" for describing science. Bloor enunciated four tenets for the description of science, three of which are relevant to epistemology.

  1. The study is concerned with the way in which scientists acquire their beliefs. (Causality)
  2. It should not show any partiality toward true beliefs as opposed to false beliefs. (Impartiality)
  3. Both true and false beliefs are to be explained by the same kinds of causes. (Symmetry)

A key feature of the alleged irrationality of scientific change is the claim that what really drives it is not the disinterested investigation of nature, but rather the personal and political interests of the scientists themselves. Science, like other social institutions, is the manifestation an underlying dynamics of power.

Understandably, there has been a strong reaction against the relativism of the strong programme and the claim that science proceeds irrationally. The vast outpouring of historical and sociological study of science admittedly seems to undermine the more idealized picture of science that emerged from the scientific revolution of the seventeenth century. But it is not at all clear that the epistemic value of scientific investigation can be undermined completely by the the fact that it is powered to some extent by non-epistemic forces. Science to a large extent proceeds according to epistemic norms, and it falls to the project of validation to try to determine whether those norms lead to knowledge.


[ Previous Module | Next Module | Assignment Page | Course Web Page ]

[ G. J. Mattey's Home Page | UC Davis Philosophy Department Home Page | UC Davis Home Page ]

This page and all subordinate pages copyright © G. J. Mattey, 2004. All rights reserved.