Skip to main content

CHAPTER 4 MIND AND BODY

Descartes: Discourse on the Method of Rightly Conducting the Reason (Conceivability Argument)

In this excerpt, Descartes lays out his conceivability argument for substance dualism. He reasons that something that is conceivable is logically possible, and something that is inconceivable is logically impossible. It is conceivable—and therefore logically possible—that someone can exist without a body. If so, then having a body is not an essential feature of that person. On the other hand, it is inconceivable—and therefore logically impossible—that someone could exist without a mind. If so, then having a mind is an essential feature of that person, and that means that the person is an immaterial, thinking thing.

Schick: Doing Philosophy

Schick maintains that Descartes’s conceivability argument fails because, contrary to Descartes’s view, disembodied existence seems not to be conceivable and is therefore not logically possible. 

Descartes: Meditations on First Philosophy (Divisibility Argument)

In this excerpt, Descartes articulates his divisibility argument for substance dualism: It seems that minds and bodies must be different substances because they have different properties—namely, that bodies can be divided into parts, but minds cannot.

Searle: Mind (Substance Dualism)

Searle argues that substance dualism must be false because it conflicts with a fundamental law of science (the law of conservation of mass-energy). Moreover, the substance dualist’s tack of embracing epiphenomenalism will not save his theory, because epiphenomenalism seems to defy common sense.

Smart: Sensations and Brain Processes

Smart states and defends mind-body identity, clarifying the theory’s central claims and answering several objections to it.

Chalmers: The Conscious Mind

Chalmers argues against materialism by first stipulating the possibility of zombies. For Chalmers, a zombie “is molecule for molecule identical to me, and identical in all the low-level properties postulated by a complete physics, but he lacks conscious experience entirely.” He contends that it is conceivable that such a zombie could exist. That is, it is conceivable that there could be a creature physically identical to him in every way but lacking the mental states that constitute conscious experience. If this zombie is conceivable, he says, then it is logically possible that the zombie could exist. If it is logically possible that he could exist, then physical states must not be essential to conscious experience. Materialism, therefore, must be false.

Nagel: What Is It Like to Be a Bat?

Nagel argues that for an organism to have conscious experience, there must be “something that it is like to be that organism—something it is like for that organism.” This something he calls the “subjective character of experience.” For example, there is something that it is like to be a bat, a specific subjective experience unique to it. For an organism to have conscious experience, there must be “something that it is like to be that organism—something it is like for that organism.” The conclusion to be drawn from such facts is that consciousness does not seem to be the sort of thing that can be explained purely in physical terms. Exhaustively cataloging the physical characteristic of a bat (or a human) will not explain the peculiar nature of its conscious experience. Reductive theories of mind therefore appear to be fundamentally inadequate.

Fodor: The Mind–Body Problem

In this article Fodor criticizes traditional mind–body theories and argues for functionalism, a distinctive departure from both dualism and identity theory. “In the functionalist view,” he says, “the psychology of a system depends not on the stuff it is made of (living cells, mental or spiritual energy) but on how the stuff is put together.” Mental states are functional states—systems of causal relationships—typically realized in, or supported by, the brain. But these relationships need not occur only in neurons; any suitable material will do. The mind is like computer software (a system of functional or logical relationships), which can be realized in, or run on, any suitable hardware.

Block: Troubles with Functionalism (Chinese Brain Thought Experiment)

Block presents what is known as an absent qualia argument against functionalism. The basic idea of such arguments is that it is possible to introduce a functional organization into some system so that, if functionalism is correct, a mind would be brought into existence. But it seems intuitively obvious that no mind at all is constituted, so functionalism is false. Suppose, says Block, “we convert the government of China to functionalism, and we convince its officials that it would enormously enhance their international prestige to realize a human mind for an hour. We provide each of the billion people in China . . . with a specially designed two-way radio that connects them in the appropriate way to other persons and to [an] artificial body . . . we arrange to have letters displayed on a series of satellites placed so that they can be seen from anywhere in China.” According to functionalism, the proper arrangement of inputs and outputs among the billion people should produce mental states. That is, from the one billion Chinese people there should arise one more person—the one brought forth by the whole system’s functional organization. But, says Block, what makes this “Chinese brain” a counterexample to functionalism is that there is considerable doubt that the Chinese brain has any mental states.

Searle: Mind (Chinese Room Thought Experiment)

Searle sets out to refute strong artificial intelligence (AI) with his classic thought experiment known as the “Chinese Room.” The idea is that if strong AI is true, then a person should be able to acquire a cognitive capacity (thinking, perception, understanding, etc.) simply by implementing an appropriate computer program. But Searle thinks his thought experiment shows that no such capacities of mind are achieved just by running a program. He imagines himself locked in a room with boxes of Chinese symbols, and a rule book that allows him to answer questions put to him in Chinese. He is, in effect, implementing a computer program. But, he says, no matter how well I handle inputs and outputs, “I do not understand a word of Chinese. And if I do not understand Chinese on the basis of implementing the right computer program, then neither does any other computer just on the basis of implementing the program, because no computer has anything that I do not have.” The argument is that if strong AI is correct, then Searle should understand Chinese because he is manipulating symbols just as a computer does; he is running a program. But he remains clueless about Chinese. Therefore, computers cannot understand Chinese—or acquire any other cognitive capacity—just by running the right software. Strong AI is false.

Chalmers: The Conscious Mind (Property Dualism)

Chalmers argues for a theory of mind known as “property dualism” (also “nonreductive materialism” and “naturalistic dualism”). On this view, mental states, or properties, are distinct from physical properties, arising from the physical properties without being reducible to, or identical to, them (and without being some kind of Cartesian substance). Philosophers like to say that this relationship between the mental and physical is one of supervenience—that is, mental properties supervene on the physical ones. This means that something possesses a mental property in virtue of having a physical property. The mental property depends on the physical one, arises from it, but is not identical to it. If true, reductive materialism must be false. “This failure of materialism,” says Chalmers, “leads to a kind of dualism: there are both physical and nonphysical features of the world.” Mental properties are features of the world that are “over and above the physical features of the world.”