Desktop version

Home arrow Education

Cartesian Materialism

Dennett also discusses a materialist version of the Cartesian theater. Some materialists believe there is a centralized structure that does all of the mind’s conscious processing. Dennett calls this version of the Cartesian theater Cartesian materialism. But we have no evidence of such a structure—all we have are outdated Cartesian intuitions about how all conscious processing must come together in the brain. We find not a Cartesian theater in the brain but a parallel neural machine that runs a profusion of widely dispersed computational threads simultaneously. We simply end up with neurocomputational information processing:

We need to say something about how the mind comes into the picture. How is it that we come to experience something like a Cartesian theater if, in fact, there is none? According to Dennett, we know that the above diagram is a correct analysis of what is going on when a person interacts with an environment. There is nothing else but parallel information processing doing the work of conscious intentional action. How can we understand this processing? Can we give a mechanical specification of the conscious mind through a reverse-engineering process, analogously to how we can give a mechanical specification of a car engine or watch? What makes the car engine run and the clock tick? What makes us do what we do with our conscious minds?

We reverse-engineer the mind by looking at input stimuli and behavioral effects, and figure out what the right sort of software architecture could be that drives the information processing in between. What software could make us talk and otherwise behave as we do when we are judged to be conscious? Why do we have to reverse-engineer the mind like an Internet hacker trying to figure out how a piece of software works on a remote computer without knowing its implementation details? Why couldn’t we look inside the neurocomputer to see what is going on? We would find a parallel-processing neural architecture. But it does not lend itself well to interpretation. Why not? In one word: “complexity.” No one has learned how to interpret neural networks in the way that it is possible to interpret computer programs, and even if we succeeded in explaining the neural processes at the micro level, we would still need a simpler, higher-level description to make sense of it all (Dennett 1991, p. 193). Even if we examined high-level neural processing, such as large neural networks, we would not be able to see the forest for the trees. The interpretative line of attack must be the software level—the functional computational architecture—but it must be inferred from the outside. We must work ourselves into the mind from the outside—we must hack ourselves into the mind. Think of the mind as an unknown software system and deduce the nature and composition of it from behavioral manifestations.

Dennett thinks of minds and mental states as theoretical entities we infer from behavior. There is no mind with isolatable mental states in the brain, but we can build a theory of the mind, and if the theory squares with behavioral effects, it works and is about something real. The reality of the mental lies in the powers of explanation and prediction. Dennett calls this the intentional systems or intentional strategy approach. Let us examine some of the major theoretical constructions he comes up with.

 
Source
< Prev   CONTENTS   Source   Next >

Related topics