Content and Consciousness: Chapter 3

Herein, Dennett argues that the process of evolution by natural selection can be, theoretically, a sufficient mechanism by which to generate (or approximate) ‘content'(intentionality) within animal nervous systems.   He asks, ‘What, if anything, permits us to endow neural events with content?…approximating to “the contents of our thoughts, perceptions, and intentions”?’

He begins this task by defining intentional behavior extremely broadly, in a common language sense, ‘the sort we normally characterize in Intentional terms.’  And then establishes a first, basic, necessary condition for intentional behavior, ‘…the capacity to store information.’  And since D- is a materialist, ‘the method [of information storage]… will have to be some form of storage in its material organization.’  But even this is not yet enough, as even tree rings store information in their structure.  We need ‘Intelligent storage’ which means that ‘the information stored can be used by the system that stores it’.  This very definition of intelligent storage entails that we bring in another notion, ‘For information to be for a system, the system must have some use for the information, and hence the system must have needs.’

‘The criterion for intelligent storage is then the appropriateness of the resultant behavior to the system’s needs given the stimulus conditions of the initial input and the environment in which the behavior occurs…The useful brain is the one that produces environmentally appropriate behavior.’   So now we have systems, which store information, have goals, and some degree of behavioral plasticity though which it can use that information, intelligently, to pursue those goals.  How could we have those in a material universe?  ‘No physical motions or events have intrinsic significance…what a stimulus…heralds cannot be a function of its internal characteristics alone…the brain is “blind” to the external conditions producing its input.’

I think this notion is very central to Dennett’s philosophy, though he does not expound upon it much here.  I think this is tied to his attitudes toward the homunculus problem; if in some sense the brain did ‘know what produced input x’ it seems it would have to be done by through the ‘perception by the homunculus of an image in the Cartesian theater’.  All the brain ‘knows’ is ‘retinal neuron cluster #5 and #7 firing at rate x’ or even merely ‘firefirefirefirefireinhibitfireinhibitfire, etc.’ it does not (and D- reminds us cannot) ‘know’ ‘Giraffe-there-now’.  These sort of semantic, intentional objects cannot exist in the structure of the brain (which is all natural selection can act upon) but rather only as either ‘free-floating rationales’ or ‘intentional stance attributions’.  There is no semantic content in a brain, there is only semantic content in language (if anywhere).  Brains are ‘stupid’ input-output machines, or in D-‘s terminology ‘complex functional structures’.

‘Functional structures [are] any bits of matter…which can be counted on to operate in a certain way when acted upon in a certain way.’  If we have a brain which is a (somewhat) plastic complex of functional structures (made of neurons and glial cells), and a capacity to ‘sort’ or select among those structures to find and keep those which are conducive to the survival of the system of which they are a part, we would have an evolutionary story of mental construction.  D- thinks this will work, and will explain all of intentionality (‘worth wanting’).

There follows a long chapter with many cognitive science and evolutionary biology delvings which are not relevant to the current philosophical level of consideration (though the specifics do become philosophically interesting at other levels).  These details are often anachronistic or otherwise trivial and I won’t bother with them here.

D- comes around near the end of the chapter to give his account of the as-yet left out concept of ‘Goals’.  He compares the ‘goal-terminating’ behavior of certain simple computer programs (‘Do X until you reach Y then terminate’) with the putative ‘goal-directed’ behavior of humans and other behaviorally complex animals.  His answer to what a goal is, is from the intentional stance: a ‘goal’ is a theoretical posit the scientist uses to explain and predict systemic behavior.  Behavior is ‘goal-directed’ if it is useful for us to describe it as such, ‘Deciding whether a particular animal is exhibiting goal-directed behavior will hings on how we interpret its motions: are they sufficiently directed towards achieving the goal?’

About Harland Grant

https://www.dawdlersphilosophy.com
This entry was posted in Content and Consciousness - Daniel Dennett and tagged , , , , . Bookmark the permalink.

Leave a comment