Preferred Citation: Horst, Steven W. Symbols, Computation, and Intentionality: A Critique of the Computational Theory of Mind. Berkeley:  University of California Press,  c1996 1996. http://ark.cdlib.org/ark:/13030/ft509nb368/


 
Chapter Ten— An Alternative Approach to Computational Psychology

10.5.2—
The Formal Description of Intentionality

There is, however, a second level of description at which computational description might really add something new to an account of intentionality. For while traditional logical analyses yield numerous essential insights into intentionality, they tend to do very little to give an overarching model of how these insights fit together, and in particular they do not give the kind of model that would seem to be of much use in the project of building an empirical theory. Here the resources of computer sci-


329

ence may be of some use precisely in their ability to supply descriptions of the formal properties of certain kinds of systems. And the insights gained through logical and phenomenological analysis might be interpretable as formal constraints placed on a mathematical description of the "form" of intentional states and processes. This line of thought has been pursued by writers like Dreyfus and Hall (1984) and Haugeland (1978, 1981, 1985), who have seen a certain continuity between the Husserlian approach to intentionality and computer modeling. I shall not go into detail about where I agree and disagree with the analysis presented by these writers but shall supply a few examples of how I think this sort of intuition might be fleshed out.

(1) One insight to be gained from the logical analysis of intentionality is that intentional states can be about other intentional states. I can, for example, wish I could believe that my neighbor was trustworthy (WISH [BELIEF [my neighbor is trustworthy]]), or remember once having believed in the lost continent of Atlantis (RECOLLECTION [BELIEF [Atlantis exists]]). And such an insight is all very well and good, not to mention true. This same insight, however, can also be cashed out as a more interesting claim about the possible structures of intentional states: namely, that the structure permits of recursion. Or, to put it differently, if we were to give a formal description of the form of intentional states, it would have to involve a rule that allowed for recursion by embedding reference to one intentional state within the content of another. And since we have formal ways of talking about recursion, we have now taken a small step towards being able to say something about the abstract formal properties of intentionality. Such an insight might also provide the basis for other hypotheses—such as that the distinction between competence and performance can be applied to this embedding of intentional states, and that there might be general rules governing what intentional states can take particular other intentional states as arguments.

(2) Some insights gained from logical analysis take the form of either normative or productive rules concerning intentional states. For example, an analysis of the intentional modality of recollection reveals that it presents its object as having been previously experienced in some other intentional mode (e.g., perception). This sets normative constraints on the satisfaction of such a state: you cannot felicitously remember seeing Y unless you have at some previous time had a perceptual gestalt of Y . You can, however, experience a state whose intentional modality is RECOLLECTION and whose content is that of oneself having seen Y without actually having had a perceptual gestalt of Y in the past. (There are false


330

memories, after all.) So we would wish to describe our intentional processes in such a fashion that

(1) it is possible to experience RECOLLECTION [self having seen Y ] without having previously experienced PERCEPTUAL PRESENTATION [Y ], but

(2) the satisfaction conditions for RECOLLECTION [self having seen Y ] cannot be fulfilled unless PERCEPTUAL PRESENTAION [Y ] has previously been experienced.[11]

Such rules can, of course, be characterized in terms of purely formal relationships expressed in the form of normative licensing rules (which set constraints on satisfaction conditions) and productive rules which describe what combinations of intentional states actually result in the generation of particular new intentional states.

(3) To take a somewhat different example, the analysis of intentionality may show us how to separate the issue of "being about something" in the sense conveyed by the opaque construal of intentional verbs from the issue of the fulfillment of such states in veridical intentional states. There is, I think, a good case to be made to the effect that, once this is done, we already have a mathematical format for talking about the fidelity of at least some intentional states (e.g., the perceptual ones): namely, the Mathematical Theory of Communication (MTC). Even if one is wary of the claims made by Sayre (1986) that one can build semantic content out of the technical notion of information employed in MTC, it nonetheless seems that MTC might be telling a perspicuous story about the difference between veridical perception and perceptual gestalts that result from illusions, hallucinations, and the like.

Now it is important to see how this story differs from some other stories about computers and the mind. The point here is not that intentional states are just functional relationships to symbols and hence precisely analogous to computing machines. The point, rather, is that there is a system of abstract properties to be found in the system of intentional states and processes, and these might very well be the same abstract properties that are being explored in computer science, in much the same sense that the calculus provided an appropriate set of mathematical forms for problems in classical mechanics. The question is that of finding the right description for the formal features of intentionality, and not that of whether anything sharing those formal features would thereby have intentionality as well. The answer to that latter question is surely no: there


331

will always be purely abstract objects having any given formal structure, and these do not have intentionality. And in general we should not expect any two isomorphic systems to be identical in all properties: for example, thermodynamics and Mathematical Theory of Communication share a formalism, but have different subject matters. The "intentionality" of symbols in computers may seem to track the intentionality of mental states, but only because symbols in computers are representations that have semiotic-meanings and hence are designed to express the mental-meanings of mental states.

To put matters somewhat differently, if we start with an analysis of intentionality and add the resources of computer science, we might end up with a useful set of formal constraints upon the shape intentional systems can take. On the other hand, if we merely start with formal properties, we will never develop notions such as mental-meaning out of those, and hence will never get intentionality as opposed to getting the formal shape shared by intentionality and perhaps any number of other things. Moreover, we need to start with our intuitions about intentionality to know which formal properties are relevant. There are many possible formal descriptions which might be interesting but are not viable as descriptions of cognition. The only way to get a formal description of intentionality is to start top-down from our intuitions about the intentional states we already know about—namely, our own—and study their formal "shape" by a process of abstraction.


Chapter Ten— An Alternative Approach to Computational Psychology
 

Preferred Citation: Horst, Steven W. Symbols, Computation, and Intentionality: A Critique of the Computational Theory of Mind. Berkeley:  University of California Press,  c1996 1996. http://ark.cdlib.org/ark:/13030/ft509nb368/