The Hard Problem Isn't
by
Michael R. Lissack
New England Complex Systems Institute (Cambridge, MA, USA)
Henley Management College (Henley-on-Thames, UK),
150 West 56
th St. #4904New York, New York 10019
Phone: 212-245-7055
Fax: 212-956-3464
e-mail: lissack@lissack.com http://lissack.com
Abstract
Chalmers (1995) has described the hardness of the hard problem of explaining consciousness as its lack of a functional explanation (which in turn would make it easy). This article posits such a functional explanation and suggests that the failure of others to describe consciousness in such functional terms lays in the use of an incorrect unit of analysis. From an evolutionary point of view, consciousness may have the functional purpose of facilitating what Stewart and Cohen (1997) have termed extelligence -- the external storage and manipulation of information. The savings of storage and retrieval acts as well as direct storage capacity can in turn facilitate the development of more complex organisms. By positing a functional explanation in response to the question "why consciousness?" the article proposes to transform the hard problem into an easy one.
_____________________________________________________________
Submitted to the Journal of Consciousness Studies October, 1997
What Makes The Hard Problem Hard?
Chalmers (1995) in positing the "hard problem" distinguishes it from mere "easy" problems by the role of mechanism and function. Easy problems are easy because they involve explanation in the form of mechanism specification – what mechanism can perform the function. Chalmers posits that there is no such problem to which consciousness is the mechanistic answer and thus that the problem of "why consciousness?" is a hard problem – one without functional explanation (c.f. Güzeldere, G., 1995 and Shear, 1997).
Robinson (1996) echoes and amplifies on Chalmers' distinction in an article entitled "The Hardness of the Hard Problem." In doing so, Robinson restates the hard problem as follows, "Why should a subject S have a conscious experience of type F whenever S has a neural event of type G?" He goes to state "two facts [which] will lead us directly to the conclusion that we cannot answer the question why neural events yield corresponding conscious experiences within our present conceptual framework [emphasis added]." His two "facts" are 1) regularities are explained through structure and 2) among the properties of consciousness at least one has no structural expression. In Aristotelian terms, Robinson's "fact" one is a statement of the primacy of material cause (allowing for both formal and efficient cause to play a role in "restructuring'). By contrast, his "fact" two suggests that there may be a need for "final cause." Indeed, this article builds on that suggestion.
Chalmers asks "Why should physical processing give rise to a rich inner life at all? It seems objectively unreasonable that it should, and yet it does. " In so asking he creates the very hardness of the problem. Asking "why should physical processing give rise to?" is a request to find a material cause. Yet, the hard problem can be stated in a different form. What is it about the evolution of organisms that have consciousness which has lent an advantage to their so being capable of the subjective qualia of consciousness? What evolutionary advantage has consciousness wrought? To what other functions of the organisms has consciousness been a contributing factor? In what way can consciousness be explained through its being a medium or vehicle to allow something else to have occurred in the evolution of the organisms in question? Does consciousness stem from a final cause?
The hardness of the hard problem lies in the problem poser's reliance upon material, efficient, and formal causation as all that matters. Within such a conceptual framework, the hard problem is truly hard and perhaps permanently unsolvable. But, there is another conceptual framework available – that of Aristotle's final cause. The conscious experience F is entailed by the neural event G as a means of allowing S to do something else X or Y. Thus, although F cannot be explained by G, the relation of F to G can be explained by the relation between S and X or S and Y. If such a theory can be articulated, then it would appear that the hard problem can be rethought of as an easy problem.
Such is this article's task.
The Unit Of Analysis
In two leading books summarizing current debate on consciousness (Shear, 1997 and Block, Flanagan, and Güzeldere, 1997) what is common among the posers of the hard problem is statement of the problem as one requiring F to be explained by G. The unit of analysis is thus taken to be 1) an individual conscious experience and 2) a related neural event or set of events. As Chalmers (1995) notes, in his definition of what makes the hard problem hard, "Throughout the higher-level sciences, reductive explanation works in just this way." Yet, as Bohm most notably pointed out, when we look at the world through any pre-given theoretic lens, the factual knowledge we obtain will be shaped and formed by those same theories.
Rosen (1985, 1991) and Kampis (1991) argue that complex systems such as life tend to involve final cause and that within such systems the remaining Aristotelian causes are mixed in ways that render them not reducible. Again from Chalmers (1995), "What makes the hard problem hard and almost unique is that it goes beyond problems about the performance of functions. To see this, note that even when we have explained the performance of all the cognitive and behavioral functions in the vicinity of experience - perceptual discrimination, categorization, internal access, verbal report - there may still remain a further unanswered question: Why is the performance of these functions accompanied by experience? A simple explanation of the functions leaves this question open." Chalmers uses this presentation of non-reducibility to argue for supervenience, yet it may be that what needs too be supervened is the unit of analysis. Seeking a materialist cause to answer what is self-described as a non-reducible problem seems fruitless. Non-reducibility, however, can yield to final causes.
Rosen points out that the property of anticipation is seen often in living and other complex systems. What is anticipated, but some form of final cause? The reductionist approach to the primacy of material cause fails in explaining evolution and in explaining almost anything to which final cause may be applied. In the traditional studies of living systems, final cause is often only acknowledged in evolutionary theory. And so, to evolutionary theory we turn.
In evolutionary terms, the question of "why consciousness?" must be answered in terms of the sustainability or fitness it confers upon S and other species members similar to S. The unit of analysis is not one conscious experience and its related neural event(s) but instead the subject S and its other species members. The answer to the question why is not "because neural state G works in the following manner to create F" but rather "because having conscious experiences F
1 F2 F3 … will allow S to better do X or Y."
Consciousness viewed this way is not an end to itself but a means to some other end. And, as a means its explanation lies in final causes. A philosophical notion from Heidegger seems to capture this relation. This notion is that of "fundierung" -- the relations between facticities and functions, or perhaps between mediums and objects (c.f. Rota, 1997, chapter XV). In a fundierung relation it is the presence of the function that ascribes meaning and value to the faciticities that made it possible. For example, in the absence of the functions of writing and drawing, a pen is pointer and the ink and tip have little to no value. The presence of the function "status giver," however, can give great value to an empty Mont Blanc. Can consciousness be viewed as having a fundierung relation to some evolutionarily advantageous role or function? If so the unit of analysis cannot be that of the experience itself nor its related neural events.
Chalmers (1995) writes, "We know that conscious experience does arise when these functions are performed, but the very fact that it arises is the central mystery. There is an explanatory gap between the functions and experience, and we need an explanatory bridge to cross it. A mere account of the functions stays on one side of the gap, so the materials for the bridge must be found elsewhere." Our focus in answering "why consciousness?" must turn to discovering what advantage having consciousness conveys to the consciousness possessing species. If a tentative answer to that question can be posed, then consciousness may indeed have a functional, mechanistic explanation, and, by Chalmers' own terms, the hard problem will no longer be hard.
Extelligence and External Data Storage
Chalmers (1995) himself outlines the first step in creating a final cause explanation; "information (or at least some information) has two basic aspects, a physical aspect and a phenomenal aspect. This has the status of a basic principle that might underlie and explain the emergence of experience from the physical. Experience arises by virtue of its status of one aspect of information, when the other aspect is found embodied in physical processing." What is missing (and this article adds) is the notion that the physical embodiment of information demands resources for storage, transmission, replication, evaluation and the like which are also occurring in the non-physical realm. There is a tendency to conserve the expenditure of such resources, and an evolutionary advantage will be conveyed to those organisms which can more efficiently manage such resource demands or which can externalize some of the load to others or to the environment.
Stewart and Cohen (1997) refer to this evolutionary advantage as "extelligence." Boisot (1995) refers to it as his "principle of least action." Both Boisot and Stewart and Cohen write of the potency of the qualitative improvement which results when data storage and processing problems are off loaded and thence converted via extelligence into pattern recognition problems. The latter require fewer resources and allow the creature involved to devote freed resources to more complex (and presumably higher-order) development.
As a first attempt to posit such an explanation, let us note that consciousness gives us an awareness of emotional response and a limited awareness of the context dependedness of such responses. The question to pose is, does this give us any advantage? The answer is a seemingly obvious yes. By associating emotional responses with specific aspects of situations we seem to be able to decompose that situation for later recall by a variety of triggers including context and emotion. In essence we have off loaded some of the data requirements for processing similar situations in the future by labeling that aspect with a pattern and allowing the pattern to be recalled.
Further, it possible that awareness (the experience of consciousness) allows the direct manipulation of valence and salience values (what Griffiths, 1997, labels as the purpose of emotion) associated with the information subject to such external storage. If so then the ability to directly manipulate the values will itself further facilitate the association of such values with the "components" of any decomposable unit, thereby allowing rapid reaction to composites (both new and old). In the absence of consciousness, direct manipulation of such associations and their values is vastly more difficult, if not impossible. With consciousness, such direct manipulations allow for the socialization of some forms of learning which itself is another form of externalized information storage, freeing resources for redirection to still more complex development. Awareness gives us access to the valence and salience valuations and thus allows us to change the codings. As John Barlow put it, when we remember we remember when we last remembered and not before.
Now is this the only such explanation? No. But is a plausible one. We can go on. Awareness allows us the ability to observe decomposition of interpretation in action. Why would this be an advantage? Recognition of the componentness of components allows their reconstitution into composites (c.f. Bar-Yam, 1997) and again is a great conserver of data processing and storage requirements.
Fontana and Buss (1996: 104) comment appropriately here: "organization…derives from placing a theory of objects in a suitably constrained many-body dynamical setting. the conventional settings of either dynamics alone or syntactical manipulation alone are insufficient; organization derives from their combination in a constructive dynamical system." In the context of this article, for organization read learning, for dynamics read physical processes and for syntactical manipulation read consciousness or experience.
Chalmers (1995) did not accept that such explanations were sufficient: "This is not to say that experience has no function. Perhaps it will turn out to play an important cognitive role. But for any role it might play, there will be more to the explanation of experience than a simple explanation of the function." This author is not so sure. It seems that given the above, many more such tentative explanations in the form of why the relation of F to G helps S with X or Y are possible. One or more of these may fully explain why we have the subjective feeling of experience. But no such explanation is a simple material cause explanation, instead it will be a more complex final cause explanation. Perhaps, for the hardness of the hard problem, therein lies the rub.
Hypotheses for "Why?"
The author suggests three hypotheses and related corollaries for further exploration re the why of consciousness:
(H1) Consciousness experiences give a creature separate access to valuations regarding valence and salience of perceptions of data (c.f. Griffiths’, 1997, model of the role of emotions and DeLancey, 1996)
(C1a) Valence and salience valuations can be assumed to be reflected in an emotive or affective response to a given data item but only consciousness allows the separate observation of such valuations as one item in a composite.
(C1b) Separate access is important if the creature is to be able to decompose composites as observed (c.f. Bar-Yam, 1997).
(C1c) Separate access is critical for situatedness (c.f., Nardi, 1996 and Hendriks-Jansen, 1996) to be recognized and dealt with as other than a composite phenomenon.
(H2) The transformation of data ("perceived change") to information (data in which patterns can be recognized) and thence to knowledge (a structure of information which ranges over time or meta-patterns) can facilitate the better use of storage resources and conservation of the actions required to both store and retrieve data, information, and/or knowledge items.
(C2a) Association of items with valence and salience valuations facilitates pattern recognition and the minimization of energy expenditure.
(C2b) Association of decomposable items is essential if composites are to be readily recognized and dealt with (c.f. Bar-Yam, 1997).
(C2c) Separate association is critical for situatedness (c.f., Nardi, 1996 and Hendriks-Jansen,1996) to be recognized and dealt with as other than a composite phenomenon.
(H3) Off loading storage capacity and/or processing requirements can free resources a creature may use for more complex development.
(C3a) Creatures are resource constrained.
(C3b) Creatures are situated.
(C3c) More complex development can create evolutionary advantage.
Final cause can be fairly potent as an evolutionary force. "If the potential is sufficiently accessible and the advantages that accrue from realizing it are strong enough, then evolution will come up with some form of the necessary trick (Stewart and Cohen, 1997:112)."
Facilitating What?
Stewart and Cohen suggest that the "self is not a thing, but a process, which preserves an apparent sense of identity even as it changes complicitly with everything around it (p. 224)." In so stating, they are echoing the auopoietic notions of Maturana and Varela (1980), Mingers (1995), Luhmann (1995), and Kampis (1991).
To get to this conclusion, Stewart and Cohen take up Chalmers’ challenge of the dual nature of information as both symbolic representation and physical instantiation. They argue that features of the world (coherent entities in their own right) are translated into these dual status entities within our brain and that self awareness is the recursive by-product of a generalized feature-detector that has learned to recognize itself as a feature. The resultant "electrical impulses have two distinct interpretations. They are physical processes in the real world; but to the owner of that particular brain they carry and interpretation as models of the real world. The models are imperfect but the physics that run them obeys all the usual rules (p. 186)."
If we attempt to portray the self as a process which takes advantage of this dual nature of electric impulses, then through Rosen’s (1991) concept of the modeling relation the above hypotheses could be more generally stated. The modeling relation is a mathematical object models the process by which we assign meaning to the world we perceive. This model concerns the relationship between two systems, a so-called "natural system" and a "formal system." The assumption is that when we are "correctly" perceiving our world, we are carrying out a special set of processes in which we "model" that world by means of the "formal system". The natural system is something in our surrounds that we wish to "understand" (as well as control and make predictions about, if these are distinguishable from "mere" understanding). The formal system is some creation of our mind or something our mind "borrows" in order to try to deal with observations or experiences we have with our surrounds. The modeling relation is a description of the encoding and decoding that goes on between the formal system and the natural system or the "world." What is important in this relation is the ways we have chosen to do the encoding and decoding rather than the codes themselves.
Consciousness allows a creature to modify its encoding and decoding regimes. The more developed the sense of consciousness, the more awareness a creature has of its encoding and decoding processes, and the more manipulable such processes become. That ability over time confers an evolutionary advantage. If we assume that conscious creatures make some use of the modeling relation in constructing a mental model of their world, then yet a further hypothesis is suggested:
(H4) Consciousness can affect the creature’s ability to both decode and encode with regard to its constructed model of the world.
(C4a) Only by awareness can a creature be removed enough from the encoding and decoding process so as to be able purposefully make changes to its decoding and encoding process as well as to the model itself.
(C4b) Associations of valence and salience valuations with decomposable pieces of the model in the modeling relation will facilitate both the interpretation of composites and context dependence in the model.
Implications
By suggesting a change in the unit of analysis and a redirection of attention away from a materialist reductionist perspective, this article has opened up the potential redefinition of the hard problem as an "easy," i.e., functionalist, problem. Moving from the examination of why this neural event should on its own lead to a conscious experience to an examination of what evolutionary advantages are created by having conscious experience associated with particular neural events opens new vistas for solution of the problem.
As Stewart and Cohen (1997: 47) note, "The bird does contract a vast number of muscles, it did take four thousand million years to evolve. But equally it is singing because that is what songbirds do."
By examining the embedding of a system in its surroundings, we can study both what that system does and what it might have done. The resulting possibility spaces look very different in worlds with and without final cause. The hypotheses suggested herein are but a beginning. In a world of abductive research (c.f. Josephson and Josephson, 1996) one has to begin somewhere. A world view admitting of Aristotelian final cause, and the fundierung of potential media, does not lend itself to deductive reasoning (c.f. Kelly, 1997). The hard problem restated as herein is potentially solvable and testable and revisable in an abductive manner.
Rosen’s modeling relation is an important philosophical tool to sharpen reasoning about similar questions and has been ignored by more traditional researchers for far too long. The hardness of the hard problem proves to be merely a brittleness and, when attacked with the right tool, easily cracked.
References
Bar-Yam, Y. (1997). Dynamics of Complex Systems. (Reading, Mass.: Addison-Wesley).
Block, N. , Flanagan, O. and Güzeldere, G. eds. (1997). The Nature of Consciousness.(Cambridge: MIT Press).
Boisot, M. (1995). Information Space. (London: Routledge).
Casti, J. and Karlqvist, A. (1996). Boundaries and Barriers. (Reading, Mass.: Addison-Wesley).
Chalmers, D. (1995). "Facing up to the problem of consciousness," Journal of Consciousness Studies, 2 (3), pp. 200-19.
Chalmers, D. (1996). The Conscious Mind: In Search of a Fundamental Theory. (New York: Oxford Univ. Press).
DeLancey, C. (1996). "Emotion and the function of consciousness ," Journal of Consciousness Studies, 3 ( 5-6) 1996 , pp. 492-499.
Fontana, W. and Buss, l. (1996). "The Barrier of Objects: From Dynamical Systems to Bounded Organizations," in Casti, J. and Karlqvist, A. (1996).
Griffiths, P. (1997). What Emotions Really Are. (Chicago: Univ. of Chicago Press).
Güzeldere, G (1995), "Problems of consciousness: a perspective on contemporary issues, current debates," Journal of Consciousness Studies, 2 (2), pp. 112–43.
Hendriks-Jansen, H. (1996). Catching Ourselves in the Act : Situated Activity, Interactive Emergence, Evolution, and Human Thought. (Cambridge: MIT Press).
Josephson, J. and Josephson, S.(1996). Abductive Inference: Computation, Philosophy, Technology. (Cambridge: Cambridge Univ. Press).
Kampis, G. (1991). Self-Modifying Systems in Biology & Cognitive Science: A New Framework for Dynamics, Information & Complexity. (New York : Pergamon Press).
Kelly, K. T. (1997). The Logic of Reliable Inquiry. (New York, Oxford Univ. Press).
Luhmann, N. (1995) Social Systems. Bednarz, J (trans,) and Baecker, D, (trans,). (Palo Alto: Stanford Univ. Press).
Maturana, H. and Varela, F. (1980). Autopoiesis and Cognition: The Realization of the Living. (Boston: Reidel).
Mingers, J. (1995). Self-Producing Systems – Implications and Applications of Autopoiesis. (New York: Plenum Press).
Nardi, B. ed. (1996). Context and Consciousness. (Cambridge: MIT Press).
Robinson, W. (1996). "The hardness of the hard problem," Journal of Consciousness Studies, 3 (1), 1996, pp.14-25.
Rosen, R. (1991). Life Itself. (New York: Columbia Univ. Press).
Rota, G. (1997). Indiscrete Thoughts. (Boston: Birkhauser).
Shear, J. ed. (1997). Explaining Consciousness, The Hard Problem. (Cambridge: MIT Press).
Stewart, I. & Cohen, J. (1997). Figments of Reality. (Cambridge:Cambridge Univ. Press).