A survival value for consciousness

From discussion on the everything list:


13.01.2012 21:54. In my favorite book on consciousness (by Jeffery A. Gray, Consciousness: Creeping up on the Hard Problem) there is chapter 7 “A survival value for consciousness” that is summarized on p. 90:

“Whatever consciousness is, it is too important to be a mere accidental by-product of other biological forces. A strong reason to suppose that conscious experience has survival value in this. It is only by appealing to evolutionary selection pressures that we can explain the good fit that exists between our perception of the world and our actions in dealing with it, or between my perceptions and yours. Biological characteristics that are not under strong selection pressure show random drift which would be expected to destroy the fit. I assume, therefore, that consciousness has a survival value on its own right. That rules out epiphenomenalism, but leaves us with a problem of identifying the casual effect of consciousness in its own right.”

By the way in the Gray’s book the term intelligence is not even in the index. This was the biggest surprise for me because I always thought that consciousness and intelligence are related. Yet, after reading the book, I agree now with the author that conscious experience is a separate phenomenon.

13.01.2012 22:36 Brent: “I think he may go wrong there. If you like Julian Jaynes’ theory of the origin of consciousness: a kind internalized perception of speech that evolved because of co-opting brain structures used for hearing and language processing. Then, because it is sharing the same processing for inner narrative and for social exchange the two can’t drift apart.”

13.01.2012 23:50: I would say that before speech there was music. And without conscious experience music is not possible. How sound waves form music without consciousness? Hence Julian Jaynes’ theory does not impress me.

13.01.2012 22:36 Brent: “Intelligence, the modeling of oneself and ones relations to others has survival value and this is tied through language to internal narratives. I think there could be intelligence which did this modeling in someway not shared with external perception and while it would be conscious in the sense of having an internal model of itself and its relations, it’s consciousness might be different from ours. We can imagine this in part by considering changes to our own consciousness. If you’re like me, more of your thinking is in words and images than in talking pictures. But suppose there were implanted in your brain an internet connection. Of course we developed the internet so it has a lot of written language and pictures; but suppose for some reason the internet connection in your brain only transmitted youtube.videos. So when you thought of Obama, instead of the word “Obama” or a picture of him springing to mind, a video of him would spring to mind. This would be a qualitative change in your consciousness. ”

13.01.2012 23:50: The main question here is how unconscious process in the brain produce conscious experience. Say, there is some problem and it is necessary to make choices. A person who has no idea what to do goes to sleep and in the morning he has a conscious experience of a very good solution that has been prepared unconsciously during the sleep. Then a question is how to make a border between conscious and unconscious. Or you believe that the both phenomena are the same?

14.01.2012 03:06 Brent: “So does Gray think that beings can be conscious without being
intelligent or intelligent without being conscious?”

14.01.2012 09:08: The first part is definite yes, for example “But we can, I believe, safely assume that mammals possess conscious experience.”

There is no clear answer for the second part in the book. Well, for example

“Language, for example, cannot be necessary for conscious experience. The reverse, however, may be true: it may be that language (and other functions) could not be evolved in the absence of conscious experience”.

It depends however on the definition, I would say that a self-driving car is intelligent and a rock not, but even in this case it is not completely clear to me how to define it unambiguously.

Gray’s personal position is that consciousness survival values is “late error detection” that happens through some multipurpose and multi-functional display. This fits actually quite good in cybernetics but leaves a question open about the nature of such a display.