« WikiScanner reveals biased edits of Wikipedia | Main | Seam carving and image resizing »
August 21, 2007
Sleight of mind
There's a nice article in the NYTimes right now that ostensibly discusses the science of magic, or rather, the science of consciousness and how magicians can engineer false assumptions through their understanding of it. Some of the usual suspects make appearances, including Teller (of Penn and Teller), Daniel Dennett (that rascally Tufts philosopher who has been in the news much of late over his support of atheism and criticism of religion) and Irene Pepperberg, whose African parrot Alex has graced this blog before (here and here). Interestingly, the article points out a potentially distant forerunner of Alex named Clever Hans, a horse who learned not arithmetic, but his trainer's unconscious suggestions about what the right answers were (which sounds like pretty intelligent behavior to me, honestly). Another of the usual suspects is the wonderful video that, with the proper instruction to viewers, conceals a person in a gorilla suit walking across the screen.
The article is a pleasant and short read, but what surprised me the most is that philosophers are, apparently, still arguing over whether consciousness is a purely physical phenomenon or does it have some additional immaterial component, called "qualia". Dennett, naturally, has the best line about this.
One evening out on the Strip, I spotted Daniel Dennett, the Tufts University philosopher, hurrying along the sidewalk across from the Mirage, which has its own tropical rain forest and volcano. The marquees were flashing and the air-conditioners roaring — Las Vegas stomping its carbon footprint with jackboots in the Nevada sand. I asked him if he was enjoying the qualia. “You really know how to hurt a guy,” he replied.
For years Dr. Dennett has argued that qualia, in the airy way they have been defined in philosophy, are illusory. In his book “Consciousness Explained,” he posed a thought experiment involving a wine-tasting machine. Pour a sample into the funnel and an array of electronic sensors would analyze the chemical content, refer to a database and finally type out its conclusion: “a flamboyant and velvety Pinot, though lacking in stamina.”
If the hardware and software could be made sophisticated enough, there would be no functional difference, Dr. Dennett suggested, between a human oenophile and the machine. So where inside the circuitry are the ineffable qualia?
This argument is just a slightly different version of the well-worn Chinese room thought experiment proposed by John Searle. Searle's goal was to undermine the idea that the wine-tasting machine was actually equivalent to an oenophile (so-called "strong" artificial intelligence), but I think his argument actually shows that the whole notion of "intelligence" is highly problematic. In other words, one could argue that the wine-tasting machine as a whole (just like a human being as a whole) is "intelligent", but the distinction between intelligence and non-intelligences becomes less and less clear as one considers poorer and poorer versions of the machine, e.g., if we start mucking around with its internal program, so that it makes mistakes with some regularity. The root of this debate, which I think has been well-understood by critics of artificial intelligence for many years, is that humans are inherently egotistical beings, and we like feeling that we are special in some way that other beings (e.g., a horse or a parrot) are not. So, when pressed to define intelligence scientifically, we continue to move the goal posts to make sure that humans are always a little more special than everything else, animal or machine.
In the end, I have to side with Alan Turing, who basically said that intelligence is as intelligences does. I'm perfectly happy to dole out the term "intelligence" to all manner of things or creatures to various degrees. In fact, I'm pretty sure that we'll eventually (assuming that we don't kill ourselves off as a species, in the meantime) construct an artificial intelligence that is, for all intents and purposes, more intelligent than a human, if only because it won't have the enumerable quirks and idiosyncrasies (e.g., optical illusions and humans' difficulty in predicting what will make us the happiest) in human intelligence that are there because we are evolved beings rather than designed beings.
posted August 21, 2007 09:52 AM in Thinking Aloud | permalink
Comments
Hi Aaron,
Matt Berryman referred me to this blog entry on qualia and intelligence. I've made a few comments in an entry on my blog. I'd be interested in your reply. The essence of my comments is that there is a difference between subjective experience and intelligence. It's not clear to me from your piece where you stand on that.
-- Russ
Posted by: Russ Abbott at August 30, 2007 10:25 AM
Hi Russ, thanks for stopping by. I'm not particularly interested in "qualia" per se, mainly because I don't think it's particularly important to the thing that I am interested in, which is intelligence. And, I think this is kind of what you're saying.
I liked the quotation by Dennett because basically he's saying the same thing -- qualia, in the sense that these philosopher of the mind people apparently use it, is a supernatural something. But, he's saying that qualia isn't necessary for either intelligence, or the non-intelligence that people exhibit when they're fooled by simple conjuring.
I guess my personal take on it is that discussions of "qualia" don't (yet?) belong in scientific literature. That doesn't mean they aren't interesting, or that we might not one day figure out how the brain (or a suitably programmed computer) produces certain kinds of subjective experiences. I'm just more interested in intelligence and intelligent behavior, which is more about computation and less about how red light in the 625–750 nm range seems.
Reading your webpage though, it sounds like you have a deeper interest in qualia than I do. So, perhaps I could challenge you to precisely define what you mean by qualia, and do so in a way that doesn't appeal to some "universal" experience that we as humans all have. What do you think?
Posted by: Aaron at August 30, 2007 04:22 PM
Nice site. Bookmarked :)
Posted by: eneken at August 31, 2007 03:47 PM
There are two reason why I can't now (and don't even know whether to be able to expect to be able to)
define what [I] mean by qualia, and do so in a way that doesn't appeal to some "universal" experience that we as humans all have.
The first reason is that as far as I'm concerned the term "qualia" means a subjective experience. I doubt that such a definition will satisfy you. Yet we currently don't know enough about how subjective experience works to be able to say much more about it than it's the sort of thing we all have and which often goes by the name of a first (as distinguished from a third) person perspective. So that's the best I can do as far as a definition goes. In that sense it's pretty much a primitive on the subjective experience level of abstraction.
Before you scorn at that, tell me how you would define any primitive on any level of abstraction. For example, in the days in which stacks were defined (for example) in terms of axioms that related pop, push, top, etc. all we had were those axioms. They weren't definitions in terms of something else. One could of course define a stack in other terms — or in terms of an implementation, but eventually one will be stuck with primitives.
I'll put my second reason as a challenge for you. Tell me how to express the sort of definition you want in a way that doesn't rely on subjective experience. By that I mean, if I were to write down words, those words would have meaning only to someone who understood them. (The same goes for equations.) And the notion of "understanding" is itself a matter of subjective experience. So any definition in the usual sense depends on subjective experience even to be a definition. In that sense it would essentially be circular.
I just submitted an abstract that is relevant to this issue to a workshop on Philosophy and Engineering. I'd be interested in your comments.
Posted by: Russ Abbott at September 1, 2007 07:40 PM
P.S. I would appreciate it if you would put a comment on my blog entry (or send me email) if you reply.
Thanks
-- RussPosted by: Russ Abbott at September 1, 2007 07:45 PM
I'm reasonably convinced that my own subjective experience of the world is largely an illusion constructed by things outside of that experience, and done so in order to deal with the everyday uncertainties that my ancestors evolved to handle. Which is to say I'm pretty sure that my subjective experience of the world is an artifact of the computational processes that the human brain creates. If you will, it's the dynamic contents of the registers of the CPU in my head, which is to say that it's extremely limited in capacity and scope, and is largely at the whim of things far beyond it (input/output, long-term memory, interrupts from critical subsystems, etc.), although clearly from the point of view of the registers, they're where all the action really happens.
I'm willing to conjecture that any computational machine that exhibits a couple of basic features will also have a "subjective" experience, and I'm also willing to give its experience full equivalence to mine, for all intents and purposes. Here's what I think is necessary. (1) The machine must receive input in real-time from a richly unpredictable (which I distinguish from more simple minded unpredictability like coin flips) external world. (2) It must also interact with this world, also in real-time, in such a way that it's continued existence is contingent on making accurate predictions about certain events in said world. Obviously, such interactions will change the state of the external world, and the ability to make such predictions will probably require things like long-term memory and some learning algorithms. And finally, (3) the machine's predictions (behavior?) are not simply a function of the outside world (or rather, the perceived region of the outside world accessible to it), but also a function of some (but not all, I think) of the machine's current internal state (and obviously also its memory of its own actions and its previous experiences). The key thing about this model is that it's a real-time (online) system with limited self-awareness, which requires it to not only learn about the world, but also about itself. I'm willing to grant that any such system will have a subjective experience of the world. Are you?
Regarding your abstract, it seems interesting. Are you familiar with the work in physics on the renormalization group theory? Phil Anderson wrote an excellent essay in Science several years ago about it and how it relates to things like emergence. Seems like it's somewhat related to the things you're thinking about.
Posted by: Aaron at September 1, 2007 11:39 PM
First of all, I am familiar with "More is Different." (I refer to it as the start of complex systems thinking in my "Emergence Explained" paper.) That was 35 years ago. Are you referring to something more recent? I didn't see anything else on your "Things to read" page by Anderson. I don't know anything about renormalization group theory. Is there something available for a non-physicist?
To get to your main point, I think there are a number of issues. The first is whether we can build a computer that will act like a human being, i.e., strong AI. In one sense that's trivial because nature already did it. At a slightly more sophisticated level, we would like to understand more about how we work. But I think we'll eventually get there.
Neither of these deal with subjective experience as such. I can write a computer program now that communicates in the first person. "Now I believe this; now I see that; now I want such and such." I can't imagine being convinced that such a program (built using current day hardware and software) has subjective experience as I understand it. (Are you saying you would?) I may be convinced in the future. This isn't to say that we won't eventually (and probably sooner rather than later) understand what it takes to have subjective experience. But I don't think we have much of a clue these days. (Do you really disagree with that?)
What I find particularly strange about your reply, though, is that you wrote it at all. It's an attempt (successful; I'm not complaining about that) to communicate ideas in your mind to me. Ideas are subjective experience. So you really must believe that you have subjective experience — and that I do also. Subjective experience in that sense is not just the contents of registers. It's experience. The distinction that's often made is between a being with subjective experience and a zombie. We have an intuitive sense that one could build a zombie or robot that is behaviorally identical to a human being (strong AI) but that doesn't have subjective experience. Perhaps that will turn out not to be possible. But were it possible, the intuitive distinction inherent in the contrast between such a zombie or robot and a human being is what characterizes subjective experience.
Do you understand the distinction I'm pointing to? Do you agree that there is a distinction? If so then no discussion of registers will convince me that a machine has subjective experience. What is it, for example, about your three conditions that generate subjective experience? Why won't a machine that has the properties you describe still be just a machine?
We can certainly build such computers now. Will they all have subjective experience? I don't see why. Regarding your point 2, depending on the extent that the machine's "continued existence" depends on its real time reactions, many of them may not "continue to exist" (whatever you mean by that) for very long. Your point didn't require that the machine continue to exist for a "long time", only that its continued existence depends on its reactions. Are you saying that any such machine machine will have subjective experience as long as it exists? Why do you say that, and how are such machines different from the real-time embedded systems we have now?
Posted by: Russ Abbott at September 2, 2007 11:24 AM
Hi yet again Aaron,
I've posted what now seems to me the essence of my objection to a discussion of the non-existence of qualia as a short blog piece.
Posted by: Russ Abbott at September 2, 2007 05:27 PM
Yup, I was referring to Anderson's "More is Different" paper, in which he sketches out how the renormalization group (RG) theory gives a nice way of understanding how complexity can emerge from simplicity. I'm not an expert on RG theory, although there are people at SFI that I think might be. My general understanding is that at this point, RG has several very nice theoretical (and mathematical) results at the lower end of the complexity scale, and only a nice story for conceptually extending those results to the higher end of the scale. I think it's a major open area to understand, in detail and for a given level of system complexity, how its "physics" arises from the physics of the next level down (so to speak).
As for subjective experience, I think our differences may come down to a difference in the assumptions we're willing to make about the world. Let me explain.
In the article that I originally referenced, "qualia" is defined like so:
In a recent paper, Michael P. Lynch, a philosopher at the University of Connecticut, entertained the idea of a “phenomenal pickpocket,” an imaginary creature, like Apollo the thief, who distracts your attention while he removes your qualia, turning you into what’s known in the trade as a philosophical zombie. You could catch a ball, hum a tune, stop at a red light — act exactly like a person but without any sense of what it is like to be alive. If zombies are logically possible, some philosophers insist, then conscious beings must be endowed with an ineffable essence that cannot be reduced to biological circuitry.
Which actually sounds a lot like your argument. To put it bluntly, what I think you (and Lynch) are suggesting is that we humans have souls and that they are what have the (or, give us) subjective experience. Dennett's comment about qualia, I believe, is a criticism of this perspective, because he's a materialist and doesn't believe that non-physical souls are necessary for any phenomena in our universe, including what we call "consciousness." I think that you may agree with Lynch and thus accept the notion that it's possible (in theory) to construct such a zombie. This would mean that you probably also accept the notion that there's something non-physical about your/my subjective experience. If so, then that's where we differ.
Personally, I'm sympathetic to Dennett and the other materialists here. I don't think it's possible (in theory) to construct such a zombie because I don't think there's anything non-physical about humans: constructing a machine that is behaviorally identical to a human is equivalent to constructing a human. (This may have been Turing's point of view, as well.)
Here's a thought experiment that I think might illustrate my point. Consider a day far in the future in which humans have advanced technologically (etc.) to the point that we can create arbitrarily complex clumps of matter (of roughly human-size). Obviously, such technology could be used to create a working teleportation device. When you step into the transmitter end of this device, it scans your body, writes down the complete physical structure of your body, vaporizes you, and transmits the recorded information to the receiver end of the device. At the receiver, raw elements are then assembled into an exact physical copy of the old you, all the way down to the quantum information. Is this copy a zombie (in the sense you defined) or not? I would say that it is equivalent, in every way that matters, to the vaporized you, and that it has as much subjective experience as you or I. If you don't buy this argument, then consider that this exact process already happens, albeit on a very long time scale in which individual atoms are replaced in your body such not a single atom in your body has been there more than 7 years or so.
As far as I'm concerned, if you are willing to assume that there is something non-physical about human subjective experience, then your zombies may indeed already walk among us. If you are not willing to make that assumption but still insist that subjective experience is a real phenomenon, then (a) zombies cannot exist and (b) machines with subjective experience equivalent to our own must by possible. Of course, there might be no such thing as subjective experience and we're all actually zombies. Given the monumental stupidity of humanity in the past and present, this possibility is at least consistent with the evidence!
Posted by: Aaron at September 2, 2007 10:34 PM
Now I'm confused about where we are. Here's what I think is wrong with Dennett's position. With respect to souls, it's too easy to dismiss someone as a dualist. I never made that argument. Please don't misrepresent me. As I said before, I don't know whether it is possible to build something that is behaviorally indistinguishable from a human being but that didn't have subjective experience. I'm still not sure if you do. If something like that could be built, it would be a philosophical zombie. I think its an open (and empirical) question whether such zombies are technologically feasible. What's your position? In some sense that's the Turing test challenge. Do you have a prediction about when it will be met? Do you have an entry for the Loebner prize? With respect to a teleportation device, I see no reason (other than technological failure) that the teleported copy wouldn't have the same sort of subjective experience as the original. Why shouldn't it? More to the point, do you have subjective experience? If so, then what are we debating? If not, I'd be very interested in finding out how you work!
Posted by: Russ Abbott at September 3, 2007 12:40 AM
What Lynch and Russ are saying is (I believe) that we do have subjective experiences, and one need not posit a soul to explain this, I think Searle explains (and expounds) this best in this book, the note on the page linked is a discussion between Russ and I, I just updated the note to clarify my stance---in summary I think consciousness is to do with emergence, which just happens in some systems with certain properties, and also (as Aaron) mentioned to do with self-awareness, and probably has evolved because it does confer advantages, though I'm wary of making evolutionary "just so" stories. The topic of levels of abstraction and emergence gets back to my comments in the other thread on this blog.
Meta-discussion: One is so used to Cartesian dualism that attempts to introduce subjective experience are assumed to be attempts to introduce a soul and not attempts to explain them mechanistically.
I really fail to see how zombies and machines with subjective experience cannot co-exist, one could (using my definition) make a machine in which complexity sufficient for emergence (in the sense defined in Alex Ryan's paper) exist and also self-awareness (in the sense that Aaron uses), and machines which simulate at least the self-awareness part to such an extent that they appear as "conscious"* you and I (and are thus zombies), but I accept point (b) (difficulties and questions of emergence and its necessity aside).
*A label I attach to real-world phenomena not to anything above explanation or anything material.. which brings me to another point in that consciousness is very much a dynamic process of brain parts. This ties in nicely with definitions of self-organisation (and the fact that we don't have zombies in the literary sense :)
Posted by: Matthew Berryman at September 3, 2007 12:45 AM
Here's a post you can use to argue that I've gone off the deep end.
Posted by: Russ Abbott at September 3, 2007 05:16 PM
Aaron, what about your reply to my previous comment?
Do you have subjective experience? If so, then what are we debating? If not, I'd be very interested in finding out how you work!
Posted by: Russ Abbott at September 3, 2007 10:28 PM
I'm still interested in my previous question about whether you have subjective experience.
But in the mean time, I posted a blog entry about what it might mean to respond to your original request to define qualia.
Posted by: Russ Abbott at September 3, 2007 11:58 PM
- I don't think of qualia, as defined on your blog, as being the same as subjective experience (see below), and maybe this is part of why we're having such difficulty understanding each other.
- I don't think it's possible to build a machine that is behaviorally identical to a human being but which also lacks what I call subjective experience, or what you call qualia. (What it is like to experience any sensory input depends on the hardware used to experience it, so by that measure, every computational device attached to a sensor has qualia, albeit of a kind that is unlike our own.)
- I don't know when humanity will construct a machine that is behaviorally identical to a human; certainly we can't do it now and I don't foresee a time when we can. My crystal ball isn't that good at predicting the future, but I don't expect to see something like that during my lifetime.
- If you can define "subjective experience" in a way that does not boil down to saying "you know what I mean", then I can tell you whether I think I experience that particular thing or not. Until you can give me such a definition, the best I can say is that I certainly seem to experience an almost-real-time internal narrative and stream of sensory input at a location I think is roughly between my ears. Unfortunately, whether this is anything like what you mean by "subjective experience", I have no idea, and am sad to say that the more you write, the more I doubt that I understand what you mean. What's your "subjective experience" like?
- I'm willing to consider criticisms of and alternatives to my three preconditions for a machine (biological or not) to experience something like what I'm calling subjective experience. Perhaps they're incomplete or even missing the point, but they're a start for me. I'm not content with the "you know what I mean" definition, as I clearly do not know what you mean.
- The thing about my three preconditions that satisfies me (so far) is mainly the third point: limited self-awareness. The first and second points mainly make for something like an autonomous agent in which there is a real cost to poor predictions, and where perfect predictions are not possible. Were a machine's experience of the world (or itself) perfect, I think that would be more like objective experience than subjective experience, and thus probably a category into which I would place our current computer programs. Basically, I think having perfect knowledge of state precludes any kind of subjective experience. So, I think only a distributed system could possibly have subjective experience because only a component in a distributed system can have limited knowledge of the full system's state. Combine that with a learning algorithm that learns about both the external world and the other components, and I think I'm willing to call this activity a subjective experience. I'm not claiming that its experience of its world will be equivalent to, or even remotely like, my own, but I do think that it would qualify as a subjective one.
- Having written the above, it seems possible now that any old learning algorithm may not be sufficient. That is, it might be necessary for the algorithm to be what I might call a categorical one, in which there is some preconceived notion of categories like self (internal) and other (external) in the way that it stores its information and makes its predictions.
Posted by: Aaron at September 4, 2007 12:10 AM
Hi Aaron,
I think we're stuck, but I've made some comments to your comments. (To save space, I've left out the points for which I have no comment.)
- I don't think of qualia, as defined on your blog, as being the same as subjective experience (see below), and maybe this is part of why we're having such difficulty understanding each other.
Can you tell me what the distinction between "qualia" and "subjective experience" is for you?
You'll probably withdraw your previous statement, but what do you mean that "every computational device has qualia"? What is it like for a photon sensor to sense photons? That is, how does a photon sensor experience the sensing of photons?
I suspect that my subjective experience is probably a lot like yours. One of the difficulties we face in this discussion is the impossibility of anyone knowing anyone else's subjective experience. (That's why it's called subjective.) But that's a very old problem, and I don't see any way around it. The only way to proceed is to suppose that since we are all humans we have comparable subjective experiences. I don't think there is anything more that anyone can say given our current state of knowledge about subjective experience.
Alternatively, can you tell me what a definition of "subjective experience" would look like? You've seen my recent blog entry on definitions. Which form of definition would you like? A definition that defines "subjective experience" in terms of other (undefined) words will always be open the criticism that it relies on "you know what I mean." One that defines "subjective experience" axiomatically in terms of other (undefined) primitives is too hard—and probably wouldn't satisfy you anyway. So I doubt that I'll be able to satisfy you either way. But I would like to know what you are looking for in a definition of "subjective experience."
Here's where you lose me. What do you mean by preconditions? Are you saying that you hypothesize that subjective experience won't exist in entities that don't also have your pre-conditions? I would take that as an empirical question. Or are you saying something else.
My guess is that you are saying that you won't agree that an entity has subjective experience if it doesn't also have the properties you list. That may be your start on defining what you want the term "subjective experience" to mean. That seems to me like the wrong way to go about it. For one thing, it seems arrogant. What right do you have to declare whether something else has subjective experience just because it does or doesn't have conditions that you lay out? If your answer is that it's just a definition, then it's just an arbitrary definition and is nothing more than an abbreviation for your longer statement. Secondly the definition uses lots of terms that you would probably have a very difficult time defining. It's not clear to me that the definition makes the situation any clearer. I commented on some of those difficulties earlier.
I don't know what you mean by "limited self-awareness." Do you mean a limited form of subjective experience? If so, isn't this circular?
Posted by: Russ Abbott at September 4, 2007 11:32 AM
This is related to the notion that a photon sensor has an experience of sensing photons. On my blog you wrote about "the experience that a machine has as electrons run through its wires." Do you really mean that as it sounds? It sounds like panpsychism. Can you tell me what you have in mind as to what that experience is like? As I said on my blog, we may be forced to something like a scientifically acceptable version of that. But I'm surprised to see you accepting such a position so easily. Is that really where you are, that it makes sense to say that computers experience electrons running through their wires? Or is this (as I suspect) just a figure of speech for you and that you don't really think that computers have an experience as electrons run through their wires?
Posted by: Russ Abbott at September 4, 2007 11:48 AM
- My understanding of the term "qualia" is that it refers to the particularity of a sensory stimulation that is unique to the sensing organism. Thus, when I ponder the question of whether you perceive the same thing as I do when your eyes receive photons in the range of 625-750 nanometers, I am pondering a question of qualia. That is, I am pondering how different are the particularities of your sensory stimulation, as perceived by your attention, relative to my own. In contrast, my understanding of the term "subjective experience" is that it refers to the sensation of having a seat of attention (or, seat of consciousness if you prefer that phrase) that perceives the world as a kind of real-time narrative or movie-like phenomenon. So, these things are different in the sense that the former refers to how my sensory perceptions differ from yours (an inherently relative phenomenon), while the latter refers to my having an egocentric view of the world. I'm not sure which of these you mean, or whether you mean either of these at all, when you say qualia = subjective experience. Can you clarify? (In any way you like.)
- If you could give me a "you know what I mean" definition of subjective experience that evokes the response from me of "yes, I know what you mean", then I would consider that progress at this point.
- I believe that my sympathies lie on quite the opposite end of the spectrum to what panpsychism seems to suggest. For me, "mind" is nothing more than a linguistic construction (a communication convenience, if you will) for a (possibly ill-defined) particular set of physical phenomena that humans associate with our self-awareness of our internal brain processing. My opinion is that these phenomena are perplexing mainly because of human egotism and arrogance, and that our confusion will be greatly alleviated if we ever figure out how to connect brains directly.
- As a materialist, I reject the notion that there's anything about human existence that isn't already explained by the laws of physics. At the very least, I reject the notion that minds are first-order objects in the universe (by first-order I mean of the same level of fundamental-ness as objects like atoms), although I suspect that I would reject giving the idea of a "mind" any status other than that of epiphenomenon.
- What if I claim that my subjective experience is completely different from yours? (Certainly our inability to communicate clearly with each other would support this claim.) You certainly can't disprove me since you can only assume (axiomatically) that our subjective experiences are similar. What do we gain from a discussion of a phenomenon for which either of us can assert arbitrary properties that the other cannot dispute? If we take a behavioralist perspective, then it's clear that many people have extremely different subjective experiences from each other, e.g., individuals with certain kinds of brain damage, autistics, etc.
- Russ said: What right do you have to declare whether something else has subjective experience just because it does or doesn't have conditions that you lay out? Are we talking about ethics and human rights here, or are we talking about a physical phenomenon? If the latter, then let's leave suspicions of arrogance aside and focus on trying understand the phenomenon.
- Sadly, I'm close to giving up on this conversation thread. The longer it goes, the more I am convinced that what I say makes little sense to you, and I know that I find much of what you say confusing. At best, it's been pleasant for me to consider more carefully my own ideas about "consciousness", but I had originally also wanted to understand your ideas. Originally, it seemed that you'd wanted to understand my opinions, as well, but I'm not sure all this discussion has accomplished that. Has this discussion been useful / valuable to you at all?
Posted by: Aaron at September 4, 2007 03:22 PM
I think we are getting somewhere. You said,
For me, "mind" is nothing more than a linguistic construction (a communication convenience, if you will) for a (possibly ill-defined) particular set of physical phenomena that humans associate with our self-awareness of our internal brain processing.If I may paraphrase in an attempt to simplify, you are saying that mind refers to
physical phenomena that we humans associate with self-awareness (of a certain sort).
In other words you are saying that mind is a word for certain physical phenomena. Which physical phenomena? The ones that
we associate witha certain sort of
self-awareness.
I hope I haven't jumbled your meaning in making these transformations. I'm trying to get the heart of the issue.
If I've been fair, if you would tell me what you mean by we associate with and self-awareness I think we'll make some progress.
Posted by: Russ Abbott at September 4, 2007 11:07 PM
I hereby declare the debate closed... I really don't think it's going anywhere, though I don't think the positions are really quite as far apart the point of disagreement to me seems related to emergence, and onas they seem, we all agree that we have experiences of ourselves, and that it's all material... I do disagree slightly with Aaron's position that the laws of physics /explain/ everything in human existence, rather, I would say they constrain what is possible, I think there are certainly emergent properties which aren't reducible to the laws of physics, which isn't to say that there isn't anything non-material going on... it's like watching a game of Go as opposed to following rules. Or maybe we're all just confused now. :)
Posted by: Matthew Berryman at September 5, 2007 03:44 AM
Matt, I'm happy for the laws of physics to explain or constrain human existences, so long as there's nothing left afterward that is attributed to the non-physical or non-material. I'm certainly willing to agree that a lot of the complex behavior of the human brain is an emergent phenomenon. I guess I would just argue that, if worked out carefully, the emerged properties must, in principle, be derivable from the underlying physics.
Russ, it seems to me that self-awareness, for a machine like the one I've described previously, need be nothing more than a feedback loop in the learning algorithm, where the current internal state is part of the input to the algorithm. (Obviously, learning algorithms manipulate their internal state, but I'm not sure there are many that take their internal state as input in a way that is equivalent to external stimulus.) The learning algorithm may need to be constrained to those that have some way of labeling categorically, and thus differentiating between, self (internal) and non-self (external) state variables. That is, the algorithm is learning about two systems simultaneously, itself and the external world. (This is another point that departs from the learning algorithms I think we're most familiar with.) So, "mind" could then be the label that the learning algorithm gives to ("associates with") the "self" set of state variables and, I suppose, their history.
Posted by: Aaron at September 5, 2007 08:01 AM
As I explain in "Emergence Explained", there are no causes beyond those of fundamental physics. Nor is there a mystery about the objective reality of epiphenomena entities. Aaron, I think you disagree and claim that epiphenomena are in some sense imaginary. I claim that they're not; they are objectively real—what Searle called ontologically irreducible. So although this may be a disagreement, I think it's clear what the disagreement is.
What I think we have been unable to communicate is what in my opinion is still the unsolved problem of subjective experience. This is what David Chalmers more than a decade ago called the "hard problem of consciousness."
- I have experience;
- you have experience;
- the having of experience is associated with a level of abstraction;
- all levels of abstraction are physically implemented;
- but we have no way of explaining how the having of experience is connected with a physical implementation.
That's the zombie issue. We have not been able to think of a way to construct an entity that we are convinced will have subjective experience other than by replicating one that already has it. We simply don't understand the mechanism that produces subjective experience. Every construction that we have been able to come up with might just as well be a zombie—in the technical sense. Why is that, and why can't we think of a way of constructing something (whose design we understand) that we feel confident will have the same sort of subjective experience as we do?
What's been frustrating for me in this discussion is that as far as I can tell Aaron does not acknowledge that there is a problem to explain, i.e., that there is no "hard problem of consciousness."
You said
I don't think it's possible (in theory) to construct a zombie because I don't think there's anything non-physical about humans: constructing a machine that is behaviorally identical to a human is equivalent to constructing a human.That's fine with me. The issue you have not been willing to face is this.
- Computers as we know them do not have subjective experience. (Do you agree?)
- Human beings like you and I have subjective experience. (Do you agree?)
- What's the step that takes us from one to the other?
Posted by: Russ Abbott at September 5, 2007 12:19 PM