Urban75 Home About Offline BrixtonBuzz Contact

Can Evolutionary Theory Explain Human Consciousness?

Knotted said:
I might not be a philosophical zombie but everybody else is. Nobody has feelings but me. Everybody else merely behaves as if they have feelings.

I grew out of this when I was 4 :confused:
 
I thought that's what the rest of my post was doing but no matter...

8ball said:
I think you'll need to back that up a bit rather than expect everyone to immediately take it seriously.
Unless there's some sort of ontological gap between mind and body, how do you make sense of the notion of private subjective experience? Or to put it another way how can you preserve the inner/outer distinction which the notion of qualia presumes?
 
I really don't follow

Knotted said:
If we use the word 'qualia' with any consistency then the above is not just a contention it REALLY IS TRUE. Only I have qualia. Only I am subjective. I have nobody else's feelings.
Er, why? :confused:
 
nosos said:
Unless there's some sort of ontological gap between mind and body, how do you make sense of the notion of private subjective experience? Or to put it another way how can you preserve the inner/outer distinction which the notion of qualia presumes?

Ah, right - I think I'm, seeing roughly where you're going now.

Though you seem to be pointed in the general direction of sixth-form solipsism.

The ontological gap doesn't necessarily have to be between mind and body, but there seem to be enough ontological distinctions to be getting on with without positing ghosts in machines.

Ok, you may, if you choose, end up positing such things, but I don't think you're forced into such a position for lack of ontological gaps.
 
nosos said:

The reason is that I was considering subjective feelings in an odd way. I was considering something subjective as something subjective. The consideration of the experience is attempting to be a perfect mirror of the experience. In this sort of consideration what is it to say that somebody else has an experience? I have nothing to mirror.

The point is that when we consider qualia, we are not considering subjective experience. We are considering the ordinary consideration of subjective experience -that is the consideration of subjective experience as a universal. But then any arguments about qualia now loose their force - we now only have to explain how we consider things rather than how we feel something. The problem here is meaning, not qualia.
 
This 'qualia' word seems to be a lot of hifalutin talk when it's a feature of most people's cognitive environment that they recognise readily enough when you talk about it a bit.

If we were to talk about 'thoughts' (another feature of internal cognitive environment) I don't think there would be much argument as to what was meant, even though 'thoughts' are much more varied and complex in their character than mere qualia.

It could well be something I'm not getting but it looks to me like reading too much philosophy might actually be impeding some of the thinking round here.
 
8ball said:
This 'qualia' word seems to be a lot of hifalutin talk when it's a feature of most people's cognitive environment that they recognise readily enough when you talk about it a bit.

If we were to talk about 'thoughts' (another feature of internal cognitive environment) I don't think there would be much argument as to what was meant, even though 'thoughts' are much more varied and complex in their character than mere qualia.

I would make the same arguments. In fact I think thoughts are a type of qualia (in so far as it makes sense to talk about this sort of thing at all).

8ball said:
It could well be something I'm not getting but it looks to me like reading too much philosophy might actually be impeding some of the thinking round here.

Now that's a much more sound thought. Give us a tangible problem!
 
Knotted said:
I would make the same arguments. In fact I think thoughts are a type of qualia (in so far as it makes sense to talk about this sort of thing at all).

Ah, right - think you're on the same page as me with this, then.

I still go with my original answer to the original question (that the answer is no) but I don't know what the extra paradigm is that will unlock the explanation. :confused:

Sometimes I think we're going to nail the issue at some point, probably entirely by accident, and it will make a great number of people very unhappy.

I'm not being helpful, I know. ;)
 
To put it another (easier) way. Arguments about qualia assume an unspecified idea of the self. It is not surprising that the conclusion is that the self cannot be specified.
 
gurrier said:
Actually, the evidence suggests that single neurons do encode particular semantics - you may have a single 'cup' neuron, even a 'bill clinton' neuron which map precisely to the concept (i.e. when you are thinking of a cup, the neuron is excited, otherwise it is not).

No it doesn't.
 
Fruitloop said:
a: If this is all conciousness is then the word is superfluous. This kind of reasoning earned Dennett's 'Consciousness Explained' book the moniker 'Consciousness Ignored'. Personally I have no idea why qualia are necessary for long-term planning - there are desires and aversions, a semantic map of some kind, a symbolic order, and abstract self-object; all these are quite adequate for long-term planning (unless by planning you mean something radically different to its common usage).
What do you need consciousness to have? Why can't qualia simply be messages?

Fruitloop said:
b '...which the brain passes to the consciousness...'??? What fresh madness is this? :eek: This is completely incompatible with the identity theory you were expounding earlier, besides which I have no idea what it could mean. Either the brain is the consciousness, in which case what is being passed to what, or it isn't and you are back with the hard problem. Or some kind of substance or property dualism, neither of which is particularly attractive.
Everything I have said is internally consistent - you will have to point out a specific contradiction if you want to disagree.

The idea that the mind can be sub-divided into conscious and sub-conscious is very old and not at all contraversial. It is also completely established that a great deal of the information that is processed by the brain is not accessible to the conscious mind. A very great deal of experimental evidence has been produced which strongly suggests that the conscious mind receives a high-level "executive summary" of the situation and it does not have direct access to any stimulus data in. The sub-conscious parts of the mind map stimulus inputs into low-level responses, while also combining to map their data into much higher levels of abstraction suitable for the consciousness's needs.

None of this is conjectural or speculative at all. It's stuff that we know - through rigorous experimentation - about how the mind works. We also have a very good handle on the low-level mechanisms in the brain which underly much of the mind's sub-conscious processing. We can actually trace the neuronal circuits which map retinal excitations to lines, curves and shapes to be handed to higher levels for processing.

So, in short, if you think that the idea of the brain's lower levels passing messages (of some sort) to consciousness is ridiculous, you are dismissing virtually all of the evidence that science provides without offering even an argument.

Fruitloop said:
'c', I'm afraid, is bunk. It's trivially easy to imagine a system with a basic set of desires/aversions and a semantic system for overcoming obstacles to achieve them.

The entire history of artificial intelligence research flatly contradicts that assertion. It's the sort of thing that people used to imagine back before we tried to implement such trivial solutions. It turns out that emulating outward human behaviour even in limited domains is not trivial at all - it's mind-blowing, staggeringly complex. It is not at all clear that it is even possible unless the emulation is given something remarkably similar to our conscious mind.

Fruitloop said:
No need for it to know what it feels like to run your hand along the surface of the desk for such a system to function. The former entity is precisely the 'philosophical zombie'.
Again, your assertion is unsupported by the evidence. If our zombie is to act like a human, it needs, at a very minimum, to be able to differentiate between different types of surface. Each different type of surface has to feel different to it.

Fruitloop said:
With regard to d: it doesn't feel like anything for a computer to implement a 'data model'! That's why we're conscious and they aren't. This is precisely the point that you seem unable to grasp.

I am quite able to grasp your point, I just feel it is confused. You are failing to appreciate the overwhelming complexity of the information that consciousness, or any emulation of it, needs to process.

The basic problem I have in conveying my point is that I do not understand what you mean by qualia. To my mind, my definition is sufficient to explain (even to require) the emotional and sensational richness that our consciousness experiences. I also think that, given the required complexity, any synthetic emulation of consciousness as a data structure would have a similarly rich range of emotional and sensational experiences.


Fruitloop said:
Finally, please note that the claim that function is all that needs to be explained is in no sense a scientific claim - it's a philosophical claim through and through. Therefore it needs a philosophical justification, and what I've seen so far from you has been very, very weak.

I'm not much interested in philosophy to be honest. Although I acknowledge that my argument about the nature of consciousness is speculative and contraversial, it is firmly based in, and entirely consistent, with the scientific knowledge of how our brains work and the problems of emulating them. If I must, though, I suppose the philosophical position which underpins my case is the following

Our conscious minds have rich experiences of 'qualia' because our sub-conscious mind does an awful lot of processing and delivers them to our consciousness together with a complex set of associated emotions and concepts. The consciousness needs to be able to differentiate between a vast, vast range of possible combinations of memories, associations, emotions and so on. Since the consciousness is totally unaware of its own neuronal wiring and internal means of representing data, the subconsciousness has to translate the richness of sensual data into a format that is processible by the consciousness and this is what gives us the infinitely subtle richness of how things feel subjectively.
 
kyser_soze said:
'Special' seems to be the general term, if not the absolute refusal to say we're animals, on the basis that it causes issues about how we treat other humans...of course, carry on down that avenue and you end up with ideological vegetarianism OR the classical Humans/all other biological entities are inferior to us split, neither of which I find especially satisfying as positions to take.

M8, you really don't understand, so I conclude you really need a good talking to! One of these day, when both of us are healthy and sane... :D Choose your weapon! :p :D
 
Knotted said:
Hegel's philosophy is the philosophy of the art critic who thinks he is more important than the artist.

He is a philosopher that comes after the artist and makes sense of it all... A bit of a difference...:cool:

As for Newton - really, you should read it carefully... And see how it's questioned afterwards, too... by physicists, let alone philosophers...:cool:
 
gurrier said:
You see, my argument is that pain, and other emotions, are inseperable from the experience of what they are like - emotions are the experience of having them.

phildwyer said:
And that is your fundamental error, from which your multifarious other errors all flow. Human beings experience through concepts. Human beings are not animals. The ethical implications of treating human beings as animals are clear to everyone except those blinded by dogma, such as yourself.

Tbh I agree more with gurrier than I do with phil on this one.

Although there is some truth in the idea that humans experience through concepts, - there is also a a meaningful notion of non-conceptual experience, and concepts don't make a lot of sense without non-conceptual content to bite on.

Intuitively, pain is inseparable from the experience of what it is like to feel pain, - but arguably, conceptualising it as pain adds an extra-dimension to the experience. The problem with what gurrier says isn't that the quote above from gurrier is false, - (I think it's unarguable). The problem is that it makes no sense fior him to claim that on his account. You can't have it both ways.
 
kyser_soze said:
I prefer 'more complex' or 'different' - still value judgements, but one less loaded, I think, than 'special'...

We are very special! The most special living being on Earth!!!:p :cool:
 
goldenecitrone said:
I don't really understand this 'We're not animals' idea. What are we then? Sunbeams?

To be a living being does not immediately equate us with animals. Nope, sorry!

We had this debate, from which some have learned NOTHING[!!!] b4.

See a debate on Social Darwinism, page 2 or 3 now, I think... ;)
 
Is it thicko creationism?

Is it the 'Giraffe Defence'?

Or something else entirely?

Let's find out . . .
 
User 301X/5.1 said:
I have read some theories about the dimesion of time not being "real". As Einstein explained time is not absolute and varies from observer to observer depending upon how they are accelerating in space.

I find time just as difficult to get my head around as conciousness itself.

What does conciousness mean without time? Would time exist without conciousness?

I reckon they must be the same thing (but that hinges on my own hunch that time does not exist without conciousness)

Can evolutionary theory explain conciousness?
would lead to.........
Can evolutionary theory explain time?
would then lead to......
What is evolution without time?
which I find a rather meaningless question because what is change without time?

to get away from the time problem perhaps we need to forget that the apparent time in our universe is not he only one. (other universes exist, some without time, some with time, some with time and concious observers)

I think the only way evolutionary theory could explain it is to conisder the "many worlds" approach and apply evolutionary theory to a multiverse of possibilities. It just so happens we are in a particular style of universe which is condusive to the production of concious beings.

I reckon this is insightful.

Physics can't explain consciousness because - structure and dynamics can only explain further structure and dynamics. Physics also has no account of why it happens to be "now", now. On the scientific account of time, all times are equally real.

The lack of explanation of these two things is almost certainly just a failure to explain one thing, - the two things that aren't explained are the same thing, as you say.

And yes, I'm inclined to agree that it starts to make more sense once you start taking the many-worlds everett-wheeler theory seriously.

Which is good news for all of us.

Jonti's already seen the point I reckon.
 
<waits to see if this is just the usual quantum bollocks that comes out when a discussion gets bogged down . . . >
 
Demosthenes said:
there is also a a meaningful notion of non-conceptual experience

But "notion" = "concept," right? So surely the fact that there is a notion (a concept) of the non-conceptual proves my point that there is no non-conceptual human experience. In order to experience something we have to conceptualize it.
 
goldenecitrone said:
I don't really understand this 'We're not animals' idea. What are we then? Sunbeams?

We are animals with consciousness. No other animal has consciousness. Our consciousness is not part of our animal nature. We are more than animals.
 
As the brain is made of neurons and the firing of neurons is an electrical phenomenon, and consciousness seems in general to be a property of electrical nervous systems, - - what these firings actually are, is highly relevant, and can only be understood by understanding quantum mechanics.

In fact, nothing can be scientifically understood without understanding teh subatomic world, and it certainly seems that the subatomic world cna't be understood without understanding consciousness. (see chapter 3, the mind's I, hofstadter and dennett) (a chapter they include to show an alternative to their computationalist view. )
 
gurrier said:
I'm not much interested in philosophy to be honest.

Coming from one who purports to be studying consciousness this is akin to a doctor declaring "I'm not much interested in medicine to be honest."
 
phildwyer said:
We are animals with consciousness. No other animal has consciousness. Our consciousness is not part of our animal nature. We are more than animals.

Right, so we are animals, but animals with consciousness. I think it's a bit of a leap of faith to say that no other animal has consciousness, but there you go.
 
Demosthenes said:
:p

As the brain is made of neurons and the firing of neurons is an electrical phenomenon . . .

Dude, two errors in the first sentence. I'm going to put this on the 'quantum bollocks' burner for now - no offence, like.
 
Back
Top Bottom