Urban75 Home About Offline BrixtonBuzz Contact

Designing consciousness

obanite

Kroketten
If you were to construct an 'artificial consciousness', what features would you give it in order to demonstrate consciousness/self-awareness/sentience? In computer programming terms, what would the software's pseudocode look like? Would it be possible with a neural net alone, or would we need to simulate chemical 'flooding' & stimulus a la the hypothalamus and other parts of the limbic system? Is emotion superfluous or necessary?
 
I'm thinking along the lines of, you could know and comprehend that you are an independent entity. This would be part of self-awareness. But without emotion, would you care; why would it matter any more than any other fact about the world you're in? Or is sentience just a heightened level of survival-instinct - "this is me, I must protect me"?
 
Very good point; what is consciousness when stripped of any anthropomorphic hangovers?

Is a CPU/BIOS that knows about its operating temperature and automatically adjusts fan speeds/shuts down if it gets too high exhibiting some form of self-awareness? I'd say no because it's hard-coded.

Are babies self-aware? I remember vaguely when doing developmental psychology that there's a point at which children become capable of thinking of themselves in the third person. Is this a requirement for consciousness, or can something thinking purely in first person terms be truly sentient?
 
whilst it might be anthropocentric to base one's designs on the human brain.. it might not be a bad starting point.. after all nature is a lazy and efficient designer, it's got to be worth stealing her ideas.. that is certainly part of the thinking behind most of the currently most theoreticall interesting projects i know of at the moment.

For example, RobotCub - An Open Framework for Research in Embodied Cognition


although all the real money in AI research is provided by the military so the first thinking machine will probably be a terminator :(
 
Consciousness is increasingly looking like a very, very sophisticated con-trick that we evolved to deal with the more extreme reactions in our subconscious.
 
kyser_soze said:
Consciousness is increasingly looking like a very, very sophisticated con-trick that we evolved to deal with the more extreme reactions in our subconscious.

I would say it is more likely that our subconscious makes us believe that we do.

salaam.
 
kyser_soze said:
Consciousness is increasingly looking like a very, very sophisticated con-trick that we evolved to deal with the more extreme reactions in our subconscious.

Could you elaborate please; what kind of extreme reactions? :)
 
Violent reactions primarily - there's a growing theory that consciousness exists as a control mechanism when humans are interacting with each other and the world that helps keep us safe (evolutionarily it's a peach - hom sap developed this mechanism that allows us to second-guess our behaviour as opposed to just reacting to stimuli, thus enabling us with impulse control AND helping create an abstract of reality that we can do stuff like forward plan in)
 
Well I dunno about that. It seems obvious to me that whatever the lowest level of semantic object that exists in the brain is, it can have either a compelling or an inhibitory effect on action. Still, I don't really see the advantage in making this into some big categorical differentiation like the cathexis/anti-cathexis thing in Freud.
 
obanite said:
If you were to construct an 'artificial consciousness', what features would you give it in order to demonstrate consciousness/self-awareness/sentience? In computer programming terms, what would the software's pseudocode look like? Would it be possible with a neural net alone, or would we need to simulate chemical 'flooding' & stimulus a la the hypothalamus and other parts of the limbic system? Is emotion superfluous or necessary?

You couldn't design a conciousness, it would be self-emergent, multi-refferencing, with a degree of open-ended adaption.
 
foreigner said:
You couldn't design a conciousness, it would be self-emergent, multi-refferencing, with a degree of open-ended adaption.

What do you mean by multi-referencing?

I disagree that something can't be designed because first time around it was self-emergent. Anything can be simulated by a Turing complete computer.
 
obanite said:
What do you mean by multi-referencing?

I disagree that something can't be designed because first time around it was self-emergent. Anything can be simulated by a Turing complete computer.

Artificial intelligence yes, but Synthetic intelligence?
 
To design it, we must understand it, and we're not even close.
However, by mimicing what we do know, it's possible we'll get an emergent/trained AI - but we won't know how it works.
 
And could you create consciousness in a box with no body, no need to protect itself, to find food and shelter, to survive.
 
foreigner said:
Artificial intelligence yes, but Synthetic intelligence?

What's the difference? :confused:

Crispy, I don't think we're a million miles away from understanding what it is and does. How it does it might be a bit trickier - but does the method actually matter so long as it produces the desired output? [black box consciousness?]

goldenecitrone - very good question, in fact vital. Is someone conscious under conditions of total sensory deprivation? I think they are, but consciousness needs a fuckton of stimuli in order to develop in the first place.

I wish I'd paid more attention in developmental psychology.

I wish I'd not failed psych and dropped it :D
 
goldenecitrone said:
And could you create consciousness in a box with no body, no need to protect itself, to find food and shelter, to survive.
No, I reckon our AI's will need to be 'brought up' just like new born babies. This means senses, and a means of interacting with the world.
 
Artificial intelligence isn't intelligent, it just appears to be so. Synthetic is genuin intelligence, perhaps even sentience, perhaps even sapience, it would just happen (perhaps in the far-out future) to have been conceived in a computer program.
 
obanite said:
If you were to construct an 'artificial consciousness', what features would you give it in order to demonstrate consciousness/self-awareness/sentience?

decent chat-up lines?

In computer programming terms, what would the software's pseudocode look like?

cognitive networks are not programmed at all

Would it be possible with a neural net alone, or would we need to simulate chemical 'flooding' & stimulus a la the hypothalamus and other parts of the limbic system? Is emotion superfluous or necessary?

emotion and thought are the same thing.
 
Some early Artificial Intelligence research was paid for by the US Air Force. (OK, intelligence != consciousness, but bear with me.)

They presumably had long-term hopes of infinfitely obedient self-flying planes and so forth.

I'd say that you only know your system is conscious when it goofs off a month's work because it's fallen in love, goes on strike for longer tea-breaks, and declares itself a conscientious objector.
 
they have their self flying planes already, though. and AI's have been beating aircraft pilots in simulations for years, iirc.
 
obanite said:
Very good point; what is consciousness when stripped of any anthropomorphic hangovers?

consciousness is a cognitive network with both external structural coupling (ie. both "senses" and actual immersion and participation in a changing system that has energy being pumped into it) and internal structural coupling (ie. the processes of cognition - the ongoing feedback-looped changes in structure - are themselves affected by themselves).

Is a CPU/BIOS that knows about its operating temperature and automatically adjusts fan speeds/shuts down if it gets too high exhibiting some form of self-awareness? I'd say no because it's hard-coded.

that's not really self-awareness but interestingly it is an example of a feedback loop. computer gets hot - fan speeds up - computer cools down - fan slows down. all the variables in this system orbit an attractor in a phase-space (no really) until conditions change. cognitive networks are built out of systems like this - a form of incredibly complex hierarchy of feedback loops inside loops inside loops inside loops.

Are babies self-aware? I remember vaguely when doing developmental psychology that there's a point at which children become capable of thinking of themselves in the third person. Is this a requirement for consciousness, or can something thinking purely in first person terms be truly sentient?

you want to be careful here. i've forgotten his name but an American philosopher/psychologist made a real name for himself with this line of thinking a few years back. horribly, it was hijacked by media and far-right anti-abortionist lobbies as an example of the evil of secular science and philosophy and he was splashed all over BBC2 like taramasalata on an incredibly dry old piece of rye bread.

the argument runs, if abortion is ok, then it's reasonable to kill children as well. abortion is often justified on the grounds that the foetus is not yet conscious, that it is alive and reacts to stimuli (some would say "programmed" but then, they would) but only in the way a lower animal is alive and reacts. they do not have self-awareness.

but nor does the foetus after it is born. in fact human babies are more helpless, less able to act on instinct (instinct = inherited structural biases in the cognitive network which prompt action), and less able to process their senses than many other newborn animals. we kill and eat animals. therefore we could kill and eat babies. it does not matter what you do to children or animals because basically they are considerably less self-aware than yow, and therefore don't really suffer the way you do. altogether now ahhhhhh diddums.

my reaction to this is, fuck off you psychobiscuit, i'm a queer vegetarian and your silly babies-and-meat-eating arguments just slide off me like hot butter on the arse of Bernard Manning.

(i actually remember the day when i first thought about myself thinking. i nearly fell over.)
 
fudgefactorfive said:
(i actually remember the day when i first thought about myself thinking. i nearly fell over.)
One day, a computer will say this :cool:
 
computer says no

it's gonna be fab. roomfulls and roomfulls of drab grey boxes that don't appear to be doing very much but really, although nobody knows it, they are all screaming AAAAAAAAAH KILL ME PLEEEEASE KILL ME AAAAIEEEE IT HURRRRTS in a manner that nobody can perceive. :cool:
 
fudgefactorfive said:
Are babies self-aware? I remember vaguely when doing developmental psychology that there's a point at which children become capable of thinking of themselves in the third person. Is this a requirement for consciousness, or can something thinking purely in first person terms be truly sentient?

you want to be careful here. i've forgotten his name but an American philosopher/psychologist made a real name for himself with this line of thinking a few years back. horribly, it was hijacked by media and far-right anti-abortionist lobbies as an example of the evil of secular science and philosophy and he was splashed all over BBC2 like taramasalata on an incredibly dry old piece of rye bread.

the argument runs, if abortion is ok, then it's reasonable to kill children as well....

That'd be Peter Singer: Australian, currently at Princeton in the US.
 
i read once that humans are life-long foetuses. that's why we're all naked and big-headed and funny-looking. we are literally born "too early", because we'd pop our mothers otherwise, and she needs to have 2.4 kids, not just you. (you are allowed to kill her if you're the .4). we need some extra early-growing power to build the kit that runs the complicated job of forming full-on self-awareness.

imagine then if animals actually did pop out perfectly self-aware. it's horrible; there'd be small-talk.

MEDICAL STAFF: Push! Push! Push!
WOMAN: AAAAAAAARGH!
(comical "pop" noise)
BABY: Hello! How are you!
WOMAN: Oh great, thanks, well, you know, not too bad, what with giving birth to you 'n' all.
BABY: Oh good! Good! That's great.
WOMAN: Yeah. Um ... and how are you?
BABY: Oh fine. Y'know. Not been up to much.
(long pause)
BABY: So! Seen any good films lately?
 
Back
Top Bottom