• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

What If Consciousness Comes First?

Shadow Wolf

Certified People sTabber
What if I just become bored with my working computer?
For whatever reason, when disposing of computer things they must be properly recycled. Even with Lt. Commander Data, as beloved as he is, he can't be buried like a human being. With that said, if something works, damn "planned obsoletion" or "needing newer and better" because we don't live in a disposable world even though we have a disposable culture, and we do have finite resources although we live like they are infinite.
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
We don't give moral agency to any particular neuron either. It is the way they interact and respond to their environment that makes us who we are. Our own neurons 'manditorily' follow the electrochemical paths determined by the physical laws.

So, yes, there is a difference. The soft ware *is* the difference.
You must then be purported a magical step that differentiates mimic software from mathematical software.
 

Polymath257

Think & Care
Staff member
Premium Member
What if I just become bored with my working computer?

If it is 'mimicking' human behavior and interactions to the extent we are talking about, it would be immoral to turn it off. That would be the same as 'getting bored' with grandma and killing her.

We are biological/chemical computers. The moral issues come from how the physical/chemical structures react with their larger environment. We value that complex level of interaction and call it 'consciousness'.
 

Polymath257

Think & Care
Staff member
Premium Member
You must then be purported a magical step that differentiates mimic software from mathematical software.

Not magic at all. Simply a matter of complexity. I don't give moral agency to an insect because of the lack of such complexity. Currently, our robots and computers are below the level of insects, but I see no reason why the level of complexity seen in humans is impossible.

In particular, consciousness involves having a model of one's own internal state and reacting/making plans based on that.
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
Not magic at all. Simply a matter of complexity. I don't give moral agency to an insect because of the lack of such complexity. Currently, our robots and computers are below the level of insects, but I see no reason why the level of complexity seen in humans is impossible.

In particular, consciousness involves having a model of one's own internal state and reacting/making plans based on that.
But certainly you see you are imputing some ‘magical’ step at some level of complexity that mandates ethical treatment. What is that?
 

Polymath257

Think & Care
Staff member
Premium Member
But certainly you see you are imputing some ‘magical’ step at some level of complexity that mandates ethical treatment. What is that?

No, not a magical step. At what point in the animal kingdom do we start seeing moral agency? A similar step would be used for robots and computers.
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
No, not a magical step. At what point in the animal kingdom do we start seeing moral agency? A similar step would be used for robots and computers.
I would say the difference is the mysterious thing called consciousness that differentiates animate versus inanimate. A computer is always going to be inanimate.
 

Polymath257

Think & Care
Staff member
Premium Member
I would say the difference is the mysterious thing called consciousness that differentiates animate versus inanimate. A computer is always going to be inanimate.

But that is a definition based on indistinguishable differences. Not very useful.

Robots, for example, already move extensively in their environments and interact in complex ways with those environments. In what way are they *not* 'animate'?

Also, you are confusing 'consciousness' with 'life' here. Plants, for example, are alive, but they are usually not considered to be animate.
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
But that is a definition based on indistinguishable differences. Not very useful.

Robots, for example, already move extensively in their environments and interact in complex ways with those environments. In what way are they *not* 'animate'?

Also, you are confusing 'consciousness' with 'life' here. Plants, for example, are alive, but they are usually not considered to be animate.
Notice I said ‘mysterious’ thing called consciousness. It is what gives us subjective experiencing. A computer can never feel but only mimic. I don’t see complexity creating a new thing that didn’t exist when simpler.
 

siti

Well-Known Member
I don’t see complexity creating a new thing that didn’t exist when simpler.
But it does - all the time...neither hydrogen nor oxygen atoms have the property of being liquids...even an isolated water molecule cannot be a liquid...but when lots of them are together in one place - hey presto - you have a liquid!
 

Howard Is

Lucky Mud
What if consciousness and form are coemergent ?

Are we confusing consciousness and behaviour ?

From the article -
The problem is that there could conceivably be brains that perform all the same sensory and decision-making functions as ours but in which there is no conscious experience. That is, there could be brains that react as though sad but that don’t feel sadness, brains that can discriminate between wavelengths of light but that don’t see red or yellow or blue or any other color, brains that direct their bodies to eat certain foods but that don’t taste them

This is also true for AI.
Conflating complex behaviour with consciousness/awareness is an error I see repeatedly in these discussions.

Maybe the dichotomy of substance and awareness is false.
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
But it does - all the time...neither hydrogen nor oxygen atoms have the property of being liquids...even an isolated water molecule cannot be a liquid...but when lots of them are together in one place - hey presto - you have a liquid!
But where is there any change in type between complex Mathematical software and an complex human mimicking program for example? One needs to be treated ethically and the other like common inanimate matter???,
 

siti

Well-Known Member
But where is there any change in type between complex Mathematical software and an complex human mimicking program for example? One needs to be treated ethically and the other like common inanimate matter???
We all end up as common inanimate matter in the end...but I think - as I pointed out earlier - that we are a very long way from getting a computer to mimic human behaviour. And if we ever did, how would we know that there was a qualitative difference between a 'feeling computer' and a real human? Its purely hypothetical because we are nowhere near being able to do that - and probably never will be. At present we have machines and humans - they are quite different and there is no question about applying human ethical considerations to machines. I fail to see how this relates to the question of whether consciousness comes first...if anything, it seems to imply that perhaps it might not.
 

Howard Is

Lucky Mud
What if neither consciousness nor matter ‘came first’ ?

Is up a result of down ? Or the other way around ?
 

Howard Is

Lucky Mud
I would say the difference is the mysterious thing called consciousness that differentiates animate versus inanimate. A computer is always going to be inanimate.

And that is where we disagree.
All we can say is that a computer behaves differently to an organic life form.
But both are matter in motion.
What we don’t know is how consciousness relates to form, in either case.
 

siti

Well-Known Member
We consider molecules inanimate because their behaviour is relatively simple.

I’m not suggesting that molecules think. Not individually anyway.
Yes. So that's what I'm saying - molecules clearly don't "think" - but they do 'experience' and relate to one another and to their environment (e-m fields etc.) - what I am suggesting is that perhaps what we call consciousness is just a more complex way of relating to our environment that emerges when you have sufficiently many 'experiencing' things networked in a sufficiently complex arrangement - like sufficiently many water molecules experiencing and relating to one another and to their environment makes a drop or a pool of liquid water. There is nothing 'liquid' about an individual water molecule; there is nothing 'conscious' about an individual neuron.
 

Howard Is

Lucky Mud
I would say the difference is the mysterious thing called consciousness that differentiates animate versus inanimate. A computer is always going to be inanimate

How can you verify an absence of consciousness in matter ?

Notice I said ‘mysterious’ thing called consciousness. It is what gives us subjective experiencing. A computer can never feel but only mimic. I don’t see complexity creating a new thing that didn’t exist when simpler

Not a new thing. Just a more complex thing, with emergent properties.
So your point supports the notion that awareness is intrinsic, but the content of awareness is a function of complexity.

Yes. So that's what I'm saying - molecules clearly don't "think" - but they do 'experience' and relate to one another and to their environment (e-m fields etc.) - what I am suggesting is that perhaps what we call consciousness is just a more complex way of relating to our environment that emerges when you have sufficiently many 'experiencing' things networked in a sufficiently complex arrangement - like sufficiently many water molecules experiencing and relating to one another and to their environment makes a drop or a pool of liquid water. There is nothing 'liquid' about an individual water molecule; there is nothing 'conscious' about an individual neuron.

Yep. I see it that way.
 

Howard Is

Lucky Mud
Back soon. My guitar wants to be played, and who am I to argue ? She’s clever though, somehow she convinces me that it was my idea !
 
Top