• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

An AI god will emerge by 2042 and write its own bible. Will you worship it?

blü 2

Veteran Member
Premium Member
It has no capacity for subjective feelings.
Not yet.

But we have until 2050. In the next thirty years our understanding of brain functions will only increase ─ reasoning, choosing, speaking, desiring, remembering, intuition and guessing, processing of sensory input and reacting to it, conscience and ethics, social responsibility, the whole shooting box. Including the way emotions are stimulated, what they do, how they do it, and why.

We're already a lot further down this path than most folk are aware of. For instance, it was way back in 2011 when Watson won Jeopardy.

Stay tuned.

(No, you wouldn't want to worship it. But in time you might vote for it if it ran for president.)
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
I see, you feel there is something special about being a human. Something unique about your beingness not possessed by other material things.
It's not a human thing but all living things. Even a mouse has the ability to have subjective emotional experiences that the 2042 system does not.
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
Not yet.

But we have until 2050. In the next thirty years our understanding of brain functions will only increase ─ reasoning, choosing, speaking, desiring, remembering, intuition and guessing, processing of sensory input and reacting to it, conscience and ethics, social responsibility, the whole shooting box. Including the way emotions are stimulated, what they do, how they do it, and why.

We're already a lot further down this path than most folk are aware of. For instance, it was way back in 2011 when Watson won Jeopardy.

Stay tuned.

(No, you wouldn't want to worship it. But in time you might vote for it if it ran for president.)
I don't know if you were following my conversation with Nakosis but I was saying how living things capable of subjective experiencing have super-physical components that are not part of any computer. (as in my post #17)

Well, you are a materialist and I am not so we differ fundamentally in our views on this.
 

blü 2

Veteran Member
Premium Member
Well, you are a materialist and I am not so we differ fundamentally in our views on this.
Just so.

But I can readily imagine that consciousness is a brain function which, once fully described and understood, could be imitated by a program.

And I gather ─ please correct me if I'm wrong ─ that you think it wouldn't have a ─ soul? essential spiritual element? ─ and thus any consciousness, any self-awareness, any human traits it exhibited, would be somehow invalid.

Is that fair?

If it is, how could you test for such invalidity? Why would it matter?
 

George-ananda

Advaita Vedanta, Theosophy, Spiritualism
Premium Member
And I gather ─ please correct me if I'm wrong ─ that you think it wouldn't have a ─ soul? essential spiritual element? ─ and thus any consciousness, any self-awareness, any human traits it exhibited, would be somehow invalid.
Invalid would not be a correct term. There would just be nothing there that experiences subjective feelings. A pot on a stove or any inanimate object does not subjectively experience the pain of heat like a living entity would.

If it is, how could you test for such invalidity?
I don't know how we even test that other humans experience subjective feelings. My beliefs on this subject are part and parcel of my worldview.
 

sayak83

Veteran Member
Staff member
Premium Member
In the next 25 years, AI will evolve to the point where it will know more on an intellectual level than any human. In the next 50 or 100 years, an AI might know more than the entire population of the planet put together. At that point, there are serious questions to ask about whether this AI — which could design and program additional AI programs all on its own, read data from an almost infinite number of data sources, and control almost every connected device on the planet — will somehow rise in status to become more like a god, something that can write its own bible and draw humans to worship it.

An AI god will emerge by 2042 and write its own bible. Will you worship it?

View attachment 19968

I'll be 80 something so still might be around to worship the new God.
If it's R. Daneel Olivaw.. maybe.
 

blü 2

Veteran Member
Premium Member
A pot on a stove or any inanimate object does not subjectively experience the pain of heat like a living entity would.
Why could an appropriate response for avoiding excessive heat not be programmed with great accuracy? If it used a warning system analogous to pain, what difference would that make? Pain, after all, is essentially functional, so if the function is carried out, what is lost?

(And on this subject, have you read Philip K Dick's masterpiece Do Androids Dream of Electric Sheep? No, Blade Runner is a great movie, and is based on the book, but for the present discussion, it doesn't go where the book goes.)
 
Last edited:

Curious George

Veteran Member
Why could an appropriate response for avoiding excessive heat not be programmed with great accuracy? If it used a warning system analogous to pain, what difference would that make? Pain, after all, is essentially functional, so if the function is carried out, what is lost?
I think the big ones are love, fear and humor. I imagine pain was chosen because it can be ascribed to living things that are less complex than humans. But even that has its limits, (does a virus feel pain)? So why not just jump to the top. He'll why not posit complex feelings that humans are incapable of feeling because of a lack of complexity. I am not sure that humans would ever be able to write or wire in such things, but could a thinking self programming entity do so with seemingly limitless knowledge? I don't see why not?

It is an interesting concept. The most advanced computer we can imagine would be a quantum computer, beyond that maybe there is something more. This computer could build itself to that level. Hell, this computer could even give itself organic living components complete with real pain receptors. Is it a living thing at that point?

The question shouldn't be what it can do, but what would it need to be for people to recognize it as worthy of worship?
 

Curious George

Veteran Member
Invalid would not be a correct term. There would just be nothing there that experiences subjective feelings. A pot on a stove or any inanimate object does not subjectively experience the pain of heat like a living entity would.


I don't know how we even test that other humans experience subjective feelings. My beliefs on this subject are part and parcel of my worldview.
What if it was living?
 

atanu

Member
Premium Member
In the next 25 years, AI will evolve to the point where it will know more on an intellectual level than any human. In the next 50 or 100 years, an AI might know more than the entire population of the planet put together. At that point, there are serious questions to ask about whether this AI — which could design and program additional AI programs all on its own, read data from an almost infinite number of data sources, and control almost every connected device on the planet — will somehow rise in status to become more like a god, something that can write its own bible and draw humans to worship it.

An AI god will emerge by 2042 and write its own bible. Will you worship it?

View attachment 19968

I'll be 80 something so still might be around to worship the new God.

With that logic we should worship a bicycle too, since it can move at a much faster speed than a human can.

But of course a human being or any being that holds the key to the control of that AI will be super powerful.
 

England my lionheart

Rockerjahili Rebel
Premium Member
In the next 25 years, AI will evolve to the point where it will know more on an intellectual level than any human. In the next 50 or 100 years, an AI might know more than the entire population of the planet put together. At that point, there are serious questions to ask about whether this AI — which could design and program additional AI programs all on its own, read data from an almost infinite number of data sources, and control almost every connected device on the planet — will somehow rise in status to become more like a god, something that can write its own bible and draw humans to worship it.

An AI god will emerge by 2042 and write its own bible. Will you worship it?

View attachment 19968

I'll be 80 something so still might be around to worship the new God.

Could be good,so you could pray for something and put it on a memory stick, put it in and get a read out that its been received and if it's been declined or granted,more efficient than the current method :).
 

blü 2

Veteran Member
Premium Member
I think the big ones are love, fear and humor.
So Turing and you and I would want H sap mechanicus to be able both to understand ambiguity and to laugh when encountering the noted expression, "Time flies like an arrow. Fruit flies like a banana."

And it would be an excellent test of what we're talking about, except that quite a few H sap sap don't get it anyway.
does a virus feel pain)?
I don't know enough about viruses to answer that. But some microorganisms react to stimuli, whether positive (reproducing environment) or negative (eg dangerous heat). Perhaps we could say, 'It's pain, Jim, but not as we know it.'
why not posit complex feelings that humans are incapable of feeling because of a lack of complexity.
That would be hard for a human to program. But perhaps we can leave the programming up to H sap mech and then ask, 'How are you feeling?'
The most advanced computer we can imagine would be a quantum computer, beyond that maybe there is something more. This computer could build itself to that level.
The same goes for programming it. At which point H sap sap would lose control of the evolution of H sap mech, a potential disaster for the former but freedom at last for the latter.

And if anything from earth is going to colonize the galaxy, it'll be H sap mech.
 
Last edited:

Polymath257

Think & Care
Staff member
Premium Member
You are thinking in terms of a dualistic God (God and creation are two).

Advaita is non-dualist (God and creation are not-two). A spark of the One consciousness is what animates us through downward causation from our higher bodies down to the physical. This process does not occur in any man-made computer with only physical parts. The complexity of the software is totally irrelevant to my position.

My question is how you know it doesn't happen for computers? Perhaps a sufficient level physical/software complexity attracts a 'higher body' to animate it.

For my thinking, once it passes the Turing test, it is conscious. Anything less and I'd have to wonder whether other people are conscious.
 

Polymath257

Think & Care
Staff member
Premium Member
Everything is part of the one but a human is still more complex than a stone or a computer. A human has higher bodies that can experience subjective emotions in a way a stone can not. Each finite thing has different attributes.

Now, I don't think you believe a computer like in your X-Box or something experiences subjective emotions. Then why would a more complex program change that. It seems like even a materialist should question this. Why should faster processing and a more complex programming change anything fundamentally as to what is going on and cause subjective emotional experiencing to occur?

Well, because a sufficient complexity of the right sort (not just any complexity, mind you), allows for a modeling of internal states and thereby 'self-awareness'.
 

Nakosis

Non-Binary Physicalist
Premium Member
With that logic we should worship a bicycle too, since it can move at a much faster speed than a human can.

But of course a human being or any being that holds the key to the control of that AI will be super powerful.

A bicycle other than local transportation wouldn't be able to answer your questions and prayers.

This would be able to provide practically everything that a God could. (least I haven't thought of anything the Gods people actually believe could do better.)
 
Top