AlphaAlex115
Active Member
wrong. think bout skynet broooo. come on now. it aint so easyt bryverSure I'll worship... we'll have peace and global communion... what a party!
Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.
Your voice is missing! You will need to register to get access to the following site features:We hope to see you as a part of our community soon!
wrong. think bout skynet broooo. come on now. it aint so easyt bryverSure I'll worship... we'll have peace and global communion... what a party!
Not yet.It has no capacity for subjective feelings.
It's not a human thing but all living things. Even a mouse has the ability to have subjective emotional experiences that the 2042 system does not.I see, you feel there is something special about being a human. Something unique about your beingness not possessed by other material things.
I don't know if you were following my conversation with Nakosis but I was saying how living things capable of subjective experiencing have super-physical components that are not part of any computer. (as in my post #17)Not yet.
But we have until 2050. In the next thirty years our understanding of brain functions will only increase ─ reasoning, choosing, speaking, desiring, remembering, intuition and guessing, processing of sensory input and reacting to it, conscience and ethics, social responsibility, the whole shooting box. Including the way emotions are stimulated, what they do, how they do it, and why.
We're already a lot further down this path than most folk are aware of. For instance, it was way back in 2011 when Watson won Jeopardy.
Stay tuned.
(No, you wouldn't want to worship it. But in time you might vote for it if it ran for president.)
Just so.Well, you are a materialist and I am not so we differ fundamentally in our views on this.
Invalid would not be a correct term. There would just be nothing there that experiences subjective feelings. A pot on a stove or any inanimate object does not subjectively experience the pain of heat like a living entity would.And I gather ─ please correct me if I'm wrong ─ that you think it wouldn't have a ─ soul? essential spiritual element? ─ and thus any consciousness, any self-awareness, any human traits it exhibited, would be somehow invalid.
I don't know how we even test that other humans experience subjective feelings. My beliefs on this subject are part and parcel of my worldview.If it is, how could you test for such invalidity?
If it's R. Daneel Olivaw.. maybe.In the next 25 years, AI will evolve to the point where it will know more on an intellectual level than any human. In the next 50 or 100 years, an AI might know more than the entire population of the planet put together. At that point, there are serious questions to ask about whether this AI — which could design and program additional AI programs all on its own, read data from an almost infinite number of data sources, and control almost every connected device on the planet — will somehow rise in status to become more like a god, something that can write its own bible and draw humans to worship it.
An AI god will emerge by 2042 and write its own bible. Will you worship it?
View attachment 19968
I'll be 80 something so still might be around to worship the new God.
Why could an appropriate response for avoiding excessive heat not be programmed with great accuracy? If it used a warning system analogous to pain, what difference would that make? Pain, after all, is essentially functional, so if the function is carried out, what is lost?A pot on a stove or any inanimate object does not subjectively experience the pain of heat like a living entity would.
I think the big ones are love, fear and humor. I imagine pain was chosen because it can be ascribed to living things that are less complex than humans. But even that has its limits, (does a virus feel pain)? So why not just jump to the top. He'll why not posit complex feelings that humans are incapable of feeling because of a lack of complexity. I am not sure that humans would ever be able to write or wire in such things, but could a thinking self programming entity do so with seemingly limitless knowledge? I don't see why not?Why could an appropriate response for avoiding excessive heat not be programmed with great accuracy? If it used a warning system analogous to pain, what difference would that make? Pain, after all, is essentially functional, so if the function is carried out, what is lost?
What if it was living?Invalid would not be a correct term. There would just be nothing there that experiences subjective feelings. A pot on a stove or any inanimate object does not subjectively experience the pain of heat like a living entity would.
I don't know how we even test that other humans experience subjective feelings. My beliefs on this subject are part and parcel of my worldview.
In the next 25 years, AI will evolve to the point where it will know more on an intellectual level than any human. In the next 50 or 100 years, an AI might know more than the entire population of the planet put together. At that point, there are serious questions to ask about whether this AI — which could design and program additional AI programs all on its own, read data from an almost infinite number of data sources, and control almost every connected device on the planet — will somehow rise in status to become more like a god, something that can write its own bible and draw humans to worship it.
An AI god will emerge by 2042 and write its own bible. Will you worship it?
View attachment 19968
I'll be 80 something so still might be around to worship the new God.
In the next 25 years, AI will evolve to the point where it will know more on an intellectual level than any human. In the next 50 or 100 years, an AI might know more than the entire population of the planet put together. At that point, there are serious questions to ask about whether this AI — which could design and program additional AI programs all on its own, read data from an almost infinite number of data sources, and control almost every connected device on the planet — will somehow rise in status to become more like a god, something that can write its own bible and draw humans to worship it.
An AI god will emerge by 2042 and write its own bible. Will you worship it?
View attachment 19968
I'll be 80 something so still might be around to worship the new God.
So Turing and you and I would want H sap mechanicus to be able both to understand ambiguity and to laugh when encountering the noted expression, "Time flies like an arrow. Fruit flies like a banana."I think the big ones are love, fear and humor.
I don't know enough about viruses to answer that. But some microorganisms react to stimuli, whether positive (reproducing environment) or negative (eg dangerous heat). Perhaps we could say, 'It's pain, Jim, but not as we know it.'does a virus feel pain)?
That would be hard for a human to program. But perhaps we can leave the programming up to H sap mech and then ask, 'How are you feeling?'why not posit complex feelings that humans are incapable of feeling because of a lack of complexity.
The same goes for programming it. At which point H sap sap would lose control of the evolution of H sap mech, a potential disaster for the former but freedom at last for the latter.The most advanced computer we can imagine would be a quantum computer, beyond that maybe there is something more. This computer could build itself to that level.
..because its a computer!I don't worship anything now...why would do so then?
Everybody likes Daneel. He's a special guy.If it's R. Daneel Olivaw.. maybe.
You are thinking in terms of a dualistic God (God and creation are two).
Advaita is non-dualist (God and creation are not-two). A spark of the One consciousness is what animates us through downward causation from our higher bodies down to the physical. This process does not occur in any man-made computer with only physical parts. The complexity of the software is totally irrelevant to my position.
Everything is part of the one but a human is still more complex than a stone or a computer. A human has higher bodies that can experience subjective emotions in a way a stone can not. Each finite thing has different attributes.
Now, I don't think you believe a computer like in your X-Box or something experiences subjective emotions. Then why would a more complex program change that. It seems like even a materialist should question this. Why should faster processing and a more complex programming change anything fundamentally as to what is going on and cause subjective emotional experiencing to occur?
That depends on us!wrong. think bout skynet broooo. come on now. it aint so easyt bryver
With that logic we should worship a bicycle too, since it can move at a much faster speed than a human can.
But of course a human being or any being that holds the key to the control of that AI will be super powerful.
If it's R. Daneel Olivaw.. maybe.