I had to look up p-zombies to know what you were talking about. I think you are attempting to say that if we build a machine that mimics human behavior we have created a human. Is this your argument?
We have created a human mind, yes. Or something greater than it. Whether or not that human mind has a human
body is another question entirely.
How do you think the designers do it? With tweezers? They hand their design to a computer. There's no theorectical reason that the computer can't invent designs on its own. (It already compiles it from extremely high level constructs into actual instructions for the FPGA.)
That's my point. Computers can't either. They are just like a screwdriver with more functions. More like a desk lamp, really. That's why I always say that.
Computers can't be autonomous?
Von Nueman replicators aren't a thing? As far as computability theory is concerned, the difference between Von Nueman's device and grey-goo nanotechnology is a matter of engineering - it certainly
can be done, the question is how to do it.
Desk lamps and screwdrivers have a defined function - contrariwise, sufficiently advanced robotics can do
anything.
You'll have to run that by me again.
"If A, B, else C."
That's a decision - but it's not really a choice, because "A" is something that's as obvious (to the computer) as "The sun is yellow" is to us.
The fact that it performs as designed is irrelevant. We can't turn screws as good as screwdrivers, either. But they don't really turn screws, do they? We do. With screwdrivers.
What sort of design applies to a machine that can rewrite its own instructions and rebuild its own processes?
Haha, okay. All the stuff I said barring inexplicable stellar events, divine intervention, alien co-habitation, or whatever it was that monolith was supposed to be...
The monolith is an autonomous Von Neuman replicator. Amongst other things.
Sure. It's inferred from basic quantum mechanics (although not necessarily true). But it's also irrelevant. Because the whole point is to be able to identify a relevant "state" in a finite state machine. The question is whether or not we can simulate the brain on a finite state machine.
The
universe is technically a finite state machine, since there are only a finite number of ways to arrange the universe's contents. (Remembering that energy appears to be quantized.) The asynchonicity can be dealth with by either making the carrier fields part of the machine, or simply accepting a somewhat bizarre view of what a state of the universe is.
It's impossible to know, yes. But the more we learn about the brain, the more we do know what is at least involved in information storage. We know, for example, that unlike a computer there is no distinction between "storage" and "processing".
With any sort of non-trivial model, that's true in computers as well. That is, writing or
reading of data requires significant processing.
Both are constant and dynamic (the brain is never at rest). We also have no idea how much can be represented by integers (even allowing for the infinite set, because from the physical to the life sciences everything is approximation.
Integers cannot represent all real numbers to infinite precision, but they can represent real numbers to
arbitarary precision. All physical quantities have "smallest meaningful values", e.g. the Planck length, so integers suffice.
But they can't deal with concepts at all. So its irrelevant.
You earlier admitted that you can't define what a concept is. So how do you know that computers can't deal with them? What would dealing with them look like?
Computers deal well with "generalities" only in so far as there is a human there to understand what these mean. It is incredibly easy to create objects like "tree" have thinks like "oak", "spruce", etc. inherit from "tree". It's also meaningless to the computer.
But it's not quite meaningless - it is intrinsic to the program that oaks, spruces, etc, all fufill certain properties and the intersection of those is at least the generality "tree." If you write the program in a language such as Spec# or (to a lesser extent) Haskell, this becomes clear, because the language requires you to define that meaning. Compare:
(Python)
to
Code:
Number add(Number a, Number b)
requires 0 <= a <= (1 << 32)
requires 0 <= b <= (1 << 32)
ensures 0 <= Return <= (1 << 33)
ensures Return >= a
ensures Return >= b
{
return a+b;
}
(psudo-Spec#)
Though they perform the same function, the latter is far more meaningful, both to a human and, importantly,
to the computer. It is meaningful to the computer because of reflection, which exposes its own inner workings as data - in this case,
its own expectations of how things behave! Our software is capable of navel-gazing! Now that the computer knows what it's expecting, it can invent data to violate those expectations, and work through the model to determine a new answer. For instance, It can then build the more general function:
Code:
Number add(Number a, Number b)
requires -(1 << 32) <= a <= (1 << 32)
requires -(1 << 32) <= b <= (1 << 32)
{
return a+b;
}
And then, invent the data (-1,-1), feed it into the function above and get -2. However, because the process in this new function is the same as the old one (obviously any IRL system will have something that's more complex and non-circular) the computer can logically deduce more meaning than initially provided. For instance, it might come up with,
Code:
Number add(Number a, Number b)
requires -(1 << 32) <= a <= (1 << 32)
requires -(1 << 32) <= b <= (1 << 32)
ensures (Return >= a && Return >= b) || a < 0 || b < 0
As far as I'm concerned, this is 1) learning, 2) the concept of addition. (The bit missing from this being fully intelligent is that we haven't covered comparing the improved function to real-world evidences yet.)
CAPTCHAs and facial recognition are incredibly difficult for computers beacause they computers are designed from the ground up for "particulars." Lots of things can be stored, but they are stored specifically in well-defined ways.
That's one of the most general statements you can get - it qualifies over an infinite set! - yet it's a string less than 60 bytes long.