Koldo
Outstanding Member
According to determinism, everything has a cause. Free will does not contradict that. It just tells us that we are free to choose to carry our out will. "Free will" does not mean that our "will" is undetermined. It means that nothing impedes our freedom of action--what we will to do. If you tried to move your arm, and it failed to move, then your free choice would be impeded. Free will is about having control over one's actions. In most cases, people think of free will in terms of a conflict of desire, where they ultimately achieve their greatest desire unobstructed.
It depends on how we define 'free will'.
Free will is usually understood as the ability to choose between two or more options, in such a manner that it would be impossible to previously determine the result of our actions. When people think of 'free will', they think of past situations where they could have choosen otherwise.
This means that if we could perform time travels, we would see an individual picking different answers every time we go back in time in a given moment.
Robots actually have freedom of choice, although we don't conventionally acknowledge that. Nondeterministic programming techniques provide robots with an array of goals, and circumstances provide them with decisions to make. What a robot ultimately decides is often not predictable in advance, because it calculates its actions on the basis of a large number of conflicting goals. Programmers do not necessarily know in advance what set of circumstances will trigger what behavior, although they can hazard guesses, just as we humans can hazard guesses as to how other humans will behave.
Freedom of choice? How?
A robot is determined to behave according to how it was programmed. Just because its programmer fails to graps how the whole code will interact in every possible circumstance it doesn't mean the robot is free.
Before you make such a bold claim, you should think long and hard about what it means to be a "moral agent". Morality is a causal factor in what determines behavior.
A moral agent is, among other things, someone who bears moral responsability.
If determinism is true, then moral responsability is not.
If determinism is true, then we just act like programmed robots.
Is a moral agent still a moral agent without moral responsability?
Perhaps we should come up with another term?