Brian Peacock wrote:pErvin wrote:Scott's definition (which I assume is an analogue of Dennett's).
OK.
pErvin wrote:Scott1328 wrote:Free will is:
The ability of an agent to predict, on occasion, and evaluate the possible outcomes of various alternatives and behave according to that evaluation. This then indicates that there are degrees of freedom; the better an agent's ability to predict and evaluate the more freedom it has. Sometimes, though all the prediction and evaluation in the world will not help if there are no alternatives, in such cases there is no free will. This is the gist of Dennett's argument in Freedom Evolves.
Going back to this definition, that means that my phone has free will because it uses auto-correct.
Free will, as is both commonly understood and is taken to mean in this context, concerns human motives, not merely the existence of alternatives. The clue is in the word 'will', which implies action beyond mere unthinking habit or instinct, actions who's motives have to be distinguished and considered in context.
You are basically shoehorning in metaphysical concepts of "will" and "motive" into a non-metaphysical concept. I judge this to be the case, as you are claiming these are the special sauce that can't exist in software, for example, to cause it to experience free will. Why can't they exist in software? What makes them solely applicable to humans and perhaps animals? Wiki's definition of "motivation" is:
"Motivation is a theoretical construct used to explain behavior. It gives the reasons for people's actions, desires, and needs. Motivation can also be defined as one's direction to behavior, or what causes a person to want to repeat a behavior and vice versa." I see absolutely no reason why that definition couldn't apply to software. Can you provide a different definition that supports your position?
The algorithm in your phone does indeed offer you alternatives selected from a base list which is also appended by user interaction - within certain bounds. Predictive text then is wholly deterministic: if you add the same combination of information in you'll get the same information out, depending on a few predetermined variables.
As I said earlier, determinism isn't a rebuttal of compatibilism. In fact, wiki states:
"Compatibilist's free will should not be understood as some kind of ability to have actually chosen differently in an identical situation." https://en.wikipedia.org/wiki/Compatibi ... _imaginary
But you, the agent, have to decide which alternative, if any, should make it to your txt, and is goes without saying that your phone does not predict what you intend to say, or when you want to say it, or to whom, or compose and send txt messages without your input.
A hypothetical phone can be modified to do just that.
Your phone has no motives, it has no will. It is not an agent, it is the tool of an agent.
As I said above, this is just injecting some special sauce into humans/animals. What is the metaphysical special sauce?
A phone could be programmed to behave according to one or more motives.
Nonetheless, following your predictive text example, do you consider that which we call free will operates on a similar level or in a similar manner to the predictive txt algorithm, that is; considering a choice of beverages, and given a particular informational state, you'll always and only ever choose coffee over tea, beer over gin, and that this implies that 'free will' is a non-starter?
You are mixing two concepts here - determinism and compatibilism. The two are not directly related. Under the compatibilists definition of free will, I see no reason why software can't be thought to have free will. What the utility of such a definition is, I can't see.
Sent from my penis using wankertalk.
"The Western world is fucking awesome because of mostly white men" - DaveDodo007.
"Socialized medicine is just exactly as morally defensible as gassing and cooking Jews" - Seth. Yes, he really did say that..
"Seth you are a boon to this community" - Cunt.
"I am seriously thinking of going on a spree killing" - Svartalf.