TV Tropes Org

Forums

search forum titles
google site search
Total posts: [2,594]  1 ... 86 87 88 89 90
91
92 93 94 95 96 ... 104

The philosophy thread general discussion. :

 2251 De Marquis, Wed, 5th Mar '14 1:48:42 PM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
Interesting. I'll have to do some research on that.
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Yasu!
Probability is in the mind. Just because it is impossible to be certain about something doesn't mean that is random.
When life gives you lemons, burn life's house down with the lemons.
 2253 Greenmantle, Wed, 5th Mar '14 1:52:21 PM from Albion Relationship Status: [TOP SECRET]
In other words, you're saying there's no such thing as probability?

edited 5th Mar '14 1:52:26 PM by Greenmantle

 2254 De Marquis, Wed, 5th Mar '14 2:14:22 PM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
If so, arent you agreeing with me?
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Yasu!
@Greenmantle: Probability is in the mind. A coin is not uncertain of its own outcome. But there is a still a right way to do probability.

@De Marquis: I am still sure not sure what you believe so I don't know if I am agreeing with you or not.
When life gives you lemons, burn life's house down with the lemons.
 2256 Greenmantle, Wed, 5th Mar '14 2:28:11 PM from Albion Relationship Status: [TOP SECRET]
But there is a still a right way to do probability.

...which is? And, um, is English your first language?
 2257 Meklar, Wed, 5th Mar '14 2:36:56 PM from Milky Way Relationship Status: RelationshipOutOfBoundsException: 1
My understanding is that quantum indeterminism only operates at the level of quantum particles. I was under the impression that at the scale of a coin all those effects cancel each other out, and have no impact on which side will turn up.
They almost cancel out, almost always, at least for coins flipped a normal height. But this is just because coins are relatively large and the normal flipping height is pretty small. For events that are already very close to the tipping point, or processes that occur across extremely large scales, the outcome absolutely can be changed on a macroscopic level.

An example that is sometimes given is balancing a pencil on its tip. In a perfect newtonian world, a perfect pencil balanced on a perfect surface precisely vertically in a vacuum will stay balanced forever (you can sometimes see a similar phenomenon in physics puzzle games, where you can drop a triangular block onto a flat plane and it just sits there on its point). However, it has been estimated that in the real world, if you balanced a large number of pencils like this independently of each other, then chances are that within ten seconds, most of them will have fallen over- and in completely random directions.

Yasu!
@Greenmantle: English is my first language.

Probability is in the mind, but the laws of probability are theorems. The evidence should shift your belief only a certain amount and anymore or less will make you wrong.
When life gives you lemons, burn life's house down with the lemons.
Unless your cheating, if you assign a probability other then 0.5 to a coin flip, you will tend to lose.
The road to exploding boats is paved with good intentions.
 2260 De Marquis, Wed, 5th Mar '14 4:30:37 PM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
Well, in the classical view, probability reflects only our ignorance of the facts, so what the probability is depends on how many facts we have (the more we have, the less uncertain the outcome). According to this new theory that may no longer be true, at least not all the time. It seems that there is still some debate regarding the significance of this new claim, but if flipping a coin can to some extent reflect the underlying quantum uncertainty, then probability is no longer simply in the mind.

edited 5th Mar '14 4:31:00 PM by DeMarquis

“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Yasu!
Getting back on track. This notion of free will is too vague. Can you even imagine answering the question of free will? If not we might be asking the wrong question.
When life gives you lemons, burn life's house down with the lemons.
Changing the subject since it seems to have died anyways, here's my idea of how morality works if it exists.

1 It needs to be something most humans could be persuaded to in principle. It might take centuries. And it might take more computing power then is morally practical for a human to run in full detail in most cases.

It doesn't need to persuade paperclip maximizers.

It is not equivalent to what you want, any more than logic is equivalent to what you think, But both logic a morality have to be reached using your actual brain, starting from something much cruder and less reliable. Trusting specific arguments isn't the same as completely trusting yourself.

Morality isn't simple and it is easy to be confused about it. We are starting with dozens or hundreds of intuitions, and don't even have a full list of them.

2 It might be the case to there are multiple moralities, humans can reach. Maybe different people have different moralities. Maybe women have a different morality then men. Or each person has multiple moralities they can reach. But these moralities are still quite limited in what they can be. Maximizing paperclips isn't any kind of morality.

If morality doesn't exist then moral reasoning will never settle on one answer, and will wonder a space that while still limited is to vague, to practically call morality.

edited 30th Mar '14 2:48:06 PM by supermerlin100

The road to exploding boats is paved with good intentions.
 2263 demarquis, Sun, 30th Mar '14 2:53:30 PM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
Do you mean a universal morality?
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Yasu!
@Demarquis Universal to humans. It is unlikely that most aliens possess a utility function compatible with morally.
When life gives you lemons, burn life's house down with the lemons.
[up][up]No I covered relativistic morality to. Though I think there's only one morality.

[up][nja]

edited 30th Mar '14 3:34:39 PM by supermerlin100

The road to exploding boats is paved with good intentions.
 2266 demarquis, Sun, 30th Mar '14 6:26:04 PM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
Oh, so you mean a set of criteria defining what beliefs can qualify as a "morality"? Something more constrained than "the pursuit of self interest"
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
"Morality" here is being used to refer to a small set of stable utility functions that humans can reasonable reach (without discounting or misrepresenting their own intuitions).

Modified humans might reach different stable utility functions, but those aren't moralities. They are xyztles, though some xyztles might be moralities.

The word morality has a fixed reference(s), that are not dependent on humans, but which we assign a label to, because we fine it helpful to.
The road to exploding boats is paved with good intentions.
 2268 demarquis, Mon, 31st Mar '14 8:12:43 AM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
Is English a second language for you? [sincerity mode]

Since utility functions depend upon what preferences an individual adopts, and different people adopt vastly different moral preferences, I dont think your definition helps very much. What preferences humanity could adopt in order to achieve some hypothetical level of consensus is not a very interesting question (because it is extremely unlikely to happen just for that reason).
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Yasu!
Human's utility functions are slightly different, but it can't be anything it wants. A human will never be a paper-clip maximizer for example.

Morally can be defined without any dependence on human preferences narrowing the space of moral utility functions to a very small space.
When life gives you lemons, burn life's house down with the lemons.
[up][up] The problem is that is that those multiple sets of moral preferences, come from mistaken generalizations. If they actual logically follow, then why shouldn't they be done? If there's a reason to do otherwise, then a mistake must have been made somewhere.

It could accurately be said that a paperclip maximizer should maximize paperclips. There's no logical reason it should do what is moral. It also shouldn't just sit there doing nothing talking about how meaningless life is.

When we talk about meaning we are pointing at something outside of ourselves, but that most possible minds have no reason to care about.
The road to exploding boats is paved with good intentions.
 2271 demarquis, Mon, 31st Mar '14 2:36:23 PM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
In what sense can a moral generalization be mistaken? What basis do you have for saying that people who hold a given moral preference do not act on them? Two people can hold contrasting preferences, and each believe in them equally strongly. There is no a priori reason to believe that there is one set of consistent preferences that should compel people more strongly than any other.
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Take an easy one maximizing happiness. Sounds great right? But this would imply you should press a button that turns all of the universe into mindlessly happy brains and their life support, even if this means destroying all freewill, all individuality, all beauty, all future acts of love, all challenge, all striving, all sense of purpose, fun, and creativity ect ect, if presented with the choice.

The mistake made here is pretty obvious. They thought happiness had everything covered. Of course if you were trying to make people happy you would give them things like freedom and fun.

Imagine you make an AI but before you write in its utility function, you shatter the function into a hundred vague impressions. The AI doesn't know how many, or which ones it is running at a given time. Now make it so it tends to cache what its authority figures say its function is and to some extent forget where it got the idea. Also make it hate admitting it's wrong even to itself. That's what we're like. The only difference is that there isn't a guarantee there is a single function, or even a reasonable small set of functions.

edited 31st Mar '14 7:20:12 PM by supermerlin100

The road to exploding boats is paved with good intentions.
 2273 demarquis, Tue, 1st Apr '14 11:34:54 AM from Hell, USA Relationship Status: Buried in snow, waiting for spring
Who Am I?
In what sense is a preference for the experience of happiness or satisfaction at the expense of individual freedom or any other unique value set a "mistake"? If that's what someone wants, then that's what they want.
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
 2274 Greenmantle, Tue, 1st Apr '14 1:51:04 PM from Albion Relationship Status: [TOP SECRET]
[up][up] Can I ask what do you rate as most important — the happiness of the group or of the individual?
[up][up] The point is that if that nobody actually does. And the deal is lots of happiness in exchange for everything else. You aren't you after they deal no one is. Everyone is dead, replaced by happiness engines. Anything and everything anyone ever cared about except happiness, is gone.

Also I could have sworn you believed in objective morality?

edited 1st Apr '14 3:49:25 PM by supermerlin100

The road to exploding boats is paved with good intentions.
Total posts: 2,594
 1 ... 86 87 88 89 90
91
92 93 94 95 96 ... 104


TV Tropes by TV Tropes Foundation, LLC is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Permissions beyond the scope of this license may be available from thestaff@tvtropes.org.
Privacy Policy