Follow TV Tropes

Following

Harry Potter And The Methods Of Rationality

Go To

TamH70 Since: Nov, 2011 Relationship Status: Faithful to 2D
DeMarquis Since: Feb, 2010
#7302: Mar 27th 2015 at 8:45:24 PM

"That's not correct. It is equivalent. That's the entire point of probability."

It's not clear to me that probability has a 'point', or even what that means, so I cant argue that; but you cant calculate an average based on a single event.

"...90% probability of saving 500 people and a 10% probability of losing 500 against 100% probability of saving 400 people and 100% probability of losing 100 people."

OK, fine, but that doesn't affect my argument. It's a matter of choosing between a 10% chance of killing 500 people vs. a 100% chance of killing 100. I content that there's no objectively correct answer, so we are free to follow our intuition.

In my variation of the dilemma, what I was trying to describe is an alternative in which you have a 90% chance of saving the entire human race along with a 10% of losing everyone vs. a 100% chance of saving 400 people while also killing everyone else. Playing around with the size of the different groups involved is an interesting exercize in priority setting.

crazysamaritan NaNo 4328 / 50,000 from Lupin III Since: Apr, 2010
NaNo 4328 / 50,000
#7303: Mar 28th 2015 at 6:07:09 AM

you can't calculate an average based on a single event.
Errr, no, the average was calculated beforehand. In this exercize, 90% probability is a fact as much as gravity is. So the average life saved is an objective fact, not subject to perspective.

I contend that there's no objectively correct answer, so we are free to follow our intuition.
There ae a list of objective criteria, and the typical debate around those problems is over which criteria is the best. Obviously, you do not consider average # of lives saved to be sufficient criteria. You want the "safe" option.

In my variation of the dilemma, what I was trying to describe is an alternative in which you have a 90% chance of saving the entire human race along with a 10% of losing everyone vs. a 100% chance of saving 400 people while also killing everyone else. Playing around with the size of the different groups involved is an interesting exercize in priority setting.
"Entire human race" creates a new quality that isn't present in the old exercize. If you change 500 to 5,000, or to 5 billion, but keep the 100% probability to 400, then the answer is still to take the 90% chance.

Here's a new thought for you,

  • 500 lives at risk: You have a 90% chance of saving 500 vs. a 95% chance of saving 400 people.

Link to TRS threads in project mode here.
DeMarquis Since: Feb, 2010
#7304: Mar 28th 2015 at 4:50:22 PM

I wasn't questioning the validity of the 90%/10% figure. Perhaps there was an entire series of trials that established this. But according to the wording of the problem, this is only going to happen to us once. Therefore there is no alternative we can choose that will save 450 people, on average or otherwise. So comparing a 100% chance of saving 400 people (100% chance of killing 100) to a "highest expected utility" (Higurashimerlin's words) in which a 90% chance of saving 500 people is taken as equivalent to saving 450 people on average is meaningless. It wasn't meaningless across all the previous trials, but it is meaningless on the one trial we get to decide, because there will only be one event and we cant strive to save an average number of people. That makes no sense.

ashnazg Since: Dec, 2009
#7305: Mar 28th 2015 at 11:34:10 PM

[up]People consider "average values" in decisions all the time, even if they're not doing it consciously and deliberately. Take insurance, for example - choosing to buy insurance means that you believe your average expected utility from doing so is positive, despite the fact that the probability of you just paying for it without ever getting a claim is pretty high. (Insurance is also probably a good example of a nonlinear utility function for money, since the monetary expectation value of insurance is almost certainly negative for you, the company wouldn't profit otherwise.)

DeMarquis Since: Feb, 2010
#7306: Mar 29th 2015 at 12:20:29 PM

That's not parallel at all. For one thing, you aren't comparing two different alternatives both of which will determine the percent chance of different outcomes happening to you. It's just a matter of determining what the probability of something bad happening to you is and then paying some money ahead of time to make sure you're covered. Here we know what the probabilities are going to be associated with two different choices, and trying to select which probability we wish to expose ourselves to. The point I'm making is that our outcomes in the future are not going to be averaged over anything.

supermerlin100 Since: Sep, 2011
#7307: Mar 29th 2015 at 4:04:51 PM

They're so-to-speak averaged over possible futures.

Plan A can't be better, because other wise it would be better than a 99.99999999999999999% chance of saving 500. And there would be a big difference between saving 400 with 100% certainty vs saving them with 99.9999999999999999999999999999999999999999999999999999999999999999999% certainty

So if we replace 100 with 99.(Alot of 9s) ,plan A being right implies the benefit of saving a fixed number of lives goes down with the total number of lives saved, like money. Or that there's at least a discontinuality at LS=0.

But outside of instrumental concerns, this doesn't seem to make any sense.

edited 29th Mar '15 4:06:07 PM by supermerlin100

crazysamaritan NaNo 4328 / 50,000 from Lupin III Since: Apr, 2010
NaNo 4328 / 50,000
#7308: Mar 29th 2015 at 4:42:55 PM

So comparing a 100% chance of saving 400 people (100% chance of killing 100) to a "highest expected utility" (Higurashimerlin's words) in which a 90% chance of saving 500 people is taken as equivalent to saving 450 people on average is meaningless.
You've misunderstood ~Higurashimerlin's words.

Please analyze the new exercise for me, and tell me which option is better

  • 500 lives at risk: You have a 90% chance of saving 500 vs. a 95% chance of saving 400 people.

Link to TRS threads in project mode here.
DeMarquis Since: Feb, 2010
#7309: Mar 29th 2015 at 6:38:01 PM

Actually, I never stated which option is better, merely that the better option is not objectively obvious. Which of the options is better depends upon the negative weight one places on risk, which is a subjective opinion. Assuming that if one picks the 400, and that approach fails, everyone dies, then there is less of a difference between the two, and one might very well attempt to save all 500. But this is still a matter of subjective preferences.

SparklingDude Since: Jun, 2012
#7310: Mar 29th 2015 at 11:00:09 PM

[up][up] depends on how you rate survival of the individuals to survival of the group. if they are just some people, then 500*0.9 > 400*0.95. if they are the last of (humanity / my nation / ...) or in other way special as a group, i might pick A.

crazysamaritan NaNo 4328 / 50,000 from Lupin III Since: Apr, 2010
NaNo 4328 / 50,000
#7311: Mar 30th 2015 at 7:24:41 AM

Actually, I never stated which option is better, merely that the better option is not objectively obvious. Which of the options is better depends upon the negative weight one places on risk, which is a subjective opinion.
I already replied to that: there are many objective standards that are debated, and which objective standard is better is the subjective debate. You expressed a preference to one of the options.

You have now clearly stated what I was guessing about you earlier: You wish to minimize risk. That is an objective standard. To maintain consistency with your standard, if you had a 99.9% chance of saving all 500 in the population, you should dismiss it in favour of a 100% chance to save 1 random person in the population of 500. Risk, by itself, has no negative weight.

depends on how you rate survival of the individuals to survival of the group
Agreed: please see post 7292.

Link to TRS threads in project mode here.
supermerlin100 Since: Sep, 2011
#7312: Mar 30th 2015 at 10:50:59 AM

Risk aversion can be consistent without outside concerns, if you have a fixed discount rate.

To make saving 400 lives better than saving 500 with a probability of 0.9, you need a dicount rate of x where 0.9(Sum_{k=400->499} cx^k) - 0.1(sum_{k=0->399} cx^k)<0.

But I see no reason to do that.

higurashimerlin Since: Aug, 2012
#7313: Mar 30th 2015 at 11:33:16 AM

....So this and the Madoka thread is where all the people from the philosophy thread went. Except more is getting done here and in the Madoka thread then the philosophy thread. Eliezer would approve.

When life gives you lemons, burn life's house down with the lemons.
TheHandle United Earth from Stockholm Since: Jan, 2012 Relationship Status: YOU'RE TEARING ME APART LISA
United Earth
#7314: Mar 30th 2015 at 2:37:49 PM

When one says "philosophy", one is primed to think of useless, esoteric, 'pure' dick-waving contests of intellectual impressiveness, and obsessing over the concerns of dead white men. Not of attempting to solve complex problems that we genuinely care about, fictional though they may be.

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
higurashimerlin Since: Aug, 2012
#7315: Mar 30th 2015 at 2:56:46 PM

Primed yes. But "some" people do manage to do actual good philosophy. And don't argue about solved problems.

When life gives you lemons, burn life's house down with the lemons.
storyyeller More like giant cherries from Appleloosa Since: Jan, 2001 Relationship Status: RelationshipOutOfBoundsException: 1
More like giant cherries
higurashimerlin Since: Aug, 2012
#7317: Mar 30th 2015 at 7:48:41 PM

You might need to go back a "few" pages to find that stuff.

When life gives you lemons, burn life's house down with the lemons.
DeMarquis Since: Feb, 2010
#7318: Mar 31st 2015 at 6:52:55 AM

@Crazysamaritan: I didn't express a preference, only that a number of potential preferences exist, each of which are equally valid. In any case, I suspect that you are reading something into the scenario that I do not: one way to read it is that we know which 400 are going to live, and which 100 are going to die. In that case, the risk associated with that option is zero. If that isn't the case, then, as I stated, if you are trying to save one particular person then saving the 500 is the way to go. If you are not, then pick based on your intuition, because there is no metric that clearly supports one over the other.

If by "save 1 random person in the population of 500" you are referring to the idea that saving 400 people kills 100 people and the families and friends of those people experience a loss, yes they do- but so do the families and friends of the 500 people you have a 10% of killing if you select that option. This doesn't help us decide.

crazysamaritan NaNo 4328 / 50,000 from Lupin III Since: Apr, 2010
NaNo 4328 / 50,000
#7319: Mar 31st 2015 at 5:08:18 PM

one way to read it is that we know which 400 are going to live, and which 100 are going to die.
That conclusion actually works with the way I phrased it, although I did not intend it:
  • The natural reaction is to then want to treat 400 patients with the 100% medicine and the other 100 with the 90% medicine. So the next step is to clarify that the medicine that can save 400 people is diluted (by mixing with more experimental medicine) in order to save 500 people.
As I said before, the wording is definitely an issue. The usual mental experiment intends that the 400 people is saved randomly from the 500 population, because the "loved one" aspect is always randomized. You can't choose which people to save in the "400 live 100% of the time" scenario.

Mathematically, if your priority is to save the most lives, then the "400 live 100% of the time" scenario is less effective.

Link to TRS threads in project mode here.
DeMarquis Since: Feb, 2010
#7320: Mar 31st 2015 at 5:21:49 PM

How would you demonstrate that?

crazysamaritan NaNo 4328 / 50,000 from Lupin III Since: Apr, 2010
NaNo 4328 / 50,000
#7321: Mar 31st 2015 at 5:23:24 PM

  1. 400x100%=400
  2. 500x90%=450

Link to TRS threads in project mode here.
DeMarquis Since: Feb, 2010
#7322: Mar 31st 2015 at 5:27:53 PM

Sigh. We've been over this. There is no option that allows you to save 450 people. Multiplying 500 times .9 is misleading because that represents an average across multiple events; i.e., if you selected the option that saves 500 people 90% of the time, and kills everyone 10% of the time, then on average you would save 450 people. But in this case we are never going to get the opportunity to achieve that average. It's 500 or nothing.

crazysamaritan NaNo 4328 / 50,000 from Lupin III Since: Apr, 2010
NaNo 4328 / 50,000
#7323: Mar 31st 2015 at 6:38:56 PM

No, you never save 450 people. That's a mathematical average of Risk vs Reward. You may save zero people, or you may save everyone. The risk multiplied by the reward results in a greater benefit for the 90% scenario than the 100% scenario.

If you choose to minimize risk, then having a 100% chance of randomly saving 1 person out of 500 is better than a 90% chance of saving all 500 people. Because the goal is not "potential lives saved" the goal is "minimize risk".

Link to TRS threads in project mode here.
DeMarquis Since: Feb, 2010
#7324: Apr 1st 2015 at 9:58:40 AM

"That's a mathematical average of Risk vs Reward."

Yes it is, but just because, in the past, we were presumably able to achieve a 95% save rate across some series of events that are now over, does not mean that we will be able to achieve the same rate of success in the set of next trials in the future. Because there will not be a series of events to average over in the future, there will only be one event, so pointing out that the risk of any one random person being saved is 95% next time is misleading. It's an artifact of the way statistics are ordinarily calculated.

I'll try to explain. The 95% figure is arrived at by taking the probability of being able to save any one specific person, which actually is .95, and multiplying that by 500, the total number of people. But that's misleading in this case, because we aren't allowed to save 95% of the people. If the scenario hadn't prevented us from doing that, then the correct solution would be obvious. And that's normally what would happen in real life. We are used to seeing situations, say in health care, in which one option has a certain chance of saving X% of an at-risk population, and another option has a different, higher chance. We are so used to this that we tend to use the same decision criteria even when the circumstances have changed.

But in this case the fact that we have a 95% chance of saving any one specific person doesn't translate directly into a policy that we can apply to the entire group, because we only get one opportunity. We cannot reduce the 90% chance of saving everyone vs. a 10% chance of losing everyone into saving an average of 95% of the people. In fact, there is no metric we can use to directly compare to the other scenario, in which we can be 100% certain to save 80%. Is certainty worth sacrificing 10% of the people? That's a matter of opinion.

edited 1st Apr '15 9:59:56 AM by DeMarquis

ashnazg Since: Dec, 2009
#7325: Apr 1st 2015 at 10:47:04 AM

For the insurance example, there certainly is a choice being made - the choice is between buying insurance and not buying insurance. The former gives a large risk of a small loss (with negative monetary expectation value) while the latter gives a small risk of a huge loss (with positive monetary expectation value). It's slightly different from the "save 400 people" question in that neither option for the insurance question gives 100% probabilities, but 100%-probability cases can be considered as an edge case of a general decision framework for handling arbitrary probabilities.

While I do somewhat agree that comparing a certain outcome with a gamble of some expectation value does smack of comparing incomparable objects, I'd say that claiming it's impossible to assign a metric would be a stretch, since it would paralyse us from a vast majority of decisions. The very fact that we make can decisions on questions like this already indicates that we're doing some kind of utility calculation and going with the one that gives us the higher expected utility. I'm inclined to say that you can compare them by attaching a utility penalty to risk, or by using nonlinear utility functions with respect to the number of people saved. The part which can vary from person to person is how much utility penalty to attach to risk, and/or the extent of nonlinearity in the utility function.

edited 1st Apr '15 10:49:36 AM by ashnazg


Total posts: 7,408
Top