In other words, having someone to desperately protect won't make you a surgeon, but, when you're operating on that someone, you won't just use standard procedure and then wash your hands if it fails. You're going to think about what you're doing.
Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.Skill is a learned attribute, and can be offset slightly by natural talent. I think Yudkowski is talking about an aspect of sapience, what makes an observer say "that creature is smart", irrespective of their IQ. I think Yudkowski is talking about creativity.
Rationality is a tool that can help us tell good ideas apart from bad ideas, truth from lies. But in order to use that tool, we need to generate those ideas. The first open heart surgery was not likely performed by the best surgeon in the world. The first successful open heart surgery was not likely performed by the best surgeon in the world. There's no possibility that either surgeon could have learned the procedure for open heart surgery in medical school. But the difference is that the successful surgeon had a better idea for the surgery than the ones that failed. Creativity is a power that comes from desperation. That has been acknowledged since the time of Aesop: "Necessity is the mother of invention."
Link to TRS threads in project mode here.And apparently a Protectorate is a more imperious and urgent necessity than mere personal survival.
edited 25th Mar '15 5:36:19 AM by TheHandle
Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.For some people even a threat to their own life isn't enough to get them to think. Some people do dangerous things like driving drunk to avoid minor issues that come from saying just asking someone else. You need something to protect to force yourself to think above the level of the art taught to you. It is similar to the concept of heroic responsibility he mention earlier in the story.
When life gives you lemons, burn life's house down with the lemons.The problem with the 400 vs. 500 people dilemma, is that 1) the dilemma seems worded in such a way as to lead the casual reader into making the incorrect decision and 2) I dont agree with EY's assessment of the correct answer. I think it's at least arguable that one could prefer the 100% chance of saving 400 over the 90% chance of saving 500, or else no one; provided you are not trying to save any one particular person. That changes the whole issue. I think it also makes a difference whether you are trying to save 400 vs. 500 random strangers you will never meet, or 400 vs. 500 people standing right in front of you.
I agree that once you already are a surgeon, then having a reason to operate that makes a huge difference to you personally can motivate you more strongly. But it can also create so much pressure that performance fall dramatically. It's a fine line between motivation and stress. That's why surgeons aren't allowed to operate on their own relatives.
I believe that research indicates that creativity is highest in group situations where stress is low.
Really, we're talking about the effect of high levels of emotional arousal, then while "rising to the challenge" is a thing, I believe sudden, unexpected threats to something you greatly value will do more harm than good, most of the time.
I'm done trying to sound smart. "Clear" is the new smart.I keep reading Yudkowski's version as "save 100% of people" or "save 90% of people". The probability scenario must make it explicit at the start that 500 people are at risk of dying. One medicine will save 400 of the patients 100% of the time, and another medicine will save all patients 90% of the time. The natural reaction is to then want to treat 400 patients with the 100% medicine and the other 100 with the 90% medicine. So the next step is to clarify that the medicine that can save 400 people is diluted (by mixing with more experimental medicine) in order to save 500 people.
So you have the choice of medicine that works 100% of the time to save 400 lives, or you can modify the medicine to work 90% of the time, but it saves 500 people. Which do you do?
That's right, and I happen to think that wanting to avoid a 10% of killing everyone is a coherent choice.
I am aware of that kind of training, and it works quite well, however it tends to be extremely context specific (because it relies on teaching you the procedures to follow so well you don't have to think about them). The training that airline pilots receive to deal with emergencies is different than the training surgeons receive for the same purpose.
I'm done trying to sound smart. "Clear" is the new smart.It is kind of dumb. I take the 90% chance to save everyone. Following that method saves more people on average then playing everything safe and it has the highest expected utility! Taking the 100% chance of saving only four hundred is clearly the wrong choice.
When life gives you lemons, burn life's house down with the lemons.De Marquis #7281: right, it depends. i love this illustration on hypothetical games and money: you can choose between two games A and B. A ears you 4 dollars with 100% certainty. B earns you 5 dollars with 90% certainty and 0 otherwise. obviously, B has higher expected outcome.
now let the stakes rise - when gambling for 4 or 5 million, game theory gives still the same answer, but...
edited 26th Mar '15 1:17:54 PM by SparklingDude
Depends on how much money you got to lose. When you factor in interest rates and the cost of opportunity, debt rates a lot more negative utility than the nominal loss.
Game theory accounts for that too, you know, just the same as electromagnetic theory can predict behaviour that just using Ohm's Law you wouldn't expect. We need to go deeper.
edited 26th Mar '15 1:36:49 PM by TheHandle
Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.![]()
there's no loss, only two possible gains (nice, isn't it).
a simple utility function determined by the expected monetary outcome is not of much use (4 vs 4.5 for both cases). you need the context. same for the people.
(but yes, once you construct an utility function for yourself, game theory serves you well again)
There is no average, you only get one shot. And because you only get one shot a 90% chance of saving 500 people is not equivalent to an average utility of 450 people. The 90% is a probability of a single event happening. 450 people is an average proportion of a group over a series of events. Those are not equivalent metrics. There is a 90% probability of saving 500 people, and a 10% probability of losing everyone. There is nothing else to that option. Against that is the certainty of saving 400 people.
If you ask me, however, if there is a number of people in the 90% chance of saving category that would change my assessment, yes there probably is. If it were a 90% chance of saving the entire human race, vs. a 100% chance of saving only 400 people, well that would be different. I'm not sure myself where the line is drawn.
I'm done trying to sound smart. "Clear" is the new smart.a simple utility function determined by the expected monetary outcome is not of much use (4 vs 4.5 for both cases). you need the context
The typical response is to maximize expected utility. Which is obviously not linear in money especially at large scales. Of course you can't actually measure utils in most cases, but then again in real life people aren't perfectly rational either.
Blind Final Fantasy 6 Let's PlayI rate the survival of the human race above the life of any given human.
Link to TRS threads in project mode here.Doing so somehow always looks bad. People call you heartless, inhuman. You're supposed to Always Save the Girl...
Oh. Funny. Hermy was already saved...
Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.Ug, I really hate stuff like the climax of the first Spiderman movie, where he endangers an entire schoolbus of kids to save his girlfriend.
Of course, I've heard that the first Superman movie was much worse in that regard. Superman saves a ton of people from various disasters, and then rewinds time and saves his girlfriend instead.
Blind Final Fantasy 6 Let's PlayObjection. He stopped the event from happening at all. Nobody died. The nukes never went boom, taking the San Andreas Fault and all of California with it. So no-one was put in danger by the consequences of the nuclear initiations, because they never initiated.
(Initiation's what insiders call what happens when a nuclear device goes pop.)
Objection. He stopped the event from happening at all. Nobody died
That makes sense. I haven't seen the movie, I was just going by what Cracked said. Which isn't always reliable.
Blind Final Fantasy 6 Let's Play@ Handle - Not quite an outsider. I had several of them pointed at me - stuck on big pointy SS-18 Satan missiles which were aimed at my general direction. Comes with living in the middle of one of the bullseyes of the strategic target map of Europe. My old job's left me fatalistic about that kind of thing.

I think I see Eliezer's point.
Save 400 lives, with certainty Save 500 lives, 90% probability; save no lives, 10% probability.
You may be tempted to grandstand, saying, "How dare you gamble with people's lives?" Even if you, yourself, are one of the 500—but you don't know which one—you may still be tempted to rely on the comforting feeling of certainty, because our own lives are often worth less to us than a good intuition.
But if your precious daughter is one of the 500, and you don't know which one, then, perhaps, you may feel more impelled to shut up and multiply—to notice that you have an 80% chance of saving her in the first case, and a 90% chance of saving her in the second.
And yes, everyone in that crowd is someone's son or daughter. Which, in turn, suggests that we should pick the second option as altruists, as well as concerned parents.
My point is not to suggest that one person's life is more valuable than 499 people. What I am trying to say is that more than your own life has to be at stake, before a person becomes desperate enough to resort to math.