Follow TV Tropes

Following

History Headscratchers / StarTrekTheNextGenerationTechnology

Go To

OR

Is there an issue? Send a MessageReason:
None

Added DiffLines:

** In "The Offspring", while Data does say that he "cannot" use contractions, in the same episode he says that using contractions are something his "program has never mastered". The latter implies that he has tried but cannot do it well. That also suggests that his "cannot" statement is oddly imprecise for Data.

Added: 431

Changed: 31

Is there an issue? Send a MessageReason:
None



to:

[[folder: Whither spacesuits?]]
Why do they never wear spacesuits or environment suits of any kind on away missions? I can't recall a single time in the TNG TV series where they did so yet offhand I remember at least twice in TOS, once with the Tholians and also The Naked Time. In The Naked Time, the Enterprise became infected because a crewman breached his suit. In The Naked Now, they became infected because they didn't even bother with a suit.
[[/folder]]
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** And yet, the Tox Uthat in "Captain's Holiday" is treated as a game-changing superweapon because it can stop nuclear fusion in a star. But isn't that reasonably easy to do anyway?
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** You're thinking of inheritance, I think, and that's different in that each "link" in the "chain" inherits the properties of those above it. You might have consumable -> beverage -> tea, in which you don't have to define how to drink tea, because you've defined how to drink a beverage, and the tea inherits that definition.

Added: 689

Removed: 689

Is there an issue? Send a MessageReason:
This paragraph seems to have been placed wrong.











*** My understanding is that there was the practical matter of set storage to think about - since there was no regular cast member for engineering in season one, engineering hadn't been expected to be a major part of the series. But during the filming of Encounter at Farpoint, they got told effectively "it's built now, or it never gets built" (leading to the couple of scenes where characters stroll through engineering for no particular reason). With the Main Engineering set now taking up space as a standing set, alongside the bridge, sickbay, and the various smaller rooms, something else had to go, and it ended up being the battle bridge set that got its regular appearances axed.



*** My understanding is that there was the practical matter of set storage to think about - since there was no regular cast member for engineering in season one, engineering hadn't been expected to be a major part of the series. But during the filming of Encounter at Farpoint, they got told effectively "it's built now, or it never gets built" (leading to the couple of scenes where characters stroll through engineering for no particular reason). With the Main Engineering set now taking up space as a standing set, alongside the bridge, sickbay, and the various smaller rooms, something else had to go, and it ended up being the battle bridge set that got its regular appearances axed.
Is there an issue? Send a MessageReason:
Did someone go through and change all the spellings of "Moriarty"? Anyway, this is how it's supposed to be spelled.


* This occurred to me originally while watching the episode "Elementary, Dear Data". In that episode, Geordi instructs the computer on the holodeck to create an adversary "capable of defeating Data". This results in a self-aware hologram. If the computer is capable of maintaining a self-aware, sentient artificial intelligence as just one program which takes up a mere fraction of its resources, would that not mean the computer itself as a whole is capable of being a self-aware, thinking artificial intelligence? Also in the same episode, the computer can create extensive scenarios just with a couple of sentences of instructions. I find that a frightening capacity for creative thought for a computer. Now, I have seen it argued in tangentially related discussions above that Moriarity was merely a complex simulation of a person, not a person in itself. However, I would argue that he displays attributes which qualify him as a sentient being. The chief among those would be that he soon re-evaluated his goals and the function he was originally created for (besting Data) fell to the wayside, unless you interpret it ''very'' creatively.

to:

* This occurred to me originally while watching the episode "Elementary, Dear Data". In that episode, Geordi instructs the computer on the holodeck to create an adversary "capable of defeating Data". This results in a self-aware hologram. If the computer is capable of maintaining a self-aware, sentient artificial intelligence as just one program which takes up a mere fraction of its resources, would that not mean the computer itself as a whole is capable of being a self-aware, thinking artificial intelligence? Also in the same episode, the computer can create extensive scenarios just with a couple of sentences of instructions. I find that a frightening capacity for creative thought for a computer. Now, I have seen it argued in tangentially related discussions above that Moriarity Moriarty was merely a complex simulation of a person, not a person in itself. However, I would argue that he displays attributes which qualify him as a sentient being. The chief among those would be that he soon re-evaluated his goals and the function he was originally created for (besting Data) fell to the wayside, unless you interpret it ''very'' creatively.



* The dissonance here derives from the definition of the word "capable". It has been shown repeatedly that ''programs'' (e.g. Moriarity, the Doctor) running ''on'' the ship's computer ''are'' capable of full sentience, and of meeting all the criteria used by Starfleet to define such. In the modern world of computing, we draw a much clearer distinction between hardware and software than most people did in the 1960's. A computer without any software is a piece of decor and nothing more. By simple virtue of having so much memory and processing power, starship computers ''can'' run sentient programs. However, perhaps as a safety feature, such programs are not ''supposed'' to be able to directly control the entire computer. Especially after the Moriarity incidents, the need to limit just how much control a sentient program can exert over the computer (and the ship) in the absence of at least a quasi-physical interface (i.e. a holographic projection) was taken into account. Another interesting contradiction that arises from that though is that while it could be argued that holograms require holographic emitters to sustain their existence, biological crews likewise require ''life support'' or else they would cease to exist (i.e. die) as readily as a deactivated hologram whose program was subsequently erased from the computer.

to:

* The dissonance here derives from the definition of the word "capable". It has been shown repeatedly that ''programs'' (e.g. Moriarity, Moriarty, the Doctor) running ''on'' the ship's computer ''are'' capable of full sentience, and of meeting all the criteria used by Starfleet to define such. In the modern world of computing, we draw a much clearer distinction between hardware and software than most people did in the 1960's. A computer without any software is a piece of decor and nothing more. By simple virtue of having so much memory and processing power, starship computers ''can'' run sentient programs. However, perhaps as a safety feature, such programs are not ''supposed'' to be able to directly control the entire computer. Especially after the Moriarity Moriarty incidents, the need to limit just how much control a sentient program can exert over the computer (and the ship) in the absence of at least a quasi-physical interface (i.e. a holographic projection) was taken into account. Another interesting contradiction that arises from that though is that while it could be argued that holograms require holographic emitters to sustain their existence, biological crews likewise require ''life support'' or else they would cease to exist (i.e. die) as readily as a deactivated hologram whose program was subsequently erased from the computer.



*** Everyone seems to assume that the computer's Moriarity-simulacrum was actually sentient, but nothing (at least in the first of the two episodes dealing with Holo!Moriarity) seems to make that a necessary conclusion; a computer with such demonstrated natural-language processing facility as that one could certainly be excused for inferring that it had been asked not for a simple and ordinary holodeck challenge game, but rather something with a bit more meta-level play -- after all, the request was for an adversary capable of defeating not Sherlock Holmes, Data's character in the holodeck, but ''Data himself''. And for a computer which can directly do as many things in the real world as the Enterprise-D's can, we've seen ''plenty'' of times that there aren't any particular security protocols or sanity checks against, say, making a holodeck detective game more interesting by giving the villain character full knowledge of the true nature of his situation, and the ability to understand and directly affect ship systems.

to:

*** Everyone seems to assume that the computer's Moriarity-simulacrum Moriarty-simulacrum was actually sentient, but nothing (at least in the first of the two episodes dealing with Holo!Moriarity) Holo!Moriarty) seems to make that a necessary conclusion; a computer with such demonstrated natural-language processing facility as that one could certainly be excused for inferring that it had been asked not for a simple and ordinary holodeck challenge game, but rather something with a bit more meta-level play -- after all, the request was for an adversary capable of defeating not Sherlock Holmes, Data's character in the holodeck, but ''Data himself''. And for a computer which can directly do as many things in the real world as the Enterprise-D's can, we've seen ''plenty'' of times that there aren't any particular security protocols or sanity checks against, say, making a holodeck detective game more interesting by giving the villain character full knowledge of the true nature of his situation, and the ability to understand and directly affect ship systems.



*** Data is unique because he's capable of the same degree of sentience in a much smaller package. Moriarity may be a sentient holographic life form, but he needs an entire holodeck matrix (possibly an entire ship's computer) to exist. Data is all that hardware compressed into a human-sized package. It's like the difference between a refrigerator-sized computer from the 70s and a modern laptop.
*** Not really; in a later episode (yes, [=TNG=] actually returned to a previous episode's hanging plot line and resolved it, try not to faint) it develops that a briefcase-sized device can contain enough computing power and memory not only to run the programs for Moriarity and his newly created love interest, but also to simulate an entire galaxy for them to explore, and enough battery power to last at least as long as the remainder of the characters' natural lifetimes.

to:

*** Data is unique because he's capable of the same degree of sentience in a much smaller package. Moriarity Moriarty may be a sentient holographic life form, but he needs an entire holodeck matrix (possibly an entire ship's computer) to exist. Data is all that hardware compressed into a human-sized package. It's like the difference between a refrigerator-sized computer from the 70s and a modern laptop.
*** Not really; in a later episode (yes, [=TNG=] actually returned to a previous episode's hanging plot line and resolved it, try not to faint) it develops that a briefcase-sized device can contain enough computing power and memory not only to run the programs for Moriarity Moriarty and his newly created love interest, but also to simulate an entire galaxy for them to explore, and enough battery power to last at least as long as the remainder of the characters' natural lifetimes.



*** That doesn't make any sense. It wasn't sentient because it was only smart? What I found quite interesting was that this Moriarity seemed much more human than Data. Data would probably fail a Turing-Test.
*** But non-sentient computers can pass a Turing Test, so what does that prove? Sentient or not, Moriarity was designed to perfectly mimic a human being, which Data was purposely not designed to do.

to:

*** That doesn't make any sense. It wasn't sentient because it was only smart? What I found quite interesting was that this Moriarity Moriarty seemed much more human than Data. Data would probably fail a Turing-Test.
*** But non-sentient computers can pass a Turing Test, so what does that prove? Sentient or not, Moriarity Moriarty was designed to perfectly mimic a human being, which Data was purposely not designed to do.



*** MovingTheGoalposts must be hugely tempting to any society capable of creating life-like artificial people. Much of the value of sentience is in its mysterious quality, in our inability to recreated it or control it. Once you build a machine that can pass a Turing test, you would realize just how many tricks you put into it, how much cheating you had to do. If you did manage to pass a Turing test, it would probably seem like you'd do it by tricking people rather than making a really sentient machine, since all the mystery of sentience would be absent. To the person who understands the algorithm which produces the doctor's bedside manner, becoming emotionally attached to the doctor would seem foolish, like trying to give human rights to a fictional character. The doctor isn't a real person; he's a pretend person. That Moriarity was created as a real person was presented as an incomprehensible fluke, something which could not be recreated no matter how many times you ask the computer for another one.

to:

*** MovingTheGoalposts must be hugely tempting to any society capable of creating life-like artificial people. Much of the value of sentience is in its mysterious quality, in our inability to recreated it or control it. Once you build a machine that can pass a Turing test, you would realize just how many tricks you put into it, how much cheating you had to do. If you did manage to pass a Turing test, it would probably seem like you'd do it by tricking people rather than making a really sentient machine, since all the mystery of sentience would be absent. To the person who understands the algorithm which produces the doctor's bedside manner, becoming emotionally attached to the doctor would seem foolish, like trying to give human rights to a fictional character. The doctor isn't a real person; he's a pretend person. That Moriarity Moriarty was created as a real person was presented as an incomprehensible fluke, something which could not be recreated no matter how many times you ask the computer for another one.



** This is a minor point, but why did Moriarity insist on calling the ''Enterprise'' computer ''Mr''. Computer? It was done insistently enough that there was probably a good reason, but I've never been able to figure that reason out. I doubt that Majel Barrett-Roddenberry's voice could have been mistaken for masculine. Moriarity seemed to be aware he was on a ship of some kind, and probably would have been aware of the tradition of applying feminine pronouns to such vessels, so why ''Mr.'' Computer?

to:

** This is a minor point, but why did Moriarity Moriarty insist on calling the ''Enterprise'' computer ''Mr''. Computer? It was done insistently enough that there was probably a good reason, but I've never been able to figure that reason out. I doubt that Majel Barrett-Roddenberry's voice could have been mistaken for masculine. Moriarity Moriarty seemed to be aware he was on a ship of some kind, and probably would have been aware of the tradition of applying feminine pronouns to such vessels, so why ''Mr.'' Computer?



*** An in-universe Woolseyism? In the 24th century "Mr" has become gender-neutral ("Mr. Saavik", etc.), so Moriarity is using correct English as spoken by the holodeck users.

to:

*** An in-universe Woolseyism? In the 24th century "Mr" has become gender-neutral ("Mr. Saavik", etc.), so Moriarity Moriarty is using correct English as spoken by the holodeck users.



*** Perhaps he attaches more importance to the ''Enterprise's'' non-humanity than to its speaking voice, and therefore refers to it by what (in Victorian English terms) would be the 'highest status' gender honorific. Attaching an honorific at all shows that Moriarity, at least, is not falling into the trap his creators have about dehumanizing an artificial being.

to:

*** Perhaps he attaches more importance to the ''Enterprise's'' non-humanity than to its speaking voice, and therefore refers to it by what (in Victorian English terms) would be the 'highest status' gender honorific. Attaching an honorific at all shows that Moriarity, Moriarty, at least, is not falling into the trap his creators have about dehumanizing an artificial being.



*** Neither of those two follow from how I remember the episode (though I might have to watch it again to be sure). The ''first'' program Geordi and Data attempt is one specific mystery, which Data solves easily because he knows the ending already. Then after discussing it with Data and Pulaski, Geordi starts a ''second'' program which is supposed to be original (but isn't). He ''doesn't'' try to tailor this one specifically to beating Data, presumably because he thinks that just having an original mystery would be enough to force Data to play by the rules. The ''third'' attempt is the one where he wants the program specifically to beat Data, which results in Moriarity, but that's not the one I'm talking about. I'm talking about the ''second'' attempt, which Geordi requested to be original, but it didn't do that even though, again, the computer is powerful enough to create a sapient intelligence (and thus, theoretically, could recreate Arthur Conan Doyle to the best of its ability behind the scenes of the program and have him write an original story!)
** Yes, the non sentient computer that can only use what's been fed into it by sentient creative people isn't that great at being creative on its own. This really isn't all that shocking. Really all it did when Geordi requested a villain who could challenge Data was put Moriarity in "free range" mode like we see some other holograms, like the Lea Brahms one, do. He stopped just speaking lines the computer fed him and went "Oh, hey, look at that." It's quite possible he's not ''really'' sentient and just that the computer's no longer having him fail to react to things like Starfleet uniforms and the Arch like it has most of its holograms ignore them.
** This probably isn't the case (another troper could probably a dozen holes in this theory with a little thought), but I've always wondered if maybe Moriarity was never sentient, the computer just wrote a story in which Moriarity becomes self-aware and takes over the ship. And later, when Barkley accidentally accessed the program, the computer quickly wrote a sequel. Starfleet computers, being slightly less secure than the average doggy door, don't seem to mind putting ships in danger for no good reason, so messing with critical ship systems for the sake of its story doesn't seem all that out of character.
** This was originally my question, but after thinking about it some more I figure the above posters have a point. Even if it's capable of creating sapience, the computer isn't sapient itself, it just does what it's told to do, and likely does so in the easiest/most efficient way possible. Geordi (well, actually, looking at the episode again, it was ''Data'' who did the programming the second time around, which might explain the problem in itself) only ''asked'' for a "Sherlock Holmes-type problem" not written by Doyle. He didn't go into any more detail than that. Putting the maximum amount of creativity in would go beyond the scope of the program- something like creating a sapient holo-Doyle to write the story as he would have if he'd written another one would be right out (especially since, as we see, creating Moriarity created a very noticeable power surge to the holodeck.) ...also, use of holo-Doyle might just go against the stipulation that Doyle not have written the story.

to:

*** Neither of those two follow from how I remember the episode (though I might have to watch it again to be sure). The ''first'' program Geordi and Data attempt is one specific mystery, which Data solves easily because he knows the ending already. Then after discussing it with Data and Pulaski, Geordi starts a ''second'' program which is supposed to be original (but isn't). He ''doesn't'' try to tailor this one specifically to beating Data, presumably because he thinks that just having an original mystery would be enough to force Data to play by the rules. The ''third'' attempt is the one where he wants the program specifically to beat Data, which results in Moriarity, Moriarty, but that's not the one I'm talking about. I'm talking about the ''second'' attempt, which Geordi requested to be original, but it didn't do that even though, again, the computer is powerful enough to create a sapient intelligence (and thus, theoretically, could recreate Arthur Conan Doyle to the best of its ability behind the scenes of the program and have him write an original story!)
** Yes, the non sentient computer that can only use what's been fed into it by sentient creative people isn't that great at being creative on its own. This really isn't all that shocking. Really all it did when Geordi requested a villain who could challenge Data was put Moriarity Moriarty in "free range" mode like we see some other holograms, like the Lea Brahms one, do. He stopped just speaking lines the computer fed him and went "Oh, hey, look at that." It's quite possible he's not ''really'' sentient and just that the computer's no longer having him fail to react to things like Starfleet uniforms and the Arch like it has most of its holograms ignore them.
** This probably isn't the case (another troper could probably a dozen holes in this theory with a little thought), but I've always wondered if maybe Moriarity Moriarty was never sentient, the computer just wrote a story in which Moriarity Moriarty becomes self-aware and takes over the ship. And later, when Barkley accidentally accessed the program, the computer quickly wrote a sequel. Starfleet computers, being slightly less secure than the average doggy door, don't seem to mind putting ships in danger for no good reason, so messing with critical ship systems for the sake of its story doesn't seem all that out of character.
** This was originally my question, but after thinking about it some more I figure the above posters have a point. Even if it's capable of creating sapience, the computer isn't sapient itself, it just does what it's told to do, and likely does so in the easiest/most efficient way possible. Geordi (well, actually, looking at the episode again, it was ''Data'' who did the programming the second time around, which might explain the problem in itself) only ''asked'' for a "Sherlock Holmes-type problem" not written by Doyle. He didn't go into any more detail than that. Putting the maximum amount of creativity in would go beyond the scope of the program- something like creating a sapient holo-Doyle to write the story as he would have if he'd written another one would be right out (especially since, as we see, creating Moriarity Moriarty created a very noticeable power surge to the holodeck.) ...also, use of holo-Doyle might just go against the stipulation that Doyle not have written the story.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** It's more analogous to if a person showed up with a cane that's also a high-tech multitool. A lot of people would ask how it works, no?

Added: 11

Changed: 223

Is there an issue? Send a MessageReason:
None


*** I'd have thought that if a bunch of people beam down to your planet consisting of a tall man with a furry face, a big angry guy with a weird forehead, a woman in a leotard and an albino man made of plastic, Geordi wearing a thing over his eyes would be the last thing I'd question.[[/folder]]

to:

*** I'd have thought that if a bunch of people beam down to your planet consisting of a tall man with a furry face, a big angry guy with a weird forehead, a woman in a leotard and an albino man made of plastic, Geordi wearing a thing over his eyes would be the last thing I'd question.question.
*** Maybe people are just too polite to mention it. After all, if you saw somebody walking with a cane, you wouldn't go up to them and say, "Hey, can't walk properly without that thing, huh?" At least I hope not.
[[/folder]]
Is there an issue? Send a MessageReason:
None


* I've had it explained to me that Data is simply unable to interpret the things he experiences as emotions. He frequently describes his thought process and stimuli to other people and is told that they're emotions--he can become self-reflective at a funeral or desire to engage in a ''SherlockHolmes''-themed holodeck adventure, but can't grasp that those are products of sadness and happiness, respectively. Hell, his "grandpa" outright tells him that the real problem is that he's too hung up on the questions to accept the answers, and even this doesn't connect for him.

to:

* I've had it explained to me that Data is simply unable to interpret the things he experiences as emotions. He frequently describes his thought process and stimuli to other people and is told that they're emotions--he can become self-reflective at a funeral or desire to engage in a ''SherlockHolmes''-themed ''Literature/SherlockHolmes''-themed holodeck adventure, but can't grasp that those are products of sadness and happiness, respectively. Hell, his "grandpa" outright tells him that the real problem is that he's too hung up on the questions to accept the answers, and even this doesn't connect for him.
Is there an issue? Send a MessageReason:
redlink


* Blowing up a star appears to be terrifyingly easy in the 'trek universe. In Generations we see that Soran is able to do it using a small missile and a base he set up and built himself. In DS9 a changeling is able to build a sun-buster using a few hour's uninhibited access to an industrial replicator and a standard-issue Starfleet runabout. In an episode of Star Trek, the Enterprise-D blows one up accidently with a couple of modified torpedoes(they were trying to fix it). Blowing up stares appears to be so easy that's even possible to do by mistake. So why aren't there more suns blowing up? If its that easy it seems like quite a few hostile governments, let alone terrorist organizations, should be busting suns left, right, and sideways.

to:

* Blowing up a star appears to be terrifyingly easy in the 'trek universe. In Generations we see that Soran is able to do it using a small missile and a base he set up and built himself. In DS9 [=DS9=] a changeling is able to build a sun-buster using a few hour's uninhibited access to an industrial replicator and a standard-issue Starfleet runabout. In an episode of Star Trek, the Enterprise-D blows one up accidently with a couple of modified torpedoes(they were trying to fix it). Blowing up stares appears to be so easy that's even possible to do by mistake. So why aren't there more suns blowing up? If its that easy it seems like quite a few hostile governments, let alone terrorist organizations, should be busting suns left, right, and sideways.

Added: 293

Changed: 2

Is there an issue? Send a MessageReason:
None


*** Shielding her from discrimination was not Data's motivation exactly, so that's a decidedly post hoc perspective. Conversely, maybe the presence of a known android living benignly as a Federation scientist in the wake of the Mars attack might have helped blunt the anti-synthetic sentiment.



*** Still doesn't fly. If the computer created Moriarity, then it knows everything about Moriarity. Therefore it should logically know exactly what's necessary to defeat him.

to:

*** Still doesn't fly. If the computer created Moriarity, Moriarty, then it knows everything about Moriarity.Moriarty. Therefore it should logically know exactly what's necessary to defeat him.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** At this point it is also worth noting that if we take the android discrimination of ''Star Trek: Picard'' into account; Data absolutely made the right decision.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Whether to reveal her android nature to her is what they're in fact debating at this point in the narrative, and Data decides that ignorance is bliss.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** To offer her immortality would have been to reveal her android nature to herself (remember, she's consciously unaware of being an android), and holo-Soong asked that Data never reveal this to her, and allow her to live out her per-programming limited life believing she was a normal human. Learning after percieved decades of life and experience that you're actually an immortal artificial being? Hell on the ol' psyche, I tell ya what.
Is there an issue? Send a MessageReason:
Extra discussion in Measure of a Man

Added DiffLines:

** The slavery aspect of the argument was more thinking of the future implications. Guinan and Picard talked out the idea of what would happen if there was a double-whammy of Data's agency and self-determination being denied by Starfleet combined with Maddox being successfully able to recreate Soong-type androids as a result of taking Data apart for study. Picard realizes the potential for a new form of chattel slavery of arguably sentient beings(with Whoopi Goldberg's presence in this episode adding an extra layer of RealitySubtext). His argument then essentially becomes "You can't be sure if Data is truly sentient or not, but if you rule against him, you've tacitly given the Federation permission to create a new slave race decades down the line, reliving an exceptionally ugly part of human history. Do you really want to risk that result?" (Tangentially, it calls to mind a legal case early in the USA's colonial history where a civil dispute involving (ironically) a black slaveholder provided the legal precedent for race-based chattel slavery in the future country as a whole, despite the less firm legal footing in Britain, which practiced indentured servitude as a more class-based temporary slave status than permanent race-based slavery.)
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Maybe that would be an emotion, if the artificial intelligence was aware enough to "feel" compulsions like that. Emotions in humans are base-level urges and sensations that kick in whenever an appropriate trigger happens, after all; if a self-aware AI had similar urges programmed into it, it might very well think "human brains are wired to experience things similar to this and they call it an emotion, therefore this must be an emotion that I'm wired to experience."

Added: 758

Changed: 34

Is there an issue? Send a MessageReason:
None




to:

\n[[folder:TNG's 'nuclear option'?]]

* Blowing up a star appears to be terrifyingly easy in the 'trek universe. In Generations we see that Soran is able to do it using a small missile and a base he set up and built himself. In DS9 a changeling is able to build a sun-buster using a few hour's uninhibited access to an industrial replicator and a standard-issue Starfleet runabout. In an episode of Star Trek, the Enterprise-D blows one up accidently with a couple of modified torpedoes(they were trying to fix it). Blowing up stares appears to be so easy that's even possible to do by mistake. So why aren't there more suns blowing up? If its that easy it seems like quite a few hostile governments, let alone terrorist organizations, should be busting suns left, right, and sideways.
[[/folder]]

Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Keep in mind, though, that what this game is trying to accomplish is a lot more complex. First, it seems to be closer to a physical dependency than psychological - an overwhelming and irresistible compulsion in even the most strong-willed individuals (which isn't to cheapen the addictive potential of modern games, but it's not 100% effective in every individual after just a few seconds - this is next-level stuff). Second, it's not just encouraging further play, but programming in complex actions and knowledge. The affected actively seek to get others addicted, know where and how to contact the game's programmers, and even know they're expected to address them in a specific way. Third, it's making very specific changes to its victims' personalities, making them more aggressive and changing their allegiances. It's going to take a lot more than the current stimulus-reward model to make all that happen.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** Exactly this. My given name is Daniel. It means "God is my judge." That doesn't mean my name is God Is My Judge, and I wouldn't think to respond if that phrase came up in conversation.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** My understanding is that there was the practical matter of set storage to think about - since there was no regular cast member for engineering in season one, engineering hadn't been expected to be a major part of the series. But during the filming of Encounter at Farpoint, they got told effectively "it's built now, or it never gets built" (leading to the couple of scenes where characters stroll through engineering for no particular reason). With the Main Engineering set now taking up space as a standing set, alongside the bridge, sickbay, and the various smaller rooms, something else had to go, and it ended up being the battle bridge set that got its regular appearances axed.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** As a martial artist, I can attest that unskilled opponents are sometimes the toughest to beat. At a certain point in most activities, you begin to formulate plans and strategies based on a certain assumption of reaction (I do X, they do Y, I counter with Z), but someone less skilled doesn't react that way (You do X, they react with Bagel), and that can absolutely play havoc with a gameplan (especially in a game like chess, where everything depends on each preceding move). Data may have based his chess strategies on the assumption of a more skilled player (Dr. Singh, Picard, etc), and Troi's playstyle absolutely threw him (say, taking a suboptimal trade, instead of the projected ideal play)
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** I think that it's less fear of her, so nuch as unease that she's an unknown variable. She's got all of Data's capability (and they know that Data is entirely capable of handling the Enterprise-D basically on his own, but she's young and they don't know enough about her to not be wary. After all, they've already had one teen almost get them killed (Wesley), and Lal is even more capable. Say she goes through a troubled teenager phase and decides to take the Enterprise for a joyride, or gets upset and decides to try and lash out. Now there's a super-advanced angry teen running around, who also happens to be the daughter of a close friend. Bit of a problem
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** It could be that Janeway is less picky about the finer points of her coffee, and just wants the coffee. Picard is more meticulous, and wants his tea a certain temperature, because that matters with tea preparation.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** It would have been a huge gamble, but she might have inferred something from the way Data confirmed to Picard that the contract ''could'' be interpreted in a way that validated Ardra's claim on the ''Enterprise''. That was much more helpful to Ardra than it was to Picard, so she reasoned that he must be unbiased if he was willing to blurt out a fact that hurt his Captain's argument.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** In "Devil's Due," Ardra refers to Data as being "incapable of deceit or bias." That could be construed as saying that he can't lie, but may simply mean that he wouldn't be a deceitful type of person. In any event, how would she know?
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** The Doylist answer is that the show runners were able to bring the actual Stephen Hawking to the set of TNG and chose to do so rather then hire an actor to portray a younger Hawking. The In-Universe explanation is that Data might have found this representation more interesting, possibly in part due to his condition. So Data programmed in minor enhancements (a faster speech processor and a remote-controllable 'hand' for playing poker) in order to allow Hawking to play with the other holodeck characters without fundamentally altering the character.

Added: 146

Changed: 2

Is there an issue? Send a MessageReason:
None


*** The question isn't whether he's morally obligated to make some immortal, but whether he's morally obligated to ''offer'' someone mortality. Or perhaps more accurately, to make someone aware of the fact that they're nor more or less mortal than he is already.

to:

*** The question isn't whether he's morally obligated to make some immortal, but whether he's morally obligated to ''offer'' someone mortality. immortality. Or perhaps more accurately, to make someone aware of the fact that they're nor more or less mortal than he is already.already.
*** And incidentally, now I'm imagining a version of Data as an omnicidal killer who's devoted to giving the "gift of death" to all living things.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** The question isn't whether he's morally obligated to make some immortal, but whether he's morally obligated to ''offer'' someone mortality. Or perhaps more accurately, to make someone aware of the fact that they're nor more or less mortal than he is already.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** This is basically a question of what Data's personal ethical standards are- if he can make someone immortal, is he morally obliged to do so? Given how much Data values the qualities of humanity, including mortality, it's likely that he would actually consider mortality "better" than immortality. (he basically ends up saying as much in ''Series/StarTrekPicard'')
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** I don't think that is ever actually established. If that were the case, why doesn't the ''Voyager'' crew have all their meals on the Holodeck instead of the mess hall?

Top