TVTropes Now available in the app store!
Open

Follow TV Tropes

Following

Reverse engineering a modern computer.

Go To

Completion oldtimeytropey from Space Since: Apr, 2012
oldtimeytropey
#1: Feb 7th 2013 at 11:28:17 AM

Let's say a time-traveler goes to the 1960s or the 1950s and accidentally leaves behind a modern laptop or desktop computer. The software is completely wiped, so the scientists of that era cannot access it. Also, the microchip is about 10 to 20 years away from being invented.

Would they be able to reverse engineer this technology?

BlueNinja0 The Mod with the Migraine from Taking a left at Albuquerque Since: Dec, 2010 Relationship Status: Showing feelings of an almost human nature
The Mod with the Migraine
#2: Feb 7th 2013 at 12:09:44 PM

I expect, to some degree, they would. The integrated circuit really isn't the biggest immediate breakthrough for them; the battery would be. Transistors were pretty new in the 60s IIRC, so figuring out how to make them loads smaller would give the space race a total kick in the pants, and smaller, more efficient batteries and capacitors would be even greater help. The flat-screen (or even touch-screen) would also be a bigger breakthrough, as with it being in color.

That’s the epitome of privilege right there, not considering armed nazis a threat to your life. - Silasw
Pykrete NOT THE BEES from Viridian Forest Since: Sep, 2009
NOT THE BEES
#3: Feb 7th 2013 at 12:27:32 PM

There might be a couple conceptual things like touchscreens, LCD displays, but mostly it wouldn't change much other than locking down patented architecture a couple years quicker. They were long since familiar with the mechanics of what was going on.

The major difference is that they couldn't make parts that small yet. I mean you could hand them a quad-core processor and a Radeon, and they could check it out with a microscope and spectrometer and probably figure out what's going on. But the ultimate question would be "how the fuck did you make it this small?" and the ultimate answer would be "using equipment run by the previous generation of computers. Have fun making those first."

edited 7th Feb '13 12:29:13 PM by Pykrete

pvtnum11 OMG NO NOSECONES from Kerbin low orbit Since: Nov, 2009 Relationship Status: We finish each other's sandwiches
OMG NO NOSECONES
#4: Feb 7th 2013 at 1:53:13 PM

Yeah, I'd imagine that they would have significant problems replicating the 22nm transistor size that we're at now. Plus, we pack a staggering amount of transistors onto a single chip. Early efforts only crammed in a couple at best.

Memory would be a similiar headache to duplicate.

Plus, if you made something like a mere first-gen Pentium with 1950's IC technology (the cutting edge stuff), the computer would take up a room and put out a hell of a lot of heat.

Stuff has scaled down a heck of a lot over the decades.

But hey, at least they can boot it to BIOS, right?

Happiness is zero-gee with a sinus cold.
Bluesqueak Since: Jan, 2010
#5: Feb 7th 2013 at 2:46:36 PM

IC's were first patented in 1959, so I'm not sure why you think they're ten years away. To do that, you'd need to place the story in the 1940's (transistors were invented 1948).

Even then, the first proposal that it might be possible to create a semiconductor crystal containing several components was about to arrive in 1952.

The reaction would basically be 'this is what we've been looking for'. With computers the size of a room, engineers were desperately trying to figure out how to make them smaller. They'd be examining the materials used. If they could figure out that the circuits were mostly on that tiny little chip - the big problem was the reliability of components, something was always breaking down.

They would be very, very keen on finding out how the thing worked. It would be the top engineers of the day.

It ain't over 'till the ring hits the lava.
Madrugada Since: Jan, 2001
#6: Feb 7th 2013 at 3:50:26 PM

One of the reasons the answers are s inconsistent is that "50's or 60's" is too long of a time span to answer. What they could have done in 1951 is vastly different than what they could have accomplished in 1969.

DrunkGirlfriend from Castle Geekhaven Since: Jan, 2011
#7: Feb 7th 2013 at 6:55:57 PM

Yeah, it'd just take them about thirty years. tongue

"I don't know how I do it. I'm like the Mr. Bean of sex." -Drunkscriblerian
Meklar from Milky Way Since: Dec, 2012 Relationship Status: RelationshipOutOfBoundsException: 1
#8: Feb 8th 2013 at 12:24:42 AM

I think they'd pretty quickly pull out a microscope and figure out what was going on in the chips. Creating new chips would be somewhat harder, we use advanced chemistry and materials science for that and back then they might not have had the right techniques yet. They might build much larger versions first before eventually figuring out how to scale them down to the size of the one they're studying. I would guess they'd take two or three years before having their own working microchip built, and maybe 15 years to reach the same level of miniaturization. In the meantime, work would also be done on devices to wreck the chips; remember that the Cold War was on then, and anyone who found any new technology would assume that the Other Side™ already had it and might be deploying it.

This is assuming they even realize the laptop is special to begin with. If all the software and the OS were cleared, it might not be obvious that the machine was worth close study in the first place.

Join my forum game!
Bluesqueak Since: Jan, 2010
#9: Feb 8th 2013 at 7:57:47 AM

[up]As a few people have mentioned, if the BIOS was operating, they'd at least know that this machine turns on, fires up, and displays stuff.

Stuff with mysterious terms like 'CPU Configuration', 'IDE Configuration'. That would make it sound like some kind of engineering tool.

If it was an inoperative 'black box', it would still be interesting once an engineer got a look inside. They might not immediately realise the significance of the various square black boxes, but they'd get that the printed circuit board (invented 1936) was connecting to them. So they're doing something. The CPU fan would tell them it was a something generating a lot of heat - and they knew about electrical components generating heat.

So by this point they'd be - it's an electrical machine, no valves anywhere, it uses some kind of black box technology, it has a display screen like nothing we've seen, it's got a tiny loudspeaker like a wireless set - let's get some test gear onto this.

edited 8th Feb '13 8:01:29 AM by Bluesqueak

It ain't over 'till the ring hits the lava.
TamH70 Since: Nov, 2011 Relationship Status: Faithful to 2D
#10: Feb 8th 2013 at 3:42:09 PM

The display screen itself would be worth a price beyond rubies. The company that got that thing on the market first would kill every CRT manufacturer on the planet stone dead within the first year of production.

pvtnum11 OMG NO NOSECONES from Kerbin low orbit Since: Nov, 2009 Relationship Status: We finish each other's sandwiches
OMG NO NOSECONES
#11: Feb 8th 2013 at 4:09:10 PM

Assuming they could reproduce the LED's that go into it. Or do they use colored LCD's now...? Is confused.

Happiness is zero-gee with a sinus cold.
Bluesqueak Since: Jan, 2010
#12: Feb 8th 2013 at 4:46:22 PM

Liquid crystal display, usually. Again, they knew something about liquid crystals - they were used for light valves. The jump start would be realising they have other electro-optic properties. And that you could use them for colour.

It ain't over 'till the ring hits the lava.
Pykrete NOT THE BEES from Viridian Forest Since: Sep, 2009
NOT THE BEES
#13: Feb 10th 2013 at 5:46:20 PM

They already understood most of the physical concepts save for some of the really recent stuff like memristance and quantum computing. The main hurdle is that the key components of computers are so darn small that they have to be manufactured by equipment run by the previous generation of computers.

CassidyTheDevil Since: Jan, 2013
#14: Feb 10th 2013 at 5:49:16 PM

Okay, would they be able to do it any better if you gave them all the information on how to manufacture computers too?

TamH70 Since: Nov, 2011 Relationship Status: Faithful to 2D
#15: Feb 10th 2013 at 6:55:26 PM

Maybe, but you would be running into the classic "you don't have the tools to make the tools to make the tools to make the parts problem." Which is why even though the Greeks invented the steam engine, it took James Watt and Richard Trevithick to make them practical items.

Xtifr World's Toughest Milkman Since: Jan, 2001 Relationship Status: Having tea with Cthulhu
World's Toughest Milkman
#16: Feb 14th 2013 at 1:19:44 PM

From what I've heard, modern computer chips would appear to be basically pure silicon—more pure than they could even make as recently as the forties. All the circuitry embedded in their would just look like trace impurities. I'm not sure they'd even be able to tell it was deliberately structured, except from the side-effects, which would certainly imply that something was going on inside there. :)

Speaking words of fandom: let it squee, let it squee.
Pery-ton Since: Nov, 2014
#17: Jan 20th 2016 at 11:25:33 AM

necro but I just found this topic and it occurred to me that possibly the craziest and scariest part for an 1950 engineer from the USA or western Europe would be the letters "MADE IN CHINA"

Fighteer Lost in Space from The Time Vortex (Time Abyss) Relationship Status: TV Tropes ruined my love life
Add Post

Total posts: 18
Top