I expect, to some degree, they would. The integrated circuit really isn't the biggest immediate breakthrough for them; the battery would be. Transistors were pretty new in the 60s IIRC, so figuring out how to make them loads smaller would give the space race a total kick in the pants, and smaller, more efficient batteries and capacitors would be even greater help. The flat-screen (or even touch-screen) would also be a bigger breakthrough, as with it being in color.
That’s the epitome of privilege right there, not considering armed nazis a threat to your life. - SilaswThere might be a couple conceptual things like touchscreens, LCD displays, but mostly it wouldn't change much other than locking down patented architecture a couple years quicker. They were long since familiar with the mechanics of what was going on.
The major difference is that they couldn't make parts that small yet. I mean you could hand them a quad-core processor and a Radeon, and they could check it out with a microscope and spectrometer and probably figure out what's going on. But the ultimate question would be "how the fuck did you make it this small?" and the ultimate answer would be "using equipment run by the previous generation of computers. Have fun making those first."
edited 7th Feb '13 12:29:13 PM by Pykrete
Yeah, I'd imagine that they would have significant problems replicating the 22nm transistor size that we're at now. Plus, we pack a staggering amount of transistors onto a single chip. Early efforts only crammed in a couple at best.
Memory would be a similiar headache to duplicate.
Plus, if you made something like a mere first-gen Pentium with 1950's IC technology (the cutting edge stuff), the computer would take up a room and put out a hell of a lot of heat.
Stuff has scaled down a heck of a lot over the decades.
But hey, at least they can boot it to BIOS, right?
Happiness is zero-gee with a sinus cold.IC's were first patented in 1959, so I'm not sure why you think they're ten years away. To do that, you'd need to place the story in the 1940's (transistors were invented 1948).
Even then, the first proposal that it might be possible to create a semiconductor crystal containing several components was about to arrive in 1952.
The reaction would basically be 'this is what we've been looking for'. With computers the size of a room, engineers were desperately trying to figure out how to make them smaller. They'd be examining the materials used. If they could figure out that the circuits were mostly on that tiny little chip - the big problem was the reliability of components, something was always breaking down.
They would be very, very keen on finding out how the thing worked. It would be the top engineers of the day.
It ain't over 'till the ring hits the lava.I think they'd pretty quickly pull out a microscope and figure out what was going on in the chips. Creating new chips would be somewhat harder, we use advanced chemistry and materials science for that and back then they might not have had the right techniques yet. They might build much larger versions first before eventually figuring out how to scale them down to the size of the one they're studying. I would guess they'd take two or three years before having their own working microchip built, and maybe 15 years to reach the same level of miniaturization. In the meantime, work would also be done on devices to wreck the chips; remember that the Cold War was on then, and anyone who found any new technology would assume that the Other Side™ already had it and might be deploying it.
This is assuming they even realize the laptop is special to begin with. If all the software and the OS were cleared, it might not be obvious that the machine was worth close study in the first place.
Join my forum game!
As a few people have mentioned, if the BIOS was operating, they'd at least know that this machine turns on, fires up, and displays stuff.
Stuff with mysterious terms like 'CPU Configuration', 'IDE Configuration'. That would make it sound like some kind of engineering tool.
If it was an inoperative 'black box', it would still be interesting once an engineer got a look inside. They might not immediately realise the significance of the various square black boxes, but they'd get that the printed circuit board (invented 1936) was connecting to them. So they're doing something. The CPU fan would tell them it was a something generating a lot of heat - and they knew about electrical components generating heat.
So by this point they'd be - it's an electrical machine, no valves anywhere, it uses some kind of black box technology, it has a display screen like nothing we've seen, it's got a tiny loudspeaker like a wireless set - let's get some test gear onto this.
edited 8th Feb '13 8:01:29 AM by Bluesqueak
It ain't over 'till the ring hits the lava.They already understood most of the physical concepts save for some of the really recent stuff like memristance and quantum computing. The main hurdle is that the key components of computers are so darn small that they have to be manufactured by equipment run by the previous generation of computers.
From what I've heard, modern computer chips would appear to be basically pure silicon—more pure than they could even make as recently as the forties. All the circuitry embedded in their would just look like trace impurities. I'm not sure they'd even be able to tell it was deliberately structured, except from the side-effects, which would certainly imply that something was going on inside there. :)
Speaking words of fandom: let it squee, let it squee.The hell does that have to do with the topic?
"It's Occam's Shuriken! If the answer is elusive, never rule out ninjas!"

Let's say a time-traveler goes to the 1960s or the 1950s and accidentally leaves behind a modern laptop or desktop computer. The software is completely wiped, so the scientists of that era cannot access it. Also, the microchip is about 10 to 20 years away from being invented.
Would they be able to reverse engineer this technology?