12th Mar: Page-bottom layout change on the way. Details here. Causes no change to editing/formatting of articles. This is just FYI.
The trope is Truth in Television as of 2014 in the realm of electronics, due to a previously ignored factor. Not computing power or design, they were both there as of 2000, but reduced energy consumption of electronic devices. Once the energy-sipping ARM chips came around 2010 and broadband data connection cheapened itself to peanuts, there was almost no incentive to use power hungry systems, optical disk storage, external HDDs, noisy desktop fans, bundles of cables for data and power. Cloud storage for unimportant, but space-hungry data like film or photos, small handheld devices like tablet computers or smartphones instead of laptops, Bluetooth accessories like speakers, keyboards or even desktop printers were the hallmarks of a less "dirty" age of computers. Chips with no cooling requirements could fit anywhere, in tablets, smart-T Vs, embedded devices, while the economies of scale dropped their price to be affordable to everyone. Therefore the jaunt from "better, faster, computers" to "similar performance, but less power hungry" had the effect of a soft singularity.
While a high-grade desktop workstation may be superior to a tablet by a few orders of magnitude, most of the time people will use their device for communication or media sharing, jobs which a modern smartphone performs best. So instead of growing to the point of meltdown the speed and computing power of a single-core CPU, which needs more cooling, and more space, and bigger case for the coolers, manufacturers chose smaller, more efficient designs and designed the outer case and screen interface according to Apple-like "clean" design language. Ultrabook computers and latest generation◊ of Samsung tablets are just as iPadish as the iPad◊ itself.