Our Graphics Will Suck in the Future
"We've got screens figured out now. What happens in the future that makes them worse?"This trope is basically Zeerust applied to the digital era. The page image represents what a computer display in Star Wars looks like. Now look anywhere at your screen, and compare to what your computer can do. In a Science Fiction program, the graphics quality of whatever computer is used is that of what computers were available at the time. Therefore, there are no screens in 1960s shows and there are no GUIs in the 1970s and 1980s. In earlier eras, the writers probably didn't think computer graphics could improve. As the nature of computer advancements became more apparent, however, such limitations have become more about budget and imagination. Arguably can be justified in a scenario when functionality is preferable to looks. After all, the last thing you want is to see a graphics driver error on the screen of your spaceship's on-board computer in the middle of a crucial operation. This is Truth in Television in a surprising number cases. There are a number of situations where complex graphics are not only unnecessary, but are actually a hindrance, or even dangerous. See also Extreme Graphical Representation, Holographic Terminal, Magic Floppy Disk. Related to Science Marches On and Tech Marches On.
— Graham Stark, Unskippable
open/close all folders
Anime and Manga
- Bubblegum Crisis was made in the late 80s and mostly used command line terminals.
- Legend of Galactic Heroes, apparently set in the late 3590s, also has bulky computers showing simplistic vector graphics.
- Not to mention floppy disks.
- RahXephon, set in 2027, has computers with interfaces from Silicon Graphics' Irix,◊ whose UI has remained largely unchanged since 1991.
- At least they did better than Mobile Suit Gundam, which doesn't even have GUIs who knows how many centuries in the future.
- Word of God has hinted that the "Universal Century" (the main timeline of Gundam) begins in the mid-2100s. Given that the first TV series was produced in 1979, five years before the Apple Macintosh debuted with a built in GUI, it's not surprise that they didn't show advanced GUIs beyond handrawn line images.
- Star Wars: In Episode IV, the fighters' targeting computers had very plain graphics, as did the Rebels' displays at the Yavin base. In later (and "earlier") installations, Lucas and company apparently understood how computers were changing. For The Empire Strikes Back and Return of the Jedi, they didn't put any graphics that would actually appear on a computer screen onscreen (though they continued to show holograms). Even for the prequels, they kept such visuals to a minimum, though they likely could have created any interface they liked with effects. Rule still applies, even if taking place "long ago".
- Even so, the holograms are black and white and flickery, not half as good an image as any video technology that would've existed when the first Star Wars movie was filmed. However, it does add Used Future appeal.
- On the other hand in The Phantom Menace, Nute Gunray had a huge TV like transmitter that had very good graphics like a traditional TV.
- The X-Wing video game actually used the Episode IV visuals for its targeting computers. Apparently deciding that they could do better, in TIE Fighter Lucasarts gave the TIEs a targeting computer that showed the target from the perspective of the pilot's ship, including orientation, though the viewpoint of the "camera" was always from the same distance. It might have been a decision to give the TIEs more advanced equipment, except that all future iterations gave player-controlled craft an identical targeting computer.
- Compare the drab all-text computer graphics from Alien with the rudimentary graphics from Aliens. Seven years is a long time in computer science.
- Also, check out the digital photo that briefly appears in the director's cut of Aliens. It looks to be about .001 megapixel resolution.
- In fact, Alien did have wireframe 3D animation on some of the CRT monitors in the shuttle craft's bridge. The code for these was written in FORTRAN by British programmers on a Prime 400 microcomputer with 192 kB RAM.
- Now contrast the graphics of Alien and Aliens with the state of the art-looking holograms, projections, and imagery present in Prometheus, theoretically set long before Alien. Possibly justified, since the Nostromo from Alien was a low-end old space tug and the Sulaco from Aliens was a rugged military transport, while the Prometheus was the shiny state-of-the-art Cool Starship
- Alien: Isolation deliberately uses the outdated graphics from the first film to evoke nostalgia and the feeling of trying to survive against a Nigh Invulnerable enemy with technology that is outdated even in-universe.
- Averted (a bit) in 2001: A Space Odyssey, which used modified cel animation to depict computer readouts that would otherwise be difficult or impossible in 1968, but played painfully straight in the sequel 2010: The Year We Make Contact, with graphics typical of 1984.
- 2001: A Space Odyssey also depicted the astronauts in Discovery watching TV on a paper-thin screen laying casually on a table.
- In Star Trek The Motion Picture their scientific advisor took a look at what the effects people had come up with for their viewing screen tactical displays, and told them "I can do better than that on my TRS-80," so what we see in the movie is what he did on his TRS-80.
- Some of the displays in The Wrath of Khan and The Search For Spock are definitely low-grade computer graphics. Then Michael Okuda came along on The Voyage Home and vastly improved the look. It's particularly jarring, though, when one of the bridge displays in The Wrath Of Khan, set in 2285, is primitive compared to the display of a circa-1986 computer in The Voyage Home!
- Averted with the simulation of the Genesis Device, first seen in The Wrath of Khan. Done as a showpiece by what would later become Pixar, it was considered a Crowning Moment of Awesome for the field of computer graphics of the time, and remains believable as a simulation thirty years later. The Star Trek production team was so enamored with it that they incorporated the footage into the next two sequels.
- The text we see when RoboCop is first activated in RoboCop (1987) shows that he is running under MS-DOS 3.3.
- The Terminator's POV shots have 6502 assembly language code in the first two movies, and Macintosh ones (including "QuickTime Player"!) in the third. Also, said Robo Cam is not on full-color, but tinted in either red or blue (though it's implied they run just like Night-Vision Goggles).
- In Gattaca, they can make DNA tests in seconds, but they have neither touchscreens nor high resolution.
- Escape from New York is set in 1997, but is forced to use 1981 graphics. The effect helps create an Unintentional Period Piece.
- The glider computer's green wireframe graphic was too expensive to do back then so the model of Manhattan made for different scenes in the movie was painted black, outlined with green reflective tape and filmed. Truly, the past is another country.
- Inexplicably done in Real Steel, with a Generation 2 controller that Bailey dug up for Max to use with Atom. Seeing that 2007 was a date mentioned where Charlie was still boxing, the monochrome low-res screen on the G2 controller should be more advanced than that.
- Sex Mission, made in 1984: It is set in 2044, but computers still use wireframe 3-D green-lined graphics... and, at one point, what is clearly ZX Spectrum graphics.
- Back to the Future Part II featured Marty getting scared by a hologram from a poster for Jaws: 19. The hologram looks like this,◊ which makes Marty's "The shark still looks fake." line that much funnier.
- Fridge Brilliance here in that, for a person in 2015, it would look exceptionally fake, and the theater might want to catch the attention of people on the street, but not terrorize them in public.
- Foundation, set thousands of years in the future in a galaxy-spanning empire with colossal starships and pocket-sized nuclear power plants, makes a big deal about a shipboard navigation computer with graphics.
- In Heinlein's The Moon Is a Harsh Mistress Luna City's Master Computer, "Mike" has no monitors, but he does have mic pickups and can access Video Phones. Eventually he is able to generate a CGI avatar for video calls that is indistinguishable from real life, after some adjustment, but it takes up the majority of his processing power, and he's a sentient AI.
Live Action TV
- Many a Trekkie has suffered brain damage trying to explain the dichotomy between the Viewer-Friendly Interface on computers in Star Trek: Enterprise and the flashy lights and hand-made slides in Star Trek: The Original Series — we get a little help from the fact that we almost never see the screens of video displays on TOS showing anything other than fullscreen video. We get a better look at a TOS-era display in the Star Trek: Enterprise episode "In a Mirror, Darkly", where it appears to be a sort of art deco version of the TNG-era LCARS interface.
- Star Trek: The Next Generation and Star Trek: Deep Space Nine suffered from the same problem mentioned in the trope description of frame rate refresh being visible on screen. For that reason, only specialised TV monitors whose refresh rate could be adjusted to match that of the cameras were used, which meant that there you rarely saw an animated display in the background, only the ones necessary for the plot.
- While DS9 has considerably more animated displays than TNG, it makes it look like the Cardassians trashing the station on their way out replaced certain displays◊ with 377-year-old Macintoshes, if the Chicago font is any indication. At least some of us wouldn't put it past those Affably Evil Cardassians....
- Star Trek: Voyager retconned this by having a time traveler introduce computer technology to the 20th century. The result was an alternate timeline similar to our own.
- In Knight Rider, all of KITT's "complex" displays are source listings of BASIC programs.
- Even worse, in Timeslip, a futuristic (evil) computer can output directly as brainwaves or on a video screen. The video screen shows the image of a teletype printing out the computer's output.
- The makers of the original Battlestar Galactica made an effort to avoid (well, delay) this trope by using the top-of-the-line graphics systems then available for the bridge display of incoming enemy fighters. They looked rather impressive for about five years.
Tom Servo: Graphics made by Kenner.
- Oddly enough, the re-imagined series made a point of this with the computers on Galactica, which have been described as being far below the specs of today's systems.
- It is presumably due to trying to avoid this trope that you don't really see the computer displays on the Pegasus (which is a more up to date battlestar) or any of the civilian ships, all of which would be running the "current day" (or at least more modern) colonial computers as opposed to the obsolete systems on the Galactica.
- The spin-off Caprica used much more flashy looking displays and technology in general - for instance, the tablet device Zoe uses and then rolls up to put back in her pocket.
- When the film Space Mutiny (which used classic Galactica scenes) was featured on Mystery Science Theater 3000, Mike and the 'bots took notice of this easily.
- Oddly enough, the re-imagined series made a point of this with the computers on Galactica, which have been described as being far below the specs of today's systems.
- In The Sarah Connor Chronicles we learn that at least part of SkyNet is written in Visual Basic and that Terminator CPUs plug into small subsection of PCI bus. No wonder they want to kill humanity.
- Look Around You, keeping with its Retraux theme, makes use of BBC Micros, using one in the first series opening titles to run a laughably simple BASIC program. The second series features a BBC Micro with glitchy voice software welcoming viewers to the future of "Look Around Yog", while a toaster with a BBC Micro attached is a "futuristic toasting system".
- The Hitchhiker's Guide to the Galaxy (BBC miniseries)'s producers looked at what the BBC's own effects department offered for the guide. It wasn't pretty. So they averted this by using very painstakingly detailed cel animation and clever rear projection tricks to show "advanced" computer displays (such as the tiny non-flat flatscreen of the guide, the gigantic widescreen display on the Heart of Gold, etc).
- Played with in Bones where Angela has a holographic display, with amber graphics resembling some types of 80s CRT monitors. The resolution was way better, though.
- Max Headroom. Everything is in wire frames. Then again, it was the Trope Namer for Twenty Minutes into the Future....
- Doctor Who:
- In the 1982 episode Castrovalva, it turns out that the fantastically advanced TARDIS computer has a display that is outperformed by a ZX Spectrum. Justified in that later it turns out that the whole interface was a phoney produced by the Master so that Tegan and Nyssa would think they were piloting the TARDIS.
- The other anachronisms in the TARDIS interface were later retroactively justified when the Doctor changes the TARDIS's "desktop theme" into a more organic, steampunk, retrotech look. Apparently, the Doctor is enough of a Bunny-Ears Lawyer to actually prefer that look over proper graphics.
- Unlike their Hitchhiker's Guide counterparts, the Doctor Who creative team were quite happy to use BBC Micros to generate their on-set graphics for most of the Fifth and Sixth Doctors' runs. Sometimes they could get away with it if the stories were set in the present or near-future, but stories set further in the future ended up fitting this trope to a tee.
- In the first series of Red Dwarf, Holly's appearance was very pixelated.
- Monitors of any sort are rarely seen in Warhammer 40,000 (it being a miniatures wargame, after all) but the graphical quality of what little we do see tends to vary. Often justified since most races, especially humans, are living in a Used Future. The most recent example (at time of writing) is the Cold Open in the tie-in video game Warhammer 40,000: Space Marine. The Imperial command's monitor has a fully functional GUI and supports a click-and-zoom map of the galaxy, but can only display yellow, red, and black.
- Crazy juxtapositions of high and low technology are a big part of 40k's design aesthetic, especially for the Imperium of Man. Sometimes advanced computer monitors are even lit up with tallow candles, lacking any kind of internal illumination of their own.
- Completely averted with the Tau, whose tech is far more advanced than humans, to the point where it creeps them out to see holograms that don't require Percussive Maintenance every five minutes.
- Ansem's Computer in Kingdom Hearts II is supposed to be highly advanced and storing all of his and his students research data. Yet, it uses 8-bit graphics and a user-interface which looks like the most primitive form of Windows the world has ever seen. Not even a mouse is used. It's somehow justified by the fact that this computer is the gate to "Space Paranoids", a world based on the '80s movie TRON, and the fact that it is at least twenty years old already by the time KH2 takes place, and there hasn't exactly been anyone around to upgrade the software.
- The computers in Grim Fandango appear to be teletypes hooked up to enormous amber-monochrome screens. It fits with the Art Deco theming everywhere.
- It's also never explicitly stated just when the game is set; if anything, it seems to be around the Forties or Fifties, which would make them advanced for their time.
- In Mega Man X, the intro has Dr. Cain working on a circa-2114 machine with 8 Petabytes of "real mem" (probably RAM) and 32 PB of "avail mem" (probably space in the swap partition of the hard drive) whose power-on self-test sequence still looks like this. (By contrast, a Mac Pro can be configured with 64 gigabytes of RAM (1/131,072th) of the fictional computer) and 8 terabytes drive space (1/4096th the fictional) and, well...
- Used in Startopia. Most likely intentional given how the game is a love letter to 'classic' sci-fi.
- In the Shadowrun SNES game (which takes place in the 2050s), office computers don't have any graphics at all! Whenever you use your cyberdeck to jack into the Matrix, you get a screen full of command lines in classic green-on-black monochrome scheme while the connection is established.
- In the mid-90s Amiga adventure game Dream Web (taking place in the near future), home computers similarly have no graphics at all, and no user interface either! the user is stuck with a clumsy DOS-like interface to access everything from his eMails to fetching the latest news broadcast (which consists of text, too, of course).
- The prevalence of text-only monochrome CRT screens in the Fallout setting, which simultaneously employs laser weapons and intelligent computers, establishes that it takes place in a parallel universe.
- In the Mass Effect series, we have whizzy holographic monitors, with monochrome visuals (usually amber, sometimes blue).
- Even non-holograms tend to be grainy, full of static, or blurry.
- ME 2 and 3 justifies this in-game by explaining that all of the important holographic conversations occur instantly across pan-galactic distances via Quantum String technology, which is still very much in it's infancy. Quite literally, it looks crummy because only a tiny handful exist in the galaxy, and most of the ones used by the Alliance had to be reverse engineered from what they could steal from Cerberus and the Normandy SR-2. Getting it to run in 1080p before the Reapers arrived probably wasn't their highest priority....
- Given every other computer display in the series is just as bad, they did a pretty good job.
- The regular work space holographic displays look like they were specifically made to cause seizures or otherwise injure their operators. They're pointlessly layered (making text illegible), out of focus, and flicker constantly.
- Even non-holograms tend to be grainy, full of static, or blurry.
- In Vampire: The Masquerade – Bloodlines all the computers run on DOS in a game taking place in 2004.
- Played with in Grand Theft Auto: Vice City Stories with the advertisement for the Fruit LC personal computer, with features like 18 kilobytes of memory and a two-tone, 8-inch display. In 1984, when the game is set, this would have been nearly revolutionary, but when the game was released in 2007, the computer seemed hilariously primitive.
- Many processor and memory intensive tools- 3d art programs, for example- use extremely primitive interfaces. The fraction of a second of lag as a computer renders the high res font and dropshadows of a typical program's interface can become several seconds when a computer has 90% of its resources dedicated to rendering a high-poly mesh or HD resolution image. Multiply that by an entire day's work of opening and closing menus and panels and you begin to see why the typical GUI in an art program looks typical of the early 90's.
- Similarly, many business applications are extremely primitive, but in this case it's often for the comfort of employees who have been using the same program for decades and companies that don't want to lose work hours while they get used to a new interface- changes between versions tend to be "under the hood" and simply add new features without changing the familiar, outdated, look. If they do update the interface, there will often be an option to use the old look as a shell over the new interface.
- If properly designed, a simple graphic can convey all necessary information in a glance. Compare the Heads-Up Display in a video game- simple icons and colored bars are used to represent large amounts of complex information quickly.
- Interfaces for tasks like air traffic control◊ tend to be extremely primitive looking simply because it reduces the number of distractions, increases the speed at which the viewer can understand the information, and allows the screen to be updated in near real-time. This is critical when lag for either the operator or the computer can result in a firey mid air collision!
- AutoCAD programs use the same blueprint shorthand that has been used for nearly two centuries, in a standard format. This prevents mistakes which can lead to injuries and deaths, because it is familiar to anyone in the engineering and construction industry, regardless of language. An engineer from the 1800's could pick up a blueprint printed from an AutoCAD program and would only be moderately unfamiliar with the notations for advanced electrical wiring.
- Many programs written for scientific research purposes tend to be simplistic in terms of graphics because they are written purely for utilitarian purposes, sometimes as a home-brew solution which may only be used a few times by the researcher for a single experiment. Even on high budget projects, more money tends to go toward hardware and staff than toward designing an aesthetically pleasing interface.
- In one of the most epic dual-subversions/justifications in human history, as Ray Heacock, spacecraft systems manager for the Voyager program once explained,
Any good...PC, today, will have several hundred thousand words of memory, and no one would think of buying a computer with the limited capabilities that the Voyager systems have. And of course, today, no one would think of building for spaceflight computers with such limited capabilities. But the thing that these computers had was reliability. And being programmable from Ground Operations, we can still have them perform very complex and sophisticated operations.— interview, The Infinite Voyage series, Sail On, Voyager!, 1990
- NASA engineers chose computer systems for the spacecraft that were not the absolute most advanced even in their own day (1977), in favor of systems that were intended to never have the slightest chance of failing while in-mission. Over 35 years later, the still-functioning first spacecraft to ever leave the solar system bear testament to their constructors' foresight of valuing proven endurance over cutting-edge yet uncertain technology.
- There are other concerns that keep computers in space slower as well. The first is the problem of cooling; space isn't actually cold, it's a void, so operating temperatures have to be minimized. The second is the sheer amount of radiation shielding and/or redundancy in design required to keep delicate electronics from being fried outside the natural protections we have on Earth. This also adds to the cooling problem. e.g. You can box your computer inside lead to prevent charged-particle radiation from scrambling the memory, but then the lead acts as a insulator...
- Modern (2014) UI design languages are merging to simpler, 'flatter' appearances. While your millage may vary on whether these look better or not, with people claiming how, for example, Microsoft's Metro is the suck over Aero, a theory posits that the reason we're going this route is because fancy skeuomorphisms were set in place because they made for nice eye candy to mask the fact that screen resolutions were lower, and trying to scale those elements up either takes a lot of work or produces ugly results.
- Many companies still use old software because everyone is already trained on it and it is reliable. A potential time traveler going forward in time from the past may indeed think this trope when visiting a company.