Follow TV Tropes

Following

Our Graphics Will Suck in the Future

Go To

https://static.tvtropes.org/pmwiki/pub/images/rsz_dhcgdcvw0aemssy.png
Mind a pair of reading glasses?

This trope is basically Zeerust applied to the digital era.

The page image represents what a computer display in Star Trek looks like. Now look anywhere at your screen, and compare to what your computer can do.

In a Science Fiction program, the graphics quality of whatever computer is used is that of what computers were available at the time. Therefore, there are no screens in 1960s shows and there are no GUIs in the 1970s and 1980s.

In earlier eras, the writers probably didn't think computer graphics could improve. However, as the nature of computer advancements became more apparent, such limitations have become more about budget and imagination.

Can be arguably justified in a scenario when functionality is preferable to looks. After all, the last thing you want to see on the screen of your spaceship's onboard computer in the middle of a crucial operation is a graphics driver error. This is Truth in Television in a surprising number of cases, where complex graphics are not only unnecessary, but are actually a hindrance, or even dangerous. Although in the future, our graphic cards will probably be way better and more reliable too.

Often Invoked to avoid being a Cosmetically-Advanced Prequel. Cassette Futurism is when this is done deliberately to create a Retro Universe feel.

See also Extreme Graphical Representation, Holographic Terminal, Magic Floppy Disk, The Aesthetics of Technology, and Cyber Green. Related to Science Marches On and Technology Marches On.


Examples:

    open/close all folders 

    Anime & Manga 
  • Bubblegum Crisis was made in the late 1980s and mostly used command line terminals.
  • Ghost in the Shell (1995): Although there are very advanced-looking 3D monitors, the GPS system that Section 9 uses to track criminals is like a bare-bones Google Maps.
  • In an obvious stylistic choice, Kill la Kill is set at most 20 Minutes into the Future or possibly in a higher-tech alternate present, yet all the screens shown are low-res, grayscale LCD displays, even when they're attached to supercomputers or smartphones.
  • Legend of the Galactic Heroes, apparently set in the late 3590s, also has bulky computers showing simplistic vector graphics. Not to mention floppy disks.
  • At least they did better than Mobile Suit Gundam, which doesn't even have GUIs who knows how many centuries in the future. Word of God has hinted that the "Universal Century" (the main timeline of Gundam) begins in the mid-2100s, putting the original series into the early 23rd century. Given that the first TV series was produced in 1979, five years before the Apple Macintosh debuted with a built-in GUI,note  it's not a surprise that they didn't show advanced GUIs beyond hand-drawn line images.
  • RahXephon, set in 2027, has computers with interfaces from Silicon Graphics' Irix, whose UI has remained largely unchanged since 1991.

    Comic Books 
  • Legion of Super-Heroes: As seen in The Great Darkness Saga, the Legion's computers, built by the greatest genius in the 30th century galaxy, have displays with very simple 2-D graphics, and plain green screensavers.
  • Aboard the starship Entreprise-2061 of Pouvoirpoint, all screens display geometric or wireframe graphics, and crappy screensavers.

    Films — Live-Action 
  • Averted (a bit) in 2001: A Space Odyssey, which used modified cel animation to depict computer readouts that would otherwise be difficult or impossible in 1968, such as David Bowman watching television on a paper-thin tablet aboard the Discovery, but played painfully straight in the sequel 2010: The Year We Make Contact, with graphics and controls typical of 1984. On the other hand, the Soviet Alexei Leonov isn't nearly as advanced as the American Discovery despite the Leonov being several years younger.
  • Compare the drab all-text computer graphics from Alien with the rudimentary graphics from Aliens. Seven years is a long time in computer science.
  • Back to the Future Part II featured Marty getting scared by a hologram sprouting from a theater marquee for Jaws: 19. The hologram is shown with low-detail CGI and bug eyes, which makes Marty's "The shark still looks fake." line that much funnier.
  • Escape from New York is set in 1997, but is forced to use 1981 graphics. The effect helps create an Unintentional Period Piece.
    • The glider computer's green wireframe graphics were too expensive to do back then, so the model of Manhattan made for different scenes in the movie was painted black, outlined with green reflective tape and filmed. Truly, the past is another country.
  • In Gattaca, they can make DNA tests in seconds, but they have neither touchscreens nor high resolution.
  • Inexplicably done in Real Steel, with a Generation 2 controller that Bailey dug up for Max to use with Atom. Seeing that 2007 was a date mentioned where Charlie was still boxing, the monochrome low-res screen on the G2 controller should be more advanced than that.
  • The text we see when RoboCop is first activated in RoboCop (1987) shows that he is running under MS-DOS 3.3.
  • Sex Mission, made in 1984: It is set in 2044, but computers still use wireframe 3-D green-lined graphics... and, at one point, what is clearly ZX Spectrum graphics.
  • Space Mutiny simulates the fighting space ships in the beginning with very primitive vector graphics that only show a vague resemblance to their counterparts.
  • In Star Trek: The Motion Picture their scientific advisor took a look at what the effects people had come up with for their viewing screen tactical displays, and told them "I can do better than that on my TRS-80," so what we see in the movie is what he did on his TRS-80.
    • Some of the displays in The Wrath of Khan and The Search For Spock are definitely low-grade computer graphics. Then Michael Okuda came along on The Voyage Home and vastly improved the look. It's particularly jarring, though, when one of the bridge displays in The Wrath Of Khan, set in 2285, is primitive compared to the display of a circa-1986 computer in The Voyage Home!
    • Then they did Star Trek V on a short timeframe and reduced budget, and the bridge displays became shockingly awful again.
    • Averted with the simulation of the Genesis Device, first seen in The Wrath of Khan. Done as a showpiece by what would later become Pixar, it was considered a high point for the field of computer graphics of the time, and remains believable as a simulation thirty years later. The Star Trek production team was so enamored with it that they incorporated the footage into the next two sequels.
  • Star Wars: In Episode IV, the fighters' targeting computers had very plain graphics,note  as did the Rebels' displays at the Yavin base. In later (and "earlier") installations, Lucas and company apparently understood how computers were changing.note  For The Empire Strikes Back and Return of the Jedi, they didn't put any graphics that would actually appear on a computer screen onscreen (though they continued to show holograms). Even for the prequels, they kept such visuals to a minimum, though they likely could have created any interface they liked with effects. Rule still applies, even if taking place "long ago".
    • Could be justified with the limited amounts of power the onboard computers of mass-produced (for the Empire's bloated war machine) ships could afford.
    • Even so, the holograms are black and white and flickery, not half as good an image as any video technology that would've existed when the first Star Wars movie was filmed. However, it does add Used Future appeal.
    • On the other hand in The Phantom Menace, Nute Gunray had a huge TV like transmitter that had very good graphics like a traditional TV.
    • The X-Wing video game actually used the Episode IV visuals for its targeting computers. Apparently deciding that they could do better, in TIE Fighter Lucasarts gave the TIEs a targeting computer that showed the target from the perspective of the pilot's ship, including orientation, though the viewpoint of the "camera" was always from the same distance. It might have been a decision to give the TIEs more advanced equipment, except that all future iterations gave player-controlled craft an identical targeting computer.
    • In the Rogue Squadron games, the targeting computer has the same color scheme as the films but puts a color overlay on top of the game's own graphics instead of using grids and dots.
    • Many of the displays in The Force Awakens look more updated....except for the targeting computer of the Falcon which has the exact same Atari-looking graphics it had in 1977.
    • Downright enforced by Rogue One and Solo, which are set shortly before the original movie, and deliberately keep the displays with very simple graphics.
    • Played with in The Mandalorian, when a tank targets a Scout Trooper trying to toss a grenade into the tank's hatch. The targeting graphics are still simple-looking, but apparently update in real time, down to the trooper's limb movements.
  • The Terminator's POV shots have 6502 assembly language code in the first two movies, and Macintosh ones (including "QuickTime Player"!) in the third. Also, said Robo Cam is not on full-color, but tinted in either red or blue (though it's implied they run just like Night-Vision Goggles).

    Literature 
  • In Heinlein's The Moon Is a Harsh Mistress Luna City's Master Computer, "Mike" has no monitors, but he does have mic pickups and can access Video Phones. Eventually he is able to generate a CGI avatar for video calls that is indistinguishable from real life, after some adjustment, but it takes up the majority of his processing power, and he's a sentient AI.

    Live-Action TV 
  • The makers of the original Battlestar Galactica made an effort to avoid (well, delay) this trope by using the top-of-the-line graphics systems then available for the bridge display of incoming enemy fighters. They looked rather impressive for about five years.
    • Oddly enough, the re-imagined series made a point of this with the computers on Galactica, which have been described as being far below the specs of today's systems. The reason is that during the initial Cylon uprising, the robots were extremely good at hacking, so the Battlestars used no wireless communications and only standalone computers (no networking). Since the Cylons vanished, all the Battlestars have been upgraded; Galactica is the only old-style Battlestar left, as its commander stubbornly insists that the Cylons aren't gone forever. (Obvious spoiler alert: he's right).
      • It is presumably due to trying to avoid this trope that you don't really see the computer displays on the Pegasus (which is a more up to date battlestar) or any of the civilian ships, all of which would be running the "current day" (or at least more modern) colonial computers as opposed to the obsolete systems on the Galactica.
      • The spin-off Caprica used much more flashy looking displays and technology in general - for instance, the tablet device Zoe uses and then rolls up to put back in her pocket.
    • When the film Space Mutiny (which used classic Galactica scenes) was featured on Mystery Science Theater 3000, Mike and the 'bots took notice of this easily.
    Tom Servo: Graphics made by Kenner.
  • Played with in Bones where Angela has a holographic display, with amber graphics resembling some types of 80s CRT monitors. The resolution was way better, though.
  • Doctor Who:
    • In the 1982 episode Castrovalva, it turns out that the fantastically advanced TARDIS computer has a display that is outperformed by a ZX Spectrum. Justified in that later it turns out that the whole interface was a phony produced by the Master so that Tegan and Nyssa would think they were piloting the TARDIS.
    • The other anachronisms in the TARDIS interface were later retroactively justified when the Doctor changes the TARDIS's "desktop theme" into a more organic, steampunk, retrotech look. Apparently, the Doctor is enough of a Bunny-Ears Lawyer to actually prefer that look over proper graphics.
    • Unlike their Hitchhiker's Guide counterparts, the Doctor Who creative team were quite happy to use BBC Micros to generate their on-set graphics for most of the Fifth and Sixth Doctors' runs. Sometimes they could get away with it if the stories were set in the present or near-future, but stories set further in the future ended up fitting this trope to a tee.
    • In "Warriors' Gate", the privateer's computer displays the TARDIS as a wireframe graphic. According to the DVD commentary, this wasn't even computer-generated — it was done by filming an actual wireframe model.
  • The Hitchhiker's Guide to the Galaxy (1981) (BBC miniseries)'s producers looked at what the BBC's own effects department offered for the guide. It wasn't pretty. So they averted this by using very painstakingly detailed cel animation and clever rear projection tricks to show "advanced" computer displays (such as the tiny non-flat flatscreen of the guide, the gigantic widescreen display on the Heart of Gold, etc).
  • In Knight Rider, all of KITT's "complex" displays are source listings of BASIC programs. Given that the software is non-commercial, intended for use by a single trained user, and designed by a very small team to interface with custom hardware at a time with a shortage of third-party cross-platform GUI libraries, a text display was quite realistic for the period, of course - but that doesn't mean it was chosen as a result of the staff doing their homework.
  • Look Around You, keeping with its Retraux theme, makes use of BBC Micros, using one in the first series opening titles to run a laughably simple BASIC program. The second series features a BBC Micro with glitchy voice software welcoming viewers to the future of "Look Around Yog", while a toaster with a BBC Micro attached is a "futuristic toasting system".
  • Max Headroom. Everything is in wire frames. Then again, it was the Trope Namer for 20 Minutes into the Future....
  • Moonbase 3: The computers were certainly more advanced looking than those used in 1973 - considering that they didn't take up a room - but there are no graphic interfaces.
  • In the first series of Red Dwarf, Holly's appearance was very pixelated.
  • Many a Trekkie has suffered brain damage trying to explain the dichotomy between the Viewer-Friendly Interface on computers in Star Trek: Enterprise and the flashy lights and hand-made slides in Star Trek: The Original Series — we get a little help from the fact that we almost never see the screens of video displays on TOS showing anything other than fullscreen video. We get a better look at a TOS-era display in the Star Trek: Enterprise episode "In a Mirror, Darkly", where it appears to be a sort of art deco version of the TNG-era LCARS interface.
    • Star Trek: The Next Generation and Star Trek: Deep Space Nine suffered from the same problem mentioned in the trope description of frame rate refresh being visible on screen. For that reason, only specialised TV monitors whose refresh rate could be adjusted to match that of the cameras were used, which meant that there you rarely saw an animated display in the background, only the ones necessary for the plot.
    • While DS9 has considerably more animated displays than TNG, it makes it look like the Cardassians trashing the station on their way out replaced certain displays with 377-year-old Macintoshes, if the Chicago font is any indication. At least some of us wouldn't put it past those Affably Evil Cardassians....
    • Star Trek: Voyager retconned this by having a time traveler introduce computer technology to the 20th century. The result was an alternate timeline similar to our own.
  • In Terminator: The Sarah Connor Chronicles we learn that at least part of SkyNet is written in Visual Basic and that Terminator CPUs plug into small subsection of PCI bus. No wonder they want to kill humanity.
  • Even worse, in Timeslip, a futuristic (evil) computer can output directly as brainwaves or on a video screen. The video screen shows the image of a teletype printing out the computer's output.

    Tabletop Games 
  • Varies heavily in BattleTech, set in the far off future of 3025 and beyond. Battlemech heads up displays and cockpit displays are often depicted as being fairly simple affairs, albeit more for readability in combat rather than a lack of processing power. Third-dimensional holographic displays exist with great graphical capacity, but are uncommon due to excessive costs versus standard flatscreens, and interstellar transmissions are always sent in the smaller flatscreen format to save bandwidth. The depiction of display graphics has varied heavily in the franchise, having run since 1984, with older works using trending towards simplistic displays while newer ones use flatscreens, tablets, and holograms.
  • Monitors of any sort are rarely seen in Warhammer 40,000 (it being a miniatures wargame, after all) but the graphical quality of what little we do see tends to vary. Often justified since most races, especially humans, are living in a Used Future. The most recent example (at time of writing) is the Cold Open in the tie-in video game Warhammer 40,000: Space Marine. The Imperial command's monitor has a fully functional GUI and supports a click-and-zoom map of the galaxy, but can only display yellow, red, and black.
    • Crazy juxtapositions of high and low technology are a big part of 40k's design aesthetic, especially for the Imperium of Man. Sometimes advanced computer monitors are even lit up with tallow candles, lacking any kind of internal illumination of their own.
    • Completely averted with the Tau, whose tech is far more advanced than humans, to the point where it creeps them out to see holograms that don't require Percussive Maintenance every five minutes.

    Video Games 
  • Embraced by Alien: Isolation as part of its beautifully-realized Retro Universe for the sake of Zeerust Canon. The game itself has spectacular environmental graphics, particularly for 2014, but all the HUD elements, menus, and in-universe computer terminals are designed to look chunky and lo-fi. CA's art team really had their work cut out for them, considering this was the company's first first-person game: the team had access to original sound effects and high-quality assets from the original film, and even utilized early-80s video technology to provide another layer of authenticity, transferring some visual elements onto VHS tapes, introducing distortion with high-powered magnets, and scanning the result back into the game. Sevastopol station is littered with two-bit CRT computer monitors, punch-card slots, and massive mainframe computers making buzzing mechanical noises, much like those from the original Alien movie.
  • In the mid-90s Amiga adventure game Dream Web (taking place in the near future), home computers similarly have no graphics at all, and no user interface either. The user is stuck with a clumsy DOS-like interface to access everything from his e-mail to fetching the latest news broadcast (which consists purely of text, too).
  • Justified in the Fallout series. The transistor wasn't invented until 2067 (roughly a hundred years after the real-world silicon transistor), leading to the common computers just before the Great War of 2077 being very simplistic, equivalent to late 1970s' personal computers. Displays are massive monochromatic green/amber cathode ray tubes; even the Institute in Fallout 4 still uses CRTs despite having had two centuries to improve. Holograms were developed and the technology was in its infancy at the end of the world. The Sierra Madre Casino in Fallout: New Vegas has fairly realistic albeit monochromatic Hard Light holographic security drones and entertainers.
  • Played with in Grand Theft Auto: Vice City Stories with the advertisement for the Fruit LC personal computer, with features like 18 kilobytes of memory and a two-tone, 8-inch display. In 1984, when the game is set, this would still have been a rather respectable system.
  • The computers in Grim Fandango appear to be teletypes hooked up to enormous amber-monochrome screens. It fits with the Art Deco theming everywhere. It's also never explicitly stated just when the game is set; if anything, it seems to be around the Forties or Fifties, which would make them advanced for their time.
  • Ansem's Computer in Kingdom Hearts II is supposed to be highly advanced and storing all of his and his students research data. Yet, it uses 8-bit graphics and a user-interface which looks like the most primitive form of Windows the world has ever seen. Not even a mouse is used. It's somehow justified by the fact that this computer is the gate to "Space Paranoids", a world based on the '80s movie TRON, and the fact that it is at least twenty years old already by the time KH2 takes place, and there hasn't exactly been anyone around to upgrade the hardware or software.
  • In the Mass Effect series, we have whizzy holographic monitors with monochrome visuals (usually amber, sometimes blue). Even non-holograms tend to be grainy, full of static, or blurry.
    • The sequels justify this in-game by explaining that all of the important holographic conversations occur instantly across pan-galactic distances via quantum entanglement technology, which is still very much in its infancy. Quite literally, it looks crummy because only a tiny handful exist in the galaxy, and most of the ones used by the Alliance had to be reverse engineered from what they could steal from Cerberus and the Normandy SR-2. Getting it to run in 1080p before the Reapers arrived probably wasn't their highest priority. Given every other computer display in the series is just as bad, they did a pretty good job.
      • The regular work space holographic displays look like they were specifically made to cause seizures or otherwise injure their operators. They're pointlessly layered (making text illegible), out of focus, and flicker constantly. The Codex justifies it by pointing out that people working on displays use haptic implants in their fingers and special glasses or implanted lenses to properly see what they're working on. Without those, the display looks messy.
  • In Mega Man X, the intro has Dr. Cain working on a circa-2114 machine with 8 petabytes of "real mem" (probably RAM) and 32 PB of "avail mem" (probably space in the swap partition of the hard drive) whose power-on self-test sequence still looks like this. (By contrast, a Mac Pro can be configured with 64 gigabytes of RAM (1/131,072th) of the fictional computer) and 8 terabytes drive space (1/4096th the fictional) and, well...
  • The Outer Worlds, being a spiritual successor to Fallout, very much do the same, despite humanity being in a space age. Unlike Fallout, no real attempt is made to explain why technology is backwards compared to the real world. It's just how the setting is.
  • PlanetSide uses this for its virtual reality training areas for soldiers to experiment with new equipment before unlocking it. In the first game, objects in the VR had thick outlines and the terrain was super low-resolution and overlaid with wireframe. In the second game, objects look just like real life up close but beyond a few hundred meters the world fades away to black-and-white wireframe.
  • In the Shadowrun SNES game (which takes place in the 2050s), office computers don't have any graphics at all! Whenever you use your cyberdeck to jack into the Matrix, you get a screen full of command lines in classic green-on-black monochrome scheme while the connection is established.
  • Speedball 2: Brutal Deluxe has a text introduction about the history of the Speedball sport in the period around the turn of the 22nd century. This text is presented in monospaced all-caps with a blinking block cursor.
  • Used in Startopia. Most likely intentional given how the game is a love letter to 'classic' sci-fi.
  • The intro for the Sega Master System version of Super Space Invaders, even if the game takes place in 2073, features CRT-like graphics and a Beeping Computer.
  • In Superhot, the OS for the player's computer and most of the apps on it are made of ASCII art, but there are hints of Cyberpunk levels of technology available to the System, most prominently their Brain Uploading.
  • In Vampire: The Masquerade - Bloodlines all the computers run on DOS in a game taking place in 2004.

    Webcomics 

    Western Animation 
  • This happens frequently on Futurama, what with it being a parody of classic science fiction. Notably, in "War is the H-Word", a bad-graphics hologram of a planet shows up and Fry is actually impressed.
  • Mighty Orbots: In "The Wish World" Rob's playing a video game that fills up the whole room with its display but still looks like a bog-standard mid-80's spaceship shooting game, even though this cartoon is set in the 23rd Century.
  • Parodied in Star Trek: Lower Decks: when the cast are taking part in Boimler's holo-novel, graphics of the same level seen in Star Trek II: The Wrath of Khan are shown. Rutherford is blown away by the amazing graphics.
  • Displays for gem tech in Steven Universe tend to be simple shapes in few colors or monochrome (in whatever color the screen is). All of them seem capable of displaying photos and video, which have clear resolution but heavy tinting in the screen's color.

    Real Life 
  • Modern tactical displays honestly do look a lot like computer screens in the New Hope — the lines are much thinner and overall picture is generally much sharper, but it's still the same spartan and simplistic vector graphics with purely functional look. If the video feed is featured, it's usually monochrome footage of thermal camera or image intensifiernote , or, if a map is displayed, it's a bare vector version, overlaid with targeting reticles, unit icons, attack vectors, fields of fire, projected trajectories etc., all stark and functional, with simple alphanumeric readouts for required data. The last thing a commanding officer needs is unnecessary bells and whistles that could introduce ambiguity or tax the performance of their not very powerful heavy-duty hardware.
  • There's a good reason for the command line interface's continued existence in spite of the advent of GUI and later touch interfaces. It consumes far less system resources than the latter and it's trivially easy to automate—just whip up a text file containing a bunch of commands and some flow control statements, and it's really light on network bandwidth when accessed remotely, since it's just sending a bunch of (hopefully encrypted) text instead of what amounts to a video feed of a desktop. But if even the crisp black background, 4k antialiased fonts, and eye-strain free flat panel displays of today's CLI just aren't retro enough, then there are "eye candy" terminal emulator programs like Cool Retro Term which attempt to recreate the hazy, flickering, amber-tinted cathode ray tubes of the terminals of yore, complete with incessant beeping on every key press.
  • As of the mid-to-late 2010s, UI design languages are moving towards simpler, 'flatter' appearances from sleek, opulent appearance of late 2000s to the first half of 2010s, thanks to Microsoft's introduction of "Metro" (later renamed to "Modern" or "Microsoft Style" due to trademark issues) design language in 2012, though it sparks Broken Base among those who are used with the sleek, opulent appearance. Other software and IT companies such as Google (signified with the change of its iconic logo like how Microsoft and Windows changes their iconic almost 20-year old logo) follow suit due to the lack of design style patent, along with the overall UI design on both PC and mobile operating systems and web pages.
    • One theory posits that earlier, more visually complex UIs were designed to compensate for lower screen resolutions, and trying to scale these items up takes a lot of work or produces ugly results.
  • Similarly, many business applications are extremely primitive, but in this case it's often for the comfort of employees who have been using the same program for decades and companies that don't want to lose work hours while they get used to a new interface- changes between versions tend to be "under the hood" and simply add new features without changing the familiar, outdated, look. If they do update the interface, there will often be an option to use the old look as a shell over the new interface.
  • The Voyager probes (and others), in one of the most epic dual-subversions/justifications in human history, as Ray Heacock, spacecraft systems manager for the Voyager program once explained,
    Any good... PC, today, will have several hundred thousand words of memory, and no one would think of buying a computer with the limited capabilities that the Voyager systems have. And of course, today, no one would think of building for spaceflight computers with such limited capabilities. But the thing that these computers had was reliability. And being programmable from Ground Operations, we can still have them perform very complex and sophisticated operations.
    — interview, The Infinite Voyage series, Sail On, Voyager!, 1990
NASA engineers chose computer systems for the spacecraft that were not the absolute most advanced even in their own day (1977), in favor of systems that were intended to never have the slightest chance of failing while in-mission. 40 years later, the still-functioning first spacecraft to ever leave the solar system bear testament to their constructors' foresight of valuing proven endurance over cutting-edge yet uncertain technology.
  • There are also other concerns that keep computers in space slower as well. The first is the problem of cooling; while space is extremely cold (2.7K), the only cooling available is very slow thermal radiation (convective cooling, i.e. fans blowing cool air on the component, doesn't work in a vacuum for obvious reasons), so operating temperatures have to be minimized. The second is the sheer amount of radiation shielding and/or redundancy in design required to keep delicate electronics from being fried outside the natural protections we have on Earth (the atmosphere, magnetic field, etc). This also adds to the cooling problem - you can put your computer inside a lead box to prevent charged-particle radiation from scrambling the memory, but then the lead acts as a insulator... and finally, spacecraft components are expensive, as they're built at best in very small numbers (to have spares to test what has failed when something goes wrong up there), and to update a component, besides having to design said component, may even mean a more or less through redesign of the spacecraft to account for things that may differ as power consumption, mass, etc.
  • Many processor and memory intensive tools, 3D art programs, for example, use extremely primitive interfaces. The fraction of a second of lag as a computer renders the high-res font and drop shadows of a typical program's interface can become several seconds when a computer has 90% of its resources dedicated to rendering a high-poly mesh or HD resolution image. Multiply that by an entire day's work of opening and closing menus and panels and you begin to see why the typical GUI in an art program looks typical of the early '90s.
  • If properly designed, a simple graphic can convey all necessary information in a glance. Compare the Heads-Up Display in a video game- simple icons and colored bars are used to represent large amounts of complex information quickly.
    • A similar example are the graphics used in sports broadcasts (a "score bug"), with baseball being a prime example. If someone walked by a television showing a Major League Baseball game with the sound off, they can, with a few numbers and some symbols located in the corner of the screen, immediately know who's playing, the score, what inning and what half of an inning the game is in, how many out, how many on base (and what bases are occupied), the count on the batter and (if a playoff) what the series standing is.
  • Interfaces for tasks like air traffic control tend to be extremely primitive looking simply because it reduces the number of distractions, increases the speed at which the viewer can understand the information, and allows the screen to be updated in near real-time. This is critical when lag for either the operator or the computer can result in a fiery mid air collision!
  • AutoCAD programs use the same blueprint shorthand that has been used for nearly two centuries, in a standard format. This prevents mistakes which can lead to injuries and deaths, because it is familiar to anyone in the engineering and construction industry, regardless of language. An engineer from the 1800's could pick up a blueprint printed from an AutoCAD program and would only be moderately unfamiliar with the notations for advanced electrical wiring.
  • Many programs written for scientific research purposes tend to be simplistic in terms of graphics because they are written purely for utilitarian purposes, sometimes as a home-brew solution which may only be used a few times by the researcher for a single experiment. Even on high budget projects, more money tends to go toward hardware and staff than toward designing an aesthetically pleasing interface.
    • Even in programs explicitly designed to produce graphical output (such as realistically isometric renderings of complex molecules), the interface, such as it is, may be something that quite literally wouldn't have been at all out of place in the 1960s, with the only concession to modern technology being that the atom positions and rendering options are contained in a text file rather than a physical deck of punched cards.
  • Many companies still use old software because it does what it needs to do, everyone is already trained on it, it's reliable and updating it would be a monumental undertaking. Often the code is cryptic, poorly documented, poorly understood and sometimes even written in some archaic language nobody programs for anymore. And the only ones who'd know how to migrate it to a newer architecture would be the original authors - who are now retired, too old to remember, or may even have left this plane of existence altogether. At that point you can pay a whole devteam big money to rewrite the whole damn thing from scratch, or you can get an undergrad to spin up a virtual machine and just run your old software on that. Hmm, tough choice.
    • Antiquated, reliable software, often written in antiquated, reliable languages (Ada in particular), is particularly common in militaries, making the Star Wars page header something close to Truth in Television.
  • Game tools not part of the game package, or primarily intended for internal use, normally use the default Windows interface; think of the difference between the Elder Scrolls games and the Elder Scrolls Creation Kit, or the average server-based game and the average server-based game-hosting interface. Tools like this are meant primarily for people who are used to the game and possibly bored by having worked on it for so long; they need the best performance they can get, and don't need to spend so much time and effort on graphics.
  • A lot of serious Linux users favor minimalistic window managers and the command line over fancy graphical interfaces.
  • On the dark web, web page design seems to have plateaued at some point between the late '90s and the mid-'00s. The anonymizing networks needed to access the dark web slow the speed at which pages load to a crawl, as requests are bounced around several servers to keep them from being tracked, and as such, dark web pages often have a minimum of ornamentation in order to get them to load as quickly as possible.

Alternative Title(s): We Will Use Micros In The Future

Top