Follow TV Tropes


Dork Age / Real Life

Go To

Back to Dork Age

    open/close all folders 

  • One can't help but get the impression that architectural schools were infiltrated by the KGB during the Cold War, placing in vogue the Stalinist trend known as Brutalism - which produced ominous, concrete blocks of pure authoritarian coldness.note  The future seemed bleak for decades, until Postmodernism rode in from the West, kicked out the commies, and saved the day. The legacy of this jarring midcentury trend can be seen today on public urban buildings and state university campuses. The city of Boston, unfortunately, fell victim to Brutalism when a new city hall was commissioned. The chaotic, faded mess that ensued elicits near-universal disgust from visitors and remains a testament to the failures of The Eastern Bloc, and its Western sympathizers. The only city to actually get the concept of Brutalism the right way was none other than America's capital, DC, in the form of the Washington Metro Underground System. Its uniquely carved concrete walls and ceilings that interlock with each other in arches at every hallway and mezzanine prove that Brutalism isn't always a regression of aesthetics. In fact, the architecture in the WMATA underground would be something that the Moscow Metro would dream of looking like, had it curiously not been designed with the more uncharacteristically beautiful baroque architecture that permeates every surrounding station in a country that isn't known for "bourgeoise" aesthetics. One of the big downsides of concrete is that it does not age gracefully in a temperate climate, much less in one with any degree of air pollution. This problem is of course less notable in "indoor" spaces such as Washington Metro.
  • In the Eastern Bloc itself, this Dork Age of architecture lasted until the fall of communism. Even after that new aesthetic influences reach the urban landscape rather slowly, thus the depressing views of endless concrete blocks and boxy, edgy monuments are there to stay. Yet in some former communist states what structures have been built, while decidedly ungainly and undesired (some monuments are known under Fan Nicknames such as 'seven-winged five-dick' owing to their lack of aesthetics), are now left in a state of decay due to intense corruption after the Hole in Flag revolutions, with no intention to improve or replace them. This has led to a sentiment along the lines of "at least they used to build stuff back then and put some flowers around it" — that the architectural Dork Age was followed by an even bigger Dork Age.
  • The worst part of Brutalism was that it caused just as many problems as it solved. Sure, it was cheap and quick to build (because it was all concrete), but it became very expensive to maintain (because it's all concrete). Combine this with the above-mentioned air pollution issue (which became rampant in the 20th century until at least the 90s, and still ongoing in some countries), many of the structures became little more than crumbling, moldy tenement halls. At least it gave us a great art style for Doom levels.
    • Many brutalist buildings were designed and constructed in an era of cheap oil and coal. Now these buildings cost a small fortune to heat in the winter and cool in the summer.
  • In general, the era of the "automotive city", with its barren concrete plazas, urban highways on stilts, and tearing down entire neighborhoods to make way for roads and parking, is now considered the worst of all architectural dork ages. Almost all other styles and epochs have their defenders and people who "revive" them today, but this one is so extremely derided, that it explains the bad rep of brutalism (which mostly happened in the same era) described above.
  • The Nazi Party thought the simplistic Bauhaus style from the Bauhaus architecture and design school was this. Much like Brutalism, one of the purposes was to be a socialist attack to the bourgeoisie — a simple, money-saving alternative to design. Unlike brutalism, it was good-looking and ergonomic. Of course, this was taken as a protest against the overtly ornate and luxurious classical Germanic style; it didn't help matters that the school's dean, by the early '30s, was actually a far-left pro-communist that would expel students not committed enough to the ideals, resulting in the school being raided by the Nazi party in 1933.
    • On the flip side, while many consider Bauhaus ugly since it sometimes oversimplifies to the point of innocuousness, it has been extremely helpful in engineering and digital media. Many tools, webpages, and user interfaces today follow the Bauhaus style of ergonomy.

    Automobiles and motorcycles 
  • Remember all those great cars Detroit came out with in The '70s? No? A toxic combination of lack of innovation, poor design, and quality control, Congress relaxing import quotas (allowing foreign automakers to sell more cars in the USA), new emissions and fuel economy regulations, and the oil crises of 1973-74 (prompted by the Yom Kippur War) and 1979 (prompted by the Islamic Revolution in Iran) nearly destroyed the industry. Chrysler required a government bailout to survive, American Motors collapsed altogether and saw its pieces snatched up by Renault and later Chrysler in The '80s, and Ford and GM were better off only by comparison, rapidly losing market share to Japanese and German automakers who built smaller, more efficient, and more reliable cars. It did destroy the city of Detroit itself (and most of Michigan for that matter), and to this day, there are many Americans of a certain age who still refuse to buy domestic. Auto writer Murilee Martin coined the term "Malaise Era" to define the years from 1973 to 1983 when the quality and performance of American cars seemed to be in active decline. While Detroit did start making some good cars again from the mid-'80s onward (cars like the Ford Taurus, Chrysler's "K-cars" and minivans, and GM's Saturn brand and A-body platform showed that they still knew how to innovate), they still put out more than the occasional stinker (see: the Chevrolet Cobalt) until the mid-'00s, when Detroit realized that they were going to completely lose the market to foreign competitors and upped their game. As of now, there are lots of domestics that are every bit as good as foreign cars (and, in many cases, better), but anyone with any sense will be very careful about most used domestics from model years prior to 2009 or so.
  • British cars fell off even harder in The '70s, which led to British Leyland consolidating most contemporary car brands. In theory, having most of the major British car companies under one organization was a good idea... but many of them were competing against each other in the market, and none of them knew how to work together. As a result, many cars of the time were cheap, quickly-produced Suspiciously Similar Substitutes, while new models took a long time to develop.

    Despite its decent sales numbers, the Morris Marina is widely considered one of British Leyland's worst cars. Add to that BL's disputes with trade unions, the oil crisis, and the "Three-Day Week"note , and you can see how British Leyland became the poster child for Britain's '70s industrial problems; the company collapsed in 1975, taking most of the industry down with it, and had to be nationalized just to keep the lights on at the factories. Nowadays, all the major British car makers are owned by foreign groups, with Rover having closed in 2005.
  • The Ford Mustang II, sold from 1974-78. The concept was good on paper — reverse the size creep the Mustang had been suffering from since the late 60s and go back to the basic concept of a compact coupe with a powerful engine. What consumers actually got was basically a Pinto with a fancier body, no V8 option, and enough mid-'70s chrome, vinyl, and fake wood for a much larger car. Sales for the Mustang II were actually much better than the late 60s/early 70s Mustangs, but it alienated enthusiasts. Even after it got a V8, any performance advantages that would've come from the lighter, more nimble body were negated by federally mandated emissions-control devices that sapped performance; this led to disgruntled fans calling it the "Disgustang". Meanwhile, to add insult to injury, the Mustang's rivals, the Chevrolet Camaro/Pontiac Firebird twins, underwent something of a Golden Age in the '70s. While they too felt the effects of the new standards (they were nearly killed in 1972 due to a UAW strike concerning the new regulations), their performance didn't suffer nearly as badly as the Mustang's, and their bodywork wasn't nearly as garish as other cars during the era. The Camaro and, to a lesser extent, the Firebird outsold the Mustang by 1977 because they were some of the only cars at the time worth getting for sports car/post-muscle car enthusiasts (especially made all the more apparent with the release of Smokey and the Bandit, which really boosted sales of the Camaro and the Firebird that year). To this day, the Camaro and Firebird are probably the only American performance cars to not have their legacy stained by WTH engineering/designing departments even during The '70s. The only low point in the Camaro's career was the Iron Duke design of The '80s, but that was a separate model and did rather little to hurt the Camaro's popularitynote .
    • Recently, though, some auto enthusiasts have begun to defend the Mustang II, arguing that it was simply an alright car that simply had the misfortune of coming between the groundbreaking first generation and the long-lived and successful third generation.
  • The '80s were not kind to Cadillac. First, they introduced the V8-6-4 engine, which was meant to maximize fuel economy by activating and deactivating cylinders according to driving conditions, but proved to be unreliable due to the limitations of early 80s computer technology. Then, they attempted to break into the compact luxury market with the Cimarron, which was little more than a Chevrolet Cavalier with nicer trim and Cadillac badges. The Cimarron was marketed towards young, upwardly mobile customers that normally bought BMWs, but they weren't impressed while traditional Cadillac customers were incensed by the cheapening of the Cadillac brand. Their replacement for the V8-6-4, the High Technology engine, proved to be a maintenance nightmare, leading many customers to remove it and replace it with Oldsmobile and Chevy engines. The redesigns of the De Ville and Fleetwood (1985) and the Seville and Eldorado (1986) were supposed to give the lineup a modernized look but were widely panned as being dull and boxy, which harmed sales greatly. The final straw was the introduction of the Allante, which tried to compete with the Mercedes-Benz SL and Jaguar XJS, but failed to make a dent in either car's sales thanks to derivative styling and a bland driving experience. By the end of the decade, Cadillac had gone from dominating the luxury car market in the US to struggling to compete with both European and emerging Japanese brands. Things would begin to look up for them in The '90s, though there were still some missteps in that period such as the Catera, which was basically an Opel Omega thinly disguised as a luxury sedan. It took the introduction of the "Art and Science" theme in the 2000s to truly get them back in gear.
  • The third-generation (1996-99) Ford Taurus destroyed that car's reputation virtually overnight. With the rest of the automotive world having caught up to what the original Taurus had accomplished in The '80s, Ford wanted to move the car upmarket and give it a more distinctive look inspired by contemporary Jaguars (Ford owned Jaguar by this point) and Infinitis in order to stand out from the competition (chief designer Jack Telnack compared the first two generations of Taurus to "a pair of slippers"), and the result was a rounded, oval-shaped appearance that certainly did that. The moment the car debuted at the 1995 North American International Auto Show in Detroit, it was derided by consumers as the "Bubble" or "Submarine" Taurus, and while it was reliable and a pleasant car to actually drive, its blob-like styling turned many people off (especially with how it obstructed the driver's sight lines), as did its reduced trunk space, increased weight (without a horsepower boost to match), and a higher price point that came from luxuries that most people didn't really want. The high-performance SHO model was also underwhelming compared to previous versions, lacking a manual transmission option and being prone to camshaft problems on top of it. In its debut model year, it only kept its position as the best-selling car in America because of sales to rental fleets (which made up 51% of sales), and by the following year, it had fallen behind the Toyota Camry. Ford hastily redesigned it in 2000 to give it a more conservative appearance, but it was too little, too late, and the Taurus, once the innovative, cutting-edge sedan that saved Ford from bankruptcy, never regained its position. Jack Baruth, writing for Road & Track, called it "the saddest car ever made".
  • The ninth-generation (2012-2015) Honda Civic is remembered as a black spot in the history of what is otherwise one of the most celebrated compact cars in the world. Not only did the Great Recession give Honda the bright idea to position the new Civic as more of an entry-level car with cheaper materials (and just as the economy was starting to recover, at that), but the 2011 Tohoku earthquake and tsunami, which disrupted supply lines for all Japanese automakers, didn't help matters either. The result was a Civic that very much felt like a bargain-bin model, lacking the fit and finish that had once helped the Civic stand head and shoulders above competing compacts. Notably, Consumer Reports, for decades one of the Civic's biggest boosters in the automotive press, took the car off its Recommended list for the first time in 2012. Honda scrambled to fix the car, eventually giving it a full redesign for the tenth generation just four years later, and while that car too initially suffered teething issues (the debut 2017 model again failed to make CR's Recommended list), once those were ironed out it won back many previously disappointed Honda fans.
  • The storied American motorcycle manufacturer Harley-Davidson has had two of them.
    • To start with, the "Malaise Era" of American motor vehicles was hardly restricted to cars, as evidenced when American Machine & Foundry, a company known primarily for sporting goods equipment (bowling in particular — yes, it's the same AMF that owned a lot of bowling alleys and made lots of bowling balls), bought out Harley-Davidson in 1969. While their cash infusion saved the company (then close to bankruptcy) in the short term, their mismanagement ran it into the ground in the long term, as their attempts to streamline production did little more than drive build quality into the gutter and ignite labor disputes. During The '70s, Harleys were more expensive, less reliable, and had worse performance than comparable import bikes, the general opinion being that the only things they had going for them were patriotic appeal and style.note  The term "hog" to describe Harleys, while since reclaimed by the company and Harley riders (the Harley Owners Group takes its name from it), originated during this time as a pejorative; other nicknames included "Hardly Ableson", "Hardly Driveable," and "Hogly Ferguson". The Confederate Edition series produced in 1977 is also one that the company would like to forget. The "AMF years" ended in 1981, when AMF sold Harley-Davidson to a group of investors that included the grandson of co-founder William A. Davidson. They turned the company around with a series of much better-built bikes, most notably the Softail series, while deliberately playing to nostalgia and patriotism with retraux styling rather than trying to copy their Japanese and European competitors, in the process turning Harley-Davidson into a lifestyle brand as well as a motorcycle company.
    • Unfortunately, this lay at the root of Harley's second Dork Age in the 2010s. Among those who feel that the company has lost its touch, the belief is that the seeds of Harley's comeback in The '80s were a Franchise Original Sin that caused it to ignore trends in motorcycle technology and engineering in favor of relying on the Nostalgia Filter of aging Baby Boomers who grew up with '60s biker culture — a strategy that produced a stereotype of Harley riders as Former Teen Rebels and Amazingly Embarrassing Parents who bought their bikes as mid-life crisis mobiles, left the bikes themselves as luxury items built on aging platforms that had little to offer for anybody interested in performance or practicality, and eventually came back to bite Harley in the rear once their target demographic started getting too old to safely ride motorcycles. The rise of the relaunched Indian Motorcycle brand in the 2010s, which built the same kind of classic-style, all-American cruisers that Harley did with more modern technology and superior performance, simply underscored the problems that Harley faced. As of 2021, Harley is making a concerted effort to end the Dork Age, introducing the LiveWire electric motorcycle and the Pan America touring bike targeted at new customers beyond their normal demographic; time will tell if this pulls them out.
  • Hyundai had one in the United Kingdom and Continental Europe during the Turn of the Millennium, being seen as producing Boring, but Practical Korean family cars. This was a period when the Accent subcompact was seen as So Okay, It's Average, with only the luxo GSi and CDX models being recommended, and the Elantra being considered the best model. The midsize Sonata was seen as too Americanized and chrome-y for most people, who went for the then-contemporary Ford Mondeo and Volkswagen Passat instead, and the Hyundai SUVs were seen as dull and too rough-and-ready for some, with the Santa Fe considered good, but not great. It took until 2010 for things to improve, although the new i30 launched in 2007 went some way to improving that reputation.
  • General opinion of Nissan is that the quality of their cars went to pot during the 21st century, with this article by Kristen Lee for Jalopnik going into more detail. Their once-acclaimed sports cars grew stagnant, with the flop of the 370Z (released in 2009 at the height of the Great Recession) being a turning point in convincing Nissan to let their sports car lineup wither on the vine, while their mass-market passenger cars and SUVs earned a reputation as more shoddily-built versions of comparable Toyotas. Many blame the partnership with Renault and the resulting new CEO Carlos Ghosn, who, ironically, was initially hailed as a corporate hero in the manner of Lee Iacocca for returning Nissan to profitability in the early 2000s. (Ford tried to recruit him to be their CEO at one point, and there was even a manga about him.) However, his later years were marked by increasingly poor reliability and build quality of Nissan's passenger cars, owing to both the cost-cutting measures he took and the use of Renault's French parts in Nissan's cars, and it would later be discovered that he was involved in financial chicanery, leading to his arrest and removal from the company.
  • The German sports car maker Porsche was in a bad place in the late '80s and early '90s. Sales crashed after peaking in the mid-'80s as its bloated and inefficient production process reduced output, the early '90s recession hammered luxury car makers across the board, and their models grew extremely long in the tooth; the venerable 911, while undoubtedly a Cool Car, had been designed back in 1964 and couldn't compete with more modern sports carsnote , while the 968 offered little change from the 944 that had been in production since 1982. At the time, it seemed as though Porsche was coasting on its '80s yuppie cachet rather than innovation and performance. The introduction of the entry-level Boxster in 1996 marked the beginning of a turnaround, allowing Porsche to re-establish its place as one of the world's premier sports car manufacturers.
  • Tesla Motors Inc. went through one starting in 2019, due to controversies surrounding the brand's creator Elon Musk (such as him smoking weed on a radio show), the brand's now-notorious "no dealers" schtick, reliability issues, autonomous driving that could be lethal and did not live up to promises, and questionable styling. Tesla as a brand was seen as something of a Spiritual Successor to DeLorean, although the politics the brand had were far different. Tesla ranked lowest amongst all brands in a June 2020 survey on build quality by J.D. Power, and between that and the brand's controversies, some buyers grew suspicious and prepared to wait for competing luxury electric cars from Mercedes-Benz and Porsche.
  • The Volkswagen Golf GTI had two occasions where it didn't quite live up to its reputation as being one of the best sports hatchbacks and it did not live up to expectations for its target demographic:
    • The Golf GTI MkIII of 1992 had a 2.0-liter/115hp 8-valve engine (2.0 8v) which did 0-60mph in 8.7 seconds which was mediocre for a supposed hot hatch, being all show and no go. Esthetically, though, it was pleasing, with the "Longbeach" 14-inch or Ancora 15-inch wheels.
    • Then there was the Golf GTI Mk4 of 1998-2004, which had a 2.0-liter/115bhp 4-cylinder model launched in 1999, that looked sporty, but did 0-60mph in 10.2 seconds, making it the least sportiest and it felt like a GTI In Name Only. British customers were largely unimpressed, preferring the sportier 1.8T 20V 150 (the 1.8-liter/150bhp 20-valve 4-cylinder was slightly quicker, doing 0-60mph in 8.2 seconds), later the 1.8T 20V 180 (with a more powerful 1.8-litre/180bhp 20v 4-cylinder that was even quicker).
    • Interestingly, buyers in North America didn't have this problem as much, where the Volkswagen GTI (dispensing with the Golf name) was marketed as a luxury car rather than a workaday family hatchback with performance attributes.

  • Lockheed in the The '70s decided to bribe various government officials and cover up problems with the F-104. As a fighter plane, it was good; as a light bomber, not so much. The scandals almost killed the company. The commercial failure of the L-1011 TriStar didn't help matters any, either.
  • Boeing fell into this in The New '10s with the Boeing 737 MAX, the latest update of the proven 737 series of airliners. The MAX started off with a lot of promise, but after two fatal crashes aviation authorities decided to ground it. As it turned out, Boeing had equipped the 737 MAX with an untested software system designed to compensate for the engines being mounted in a way that affected the plane's controllabilitynote , but a primary selling point was that pilots did not need to be retrained. They failed to account for the possibility of the software - or, indeed, the one sensor feeding the software data - malfunctioning, which led to them not informing their customers of the new system. So sure were they that this would pose no problem whatsoever, much less one with the potential to kill people, that they made an AOA disagree warning an optional extra. As a result, when the system malfunctioned due to inaccurate data, the pilots had no idea how to counteract it, leading to the crashes. The costs for the company are estimated to be more than $18 billion. Problems with the SLS launch vehicle and the Starliner crew capsule on development for NASA and flaws on the 787's software are not helping matters any, either.
    • As of 2021, the 737 MAX is being recertified across the globe (Except, of course for China, the first country to ground it), as the necessary changes have been made, and airlines surprisingly for the most part eager to begin resuming their use of them; on June 29, 2021, United Airlines announced that they were going to purchase 200 more of them!
    • Around the Turn of the Millennium, the KC-X program sent a Boeing executive to prison. The KC-46 is an aircraft with lots of teething troubles - at one point the tanker/transport was banned from carrying cargonote and tools were left inside the fuel tanks. On the civilian side, the 787 Dreamliner suffered battery issues that could've caused fatal accidents had they not been found out.

    Computers & Electronics 
  • Apple:
    • Their product range during the tail end of the 1980s and early 1990s had degenerated from the insane greatness of the classic Apple Macintosh to the extraordinarily bland Performa range. Although the Powerbooks sold well, and the Power Macs and Quadras got good reviews, none of the company's products were particularly exciting.
      Strapped for cash, Apple even took to licensing clones of the Mac hardware, which raised money in the short term but ate into long-term Macintosh sales. The company was in pretty bad shape before Steve Jobs came back in 1997 and the original iMac was released in 1998, and it took them a few more years after that to finally get rid of the mess that the classic Mac OS had become.
    • On the phone/tablet front, Jony Ive's signature flat, Helvetica-soaked design language (replacing a previous, less-harmonized appearance that a lot of people found excessively skeuomorphic on both platforms) has been divisive since debuting with iOS 7, while iOS 8 was not only buggy, a class-action lawsuit filed at the beginning of 2015 alleged that it was so bloated that it didn't leave enough room for user content. The iPhone 6's larger size (4.7 and 5.5 inches for the standard and Plus models) was also contentious. While some were excited that Apple was finally making a 'phablet' to compete with similarly large Android offerings, those who liked the smaller, older iPhones were dismayed by it — especially Jobs loyalists, given that Jobs had made a point of never making an iPhone with a screen greater than 3.5 inches, which he felt was the perfect size for a smartphone screen (he derisively compared larger phones to Hummers). The release of the "budget" iPhone SE in 2016, combining the power of the 6S with the form and four-inch screen size of the 5S, is generally seen as an attempt to Win Back the Crowd on that front.
    • The announcement later that year that the iPhone 7 would be removing the headphone jack, instead using Bluetooth and Apple's proprietary Lightning port for headphones, wiped out any goodwill that had been earned and then some. Many saw it as an attempt to force users to shell out extra to replace their headphones with new ones that used the licensed Lightning connector (especially given Apple's aforementioned purchase of Beats), without thinking about those who depended on the headphone jack for other uses (car connectors, credit card readers, et cetera). While there were those who defended the decision, saying that it allowed Apple to make the phone thinner while adding more internal hardware, others argued that those two goals went against each other, and questioned why phones needed to be so thin in the first place given how many people bought protective cases for them anyway, especially when Bendgate and the Samsung Galaxy Note 7 explosions were partly caused by the respective companies overestimating how thin a phone could practically be.
    • The MacBook lineup went through this starting around 2015. For starters, Apple changed the previously well-loved keyboard design to one relying on butterfly switches that were less comfortable to type on and were incredibly fragile, and the model after that replaced the function keys on the keyboard with a gimmicky, awkward touch bar that was even less reliable than the rest of the keyboard. Thanks to Intel going through a Dork Age of its own, performance and battery also became increasingly wonky, with some models suffering from severe thermal issues. Apple also replaced all ports except the headphone jack with USB-C/Thunderbolt 3 onesnote , with the first MacBook to do this not even having Thunderbolt benefits because its only non-headphone port was USB only, forcing users to use clunky, awkward dongles to gain basic functionality. While Apple did fix or at least reduce these issues over time, the lineup only really left its Dork Age in late 2020, when Apple switched out the Intel-based processors that powered the previous models in favour of their own custom ARM chip. While this move was initially met with skepticism and trepidation, Apple ultimately proved the skeptics wrong, and the M1 MacBook models, despite being entry-level models, turned out to have incredible performance and battery life while maintaining speedy, functional backwards compatibility with Intel-based programs. Of note, Geekbench single core running in Rosetta 2 on an M1 MacBook Air outperformed the same test on every Intel-based Mac, including higher-end ones.
    • Apple’s iOS mobile operating system has had its ups and downs over the years, but one of the more notable downs was iOS 11. It did have its positives, like a much-needed overhaul to the iPad interface, adding an actual (if limited) file manager, and an enhanced control center. However, it mostly became notorious for its many, many problems - for starters, it ended 32-bit support, killing off a ton of apps. The UI design on the default apps also became sloppier and overall worsened. It was massively glitchy and prone to becoming unresponsive (with the weirdest, if not necessarily most critical, bug being that the calculator app would give wrong answers, thanks to an animation glitch) and worst of all, absolutely tanked performance, rendering some older devices almost unusable. It also didn’t help that the “Batterygate” controversy came about around this time, where it was discovered that Apple deliberately throttled the performance of devices that had heavily degraded batteries in order to prevent sudden shutdowns. While it was a smart, reasonable trade-off, this was poorly communicated to end-users and it and the lackluster performance of iOS 11 in general led to a firestorm of controversy, numerous lawsuits, and (unfair, but understandable) accusations of planned obsolescence. Ultimately, iOS 12 introduced little in the way of actual new features and focused more on simply fixing iOS 11’s problems - it succeeded quite happily, and Apple wisely made sure that any device that could run iOS 11 could also update to iOS 12.
  • Microsoft:
    • The common joke about Microsoft's Windows operating systems is that they go in a cycle between a Dork Age and a quality product. Windows 95 was successful for the innovations it brought, but also extremely buggy to the point where it was the butt of many jokes in the '90s, while Windows 98 corrected the technical flaws and provided an all-around quality product. Windows Millennium Edition (or ME) was such a notorious Porting Disaster of features from the NT-based Windows 2000 that it killed off the Windows line derived from MS-DOS, with many users choosing to either stick with 98 or, if they had to upgrade, going with Windows NT (their business OS) instead. Windows XP, derived from NT-based 2000, was a return to form and arguably the most successful operating system in history; released in 2001, it didn't drop its title as the OS with the greatest market share until 2011 when it ceded it to Windows 7, and it was still supported with regular updates until 2014 (and it even received a critical security update five years after support ended). The reason for this is Windows 7's predecessor, Windows Vista, a buggy mess that quickly became an Old Shame for Microsoft, with Windows 7 generally seen as the 98 to Vista's 95 in terms of correcting its problems. Windows 8, released in 2012, didn't suffer from the bugs that plagued ME and Vista, but its "Metro" user interface implemented on Start Screen and Setting Screen, while only affecting few user interface leaving the desktop unscathed, was built around touchscreens, and it had such scathing reception from users of conventional desktops and laptops, again leaving people (especially business/office users, historically the core of Microsoft's base) unwilling to upgrade from XP and 7. Amidst this fiasco, Microsoft went to work on Windows 10, skipping over 9 completely in hopes of distancing themselves from the poor reception of 8, which combined 8's Metro User Interface with Start and Settings that was far more user-friendly for desktop users, to the rejoicing of Windows users...
    • That is until the free upgrade promo (from July 2015 to July 2016) abuses Windows Update to the point that those who didn't want Windows 10 are getting it, conflicts and all, without setting up updates manually...
    • As for Windows 10 itself, over the hood it's unable to set updates (normally Windows 10 will download any and all available released update and install them as soon as available while Windows Update since its Windows 7 and 8 times has rather bad track record of conflicts and system instabilities) to download manually although workarounds do exist, mobile app style content delivery (which means expect suggestion and ads to Windows apps on Windows store and the flood of freemium), alleged collection of user data, and the fact that support for the more personalize-able Windows 7 and 8 is increasingly deprecated to the point that CPU launched from 2016 onwards only supports Windows 10.
    • The Internet Explorer browser, from version 6 to roughly version 9. IE6 was the browser with the biggest market share in history for years, mostly because it was the standard for many business users who had software developed that worked great with it but simply wasn't easily ported to something newer. As a result, people kept using IE6, despite it being outdated and insecure. The compatibility requirements put 7 and 8 incredibly behind Chrome and Firefox. IE9 was a return to form, but the bad reputation Internet Explorer earned with 6 was still there. Microsoft would eventually phase out the Internet Explorer themselves, replacing it with Edge in Windows 10.
    • For Windows 10, Microsoft decided to make the bulk of their QA team redundant and outsource real-world bug testing to an Insider Program - in other words, volunteers. This was a horrible mistake, as volunteers are not professional QA testers and are less likely to be as diligent or have the same priorities as a paid professional, and even if they do catch a serious bug, there's a chance the reports will be buried by less vital suggestions, causing Microsoft to miss them. In October 2018, Insiders' attempts to warn Microsoft that an update was deleting user files failed to be seen by Microsoft themselves before it snapped non-Insiders' files out of existence. That was only the beginning of a series of faulty updates partially chronicled on Idiot Programming, but in summary, Windows 10 updates have developed a habit of creating three bugs for every bug they fix, ranging from the annoying to the bizarre to the catastrophic. A not-uncommon sentiment is that Microsoft needs to get their shit together before the damage to their reputation becomes irreversible, assuming it is still reversible.
  • Intel have had at least two:
    • Early on in the new millennium, the otherwise top-of-the-game Intel fell behind an increasingly competitive AMD with the Netburst-based Pentium 4. Promising to eventually break the 10GHz barrier, it instead ran inefficiently and incredibly hot. On top of that, their attempt to produce a 64-bit successor to the ageing x86 architecture resulted in the Itanium line, which proved a massive failure and forced them to adopt AMD's competing AMD64 architecture. The company was finally out of the woods mid-decade with the release of the Core 2, a more modernized take on the P6 architecture, along with a steady yearly update schedule allowing for step-by-step refinements that saw them easily outpace AMD for well over a decade.
    • However, Intel fell back into a Dork Age in the mid-late 2010s, as they've been stuck on the same basic architecture (Skylake) since 2015, with their first attempt at producing a revised version (Cannonlake) being such a disaster that they were forced to essentially abandon it, before having slightly more luck with Skylake's planned successor (Icelake), but only being able to release low-power mobile versions due to manufacturing problems. AMD, meanwhile, got their act back together and released the new "Zen" microarchitecture, which restored them to competing with Intel's high-end processors, before following it up two years later with the Zen 2, which utterly destroyed Intel's entire desktop and workstation/server line-up in much the same way that the Athlon 64 had humiliated the Pentium 4. It took until 2021 for Intel to produce something that wasn't just a Skylake variant with more cores... and when their new architecture, Rocket Lake, was actually released, it turned out to perform worse than Skylake in many apps and games. Ultimately, this Dork Age cost Intel their partnership with Apple because it caused thermal and performance problems that exacerbated the MacBook line's own Dork Age.
  • AMD's own Dork Age is widely regarded to have been between 2011 and 2016. The five years prior to that weren't exactly a golden age for the company, and saw the disastrous launch of the original Phenom (which was competitive with Intel's Core 2 line, but suffered from low clock speeds and the infamous TLB glitch), but also weren't a complete disaster. Unfortunately, AMD's FX line of CPUs was a complete disaster, with its supposedly revolutionary "module" concept not working out that well in practice, and the rest of the chip being sunk by, ironically, the very same issues that had dogged the Pentium 4 a decade previously.
  • NVIDIA fell into this with the GeForce FX GPU, which used a substandard implementation of DirectX that allowed then competitor ATi (now AMD) to wipe the floor with its GPU. NVIDIA came back the next generation and stumbled a few more times — most infamously with the late and very power-hungry GeForce GTX 400 series — but not nearly as badly as this.
  • ATI themselves had a Dork Age from about 2006-2009, as their Radeon 2000 series proved late and underwhelming, leaving them playing catch-up for a couple of years. They finally managed to get ahead of NVIDIA with the Radeon 5000 series, but it proved too little too late for the ATI brand, which AMD killed in late 2010.
  • The BlackBerry line of smartphones had one around the beginning of The New '10s, due to its obsolescence in the face of the iPhone and Android-based phones, and slowness in developing new models. The latest models based on BlackBerry 10 have been well-received, and the company has returned to profitability, but it's still a far cry from its previous stature in the smartphone industry it pioneered.

  • From 2014 to 2018, Select Fashion was seen as being little more than a poor imitation of brands like New Look and Marks & Spencers, and its increased focus on sports tops, crop tops, and shorts / athleisure wear got mixed receptions. The brand was already in trouble anyway due to the political situation in the United Kingdom, and the dork age was compounded by a controversial website ( that people complained was riddled with bugs and had a user interface that was slower than its rivals. Adding to the dork age was some people considering the store a Genre Throwback to 1990s-style clothing, or being a British Captain Ersatz of Hot Topic (fashionable-style clothing). Also, some people saw it as having a Periphery Demographic for womenswear, since despite its supposedly younger demographic, older women were buying the sports bras and crop tops more than the 20-somethings and millennials that the brand was aimed towards.
  • Abercrombie & Fitch endured two of these.
    • First, in the '70s and '80s, financial problems forced what had once been a top-shelf outdoor outfitter to cut costs any way it could, cutting back on both their expensive loss leaders and their more moderately-priced items. This culminated in its bankruptcy in 1976, and the closing of its flagship store on New York's Madison Avenue the following year. It survived The '80s mainly as a mail-order catalog owned by Oshman's Sporting Goods, with only three physical stores in Beverly Hills, Dallas, and New York. The Limited bought out A&F in 1988 and pulled it out of the Dork Age, as new CEO Mike Jeffries reinvented it as a preppy, sexy lifestyle fashion retailer aimed at teenagers and young adults. Under Jeffries' leadership, A&F became an iconic symbol of youth fashion in the '90s and '00s.
    • That said, The New '10s were not kind to A&F. The Great Recession turned one of the brand's main selling points, its high price and exclusive image, into a liability virtually overnight, as the displays of wealth that A&F came to be synonymous with were now seen as elitist, the domain of Jerk Jocks and Alpha Bitches. Long-simmering controversies surrounding the hiring and treatment of the stores' clerks (in 2006, Jeffries stated in an interview that they only hired "good-looking people" to work in their stores because their target demographic was The Beautiful Elite) only furthered the image of a brand that was stuck in the past, as did its highly sexualized marketing, which came to be seen as tacky in an age of greater awareness of sexual harassment and assault. All this allowed low-cost "fast fashion" brands to eat A&F alive in the early-mid 2010s, with drastically shrinking revenues and many stores closed. In 2017, Fran Horowitz, fresh off of turning around A&F's sister brand Hollister, was promoted to run the company's mothership, and she proceeded to turn it around by both toning down the overt sexuality and focusing on quality products, successfully banking on fast fashion's "you get what you pay for" reputation to drive sales from people looking for longer-lasting clothes.

    Food & Drink 
  • In 1985, Coca-Cola decided to change its secret formula that had been the same for the better part of a century. Ironically, the "New Coke", as the media dubbed it, tasted more like Coke's chief rival, Pepsi (part of the whole point, actually). Die-hard Coca-Cola drinkers said "They Changed It, Now It Sucks!" and Pepsi drinkers kept on drinking Pepsi. This new formula actually made Pepsi the number-one selling soft drink for a while, partly because most of its advertising during the period was "Hate the New Coke? Drink Pepsi!" Pepsi actually saw the New Coke blunder as such a major win, they gave all their employees a day off in celebration. The original formula returned to the market 80 days after New Coke's debut; the original formula was branded "Coca-Cola Classic" while the new one was branded as simply "Coke." The rest of the decade found Coca-Cola shilling (New) Coke with a Younger and Hipper advertising campaign starring '80s phenom Max Headroom, but with very little impact. New Coke was eventually rebranded (quietly) as "Coke II" but faded to its death in the late 1990s and finally perished in the early 2000s. This debacle became a running joke for years. Even in Futurama, the "Slurm" episode poked fun at it, and Stranger Things made it the butt of a joke in its third season.note 
    • Dave Barry lampooned this in one of his books with a "test your business IQ" question that went something like "You are the world's largest manufacturer of soft drinks. You are using a tested and proven formula that has remained the same for nearly a century. Your product's name is virtually synonymous with 'soft drink' in many areas. You should:" Of the choices, one of them was "Immediately change your formula." (Another, aimed at a more or less contemporaneous Pepsi PR disaster, was something like "Set a celebrity on fire.")
    • This one was such a debacle that there actually exists a conspiracy theory claiming that New Coke was a Springtime for Hitler moment, that Coca-Cola deliberately changed the formula so that they could create enough outrage that people would demand the original Coke back, leading to a long-lasting boost in sales once they quietly shelved New Coke and brought back the original article. A slightly different theory claims that New Coke was done in order to cover up the switch from cane sugar to high-fructose corn syrup in the original Coke, by distracting people with a radically altered formula (in truth, this shift from sugar to HFCS occurred five years before the New Coke debacle). Both of these theories assume that there's no possible way that such a major corporation could shoot its business model in the foot so badly... right? To quote Donald Keough, then the company's president and chief operating officer:
    "Some critics will say Coca-Cola made a marketing mistake. Some cynics will say that we planned the whole thing. The truth is we are not that dumb, and we are not that smart."
  • Jack in the Box (the restaurant) had one between 1980 and 1994. Read more about it on Wikipedia. In short, what happened was originally Jack in the Box had a typical West Coast hamburger stand feel to it: you talked into the clown's mouth to order, and advertising featured an early version of Jack as well as several other characters. But in 1980, the chain ran a series of commercials where Jack was destroyed. New marketing was toward the "affluent yuppies". The menu expanded at an alarming rate of two new items a year. They even tried to rename the restaurant to "Monterey Jack's". Around this point, the chain also withdrew from several markets east of the Mississippi, including New York state, Chicago, and Detroit. A massive E. coli outbreak at several Jack in the Box locations on the West Coast (which made 623 people ill and killed four) also did damage to the chain's reputation — but at the same time, it led to Jack in the Box completely overhauling their food safety procedures, and many other fast-food chains soon followed their lead to ensure that an outbreak that large would never happen again. Jack in the Box has also begun expanding again, filling in markets such as Cincinnati and Indianapolis.
  • Hardee's went through a similar dip in The '80s and The '90s. The chain, already taxed by buying out other chains (most prominently Burger Chef and Sandy's), attempted to cut costs by using frozen instead of charbroiled meat patties. A 1990 buyout/conversion of Roy Rogers restaurants (based in the Northeast, where the Hardee's name was totally unfamiliar) was met with such backlash that most of them were quickly reverted. Issues with quality control and constant menu changes brought the chain to its nadir in 1997, when tons of locations were closed (most of the franchises in Detroit were sold to Wendy's or Canadian chain Tim Hortons, giving the latter its second successful American market), and the remainder was sold to California-based Carl's Jr. For the next six years, Carl's Jr. struggled in attempts to merge the two chains by keeping Hardee's still-successful breakfast menu and Carl's Jr.'s lunch/dinner menu and logo. The change was rough at first, resulting in a schizophrenic mess of stores, with some stores still having the pre-1997 menu and orange-and-brown logo well into the 2000s. But one last Retool of the menu to focus on "Thickburgers" seemed to finally turn things around and re-establish the chain with a more "upscale" image than McDonald's, Burger King, or Wendy's. As of The New '10s, Hardee's/Carl's Jr. has once again been in expansion mode, gradually filling in markets that had been abandoned in the '90s or earlier, such as Chicago.
    • Speaking of Roy Rogers, Hardee's singlehandedly caused a deep Dork Age for that chain thanks to the aforementioned buyout/conversion plan. However, some history is in order. Roy's was founded in 1968 by Marriott- yes, the hotel chain. They had been in the restaurant business for years prior to hotel ownership, by the late '60s, their chain of "Jr. Hot Shoppes" (fast-food style versions of their original "Hot Shoppes" eateries) wasn't doing so well, and Marriott wondered what to do with them. Meanwhile, Marriott had recently acquired the franchise rights to a place called RoBee's, which sold roast beef- but they couldn't use that name outside of Indiana. Marriott board-member Bob Wain (the namesake of Bob's Big Boy) came up with an ingenious idea- merge the two chains together into one that would offer hamburgers, roast beef, and fried chicken- plus a "Fixin's Bar", where the customer could garnish their meal with dressings, condiments, and toppers as they wanted. He then reached out to famous Western movie star Roy Rogers to have his name on the chain and endorse its products. The concept was a wild success, and by the late 1970s, Roy's had expanded from its Washington DC-area base to all over the Northeast and Mid-Atlantic. They were further boosted when they acquired rival burger/chicken chain Gino's in 1982. The chain looked to be set for the future- until Marriott decided to get out of the food business in 1989. Hardee's bought Roy's and immediately set out to use the chain for Northeast expansion- many company-owned Roy Rogers were converted to Hardee's, and with the exception of the fried chicken (which was offered at Hardee's- even those that weren't Roy's- through the late 90s), Roy's familiar items were eliminated. Loyal Roy's customers revolted, and by 1994, Hardees was selling off locations left and right, often to McDonald's, Wendy's, and Boston Market. By 1997, only a few franchised stores in the DC/Baltimore area, and some stragglers located in highway rest stops, were left of the once-mighty Roy Rogers. In 2002, Peter and Jim Plamondon- twin sons of one of the original Roy's executives- bought the rights and chain and began an effort to prevent the name and menu from falling by the wayside. They've built new locations and expanded franchising, and slowly but surely the chain is inching its way back- notably in New Jersey, where 4 new locations have opened in the past few years, and the return of Roy's is always dedicated with long lines on opening day. But the chain still has a long road ahead of it.
  • Wendy's went through a similar plunge in The '80s, due mainly to poor upkeep of its stores that created cleanliness issues, as well as a failed attempt to adopt a breakfast menu (unlike McDonald's or Burger King, Wendy's never fully got on board with breakfast, and it was only available at a handful of locations {mostly in 24-hour truckstops} before quietly being dropped in 2014). However, it was not a long-lived or detrimental decline like Hardee's suffered — by The '90s, the chain recovered from its eighties slump, thanks to storewide renovations and a highly popular series of ads featuring founder Dave Thomas. By the mid-'90s, Wendy's was considered the best in quality and service among the "big three" burger chains, and, despite closing most of its international locations later in the decade, it has been a solid #3 ever since, becoming popular through a snarky social media presence. The chain even relaunched its breakfast menu nationwide in 2020.
  • A&W Restaurants went through this in The '70s and The '80s. This was mainly due to their drive-in restaurants aging and becoming less feasible as McDonald's took over the fast-food world — except in warmer climates, drive-ins often had to close in the winter, as very few had indoor seating, while the increased presence of drive-through at McDonald's and its ilk made drive-ins seem dated. The chain went on a huge closing spree and franchise freeze, slimming the numbers down greatly; they also sold all of their Canadian locations to Unilever (to this day, Canadian A&W has no connection with its southern counterpart). A subsidiary was spun off to sell A&W root beer in grocery stores. New sit-down locations with drive-throughs were piloted, and A&W began to push into shopping malls as well (due not only to a buyout of Carousel Snack Bars but also due to the company being owned by shopping mall developer A. Alfred Taubman at the time). The chain ended up in the hands of Yum!, the owners of KFC, Taco Bell, and Pizza Hut (and also Long John Silver's at the time), who aggressively expanded the brand with "2 in 1" stores combined with KFC or Long John Silver's. Although A&W and Long John Silver's have since spun back off from Yum!, the co-branding largely remains, and A&W has been fairly secure ever since.
  • Tropicana orange juice went through a bizarre and brief Dork Age when they hired the Arnell Group to redesign their packaging. The new design was so ugly that it actually caused a 20% drop in sales. Thankfully, it was reverted after just a few months.
  • English cuisine went through its Dork Age in the 20th century. Until the industrial revolution, the cuisine of England was well-regarded throughout Europe, but due to the industrial revolution and the two world wars, the cuisine of England saw a drastic decline in quality and reputation when mass-produced food became popular. The Dork Age has started to decline since the '70s, but the reputation still persists.
  • American cuisine, meanwhile, took a nose-dive in quality during the early Cold War, becoming a giant buffet of artificial chemical garbage loaded with dangerous amounts of sugar and fat with trace amounts of real nutrients. Food preservation technologies developed during World War II, combined with the perceived need to stockpile food heavy with preservatives for Cold War-era fallout shelters, and general public ignorance about the potential health risks of chemical additives spawned a wave of food production emphasizing price and speed over quality. The rise of fast food and the introduction of the microwave only made things worse; this was when America first developed its Fast-Food Nation reputation, which it, like Britain, still has trouble shaking off. Unsurprisingly, heart attack and cancer rates in America skyrocketed during The '50s and The '60s thanks in no small part to the garbage people were putting in their bodies. This nearly destroyed drip coffee's reputation and spurred the organic and slow food movements as an explicit rejection of the trend. Entire websites like Lileks Gallery of Regrettable Food and The Mid-Century Menu show some of the awful recipes to come out of this era.
  • French cuisine during The '90s and the Turn of the Millennium came to be seen as a bland and formulaic institution, trading entirely on tradition and reputation while spurning innovations coming out of countries like the US, the UK, Italy, Spain, and Japan, often on grounds of Creator Provincialism and national pride. The increasingly difficult economics of running a top-flight restaurant in the face of France's Obstructive Bureaucrats were also leading many restaurateurs to cut corners; a 2010 Canal Plus documentary revealed that up to 70% of French restaurants were relying on pre-packaged meals, ingredients, and sauces. A successful push that same year to get UNESCO to designate French cuisine as part of the world's cultural patrimony was also widely criticized as merely entrenching its ossification, turning it into a museum piece and making it even less relevant to the French people. The New '10s saw a new breed of chefs, influenced by cuisine from the US and elsewhere in Europe, bring those countries' culinary revolutions to France, setting off a wave of creativity and experimentation and putting France back on the map.
  • The Snapple brand of tea and juice drinks fell into one in 1994 after its founders sold the company to Quaker Oats, who mismanaged it into the ground. They cut back the low-volume niche flavors in order to save money and fired their spokesperson "Wendy the Snapple Lady" due to her being seen as too "Long Island" for a national advertising campaignnote , in the process killing much of the indie upstart spirit that the company was known for, while also pulling its advertising from The Howard Stern Show while he was on vacation due to his penchant for controversy. Stern, who had been a fan of Snapple before then (leading to their partnership in the first place), was not amused one bit and started calling the drink "Crapple" on his show. While Quaker Oats sold Snapple off to Triarc Companies in 1997 and the new owners managed to stop the bleeding, the damage was done, and Snapple's market share never fully recovered; as of 2014, it was less than half of what it had been twenty years prior. Whereas once Snapple had national reach, nowadays its popularity is largely concentrated on the East and West Coasts (particularly New York City) and in places with large numbers of people originally from those areas (such as Florida). Today, Harvard Business School uses Quaker Oats' ownership of Snapple as a case study in how not to run a brand.

  • In the early 1990s, Las Vegas was facing stiff competition from not only Atlantic City drawing away gamblers on the East Coast (at its height, AC had over twice as many tourists as Vegas), but the looming threat of Native American casinos, legalized in 1988note , drawing away gamblers from Middle America as well. As a survival mechanism, Las Vegas began its now-infamous attempt to expand its appeal to tourists by rebranding the city as a destination for family vacations. Every Strip hotel built over 1990-93 had at least one theme park-esque attraction and theme, with the new MGM Grand boasting a full-blown theme park.

    This backfired badly. Adult tourists who preferred to gamble and party without dodging kids were upset, hotel-casino staffs trained to operate adult-oriented resorts couldn't handle the unique needs of families, cases of parents rushing off to the gaming tables and leaving their kids to fend for themselves made the news — one abandoned child ended up kidnapped and murdered — and the theme parks turned out to be a bomb that closed in 2002. This age ended with the opening of the Bellagio in 1998, which was explicitly geared towards a very classy and very adult clientele with its fine art gallery, conservatory, resident Cirque du Soleil show, and high-stakes poker tables. While the hotels that opened to serve families are still around, and Vegas still markets itself as being about more than just gambling, said hotels have been progressively de-themed and the city's entertainment mix now mostly excludes families.
    • This was referenced and summed up pretty well at the end of the film Casino.
    Today it looks like Disneyland. And while the kids play cardboard pirates, Mommy and Daddy drop the house payments and Junior's college money on the poker slots.
    "The town of Vegas has got a different face / Because it's a family place / With lots to do. / While in The '50s a man could mingle with scores / Of all the seediest whores, / Well now his children can too!"
    • The film Vegas Vacation in 1997 parodied the family era of Las Vegas, with Clark Griswold taking his wife and kids on a trip to Vegas with his bonus check. Without parental supervision, Rusty gets a fake ID and becomes a high roller, while Audrey gets a job as a stripper thanks to the influence of Cousin Eddie's trashy daughter Vicki.
    • The Simpsons poked fun at this in the 1999 episode Viva Ned Flanders, the episode began with the Monty Burns Casino being demolished so it can be replaced with a "casino-themed family hotel." Later in the episode, there's a brief cameo of Raoul Duke complaining that there were "too many kids" in Vegas.
  • Paul Pressler’s run as president of Disneyland from 1996 to 2000 is a textbook example of someone excelling in one field but completely failing in another. After a very successful stint as the head of The Disney Store (which itself underwent a Dork Age after he left, thanks in part to focusing on films like The Hunchback of Notre Dame and Hercules that aren't exactly merchandising-friendly), Pressler was promoted to the top position of Disneyland, which at the time was undergoing a radical change. The park was slowly losing new customers, and an attempt to add a second Disney park to the area had failed miserably. He was charged with saving money and enticing new people to the park – which he did by seriously cutting attraction maintenance and operating hours, and by homogenizing merchandise within the park down to only a few major items like T-shirts and plushies, basically turning the park into a glorified Disney Store. He then turned to saving even more money by completely shutting down smaller low-capacity attractions like the Motor Boat Cruise, and helping “trim” the budget to a massive redo of the Tomorrowland area.

    For his “success” at Disneyland he was promoted to the head of the entire theme parks division in 2000, where he oversaw development of the long-awaited second theme park in the Disneyland area, Disney’s California Adventure. The park opened to great fanfare in 2001… and very quickly became a spectacular flop. The park itself was accused of being cheap and uninteresting, with more of an emphasis on shops and dining over shows and attractions. After two years of trying and failing to fix California Adventure, Pressler resigned in 2003 to become the head of The Gap, and thankfully taking Disneyland’s Dork Age with him.
  • The German Baltic Sea Coast had this, particular in the former East due to no fault of its own. During the 19th century, resorts at the Baltic coast were among the most priced and widely sought after places for the well to do to spend a Kur (a unique German type of stay at a spa that supposedly rejuvenates and increases health) and many grand old hotels date to that era and had crowned heads among their guests. The introduction of the railroad made more and more spas accessible to the growing middle class and soon tourism vastly overtook fishery in economic importance. Then two world wars hit and with it came German partition. While the resorts had always bounced back before, some were now close to the border and could not continue to operate whereas others lost almost all of their former cross-border guests. Still those in the GDR enjoyed a steady stream of tourists of all classes, aided by the fact that travel to the West was out of the question and even holidays in "socialist brother countries" like Hungary or Poland were a bureaucratic hassle for GDR citizens to say nothing of the long trips there in relatively slow trains or the Trabbi. After 1990 the border opened and suddenly GDR citizens either moved West, had become unemployed or wanted to explore all those countries beyond the Iron Curtain that had always been beyond their reach. Baltic Sea towns suffered and even some in the West had problems with (ironically) Westerners abandoning them for resorts in the East. At the same time dockworks in Rostock (one of the other major employers in the region) collapsed and the GDR fishery, which had been unsustainable to begin with, lost its reason for existence. It seemed for a time that with rampant emigration of every young person who could and economic problems the region had entered a deadly tailspin, but while some of these problems still persist, tourism has bounced back. In part because (former) GDR citizens did in the end return to the beaches of their youth, in part because some Westerners found their love for East German beaches and in part because the G8 summit in Heiligendamm (the one with the quirky picture of world leaders in a ''Strandkorb'') put the region onto the international tourism map for the first time in over half a century.
  • Both Universal Studios resorts went through this in the Turn of the Millennium.
    • Universal Orlando enthusiasts deem the period during and after the 1999 resort redesign as this. During this period, many of Universal Studios Florida's classic attractions, such as Ghostbusters Spooktacular, Kongfrontation, Alfred Hitchcock: The Art of Making Movies, Stage 54, The Funtastic World of Hanna-Barbera, The Wild Wild West Stunt Show, and Back to the Future: The Ride were closed in an attempt to "modernize" the theme park in order to keep it competitive with Disney. The enthusiasts' reactions to most of the replacements for these rides ranged from extremely polarizing in the case of Jimmy Neutron's Nicktoon Blast to heavy backlash in the case of Shrek 4-D. The only rides that didn't have any backlash against them were Men in Black: Alien Attack, Revenge of the Mummy, and The Simpsons Ride, and even the latter two had something of a Broken Base due to them having replaced Kongfrontation and Back to the Future: The Ride, respectively. It also didn't help that during this time, Universal scrapped plans for building a second resort just a few miles away for economic reasons, with nearly disastrous consequences.
    • Universal Studios Hollywood had it even worse than Orlando. Most of the park's attractions from the late '80s all the way up to the early 2010s got replaced left and right just to keep the park competitive to Disneyland, and even then, most people just visited the park just to ride the iconic Studio Tour.note  The most negatively received of the newer attractions at the time was Spider Man Rocks!, a live stage show that replaced Beetlejuice's Graveyard Revue. It was frequently mocked by almost everyone who watched it, and it made Spider-Man: Turn Off the Dark look like a Tony Award-winning musical by Bob Fosse. It ended up being the only Marvel-related attraction Universal Studios Hollywood had; it closed in 2004 to be replaced by Fear Factor Live (which didn't fare much better reputation-wise), and Marvel regained the California theme park rights for their characters in 2008, which fell into Disney's hands the following year when they bought Marvel. Only with the opening of Hollywood's Wizarding World of Harry Potter and Springfield areas, along with Super Silly Fun Land seemed to signal the age was starting to die down.
    • As of now, some enthusiasts have come to see Universal Orlando as having fallen into another Dork Age. In particular, they cite a perceived over-reliance on motion simulator rides at the expense of more traditional rides and stage shows, especially now that, with both the original Studios park and Islands of Adventure mostly filled in, Universal had to close old attractions to make way for these new rides. While the replacement of Twister...Ride it Out with Race Through New York Starring Jimmy Fallon wasn't too bad (Twister was a polarizing attraction to begin with, so replacing it with an equally polarizing one didn't change much), replacing the well-received Disaster! (one of the few remaining attractions at the park to date back to its 1990 opening, albeit in the form of Earthquake: The Big One) with Fast & Furious: Supercharged, which met a scathing reception even from casual park guests, wasn't. Volcano Bay, Universal Orlando's third theme park (and first water park), also received very mixed reviews on opening day in 2017 for being an Obvious Beta, with complaints about the rides breaking down frequently, the wireless TapuTapu wristbands used for the virtual lines being glitchy, and said virtual lines being several hours long, such that Universal was forced to cap ticket sales in order to prevent overcrowding (leading to guests being turned away from the gate as early as noon). It's possible this age has come to an end with the opening of the universally-acclaimed Hagrid's Magical Creatures Motorbike Adventure, and the announcement of the long-awaited Epic Universe park in the Orlando resort, which seems highly promising.
    • For Halloween Horror Nights Orlando, meanwhile, the years from 2012 through 2014 are often cited due to the increased focus on licensed properties (especially The Walking Dead) and lack of "icon" figures in favor of "having the event represent itself" (which usually translated to a heavy emphasis on The Walking Dead in the marketing). This period reached its nadir in 2013 when every scarezone was themed around The Walking Dead. The Dork Age ended with the 25th anniversary event in 2015, featuring the return of many popular characters from past events, a more generous ratio of licensed to original content, and The Walking Dead being kept to one house.

  • Math education in the U.S. (and, to a lesser extent, Europe and Japan) went through a Dork Age in the 1960s with the "New Math" format. It involved teaching students advanced topics like boolean algebra, bases other than 10, abstract algebra, and set theory from an early age instead of emphasizing memorization, word problems, and, well, actual numbers. This is the equivalent of a Japanese teacher jumping straight into kanji before teaching their students kana or words as basic as "watashi". The idea was to create a generation of engineers, scientists, and mathematical thinkers capable of competing with the USSR, but the results were disastrous because the brain geniuses who thought of this forgot that you still have to teach the basics, especially to children. Most children were unable to grasp the concepts because they hadn't learned basic arithmetic like the multiplication table firstnote . Most teachers didn't fully understand what it was they were teaching, and parents were unable to help their children because they weren't taught that waynote . The subjects taught weren't even that useful for their intended engineering/physical science purpose. And an entire generation thought of math as even less useful and relevant than children usually do. Today, New Math is remembered as an utter catastrophe of misguided education reform Gone Horribly Wrong. note 
    • Similarly, Common Core Education (Math in particular) from The New '10s has gotten a similarly poor reputation amongst educators, parents, and politicians. Originally created in 2009 under the Obama administration in response to the nation's overall poor educational system compared to other developed countries, as the decade rolled on, the reputation of Common Core itself was increasingly becoming more and more strained. In fact, in spite of the attempt, Common Core actually ended causing students K-12 to perform even worse than they did before.
  • In the UK, a quiet and largely overlooked dork age is taking place, thanks to secondary schools (the equivalent of Junior High & High School education, for our US readers) railroading students into University and refusing to accept the idea that some people are better suited for apprenticeships or going into manual work, all to chase higher league table scores. As a result, educational institutions of higher education, such as A-Level exam boards and Universities, have been steadily inflating grades; at the same time, this is resulting in a crisis where hundreds of thousands of new students receive high grades for things that they're either not suited for, or actually didn't do very well at in the first place.
  • Rail travel underwent a serious dork age in most of the West between (roughly) the 1950s and the advent of High Speed Rail. The decline was precipitated by the rise in private automobile ownership and air travel, leading to the abandonment of many lines as well as the bankruptcy of several private railroads. Some railroads tried their best to counteract the trend, but of course this was not always successful and some of the attempts to update the design were about as successful as "new coke". However, with the rise in gas prices as well as newer faster services such as the Shinkansen (Japan, 1960s) the TGV (France, 1980s), or Deutsche Bahn's ICE (1990s) rail travel recovered and even managed to put a dent in the numbers of air travel along short routes. Even the much laughed about Amtrak of the United States which was formed in the 1970s in order to keep the private railroads from collapsing under the weight of money losing passenger services has increased ridership by over 50% since 2000 and carries more passengers along the Acela-corridor (Boston-New York-Washington) than all airlines combined. Unfortunately, a combination of privatization and persistent Westminster neglect of areas outside Greater London has prevented British rail travel from escaping its dork age, and to this day it's infamously inferior to and more expensive than mainland European rail travel.
  • The Fender Stratocaster is one of the greatest electric guitars of all time. However, it wasn't the case during the 70s. When Leo Fender sold the Fender Electric Instrument Company to CBS in 1965, their guitars saw a steady decrease in quality, hitting a low point in the '70s. The company switching from a 4-bolt neck to a 3-bolt neck meant that the necks were less stable, the inclusion of an ugly and intrusive bullet truss rod, the increased weight of the guitars due to a change in wood (from southern swamp ash to heavier northern ash), the use of inferior-sounding pickups, and the overall poor build quality of the guitars from this era. By the early '80s, CBS had run Fender into the ground. Thankfully, a group of employees bought the company back from CBS, undid the poorly received changes (except the 5-way pickup selector switch, the only positively received change from the era), and refocused the company on building quality guitars. They were able to get the company back on track, and Fender remains one of the biggest electric guitar manufacturers in the world.
  • Gibson, like Fender, also saw a dork age in the '70s, though it's debatable if they've ever truly recovered. In 1969, Gibson's parent company, Chicago Musical Instruments, was bought out by ECL. This led to a decrease in quality in their products, most notably the Les Paul. The already weighty Les Paul weighed even more in this era when the neck material was changed to maple, and Gibson began cutting corners to reduce manufacturing costs, at the expense of quality. While Les Pauls of the 50s and 60s were made from a single mahogany slab with a maple cap, the ECL-era Les Pauls were made of two mahogany slabs sandwiched together. They were also criticized for their poorer electronics and shoddy quality control. Gibson was on the verge of bankruptcy when it was bought out by Henry E. Juszkiewicz in 1986, who was able to turn the company around.
    • Unfortunately, it didn't last. Throughout the 2000s and 2010s, Gibson's QC declined once again, and they began introducing unpopular features such as a zero fret, robot tuners, and DIP switches. All of this was compounded by the $2,500+ price tag of their instruments despite their inferior quality to competitors like Fender and PRS, and even Epiphone, their beginner brand. Worse is that Gibson made several poorly thought-out acquisitions at this time, buying up companies such as Onkyo and TEAC. Combine that with Gibson's perceived lack of ability to market to millennial musicians (leading to the Les Paul's unwanted reputation as the guitar that non-musician suburban dads going through a mid-life crisis would purchase to hang on their wall) as opposed to Fender, who markets to the young and hip crowd, including women and minority groups, and as a result, their instruments have become synonymous with indie rock. Meanwhile, Gibson was reduced to a walking punchline among the guitar-playing community.
      • In 2018, they filed for chapter 11 bankruptcy. They would sell off their acquisitions and Juszkiewicz would step down from his CEO position. When they emerged from bankruptcy, they would streamline their guitar line and redesign the Les Paul Standard (now the Les Paul Modern), doing away with all of the unpopular and much-ridiculed features, while maintaining the features that players did like (like the weight-relieved body, contoured neck heel, slimmer neck profile, and push-pull knobs). In addition, they introduced several lower-priced guitar models. However, many of their models are still $1,000+, and whether Gibson's QC has improved is still up for debate. There was also the PR nightmare that was "Play Authentic", an ad campaign that served as a thinly-veiled attack against guitar makers building Gibson clones. The campaign drew negative comparisons to, once again, Fender, who not only doesn't go after anyone building Fender clones (mostly because the only part of the guitar they hold a trademark on is the headstocks) and have even licensed their brand out to guitar part manufacturers such as Warmoth. Then with Gibson's buy-out of famous guitar amp manufacturer Mesa-Boogie, people fear that Gibson may already be slipping back into their old ways.

    Pinball manufacturers 
  • Stern is widely considered to have gone through a dork age between 2003 and 2012. The reason is simple: During this interval, Stern was the only non-boutique manufacturer of pinball machines. As a result, innovation and creativity mostly stagnated during this time. (That being said, Stern is also partially Mis-blamed for this stagnancy, as they got caught in the tail end of a patent war among several other manufacturers that left Stern with next to nothing to use.) Stern was off to a good start with well-acclaimed early hits like Lord of the Rings and The Simpsons Pinball Party, but they got complacent soon afterwards: The Problem with Licensed Games took full effect here, as Stern was able to get themes for hot properties at the time of release, like 24 or Pirates of the Caribbean, meaning Stern's machines typically got the price of the machines back simply because passers-by familiar with the license would drop in some coins out of curiosity. This convinced operators, who mostly do not play pinball, to just buy whatever machines Stern would release. In addition, the rules were released in such an incomplete state that some games, like Batman (Stern), were near unplayable on release. The build quality also suffered, most noticeably with Iron Man's rivets and screws shaking loose in as early as six months. This changed in 2012 when Jersey Jack Pinball emerged as a competitor to Stern, and Stern's machines took a noticeable leap in quality, both in build quality and in gameplay. Whereas Stern was once practically a derogatory name among pinball fans, they are universally respected except by the most diehard Bally-Williams fans. This leap is so dramatic, several of Stern's dork age machines have since been Vindicated by History, with Stern now manufacturing re-releases of machines they never sold much of when first introduced.
  • After dominating the electromechanical era (from post-World War II to 1979) and some hits right after that, most notably Black Hole (which charged double per game compared to surrounding releases and was still popular in every arcade), Gottlieb got caught in a dork age from roughly the mid-'80s and onwards that they just couldn't shake off. This downturn caused the company to financially trail behind its competitors and go out of business in 1995. The reasons, however, are not entirely clear: Ask 50 pinball fans and they will give you 50 different answers, but they all agree that the Gottlieb machines weren't quite as good. A likely underlying reason is that as Gottlieb's competitors created more intricate rules,note  deeper integration of their themes,note  and incorporated new technology,note  Gottlieb's design team were stuck in electromechanical ways of design had trouble adapting, lagging behind Bally-Williams and Data East by two to four years. When Gottlieb did adapt to the mode-based progression in the '90s, however, the disjointed gameplay, lopsided scoring, and clunky playfield geometrynote  earned it few fans. There are a few highlights within this era though, such as Stargate, Rescue 911, and Cue Ball Wizard.

  • The Escapist fell into one from 2014 to 2018. Short version: the owner of the site took a stance on a heated controversy within gaming that had broader political implications, a stance that most of the site's contributors sharply disagreed with, causing many of them to either quit or receive pink slips. By late 2017, only Zero Punctuation remained at the site, largely because Yahtzee tried to keep his show as far away from the controversy as possible and by that point even he was leaning more on YouTube than The Escapist. New management in 2018 (namely, the original creators of The Escapist buying it back) rebuilt the site, hiring back many former contributors who had previously left.
  • Mention Yahoo! or Verizon to a Tumblr user at your peril. Yahoo! bought the blogging site in 2013, Verizon took it over in 2017 when they bought Yahoo!, and between them, they proceeded to run it into the ground. They both attempted to turn the site, which had its own very distinct and often countercultural community, into an advertising-friendly competitor to major social media platforms like Facebook, which backfired as corporate and media attempts to ingrain themselves in the site's culture often carried a whiff of We're Still Relevant, Dammit! and met considerable backlash from the site's users. All the while, the site gained a reputation for being an extremely toxic environment due in part to a lack of official content policy and a business model that saw a repeated failure to incentivize or financially compensate the userbase, leading to intense cyberbullying, blogs containing graphic pornographynote , and a proliferation of extreme communities ranging from white supremacists to fan clubs for mass murderers, while the protests of the site's users concerning such groups went ignored by the management. In late 2018, Verizon finally enacted an official content policy after the Apple App Store removed the site's dedicated app for policy violations, but also because a sweep by the FBI had found extensive stashes of child pornography and other disgusting material circulating on the site. While Verizon’s decision did admittedly solve the problems with extreme porn, it not only did nothing to solve any of the site's other issues but wound up causing new problems of its own due to the site's algorithms flagging innocuous images as pornographic (at one point, it wound up flagging the staff's post announcing the ban), further alienating the userbase and causing an exodus of artists to other sites ranging from Twitter to Newgrounds. In 2019, six years after Yahoo! paid an eye-popping $1.1 billion to acquire Tumblr, Verizon sold it to Automattic (owners of the WordPress blogging platform) for less than $3 million, an insane Generation Xerox degree self-destruction that would put even Myspace to shame and simply showed just how far Tumblr had fallen.
  • For the various sites under the formerly-known-as Gawker Media umbrellanote , the Dork Age started in 2016 with the shutdown of the flagship site Gawker, due to a lawsuit brought by Hulk Hogan and supported by tech billionaire Peter Thiel (who had his own personal beef with the site) over its publication of Hogan's sex tape. The newly-renamed Gizmodo Media Group was soon bought out by Univision, whose mismanagement came under fire from the sites' own writers in 2018. In 2019, as Univision's own problems mounted, they sold the Gizmodo Media Group, along with The Onion and The AV Club (which they had acquired separately in 2016), to the private equity firm Great Hill Partners, who created G/O Media out of it. It was here that the problems really began. To put a long story short, new CEO Jim Spanfeller fundamentally disagreed with the Deadspin writers and editors as to the best direction for the website going forward. Spanfeller wanted Deadspin to focus purely on sports-related news and leave non-sports stories to the other websites under their banner, while the writers wanted to maintain the freedom to mix sports news with other subjects they enjoyed under previous leadership. The dispute eventually set off a Writer Revolt that caused the site's entire staff to quit, effectively putting Deadspin on hiatus and leaving a stinging effect on morale at the other G/O sites. The site was eventually relaunched with a new team in March 2020, while most of the original crew launched Defector in September 2020.

    Web Video 
  • By the admission of its own creator James Tyler, the YouTube gaming show Cleanprincegaming fell into a bad one in mid-2018. The show's rapid rise caused Tyler to speed up his schedule to keep churning out new content, most notably a vlog as a supplement to his normal video essays, which eventually led to increasingly formulaic videos, sloppy research, and accusations that he wasn't actually playing the games he was talking about so much as summarizing other people's opinions on them. Fortunately, Tyler recognized that things had gone wrong with his show, and took a month-long break before returning with a massive step up in quality, complete with a commitment to using only game footage that he had taken himself. This was enough to Win Back the Crowd and lure back many previously disillusioned fans.
  • Chris Bores's videos have always been divisive, but there are two periods even his fans weren't very fond of:
    • The first signs of trouble came in 2012-2013, when he posted uninteresting ghost hunting videos and flooded his channel with Skylanders content. The episodes of his (relatively) popular series like I Rate the 80's and his flagship The Irate Gamer became increasingly scarce and eventually stopped coming completely. By 2015, the Dork Age was in full force — his most popular series had been replaced by more ghost hunting videos and several failed experiments like a few botched attempts at Let's Plays and a poorly-received "crazy" review style that was abandoned after a couple of videos. He also made many Product Placement videos like incompetent reviews praising questionable tech products, and videos that were mostly him showing off the contents of subscription boxes and saying it's cool. This Dork Age ended in 2018 when he launched Chris NEO, a reboot of the review shows his fans enjoyed and dedicated his channel to that. It didn't last — NEO came to an end in October 2018, and his channel remained dormant until August 2019.
    • His August 2019 return marked the start of his second Dork Age. Instead of videos, he decided to make a podcast titled Geek Time TNG, which most of his fans passed up. In particular, the "The JOKER Movie SUCKS!!" episode was poorly received because he based his opinion entirely off the trailers, even though it was uploaded around the time the full movie hit theaters. This Dork Age ended with the release of a new Irate Gamer video in May 2020, which was so positively received, it actually drew in new fans.


How well does it match the trope?

Example of:


Media sources: