Follow TV Tropes

Following

Audience Alienating Era / Real Life

Go To

Back to Audience-Alienating Era.

No Recent Examples, Please! applies to this trope. Examples shouldn't be added until five years after the era begins. Please also try to avoid Complaining About Shows You Don't Like.


    open/close all folders 

    Architecture 
  • One can't help but get the impression that architectural schools were infiltrated by the KGB during the Cold War, placing in vogue the Stalinist trend known as Brutalism - which produced ominous, concrete blocks of pure authoritarian coldness.note  The future seemed bleak for decades, until Postmodernism rode in from the West, kicked out the commies, and saved the day. The legacy of this jarring midcentury trend can be seen today on public urban buildings and state university campuses. The city of Boston, unfortunately, fell victim to Brutalism when a new city hall was commissioned. The chaotic, faded mess that ensued elicits near-universal disgust from visitors and remains a testament to the failures of The Eastern Bloc, and its Western sympathizers. The only city to actually get the concept of Brutalism the right way was none other than America's capital, DC, in the form of the Washington Metro Underground System. Its uniquely carved concrete walls and ceilings that interlock with each other in arches at every hallway and mezzanine prove that Brutalism isn't always a regression of aesthetics. In fact, the architecture in the WMATA underground would be something that the Moscow Metro would dream of looking like, had it curiously not been designed with the more uncharacteristically beautiful baroque architecture that permeates every surrounding station in a country that isn't known for "bourgeoise" aesthetics. One of the big downsides of concrete is that it does not age gracefully in a temperate climate, much less in one with any degree of air pollution. This problem is of course less notable in "indoor" spaces such as Washington Metro.
    • In the Eastern Bloc itself, this Audience-Alienating Era of architecture lasted until the fall of communism. Even after that new aesthetic influences reach the urban landscape rather slowly, thus the depressing views of endless concrete blocks and boxy, edgy monuments are there to stay. Yet in some former communist states what structures have been built, while decidedly ungainly and undesired (some monuments are known under Fan Nicknames such as 'seven-winged five-dick' owing to their lack of aesthetics), are now left in a state of decay due to intense corruption after the Hole in Flag revolutions, with no intention to improve or replace them. This has led to a sentiment along the lines of "at least they used to build stuff back then and put some flowers around it" — that the architectural Audience-Alienating Era was followed by an even bigger Audience-Alienating Era.
    • The worst part of Brutalism was that it caused just as many problems as it solved. Sure, it was cheap and quick to build (because it was all concrete), but it became very expensive to maintain (because it's all concrete). Combine this with the above-mentioned air pollution issue (which became rampant in the 20th century until at least the 90s, and still ongoing in some countries), many of the structures became little more than crumbling, moldy tenement halls. At least it gave us a great art style for Doom levels.
  • In general, the era of the "automotive city" (which is really more a style of urban planning than of architecture), with its barren concrete plazas, urban highways on stilts, and tearing down entire neighborhoods to make way for roads and parking, is now considered the worst of all architectural audience-alienating eras. Almost all other styles and epochs have their defenders and people who "revive" them today, but this one is so extremely derided, that it explains the bad rep of brutalism (which mostly happened in the same era) described above.
  • The Nazi Party thought the simplistic Bauhaus style from the Bauhaus architecture and design school was this. Much like Brutalism, one of the purposes was to be a socialist attack to the bourgeoisie — a simple, money-saving alternative to design. Unlike brutalism, it was good-looking and ergonomic. Of course, this was taken as a protest against the overtly ornate and luxurious classical Germanic style; it didn't help matters that the school's dean, by the early '30s, was actually a far-left pro-communist that would expel students not committed enough to the ideals, resulting in the school being raided by the Nazi party in 1933.

    Automobiles and motorcycles 

Asia

  • The ninth-generation (2012-2015) Honda Civicnote  is remembered as a black spot in the history of what is otherwise one of the most celebrated compact cars in the world. Not only did the Great Recession give Honda the bright idea to position the new Civic as more of an entry-level car with cheaper materials (and just as the economy was starting to recover, at that), but the 2011 Tohoku earthquake and tsunami, which disrupted supply lines for all Japanese automakers, didn't help matters either. The result was a Civic that very much felt like a bargain-bin model, lacking the fit and finish that had once helped the Civic stand head and shoulders above competing compacts. Notably, Consumer Reports, for decades one of the Civic's biggest boosters in the automotive press, took the car off its Recommended list for the first time in 2012. Honda scrambled to fix the car, eventually giving it a full redesign for the tenth generation just four years later, and while that car too initially suffered teething issues (the debut 2017 model again failed to make CR's Recommended list), once those were ironed out it won back many previously disappointed Honda fans.
  • Hyundai:
    • The company had one in the United Kingdom and Continental Europe during the Turn of the Millennium, being seen as producing Boring, but Practical Korean family cars. This was a period when the Accent subcompact was seen as So Okay, It's Average, with only the luxo GSi and CDX models being recommended, and the Elantra being considered the best model. The midsize Sonata was seen as too Americanized and chrome-y for most people, who went for the then-contemporary Ford Mondeo and Volkswagen Passat instead, and the Hyundai SUVs were seen as dull and too rough-and-ready for some, with the Santa Fe considered good, but not great. It took until 2010 for things to improve, although the new i30 launched in 2007 went some way to improving that reputation.
    • While Hyundai and its corporate sibling Kia saw their reputation in the US improve in the 2010s, one particularly embarrassing problem emerged late in the decade. While every other automaker has engine immobilizers as standard equipment on every car they sell in the US (as does Hyundai in Canada and Europe due to local laws), they aren't technically required, and so Hyundai decided to save money by not including them in cars made from 2011 through 2021. This meant that anyone with a screwdriver and a USB cord could easily steal a Hyundai or Kia,note  and once criminals figured this out and shared the information on TikTok (where a subculture called the "Kia Boys" emerged), car theft rates skyrocketed, driven entirely by thefts of Hyundais and Kias. The problem got so bad that seventeen cities sued Hyundai over it, blaming them for a wave of car thefts, joyriding, and assorted crimes in which stolen cars were used as getaway vehicles, and it's generally acknowledged that driving a Hyundai or Kia from 2021 or earlier may as well be an invitation for somebody to steal it.
  • General opinion of Nissan is that the quality of their cars went to pot during the 21st century, with this article by Kristen Lee for Jalopnik going into more detail. Their once-acclaimed sports cars grew stagnant, with the flop of the 370Z (released in 2009 at the height of the Great Recession) being a turning point in convincing Nissan to let their sports car lineup wither on the vine, while their mass-market passenger cars and SUVs earned a reputation as more shoddily-built versions of comparable Toyotas. Many blame the partnership with Renault and the resulting new CEO Carlos Ghosn, who, ironically, was initially hailed as a corporate hero in the manner of Lee Iacocca for returning Nissan to profitability in the early 2000s. (Ford tried to recruit him to be their CEO at one point, and there was even a manga about him.) However, his later years were marked by increasingly poor reliability and build quality of Nissan's passenger cars, owing to both the cost-cutting measures he took and the use of Renault's French parts in Nissan's cars, and it would later be discovered that he was involved in financial chicanery, leading to his arrest and removal from the company.

Europe

  • In The '70s, British cars fell off even harder than their American counterparts, which led to British Leyland consolidating most contemporary car brands. In theory, having most of the major British car companies under one organization was a good idea... but many of them were competing against each other in the market, and none of them knew how to work together. As a result, many cars of the time were cheap, quickly-produced Suspiciously Similar Substitutes, while new models took a long time to develop.
    • Despite its decent sales numbers, the Morris Marina is widely considered one of British Leyland's worst cars. Add to that BL's disputes with trade unions, the oil crisis, and the "Three-Day Week"note , and you can see how British Leyland became the poster child for Britain's '70s industrial problems; the company collapsed in 1975, taking most of the industry down with it, and had to be nationalized just to keep the lights on at the factories. Nowadays, all the major British car makers are owned by foreign groups, with Rover having closed in 2005.
  • The German sports car maker Porsche was in a bad place in the late '80s and early '90s. Sales crashed after peaking in the mid-'80s as its bloated and inefficient production process reduced output, the early '90s recession hammered luxury car makers across the board, and their models grew extremely long in the tooth; the venerable 911, while undoubtedly a Cool Car, had been designed back in 1964 and couldn't compete with more modern sports carsnote , while the 968 offered little change from the 944 that had been in production since 1982. At the time, it seemed as though Porsche was coasting on its '80s yuppie cachet rather than innovation and performance. It didn't help that while they were still capable of building fine sports cars during this time, two of their best cars from this period, the Mercedes-Benz 500 E and the Audi RS2 Avant, were both built for other automakers; while working on those cars likely saved Porsche and gave it the funds it needed to invest in its own cars, it was Mercedes and Audi that reaped the rewards. The introduction of the entry-level Boxster in 1996 marked the beginning of a turnaround, allowing Porsche to re-establish its place as one of the world's premier sports car manufacturers.
  • The Volkswagen Golf GTI had two occasions where it didn't quite live up to its reputation as being one of the best sports hatchbacks and it did not live up to expectations for its target demographic:
    • The Golf GTI MkIII of 1992 had a 2.0-liter/115hp 8-valve engine (2.0 8v) which did 0-60mph in 8.7 seconds which was mediocre for a supposed hot hatch, being all show and no go. Esthetically, though, it was pleasing, with the "Longbeach" 14-inch or Ancora 15-inch wheels.
    • Then there was the Golf GTI Mk4 of 1998-2004, which had a 2.0-liter/115bhp 4-cylinder model launched in 1999, that looked sporty, but did 0-60mph in 10.2 seconds, making it the least sportiest and it felt like a GTI In Name Only. British customers were largely unimpressed, preferring the sportier 1.8T 20V 150 (the 1.8-liter/150bhp 20-valve 4-cylinder was slightly quicker, doing 0-60mph in 8.2 seconds), later the 1.8T 20V 180 (with a more powerful 1.8-litre/180bhp 20v 4-cylinder that was even quicker).
    • Interestingly, buyers in North America didn't have this problem as much, where the Volkswagen GTI (dispensing with the Golf name) was marketed as a luxury car rather than a workaday family hatchback with performance attributes.

North America

  • Remember all those great cars Detroit came out with in The '70s? No?
    • In the 1970s, a toxic combination of lack of innovation, poor design, and quality control, Congress relaxing import quotas (allowing foreign automakers to sell more cars in the USA), new emissions and fuel economy regulations, and the oil crises of 1973-74 (prompted by the Yom Kippur War) and 1979 (prompted by the Islamic Revolution in Iran) nearly destroyed the industry. Chrysler required a government bailout to survive, American Motors collapsed altogether and saw its pieces snatched up by Renault and later Chrysler in The '80s, and Ford and GM were better off only by comparison, rapidly losing market share to Japanese and German automakers who built smaller, more efficient, and more reliable cars. It did destroy the city of Detroit itself (and most of Michigan for that matter), and to this day, there are many Americans of a certain age who still refuse to buy domestic. Auto writer Murilee Martin coined the term "Malaise Era" to define the years from 1973 to 1983 when the quality and performance of American cars seemed to be in active decline.
    • Detroit did start making some good cars again from the mid-'80s onward, with cars like the Ford Taurus, Chrysler's "K-cars" and minivans, and GM's Saturn brand showing that they still knew how to innovate, while the SUV boom became a license to print money in Detroit. However, they still put out more than the occasional stinker (see: the Chevrolet Cobalt) until the mid-late '00s when, amidst the global financial crisis and another oil crisis, GM and Chrysler had to be bailed out by the federal government, the latter for a second time. The terms of the bailout were strict, but they are generally credited with turning the American auto industry around in a big way, forcing it to take both quality and the emerging electric car revolution (itself led by another American company, the startup Tesla Motors) seriously. As of now, there are lots of domestics that are every bit as good as foreign cars (and in many cases better, especially where electrics are concerned), but anyone with any sense will be very careful about most used domestics from model years prior to 2009 or so.
  • Ford has had a few audience-alienating eras:
    • The Ford Mustang II, sold from 1974-78. The concept was good on paper — reverse the size creep the Mustang had been suffering from since the late 60s and go back to the basic concept of a compact coupe with a powerful engine. What consumers actually got was basically a Pinto with a fancier body, no V8 option, and enough mid-'70s chrome, vinyl, and fake wood for a much larger car. Sales for the Mustang II were actually much better than the late 60s/early 70s Mustangs, but it alienated enthusiasts. Even after it got a V8, any performance advantages that would've come from the lighter, more nimble body were negated by federally mandated emissions-control devices that sapped performance; this led to disgruntled fans calling it the "Disgustang". Meanwhile, to add insult to injury, the Mustang's rivals, the Chevrolet Camaro/Pontiac Firebird twins, underwent something of a Golden Age in the '70s. While they too felt the effects of the new standards (they were nearly killed in 1972 due to a UAW strike concerning the new regulations), their performance didn't suffer nearly as badly as the Mustang's, and their bodywork wasn't nearly as garish as other cars during the era. The Camaro and, to a lesser extent, the Firebird outsold the Mustang by 1977 because they were some of the only cars at the time worth getting for sports car/post-muscle car enthusiasts (especially made all the more apparent with the release of Smokey and the Bandit, which really boosted sales of the Camaro and the Firebird that year). To this day, the Camaro and Firebird are probably the only American performance cars to not have their legacy stained by WTH engineering/designing departments even during The '70s. The only low point in the Camaro's career was the Iron Duke-powered models of The '80s, but that was a just an optional engine and did rather little to hurt the Camaro's popularitynote . Recently, though, some auto enthusiasts have begun to defend the Mustang II, arguing that it was simply an alright car that simply had the misfortune of coming between the groundbreaking first generation and the long-lived and successful third generation.
    • The third-generation (1996-99) Ford Taurus destroyed that car's reputation virtually overnight. With the rest of the automotive world having caught up to what the original Taurus had accomplished in The '80s, Ford wanted to move the car upmarket and give it a more distinctive look inspired by contemporary Jaguars (Ford owned Jaguar by this point) and Infinitis in order to stand out from the competition (chief designer Jack Telnack compared the first two generations of Taurus to "a pair of slippers"), and the result was a rounded, oval-shaped appearance that certainly did that. The moment the car debuted at the 1995 North American International Auto Show in Detroit, it was derided by journalists as the "Bubble" or "Submarine" Taurus, and while it was reliable and a pleasant car to actually drive, its blob-like styling turned many people off (especially with how it obstructed the driver's sight lines), as did its reduced trunk space, increased weight (without a horsepower boost to match), and a higher price point that came from luxuries that most people didn't really want. The high-performance SHO model was also underwhelming compared to previous versions, lacking a manual transmission option and being prone to camshaft problems on top of it. In its debut model year, it only kept its position as the best-selling car in America because of sales to rental fleets (which made up 51% of sales), and by the following year, it had fallen behind the Toyota Camry. Ford hastily redesigned it in 2000 to give it a more conservative appearance, but it was too little, too late, and the Taurus, once the innovative, cutting-edge sedan that saved Ford from bankruptcy, never regained its position. Jack Baruth, writing for Road & Track, called it "the saddest car ever made".
    • In 2007, Ford launched the Transit Tourneo minivan based on the Transit SWB with front-wheel-drive, and while on the whole it wasn't bad for a minibus derived from a van, it had a smaller color palette than the 1994-2000 version, and there was no gasoline option (from 1994 to 2000 a 2.0-liter petrol was offered on the Tourneo, but you couldn't get the 2.3-liter 145hp 4-cylinder in the Tourneo unlike the Transit SWB van it was based on, except in China, where it was the Transit Minibus SWB, no Tourneo name). Unlike other examples of this trope, it wasn't a nadir/low point, but it was seen as a small disappointment to loyal buyers.
    • The 2011 Mondeo facelift was seen as a quick cash-grab by some, ruining what looked esthetically pleasing even though it was good to drive, but 12 years on, people seem to be more accepting of it. The Graphite trim level is now seen as a collector's car due to its rarity.
  • The '80s were not kind to Cadillac. First, they introduced the V8-6-4 engine, which was meant to maximize fuel economy by activating and deactivating cylinders according to driving conditions, but proved to be unreliable due to the limitations of early 80s computer technology. Then, they attempted to break into the compact luxury market with the Cimarron, which was little more than a Chevrolet Cavalier with nicer trim and Cadillac badges. The Cimarron was marketed towards young, upwardly mobile customers that normally bought BMWs, but they weren't impressed while traditional Cadillac customers were incensed by the cheapening of the Cadillac brand. Their replacement for the V8-6-4, the High Technology engine, proved to be a maintenance nightmare, leading many customers to remove it and replace it with Oldsmobile and Chevy engines. The redesigns of the De Ville and Fleetwood (1985) and the Seville and Eldorado (1986) were supposed to give the lineup a modernized look but were widely panned as being dull and boxy, which harmed sales greatly. The final straw was the introduction of the Allante, which tried to compete with the Mercedes-Benz SL and Jaguar XJS, but failed to make a dent in either car's sales thanks to derivative styling and a bland driving experience. By the end of the decade, Cadillac had gone from dominating the luxury car market in the US to struggling to compete with both European and emerging Japanese brands. Things would begin to look up for them in The '90s, though there were still some missteps in that period such as the Catera, which was basically an Opel Omega thinly disguised as a luxury sedan. It took the introduction of the "Art and Science" theme in the 2000s to truly get them back in gear.
  • The Chevrolet Lumina of 2005-2006 sold in the Philippines was seen as a pretty weak car for GM during Turn of the Millennium, as it was a reverse example of the Cadillac Cimarron; it was basically a badge-engineered version of the Chinese-built Buick Regal, taking a luxury car and making it more basic and more dull to look at. The 2.5-liter/145hp V6, GM's LG8 engine was fine, but what wasn't so good, was the lack of luxury spec, and it temporarily killed Chevrolet's sedans in the Philippines until 2013, when the Malibu replaced it; it couldn't compete with the local Honda Accord and Toyota Camry sedans for image or reliability. This has more to do with Values Dissonance than anything; in the US, cars like Chevrolet and Honda are supposed to be dull but reliable entry-level cars, in the Philippines they're viewed as luxury cars that are also affordable.
  • The storied American motorcycle manufacturer Harley-Davidson has had two of them.
    • To start with, the "Malaise Era" of American motor vehicles was hardly restricted to cars, as evidenced when American Machine & Foundry, a company known primarily for sporting goods equipment (bowling in particular — yes, it's the same AMF that owned a lot of bowling alleys and made lots of bowling balls), bought out Harley-Davidson in 1969. While their cash infusion saved the company (then close to bankruptcy) in the short term, their mismanagement ran it into the ground in the long term, as their attempts to streamline production did little more than drive build quality into the gutter and ignite labor disputes. During The '70s, Harleys were more expensive, less reliable, and had worse performance than comparable import bikes, the general opinion being that the only things they had going for them were patriotic appeal and style.note  The term "hog" to describe Harleys, while since reclaimed by the company and Harley riders (the Harley Owners Group takes its name from it), originated during this time as a pejorative; other nicknames included "Hardly Ableson", "Hardly Driveable," and "Hogly Ferguson". The Confederate Edition series produced in 1977 is also one that the company would like to forget. The "AMF years" ended in 1981, when AMF sold Harley-Davidson to a group of investors that included the grandson of co-founder William A. Davidson. They turned the company around with a series of much better-built bikes, most notably the Softail series, while deliberately playing to nostalgia and patriotism with retraux styling rather than trying to copy their Japanese and European competitors, in the process turning Harley-Davidson into a lifestyle brand as well as a motorcycle company.
    • Unfortunately, this lay at the root of Harley's second Audience-Alienating Era in the 2010s. Among those who feel that the company has lost its touch, the belief is that the seeds of Harley's comeback in The '80s were a Franchise Original Sin that caused it to ignore trends in motorcycle technology and engineering in favor of relying on the Nostalgia Filter of aging Baby Boomers who grew up with '60s biker culture — a strategy that produced a stereotype of Harley riders as Former Teen Rebels and Amazingly Embarrassing Parents who bought their bikes as mid-life crisis mobiles, left the bikes themselves as luxury items built on aging platforms that had little to offer for anybody interested in performance or practicality, and eventually came back to bite Harley in the rear once their target demographic started getting too old to safely ride motorcycles. The rise of the relaunched Indian Motorcycle brand in the 2010s, which built the same kind of classic-style, all-American cruisers that Harley did with more modern technology and superior performance, simply underscored the problems that Harley faced. As of 2021, Harley is making a concerted effort to end the Audience-Alienating Era, introducing the LiveWire electric motorcycle and the Pan America touring bike targeted at new customers beyond their normal demographic; time will tell if this pulls them out. Nowadays, the Softail and its associated lifestyle are often cited as a case study in how over-reliance on nostalgia as a selling point can eventually backfire and hit diminishing returns.
  • General Motors' Saturn brand went downhill in 2000 when they introduced the L series, the car that destroyed Saturn's entire raison d'être, and never recovered. The intention of Saturn was to create an all-new brand that was GM In Name Only, with its own corporate structure, dealer network, factories, cars, platforms, and parts separate from the rest of GM in order to inject fresh blood into a stagnant company that was still climbing out of its '70s A.AE. By all accounts, in The '90s it succeeded at this, with the compact S series selling millions, building a solid cult fandom, and seemingly offering proof that Detroit could build compact cars just as good as the Japanese. The midsize L series, however, shared its platform with Opal and Vauxhall cars from GM's European division and its engine with several other GM cars, and was built at a Delaware plant where numerous other GM models were built, and between that and its reliability issues, it turned Saturn into just another low-cost budget brand for GM. Sales predictably tumbled, and when GM went through restructuring post-bankruptcy in 2009, Saturn was one of the first brands to get the axe.

Other/multiple

  • Car interiors went through this during the 2010s, mostly due to the rise of touchscreens as center console interfaces. In theory, replacing all the buttons, knobs, dials, and switches in a center console with a single touchscreen should have greatly simplified car controls, added an interface that most drivers were familiar with from their phones and tablets, and made it much easier to customize the center console and add new features like one would on a phone. In practice, however, there's a reason why there are laws prohibiting cell phone use while driving, and that reason became readily apparent when drivers had to use those screens. Without the tactile feeling of physical controls, drivers were forced to take their eyes off the road for seconds at a time to adjust controls that, in the past, they would have been able to by just moving their hand to the requisite button or knob without even looking.note  By the 2020s, center console touchscreens came to be recognized as annoying and even dangerous distractions and safety hazards, seen as driven less by practicality than by aesthetics, tech fetishism, and attempts to copy Tesla (which pioneered and exemplified the trend), and automakers like Mazda, Hyundai, and Volkswagen started returning to physical controls.

    Aviation 
  • Lockheed in The '70s decided to bribe various government officials and cover up problems with the F-104. As a fighter plane, it was good; as a light bomber, not so much. The scandals almost killed the company. The commercial failure of the L-1011 TriStar didn't help matters any, either, and forced Lockheed to quit building airliners entirely.
  • Boeing fell into this in The New '10s with the Boeing 737 MAX, the latest update of the proven and popular 737 series of airliners. The MAX started off with a lot of promise, but after two fatal crashes aviation authorities decided to ground it. As it turned out, Boeing had equipped the 737 MAX with an untested software system designed to compensate for the engines being mounted in a way that affected the plane's controllabilitynote , but a primary selling point was that pilots did not need to be retrained. They failed to account for the possibility of the software - or, indeed, the one sensor feeding the software data - malfunctioning, which led to them not informing their customers of the new system. So sure were they that this would pose no problem whatsoever, much less one with the potential to kill people, that they made an AOA disagree warning an optional extra. As a result, when the system malfunctioned due to inaccurate data, the pilots had no idea how to counteract it, leading to the crashes. The costs for the company are estimated to be more than $18 billion. Problems with the SLS launch vehicle and the Starliner crew capsule on development for NASA and flaws on the 787's software are not helping matters any, either.
    • As of 2021, the 737 MAX is being recertified across the globe (Except, of course for China, the first country to ground it), as the necessary changes have been made, and airlines are for the most part surprisingly eager to begin resuming their use of them; on June 29, 2021, United Airlines announced that they were going to purchase 200 more of them!
    • Around the Turn of the Millennium, the KC-X program sent a Boeing executive to prison. The KC-46 is an aircraft with lots of teething troubles - at one point the tanker/transport was banned from carrying cargonote  and tools were left inside the fuel tanks. On the civilian side, the 787 Dreamliner suffered battery issues that could've caused fatal accidents—potentially including in-flight fires, possibly the worst kind of emergency that can happen—had they not been found out.
    • As this video argues, the seeds of this trouble were planted in 1997, when Boeing merged with McDonnell Douglas. After the merger, factory closures resulted in massive layoffs and increased outsourcing, and Boeing inherited the more worrying aspects of McDonnell Douglas' corporate culture... namely, the latter's excessive focus on profits, which had previously led to that company putting a plane on the market with a design flaw with the potential to cause the cargo door to blow open mid-flight and, in an eerie echo to the 737 MAX, making redundant safety features optional extras.
  • Going into World War II, Curtiss-Wright was one of the biggest aviation companies in the United States, with a sterling track record of making military planes. Unfortunately, while the company was the second-largest military manufacturer during World War II, behind only General Motors, as the war progressed it was clear the company had hit a major slump:
  • The entire British aviation industry went into an A.A.E. after World War II. While manifestly still capable of designing great aircraft, the Brits were plagued with problems that led to low sales and many embarrassing failures:
    • Structurally, there were too many builders and not enough customers in economically depressed post-war Britain. Firms simply couldn't survive on the small order numbers available, especially in the military sphere. This was compounded by the Americans withdrawing the military aid money that had bought many of those military jets.
    • The first generation de Havilland Comet, initially seen as a triumph, was grounded in January 1954 after one of them suffered an in-flight breakup due to metal fatigue, and then permanently withdrawn from service in April the same year after an ill-advised return to service was followed by an identical accident. By the time safe Comets were available, the Boeing 707 was on the market and was outcompeting the competition. This not only killed de Havilland stone dead, but also cost the Brits any chance at first-adopter advantages in jet-powered passenger aircraft.
    • The British government did themselves no favors. The short-sighted decision in 1957 to kill off all but a handful of military aircraft programs in favor of guided missiles proved highly damaging. The decision to merge the aerospace industry into two companies was necessary, but was done in a hamfisted manner that caused much friction within the new companies as the old management teams figured out how to work together.
    • The British exited this A.A.E. in the 1970s. While unable to break back into the commercial business, British military jets have proven popular and effective, thanks to several collaborations with other countries to boost orders.

    Computers & Electronics 
  • Apple:
    • Their product range during the tail end of the 1980s and early 1990s had degenerated from the insane greatness of the classic Apple Macintosh to the extraordinarily bland Performa range. Although the Powerbooks sold well, and the Power Macs and Quadras got good reviews, none of the company's products were particularly exciting.
      Strapped for cash, Apple even took to licensing clones of the Mac hardware, which raised money in the short term but ate into long-term Macintosh sales. The company was in pretty bad shape before Steve Jobs came back in 1997 and the original iMac was released in 1998, and it took them a few more years after that to finally get rid of the mess that the classic Mac OS had become.
    • On the phone/tablet front, Jony Ive's signature flat, Helvetica-soaked design language (replacing a previous, less-harmonized appearance that a lot of people found excessively skeuomorphic on both platforms) has been divisive since debuting with iOS 7, while iOS 8 was not only buggy, a class-action lawsuit filed at the beginning of 2015 alleged that it was so bloated that it didn't leave enough room for user content. The iPhone 6's larger size (4.7 and 5.5 inches for the standard and Plus models) was also contentious. While some were excited that Apple was finally making a 'phablet' to compete with similarly large Android offerings, those who liked the smaller, older iPhones were dismayed by it — especially Jobs loyalists, given that Jobs had made a point of never making an iPhone with a screen greater than 3.5 inches, which he felt was the perfect size for a smartphone screen (he derisively compared larger phones to Hummers). The release of the "budget" iPhone SE in 2016, combining the power of the 6S with the form and four-inch screen size of the 5S, is generally seen as an attempt to Win Back the Crowd on that front.
    • The announcement later that year that the iPhone 7 would be removing the headphone jack, instead using Bluetooth and Apple's proprietary Lightning port for headphones, wiped out any goodwill that had been earned and then some. Many saw it as an attempt to force users to shell out extra to replace their headphones with new ones that used the licensed Lightning connector (especially given Apple's aforementioned purchase of Beats), without thinking about those who depended on the headphone jack for other uses (car connectors, credit card readers, et cetera). While there were those who defended the decision, saying that it allowed Apple to make the phone thinner while adding more internal hardware, others argued that those two goals went against each other, and questioned why phones needed to be so thin in the first place given how many people bought protective cases for them anyway, especially when Bendgate and the Samsung Galaxy Note 7 explosions were partly caused by the respective companies overestimating how thin a phone could practically be.
    • The MacBook lineup went through this starting around 2015. For starters, Apple changed the previously well-loved keyboard design to one relying on butterfly switches that were less comfortable to type on and were incredibly fragile, and the model after that replaced the function keys on the keyboard with a gimmicky, awkward touch bar that was even less reliable than the rest of the keyboard. Thanks to Intel going through an Audience-Alienating Era of its own, performance and battery also became increasingly wonky, with some models suffering from severe thermal issues. Apple also replaced all ports except the headphone jack with USB-C/Thunderbolt 3 onesnote , with the first MacBook to do this not even having Thunderbolt benefits because its only non-headphone port was USB only, forcing users to use clunky, awkward dongles to gain basic functionality. While Apple did fix or at least reduce these issues over time, the lineup only really left its Audience-Alienating Era in late 2020, when Apple switched out the Intel-based processors that powered the previous models in favour of their own custom ARM chips. While this move was initially met with skepticism and trepidation, Apple ultimately proved the skeptics wrong, and the first ARM-powered Macs, despite being entry-level models, turned out to have incredible performance and battery life while maintaining speedy, functional backwards compatibility with Intel-based programs via software emulation. Of note, Geekbench single core running under emulation via Rosetta 2 on an M1 MacBook Air outperformed the same test on every Intel-based Mac, including higher-end ones, with native benchmarks being even better. The 2021 MacBook Pros were seen of a further rebuke of this era, as they not only brought the entire lineup of Apple laptops onto Apple Silicon but also reintroduced the HDMI port, function keys, SD card reader, and MagSafe charger, reducing the need for dongles and adapters that the all-USB-C MacBooks had become infamous for.
    • Apple's iOS mobile operating system has had its ups and downs over the years, but one of the more notable downs was iOS 11. It did have its positives, like a much-needed overhaul to the iPad interface, adding an actual (if limited) file manager, and an enhanced control center. However, it mostly became notorious for its many, many problems - for starters, it ended 32-bit support, killing off a ton of apps. The UI design on the default apps also became sloppier and overall worsened. It was massively glitchy and prone to becoming unresponsive (with the weirdest, if not necessarily most critical, bug being that the calculator app would give wrong answers, thanks to an animation glitch) and worst of all, absolutely tanked performance, rendering some older devices almost unusable. It also didn’t help that the “Batterygate” controversy came about around this time, where it was discovered that Apple deliberately throttled the performance of devices that had heavily degraded batteries in order to prevent sudden shutdowns. While it was a smart, reasonable trade-off, this was poorly communicated to end-users and it and the lackluster performance of iOS 11 in general led to a firestorm of controversy, numerous lawsuits, and (unfair, but understandable) accusations of planned obsolescence. Ultimately, iOS 12 introduced little in the way of actual new features and focused more on simply fixing iOS 11’s problems - it succeeded quite happily, and Apple wisely made sure that any device that could run iOS 11 could also update to iOS 12.
  • Microsoft:
    • The common joke about Microsoft's Windows operating systems is that they go in a cycle between an Audience-Alienating Era and a quality product. Windows 95 was successful for the innovations it brought, but also extremely buggy to the point where it was the butt of many jokes in the '90s, while Windows 98 corrected the technical flaws and provided an all-around quality product. Windows Millennium Edition (or ME) was such a notorious Porting Disaster of features from the NT-based Windows 2000 that it killed off the Windows line derived from MS-DOS, with many users choosing to either stick with 98 or, if they had to upgrade, going with Windows NT (their business OS) instead. Windows XP, derived from NT-based 2000, was a return to form and arguably the most successful operating system in history; released in 2001, it didn't drop its title as the OS with the greatest market share until 2011 when it ceded it to Windows 7, and it was still supported with regular updates until 2014 (and it even received a critical security update five years after support ended). The reason for this is Windows 7's predecessor, Windows Vista, a buggy mess that quickly became an Old Shame for Microsoft, with Windows 7 generally seen as the 98 to Vista's 95 in terms of correcting its problems. Windows 8, released in 2012, didn't suffer from the bugs that plagued ME and Vista, but its "Metro" user interface implemented on Start Screen and Setting Screen, while only affecting a few user interfaces abd leaving the desktop unscathed, was built around touchscreens, and it had such scathing reception from users of conventional desktops and laptops, again leaving people (especially business/office users, historically the core of Microsoft's base) unwilling to upgrade from XP and 7. Amidst this fiasco, Microsoft went to work on Windows 10, skipping over 9 completely in hopes of distancing themselves from the poor reception of 8, which combined 8's Metro User Interface with Start and Settings that was far more user-friendly for desktop users, to the rejoicing of Windows users...
    • That is until the free upgrade promo (from July 2015 to July 2016) abuses Windows Update to the point that those who didn't want Windows 10 are getting it, conflicts and all, without setting up updates manually...
    • As for Windows 10 itself, over the hood it's unable to set updates (normally Windows 10 will download any and all available released update and install them as soon as available while Windows Update since its Windows 7 and 8 times has rather bad track record of conflicts and system instabilities) to download manually although workarounds do exist, mobile app style content delivery (which means expect suggestion and ads to Windows apps on Windows store and the flood of freemium), alleged collection of user data, and the fact that support for the more personalize-able Windows 7 and 8 is increasingly deprecated to the point that CPUs launched from 2016 onwards only support Windows 10.
    • The Internet Explorer browser, from version 6 to roughly version 9. IE6 was the browser with the biggest market share in history for years, mostly because it was the standard for many business users who had software developed that worked great with it but simply wasn't easily ported to something newer. As a result, people kept using IE6, despite it being outdated and insecure. The compatibility requirements put 7 and 8 incredibly behind Chrome and Firefox. IE9 was a return to form, but the bad reputation Internet Explorer earned with 6 was still there. Microsoft would eventually phase out the Internet Explorer themselves, replacing it with Edge in Windows 10.
    • For Windows 10, Microsoft decided to make the bulk of their QA team redundant and outsource real-world bug testing to an Insider Program - in other words, volunteers. This was a horrible mistake, as volunteers are not professional QA testers and are less likely to be as diligent or have the same priorities as a paid professional, and even if they do catch a serious but not particularly widespread bug, there's a chance the reports will be buried by less vital suggestions, causing Microsoft to miss them. In October 2018, Insiders' attempts to warn Microsoft that an update was deleting user files failed to be seen by Microsoft themselves before it snapped non-Insiders' files out of existence. That was only the beginning of a series of faulty updates partially chronicled on Idiot Programming, but in summary, Windows 10 updates have developed a habit of creating three bugs for every bug they fix, ranging from the annoying to the bizarre to the catastrophic. A not-uncommon sentiment is that Microsoft needs to get their shit together before the damage to their reputation becomes irreversible, assuming it is still reversible.
  • Intel have had at least two:
    • Early on in the new millennium, the otherwise top-of-the-game Intel fell behind an increasingly competitive AMD with the Netburst-based Pentium 4. Promising to eventually break the 10GHz barrier, it instead ran inefficiently and incredibly hot. On top of that, their attempt to produce a 64-bit successor to the ageing x86 architecture resulted in the Itanium line, which proved a massive failure and forced them to adopt AMD's competing AMD64 architecture. The company was finally out of the woods mid-decade with the release of the Core 2, a more modernized take on the P6 architecture, along with a steady yearly update schedule allowing for step-by-step refinements that saw them easily outpace AMD for well over a decade.
    • However, Intel fell back into an Audience-Alienating Era in the mid-late 2010s, ending up stuck on the same basic architecture (Skylake) for six years, with their first attempt at producing a revised version (Cannonlake) being such a disaster that they were forced to essentially abandon it, and then only being able to release low-power mobile versions of their next architecture (Icelake) due to manufacturing problems. AMD, meanwhile, got their act back together and released the new "Zen" microarchitecture, which restored them to competing with Intel's high-end processors, before following it up two years later with the Zen 2, which utterly destroyed Intel's entire desktop and workstation/server line-up in much the same way that the Athlon 64 had humiliated the Pentium 4. It took until 2021 for Intel to produce a desktop CPU that wasn't just a Skylake variant with more cores... and when their new architecture, Rocket Lake, was actually released, it turned out to perform worse than Skylake in many apps and games. Later that year saw the release of another new architecture, Alder Lake, which finally restored them to competitiveness in the desktop, though they remain a distant fourth place (behind AMD, IBM, and ARM) in server performance. Ultimately, this Audience-Alienating Era cost Intel their partnership with Apple because it caused thermal and performance problems that exacerbated the MacBook line's own Audience-Alienating Era.
  • AMD's own Audience-Alienating Era is widely regarded to have been between 2011 and 2016. The five years prior to that weren't exactly a golden age for the company, and saw the disastrous launch of the original Phenom (which could have been competitive with Intel's Core 2 line, but suffered from low clock speeds and the infamous TLB glitch), but also weren't a complete disaster. Unfortunately, AMD's FX line of CPUs was a complete disaster, with its revolutionary-on-paper "module" concept not working out that well in practice, and the rest of the chip being sunk by, ironically, the very same issues that had dogged the Pentium 4 a decade previously. Things were looking so dire for the company that in 2015 they were projected to be entering bankruptcy by 2020. However, the new Zen architecture that launched with their Ryzen line of CPUs proved to be a massive improvement, and by the time the Ryzen 3000 series rolled around with Zen 2, AMD was on the winning end of a Curb-Stomp Battle.
  • NVIDIA fell into this with the GeForce FX GPU, which used a substandard implementation of DirectX that allowed then competitor ATi (now AMD) to wipe the floor with its GPU. NVIDIA came back the next generation and stumbled a few more times — most infamously with the late and very power-hungry GeForce GTX 400 series — but not nearly as badly as this.
  • ATI themselves had an Audience-Alienating Era from about 2006-2009, as their Radeon 2000 series proved late and underwhelming, leaving them playing catch-up for a couple of years. They finally managed to get ahead of NVIDIA with the Radeon 5000 series, but it proved too little too late for the ATI brand, which AMD killed in late 2010.
  • The BlackBerry line of smartphones had one around the beginning of The New '10s, due to its obsolescence in the face of the iPhone and Android-based phones, and slowness in developing new models. The models based on BlackBerry 10 were well-received, and the company returned to profitability, but it was still a far cry from its previous stature in the smartphone industry it pioneered. Ultimately the company gave up on BB 10 and built/licenced the construction of Android-powered handsets before withdrawing from the smartphone market entirely.
  • In the early-mid-2000s, a wave of faulty aluminum electrolytic capacitors produced what came to be called the "capacitor plague" of computers whose internals shorted out after just a couple of years. The cause turned out to be a bad case of industrial espionage where a worker at the Japanese electronics company Rubycon left to work for the Chinese company Luminous Town Electric, taking with him a secret formula for the company's water-based electrolytes that turned out to have been mis-copied, lacking certain ingredients that were essential to the capacitors' stability. Said formula wound up stolen again with the errors intact and passed around throughout the Chinese and Taiwanese electronics industries, which cranked out millions of computers with shoddy circuit boards before the error was caught. Anybody who knows about the issues that resulted typically knows better than to purchase any used computer made between 1999 and 2007.

    Food & Drink 
  • In 1985, Coca-Cola soft drink decided to change its secret formula that had been the same for most of a century. The "New Coke", as the media dubbed it, tasted more like Coke's chief rival, Pepsi (part of the whole point, actually). The general public considered it a downgrade, die-hard Coke drinkers said "They Changed It, Now It Sucks!" and Pepsi drinkers stuck with Pepsi. This new formula actually made Pepsi the number-one selling soft drink for a while, partly because most of its advertising at the time was "Hate the New Coke? Drink Pepsi!" Pepsi saw the New Coke blunder as such a major win, they gave all their employees a day off in celebration. The original formula returned to the market 80 days after New Coke's debut; it was branded "Coca-Cola Classic" while the new one was simply "Coke". The rest of the decade found Coca-Cola shilling (New) Coke with a Younger and Hipper advertising campaign starring '80s phenom Max Headroom, but with very little impact. Eventually, the original recipe reclaimed its "Coke" title while New Coke was quietly rebranded "Coke II", but finally perished in the early 2000s. Mocking the New Coke decision became a running joke for years, such as in Futurama's "Slurm" episode and the third season of Stranger Things.note 
    • Dave Barry lampooned this in one of his books with a "test your business IQ" question that went something like "You are the world's largest manufacturer of soft drinks. You are using a tested and proven formula that has remained the same for nearly a century. Your product's name is virtually synonymous with 'soft drink' in many areas. You should:" Of the choices, one of them was "Immediately change your formula." (Another, aimed at a more or less contemporaneous Pepsi PR disaster, was something like "Set a celebrity on fire.")
    • There's even a conspiracy theory that New Coke was a Failure Gambit, with Coca-Cola deliberately trying to provoke enough hatred for New Coke that people would be extremely happy at the return of Classic Coke, and thereafter buy more of it. A slightly different theory claims that New Coke was done in order to cover up the switch from cane sugar to high-fructose corn syrup in the original recipe, by making people forget what it used to taste like (in truth, this shift from sugar to HFCS occurred five years before New Coke). Both of these theories assume that there's no possible way that such a major corporation could shoot its business model in the foot so badly... right? To quote Donald Keough, then the company's president and chief operating officer:
    "Some critics will say Coca-Cola made a marketing mistake. Some cynics will say that we planned the whole thing. The truth is we are not that dumb, and we are not that smart."
  • Jack in the Box (the restaurant) had one between 1980 and 1994. Read more about it on Wikipedia. In short, what happened was originally Jack in the Box had a typical West Coast hamburger stand feel to it: you talked into the clown's mouth to order, and advertising featured an early version of Jack as well as several other characters. But in 1980, the chain ran a series of commercials where Jack was destroyed. New marketing was toward the "affluent yuppies" (mostly because the higher-ups realized the family and kids market they previously targeted was simply too crowded). The menu expanded at an alarming rate of two new items a year. They even tried to rename the restaurant to "Monterey Jack's" for more upscale-ness, which failed within a year. Around this point, the chain also withdrew from several markets east of the Mississippi, including New York state, Chicago, and Detroit. A massive E. coli outbreak at several Jack in the Box locations on the West Coast (which made 623 people ill and killed four) in 1993 also did damage to the chain's reputation — but at the same time, it led to Jack in the Box completely overhauling their food safety procedures, and many other fast-food chains soon followed their lead to ensure that an outbreak that large would never happen again. Jack in the Box, buoyed by their rebound and their highly-effective "Jack's Back" ad campaign (featuring Jack returning to head the company, thanks to Magic Plastic Surgery; to symbolize the break from the JITB of the late 1980s, Jack proceeded to blow up the company boardroom as the "fancy" sounding music from the previous ads played), has also begun expanding again, filling in markets such as Cincinnati and Indianapolis.
  • Hardee's went through a similar dip in The '80s and The '90s. The chain, already taxed by buying out other chains (most prominently Burger Chef and Sandy's), attempted to cut costs by using frozen instead of charbroiled meat patties. A 1990 buyout/conversion of Roy Rogers restaurants (based in the Northeast, where the Hardee's name was totally unfamiliar) was met with such backlash that most of them were quickly reverted. Issues with quality control and constant menu changes brought the chain to its nadir in 1997, when tons of locations were closed (most of the franchises in Detroit were sold to Wendy's or Canadian chain Tim Hortons, giving the latter its second successful American market), and the remainder was sold to California-based Carl's Jr. For the next six years, Carl's Jr. struggled in attempts to merge the two chains by keeping Hardee's still-successful breakfast menu and Carl's Jr.'s lunch/dinner menu and logo. The change was rough at first, resulting in a schizophrenic mess of stores, with some stores still having the pre-1997 menu and orange-and-brown logo well into the 2000s. But one last Retool of the menu to focus on "Thickburgers" seemed to finally turn things around and re-establish the chain with a more "upscale" image than McDonald's, Burger King, or Wendy's. As of The New '10s, Hardee's/Carl's Jr. has once again been in expansion mode, gradually filling in markets that had been abandoned in the '90s or earlier, such as Chicago.
    • Speaking of Roy Rogers, Hardee's singlehandedly caused a deep Audience-Alienating Era for that chain thanks to the aforementioned buyout/conversion plan. However, some history is in order. Roy's was founded in 1968 by Marriott; yes, the hotel chain. They had been in the restaurant business for years prior to hotel ownership, by the late '60s, their chain of "Jr. Hot Shoppes" (fast-food style versions of their original "Hot Shoppes" eateries) wasn't doing so well, and Marriott wondered what to do with them. Meanwhile, Marriott had recently acquired the franchise rights to a place called RoBee's, which sold roast beef; but due to a dispute with Arby's over the similar-sounding name, they couldn't use that name outside of Indiana. Marriott board-member Bob Wain (the namesake of Bob's Big Boy) came up with an ingenious idea: merge the two chains together into one that would offer hamburgers, roast beef, and fried chicken, plus a "Fixin's Bar", where the customer could garnish their meal with dressings, condiments, and toppers as they wanted. He then reached out to famous Western movie star Roy Rogers to have his name on the chain and endorse its products. The concept was a wild success, and by the late 1970s, Roy's had expanded from its Washington DC-area base to all over the Northeast and Mid-Atlantic. They were further boosted when they acquired rival burger/chicken chain Gino's in 1982. The chain looked to be set for the future... until Marriott decided to get out of the food business in 1989. Hardee's bought Roy's and immediately set out to use the chain for Northeast expansion. Many company-owned Roy Rogers were converted to Hardee's, and with the exception of the fried chicken (which was offered at Hardee's — even those that weren't Roy's — through the late 90s), Roy's familiar items were eliminated. Loyal Roy's customers revolted, and by 1994, Hardees was selling off locations left and right, often to McDonald's, Wendy's, and Boston Market. By 1997, only a few franchised stores in the DC/Baltimore area, and some stragglers located in highway rest stops, were left of the once-mighty Roy Rogers. In 2002, Peter and Jim Plamondon (twin sons of one of the original Roy's executives) bought the rights and chain and began an effort to prevent the name and menu from falling by the wayside. They've built new locations and expanded franchising, and slowly but surely the chain is inching its way back.
  • Wendy's went through a similar plunge in The '80s, due mainly to poor upkeep of its stores that created cleanliness issues, as well as a failed attempt to adopt a breakfast menu (unlike McDonald's or Burger King, Wendy's never fully got on board with breakfast, and it was only available at a handful of locations before quietly being dropped in 2014). However, it was not a long-lived or detrimental decline like Hardee's suffered — by The '90s, the chain recovered from its eighties slump, thanks to storewide renovations and a highly popular series of ads featuring founder Dave Thomas. By the mid-'90s, Wendy's was considered the best in quality and service among the "big three" burger chains, and, despite closing most of its international locations later in the decade, it has been a solid #3 ever since, becoming popular through a snarky social media presence. The chain even relaunched its breakfast menu nationwide in 2020.
  • A&W Restaurants went through this in The '70s and The '80s. This was mainly due to their drive-in restaurants aging and becoming less feasible compared to drive-through places such as McDonald's — except in warmer climates, drive-ins often had to close in the winter, as very few had indoor seating. The chain went on a huge closing spree and franchise freeze, slimming the numbers down greatly; they also sold all of their Canadian locations to Unilever (to this day, Canadian A&W has no connection with its southern counterpart). A subsidiary was spun off to sell A&W root beer in grocery stores. New sit-down locations with drive-throughs were piloted, and A&W began to push into shopping malls as well (due not only to a buyout of Carousel Snack Bars but also due to the company being owned by shopping mall developer A. Alfred Taubman at the time). The chain ended up in the hands of Yum!, the owners of KFC, Taco Bell, and Pizza Hut (and also Long John Silver's at the time), who aggressively expanded the brand with "2 in 1" stores combined with KFC or Long John Silver's. Although A&W and Long John Silver's have since spun back off from Yum!, the co-branding largely remains, and A&W has been fairly secure ever since.
  • Tropicana orange juice went through a bizarre and brief Audience-Alienating Era when they hired the Arnell Group to redesign their packaging. The new design was so ugly that it actually caused a 20% drop in sales. It was reverted after just a few months.
  • English cuisine went through its Audience-Alienating Era in the 20th century. Until the industrial revolution, the cuisine of England was well-regarded throughout Europe, but due to the industrial revolution and the two world wars, the cuisine of England saw a drastic decline in quality and reputation when mass-produced food became popular. The Audience-Alienating Era has started to decline since the '70s, but the reputation still persists.
  • American cuisine, meanwhile, took a nose-dive in quality during the early Cold War, becoming a giant buffet of artificial chemical garbage loaded with dangerous amounts of sugar and fat with trace amounts of real nutrients. Food preservation technologies developed during World War II, combined with the perceived need to stockpile food heavy with preservatives for Cold War-era fallout shelters, and general public ignorance about the potential health risks of chemical additives spawned a wave of food production emphasizing price and speed over quality. The rise of fast food and the introduction of the microwave only made things worse; this was when America first developed its Fast-Food Nation reputation, which it, like Britain, still has trouble shaking off. Unsurprisingly, heart attack and cancer rates in America skyrocketed during The '50s and The '60s thanks in no small part to the garbage people were putting in their bodies. This nearly destroyed drip coffee's reputation and spurred the organic and slow food movements as an explicit rejection of the trend. Entire websites like Lileks Gallery of Regrettable Food and The Mid-Century Menu show some of the awful recipes to come out of this era.
  • French cuisine during The '90s and the Turn of the Millennium came to be seen as a bland and formulaic institution, trading entirely on tradition and reputation while spurning innovations coming out of countries like the US, the UK, Italy, Spain, and Japan, often on grounds of Creator Provincialism and national pride. The increasingly difficult economics of running a top-flight restaurant in the face of France's Obstructive Bureaucrats were also leading many restaurateurs to cut corners; a 2010 Canal Plus documentary revealed that up to 70% of French restaurants were relying on pre-packaged meals, ingredients, and sauces. A successful push that same year to get UNESCO to designate French cuisine as part of the world's cultural patrimony was also widely criticized as merely entrenching its ossification, turning it into a museum piece and making it even less relevant to the French people. The New '10s saw a new breed of chefs, influenced by cuisine from the US and elsewhere in Europe, bring those countries' culinary revolutions to France, setting off a wave of creativity and experimentation and putting France back on the map.
  • The Snapple brand of tea and juice drinks fell into one in 1994 after its founders sold the company to Quaker Oats, who mismanaged it into the ground. They cut back the low-volume niche flavors in order to save money and fired their spokesperson "Wendy the Snapple Lady" due to her being seen as too "Long Island" for a national advertising campaignnote , in the process killing much of the indie upstart spirit that the company was known for, while also pulling its advertising from The Howard Stern Show while he was on vacation due to his penchant for controversy. Stern, who had been a fan of Snapple before then (leading to their partnership in the first place), was not amused one bit and started calling the drink "Crapple" on his show. While Quaker Oats sold Snapple off to Triarc Companies in 1997 and the new owners managed to stop the bleeding, the damage was done, and Snapple's market share never fully recovered; as of 2014, it was less than half of what it had been twenty years prior. Whereas once Snapple had national reach, nowadays its popularity is largely concentrated on the East and West Coasts (particularly New York City) and in places with large numbers of people originally from those areas (such as Florida). Today, Harvard Business School uses Quaker Oats' ownership of Snapple as a case study in how not to run a brand.

    Pinball manufacturers 
  • Stern is widely considered to have gone through an audience-alienating era between 2003 and 2012. The reason is simple: During this interval, Stern was the only non-boutique manufacturer of pinball machines. As a result, innovation and creativity mostly stagnated. (That being said, Stern is partially Mis-blamed for this stagnancy, as they got caught in the tail end of a patent war among several other manufacturers that left Stern with next to nothing to use.) Stern was off to a good start with well-acclaimed early hits like Lord of the Rings and The Simpsons Pinball Party, but they got complacent soon afterwards: The Problem with Licensed Games took full effect here, as Stern was able to get themes for hot properties at the time of release, like 24 or Pirates of the Caribbean, meaning Stern's machines typically got the price of the machines back simply because passers-by familiar with the license would drop in some coins out of curiosity. This convinced operators, who mostly do not play pinball, to just buy whatever machines Stern would release. In addition, the rules were released in such an incomplete state that some games, like Batman (Stern), were near unplayable on release. The build quality also suffered, most noticeably with Iron Man's rivets and screws shaking loose in as early as six months. This changed in 2012 when Jersey Jack Pinball emerged as a competitor to Stern, and Stern's machines took a noticeable leap in quality, both in build quality and in gameplay. Whereas Stern was once practically a derogatory name among pinball fans, they are universally respected except by the most diehard Bally-Williams fans. This leap is so dramatic, several of Stern's audience-alienating era machines have since been Vindicated by History, with Stern now manufacturing re-releases of machines they never sold much of when first introduced.
  • After dominating the electromechanical era (from post-World War II to 1979) and some hits right after that, most notably Black Hole (which charged double per game compared to surrounding releases and was still popular in every arcade), Gottlieb got caught in an audience-alienating era from roughly the mid-'80s and onwards that they just couldn't shake off. This downturn caused the company to financially trail behind its competitors and go out of business in 1995. The reasons, however, are not entirely clear: Ask 50 pinball fans and they will give you 50 different answers, but they all agree that the Gottlieb machines weren't quite as good. A likely underlying reason is that as Gottlieb's competitors created more intricate rules,note  deeper integration of their themes,note  and incorporated new technology,note  Gottlieb's design team were stuck in electromechanical ways of design had trouble adapting, lagging behind Bally-Williams and Data East by two to four years. When Gottlieb did adapt to the mode-based progression in the '90s, however, the disjointed gameplay, lopsided scoring, and clunky playfield geometrynote  earned it few fans. There are a few highlights within this era though, such as Stargate, Rescue 911, and Cue Ball Wizard.

    Retail 
  • From 2014 to 2018, Select Fashion was seen as being little more than a poor imitation of brands like New Look and Marks & Spencers, and its increased focus on sports tops, crop tops, and shorts / athleisure wear got mixed receptions. The brand was already in trouble anyway due to the political situation in the United Kingdom, and the audience-alienating era was compounded by a controversial website (selectfashion.co.uk) that people complained was riddled with bugs and had a user interface that was slower than its rivals. Adding to the audience-alienating era was some people considering the store a Genre Throwback to 1990s-style clothing, or being a British Captain Ersatz of Hot Topic (fashionable-style clothing). Also, some people saw it as having a Periphery Demographic for womenswear, since despite its supposedly younger demographic, older women were buying the sports bras and crop tops more than the 20-somethings and millennials that the brand was aimed towards.
  • Abercrombie & Fitch endured two of these.
    • First, in the '70s and '80s, financial problems forced what had once been a top-shelf outdoor specialty retailer to cut costs any way it could, cutting back on both their expensive loss leaders and their more moderately-priced budget items. This culminated in its bankruptcy in 1976, and the closing of its flagship store on New York's Madison Avenue the following year. It survived The '80s mainly as a mail-order catalog owned by Oshman's Sporting Goods, with only three physical stores in Beverly Hills, Dallas, and New York. The Limited bought out A&F in 1988 and pulled it out of the Audience-Alienating Era, as new CEO Mike Jeffries, who took over in 1992, reinvented it as a preppy, sexy lifestyle fashion retailer aimed at teenagers and young adults. Under Jeffries' leadership, A&F became an iconic symbol of youth fashion in the '90s and '00s.
    • That said, The New '10s were not kind to A&F. The Great Recession turned one of the brand's main selling points, its high price and exclusive image, into a liability virtually overnight, as the displays of wealth that A&F came to be synonymous with were now seen as elitist, the domain of Jerk Jocks and Alpha Bitches, especially with both advertising and store staffing tending to be overwhelmingly White. Long-simmering controversies surrounding the hiring and treatment of the stores' clerks (in 2006, Jeffries stated in an interview that they only hired "good-looking people" to work in their stores because their target demographic was The Beautiful Elite) only furthered the image of a brand that was stuck in the past, as did its highly sexualized marketing, which came to be seen as tacky in an age of greater awareness of sexual harassment and assault. All this allowed low-cost "fast fashion" brands to eat A&F alive in the early-mid 2010s, with drastically shrinking revenues and many stores closed. In 2017, Fran Horowitz, fresh off of turning around A&F's sister brand Hollister, was promoted to run the company's mothership, and she proceeded to revive its fortunes by toning down the overt sexuality, abandoning the elitism in favor of a more down-to-earth, aspirational vision of preppiness (including expanded size ranges for smaller and larger customers), and focusing on quality products, successfully banking on fast fashion's reputation for poor quality and environmental largesse to drive sales from people looking for longer-lasting clothes.
  • In the '80s and '90s, the American book retail chain Barnes & Noble became a villain of independent booksellers everywhere, to the point of it becoming a central part of the plot of the hit 1998 Romantic Comedy You've Got Mail. Perhaps it was only karmic when, in the 2010s, the chain was hit by the same pressure in the form of Amazon, the online retail giant that started out selling books and was able to beat B&N at its own game. B&N's business tactics during this time did nothing to revive its flagging fortunes; an attempt to diversify their offerings into fields that were only tangentially related to books left customers puzzled as to why they'd want to buy, say, batteries, while their tactic of taking fees from publishers to give certain high-profile new releases prime placement at the front of the store, while offering easy money in the short term, often left stores giving the best shelf space to books that nobody wanted to buy. This era ended in 2019 when new CEO James Daunt, founder of London's Daunt Books chain who had previously turned around Waterstones (the British B&N), did the same at the American company by redoubling its focus on books, giving individual stores more freedom to choose which titles they wanted to promote, ending the promotional deals with publishers, and investing in both B&N's online arm (including its languishing Nook e-reader) and redecoration at the chain's physical stores. By the early '20s, even amidst the COVID-19 Pandemic that devastated brick-and-mortar retail, the company was once again profitable and expanding, much to the surprise of those who thought that e-readers and Amazon had made it obsolete.
  • Kmart was the original chain of discount department stores, with its roots in the SS Kresge five-and-dime stores; launched in 1962, it expanded rapidly and by 1981, it had over 2,000 stores across the US and Canada. The phrase "Attention Kmart shoppers" to direct customers to their famous Blue Light Specials entered the popular lexicon. Unfortunately, the 1980s saw trouble afoot — Kmart neglected store upkeep and other capital improvements (like computerizing their inventory system) in favor of paying high dividends to their stockholders. They also lost sight of their core business and diversified wildly into all sorts of other store chains and concepts, including Borders and Waldenbooks, Sports Authority, Builders Square, Pace Membership Warehouse and others (to the point several shopping plazas were constructed with all the tenants being owned by or connected to Kmart; almost all of said chains have since gone under themselves). As the 90s dawned, they hurriedly attempted to play catchup with newly-emerging rivals like Walmart and Target by opening Super Kmarts, pouring money into advertising and exclusive deals with celebrities, and rebranding almost all their normal stores as "Big Kmart". This accelerated the decay and Kmart went bankrupt in 2002. Corporate vulture Eddie Lampert bought the chain out of bankruptcy, then merged it with the also-declining Sears not long after. Lampert clearly didn't give a damn about anything but making himself money (mostly by selling the company's physical properties to his real estate company, then leasing them back to the stores), and both chains began to spiral into crisis through the late 2000s and all the way through the 2010s, with locations being closed left and right and new formats failing; even more sympathetic views of Lampert's actions while running Kmart and Sears involved trying to implement ideas that had worked for him as a lone investor but were hopelessly out of touch and counterproductive as head of big complicated company. As of right now, Kmart is down to 9 stores, only 3 of which are in the continental US (the others are located on islands like Guam, having no competition, likely explaining why they've been kept open for so long).
  • In the 2000s, while the British fashion house Burberry was still successful overseas, its once-luxurious reputation had collapsed at home. The brand had boomed in The '90s, but it had done so by reducing prices and moving downmarket, and by the Turn of the Millennium the consequences of doing so had come home to roost: namely, their clothing and trademark check pattern (especially their baseball caps) were embraced by "chav" culture and Football Hooligans to the point that pubs, nightclubs, and stadiums started banning anybody wearing Burberry. Danniella Westbrook, an EastEnders star notorious for her Hard-Drinking Party Girl lifestyle, was especially known for wearing Burberry, just about the worst brand ambassador that the company could have. New ownership in the late '00s fought hard to turn around Burberry's déclassé reputation at home, most notably by toning down use of their check pattern and removing the infamous caps from sale entirely, and even then, it took years for the company to recover.

    Tourism 
  • In the early 1990s, Las Vegas was facing stiff competition from not only Atlantic City drawing away gamblers on the East Coast (at its height, AC had over twice as many tourists as Vegas), but the looming threat of Native American casinos, legalized in 1988note , drawing away gamblers from Middle America as well. What's more, the traditional Mafia backers of many resorts were under growing pressure as federal law enforcement slowly pushed them out of the city. As a survival mechanism, Las Vegas began its now-infamous attempt to expand its appeal to middle-class tourists by rebranding the city as a destination for family vacations. Every Strip hotel built over 1990-93 had at least one theme park-esque attraction and theme, with the new MGM Grand boasting a full-blown theme park, MGM Grand Adventures, and Circus Circus adding one of their own, Grand Slam Canyon (now Adventuredome).

    This backfired badly. Adult tourists who preferred to gamble and party among their own kind were upset by the very presence of large numbers of children in Vegas, parents were more likely to spend time with their kids than at the gambling tables and slot machines, hotel-casino staffs trained to operate adult-oriented resorts couldn't handle the unique needs of families, cases of parents rushing off to the casino floor and leaving their kids to fend for themselves made the news — one abandoned 7-year-old girl ended up kidnapped and murdered in 1997 — and the MGM Grand's theme park turned out to be a bomb that closed in 2002.note  The beginning of the end for this era came with the opening of the Bellagio in 1998 and the Venetian in 1999, which were explicitly geared towards a very classy and very adult clientele, with the Bellagio boasting a fine art gallery, conservatory, resident Cirque du Soleil show, and high-stakes poker tables and the Venetian boasting a nightclub and a museum. In 2003, the city of Las Vegas began its "What Happens Here, Stays Here" ad campaign that marketed the city as a Hotter and Sexier tourist destination for young adults, firmly ending the days of "family Vegas" once and for all. While the hotels that opened to serve families are still around, and Vegas still markets itself as being about more than just gambling, said hotels have been progressively de-themed and the city's entertainment mix now mostly excludes families.
    • This was referenced and summed up pretty well at the end of the film Casino.
    Today it looks like Disneyland. And while the kids play cardboard pirates, Mommy and Daddy drop the house payments and Junior's college money on the poker slots.
    "The town of Vegas has got a different face / Because it's a family place / With lots to do. / While in The '50s a man could mingle with scores / Of all the seediest whores, / Well now his children can too!"
    • The film Vegas Vacation in 1997 parodied the family era of Las Vegas, with Clark Griswold taking his wife and kids on a trip to Vegas with his bonus check. Without parental supervision, Rusty gets a fake ID and becomes a high roller, while Audrey gets a job as a stripper thanks to the influence of Cousin Eddie's trashy daughter Vicki.
    • The Simpsons poked fun at this in the 1999 episode Viva Ned Flanders, the episode began with the Monty Burns Casino being demolished so it can be replaced with a "casino-themed family hotel." Later in the episode, there's a brief cameo of Raoul Duke complaining that there were "too many kids" in Vegas.
  • Paul Pressler’s run as president of Disneyland from 1996 to 2000 is a textbook example of someone excelling in one field but completely failing in another. After a very successful stint as the head of The Disney Store (which itself underwent an Audience-Alienating Era after he left, thanks in part to focusing on films like The Hunchback of Notre Dame and Hercules that aren't exactly merchandising-friendly), Pressler was promoted to the top position of Disneyland, which at the time was undergoing a radical change. The park was slowly losing new customers, and an attempt to add a second Disney park to the area had failed miserably. He was charged with saving money and enticing new people to the park – which he did by seriously cutting attraction maintenance and operating hours, and by homogenizing merchandise within the park down to only a few major items like T-shirts and plushies, basically turning the park into a glorified Disney Store. He then turned to saving even more money by completely shutting down smaller low-capacity attractions like the Motor Boat Cruise, and helping “trim” the budget to a massive redo of the Tomorrowland area.

    For his “success” at Disneyland he was promoted to the head of the entire theme parks division in 2000, where he oversaw development of the long-awaited second theme park in the Disneyland area, Disney’s California Adventure. The park opened to great fanfare in 2001… and very quickly became a spectacular flop. The park itself was accused of being cheap and uninteresting, with more of an emphasis on shops and dining over shows and attractions. After two years of trying and failing to fix California Adventure, Pressler resigned in 2003 to become the head of The Gap, and thankfully taking Disneyland's Audience-Alienating Era with him.
  • The German Baltic Sea Coast had this, particular in the former East due to no fault of its own. During the 19th century, resorts at the Baltic coast were among the most priced and widely sought after places for the well to do to spend a Kur (a unique German type of stay at a spa that supposedly rejuvenates and increases health) and many grand old hotels date to that era and had crowned heads among their guests. The introduction of the railroad made more and more spas accessible to the growing middle class and soon tourism vastly overtook fishery in economic importance. Then two world wars hit and with it came German partition. While the resorts had always bounced back before, some were now close to the border and could not continue to operate whereas others lost almost all of their former cross-border guests. Still those in the GDR enjoyed a steady stream of tourists of all classes, aided by the fact that travel to the West was out of the question and even holidays in "socialist brother countries" like Hungary or Poland were a bureaucratic hassle for GDR citizens to say nothing of the long trips there in relatively slow trains or the Trabbi. After 1990 the border opened and suddenly GDR citizens either moved West, had become unemployed or wanted to explore all those countries beyond the Iron Curtain that had always been beyond their reach. Baltic Sea towns suffered and even some in the West had problems with (ironically) Westerners abandoning them for resorts in the East. At the same time dockworks in Rostock (one of the other major employers in the region) collapsed and the GDR fishery, which had been unsustainable to begin with, lost its reason for existence. It seemed for a time that with rampant emigration of every young person who could and economic problems the region had entered a deadly tailspin, but while some of these problems still persist, tourism has bounced back. In part because (former) GDR citizens did in the end return to the beaches of their youth, in part because some Westerners found their love for East German beaches and in part because the G8 summit in Heiligendamm (the one with the quirky picture of world leaders in a ''Strandkorb'') put the region onto the international tourism map for the first time in over half a century.
  • Both Universal Studios resorts have had low points in their history, particularly during the Turn of the Millennium.
    • Universal Orlando:
      • Enthusiasts deem the period during and after the 1999 resort redesign as the park's first low point. During this period, many of Universal Studios Florida's classic attractions, such as Ghostbusters Spooktacular, Kongfrontation, Alfred Hitchcock: The Art of Making Movies, Stage 54, The Funtastic World of Hanna-Barbera, The Wild Wild West Stunt Show, and Back to the Future: The Ride were closed in an attempt to "modernize" the theme park in order to keep it competitive with Disney. The enthusiasts' reactions to most of the replacements for these rides ranged from extremely polarizing to heavy backlash. The only rides to be well-received were Men in Black: Alien Attack, Revenge of the Mummy, and The Simpsons Ride, and even the latter two faced some backlash due to them having replaced Kongfrontation and Back to the Future: The Ride, respectively. It also didn't help that during this time, Universal scrapped plans for building a second resort just a few miles away for economic reasons, with nearly disastrous consequences.
      • Its second low point, meanwhile, was the mid-late 2010s, especially in Orlando but also affecting Hollywood. In particular, enthusiasts cite a perceived over-reliance on motion simulator rides (or "screen" attractions) at the expense of more traditional rides and stage shows, especially now that, with both the original Studios park and Islands of Adventure mostly filled in, Universal had to close old attractions to make way for these new rides. While the replacement of Twister...Ride it Out with Race Through New York Starring Jimmy Fallon wasn't too bad,note  replacing the well-received Disaster! with Fast & Furious: Supercharged, which met a scathing reception even from casual park guests, wasn't, especially since it was one of only four remaining attractions at the park (the others being E.T. Adventure, Animal Actors on Location, and Universal Orlando's Horror Make-Up Show) to date back to its 1990 opening, albeit in the form of Earthquake: The Big One. Volcano Bay, Universal Orlando's third theme park (and first water park), also received very mixed reviews on opening day in 2017 for being an Obvious Beta, with complaints about the rides breaking down frequently, the wireless TapuTapu wristbands used for the virtual lines being glitchy, and said virtual lines being several hours long, such that Universal was forced to cap ticket sales in order to prevent overcrowding (leading to guests being turned away from the gate as early as noon).

        It's widely agreed that this era ended in 2019, which saw the opening of Hagrid's Magical Creatures Motorbike Adventure, a rollercoaster that won acclaim for its merging of traditional coaster design and physical storytelling, tellingly only featuring a single screen effect used for the pre-show video. The early '20s reopening from COVID saw Universal continue this streak of rides that were far more careful with the use of screens, such as VelociCoaster in Orlando, The Secret Life of Pets: Off the Leash and Jurassic World: The Ride (a refurbishing of Jurassic Park River Adventure) in Hollywood, and Jurassic World Adventure at their new park in Beijing. This period also saw major expansion projects being undertaken, including the aforementioned opening of Universal Studios Beijing, Super Nintendo World at various parks, the construction of the Endless Summer Resort in Orlando, and the development of the long-awaited "third gate" in Orlando in the form of the ambitious Epic Universe park. Fans tend to favorably compare this pace of ride and park development, especially in Orlando, with that of Disney, who are seen to have gone through an AAE of their own under the leadership of Bob Chapek due to their cost-cutting, price hikes, and lack of major sustained investment into their parks.
    • Universal Studios Hollywood had it even worse than Orlando in the '00s. Most of the park's attractions from the late '80s all the way up to the early 2010s got replaced left and right just to keep the park competitive to Disneyland, and even then, most people just visited the park just to ride the iconic Studio Tour.note  The most negatively received of the newer attractions at the time was Spider Man Rocks!, a live stage show that replaced Beetlejuice's Graveyard Revue in 2002. It was frequently mocked by almost everyone who watched it, and it made Spider-Man: Turn Off the Dark look like a Tony Award-winning musical by Bob Fosse. It ended up being the only Marvel-related attraction Universal Studios Hollywood had; it closed in 2004 to be replaced by Fear Factor Live (which didn't fare much better reputation-wise), and Marvel regained the California theme park rights for their characters in 2008, which fell into Disney's hands the following year when they bought Marvel. Only the opening of Hollywood's Wizarding World of Harry Potter, Springfield, and Super Silly Fun Land areas in the '10s seemed to signal the age was starting to die down.
    • For Halloween Horror Nights Orlando, meanwhile, the years from 2012 through 2014 are often cited due to the increased focus on licensed properties (especially The Walking Dead) and lack of "icon" figures in favor of "having the event represent itself" (which usually translated to a heavy emphasis on The Walking Dead in the marketing). This period reached its nadir in 2013 when every scarezone was themed around The Walking Dead. The Audience-Alienating Era ended with the 25th anniversary event in 2015, featuring the return of many popular characters from past events, a more generous ratio of licensed to original content, and The Walking Dead being kept to one house.
  • If you want to get a North American skier or snowboarder worked up into a fury, bring up Vail Resorts within earshot. Starting in the 2010s, Vail, then the owner of four ski resorts in Colorado (including their namesake) and Heavenly Mountain Resort at Lake Tahoe in California, went on a buying spree of ski resorts across the US and beyond, especially in the latter half of the decade, and many longtime skiers and riders would argue that they put every resort they bought into an AAE. Their signature innovation was the Epic Pass, a season pass that allowed entry at any Vail-owned ski resort in the world along with limited access to multiple partner resorts — and because they were so cheap,note  many Vail-owned resorts were soon crawling with more guests than they knew what to do with, leading to not only long lines at chairlifts but also, more importantly, safety hazards from both overcrowding and large numbers of beginners. What's more, Vail's management often eliminated local control at the resorts it bought and centralized most decision making, and brought a wave of gentrification to many ski towns where they bought resorts, leading many to feel that their local mountains getting bought out by Vail stripped them of their identity and character. Skiers and riders on the East Coast especially grew to hate Vail, seeing the Colorado-based company as failing to understand the unique nature of skiing in the EastShort version... and consequently accusing them of running many of the resorts they bought into the ground. On the other hand, Vail also has its defenders, who credit their Epic Passes with inspiring a Newbie Boom in skiing and snowboarding by making fairly expensive sports more accessible. The success of the Epic Pass, in fact, led many competing resorts to pool their resources into offering similar multi-mountain season passes, like the Ikon Pass and the Indy Pass.

    Websites 
  • The Escapist fell into one from 2014 to 2018. Short version: the owner of the site took a stance on a heated controversy within gaming that had broader political implications, a stance that most of the site's contributors sharply disagreed with, causing many of them to either quit or receive pink slips. By late 2017, only Zero Punctuation remained at the site, largely because its creator Ben "Yahtzee" Croshaw tried to keep his show as far away from the controversy as possible, and by that point even he was leaning more on YouTube than The Escapist. New management in 2018 (namely, the original creators of The Escapist buying it back) rebuilt the site, hiring back many former contributors who had previously left. In late 2023, however, Yahtzee had finally had enough, and left the site along with much of its staff to create Fully Ramblomatic as a Creator-Driven Successor, since the Escapist still owns the rights to the Zero Punctuation IP.
  • Mention Yahoo! or Verizon to a Tumblr user (or even viewer) at your peril. Yahoo! bought the blogging site in 2013, Verizon took it over in 2017 when they bought Yahoo!, and between them, they proceeded to run it into the ground. They both attempted to turn the site, which had its own very distinct and often countercultural community, into an advertising-friendly competitor to major social media platforms like Facebook, which backfired as corporate and media attempts to ingrain themselves in the site's culture often carried a whiff of trying too hard to be "modern" and met considerable backlash from the site's users. All the while, the site gained a reputation for being an extremely toxic environment due in part to a lack of official content policy and a business model that saw a repeated failure to incentivize or financially compensate the userbase, leading to intense cyberbullying, blogs containing graphic pornographynote , and a proliferation of extreme communities ranging from white supremacists to fan clubs for mass murderers, while the protests of the site's users concerning such groups went ignored by the management.

    In late 2018, Verizon finally enacted an official content policy after the Apple App Store removed the site's dedicated app for policy violations, but also because a sweep by the FBI had found extensive stashes of child pornography and other disgusting material circulating on the site. While Verizon’s decision did admittedly solve the problems with extreme porn, it not only did nothing to solve any of the site's other issues but wound up causing new problems of its own due to the site's algorithms flagging innocuous images as pornographicnote , further alienating the userbase and causing an exodus of artists to other sites ranging from Twitter to Newgrounds. In 2019, six years after Yahoo! paid an eye-popping $1.1 billion to acquire Tumblr, Verizon sold it to Automattic (owners of the WordPress blogging platform) for less than $3 million, an insane Generation Xerox degree self-destruction that would put even MySpace to shame and simply showed just how far Tumblr had fallen.

    Ironically, in hindsight this affair was seen as the beginning of the end of Tumblr's AAE, even if it spent the following years in a limbo. While Tumblr's mainstream popularity declined, its remaining users seemed to grow all the more attached to it, resulting in a small but loyal community of people who actually appreciated the site's lack of features and saw it as an alternative to mainstream platforms that prioritized engagement and advertising at the expense of privacy, mental health, users from marginalized communities, and fighting misinformation. By 2022, this was enough to kick off a revival of Tumblr, first as the site attracted a new generation of teenage users and then as it absorbed a wave of refugees from Twitternote  after Elon Musk took over that site and swiftly implemented a series of controversial changes, just in time for Tumblr to soften its porn ban to permit nudity again (though not explicit sex).
  • For the various sites under the formerly-known-as Gawker Media umbrellanote , the Audience-Alienating Era started in 2016 with the shutdown of the flagship site Gawker, due to a lawsuit brought by Hulk Hogan and supported by tech billionaire Peter Thiel (who had his own personal beef with the site) over its publication of Hogan's sex tape. The newly-renamed Gizmodo Media Group was soon bought out by Univision, whose mismanagement came under fire from the sites' own writers in 2018. In 2019, as Univision's own problems mounted, they sold the Gizmodo Media Group, along with The Onion and The AV Club (which they had acquired separately in 2016), to the private equity firm Great Hill Partners, who created G/O Media out of it. It was here that the problems really began. To put a long story short, new CEO Jim Spanfeller fundamentally disagreed with the Deadspin writers and editors as to the best direction for the website going forward. Spanfeller wanted Deadspin to focus purely on sports-related news and leave non-sports stories to the other websites under their banner, while the writers wanted to maintain the freedom to mix sports news with other subjects they enjoyed under previous leadership. The dispute eventually set off a Writer Revolt that caused the site's entire staff to quit, effectively putting Deadspin on hiatus and leaving a stinging effect on morale at the other G/O sites. The site was eventually relaunched with a new team in March 2020, while most of the original crew launched Defector in September 2020.
  • During the late 2000s and early 2010s, Cracked was a very popular humor website, spun off from a failing knock-off of MAD Magazine in 2005 under editor-in-chief Jack O'Brien. It thrived on a mixture of community-submitted lists and articles and columns written by a small and often popular core of on-staff columnists with distinct styles, often combining genuinely interesting trivia and facts with an over-the-top, somewhat profane, and self-deprecating sense of humor, most of which ended in recommendations for other such articles so it was easy to get sucked in after reading just one. By 2010, Cracked had drawn over a billion page views, and by 2012 it was the most-visited humor site in the world, even beating out The Onion. They also began releasing video series like "Agents of Cracked" and "After Hours," which unfortunately sowed the seeds of the site's doom. It became a victim of the pivot to video, where Facebook's misrepresented statistics about online usage led many companies to prioritize video content, even though it's slower and more expensive to produce and more rapidly consumed. In 2016, Cracked was acquired by the E. W. Scripps Company, who fired most of the staff to cut costs in 2017, with others leaving to pursue other projects. This coincided with a dramatic reduction in the quality of the community-submitted list articles, which had previously been heavily vetted before posting. By 2020, the last of the old guard had been fired, the site had been passed around like a hot potato by various corporate owners with a tiny skeleton crew of actual employees pumping out content as fast and as often as they can with little regard for quality control, the new list articles and captions are more designed to catch clicks or search engines and feature well on social media than anything else, and the only consistent viewers are only showing up to try to look up old articles they remember from the past, many of which are increasingly unreadable as format changes to the site leave their text jumbled, their images missing, and the new search features struggling to locate them anyway. Time will tell if this era ends Cracked forever.

    Web Video 
  • By the admission of its own creator James Tyler, the YouTube gaming show Cleanprincegaming (now known as Nerdstalgic Gaming) fell into a bad one in mid-2018. The show's rapid rise caused Tyler to speed up his schedule to keep churning out new content, most notably a vlog as a supplement to his normal video essays, which eventually led to increasingly formulaic videos, sloppy research, and accusations that he wasn't actually playing the games he was talking about so much as summarizing other people's opinions on them. Fortunately, Tyler recognized that things had gone wrong with his show, and took a month-long break before returning with a massive step up in quality, complete with a commitment to using only game footage that he had taken himself. This was enough to Win Back the Crowd and lure back many previously disillusioned fans.
  • Chris Bores's videos have always been divisive, but there are two periods even his fans weren't very fond of:
    • The first signs of trouble came in 2012-2013, when he posted uninteresting ghost hunting videos and flooded his channel with Skylanders content. The episodes of his (relatively) popular series like I Rate the 80s and his flagship The Irate Gamer became increasingly scarce and eventually stopped coming completely. By 2015, the Audience-Alienating Era was in full force — his most popular series had been replaced by more ghost hunting videos and several failed experiments like a few botched attempts at Let's Plays and a poorly-received "crazy" review style that was abandoned after a couple of videos. He also made many Product Placement videos like incompetent reviews praising questionable tech products, and videos that were mostly him showing off the contents of subscription boxes and saying it's cool. This Audience-Alienating Era ended in 2018 when he launched Chris NEO, a reboot of the review shows his fans enjoyed and dedicated his channel to that. It didn't last — NEO came to an end in October 2018, and his channel remained dormant until August 2019.
    • His August 2019 return marked the start of his second Audience-Alienating Era. Instead of videos, he decided to make a podcast titled Geek Time TNG, which most of his fans passed up. In particular, the "The JOKER Movie SUCKS!!" episode was poorly received because he based his opinion entirely off the trailers, even though it was uploaded around the time the full movie hit theaters. This Audience-Alienating Era ended with the release of a new Irate Gamer video in May 2020, which was so positively received, it actually drew in new fans.
  • Extra Credits is believed to have entered one starting from 2018, around the time narrator Daniel Floyd left. Daniel was widely seen as the frontman of the show, and his replacement Matt Krol drew ire for taking his place to the extent that he joked about Daniel being more popular than him. The era worsened in 2019 when writer James Portnow left as well, being replaced by a group of guest writers that resulted in a downtick in overall episode quality, with nadirs such as "Stop Normalizing Nazis" and "Evil Races are Bad Game Design" espousing immensely controversial viewpoints regarding culture in video games being released during this period. Not helping matters was the Growing the Beard of its More Popular Spin-Off Extra History, which quickly supplanted Extra Credits in popularity to the point that in 2022, it would take over the main channel and relegate Extra Credits to a smaller sub-channel.
  • Matthew Patrick's Game Theory has been accused of being in one, with the most commonly accepted start date being around 2015. Around that time, the theories (especially those related to lore instead of math or science) became increasingly flimsy and poorly-researched, particularly the infamous Sans is Ness, alongside relying more on speculation instead of hard evidence or Word of God and an overabundance of videos dedicated to Super Mario Bros. and particularly Five Nights at Freddy's (which, admittedly, boosted his visibility large enough for a cameo appearance in Five Nights at Freddy's (2023)), especially considering that several videos were promoted as being his final theory on FNAF, only for MatPat to repeatedly go back on his word and make another video. While a shift towards acknowledging sources have managed to win back some disaffected viewers, with the videos speculating the plot of Minecraft being regarded as a major highlight, there is still criticism towards the over-exposure of FNAF and other Mascot Horror games on the channel.
  • Having enjoyed its heyday at the beginning of The New '10s, the whole genre of Object Shows is agreed to have entered one starting from the mid-2010s, with a wide range of factors being given: the decline of Total Drama, which the genre was born to spoof, the increasingly long Schedule Slip of high-profile object shows (most notoriously, a whole decade between the release of successive episodes of Battle for Dream Island Again), a flood of low-quality object shows that soured potential viewers on the genre, several promising shows such as Object Overload and Object Havoc ending up in Development Hell and/or being Quietly Cancelled, allegations of misconduct being made against creators (such as creator of Battle for Object Destination Maxwell Hall being accused of grooming a minor, and creator of After Schooligans LorenTzel being accused of racism at the 2022 BFDI + II Meetup), and so on.

    Other 
  • Math education in the U.S. (and, to a lesser extent, Europe and Japan) went through an Audience-Alienating Era in the 1960s with the "New Math" format. It involved teaching students advanced topics like boolean algebra, bases other than 10, abstract algebra, and set theory from an early age instead of emphasizing memorization, word problems, and, well, actual numbers. This is the equivalent of a Japanese teacher jumping straight into kanji before teaching their students kana or words as basic as "watashi". The idea was to create a generation of engineers, scientists, and mathematical thinkers capable of competing with the USSR, but the results were disastrous because the brain geniuses who thought of this forgot that you still have to teach the basics, especially to children in the real world that still required manual calculation for even the most basic math in an era before the widespread deployment of electronic calculators and computers. Most children were unable to grasp the concepts because they hadn't learned basic arithmetic like the multiplication table firstnote . Most teachers didn't fully understand what it was they were teaching, and parents were unable to help their children because they weren't taught that waynote . The subjects taught weren't even that useful for their intended engineering/physical science purpose. While in hindsight the New Math initiative might have seemed ahead of its time as many of its ideas form the bedrock of modern computer science, that wasn't a big help to most students in an industrial and mechanical era that still demanded industral and clerical workers versed in arithmetic and engineers versed in algebra and calculus, when the only way to perform these was through arduous pen-and-paper computation. An entire generation thought of math as even less useful and relevant than children usually do. Today, New Math is remembered as an utter catastrophe of misguided education reform Gone Horribly Wrong. note 
    • Similarly, Common Core Education (Math in particular) from The New '10s has gotten a similarly poor reputation amongst educators, parents, and politicians. Originally created in 2009 under the Obama administration in response to the nation's overall poor educational system compared to other developed countries, as the decade rolled on, the reputation of Common Core itself was increasingly becoming more and more strained. In fact, in spite of the attempt, Common Core actually ended causing students K-12 to perform even worse than they did before. As with New Math, advocates of traditional mathematics criticized Common Core's de-emphasis of manual calculation, though Common Core proponents argued that problem-solving and intuition were more important than rote memorization of arithmetic rules in an era of ubiquitous computers and calculators.
  • In the UK, a quiet and largely overlooked audience-alienating era is taking place, thanks to secondary schools (the equivalent of Junior High & High School education, for our US readers) railroading students into University and refusing to accept the idea that some people are better suited for apprenticeships or going into manual work, all to chase higher league table scores. As a result, educational institutions of higher education, such as A-Level exam boards and Universities, have been steadily inflating grades; at the same time, this is resulting in a crisis where hundreds of thousands of new students receive high grades for things that they're either not suited for, or actually didn't do very well at in the first place.
  • Rail travel underwent a serious audience-alienating era in most of the West between (roughly) the 1950s and the advent of High Speed Rail. The decline was precipitated by the rise in private automobile ownership and air travel, leading to the abandonment of many lines as well as the bankruptcy of several private railroads. Some railroads tried their best to counteract the trend, but of course this was not always successful and some of the attempts to update the design were about as successful as "new coke". However, with the rise in gas prices as well as newer faster services such as the Shinkansen (Japan, 1960s) the TGV (France, 1980s), or Deutsche Bahn's ICE (1990s) rail travel recovered and even managed to put a dent in the numbers of air travel along short routes. Even the much laughed about Amtrak of the United States which was formed in the 1970s in order to keep the private railroads from collapsing under the weight of money losing passenger services has increased ridership by over 50% since 2000 and carries more passengers along the Acela-corridor (Boston-New York-Washington) than all airlines combined. Unfortunately, a combination of privatization and persistent Westminster neglect of areas outside Greater London has prevented British rail travel from escaping its audience-alienating era, and to this day it's infamously inferior to and more expensive than mainland European rail travel.note 
  • The Fender Stratocaster is one of the greatest electric guitars of all time. However, it wasn't the case during the 70s. When Leo Fender sold the Fender Electric Instrument Company to CBS in 1965, their guitars saw a steady decrease in quality, hitting a low point in the '70s. The company switching from a 4-bolt neck to a 3-bolt neck meant that the necks were less stable, the inclusion of an ugly and intrusive bullet truss rod, the increased weight of the guitars due to a change in wood (from southern swamp ash to heavier northern ash), the use of inferior-sounding pickups, and the overall poor build quality of the guitars from this era. By the early '80s, CBS had run Fender into the ground. Thankfully, a group of employees bought the company back from CBS, undid the poorly received changes (except the 5-way pickup selector switch, the only positively received change from the era), and refocused the company on building quality guitars. They were able to get the company back on track, and Fender remains one of the biggest electric guitar manufacturers in the world.
  • Gibson, like Fender, also saw a audience-alienating era in the '70s, though it's debatable if they've ever truly recovered:
    • In 1969, Gibson's parent company, Chicago Musical Instruments, was bought out by ECL. This led to a decrease in quality in their products, most notably the Les Paul. The already weighty Les Paul weighed even more in this era when the neck material was changed to maple, and Gibson began cutting corners to reduce manufacturing costs, at the expense of quality. While Les Pauls of the 50s and 60s were made from a single mahogany slab with a maple cap, the ECL-era Les Pauls were made of two mahogany slabs sandwiched together. They were also criticized for their poorer electronics and shoddy quality control. Gibson was on the verge of bankruptcy when it was bought out by Henry E. Juszkiewicz in 1986, who was able to turn the company around.
    • Unfortunately, it didn't last. Throughout the 2000s and 2010s, Gibson's QC declined once again, and they began introducing unpopular features such as a zero fret, robot tuners, and DIP switches. All of this was compounded by the $2,500+ price tag of their instruments despite their inferior quality to competitors like Fender and PRS, and even Epiphone, their beginner brand. Worse is that Gibson made several poorly thought-out acquisitions at this time, buying up companies such as Onkyo and TEAC. Combine that with Gibson's perceived lack of ability to market to millennial musicians (leading to the Les Paul's unwanted reputation as the quintessential "dad guitar" for dads who don't even play guitar) as opposed to Fender, whose guitars have become associated with indie rock thanks to the company marketing to the young and hip crowd, including women and minority groups. Meanwhile, Gibson was reduced to a walking punchline among the guitar-playing community.
    • In 2018, they filed for chapter 11 bankruptcy. They would sell off their acquisitions and Juszkiewicz would step down from his CEO position. When they emerged from bankruptcy, they would streamline their guitar line and redesign the Les Paul Standard (now the Les Paul Modern), doing away with all of the unpopular and much-ridiculed features, while maintaining the features that players did like (like the weight-relieved body, contoured neck heel, slimmer neck profile, and push-pull knobs). In addition, they introduced several lower-priced guitar models. However, it was up to debate as to whether Gibson's QC has improved enough to justify their $1,500+ price tags. There was also "Play Authentic", a video posted to their YouTube channel that served as a thinly-veiled attack against guitar makers building Gibson clones. The video drew negative comparisons to, once again, Fender, who not only doesn't go after anyone building Fender clones (mostly because the only part of the guitar they hold a trademark on is the headstocks) and have even licensed their brand out to guitar part manufacturers such as Warmoth. Then with Gibson's buy-out of famous guitar amp manufacturer Mesa-Boogie, people fear that Gibson may already be slipping back into their old ways.
  • The field of biology had two in the 19th century, both of which went above and beyond mere Science Marches On:
    • The first was so-called “natural theology”, whose heyday was from the start of the century to around 1859, when Charles Darwin published On the Origin of Species and laid out his theory of evolution. In essence, biologists of that time ignored the accepted philosophy of science and tried to prove God’s existence through nature, despite God by definition being a supernatural force that exists beyond the boundaries of natural law. The standard of argument fell significantly as arguments that had been refuted decades earlier became hardly questioned. Where natural theology was popular, hardly any new discoveries were made, and modern scientists, even those who believe in God, look back on it as an embarrassment.
    • On the other hand, the "eclipse of Darwinism", from about Darwin's death in 1882 through the emergence of the "modern synthesis" in the 1930s and '40s, was a good deal worse. By this time, scientists generally accepted that evolution was real, but natural selection, the mechanism that Darwin proposed for how it worked, had not yet become the standard, and so pseudoscientific mechanisms of evolution that failed to qualify as science even at the time became the standard. This is the origin of almost all the popular misconceptions about evolution, such as evolution working towards a set goal, modern animals evolving from other modern animals, humans being the end goal of evolution, and evolution working on individuals instead of the population. And the less said about the "scientific racism" popular during this period (which, despite the name, never followed scientific rigor), the better. There's a reason why implying that races are objective has become a Berserk Button among biologists and anthropologists. Worse, these misconceptions leaked out of the academy and into the broader public, leading to the rise of eugenics and social Darwinism as mainstream movements and ideologies. Franz Boas, considered the father of modern anthropology, made his name in the '30s critiquing and dismantling the racist and ableist pseudoscience that flowed unchecked as "accepted wisdom" among biologists.
  • Creepypastas are widely agreed to have entered one shortly after their golden age at the start of The New '10s, with the most common starting point being May 2014 — the same month of an attempted murder that was the first of a series of incidents that ultimately brought down one of the creepypasta scene's biggest mascots, Slender Man. This was quickly followed up by the downfalls of the "teenage murderer", "evil video game" and "disturbing lost episode" genres popularized by Jeff the Killer, Sonic.exe and Squidward's Suicide respectively, due to a flood of imitators that were increasingly criticized for being thinly-veiled revenge fantasies, overusing the same tropes to the point that the original stories were indistinguishable from their derivatives, and/or their gratuitous violent content that eventually desensitized audiences, all of which caused potential readers to instead turn towards more highbrow online horror stories. A few creepypastas are still being written, but the scene is far more niche than it used to be.

Top