Follow TV Tropes

Following

Useful Notes / PC vs. Console

Go To

A major source of Internet Backdraft, the PC vs. Console wars pit fans of both platforms in battles of nerd rage on forums all over the internets. As with Console Wars, fans of both platforms will argue on which is better for gaming.

  • PC gamers usually cite the computer's modding abilities, versatility and utility, keyboard/mouse control along with the ability to use every control scheme you can think ofnote , cheaper games, better graphical capabilities, openness to indie games, free online play, and sheer practicality: ever since the late '90s, the PC has turned from an optional luxury to a necessity for modern life. Usually, it is also cheaper to build a very powerful gaming PC (especially if the more basic PC you have for homework or job-hunting anyway is a desktop model), although pre-built PCs are another story.

  • Advertisement:
  • Console gamers cite ease of use, the "plug in and play" nature of consoles, simple (and sometimes unusual) control schemes with the controller, game stability, uniform hardware eliminating concern over technical specs, and easier local multiplayer, especially split screen. They may also cite the ability to resell/buy used games, though that is itself a very controversial issue; let's not get into the Internet Backdraft on that subject in this page.

Naturally, this results in many a Flame War on the web. According to PC gamers, consoles are holding back the development of gaming due to outdated hardware, while console gamers are either immature adolescents or obnoxious frat boys screaming obscenities and racial slurs into the microphone, too stupid and/or lazy to learn how to use a computer, unwilling to play any game not Rated M for Money, and likely unable to properly type their own name (at least, not without framing it in Xs and adding "420" at the end). According to console gamers, PC gamers are elitist nerds with no life who need to get laid, pour hundreds if not thousands of dollars into the latest hardware that will be outdated in two years, and consider themselves the glorious master race despite living in their mom's basement. Unfortunately, magazines only reinforce these stereotypes, making gamers who play both or even exclusively one to yell "Stop Being Stereotypical" every time they read the next issue of their gaming magazines.

Advertisement:

One thing that's almost never mentioned is the developer's point of view. Consoles are easier to develop for because every single version of that console has (or should have; hard drive size will vary) the exact same hardware and firmware; it's much easier to tailor the game to the platform, and to push the platform to its limits. Meanwhile, the PC world doesn't have standardized hardware; you might be running at least one of three operating systemsnote , two manufacturers' style of graphics cardsnote , two manufacturers' style of CPUnote , and God only knows how much hard drive space and RAM. And to be popular, your game needs to be accessible to as many of these options as possible. Part of the reason that games like Jurassic Park: Trespasser and Ultima IX flopped was because most computers could not run them; likewise, part of the lasting charm of games like League of Legends, World of Warcraft, Sins of a Solar Empire and pretty much all indie games in general is that you don't have to upgrade your computer to run them. To further confound it, there's the fact that the PC Format is constantly evolving. Nobody is able to stay "on top". PC gamers long ago ceased boasting about their rig's strength, since nobody can afford the most advanced hardware except game developers themselves. Even buying a dedicated gaming PC can be a lot more expensive than buying one gaming system. PC gamers are actually more likely to applaud a game for making the best use of an older set-up than to boast about the strength of their personal format, because they could be buying a new one next year.

Advertisement:

Long story short, it's easier to make a game that "a PS3" can run than "computers" can; and in the 2010s, where AAA games have Hollywood-movie-level budgets and expectations invested in them, lower risk and fewer variables makes everyone happy.

On the other hand, while console platforms are easier to tailor a game for, PCs are easier for small indie studios to publish games for, with a wide variety of distribution options and technologies available and cheap or free open source game engines and SDKs to work with and no platform licensing fees. The digital distribution model is also more amenable to the smaller cheaper games that indie teams are capable of creating, and these smaller games tend to run well on the lower-powered laptops and tablets that are starting to be favored as new hardware purchases. Consoles have been fighting to alleviate this problem over the last decade, however, with markets such as Xbox Live Arcade for Microsoft's systems, the Play Station Network for Sony's systems, and the Virtual Console for Nintendo's systems.

Another thing that commonly pops up is the issue of cross-generational compatibility. Consoles are rather notorious for requiring one to buy completely new hardware to play the next generations' games, and you can rarely use your new system to play games from the previous generation. (The backwards compatibility of the PlayStation 2 and Game Boy Advance were touted as selling points for those systems, and the removal of backwards compatibility from later models of the PlayStation 3 caused a lot of cries of being ruined.) For PCs this is less the issue, which is more "Will it run faster than a slug on barbiturates?", as seen with gamers having more or less powerful systems. It's been showing up in recent years, but there have been problems with older games being incompatible with modern systems as far back as the '90s. Some games used to be on different formats, such as floppy discs, and even if you could buy a CD Version, there were still problems with it being unable to run without causing glitches. That is, if you can find them. Some games you just flat-out can't buy anymore for various reasons, and if you could get them, you'd have to use an emulator or fan-made patches so it would actually run and not look really really weird due to the resolution since they weren't made with computers 20 years later. To remedy the problems of backwards compatibility, as well as availability, companies have put up "virtual console" or "Digital rerelease" versions, and the PC in fact has a DOS emulator.

Until it became more common than not to have a console almost always connected to the Internet, PC games had the advantage and disadvantage of patching. Patches for PC games can often add new content and fix game breaking bugs, as well as fix other issues that slipped past the beta testing. The disadvantage of patching is that, for some reason, developers seem to use patches as an excuse to release games half-completed, using the consumers as testers to find issues for them to patch. By no means is this exclusive to the PC platform; it's become pretty much a standard for games to require a couple of patches because they're rarely without a couple game breaking bugs fresh out of the box. (Unless there's an Updated Re-release, like a "Game of the Year" edition, Blizzard's Battle Chests, bundle games, etc.) Currently, there are plenty of clients that automatically patch the game for you, making this better. Bottom line, if you buy a AAA PC game on release, most of the time you can expect to have purchased an invite to the late beta. As consoles gain more continuous access to the Internet, though, they are increasingly likely to see patches of their own, with all the pros and cons that come with them. Because of the proprietary networks that publishers have to go through to distribute software to consoles, though, patches may take much longer to be released to console versions than PC versions and console versions may lag behind or simply never get patched.

It should also be noted that some genres just naturally fit onto certain platforms better. Real-Time Strategy and other Real Time simulations are accepted by most people to be PC-only territory, due to the difficulties with attempting to "click and drag" with a joystick and also because of the wider degree of selection and multitasking offered by a mouse and keyboard (StarCraft is the most-played RTS in history, but its Nintendo 64 port was a wipe), and trying to do a MMORPG on a console is probably suicidal (Final Fantasy XI and Final Fantasy XIV have been the only console MMOs with financial success, and the latter started PC-only). Meanwhile, fighting games belong in Console Country, since those games are designed for local multiplayer, which video arcadesnote  have been offering since the '80s but which PCs only managed around 2006 once HDTV sets became affordable.note  Today, the major battleground is the shooter genres (be it first or third); wars have been fought, only some of them digital, over whether a game's console version or PC version was better. Initially, PCs had the edge, due to the awkwardness of gamepad controls in a shooting environment and the lack of Internet multiplayer, but then dual analog sticks, GoldenEye and Halo 2 came along and collectively made those things work on a console, and from that day forward all bets were off.

Nowadays, consoles and PCs are both powerful gaming machines, capable of online gaming and vibrant effects. It is starting to be unusual to see games exclusive to a platform; releasing a game on not only the PC but multiple consoles is typically where the money is at. This brings us to yet another set of pitfalls: "porting" a game from one system to another. Simply put, it's so easy to do this badly that we have an entire trope for it: the "Porting Disaster." The PC had it so bad that it even has an entire wiki dedicated to fixing these problems, one that's still going strong today.

When it comes to porting a game over from one camp to another, things will go hairy if the port job is half-assed, or if the game in question was never meant to be on the other side. This is especially common with Japanese-developed games, since console gaming displaced PC gaming early on in Japan (where in the 1980s the MSX contended with the Famicom) and ports of those games are sometimes outsourced to Western development teams. Usually though, the PC ends up taking the brunt of sloppy porting jobs, as many games were designed for controller, not mouse-and-keyboard, inputs. To PC gamers, this is known as consolitis, where it is claimed that the developers are making their favorite game series easier for the console crowd. If a long-running PC franchise goes multi-platform, the console release often gets blamed for any unpopular changes to the game, particularly those which result in simplification or loss of options.

Hardware-wise, the relentless drive of PC component manufacturers to outdo each other results in performance advancements that rapidly outstrip that of consoles, whose specifications remain static for their entire lifetime. This forces developers to compromise the console port in ways that degrade the quality of the gameplay experience (such as less-detailed graphics, smaller levels, or Loads and Loads of Loading). Expect an enormous backlash if this is ever suspected of compromising the PC version itself.

On the other side of the fence, when PC games try to port to the console side, things don't always go as well either. Control-wise, there are more buttons on a keyboard than on a controller, and it's almost impossible to translate the speed and precision of a mouse to a pair of analogue sticks. As a result, games with a wide range of actions or those requiring quick and accurate pointing don't go over so well on the console.

It's worth noting that when it comes to controlling a port, Consoles could have the short end of the stick. Not only are there perfectly adequate joypads and controllers available for the PC, but some PC Gamers simply come to prefer the accuracy of a mouse and keyboard. Gamers who abandon the joypad even for beat-em-ups and platformers are sometimes known as "The WASD Gamer".

As for the market, it is not so easy to tell, unlike in the Console Wars. While it is fairly simple to measure out the sales for consoles and their games, since the sales of console games is related to sales of the consoles, it is much more difficult to measure it out for PC games, since there are millions of PCs in the world that have never had a game installed on them. And this is just including mainstream PC games. It could be argued that the millions of Farmville players are PC gamers as well (though if you did you might Go Mad from the Revelation). There is also the issue of piracy on the PC side. Developers usually prefer to work more on the console side because it's significantly harder to pirate console games.

One more — and perhaps unrelated — thing to consider is the advent of emulation. If accepted for the sake of the argument, this can easily put the PC over any console it is given the power to imitate. However, there are proprietary, technical, and legal challenges when it comes to emulation, such that it is difficult to find working PC emulations for consoles less than a decade old.note  The PC can even emulate Mobile devices to use Apps, and can be used to play Freemium and Microtransaction-heavy games for free. Which is technically a criminal act but some would say the prices charged by their makers to be criminal too.

Chronology and Notable Events

1977 to 1984

The gaming industry as a whole developed largely in the 1970s with the creation of the video arcade. Throughout the decade, technology evolved that would allow these arcade machines to be adapted into a home environment, thus creating the gaming market we know and love. Home computers and videogame consoles had their biggest jump in 1977, when the release of the Atari 2600 brought an affordable and easy to use home game console to the masses, while the releases of the Apple II and the Commodore PET that same year also caused a massive leap for the home computer industry. At first, the two were not really competitors, as the dedicated hardware of the Atari 2600 made it first and foremost a games console, while the aptly named microcomputers note  did not see significant game production. Computer gaming was still largely a novelty, with small titles churned out by one man development teams hoping to use the games as an opportunity to learn some programming skills. Microcomputers still largely remained the realm of word processing and business until 1982, with the Commodore 64. The C64 was first and foremost a gaming computer, although it still used its more clandestine uses to market it to families and businesses who were skeptical of gaming. The C64 sold like hotcakes despite its steep price, because it had immense technical specs for the time and proved quite versatile. It is still ranked as the most successful computer model of all time, topping out at 70+ million copies sold.

The C64 would dominate the North American market, while the indigenous ZX Spectrum computer would provide still competition for it in the UK. Despite the Spectrum never gaining any popularity outside of Western Europe, it would still have an important legacy on the gaming industry, as its ubiquity in British households is one major reason why PC gaming continues to dominate in that country compared to consoles.

1983 would be a pivotal year in gaming, however, due to The Great Video Game Crash of 1983. In 1983, over-saturation of the market and declining game quality resulted in poor sales for gaming companies as a whole, with Atari taking the biggest hit. The C64 was impacted harshly by this, as it primarily billed itself as a gaming computer, and the crash had induced a lot of skepticism in the general public towards gaming. Many people considered gaming to be nothing more than a passing trend, and the 1983 crash seemed to prove their suspicions, causing a lack of investment in the industry in general. Gaming was still largely geared towards younger audiences who lacked financial independence and were reliant on their parents for purchases, and the parents refused to buy in to the videogaming "fad".

However, 1984 would result in a significant shift in the industry, and in the PC vs. Console battle as well.

1984 to roughly 1992

The release of the Nintendo Entertainment System sent shockwaves through the industry in 1984. While American companies plummeted as a result of the crash, Japanese firms saw an opportunity to break into the market. North America very quickly became a battleground between Japanese firms, with the rest of the world following suit, but it wouldn't be until the next console generation that Nintendo would get stiff competition from Sega. Sega had initially tried to challenge Nintendo's veritable monopoly with the Master System in 1986, but it underperformed in sales, especially in North America. From roughly 1984 to 1988, Nintendo had total domination over the console market, with American firms failing to produce any good consoles. This was partly due to Nintendo's highly noncompetitive practice of forcing developers into long exclusivity contracts for their systems, and partly because of the horrific mismanagement and greed that plagued the American console market.

The C64 survived in this period largely because it had already become a common feature in millions of American households, schools, and offices. However, Commodore as a company would be badly affected by the 1983 crash and would eventually go under. The C64 primarily made sales by constantly decreasing its price, but throughout the 1980s consoles largely dominated the gaming market. The C64 and other PCs became more of a niche for those who were really into the homebrew scene and wanted to program their own games, as opposed to those who just wanted to plug in and play. It also faced some competition: Commodore's founder Jack Tramiel went on to purchase and reorganize Atari, using it to publish the Amiga gaming PC to limited success. On the more technical side, IBM's computers provided greater capabilities at a significant cost, making them a commodity only for the geekiest (and wealthiest) of PC gaming nerds.

Console games mostly stayed on consoles, but with few exceptions, every popular arcade or computer game was ported to almost every platform available, even though the common home platforms had widely varying processing power and graphical capabilities, and porting a game to a system with a different Central Processing Unit would likely mean hiring another programmer to recode it from scratch. Still, most computer games were simple in design with relatively unsophisticated 2D graphics, and even arcade driving games built for state-of-the-art dedicated hardware were often ported to far less capable home computers. The gaming audience was diverse enough: some owned Amiga, some owned IBM PC, some owned PC-88; you want to make your games playable in all of them if you want profit. The IBM-compatible PC architecture which would come to dominate computer gaming in the next decade was relatively weak in the 80s, compared to both consoles and other home computers.

1992 to 2002

The 1990s started off with consoles still clearly dominating. What's difficult to believe today is that during this decade, PCs simply were not good enough. Graphics card manufacturers were locked in a battle comparable to the Console Wars ten years previous, and even the most basic sound card was an extra expense. As a result, PCs were still considered business machines.

Things would gradually change midway through the 1990s for a variety of reasons. The PC market homogenized as the list of competitors grew smaller and smaller, and IBM architecture would become the lion's share of the market, remaining as such to this day. Operating systems were becoming more user friendly, with mouse control becoming a feature throughout the 1980s, and most operating systems utilizing a GUI (Graphic User Interface) instead of the old text-based model. This simplified PCs and made them far more accessible to the average person, while still allowing them to retain their complexity underneath the surface, making them the go-to platform for independent developers and hobbyists. Developing for the PC required little more than patience, with no expensive contracts or licensing fees. This allowed hundreds of small teams -often no more than 3 or 4 friends- to work together to make entire games in their spare time, with little more than commercially available PCs. This helped propel the PC market forward when smaller developers broke in to the 'shareware' market, where demo versions would be shipped to someone's house or passed around in hobby shops, and upon playing, the consumer would determine whether they wanted to pay for a full product. This helped push developers like id Software to the cream of the crop, while giving PC gamers a larger variety of options compared to their console bretheren. Consoles still dominated in polish and were the prime medium for big budget, AAA releases, but PC steadily regained its footing.

During the second half of this decade, the jump to 3D happened, bringing to households something that was previously only in arcades and massive rendering facilities. PC hardware manufacturers initially dismissed the idea of the PC as a serious competitor with 3D gaming consoles, but games like Quake III showed them the potential. This is also the birth of the stereotypes: PC gamers must tinker with their rig to get performance that can catch up to PS1, N64, and the like, and they were largely derisive of the kids who simply "did not earn their fun". The console gamers retaliated by painting the PC gamers as insufferable nerds with no social life.

By the end of the 1990s, things were gradually moving back in PCs favor. It was the obvious choice for many genres, such as FPS and strategy titles that just did not handle well on consoles, and by the end of the 90s PC hardware caught up with -and even outclassed- console hardware. Still, in terms of overall sales and ubiquity, consoles would continue to surpass PCs for quite some time.

Turn of the Millennium

PC and Consoles would continue to battle it out throughout the early 2000s, with consoles generally remaining the top dog. The release of the incredibly affordable (if underpowered) Playstation 2 gave households what was at the time the cheapest and easiest way to play games, causing it to dominate that console generation in overall sales. Consoles in general were far more affordable than PCs. While PCs could manage some truly impressive things (with 2007's Crysis being one of the most visually stunning titles ever released, and absolutely flooring people when it came out) they became prohibitively expensive, thus reinforcing the stereotype of PC users being little more than elitist nerds who prided stunning visuals over actual gameplay.

PCs had also begun supporting online play long before consoles, and although the Playstation 2 and Xbox would get online compatibility eventually, it came fairly late into their lifespans and wouldn't really win over any PC gamers. This cemented PC gaming as by and large the go-to for multiplayer, especially competitive multiplayer, but console gaming still held dominance in terms of 'couch multiplayer' as many PC titles didn't support any form of local multiplayer beyond LAN.

For the first half of the Naughts, consoles would see the bulk of big budget releases, but multi-platform releases became more or less ubiquitous in this time as developers now had the budgets to support porting things between console and PC. The problem is that most PC ports of this era were sub-par, prone to bugs and glitches, and significantly limited in the amount of options and tweaking given to the player. This was also still the time where one had to manually download patches for games, rather than having them automatically applied through a launcher.

That would gradually change, however, when the release of Steam revolutionized the PC market. Steam was released in 2004 to launch Half-Life 2. While initially received with mixed response, with many finding it to be a hindrance to playing games, it gradually picked up greater and greater acceptance over the course of the decade. The digital distribution platform allowed distributors to sell games directly to consumers - no packaging or shipping or retail stores or discs required. Steam had no bias and would sell almost any game, with any rating, to anybody with an internet connection and a credit card.

Indeed, Steam, with its endless supply of low-spec, high-quality Indie games, was just the thing the PC needed to support itself through this turbulent time of PC evolution, because its greatest strength was its sales. Because the game came straight from the developer to the consumer, it could afford to knock up to 90% off the price. At first sales occurred mostly during the holiday season, but soon expanded into summer, autumn, winter, and finally Midweek Madness, selling random games at knock-down prices for a matter of days. With prices so low, for the first time quality really did become a non-issue for gamers, who could afford to take risks, and Indie developers saw a foothold that they would never get into console gaming, increasing the range of games available to the PC that would never be ported to Consoles.

Ultimately, Steam (and its competitors) presented the PC with its greatest advantage over the Console yet - instant access to thousands of games at cheap prices, with no worries about games selling out on the shelves or being asked your age. It dusted off old games and adapted them for faster computers. It made expansions and regular patches easily accessible. As Steam began integrating social features like a friends system, text and voice chat, and integration in games to send invites, it seemed that there was nothing the Console could do that the PC could not also pull off.

For the first time, the war between the Console and the PC seemed to have a clear winner in sight.

But even though Consoles had lost this battle, they still had a few tricks up their collective sleeves...

The New '10s

With the success of Steam, publishers tried getting in on the digital wagon. Sony and Microsoft started to sell digital versions of games on their digital stores, with Sony even creating a version of their PlayStation Portable that forewent the UMD drive entirely and relied on downloading games. EA notably released a competitor to Steam called Origin, which launched with mixed results and controversy. GameStop, traditionally known for its chain of brick-and-mortar gaming stores that mostly cater to the console crowd, also tried to get in on the action, buying the Impulse content delivery platform from Stardock and rebranding it as GameStop PC Downloads. All four also followed in Steam's wake with sales or other perks.

These 'digital retailers' have had a massive impact on the industry, with consoles now offering their own versions, and Microsoft going as far as to have a cross-platform version between their Xbox One console and Windows 10 PC. It is not all positives, however: the ease of patching titles with a digital storefront, combined with skyrocketing development costs and time for the newest, most graphically intensive generation, have resulted in most games being released as obvious betas. Combined with practices pioneered by the tiered release of the hit game Minecraft and the advent of Steam's Early Access platform, there is little to no incentive for developers to release a finished product, instead pushing it out quickly and supporting it after launch. It often takes a game a year or more of patching to reach a finished, completed state, with many games going through dramatic changes and overhauls in that time.

Additionally, the era saw the advent of the "good enough" computer. Back in the 1990s and early 2000s, a PC just a few years old would often have a great deal of difficulty running new games. Today, five year old computers can regularly play top of the line games on fairly high settings without issue. This dramatically drove down the price of getting into PC gaming, as everyone has to own a computer, and the price of a console + a low end PC is higher than the price of a reasonable gaming PC which will be good for many years worth of gaming... and gives access to much cheaper games, despite (or arguably because of) the absence of the used game market.

On the other hand, during the 2000's and into the 2010's, the public's PC preferences have shifted from desktops to laptops. While today's laptops can squeeze more than enough power for basic PC functions like word processing, Internet browsing, and email into small and light packages, many of PC gaming's advantages apply only to mid-to-high end desktops. Indeed, a lot of laptops, especially smaller ones, have trouble running 3D games at all thanks to low-end integrated graphics cards. Many people, because of their job and/or lifestyle, prefer or even need to use a laptop for their PC needs and would balk at the idea of having a bulky and expensive second PC in the form of a desktop, preferring to game on cheaper, smaller, and simpler consoles.

The decade gradually saw the erosion of the entire PC vs. Console dichotomy. PC ports were typically treated with more care than before, while the advent of services like Xbox Live and Playstation Plus would make PC-exclusive things like automatic patching, downloadable content, online play, and eventually even modding accessible to console gamers, helping to bridge the gap. The biggest difference of the New Tens mainly came down to visuals, with PCs often being able to run games at a much higher resolution than consoles and with fancier graphics options, but typically at a price. However, the latest console generation would give consoles graphical parity with their PC cousins. It is typical now for many households to have both a gaming PC and consoles, with the latter giving them access to console-exclusive titles.

Soon into the decade, there were rumors about Sony's upcoming PlayStation 4 (then known as Orbis) and Microsoft's Xbox One (known as Durango). Rumors were heavily pointing to the x86 architecture and other components found in PCs, which lead to the following conclusions:

  • The new generation of consoles for all intents and purposes are specialized gaming PCs, abandoning the unique hardware architecture (Cell, Emotion Engine) that could cause troubles for developers (like it did for PS3 at the start of its life) and raise the already high costs.
  • When only considering the raw numbers, they are noticeably weaker than the above mid-range PCs, when at the start of the previous gen you needed a top of the line PC just to have a slight advantage.
  • When rumors surfaced that the new consoles were going to be built using AMD APUs, it didn't help much since the PC gaming community considers these low tier.
  • The announcement of the Playstation 4's specs caused many PC gamers to scoff at it, stating that the console will quickly be outclassed by a more powerful PC. It also did not help that developer Linus Blomberg openly stated that the Playstation 4 would outperform most PCs for years.

2015 saw the release of the first Virtual Reality peripherals. While they were expected to make a huge splash in the industry and upset the PC vs. Console balance through intense hardware requirements, their release came and went with little attention. This is due to two factors: their enormous cost, and the lack of interesting titles. Few companies have put any money into major, AAA VR titles due to the small market size for them. This lack of demand is paradoxical: big developers won't make games for VR because there isn't enough demand, but there isn't enough demand because most people won't purchase a VR headset due to the lack of AAA titles. Right now, some of the biggest draws to VR come from adaptations of already existing games to the platform, rather than dedicated VR releases themselves.

2015 and onward also saw the increasing number of games announced exclusively for one console AND PC, most commonly timed-exclusive PS4 and later PC combo, even when the console makers fund the game like in case of Street Fighter V. Possible explanations range from PC re-gaining enough ground to be relevant again, to rising costs of game development necessitating for the game to be on as many platforms as possible without breaking "exclusivity", to Sony and Microsoft not caring as long the games do not appear on the competitor's console.

Most notable about the New Tens is the introduction of a third platform: mobile gaming. Typically maligned by PC and console gamers alike for being 'casual' and prone to scummy, anti-consumer practices, the mobile gaming market has catapulted into being the largest of them all when viewed in raw numbers. Mobile gaming had significant advantages in regards to accessibility: nearly everyone in the developed world (and many people in the developing world) has a smart phone now-a-days, with many completely phasing out laptops and desktops in their lives for the much smaller, user friendly, and cheaper phone and tablet models. They also reached markets that were traditionally untouchable in the gaming field, namely older people who viewed gaming as something "for children" or had a stigma attached to mainstream gaming due to its often violent and immature nature. These people found a haven in mobile gaming, with games that are meant to be played in short bursts for entertainment, rather than for hours on end.

Mobile gaming significantly upset the balance of the market, because its massive audience and cheap development cost means that it makes far more money than mainstream gaming. However, mobile developers are prone to scummy practices, such as the maligned microtransactions and premium currency models. These allow games to be "free" to download and play, but to actually invest time and excel at them requires paying in a little money. These transactions are often made to be small enough that they are done without a second thought, and many mobile games are designed entirely around them, intending to 'funnel' the player into a purchase by hooking them with easy early levels and then rapidly ramping up the difficulty or time commitment. These practices began bleeding in to the mainstream gaming market on both PC's and consoles alike, with microtransactions, premium currencies and "lootboxes" becoming a prominent sight. At first it was largely restricted to games using the same 'free to play' model as their mobile cousins, but eventually such things would find their way into full-priced games. Lootboxes, thankfully, ceased being as much of an issue around 2018 when publishers guilty of using them in their games were basically forced to remove them after the Belgian government officially deemed them a form of gambling, leading all titles containing them to be essentially banned in that country (with no doubts that other countries would eventually follow suit).

The future of the rivalry isn't certain, but generally the distinction between PC and console seems to be fading. The differences between the hardware are increasingly small, leading many to accuse the latest console generation of being little more than gaming PCs with pretensions. The rise of streaming services and competitive e-sports have also played a significant role in revitalizing PC as the go-to market for gamers, with more and more casual audiences being pulled into PC gaming through interest in these activities. It seems that things are reaching a sort of singularity, and that PC and Consoles will effectively become one market rather than two competing markets, and with the volume of multi-platform releases, it seems we are already halfway there.


Top