Wait, this page doesn't Links to the all the Video Games with Headscratchers page. Why is that?
They're in Headscratchers/Games along with the tabletop ones.
Why is it call "Video Games?", Video Games have nothing to do with the Video itself, so why are they called Video Games?
Well, they would be pretty useless without a "video screen," wouldn't they?
Mmm... All right... So Why isn't it call "TV Games?"
If I'm not mistaken at least some early ones were.
Japan also calls videogames, "TV Games."
Because people use video tapes all the time. And not all games are played on the television. And the home console came into existence after the arcade box.
Bonus points for the above statement: some of the earlier consoles (including those in Japan) actually used video tapes to hold the data for games. Then there's the wonderfully outdated slang term "game tapes."
It's things moving around on a screen. It's just easier, probably to think of it as a video of some sort.
They used to get called "computer games" for the longest of times but that seems to have died out now.
That's technically inaccurate too, since some early videogames (like Pong TV units) weren't true computers.
Pong (even the original) was certainly a computer; it just wasn't a general-purpose one. And it may not have been a digital one. But a non-programmable analog computer is still a computer.
Why? Look above. Its to differentiate them from card games and board games. Considering that these games required the use of some form of video display, the name was pretty apt.
On top of that, some board games don't even have a board, the most notable ones being Twister, Yahtzee, Apples to Apples, and Puerto Rico. However, their gameplay and their rules follow normal board game conventions (minus the pieces, moving, and board) and are sold with other board games.
Because English's status as a living language without any official or effective authority (outside of individual organizations) allows for the compounding of words based on individual preference, apparently prevalent enough to combine 'videogame' and (despite what you think) 'boardgame'. 'Playinggame', however, isn't aesthetically pleasing due in part to the clashing letters. Why do you say 'tomorrow' instead of 'to the morrow'? Why do you say "whatever" instead of "what ever"? English is not a language that lends itself to Grammarnazification.
Why not "electronic games"?
Because that term generally refers to hand-held games. (Not like the Gameboy, but like these)
Pretty much no one uses "electronic" and "computer" as synonyms any more, at least not in the US.
It has everything to do with the video itself. That's like saying books have nothing to do with paper.
The first link I got when I Googled for "video definition" produced "of or pertaining to the production of text or graphics on a video display." It's a valid use of the term.
Why has no one made Nintendo Vs. Capcom yet?
Knowing Nintendo, they probably turned it down... for now.
Capcom has tried, and offered Masahiro Sakurai a leading role, with the intention of making a Smash Bros.-like version and a traditional 2-D fighting game version. Sakurai and Nintendo both turned them down. Likely reasons are that Sakurai was at that point working on Super Smash Bros. 4 and that Nintendo does not like doing things traditional and won't trust Capcom to make a 2-D fighting game that can sufficiently break the mold.
Have you noticed how in many JRPGs when you travel on the overworld and cross the edge of the world map, you appear on the other side? In other words, when you go out of the world map from the left side, you appear on the right side. Likewise when you go out from the upper side you appear on the lower side. If you don't think too much about it, it seems to make sense and to be logical: The world's surface is contiguous so you can travel endlessly in one direction. Now think about it a bit more. Take a world map of the Earth and think how it would work there: If you go out the left edge you would appear on the right edge OK, no problem. The left and right edges are connected. But what happens if you go out the upper edge (IOW the north pole)? Do you appear on the lower edge (i.e. the south pole)? No! The upper and lower edges are not connected! (If you go out of the upper edge, you will just appear again on the upper edge, on a different place.) So how exactly does this work on JRPG world maps, again? That's fridge logic for you.
(Answer: The only way for it to work like that is if the planet was actually a torus. The only way for a rectangular area to be able to represent the surface of an object such that both pairs of opposing edges are connected is if the object is toroidal. Hence all JRPG planets are toroidal in shape. This raises the question how such oddly-shaped planets can form...)
On the other hand, maybe you just walked across the North Pole and started walking south on the other side?
Developers (Especially newer developers) always thinking in grandiose concepts that more experienced developers have done, and better than "New Development Studio X's First Game Where You Fight Big Armies". Even if you're working with a HD 3D game that shouldn't mean you over-extend your budget and end up crashing the studio! Why is it always big battles when a small adventure game can get you started with the tech for a fraction of the cost and therefore make easier returns? I understand some studios know this lesson (LittleBigPlanet's Media Molecule for instance), but it should be common sense. Then again I'm not a developer. Thoughts?
They are developers, but they are neither businesspeople nor economists. These guys assemble a studio because they have an artistic vision in mind. The problem here is that they have so much ambition, they can't wait to make their dream project. Said dream project is usually grand and will thus cost a lot of money, but they will underestimate development costs and time needed. This oversight results in an unfinished but promising game and a lot of broken dreams.
Whynote Forgive me if this has already been mentioned — this page is too huge to read every word. would game developers want their games to be played longer, to pad out their games, and the like? Wouldn't it be more lucrative for a customer to pay $60 for a 4-hour game with no replay value whatsoever, so they immediately go out and buy another $60 game, repeat ad nauseam? You'd think RPGs would be less profitable than FPSs, but these days you have all sorts of added stuff to them (Nazi Zombies from one of the Call of Duty games). Wouldn't developers rather remove all that and make players buy the next game?
That's assuming they're willing to shell out $60 for a four hour game. Plus, like other forms of niche culture, they have to keep the players playing games habitually. If they play a four hour RPG, beat it, and have to wait a week or two for their next paycheck to get another $60 game, their console's already become a random knicknack below the TV.
Real reason: GameStop. Buying a (console at least) video game anymore is more or less an extended rental. DLC and such can't be resold, since it's locked to a Steam or EA account, and "free" DLC or pre-order bonuses ensure that gamers buy spiffy new shrink-wrapped games instead of crummy old used ones (which, coincidentally, don't provide new royalty checks for the distributor).
You can't even sell a twenty-hour JRPG for $60. A lot of JRPGs have replay value, too, thanks to alternate endings and New Game+ — I feel cheated if I'm "done with" a $60 game after less than seventy hours total.
You can't sell a Twenty hour RPG period, unless it's a handheld title. And even then... the shortest handheld RPG I've ever played was Tales of the Tempest.
Simple economics and consumer behavior. Read about The Great Video Game Crash of 1983, in which many video game companies tried to do exactly what you suggested. Eventually, gamers simply figure out that they are just getting ripped off by the developers with poorly made games, and simply stop buying them all together. Almost destroying the entire industry forever.
Achievement. I do have a question - do you think that someone could make a really reallyGuide Dang It achievement just to be funny?
They have. 'Sweet Goodbye' In Mirror's Edge is just obtuse.
Why is it when people show off a game that's available only on PS3 and 360, they always show and favor the 360 version?
FollowTheLeader may apply if the 360 sells more versions or is held by customers more likely to read the review. This is obvious for console specific medium like Sony Network Magazine (or whatever they migth be called).
Most of the time, multi-console games are developed first for Xbox, as PS3 can be much more easily ported from Xbox, but is much harder the other way around. Although this isn't always the case. Sonic and Sega All-Stars Racing was originally developed for PS3.
Strangely, only the Wii and 360 versions of the game got their own console-specific characters, but the PS3 gets squat. (to say nothing about the DS version)
So nowadays (well, for years now) they've stuck a warning about risk of epilepsy episodes when playing games. That's good, light-pattern-triggered epilepsy is a terrible thing. But why only games get that treatment? I saw a warning like that in Shadow of the Colossus HD for the PS3, a game that is not know for blinking lights and sudden light pattern changes. In terms of light intensity, it's tamer than most action movies and animations! How come only games get warning signs?
Not sure, but you never know if it'll happen.
Yes, better safe than sorry, but why only videogames. We know about light-pattern-triggered epilepsy because (AFAIK) of Pokémon — the series, not the game. So... again, why only videogames get the warning?
Actually it was justified in Space Channel 5 - I mean, look at stage 3. that looks more like the Pokémon seizure event.
Back to original question, tv shows and the like know exactly what's going to be on screen, so they can more confidently avoid it. Video games have to tack into account movement and glitches, which is why they get warnings. That said, I have seen DVDs with warning labels, so it's more "A lot more video games get warnings." then " Only video games get warnings.
Because video games are the medium that got in trouble over bright flashing lights causing seizures (even if it was a TV adaptation of a game). If there was ever a TV show or movie that caused something similar and gained some infamy for it, they would start putting those warnings up too.
It's aggravating when you have to unlock each level in a puzzle game, where there's no concept of story, gradual character growth, or competitive play to justify it. If I'm stuck on puzzle #9 I'd like to be able to move on to #10 and come back to it later. Or maybe puzzles #26-50 are based on a game mechanic I don't enjoy, and I want to skip to #51. This is especially aggravating when a puzzle game's first entire set of puzzles is a tutorial and it still won't let me move on to real puzzles until the tutorial's done.
How come I see people praising Indie games in SD for having "retro charm", yet the same people turn around and bash certain developers (Namely Nintendo) for not making games in HD or for making games look "like shit" or "Dated"? (Xenoblade and Kingdoms Of Amalur come to mind.) Is HD really that cheap to make games in that these devs are just being luddite?
Well, that's just people hating on Nintendo, as usual.
There's a difference between genuine "retro charm", and simply crappy graphics. Think of Mega Man 9. That game was praised for its old school graphics and charm; that was their intent. However, take Xenoblade. This game is not trying to capture the nostalgia factor, it's just running on Nintendo's inferior power. One of my main complaints of the game is that it doesn't take advantage of the Wii's unique hardware (mainly, the Wiimote, unless you use that control scheme, but it adds nothing). If this game were to be run on the 360 or PS3 (PC's doesn't count, they're always graphically superior), you'd have the exact same thing, 'cept with superior graphics. I absolutely loved the game, but, to me, that is one of it's faults.
And yet Xenoblade's devs didn't have to sell several million copies because they made their game on a system that was cheaper to make...strange...
How does Xenoblade Chronicles "look like shit"? It's on par with anything else on the Wii (and not being in HD doesn't equal bad graphics, otherwise we might as well say everything on sixth-generation consoles is fugly, right?).
Seinfeld Is Unfunny. If it was a Gamecube game people would say it looks beautiful. Even then, it's "subjective" and I thought the hand-crafted world was a lot more interesting. (I mean, to quote Destructoid...look around Skyrim or most cities in Rockstar games. AS beautiful as they are, you can see what I call the "Lego Seams" where they clearly just pieced it together out of existing models and blocks.) Plus, I thought Xenoblade in some way looked better, just because the characters didn't look like walking corpses, a big problem with "realistic" games...
That article mentions about games with budgets so big that they start laying off staff and shutting down their doors... yet how come I never hear about this with Nintendo's second party companies? I wonder... is it because they don't need to make games with budgets bigger than those of most of Africa? Or do they just pace themselves so they can afford a loss moreso.
With the exception of Pokémon and Super Smash Bros., 2nd-party games tend to get pretty low budgets at Nintendo and are given intense pressure. Games like the Brain Age series, Dillon's Rolling Western, and Pushmo are made under these conditions. This usually results in a miserable time when the game is made, but when finished, they tend to turn a profit very soon after the game comes out and are usually well-liked.
Related to the above, how many people still don't have HD? Just out of curiosity. (I don't - I can't afford it. I can't even afford to buy new gaming consoles and have to buy them second-hand.)
I'm in the same boat, I have to squint to see subtitles on my 360. Apparently many people have HD (They definatly have enough that I see eight-year-olds with 3DS') enough people have it that it's a viable thing to make, but a good number of people still don't have it, so Nintendo is being nice to the casuals like a good company.
I still mainly run on SD, even though I have an HD television. (The HD television is plugged only into SD hardware, though for reasons unrelated to this discussion.) SD is still very strong, and the used market is hot, at least in the United States. The reason you're seeing the game companies pushing HD is because, well, video games are made by programmers—in other words, fans of cutting edge technology. Someone who still uses SD is at idological odds with the people who make video games.
If the latest numbers are to be believed, something like 69% of households have at least one HD television, and that's going to go up as TV's get cheaper.
Part of it is cost, however there are two sectors of people who prefer CR Ts over LC Ds/plasmas. The first reason is response time, I do not know the exact number, but LCD TVs actually have a small delay for input. In most games today it's not too much of a problem as they're made expecting the input as well as any online delay (5ms for a base), but these people prefer CRT as the input response is near instantaneous. The other reason is colors, older games have odd coloring on an LCD but come in perfect on a CRT via RGB wiring, as well as the CR Ts native filtering system. LC Ds also force the consoles to stretch which may bother said community. It's a really interesting issue that isn't necessarily known offhand, but has a detailed history behind it.
Why on earth is Microsoft prefered by most companies over Sony? For example, Call of Duty gets DLC first, some games get no DLC at all for PS3, while Xbox gets a lot, butchered PS3 ports. What baffles me is how Microsoft is such a Jerkass when it comes to publishing games and with their guidelines and approval. Don't Indy companies get sick of Microsoft with their BS?
Money. Whenever there's an "Xbox 360 exclusive" DLC, Microsoft has usually put up cash for it, and they've paid in the past for exclusive titles in order to "establish" their console in certain weak areas (Tales of Vesperia comes to mind). For a smaller company in a very fickle gaming market, a guaranteed money deal can be a lifeline.
Pretty much, especially for games that're close to release. Contrary to popular belief, not every game is actually profitable upon release. (You know that game, Kingdoms Of Amalur that sold like 1.2 million copies within a few days? Impressive...yet they needed to sell around three million before it would start making money.)
Contrary to what a lot of people think, this virtually never happens. If Microsoft spends money on a game, it's almost always as an investor, in the exact same way Sony and Nintendo do the same thing.
The actual reason is because Sony focuses almost entirely on the Japanese market, where DLC only recently started becoming accepted. The reason they were slow to embrace it is because it's a very common practice in Japan for major games to be re-released with new content. It's far more profitable to have someone buy the same game twice at full retail price than to sell the game once and then sell DLC. Those sorts of re-releases tend to do poorly on the American and European markets, which is why expansions and DLC were more important. Since DLC is more important in Microsoft's main market, they have always been much more DLC friendly. Sony only recently started to realize that discouraging developers from making DLC might not be a brilliant strategy.
Something that had been addressed in discussing video game movies that has bugged me for a while now: why do they so drastically change the main female character in video game movies? In Resident Evil for example Jill was pretty nice. Maybe not the nicest character in the series, but in the films she's an utter bitch. Same with Wing Commander, Angel is a pretty decent character in the game, in the film she's pretty horrid. Dead or Alive, Kasumi is perhaps the nicest character in the series, and the movie makes her angry and violent. Why this obsession to make them bad girls? Do the film makers actually look at the games, at all?
I think they're trying to play up the raging "bad-girl" fetish we Americans apparently all have. In Japan, where the games are made, girls who are sweeter and not aggressive are veiwed more positively, but in America where the films are made, it's all about women empowerment and such. Having a nice female action movie protagonist is completly taboo. Personally, I can't stand horribly aggressive girls, but I guess some people do...?
I've come to the conclusion that Hollywood just can't write women well, especially video game women. Part of this is because they probably don't play the games, and just get a synopsis or a plot rundown from someone else, and part of it is because they're screwed either way. If a woman is featured in an action movie, she's always either a Distressed Damsel or such an Alpha Bitch that nobody would want to associate with her. If a woman is helpless, the writers get lambasted for portraying women as useless or weak. And the empowered woman bit hasn't quite figured out where it wants to go yet, successful examples of Action Girl aside. Some of that is because a person can get away with a lot more violence spaced out over eight to twenty hours of gameplay than they would in a two-hour movie, but I'm starting to think that Hollywood just sucks at translating characters. The only anti-heroine I've ever seen that made the transition from video games to movies and became more relatable in the process was Lara Croft, and part of that was because the video-game Lara Croft is a borderline psychopath. Thinking about it, I can't come up with one female character that wasn't either The Load, or someone I wouldn't trust with anything even remotely pointy.
Disney is making a concerted effort with this, between Merida and Anna. Fan debates have heated up over who is the better female role model.
Why do the role models have to be female role models and male Role models, anyways? Maybe I'm just weird but I figure a role model doesn't have to be a woman or a man to be such. When I head Samus was a girl, I thought "...Well okay" and was confused at how she was a role models for girls... as if boys can't find her to be a role model either.
I have an Xbox 360 and enjoy JRPGs. Luckily, there is no short term of such games. However, it seems every JRPG that was an exclusive for the 360 soon gets a super-charged Enhanced Remakeonly for the PS3, which I don't have (Games like Eternal Sonata, Star Ocean: The Last Hope or Tales of Vesperia). Which is stupid for two reasons: 1) If the game sold so well in the first place, don't remake it for a completly different consol, and 2) why couldn't they just also release the remake for 360? Is it just all about money? Is the PS3 crowd a goldmine for RPG players?
Blu-ray holds a lot more space than a DVD. In the case of games like The Last Hope, that add a whole new audio track, that audio takes up a crap-ton of space, and the Xbox version already takes up multiple DVDs as it is, so the additional audio would make a DVD release very cost-inefficient.
Microsoft pays for the exclusives (to try to establish their console in Japan), but PS3 versions make up the bulk of Japanese sales.
What's worse is that in the case of games like Eternal Sonata or Tales of Vesperia is that they were made for the 360 firstm and the latter even made the 360 sell out in Japan. Yet despite all that...it was a freaking Beta. The PS3 is indeed a goldmine for RPG players mostly because that's what the PS1 and the PS2 were known for, so Sony continues the image, and likewise the publishers are drawn to them.
How was Vesperia a "beta"? Granted, I never finished the game, but I saw no glitches in it from what I played.
The PS3 version of Tales of Vesperia contained so much extra content, and there were signs of Dummied Out content in the 360 version that's in the PS3 version... yeah. The 360 version is about 80% of what they intended. They released a product that was polished but about 80% of it. Essentially, a lot of people are hurt that it feels like they got a beta version of the game. (Bonus points for Japanese players who got a 360 just for that game. Given that the game caused the 360 to sell out in Japan... there's gotta be quite a few of those/
The added content is more "beta" than the 360 release; a lot of it was either pointlessly tacked on extras, or a bad stylistic fit (i.e., anything with Patty). Anyhoo, Sony wants a certain amount of new content in situations like that, so they can flog it as an Updated Re-release.
Why there is no collaboration between western indie game developers and Japanese doujin circles? It would be awesome if such a thing exists.
Not enough time or money to afford English/Japanese/German/French/Spanish/Dutch/Norwegian/Whatever lessons or translators, maybe?
Because "What's this? Scary foreign porn of my characters? Lawsuit time!" comes to mind.
The nature of doujin is a really risky thing outside of Japan. Doujin exists there because of the relaxed attitude most creators have for fan-material that seems to be part of their culture. (Music is an exception, but that's another story.) Compare that to the United States, where, for instance, FOX sent cease-and-desist letters to every fansite of The Simpsons in the mid-90's. Unless it's an original work, and the western indie developer owns the rights, such a collaboration will attract copyright lawyers hungry to sue.
Excuse the Fridge Logic here, but has anyone else noticed this fundamental difference in single player versus Multiplayer? I mean, I look around gaming boards and the single players pretty much play a lot of new games, whereas multiplayers generally tend to be married to a handful of games, only stopping for a little while before ultimately returning. I know Multiplayer games may have more longevity (Single players tend to drop games more easily) But it seems to be easier to persuade single player games with a new game - you can persuade a Multiplayer gamer to try a new game, but they're likely not going to stick around, unless there's a mass exodus. Multiplayers always go where the gamers are. And not every advertised "____ killer" actually does it since almost everyone just goes back to ____ because "There's nobody on there."
you only need yourself to play a single player game, a multiplayer game requires more general interest, if someone wants to play a multiplayer game and matchmaking takes to long because there is nobody to play with then they will lose interest, it's like waiting hours or days every time you want to load your singleplayer game.
Which makes one wonder why devs think single player games don't sell - if anything, they're easier to sell games to because they don't just obsessively play Skyrim for years, they obsessively play it for several months, then move onto the next game that catches their interest.
Why are most hidden achievements/trophies written in present tense (such as "find this item", "beat the game in this difficulty, etc.), when you already completed the condition(s) needed to unlock it?
I think that's actually the imperative mood.
Speaking of which, why are most hidden achievements, like beating Chemical Plant, Act 2 without falling into the water, well...hidden? Shouldn't they be reserved for plot-specific trophies that would otherwise spoil, like defeating General Azimuth and saving the universe in ''Ratchet & Clank Future: A Crack in Time"?
To make it harder to obtain, theoretically, but it just causes people to look them up online because people hate Trial-and-Error Gameplay that spans the entire game and could potentially be anything (including losing or dying).
So I notice this with every single hyped game that comes out: Someone streams it for like 50 hours straight and has the game beaten within a weekend. Almost every single time, there are a bunch of others who do it as well... then say "It's Short, so It Sucks ". Huh?
Perhaps those people haven't figured out yet that most players do not stream games for 50 hours straight.
Smugness. They're Jason Foxes who grumble and moan when a game doesn't go on for 600 hours.
Do people not understand what "8 bit" and "16 bit" means? I see people go around praising games like Starbound for having "8-bit graphics" when those are obviously NOT 8-bit graphics. Similarly, 10 years ago, I used to hear people describe games with sprites (Namely Disgaea) for having "SNES sprites". Uh... Disgaea does not have "SNES Sprites" - those would not work.
when working in 8 bit or 16 bit graphics certain graphical styles tend to arise due to the limitations, when people call something that is not 16 bit 16 bit then they are generally reffering to that style not the actual number of bits, or they are just misinformed.
Why aren't there any local multiplayer RPGS? I understand that you usually want people to follow the same path in an RPG but many developers have solved multiplayer pathfinder issues before. It would work really well especially in a turn-based battle system.
Well, there's Final Fantasy Crystal Chronicles. Besides that, I'd guess it's because RPGs are large time investments compared to other games, and unless you are close to at least one person who comes over a lot (or lives with you) and is also into RPGs, you won't get much multiplayer done. Other games with strong multiplayer elements—fighting games, racing games, platformers, sports games, virtual board games, etc.—only last for a few minutes per round except virtual board games, and each of those can tell a full story in one sitting.
Because they'd have to make it something you can't do in just dungeons and Dragons.
I saw some people saying that Bravely Default was "Taking the 'game' out of 'gameplay'" because you could hit autobattle. Err... have these people never played Shin Megami Tensei, Breath of Fire, Dragon Quest, or well... a lot of old RPGs in general? Auto-battle has been in games for literally decades.