High Definition Television is a video format containing more resolution than the 625 lines of PAL or 525 of NTSC used by Standard Definition video. Analog broadcasts in HD go back to the 1950snote , but did not take off in most of the world until the 2000s. note Furthermore, computer users with VGA monitors have effectively had HD since the late 1990s. The transition to high definition also brought an Aspect Ratio change to television, making 16:9 the new default (as 4:3 and Cinemascope content both occupy 75% of the screen when letterboxed for 16:9, making it an effective compromise between the two extremes).
These three resolutions are generally regarded as HD:
- 720p (1280x720, progressive-scan)
- 1080i (1920x1080, interlaced)
- 1080p (1920x1080, progressive-scan)
Broadcasters tend to choose between 720p and 1080i depending on their type of programming; for instance ABC, Fox and ESPN go with 720p to reduce image blur during fast motion in sporting events and films, and to address bandwidth concerns. 1080i broadcasters such as NBC, CBS, the Discovery, A&E/Lifetime networks and HGTV/Food Network go for image clarity. However in many cases the average consumer has no need to understand either format, as they all provide a great picture. 720p sets are cheaper than 1080i televisions, though as prices go down, 720p sets are becoming fewer and fewer. Cable provider Xfinity chose to forego 1080i entirely and converts all channels to 720p or less for broadcast. 1080p is mainly a media resolution utilized by camcorders, video games, streaming services like Netflix and Blu-ray. It was not possible to broadcast a 1080p signal over the airnote until the ATSC 3.0 standard came arounddates (3.0 also includes support for 2160p/4K/Ultra HD signals)
Almost all content recorded after 2009 on major networks is in high definition, and when Fox de facto ended their Saturday morning infomercial block for new edutainment in most markets, it meant that outside of one (usually ignored) exception, all the major American and Canadian broadcast networks are now solely run in HD. The few shows that were behind the times, such as America's Next Top Model and Big Brother, had varying reasons, such as probable unease by Tyra Banks over the format, and in the case of Big Brother the high cost of refitting the set's collection of voyeuristic cameras with the HD variety. ANTM finally at least pulled the trigger in March 2012, but most of their setup remained "enhanced definition widescreen" for a long time, which is marketing code for "not really HD". Big Brother pulled the HD trigger in 2014, but not because of laziness; the complicated retrofitting of the entire show to run HD took three years to complete, and the producers wanted it all-HD or none at all. Let's Make a Deal was be the final network program to make the switch with the 2014 season start.
The turning point for switching most programs in syndication to HD was the 2011-12 season (one show, Swift Justice with Jackie Glass, downgraded to SD because of the loss of Nancy Grace and a new studio in that season, and was swiftly canceled). The trifecta of NBCUniversal's trash talk shows (Jerry Springer, Steve Wilkos and Maury) and others of its ilk, along with most of the low-tier court shows, stood out for a long time as being stuck in SD, mainly because the sketchy lawyer ads airing on those shows don't really pay the HD upgrade bill well. Again in 2012-13 however, all of those programs upgraded to 480p widescreen, good enough to fill the screen at the very least, and all of them eventually switched to HD cameras when the budget allowed them to do so. As of the fall of 2017, The Robert Irvine Show on The CW finally scrounged up the change between their couch cushions and bought HD cameras, finally ending the standard definition age on American broadcast television (though the network and affiliates rarely promote that show at all as it's mainly a contractual thanks to its largest station group, Tribune). The last true SD "full frame" show in all of syndication, the morning business review First Business, ended in December 2014, but more because of local morning newscasts stealing their timeslots than any technological issues.
The ultimate resolution, which is drooled over by home theater buffs as it's the exact size of a "frame" of a digital theatrical film, has a width of 4096 pixels and a varying height between 2200-3100 pixels depending on film aspect ratio standard; this is known as "4K". "4K" is named after the approximate width in pixels, while 720 and 1080 are named after the height, hence, 4K is roughly twice as big in each dimension as 1080, not four times. Amazon Studios has required several of their movies and big name shows to record in 4K, including The Man in the High Castle, Transparent, and The Grand Tour. It is worth noting that on most domestic 4K screens, at a resonable viewing distance, even 20:20 vision is not capable of perceiving the resolution of 4K— the pixels are close enough together that your retinas can't distinguish between them— and most digital effects work is only mastered in 2K/HD (which is perfectly fine for cinema screens) so anything with CGI, or digital compositing, or digital editing, or a digital color grade, likely had that part of the film mastered in 2K and was then upscaled for the 4K release. Since practically everything made these days has at least one of these elements if not all of them, you have to go back to something scanned from a purely analogue film print to potentially get a measurable benefit to 4K anyway. Technophiles and Cinephiles respond to these criticisms with a variety of arguments about not ruining their fun; and, as post-production computers get faster, the possibility of rendering full 4K during the process do go up, so eventually they might be right.
Shows recorded on film that were broadcast before the HD-era can be remastered as long as the original film negatives are intact. This has been done with works such as Star Trek, which saw an acclaimed re-release in the format in both syndication and on Blu-ray. This sometimes causes gaffes such as If You Can Read This, which happens when writing that was never meant to be read in the old days becomes legible in HD.
Japan pioneered the adoption of High-Definition TV. The system used in Japan, "Hi-Vision" (technical name MUSE), began development in 1979 and was adopted as early as 1988 by NHK. The system uses an analog compression algorithm with some clever tricks like motion compensation to reduce blurring and keep frame rates consistent, and it was initially anything but bandwidth-friendly- initially, the signal could only be delivered via satellite as the system consumed a whopping 20MHz per channel (comparatively, a typical UHF channel consumes 7-8MHz). Nonetheless the system was a success in Japan and by the early 90s, many Japanese households had BSnote -friendly TVs that received MUSE high-definition signals direct from satellite. Additionally, DVHS recorders (marketed as W-VHS to indicate that they're capable of recording MUSE signal) that could record (via a BS source) and play back MUSE signal and MUSE-capable LaserDisc players were also developed for the market.
The US investigated the standard but ultimately decided not to adopt itnote , while the Europeans attempted to create their own system, called HD-MAC. This attempt failed, partially due to the same reason as the US and partially because they refused to modify the 60Hz system to work at 50Hz, preferring to build a new system that worked at 50Hz from ground up. Ironically, a stream of improvements to the system later made it possible to transmit MUSE signal over a 6MHz bandwidth by the late 90s, which American engineers claimed to be impossible to pull off without severe frame rate and picture quality issues (again, the Japanese employed heavy use of motion compensation and employed analog encoding tricks which the US engineers had failed to consider, like for example encapsulating the entire image stream as an FM signal). The standard was later expanded to carry Dolby Digital Surround audio- years before the rest of the world had even heard of the term high definition. Japan eventually abandoned MUSE for the digital ISDB broadcast system, which had been developed with HD support in mind from the start.
Some persons believe that it was the French who pioneered HDTV with their "System E" transmission standard, which broadcasted a 819i over a 14MHz bandwidth on VHF from the 50s through the 80s in France and Monaco. This spawned a modified "System F" that uses half the bandwidth which was deployed in Luxembourg. However, the systems were black and white only (notice how they're not prefixed with the word "SECAM" in many technical documentations), and in the case of System F, had noticeably bad vertical resolution due to the available bandwidth being cut in half. The systems were dropped when these countries finally switched to color broadcasts completely.
Many 8-bit and 16-bit standard resolution era gaming consoles opted to abuse the vertical retrace pulse to create a pseudo- progressive scan mode, meaning they drew only on one field but, that field was refreshed twice as fast. This consequently resulted in the "scanlines" effect as consoles drew exclusively to the odd field and left the even field empty. By the time the 32-bit consoles came about, consoles started taking advantage of interlacing to generate higher resolution images at the cost of losing half the frame rate, but games could choose to stick with the pseudo-progressive scan mode to maximize frame rate at the cost of resolution. Later, during The Sixth Generation of Console Video Games, new titles started to also support true progressive scan, though in limited capacity on most platforms due to the fact that most consumers didn't own a progressive-compatible TV until the commodification of HD TVs in the late 2000's and early 2010's standardized it. The vast majority of the Nintendo GameCube library, however, did support progressive scan through component and D-Terminal cables, though the ones made specifically for the GameCube were discontinued due to extremely low adoption and now go for more money than the system itself on secondhand markets (as the fact that they plugged into a Digital Audio/Video port required the use of a proprietary DAC built into the cables, which was never licensed out to any potential third-party manufacturers); you can still get near-identical results for far less money with the Wii's component cables though (as those plug into the same analog port as the composite cables and consequently have a large number of third-party variants readily available even today). Progressive scan was also widely supported by DVD players due to the higher and more even image quality benefitting the videophile market, and this trait would carry over to Blu-ray and UHD Blu-ray players as well.
The PlayStation 2 and Xbox both supported HD, but very few games used those modes. The PS2 only had two games that supported 1080i (Gran Turismo 4 and Tourist Trophy, which both ran on the same Game Engine). The HDTV Arcade Game Database highlights games that support the 720p and 1080i modes. Homebrew for either platform will be more likely to leverage the higher resolution. Most (but not all) games among all four sixth-gen consoles support 480p, at the very least.
Of The Seventh Generation of Console Video Games (the PlayStation 3, Xbox 360, and Wii), only the Wii doesn't have HD capabilities. This has the side effect of making text on some PS3 and 360 games illegible on non-HD TVs (since the game programmers expected them to be played on widescreen HD sets), a problem not present in the Wii due to the aforementioned lack of HD.
All eighth-generation Wii U (and ninth generation Nintendo Switch), Xbox One and PlayStation 4 are in HD, with the latter two extending to 4K. Later, the release of the Xbox One S allowed for 4K Blu-ray and streaming video playback for Microsoft's consoles.
For the sake of clarity, the following are what the various video forms are defined as.
- Standard Definition (SD): Depending on the format, it's either 480 (Systems M and N) or 576 (System B through K', and System NC, except Systems E and F) horizontal lines interlacednote . The format can be widescreen or not. There is no width measurement as the width was considered to be limited by the allocated video bandwidth in the analog era, which can be anywhere between 6-8MHz. However, when computer graphics cards output to analog TV, the accepted format is to output 640x480 or 768x576 converted to interlaced (this may result in dot crawls and on NTSC systems, artifacting issues.
- Progressive scan: a variant of Standard Definition in which every horizontal line is displayed simultaneously rather than alternating between the odd and even ones. To differentiate between interlaced Standard Definition, resolutions of this type use a "p" suffix to denote "progressive," with interlaced resolutions being denoted by an "i" suffix. This lettering would become standard for later resolution standards as well. Progressive scan was mainly used for DVD players and some sixth-generation console games due to the greater and more even image display allowing for better picture quality than interlaced video, but was never widely adopted by broadcasters in part due to the race for HD video overshadowing it.
- Due to the way analog TV signals work, it was possible to force the TV to forego one field, which creates a pseudo- progressive scan resolution by halving the resolution and doubling the frame rate. This resulted in the pseudo-240p and -288p resolution used by many consoles of the 8-bit, 16-bit and some of the 32-bit era. The once-infamous, now-iconic "scanlines" effect consoles of the period were renowned for were the result of using this pseudo-resolution as only the odd field was ever used and the tube TV was forced to do a full retrace at the end of a frame instead of moving on to the even fields. Unfortunately, it also had the caveat of being nigh-impossible for later digital image processors to understand, with LCD televisions frequently either interpreting the signal as 480i and trying to deinterlace the progressive signal (which results in a far muddier look than how it'd appear on a CRT) or just rejecting the signal outright. Consequently, a large market ended up emerging for external upscalers that could better handle these pseudo-resolutions and convert them to a progressive signal that LCD monitors can more readily understand (typically 480p or 720p).
- Enhanced Definition (ED): The same as standard definition, but is progressive scan. In the digital era, its equivalent would be roughly 640x480 (VGA) and 768x576 for 4:3 ratio transmissions, and 854x480 (FWVGA) and 1024x576 (WSVGA) for Widescreen transmissions. ED typically means 60 frames per second. 480p at 30 frames per second is generally considered to be SD due to having the same number of lines per second as 480i (14,400 lines per second).
- High Definition (HD): 1280x720 progressive scan. Sometimes also retroactively applied to analog Systems E and F (819i) and Japan's MUSE "Hi-Vision" format (1035i).
- Full High Definition (FHD): 1920x1080 interlaced or progressive scan.note
- Quad High Definition (QHD) or 2K: 2560x1440, usually used on computers and many newer smartphones. Four times the resolution of 720p.
- Ultra High Definition (UHD) or 4K: Defined as 3840x2160 on The Other Wiki, but varies among manufacturers. Four times the resolution of 1080p and nine times the resolution of 720p.
- Full Ultra High Definition (FUHD) or 8K: Defined as 7680x4320. Four times the resolution of UHD. NHK is planning to broadcast some of the 2020 Olympics in Tokyo in 8k.