WhatIs.com

720p

By Robert Sheldon

What is 720p?

720p is a high-definition (HD) display standard used to describe the resolution of a television or computer display that measures 1280 x 720 pixels. As such, each complete raster in the display's image has 1280 vertical lines and 720 horizontal lines. This resolution also translates to a 16:9 aspect ratio, which means that for every 16 horizontal pixels, there are 9 vertical pixels. Most of today's televisions have this aspect ratio.

How do 720p displays work?

The p in 720p indicates that progressive scanning is used to render the image. With progressive scanning -- also called non-interlaced scanning -- each scan displays every line in the image raster sequentially from top to bottom so that the entire image is refreshed with each scan.

The other main approach to rendering images is interlaced scanning, represented as i. With this approach, each scan displays alternate lines in the image raster, so two complete scans are required to display the entire image. For this reason, progressive scanning delivers higher quality images than interlaced scanning. There has never been a 720i standard, although there are 480i and 1080i standards.

Other types of displays

Prior to 720p, the main TV display technologies were 480p and 480i, both referred to as standard definition television (SDTV). Each one measures 640 x 480 pixels and has an aspect ratio of 4:3. For the most part, 720p displaced 480i and 480p for television displays; and since then, 720p has been largely displaced by 1080p and 1080i and even higher resolutions, such as 4K. Both 1080p and 1080i measure 1920 x 1080 pixels and have an aspect ratio of 16:9, just like 720p.

The following figure illustrates the differences in scale between 480p, 720p and 1080p images, in terms of their display resolutions and aspect ratios. Differences in quality become apparent as display sizes increase. For example, an image displayed at 480p on a 60-inch television will look much grainier, jagged-edged, or even out-of-focus compared to 1080p.

Resolution is not the only factor that determines image quality. Features such as contrast, color (red, green and blue), color quality (hue, saturation and brightness) refresh rates and processing capabilities also play an import role. Quality can also be improved by reducing the incidence of visual artifacts. Even so, resolution remains one of the main considerations when purchasing a TV or computer monitor.

Today's high-definition television (HDTV) providers broadcast in either 720p or 1080i. Televisions must scale these signals accordingly to display them on their sets. Only smaller-sized budget TVs still offer 720p resolutions, with 1080p quickly following the same fate. Most of today's TVs -- referred to as 4K or Ultra HD TVs -- deliver resolutions of 3840 x 2160 pixels, while still providing a 16:9 aspect ratio. The newest and priciest televisions on the market, 8K TVs, offer a resolution of 7680 x 4320 pixels.

Learn how 4K is being used in the enterprise and where it makes sense.

22 Jan 2024

All Rights Reserved, Copyright 1999 - 2024, TechTarget | Read our Privacy Statement