Today, most HDTVs have two picture modes with similar names: 1080p and 1080i. They have the same screen resolution, so what’s the difference between the two modes? Below, we will explore the key differences when comparing 1080p vs. 1080i.
Let’s start with the abbreviations — 1080p is short for 1080 “progressive scan”, whereas 1080i is short for “interlaced scan”. In an interlaced scan, odd and even rows of pixels on your screen illuminate in an alternating fashion. Each set of rows is called a “field”. Since the fields flash so quickly (30 times per second!), your eyes do not notice the switch, and your brain perceives a fully assembled picture. The number 1080 refers to the amount of vertical pixels. Each field in an interlaced scan consists of 540 pixels, and adding up each field gives you get a total of 1080 vertical pixels.
Next, progressive scan takes a different approach to the same 1080 pixels, scanning every row progressively. Your screen refreshes every row a whopping 60 times per second. As you might imagine, this is harder to pull off technologically, but the results are worth it.
Despite using the same number of vertical pixels, images produced in 1080p are generally much higher quality than those produced with interlaced scan. 1080p is often referred to as “full HD” in order to differentiate it from the lesser quality pictures produced with 1080i or 720p.
Lastly, cable companies will sometimes deliver a 1080i picture, but then compress the data substantially so that it takes up less bandwidth. This process can smear details or cause pixelated color gradations, especially in scenes with a lot of movement. It’s still technically HD, but it’s not as good as “true” or “full” HD.
Need the perfect clip for your video production?