CNET published a recent article saying on a visual comparison, they can't see substantial difference between 1080i sets and 1080p sets (The case against 1080p). I find such articles alarming. I think 1080i sucks, and here's why.
Simply put, 1080i sucks because interlaced broadcasting sucks.
The real issue with interlaced content is not the TV set. I know CNET is all about telling people what equipment to buy, but the real issue is the content itself. When your content is interlaced you don't have frames, you have half-frames that are interleaved. It's not that terrible when all the TVs are also interlaced, and all have the same resolution.
But this is not the case anymore. Now you can watch HDTV on displayes with various resolutions such as 1920x1080 (full HD), 1280x720 (the lower resolution HD), or even 1366x768 (which is a wide version of the popular 1024x768 computer resolution, useful when you need to use your display for movies as well as a computer monitor).
Now here's the tricky part - if you don't have complete frames, you can't resize the image. What you need to do is break the 1920x1080 resolution into two images, each 1920x540, scale each image, and then try to combine the result into a complete frame once again (or maybe keep it interlaced). The results of this process are not excellent. Another option is to try to recreate the full frames from the half frames you have, but doing this properly is the kind of magic that consts a lot of money, I mentioned that ealier in my post "HDTV is cheaper than your old TV".
You might think your safe if you bought a 1920x1080 set, but I've already heard of plans to manufacture 2560x1536 sets. If your content is 1080i, it will be hard to scale properly for the TV of the future (or next year's projectors?!?).
Here's another good reason why interlaced sucks - because display technologies today are progressive. LCD, Plasma, and DLP are all progressive technologies. When you feed them interlace content, they have to process it somehow (deinterlace) to fit their progressive nature. Sure, if your original material is progressive (film) and you "convert" it to interlaced, it will look well when you play it back on progressive devices, but if your material was shot by an 1080i camcorder, this conversion will either have interlacing artifact, or you'll loose some sharpness.
When modern HDTV was concieved in the US in the mid 1990s, CRT was the dominating technology, and the people who concieved HDTV were visionairs, but still had a limited vision. They thought the amount of information in a 1080p (at 60 full frames per second) would be too much. Well, it's not too much now, even before 1080i is popular, and just to make sure what I need when I say 1080p, I want a set that supports 1080p at 24 full frames per second, so that movie transfers would be as accurate as possible. 1080 @ 24p contains less information than 1080 @ 60i (If someone starts shooting movies at 60 frames per second, I'll be happy to see it on a 1080 @ 60p set).
I don't really care if CNET recommends 1080i sets over 1080p sets because they're cheaper, but if this means it will take more time until we have enough content in 1080p, then that's a shame. And CNET is right about one thing - 1080p content looks good even on a 1080i set (assuming it can accept it).
Tuesday, November 14, 2006
Subscribe to:
Post Comments (Atom)
1 comment:
I'd like to know how could a monitor with a built-in complicated deinterlacer could have been made relatively cheap. The picture output to the screen must be progressive at 60fps regardless. Or did they have a limited decoder, which couldn't handle the higher bitrate of 60 full frames? ~ j7n
Post a Comment