Electric Sheep
Dsicle - BoM Dec 06
Actually David, even you are incorrect with some of your info. :razz:
Okay so I've done a bit of thinking and tried to boil some of this down to some easily digestible chunks to clear up some of these common misconceptions. I'm writing this as clearly as I can, but I'm not particularly good at being clear or concise, so I hope this makes sense. LOL!
There are a several issues here that people are getting confused:
Scan Method
The HD standards were developed during a time when almost all TV sets were CRT-based (including CRT projections) which display with an interlaced scan (every other line at a time) method. Fortunately, the folks in charge were smart enough to understand that interlaced technology would be replaced Plasma/DLP/LCD etc, all of which are progressive scan (all lines at once) method. Progressive scan is almost always considered "better" because it has less "screen flicker". To account for the fact that there are both types of sets out there, two HD standards were created: 720p (progressive) and 1080i (interlaced).
Resolution
1080i = 1920x1080 pixels, displayed every-other-line at a time each half-frame
720p = 1280x720 pixels, displayed all-at-once every frame
So why two different resolutions? Simple answer: bandwith. Cable, satellite, Fiber, IP, and most other methods (except for over-the-air) require some sort of telecommunications cable bandwith, so a "cap" per-channel was set. The maximum bandwith allows for either a high-resolution interlaced signal (1080i) or a somewhat lower resolution progressive signal (720p), both of which consume the same amount of bandwith. Think about it: a complete 720 image every frame, or half of a 1080 picture every-other-frame both take up somewhere around a million pixels to make. Bandwidth is the same even though resolution is different because of the two different scan methods.
For what it's worth, many people (myself included) would rather have the lower-resolution 720p compared to the higher-resolution 1080i because interlacing flicker is a more noticeable distraction than lower resolution. That is to say, "p" is better than "i", even if the "i" has higher resolution.
What about 1080p?
1080p = 1920x1080 pixels, displayed all-at-once every frame
Good news: that gives you the higher resolution of 1080i, but with the better progressive-scan method of 720p. Best of both worlds. That's why it's often called (in marketing speak, at least) "Full HD".
Bad news: 1080p is NOT a broadcast HD standard, and it likely won't be for a long, long time. I mean, our last TV standard (NTSC) lasted over 50 years! Again, the reason for this is bandwidth; a 1080p image requires about 2 million pixels, or twice-as-much bandwidth as 720p/1080i. That would mean a given TV provider (Cable company, satellite provider, etc) would only be able to offer HALF as many 1080p channels as they could 720p/1080i. Then of course, the decoder boxes have to be over twice-as-powerful, which makes them exponentially more expensive. And hey, the cameras used to shoot TV shows would then cost exponentially more, and the broadcast equipment, and so on and so forth.
The bottom line is that there is no such thing as 1080p "Full HD" broadcast HDTV, and because of physical production and delivery costs, there probably won't be for a very, very long time.
At this point in time, the only places you can get a 1080p signal is from BluRay, game systems like the Xbox 360 and PS3, and some pay-per-view/on-demand movie streams from cable/satellite providers.
If you have a 1080p TV set, all it's doing with the 720p/1080i HDTV broadcasts is upsampling them via pixel interpolation (similar to enlarging a picture in Photoshop). It might be marginally better than watching a 720p/1080i signal on a 720p set, but it's not really that big of a deal because it's artificially created 1080p.
IMHO, the benefit of a 1080p Full HD set isn't from watching HDTV channels (which a 720p set does nearly as well). The real benefit comes from watching BluRay/game consoles/ppv & on-demand streams. If that's what interests you, either now or maybe in the next few years, then definitely buy a 1080p set.
-----------------------------------------------------------------------
Wowzers, huh? Yucky and confusing, right? Yeah., and I oversimplified all that and left out a ton of other info! :rofl:
The bottom line is that 1080p is nice for some things, but really a 720p set is really just fine for watching HDTV channels. Buy what your budget will afford you; there are definitely benefits to 1080p, but it's probably not as great as you might think (unless you're talking about watching BluRay/etc).
Okay so I've done a bit of thinking and tried to boil some of this down to some easily digestible chunks to clear up some of these common misconceptions. I'm writing this as clearly as I can, but I'm not particularly good at being clear or concise, so I hope this makes sense. LOL!
There are a several issues here that people are getting confused:
Scan Method
The HD standards were developed during a time when almost all TV sets were CRT-based (including CRT projections) which display with an interlaced scan (every other line at a time) method. Fortunately, the folks in charge were smart enough to understand that interlaced technology would be replaced Plasma/DLP/LCD etc, all of which are progressive scan (all lines at once) method. Progressive scan is almost always considered "better" because it has less "screen flicker". To account for the fact that there are both types of sets out there, two HD standards were created: 720p (progressive) and 1080i (interlaced).
Resolution
1080i = 1920x1080 pixels, displayed every-other-line at a time each half-frame
720p = 1280x720 pixels, displayed all-at-once every frame
So why two different resolutions? Simple answer: bandwith. Cable, satellite, Fiber, IP, and most other methods (except for over-the-air) require some sort of telecommunications cable bandwith, so a "cap" per-channel was set. The maximum bandwith allows for either a high-resolution interlaced signal (1080i) or a somewhat lower resolution progressive signal (720p), both of which consume the same amount of bandwith. Think about it: a complete 720 image every frame, or half of a 1080 picture every-other-frame both take up somewhere around a million pixels to make. Bandwidth is the same even though resolution is different because of the two different scan methods.
For what it's worth, many people (myself included) would rather have the lower-resolution 720p compared to the higher-resolution 1080i because interlacing flicker is a more noticeable distraction than lower resolution. That is to say, "p" is better than "i", even if the "i" has higher resolution.
What about 1080p?
1080p = 1920x1080 pixels, displayed all-at-once every frame
Good news: that gives you the higher resolution of 1080i, but with the better progressive-scan method of 720p. Best of both worlds. That's why it's often called (in marketing speak, at least) "Full HD".
Bad news: 1080p is NOT a broadcast HD standard, and it likely won't be for a long, long time. I mean, our last TV standard (NTSC) lasted over 50 years! Again, the reason for this is bandwidth; a 1080p image requires about 2 million pixels, or twice-as-much bandwidth as 720p/1080i. That would mean a given TV provider (Cable company, satellite provider, etc) would only be able to offer HALF as many 1080p channels as they could 720p/1080i. Then of course, the decoder boxes have to be over twice-as-powerful, which makes them exponentially more expensive. And hey, the cameras used to shoot TV shows would then cost exponentially more, and the broadcast equipment, and so on and so forth.
The bottom line is that there is no such thing as 1080p "Full HD" broadcast HDTV, and because of physical production and delivery costs, there probably won't be for a very, very long time.
At this point in time, the only places you can get a 1080p signal is from BluRay, game systems like the Xbox 360 and PS3, and some pay-per-view/on-demand movie streams from cable/satellite providers.
If you have a 1080p TV set, all it's doing with the 720p/1080i HDTV broadcasts is upsampling them via pixel interpolation (similar to enlarging a picture in Photoshop). It might be marginally better than watching a 720p/1080i signal on a 720p set, but it's not really that big of a deal because it's artificially created 1080p.
IMHO, the benefit of a 1080p Full HD set isn't from watching HDTV channels (which a 720p set does nearly as well). The real benefit comes from watching BluRay/game consoles/ppv & on-demand streams. If that's what interests you, either now or maybe in the next few years, then definitely buy a 1080p set.
-----------------------------------------------------------------------
Wowzers, huh? Yucky and confusing, right? Yeah., and I oversimplified all that and left out a ton of other info! :rofl:
The bottom line is that 1080p is nice for some things, but really a 720p set is really just fine for watching HDTV channels. Buy what your budget will afford you; there are definitely benefits to 1080p, but it's probably not as great as you might think (unless you're talking about watching BluRay/etc).