Post by dkennedy on Sept 2, 2005 4:22:31 GMT -5
The Clicker: Futureproofing with 1080p?
September 1, 2005
By Peter Rojas, engadget.com
Every Thursday Stephen Speicher contributes The Clicker, a weekly opinion column on entertainment and technology:
The conversation always goes the same way:
“So 720p is progressive, right?”
”Correct – that’s what the p is for”
“But 720p has fewer pixels than 1080i, right?”
”Correct – bigger number and all”
“But 1080i is only 30 frames per second and is interlaced compared with 720p’s 60 frames of progressive goodness.”
”Correct”
“So why not just get a 1080p display?”
And thus begins the seemingly-perpetual pining for 1080p. Not that the quest for 1080p is completely without merit. As the conversation above illustrates, it doesn’t take Newtonian levels of brainpower to see how 1080p might be a good thing. If one could have all the benefits of 720p but with more pixels, it certainly couldn’t hurt. But should you really pay the premium for a 1080p set? Perhaps not.
The first problem is, of course, that there is no 1080p/60 broadcasting standard. Of the 18 ATSC standards, the closest to 1080p/60 that you’ll get is 1080p/30, and while that’s often better than 1080i/30 it still doesn’t give broadcasters the ability to handle fast-motion content such as sports.
Furthermore, the likelihood of 1080p/60 becoming a broadcasting standard is about as great as the lovechild of Britney Spears and Kevin Federline penning the great American novel – it ain’t gonna happen. Cable and satellite providers are already looking to compress the signal. They’re not about to pass twice the amount of data over those lines. And, without content, really what’s the point?
“Clearly he doesn’t understand that the PS3 will soon be pumping out 1080p,” you start to mutter under your breath as though I’m completely unaware of the situation. “Two HDMI ports pumping out 1080p,” you continue. Great! To what are they pumping this information? Are they handing this data to Samsung’s 1080p DLP sets? Nope; they can’t. Until recently HDMI chips were unable to process 1080p/60 and while they’ve broken through those limitations, few (if any) sets actually include these new chips. It’s been rumored that Sony’s Qualia 1080p front-projector will start to include 1080p/60 HDMI technology. However, for those of us looking to spend less than thirty thousand dollars, this isn’t a great help.
“But that’s a short term view. These sets will eventually include those chips.” OK. Let’s move past the fact that nearly all of today’s 1080p sets won’t accept an, uh, 1080p signal via HDMI. Will 1080p games really look much better than 720p games? Games, at their heart, are rendered and vector-based. Once you get to the point where lines are drawn without stair-stepping, added resolution is of marginal benefit. Yes, it’s possible that game developers could incorporate high-quality 1080p textures. Will they? It’s unlikely. They too are weighing all the options. Texture memory is a precious resource, and with 1080p customers so few and far between, it’s a smarter choice for game developers to use more medium-resolution textures than a smaller number of 1080p textures. The result? The difference in your viewing experience is likely to be slim.
“But the added resolution of a 1080p set makes everything look more film-like and that’s what I’m looking for.”
It’s possible that you’re special. It’s possible that you’ve got the eagle-eyes needed to detect individual pixels while watching HD content at proper viewing distances. Most people can’t. More often than not, pixel viewing problems are caused by the black border around individual pixels (i.e. the screen-door effect) not by the number of pixels.
Don’t get me wrong; all things being equal 1080p will be better than 720p. The problem? Things are rarely equal.
There’s no doubt that there are those reading this who can appreciate the differences. Unfortunately most people can’t. Don’t believe me? Listen to people exclaim how CSI is the best-looking show on television. Dig further and learn that these same people are watching this 1080i show on a 720p set. Dig even further and you discover that most modern sets convert 1080i to 720p by first chopping the 1080i down to 540p and then scaling it back up. That’s right – less resolution than 720p shows. The point? There are many different variables that contribute to the end result. The number of pixels is just one small portion of the picture quality. Black levels, contrast, color-accuracy, etc. all play a major role in the “film-like” look of a display.
In the end it really does come down to what you think looks best. Buying a 1080p display to “future-proof” isn’t the no-brainer that you think it might be, and if you’re doing it for that reason, be sure that the set includes the “future”-parts (e.g. 1080p/60 HDMI, motion-adaptive deinterlacing of 1080i content, proper pull-down of 1080i content, etc.). If you don’t, you might end up with more pixels and the same amount (or even less) of actual data.
If you have comments or suggestions for future columns, drop me a line at theclicker@theevilempire.com.
September 1, 2005
By Peter Rojas, engadget.com
Every Thursday Stephen Speicher contributes The Clicker, a weekly opinion column on entertainment and technology:
The conversation always goes the same way:
“So 720p is progressive, right?”
”Correct – that’s what the p is for”
“But 720p has fewer pixels than 1080i, right?”
”Correct – bigger number and all”
“But 1080i is only 30 frames per second and is interlaced compared with 720p’s 60 frames of progressive goodness.”
”Correct”
“So why not just get a 1080p display?”
And thus begins the seemingly-perpetual pining for 1080p. Not that the quest for 1080p is completely without merit. As the conversation above illustrates, it doesn’t take Newtonian levels of brainpower to see how 1080p might be a good thing. If one could have all the benefits of 720p but with more pixels, it certainly couldn’t hurt. But should you really pay the premium for a 1080p set? Perhaps not.
The first problem is, of course, that there is no 1080p/60 broadcasting standard. Of the 18 ATSC standards, the closest to 1080p/60 that you’ll get is 1080p/30, and while that’s often better than 1080i/30 it still doesn’t give broadcasters the ability to handle fast-motion content such as sports.
Furthermore, the likelihood of 1080p/60 becoming a broadcasting standard is about as great as the lovechild of Britney Spears and Kevin Federline penning the great American novel – it ain’t gonna happen. Cable and satellite providers are already looking to compress the signal. They’re not about to pass twice the amount of data over those lines. And, without content, really what’s the point?
“Clearly he doesn’t understand that the PS3 will soon be pumping out 1080p,” you start to mutter under your breath as though I’m completely unaware of the situation. “Two HDMI ports pumping out 1080p,” you continue. Great! To what are they pumping this information? Are they handing this data to Samsung’s 1080p DLP sets? Nope; they can’t. Until recently HDMI chips were unable to process 1080p/60 and while they’ve broken through those limitations, few (if any) sets actually include these new chips. It’s been rumored that Sony’s Qualia 1080p front-projector will start to include 1080p/60 HDMI technology. However, for those of us looking to spend less than thirty thousand dollars, this isn’t a great help.
“But that’s a short term view. These sets will eventually include those chips.” OK. Let’s move past the fact that nearly all of today’s 1080p sets won’t accept an, uh, 1080p signal via HDMI. Will 1080p games really look much better than 720p games? Games, at their heart, are rendered and vector-based. Once you get to the point where lines are drawn without stair-stepping, added resolution is of marginal benefit. Yes, it’s possible that game developers could incorporate high-quality 1080p textures. Will they? It’s unlikely. They too are weighing all the options. Texture memory is a precious resource, and with 1080p customers so few and far between, it’s a smarter choice for game developers to use more medium-resolution textures than a smaller number of 1080p textures. The result? The difference in your viewing experience is likely to be slim.
“But the added resolution of a 1080p set makes everything look more film-like and that’s what I’m looking for.”
It’s possible that you’re special. It’s possible that you’ve got the eagle-eyes needed to detect individual pixels while watching HD content at proper viewing distances. Most people can’t. More often than not, pixel viewing problems are caused by the black border around individual pixels (i.e. the screen-door effect) not by the number of pixels.
Don’t get me wrong; all things being equal 1080p will be better than 720p. The problem? Things are rarely equal.
There’s no doubt that there are those reading this who can appreciate the differences. Unfortunately most people can’t. Don’t believe me? Listen to people exclaim how CSI is the best-looking show on television. Dig further and learn that these same people are watching this 1080i show on a 720p set. Dig even further and you discover that most modern sets convert 1080i to 720p by first chopping the 1080i down to 540p and then scaling it back up. That’s right – less resolution than 720p shows. The point? There are many different variables that contribute to the end result. The number of pixels is just one small portion of the picture quality. Black levels, contrast, color-accuracy, etc. all play a major role in the “film-like” look of a display.
In the end it really does come down to what you think looks best. Buying a 1080p display to “future-proof” isn’t the no-brainer that you think it might be, and if you’re doing it for that reason, be sure that the set includes the “future”-parts (e.g. 1080p/60 HDMI, motion-adaptive deinterlacing of 1080i content, proper pull-down of 1080i content, etc.). If you don’t, you might end up with more pixels and the same amount (or even less) of actual data.
If you have comments or suggestions for future columns, drop me a line at theclicker@theevilempire.com.