Jump to content


Photo

1366 X 768 Versus 1920 X 1080


  • Please log in to reply
29 replies to this topic

#1 Shutterbug

Shutterbug

    AV Forum Member

  • Member
  • PipPip
  • 186 posts

Posted 05 May 2007 - 01:35 PM

Is much difference between the two HD standards when viewing an HD broadcast from the Nine or Ten network? Or is there
only an obvious difference when looking at an HD DVD?

#2 bac

bac

    AV Forum Member

  • Member
  • PipPip
  • 143 posts

Posted 05 May 2007 - 01:44 PM

Is much difference between the two HD standards when viewing an HD broadcast from the Nine or Ten network? Or is there
only an obvious difference when looking at an HD DVD?

Bitrates are constrained on FTA HD transmissions, so the difference between 720p and 1080i is not necessarily that noticeable.

Cheers,
BAC

#3 fd3s

fd3s

    AV Forum Member

  • Member
  • PipPipPip
  • 265 posts

Posted 05 May 2007 - 01:48 PM

i dont think any stations broad cast in 1080p only in 1080i and 720p

#4 fd3s

fd3s

    AV Forum Member

  • Member
  • PipPipPip
  • 265 posts

Posted 05 May 2007 - 01:48 PM

i dont think any stations broad cast in 1080p rather in 1080i and 720p

#5 dvduser

dvduser

    AV Forum Member

  • Senior Member
  • 2,701 posts

Posted 05 May 2007 - 02:41 PM

i dont think any stations broad cast in 1080p rather in 1080i and 720p

1080p wasn't even mentioned

#6 Zacspeed

Zacspeed

    AV Forum Member

  • Member
  • PipPipPip
  • 448 posts

Posted 05 May 2007 - 03:53 PM

As far as I was aware, nearly all HD signals at best are 1080i except for blu-ray players and maybe even HD-DVD which do 1080p.

There are several posts and threads covering this topic, especially 50" Semi HD versus 50" Full HD.
For most people the 1366 x 768 will be as good as a full HD in that the difference wouldnt be noticed. Resolution altho important isnt actually top on the list of what makes a great picture.
Reading reviews and specs for example on the Pana full HD PZ700 it seems what makes them stand out away from the resolution is that they have some further tweaks to fine tune picture. It has calibration settings that are well ahead of ordinary PDPs.

#7 Shutterbug

Shutterbug

    AV Forum Member

  • Member
  • PipPip
  • 186 posts

Posted 05 May 2007 - 04:27 PM

Channels 9 & Ten broadcast 1920 x 1080, however, I shall have a look at a few different models I see how they perform.

#8 Gazzz

Gazzz

    AV Forum Member

  • Member
  • PipPip
  • 132 posts

Posted 05 May 2007 - 04:38 PM

sure they send 1920x1080, but are the pictures truly RECORDED at that? Or simply upscaled before they're sent.

I know for a fact that 1080i from another source looks a hell of alot better than channel ten at 1080i on my panel...

Till aus really gets their ass into gear with TRUE high def, 1366x768 is more than enough for tv.

#9 PanaSung

PanaSung

    AV Forum Member

  • Member
  • PipPipPipPip
  • 763 posts

Posted 05 May 2007 - 04:46 PM

Don't forget, no-one is saying that 720's resolution is better....and as Zacspeed mentioned, the 1080 HDTV's almost always have other more desirable improvements all over the place.

Also, the the bigger the screen, the more desirable 1080 becomes for your PC desktop/games.
As far as I can tell, the only negative of 1080 is price....so if you can afford it, BOOM!!!

We do have a problem though, and that's affordable 1080 HDTV's that are reliable/versatile........however, things will be very different in 8-12months.

There are two standout HDTV's at the moment IMO.

Panasonic 50in 720p Plasma=$3300
Samsung M8 46in 1080 LCD=$4200

#10 whmacs

whmacs

    AV Forum Member

  • Member
  • PipPipPip
  • 380 posts

Posted 05 May 2007 - 04:53 PM

Channels 9 & Ten broadcast 1920 x 1080, however, I shall have a look at a few different models I see how they perform.


Hi Shutterbug,
I believe that 9 and 10 broadcast in 1440x1080, so its not even true 1080i (with the exception of Perth which does get 1080i). See Broadcaster Formats here.

Regards,
Stephen

#11 AndrewWilliams

AndrewWilliams

    AV Forum Member

  • Member
  • PipPipPipPip
  • 874 posts

Posted 05 May 2007 - 05:03 PM

I think they've actually switched the stream from 1440x1080 back to 1920x1080 but the amount of detail is the same so thanks to digital compression they don't look any different.

A lot of 'HD' drama shows (NCIS, Medium) etc. only look slightly better than SD so you won't notice any difference between 768 and 1080. Some studio shows like Letterman, David & Kim etc do look noticeably better on 1080 displays. You need a very big display (or sit very close) to notice the difference between 768 and 1080 normally.

#12 Zacspeed

Zacspeed

    AV Forum Member

  • Member
  • PipPipPip
  • 448 posts

Posted 05 May 2007 - 06:09 PM

Alot of the HD from TV stations is upscaled from an inferior resolution.
As 1920 x 1080 would require alot of bandwidth. How is it possible for that much data to be streamed through the air to a Tuner/STB?
As it seems hard enough to stream a 1080p signal from a stored medium (ie HDD) in a STB to a TV. Apparently the thick cables and electronic circuitry cant handle the bandwidth in some units. So it makes me wonder how that much data can be transmitted terrestially.

#13 AndrewWilliams

AndrewWilliams

    AV Forum Member

  • Member
  • PipPipPipPip
  • 874 posts

Posted 05 May 2007 - 07:21 PM

Uncompressed HD (1920x1080) = 1240 Mbit/s
Uncompressed SD (720x576) = 250 Mbit/s

Given that reasonable quality SD MPEG2 usually requires about 6 Mbit/s, that's a compression ratio of (250/6) = ~ 1:40

To get similar quality HD you'd need 1240/40 = 31 Mbit/s

A DTV channel has about 20 Mbit/s. 6 Mbit/s for SD leaves about 14 Mbit/s for HD - less than half of what's really needed.

Given that 1280x720 requires about half the bandwidth of 1920x1080, maybe it would be a better choice...though 1080 looks great even at low bitrates on talk shows where there's not much motion. The AFL at 1080 looks like a bit of a disaster when there's a lot of movement on screen.

#14 ummester

ummester

    AV Forum Member

  • Member
  • PipPipPip
  • 320 posts

Posted 05 May 2007 - 07:29 PM

The AFL at 1080 looks like a bit of a disaster when there's a lot of movement on screen.


This is true - HD is actually a bit of a let down for me because not much is in it and when it is the bitrate is too low.

The best HD broadcasts that I have found are those demo ones with the mountains and killer whales and stuff but no-one watches them, other then showing thier teles off :blink:

#15 big_marcelo

big_marcelo

    AV Forum Member

  • Senior Member
  • 2,676 posts

Posted 05 May 2007 - 08:12 PM

spiderman HD last week in 5.1 EX was awesome.... the benefits of 1080i from a film source, is that if your TV has a good scaler it can re-create the orignal 1080p frames, and thus PQ is fantastic ....

tonight is spiderman 2 .... which I'm recording in HD ....

#16 ...

...

    AV Forum Member

  • Senior Member
  • 8,288 posts

Posted 05 May 2007 - 08:29 PM

spiderman HD last week in 5.1 EX was awesome.... the benefits of 1080i from a film source, is that if your TV has a good scaler it can re-create the orignal 1080p frames, and thus PQ is fantastic ....

tonight is spiderman 2 .... which I'm recording in HD ....

Thanks for the reminder, just started recording before going to work! :blink:

#17 Shutterbug

Shutterbug

    AV Forum Member

  • Member
  • PipPip
  • 186 posts

Posted 07 May 2007 - 04:18 PM

Then we need a committment from the networks to a HD standard and to broadcast true HD material,also I believe there
isn't much in the way of 5.1 surround broadcast with HD either despite all of the hype some years back. Will it ever happen?

#18 Dave_L

Dave_L

    AV Forum Member

  • Member
  • PipPipPip
  • 293 posts

Posted 07 May 2007 - 04:51 PM

Uncompressed HD (1920x1080) = 1240 Mbit/s


Just a query....

Assuming this calculation above is for 1080p, perhaps this goes some way to explaining why 1080i seems to be more "broadcast-friendly", being 1920 x 540 (or thereabouts) ??

#19 JPP

JPP

    AV Forum Member

  • Member
  • PipPipPipPipPip
  • 1,106 posts

Posted 07 May 2007 - 05:29 PM

Just a query....

Assuming this calculation above is for 1080p, perhaps this goes some way to explaining why 1080i seems to be more "broadcast-friendly", being 1920 x 540 (or thereabouts) ??

As far as I understand it, from a broadcaster's point of view, transmitting interlaced video is always a better option as it halves the required bit rate for a given "resolution or sharpness" of a moving image compared to non-interlaced or progressive. It just means that the end result you see on your screen is very much a function of how good your display's or external Video Processor's de-interlacing circuitry is. So, SBS for example has a typical bit rate of 5-6 Mb/sec on SD at 567i or double that at around 10 Mb/s for its "HD" channel at 576p. The image is no sharper at 576p than it is at 576i, but 567p may look better depending on the treatment the signal undergoes by the display's de-interlacer.

The ABC HD is 720p at around 10-12 Mb/s. Again, the resultant image is little or no sharper than the SD version, but this time we have more horizontal and vertical picture elements, thus requiring our displays to do less scaling to bring it to its native resolution, typically 768 x 1366 or thereabouts, and of course no requirement for de-interlacing.

When we come to 1080i, we have increased our pixel resolution and at 15 Mb/s, a reasonable job of a sharp image. Most times though, we see 1080 x 1440, not 1080 x 1920. If we wanted 1080p, we would require twice the bandwidth or bit rate as of course we now transmit twice the amount of information in a given time at 50 times/frames per second rather than 25 times/frames a second. Again, how good the picture we see is depends on the quality of the display's de-interlacing capabilities - motion edge detecting technology giving a better image than simple bob-deinterlacing. What also helps in giving a better picture generally than either 576i/p or 720p is that the display only has to downscale (for displays of lesser resolution than 1080 x 1920). Good quality (external) scalars make the difference between 576i and 1080i very small indeed for the typical 768 x 1366 display. 720p is probably the sweet spot for this size/resolution display.

#20 Owen

Owen

    AV Forum Member

  • Senior Member
  • 11,949 posts

Posted 07 May 2007 - 06:04 PM

As far as I understand it, from a broadcaster's point of view, transmitting interlaced video is always a better option as it halves the required bit rate for a given "resolution or sharpness" of a moving image compared to non-interlaced or progressive. It just means that the end result you see on your screen is very much a function of how good your display's or external Video Processor's de-interlacing circuitry is. So, SBS for example has a typical bit rate of 5-6 Mb/sec on SD at 567i or double that at around 10 Mb/s for its "HD" channel at 576p. The image is no sharper at 576p than it is at 576i, but 567p may look better depending on the treatment the signal undergoes by the display's de-interlacer.

The ABC HD is 720p at around 10-12 Mb/s. Again, the resultant image is little or no sharper than the SD version, but this time we have more horizontal and vertical picture elements, thus requiring our displays to do less scaling to bring it to its native resolution, typically 768 x 1366 or thereabouts, and of course no requirement for de-interlacing.

When we come to 1080i, we have increased our pixel resolution and at 15 Mb/s, a reasonable job of a sharp image. Most times though, we see 1080 x 1440, not 1080 x 1920. If we wanted 1080p, we would require twice the bandwidth or bit rate as of course we now transmit twice the amount of information in a given time at 50 times/frames per second rather than 25 times/frames a second. Again, how good the picture we see is depends on the quality of the display's de-interlacing capabilities - motion edge detecting technology giving a better image than simple bob-deinterlacing. What also helps in giving a better picture generally than either 576i/p or 720p is that the display only has to downscale (for displays of lesser resolution than 1080 x 1920). Good quality (external) scalars make the difference between 576i and 1080i very small indeed for the typical 768 x 1366 display. 720p is probably the sweet spot for this size/resolution display.


There is a common misconception regarding 1080p, because there is an assumption that 1080p is 50 or 60 frames per second. The only 1080p format used for distribution is 24fps, or 25fps in Oz, 1080p 24 requires less then the data rate of 1080i 50 and quite a lot less then the data rate required for 1080i 60.

1080p 24 transfers 49766400 pixels per second, or 24 1920x1080 frames per second.
1080i 50 transfers 51840000 pixels per second, or up to 25 1920x1080 frames per second.
1080i 60 transfers 62208000 pixels per second, or up to 30 1920x1080 frames per second.

Uncompresed data rates are:

1080p 24=1194Mbps
1080i 50=1244Mbps
1080i 60= 1492Mbps

The fact that interlaced transmission uses two fields to convey a progressive frame is irrelevant.
Interlaced transmission also allows true interlaced video source from an interlaced video camera to be conveyed. This is deinterlaced to 1080p 50 or 1080p 60, and has much better motion portrayal then 1080p 24/25.

576p 50 is a useless format, it conveys no more resolution then 576i 50 as the only source is either 25fps progressive which can be carried just fine in 576i 50, or 576i/1080i 50.
The deinterlacer used to generate the 576p 50 at the TV networks is of the very ordinary bob type which simply up scales each 720x288 field to a 720x576 frame and delivers them at field rate, or 50fps. This is a total waste of bandwidth as the deinterlacing job can be done better at the user end, preferably with a much better system then bob.

#21 AndrewWilliams

AndrewWilliams

    AV Forum Member

  • Member
  • PipPipPipPip
  • 874 posts

Posted 07 May 2007 - 07:03 PM

Owen is spot on. As a transmission format, 1080i50 is superior to 1080p25 since 1080i50 can carry 50 fields/second or 25 full frames/second. It's for that reason that people should only talk about 1080p as being 50p or 60p - not 25p since 1080i50 can handle that.

1080p50 = 2488 Mbit/s.....ouch (and that's why 1080p isn't in the DTV spec)

Incidentally, I was wrong about the DTV 720p spec - It's actually 50p so it would be 1106 Mbps uncompressed which isn't much less than 1080i50 at 1244 Mbps.

#22 JPP

JPP

    AV Forum Member

  • Member
  • PipPipPipPipPip
  • 1,106 posts

Posted 07 May 2007 - 07:24 PM

The deinterlacer used to generate the 576p 50 at the TV networks is of the very ordinary bob type which simply up scales each 720x288 field to a 720x576 frame and delivers them at field rate, or 50fps. This is a total waste of bandwidth as the deinterlacing job can be done better at the user end, preferably with a much better system then bob.

Good points re 1080p/24-25 and 1080i/50 Owen. I wonder why SBS is persisting with its 576p fromat. For the same bandwidth, they may as well go to either 720p or 1080i. Studio equipment not up to it?

Phil.

#23 Owen

Owen

    AV Forum Member

  • Senior Member
  • 11,949 posts

Posted 07 May 2007 - 07:24 PM

Unfortunately 1080p 50/60 does not exist, so whenever 1080p is discussed we should assume 1080p 24.

#24 AndrewWilliams

AndrewWilliams

    AV Forum Member

  • Member
  • PipPipPipPip
  • 874 posts

Posted 07 May 2007 - 09:00 PM

Unfortunately 1080p 50/60 does not exist

Not quite true since the PS3/Xbox360/PC are capable of playing games at 1080p 50/60 but yeah, there's no video content in the mainstream in that format and there won't be for a long time to come.

#25 Dave_L

Dave_L

    AV Forum Member

  • Member
  • PipPipPip
  • 293 posts

Posted 08 May 2007 - 07:49 AM

There is a common misconception regarding 1080p, because there is an assumption that 1080p is 50 or 60 frames per second. The only 1080p format used for distribution is 24fps, or 25fps in Oz, 1080p 24 requires less then the data rate of 1080i 50 and quite a lot less then the data rate required for 1080i 60.

1080p 24 transfers 49766400 pixels per second, or 24 1920x1080 frames per second.
1080i 50 transfers 51840000 pixels per second, or up to 25 1920x1080 frames per second.
1080i 60 transfers 62208000 pixels per second, or up to 30 1920x1080 frames per second.

Uncompresed data rates are:

1080p 24=1194Mbps
1080i 50=1244Mbps
1080i 60= 1492Mbps

The fact that interlaced transmission uses two fields to convey a progressive frame is irrelevant.
Interlaced transmission also allows true interlaced video source from an interlaced video camera to be conveyed. This is deinterlaced to 1080p 50 or 1080p 60, and has much better motion portrayal then 1080p 24/25.

576p 50 is a useless format, it conveys no more resolution then 576i 50 as the only source is either 25fps progressive which can be carried just fine in 576i 50, or 576i/1080i 50.
The deinterlacer used to generate the 576p 50 at the TV networks is of the very ordinary bob type which simply up scales each 720x288 field to a 720x576 frame and delivers them at field rate, or 50fps. This is a total waste of bandwidth as the deinterlacing job can be done better at the user end, preferably with a much better system then bob.


Thanks for the explanation Owen, that's great.

I'm just trying to get my head around all this. Am I right, reading into what you are saying, that 1080i/50 gives you the advantage of both:
a) a 1080 image*
b) a much better frame rate of 50frames/sec (giving a better portrayal of motion)* ?
*both assuming they are de-interlaced suitably

Whereas 1080p/24 (albeit already a progressive source image), is only 24 (or 25, in Oz) fps (or half the refresh rate of 1080i - and therefore not as good for capturing motion video) ?

Am I on the right track ??