hdtv.org.uk                

 

> home

Super Definition:

THE ULTIMATE TV STANDARD for the home?

4K: 3840 by 2160 @ 72 Hz

In the pioneering days of research and development over the last two decades, European, American and Japanese broadcast engineers were hard at work evaluating systems and formats for HDTV.

Out of the research, it was suggested that the eye would not be able to differentiate line structure of 2200 lines at a distance of 3.3 - 3.5 times picture height (about the optimum distance people sit watching in a cinema) or notice temporal flicker at a frame rate of 70 Hz - as used by PC displays. Compare this to 5 - 10 times picture height for comfortable viewing for SDTV displays.

The Film and Movie industry uses a standard 'academy' picture ratio of 4:3 for 35mm film before processing. Any picture enhancements are done electronically using what is known as '4K' equipment for digital image processing, that is, 4096 x 3072 pixels - since at that resolution, film grain becomes a significant artifact and the detail begins to run off gently as well. The picture is then cropped at a paticular ratio for display. e.g. 2.33:1 or 1.77:1.

If the current highest rate of HDTV were doubled in lines (with four times the pixel rate) to 3840 x 2160 'lines', it compares closely within the 4K specification and is very close to the eye perception figure of 2200 lines. Indeed, The use of an optimum camera sensor with a resolution of 2160 lines would, after downscaling, give the maximum possible definition for current HDTV, overcoming camera resolution limitations at a native resolution of 1080 lines (Shannon et.al.) Indeed, in Japan 4k or 'cinema' resolution is available NOW!

Film is shot at 24 frames per second (fps). Historically, this speed was chosen in the early 20th century on cost grounds as film was then expensive. 48fps meant double the cost of film stock. For fast camera pans and movement the difference in the frame rate and the movement rate is shown up as flicker on the display. In the cinema the film/movie is played at the cinematic frame rate of 24 fps. However, the flicker between frames is very noticeable, so a trick (another compromise!) is employed to effectively show the same frame twice or three times in the same time period, giving the impression of related picture rates of 48 or 72 fps. Telecine equipment is employed to broadcast film electronically via television, DVD etc. whereby each frame is scanned electronically for transmission. In Europe, film can be run faster at 25fps to match the transmission standard whilst in the US, Japan, Canada etc, a method called 'Pull Down' is employed to match the 30 fps rate that these countries use. (Pull down can also be used in 50Hz countries on the basis of 24+1 where one additional frame is added per second). Film flicker on television is also reduced in a similar manner to film because a tv picture is transmitted in two passes; for Europe, at a rate of 50Hz, for USA etc. at 60 Hz - Compromises based on a compromise based on a historical compromise!

During the 1990's, The ASTC and European HDTV standards commitees finally agreed on a standard common set of definitions for HDTV based on 120 line intervals. Frame rates were different bar one - 24HZ. The original film rate! Standards based on compounded historical anacronisms. Bearing in mind that electronic television is a young technology that has only been in existence for 70 years, could there ever be an optimum frame rate for a world tv standard? The Snell & Wilcox broadcast equipment specialists demonstrated good experimental HD picture results at about 75Hz - close to the preferred PC display and optimum eye temporal rates. So is this the solution for an optimum frame rate for a world tv standard? Free of the past political squabbles over 50/60 Hz - by choosing a DIFFERENT standard altogether - either 70, 72 or 75Hz. Sure, I doubt it too.

However, there is no reason why displays - like the cinema projector - cannot display pictures at a faster rate. Hence the use of '100Hz' displays. This reduces any primary 'output' flicker introduced by the display.

Television as a medium is only 70 years old. It is in its infancy as a technology. What a fitting tribute it would be, for a future common international standard to be developed, with backwards compatibility with current 24fps and 50i/P (via 48 fps) and 60i/P pulldown standards that will, inevitably become 'legacy' standards.

Moreover, There is no reason why future displays could not be engineered to 'reality definition' standards with current formats scaled up to match - given the current rate of change in technology. Given that HDTV transmission has halved in bit rate with the introduction of MPEG-4 and second generation DVB-S2 and T2 formats, 4K as a format looks viable as a third generation system.

So: Why not 4k Super Definition TV (3840 x 2160 @ 72 Hz)?

Anyone?

 
> home  

Server Space managed by ODYSSEY Ltd email
http://www.odyssey.ltd.uk

© Odyssey Ltd. 2005-7