Which is perceptually bigger: a tabletop-sized home video screen or the screen of a movie theater? Which offers a greater range of shades of red: the 1953 U.S. color TV standard or the modern standard used for HDTV? The answers could be a matter of life or death.
At issue is whether there is a particular thing called HDTV and, if so, what effects it has. It may seem odd to ask the first question at a time when multiple broadcast, cable, and satellite channels say they offer HDTV daily, the Consumer Electronics Association reports annual factory sales of millions of HDTV sets, the exhibit floors of the annual NAB convention are filled with products labeled HDTV, and the term is commonly used even in the U.S. Congress.
So perhaps it would be easier to tackle the second question first. If HDTV exists, what do we know about it?
The name immediately suggests one characteristic. HDTV is TV with high definition. But what is the definition of high definition?
The Advanced Television Systems Committee (ATSC) offers one. ATSC standard A/53C says HDTV “has a resolution of approximately twice that of conventional television in both the horizontal (H) and vertical (V) dimensions and a picture aspect ratio (H x V) of 16:9.”
Conventional U.S. broadcast television has a maximum horizontal resolution of about 440 lines and about 480 scanning lines from top to bottom, so twice that would be 880 x 960. Stretching it to 16:9 would make it 1173 x 960, not matching any current HDTV format.
The standards of the Society of Motion-Picture and Television Engineers (SMPTE) that define HDTV call for 1920 x 1035 (also not matching current formats), but other SMPTE standards refer to video at 1920 x 1080 and 1280 x 720. The European Telecommunications Standards Institute recently had a standard declaring even some forms of 480-line signals to be HDTV.
ABC, ESPN, and Fox transmit a 720-line version of HDTV; other American HDTV networks use 1080-line. But the 720-line group usually transmits roughly 60 progressively scanned images per second; the others usually transmit roughly 30 interlaced frames per second. It’s “usually” because sometimes both transmit 24 progressively scanned images per second, in which case one type of HDTV offers more than double the number of picture elements of another.
Shift from standards to equipment, and there are even more differences. Consumer TVs called HDTV sometimes have a 16:9 aspect ratio, sometimes 4:3, and sometimes neither. Their resolutions also vary wildly.
The same is true of professional equipment. The Panavision Genesis captures images on a single Sony camera chip said to have more than 12.4 million photo-sensors (5760 x 2160) and records 1920 x 1080 on a Sony HDTV recorder. Sony’s HVR-Z1U camcorder, on the other hand, captures images on one-million-sensor (960 x 1080) chips (three of them) and records 1440 x 1080 on its recorder (the extra horizontal resolution is derived from a half-sensor offset between red and blue in one position and green in the other). Nevertheless, both fall into the category of HDTV camcorders.
Sony’s HDCAM format records the same 1440 x 1080 of luma (black-&-white detail information), with 480 x 1080 of color resolution. Panasonic’s DVCPRO HD (in 1080-line mode) records 1280 x 1080 of luma and 640 x 1080 of color. Both companies also sell other HDTV formats that record 1920 x 1080 HDTV, and Panasonic equipment can also record yet other HDTV formats.
ABC is a 720-line HDTV network, but ABC-affiliate WFAA-DT in Dallas transmits only 1080-line HDTV, converting the network signals. Pioneer once sold a digital-TV reception box with only a 1080-line HDTV output, but the same company’s contemporary plasma TVs converted it in the opposite direction to 720-line.
None of that would make any difference if there were not a sense that there is something called HDTV that is unique. Since the earliest days of television there have been different image and recording formats and quality levels. In fact, there are standard-definition cameras with more sensors per scanning line than some current HDTV cameras have. Users simply chose among varying levels of quality, economy, and other characteristics.
Then came HDTV — or at least the current sense of HDTV (see sidebar “H is for History”). It was said to be not merely quantitatively different from what came before but also qualitatively different. HDTV has often been characterized as offering imagery that looks like the view out a window. But, if HDTV just has more detail than ordinary TV, why isn’t ordinary TV similar to looking out a smaller and narrower-shaped window?
Perhaps it is. Or perhaps the window is not even smaller. Even expert viewers sometimes can’t tell the difference between HDTV and downconversions when real-world images are displayed. But the belief that HDTV is a single, unique form of television, despite all evidence that it has a broad range of quality levels, remains. And that belief has become harmful.
On June 12, The New York Times Magazine ran a piece by Clive Thompson called “Not Ready for Their Close-Up.” It began with a horror story.
A plastic surgeon reported on a potential patient. “She was only in her 30’s,” Thompson wrote, “and still looked terrific.” When asked why she wanted surgery, according to the doctor, “‘high-def’ was the first thing that came out of her mouth.” She was a newscaster and worried that HDTV would make her minor facial flaws stand out.
Thompson’s article offered some possible reasons for that worry. “Today’s new top-of-the-line HD televisions can display two million pixels, nearly 10 times the resolution of a regular, old-style TV set,” he wrote. “Also, the screens are the size of a tabletop.” And it’s not just the resolution.
“With high-def, more colors can be used,” Thompson wrote, “including some formerly forbidden shades of red — which means that blotches, zits and tiny nose-veins can be presented with the brutal clarity of a surgery textbook.” It’s no wonder the young, attractive newscaster was concerned. And there’s more.
Thompson also quoted HDTV pundit Phillip Swann on the way some celebrities look in the medium. Cameron Diaz’s looks “whither under the unblinking gaze of hi-def,” in Thompson’s words, because her face is “littered with unfortunate pockmarks,” in Swann’s.
Before you decide never to shoot in a high-resolution format, sit down, breathe in, and consider. Yes, home TV screens are now large enough to be considered table-top size, but movie-theater screens are tremendously larger.
Cameron Diaz looks great even on the largest cinema screen, and movie film captures even more detail than does HDTV. And Consumer Reports said in March, as part of a review of plasma TVs, that “the best [non-HD] set looked just as good with HD content as the HD sets,” suggesting that viewers can’t really perceive HDTV detail at normal viewing distances. Can it be that there is something about HDTV other than screen size or resolution that makes people look bad? Perhaps it’s those red colors.
Unfortunately, there’s a problem with that theory, too. The common chromaticity-diagram coordinates of the red primary of the original 1953 U.S. color system are .67 X and .33 Y. The coordinates of the red primary used for HDTV (and for modern standard-definition color) are .64 X and .33 Y. That means that, if anything, the old color system could show more shades of red than the new one.
So why should HDTV be a problem? Perhaps it isn’t.
In 1994, the National Academy of Television Arts & Sciences presented engineering-achievement Emmy awards to BTS and Ikegami for the development of “controlled edge enhancement utilizing skin hue keying.” The technology is better known as “skin detail,” and it helps hide facial flaws without affecting the sharpness of anything else in the image.
Even non-HDTV can capture imagery with the “brutal clarity of a surgery textbook.” But it can also, in the hands of those skilled in the art and craft of videography, help people look their loveliest. And so can HDTV.
Skin-detail is just one of the many tools used in the craft of video control. Videographers may also use filters and lighting to give faces a desired look, and then there’s makeup.
The Metropolitan Opera has had a great deal of experience with both ancient and modern HDTV. They’ve been shooting in the modern, thousand-line system since 1990.
Das Rheingold was particularly challenging for their makeup department. One of the characters is a frog-like creature, and the singer wore a bathing-cap-like hair covering and was covered in green makeup. The opera lasts two-and-a-half hours in one continuous act, precluding frequent touch-ups of the makeup.
The singing, cavorting, hot lights, costume, makeup, and tight cap all led the performer to sweat, and the sweat eventually made the line between cap and skin visible. The television director had to take great care to ensure that camera angles and zoom ranges prevented the line from interfering with viewers’ suspensions of disbelief.
So, does that mean that HDTV is a problem? No. Das Rheingold happens to have been one of the operas not shot in HDTV. But it was the most challenging video makeup problem.
There are certainly issues to be dealt with in the new technology — shooting for multiple aspect ratios, choosing colorimetry, combining multiple formats, etc. But it’s not cause to seek plastic surgery. Experimenting with camera settings, lens filters, lighting, and makeup is a much better idea.
With apologies to Hughes Mearns:
As HDTV began to air
Problems were found that weren’t there.
They’re still not there despite what some say.
So shoot some HDTV today.
H is for History
Broadcasting magazine proclaimed, “The exposition’s opening on April 30 also marked the advent of this country’s first regular schedule of high-definition broadcasts.” The year was 1939, and the “HDTV” being described had just 84% of the vertical resolution of even today’s non-HDTV broadcasts. That was still considerably more than the resolution of the world’s first HDTV standard, issued in Britain in 1937, which, itself, was considerably more than what a parliamentary committee had defined as HDTV in 1935.
Current American HDTV can be traced to a Federal Communications Commission (FCC) inquiry initiated in 1987, which was based on some U.S. demonstrations that used Japanese equipment. That equipment was developed based on a project on thousand-line television that began at the Japan Broadcasting Corporation (Nippon Hoso Kyokai, NHK) in the late 1960s.
NHK didn’t stop at a thousand lines. With 1920 x 1080 being called HDTV today, NHK’s 3840 x 2160 system became known as SHDTV, the S standing for super. Then SHDTV was overtaken by what has been called UHDTV, with 7680 x 4320 resolution, the U standing for ultra.
UHDTV is now more-commonly called UDTV and is in its fourth generation of development. When was it first shown?
The idea of thousand-line television didn’t start with NHK. Shortly after the BBC began broadcasting what we might today call 377i HDTV in the 1930s, there was a proposal to shift to thousand-line broadcasts; it was rejected on a cost basis. Something similar happened in the U.S. France actually broadcast TV with more lines in the 1940s than they did in 2004. And in 1985, two years before the FCC inquiry into HDTV, NHK was already demonstrating a form of UHDTV.
Ah, well. Sunrise is also earlier in Japan than here.