Produced by:
| Follow Us  

NOW SERVING:

  • October Is for Presentations
  • “What We’ve Done and What We Might Yet Do”
  • 2015 HPA Tech Retreat – Call for Submissions

October Is for Presentations

October 1st, 2014 | No Comments | Posted in Schubin Cafe, Schubin Snacks

Welcome to October, once the eighth month of the year (thus the “octo”). This year, when I think of October, I think of presentations.

HPA Main-Room-Overview-1-WR

First, let me remind you that proposals for presentations at the 2015 HPA Tech Retreat in February are due by Friday, October 24. You’ll find the call for presentations here: http://www.schubincafe.com/2014/09/02/2015-hpa-tech-retreat-call-for-submissions/ And, if you’re not at all familiar with the event, you’ll find an overview from 2010 here: http://www.schubincafe.com/2010/01/15/someone-will-be-there-who-knows-the-answer/

El CapitanAfter you’ve decided whether and what you want to submit for the main program at the HPA Tech Retreat, you might want to consider some of the presentations I’ll be doing in October. The first is also at an HPA (Hollywood Post Alliance) event, an all-day symposium called “Making Do With More” that will kick off the Society of Motion-Picture and Television Engineers (SMPTE) 2014 Annual Technical Conference and Exhibition. The symposium will be held at Hollywood’s historic El Capitan Theatre.

Viewing TestsI’ll begin the symposium with a presentation called “So Tell Me More About More.” It will cover such topics as more spatial-detail resolution, more frames per second, and more dynamic range from the standpoints of physics, psychology, and psychophysics (the last being human sensations caused by physical stymuli). The program will then continue with some of the greatest creative minds working today. Here’s the program for the symposium: https://www.smpte.org/smpte2014/symposium-program And don’t forget the rest of the SMPTE program the next three days.

trimmed queenOn October 30, I’ll be giving a free webinar with my eclectic view of another recent event, the International Broadcasting Convention (IBC) in Amsterdam last month. On exhibit floors that had products ranging from 8K cameras to automatic captioning, why were many visitors excited about Skype? At a conference where the title of one presentation began “Minimising nonlinear Raman crosstalk,” why did one press report comment on cinema-auditorium lighting and the gross receipts of one episode of one TV show? I’ll discuss those and more, including a citizen-journalism ecosystem, a 4K camera that can directly use long-range zoom lenses, a 3D display that doesn’t require either special glasses or a sweet viewing spot, the Holo-Deck, an immersive egg, the ability to zoom and dolly in post, and a fully accredited Wile E. Coyote.

The webinar will run for one hour, starting at 9 am Pacific, noon Eastern, and 1600 GMT. Registration is here: https://attendee.gotowebinar.com/register/5011577659299919617

2014 IBC Christopher_Widauer_Vienna_Opera,_Cho_Seong,_Dominique_MeOne unusual item at IBC 2014 was a special award to the Wiener Staatsoper (the Vienna State Opera) for its pioneering work in 4K live streaming directly to TVs with multiview capability, subtitles in selectable languages, and even a synchronized music score. In 2009, the Metropolitan Opera won IBC’s highest award, the International Honour for Excellence. How long has opera been at the forefront of media technology? Would you believe since the 17th century?

NOWshort

1673 Phonurgia Nova trimmedFor National Opera Week 2014, I’ll be giving two free illustrated talks. One, on October 30, at the College of Arts and Letters at Stevens Institute of Technology in Hoboken, NJ, is called “The Fandom of the Opera: How the Audience for a Centuries-Old Art Helped Create Modern Media Technology,” and it will cover developments from 16th-century printing through beyond-21st-century neutrino communications, quantum entanglement, and live holography with haptic feedback. The talk will be in the Kidde building, room 350, starting at 6 pm. You’ll find an illustrated poster here: http://www.schubincafe.com/wp-content/uploads/2013/10/Schubin-Stevens-Fandom-poster.pdf

2014 BrandonCrawford-ZandaSvede_byScottWallThe other National Opera Week talk has only a little to do with media technology: for roughly half a century, beginning in 1885, baseball fans went to opera houses to watch live remote games. And, due to a lack of media technology, opera fans trashed the infield of a ballpark in 1916. Those are just two tidbits from an illustrated talk I’ll be giving at the GoingGoingGoneSports gallery in the Atrium at Citigroup Center (153 E 53rd Street) in New York on Wednesday, October 29. The talk will be at 7 pm, but the gallery will be open for the special baseball & opera event from 6 pm to 9 pm, providing the opportunity to view artworks created from artifacts from some of America’s greatest games. You’ll find an illustrated poster here: http://www.schubincafe.com/wp-content/uploads/2013/10/Opera-Baseball-poster.pdf

The event is free, but reservations are requested. Please contact Neil Scherer 646 285-1497 njsart@msn.com

I hope to see you later this month.

Tags: , , , , , , , , , , , , , , , , , , , , ,

“UHD: Beyond the Hype,” CCW+SATCON, Javits Convention Center, New York, November 12

September 11th, 2014 | No Comments | Posted in Appearances

November 12, 2014, 11:30 am to 12:30 pm

“UHD: Beyond the Hype”

CCW+SATCON

Javits Convention Center, New York,

http://ccw14.mapyourshow.com/6_0/sessions/session-details.cfm?ScheduleID=4

Tags: , , , , , , , , , ,

Artistic and Educational “Street” View

May 23rd, 2014 | No Comments | Posted in Schubin Snacks, Today's Special

James Nares's Street

Are you in or around Washington, D.C.?  Will you be there anytime soon? If it’s before June 2, get yourself to the West Building of the National Galley of Art for a peek at James Nares’s Street. It’s on the ground floor, on the west side, right next to the Kaufman furniture galleries.

Nares used a Vision Research Phantom Flex high-speed HD camera and an Angenieux Optimo lens shooting from a car moving at high speed through Manhattan streets. Then he slowed the sequences, added music, and created the art work. The results are exceptional from both artistic and techno-educational points of view.

From an artistic point of view, the piece explores conceptions of reality. Sometimes the people on the street appear to be actors in some special-effects commercial rather than real people doing real things. Lights, signs, and LCD screens flash on and off — as they actually do in real life, but too fast for us to notice. There’s a review in The New York Times here from an earlier exhibit: http://www.nytimes.com/2012/08/12/nyregion/in-james-naress-street-taming-the-galloping-city.html

From a techno-educational point of view, notice how crystal clear the material in focus is, even though the car was speeding by. Even in “4k” and “8k” ultra-high-definition demonstrations, you’ve probably never seen moving images this clear. Watch as signs move across the screen, even their fine print easily readable. If you’ve been wondering about the effects of higher spatial resolution vs. higher temporal resolution, this is must-see material.

If you can’t get to the National Gallery of Art, here’s a sample from the artist’s web site: http://www.jamesnares.com/index.cfm/film-video/street/.

Watch for a showing at a museum near you.

Tags: , , , , , , , ,

A Philosophical Look at NAB 2014 with Mark Schubin (video)

May 21st, 2014 | No Comments | Posted in Download, Today's Special

The 2014 NAB Show seemed to be a social-media, IP-connected, 4K event. What does that mean? Is broadcast dead? The serial-digital interface (SDI)? HDTV? The 2/3-inch camera format? Join Mark Schubin as he looks at what was shown on the exhibit floor and puts it into a larger context. No one at the NAB show introduced a new lens mount this year. Find out why Mark considers that significant as he takes you on a tour from the sublime to the — well, “interesting.”

Download the high resolution (1024×600) version of the recorded webinar at:
http://www.schubincafe.com/wp-content/uploads/2014/05/2014-05-20-12.02-A-Philosophical-Look-at-NAB-2014.wmv

Watch the recorded version of the webinar:

Tags: , , , , , , , ,

Live 4k Streaming (for opera, of course)

 

The first commercial digital sound recording was of an opera. The first live television subtitles were for opera. And, now, live 4k opera streamed over the Internet.

2014 Elemental Vienna 4K Nabucco

At 7 pm Central European Time on Wednesday, May 7, the Wiener Staatsoper (Vienna State Opera) will transmit Verdi’s opera Nabucco, with Placido Domingo in the title role. Elemental Technologies’ high-efficiency video coding (HEVC) will be used to stream the event over the Internet in 4k resolution, using MPEG-DASH, for viewing around the world. It will also be sent to a 65” Samsung UHD TV at the opera house. A Wiener Staatsoper app with built-in time shifting will allow users to view it live or at 7 pm in their local time zone.

1180538626209Wiener Staatsoper produces more than 40 live broadcasts annually and is making almost all of its 2014/2015 season productions accessible to viewers via the Internet on smart TVs and mobile devices. “Our multicreen offer, VOD [video-on-demand] services, and user-selectable two-channel live program provide new and exciting ways for fans to experience the arts with the highest levels of accessibility and artistry,” said Christopher Widauer, the opera company’s director of digital development. Elemental provides the technology for Wiener Staatsoper’s live and VOD streaming services and supports another of the opera company’s apps, which provides synchronized subtitles and even a synchronized music score. The 4k Nabucco workflow was designed by Elemental partner ETAS High Tech Hardware Systems GmbH, and the streams will be managed by Ooyalah via Samsung applications.

1881 Scientific American Ader Fig 3Opera companies have a long history of technological development. Before Avatar, Opéra de Rennes transmitted Mozart’s Don Giovanni live to movie theaters in high-definition stereoscopic 3D. Believe it or not, opera was responsible for the invention of electronic home entertainment (1880), stereo sound transmission (1881), pay cable (1885), consumer headphones (no later than 1888), newscasts (1893), sound movies (1894), stereo broadcasting (1925), stereo networking (1973), and alternative content for stadium displays (2007).  Almost no matter whom you pick as the inventor of movies (Edison, Jenkins, Le Prince), their purpose was opera (1886-88).  And an opera house was responsible for the development of the techniques of sportscasting (1886). Really!

DSC01695Opera was also present at the inception of electrical robotics (1894), broadcasting (1900), music synthesis (1906), entertainment radio (1906-7), television (1928-1934, proposed in 1882), live alternative content for cinema (1952, proposed in 1877 — before there was such a thing as cinema), widescreen movies (1952), and international satellite broadcasting (1967). In the 17th century, opera stage technology allowed complete scene changes to take place in full view of the audience in a matter of seconds; in the 21st century, opera companies are using live, interactive digital projection with edge stitching, image warping, and even real-time depth-plane selection.

Wiener Staatsoper is part of that tradition of technological innovation. Some of the first buildings lit by electricity were opera houses, and, because there were no power companies at the time, they had their own generators and shared their output. The first X-ray machine at Boston Children’s Hospital was powered from a local opera house. Before that, flame-based lighting could be dangerous, so Wiener Staatoper had its own 21-person fire department and helped pioneer fire extinguishers, so they “could assure patrons of artistic performances that Elemental-Logo-4cwere both stunning and safe,” according John Nemeth, VP of sales EMEA for Elemental. “Elemental is honored to support Vienna State Opera in its on-going technology innovation to increase access to the arts.”

Tags: , , , , , , , , , , ,

HPA 2014 – Resolution Frame-Rate and Dynamic Range [video]

March 12th, 2014 | No Comments | Posted in Download, Schubin Cafe, Today's Special

 

Mark Schubin’s Resolution Frame-Rate and Dynamic Range presentation from the HPA Tech Retreat, presented on February 20, 2014 (audio recorded later).

(Extended Version: Bang for the Buck: Data Rate vs. Perception in UHD Production by Mark Schubin at http://youtu.be/UG6q2xVkKU4)

Video (TRT 12:57)

Tags: , , , , , , , , ,

Bang for the Buck: Data Rate vs. Perception in UHD Production

November 18th, 2013 | No Comments | Posted in Download, Schubin Cafe, Today's Special

 

Going beyond today’s television could involve higher resolution, higher frame rate, higher dynamic range, wider color gamut, stereoscopic 3D, and immersive sound. Do all provide the same sensation of improvement? Could some preclude the use of others? Which delivers the biggest “bang for the buck,” and how do we know?

Presented during Content & Communications World, November 13, 2013, Javits Center, New York.

ARRI 4K+ Cinema AuditoriumMark Schubin adds: “I neglected to describe all of the images on slide 19. The upper right image shows that, in a cinema auditorium, detail resolutions beyond HD might be visible to everyone in the audience, even in the last row. The ARRI Alexa camera, from the same company that provided that image, however, has only 2880 pixels across — less than “3k.” That hasn’t stopped it from being used in major motion pictures, such as Skyfall (shown on set in the left bottom image) or the top-grossing movie to date in 2013, Iron Man 3.”

Video (TRT 28:03)

 

Tags: , , , , , , , , , , , , , , , , , ,

Bang for the Buck

September 25th, 2013 | No Comments | Posted in Schubin Cafe

Something extraordinary happened shortly after noon local time on Saturday, September 14, at the International Broadcasting Convention (IBC) in Amsterdam (aside from the fact that the fresh herring vendors were no longer wearing traditional klederdracht). It could affect at least the short-term future of television. And a possible long-term future was also revealed at the event. IBC, however, featured not only the extraordinary but also the cute.

For many years, there seemed to be an unofficial contest to show the largest and smallest television vehicles. The “largest” title seems to have been won (and retired) at IBC 2007 by Euroscena’s MPC34, a three-truck, expanded, interconnected system including everything from a sizable production studio through edit suites, at least as wide as it was long; nothing exhibited since has come anywhere close. The “smallest” went from trucks to vans to tiny Smart cars to motorcycles to Segways. In an era in which a hand-held camera can stream directly to the world, it’s hard to claim a “smallest” title with a vehicle, so perhaps Mobile Viewpoint’s tiny scooters (click image to enlarge) should be considered among the cutest, rather than the smallest.

Not far from those Mobile Viewpoint scooters, however, was another claimant for the “cutest” title, though it was neither new nor particularly small. In the BTS outdoor exhibit was Portuguese television broadcaster RTP’s first mobile unit, built by Fernseh GmbH (roughly translated: Television, Inc.) in an eight-meter-long Mercedes-Benz truck delivered to Lisbon in 1957. It did its first live broadcast the following February 9, continued in service through 1980, and was restored in 2006 (though it still has a top speed of only 76 kilometers per hour, about 47 MPH).

About a quarter of the length of the 1957 RTP mobile unit — more than devoted to its control room — is occupied by multi-core camera-cable reels for the vehicle’s four cameras. Cabling is one area in which video technology has advanced tremendously in the last 56 years — except for one characteristic. Consider some products of another small IBC 2013 exhibitor, NuMedia.

In television’s early days, there were separate cables for video and sync, eventually becoming one in composite video. Color required multiple cables again; composite color brought it back to one. Digital video initially used a parallel interface of multiple wires, the serial digital interface (SDI) made digital video even easier to connect than analog, because a single coaxial connection could carry video, multi-channel audio, and other data. Then modern high definition arrived.

NuMedia SDI extendersSDI carried 270 million bits per second (Mbps); HD-SDI was about 1.5 billion (Gbps). HD-SDI still used just one coaxial-cable connection, but usable cable lengths, due to the higher data rates, plunged. NuMedia offered a list of usable lengths for different cables, ranging from 330 meters for fat and heavy RG11 coax down to just 90 meters for a lighter, skinnier version. NuMedia’s HDX series uses encoders and decoders to more than double usable cable lengths (and offer a reverse audio path) — at a cost of more than $1800 per encoder/decoder pair. And that provides some background for the extraordinary event.

IBC hosted a conference session called “The Great Quality Debate: Do We Really Need to Go Beyond HD?” Although going “beyond HD” has included such concepts as a wider color gamut (WCG), higher dynamic range (HDR, the range between the brightest and darkest parts of the image), higher frame rates (HFR), stereoscopic 3D (S3D), and more-immersive sound, the debate focused primarily on a literal reading of HD, meaning that going beyond it would be going to the next level of spatial detail, with twice the fineness of HD’s resolution in both the horizontal and vertical directions.

The audience in the Forum, IBC’s largest conference venue, was polled before the debate started and was roughly evenly split between feeling the need for more definition and not. Then moderator Dr. William Cooper of informitv began the debate. On the “need” side were Andy Quested, head of technology for BBC HD & UHDTV; vision scientist Dr. Sean McCarthy, fellow of the technical staff at Arris; and Dr. Giles Wilson, head of the TV compression business at Ericsson. On the “no-need” side were Rory Sutherland, vice chair of the Ogilvy Group (speaking remotely from London); journalist Raymond Snoddy; and I.

The “need” side covered the immersive nature of giant screens with higher definition and their increased sensations of reality and presence (“being there”). Perhaps surprisingly, the “no-need” side also acknowledged the eventual inevitability of ever higher definition — both sides, for example, referred to so-called “16k,” images with eight times the spatial detail of today’s 1080-line HDTV in both the horizontal and vertical directions (64 times more picture elements or pixels). But the “no-need” side added the issue of “bang for the buck.”

At the European Broadcasting Union (EBU) exhibit on the show floor, some of that bang was presented in lectures about the plan for implementing UHDTV (ultra-HDTV, encompassing WCG, HDR, HFR, immersive sound, etc.). UHDTV-1 has a spatial resolution commonly called “4k,” with four times the number of spatial pixels of 1080-line HDTV. As revealed at the HPA Tech Retreat in February, EBU testing with a 56-inch screen viewed at a typical home screen-to-eye distance of 2.7 meters showed roughly a half-grade improvement in perceived image quality for the source material used. HFRAt the EBU’s IBC lectures, the results of viewer HFR testing were also revealed. Going from 60 frames per second (fps) to 120, doubling the pixels per second, yielded a full grade quality improvement for the sequences tested. In terms of data rate, that’s four times the bang for the buck of “4k” or “4K” (the EBU emphasized that the latter is actually a designation for a temperature near absolute zero).

IBC attendees could see for themselves the perceptual effects of HFR at the EBU exhibit or, even more easily, at a BBC exhibit in IBC’s Future Zone. Even from outside that exhibit hall, the difference between the images on two small monitors, one HFR and one not, was obvious to all observers.

The EBU hasn’t yet released perceptual-quality measurements associated with HDR, but HDR involves an even lower data-rate increase: just 50% to go from eight bits to twelve. If my personal experience with HDR displays at Dolby private demonstrations at both NAB and IBC is any indication, that small data-rate increase might provide the biggest bang-for-the-buck of all (although Pinguin Ingenieurbüro’s relatively low-data-rate immersive sound system in the Future Zone was also very impressive).

At IBC’s Future Zone, the University of Warwick showed HDR capture using two cameras, with parallax correction. Behind a black curtain at its exhibit, ARRI publicly showed HDR images from just one of its Alexa cameras on side-by-side “4k” and higher-dynamic-range HD monitors. Even someone who had previously announced that “4k” monitors offer the best-looking HD pictures had to admit that the HDR HD monitor looked much sharper than the “4k.”

Fujinon XA99x8.4HDR is contrast-ratio-related, and, before cameras, processing, and displays, lenses help determine contrast ratio. Sports and concerts typically use long-zoom-range lenses, which don’t yet exist for “4k.” A Fujinon “4k” 3:1 wide-angle zoom lens costs almost twice as much as the same manufacturer’s 50:1 HD sports lens. Stick an HD lens on a “4k” camera, however, and the contrast ratio of the finest detail gets reduced — LDR instead of HDR.

Then there are those cables. As in the change from SDI to HD-SDI, as data rate increases, useful cable length decreases. Going from 1080i to “4k” at the same number of images per second is an increase of 8:1 (so-called 6G-SDI can handle “4k” up to only 30 progressive frames per second). Going from 60 fps to 120 is another 2:1 increase. Going from non-HDR to HDR is another 1.5:1 increase, a total of 24:1, not counting WCG, immersive sound, or stereoscopic 3D (a few exhibits at IBC even showed new technology for the last). Nevertheless, Denmark’s NIMB showed a tiny, three-wheel multicamera “4k” production vehicle, perhaps initiating a new contest for largest and smallest.

The lens and cable issues were raised by the “no-need” side at “The Great Quality Debate” at IBC. Perhaps some in the audience considered this conundrum: “spending” so much data rate on “4k” might actually preclude such lower-data-rate improvements as HFR and HDR. Whatever the cause, when the audience was polled after the debate, it was no longer evenly split; at an event where seemingly almost every exhibit said something about “4k,” the majority in the audience now opposed the proposition that there is a need to go beyond high definition:

http://informitv.com/news/2013/09/14/ibcdebatevotes/

http://www.hollywoodreporter.com/behind-screen/ibc-do-we-need-go-629422

Thuraya SatSleevePerhaps a secondary theme of IBC 2013 (after “4k”) will be more significant in the long term: signal distribution. IBC has long covered all forms of distribution; in 2013 offerings ranged from broadcast transmitters in just two rack units (Onetastic) to a sleeve that turns an iPhone into a satellite phone (Thuraya). In the Future Zone, Technische Universität Braunschweig offered one of the most-sensible distribution plans for putting live mass-appeal programming on mobile devices, an overlay of a tower-based broadcast over regular LTE cells.

The most radical signal-distribution plan at IBC 2013, however, was also the one most likely to be the future of the television-production business. It’s related to HD-SDI: eliminating it. HD-SDI technology is mature and works fine (up to the distance limit for the data rate and cable), but it’s unique to our industry. Meanwhile, the rest of the world is using common information technology (IT) and internet protocol (IP).

The EBU “village” was a good place to get up to speed on replacing SDI with IT, with both lectures and demonstrations, the latter, from the BBC, showing both HD and “4k.” Here are links to EBU and BBC sites on the subject:

http://tech.ebu.ch/JT-NM/FNS/NVCIP

http://www.bbc.co.uk/rd/projects/ip-studio

SVS switcher control surfaceThen there was SVS Broadcast, which took information technology a step further, showing what they called an IT-based switcher. The control surface is a little unusual, but what’s behind it is more unusual. When a facility uses multiple switchers, they can share processing power. Oh, and the control surfaces demonstrated at IBC in Amsterdam were actually controlling switcher electronics in Frankfurt.

There were more wonders at IBC, from Panasonic 64×9 images to MidworldPro Panocam tiltedMidworldPro’s Panocam that uses 16 lenses to see everything and stitch it all into a single image. And then there was Clear-Com, offering respite from the relentless march of advanced technology with their new RS-700 series, an updated version of traditional, analog, wired, belt-pack intercom.

Ahhhhhhh.

 

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , ,

NAB 2013 Wrap Up at the SMPTE DC chapter, May 23, 2013

June 2nd, 2013 | No Comments | Posted in Download, Today's Special

Mark Schubin’s NAB 2013 wrap up.

Presented to the SMPTE DC chapter, May 23, 2013.

Video (TRT 40:02)

Tags: , , , , , , , , , , , , , , , , , , , , , , , ,

Enabling the Fix

April 29th, 2013 | No Comments | Posted in Schubin Cafe

NAB logo

Sometimes cliches are true. Sometimes the check is in the mail. And sometimes you can fix it in post. Amazingly, the category of what you can fix might be getting a lot bigger.

Sonic Notify trimmedAt this month’s NAB show, there was the usual parade of new technology, from Sonic Notify’s near-ultrasonic smartphone signaling for extremely local advertising — on the order of two meters or so (palmable transducer shown at left) to Japan’s National Institute ofARRI-Ikegami-HDK-97ARRI-Camera Information and Communications Technology’s TV “white space” transmissions per IEEE 802.22. In shooting, for those who like the large-sensor image characteristics of the ARRI Alexa but need the “systemization” of a typical studio/field camera, there was the Ikegami HDK-97ARRI (right), with the front end of the former and the back end of the latter.

Dolby 1Even where items weren’t entirely new, there was great progress to be seen. Dolby’s autostereoscopic (no glasses) 3D demo (left) has come a long way in one year. So has the European Project FINE, which can create a virtual-camera viewpoint almost anywhere, based on just a few normally positioned cameras. Last year, there was a lot of processing time per frame; this year, the viewpoint repositioning was demonstrated in real-time.

Leyard 4K wallIf you’re more interested in displays, consider what’s been going on in direct-view LED video. It started out in outdoor stadium displays, where long viewing distances would hide the visibility of the individual LEDs. At NAB 2013, two companies, Leyard (right) and SiliconCore, showed systems with 1.9-mm pixel pitch, leaving the LED structure virtually invisible even at home viewing distances. Is “virtually” not good enough? SiliconCore also showed their new Magnolia panel, with a pitch of just 1.5 mm!

The Leyard display shown here (and at NAB) was so-called “4K,” with more than twice the number of pixels of so-called “Full HD” across the width of the picture. 4K also typically has 2160 active (picture carrying) lines per frame, twice 1080, so it typically has four times the number of pixels of the highest-resolution for of HD.

The Way of the Eagle4K was unquestionably the major unofficial theme on the NAB show floor, replacing the near-ubiquitous 3D of two years ago. There were 4K lenses, 4K cameras, 4K storage, 4K processing, 4K distribution, and 4K displays. Using a form of the new high-efficiency video codec (HEVC), the Fraunhofer Institute was showing visually perfect 4K pictures Inca trimmedwith their bit rates reduced to just 5 Mbps; with the approval of the FCC, that means it could be possible to transmit multiple 4K programs simultaneously in a single U.S. broadcast TV channel. But some other things in the same booth seemed to be attracting more attention, including ordinary HD images, shot by INCA, a tiny, 2.5-ounce “intelligent” camera, worn by an eagle in flight. The eagle is shown above left, the camera, with lens, at right. The seemingly giant attached blue rod is a thin USB cable.

smartphoneThroughout the show floor, wherever manufacturers were highlighting 4K, visitors seemed more interested in other items. The official theme of NAB 2013 was METAMORPHOSIS, with the “ME” intended to stand for media and entertainment, not pure self interest. But most metamorphoses seemed to have happened before the show opened. metamorphosisDigital cinematography cameras aren’t new; neither are second-screen applications. Mobile DTV was introduced years ago. So was LED lighting.

There were some amazing new technologies discussed at NAB 2013 — perhaps worthy of the metamorphosis label.  But they weren’t necessarily on the show floor (at least not publicly exhibited). Attendees at the SMPTE Technology Summit on Cinema (TSC), for example, could watch large-screen bright images that came from a laser projector.

The NAB show was vast, and the associated conferences went on for more than a week. So I’m going to concentrate on just one hour, a panel session called “Advancing Cameras for Cinema,” in one room, the SMPTE TSC, and how it showed the metamorphosis of what might be fixed in post.

1895 MaryConsider the origin of post, the first edit, and it was a doozy! It occurred in 1895 (and technically wasn’t exactly an edit). At a time when movies depicted real scenes, The Execution of Mary, Queen of Scots, in its 27-foot length (perhaps 17 seconds), depicts a living person being led to the chopping block. Then the camera was stopped, a dummy replaced the person, the camera started again, and the head was chopped off. It’s hard to imagine what it must have been like to see it for the first time back then. And, since 1895, much more has been added to the editing tool kit.

It’s now possible to combine different images, generate new ones, “paint” out wires and other undesirable objects, change colors and contrast, and so on. It’s even possible to stabilize jerky images and to change framing at the sacrifice of some resolution. But what if there were no sacrifice involved?

Hitachi-Compact-8K-Camera croppedAstrodesign 8K trimmedThe first panelist of the SMPTE TSC Advancing Cameras session was Takayuki Yamashita of the NHK Science & Technology Research Labs. He described their 8K 120-frame-per-second camera. 8K is to 4K approximately as 4K is to HD, and 120 fps is also four times the 1080i frame rate. This wasn’t a theoretical discussion; cameras were on the show floor. Hitachi showed an 8K camera in a familiar ENG/EFP form (left); Astrodesign showed one dramatically smaller (right).

If pictures are acquired at higher resolutions, they may be reframed in post with no loss of HD resolution. With 8K, four adjacent full-HD-resolution images can be extracted across the width of the 8K frame and four from top to bottom. A shakily captured image that bounces as much as 400% of the desired framing can be stabilized in post with no loss of HD resolution. And the higher spatial sampling rate also increases the contrast ratio of fine detail.

100perc_lin_xHDR_color

Contrast ratio was just one of the topics in the presentation, “Computational Imaging,” of the second panelist, Peter Centen of Grass Valley. Above is an image he presented at the SMPTE summit. The only light source in the room is the lamp facing the camera lens, but every chip on the reflectance chart is clearly visible and so are the individual coils of the hot tungsten filament. It’s an extraordinarily high dynamic range (HDR); a contrast ratio of about ten million to one — more than 23 stops — was captured.

Yes, that was an image he presented at the SMPTE summit — five years ago in 2008. This year he showed a different version of an HDR image. There’s nothing wrong with the technology, but bringing it to the market is a different matter.

Coded apertureAt the 2013 TSC, Centen showed an even older development, one first presented by an MIT-based group at SIGGRAPH in 2007 <http://groups.csail.mit.edu/graphics/CodedAperture>, a so-called “coded aperture.” Consider a point just in front of a camera’s lens. The lens might zoom in or out and might focus on something in the foreground or background. Its aperture might be wide open for shallow depth of field or partially closed for greater depth of field. If it’s a special form of lens (or lenses), it might even deliver stereoscopic 3D. All of those things might happen after the light enters the lens, but all of those possibilities exist in the “lightfield” in front of the lens.

ApertureCoded aperture from MIT paperThere have been many attempts to capture the whole lightfield. Holography is one. Another, used in the Lytro still camera, uses a fly’s-eye type of lens, which can cut into resolution (an NAB demonstration a few years ago had to use an 8K camera for a low-resolution image). A third was described by the third panelist (and shown in his booth on the show floor). The one Centen showed requires only the introduction of a disk with a pattern of holes into the aperture of any lens on any camera.

Centen closeCenten farHere is just one possible effect on fixing things in post, with images from the MIT paper. It is conceivable to change focus distance and depth of field and derive stereoscopic 3D from any single camera and lens combo after it has been shot (click on images to enlarge).

The moderator’s introduction to the panel showed a problem with higher resolutions: getting lenses that are good enough. He showed an example of a 4K lens (with just a 3:1 zoom ratio) costing five times as much as the professional 4K camera it can be mounted on. Centen offered possibilities of correcting both lens and sensor problems in post and of deriving 4K (or even 6K) from today’s HD sensors.

Fraunhofer arrayThe third panelist, Siegfried Foessel of the Fraunhofer Institute, seemed to cover some of the same ground as did Centen — using computational imaging to derive higher resolution from lower-resolution image sensors, increasing dynamic range, and capturing a lightfield, but his versions used completely different technology. The higher resolution and HDR can come from masking the pixels of existing sensors. And the Fraunhofer lightfield capture uses an array of tiny cameras not much bigger than one ordinary one, as shown in their booth (right). Two advantages of the multicamera approach are that each camera’s image looks perfect (with no fly’s eye resolution losses or coded-aperture light losses) and that the wider range of lens positions also allows some “camera repositioning” in post (without relying on Project FINE processing).

Foessel also discussed higher frame rates (as did many others at the 2013 TSC, including a professor of neuroscience and an anesthesiologist). He noted that capturing at a high frame rate allows “easy generation of different presentation frame rates.” He also speculated that future motion-image programming might use a frame rate varying as appropriate.

jotsThe last panelist was certainly not the least. He was Eric Fossum from Dartmouth’s Thayer School of Engineering, but he was introduced more simply, as the inventor of the modern CMOS sensor. His presentation was about a “quanta image sensor” (QIS) containing, instead of pixels, “jots.” The simplest description of a jot is as something like a photosensitive grain from film. A QIS sensor counts individual photons of light and knows their location and arrival time.

An 8K image sensor has more than 33 million pixels; a QIS might have 100 billion jots and might keep track of them a thousand times a second. The exposure curve seems very film-like. Fossum mentioned some other advantages, like motion compensation and “excellent low light performance,” although this is a “longer-term effort” and we “won’t see a camera for some time.”

The “convolution window size” (something like film grain size) can be changed after image acquisition.  In other words, even the “film speed” will be able to be changed in post.

Tags: , , , , , , , , , , , , , , , , , , , , , , , ,
Web Statistics