Produced by:
| Follow Us  

What Will Be at NAB? Consider IBC

April 17th, 2016 | No Comments | Posted in Schubin Cafe


SoftPanel Autocolorautocolor demoLike every other NAB show, the upcoming 2016 one will likely have lots of innovations. One to which I’m looking forward is the new “autocolor” button on SoftPanels LED lights. Each can measure the ambient lighting and adjust its output to match, whether it’s an incandescent table lamp or daylight streaming in a window.

I’m eager to see other innovations in other booths. But, for an idea of the major themes at the convention, there’s no preview like the previous fall’s International Broadcasting Convention (IBC) in Amsterdam. Cinegy brought the DeLorean time machine of Back to the Future to the show because 2015 was the year to which it brought the travelers in the second movie of the series. But, in some ways IBC 2015 seemed more “Forward to the Past.”


Consider, for example, the format wars. Remember competing videotape formats and HDTV structures? Format wars are back. Cinegy has been pushing IP (internet protocol) connections instead of SDI (serial digital interface) for some time, but there wasn’t just one version of IP for television at IBC. NewTek, for example, introduced its open NDI (network device interface) at the show.

That’s for connecting devices. Squeezing their data through an IP pipe is a different issue and one with its own format wars.Higher spatial resolution (4K) seems to demand some form of bit-rate reduction (“compression”), preferably of a mild type (“mezzanine level”) so that it won’t affect image quality in production and post-production. TICO could be found at many booths (“stands” in IBC lingo), but so could many other options.

TICO cropped

Higher spatial resolutions were the rage at IBC 2014. At the 2015 version, more attention seemed to be paid to higher dynamic range (HDR), which had its own format wars. The Philips version, requiring just 35 bytes per scene (not per frame or per second) was shown in one dark room (with the lights turned on for the photo below).

Philips cropped

There were also many sessions about HDR, some covering HDR in movie theaters. Peter Ludé of RealD noted that their scientists separately measured the reflectivity of auditorium finishes (walls and carpet), seating and people in typical theaters. The results were that the people were the biggest contributor to light backscatter to the screen. He quipped, therefore, that, for best results, HDR movies should be shown in empty auditoriums.

There were cutting-edge technologies even for ordinary HDTV, from the BBC’s “Responsive Subtitles” (which increase comprehension by appearing with the speeds and rhythms of speech) to GATR’s inflatable portable satellite antennas. The Fraunhofer Institute, which at NAB showed the ability to re-light scenes in post thanks to their camera array, at IBC showed how tracking shots could be done without moving a camera.

Below is a composite shot, the scene outside the window added in post.


It looks like there was a camera move in the background, but, as the freeze in the video below indicates, the motion was all done in post production.


20150911_164307What might Franhofer show at NAB 2016? What other goodies will be on the NAB show floor?

We’ll soon know, but right now, it’s like the World War II German encryption machine at the Rambus IBC stand: an enigma.

Tags: , , , , , , , , , , , , ,

IBC 2015: Virtual Reality for Storytelling

September 14th, 2015 | No Comments | Posted in Schubin Cafe


part of the Al Jazeera 360-degree camera mountAl Jazeera is famous for many things, but virtual reality is probably not one of them – at least not yet. At last week’s International Broadcasting Convention (IBC) in Amsterdam, in the Future Zone, however, Al Jazeera was very prominent in the field. At one “stand” (what Americans might call a “booth”), Al Jazeera Media Network’s Innovation and Research Group (I&R) was showcasing a 360 degree camera mount that employs 14 standard GoPro cameras. Al Jazeera I&R had both designed and 3D-printed it and is making the “StoryMount” rig available online for modification and re-use by anyone.

The North Face ClimbPerhaps more significant was the virtual-reality content in the Future Zone. A few steps from the Al Jazeera I&R group stand, I watched one grown man in a suit and tie put on an Oculus Rift headset and earphones and soon start jerking his head around and shrieking out loud. He was watching people climbing and jumping off high mountains in Jaunt’s The North Face: Climb <Jaunt virtual-reality content>. It didn’t do as much for me.

20150911_103854 (1)A few steps in the other direction, however, was the stand of the Emblematic Group <Emblematic Group>, co-founded by journalist Nonny de la Peña. There, I donned the visual and aural virtual-reality gear to watch Kiya, funded by Al Jazeera America. It’s a tale of domestic violence. The audio is from actual 911 recordings. The images are computer-graphic reconstructions of what was going on at the time.

I found the piece very moving, but, until the end, I wondered whether I might have found it just as moving – or even more so – had I been simply listening to the audio with no visuals or perhaps hearing it while watching photographs of the people involved. The headset allowed me to look around by moving my head, but, most of the time, I had little or no desire to do so; the mountain scenery of Climb was more inviting to me. The Kiya computer-graphics of the people and their motion were crude, like those of an old video game. And, as in Climb, the video resolution through the headset was even worse, and I was conscious of the chromatic aberration of the optics.

kiya2Then the story came to its end. The final scene was of a room, empty of people. I won’t give away the plot, but, although part of me wanted to shut my eyes, another part wanted to look at everything, the headset allowed me to do so, and I found it all exceptionally moving. It wasn’t the dialogue; there was none. This time, there was no question that it was the virtual reality, putting me in that room, that was pulling on my heartstrings.

The resolution will only get better from here. Welcome, new medium, to the storytelling business.


Tags: , , , , , , , , , ,

The Bain of Our Existence

March 2nd, 2015 | No Comments | Posted in Schubin Cafe

Bain from STE 1874 Latimer Clark donation cropped

As is the case for most technologies, television had no single “inventor.” But then there’s the amazing Alexander Bain.

Consider: 1939 August 26 Reds v. Dodgers at Ebbets mobile unitThe first major-league baseball game to be televised was played between the Cincinnati Reds and the Brooklyn Dodgers on August 26, 1939. If one believes that television was introduced at the New York World’s Fair that year a few months earlier, it didn’t take long to get from that introduction to sports coverage. In fact, there was even experimental coverage of a game between Princeton and Columbia Universities on May 17.

1931_Nov-Dec_TV_NEWS - Japan Baseball coverOf course, that’s a very U.S.-centered view of history. Regularly scheduled television broadcasting began in London in 1936 (or even earlier, depending on definitions). As for the first baseball game to be televised, that was in Tokyo in 1931.

Even in the U.S., 1931 saw the first TV shows with original scripts. Regularly-scheduled news telecasts began in Schenectady, New York in 1928. In London, the first public demonstration of a television system capable of depicting a recognizable human face was in 1926, and the first public demonstration of a cruder television system was in 1914.

eyeSiemens10An all-electronic television system was described in the publication Nature in 1908, following the patenting of an electronic picture display in 1907. The word television, itself, was coined in 1900 to describe the many moving-image transmission systems created by that point.

What has been called “the master television patent” — certainly the first patent for a complete television system — was issued in Germany in 1885. The first crude television pictures were seen by 1879. Multiple television systems were described between a demonstration of an “artificial eye” in 1876 and those first crude video pictures of 1879. And before that?

Nothing. Not even science fiction or fantasy. The closest description might be in a tale, offered by Sir Walter Scott in 1828, of a mysterious mirror that saw not only into the distance but also into the past (although it could produce images for no more than seven minutes).

Why did the concept of television suddenly appear in the 1870s? It began, perhaps, with the seemingly appropriately named Wildman Whitehouse.

AgamemnonIn one version of a common joke, a surgeon with a defective lamp calls an electrician, who arrives, works for a moment, fixes the lamp, and presents a bill. “This is outrageous!” the surgeon declares. “I’m a surgeon, and I don’t get paid as much as that.” The electrician replies, “When I was a surgeon, I didn’t get as much, either.”

Whitehouse was a surgeon who became an electrician. As the latter, he came up with a plan to use high voltage to force telegraph messages through the long transatlantic cable of 1858. Whether it was that high voltage or, as later research suggests, flaws of manufacture, the cable failed.

Willoughby Smith imageSo, for its replacement, telegraph engineer Willoughby Smith designed an apparatus to monitor its health. But John Mayhew discovered unusual variations in its readings, seeming to have something to do with light intensity. Smith conducted experiments to prove that the selenium resistors used were photoconductive and wrote to Latimer Clark about it in 1873. Clark informed the Society of Telegraph Engineers (STE), to which he, Smith, and Whitehouse all belonged, along with such other notables as William Thomson (later Lord Kelvin, for whom the K in “3200K” is named) and William Siemens. After much debate and publicity, the Siemens artificial eye appeared, followed by many television proposals. As for the STE, they became the Institution of Electrical Engineers, today the Institution of Engineering and Technology, one of the six partners who produce the International Broadcasting Convention (IBC) each year.

Many television histories begin with the photoconductivity discovery or Smith’s experiments, and there’s no question that, as publicized by the STE, they kicked off the efforts to create television. What’s odd, however, is that they weren’t the first. Long before even the first transatlantic cable, in 1839 Edmond Becquerel published in the journal of the French Academy of Science his research that sunlight could create an electrical current. At the time, it seemed just another interesting scientific phenomenon. No one made the leap from that to television.

Bain Wick plaqueThe reason television research began after Smith/Clark and not Becquerel is that, by the time of the discovery of the photoconductivity of selenium, the world was already accustomed to image transmission, and the reason for that was Alexander Bain. There were actually two famous Alexander Bains born in Scotland in the early 19th century. The one who might be considered the father of television (and almost all other forms of electronic imaging) was born in Watten in 1810 and apprenticed to a clock maker in Wick. After hearing a lecture about electricity, he abandoned his apprenticeship and went off to work in the new field.

He worked in both telegraphy and timekeeping, sometimes combining the two. In 1843, while living in London, he received a patent for “Certain improvements in producing and regulating electric currents, and improvements in electric time pieces, and in electric printing and signal telegraphs.” He later said he came up with the idea in 1842. A drawing from his 1848 U.S. patent (5957) is shown below.

Big Bain patent

Bain appears to have been the first to conceive of image scanning. In one fell swoop, he came up with linear (horizontal) scanning lines, pixels, line synchronization, and frame synchronization, all for image transmission. As John Douglas Ryder and Donald G. Fink (the latter the secretary of the U.S. National Television System Committees, NTSC) put it in their 1984 IEEE Press book Engineers & Electrons: a century of electrical progress, Bain’s “concept embodied all the geometrical and timing methods of the modern television system.”

Bakewell-Tape-1850 trimmedJust as the 1873 announcement of the photoconductivity of selenium opened the floodgates for television proposals, Bain’s patent 30 years earlier brought on a flood of proposals for what we might today call fax machines. At right is an image transmitted a long distance in 1850 using a system that created negative images at the receiver.

Caselli_pantelegraph_imageCommercial fax service began in France in 1865 using Bain’s scanning technique. The biggest problem was that the faxes had to be drawn or written with insulating ink. That didn’t stop opera composer Gioacchino Rossini from transmitting a page of music from Paris to Amiens in 1860. By 1863, faxes were even transmitted in color! But some sort of system for converting variations in light intensity to electrical signals was seemingly necessary to transmit photographic images, and that’s what the Smith/Clark 1873 announcement of the photoconductivity of selenium offered.

The fundamental concepts of television were then in place: image scanning and the conversion of light variations to electrical signals. It was already known that wires would glow at different brightnesses depending on the amount of current flowing through them. The rest was just engineering.

Tags: , , , , , , , , , , , , , , , ,

An Eclectic View of IBC 2014

November 2nd, 2014 | No Comments | Posted in Download, Schubin Cafe

On exhibit floors that had products ranging from 8K cameras to automatic captioning, why were many visitors excited about Skype? At a conference where the title of one presentation began “Minimising nonlinear Raman crosstalk,” why did one press report comment on cinema-auditorium lighting and the gross receipts of one episode of one TV show?

Between bites of fresh raw herring, Mark Schubin wandered through IBC (moderating one conference session) and discovered those and more: for example, a 4K camera that can directly use long-range zoom lenses, a 3D display that doesn’t require either special glasses or a sweet viewing spot, the Holo-Deck, an immersive egg, the ability to zoom and dolly in post, and a fully accredited Wile E. Coyote.

Catching liars and thieves? Yes, there was that, too.

Direct Link (50 MB / 38:49 TRT): An Eclectic View of IBC 2014


Tags: , , , , , , , , , , , , , , , , , , , , , , ,

Bang for the Buck

September 25th, 2013 | No Comments | Posted in Schubin Cafe

Something extraordinary happened shortly after noon local time on Saturday, September 14, at the International Broadcasting Convention (IBC) in Amsterdam (aside from the fact that the fresh herring vendors were no longer wearing traditional klederdracht). It could affect at least the short-term future of television. And a possible long-term future was also revealed at the event. IBC, however, featured not only the extraordinary but also the cute.

For many years, there seemed to be an unofficial contest to show the largest and smallest television vehicles. The “largest” title seems to have been won (and retired) at IBC 2007 by Euroscena’s MPC34, a three-truck, expanded, interconnected system including everything from a sizable production studio through edit suites, at least as wide as it was long; nothing exhibited since has come anywhere close. The “smallest” went from trucks to vans to tiny Smart cars to motorcycles to Segways. In an era in which a hand-held camera can stream directly to the world, it’s hard to claim a “smallest” title with a vehicle, so perhaps Mobile Viewpoint’s tiny scooters (click image to enlarge) should be considered among the cutest, rather than the smallest.

Not far from those Mobile Viewpoint scooters, however, was another claimant for the “cutest” title, though it was neither new nor particularly small. In the BTS outdoor exhibit was Portuguese television broadcaster RTP’s first mobile unit, built by Fernseh GmbH (roughly translated: Television, Inc.) in an eight-meter-long Mercedes-Benz truck delivered to Lisbon in 1957. It did its first live broadcast the following February 9, continued in service through 1980, and was restored in 2006 (though it still has a top speed of only 76 kilometers per hour, about 47 MPH).

About a quarter of the length of the 1957 RTP mobile unit — more than devoted to its control room — is occupied by multi-core camera-cable reels for the vehicle’s four cameras. Cabling is one area in which video technology has advanced tremendously in the last 56 years — except for one characteristic. Consider some products of another small IBC 2013 exhibitor, NuMedia.

In television’s early days, there were separate cables for video and sync, eventually becoming one in composite video. Color required multiple cables again; composite color brought it back to one. Digital video initially used a parallel interface of multiple wires, the serial digital interface (SDI) made digital video even easier to connect than analog, because a single coaxial connection could carry video, multi-channel audio, and other data. Then modern high definition arrived.

NuMedia SDI extendersSDI carried 270 million bits per second (Mbps); HD-SDI was about 1.5 billion (Gbps). HD-SDI still used just one coaxial-cable connection, but usable cable lengths, due to the higher data rates, plunged. NuMedia offered a list of usable lengths for different cables, ranging from 330 meters for fat and heavy RG11 coax down to just 90 meters for a lighter, skinnier version. NuMedia’s HDX series uses encoders and decoders to more than double usable cable lengths (and offer a reverse audio path) — at a cost of more than $1800 per encoder/decoder pair. And that provides some background for the extraordinary event.

IBC hosted a conference session called “The Great Quality Debate: Do We Really Need to Go Beyond HD?” Although going “beyond HD” has included such concepts as a wider color gamut (WCG), higher dynamic range (HDR, the range between the brightest and darkest parts of the image), higher frame rates (HFR), stereoscopic 3D (S3D), and more-immersive sound, the debate focused primarily on a literal reading of HD, meaning that going beyond it would be going to the next level of spatial detail, with twice the fineness of HD’s resolution in both the horizontal and vertical directions.

The audience in the Forum, IBC’s largest conference venue, was polled before the debate started and was roughly evenly split between feeling the need for more definition and not. Then moderator Dr. William Cooper of informitv began the debate. On the “need” side were Andy Quested, head of technology for BBC HD & UHDTV; vision scientist Dr. Sean McCarthy, fellow of the technical staff at Arris; and Dr. Giles Wilson, head of the TV compression business at Ericsson. On the “no-need” side were Rory Sutherland, vice chair of the Ogilvy Group (speaking remotely from London); journalist Raymond Snoddy; and I.

The “need” side covered the immersive nature of giant screens with higher definition and their increased sensations of reality and presence (“being there”). Perhaps surprisingly, the “no-need” side also acknowledged the eventual inevitability of ever higher definition — both sides, for example, referred to so-called “16k,” images with eight times the spatial detail of today’s 1080-line HDTV in both the horizontal and vertical directions (64 times more picture elements or pixels). But the “no-need” side added the issue of “bang for the buck.”

At the European Broadcasting Union (EBU) exhibit on the show floor, some of that bang was presented in lectures about the plan for implementing UHDTV (ultra-HDTV, encompassing WCG, HDR, HFR, immersive sound, etc.). UHDTV-1 has a spatial resolution commonly called “4k,” with four times the number of spatial pixels of 1080-line HDTV. As revealed at the HPA Tech Retreat in February, EBU testing with a 56-inch screen viewed at a typical home screen-to-eye distance of 2.7 meters showed roughly a half-grade improvement in perceived image quality for the source material used. HFRAt the EBU’s IBC lectures, the results of viewer HFR testing were also revealed. Going from 60 frames per second (fps) to 120, doubling the pixels per second, yielded a full grade quality improvement for the sequences tested. In terms of data rate, that’s four times the bang for the buck of “4k” or “4K” (the EBU emphasized that the latter is actually a designation for a temperature near absolute zero).

IBC attendees could see for themselves the perceptual effects of HFR at the EBU exhibit or, even more easily, at a BBC exhibit in IBC’s Future Zone. Even from outside that exhibit hall, the difference between the images on two small monitors, one HFR and one not, was obvious to all observers.

The EBU hasn’t yet released perceptual-quality measurements associated with HDR, but HDR involves an even lower data-rate increase: just 50% to go from eight bits to twelve. If my personal experience with HDR displays at Dolby private demonstrations at both NAB and IBC is any indication, that small data-rate increase might provide the biggest bang-for-the-buck of all (although Pinguin Ingenieurbüro’s relatively low-data-rate immersive sound system in the Future Zone was also very impressive).

At IBC’s Future Zone, the University of Warwick showed HDR capture using two cameras, with parallax correction. Behind a black curtain at its exhibit, ARRI publicly showed HDR images from just one of its Alexa cameras on side-by-side “4k” and higher-dynamic-range HD monitors. Even someone who had previously announced that “4k” monitors offer the best-looking HD pictures had to admit that the HDR HD monitor looked much sharper than the “4k.”

Fujinon XA99x8.4HDR is contrast-ratio-related, and, before cameras, processing, and displays, lenses help determine contrast ratio. Sports and concerts typically use long-zoom-range lenses, which don’t yet exist for “4k.” A Fujinon “4k” 3:1 wide-angle zoom lens costs almost twice as much as the same manufacturer’s 50:1 HD sports lens. Stick an HD lens on a “4k” camera, however, and the contrast ratio of the finest detail gets reduced — LDR instead of HDR.

Then there are those cables. As in the change from SDI to HD-SDI, as data rate increases, useful cable length decreases. Going from 1080i to “4k” at the same number of images per second is an increase of 8:1 (so-called 6G-SDI can handle “4k” up to only 30 progressive frames per second). Going from 60 fps to 120 is another 2:1 increase. Going from non-HDR to HDR is another 1.5:1 increase, a total of 24:1, not counting WCG, immersive sound, or stereoscopic 3D (a few exhibits at IBC even showed new technology for the last). Nevertheless, Denmark’s NIMB showed a tiny, three-wheel multicamera “4k” production vehicle, perhaps initiating a new contest for largest and smallest.

The lens and cable issues were raised by the “no-need” side at “The Great Quality Debate” at IBC. Perhaps some in the audience considered this conundrum: “spending” so much data rate on “4k” might actually preclude such lower-data-rate improvements as HFR and HDR. Whatever the cause, when the audience was polled after the debate, it was no longer evenly split; at an event where seemingly almost every exhibit said something about “4k,” the majority in the audience now opposed the proposition that there is a need to go beyond high definition:

Thuraya SatSleevePerhaps a secondary theme of IBC 2013 (after “4k”) will be more significant in the long term: signal distribution. IBC has long covered all forms of distribution; in 2013 offerings ranged from broadcast transmitters in just two rack units (Onetastic) to a sleeve that turns an iPhone into a satellite phone (Thuraya). In the Future Zone, Technische Universität Braunschweig offered one of the most-sensible distribution plans for putting live mass-appeal programming on mobile devices, an overlay of a tower-based broadcast over regular LTE cells.

The most radical signal-distribution plan at IBC 2013, however, was also the one most likely to be the future of the television-production business. It’s related to HD-SDI: eliminating it. HD-SDI technology is mature and works fine (up to the distance limit for the data rate and cable), but it’s unique to our industry. Meanwhile, the rest of the world is using common information technology (IT) and internet protocol (IP).

The EBU “village” was a good place to get up to speed on replacing SDI with IT, with both lectures and demonstrations, the latter, from the BBC, showing both HD and “4k.” Here are links to EBU and BBC sites on the subject:

SVS switcher control surfaceThen there was SVS Broadcast, which took information technology a step further, showing what they called an IT-based switcher. The control surface is a little unusual, but what’s behind it is more unusual. When a facility uses multiple switchers, they can share processing power. Oh, and the control surfaces demonstrated at IBC in Amsterdam were actually controlling switcher electronics in Frankfurt.

There were more wonders at IBC, from Panasonic 64×9 images to MidworldPro Panocam tiltedMidworldPro’s Panocam that uses 16 lenses to see everything and stitch it all into a single image. And then there was Clear-Com, offering respite from the relentless march of advanced technology with their new RS-700 series, an updated version of traditional, analog, wired, belt-pack intercom.



Tags: , , , , , , , , , , , , , , , , , , , , , , , , , ,

The Light Fantastic

August 26th, 2012 | 1 Comment | Posted in Schubin Cafe

Here are some questions: Why is the man in the picture above holding radioactive sheets of music? What is the strange apparatus behind him? What does it have to do with Emmy awards given to Mitsubishi and Shuji Nakamura this January? And what is the relationship of all of that to the phrase “trip the light fantastic”?

Next month the International Broadcasting Convention (IBC) in Amsterdam will reveal such innovations in media technology as hadn’t yet appeared in the seemingly endless cycle of trade shows. At last year’s IBC, for example, Sony introduced the HDC-2500 three-CCD camera. It is perhaps four times as sensitive as its basic predecessor, the HDC-1500, while maintaining its standardized 2/3-inch image format.

Other cameras have other image formats. Some popular models use CMOS (instead of CCD) image sensors with sizes approximately the same as those of a 35-mm movie-film frame. Some are even larger than that. The larger sizes and different technologies can mean increased sensitivity. Increased sensitivity, in turn, can mean less light needed for proper exposure. But that doesn’t necessarily mean fewer lighting instruments. As shown in the diagram at right, from Bill Fletcher’s “Bill’s Light” NASA web page (, a typical simple video lighting setup uses three light sources, and increasing camera sensitivity won’t drop it to two or one.

Perhaps it’s best to start at the beginning, and lighting is the beginning of the electronic image-acquisition process. Whether you prefer a biblical or big-bang origin, in the beginning there was light. The light came from stars (of which our sun is one), and it was bright, as depicted in the image at left, by Lykaestria ( But, in the beginning of video, it wasn’t bright enough. The first person to achieve a recognizable image of a human face (John Logie Baird) and the first person to achieve an all-electronic video image (Philo Taylor Farnsworth) both had to use artificial lighting because their cameras weren’t sensitive enough to pick up images even in direct sunlight, which ranges between roughly 30,000 and 130,000 lux.

There are roughly 10.764 lux in the old American and English unit of illuminance, the foot-candle. Candles have been used for artificial lighting for so long that they became part of the language of light: foot-candle, candlepower — even lux is short for candela steradian per square meter. General Electric once promotionally offered the foot candle shown at right (in an image, used here with permission, by Greg Van Antwerp from his Video Martyr blog, where you can also see what’s written on the sole,

As long ago as the middle of the 19th century, when Michael Faraday began giving lectures on “The Chemical History of a Candle,” candles were very similar to today’s: hard cylinders that burned down, consuming both the fuel and the wick, and emitting a relatively constant amount of light (which is how candle became a term of light measurement). Before those lectures, however, candles were typically soft, smoky, stinky, sometimes toxic, and highly variable in light output, at least in part because their wicks weren’t consumed as the candles burned. Instead, the wicks had to be trimmed frequently, often with the use of candle “snuffers,” specialized scissors with attached boxes to catch the falling wicks, as shown at left in an ornate version (around 1835) from the Victoria and Albert Museum. This link offers more info:

Then came electricity. It certainly changed lighting, but not initially in the way you might think. Famous people are said to be “in the limelight.” The term comes from an old theatrical lighting system in which oxygen and hydrogen were burned and their flame directed at a block or cylinder of calcium oxide (quicklime), which then glowed due to both incandescence and candoluminescence (the candle, again). It could be called the first practical incandescent light. It was so bright that it was often used for spotlights. As for the electric part, the hydrogen and oxygen were gathered in bags by electrolysis of water. At right is an electrolysis system developed by Johann Wilhelm Ritter in 1800.

Next electricity made possible the practical arc light, first used for entertainment at the Princess’s Theatre in London in 1848. In keeping with the candle theme, one version (shown at left) was called Jablochkoff’s candle. But its light was so harsh and bright (said to be brighter than the sun, itself), that theaters continued to use gas lighting instead.

Like candles and oil lamps before it, gas lighting generated lots of heat, consumed oxygen, generated carbon dioxide, and caused fires. Unlike the candles and oil lamps, gas lights could be dimmed simultaneously throughout a theater (though there was a candle-dimming apparatus, below, depicted in a book published in 1638). But gas lights couldn’t be blacked out completely and then re-lit without people igniting each jet — until 1866, that is. That’s when electricity made its third advance in lighting: It provided instant ignition for gas lights at the Prince of Wales’s Theatre in Liverpool.

Actually, there was an earlier contribution of electricity to lighting. In 1857, Heinrich Geissler evacuated the air from a glass tube, inserted another gas, and then ran a current through electrodes at either end, causing the gas to glow. As shown at right in a drawing from the 1869 physics book Traité Élémentaire de Physique, however, Geissler tubes were initially used more to provide something to look at rather than providing light for looking at other things (click the image for an enlargement). They were, effectively, the opposite of arc lights.

The first practical incandescent electric lamps, whether you prefer Joseph Swan or Thomas Edison (or his staff) as the source, appeared around 1880 and were used for entertainment lighting almost immediately at such theaters as the Savoy in London, the Mahen in Brno, the Palais Garnier in Paris, and the Bijou in Boston. At about the same time, inventors began offering their proposals for another invention: television. As the diagram at left (from the November 7, 1890 issue of The Telegraphic Journal and Electrical Review), of Henry Sutton’s 1885 version, called the “telephane” (the word television wouldn’t be coined until 1900), shows, however, the newfangled incandescent lamp wasn’t yet to be trusted; the telephane receiver used an oil lamp as its light source.

When Baird (1925) and Farnsworth (1927) first demonstrated their television systems, the light sources in their receiver displays were a neon lamp (a version of a Geissler tube) and a cathode-ray tube (CRT) respectively, but the light sources used to illuminate the scenes for the cameras were incandescent light bulbs. Baird’s first human subject, William Edward Taynton, actually fled the camera because he was afraid the hot lights would set his hair on fire. Farnsworth initially used an unfeeling photograph as his subject. When he graduated to a live human (his wife, Elma, known as Pem, shown at right) in 1929, she kept her eyes closed so as not to be blinded by the intense illumination.

The CRT, developed in 1897 and first used (in a version shown at left) to display video images in 1907,  is also a tube through which a current flows, but its light (like that of a fluorescent lamp) comes not directly from a glowing gas but from excitation of phosphors (chemicals that emit light when stimulated by an electron beam or such forms of electromagnetic radiation as ultraviolet light). But, if it’s a light source, why not use it as one?

Actually, the first use of a scanned light source in video acquisition involved an arc light instead of a CRT. As shown below, in a 1936 brochure about Ulises Sanabria’s theater television system (reproduced on the web site of the excellent Early Television Museum, here:, the device behind the person with the radioactive music at the top of the post is a television camera. But it works backwards. The light source is a carbon arc, focused into a narrow beam and scanned television style. The large disks surrounding the camera opening, which appear to be lights, are actually photocells to pick up light reflected by the subject from the scanned light beam.

Unfortunately, the photocells could also pick up any other light, so the studio had to be completely dark except for the “flying spot” scanning the image area. That made it impossible to read music, thus the invention of music printed in radium ink on black paper. The radium glow was sufficient to make out the notes but too weak to interfere with the reflected camera light.

Of course, the studio was still dark, which made moving around difficult. Were it not for the fact that the phrase “trip… the light fantastick” appears in a 1645 poem by John Milton, one might suspect it was a description of such a studio. The scanned beam of “the light fantastic” emerged from the camera, and, because it was the only light in the room, everyone had to be careful not to trip. Inventor Allen B. DuMont came up with a solution called Vitascan, shown below in an image from a 1956 brochure at the Early Television Museum:

Again, the camera works backwards: Light emerges from it from a scanned CRT, and photomultiplier tubes pick it up. This being a color-television system, there are photomultiplier tubes for each color. Even though the light emerges from the camera, the pickup assemblies can be positioned like lights for shadows and modeling and can even be “dimmed.” It’s the “sync-lite” (item 7) at the upper right, however, that eliminated the trip hazard. Its lamps would flash on for only 100 millionths of a second at a time, synchronized to the vertical blanking interval, a period when no image scanning takes place, providing bright task illumination without affecting even the most contrasty mood lighting. In that sense, Vitascan worked even better than today’s lighting.

Vitascan wasn’t the only time high-brightness CRTs were used to replace incandescent lamps. Consider Mitsubishi’s Diamond Vision stadium giant video displays. The first (above left) was installed at Dodger Stadium in 1980. It used flood-beam CRTs (above right), one per color, as its illumination source, winning Mitsubishi an engineering Emmy award this January.

In the same category (“pioneering development of emissive technology for large, outdoor video screens”) another award was given to a Japan-born inventor. Although he once worked for a company that made phosphors for CRTs, he’s more famous for the invention that won him the Emmy award. It’s described in the exciting book Brilliant! by Bob Johnstone (Prometheus 2007). The book covers not only the inventor but also his invention. It’s subtitled Shuji Nakamura and the Revolution in Lighting Technology.

Nakamura came up with the first high-brightness pure blue LED, followed by the high-brightness pure green LED. Those led not only to LED-based giant video screens but also to the white LED (sometimes created from a blue LED with a yellow phosphor). And bright, white LEDs led to the rapid replacement of seemingly almost all other forms of lighting in many moving-image productions.

Those who attended the annual NAB equipment exposition in Las Vegas in April couldn’t avoid seeing LED lighting equipment. But they might also have noticed some dissension. There was, for example, PRG’s TruColor line with “Remote Phosphor Technology.” Remember the “pure blue” and “pure green” characterizations of Nakamura’s inventions? If those colors happen to match the blue and green photosensitivities of camera image sensors, all is well. If not, colors can be misrepresented. So PRG TruColor moves the phosphors away from the LEDs (the “remote” part), creating a more diffuse light with what they tout as better color performance–a higher color-rendering index (

Hive, also at NAB, claims a comparably high CRI for its lights, but they don’t use LEDs at all. Instead, they’re plasma. They’re not plasma in the sense of using a flat-panel TV as a light source; they’re plasma in the physics sense. They use ionized gas.

Unlike Geissler tubes, however, Hive’s plasma lamps (from Luxim) don’t have electrodes. The tiny lamps (right) are said by their maker to emit as much as 45,000 lumens each and to achieve 70% of their initial output even after 50,000 hours (about six years of being illuminated non-stop).

If they don’t have electrodes, what makes them light up? Luxim provides FAQs here, which describe the radio-frequency field used, but Hive’s site, gets into specific frequencies. “Hive’s lights are completely flicker-free up to millions of frames per second at any frame rate or shutter angle. Operating at 450 million hertz, we’re still waiting for high-speed cameras to catch up.”


Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,

IBC 2010 – 2D, 3D, 4D, 5D

October 25th, 2010 | No Comments | Posted in 3D Courses, Schubin Cafe

There was plenty of 3D at the International Broadcasting Convention (IBC) in Amsterdam this year.  At the awards ceremony, alone, the audience was frequently asked to don 3D glasses to see clips from the winners (before viewing a portion of the not-yet-released 3D movie Tron: Legacy).  But the first sentence of the first comment posted on the question “What did you see around at IBC2010?” posted on the LinkedIn Digital TV Professionals Group was “Lots of 3DTV Demos that nobody was looking at” (from Alticast senior vp Anthony Smith-Chaigneau), and two other group members quickly agreed.


In fact, some of the 3D demos were very viewed, including the ones in Sony’s exhibit, based largely around their MPE200 processor.  Introduced at NAB in April, the MPE200 was then capable primarily of correcting stereoscopic camera-alignment errors, as shown above.  It has become so popular that one announcement of Sony 3D equipment sales at the show (to Presteigne Charter) included 13 MPE200 processors but only 10 HDC1500R cameras (with two required per 3D rig).

At IBC 2010, the MPE200 was joined in that correction function by Advanced 3D Systems’ The Stereographer’s Friend (  Whereas the MPE200 currently has a four-frame latency, The Stereographer’s Friend is said to do its corrections within just one frame and for lower cost.

TS-5 smallSome stereoscopic camerazepar small rigs are said to be so precise that correction is not necessary.  Although some had seen it previously, 3ality’s small, relatively lightweight TS-5 rig (shown at left) was officially introduced at IBC 2010.  Zepar introduced an even-smaller stereoscopic lens system (shown at right) for a single camera, reducing the need for correction.  Such 3D-lens systems normally raise concerns of resolution and sensitivity loss, but Zepar’s is intended to be mounted on the Vision Research Phantom 65, which has plenty of each.

At IBC 2010, however, Sony’s MPE200 was no longer just a correction box; three more functions were introduced.  One is 2D-to-3D upconversion.  Sony was not alone in that area, either.  One new competitor is SterGen, an Israel-based company with a system intended specifically for sports.  According to their web site (, they offer “better quality than real 3D shooting.”


Then there’s graphics insertion.  In that new function, the MPE200 was joined by Screen Subtitling’s 3DITOR, which analyzes not only the depth characteristics of the current frame but also the depth history.  Above, one of the depth-measurement tools is shown (based on an image from Wild Ocean, ©2010 Yes/No Productions Ltd and Giant Screen Films).  The company offers a white paper on the myriad issues of 3D text:

Another new MPE200 function is stitching, the ability to combine pictures from multiple cameras into one big panorama and then derive a stereoscopic camera image from a portion of the result.  The European research lab imec had shown a stereoscopic virtual camera at NAB in April (and brought it to IBC, too:, and BBC R&D described a system even earlier (


Much of the interest in stitching at IBC 2010, however, was unrelated to 3D.  It was associated, instead, with the Hego OB1 system, which uses a package of six cameras in one location to create the panorama.  It won awards from Broadcast Engineering and TVBEurope magazines.  Certainly, the system uses interesting technology, but so does Sony’s MPE200.  Perhaps Hego’s winning the awards had something to do with how the OB1 was demonstrated, with bikini-clad beach-volleyball players on the IBC Beach, as shown above in a portion of a photo by Wes Plate (  That’s the camera array at the upper right.

Sisvel tile

In fact, there was plenty new at the show that was not 3D.  There was more 3D, of course.  In the area of distribution, Dolby pushed its version of 3D encoding and Sisvel brought a new “tile” format, shown above with an image from Maga Animation.  The left-eye view occupies a 1280 x 720 portion of the 1920 x 1080 frame, allowing it to be extracted for 2D HD viewing without necessarily changing existing decoders.

There were new 3D analyzers from Binocle (DisparityTagger) and Cel-Soft (Cel-Scope).  There was an iPhone/iPod app from Dashwood Cinema Solutions for stereoscopic calculations associated with Panasonic’s 3DA1 camcorder.  There was the Vision 3 camera with toe-in-free convergence that I wrote about just before the show (  There were glasses-free displays (one noting that its correct viewing distance was 4.4 meters).  There was a seven-camera 3D rig for capturing information for such displays.  There was a book about stereoscopic cinematography from 1905.  There was an eye-tracking 3D laser-display system.

That was just in the exhibits.  There were also plenty of 3D conference sessions.  IBC’s best-paper award went to a group from NDS for their paper “Does size matter? The challenges when scaling stereoscopic 3D content,” which showed that not only does apparent depth change with different screen sizes, but it also doesn’t scale.  And stereographer Kommer Kleijn punched holes in “religious” views of toe-in vs. parallel shooting in a presentation about stereoscopic shooting for people experienced in 2D.

Actually, in addition to 2D and 3D, IBC 2010 also had 4D.  It was in a small exhibit in a low-traffic hall.  The full title was Real-Sense 4D, from ETRI, the Korean Electronics and Telecommunications Research Institute.

ETRI 4D cropped

As shown above, Real-Sense 4D involves more than just an image display.  I tried it out.  When the story involved a fire, I not only saw the flames and heard them crackling but also felt the heat and smelled the smoke.  During a segment on ice skating, I felt the air rushing past and then, in a moment out of Nancy Kerrigan’s career, felt a sudden WHAP! on my legs.

Panasonic AF100 on shoulder-mount rig trimmed

As at many recent professional equipment exhibitions, there was also 5D, specifically the Canon Eos 5D Mark II DSLR camera, capable of shooting HD.  But there was also something characterized by David Fox in the IBC Daily as “the HD DSLR Killer.”  It was Panasonic’s AG-AF100/101 (above), shown only in a display case at the NAB show earlier this year.  It combines the advantages of a large-size image sensor (Micro Four Thirds format) with the features of a video camcorder.  At IBC, there were many operating units, and the reaction of the IBC press corps was wildly enthusiastic about them, much more so than to Panasonic’s 3DA1 camcorder.

Polecam_HRO_69_HD_lens trimmedWhereas at IBC 2009 some 25 new camera models were introduced, at IBC 2010, besides the V3i stereoscopic camera and the AF100/101, the main introductions were Canon’s XF100 and XF105 camcorders and IDT’s palm-sized, 2K, high-speed NR5.  There were also compact versions of NHK’s 8K Super Hi-Vision cameras from Hitachi and Ikegami.  But there were significant wide-angle lens introductions from Polecam (HRO69, at left) and Theia (MY125) for 1/3-inch-format cameras, offering horizontal acceptance angles of 69 and 125 degrees, respectively. Halibut

There were also introductions in the camera-mount area, such as Bradley Engineering’s multi-axis Gyro 350 (similar looking to the older Axsys V14 but said to be lower in cost), Vinten’s encoding Vector 750i pan head, and SiSLive’s Halibut underwater track.  Vaddio’s Reveal wall-mounted camera systems are invisible until used.  Broadcast Solutions showed a tiny two-seat Smart car equipped as a five-camera studio.  That’s not merely the control equipment; the five cameras were mounted in the car.

Brick House Tally Ho!Other acquistion-related introductions at IBC included a video whiteboard system from Vaddio that does not require a computer, a version of Sennheiser’s MKE-1 lavalier microphone in which every part, from cable to connector to windscreen, is paintable to precisely match costume color, and a wireless tally system from Brick House Video.  Capable of dealing with up to eight cameras, the Tally Ho! handles both on-air and preview/iso tally, and the charger for the tally modules doubles as the system  transmitter.

Atomos Ninja Product Image trimmedAJA Ki Pro MiniIf IBC 2010 wasn’t about new cameras, it did offer many new introductions in storage and distribution.  There was, for example, AJA’s new, small, lightweight, camera-mountable Ki Pro Mini (left).  Then there was the even smaller and lighter Atomos Ninja (right), intended specifically for use with certain types of cameras.  And Cinedeck Extreme v. 2.0 allows direct use of Avid’s DNxHD codec.  Sonnet’s Qio MR brings the ability to play essentially all popular types of camcorder flash cards (including Panasonic’s P2 and Sony’s SxS) to Windows-based tower computers.

Marvin trimmedThen there were transportable systems, bigger than those above but still usable in the field.  One was the Globalstor Extremestor Transport.  Comparably sized but serving a very different function was Marvin (left), from Marvin Technologies.  It accepts almost any form of field recording and then, according to preselected options, automatically makes copies, including archival tape cartridges and DVD screening copies.

The tiny storage devices introduced at IBC 2010 were joined by tiny encoders for distribution.  The ViewCast Niagara 4100 was small, the TV1.EU miniCaster smaller, and the Teradek Cube smaller still.  Clear-Com’s HelixNet intercom won an award from TV Technology Europe.  It’s a digital intercom system using microphone-type cables like older analog systems (but also very much like Riedel’s already existing digital Performer series).

Quantel QTube trimmed

There was much more at IBC.  Cloud-based editing (an Internet Explorer screen from Quantel’s QTube is shown above), a new acoustic summing algorithm, a multi-touch video wall — and those were just some of the items in the public exhibits.  In private rooms, one could find such items as TiVo’s integration of YouTube and Sony’s 24-inch OLED and terabyte memory card.

uWand trimmed

Then there was uWand, an unusual remote control from Philips.  Like so many others, it uses infra-red signals.  Unlike those others, it receives those infra-red signals rather than emitting them, so a user can, for example, move an image from a TV screen to a digital picture frame, just by aiming the remote control.

IBC clearly isn’t just about broadcasting anymore.  For more of my take on IBC 2010, see the Power Point from the Schubin Cafe IBC review on October 12, available here:

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

IBC 2010 Review PowerPoint Presentation

October 25th, 2010 | No Comments | Posted in Download, Today's Special
IBC 2010 Review


Tags: ,

3D Camera: Something Different

September 9th, 2010 | No Comments | Posted in 3D Courses, Today's Special

I’ve been writing about 3D image capture for more than 35 years.  I’ve covered side-by-side and beam-splitter rigs, parallel and toed-in lenses, integrated cameras, single-lens stereo, integral imaging, 3D illusions, and holography.  But I’ve not — until now — covered anything like Frontniche’s VC-3100 HD, made by V3i.

Frontniche 1

It’s an integrated 3D camera being introduced at the International Broadcasting Convention in Amsterdam tomorrow.  It uses dual 3-chip 2/3-inch-format 2.2-megapixel CCD sensors and has dual 18 x 7.6 mm lenses with synchronized zoom, focus, and iris functions.  It has a 7-inch viewfinder.  It even has a tally light.  In other words, ignoring its 3D aspect, it’s like a typical broadcast HD camera (with a twin attached).

Frontniche 2It is, however, a 3D camera, but unlike any other.  Its image sensors (the prism optical blocks with chips attached) move horizontally.

Frontniche makes many claims for the camera, which it calls “the world’s first all-in-one ortho-stereoscopic broadcast camera.”  Among them are a “maximum 3D effect distance” of 360 meters, more than enough to shoot one football goalpost from behind the other.  It’s also said to comply with Japan’s “Stereoscopic Image Safe Standard” law.

You can read more about it in the product brochure, from which these images were taken:

Frontniche 3

The brochure includes links to sites covering the theory of moving-sensor 3D and issues of sports shooting.  I plan to give the unit a good look at IBC.

Tags: , , , , , , , ,

Walkin’ in a Camera Wonderland

September 20th, 2009 | 3 Comments | Posted in 3D Courses, Schubin Cafe
If you want to see products that don’t appear in U.S. trade-press magazines, you need to go beyond NAB, SMPTE, and InfoCOMM. You need to go to the International Broadcasting Convention.


IBC is my favorite trade show. I can leave work, catch an evening flight to Amsterdam, and take a train directly from the airport to the convention center. If I’m hungry, some exhibitor will be providing food. Thirsty? Water, various forms of coffee, juices, beer, and wine flow freely. IBC even throws a party to which everyone is invited. But none of that is why I like it so much.

Americans tend to forget that we are not alone. Back in the days of RCA cameras, you needed to come to IBC to see those of the UK-based manufacturer Pye.

Today, we tend to think of NAB as an international show. Cameras are shown there by such Japanese manufacturers as Hitachi, JVC, Panasonic, Sony, and Toshiba. And Grass Valley’s cameras at NAB come from Europe. So why bother with IBC? More »

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , ,
Web Statistics