Logie Baird Came Second

Should interlacing finally be consigned to the history books? Hiro Quibbler investigatesIf you ever find yourself in Stirling, go and see the National Wallace Monument – a bizarre Victorian Gothic monster with an even more bizarre late 20th century statue of Mel Gibson in the car park. The view from the top of the Monument, itself atop the mighty Abbey Craig, is stunning. You have to wind your way up a narrow spiral staircase, praying that a fat bloke won’t be coming down them at the same time, but it’s worth the climb. Half way up there’s a room celebrating great Scots throughout the ages. John Logie Baird is there, as the inventor of television, which is odd because he no more invented television than George W. Bush invented World Peace. It’s doubly odd, because the guy who actually invented television was also a Scot – Alan Archibald Campbell-Swinton – but he gets no mention. One hundred years ago, on June 18 1908, in a letter to Nature he described, in perfect detail, the electronic camera tube and the CRT display. He explained how they would work, how they would be synchronised – everything. He proposed a frame rate of 10 fps, which is a little slow, but other than that he was spot on.

The electro-mechanical system, developed by Baird and first demonstrated in 1925, was wrong for so many reasons, not least the fact that a better system had already been suggested. Baird’s ‘invention’ required a rotating disc with a double spiral of lenses to both scan and reconstruct the image. In order to get the scan lines more-or-less straight, the disc had to be several times the size of the screen – imagine the size of the cabinet you would need for a 50” image and what would happen if the disc jumped its bearings at 25fps. 

In Russia, Léon (Lev) Theremin had started work on a similar system that used a drum instead of a disc – it worked much better than the Baird disc, but the world still only remembers Theremin for that weird electronic instrument so beloved of 1950s sci-fi film score composers. Theremin’s TV also used interlacing in 1926, though again unfairly, this invention is attributed to one Randall C. Ballard, though RCA only filed his patent in July 1932.

As the patent describes, interlacing was a cunning way of getting a higher line count without increasing the bandwidth of the channel being used to transmit the TV signal. Let’s say that you want to transmit a 625 line picture, but the channel that the government have allotted you will only allow that resolution to be transmitted at 25fps, which isn’t fast enough (if shown on a CRT) and flickers when viewed. Human persistence of vision generally needs a light source to flash at least 40 times per second to be perceived as being continuously ‘on’. Interlacing sends every other line of the picture first, then returns to the top of the image and sends the lines that it missed the first time. Sending the whole picture takes 1/25th second – the minimum allowed in our scenario – but each field (the picture made of every other line) is sent in 1/50th second, fast enough for most humans to perceive as flicker free.

As well as this improvement in spatial resolution (on a still image), interlacing also has a higher temporal resolution for horizontal motion. If you pan a film camera (or any other ‘progressive’ capture camera – one which captures all the image at the same instant) you get far more judder in the image than if you pan an interlaced camera. The opposite, however, is true for vertical motion. You most often see this interlacing problem with rolling credits. There are several roll rates at which lines of the characters will ‘miss’ the field scan – imagine the top of a character should be scanned by field 1, but we are currently capturing field 2. The character then rolls up a line so should be captured by field 2, but we’re now scanning field 1. The top line of the character totally disappears. At close to this rate, it will appear and disappear – beating with the scan rate – yuk!

Nevertheless, interlacing has worked valiantly for standard definition TV, based on the CRT, for all these years. The problems really started with the drawing up of the High Definition TV specifications.

Making an HD CRT is really hard. The cathode ray beam has to be incredibly well focussed – hard to do if you bend it through large angles. As a result, HD CRTs tend to be as long as Titanic and weigh more than Saturn. CRTs were never going to be a practical HD display technology, which, frankly, everyone knew when HD was first proposed.

Most practical HD displays are progressive – they display all lines of the image simultaneously – whether they be LCDs or Plasma or whatever. Faced with an interlaced signal they have to de-interlace it. There are a variety of techniques available to do this, varying in complexity, quality and cost, but most degrade the picture such that, in real TV with moving pictures, it’s hard to tell the difference between 1080i and 720p. It’s only when the picture is dead still that the increased spatial resolution of 1080i becomes apparent. Furthermore, these display technologies don’t ‘paint’ the picture with a decaying trace, the way a CRT does, so the persistence of vision flickering problem is reduced.

Interlacing – fundamentally a solution to a CRT based problem – should never have been included in the HDTV specification; it just encouraged equipment manufacturers and broadcasters to use it. Like Baird’s spinning disc, it was obsolete before its inception. Hopefully, it will be consigned to the history books (or, The Internet as they’re now known) in the same way.

Posted on May 11, 2010 and filed under comment.