Cover Story (sidebar) / March 1994

The Tools for New TV

Tom R. Halfhill

Think of it as the world's largest WAN (wide-area network) with the world's largest database servers at one end and the world's largest number of clients at the other: That's the vision for broadband ITV (interactive TV).

The clients, of course, are ordinary TV sets, augmented by a new generation of digital set-top boxes that will rival the processing power of today's PCs and workstations. The servers are likewise a new breed of computers that not only have enough storage for vast libraries of movies, TV shows, and multimedia applications, but also are capable of feeding that data downstream to millions of users — in real time, on demand. In between, tying everything together, is the nationwide broadband network that flawlessly switches all this traffic while ensuring that every transaction is billed to the appropriate user.

The whole system is far larger and more complex than anything that exists today. Its nearest relative is the public telephone system — a low-bandwidth network that terminates into relatively simple analog devices and is designed to deliver communications instead of content. ITV is so new that critical pieces of the hardware and software technologies are still being invented. It won't come cheap, it won't come easy, and it probably won't come as quickly as some people are predicting.

Will it come at all? No question. ITV definitely isn't a technology in search of a solution. In fact, it's the technology that's the problem.

Consider, for example, the servers that will form the hub of this great network. Most of today's databases store relatively simple data (e.g., names and addresses), and their I/O model is transactional, so minor delays are common when accessing records. But headend servers on the ITV network must store full-motion video, stereo sound, and other rich data types. These "video servers" must also achieve real-time or near real-time throughput, because even brief delays will cause visible glitches on home TV screens.

Oracle (Redwood Shores, CA), which hopes to become a key player in this field, says that most of its current database customers manage 100 to 150 GB of data. Oracle's biggest customer, a credit-history company, has a database approaching 1 TB (1024 GB). But the ITV network of the future will store the world's entire movie library, estimated at 65,000 films. Each film requires 1.5 GB or more of storage when compressed in MPEG-2 format. That adds up to about 95 TB. Now add all the historical news footage and popular TV shows that will eventually be stored, too. And don't forget the other content, such as electronic catalogs and interactive encyclopedias, and things yet to be imagined.

The bulk of this material will be archived in near-line storage: banks of automated jukeboxes that can mount tape cartridges or optical disks on the video server within seconds of a user's request. The server will copy the video onto its local mass storage, probably striping the data across arrays of hard disks for redundancy and faster access. Then it will buffer the data in RAM while pumping it downstream to the user's set-top box. Frequently accessed material, such as the most popular movies and games, may be permanently maintained on local storage. Special software will track viewing habits, automatically loading It's a Wonderful Life in December.

Consumers will expect the same reliability from the ITV network that they do from the public phone system, so video servers will need careful maintenance. An array of 1000 hard disks will lose an average of one drive per day, according to MTBF (mean time between failure) statistics. Technicians will patrol rooms of servers and jukeboxes, hot-swapping failed drives on the spot, just as they used to keep ENIAC running by constantly replacing blown vacuum tubes.

If the storage requirements of video servers seem daunting, the I/O is nightmarish. During peak hours in major cities, thousands of people may be requesting videos. Today's broadcast model is synchronous: One "copy" of a movie is sent over cable or the airwaves to a mass audience, and everyone watches it at once. Pure video-on-demand is asynchronous: If 5000 people on Saturday night want to watch the latest hit film, only a few will punch in their orders at the same moment. Thus, the server must stream the same video to thousands of destinations according to different time bases, some only seconds apart.

To complicate matters still further, the video server will provide virtual VCR functions, such as pause, rewind, fast-forward, slow motion, and frame advance. So it has to update thousands of file pointers to keep pace with frequently shifting viewing patterns.

What kind of computer can do all this? "I think 'computer' may be the wrong word," says Greg Hoberg, marketing manager of the video communications division at Hewlett-Packard (Santa Clara, CA); "it's really an I/O machine." Hoberg says the problem is not computational and therefore requires an entirely new approach to hardware design. "We're trying to come up with the architecture that is appropriate to this problem. It's a problem of I/O and mass storage, not a problem of MIPS."

HP's video server, dubbed the Video Engine, is expected to be ready in about a year. Hoberg says it will be a highly scalable machine that fits into HP's vision of numerous servers distributed across a hierarchical network. Local servers will supply the most popular videos, while remote machines that serve many localities will store less-popular content. This topology could minimize headend costs without compromising access.

HP isn't alone in the scramble to gain a foothold in the high-stakes video-server market. IBM and DEC see video servers as a potential use — even a savior — for large minicomputers and mainframes. Microsoft, Intel, AT&T, Silicon Graphics, Motorola, nCube, and Oracle are a few of the other companies working on hardware and software. As with anything new, different approaches are emerging.

Unlike HP, Oracle and nCube (Foster City, CA) think video servers do need great computational power. They're designing servers using nCube's massively parallel computers and Hypercube architecture. In their view, symmetric multiprocessor systems have too much hardware overhead and will quickly fall victim to bus saturation if applied to large-scale video-on-demand.

To boost the server's I/O bandwidth, nCube interconnects large numbers of proprietary microprocessors comparable to a 386 but optimized for throughput. Oracle, which is writing the software, says an nCube-based video server with 1024 processors could supply video to 7000 homes. A larger nCube-2 computer supports up to 4096 processors and could serve 30,000 homes.

"No one knows for sure how these machines will be used," says Benjamin Linder, director of technical marketing for Oracle's Media Server project. "So Oracle is designing a system that's as general as possible. We're trying to create servers to act as living libraries on the data superhighway."

Linder says that the massively parallel approach is overkill for video I/O, even on this scale, but nevertheless makes sense because the server could handle tasks that otherwise would be shunted downstream to the user's set-top box.

This is a key point. New computers are needed for both ends of the ITV network — clients as well as servers. Digital set-top boxes are much more than simple tuners or descramblers, yet their cost must be driven down to about $300 before broadband ITV is economical.

Consider what a typical box might contain. Start with a powerful CPU, such as a 486, PowerPC, or Mips R4000. Add 1 to 3 MB of RAM; a high-speed graphics chip for screen overlays and video games; a display chip; a 1-GHz RF tuner; a demodulator; an error-correction chip; an MPEG-II decoder; logic to strip the audio soundtrack from the incoming video; a Dolby decoder; two 16-bit audio D/A converters; a video RGB converter; an RF modulator; an infrared interface for remote control; flash ROM for the operating system; a security chip to prevent theft of service; and a switching power supply.

"People have this interactive TV vision, but if the set-top box costs $1000, it's not going to be worth it," notes Roger Kozlowski, vice president and technical director for the consumer segment of Motorola (Phoenix, AZ). That's why Oracle and nCube are designing video servers that can shoulder part of the computational burden. They seem to be in the minority, however. Other companies are betting the set-top technology will be affordable by 1995 or 1996 — and it'll be at least that long before the network infrastructure is ready to support it.

Illustration: Broadband Interactive TV. To provide video-on-demand, multimedia encyclopedias, and other new services, ITV networks will need high-speed servers with vast amounts of mass storage. Material will be stored on digital tapes or optical disks in automated jukeboxes. When the network receives a user's request via the upstream backchannel, the server will retrieve the appropriate file from the jukebox and copy it to a hard disk array, from which the compressed video will be spooled downstream to the user's digital set-top box. The box will decode and decompress the video and then modulate an analog signal for the TV.

Tom R. Halfhill is a BYTE senior news editor. You can reach him on the Internet or BIX at thalfhill@bix.com.

Copyright 1994-1997 BYTE

Return to Tom's BYTE index page