Data Broadcasting and Interactive Television

17
Data Broadcasting and Interactive Television REGIS J.CRINON, DINKAR BHAT, DAVID CATAPANO, GOMERTHOMAS, JAMES T.VAN LOO, AND GUN BANG Invited Paper This paper provides an overview of the digital television (DTV) data broadcast service and interactive service technologies that have been deployed over the last ten years. We show how these trials have led to the development of data protocol and software middle- ware specifications, worldwide. Particular attention is given to the series of standards established by the Advanced Television System Committee. Experimental deployments to both Personal Computer (PC) and Set-Top-Box (STB)_receivers are considered, with an em- phasis on the services that have introduced new business models for DTV operators. Keywords—Datacasting, digital television (DTV) infrastructure, digital television (DTV) middleware, digital television (DTV) ser- vices, interactive television. I. INTRODUCTION The introduction of digital television (DTV) services has opened up many new vistas. One of these is the ability to include data in a DTV broadcast stream along with the audio and video. This capability can be used to provide an en- hanced experience for television viewers (interactive televi- sion data broadcasting), and it can be used to deliver data for applications that have no direct connection to television pro- gramming (general purpose data broadcasting). This paper deals with both classes of “datacasting” (data broadcasting) applications. An important enabler for interactive television is the need for new functionality in TV receivers, including frame buffers and new capture logic for demultiplexing and parsing digital streams. Because the new sophisticated receiver en- vironments offer more real estate and higher resolution Manuscript received July 20, 2005; revised October 15, 2005. R. J. Crinon and J. T. Van Loo are with Microsoft Corporation, Redmond, WA 98052 USA (e-mail: [email protected]; jamesvan@microsoft. com). D. Bhat, D. Catapano and G. Thomas are with Triveni Digital, Inc., Princeton Junction, NJ 08550 USA (e-mail: [email protected]; [email protected]; [email protected]). G. Bang is with Electronics and Telecommunications Research Institute, Daejeon 305-350, Korea (e-mail: [email protected]). Digital Object Identifier 10.1109/JPROC.2005.861020 graphics, over-the-air broadcast, satellite, and cable opera- tors have realized that there is an opportunity to supplement their audio/video services with new types of interactive and data enhancement services. An early form of interactive television came with the advent of the electronic program guide (EPG) application that allows consumers to navigate through a large set of digital channels. The development of the World Wide Web in the early 1990s and the rapid growth in Internet services gave an additional incentive to DTV operators to look be- yond simple EPG applications. To understand the various potential benefits and pitfalls of interactive TV, cable and over-the-air broadcasters launched a series of interactive TV trials starting in the mid-1990s, including nationwide trials by the Public Broadcasting Service (PBS) and a number of its member stations in 2001 and 2002, which were based on the Advanced Television Enhancement Forum (ATVEF) specification. At the same time, general purpose datacasting was being considered by many public TV stations as a means of fur- thering their public service mission and by some commercial TV stations as a possible source of new revenue. The interest in new data services was further motivated by the need to download software upgrades for the new DTV re- ceiver systems. For example, in the mid–1990s, the Tele-TV service was designed to support a data download capability for delivering Operating System patches/enhancements to its set-top-box terminals. All of these factors led to a wide recognition of the need for standards, and the Advanced Television Systems Committee (ATSC) in particular started working on the A/9x series of standards to address the data delivery aspects of data broad- cast and interactive television services, or in other words, the fundamental transmission protocols and signaling mech- anisms needed for the deployment of any such services. In parallel to this effort came the realization by the industry that standardizing the software run-time environment and appli- cation interfaces in DTV receivers is necessary if broad de- ployment of such services is to happen. The idea here was to provide a minimum set of software interfaces and hardware 0018-9219/$20.00 © 2006 IEEE 102 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Transcript of Data Broadcasting and Interactive Television

Page 1: Data Broadcasting and Interactive Television

Data Broadcasting and Interactive Television

REGIS J. CRINON, DINKAR BHAT, DAVID CATAPANO, GOMER THOMAS,JAMES T. VAN LOO, AND GUN BANG

Invited Paper

This paper provides an overview of the digital television (DTV)data broadcast service and interactive service technologies thathave been deployed over the last ten years. We show how these trialshave led to the development of data protocol and software middle-ware specifications, worldwide. Particular attention is given to theseries of standards established by the Advanced Television SystemCommittee. Experimental deployments to both Personal Computer(PC) and Set-Top-Box (STB)_receivers are considered, with an em-phasis on the services that have introduced new business models forDTV operators.

Keywords—Datacasting, digital television (DTV) infrastructure,digital television (DTV) middleware, digital television (DTV) ser-vices, interactive television.

I. INTRODUCTION

The introduction of digital television (DTV) services hasopened up many new vistas. One of these is the ability toinclude data in a DTV broadcast stream along with the audioand video. This capability can be used to provide an en-hanced experience for television viewers (interactive televi-sion data broadcasting), and it can be used to deliver data forapplications that have no direct connection to television pro-gramming (general purpose data broadcasting). This paperdeals with both classes of “datacasting” (data broadcasting)applications.

An important enabler for interactive television is theneed for new functionality in TV receivers, including framebuffers and new capture logic for demultiplexing and parsingdigital streams. Because the new sophisticated receiver en-vironments offer more real estate and higher resolution

Manuscript received July 20, 2005; revised October 15, 2005.R. J. Crinon and J. T. Van Loo are with Microsoft Corporation, Redmond,

WA 98052 USA (e-mail: [email protected]; [email protected]).

D. Bhat, D. Catapano and G. Thomas are with Triveni Digital, Inc.,Princeton Junction, NJ 08550 USA (e-mail: [email protected];[email protected]; [email protected]).

G. Bang is with Electronics and Telecommunications Research Institute,Daejeon 305-350, Korea (e-mail: [email protected]).

Digital Object Identifier 10.1109/JPROC.2005.861020

graphics, over-the-air broadcast, satellite, and cable opera-tors have realized that there is an opportunity to supplementtheir audio/video services with new types of interactive anddata enhancement services.

An early form of interactive television came with theadvent of the electronic program guide (EPG) applicationthat allows consumers to navigate through a large set ofdigital channels. The development of the World Wide Webin the early 1990s and the rapid growth in Internet servicesgave an additional incentive to DTV operators to look be-yond simple EPG applications. To understand the variouspotential benefits and pitfalls of interactive TV, cable andover-the-air broadcasters launched a series of interactive TVtrials starting in the mid-1990s, including nationwide trialsby the Public Broadcasting Service (PBS) and a number ofits member stations in 2001 and 2002, which were basedon the Advanced Television Enhancement Forum (ATVEF)specification.

At the same time, general purpose datacasting was beingconsidered by many public TV stations as a means of fur-thering their public service mission and by some commercialTV stations as a possible source of new revenue.

The interest in new data services was further motivated bythe need to download software upgrades for the new DTV re-ceiver systems. For example, in the mid–1990s, the Tele-TVservice was designed to support a data download capabilityfor delivering Operating System patches/enhancements to itsset-top-box terminals.

All of these factors led to a wide recognition of the need forstandards, and the Advanced Television Systems Committee(ATSC) in particular started working on the A/9x series ofstandards to address the data delivery aspects of data broad-cast and interactive television services, or in other words,the fundamental transmission protocols and signaling mech-anisms needed for the deployment of any such services. Inparallel to this effort came the realization by the industry thatstandardizing the software run-time environment and appli-cation interfaces in DTV receivers is necessary if broad de-ployment of such services is to happen. The idea here was toprovide a minimum set of software interfaces and hardware

0018-9219/$20.00 © 2006 IEEE

102 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 2: Data Broadcasting and Interactive Television

resources that service providers can assume available in anyDTV receiver. In ATSC this standardization effort took thename of DTV Application Software Environment (DASE),which was completed in 2003.

With the prospect of standardized software infrastructurein every receiver, consumer electronics companies, infor-mation technology companies, and cable operators in theUnited States got together to solve the problem of how suchnew DTV receivers could interface with the communicationand conditional access protocols used by cable operators.From this collaboration came the Unidirectional cable in-terface based on the concept of a “point of deployment”(POD) module, which is a Personal Computer MemoryCard International Association (PCMCIA) card for trans-lating cable conditional access to a standardized interfaceinto the DTV receiver based on the Digital TransmissionContent Protection (DTCP) specification. CableCARD isthe marketing term that has been adopted by the consumerelectronics and cable companies for the POD module.

Data broadcasting and interactive television are slowly be-coming a reality as, one by one, the standards and technolo-gies fall into place. The next milestone will certainly be abidirectional cable agreement between consumer electronicsand information technology companies and cable operatorsto allow full interactivity between a DTV receiver and a cablehead-end. Once the agreement and the supporting standardsare ready, it will only be a matter of time before the broadconsumer market gets access to datacasting and interactiveservices. So, what are these protocol and software technolo-gies that are going to make this all possible? What are theremaining challenges? This is what we propose to discuss inthis paper.

The paper is organized around two main topics: generalpurpose data broadcasting and interactive television. Al-though these application domains have much in common,they have a number of key differences. General purpose dat-acasting services are often targeted for enterprises (such asbusinesses or schools), rather than consumers. Their targetreceiving devices are typically PCs or portable devices,rather than TV receivers. The data items are typically notdelivered in real time and must be cached on the hard diskdrive of the receiver before the application consuming thedata can start using them. The broadcast data items are oftenshared among multiple users on an LAN.

II. GENERAL PURPOSE DATA BROADCASTING

A. Business Motivation

In the United States, most of the stations involved in datadelivery over digital terrestrial television broadcasts so farhave been public broadcasting stations, largely because ofdifferences in mission and funding between public broad-casting stations and commercial broadcasting stations.

1) Public Broadcast Stations: Most public television sta-tions view their public service mission in broad terms, en-compassing not only public service programming, but also

provision of other educational and information services totheir communities, and even a role in pioneering innovativebroadcast technologies.

The funding sources for public television stations bothreflect and reinforce this view. Most of them get a mix ofmoney from government grants, corporate sponsorships andgrants, and individual memberships and gifts. This fundingis seldom in the nature of direct payment for service, but isbased instead on a general perception that the stations areproviding valuable public services.

Thus, public television stations can embrace data broad-casting in support of education, emergency management, andother public services without having to show a direct returnon the investment. Their enhanced reputation from using in-novative new technology to provide new types of servicesleads indirectly to increased funding. The result is that at thetime of this writing, around 50 public television stations haveinstalled a DTV data broadcast system of some kind, withwhich they are deploying a variety of applications.

2) Commercial Stations: Commercial television stationstypically need to show a more direct connection between theservices they offer and the payment for them.

Their traditional revenue sources are based on advertisingtargeted to consumers, so it is natural for them to look towarddata broadcast services aimed at consumers and supportedby advertising, for example data-enhanced TV programmingor data-only “TV programs.” However, broadcasters are re-luctant to invest in data-enhanced TV programming untilenough consumers have data-capable DTV sets, and con-sumers are reluctant to pay for data-capable DTV sets untilthere is a significant amount of data-enhanced programmingavailable, a classical “chicken and egg” situation. This is ag-gravated by the fact that until very recently there was nostandard for data-enhanced programming that was widely ac-cepted across both terrestrial and cable environments.

Another option for commercial TV broadcasters is leasingbandwidth to third parties for use as a data delivery pipe forapplications that may be unrelated to TV broadcasting. It hastaken a while for this to develop, since it requires a whole newbusiness orientation for broadcasters, with new sales chan-nels and new marketing expertise.

There were several early attempts to start up servicesbased on delivering various kinds of digital files to con-sumers over DTV signals, but the cost of the necessarycustom receivers created a formidable economic barrier, andthe attempts were not very successful. More recently the costof data-capable receivers (and disk storage) has come down,and several efforts are under way now to offer push video ondemand (VOD) services, whereby video files are broadcastto data-capable set-top boxes in the hands of individualconsumers and stored on disk there. The consumer can thenview the videos on demand, either on a pay-per-view orsubscription basis. Early indications for the success of suchservices are promising, although it is too early to tell forsure yet.

The bottom line is that so far only a handful of commercialstations are broadcasting data.

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 103

Page 3: Data Broadcasting and Interactive Television

B. Applications

Digital TV enables a great many different types of databroadcast applications, which may be classified in a numberof ways [1]. Three of the more important classification cri-teria are coupling, target audience, and type of data (from atechnology standpoint).

1) Coupling: One classification basis for applications isthe degree to which the application is coupled to the normalTV programming.

Tightly coupled data are intended to enhance TV program-ming in real time. An example of such an application is thedisplay on demand of player, team, league and game statsduring a sports event. The timing of such data is often closelysynchronized with the video and audio frames being shown.

Loosely coupled data are related to the program butnot closely synchronized in time. For example, an educa-tional program might data broadcast supplementary self-testquizzes along with the program.

Noncoupled data are typically contained in separate “data-only” virtual channels. Some examples of applications en-abled under this category are delivery of software updates toDTV receivers, delivery of emergency alerts to public safetyagencies, and push VOD services.

2) Target Audience: Another important classificationbasis for applications is in terms of the target audience, i.e.,whether the broadcast data is targeted to enterprises or to theconsumer mass market.

For data broadcast applications targeted to consumers, akey requirement for success is that it must be possible to pro-duce low cost DTV receivers that can receive and consumethe data. At the time of this writing, DTV technology is justreaching the point of maturity where such low cost data-ca-pable receivers are becoming possible.

For applications targeted to enterprises, the cost of theDTV receivers is less crucial. Data may be sent only to spe-cialized receivers in an enterprise, which then may distributeit, for example over an LAN, to other users in the enterprise.Since the value of such applications may easily outweigh thecost of specialized receivers, they are more viable from abusiness perspective. Almost all enterprise-targeted applica-tions involve loosely coupled data. Examples of such appli-cations include distribution of educational material to publicschools and distribution of emergency alerts to public safetyagencies.

3) Technology: Applications may also be classified by thetype of data, for example streams (such as streaming videoor audio), files (such as video clips or text documents) andnetwork datagrams (such as Internet Protocol (IP) packets).The type of data can be further broken down into synchro-nized, synchronous, or asynchronous data.

Synchronized data has a strong timing relationship withanother stream, such as a video stream. Each item in a syn-chronized data stream is intended to be presented at a specifictime, relative to a clock established in another stream in thebroadcast. Synchronous data has an internal timing associa-tion among its own data items, but not with any other streamin the broadcast. Asynchronous data has no internal or ex-ternal timing relationships.

C. Application Requirements

The different datacasting applications described above re-quire a variety of different types of functionality in order tomeet their objectives [2], such as the following.

• Bandwidth management: Often broadcast bandwidthmust be allocated among multiple applications, varyingby date and time, with priorities among the applications.

• Scheduling: Different types of schedules may be re-quired, such as one-time delivery, periodic delivery atspecified intervals, continuous carouseling, etc. Starttimes, durations, priorities, and bitrates may need to bespecified. For some applications it may be necessary tospecify separately the times at which the data is fetchedfrom its source and the times at which it is broadcast.

• Flow control: To efficiently utilize available bandwidthin a transport stream for data broadcasting, the mul-tiplexer would have to exchange bandwidth informa-tion, usually called flow control messages, with the dataserver. Consequently, the data server would appropri-ately control its output bandwidth.

• Queue management: Different content items maybe assigned different priorities during scheduling. Forinstance, critical items that need to go out at specifictimes may get the highest priority at the scheduledtime, whereas noncritical items may get lower pri-orities. Further, certain items may be given fixedbandwidth profiles as compared to others that are senton a best-effort basis.

• Error recovery: Parts of content may get corruptedor lost during transmission. While the transmissionsystem typically has built-in error correction (forexample Reed–Solomon and Trellis forward errorcorrection (FEC) in the 8-VSB (vestigial sideband)modulation system used for ATSC DTV broadcasts),additional correction at a higher layer can be useful inmore lossy conditions.

• Receiver targeting: Often content has to be targeted toreach a specific set of receivers; for instance, in a dis-tance education application only students registered fora course must receive course material. Receivers maybe targeted by identity or attribute.

• Encryption: Content may need to be encrypted; forinstance, in a homeland security application. Securelydelivering the encryption key to receivers is a chal-lenge. Encryption must work in tandem with receivertargeting since the key must only be available to tar-geted receivers.

• Receiver service discovery: The basic structure ofa DTV broadcast stream is specified in the so-calledMPEG-2 Systems Standard [5], which is a memberof the family of MPEG-2 standards developed by theMoving Picture Experts Group, a working group ofthe International Standards Organization (ISO). Thebroadcast stream consists of a sequence of transportstream packets, and each packet contains in its headeran identifier called a PID that identifies which specificaudio stream, video stream, data stream, etc., the packet

104 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 4: Data Broadcasting and Interactive Television

belongs to. Service discovery means finding the broad-cast band and the specific PID or PIDs for the transportstream packets carrying the data for the desired service,identifying the encapsulation format used, identifyingthe correct IP addresses (for data carried in IP packets),etc.

• Receiver acknowledgments: For certain applicationslike emergency alert notification, it is imperative thatcontent be received. In such applications, receivers mustbe made aware that they should acknowledge certaindata items and must be told where they should send theiracknowledgment.

• Status reporting: Both the content server and receiversmust report status such that they can be monitored orcontrolled. The server must report information like thetransmitted items, total bits sent out, bandwidth usage,and number of acknowledgment received from receiversin the field, etc. The receiver must report on the band-width currently being received, the current cache size,etc.

• Remote monitoring and control: Operators of dataservers need tools to monitor and control the data serverremotely. Features that may be needed in a remote con-troller include the ability to start and stop the server,monitor bandwidth usage, monitor warnings and errors,change settings like the identifier value for the MPEG-2Program Element used for broadcasting the data.

• Autolaunching Content on Arrival: Sometimes itis useful for content to be automatically launched onarrival by the appropriate application. For instance,when a Hypertext Markup Language (HTML) pagedescribing an emergency is delivered, the page may belaunched immediately on arrival.

D. Protocols

In order to support efficient interoperability among databroadcast products, a major requirement is that the data mustbe encapsulated in standard formats and delivered usingstandard protocols. In addition, there must a well-definedin-band service description framework for signaling thepresence of current data services and announcing future dataservices. The ATSC Data Broadcast Standard [3] describesbasic encapsulation formats and protocols and signalingand announcement information that can be used in differentscenarios.

The encapsulation formats and protocols are arranged in alayered fashion, as illustrated in Fig. 1. They are all based onthe MPEG-2 transport stream structure. The different encap-sulation formats have different structures layered above that,as described below.

Data piping encapsulation is intended for proprietary ap-plications with special requirements that are not met by theother standard encapsulation protocols. The data items aresimply packed into MPEG-2 transport stream packets in anapplication-dependent way. Since the standard encapsulationprotocols meet the needs of most applications, this form ofencapsulation is not being widely used in practice today.

Fig. 1. Protocol stack defined in the ATSC data broadcast standard.

Fig. 2. Addressable section encapsulation for protocol datagrams.

Data streaming encapsulation is intended for streamingdata that make use of the program clock reference (PCR) andpresentation time stamp (PTS) values defined in the MPEG-2Systems standard [4] for timing. The data stream is packedinto packetized elementary stream (PES) packets, the sametype that is defined in the MPEG-2 Systems Standard for car-rying audio and video streams for TV programming. In cur-rent practice, however, most streaming data is IP streamingmedia, typically carried in the form of IP multicast streamsencapsulated by protocol datagram encapsulation.

Protocol datagram encapsulation is intended for carryingnetwork protocol traffic, such as IP traffic, in a DTV broad-cast stream. The protocol encapsulation format specified inthe ATSC Data Broadcast Standard, often called multipro-tocol encapsulation (MPE), encapsulates network protocoldata units in Distributed Storage Media Command and Con-trol (DSM-CC) addressable sections [8], as shown in Fig. 2.These addressable sections are in turn packed into MPEG-2transport stream packets.

This method is capable of handling diverse protocols,including those in the IP family, the Internetwork PacketExchange (IPX) family and the Open Systems Interconnec-tion (OSI) family. Logical Link Control/Subnetwork AccessPoint (LLC/SNAP) headers [5], [6] are used within theaddressable sections to identify the network protocol whichis being carried. However, in recognition of the widespreadusage of the IP protocol, these headers are optional (and infact deprecated) for the IP protocol, but are required for allother network protocols.

IP multicasting is especially attractive for datacasting be-cause it is a broadcast protocol, it is well understood, re-ceivers can implement IP stacks easily, and data receivedover the broadcast medium can be treated much the same asregular IP traffic. In particular, it can be rerouted over LANsif required.

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 105

Page 5: Data Broadcasting and Interactive Television

Fig. 3. Data download encapsulation.

Data download encapsulation is intended primarily forfiles, in either a one-time delivery or continuous carouselmode. It can also be used for delivery of unbounded datastreams. The data download protocol specified in the ATSCData Broadcast Standard is based on the DSM-CC datadownload protocol [9]. It provides an efficient way ofpackaging bounded data objects like files into modules,versioning them, sending them once or cyclically, sendingcontrol messages to enable receivers to deal appropriatelywith the data, and including time stamps for time syn-chronization if required. A data carousel, where boundedmodules are sent cyclically, is a common usage of thisprotocol. Fig. 3 illustrates data download encapsulation.

Each data object is organized into one or more discretedata modules, and in order to carry data modules in theMPEG-2 transport stream, each module is segmented intothe payload of a number of download data block (DDB)sections, each with a maximum payload of 4066 bytes. Thedownload protocol may be constructed with one or twolayers of control information, contained in download infoindication (DII) and download server initiate (DSI) sections.The DDB, DII. and DSI sections are packed into MPEG-2transport stream packets. For simple data structures, the onelayer protocol will suffice. The one-layer protocol limitsthe number of modules that may be referenced to a smallnumber and does not allow any logical grouping of modules.Hence, for more complex data structures, the two-layerprotocol may be used, allowing a larger number of modulesand providing a way to logically group modules.

For delivery of unbounded data streams, a continuous se-quence of new versions of a module may be transmitted.

In practice, many file delivery applications today do notuse the data download protocol. Instead they encapsulatefiles into IP packets with either a proprietary file encapsula-tion protocol or a standard file encapsulation protocol suchas the Society of Motion Picture and Television Engineers(SMPTE) Unidirectional HTTP (UHTTP) [7], and thenencapsulate the IP packets into MPEG-2 transport streampackets with protocol datagram encapsulation.

The next question is how receivers find data services in atransport stream. Under the ATSC Data Broadcast Standard,this can be done by means of the data service table (DST).The DST may have entries for one or more applications.Each application may have one more “taps,” which pointto resources of the application. A tap always specifies anMPEG-2 program element in an MPEG-2 transport stream,and may include additional selection information like aspecific IP address in the case of encapsulated IP packetsin addressable sections. The DST is signaled in the virtualchannel table (VCT) and the program map table (PMT),defined in the ATSC PSIP standard [10] and the MPEG-2Systems standard, respectively. In order to facilitate the im-plementation of data broadcast services in the ATSC space,an implementation guide [11] is available, which providesa set of guidelines in accordance with the data broadcaststandard. The information therein applies to broadcasters,service providers, and equipment manufacturers.

For those geographical areas in the world where the dig-ital video broadcasting (DVB) DTV standards are used, theDVB specification for data broadcasting [12] contains dataencapsulation protocols very similar to those in the ATSCstandard. However, this specification takes a very differentapproach to signaling.

For IP multicast services, it is useful to supplement thesignaling information in the DST with messages formattedaccording to the Internet Engineering Task Force (IETF) Ses-sion Description Protocol (SDP) [13] and IETF Session An-nouncement Protocol (SAP) [14]. These SDP/SAP messagesdescribe various properties of a multicast service, includingthe IP addresses and UDP ports used for the service. In theATSC domain, the use of SAP/SDP in conjunction with theDST is required for IP multicast services, as described inthe ATSC standard on delivery of IP multicast sessions overATSC data broadcast [15]. Typically, a tap in the DST for anapplication carrying an IP multicast service gives the IP ad-dress of the SDP/SAP announcements. This ATSC standardalso describes other aspects of the management of multicastservices within virtual channels under different conditions,and associated signaling requirements.

In the U.S. cable world, the SCTE (Society of CableTelecommunications Engineers) standard on IP multicastfor digital MPEG networks [16] describes the protocols forcarriage of multicast services. While the DSM-CC address-able section encapsulation format in [16] is almost identicalto that in [15], the signaling implementation is different.In [15] each application that carries a multicast service isconsidered as an independent network (so a single MPEG-2program may consist of multiple networks), and hence thescope of IP addresses is restricted to that application. In [16]each MPEG-2 program is considered to be a network, so thescope of IP addresses is the entire program. The approachin [16] may require more care in the management of mul-ticast addresses, since it may require coordination amongmultiple independent content providers whose applicationsare being inserted into the same MPEG-2 program. Thecontent providers must make sure that there are no collisionsin multicast addresses. However, the ability under [15] to

106 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 6: Data Broadcasting and Interactive Television

scope multicast addresses more efficiently comes at the priceof more complex implementation of the IP stacks in the datareceivers, since the number of virtual network interfaces inthe receiver can change dynamically as applications comeand go in a program.

For some applications it is desirable to synchronize the dis-play of large data objects, such as complete HTML pagescontaining embedded images and such, with a video stream.It is possible to do this using the PCR values that are used foraudio and video synchronization. However, the PCR valuesmay have discontinuities, for example where commercialsare spliced in. This makes it difficult to place meaningful PTSvalues in the header of the encapsulated data objects in theusual way, since it may take a long time for the data objectto download, and the timeline may suffer a discontinuity inthe meantime. To deal with this problem ATSC developeda trigger standard [17], which uses separate triggers to de-couple the data objects from their activation timing. The trig-gers are very small objects that carry the activation time fordata objects and are appropriately transmitted in the presenceof timeline discontinuities. The trigger standard also enablesdelivery of events to receivers where the trigger itself carriesthe user-defined payload instead of referring to a data object.

One useful application of datacasting with unique sig-naling requirements is delivery of in-band updates orupgrades of firmware, middleware, device drivers, andapplication software in terminal devices like consumer TVsets or set-top boxes. The ATSC Software Download DataService standard [18] provides a protocol to support thisapplication, based on the two-layer data download protocol.This protocol has built-in signaling to indicate the manufac-turer and model of the device for which each downloadablesoftware module is intended. Similar protocols are definedby the Specification for System Software Update in DVBSystems [19] for those geographical regions using the DVBdigital TV standards, and by the ANSI/SCTE Host-PODInterface Standard [20] for cable environments in the UnitedStates (where ANSI is the American National StandardsInstitute).

So far this section has focused on protocols and encapsula-tion formats for transmitting data in the forward channels, butfor many applications a back channel is also important. Oneway to enable interactivity between users and applicationsis to transfer application data to the terminal device’s localcache and allow the user to run the application from the localcache. This is sometimes called “local interactivity.” On theother hand, it may be required to interact with data and soft-ware on remote servers through an interaction channel, for in-stance using an Internet connection through a cable modem.This is called “remote interactivity,” and the ATSC Interac-tion Channel Protocols standard [21] defines protocols on thereturn channel. (See Section III of this paper for a more com-plete discussion on interactive television.)

While the datacasting standards are sufficient fortransmitting and organizing data services in terms of appli-cations, they do not provide sufficient signaling to support allthe application requirements defined in Section II-C above,like targeting, support for receiver acknowledgment, etc.

Fig. 4. General data broadcast scenario.

However, the standard signaling can be extended to fill thegap through the use of in-band Extensible Markup Language(XML) messages, which will be called “catalogs” in thispaper, that provide fine-grained information about what dataare in the datacast stream and how receivers should handlethem. The service framework described in [3] and [15] canbe used to locate the catalogs themselves. In the case of dataservices based on encapsulated IP packets the catalogs foran application can be periodically transmitted on a particularIP multicast address and port. The DST can tell the receiverwhere to find the SDP/SAP messages for the application,and the SDP/SAP messages can tell the receiver the IPaddresses and UDP ports for the catalogs, as well as the IPaddresses and UDP ports for the actual data. The catalogscan contain information to satisfy many of the applicationrequirements described in Section II-C above. Consider thecase when data is targeted and encrypted. Access controlinformation in a catalog can tell whether items are intendedfor targeted receivers only or all receivers. If they are fortargeted receivers, then the ID of the targeted receivers canbe listed (in the case of targeting by ID). If the content isencrypted, then the access control information giving infor-mation about the encryption algorithm and the decryptionkeys can be contained in the catalog. Similarly, for FEC andcompression, the catalog can provide information about theFEC and compression algorithms that were used. With thisinformation in the catalog, receivers can correctly downloaddata items. For enabling receiver acknowledgments, thecatalog can specify the location where receivers must sendthe acknowledgment messages. More details about thisapproach can be found in [2].

E. Infrastructure

In order to understand the requirements for the data broad-cast infrastructure, it is first necessary to understand the gen-eral data broadcast environment.

1) Data Broadcast Environment: Fig. 4 illustrates ageneral data broadcast scenario, where data files, mediastreams, and/or protocol packets for a variety of applicationsare broadcast to a variety of different types of data receivers.

There are typically three roles involved in such a scenario:• manager of the broadcast pipe (broadcaster);

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 107

Page 7: Data Broadcasting and Interactive Television

• managers of the content flow (content providers);• end users of the content (content recipients).Depending on the specific application, these roles may

be filled by members of the same organization, by mem-bers of three different organizations, or by any combinationin between. In any case, any data broadcast infrastructureimplementation should support the needs of all three ofthese roles.

2) Head-End: There are two logical architectural com-ponents at the head-end—scheduling workstations to meetthe need of the content providers and a data server to meetthe needs of the broadcaster. In actual implementations thesemay be combined on a common platform, or they may be onseparate, distributed platforms.

A scheduling workstation allows a content provider toperform detailed scheduling for retrieval and broadcast ofindividual data items. The content provider can change thescheduling at any time. The scheduling information is trans-ferred to the data server as needed. The content provider mayspecify to the data server the receivers to which the contentmust be targeted, and whether it should be encrypted. It mayalso specify if content must be compressed or encoded witherror correction. Sometimes it is necessary for a contentprovider to distribute the same data to multiple data servers,to be broadcast at different rates and times. Convenientscheduling tools are needed to accommodate these differentsituations.

The data server is by necessity located at the broadcaster’sfacility, since it is the component that actually inserts the datainto the broadcast stream. The data server may also allocatebandwidth among multiple different content providers or ap-plications, and may keep track of the bandwidth actually usedby the different content providers or applications for billingpurposes.

Once the data server receives the data items pushed bythe providers, it applies encryption, error correction, etc. asrequired. The data server enforces and meters the bandwidthusage for each content provider. It can then turn over the datato an MPEG-2 gateway according to the schedules provided.The gateway would encode the data to MPEG-2 packets andfeed it to a multiplexer for actual insertion into the broadcaststream. Alternatively, the data server could encode the datainto MPEG-2 format itself, and feed it to the multiplexer. Anadvantage of the former approach is that the data server isfreed from the task of interfacing with different multiplexers.On the other hand, the gateway is an extra component in thesystem, which can add to the cost.

Many TV stations use variable bitrate (VBR) encoders, inwhich the bandwidth used for encoding the video dependson the nature of the picture. When there is a great deal ofdetail in the frames and a great deal of change from one frameto another, more bandwidth is needed to encode the videothan when there is little detail and little change from frameto frame. When VBR encoders are used the broadcast streamhas some so-called “opportunistic” bandwidth which can beused for datacasting, i.e., bandwidth which is available whenthe video is not very demanding but not available at othertimes.

Since the multiplexer is aware of how much opportunisticbandwidth is available at any time, it can provide the infor-mation in real time to a data server through a handshake pro-tocol such as the SMPTE 325M protocol [22]. Of course, thedata server must support the same protocol as the multiplexerin order to take advantage of the opportunistic bandwidth.

In various application scenarios a single content providermay need to provide data to several data servers, and con-versely multiple content providers may need to provide datato a single data server.

Broadcasting networks may have distributed data servers,for instance at a national uplink site and at local stations. Thenational data servers insert content of national interest whilelocal data servers insert local content of local interest.

3) Receiver End: The data receiver is the architecturalcomponent in the chain responsible for receiving data andmaking it available to end users. It is equipped with a de-vice responsible for tuning to the appropriate data broadcastchannel and extracting data. The device may be a set-top boxor a PC receiver card.

If it is a PC-based device, it may be internally connectedthrough a peripheral component interconnect (PCI) bus, orexternally through a universal serial bus (USB) port. Theformer approach is a little tidier (and supported higher datarates until the advent of USB 2.0), but the latter is more con-venient and portable. If the data consist of IP packets, andif the device is equipped with an appropriate software driver(like a network driver interface specification (NDIS) mini-port driver on the Microsoft Windows software platform),the device may be visible to the PC user as a regular networkinterface card (NIC), albeit one that can only carry inboundtraffic. If the underlying data are in the form of IP datagramsand the device acts as an NIC, then it can simply pass the de-capsulated data to the application level via the usual IP stack.In fact, the device can often act as a router, and forward theIP data over an LAN to other users.

The application level software can then extract targeteddata, apply decryption, decompression, FEC, and take otheractions as required. It may also be responsible for sendingacknowledgment messages regarding status of received dataitems to the data server. If data items are incomplete becausepackets are corrupted or dropped during transmission, thenthe server can resend only those packets and the receiver mustbe able to reconstruct the data items correctly.

The receiver can store items on a hard disk, perform cache-management appropriately, and autolaunch applications asdeemed necessary by the application. The receiver can act asan edge-server, in which case the stored data is made avail-able to other users over an LAN. In enterprise applications,receivers acting like edge-servers or routers may be popular,since the receiver cost per end user is much less.

F. Case Studies

1) WRAL/WRAZ: WRAL, a Capital Broadcasting Com-pany station in Raleigh, NC, was the first TV station in theUnited States to go on the air with digital TV, in June 1996,and was among the first TV stations in the United States to

108 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 8: Data Broadcasting and Interactive Television

Fig. 5. Screenshot from WRAL TotalCast Service.

start data broadcasting, in 1999. Fig. 5 shows a screen shotfrom the WRAL TotalCast service.

The TotalCast service promotes the WRAL brand by de-livering files to consumer PCs in the Raleigh area with suchcontent as a mini-Website derived from the WRAL.com Website, video clips that can be viewed on demand (from WRALnewscasts, local programs, specials, and documentaries),audio clips (from North Carolina News Network), computergames, and real-time stats during basketball and baseballcontests.

2) Kentucky Education Television (KET): KET, anagency of the state of Kentucky, operates a network of 16TV transmitters covering the entire state. The KET staffbegan dreaming of data broadcasting in early 1999 andrealized that dream when they started transmitting data inMay 2003. They are currently datacasting state legislativesessions and committee hearings in the form of IP streamingmedia and also datacasting weather alerts and warnings.These data services are accessible to anyone with a PC, aDTV adapter, and software available from the KET Website. In addition to this, they are datacasting health servicesinformation available only to Kentucky Family and HealthServices personnel, and they are working with the Depart-ment of Transportation on a project to deliver content toinformation kiosks in roadside rest areas throughout thestate (locations which are very difficult and expensive toconnect to the Internet).

3) KLCS: TV station KLCS, owned and operated by theLos Angeles Unified School District (LAUSD), installeda data broadcasting system in the fall of 2003, and soonafter that implemented a fully automated system that allowsteachers in the school district to order educational videosfrom the extensive KLCS video archive via a Web interfaceand have them delivered by data broadcasting directly toservers in their schools. When a video arrives at a school,the teacher who ordered it is automatically sent an e-mailnotification containing the URL where the video can beaccessed over the school’s LAN. The response to the initialpilot implementation was so positive that KLCS is now inthe process of extending the system to all the high schools

in the city, and hopes to extend it to the other schools in thecity before too long.

4) Nebraska Educational Telecommunications (NET):NET operates a network of 9 transmitters and 14 translatorscovering the state of Nebraska. They began data broad-casting very early, and they now have several applications,all education specific. One application provides a Web sitewhere teachers can access a searchable library index andrequest files containing video, audio, text, still photos, ormultimedia materials and have them dumped via data broad-casting directly to their desktop. Another version of this hasan edge-server that serves multiple schools connected by avery high-speed data link. If the ordered content is alreadyon the edge-server, it goes to the user directly. Otherwise thecontent is delivered to the edge server via data broadcasting,with e-mail notification to the teacher when it arrives. Theyare currently working with the state Department of Roadsand Department of Tourism to implement a content deliverysystem for kiosks at roadside rests.

G. Prospects for the Future

The number of data broadcast projects using ATSC broad-cast facilities has started to rise sharply within the last coupleof years, and there are several factors that are likely to makethis trend continue for the foreseeable future.

• The cost of data-capable receivers will continue to fall.• The examples of successful data broadcast projects will

encourage others to replicate them in other locations.• New ideas for data broadcast applications will continue

to arise as broadcasters and entrepreneurs get a betterunderstanding of the potential of the technology.

Not only will there be an increasing number of applicationstargeted to enterprise receivers, but consumer applicationsare likely to play an increasingly important role in the nearfuture.

III. INTERACTIVE SOFTWARE ENVIRONMENTS

A. Business Motivations

The introduction of Interactive Television technologieshas been driven by two consumer experiences: Personaliza-tion of the television services and bringing e-commerce intothe living room, often referenced as “t-commerce.”

Personalization of services has been motivated by the factthat it increases consumer loyalty and also gives operatorsan option to localize their services (local preferences, localbusinesses). Services falling in this category include the fol-lowing.

• EPGs: This is perhaps the older form of interactive TV.An EPG allows viewers to navigate through a large setof DTV channels. Over the years, EPG applicationshave been enhanced with personalized features allowingeach viewer to preselect a preferred set of channels orto control access to channels.

• VOD applications: VOD is an interactive applicationthat has received a lot of attention over the last fewyears. A VOD application provides viewers with aselection of movies/TV shows available immediately.

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 109

Page 9: Data Broadcasting and Interactive Television

Upon selecting a movie, the viewer initiates a sessionwith a remote server. Over the years, VOD applicationshave been enhanced with search applications to facili-tate browsing through large sets of movie databases.

• Passive commercial enhancements: During a commer-cial, an interactive TV application provides a series ofoptions that a TV viewer can select to get more infor-mation on various aspects of the product/service beingadvertised. The options and enhancements are typicallyshown on the screen as graphics overlay. This conceptis also applicable to news and sports enhancements.

• TV game enhancements that allow viewers to play alongand measure up to the contestants on TV.

• Polling applications: Viewers are presented with anoption to vote in support of a candidate. Interactivepolling applications are particularly well suited forreality shows or for music contests, for example.

T-commerce has been driven to move some of the e-com-merce services available from the PC today to the TV screen.This includes the following.

• Active commercial enhancements that complementthe passive commercial enhancements described abovewith the possibility to engage into a financial transac-tion to purchase the product or service being advertised.

• Games involving gambling or money pools of somekind (card games, sports bets).

• Home shopping and auctions: Typically offered on ded-icated channels, an application allows viewers to pur-chase the item on display.

• Purchases of music or educational materials that can bedownloaded from a service after purchase.

Categorization of interactive applications can follow othercriteria. For example, interactive services in some instancesrequire a return channel that allows the interactive TV ap-plication to communicate with a remote transaction server.This is the case for most of the T-commerce applications, butsome of the personalization applications (VOD, polls) alsorely on such infrastructure. These applications are referencedas transactional applications. The remaining applications arelocal interactive applications because they solely rely on datastored in the DTV receiver. Another categorization, boundedversus unbounded applications, is sometimes used to differ-entiate applications that are related to programming versusthose that are not, respectively.

In some cases, the business challenges associated the de-ployment and viability of some of these services should notbe underestimated: for example, any application enhancing acommercial can take the viewers away from watching subse-quent commercials. If proper safeguards are not in place, thevalue of the following commercials can be reduced and thisof course, could make a big impact on the overall advertisingbusiness.

B. History

With the advent of Moving Pictures Experts Group(MPEG)-based digital video and audio compression tech-nologies in the early 1990s, cable and telecommunication

companies started exploring the possibility of enhancingthe traditional TV viewing experience with new interactiveservices like walled gardens, games, home shopping andeducational programs. The biggest trial was conducted byTime Warner in Orlando, FL in 1994. It was called FullService Network (FSN) and had a few thousand homesconnected to a fiber to the curb network for easy accessto services such as home shopping and VOD. Other trialsfollowed shortly thereafter, and a number of interactiveTV application providers such as Wink, OpenTV, Liberate,PowerTV, Canal+ Technologies competed aggressivelyfor participations in these trials. Most of these interac-tive software providers worked closely with the two mainset-top-box providers, Motorola Broadband (formerly calledGeneral Instruments) and Scientific Atlanta to gain certi-fication on their platforms. Some of them participated inthe deployments of services such as the Tele-TV servicethat Bell Atlantic, Nynex, and Pacific Telesis planned todeploy over microwave transmission channels. In parallelto these activities, the rapid evolution of the World WideWeb offered over-the-air broadcasters the opportunity toroll out field trials on their own with services like Intercast.Intercast services brought together Intel, NBC, and severalcontent providers such as CNBC, CNN, and the WeatherChannel. Intercast used data transmitted in the verticalblanking interval of the video signal to provide Web pageenhancements to TV programming on the PC. This effortwas quickly followed by a more TV-centric service calledAdvanced Television Enhancement Forum (ATVEF), alsoinitiated by Intel in cooperation with broadcast operatorslike PBS. The ATVEF specification defined an HTML-basedcontent format for broadcasting trigger events to interactiveapplications which, following their activation at specificinstants, would invoke preloaded graphics and image- andtext-based enhancements to the TV content. In 2000, theDeclarative Data Essence (DDE) Ad Hoc Group of theSMPTE D27 Technical Committee standardized the ATVEFas a first level interactive TV content authoring format [7].

Another product called WebTV was deployed in the mid-1990s, which was all about bringing PC content to the TVenvironment. The product of a cooperation between WebTV,Sony, and Philips, WebTV provided consumers with a re-ceiver device to get access to the Internet from their TV.Fonts, color, and management of screen real estate were allspecific to interlaced TV displays. Shortly after its acqui-sition by Microsoft Corporation, WebTV released WebTVPlus, which consisted of a device including a TV tuner and aTV listings service in addition to the base WebTV service.

These brand new interactive TV trials prompted standard-ization bodies like ATSC to invest in the development ofasynchronous and synchronized data protocol and middle-ware standards for enabling the development, distribution,and management of ITV and datacasting applications arounda unified infrastructure. ATSC started working on the de-velopment of the Data Broadcast and DTV ApplicationSoftware Environment (DASE) specifications in 1996; the

1[Online.] Available: http://www.atsc.org/standards.html

110 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 10: Data Broadcasting and Interactive Television

standards ATSC A/90 [3] and ATSC A/100 [23] were com-pleted in 2000 and 2003, respectively.1 The MultiprotocolEncapsulation, Data Carousel, and Object Carousel Proto-cols described in A/90 have been used extensively. Lately,an agreement between the consumer electronic companiesand the cable industry on a removable security module (Ca-bleCARD) for unidirectional and soon bidirectional serviceswill certainly provide the framework for broad deploymentof interactive television.

C. Technologies

An overview of the various interactive television softwaretechnologies is provided in the following sections. At thebottom of the stack come the MPEG-2 Systems protocols(see Section III-C1), the Service Information protocols (seeSection III-C2), and then the data communication protocols(see Section III-C3), and finally the application programminginterfaces (APIs; see Section III-C4). As each of these layersrepresents an extremely rich topic, each of the following sub-sections describes only a very specific aspect of the protocolsor software interfaces in use today.

1) MPEG-2 Systems: The MPEG-2 Systems specification[4] defines the MPEG-2 transport stream packetization layer,multiplexing layer, and synchronization layer used in mostof the broadcast systems in the world today. The specifica-tion includes the carriage of program specific information inthe PMT which provides the means for receivers to identifythe MPEG-2 packets that convey the data relative to a par-ticular datacasting or interactive services. In this regard, thefunctionality provided by the PMT is a simple extension ofwhat it is being used for today (that is, identification of videoand audio elementary streams in a multiplex). By providing amechanism to list the elementary streams composing a DTVvirtual channel, the PMT provides a simple mean to bind dataelementary streams to a particular DTV virtual channel. Thevirtual channel may be exclusively a data service channel, inwhich case no audio or video elementary stream is listed. Inan ATSC network, the PMT is transmitted at least once every400 ms.

2) Service Information: The ATSC Service InformationProtocol for over-the-air broadcast systems is called the Pro-gram and System Information Protocol (PSIP). PSIP pro-vides a method for publishing the type of interactive TVapplications bound to a virtual channel. More specifically,the service type information provided in the virtual channeltable provides the EPG receiver application with informationon whether the virtual channel is exclusively a data servicechannel (in which case, no audio or video elementary streamis associated with the data elementary streams), an audio/datachannel, or a video/audio/data channel. PSIP also providesa mechanism for informing the schedule of each event bymeans of auxiliary tables called the event information table(when the data service is provided along with video and/oraudio) or a data event table (when the data service is a stand-alone service).

3) Data Communication Protocols: There are manyforms of data communication protocols used in interactivetelevision today. The DSM-CC specification [9] defines

some of these protocols, like the Data Download or theObject Carousel Protocol. It also describes the method fortunneling other popular protocols like IP in special MPEG-2Systems structures called DSM-CC addressable sectionsas mentioned in Section II-D above. The Object CarouselProtocol has played a key role in the development of theinteractive television system mainly because it serves asa bridge between the traditional, low-level MPEG-2 Sys-tems-based delivery mechanisms used in any DTV serviceand the interactive software middleware running in a PC ora digital set-top box. More specifically, the Object CarouselProtocol is a “file-in-the-sky” system which defines a hierar-chical organization among simple data modules transmittedin an MPEG-2 transport stream. The result is a set of dataobjects representing directory and data objects linked amongthemselves by a sophisticated referencing mechanism. Asa result, a datacasting or Interactive application running ina digital receiver can use Object Carousels to navigate effi-ciently through large sets of data. Depending on the amountof dynamic memory or hard disk drive space available, areceiver may elect to cache none of, a portion of or the fullObject Carousel. Although the organization of the directoryand data objects cached in a receiver may not necessarilyfollow the directory/object organization prescribed by theobject carousel, it is usually the case that both data organi-zations are typically identical as any discrepancy betweenthe two requires additional metadata in the object carousel(to represent the intended data organization in the cache). Ineffect, the Object Carousel Protocol is the bridge betweenthe data transmission part and the cache management partof a datacasting or interactive data service. To understandthe essential role that the Object Carousel Protocol hasplayed in the deployment of interactive DTV services, itis worth going over the history and fundamentals of theprotocol. The protocol was designed in early 1995 as partof the MPEG DSM-CC standard [9]. The motivation fordesigning the protocol was to extend the use of the fileaccess and directory listing APIs defined for conventionalbidirectional VOD networks to unidirectional broadcastnetworks. The result of the work was an additional protocollayer called BIOP (for Broadcast Inter-ORB Protocol—thetrue technical term for the Object Carousel Protocol) on topof the simple, flat organization of a DSM-CC Data CarouselProtocol. In effect, the Object Carousel Protocol allows thepayload of data modules to represent a collection of direc-tory names, file names, or data files. As shown in Fig. 6,the Data Carousel Protocol is the periodic transmission ofdata modules carried in MPEG-2 sections. See [24] for acomplete description of the protocol. It must be noted that,depending on a variety of considerations such as accesstime by the datacasting or interactive TV application androbustness against transmission errors, data modules maybe repeated several times in a single carousel period. Forexample, in the figure, the repetition rate ,for module , , respectively, is .

The DSM-CC Object Carousel Protocol defines a familyof objects that make up the payload of each data module.The object design follows a common framework defined by

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 111

Page 11: Data Broadcasting and Interactive Television

Fig. 6. Object carousel with multiple data modules.

BIOP [9]. A BIOP object may not be split across multipledata modules, but a data module may contain zero, one ormultiple BIOP objects. The DSM-CC Object Carousel Pro-tocol defines the following set of BIOP objects.

• The BIOP File Message: This is the object conveyingthe data files consumed by the datacasting and/or inter-active data service application using the object carousel.

• The BIOP Directory Message: This is the object con-veying the list of subdirectory, file, stream, or streamevent names falling under the current directory.

• The BIOP ServiceGateway Message: This object is aspecial directory object which is used to designate thetop directory of the hierarchical structure of an objectcarousel.

• The BIOP Stream Message: This object is essentially areference to an audio, video, or data stream.

• The BIOP Stream Event Message: This object repre-sents asynchronous or synchronized triggers for specificactions to be implemented by the datacasting and/orinteractive data service application that use the objectcarousel. Stream event objects may or may not be boundto streams in the object carousel.

The BIOP objects shown above share the same structure: amessage header followed by a message subheader followedby the message payload. The message header is a generic IOPheader defined by Common Object Request Broker Architec-ture (CORBA) and the message subheader contains specificinformation to the BIOP protocol like objectKey and a stringidentifying the kind of object (“srg” for service gateway,“dir” for directory, “fil” for files, “str” for streams, and “ste”for stream events.

The mechanism used to bind these objects together isthe interoperable object reference (IOR). An IOR is areference to a BIOP object and, as such, contains all theinformation needed to locate the BIOP object in an MPEG-2transport stream. An IOR consists of one or more taggedprofiles. A tagged profile is either a reference to a BIOPobject carried in the same object carousel or a referenceto a BIOP object carried in an external object carousel(defined in the DSM-CC standard as “BIOPProfileBody”and “LiteOptionsProfileBody,” respectively). A BIOPPro-fileBody includes one or more references to the MPEG-2program element(s) carrying acquisition information aboutthe data module where the target BIOP object resides. Thisinformation is captured in the connection binder substructure

of the IOR. Also, the IOR also includes an objectKey sub-structure which is the unambiguous identifier of the BIOPobject in the object carousel. The objectKey is typically usedby the Object Carousel Protocol software stack in a digitalreceiver to parse the objects in a data module sequentiallyuntil the target BIOP object has been found. On the otherhand, a LiteOptionsProfileBody contains the reference toanother object carousel within the same MPEG-2 trans-port stream or in another MPEG-2 transport stream. Thisinformation is included in a ServiceDomain substructureof the IOR. Fig. 7 shows a simple object carousel made ofthree files organized in two directories. The right portionof the figure shows the IORs included in each of the BIOPdirectory objects. The IORs for the directory object and

are included in the service gateway object. The IORs forthe BIOP files and are in the directory object. TheIOR of the service gateway object is typically included in anexternal location (the DownloadServerInitiate message of adata carousel) that the receiver can get easily to mount theobject carousel at some predefined location of its file system.

The ATSC A/95 [25] standard specifies the constraintsthat must be applied to designs of object carousels deliv-ered in ATSC transport streams. The ATSC design includesextensions for providing URI-based references to objects inthe object carousel, thereby providing a mechanism for ab-stracting the location of BIOP objects from their location inan object carousel. The primary advantage of this abstractionis to decouple the application software development from thelocation of the data files it consumes. More specifically, anapplication developer can simply refer to objects in the objectcarousel via the URI reference mechanism without having topay attention to the final location of such object in the ob-ject carousel. The application does not need to be modifiedif the organization of the files in object carousel changes. Onthe transmission side, the broadcast, satellite, or cable oper-ator has complete freedom to map and schedule the deliveryof the BIOP objects in the carousel. This is the IOR of theBIOP protocol that effectively brings the transmission oper-ations world and the application software development worldtogether.

The joint ATSC/CableLabs standardization activitiesstarted more than two years ago now are providing theopportunity to align the object carousel designs betweenthe over-the-air/satellite and the cable industries. In partic-ular, enhancements are being made around the infrastructureneeded to make synchronized triggers based on BIOP streamevent objects work. Current design assumes easy and directaccess to MPEG-2 timing information to bind media timewith the MPEG-2 System time clock used by the receiverin an unambiguous fashion, even in the presence of PCRdiscontinuities. However, this is generally not the case andATSC, after realizing this almost five years ago, introducedthe A/93 (Asynchronous/Synchronized trigger) standard in2002 to provide an alternative solution to the industry.

4) Application Programming Interfaces: DASE and Ad-vanced Common Application Platform (ACAP) are the twospecifications that ATSC has developed toward the defini-tion of a common DTV receiver software platform and APIs

112 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 12: Data Broadcasting and Interactive Television

Fig. 7. Organization of directories, files, and their references in an object carousel.

Fig. 8. Resources available to an interactive application.

that interactive TV application developers can use. In the fol-lowing subsections, an overview of such a software platformis provided.

a) Resources: The semantics of the application oftenrequire that it interact with platform resources. Fig. 8 il-lustrates the resources available to the application. Thefirst stages of the pipeline relate to transport streams. Thefirst stage tunes. The second stage isolates specific streamswithin the transport stream. The third stage, the section filter,isolates specific programs, or specific tables within specificprograms. The functions above the pipeline can exploitthe feature to isolate protocol of interest, for example, theprotocol that publishes service metadata.

The later stages of the pipeline relate to the content. Thedecryption element recovers the media streams. The decodeelements are specific to the media stream. In addition toaudio, the specification supports four devices that present tothe screen. The devices render into a frame buffer, the planesof which divide into four sets. The first plane set from backto front, stores a background image. The second plane set isfor the video. The third plane set is for graphics. The fourthplane set is for captions.

The graphics device represents color samples as threecolor components plus a fourth component that blends the

sample with the current contents of the plane set. The appli-cation can composite source images with the plane set and,since the plane set stores the weights, can also compositethe plane set for graphics with the plane sets behind.

The figure also illustrates the components that support ser-vices and applications. The application can create filters thatparse the service metadata to isolate services of interest. Theapplication can, with proper permissions, then select the ser-vice. The application can also create filters to isolate applica-tions bound to the service and then request that the platformlaunch the application. The platform extracts the applicationfrom the object carousel, creates a context in which the ap-plication executes, and launches the application.

b) Scope: As the discussion above suggests, the scopeof the interfaces available to applications is considerable.The application can, with proper permissions, perform theseoperations:

• create preferences; inquire preferences; configure plat-form defaults, such as localization of text, in responseto preferences;

• language selection; configure the default language forthe audio device and the caption device;

• configure the transport hardware; register interest intransport events, for example the allocation or releaseof transport resources;

• configure the media hardware; control the mediastreams; register interest in media events, for examplechanges to the video format;

• configure the return channel; create sockets and ex-change datagrams;

• create section filters; extract specific tables in the trans-port stream;

• create service filters; browse the service metadata; reg-ister interest in events, for example, the introduction, re-placement, or deletion of a service;

• service selection; resolve the service name to the ser-vice address; select the service; control the service ex-ecution; register interest in service events, for exampletransitions of the execution state machine;

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 113

Page 13: Data Broadcasting and Interactive Television

• create application filters; browse the available applica-tions; control the application execution; register interestin application events, for example transition of the exe-cution state machine;

• browse the object carousel; load specific objects ofthe object carousel; register interest in object carouselevents, for example the addition, replacement, or sub-traction of objects;

• access persistent storage;• create widgets; configure the widget position, order,

focus; composite images into the graphics plane set;register interest in interaction device events, for ex-ample, focus status.

While other media solutions often include comparablefunctions, the television design center is evident throughsubtle details. The target device receives broadcast streams,the details of which often change without application in-tervention. The transport stream that publishes servicemetadata might announce the addition or subtraction ofservices. The object carousel might increment the appli-cation version number, so as to indicate that the previousapplication version is stale. The details of the video stream,such as its aspect ratio, can change. These details are oftenof interest to applications. If the application is to register itspresentation with the video below, for example, the applica-tion must understand the origin and size of the video stream.For these reasons the design provides extensive events towhich applications can subscribe so as to detect changes tothe platform state.

It is probable that the execution of one application will af-fect other applications. The allocation of resources to one ap-plication, for example, can interfere with other applications,so the design defines permission classes to protect transportresources and media resources. The design also anticipates amaster application that executes in the background. Since itspurpose is to coordinate the execution of other applications,it survives service selection. It can filter the available appli-cations, control their execution state machine, and arbitrateresource conflicts. The master application also can extractdata streams otherwise not available to applications.

To account for the distance to the screen, the default widgetset is distinctive. The design provides remote control eventsas well as keyboard events. The design lets applications in-terpose filters to isolate specific events. The application neednot have focus to receive interaction device events. If the ap-plication is a program guide, for example, the application canlisten for remote control events and, upon the receipt of theevents, activate its widgets.

c) Secure execution: The first line of defense to secureexecution is the Java language. The language provides con-structs that allow the author to control access. If an operationis private, for example, just the code of the class that declaresthe operation can access it. If a class is final, there can beno subclass that might implement different operations. Thecompilation phase then enforces the rules.

The second line of defense is the exchange representa-tion, the bytecode design, which again controls access rights.The bytecode design does not support the concept of di-

rect storage access; rather, access is though object referencesthat do not expose the storage location. The application canjust inspect data within its protection domain. The third lineof defense is the verification phase, which occurs just be-fore code execution. The verification phase enforces rules theevaluation of which requires the emulation of code execution.If the application attempts to cast an object to a superclass,for example, the verification phase confirms that the object isindeed a subclass. The application cannot forge objects thatevade the access rules.

The core packages then provide a design pattern availableto further control access. The various packages define per-mission classes that can contain three components: 1) theclass to which the permission relates; b) the specific target,for example, a specific file; and c) the specific actions, forexample, read, write, or delete.

The permission class just provides the language to artic-ulate the available operations. To complete the design, thestandard provides the mechanism to relate specific permis-sions to specific applications. Before the platform executesthe application, it evaluates the permission requests. Theplatform first determines the code source, that is the networklocation at which the application resides, and the certifi-cate(s) of the organization(s) that signed the application.The platform traverses a certificate chain until it encountersa certificate of an organization that the platform trusts, atwhich point the platform grants the specific permissions.

The standard builds on these concepts to protect sensi-tive data and scarce resources. The standard defines permis-sions for each of the elements of Fig. 8. The object carouselprovides, in addition to the application itself, a permissionrequest file. The file describes the resources to which the ap-plication request access. If the certificate chain confirms thatthe platform can trust the application, the platform grants ac-cess to the resources. If the application should attempt toaccess other scarce resources, the platform detects that theresource requires a permission that the application does notpossess. The access attempt fails and the platform raises anexception.

While these mechanisms assure that application cannotaccess resources without permission, since the standardsupports simultaneous execution of multiple applications,applications might contend for scarce resources. The stan-dard provides a prioritization mechanism, but since theassignments are made at the time the object carousel is built,conflicts can still occur. The standard provides a solution. Ifthe platform detects resource contention, it escalates to themaster application, which prioritizes the applications. Theplatform then applies the result to resolve resource conflicts.

d) State of the standardization process: The evolutionof the ACAP specification involved three distinct projects.Fig. 9 illustrates the process which began with the inceptionin September 1998 of the specialists’ group for the DASEspecification. The specification became a candidate standardin November 2002 and became a standard in March 2003.

While the DASE specification and the Open Cable Ap-plication Platform (OCAP) specification are both built onsimilar runtimes and common core interfaces, the television

114 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 14: Data Broadcasting and Interactive Television

Fig. 9. ATSC standardization timeline for interactive TV middleware.

Fig. 10. Conceptual model for data broadcasting service.

packages and signatures of the two specifications differ.Since the designs had comparable scopes, it was thoughtthat harmonization of the designs was feasible. A joint taskforce of specialists from both organizations was formed inAugust 2002 and completed its investigations in September2003. The recommendation was build on the transportprotocol specifications of Advanced Television SystemsCommittee and to adopt the application interface specifica-tion of CableLabs.

5) Essential Elements for Head-End System: A model foran end-to-end data broadcasting service system is shown inFig. 10. The model is composed of several components, in-cluding modules for content generation, transmission, andreception.

The contents generation processing unit includes compo-nents for A/V acquisition, content authoring, audio/videoencoding, generation of system-level program informa-tion (PSI) and program guide data (PSIP), managementand storage for the audio/video/data assets, and the returnchannel server(s) used for supporting the interactive ser-vices. The outputs of each of these components are thenmultiplexed into one or more MPEG-2 transport streamswhich in turn, is modulated as an 8-VSB signal.

A more detailed description of the modules is providedbelow.

• Data content generation: The first step in the devel-opment of a DTV data broadcasting service is thedevelopment of an application based on Java as pro-cedural language or XML as declarative language.The choice to use either a procedural or declarative

framework is driven by several factors (complexityof graphics, for example) and it is generally recog-nized that a procedural framework is to preferred forscenarios calling for extensive user more interactivitywhile a declarative environment is to be preferred forscenarios calling for rapid development of applicationsthat typically require less computational power. For thisreason, procedural environments are more suitable forthe development of interactive services such as gamesand quiz shows. On the other hand, declarative appli-cations are to be preferred for applications designed toproduce large amounts of information such like actorand actress information or drama synopsis.

• Transport: The role of the transmission module is toencode the video and audio contents and to encapsulatea procedural or declarative application and its dataaccording to the transport protocols referenced in [24].The application and its data are generally encapsulatedinto an object carousel before being multiplexed withthe audio and video elementary stream into an MPEG-2transport stream. Also, the necessary MPEG-systemsprogram system information and the relevant appli-cation information table (AIT) are multiplexed soreceivers can discover the presence and location of thedata elementary stream conveying the object carouseldata.

• Program scheduler: The role of the program scheduleris to generate and manage the schedule for the audioand/or video program as well as the broadcast and in-teractive data services. Minimally, the schedule consistsof a start time, duration, metadata such as storage path,and optionally for data services, information binding theapplication to a specific audio and/or video program.The program scheduler typically controls much of thehead-end system including data servers and video andaudio encoders.

• Data server: The data server has three distinct func-tions. Its first responsibility is to store and manage theapplication code; its second role is to encapsulate andpacketize the application data before it is transmittedto receivers; this is typically achieved by means of theobject carousel protocol. The final responsibility of thedata server is to generate the AIT, which holds all theinformation (application name, location of resources,arguments) that a receiver needs to run the applicationproperly.

• PSIP/SI server: The ATSC PSIP server and the systeminformation server provide the information needed forassociating audio, video, and data elementary streamswith a DTV virtual channel. System information con-sists of the program element identifier (PID) values foreach video, audio, and data elementary stream.

• Multiplexer: The multiplexer multiplexes the audio,video, and data MPEG-2 transport packets producedby the encoders and the data servers. Multiplexing ofthe packets is done according to the bitrate assigned foreach of the elementary streams. The resulting MPEG-2transport stream also includes the PSIP data and of

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 115

Page 15: Data Broadcasting and Interactive Television

Fig. 11. Data broadcasting service screenshots. (a) Player information. (b)Card game. (c) Shopping. (d) Quiz.

course the program system information (MPEG-2systems program association table, MPEG-2 systemsprogram map table) digital receivers rely on to discoverthe services.

• Modulation: Finally, for ATSC terrestrial broadcastingservice, the MPEG-2 transport stream is modulated intoa radio-frequency signal. The modulation is done ac-cording to an eight-level vestigial side band (8-BSB)scheme specified by ATSC.

• Reception: DTV terminals receive the 8-VSB signal anddecode the signal into an MPEG-2 transport stream,which is then demultiplexed into individual elementarystreams using the program specific information. Also,the terminal includes the procedural or declarative mid-dleware responsible for executing the interactive dataapplication on the terminal. Generally, the graphics as-sociated with a data application is overlaid on the videocontent or alternatively, is shown on the side panels ofa wide aspect ratio (16 : 9) display. For most interac-tive applications, users can use the remote control unitor a keyboard to navigate through the various statesof the application during the audiovisual program. It isnot uncommon that several interactive applications areconcurrent with a single audiovisual program. Fig. 11shows screenshots of experimental data broadcastingservices in South Korea.

Fig. 11 illustrates various types of data services. Aprogram related data service is shown in Fig. 11(a). Theservice provides auxiliary player information for anyviewer interested in specific player information or sta-tistics. Such a system was prototyped during the soccerWorld Cup in 2002. A non-program-related interactiveservice is shown in Fig. 11(b): it is a card game that auser can select and play card game while watching a TVprogram.

The interactive data service shown Fig. 11(a) and11(b) are both unidirectional services because these canbe provided without the need of a return channel. On theother hand, interactive services like shopping or quiz

shows [Fig. 11(c) and 11(d)] are true bidirectional in-teractive services requiring a return channel to establisha session with a remote transaction server.

IV. CONCLUSION

This paper has provided an overview of the data broadcastand interactive television service technologies that have beenused in field trials and commercial deployments over the lastten years. These technologies include communication dataprotocols for the forward and return channels and DTV re-ceiver middleware APIs that content provider can rely on todeploy their services. The convergence of standards towardsimilar protocols and DTV receiver runtimes, the appearanceof PC/DTV convergence products such as PCs running theWindows XP Media Center Edition platform, the migrationof audio/video decoding capability from the set-top box intothe DTV receiver are all pointing to a gradual change of thetraditional TV experience to a future where DTV will pro-vide an enhanced viewing experience with interactive dataservices.

Future interactive TV services will allow operators toprovide increased personalization to their customers. Thisdrive toward personalization will bring new consumer be-haviors which in turn may open the door to new pay servicessuch as T-commerce or product documentation services.Additional technical solutions such as digital rights man-agement are needed, but they are now being evaluated andslowly deployed. There are, however, important problemsthat remain to be solved before any broad deployment ofinteractive TV can happen. One of them is certainly the issueof conformance for interactive applications. Indeed, serviceproviders will increasingly be seeking guarantees that theirapplications run and present content as it was originallyintended. Acquisition, caching, and execution environmentin DTV receivers can all have an impact on how the datais processed and rendered. Therefore, we can expect thedevelopment of minimum hardware and software guidelinesfor DTV receivers in the near future. Another challengeis the development of logical interfaces into a broadcasterhead-end that let service providers define the interactiveTV experience and the business rules associated with theirservice. This means that new specifications for metadata fordescribing interactive data services are going to be needed.It is only when these logical interfaces are available thatDTV operators will have a way to integrate the revenues andcosts associated with operating interactive data services intheir plants.

ACKNOWLEDGMENT

The authors would like to thank the anonymous reviewers,who have all provided great feedback.

REFERENCES

[1] G. Thomas, “ATSC datacasting: opportunities and challenges,” inProc. NAB 2000 Broadcast Engineering Conf., pp. 307–314.

[2] D. Catapano and G. Thomas, “Fine-grained announcement of dat-acast services,” in Proc. NAB 2004 Broadcast Engineering Conf.pp. 249–256.

[3] ATSC data broadcast standard, ATSC Standard A/90, 2000.

116 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006

Page 16: Data Broadcasting and Interactive Television

[4] Information technology—Generic coding of moving pictures andassociated audio—Part 1: Systems, ISO/IEC Standard 13818-1,2000.

[5] Information technology—Telecommunications and informationexchange between systems—Local and metropolitan area net-works—Specific Requirements—Part 1: Overview of local areanetwork standards, ISO/IEC/TR3 Standard 8802-1, 1997.

[6] Information technology—Telecommunications and informationexchange between systems—Local and metropolitan area net-works—Specific requirements—Part 2: Logical link control,ISO/IEC Standard 8802-2, 1998.

[7] Television—Declarative data essence—Unidirectional HypertextTransport Protocol, SMPTE Standard 364M-2001.

[8] Information technology—Generic coding of moving pictures andassociated audio information—Part 6: Extensions for DSM-CC,additions to support data broadcasting, ISO/IEC Standard 13818-6Amendment 1, 2000.

[9] Information technology—Generic coding of moving pictures andassociated audio information—Part 6: Extensions for DSM-CC,ISO/IEC Standard 13818-6, 1998.

[10] Program and system information protocol for terrestrial broadcastand cable (Revision B), ATSC Standard A/65B, 2003.

[11] Implementation guidelines for the ATSC data broadcast standard,ATSC Recommended Practice A/91, 2001.

[12] DVB specification for data broadcasting, ETSI Standard EN 301192 V1.4.1, 2004.

[13] “SDP: Session description protocol,” Internet Engineering TaskForce, RFC 2327, Apr. 1998.

[14] “Session announcement protocol,” Internet Engineering TaskForce, RFC 2974, Oct. 2000.

[15] Delivery of IP multicast sessions over ATSC Data Broadcast, ATSCStandard A/92, 2002.

[16] IP multicast for digital MPEG networks, ANSI/SCTE Standard 42,2002.

[17] Synchronized/asynchronous trigger, ATSC Standard A/93, 2002.[18] Software download data service, ATSC Standard A/97, 2004.[19] Specification for system software update in DVB systems, ETSI

Standard ETSI TS 102 006 V1.3.1, 2004.[20] HOST-POD interface standard, ANSI/SCTE Standard 28, 2004.[21] ATSC interaction channel protocols, ATSC Standard A/96, 2004.[22] Digital television—Opportunistic data broadcast flow control,

SMPTE Standard 325M-1999.[23] DTV application software environment level 1 (DASE-1), ATSC

Standard A/100, 2003.[24] R. Chernock, R. Crinon, M. Dolan, and J. Mick, Jr., Data Broad-

casting—Understanding the ATSC Data Broadcast Standard , ser.McGraw-Hill Video/Audio Professional Series, New York, Mc-Graw-Hill, 2001.

[25] Transport stream file system, ATSC Standard A/95, Feb. 2003.

Regis J. Crinon (Member, IEEE) receivedthe M.S.E.E. degree from the University ofDelaware, Newark, in 1984 and the Ph.D. degreein electrical and computer engineering fromOregon State University, Corvallis, in 1994.

He started his career at Tektronix Inc.,Beaverton, OR, where he codeveloped thethree-dimensional NTSC and PAL chromi-nance/luminance separation for the Emmy awardwinning Profile video editing system. In 1987,he was a Visiting Scientist at the Advanced

Television Research Program, Massachusetts Institute of Technology,Cambridge. He then joined Thomson Consumer Electronics, Indianapolis,IN, where he was the data services system architect for the TELE-TVsystem. At Sharp Laboratories of America, Camas, WA, he worked onMPEG-4 Video and Systems. More recently, he was with Intel Corporation,Hillsboro, OR, where he was the engineering manager for the developmentof a prototype end-to-end PC-based datacasting system. He has been withMicrosoft Corporation, Redmond, WA, since 2002 where he is currentlya Lead Program Manager in the Digital Media Division. He is also anAdjunct Faculty Member at Oregon State University, where he has taughtcourses in the area of Digital Video Processing and also has served as Ph.D.student advisor. He also coauthored a book entitled Data Broadcasting;Understanding the ATSC Data Broadcast Standard (McGraw-Hill, 2001).

Dr. Crinon has been an active participant in the MPEG Systems standard-ization process. In 1999, he was recognized twice by MPEG for outstandingcontributions to MPEG Systems standards. He also was the chairman of theATSC T3/S13 Data Broadcast Specialist Group from 2000 until 2002 andreceived the ATSC Bernard J. Lechner Outstanding Technical ContributorAward in 2002.

Dinkar Bhat received the B.Tech. degree inelectrical engineering from the Indian Institute ofTechnology at Madras (now Chennai), the M.S.degree in computer science from the Universityof Iowa, Iowa City, and the Ph.D. degree incomputer science from Columbia University,New York.

He is Principal Engineer at Triveni Digital,Inc., Princeton, NJ, where he has made manysubstantial contributions to products for databroadcasting, bitstream monitoring and analysis,

and PSIP generation and grooming. He has published in leading journals,such as the IEEE TRANSACTIONS ON PATTERN ANALYSIS, and in IEEE,Society of Motion Picture Television Engineers (SMPTE), and NationalAssociation of Broadcasters (NAB) conferences. He holds two patents inthe area of digital television.

David Catapano received the B.S. degree inapplied computer science from the University ofWisconsin-Parkside, Kenosha.

He worked for 14 years at the Xerox Corp.in Rochester, NY, developing advanced digitalprinting technology and products. He is currentlySenior Director of Product Development atTriveni Digital, Inc., Princeton, NJ, with overallresponsibility for development of all TriveniDigital products, and he has been developmentmanager and primary architect of the Triveni

Digital SkyScraper data broadcast product line since its inception in 2000.He was an early contributor to the ATSC specialist group T3/S17 (DASE),and he oversaw the development of an early prototype JavaTV/DASEenvironment. He holds three patents for printer-related technology.

Gomer Thomas (Member, IEEE) received B.A.degrees in mathematics from Pomona College,Claremont, CA, and the University of Cambridgein Cambridge, England, and the Ph.D. degreein mathematics from the University of Illinois,Champaign-Urbana.

He is Principal Scientist at Triveni Digital,Inc., Princeton, NJ, with current focus on TriveniDigital’s SkyScraper data broadcast productline. has At various points in his career, he hascarried out research and development in abstract

algebra, real-time systems, heuristic combinatorial algorithms, distributeddata management, and digital television. He has contributed to standardswork in the ATSC specialist groups T3/S8 (Transport) and T3/S13 (DataBroadcasting). He has presented numerous technical talks and papers atNational Association of Broadcasters (NAB), IEEE, Society of MotionPicture Television Engineers (SMPTE), Society of Cable Telecommunica-tions Engineers (SCTE), Public Broadcasting System (PBS), and Societyof Broadcast Engineers (SBE) conferences and meetings.

CRINON et al.: DATA BROADCASTING AND INTERACTIVE TELEVISION 117

Page 17: Data Broadcasting and Interactive Television

James T. Van Loo received the B.S.E.E. de-gree from the University of Michigan and theM.S.E.E. degree from the University of Florida.

He began his career with the flight simu-lation component of General Electric, wherehis interests were algorithm design for flightsimulation hardware and algorithms to create thescene content. He then joined Sun Microsystemswhere he contributed to graphics workstationdesign. He later was active in multimedia objectframework standards where he was one of the

authors of the Multimedia Home Platform specification of the Digital VideoBroadcast consortium. In his current position at Microsoft Corporation,Redmond, WA, he contributes to television standards specification. Hisresearch interest is in the intersection of aesthetics and science.

Gun Bang received the M.S. degree in com-puter engineering from the Hallym University,Chuncheon, Korea, in 1997, and he is currentlyenrolled in the Ph.D. program of the ComputerScience department at Korea University, Seoul,Korea.

From 1998 to 1999, he developed a video tele-phone system based on the H.264 codec at NADAResearch and Institute, Seoul, Korea. He has beena Senior Engineer at Electronics and Telecommu-nications Research and Institute, Taejoen, Korea,

since 2000. He has been an active participant in ATSC T3/S2 AdvancedCommon Application Platform Specialist Group since 2002. He also hasparticipated in the development of an ATSC DASE-based data broadcastingprototype for the FIFA Korea-Japan WorldCup in 2002 and he is currentlythe secretary for the TTA (Telecommunications Technology Association)TC3/PG312 Working Group responsible for the development of a terrestrialdata broadcasting standard in Korea. He is also active in ISO/IEC JTC1/SC29/WG11-MPEG. His current research interests focus on data broad-casting, digital right management for digital broadcasting, and personalizedTV.

118 PROCEEDINGS OF THE IEEE, VOL. 94, NO. 1, JANUARY 2006