EBU Tech. Seminar

35
Production Technology Seminar 2008 EBU, Geneva, 29-31 January 2008 organized with EBU Production Management Committee (PMC) Report Written by Jean-Noël Gouyet, EBU International Training Revised and proof-read by the speakers

description

an excellent Seminar 2008 on briadcast technologies with casestudies tapeless, 3D etc

Transcript of EBU Tech. Seminar

Page 1: EBU Tech. Seminar

Production Technology Seminar 2008 EBU, Geneva, 29-31 January 2008

organized with EBU Production Management Committee (PMC)

Report Written by Jean-Noël Gouyet, EBU International Training

Revised and proof-read by the speakers

Page 2: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

2

Table of content Seminar highlights 3

Foreword 4

Opening speech 5

1  Cross-media production 6 1.1  Keynote speech & 'ITN ON' 6 1.1.1  The context of cross-media production 6 1.1.2  ITN ON 7 1.2  Kongamedia - a platform for production and playout of hybrid content 8 1.3  User-Generated content: the BBC experience 8 1.4  Producing & Delivering for mobiles 10 1.5  Interactivity on Mobile: PorTiVity project 10 1.6  'DR Update': Cost efficient News on TV, Web and mobile TV 12 1.7  Cross-media production with the Digital Media Factory (DMF) at VRT 12 

2  HD is now mainstream 14 2.1  High Quality Audio for HD 14 2.2  Microphones in the wild 15 2.3  Sports in Surround 16 2.4  HD Production codecs (P/HDTP) 16 2.5  HD Emission codecs (D/HDC) 17 2.6  Sports in HD 18 2.7  Producing News in HD 18 2.8  Premium productions in HD 19 2.9  Moving to file-based workflow at ESPN 20 2.10  HD production & File-based delivery in HDTV 21 2.11  UHDTV, the 'Super Hi-Vision' 21 

3  Systems and the future of TV 23 3.1  Broadcast eXchange Format (BXF) 23 3.2  Material eXchange Format (MXF) 23 3.3  Time & Sync 25 3.4  Automated information extraction in media production 26 3.5  NewsML-G2 implementation at VRT 27 3.6  Migrating to Service Oriented Architecture (SOA) 28 

Annex 1 - List of abbreviations and acronyms 30

Page 3: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

3

Seminar highlights Per BOEHLER, NRK, Norway, Chairman of the Production Management Committee (PMC)

On the first day of the seminar we concentrated on “Cross-media Production”. For some time we have seen a change in the traditional broadcast environment in the sense that there are now many more ways to distribute our content than before. This brings a number of challenges to broadcasters:

have to be present on all platforms (otherwise someone else will fill the slot) have to handle user-generated content have to re-purpose content to suit the various platforms

OR adapt production to the platforms

BUT What about the business models?

AND Are we delivering to those of our viewers/listeners that are in the “long tail segment”?

The second day was entirely set off to HDTV. This has (perhaps) become the hottest topic for European broadcasters at the moment.

This time we have tried to focus more on the audio than we have done before. The audio is more important for the successful HD programme content than many “video addicts” often think. We have heard about the new HDTV video compression algorithms now available to us and we had a

report on the current status of HD compression encoders for transmission to the home viewer. Productions of sport and news in HD and the challenges involved were also covered. In his presentation on File-based (HD) Workflow Ted Szypulski (ESPN) mentioned the importance of

staff training across all disciplines involved in content production. This is a very important issue for every one of you since I assume you will implement HD services very soon. In looking towards the future, NHK gave us a fascinating glimpse of what lies ahead in the form of

(Ultra) HDTV systems with much higher image resolution than what we see today. On the third day we could not avoid to give you the latest update on Metadata and file transfer (MXF) issues. Simply because without proper Metadata generation/handling and standardized file transfer protocols our new IT-based production environments simply WILL NOT WORK.

A glimpse of the “new kid on the block”, the BXF format, was also given. You have just heard a thorough explanation of the activities of the EBU/SMPTE Task Force on Timing

and Synchronization. This is a much more important activity than many of us would have thought. I will only urge you to take an interest to keep yourselves informed because this activity and its outcome will strongly affect you in the future. The development of “new tools” enabling us to automatically generate content related metadata was

explained to us. If successful this will simplify a task that today has to be handled manually. An example of the implementation of NewsML G2 was given to you and finally Jean Pierre Evain

gave us an extensive overview of the EBU metadata involvement. It is not an easy task to handle but SO important for all of us in the broadcast community.

Page 4: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

4

Foreword This report is intended to serve as a reminder of the presentations for those who came to the 2008 seminar, or as an introduction for those unable to be there. So, please feel free to forward this report to your colleagues! It is not a transcription of the lectures, but a summary of the main elements of the sessions. For more details, the reader of this report should refer to the speakers’ presentations, which are available on the CD-ROM distributed at the end of the seminar to the participants. Missing presentations are on the following FTP site via browser: ftp://uptraining:[email protected] . You may also contact: Nathalie Cordonnier, Project manager - Tel: +41 22 717 21 48 - e-mail: [email protected] The slides number [in brackets] refer to the slides of the corresponding presentation. To help "decode" the (too) numerous1 abbreviations and acronyms used in the presentations' slides or in this report, a list is provided at the end of this report. Short explanation of some terms may complete the definition. Web links are provided in the report or in the list of abbreviations for further reading. Many thanks to all the speakers and session chairmen who revised the report draft. The reports of the Production Technology 2006 and 2007 seminars are still available on the EBU site: http://www.ebu.ch/CMSimages/en/EBU-2006ProdTechnoSeminar-Report_FINAL_tcm6-43103.pdf http://www.ebu.ch/CMSimages/en/EBU-2007ProdTechnoSeminar-Report_FINAL_tcm6-50142.pdf

1 About 310! 2 For abbreviations and acronyms refer to list of at the end of the report

Page 5: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

5

Opening speech

Lieven Vermaele, EBU Technical Director

Development and Innovation takes place continuously in content production, in our industry, in our society - in devices, in networks, in systems and software - and have a huge impact on our media industry of today. In the past, however, our Broadcasting industry was a very unique, specific, and quite closed domain. Today, because of digitisation, interconnected networks, integration and software development, a big part of innovation takes place around us. This changes dramatically the view we can have on the future and our influence… Companies active in production technology solutions can define their position by building even bigger and even more integrated solutions to fulfill any possible demand of any organisation. But these solutions are more and more closed - in my opinion for obscure reasons - while we ask and we seek for more open solutions, building blocks based solutions, so that we can connect different systems in a much easier way and meet our requirements, optimisation of our processes and future evolutions… The IT industry is developing highly redundant and scalable IT systems, for example for the banking industry. These systems are also applicable in our broadcasting environment. But it's not only about using IT in our media environment. It's also about scaling IT, about building redundancy we need using IT equipment, about how to make the exchange of media possible like in traditional broadcasting environment, about standards, open interfaces… and not just installing networks and software. It's about understanding how IT works, where the limits are and how to apply IT in a media environment. IT for media is the biggest and the most important theme in the future of our business. .

Page 6: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

6

1 Cross-media production Chairperson: Ave Wrigley, ITN, UK Broadcasters are more and more becoming producers. The traditional chain is replaced by a range of new ‘inputs’ (User Generated Content) and ‘outputs’ (webstreaming, Joost, YouTube, IPTV, ...). What does this mean for the ‘production core’ what is the way forward?

1.1 Keynote speech & 'ITN ON' Ave Wrigley,Head of Technology, ITN, UK

1.1.1 The context of cross-media production Taxonomy of cross-media platforms On the Web, there is Live streaming or Video on demand (VOD2), with new players such as YouTube, BBC i-Player, 4OD, ITV live On Mobile platforms, there is VOD on 2.5G or 3G handsets, or 'live' streaming, or 'live' broadcast via DAB-IP / DVB-H, MediaFlo… On IPTV (content delivered over IP):where content is distributed either on the computer becoming a TV-like device (Microsoft Media Center, Joost, Babelgum, Hulu), or on the TV set (Apple TV, BT Vision, Tiscali TV).

Why? Broadcasters have to to accommodate the way people are viewing TV or video. The audience behaviour is changing, from the nuclear family in front of the TV set to more 'mobile' use. Traditional broadcast audiences are shrinking: it is not clear whether the trend is that individuals watch less TV [6] or they watch different TV/video channels [7] with, for example, the rise of You Tube [5]. Broadcasters may also miss revenue opportunities on these new platforms, through advertising - the main trend [9] - (traditional banner ads, pre-roll/post-roll, interactive in-video overlays) or through traditional subscription or pay-per-view (the satellite TV premium model applying to movies, sports and footage on the mobile, and mobile billing being straightforward)

What? Exploiting new platforms means working both with new products specifically created for each of them, and repurposing existing products. User-Generated Content (UGC) 3 is now made easier thanks to new handsets 4 and new services (YouTube Mobile, SeeMeTV…). However 'Citizen Journalism' was not as successful as expected (e.g. 'ITV Uploaded', the ITV News site fuelled by citizen journalism was discontinued). UGC implies handling a lot of different formats and managing the rights.

How? The production platform and process must be flexible, supporting both content re-use and specific production. The Content Management System (CMS) must be able to handle a wide variety of input/output formats, to integrate with the traditional broadcast production technological platform, and to distribute over the various channels. Emphasis should be put on low cost. First because of the unpredictable, low margin revenues. The 2 For abbreviations and acronyms refer to list of at the end of the report 3 See also § 1.3 'User-Generated Content: the BBC experience' 4 For example, Nokia N95

Page 7: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

7

nature of the platform and of the content changes, implying a cautious investment on technology. At the same time the solution must be scalable to face a rapid platform growth. The 'risk of success' should also be managed, the distribution costs becoming then out of scale… of the revenues.

The future Multi-platform has an impact on the traditional broadcast business, but a limited one. It offers an increased diversity: more channels, timeslip for people to watch TV when they want, delivering the same product on different platforms. Low-cost cross-media production techniques are migrating to broadcast while a lot of pressure still weighs on production cost in the traditional broadcast world. The traditional broadcast will get a smaller slice of a bigger pie [9] but the cross-media will be a niche content, or at least a different content. On YouTube, for example, there are of course both legal and illegal versions of traditional broadcast products, but a vast majority of products are to be found on YouTube only. Concerning rights management, new models are emerging, such as 'YouTube claim-your-content'. It provides tools in two ways: for the rights owner to manage his/her rights by searching his/her content and claiming it back - and when one presents some content to YourTube, it will try to find out if anyone else has already submitted the same content and claimed it back.

1.1.2 ITN ON

'ITN ON' provides live and on-demand video content for mobile phones, broadband, iTV 20+ and radio5. Clients are mobile operators, newspapers, Web portals, Web search providers (3, bebo, blinkx, Channel 4, Google, itv.com, Mirror.co.uk, Orange, O2, MSN, T Mobile, Telegraph Group, Virgin, Vodafone, Yahoo!, YouTube).

The products include: o News with bulletins, summary of the day, hot clips… A live 24 hours streaming news channel is available through 3G operators (3, Orange, T-Mobile, Virgin, Vodafone) or roadcast (BT Movio). o Weather channel o Showbiz: news, gossips and interviews, Bollywood Insider (Asian UK market), Movie Buff review… o Sports o Archive: 'On this day in History', Most Requested o Technology: gadget reviews, games reviews o Music o Specific production (Telegraph TV)

For creating and re-purposing content, 'ITN ON' production has a dedicated team of multi-skilled

producers (for on-line editing, writing copy, voicing over, presenting), of video journalists (going out for special events, creating own content) and a dedicated studio. The technical platform includes Quantel video servers, with desktop browse and edit tools, FlipFactory6 for the video encoding, and an in-house content management system 'Nemesys' [24] allowing for:

o Web-based content authoring [26] o Different video formats, bitrates, aspect ratios, sizes o Scheduling and automated distribution with a wide variety of delivery options (FTP, HTTPO, SFTP, e-mail… automated alerting, differential updating delivery, re-delivery on failure, …)

The 24-hour live News channel for Mobile is based on: o a 15-minute wheel Carousel

5 http://itn.co.uk/on/index.html 6 http://www.telestream.net/products/flipfactory.htm

Page 8: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

8

o a "Studio in a box" [29], PixelPower Clarity7, operated by a single producer, for inputting (from carousel, studio / outside source), graphic creating, chromakey compositing, formatting with templates, play-out). it is used for 15-minute live News on top of the hour or breaking news. o an Ibis automation8 (play-out scheduling with looping)

User-Generated Content (UGC) 1.2 Kongamedia - a platform for production and playout of hybrid content Gabriel Basso, "Veille technologique", TSR, Switzerland

UGC is never organised, it always arrives unexpectedly in conditions of unpreparedness and even chaos. But it is the additional piece of information which may make the difference, and anyway offers a different point of view. On the production side, one wants to make sure to

answer the demands of the editorial people to put UGC on air or on the Web, or even anticipate their needs; do it easily with no impact on the established production processes and procedures put in place; do it securely, with no risk of corrupting or jeopardising the IT infrastructure

Kongamedia9, has been jointly developed by TSR and BadgadStudio, as a tool to "tame the UGC monster". It is a '5 in 1' product. An ingest mechanism - it 'gobbles up' any digital file regardless of format, media or device: audio, video, animation, still image, PowerPoint presentation…coming from camcorder, mobile phone, still camera, computer, PDA… A traffic controller and a 'big mouth' - content can be transferred to Kongamedia in many ways: from MMS, e-mail, FTP, CD, DVD, USB key, flash memory card, Bluetooth, InfraRed… A file profiler and organiser - it recognises the file format, stores the file, organizes it and adds the necessary metadata. A play-out system - it processes the files, loads them into the Content Management System (CMS) [6] and automatically converts them into the right format (video feed, file format) for any distribution platform (broadcast, Web, mobile phone…). A highly secure 'kid's game'. It requires no particular training thanks to its simple and user-friendly interface. Three 'buttons' only: 'Record', to ingest. 'Metadata': Title, Commentary, Category (audio / animation / video / text / image), Audio(Yes/No), Reception Date, Publishing Date, Producer, Rights (Web / Mobile / TV / none), To be archived. 'Play' with a Search engine (by Keywords, Media type, Producer, Rights, Date). The universal player detects the right codec to be used. As a centralized and isolated tool, it is virus and malware proof.

1.3 User-Generated content: the BBC experience Guy Pelham,Live Editor of News Gathering, BBC, UK

7 http://www.pixelpower.com/products_product_info.html 8 http://www.ibistv.com/ 9 http://www.kongamedia.com

Page 9: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

9

UGC is valuable because it may be the only way that broadcasters can gather images of major or breaking stories, especially in places where access for broadcast journalists is difficult. For instance, the impact of the protests in Burma in 2007 would have been far less without the images sent by the Burmese people themselves via mobile phones and the internet, avoiding the censorship of the Burmese authorities. For the BBC, the key challenges are:

The scale:10,000 to 12,000 e-mails per day (up to 18,000-20,000 on a breaking news day), with a small part carrying pictures. So, a system has to be put in place first to exploit this flow of information and second to verify these contributions. Understanding the UGC providers and finding potential new ones. A lot of people sending UGC

material are thinking differently to traditional audience. They have grown up in an environment where sending messages and sharing pictures is natural. They belong to social networks (Facebook…)[22]. Journalists need to understand this world, enter it, find people with material of editorial significance (not only pictures) and ask them to contribute. Telling them how to contact you. UGC is encouraged on the BBC News Web site10. Video or photos

can be uploaded, or sent via e-mail or as MMS,or as Webcam messages. The Web site is promoted on all BBC News bulletins. The technical infrastructure. At the start, there was only a giant e-mail Inbox with a journalist looking

for the 'gem'! Then a UGC Management System was developed, with a 'picture console 'interface [12] [13], with Rating (YouTube-like stars) and a Flag (you can use it/not). Verification - the systems which deal with UGC stuff have to be editorially led. If one gets this wrong,

the whole concept of UGC is devaluated. Therefore at the 'UGC hub' there is a team of 13 journalists working around the clock, using traditional skills, checking ('were you there?'), calling or e-mailing back the UGC provider, being aware of the hoaxes, tricks and technical stunts which can be used to make a fool of you… The results of this verification are inserted on the picture console (green flag + journalist's comment) [18]. Contacting the material provider [19] may also be the opportunity to interview him/her for Radio and/or TV.

The workflow [21] is composed of: the UGC hub, the Picture console for editorial clearance, the transfer through the Central Technical Area, the Transfixer for formatting the pictures to the screen aspect ratio, the input into the BBC 'Jupiter' Digital Newsroom Computer System for access to any News programme. People contributing to UGC expect the BBC to use this material [23 - summer floods in the UK], otherwise they will stop sending it. But on traditional Broadcast outlets there is a limited amount of airtime for this huge amount of items. In order to exploit this material and feed it back to the audience, it is put back on local BBC Web sites, geo-mapping it on Google map, and by clicking one can watch it. During the 7/7/2005 London terror bombing, the BBC only got pictures from the public, not from the hundreds/thousands of journalists going to work. Therefore, a Personal Newsgathering application [25] is under development for the staff. It is an FTP client embedded in the operating system of the Symbian mobile phone (63 % of the mobile phones, e.g. Nokia, etc.), allowing staff to record images or video clips, and sending them back to the broadcast centre as a file.

10 http://news.bbc.co.uk/ > 'CONTACT US' – 'Help us to make the news'

Page 10: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

10

Producing for Mobile TV 1.4 Producing & Delivering for mobiles Enrico Moresi, Head of Transmission, Euronews, France

Euronews11 is a 24-hour News channel, belonging to a pool of European broadcasters12. It gets pictures mainly from News agencies broadcasts and transmits in 7 languages (+ Arabic after mid-July 20008). It has no own production team and no presenters. The News (World News, Business, Sport, European affairs, short Magazines, Weather report…) are formatted on a 30-minute wheel composed of very short items (1 min. in average) and are presented in an European perspective. The traditional distribution channels are: Satellite since 1993, now covering 5 continents, Cable mainly in Europe and North America, Terrestrial in Europe. Development is ongoing for IPTV, ADSL and Mobiles (since Spring 2007) in Europe. The production is common for all languages and platforms. A story is first edited in a specific language. Then, each journalist writes in each language an unique script for all platforms (with minor changes for the Website) and voices-over the commentary. The Video and the 7 languages are merged into a single file, stored in the server and then delivered onto the platforms. Concerning the production for mobiles, in the present Phase 1, the story is taken out of the common MPEG-2 stream for traditional platforms, using the same playlist, but is formatted with specific graphics adapted to the mobile handset display and with a more visible logo. In the near-future Phase 2, a dedicated playlist will be produced to increase the delivery speed of the News onto mobiles, with a focus on short stories (less than 2 minutes). But an economical model applicable to Euronews has yet to be developed, the 'Live TV' streams being presently offered to the operators, who are responsible for distribution. The delivery consists of:

5 different MPEG-2 streams (each one with the 7 languages) 2 MPEG-4 streams for: o DVB-H (with 7 languages):Video CIF (352*288 pixels), H.264 coded, 300 kbit/s + Audio AMR-NB coded, o 3GPP networks (with presently 1 language only): Video QCIF (176*144 pixels), H.263 coded, 16 f/s, 80 kbit/s + Audio AMR-NB coded, 4.5 kbit/s.

The production for mobiles is delivered to 20 Telecom operators via the satellite Sesat (36° East) covering European and Middle-East countries13, in both MPG-2 and MPEG-4 streams (some operators still preferring to encode themselves for mobiles).

1.5 Interactivity on Mobile: PorTiVity project Gerhard Stoll, Senior Engineer, Audio System, IRT, Germany 'porTiVity' is an Information Society European project (January 2006 - September 2008) developing and experimenting a complete end-to-end platform providing Rich Media Interactive TV services for portable and mobile devices. They produce/set up direct interactivity with moving objects, within the TV programmes, on handheld receivers connected to DVB-H/eDAB (broadcast channel) and UMTS (unicast 11 http://www.euronews.net/index.php 12 CT (Czech Republic), CyBC (Cyprus), ENTV (Algeria), ERT (Greece), ERTT (Tunisia), ERTV (Egypt), francetélévisions (France), NTU (Ukraine), PBS (Malta), RAI (Italy), RTBF (Belgium), RTE (Ireland), RTP (Portugal), RTR (Russia), RTVE (Spain), RTVSLO (Slovenia), SSR (Switzerland), TMC (Monaco), TVR (Romania), TV4 (Sweden), YLE (Finland). 13 http://www.eutelsat.org/satellites/36esesat.html

Page 11: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

11

channel). This project learned from the previous project called 'General Media Framework for Interactive TV' (GMF4iTV) project (June 2002 - February 2005) allowing interaction on a TV display via a PDA (or iPhone…if it had already existed) and an extended MHP set-top box [6]. The main programme comes over the Broadcast channel [5] via DVB-H, or eDAB, with IP delivery [7]. Additional information, such as technical metadata describing the position in time of the objects one can interact with and basic multimedia scenes, come also over the Broadcast channel to strongly synchronise with video [8]. One can retrieve information via the UMTS / or Wi-Fi / or WiMax return channel if it is personalised [8]. 'porTiVity' has chosen the new MPEG standard for Lightweight Application Scene Representation (LASer, MPEG-4 Part 20) to realize the interactive Rich-Media services on the handhelds. This format, which is based on SVG (Scalable Vector Graphics), provides a fluid user experience of enriched content, including audio, video, text and graphics on constrained networks and devices. The bit-rate for video and audio is flexible. However, we are using H.264 with typically a bit-rate of 160 kbit/s for 12.5 fps and QVGA video definition (320*240). The audio bit-rate is 32 kbit/s stereo MPEG AAC Low Complexity. However, we could easily change to other bit-rates, even lower ones, in particular if we would use HE-AAC v2. A storage capacity of 1GB RAM available on the handheld allows the user to store a programme duration of up to 8 hours, e.g. if wanted for time shift viewing, which includes.the complete application to interact with objects on the screen and to get all the additional information. With this time shift mechanism we can make sure that he will not miss any important event during the programme (e.g. during a Sports programme). In the porTiVity system architecture [9], following components are used:

The offline or the real-time (in sport scenario) authoring suite: it defines visual objects (for interaction) on a first frame and tracks them on the next frames [10] [22]. It is coupled with an MPEG-7 annotation tool [20]. The MXF file format provides: support for the required types of essence, LASeR, metadata and

additional content, including synchronisation for different content elements/streams, read while write processing support (for live production), and the ability to express timing relationships across multiple files. The MPEG LASeR standard [14]-[19] allows for: o Optimised set of graphical objects (based on 2D SVG) for scene representation o Dynamic updating of the scene o Accurate frame by frame synchronisation and efficient interface to A/V streams o Compression & transport efficiency

A demo took place using a porTiVity handheld terminal with a Player protoype [23]-[26], and the 'Spur and Partner' mobile service based on an ARD interactive TV crime series for children14.

14 http://www.daserste.de/checkeins/spurundpartner/

Page 12: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

12

Providing other New Services 1.6 'DR Update': Cost efficient News on TV, Web and mobile TV Anders Kappel, Executive Editor, DR News (a), and Erik Norgaard Gravesen, Editor TV on New Media (b), DR, Denmark (a) - 'DR Update'15 is a TV channel on Internet, launched mid-2007 to fight the drop of interest in traditional newscast on television (200,000-300,000 viewers lost in just 2 years) [3]-[4]. DR spent a lot of money hiring popular professional anchors and developing the interface [18]-[22], which allows for watching the live stream, or selecting-viewing a story, sending it to a friend, rating it, consulting a Top 5 list of the most seen clips within a day/week/month. A content item is composed of the anchor presentation and of the story clip. 30 -40 % are 'fascination' -'people' stories. The items are packed with a Web Content Management System (CMS) controlled by an assistant editor. Besides the Web home page, a live television stream compiles items and runs in a 5-6 minute loop. It is distributed to [8]: private cable companies (Canal Digital, Viasat and You See/TDC), DR Television Web site16, a DVB-T channel, 3G-mobile companies (soon a VOD version), a joint venture system on busses and commuter trains and in shopping malls. (b) – A new 'player system' was built which can allow the user to lean back (traditional TV viewing) or lean forward (Net-tv), changing between passive and active viewing[4], changing from a linear flow to an 'active flow', surfing in the programme assembled in clips instead of zapping between the programmes [5]. The clips of many programmes are collected into 'players' (Children, Young people, Culture, Nature…) [6] thanks to the metadata associated with each clip [8]-[11]. Through the DR Content Management System, all the players can be updated at one time, and a new player can be composed in ½-1 hour.

1.7 Cross-media production with the Digital Media Factory (DMF) at VRT André Saegerman, Director of Production Technology, Johan Hoffman, Peter Soetens, VRT, Belgium VRT has implemented an integrated file-based media production workflow, starting in June 2007 with the News, and now extending to general programme production, to record/play-out in production studios (May 2008), and to HD (end 2008). The architecture of this 'Digital Media Factory' (DMF)17 is based on following principles and modules [12]-[13]:

Central media storage (with 4 clusters dedicated to 'high-res' material / browse / archive / systems components) [16] and central management system (Ardendo Ardome) [15], with an integration layer based on the IBM's Enterprise Service Bus, ESB [18]-[19]. Work centres connected to these central media storage and management system, each through an

integration point with the ESB. A work centre (WoC) is an autonomous environment used for performing a specific craft step in the production process (e.g. audio editing, video editing, subtitling…).

o Instead of a single vendor approach, the best tool for each craft was selected. This better responded to the users' needs, but increased the complexity of the integration. o A work centre must at all times be able to operate autonomously, even if the connection to or

15 http://www.dr.dk/nettv/update 16 http://www.dr.dk/TV/forside.htm 17 A detailed technical description of the DMF is available in the report of a visit organised by EBU in December 2007. Please contact Hélène Rauby-Matta: [email protected]

Page 13: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

13

integration with other work centres or the central media management is temporarily broken. This implies a loose coupling. A file-based workflow used throughout. This implies that material will either start its existence as a file

(e.g. file-based camera with P2 memory card), or it will be converted to a file as the very first step in the production process (e.g. feed ingest, ingest of archive tapes, ingest of material purchased externally). Common ingest [14]. Coming in raw video/audio material is ingested both into the video/audio edit

craft system and at the same time into the central storage system. So the media is immediately made available to video/audio editors, to the journalists browse editors, to the on-line editors, to the radio audio editors. Separate but fully file-based final control rooms and play-out for television, radio, on-line [20] A full integration on Essence and metadata level. As Essence (audio and video data) is produced,

transformed and transported between work centres and central storage & media management, the metadata associated with are gradually enriched in each step of the workflow. The integration layer takes care of synchronising the metadata between systems and of orchestrating the various Essence transfers.

The media formats and the file format used. For News: DV25 with 4 audio channels in MXF-wrapper OP 1A; for TV General Purpose Programs: D-10 (IMX) 50 Mbit/s with 8 audio channels in MXF-wrapper OP 1A; for digitized 1" legacy archive: MPEG-2 long GOP 15 Mbit/s; for browse material: MPEG-1, 1 Mbit/s; for Radio: uncompressed. The MXF interoperability is made difficult because of the different 'flavours' and vendors' implementations.

Page 14: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

14

2 HD is now mainstream Chairperson: Andy Quested, BBC, UK This year is the 25th anniversary of the first NHK demonstration of HDTV in Europe18. It was given at the General Assembly of the European Broadcasting Union, in Killarney, Ireland in June 1982. The meeting triggered the search for HDTV technical standards, for which a notable milestone was reached in April 1997, when a draft Recommendation which became Rec. 709-3 was agreed by the ITU-R. Ten years later, the question remains: How to design an HD audio and HDTV production environment for today’s needs and to get them future proof?

Audio 2.1 High Quality Audio for HD Florian Camerer, 'Tonmeister' & Trainer, ORF, Austria After a reminder of the general issues of surround sound and of its role in broadcasting [4]-[15], ORF's production at the New Year's Concert 2008 in Vienna was detailed [16]-[23], along with some 'policing' guidelines:

DO the 'Good' (+) and DON'T DO the 'Bad' and the 'Ugly' (-): (+) Produce high quality audio for high quality video (-) Fake 5.1 through up-mixing, especially when it comes again to the down-mixing (phasing problems) (-) Put Mono in all 5 channels (+) Use genuine and worthwhile 5-channel atmospheres (+) Use the surround channels with a clear intention (-) Underestimate the importance of room acoustics (+) Use the centre channel to anchor on-screen events (commentary may be on L & R - take care of dialogue intelligibility!) (-) Use the Center only as an “add-on“ (-) Put signal in the LFE for the sake of it (if no avalanche or earthquake!) (-) Confuse LFE signal with the subwoofer! (+) Think about the stereo downmix and listen to it! (like more than 90 % of the audience) (+) Approach the transition to loudness metering next to peak metering (ITU-R BS.177019) (+) Go for a dynamic and transparent mix (+) Do think about post-production, your program might end on a DVD-Video or Blu-Ray Disc (implying 6.1 / 7.1, multi-track recording, re-mixing, editing, mastering) (+) Avoid complicated signal flow with too much cascading (+) Use the highest bitrate possible for broadcasting, but no lower than 320 kbit/s (cf. EBU D/MAE evaluation of current multichannel codecs20). (+) Archive PCM (+) Train your sound engineers extensively for 5.1 (-) Underestimate the complexity of good surround sound, it needs a lot of attention and experience (-) Give your engineers too little time to experiment and gain knowledge

18 'The development of HDTV in Europe' - David Wood, EBU Technical Review, July 2007 http://www.ebu.ch/en/technical/trev/trev_311-wood_hdtv.pdf 19 BS.1770 (07/06) Algorithms to measure audio programme loudness and true-peak audio level 20 http://www.ebu.ch/en/technical/publications/index.php (EBU members only - Login and Password required)

Page 15: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

15

(+) Don´t be afraid to do Stereo or Mono still! (Bad programs get worse in surround sound… probably best in band-limited Mono) (+) Start yesterday...! and don't forget the fun!!!

2.2 Microphones in the wild21 Bill Whiston, Sound supervisor, BBC, UK There is a variety of microphone systems used for surround sound pickup: MSM (2 cardiodids + figure-of-8) [5], Trinnov (8 omni capsules) [6], discrete (combination of spaced omni and directional microphones) [7], holophone (8 'elements' in a human head shape) [8], Soundfield (4-channel B-format) [9]. But 'how do you choose' what setup the system is based on?

Circumstances - How can I mount the microphones? What am I covering? What is your budget? Required Result - 4.0/ 5.0/ 5.1, etc? Training and talking about Surround with knowledgeable people is essential. This is especially so for

producers and directors approaching Surround for the first time. Experience: Nothing beats doing it a lot, in studio and outside. Use your Stereo experience and build

on it. Experimentation & Trial and Error! These require time and lack of pressure to be of any use. Without

either you won’t progress and make Surround your own. Five examples of Surround Sound pickup and production were detailed: Wimbledon Centre Court tennis coverage [14]-[18], BBC Promenade Concerts at the Royal Albert Hall [20]-[22], All-Stars Football [24]-[26], 'Later with Julie Holland' [28]-[30], 'Strictly Come Dancing' [32]. Following guidelines can be drawn from this experience:

Practice. Treat every programme you do as Surround. Take the time to try different ways of acquiring Surround. Multi-track everything you can and demand the time to use the results to practice. Training. Both technical and production people need to understand what we are all talking about.

Without training there’s no understanding and therefore no desire for sound mixers, producers and directors to get involved. However, when invited to listen to a good surround mix the effect on production staff is electric! Persuasion. Talk to your customers, cajole, sweet talk - even blackmail them if necessary! Sell it to

them. It’s no good setting all this up if you can’t sell it. Charge a reasonable rate and you will sell the idea much more easily!

It's all down to investment in people and equipment. And, where to start?

Start from Stereo and add to that. You don’t have to reinvent the wheel! There doesn’t have to be something in the Surround or Centre channel ALL the time. 4.0 can work

just as well as 5.0, and 2.0 or 3.0 if necessary, during the course of a programme. Do whatever suits the subject. Make it as natural as possible unless you are after a specific effect.

USE YOUR EARS!

21 Read also the recent article of Bill Whiston in the last EBU Technical Review, 2008 Quarter 1 'Microphones systems used for Surround Sound pickup – and their use at Wimbledon tennis and The Proms' http://www.ebu.ch/en/technical/trev/trev_2008-Q1_whiston%20(surround%20sound).pdf

Page 16: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

16

2.3 Sports in Surround Martin Black, Senior sound supervisor & Technical consultant, Sky Sports HD, BSkyB, UK The delivering of 5.1 surround sound for Sky Sports HD involves:

Capturing the 5.1 audio, for example with an arrangement [8] of Soundfield microphone, 12 to 14 mono & stereo gun-mics around the pitch, slung omni mics above the crowd behind each goal. Creating the 5.1 mix: Soundfield mic provides basic 5.1 surround image, gun-mics provide ‘ball-kicks’,

and the commentary is placed in the centre channel. Creating the stereo fold-down: the HD stereo viewer receives the Lt/Rt folddown from the set-top box

downmix of Dolby-Digital 5.1; the SD platform transmitting MPEG stereo, a matching 2-channel Lt/Rt downmix is provided through the Dolby DP563 ProLogic encoder [13] Embedding 5.1 Dolby E (from the Dolby DP571 encoder) & stereo audio into/from SDI [16] Implementing the Doby E path from the Outside Broadcasting truck to transmission [17]

In order to line up the 5.1 audios, a special tone sequence was developed: BLITS22, 'Black and Lane23’s Ident Tones For Surround'24. It uses different frequencies [19]-[21] to identify the different 'legs' (L, R, C, LFE, Ls, Rs). These are musically related in intervals of a fifth and an octave, in line with the musical standard of 440Hz so the front left/right, for instance, starts at 880Hz25. It allows an operator getting a downmix or the original 5.1, to immediately identify them. BLITS is hoped to be recognised as an EBU standard. CTP Systems26 developed a box.

HD Codecs 2.4 HD Production codecs (P/HDTP) Massimo Visca, Centro Ricerche e Innovazione Tecnologica, RAI, Italy - Presented by Hans Hofmann, EBU Technical Department

EBU P/HDTP project group members conducted a series of tests (mainly based on expert viewing assessments) on the new generation of 4 HD production codecs: Sony XDCAM HD 422, Panasonic AVC-I, Avid DNxHD and Thomson-Grass Valley JPEG2000. The picture quality measures included not only first generation but also image quality headroom after the 4th and 7th generation, with spatial (and, where relevant, temporal) shift, in a production chain for different formats (1080i/25, 1080p/25 and 720p/50) [11]. The results are mainly based on CRT display [13]. The full reports are available (for EBU members only)27. The results for each codec are summarised in the slides [15] [17] [19] [22]. 'Quality transparent' HD production chains are feasible but they need bit rate (especially with Intra coding). This requires at the same time a compromise between the possible bit-rate and the storage & network capacity. Intra coding provides easier manipulation in post (lower decode latencies, jog/shuttle operation, jumping to different point in a file…). Inter coding allows better visual picture quality at lower bit rate, but is prone to GOP related effects and to potentially unknown system issues in production environment.

22 Name selected in part as a tribute to the earlier GLITS (Graham Lines's Identification Tone for Stereo) line-up signal for stereo, invented at the BBC 23 Keith Lane, Operations manager, Sky Sports 24 'Lining up in 5.1 audio' – George Jarrett, TVBEurope, November 2007, p. 24 http://tvbeurope.com/pdfs/TVBE_download/2007/11/TVBE_Nov07_P16-43_IBC.pdf 25 See also the recent article of Bill Whiston in the last EBU Technical Review, 2008 Quarter 1, page 5/9 http://www.ebu.ch/en/technical/trev/trev_2008-Q1_whiston%20(surround%20sound).pdf 26 http://www.ctpsystems.co.uk/ > Tone Generators 27 http://www.ebu.ch/en/technical/publications/index.php (Login and Password required)

Page 17: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

17

Multi-generation production chains need to be limited (e.g. max. 4 generations).

2.5 HD Emission codecs (D/HDC) Rainer Schaefer, Head of TV Production Systems, IRT, Germany EBU D/HDC project group, devoted to the evaluation of HD codecs, published early quality expectations concerning the emission of HD programmes [4]-[6]28. The first investigations led to following conclusions (still true) [9]: some early H.264 encoders were/are less performant than MPEG-2 encoders; hardware encoders are often better optimised than software ones; 720p is generally easier to encode than 1080i and offers better sharpness on displays; products are differently optimised for 'p' or 'i'; some HW H.264 encoders reach the 50% bit-rate gain margin versus MPEG-2 ones. The present work of D/HDC is to confirm these results and to get a better overview of various products [15] on the market. Two sets of test sequences have been defined [12]-[13]+[16] and prepared for various formats [14]. These are some conclusions concerning the results [17]-[18]+[20] of these new tests:

the 50 % H.264 bit-rate gain was again confirmed (versus the MPEG-2 24 Mbit/s reference encoder; there are still differences between the products, but products have improved; some experts felt strongly that even with the best encoder that 8 Mbit/s (1920x1080i/25) is insufficient

for HD broadcast of critical material (may be start at 10 Mbit/s); all experts felt strongly that with the best encoder, 6 Mbit/s (1280x720p/50) is insufficient for HDTV

broadcast (may be start at 8 Mbit/s) - distribution of these progressive signals saves 20 % bit-rate; encoders behaved differently in terms of trade-off between resolution and coding artefacts for critical

sequences or in case of emergency; encoders showed different behaviour in terms of buffer control (GOP pumping);

After testing sequences coded with 4 different production encoders and then cascaded with a sample broadcast encoder, it appears that with respect to distribution, one can choose whatever production codec one want to use - there is no significant dependency [24]. Concerning the latency, a 'reasonable' value for H.264 is feasible, with 720p showing less latency for all products. On the decoder side, there are still (autumn 2007) some problems with 720p, or with black level and colour representation, but update and investigation are on the way.

28 'EBU Guidelines for the RRC-06' (ITU Regional Radiocommunications Conference) I37-2006 http://www.ebu.ch/CMSimages/en/tec_text_i37-2006_tcm6-43439.pdf?display=EN

Page 18: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

18

HD productions 2.6 Sports in HD Francis Héricourt, CEO's Adviser on New Media and International Affairs, France 2, France Some economical and technical choices had to be made for the live broadcast of the Roland Garros tennis French Open:

For each of the main courts one single HD OB van delivered HD and SD streams. The 'shoot and protect technique' was used for the conversion from 16/9 to 4/3 aspect ratio [8-right],

even if this was suppressing a better 'perspective' [8-top left] of the HD coverage of the court on the wide screen TV sets. The display of the scores had to be placed not in the extreme left side of the 16/9 picture but of the

4/3 one. Don’t forget surround sound: the spectator must feel he is in! The move to digital transmission on the 'Tour de France' (2005-2006) was justified not only for economical reasons or for preparing the introduction of HD (2007), but mainly for frequencies management reasons (more input streams required by the producer, and one frequency band less). Some challenges: Save frequencies - But the producer is asking for more incoming streams (from the 5 motorbikes with HD cameras, from the teams managers' cars, from the 2 helicopters with Wescam and Cineflex tracking system) [12]. Implement a reliable system - Establish only one company responsible for the encoding, decoding and avoid unnecessary codecs; use optimized encoders avoiding 'freeze' on pictures. Improve the motorbike cameras (in HD it becomes difficult to get the right focus!). Improve the encoding system (MPEG-4???) without introducing extra delays – But the existing MPEG-2 delay is estimated already too long by producers. Even if the extra cost of the HD technical equipment cost is 20 %, it has to be compared to the manpower cost and to the important rights cost.

2.7 Producing News in HD Kei Yoshida, NHK General Bureau for Europe, Middle East and Africa, Paris, France NHK launched its satellite HD channel in December 2000, and 2 terrestrial HD channels in 2003. Presently almost 100 % of News production is in HD format (HDCAM 750 cameras for ENG). NHK decided to go HD to report the scene with enhanced sensitivity (facial expression, atmosphere…), to provide more detailed visual information to the viewers in case of emergency and disaster reporting (typhoon, earthquake…), to preserve the material for the future in a high quality digital data storage format. This was made possible thanks to:

An international HD contribution network covering all over the world [5], using until recently MPEG-2 encoders (Tandberg, Mitsubishi, Victors…) with: 60-80 Mbit/s for full contribution bit-rate, 27-33 Mbit/s (requiring a 27 MHz transponder) for ordinary News, 20-25 Mbit/s for low bit-rate News. A 'SD squeeze mode' (HD 16/9 horizontally squeezed in SD 4/3 than re-stretched to HD 16/9)

allowing to use less transmission bandwidth. In 2006 this mode represented 30 % of HD transmission, with 60 % in full HD format (in 2003 the ratio was 50 % / 30 %). In a very near future, DVB-S2 (offering a 30 % capacity gain) will be used together with H.264

allowing to transmit 3 HD 9-MHz channels via a 27 MHz transponder, or 4 HD 9-MHz channels via a 36 MHz transponder [12].

Page 19: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

19

The transition from HDCAM Edit system [14] to a PC-based one [15]. The transition to a global HD networked server production system is expected [16]-[17].

2.8 Premium productions in HD Adrian Widera, Arnold & Richter Cine Technik GmbH (a) & Co. Betriebs KG,Germany, and Stephan Wagner, DoP (b) (a) The Arri D-20 camera intends to bridge the gap between theatrical motion picture production and HD video, and is used in the production of features, TV dramas, commercials, music promos. Thanks to its concept and technical characteristics [3]-[17], it offers an HD mode and a Data 'raw' mode: HD mode Data mode Frame rates 23.976, 24, 25, 29.97, 30, 48, 50, 59.94, 60, 1-60 fps 24, 25, 30, 1-30 fps Definition SMPTE 274 1920x1080 (with a 1.5 H & V oversampling,

2880x1620, to keep 35 mm optical characteristics and to compensate bayer pattern sampling)

[19]-[21] 2880x2160 @ 25fps, 2880x1620@30fps

Aspect ratio 16:9 4:3 Sampling structure 4:4:4, 4:2:2 [24]+[26] Bayer CFA Data [14] Color space ITU-R 709 RGB, YCbCr [22]-[23] RGB Bit depth 8 or 10 Bit 12 Bit ITU 709 Legal Range / Extended Range 64-940 to 4-1019 @

10-bit [27]

Linear / Log to use together with stock film material [28] Standard Standardised Not standardised The recording storage [29] can take place on any HD-SDI compatible VTR (e.g. Sony HDCAM-SR SRW1), any data recorder (e;g. Fraunhofer IIS Megacine, on-board magazines (e.g. Thomson FlashMag, with 10 min in HD 4:4:4 and 18 min. in HD 4:2:2). Production workflows will vary according to the modes and parameters [30]-[31]. In Data mode workflow [31], it is possible to output in parallel the HD mode (for off-line editing). (b) According to an user experience, with the production of the ZDF TV feature 'Hinter dem Spiegel' (Behind the mirror), the main advantages of this camera, coupled with an HDCAM-SR recorder and HD monitor [36], are:

Very easy set-up (colour temperature, sensitivity, no white balancing, no special calibration…) Optical viewfinder (the 'real look') 35 mm equipment (great choice of lenses; accessories) Picture quality : real black, fine structure in the shadows an in the lights Depth of field , the same as on 35 mm (a challenge for the focus puller!)

Big difference with film shooting: a video monitoring out of the camera with a technician looking at he frame and giving a feedback (sharpness, lighting…) and the cable…but WYSIWYG! A minor problem: when too much light on the sensor, it creates a ghost picture inside - solution: slightly lower the contrast.

Page 20: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

20

HD studio architectures 2.9 Moving to file-based workflow at ESPN Ted Szypulski, Senior Director of Technology, Research & Standards, ESPN, USA ESPN29 has built a new HD server-based Digital Broadcast Center (13,000 m2), in Bristol, Connecticut, which inaugurated daily HD broadcasts of the ESPN flagship program 'Sports Center' in June 200430. Networks in HD 720p/60 (the best choice for Sports): ESPN HD, ESPN2 HD, ESPN News HD, 24-hour Sports News (March 2008) and ESPNU HD, College Sports (August 2008). In order to implement a new workflow:

Spend months to study old workflow: you need to know how you’ve done things before you make the change Examine technologies available and invent the new workflow to capitalize on new capabilities Implement a large capacity storage and archive with Gigabit Ethernet interconnection: the Quantel SQ

servers offer over 100 Terabyte (TB31) of on-line storage (representing either 4,000 hours of SD or 2,400 hours HD - mixed), accessible anywhere; the data tape archive capacity is presently of around 500 TB and can be increased 'limitless'.

The main functions of the workflow are:

Ingest with more than 120 input ports for live events (200-300 hours of new media daily); SD events recorded in MPEG-2 I-frame only 40 Mbit/s, HD events recorded in DV 100 Mbit/s; an 'Event Calendar' interfaced with the ESPN Media Asset Management system allow producers to select events. Screening - In a large room, to facilitate collaboration, 'screeners' create metadata associated to an

event; they have access to an application aiding to speed up the process (raster of the team, real-time scoring, list of keywords related to the kind of sports); they have been cross-trained to make some 400 rough cut editing pieces every day (12 of the craft editing suites are in the same room).

o Editing - 25 NLE central server-based suites; intermixing SD and HD, 'publishing' 450 pieces every day.

On-air Play-out. The published (= finished and approved) pieces can be played on-air in 3 ways, out of:

o an 'edge' server, belonging to each Control room for speed and safety, with a specific control interface [13], and with their own editor for clean-up and rebroadcast of the show; o the main media store linked to any Control room; o every Edit room through a direct connection via the Central routing system.

Archiving - A StorageTek robotic tape library holding 12,000 LTO-3 tapes, storing each 400 GB (representing 20 hours of SD or 8 hours of HD programming).; for safety there are several robots and multiple archive sites miles apart.

This new workflow allowed for:

Near-simultaneous tasks (next task can start immediately after previous task starts) and much faster occurring of ingest, screening, editing and play-out. Sharing media and fast access to all media, even to the archive for richer retrospective pieces. Safe media (no physical container to lose) as 'assets'. Better team work and collaboration (staff now thinks 'outside the box'), and multitasking.

29 ESPN Fact Sheet http://www.espnmediazone.com/fact_sheets/espnfactsheet.html 30 In the U.S.A., the analog switch-off will take place on the 17h of February 2009. 31 1 TB = 1000 GB

Page 21: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

21

2.10 HD production & File-based delivery in HDTV Anders Janocha, Head of Production Development, SVT, Sweden In spring 2007, SVT started to produce near[8] and in the new HD facility [7] in Göteborg a drama series 'Andra Avenyn' 32 (Second Avenue) in HD with 5.1 surround sound, each episode ½ hour long, 3 episodes to be broadcast per week from Stockholm. The production was divided into teams and stages [10]:

1st team shooting on location with DVCPRO HD P2 cameras (looking now for AVC format); 2nd team working in studio, with 3 cameras, recorded into Avid (in DNxHD 120); 3rd team doing online HD editing (5 weeks); support for some common codecs is missing (e.g.

DVCPRO HD) and it takes time to flatten the files; a suitable browse format for HD has to be found; 4th phase: colour correction, audio post-production, delivery; … and the cycle goes on every week.

2 masters are produced: a Production master 1080i25 / 5.1 (DNxHD 120 MXF format) and a Streaming Master 1080i25 / stereo (DNxHD 120 QT format). HD copies are then produced:

HD Production/Archive master - 1080i25 (HDCAM SR) HD Broadcasting master - 720p50 (HDCAM SR) SD broadcast/archive master - 576i25 / 5.1 (Digibeta) SD timeshift master - 576i25 / stereo / (DVCPRO 50)

And through the SVTi encoding cluster: High resolution (between SD and HD) [14] and in Real / WMF streams for the Web, and 3GPP for mobiles. How to deliver the programme?

Send a 'courier' with a tape 3 times a week to Stockholm (450 km away from Göteborg)! Use the SVT 'Meta' system [12]33 developed by Ardendo, and used for standard programming, to

transfer programmes from one site to another one, for playout scheduling, for archiving… but it was not possible with HD! Transfer it as a file via FTP [13]. The video and 8 discrete audio channels are encapsulated as a MXF

file in an Avid DNS Nitris station. But it is partially manual, an operator has to manipulate a tape at each end at the receiving end despite the fact that the preceding steps of the workflow are tapeless.

For delivery, an automated file based process with version control is needed. The number of transcoding passes and pre-delivery occasions (e.g. for subtitling) has to be minimized.

HD in the Future 2.11 UHDTV, the 'Super Hi-Vision' Masayuki Sugawara, Senior Research Engineer, NHK Laboratory, Japan More than 90 % of the NHK General programmes and more than 50 % of the NHK educational ones, in Terrestrial Broadcast34, are presently in HD. If HDTV brings a large screen into the home (between 1953 and 2006, the screen size has increased

32 http://www.svt.se/andraavenyn 33 'The Meta System – the secret SVT weapon' http://www.svt.se/svt/jsp/Crosslink.jsp?d=10775&a=843567&lid=puff_843570&lpos=lasMer 34 End 2007, 92 % of the 48 million Japanese households were covered by the Digital Terrestrial network.

Page 22: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

22

from 12 inches to 50-60 inches for HDTV; an 150-inch PDP has been presented35, and the viewing area has doubled in the past 2.5 years) [6], nevertheless the average viewing distance of 2.3 m has not changed. So, to follow the trend [7], it is time to start the research on the next-generation TV now! The UHDTV (Ultra High-Definition TV) parameters are quite impressive [10]:

Parameters UHDTV HDTV Definition 7680 x 4320 (33 Mpixels = HDTV def. x 16) or 3840 x 2160 1920 x 1080 Aspect ratio 16:9 16:9 Sampling structure 4:4:4, 4:2:2, 4:2:0 4:2:2 Colorimetry ITU-R BT.709 ITU-R BT.709 Bit depth 10 or 12 8 or 10 Viewing distance Down to 0.75 the Picture Height (PH)

(with a 150-inch display, PH = 74-inch (1.88 m) 3 x PH

Visual angle Up to 100 degrees… providing an immersive feeling! 30 degrees Frame frequency 50, 60* 24*,25, 30*, 50, 60* * divided by 1.001 is also specified Data rate (uncompressed)

7680 x 4320 x 12 x 3 x 60 = 72 Gbit/s 2.49 Gbit/s max.

Storage capacity required for 1-hour programme (uncompressed)

72 Gbps x 60 x60 /8 = 32 TB 1,12 TB

Standards SMPTE 2036-1 (2007) Ultra High Definition Television - Image Parameter Values for Program Production ITU-R Recommendation BT.1769 (2006): Parameter values for an expanded hierarchy of LSDI image formats for production and international programme exchange

SMPTE 274M ITU-R BT.709

Prototypes of the components of the UHDTV chain exist and new ones are developed:

Camera with 4 x 8-Mpixels CMOS (2004) [17]; there is a new image sensor [21] of 33 Mpixels (7680x4320 pixels @ 60 frame/s). Recording system of 48 3.5-inch HD, with a storage capacity of 3.5 TB (recording time of 18 min.) [18]. Display: UHD video projector with 8 Mpixels LCD panels (R/B and dual-Green) [19]. Transmission: real-time H.264 coder/decoder @ 128 Mbit/s with 16 parallel processors [23]. Emission: foreseen on a 21 GHZ-band Broadcasting Satellite System with a 300-MHz wideband

modulator for a 500 Mbit/s signal [24]. An optical transmission test was conducted in November 2005 for contribution of uncompressed

16xHD-SDI signals [25]. Delivery over a 500-km IP telecom link @ 640 Mbit/s was also tested in December 2006 [26], as well as IP Multicast in June 2007 [27].

For audio, a 22.2 multichannel sound system with 3 speakers layers (9 channels / upper layer, 10 / middle, 3 / lower + 2 LFE), provide for both horizontal and vertical localisation control of the sound [28]. Sports production shooting examples show the advantages of UHDTV in terms of details definition [29] and depth of field [30].

35 Panasonic at CES 2008, with 4k*2k definition

Page 23: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

23

3 Systems and the future of TV Chair person: Peter Symes, SMPTE

Integration technologies 3.1 Broadcast eXchange Format (BXF) Chris Lennon, Director of Integration & Standards, Software Systems, Harris, USA

BXF is a future standard (SMPTE-2021 document) for communications between traffic, automation and content delivery systems. Since 2005, over 150 people and 70 companies [13] have participated to its development (S22-10 SMPTE group). It aims to replace the hundreds of existing proprietary traffic-automation protocols and to extend the scope to content distribution. It is a protocol giving traffic - automation - content distribution - program management systems [6] a way to exchange material metadata, schedule metadata, as-run metadata, content movement instructions… It is neither an universal media file format like MXF, nor a control protocol for devices automation like the MOS communications protocol, and is only near-real-time, back and forth between systems before things get airing. BXF is: a XML schema that vendors have to implement in their systems, a graphical representation of the tree structure of the schema [24] [25], a 587-page document (but only 80 pages to read!), already in the draft publication stage (February 2008). It supports file-based communications (via FTP) as well as message-based communications (e.g. for Web services with SOA). Several vendors already started to implement it, it has been adopted and adapted by the Society of Cable Telecommunications Engineers, SCTE (U.S.A.) and is currently investigated by the Alliance for Telecommunications Industry Solutions, ATIS (U.S.A.), to be adopted for IPTV. The BXF Impacts on the workflow are the following:

Schedules from the traffic to the play-out automation (entire day, partial day, show, single event, updates backsent to traffic…). 'As-run' (full day, event by event, same day re-logging of events which may have been missed…). Content: material metadata exchange, movement instructions (push/pull). Ingest, more automated and integrated process. Sales: ability to react to changing demand for selling commercials' spots, and to reduce the

makegood/credit situations which can arise when spots are missed. Traffic & scheduling: enables to increase the control on what is happening on-air (during the day

when this department staff is around), to get rid of the numerous phone calls… Master Control: more autonomous operation possible during the day, and ability to make decisions

with enhanced metadata in off-hours (e.g. for running a commercial inside a time slot). Billing: the reconciliation process of the as-run log .becomes much simpler and quicker, and fewer

discrepancies reduces discussions with Sales, Traffic…

3.2 Material eXchange Format (MXF) Ingo Höntsch, Production Systems TV,, IRT, Germany

MXF demystified An MXF file can be seen as a binary coded multimedia document, with its structural elements identified by 'tags'. As for a book [4], a TV programme can be described with a document tree [6] referring to:

Textual information (yellow 'leaves' on the tree: programme title, author, item, segment..) needing

Page 24: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

24

small memory footprint, but many cross-relationships. Index (green 'leave') with medium size memory footprint (its size depending on the nature of the

Essence, e.g. for a long GOP every frame has to be addressed), and few relationships Essence (blue 'leaves', audio, video) with a large memory footprint and few relationships (sequential

'reading') An MXF file has a similar structure [7]:

Header Metadata: small in footprint, many relationships for describing the logical structure and the content, needed throughout the entire file processing. Index Tables data: for the optional addressing of individual essence container edit units. Essence Container data: the large material stuff (audio, video and data essence).

Each of these bytes streams are encoded separately (with KLV), serialised separately, and all are multiplexed. MXF provides:

Carriage and/or description of various internal/external essences: Video DV-DIF, MPEG-1,2 (ES, PS, TS), MPEG-4 AVC, VC-1, VC-3; J2K; Audio A-law, AES-3 , Dolby-E, MPEG, PCM...; data ANC packets, VBI lines, subtitles... Carriage of metadata: Essence Descriptive Metadata, user metadata, application metadata,

timecode... Synchronisation between essence and/or metadata Material address space: from the outside one can address specific items in the file Material complexity: from representing a simple item to medium hard cuts editing Extensibility: backwards compatible for decoders, which can ignore things they don't hnow (by worth

of KLV encoding protocol). But to use this 'toolbox'36 one needs an application specification detailing the user application needs: Which Essence format(s) and parameters? What is the complexity of the material to represent? How is the material identified? Which metadata and how are they associated to the Essence - carriage/address mechanism?). Only a thorough understanding of the tools and of their suitability allow to select the appropriate tools.

Interoperability status IRT conducted interoperability tests in Summer 200537 (8 vendors / 9 products), in 2006 (5 vendors / 5 products) and in 2007 (10 vendors / 14 products), focusing on mainstream SD production elements, on the conformance of MXF files encoded and on the robustness and completeness of decoders, using a library of reference files. The toolbox entries supported by the vendors are for the Essence: D-10 (IMX), DV-DIF, MPEG-2 Video ES I-frame and long-GOP, AES-3, PCM; for the Operational Patterns: OP 1a and OP Atom; for the Metadata: few products with DMS-1. A considerable improvement in 2007 relative to 2006 has been noted, wit nevertheless still some minor encoder errors, but generally robust decoders. EBU published 2 documents concerning the ' MXF Basic User Metadata implementation' R121-200738 and the 'MXF Timecode implementation' EBU R122-200739 36 Peter Symes' comment: If an 'universal' toolbox is really going to solve everyone's problem, it is inevitably incredibly complex, and the danger is that everyone wants it done now!... and don't worry about the details. 37 'MXF Implementation Tests - IRT Summer 2005' EBU BPN 071, May 2006 http://www.ebu.ch/CMSimages/en/ptec_bpn_071_tcm6-44567.pdf (EBU members only - Login and Password required) 38 'MXF Basic User Metadata Implementation' http://www.ebu.ch/CMSimages/en/tec_text_r121-2007_tcm6-50026.pdf?display=EN 39 'MXF Timecode Implementation'

Page 25: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

25

Standardisation news Concerning the standardisation status40, some new documents are finished: the mapping of MPEG-4 AVC streams (RP2008), of VC-1/WMV (SMPTE 2037), of VC-3/DNxHD (SMPTE 2028); in the final stage: SMPTE 410M describing the 'Generic Streams Partition' for carrying large streams of metadata. In an early stage, documents deal with multichannel audio, the mapping of AAC, the event-based Essence (e.g.) for TV subtitling, the KLV extension dictionary syntax (essentially for carrying AAF metadata dictionaries in MXF files). The fundamental documents SMPTE 377M (MXF file format specification) and SMPTE 379M (MXF Generic Container) are under revision in order to eliminate errors, inconsistencies and ambiguities, address known interoperability issues, simplify and clarify the document to improve interoperability, add new features, legalise some vendors' implementations, and eliminate constraints to more easily write AAF41 files as MXF files. SMPTE 381M (mapping MPEg-1/-2 ES/PS/TS streams) is also revised.

P/TV-File activities This EBU project group has been working on42 an 'MXF cookbook'43 underlying what to consider when introducing file-based (MXF) interchange and how to write an MXF application specification; it also contains sample application specifications of EBU members, and will include some 'MXF measurement guidelines' The project group has also been sharing experiences, has been building a contacts database (which product I use in my facility - how can I be contacted), and a questionnaire for EBU members ' MXF experience wanted'44. Join the EBU MXF world and team NOW!

3.3 Time & Sync Peter Symes, Director, Standards & Engineering, SMPTE

SMPTE and EBU have formed a new Joint Task Force45 to examine future needs for timing and synchronization in moving picture and related industries, and to recommend appropriate standardization. Current methods of timing and synchronization for television, audio and other moving picture signals rely on standards that have been in place for 30-50 years. These standards have limitations and are becoming increasingly inappropriate for the digital age. For example:

Color black Sync [3]: no deterministic synchronization of all signals; no multi-standard capability; reliance on unsuitable frequencies; signal bandwidth. Timecode [4]: no simple relationship with NTSC video; no robust solution for frame rates > 30Hz; no

support for “super slo-mo”; poor support for audio/video synchronization (e.g. with skipped video

http://www.ebu.ch/CMSimages/en/tec_text_r122-2007_tcm6-50027.pdf?display=EN 40 'MXF Specification' http://ftp.irt.de/IRT/mxf/information/specification/index.html 41 The Advanced Authoring Format AAF is designed for the storage and exchange of projects in post-production, rich editing, processing, effects, versioning of material… Concerning the data model, the MXF Structural Metadata is a constraint subset of the AAF object model. AAF is further defined via the Open Source AAF SDK requiring constraints unknown to MXF. MXF uses an external dictionary (SMPTE registries), AAF has a common subset but an internal dictionary, with the file allowing the application to dynamically configure its own decoding of the information. For the storage encoding AAF complies to Microsoft Structured Storage, KLV, XML (in the future); MXF: KLV, (XML). 42 http://wiki.ebu.ch/technical/P/TVFile (EBU members only - Login and Password required) 43 http://wiki.ebu.ch/technical/P/TVFile_MXF_Cookbook 44 http://www.ebu.ch/en/union/news/2008/tcm_6-57585.php 45 http://www.smpte.org/standards/tf_home/ http://www.ebu.ch/en/technical/time-and-sync/

Page 26: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

26

frames) The Task Force started by focusing on user requirements from broadcasters, and from the media and entertainment industries. The Task Force published a Request for Technology RFT (end February 2008). The responses of the media + IT + Telecom industries to this RFT (May 2008) will be evaluated against the user requirements. The outcome of this process will be a set of specifications that will be passed (late 2008) to the appropriate SMPTE Technology Committees for due process standardization. The RFT contains the requirements for:

A sync system [10]-[13]: frequency reference; time reference, compatibility with legacy systems, transport. A Time-related Label TRL [15]-[18]: basic functionality (count modulus, count mode,

interlaced/progressive, frame rate…), compatibility with legacy systems, transport & binding, variable speed reproduction… human interface with readable version… user data, edit data...

The most tremendous problems come from binding labels to the Essence and binding this label to the potential transports. Possible directions: to have an ID in the media document that would allow it to be keyed to the Essence and to other metadata; this label, or a subset of it, could go to a number of transports that have very limited bandwidth.

Metadata 3.4 Automated information extraction in media production Alberto Messina, Centro Ricerche e Innovazione Tecnologica, RAI, Italy The new EBU project group P/SCAIE46 deals with the 'Study of Content Analysis-based Automatic Information Extraction for Production'. It aims at making production more efficient in News, in Archives, in multi-channel & multi-format through the introduction of automatic metadata generation tools. These tools might be speech-to-text engines, shot detection and content indexing tools, classification by subject categories, face detection and recognition tools. Possible applications: automatic live event highlights production (e.g. goal detection during a football match), automatic metadata enrichment of Broadcast News items by linking to external knowledge sources (e.g. RSS feeds from on-line newspapers) for interactive services, effective archives annotation for faster and easier integration in new productions, automatic content multi-purposing (including, for example, picture formats and aspect ratios from HD/SD to mobiles), genre recognition enabling recommender systems, automatic management of user generated content. The working 'scenario' planned [8]: scientists and researchers (e.g. of the Web 2.0 world) provide technology to media producers and broadcasters; the project group collects the expression of needs from the users' communities, and all contribute to develop interactive content and services.

So, the 1st task of P/SCAIE is to report to developers about real use cases and application requirements which may benefit from automatic information extraction techniques. Results should be tested against a representative community of consumers. The purpose of the 2nd task is to identify reference annotated material relevant to key broadcasting

related production activities for objectively assessing the benefits of certain automatic extraction tools (what can be expected) and a benchmark reference for the quality of extracted information. The 3rd task consists of providing links to relevant information, provide assistance with the help of

EBU experts but also liaising with external parties such as academic and research bodies. For example,

46 http://www.ebu.ch/en/technical/projects/p_scaie.php?

Page 27: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

27

during the 1st Workshop on 'Automated Information Extraction in Media Production' AIEMPro, in Turin, 1-5 September 200847. The 4th task is to transit from research to implementation. The group will investigate the possibility to

organise on-the-field trials using the reference material against real available resources involving EBU Members' R&D resources in collaboration with other developers (e.g. academies, private laboratories, but also manufacturers).

Join this new world, become a P/SCAIE member!

3.5 NewsML-G2 implementation at VRT Robbie De Sutter, Medialab, VRT, Belgium

Everyday hundreds of News videoclips from Eurovision, Reuters, Associated Press… are ingested by Broadcasters, associated with some additional textual information on a 'dopesheet' 48 . A dopesheet contains information about the audiovisual material:

Description content (who, what, where, when, …) Administrative information (reporter, cameraman, …) Rights information (usage terms, copyright statement, …) Technical information (video format, audio format, …)

Initially this information was written in plain text (cf. dopesheet via telex [4]+[6-left]), currently it is sent as a proprietary XML structured text for computer processing (e.g. EBU NMS [6-2nd left]), in the future as an international standardized XML structured text with NewsML-G2 [6-right], useable by all content providers. NewsML (News Markup Language) is a family of computer languages used to formally describe news items and intended to support communication between systems (not humans). It is optimised for distribution of raw material (including separetely text, photos, graphs, audio or video) by News agencies, developed by Reuters in 1998, based on XML, and is an industry standard managed by the IPTC (International Press Telecommunications Council) since 200049. But NewsML being a compromise has become complicated and it has shown to be ineffective. The 2nd Generation NewsML-G2 provides:

an enhanced interoperability between various News items providers (AFP, DowJones, EBU, Reuters, Documentation services, Archives, correspondents by e-mail, telex services,etc.) and clients; extensibility (e.g. to include a subtitle standard); clarity and compactness of the syntax; ease of storage thanks to a structuring in items packages / items / fragments [10] allowing better

resource management (specific items versus entire feeds) and random access; ease of processing for relational or object mapping, with easy integration in an application and in

CMS/MAM systems; focus on "semantic" capabilities (with thesaurus and taxonomy) - 'Web 2.0 ready' - to allow future

'intelligent' NRCS, search tools and Content Management Systems to process the overflow of

47 http://www.crit.rai.it/AIEMPro08/ 48 Planning sheet for animators, similar to a storyboard. Also, sheet describing the content of TV programmes with the help of thumbnail pictures from various scenes. Worksheet used by animation directors to plan the timing and action of an animation scene, broken down into frame numbering for timing with detailed instructions regarding the field, camera movements, action, etc. In Eurovision News exchange, the dopesheet is a description of the item, it provides editorial and technical details about the item. 49 http://www.ebu.ch/en/technical/trev/trev_287-allday.pdf?display=EN

Page 28: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

28

redundant information, to be aware of potentially relevant items, to remove duplicates and detect updates, and so improving drastically search & retrieve.

Status today: IPTC has published (26 February 2008) the standard50 EBU launched a Beta-programme P/NEWSML51 IBBT/VRT medialab is developing reference software52 CWI/IBBT will submit a European project proposal - FP7 (Digital Libraries)53 At VRT [7], a first implementation test integrated NewsML-G2 into the MAM architecture [11]. But some issues are still not handled in this application: machine-readable rights language, information on the structured parts, version identification.

3.6 Migrating to Service Oriented Architecture (SOA) Jean-Pierre Evain,New Systems and Services, EBU, Switzerland The involvement of the EBU in the field of metadata is going on and concerns:

The Business-to-Business (B2B) [2]: the P/META metadata toolbox has been refurbished, application specific schemas have been extracted as additional documents (Programme Exchange, Music Reporting; News Exchange being replaced by NewsML-G254). Collaboration with the IPTC55 now on SportsML, with the ISOG56 (transmission booking), with the European Digital Libraries Initiative57, and on the ISAN58. The Business-to-Consumer (B2C) [3]: TV-Anytime59 is now almost everywhere, in DVB60 metadata

aspects, in the home networks with the DLNA61, in IPTV with TISPAN62 and ATIS63 also looking at TVA. Among the lessons learnt from this work:

The content model is very similar across domains and applications, however, the richer the metadata set the more difficult to share - but there are some ways of going around differences. There is a need for the EBU to focus on business objects, workflows, processes, use cases and best

practices to help towards a common data model (between the minimalistic Dublin Core metadata set and the too complex specific Broadcasters' data models), service oriented architectures (SOA) and the development of reusable web services. Shift to open source, not meaning 'license-free' but 'solution that can be adapted'. Save time and resources: don't reinvent the wheel! adopt and adapt! It might take more time but leads

to economies of scale.

50 http://www.iptc.org Technical Forum http://tech.groups.yahoo.com/group/newsml-g2 51 http://wiki.ebu.ch/technical/P/MAG http://www.ebu.ch/metadata/NewsML/P-newsML001_NewsMeetingPresentation.pdf 52 http://multiblog.vrt.be/medialab/category/research/competence/newsml/ 53 http://newsml.cwi.nl 54 Cf. § 3.5 55 http://www.iptc.org/pages/index.php 56 http://www.nabanet.com/wbuarea/committees/isog.asp 57 http://ec.europa.eu/information_society/activities/digital_libraries/index_en.htm 58 http://www.isan.org 59 http://www.tv-anytime.org/ 60 http://www.dvb.org 61 http://www.dlna.org 62 http://portal.etsi.org/tispan/TISPAN_ToR.asp 63 http://www.atis.org/

Page 29: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

29

What next?64 Media Asset Management - Different approaches: big broadcasters that can afford internal

developments integrating and customising solutions; smaller broadcasters acquiring turn-key solutions (more costly to adapt and to upgrade to the next generation), becoming more tightly bound to their provider. EBU wants to look at the common points between the different solutions and implementations, to ease the process and for better economies of scale for the benefit of all broadcasters, and for more generic and flexible solutions for the benefit of MAM providers.

Objective 1: destination SOA [8]-[10]. EBU has started to analyze some of the process models (e.g.

the OASIS one [9] 65). There are some specifications (e.g. OAI-PMH Protocol for Metadata Harvesting66) that allow interacting relatively simply with counterparts - EBU can provide a client to be implemented as a Web service [10]. A library of open source Web services to 'adopt and adapt' could also be proposed.

Objective 2: Search engines - A broadcaster wanting its programmes to be found in the architecture

that is going to be developed for IPTV networks67 combined with Home networks68 [11], has to look at the specifications work being under way, now! EBU is launching 2 new activities: the kick-off of the next generation of TV-Anytime69 more 'semantic Web'-oriented (with RDF, ODL…), and looking at the new ways of searching and presenting the information, in collaboration with the Internet Search engines providers.

64 EBU seminar 'New trends in digital production – Connecting and managing media workflows and digital assets with a Service Oriented Architecture' Geneva, 22-23 May 2008 http://www.ebu.ch/en/hr_training/training/cross_disciplinary/next_courses/05_media_asset_mgt.php 65 Organization for the Advancement of Structured Information Standards http://docs.oasis-open.org/soa-rm/v1.0/soa-rm.pdf 66 Open Archives Initiative http://www.openarchives.org/pmh/ 67 'Network structures – the Internet, IPTV and Quality of Experience' Jeff Goldberg and Thomas Kernen, Cisco Systems, EBU Technical review, N° 312, October 2007 http://www.ebu.ch/en/technical/trev/trev_312-kernen_QoE.pdf 68 'DVB-HN (Home Network) Reference Model Phase 1' DVB Document A109, February 2007 http://www.dvb.org/technology/standards/a109.tm3690r2.DVB-HN_ref_model.pdf 69 Geneva, 19 February 2008

Page 30: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

30

Annex 1: Abbreviations and acronyms

Note: Some terms may be specific to a speaker or/and his/her organization 2D Two-dimensional 2.5G Enhanced 2nd generation of wireless communication systems (GPRS*, EDGE*) 3G 3rd generation of wireless communication systems (UMTS*, WCDMA*) 3GPP 3rd Generation Partnership Project - produces globally applicable Technical Specifications and Technical

Reports for a 3rd Generation Mobile System http://www.3gpp.org/ 4:4:4 4:2:2, 422 4:2:0

Ratio of sampling frequencies to digitize the Luminance / Chrominance (Cb, Cr) components

AAC Advanced Audio Coding AAF Advanced Authoring Format AC Additional Content AC3 Audio Coding 3, known as Dolby Digital ADC, A/D Analog-to-Digital Conversion/Converter ADSL Asymmetrical Digital Subscriber Line AEIM Automated Information Extraction in Media Production AES Audio Engineering Society AGL XML schema used to define the metadata fields, and the layout of these fields in a graphical environment

(Ardendo) ALC Asynchronous Layered Coding (RFC 3450) AMR-NB Adaptive Multi Rate - Narrow Band ANC Ancillary (SDI, HD-SDI) AP Associated Press APTN Associated Press Television News ASF Advanced Streaming Format / Advanced Systems Format (Microsoft) ASI Asynchronous Serial Interface (DVB) ASX Advanced Stream Redirector (Microsoft) ATIS Alliance for Telecommunications Industry Solutions (U.S.A.) http://www.atis.org/ ATM Asynchronous Transfer Mode ATSC Advanced Television Systems Committee (USA) AVC Advanced Video Coding (MPEG-4 Part 10 = ITU-T H.264) AVC-I Advanced Video Coding - Intra (Panasonic) B Bidirectional coded picture (MPEG) B2B, BtoB Business-to-Business B2C, BtoC Business-to-Consumer BARB Broadcasters' Audience Research Board Ltd http://www.barb.co.uk/ BGD Broadband Gateway Device BIFS Binary Format For Scenes (MPEG-4 Part 1 & Part 11) BLITS Black and Lane's Ident Tones for Surround (British Sky Broadcasting) BMP BitMaP file format BR Bit-rate BT British Telecom BXF Broadcast eXchange Format (SMPTE) C Centre CBMS Convergence of Broadcast and Mobile Systems (DVB Technical Module, previously known as TM-UMTS) CBR Constant Bit Rate CCD Charge Coupled Device CCU Camera Control Unit CFA Color Filtered Array CG Computer Graphics CIF Common Intermediate Format (352*288 pixels)

Page 31: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

31

CLDM Common Logical Data Model CMOS Complementary Metal-Oxide Semiconductor CMS Content Management System CMT Common Metadata Toolkit (DVB Technical Module sub-group) CRT Cathode Ray Tube – the technology of the traditional TV set or video monitor CSS Cascaded Style Sheets CTA Central Technical Area CWI Centrum voor Wiskunde en Informatica http://www.cwi.nl/ D/HDC Evaluation of HD codecs (EBU Project Group – Delivery Technology) D-10 Sony's IMX VTR SMPTE standard DAB Digital Audio Broadcasting DAM Digital Asset Management DB Database DC Dublin Core http://dublincore.org/ Demux Demultiplexer DEXA Database and Expert Systems Applications DIS Digital Item Streaming (MPEG-21 Part 18) DLNA Digital Living Network Alliance http://www.dlna.org dm Downmix DMB Digital Multimedia Broadcasting (A standard adopted by WorldDAB for transmission of Multimedia over DAB.

In use in Korea.) http://www.t-dmb.org DMF Digital Media Factory (VRT) DMS Descriptive Metadata Scheme (MXF) DNxHD High Definition encoding (Avid)

http://www.avid.com/resources/whitepapers/DNxHDWP3.pdf?featureID=882&marketID= DOM Document Object Model DoP Director of Photography Double M/S, Double M&S

Mid-Mid-Side (stereo audio)

DRM Digital Rights Management DST Daylight Saving Time or Summer Time DTTV Digital Terrestrial Television DV Digital Video cassette recording and compression format DVB Digital Video Broadcasting DVB-C/-H/-S/-T Digital Video Broadcasting (Cable/ Handheld/ Satellite/ Terrestrial) DVB-H DVB - Handheld DV-DIF DV - Digital Interface Format DWDM Dense Wave Division Multiplexing DXB Digital eXtended Broadcasting ECMA European Computer Manufacturers Association eDAB extended DAB* e.g., eg exempli gratia, for example ENC Encoder / Encoding ENG Electronic News Gathering ES Elementary Stream (MPEG-2) ESB Enterprise Service Bus (IBM)

http://www-306.ibm.com/software/info1/websphere/index.jsp?tab=landings/esb ESG Electronic Services Guide ESPN Entertainment and Sports Programming Network (U.S.A.) ETSI European Telecommunications Standards Institute

http://www.etsi.org/services_products/freestandard/home.htm f/s, fps Frame/second FEC Forward Error Correction FFMPEG Collection of free and open source software libraries that can record, convert and stream digital audio and

video in numerous formats. It includes an audio/video codec library (libavcodec) and an audio/video container mux and demux library (libavformat). The name comes from the MPEG standards experts group, together with "FF" for "fast forward" http://ffmpeg.mplayerhq.hu/

Page 32: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

32

FIFA Fédération Internationale de Football Association http://www.fifa.com/en/index.html

FLUTE File delivery over Unidirectional Transport (RFC 3926) fps frame per second FRext Fidelity Range Extensions (MPEG-4 AVC) FTP File Transfer Protocol (Internet) FX Special effects Gbps Gigabit per second GBS Generic Data Broadcasting & Service Information Protocols (DVB Technical Module) GC Generic Container (MXF) GLITS Graham Lines's Identification Tone for Stereo (BBC) GMF4iTV General Media Framework for interactive Television (European IST project)

http://www.gmf4itv.net/ GOP Group Of Pictures (MPEG-2) GPFS General Parallel File System (IBM) GPP General Programme Production/Programme GUI Graphical User Interface GUID Globally Unique IDentifier H Horizontal HD Hard Disk HD-SDI High Definition SDI (1,5 Gbit/s) HE-AAC High Efficiency - AAC http://www.ebu.ch/en/technical/trev/trev_305-moser.pdf HL High Level (MPEG) HNED Home Network End Device HNN Home Network Node HP High Profile (MPEG) HTML HyperText Markup Language HTTP HyperText Transfer Protocol HW, H/W Hardware I Intra coded picture (MPEG) i Interlaced IBBT Interdisciplinair instituut voor BreedBand Technologie

http://www.ibbt.be/index.php?node=293&table=LEVEL0&id=1&ibbtlang=en IBC International Broadcasting Convention (Amsterdam) ID Identifier, identification IP Internet Protocol IPDC IP DataCasting IPI IP Infrastructure (DVB Technical Module) IPR Intellectual Property Rights IPTC International Press Telecommunications Council http://www.iptc.org/pages/index.php IPTV Internet Protocol Television IR Infrared ISAN International Standard Audiovisual Number ISOG (WBU-) International Satellite Operations Group

http://www.nabanet.com/wbuarea/committees/isog.asp ISP Internet Service Provider IT Information Technology (Informatics) ITU International Telecommunication Union ITU-R International Telecommunication Union – radiocommunication sector iTV Interactive Television ITV Commercial television network (UK) http://www.itv.com/aboutITV/ J2ME Java 2 Micro Edition JP2K, J2K JPEG2000 JPEG Joint Photographic Experts Group KLV Key-Length-Value coding (MXF) L / Ls Left / Left surround LASeR Lightweight Application Scene Representation (MPEG-4 Part 20)

Page 33: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

33

LCD Liquid Crystal Display LCOS Liquid Crystal On Silicon (display) LCT Layered Coding Transport (RFC 3451) LFE Low Frequency Effects (Surround Sound) Ln Level n LSDI Large Screen Digital Imagery Lt Left Total (stereo Dolby fold-down) LTO Linear Tape Open (IBM, HP, Seagate) LUT Look-Up Table M Mega MADI Multichannel Audio Digital Interface MAM Media Asset Management Mbps Megabit per second MCR Master Control Room MDA Microphone Distribution Amplifier MDP Media Dispatch Protocol ML Main Level (MPEG-2) MMS, MSM Mid-Mid-Side (stereo audio) MMS Multimedia Messaging Service MOS Media Object Server MP Main Profile (MPEG-2) MPE Multiprotocol Encapsulation (DVB-H) MPEG Moving Picture Experts Group MS, M&S, M+S, M/S

Mid-Side (stereo audio)

MUX Multiplexer MXF Material eXchange Format NAS Network Attached Storage NewsML-G2 News Markup Language - 2nd Generation (IPTC) NHK Nippon Hoso Kyokai (Japan) NITF News Industry Text Format NLE Non-Linear Editing NMS News Management System (EBU Eurovision) NRCS NewsRoom Computer System NYC New Year Concert (ORF, Austria) OAI Open Archives Initiative http://www.openarchives.org OASIS Organization for the Advancement of Structured Information Standards

http://www.oasis-open.org/home/index.php OB Outside Broadcasting OCT Optimized Cardioid Triangle (surround sound) ODRL Open Digital Rights Language OMA Open Mobile Alliance OP Operational Pattern (MXF) OWL Web Ontology Language (W3C) P Predicted picture (MPEG) p Progressive P/HDTP High Television in Television Production (EBU Project Group) P/META EBU Metadata Exchange Scheme P/SCAIE Study of Content Analysis-based Automatic Information Extraction for Production (EBU project group) P/TVFILE Use of FILE formats for TeleVision production (EBU Project Group) P2 Brand name of the Panasonic camera memory card

http://www.panasonic.com/business/provideo/p2/index.asp ftp://ftp.panasonic.com/pub/Panasonic/Drivers/PBTS/papers/P2-WP.pdf

P2P Peer-to-Peer PCM Pulse Code Modulation PDA Personal Digital Assistant PDP Plasma Display Panel

Page 34: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

34

PH Picture Height PL 'Positive Lock' lens mount (Arri) PMC Production technology Committee (EBU Technical Department) PMCP Programming Metadata Communication Protocol (ATSC) PMH Protocol for Metadata Harvesting PNG Personal Newsgathering PNG Portable Network Graphics file format PROMS Promenade Concerts (BBC / Royal Albert Hall, London, UK) PS Program Stream (MPEG-2) PsF Progressive segmented Frame PSI Programme Specific Information (MPEG-2 System) PSK Phase Shift Keying modulation PSNR Peak Signal-to-Noise Ratio PTS Production Technology seminar (EBU) PVR Personal Video Recorder QAM Quadrature Amplitude Modulation QC Quality Control QCIF Quarter Common Intermediate Format (176*144 pixels) QT QuickTime (Apple) R / Rs Right / Right surround R&D Research & Development RAI Radiotelevisione Italiana RD Remote Device RDF Resource Description Format (W3C) REL Rights Expression Language (MPEG-21) Res. Resolution RF Radio Frequency RFS Request For Standardization (SMPTE) RFT Request For Technology (SMPTE) RGB Red-Green-Blue (colour model) RGBA Red, Green, Blue, Alpha channel RSS Really Simple Syndication Rt Right Total (stereo Dolby fold-down) RTP Real-time Transport Protocol (Internet) RTSP Real-Time Streaming Protocol (Internet) S&R Search & Retrieve SAF Sample Aggregation Format SCTE Society of Cable Telecommunications Engineers (U.S.A.) SD(TV) Standard Definition (Television) SDH Synchronous Digital Hierarchy SDI Serial Digital Interface (270 Mbit/s) SDK Software Development Kit SEO Search Engine Optimization SFTP SSH (Secure Shell) FTP SI Service Information (DVB) slo-mo Slow motion SMIL Synchronized Multimedia Integration Language SMPTE Society of Motion Picture and Television Engineers SMS Short Message Service SNR Signal-to-Noise Ratio SOA Service Oriented Architecture SOAP Simple Object Access Protocol http://www.w3.org/TR/soap/ SoP Statement of Participation (SMPTE) SSR Societa Svizzera di Radio-Televisione STB Set-top box (-> IRD) SuperPOP Super Point of Presence (EBU - Eurovision) SVG Scalable Vector Graphics

Page 35: EBU Tech. Seminar

© EBU 2008 Production Technology seminar / January 29 - 31, 2008 Reproduction prohibited without written permission of the EBU Technical Department & EBU International Training

35

SVT Sveriges Television och Radio Grupp (Sweden) SW, S/W Software TB Terabyte TCP Transmission Control Protocol (Internet) T-DMB Terrestrial DMB* TISPAN Telecoms & Internet converged Services & Protocols for Advanced Networks

http://portal.etsi.org/tispan/TISPAN_ToR.asp TRL Time-Related Label TS Transport Stream (MPEG-2) TSM Tivoli Storage Manager (IBM) http://www-306.ibm.com/software/tivoli/products/storage-mgr/ TSR Télévision Suisse Romande, the Swiss Broadcasting's French-speaking network

http://www.tsr.ch/tsr/index.html?siteSect=100000 TVA TV-Anytime http://www.tv-anytime.org/ TWTA Travelling-Wave Tube Amplifier Tx Transmitter / Transmission uDOM micro Document Object Model (SVG* tool for mobile applications) UDP User Datagram Protocol (Internet) UGC User-Generated Content UGD Uni-directional Gateway Device UHDTV Ultra High Definition TV (NHK) UMTS Universal Mobile Telecommunications System USP Unique Selling Point USP Unique Selling Proposition V Vertical VBI Vertical Blanking Interval VC-1 Ex - Windows Media Video Codec, now SMPTE 421M-2006 VC-2 SMPTE code for the BBC's Dirac Video Codec VC-3 SMPTE code for the Avid's DNxHD Video Codec VOD Video On Demand vs. versus, against, compared to, opposed to VTR Video Tape Recorder WAP Wireless Application Protocol WBU World Broadcasting Unions

http://www.nabanet.com/wbuArea/members/about.asp WG Working Group WiFi Wireless Fidelity WMF, WMA, WMV

Windows Media format, Windows Media Audio, Windows Media Video

WML Website Meta Language WML Wireless Markup Language WoC Work Center (VRT) WYSIWYG What you see is what you get XHTML eXtensible HyperText Markup Language XML eXtensible Markup Language YCbCr Digital luminance and colour difference information YUV Luminance signal & subcarrier modulation axes (PAL colour coding system)