Post on 26-Mar-2015
OCAP RI Training
CableLabs Winter Conference
Philadelphia
March 15-16, 2012
Schedule
• Thu Mar 15 – 8:30am to 4:30 pm» Project Overview and History - DaveH » Build system - Chris» Logging and debugging - ScottD» Software architecture overview - MarcinK » Profiling applications with the RI’s JVM – SteveA/ScottD
[lunch]
» System Information – Prasanna/Craig» Service Selection and DVR subsystem - Craig» PC Platform – Marcin/SteveM/Neha» HN subsystem – Doug/Craig/Marcin/Lori
• Fri Mar 16- 8:30am to 2:30» Graphics - SteveA» Testing – ScottA/Nicolas» Issue reporting and handling - ScottA
[lunch]
» Stabilization/Robustness project – Marcin/Chris» Miscellaneous topics - TBD
Project Overview and History
March 15-16, 2012
Project started June 2008
• Core OCAP I16+ compliant» Several ECNs short of 1.0.0 compliance
• DVR I03 extension» Missing ECNs 897, 994, 1017
• Front Panel I02 extension• Approx 200 bugs carried over from legacy code
base• Vision Workbench/Clientsim SDK• Closed code base
Current Status
• Feb 2012 (1.2.2 Bundle Release)» Core OCAP 1.2.2» DVR Extension I08» Front Panel Extension I05» Device Settings Extension I05» Home Networking Extension I08» Home Networking Protocol I07
• JVM» Based on Sun open source PhoneME advanced MR2
• PC Platform» Windows XP initially; now Windows & Linux (several versions)
• SDK» Rel-6 released Nov 2011 – no further versions planned
• This is an open-source project» http://java.net/projects/ocap-ri/
2012 Schedule
• 1.2.1 Bundle released on 01/12/12• 1.2.2 Bundle (1.2.2 Rel-A RI) released on 02/24/12
» Included VPOP as primary feature» Also MIBObject.getValue» Completed implementation of ServiceResolutionHandler» Completed implementation of DTCP/IP on PC platform» Maintenance against 1.2.1 RI
• 1.2.2 Rel-B planned for 04/05/12– No feature development– Maintenance against 1.2.2 RI– Fixes will be taken off the current trunk– 1.2.2 will not be maintained as a separate branch
• 1.2.3 Bundle planned for 05/17/12» HN features to align with CVP-2 compliance» Maintenance against 1.2.2 RI» Backlog items
• Add IPv6 and https support to the PC platform• Stack housekeeping tasks
Other
• DTCP platform port for Windows and Linux» MPEOS API changes were included with 1.2.1 RI
release» RI Implementation now available for internal testing at
CableLabs• Phasing out support from Windows XP to Windows7
» Some performance issues with GStreamer plugin on Win7/newer Linux platforms are being investigated
• SDK Rel-6 released on 11/17/11» No current plans for a future release
• Upgrade of our DirectFB library we use (low priority)• Port RI to the MAC (very low priority)
Project site walk through
• Main Project Page• Bug Databases• Forums• Wikis• Contribution Process• Bug fix cutoffs• Release notes• Coding standards
Why an RI?
OCAP SpecificationsStubs & DTDs
OCAP Tests (CTP)
OCAP Reference
Implementation
Cla
rify Test
Test
ReleaseBundle
OCAP Release Bundle
• Components of a bundle are» Specs» Stubs» DTDs» RI implementation, including integration tests» CTP conformance tests
OCAP RI Requirements
• RI runs on a PC» Windows initially – now Linux
• RI and PC IDE must be available on open-source terms• RI and PC IDE must only include components with licenses compatible with
the ODL dual-license plans» Components available only under GPL are not OK» Licenses for all third-party RI components must be reviewed by both
CableLabs and the ODL legal teams• RI works with existing CableLabs ATE/CTP tests• RI adheres to current and future OCAP core specs• RI adheres to current and future OCAP extensions specs• To ensure backwards compatibility of the spec,MSO guides must run on the RI• To ensure backwards compatibility of stack ports of the RI, any changes to the
MPEOS porting layer must be approved by the RI steering committee
Licensing Models
• GPL License on java.net» CableLabs OpenCable Project
– OCAP Stack, PC Platform, Head-end Emulator
» Sun PhoneME Project - JVM
• Commercial License» CableLabs Commercial License
– Also free– Stack, platform and emulator– RAND IPR commitment– Bug fixes in stack contributed back
» Sun or other JVM vendor– Commercial CDC/PBP 1.1 JVMPhoneME JVM
OCAP RI Branching Strategy
• Three principal branches» Mainline/Development Branch
– Code implemented by internal RI Dev Team– Code from open source contributors that are vetted by RI Tech Leads– Other working branches get merged back to Mainline periodically
» Branded Branch (eg, “1.1.4”)– Fixes and enhancements that are tied to the spec and which have been
verified by the CTP– Branded branch is maintained separately from mainline– Changes from branded branch eventually migrate back to mainline
development– One branded branch per spec release
» Experimental Branch– Open source contributors have write access to this directory– No other restrictions– Merging to Mainline on a case-by-case basis
Bug Tracking
• Two Bug Tracking Databases» Internal (private) JIRA db (OCORI) at CableLabs,
tied to CableLabs CTP bug db» External (public) JIRA db on java.net (IT); hides
details of CTP-related issues
RI Build System
March 15-16, 2012
Building the RI – The Easy Way
See https://community.cablelabs.com/wiki/display/OCORI/Quick+Start for detailed instructions.
• Setup development environment» Cygwin + JDK + Ant for Windows
» A little more required for Linux (see Wiki)
• Get checkout_dev_env.sh (from svn)• Use checkout_dev_env.sh to get source, create setEnv
file• Execute ant in appropriate directory.
» Builds Platform and Stack.• See Wiki for detailed instructions.
Build System – Environment Variables
• Easy to work in several different RI code bases at the same time.
• OCAPROOT» The absolute path to the OCAP-1.0 directory.
» Required for compilation/execution
» Example: E:/Cablelabs/svn/OCAPRI/trunk/ri/RI_Stack
• OCAPHOST» Defines the host build environment
» Build system reads host environment configuration files from ($OCAPROOT/hostconfig/$OCAPHOST)
» Required for compilation only
» Example: Win32-Cygwin
Build System – Environment Variables
• OCAPTC» The Target Configuration for the build. Basically the port
you are working on.» Defines a subdirectory hierarchy where:
– build configuration files are found ($OCAPROOT/target/$OCAPTC)
– binary intermediate products are built $(OCAPROOT/gen/$OCAPTC)
– final binary products are installed and runtime configuration files are kept ($OCAPROOT/bin/$OCAPTC)
» Suggested format is:– <org>/<platform>/<os>/[debug|release]
– Example: CableLabs/simulator/Win32/debug» Required for compilation/execution
Build tools
• Make» Compiles JNI, MPE, MPEOS, and thirdparty
native libraries
• Ant» Coordinates the entire build system» Wiki contains a list of top-level build targets
• JDK (1.4 or higher)» Used to compile stack and test application
sources
Win32 Port
• Host environment is Cygwin» See Wiki for a full list of Cygwin libraries required to build
the RI Stack and Platform• Cross-compile to MinGW (no Cygwin DLL)• Lots of work (including JVM patches) to deal with POSIX
vs. Win32-style paths» POSIX for gcc» Win32 for javac, javah, etc..
• VERY SLOW (compared to Linux)» JVM binaries pre-built and checked-in to save compilation
time since most won’t be modifying the JVM• WindowsXP, Vista
Linux Port
• Known working distros/versions:» Fedora 10/12/13/15» Ubuntu 10.04/10.10/11.04» Ubuntu 11.04
• Much faster than Win32 on the same hardware.• See Wiki for detailed instructions.
Logging and Debugging
March 15-16, 2012
Stack logging
• log4j APIs included in the spec for use by applications• Additional Logger methods avoid String
concatenation overhead in most cases• Monitor applications configure logging through
DOMConfigurator or PropertyConfigurator• Groups
• Multiple loggers can share a common group name, which can be used during configuration
Stack logging continued
• New appenders• MIB Appender• AsyncAppender uses an additional thread and
a queue to offload writing to the target appender off of the caller thread
• New configuration capabilities• Configure at the ‘group’ level or the logger
level• Filter support, including ExpressionFilter
(ability to use regular expressions for fine-grained control over logging verbosity)
Stack logging continued
• Additional information available from the Wiki• https://community.cablelabs.com/wiki/display/
OCORI/Configuring+Java+stack+logging
Platform logging
• Platform code uses log4c to manage logging• Configuration found in
$PLATFORMROOT/log4crc• Additional information available from the Wiki
• https://community.cablelabs.com/wiki/display/OCORI/RI+PC+Platform+Logging
Logging and IssueTracker
When attaching a log to IssueTracker• Ensure the log contains timestamps• Helpful if Java debug logging is enabled
Chainsaw screenshot
Java Stack debugging
• Possible to step through breakpoints in Java stack code, generate stack traces and thread dumps
• Stack trace, thread dumps available via jdb (included with the Sun JDK)
• To enable Java debugging, un-comment VMOPT19 & 20 in mpeenv.ini and start debugger or jdb
• Re-comment VMOPT 19 & 20 when done..
Platform debugging
• gdb can be used to generate a trace if the Platform terminates unexpectedly
• ./runRI.sh -gdb
RI Software Architecture
March 15-16, 2012
Host Operating System
Windows and Linux APIs
Platform Support Libraries
Windows and Linux APIs
RI Platform Support Libraries
Int’l Log4cXML2 ZLib
GLib
PThreads
GStreamer Gst-Plugins-Base Gst-Plugins-Good
wxWidgets
SNMP
FFMPEG
Platform Implementation
Windows and Linux APIs
RI Platform Support Libraries
Int’l Log4cXML2 ZLib
GLib
PThreads
GStreamer Gst-Plugins-Base Gst-Plugins-Good
wxWidgets
SNMP
FFMPEG
RI Platform Implementation
GStreamer Plugins User InterfaceTuner Control
Platform API
Windows and Linux APIs
RI Platform Support Libraries
Int’l Log4cXML2 ZLib
GLib
PThreads
GStreamer Gst-Plugins-Base Gst-Plugins-Good
wxWidgets
SNMP
FFMPEG
RI Platform Implementation
RI Platform API
GStreamer Plugins User InterfaceTuner Control
Platform Summary
• Full software emulation of STB media decoding and presentation hardware.
• Majority of the code is 3rd party support libraries.• Leverages existing frameworks:
» GLib – utility library.» Gstreamer / HDHomerun / VLC– tuner control.» GStreamer – media decoding and presentation.» wxWidgets – user interface.» Net-SNMP – Master Agent.
• No OS abstraction APIs.
OCAP Porting API
RI Platform API
MPEOS Implementation
MPEOS API
Direct FB
FT2
Presentation APIsDisplay, FP, Graphics, UI Events
OS APIsDLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, UtilDecoding APIsClosed Captioning, DVR, Filter,
Media, POD, Sound, VBI
OCAP Native Library
RI Platform API
MPEOS Implementation
MPEOS API
MPE Library
DSM-CC
Direct FB
FT2
SI Database & Parsing
File System Management
Presentation APIsDisplay, FP, Graphics, UI Events
OS APIsDLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, UtilDecoding APIsClosed Captioning, DVR, Filter,
Media, POD, Sound, VBI
OCAP JVM
RI Platform API
MPEOS Implementation
MPEOS API
MPE Library
DSM-CC
phoneME JVM
Direct FB
FT2
SI Database & Parsing
File System Management
Presentation APIsDisplay, FP, Graphics, UI Events
OS APIsDLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, UtilDecoding APIsClosed Captioning, DVR, Filter,
Media, POD, Sound, VBI
VM
PBP 1.1
OCAP Java Implementation
RI Platform API
MPEOS Implementation
MPEOS API
MPE Library
DSM-CC
phoneME JVM
OCAP Java Managers
Direct FB
FT2
SI Database & Parsing
File System Management
Presentation APIsDisplay, FP, Graphics, UI Events
OS APIsDLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, UtilDecoding APIsClosed Captioning, DVR, Filter,
Media, POD, Sound, VBI
Core Implementation NanoXMLVM
PBP 1.1
Log4J JUnitA
pp
Ser
vice
Aut
h
EA
S
... Hos
t
TS
B
Rec
ord
SN
MP
OCAP API
RI Platform API
MPEOS Implementation
MPEOS API
MPE Library
DSM-CC
phoneME JVM
OCAP API
OCAP Java Managers
Direct FB
FT2
SI Database & Parsing
File System Management
Presentation APIsDisplay, FP, Graphics, UI Events
OS APIsDLL, Debug, Event, File, Memory, Socket,
Storage, Sync, Thread, Time, UtilDecoding APIsClosed Captioning, DVR, Filter,
Media, POD, Sound, VBI
Core Implementation NanoXMLVM
PBP 1.1
Log4J JUnitA
pp
Ser
vice
Aut
h
EA
S
... Hos
t
TS
B
Rec
ord
SN
MP
OCAP Summary
• Ported to Platform APIs for STB/video-related functionality.
• OS functionality MPEOS port maintained together with the stack code.
• MPE contains platform-independent / portable C code.
• Integrates Advanced phoneME JVM.• OCAP Functionality implemented via pluggable
Java Manager implementations.
Profiling applications with the RI’s JVM
March 15-16, 2012
phoneME JVM
• RI uses the open source (GPL2) phoneME Advanced JVM from Sun/Oracle» Closely based on J2SE 1.4» Adding JSSE (secure sockets) support in an upcoming release
• OCAP requires a JVM compliant with the latest maintenance releases of:» Personal Basis Profile 1.1» Foundation Profile 1.1» Connected Device Configuration 1.1
• Last update: October 24, 2009• Patch common source to fix bugs and build problems
» Win32/Cygwin/Make filesystem issues» JDWP (VM debugging) thread sync issues» PNG bug» GIF bug?
JVM Build
• All JVM-related files located in $OCAPROOT/jvm
• Build disabled by default for RI Win32 – pre-built binaries checked in to SVN» Enable/Disable building the VM with “build.jvm.exclude”
entry in $OCAPROOT/target/$OCAPTC/buildrules.properties
ocap_vmlib
• Interfaces and classes to assist in integrating a VM with the RI Stack
• Includes a full AWT implementation ported to MPE graphics APIs (DirectFB)
• Documentation» $OCAPROOT/docs/JVMPortingGuide.doc
Profiling and analysis tools
• There are a number of tools available for investigating issues in the Java stack code
• NetBeans Profiler
• CVM Inspector
• HPROF
• jdb
Profiling with NetBeans 6.8
• JVMTI-based
• Supports profiling of CPU, Memory and high level JVM stats (GC, thread activity)
• Used to identify CPU and memory hot spots
• Does not support creation or comparison of heap dumps
• https://community.cablelabs.com/wiki/display/OCORI/Profiling+the+RI%27s+JVM+using+NetBeans+6.8
CVM Inspector
• Set of utility functions and a shell script which can assist in inspecting the state of the JVM
• Available by setting a flag in the JVM build
• Used to generate and compare heap dumps
• Runs in either client-server mode (standard JVM client connects via a socket to the RI) or standalone mode (GDB)
• https://community.cablelabs.com/wiki/display/OCORI/Generating+heap+dumps+on+the+RI%27s+JVM+using+CVM+Inspector
HPROF
• A command-line profiling tool
• Used to generate detailed monitor statistics
• https://community.cablelabs.com/wiki/display/OCORI/Generating+thread+monitor+stats+on+the+RI%27s+JVM+using+hprof
JDB
• The Java debugger command line interface
• Used to generate lock and thread stack information
• https://community.cablelabs.com/wiki/display/OCORI/Generating+Java+thread+dumps+and+monitor+information
Service Information (SI)
March 15-16, 2012
OCAP SI
• OCAP SI access API» Provides information about available services in an
interactive broadcast environment
• OCAP uses SCTE-65 SI model (standard for SI delivered out-of-band for Cable networks)
• SI tables are acquired from out-of-band channel (ex: legacy OOB/DAVIC or DSG broadcast tunnel)
• Incorporates JavaTV SI API• When CableCARD is present, all applications
(service bound/unbound), have access to SI delivered on the OOB channel.
SI profiles
• SCTE-65 defines 6 profiles for OOB SI• RI stack supports Profiles 1-3• Includes tables: NIT-CDS, NIT-MMS, SVCT-DCM,
SVCT-VCM, NTT-SNS and STT (optional)
SI Profile Tables
1 - Baseline SVCT, NIT, NTT
2 – Revision Detection Versioning enabled for NIT, NTT, SVCT
3 – Parental Advisory Profile 2 plus RRT
4 – Standard EPG Data Profile 3 plus AEIT, AETT
5 - Combination LVCT, MGT. Backward compatible (1-4)
6 – PSIP only LVCT, AEIT, Optional AETT
Stack Components
OCAP SI
• Incorporates following Java TV SI packages
» javax.tv.locator
» javax.tv.service.transport.Transport
» javax.tv.service.transport.Network
» javax.tv.service.transport.TransportStream
» javax.tv.service.transport.ServiceDetails
OCAP PSI
• Incorporates following packages» javax.tv.service.navigation.ServiceComponent» org.ocap.si.ProgramAssociationTable» org.ocap.si.ProgramMapTable
• Supports Object Carousel» Carousel ID resolution» Deferred association tag resolution» NSAP resolution
Java SIDatabase/SICache
• Access to native SIDB via Java SIDatabase• SI data/events processed and dispatched to
registered listeners• Most recent SI is cached in the SICache to avoid
trips across JNI layer to MPE SI Database• SICache is flushed periodically (flush interval
configurable)• Discrete SI request types to access various SI
objects• SI requests asynchronously satisfied or timeout
(timeout value configurable) after time elapses
MPE SI
• When does it start?» Gated by POD_READY indication» Event-driven acquisition and parsing of OOB
table sections via MPEOS section filtering API» Dedicated thread for OOB SI» At start-up sets filters for all five table types NIT-
CDS, NIT-MMS, SVCT-DCM, SVCT-VCM, NTT-SNS
» Individual table timeout values configured via mpeenv.ini
» SIDB populated in MPE layer
SITP SI State diagram
OOB SI Acquiring
start
SI Fully Acquired
NIT/SVCT/NTT Received
SI Not Available Yet
SI Updates
All OOB tables received
NIT/SVCT/NTTReceived
All OOB tables received
timeout
SI DISABLED
SI NOT AVAILABLE
MPE SI (contd.)
• Profile-1 SI and CRC processing» Section count unknown. MPE SI engine employs
a heuristic based on unique section CRC values and CRC match count for deterministic...
» Section CRC match counts are configurable for individual tables via mpeenv.ini
» Table acquisition timeout values also aid in faster acquisition
» SVCT DCM scoreboard used to accelerate VCM acquisition
» On slower networks with infrequent section repeat cycle SI acquisition can be problematic
MPE SI (contd.)
• Profile-2 and above rely on the Revision Detection Descriptor (RDD) for table section count
MPE SI Cache
• Enable/Disable by mpeenv.ini configuration (Disabled in RI by default)
• Speeds up stack start-up on slower networks by using cached SI
• SI data is written to persistent memory• SI data read from file at start-up if cache is
enabled• Normal OOB SI acquisition also continues• Updates to SI are reflected in the cache• For testing only (Not intended for deployment)
MPE PSI Subsystem
• PSI: PAT & PMT acquisition using the MPE Section Filter subsystem
• Manages PAT/PMT acquisition and parsing from OOB (DAVIC/DSG Broadcast), DSG application tunnels, tuners, and HN streams
MPE PSI Subsystem
• SITP uses 6 different filter classes on each transport stream:• Initial PAT• Initial selected PMT• Initial secondary PMT(s)• Revision PAT• Revision selected PMT• Revision secondary PMT(s)
• Fixed resources (local tuners & legacy out-of-band) are assigned filter groups at startup according to mode
• Dynamic resources (DSG app tunnels, HN streams, remote tuners) are assigned filter groups when the session is started
• SITP acquisition logic then works in terms of classes and filter priorities without concern for class-to-group associations
MPE PSI
• State machine:
IDLEWait Initial
PATWait Initial
primary PMT
Wait Initial Secondary
PMTWait Revision
TUNE_SYNCA1
PAT matchedA3
timeoutA8
timeoutA9
primary PMT matchedA4+A5
all secondary PMTs matched
A4+A6
TUNE_UNSYNCA2
TUNE_UNSYNCA2
TUNE_UNSYNCA2
TUNE_UNSYNCA2
PAT matchedA2+A3
PAT matchedA2+A3
timeoutA9
A1: Set PAT positive filter (with timeout)A2: Cancel all filtersA3: Parse PAT
Set PAT negative filterSet primary PMT filter (with timeout)Set secondary PMT positive filters
A4: Parse PMTA5: Set primary PMT revision filterA6: Set secondary PMT positive filterA7: Set secondary PMT negative filterA8: Set PAT positive filter (no timeout)A9: Set PMT positive filter (no timeout)
Action Table
PAT matchedA2+A3
PMT matchedA4+A7
MPE PSI Subsystem
• Has 6 defined acquisition modes for tuner-based section acquisition, to tailor section filter resource usage to the platform’s filtering capabilities.• Mode 1: Legacy single-filter sharing• Mode 2: Dedicated filter per tuner• Mode 3: Dedicated 2 filters per tuner without secondary
acquisition• Mode 4: Dedicated per-filter tuner for PAT and selected PMT
with “wandering” PSI pre-fetch filter • Mode 5: Mode 3 +1 filter that picks up all secondary PMTs
across all tuners• Mode 6: No filter sharing (every section request uses its own
filter)
SIDB
• MPE SIDB contains following (SI/PSI) objects» Transports » Networks » Transport Streams » Services» Programs » Elementary Streams
• Provides API to access the various SI objects
SIDB API
• Lock/unlock SI DB for read/write access• Access provided using opaque SI handles• Service resolution methods
» mpe_siGetServiceHandleBySourceId()» mpe_siGetServiceHandleByFPQ()» mpe_siGetServiceHandleByServiceName()» Etc.
• SI enumeration methods » mpe_siGetAllNetworks()» mpe_siGetAllServicesForTransportStream()» mpe_siGetServiceComponentsForServiceHandle()» Etc.
SIDB API (contd.)
• Object Carousel support methods» Look-up PID given CarouselID/Component Tag» Find Program Number given Deferred
Association Tag
• CA support methods» Enumerate CA descriptors» Lookup ECM PID
System Time
• RI parses out-of-band System Time Table (STT) to extract system time in UTC
• STT section filtering and parsing done on a dedicated thread in MPE SITP layer
• Can be disabled for platforms which directly process STT or use alternate mechanism
• Maintains a difference between system time and network time
• Employs a smoothing algorithm to avoid jumps
Service Selection & DVR Subsystem
March 15-16, 2012
ServiceContext
• JavaTV ServiceContext is the primary entity for Service presentation/selection
• AV Services can be:» “Broadcast” Service. Service object retrieved from
SIManager or OCAPLocator representing a frequency/program/modulation.
» RecordedService. A recording accessed via a RecordingRequest.
» RemoteService. A Service representing an HN-streamable ContentItem retrieved from the ContentServerNetModule
ServiceContext
• The RI’s ServiceContext implementation utilizes delegates for each type of Service that can be presented:» BroadcastServiceContextDelegate: Used for presenting
Broadcast Services when the DVR extension is not present» RecordedServiceContextDelegate: Used for presenting
RecordedService (DVR-only)» RemoteServiceContextDelegate: Used for presenting
RemoteServices (HN-only)» DVRBroadcastServiceContextDelegate: Used for presenting
Broadcast Services when the DVR extension is present. Supports the TimeShiftProperties interface which allows the app to enable/disable timeshift and specify timeshift duration.
» AbstractServiceContextDelegate: Used for selecting AbstractServices – Services which represent unbound xlets
ServiceContext
• Class relationship:
ServiceContext
• OCAP requires that the ServiceContext provide a JMF Player to the application for media control
• On the RI, the Player and the Player-associated Controls provide rendering control, but actual rendering is done by the platform.
• The ServiceContextDelegate acquires tuner/TSB resources and the RI’s Player infrastructure manages everything else, including:» MediaAccessAuthorization (parental control, etc)» Conditional Access» PSI/ServiceComponent retrieval» NetworkInterface/component validation» CCI enforcement» TSB/Segmented Time-shift attachment (for
DVRBroadcastServiceContextDelegate)» SegmentedRecordedService navigation (for
RecordedServiceContextDelegate)
ServiceContext
Resource Abstract service Broadcast service Recorded service Remote service AppsDatabase Yes Yes No No Video device No Yes Yes Yes NetworkInterface No Yes* Yes** No MediaAccessHandler No Yes Yes** No RecordingManager No No Yes No MediaStorageVolume No No Yes No UPNP/DLNA framework No No No Yes
* Non-DVR ServiceContextDelegate manages a NetworkInterface directly, DVR broadcast ServiceContextDelegate uses a NetworkInterface indirectly (via TimeShiftWindowClient) ** If presenting an ongoing recording and switching to the 'live point'
• Resources required for presentation:
ServiceContext Basics
• Delegate/Player relationship:
ServiceContext
• Basic Broadcast Service “live” playback example:
t1
Tune completes
DVBServiceContext.select() called for Service s1
mpe_mediaTune(freq1,mod1)Transport stream section filtering session(s)
mpe_dvrTsbConvertStart()
t2
Section filtering forPAT/PMT initiated by MPE SI Manager
t3 t4
MPE_TUNE_SYNC
mpe_filterSetFilter()
MPE_SF_EVENT_SECTION_FOUND
t5
Decode session
PAT/PMT found
mpe_mediaDecode()
JMF Broadcast Player started.Decode initiated with Locator-specified components/PIDs from PMT
t6
mpe_mediaStop()
JMF player stopped
ServiceContext
• Basic Broadcast Service time-shift playback example:
t1
Tune completes
DVBServiceContext.select() called for Service s1
mpe_mediaTune(freq1,mod1)Transport stream section filtering session(s)
t2
t3
t4
MPE_TUNE_SYNC
mpe_filterSetFilter()
t6
Decode Session 1
mpe_mediaDecode()
JMF TSB Player started.Initializes in live mode.
t7
mpe_mediaStop()
User-initiated rewind via Player.setRate(-2).
t5
mpe_dvrPlayBackStop()
Timeshift Playback Session 1
mpe_dvrTsbPlayStart()
JMF TSB Player switches to timeshift playback
t8
Decode Session 2
JMF TSB Player switches to live playback
mpe_mediaDecode()
User-initiated jump to live via Player.setMediaTime(
POSITIVE_INFINITY)
time-shifted s1 content
TSB wrapping starts(on this platform)
TSB filling still below capacity
MPE_SF_EVENT_SECTION_FOUND
mpe_dvrTsbBufferingStart()
ServiceContext
SC State
machine:
DVR TimeShiftManager
TimeShiftManager Basics
• Internal Manager only (not exposed to OCAP Applications)• Provides both tuning and TSB management for multiple parts of the
DVR-enabled RI stack• ServiceContext implementation uses TSM to:
• Tune• Enable Service buffering (via TimeShiftProperties)• Discover already-tuned/buffering Services (and NetworkInterfaces)• Acquire/enumerate TimeShiftBuffers (TSBs) for playback.
• RecordingManager uses TSM to:• Tune• Enable Service Buffering (via BufferingRequests)• Discover already-tuned/buffering/buffered Services (and
NetworkInterfaces)• Convert previously-buffered and currently-buffering TSBs into
RecordedServices
TimeShiftManager Basics
• HN TSBChannelStream uses TSM to:• Tune (ChannelContentItem Tuning Locator)• Discover already-tuned/buffering Services• Initiate buffering• Acquire/enumerate TSBs for streaming
TimeShiftManager
Time-shift Resources / time-shifted content for service S
discover existing,
initiate buffering, present from
disc
over
exi
stin
g, in
itiat
e bu
fferin
g,
conv
ert t
o re
cord
ing
initi
ate
buffe
ring
ServiceContext presenting service S with DVR extension enabled
Recording of service S
BufferingRequest for service S
disc
over
exi
stin
g,
initi
ate
buffe
ring,
stre
am fr
om
HN streaming initiated for ChannelContentItem
associated with service S
TimeShiftManager Responsibilities
• Base Service acquisition mechanism for DVR-enabled RI stack.• Manages the pool of MPE time-shift buffers (the number of TSBs
and their size)• Embodies the knowledge of how Network Interfaces are shared and
creates SharedResourceUsages• Represents the use of the NI in ResourceContention and manages
the coordinated shutdown of NI-based operations when/if the NI is lost or necessary components are lost, including:• Transport stream synchronization • Service component availability• Conditional Access (CA)• Switch digital video transitions
• Manages the TimeBases for all TSBs to allow for:• Proper navigation of TSBs by JMF • Discovery of TSB/TSBs for retroactive recording (conversion of
already-buffered content into RecordedService(s))
TimeShiftManager Internals
<<interface>>TimeShiftWindowClient
TimeShiftWindowTimeShiftWindowClientImpl
implements
TimeShiftBuffer
TimeShiftManagerImpl
<<interface>>TimeShiftManager
implements
ManagerManager1 1
1
n
n
1
nNetworkInterfaceController
1 1
TimeShiftManager Internals
Static/Sequence Diagrams
DVR RecordingManager
RecordingManager Basics
• Defined as part of OCAP DVR I08 and DVB GEM DVR
• org.ocap.shared.dvr.RecordingManager is defined by DVB GEM – meaning its interface is used by specifications other than OCAP
• org.ocap.dvr.OCAPRecordingManager extends the GEM RecordingManager – adding OCAP-defined functionality
• Has to work with/enable HomeNetworking extension when present
RecordingManager Responsibilities
• RecordingManager-provided functionality:• Is responsible for starting and stopping
RecordingRequests at particular times and managing conditions which can interrupt/resume recordings (so RecordingManager is always running)
• Manages and attempts to honor BufferingRequests• Manages RecordedServices (“Recordings”)• Persistently saves RecordingRequests and associated
application-supplied metadata (so RRs survive reboots)• Manages the space allocation/prioritization for
RecordingRequests• Provides warnings when NetworkInterfaces are about to
be utilized for RecordingRequests (RecordingAlertListers)
RecordingManager API
RecordingManager API Diagrams
RecordingManager Implementation
RecordingImpl
RecordingInfo2
ParentNodeImpl
OcapRecordingRequest
SegmentedRecordedServiceImpl
1
0..1
RecordingRequest
RecordingRequestImpl
<<implements>>ParentRecordingRequest
<<implements>>
RecordingInfoTree
0..n
0..n
0..1
RecordedSegmentInfo
SegmentedRecordedService
RecordedService
RecordedServiceImpl
<<implements>>
<<implements>>
1
1..n 1
<<implements>>
0..n
• RecordingManager Class Diagram
RecordingInfo Metadata
Each of these forms represents a discreet persistent file type
RecordingInfoNode
RecordingInfo2 (old)
RecordedSegmentInfo
TimeTable
RecordingInfoTree
1..n
PersistentData
1
RecordedServiceComponentInfo
1..n
RecordingInfo
SegmentedLeaf (old)
0..n
LeafTree
TimeBasedDetailsInfo
1..n
RecordingInfo2
SegmentedLeaf
RecordedSegmentInfo
0..n
RecordedServiceComponentInfo
1..n
RecordingRequest States
The “blue sky” path for a (Leaf) RecordingRequest takes it through these states:PENDING_NO_CONFLICT: Recording is ready to record and there is no contention for tuners with any other RecordingRequestIN_PROGRESS: Recording is on-going (tuner is tuned and data is being written to disk)COMPLETE: Recording has completed and content is present in its entiretyDELETED: Recording has been deleted (RecordedService.delete() has been called)
RecordingImpl states
• Many things may not go as planned during the recording process which put the RR into IN_PROGRESS_WITH_ERROR:• The RecordingRequest may not acquire an NI (tuner)
at the time it’s supposed to start• The NI may be lost after the RR has started• Sync may be lost on the NI• PSI (PAT/PMT) may not be acquired or lost• Conditional Access may be denied• RemovableStorageVolume is disconnected• MediaStorageVolume (disk) becomes full• Service may be re-mapped via SPI (SDV)
Any of these conditions may remedy themselves
RecordingRequest States
Leaf States
Parent States
Unresolved
Record ()
PartiallyResolved
CompletelyResolved
Cancelled
Failed
Pending WithoutConflict
Pending WithConflict
Test
In Progress
In ProgressInsufficient
Space
Failed
Complete
Incomplete
Cancelled
Legend
In Progress With Error
RecordingImpl States
• Each RecordingImpl has an external state and an internal state (IState)
IStatePending
RecordingImpl.IState
IStateInit IStateWaitTuneSuccess IStateEndedIStateSuspendedIStateStarted
IStateSuspended TunerUnavailable
IStateSuspended BufferingDisabled
IStateSuspended MSVUnavailable
IStateSuspendedTSWBufShutdown
IStateSuspended ServiceRemap
IStateSuspended ServiceUnavailable
IStateSuspended TunerNotReady
IStateSuspendedInsufficientSpace
IStateSuspended CopyProtected
IStateSuspended CaDenied
IStateSuspendedTuneFailed
IStateSuspendedMSVUnavailable
Recording Process 1
07:00 08:00
OCAP ODL Stack Scheduled Recording Scenario 1
time-shifted s1 content
+
t1
OcapRecordingManager.record(OcapRecordingRequest orr1)
for s1, startTime 7:00, dur 60m
RecordedService 1
t2
TSB content earmarked/copiedfrom TSB1 for RS1
t3
Soon after 8:00, TimeShiftManager stops buffering into TSB1 andreleases the associated tuner
RecordingManager initiatesTimeShiftWindow attach - causing
a tune and timeshift TSB1 start
mpe_dvrTsbBufferingStart()
free space
mpe_dvrTsbConvertStart()
RecordingManager initiatestime-shift-to-recording conversion
t4
mpe_dvrTsbConvertStop()
mpe_dvrTsbBufferingStop()
More RM Detailed Diagrams
More detailed recording scenarios
RI PC Platform
March 15-16, 2012
Platform Components
• Support Libraries» 3rd party libraries supplying foundation for providing
TV/Media functionality in the Platform.» Meet the RI dual licensing requirements.
• Implementation» Leverages features provided by the support libraries.» Adds extensions and customizations.» Glues everything together into a coherent implementation.
• API» Hides the underlying technology-specific terminology.» Uses STB concepts to simplify command and control of
the Platform features.
Support Libraries Overview
gst-plugins-base-0.10.22
libiconv-1.12Win32: compile from src
Linux: part of libc
gettext-0.17Win32: compile from src
Linux: pre-installed
pthreads-2-8-0Win32: compile from src
Linux: part of libc
liblog4c-1.2.1liboil-0.3.15
Win32: compile from srcLinux: pre-installed
libxml2-2.6.32Win32: compile from src
Linux: pre-installed
zlib-1.2.3Win32: compile from src
Linux: pre-installed
gstreamer-0.10.22Net-SNMP-5.6.1
wxWidgets-2.6.4
glib-2.18.4
gst-plugins-good-0.10.10
ffmpeg-0.5
External library code
gtk+-2.10.14Win32: unused
Linux: compile from src
Support Libraries Overview
• iconv & gettext provide internationalization support and are required by glib
• xml2 library requires zlib and is used by both gstreamer and clinkc• pthreads implementation wraps Win32 thread support in POSIX
APIs and is required by clinkc• gstreamer requires glib• gst-plugins-base need gstreamer and oil – optimized low-level
routine library (i.e. fast memcpy using MMX or SSE extensions)• gst-plugins-good requires gst-plugins-base• ffmpeg supplies A/V decoding support; at present, only MPEG-1/2
video decoding capability is used• wxWidgets library provides multi-platform GUI framework and
requires gtk+ library on Linux• log4c is a native implementation of the Apache logger
Implementation Overview
RI PLATFORM IMPLEMENTATION
gst-plugins-base-0.10.22
libiconv-1.12Win32: compile from src
Linux: part of libc
gettext-0.17Win32: compile from src
Linux: pre-installed
pthreads-2-8-0Win32: compile from src
Linux: part of libc
liblog4c-1.2.1liboil-0.3.15
Win32: compile from srcLinux: pre-installed
libxml2-2.6.32Win32: compile from src
Linux: pre-installed
zlib-1.2.3Win32: compile from src
Linux: pre-installed
gstreamer-0.10.22Net-SNMP-5.6.1
wxWidgets-2.6.4
glib-2.18.4
Tuner ControlLOGGING
UIgst-plugins-good-
0.10.10
CABLELABS GSTREAMER PLUGINS
ffmpeg-0.5
LAUNCHER
RI PLATFORM INTERFACE
CableLabs developed code
External library code
gtk+-2.10.14Win32: unused
Linux: compile from src
CONFIG
SNMP Master Agent
Recent updates
•Trying to stay up-to-date on 3rd party library revisions as much as possible.•Glib
» Updated from 2.22.4 to 2.28.7» Should be available in the 1.2.2 Rel-B and future releases.
•Gstreamer» Currently in-progress.» gstreamer : 0.10.22 to 0.10.35» gst-plugins-base : 0.10.23 to 0.10.35» gst-plugins-good : 0.10.10 to 0.10.30
Supporting Infrastructure
• Logging – uses log4c, available to all platform modules.• Config – human-readable name-value keystore:
» # This is a comment
» RI.Launch.App.0 = ocap_stack
» RI.Launch.App.0.cwd = $(OCAPROOT)/bin/$(OCAPTC)/env
• Launcher - .EXE that:» Starts the platform followed by all platform client DLLs.
» Enters that main loop and waits for reset/shutdown signal.
GStreamer Overview
FDC
OpenGL Rendering
UDPSource
Section Filtering
Tee
PID Filter
TSB Tee
Decode Tee
PID Filter
Section Assembler
ES Assembler
Colorspace Converter
Section Filter
Section Sink
File Sink
File Source
FFMPEG Library
MPEG Decoder
DisplayNET UI
HDD
Section Filter
Section Sink
App Source
In-Band Pipelines:
Out-Of-Band Pipeline:
App Source
QueueVPOP Pipeline:
GStreamer Summary
• Framework and building blocks for the MPEG-2 transport stream processing.
• Supplies in-band and OOB section filtering.• Supplies MPEG-1 & 2 video decoding capability.• Provides Time-Shift Buffer recording/playback.• Supplies video and graphics plane display
capability (in conjunction with UI module).• Supplies VPOP output stream packet source
User Interface Overview
GStreamer Display
Common OpenGLInfrastructure
Native Win32
wxWidgetsNative
Linux/X11
wxWidgets Win32 build
wxWidgets Linux build
GTK+
Multi-plane Image Source
Scaling / Compositing
Rendering / Windowing / Eventing
User Interface Summary
• Emulates the final TV display output – provides a physical view of the rendered frame buffer.
• Relies on OpenGL technology for graphics-intensive operations (scaling, compositing, rendering).
• Abstracts out window-specific operations into an API to permit the usage of any UI/Widget Toolkit/Framework.
• Currently supports wxWidgets framework as well as native OS targets (Win32/Linux).
Platform API
• Object-based.• Attempts to be functionally equivalent with an STB driver
API.• Supports multiple instantiation of each object/API.• Possible to supply multiple implementations of each API
(like Java interfaces).• Removes dependencies on any technology-specific
terminology (abstraction layer).• Some objects support multiple clients (e.g. display
module for iDCR).
Home Networking Subsystem
March 15-16, 2012
HN Topics
• Overview• Public Class Hierarchy• Streaming• Mapping to Platform• Gstreamer pipeline for serving, playing back
» Porting
• DTCP-IP• Current Status
» Limitations
HN Overview
The Home Networking Extension provides support for an ever increasing set of use cases surrounding the discovery and delivery of content over the customer’s home network.
• Use cases have evolved from an adjunct option for content delivery to a primary one.» Multi-room DVR (1.1.3 – 1.1.5)» 3rd Party DLNA Device compatibility (1.2 – 1.2.2)» CVP-2 and Cloud based scenarios (1.2.3 – later)
HN Overview
• OCAP 1.1.3 : Multi-room DVR» Initial Release of HN Extension» Focus on CTP test conformance» Limited streaming capability
• OCAP 1.1.4 – 1.1.5 : Multi-room DVR cont.» Redesigned Content Directory Service» Continued CTP test conformance» Improved streaming capability
HN Overview
• OCAP 1.2 - 1.2.2 : 3rd Party Device Compatibility» Lower level UPnP Diagnostics API» Improved UPnP/DLNA compatibility» Live channels, Hidden items, Video Primary Output
Port, Alternate URI• OCAP 1.2.3 : CVP-2 Support
» RUI Discovery and Authentication
HN Overview
• Java abstraction of:» Network device discovery» Content publishing
– Recordings, Live streaming, personal content» Content discovery» Content playback/display» Remotely scheduling recordings
• Based upon, but abstracted from UPnP• 2 specs
» HNEXT I08 Spec (Java APIs)» HNP2.0 I07 Spec (Mapping to UPnP)
Baseline HN Technology
• UPnP» Device Architecture 1.0» Media Server 2.0» Content Directory Service 3.0» Connection Manager Service 2.0» Scheduled Recording Service 1.0
• DLNA» DLNA Guidelines, August 2009
HN Stack Components
HN Stack Architecture
Java Stackorg.ocap.hn
upnp media server
CyberLink For Java
mpe
org.ocap.dvr
mpeos
PlatformHN Gstreamer Pipeline
UPnP Diagnostics API
Design Choices
• UPnP Diagnostics API implemented using a 3rd party UPnP stack
• Almost all code at Java level• Content served/consumed at platform layer
» Rather than pass data up/down from Java» Sockets passed between layers
UPnP Diagnostics Classes/Interfaces
• Singleton UPnPControlPoint» Discovers UPnP devices on the network» Register low level message handlers
• UPnPClientDevice» Maps to each UPnP device on a specific network
• UPnPClientService» Maps to each UPnP service on a device on a specific
network» Invoke UPnP actions on a network service
• UPnPActionInvocation
Basic Public Classes/Interfaces
• Singleton NetManager» Get list of discovered Devices on the home network» Get aggregated list of NetModules
– NetModule is java abstraction of UPnP Service• Device
» Straightforward mapping of UPnP device• NetModules
» Loosely based on UPnP Services» ContentServerNetModule» RecordingNetModule
• Event driven listeners for discovery and activity» DeviceEventListener/NetModuleEventListener/
ContentServerListener
Basic Client Operation
• NetManager.addNetModuleEventListener(…)• NetManager.getNetModuleList(…)• ContentServerNetModule.requestRootContainer• ContentServerNetModule.requestBrowseEntries• ContentServerNetModule.requestSearchEntries
» Get back ContentList of Items/Containers» Render Item(s)
Content Public Classes/Interfaces
Content package
• Natural 1:1 mapping to UPnP ContentDirectory» ContentContainer = <container>» ContentItem, ChannelContentItem = <item>» ContentResource = <res>» MetadataNode ≈ <property attribute=blah>
» requestBrowseEntries = Browse action» requestSearchEntries = Search action
Recording Content Interfaces
Client Recording Content
• Not so simple UPnP Mapping• RecordingContentItem ≈ <item> in CDS• NetRecordingEntry ≈ <item> in CDS• But also:
» RecordingContentItem ≈ recordTask in SRS» NetRecordingEntry ≈ recordSchedule in SRS
• Refer to HNP2.0 Appendix I» Singleton NetManager gives access to
RecordingNetModules» RecordingNetModule represents a DVR device» requestSchedule ≈ CreateRecordSchedule SRS » Also requestReschedule, etc.
Recording Scheduling Interfaces
Implemented by Application Class
Implemented by Stack Class
Server Recording Content
• Server stack per se does not initiate recording
• Application (EPG) gets RecordingNetModule and registers request handler» RecordingNetModule is a NetRecordingRequestManager» UPnP SRS actions result in handler invocation» Handler responsible for calling DVR APIs and publishing
resulting recordings» OcapRecordingRequests are RecordingContentItems
• Note: RI HN extension currently requires DVR extension» In future, no DVR APIs will be referenced in HN
– Allowing HN-Only client or server
HN Streaming
• Once content has been published on the network by the DMS, and clients (DMCs) have discovered ContentItems, streaming can be initiated.
• Remote client (DMC) initiates HTTP-based streaming from the DMS using either the “res” URI from the ContentDirectoryService (CDS) or the “alternateURI” registered by the guide application on the ContentItem
• RI uses CyberGarage handle connection initiation to the HTTP port• Streaming initiation requires authorization by the
NetAuthorizationHandler, if registered. • For streaming requests, the NAH is passed the requested URI
and source IP• The NAH2 is additionally passed the entire request line and
request headers
HN Streaming
• Incoming streaming requests are delegated to one of 5 interceptors: » IconInterceptor: Provides UPnPManagedDeviceIcon support
(URI path /ocaphn/icon)» RecordedServiceInterceptor: Provides RecordedService
streaming support (URI path /ocaphn/recording)» ChannelRequestInterceptor: Provides ChannelContentItem
streaming support (URI path /ocaphn/service)» VPOPInterceptor: Provides VPOP streaming support (URI
path /ocaphn/vpop)» PersonalContentInterceptor: Intended to provide content to local
files (currently unsupported)
HN Streaming: Recorded Service
• Multiple HTTP range requests for a RecordingContentItem
t2
HTTP GET/headers parsed, URL is matched to ContentItem, and
processing delegated to RecordedServiceInterceptor
mpe_hnStreamOpen()
HN SPTS Stream Session 1 (mpe_HnStreamSession)
t5
HN Playback Session 1 (mpe_HnPlaybackSession)
mpe_hnPlaybackStart()
RecordingStream initiates RecordedService R1 range playback using mpe_HnPlaybackParamsMediaServerHttp with mpe_HnStreamContentLocation of LOCAL_MSV_CONTENT
t3
RecordedServiceInterceptor calls NetSecurityManager to authorize
playback activity
t4 t6
mpe_hnStreamClose()
t1
Socket connection established to HTTP port (via
CyberGarage)
MPE_HN_EVT_END_OF_CONTENT
t7
Next GET parsed &
delegated
mpe_hnStreamOpen()
HN SPTS Stream Session 2 (mpe_HnStreamSession)
t10
HN Playback Session 2 (mpe_HnPlaybackSession)
mpe_hnPlaybackStart()
RecordingStream initiates RecordedService R1 range playback / LOCAL_MSV_CONTENT
t8
Next GET authorized
t9
Socket handed off to platform for streaming
t11
mpe_hnStreamClose()
MPE_HN_EVT_END_OF_CONTENT
HN Streaming: Recorded Service
• SegmentedRecordedService playback (2 segments, 1x playback)
t2
HTTP GET/headers parsed, URL is matched to ContentItem, and
processing delegated to RecordedServiceInterceptor
mpe_hnStreamOpen()
HN SPTS Stream Session 1 (mpe_HnStreamSession)
t5
HN Playback Session 1 (mpe_HnPlaybackSession)
mpe_hnPlaybackStart(RS1)
RecordingStream initiates 1x playback of first RecordedService (RS1) in SegmentedRecordedService using mpe_HnPlaybackParamsMediaServerHttp with mpe_HnStreamContentLocation of LOCAL_MSV_CONTENT
t3
RecordedServiceInterceptor calls NetSecurityManager to authorize
playback activity
t4 t6
mpe_hnStreamClose()
t1
Socket connection established to HTTP port (via
CyberGarage)
MPE_HN_EVT_END_OF_CONTENT
t7
HN Playback Session 2 (mpe_HnPlaybackSession)
mpe_hnPlaybackStart(RS2)
RecordingStream initiates 1x playback of second RecordedService (RS2) in SegmentedRecordedService
t8
MPE_HN_EVT_END_OF_CONTENT
Last RecordedService in the SegmentedRecordedService
HN Streaming: Live Streaming
• Streaming requests for ChannelContentItems also are subject to additional processing steps:• The “Channel Locator” is subject to resolution via the
ServiceResolutionHandler to establish the “Tuning locator”• NetworkInterface has to be tuned to the program specified in the
Channel Locator (if the program is not already tuned)• In attempting to acquire a tuner for streaming, the
ResourceContentionHandler may be invoked – with a NetResourceUsage representing the network-initiated tuner acquisition
• When the DVR extension is present, time-shift buffering also needs to be started (if the program is not already buffering)
HN Streaming: Live Streaming
• ChannelContentItem playback
t2
HTTP GET/headers parsed, URL is matched to ContentItem, and
processing delegated to ChannelRequestInterceptor
mpe_hnStreamOpen()
HN SPTS Stream Session 1 (mpe_HnStreamSession)
t10
HN Playback Session 1 (mpe_HnPlaybackSession)
mpe_hnPlaybackStart()
TSBChannelStream initiates TSB 1 playback using mpe_HnPlaybackParamsMediaServerHttp with mpe_HnStreamContentLocation of MPE_HN_CONTENT_LOCATION_LOCAL_TSB
t3
ChannelRequestInterceptor calls NetSecurityManager to authorize
playback activity
t4
TSBChannelStream acquires a TimeShiftWindow via TimeShiftManager
t11
MPE_HN_EVT_PLAYBACK_STOPPED
mpe_hnStreamClose()
t1
Socket connection established to HTTP port (via
CyberGarage)
mpe_mediaTune(freq1,mod1)Transport stream section filtering session(s)
t5
t6
t7
MPE_TUNE_COMPLETEtime-shifted s1 content (TSB 1)
MPE_SF_EVENT_SECTION_FOUND
mpe_dvrTsbBufferingStart()
t8
t9
Remote endpoint terminates playbackcloses socket connection
mpe_dvrTsbBufferingStop()
t12
Buffering stops after timeout
HN Streaming: VPOP
• Virtual Primary Output Port: Allows for streaming of whatever content is playing back on the DMS
t1
HTTP request received with VPOP URI & handled
by VPOPInterceptor
mpe_hnStreamOpen()
HN SPTS Stream Session (mpe_HnStreamSession)
t5
HN Playback Session (mpe_HnPlaybackSession)
mpe_hnPlaybackStart(..., LOCAL_DISPLAY, disp1)
HNServerSessionManager initiates VPOP streaming by passing a mpe_HnPlaybackParamsMediaServerHttp with mpe_HnStreamContentLocation of LOCAL_DISPLAY and the mpe_DispDevice handle of the primary display
t2
NetAuthorizationHandler invoked
t4
HNServerSessionManager initializes the server-side connection using
mpe_HnStreamParamsMediaServerHttp – passing the connection to the platform
t12
MPE_HN_EVT_SESSION_CLOSED
t3
StreamingActivityListener invoked Decode Session 1
mpe_mediaDecode(…,disp1)
JMF TSB Player started.(Buffering disabled)
mpe_mediaStop()
t6
Decode Session 2
t7
mpe_mediaTune(freq1)
mpe_filterSetFilter()
Channel change
mpe_mediaStop()
t8 t9
mpe_mediaTune(freq2)
mpe_filterSetFilter()
Recording Playback Session
mpe_dvrRecordingPlayStart(…,disp1)
Playback of recorded service
mpe_dvrPlayBackStop()
t10 t11
mpe_mediaDecode(…,disp1)
The LOCAL_DISPLAY streaming/playback sessions persist across broadcast session decodes, TSB playbacks, and recording playbacks and terminates when the DMC
disconnects
VPOP connection appears paused/stalled when no content is being displayed on the DMS
The MPEOS port/platform renders the AV content for disp1 to the socket provided in the
mpe_HnPlaybackSession
Any trick modes performed during recording playback are rendered as live to the socket
provided in the mpe_HnPlaybackSession. Preferably, PIDs are remapped to consistent values
HN Streaming: RemoteService playback
• OCAP RI DMC can playback a MPEG single-program transport stream (modeled on the DMC as a RemoteService)
t1
DVBServiceContext.select() called for RemoteService s1
(could represent a remote RecordingContentItem or
ChannelContentItem).
mpe_hnStreamOpen()HN SPTS Stream Session (mpe_HnStreamSession)
mpe_dvrTsbConvertStart()
Section filtering forPAT/PMT initiated by MPE SI Manager
t2 t3
mpe_filterSetFilter(MPE_FILTER_SOURCE_HN_STREAM)
MPE_SF_EVENT_SECTION_FOUND
t4
HN Playback Session (mpe_HnPlaybackSession)
PAT/PMT foundRemoteServiceDetails.getComponents() request satisfied
mpe_hnPlaybackStart()
JMF RemoteService/ Player started.Playback initiated with default or app-selected components/PIDs
t5
mpe_hnPlaybackStop()
JMF Player stopped
HN Platform Functionality
• Two levels of RI HN Platform Code• MPEOS C code• GStreamer pipeline C code
• MPEOS Level – Player• Performs HTTP request & response handling• Receives content via socket • Transfers to HN player pipeline for decoding
• MPEOS Level – Server• Sends content retrieved from platform via socket• NOTE: HTTP handling is performed in RI Stack
• GStreamer HN Pipelines • Server pipeline supplies content• Player pipeline decodes content
MPEOS HN Changes
• Defines the porting interface for HN - mpeos_hn.h • Methods & data structures used in platform for both
Server & Player roles• Summary of recent changes
• Non-platform specific Server side processing of HTTP requests moved to RI Stack level
• Clearer separation of methods b/w player & server • Enhanced “javadoc” type documentation updates• Added DTCP support
• Need to be sensitive to changes • Impacts ALL platforms not just RI PC Platform• All changes are posted to forum
DTCP/IP
• RI Stack support complete with 1.2.1.» DTCP_ profile ID signalling.» application/x-dtcp1 mime types.» Support for Range.dtcp.com in 1.2.2-B.
• RI Platform integration complete with 1.2.1.» Integrated Intel SIK v4.02.» Library presence dynamically detected and loaded at
RI start-up.» Fallback to No-OP version.» MPE configuration options:
– MPEOS.HN.DTCTPIP.DLL=/f/encrypted/dtcpip.dll– MPEOS.HN.DTCPIP.STORAGE=/f/encrypted/keys/
DTCP/IP (cont’d)
DTCP/IP Testing
• Successful sink/source interoperability.• Verified support/interoperability with both facsimile
and production keys.• Successful streaming to a production DMP device.• Currently working through issues with DLNA LPTT.
Limitations
• Network interface mapping» MoCA support» Supporting multiple interfaces
– Bridging/loop mitigation
• Playback formats» No Audio playback support» Only standard MPEG transport stream
– DLNA “ISO” ts formats, not tts (zero or otherwise)– No MPEG PS support (adding in July)
Graphics
March 15-16, 2012
Setting up Graphics Resolution
TV Screen size fromDISP.DEFAULT.CONFIG (mpeenv.ini)
Platform Window size fromdisplay.window.widthdisplay.window.height (platform.cfg)
Video scaled accordingto incoming video sizeand applicable DFC rule
Coherent Configurations
DISP.DEFAULT.CONFIG Configuration
0 640x480 1:1 graphics, 720x480 8:9 video, 640x480 1:1 background
1 640x480 1:1 graphics, 720x480 8:9 video, 720x480 8:9 background
2 960x540 3:4 graphics, 720x480 8:9 video, 640x480 1:1 background
3 960x540 3:4 graphics, 720x480 8:9 video, 720x480 8:9 background
4 640x480 4:3 graphics, 1920x1080 1:1 video, 1920x1080 1:1 background (with I-frame support)
5 960x540 1:1 graphics, 1920x1080 1:1 video, 1920x1080 1:1 background (with I-frame support)
Stack Graphics Overview
MPE Java Classes (partial list)
• MPEGraphics• MPEGraphicsConfiguration• MPEGraphicsDevice• MPEToolkit• MPESurface• MPEImage
MPEOS Graphics Object: mpeos_GfxScreen
typedef struct mpeos_GfxScreen{
int32_t x;int32_t y;int32_t width;int32_t height;int32_t widthbytes;mpe_GfxColorFormat colorFormat;mpe_GfxBitDepth bitdepth;mpeos_GfxSurface *surf;os_GfxScreen osScr;
} mpeos_GfxScreen;
MPEOS Graphics Object: mpeos_GfxSurface
typedef struct mpeos_GfxSurface{
os_Mutex mutex; /**< surface is thread safe */int32_t width; /**< width of surface in pixels */int32_t height; /**< height of surface in pixels */int32_t bpl; /**< bytes per line */mpe_GfxBitDepth bpp; /**< bit depth (bits per pixel) */mpe_GfxColorFormat colorFormat; /**< color format */void* pixel_data; /**< pixel data */mpe_Bool primary; /**< true if on-screen surface */
mpe_GfxPalette clut; /**< color palette used (if colorFormat == MPE_GFX_CLUT8) */
os_GfxSurface os_data; /**< os-specific surface info */} mpeos_GfxSurface;
MPEOS Graphics Object: mpeos_GfxContext
typedef struct mpeos_GfxContext{
mpe_GfxColor color;mpe_GfxFont font;mpe_GfxPoint orig;mpe_GfxRectangle cliprect;mpe_GfxPaintMode paintmode;uint32_t modedata;
mpeos_GfxSurface *surf;os_GfxContext os_ctx;
} mpeos_GfxContext;
Third-party packages
• DirectFB -- Used for alpha blending onto graphics buffer
• FreeType – Font support
Relationship with Platform
• Graphics Buffer» Allocated by Platform» Pointer passed to Stack via get_graphics_buffer
(display.c)» Drawn upon request via draw_graphics_buffer
(display.c)
• Graphics resolution» Changed by a call from Stack to Platform:
update_configuration (display.c)
Use Case: FillRect() in Paint()
1) MPEToolkit creates MPEGraphics and wraps it in DVBGraphicsImpl2 object, which is passed into paint.
2) FillRect called on DVBGraphicsImpl2 passes to inner MPEGraphics object.
3) Call passes to gfxRectangle in mpeos_draw.c, with native graphics context passed in.
4) gfxRectangle calls FillRectangle on IDirectFBSurface in native graphics context
5) DirectFB paints to graphics buffer, then calls Platform to redraw
RI Graphics UML - 1
UML Diagrams• https://community.cablelabs.com/svn/OCAPRI/trunk/ri/RI_Stack/docs/design/Graphi
cs/OCAP_graphics_PhoneME.vsd• Free Microsoft Visio 2003 Viewer• enableTv Contribution (Russell Greenlee)• 9 Structure Diagrams
» RI Launcher & Display» MPE/MPEOS» DirectFB» Java» HAVi» AWT
• 23 Sequence Diagrams» RI Emulator Startup» MPE/MPEOS Initialization» DirectFB Initialization» OCAP Main/AWT Toolkit Initialization» MPE Graphics Initialization» HAVi Initialization» AWT Rendering
RI Graphics UML - 2
Graphics vs VideoOutputPorts
• VideoOutputPorts: physical “spigots” on back of OCAP box. Each HScreen has a main VideoOutputPort.
• VideoOutputPorts AND CoherentConfigs control video plane resolution
CoherentConfig vs VideoOutputPort
• Coherent Config and Video Output Config BOTH control video resolution.
• On RI startup, persisted Video Output Port Config is read and supercedes CoherentConfig
• When OCAP app change coherent config, Video Output Config is changed
• When OCAP app changes Video Output Config, CoherentConfig is changed
Testing
March 15-16, 2012
Different Tests Play Different Roles
• Smoke Tests• Standards Compliance• Integration Demonstrations• See:
https://community.cablelabs.com/wiki/display/OCORI/Testing+the+RI
Smoke Tests - Overview
• Philosophy of daily smoke tests• Current smoke tests• For more information• Recommendations for smoke tests
Smoke Tests - Philosophy
• Discover unexpected consequences ASAP» Documented procedures for how and when to run the
tests» Published reports from running guide tests» Routine, periodic execution» Executed by all engineers, verified by the build.
• Maximum coverage for short period of time» Focus on troublesome areas of the code» Manual execution < 30 minutes effort for MSO guides» Manual execution < a few minutes effort for CTP tests
Smoke Tests – Current Procedures
• Manual tests, not fully automated • For every code change
» CTP test suite» TuneTest» Other tests, as appropriate
• Daily, at least once» Aspen Guide, dual tuner configuration» ODN, will begin soon
Smoke Tests - Guides
•
Smoke Tests – For more information
• Wiki page:» https://devzone.cablelabs.com/web/oc/9/-/wiki/OC
AP%20RI/Smoke+Testing
Smoke Tests - Recommendations
• Suggestions for additional tests?• Recommendations for alternative procedures?
Smoke Tests - Summary
• Smoke tests are manual, quickly-executed, formal tests to gauge the health of a build
• All engineers on the RI project are running smoke tests
• Please contribute on the forum your thoughts about smoke tests.
CTP Testing against RI
• Full suite of unattended CTP tests (approx 5500) are run against tip of RI trunk every week.
• CTP test are run using a “real ATE” – Automated Test Environment:» Dedicated machine that executes ATE software, generates
streams in real time and sends them via RF channel to RI running on Windows and Linux.
• Priority given to testing full configuration of RI.• The other three configuration are tested monthly.
CTP Testing against RI
• Failure Results:» RI QA does initial analysis.» File bugs in RI JIRA db at CableLabs which is
tied to CableLabs CTP bug db.» If the failure is determined to be due to a test bug,
an issue is entered into the CL CTP bugs database and the 2 issues are linked.
• Weekly results and corresponding Jira issues are captured in a spreadsheet for easy comparison of RI status from week to week.
Attended CTP Test Automation
• Existing Scenario• Problems faced• Tool used for Analysis/Report Generation.
Attended CTP - Existing Scenarios
• 2 Groups of Attended TestInteractive TestVisual Inspection Test
• Automation is done only for the second group.• This is to increase the efficiency of the tester. Reduce the wait time spent for running tests.
• Minimal changes were made to ATE emulator Script changes to avoid pop up of a question. Take a screenshot of RI screen.
Tools Used for Test Run/ Test Analysis
• Screenshot tool: (Test Run)» Tool written in Java to enable screenshot of RI
screen.
• Spreadsheet creation tool:» After the Test run, to collect all the images and
questions and organize them to a spreadsheet.
• Test Result Update tool(TestResultMux):» To view the images and questions through an
interface and create a test result.
Advantages of using the tool
• Usability : » More user friendly so that it takes less time for a
new person using this tool.
• Efficiency :» Reduces the time taken for test result analysis
and creating a test report.
• Robust :» Avoids human errors while executing/updating
test analysis reports by showing all the mandatory parameters on the screen.
Integration Tests
• $OCAPROOT/apps/qa/org/cablelabs/xlets• Each test application should have a readme.txt
in its directory.• Ongoing clean-up and rework
» Latest status is found in $OCAPROOT/apps/qa/docs/IntegrationTestInfo.doc
• New test applications will be added going forward for integration testing of new features.
Integration Test Automation AutoXlet
• Automated xlet examples ($OCAPROOT/apps/qa/org/cablelabs/xlet/):» TuneTest» PermissionTest» FileAccessPermissionTest» PropertiesTest» DvrTest
Home Networking Test Automation
• Home Networking tests are challenging» Video comparison» Network issues
• Rx Project• RiScriptlet• RiExerciser• Pragmatic automation of HN Integration tests• Overnight smoke tests
MSO Guide Testing - Approach
• We want to run the same tests against the RI that are run against any STB» RI/QA is always looking for more tests
• “Canary” boxes• All non-blocked suites are run before a release,
or as needed.• TestLink manages the test procedures,
execution and results.
MSO Guide Testing - Suites
• Tuning, live streaming, VPOP• DVR and Trick Mode viewing• Locks and PINS• Various ways to find content – Guides and HN
content searches• All buttons on the simulator remote control• All “feature menus”
MSO Guide Testing – Tests NOT Run
• Environment at Cablelabs determines tests to be run. Some tests not run due to:» Specialized headend support at CableLabs
(Upsell, CallerID, etc.)» Headend triggered features » VOD and CA support» Platform features – audio, analog channels
• What tests do you think should be run, and how frequently?
TestLink – Manage MSO Guide Tests
• Web Based Test Management• http://www.teamst.org/• GNU GPL, version 2• Architecture – Apache web server, MySQL, PHP• http://www.apachefriends.org/en/xampp.html
TestLink – Test Cases
TestLink - Reports
MSO Guide Testing - TestLink
• We want to run the same tests against the RI that are run against any STB» RI/QA is always looking for more tests» RI/QA is always looking for more canary boxes
• Not all tests can be run here – however, if YOU can run them at your site…..
• TestLink is our Web Based Test Management tool
Unit Tests – Design and Expectation
• Historical JUnit Tests• $OCAPROOT/java/test• Out of date when received by RI Project• Contain useful coding examples
• Design of new tests for fixes and contributions• Fail against old code, pass on new code
• Expectation of new tests• Coverage for all lines of code for contributions
• Your experience with JUnit tests?
Rx - Ri eXerciser
March 15-16, 2012
Goals and Benefits of RX
Rx Project Overview: The RxProject is an effort to take the current DvrExerciser OCAP Xlet and refactor it into a frame work which provides the following:•A library of functionality, called OcapAppDriver, that aggregates OCAP API calls into functional building blocks provided by RI Stack & PC Platform. •A scripting interface, RiScriptlet Xlet, that allows automation of testing of these functional building blocks.•A "Guide" Type OCAP Xlet, which is referred to as the RiExerciser, that supports functional/application level testing and development through a GUI.
Another testing framework… Really?
• This framework allows for automation without much additional effort.
• RxProject will be part of the open source so outside contributors will be able to write bean shell scripts to illustrate issues and also demonstrate fixes
• Currently have a multiple of Xlets in the QA directory, and there is much redundant code. The Rx framework allows Xlets to utilize common utilities already found in OcapAppDriver or add to the common functionality
• For HN, we need a framework where tests can be run to verify robustness in a network environment with many devices. HN CTP tests assume an isolated network.
OcapAppDriver
• A library of commonly used functions (e.g. tune, record, etc) that can be used by xlets or scripts. The idea of this library is to be easier to use than the raw OCAP calls so that a script-writer in particular does not need detailed OCAP API knowledge to write a script.
• OcapAppDriver lib calls need to clearly assert whether they are synchronous or asynchronous. In general, synchronous calls will be favored over asynchronous.
• OcapAppDriver lib calls will pass/return only simple Java primitives (String, int, etc) in keeping with the philosophy that the API should require minimal OCAP API knowledge to use.
Asynchronous vs Synchronous example
• As previously mentioned, OCAP API calls will in general be synchronous. However, there are asynchronous OCAP API calls such as tuning and recording
• For any asynchronous API calls, a synchronous counterpart will exist. For example, waitForTuningState() is a synchronous counterpart to the asynchronous serviceSelectByIndex() method.
• In general, synchronous counterparts should include “waitFor” in the method name
Rx Architecture
RiScriptlet
• RiScritplet is an xlet that executes scripts
• Scripts » Written in "Java" and executed by BeanShell
interpreter» Specified in hostapp.properties or via telnet» Can have subroutines for code org
• Results file written when script is complete
Example Script
// Delete All Recordings
rxLog.info ("Deleting all recordings...");
rxReturn = true;
if(!rxDrvr.deleteAllRecordings())
{
rxReturn = false;
rxReturnString = "deleteAllRecordings failed"
}
rxLog.info ("Done deleting all recordings...");
Synchronizing Scripts
• Can run multiple scripts in parallel
• Can sync scripts via sync points on one or more RI instances
• One RiScriptlet acts as sync server to coordinate sync points
• Syncing is TCP-based
Synchronization Architecture
Script #1
RiScriptlet
(Sync Master)
Script #2
TCP(register, sync, unregister)
TCP(register, sync, unregister)
Sync Example Scripts
// Script #0
rxSyncClient.register ((byte)0 /* clientId */, (byte)2 /* expected num clients */, 3000 /* timeout */);
Thread.currentThread().sleep(5000);
rxSyncClient.sync ((byte)0 /* syncId */, 10000 /* timeout */);
rxSyncClient.unregister(3000 /* timeout */);
// Script #1
rxSyncClient.register ((byte)1 /* clientId */, (byte)2 /* expected num clients */, 3000 /* timeout */);
Thread.currentThread().sleep(1000);
rxSyncClient.sync ((byte)0 /* syncId */, 10000 /* timeout */);
rxSyncClient.unregister(3000 /* timeout */);
Home Networking Integration Testing
• Benefits• Designing tests• Building and running HN test• Handy tools• Inventory of current tests• Continuous integration scripts• Where to find more information
Benefits - HN Integration Testing
• Testing of HN features without needing a headend or guide
• RI can be server – third party devices• RI can be a player for your STB port• Advantages of PC tools for network debugging• API code examples for new ECs
Design – HN Integration Tests
• Use Cases from specifications• Narrative fro the beginning of specifications• Common sense application of a feature• Only happy path tests • Limited scope - not exhaustive combinations• Mind Maps
Manual HN Integration Tests
Inventory – HN Integration Tests
• Streaming, recording and trick modes• VPOP• Hidden content• Resource contention handling• New test suites for new ECs
Tools – HN Integration tests
• RiExerciser • Intel’s UPnP Device Spy• Third party devices – PS3, Samsung SmartTv
HN – Continuous Integration
• Regression – all tests expected to pass• Results summarized and archived• Optimized for our HN test lab• …/OCAPRI/trunk/ri/hnTestScripts/buildRx.xml
More information – HN Tests
• How we test the RI• https://community.cablelabs.com/wiki/display/OC
ORI/Testing+the+RI• Location of integration spreadsheets• OCAPRI/trunk/ri/RI_Stack/apps/qa/hn/integration
Contributions – HN Integration Tests
• Designs for integration tests• Additions to OcapAppDriver• Contribute Scriptlets
Summary – HN Integration Tests
• Demonstrations of functionality – end-to-end• Do not need:
» Guides» Headends
• Learn about new APIs• Designed from specs and common sense• Manual test procedures in spreadsheets• Automated with scriptlets
Issue Reporting and Handling
March 15-16, 2012
Issues – information to report
• Basic journalism• In what environment (RI Rev level, OS, etc.)?• What did you do (sample code is best)?• What happened (observations, logs, screen
shots)?• Priority – what should it be?• Lastly, why do you believe this is an issue?
• Include sample code to demonstrate the behavior
http://java.net/jira/browse/OCAP_RI
http://java.net/jira/browse/OCAP_RI
http://java.net/jira/browse/OCAP_RI
Issues – Life Cycle
• Filed• JIRA counterpart• Comments, comments, comments• Resolution• Closed with the filer’s assent
Issues – Status
• Unresolved• NEW – Initial state
• STARTED – Assigned and work begun
• REOPENED – Once resolved, but more work needed
• Resolved• FIXED - A fix is checked in
• INVALID – The problem described is not an issue
• WONTFIX – The issue will never be fixed
• DUPLICATE – Duplicates an existing issues
• WORKSFORME – Attempts to reproduce were futile
Issues – Guidelines for Priority
• Blocker- Most important. • Catastrophic consequences• Potentially effects many applications• No workaround by changes to the app, etc.
• Critical• More limited scope or consequences
• Major• Workaround available, limited scope
• Minor and Trivial – Least important.
Issues - Process
• Every release has a published cutoff date for considering issues to be resolved for that release. Three weeks prior to release.
• Higher priority, well documented issues are addressed sooner
• All correspondence should be through comments in the issue
Issues - Summary
• All correspondence and information about issues must be archived in the Issue Tracker system
• Report the required information for the most efficient response
• Recommendations and observations?
Issues - http://xkcd.com/627
Stabilization/Robustness Project
March 15-16, 2012
Stabilization (aka Brownian Motion)
• Rationale» Some tests are not predictably repeatable – sometimes they pass,
sometimes they fail
» This is true for a wide range of test approaches – CTP, TDK, etc.
• Approach» Need to bound the population of Brownian tests
– Approximately 114 of 5600+ CTP tests (all – not just HN)
– Approximately 5 of 400+ TDK tests (all “good” tests)
» HN is a focus area since there are many potential factors that can affect test results (eg, # of interfaces, fluctuating network conditions)
• Resolution» Determine if the cause of the uncertainty is the network, the RI or the
test itself
• Eliminate the test, fix the test, or fix the RI to accommodate a fluctuating network environment
• Will rely on RiScriplet for test automation
Robustness
• Rationale
»We have done very little stress testing on the RI
»We would like to begin addressing robustness issues now
• Approach
»Identify potential areas to research
»Focus on a few of those initially
Possible Approaches - 1
• Brownian Motion CTP investigation• FindBugs (a static analysis tool)• Drill down into code/walk throughs• Identify functional areas
» Socket timeouts» Run in context» Monitor/locks» Lack of synchronization in CyberGarage
• Design review of Cybergarage to gain better understanding• Address individual OCORIs which are categorized• Stop new feature development and focus on fixing issues• Setup “Typical/Real world” Home Network testing environment• Perform STB type testing• Gather stability metrics• Expand Rx Framework testing
Possible Approaches - 2
• Run CTP tests (either all or a subset) without consecutive reboots• More Guide Type testing by developers• Identify scenarios with guides to be run• Start fixing known issues rather than looking for more issues• Perform deployment type testing• Run CTP tests on a busy network• Investigate isolation of jvm/platform /stack to do targeted testing• Run surfer tests• Prioritize guide issues• Investigate TDK Brownian motion• Form a focus team to address discovery issue in simplified
environment (local loopback)• Re-architect problem areas
Miscellaneous Topics
March 15-16, 2012
MPEOS Review
• Low priority task to review and correct MPEOS header file comments, remove unused items, etc
• Comments in Doxygen format» Will expose this info as HTML pages» Will not update MPEOS Porting Guide
• Initial file to be refactored was mpeos_dvr.h» Was integrated into the 1.2.1 Release
• Next up are mpeos_media.h and mpeos_hn.h» Targeting 1.2.2 Rel-B
RI Conditional Access
RI provides support for Conditional Access. – Portable implementation – Compliant with CCIF 2.0 CA Support (Section 9.7)– MPE and Java-level decrypt session constructs – RI MPE CA management can be disabled via ini
configuration (When disabled the CA management is done in the MPEOS porting layer)
MPE POD Manager Responsibilities
• Methods to initiate/terminate decrypt sessions• CAS session opened when POD_READY indication is
received • All CA APDUs processed by the MPE POD Manager• CA_PMT constructed for the selected Service when MPE
decrypt session is started• Explicit LTSID assignment. MPE POD Manager assigns a
randomly generated LTSID when initiating decrypt session• LTSID is included in the CA session event and passed
around to all layers to maintain tuner to LTSID association • All CA sessions on a tuner share the same LTSID• CA sessions between clients are shared when possible (e.g.
decode and buffering of the same program on a tuner)
MPE POD Manager Responsibilities
• CableCARD decrypt resource capabilities queried at start-up• POD Manager internally keeps track of resources being used• High priority decrypt requests will pre-empt lower priority
requests (e.g. ServiceContext select is higher priority than DSMCC ServiceDomain attach)
• CP session is opened when CA session is successfully started
• Terminate CP session when CA session is stopped• CCI support (can be signaled via java/MPE decrypt session)
RI Conditional Access
CA session supported for– JMF BroadcastSession (initiate decrypt session
prior to calling mpe_mediaDecode())– DVR (initiate decrypt session prior to
mpe_dvrTsbBufferingStart())– Object Carousel (initiate decrypt session before
attaching a DSMCC service domain)– Section Filter (initiate decrypt session before setting
native section filter)
RI Conditional Access
OCAP ODL Stack Basic Decrypt/Decode
t1
Tune completes
DVBServiceContext.select() called for Service s1
mpe_mediaTune(freq1,mod1)Transport stream section filtering session(s)
mpe_dvrTsbConvertStart()
t2
Section filtering forPAT/PMT initiated by MPE SI Manager
t3 t4
MPE_TUNE_SYNC
mpe_filterSetFilter()
MPE_SF_EVENT_SECTION_FOUND
t5
Decode session CA_PMT generated(not selected)program index table updatedmpe_mediaDecode() JMF Broadcast Player started.
Decode initiated with Locator-specified components/PIDs from PMTLTSID extracted from decrypt session.
t7
mpe_mediaStop()
JMF player stopped
Decrypt session
mpe_podStartDecrypt()
CA_PMT generated. Random generated LTSID. Program index table updated.
t6
mpe_stopDecrypt()