Instructions to Playback a WebM DASH Presentation [obsolete]
Note: This page is obsolete and is here only for historic reference. For recommended instructions on how to adaptively stream WebM files using DASH, see this page: http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-adaptive-webm-using-dash
1. Introduction
Typically today there are two different methods for delivering video over the Internet, RTSP and HTTP progressive download. Both have major shortcomings. RTSP is a stateful protocol that is hampered by connection issues and does not scale as easily as HTTP servers. HTTP progressive download cannot dynamically change the video as conditions on the client change. Client-side HTTP adaptive streaming addresses the major issues of the current delivery methods while retaining their redeeming features. The client as the end point in any streaming solution has better insight to what is happening within the system and the network. Therefore given different playback choices the client can make the most informed decisions on what to play so the user will have the best playback experience.
Client-side adaptive streaming is what typically people think of when they hear adaptive streaming. There are many different formats for adaptive streaming, but usually they have one manifest file that describes how to retrieve the media data from an HTTP server.
2. Creating Files for the Presentation
2.1 Prerequisites
FFmpeg
A newer version of FFmpeg with libvpx and libvorbis support.
libwebm
git clone https://gerrit.chromium.org/gerrit/p/webm/libwebm.git
You will need the sample_muxer application that is created with the libwebm project.
TODO(fgalligan): Add patches to FFmpeg to remove dependency on sample_muxer.
webm-tools
git clone https://gerrit.chromium.org/gerrit/p/webm/webm-tools.git
You will need the webm_dash_manifest command line application.
*Note* Only tested on Linux and Windows.
2.2 WebM and DASH Manifest Files
2.2.1 WebM DASH Manifest format
DASH is the manifest format the WebM DASH javascript player uses. Here is a link to an informative specification of adding WebM files to DASH. https://sites.google.com/a/webmproject.org/wiki/adaptive-streaming/webm-dash-specification
2.2.2 WebM VOD adaptive streaming format
The WebM files for video-on-demand (VOD) adaptive streaming are non-chunked files with only one stream per file. For more information on why this format was chosen see this page http://wiki.webmproject.org/adaptive-streaming/webm-vod-baseline-format.
2.2.3 Differences from WebM Specification
The only difference is the addition of a CUES element for all files. Currently most WebM files created have audio and video streams muxed with a seek index for the video stream key frames. The seek index is needed to get the best experience when switching streams.
Without a seek index for a stream it may be close to impossible to switch the stream without having a discontinuity. Each WebM file will need to have its own seek index.
2.2.4 Constraints imposed by the WebM DASH javascript player
The current player only supports the WebM On-Demand Profile.
To switch video streams with the current player the video streams in an AdaptationSet must set @subsegmentStartsWithRAP = 1, @segmentAlignmentFlag = true, and @bitstreamSwitchingFlag = true.
What this means is that all WebM video streams within an AdaptationSet must have:
All Cuepoint timecodes match across the video streams.
All Clusters that are at sync points start with a key frame.
All TrackIDs match across the video streams.
Currently only one audio stream is supported.
2.2.5 Creating demuxed video streams for the WebM DASH javascript player
*NOTE* If you want to change resolution you need to use Chrome with a version of 18.0.980.0 or higher.
First step is to encode a video only WebM file. Below is a sample command line to create a video only WebM file with FFmpeg:
ffmpeg -i ${SOURCE} -vcodec libvpx -vb 250k -keyint_min 150 -g 150 -an ${FFMPEG_VIDEO}.webm
-i ${SOURCE} This is your source file.
-vcodec libvpx This tells FFmpeg to use libvpx (VP8) as the video encoder.
-vb 250k This tells FFmpeg to use 250 kilobits as the target datarate for the video encode.
-keyint_min 150 -g 150 This tells FFmpeg to use 150 frames as the minimum and maximum key frame interval. These values are used to align all of the sync points in the video streams to be switched. In the current WebM DASH javascript player these values must be identical.
-an This tells FFmpeg to not output an audio stream.
${FFMPEG_VIDEO}.webm This is your output file.
Unfortunately FFmpeg may not align the Cluster and Cues as needed by WebM DASH javascript player. The next step is to run encoded files through sample_muxer to align the sync points across the video streams. Below is a sample command line for sample_muxer to align the sync points:
sample_muxer -i ${FFMPEG_VIDEO}.webm -o ${SAMPLE_MUXER_VIDEO}.webm
At this point your video files should be ready to create the manifest file.
2.2.6 Creating demuxed audio stream for the WebM DASH javascript player
First step is to encode an audio only WebM file. Below is a sample command line to create an audio only WebM file with FFmpeg:
ffmpeg -i ${SOURCE} -vn -acodec libvorbis -ab 128k ${FFMPEG_AUDIO}.webm
-i ${SOURCE} This is your source file.
-vn This tells FFmpeg to not output a video stream.
-acodec libvorbis This tells FFmpeg to use libvorbis as the audio encoder.
-ab 128k This tells FFmpeg to use 128 kilobits as the target datarate for the audio encode.
${FFMPEG_AUDIO}.webm This is your output file.
The next step is to run your audio only WebM file through sample_muxer to add a Cues element, change the track number to a value different than the track number for the video streams, and to set the cluster duration. The WebM DASH javascript player needs a Cues element in the audio file to download the audio data. The WebM DASH javascript player needs the audio WebM file to have a different track number than the video files because the Media Source API does not know how to handle separate files for audio and video streams. Below is a sample command line for sample_muxer to create the audio files needed by the WebM DASH javascript player:
sample_muxer -i ${FFMPEG_AUDIO}.webm -o ${SAMPLE_MUXER_AUDIO}.webm -output_cues 1 -cues_on_audio_track 1 -max_cluster_duration 5 -audio_track_number 2
-i ${FFMPEG_AUDIO}.webm Input file.
-o ${SAMPLE_MUXER_AUDIO}.webm Output file.
-output_cues 1 Tells sample_muxer to output a Cues element.
-cues_on_audio_track 1 Tells sample_muxer to output the Cues element on the audio track.
-max_cluster_duration 5 Tells sample_muxer to set the maximum cluster duration to 5 seconds.
-audio_track_number 2 Tells sample_muxer to set the audio track id to 2.
At this point your audio file should be ready to create the manifest file.
2.2.7 ***OPTIONAL*** Encrypting the video streams
You can optionally encrypt the video streams to playback in Chrome version 22+. *Note* Currently Chrome can only decode encrypted video streams. The command line tool webm_crypt is used to encrypt the video streams. The source code for webm_crypt is in the webm_crypt folder in the webm-tools project. To build webm_crypt follow the instructions in readme.txt.
The WebM DASH javascript player will only support video streams encrypted with the same key. The code will look for a key named key.bin in the same directory as dash_player.js.
The First step is to encrypt one of WebM video files. This will also generate a key file.
webm_crypt -i ${FFMPEG_VIDEO}.webm -o ${ENCRYPTED_VIDEO}.webm
The key file generated will be named vid_base_secret.key. Rename vid_base_secret.key to key.bin.
Then encrypt all the rest of the WebM video files with the generated key file (i.e. key.bin) from the first encryption step.
webm_crypt -i ${FFMPEG_VIDEO}.webm -o ${ENCRYPTED_VIDEO}.webm -video_options base_file=key.bin
Then when creating the manifest file use the encrypted WebM video files.
*NOTE* To enable playback of the encrypted content you can turn support on via entering “chrome://flags/” in the browser and scrolling down to you see "Enable Encrypted Media Extensions on <video> elements. Mac, Windows, Linux, Chrome OS" and click the enable link.
2.2.8 Creating DASH manifest file for the WebM DASH javascript player
After your video and audio files have been encoded you will need to create the manifest file with webm_dash_manifest that references those files. Below is a sample command line to create a Webm DASH manifest file with three video streams and one audio stream:
webm_dash_manifest -o ${MANIFEST_OUTPUT}.xml \
-as id=0,lang=eng \
-r id=0,file=${SAMPLE_MUXER_VIDEO1}.webm \
-r id=1,file=${SAMPLE_MUXER_VIDEO2}.webm \
-r id=2,file=${SAMPLE_MUXER_VIDEO3}.webm \
-as id=1,lang=eng \
-r id=4,file=${SAMPLE_MUXER_AUDIO}.webm
-o ${MANIFEST_OUTPUT}.xml Output manifest file.
-as id=0,lang=eng Create an AdaptationSet.
-r id=0,file=${SAMPLE_MUXER_VIDEO1}.webm Add a Representation to the previous AdaptationSet that references video file 1.
-r id=1,file=${SAMPLE_MUXER_VIDEO2}.webm Add a Representation to the previous AdaptationSet that references video file 2.
-r id=2,file=${SAMPLE_MUXER_VIDEO3}.webm Add a Representation to the previous AdaptationSet that references video file 3.
-as id=1,lang=eng Create another AdaptationSet.
-r id=4,file=${SAMPLE_MUXER_AUDIO}.webm Add a Representation to the previous AdaptationSet that references the audio file.
After webm_dash_manifest has been successfully run you will have a DASH manifest XML file. From the example above you would have one Period with two AdaptationSets, one AdaptationSet with the three video files and one AdaptationSet with the audio file. At this point you should check the manifest to verify the player should be able to switch streams seamlessly. In order to do this you will want to look at the AdaptationSet beginning element. Here is one example:
<AdaptationSet
id="0"
mimetype="video/webm"
codecs="vp8"
width="720"
height="306"
subsegmentAlignment="true"
subsegmentStartsWithSAP="1"
bitstreamSwitching="true">
The key information is to make sure subsegmentAlignment="true", subsegmentStartsWithSAP="1", and bitstreamSwitching="true". If any of those values are different or any are missing then the WebM DASH javascript player will most likely not be able to playback the presentation and you will need to redo your steps from 2.2.5.
At this point you should be ready to try and playback the presentation in Chrome.
3. Playing the Presentation
3.1 Prerequisites
Chrome
You will need version 17.0.922.0+ for Media Source API.
You will need version 18.0.890.0+ if your video is changing resolution.
WebM DASH javascript demo player
Web server that supports byte range requests.
3.2 WebM DASH JavaScript demo player layout
The files are laid out in a directory like this:
adaptive
bandwidth.js
Calculates the current download bandwidth.
bandwidth_manager.js
Used to emulate network download bandwidth.
dash-player-simple.html
Very simple example of playing a WebM DASH presentation.
dash-player.html
More complicated example of playing a WebM DASH presentation. Allows the user to set the emulated network download bandwidth and has a graph showing which clusters were downloaded.
dash_parser.js
Code to parse DASH manifest files.
dash_player.js
Main player code for WebM DASH presentation.
basic
webm-player.html
An example of playing a muxed WebM file.
webm_player.js
Main player code for playing muxed WebM files.
shared
http.js
Code used to download data from a Web server.
utils.js
Soem general utility functions.
webm_parser.js
WebM parsing code.
webm_utils.js
WebM element IDs.
3.3 Hosting the files
The WebM DASH javascript player must be kept in the same directory and include all the files from the adaptive and shared directories. The audio, video and manifest files must be placed in the adaptive directory.
3.4 DASH-Player Web Page Information
(1) Text box to enter the manifest file.
(2) Video tag
(3) y-axis label for the graph, which is a list of video files represented by their average bandwidth.
(4) Graph showing which chunks were downloaded by the player. The x-axis is time of the presentation. Green indicates data was downloaded for a particular file over a time span. Red indicates no data was downloaded for a particular file over a time span.
(5) Text box to set the emulated video bandwidth in Kilobits per second.
3.5 Playing the Presentation
To enable the Media Source API you can turn it on via entering “chrome://flags/” in the browser and scrolling down to you see “Enable Media Source API on <video> elements. Mac, Windows, Linux, Chrome OS” and click the enable link.
Navigate Chrome to where you hosted “dash-player.html”. As a shortcut you can add “url=${MANIFEST_OUTPUT}.xml” to the query string of player url to put “${MANIFEST_OUTPUT}.xml” into the page’s text box. I.e. “http://www.example.com/adaptive/dash-player.html?url=manifest.xml”
Press the “Start” button to load the manifest file and start playing the presentation.
3.6 Notes on WebM DASH JavaScript Demo Player
3.6.1 Emulating Download Bandwidth
An adaptive player can switch streams for a variety of reasons. The player currently only makes switching decisions based on bandwidth. “bandwidth_manager.js” is a pretty simple class to emulate download bandwidth. The class is not very robust but built for ease of use. When you want to test under real world conditions I would recommended using other bandwidth shaping applications like Dummynet. http://info.iet.unipi.it/~luigi/dummynet/
3.6.2 Hello World Example
Look at “dash-player-simple.html” if you want to see an example of a hello world type demo. There are really only 2 lines of javascript code needed to start playback of a presentation.
wmp = new DashPlayer('manifest.xml', videoTag);
videoTag.src = videoTag.webkitMediaSourceURL;
The first line creates the DashPlayer object giving it the manifest file and the video tag. The second line sets it into Media Source API mode.