Building a simple MPEG-DASH streaming player

Build an MPEG-DASH player using Media Source Extensions API to stream files to an HTML5 video element. You can find the sample code on the MSDN samples site.

Getting started

Media Source Extensions (MSE) as described in the W3C spec adds buffer-based source options to HTML5 video for streaming support. Previously, you had to download a complete video file before playing or use an add-on like Silverlight or Adobe Flash to stream media. With MSE, no add-ons are required. The MediaSource object takes the place of the file URL as the src on a video object. Source buffers are appended to the MediaSource object and filled with media data from segmented files.

Segmented files can consist of a series of small single files, or a large file with indexed sections that are downloaded and played sequentially. If your app can grab short segments, it's easier to do other tasks, like inserting ads or other content into the video.

There are many ways to build an MPEG-DASH player. Our approach uses a SegmentList technique. This means we download and play a video file from an single internally-segmented mp4 source file. Instead of downloading the whole file, this example requests, downloads, and plays short segments of video at 10 second intervals.

The code here uses the Media Source Extensions (MSE) API and HTML5 video element. Because the MSE spec is still in flux and some browsers use prefixes, this sample is built to run specifically in Internet Explorer 11. The sample also only uses MSE, so for deployment, you might want to provide a fallback such as Adobe Flash or Silverlight for browsers that don't support HTML5 video and MSE.

A quick tutorial on MSE

To use the MSE API, follow these steps:

  1. Define an HTML5 video element in the HTML section of a page.
  2. Create a MediaSource object in JavaScript.
  3. Create a virtual URL using createObjectURL with the MediaSource object as the source.
  4. Assign the virtual URL to the video element's src property.
  5. Create a SourceBuffer using addSourceBuffer, with the mime type of the video you're adding.
  6. Get the video initialization segment from the media file online and add it to the SourceBuffer with appendBuffer.
  7. Get the segments of video data from the media file, append them to the SourceBuffer with appendBuffer.
  8. Call the play method on the video element.
  9. Repeat step 7 until done.
  10. Clean up.

This is a bare bones list of steps for using MSE. However, your player needs to know what kind of video file is coming and where the segments start and end. The MPEG-DASH specification describes the Media Presentation Description (MPD) file to provide that info about the video.

About the example

The example that we're building is a simple MPEG-DASH player. It uses a single file and retrieves byte ranges of video data from the file using XHR. It initially retrieves the Media Presentation Description file, and displays the parameters it includes. These parameters are the Reported values as seen in the image below. To keep the player's formatting consistent, we use CSS to set the video size to 640x480 pixels for any input. If the incoming video is larger than 640x480, it's scaled down and if smaller, it's scaled up.

The Current values are updated as the video plays. The index is the current segment number (as counted in the segmentList array). The Segment length is the actual time length of the segment that's just been returned. The actual segment length often differs from the segment length the MPD file describes. The Video time is the current place in the video file that's being played. None of the values need to be displayed; we're just providing them for reference. On a deployed webpage, you'd probably not have most of them showing.

To run the player, see the example online.

Media presentation description (MPD) files

The MPEG-DASH MPD is an XML file that contains a description of all the info you'll need to play a video file. We're only using a few of the features available in an MPD schema. The example retrieves the video mime type, the width and height, the segment duration, and the list of segment offsets (in bytes) in a single file.

The MPD file we're using is created with the MP4box command line utility. MP4Box is an open source multimedia packaging tool by GPAC that can create a DASH segmented MP4 and associated MPD file. For more info about MP4Box and to download binaries, see GPAC MP4Box or view the GPAC general documentation.

To create a segmented MP4 and associated MPD file, start by installing MP4Box. Then, call MP4Box on the command line with this syntax:

mp4box -dash 10000 -frag 1000 -rap path\yourfile.mp4

MP4Box creates two files, an MP4 and an MPD file with _dash appended. In this example, it creates yourfile_dash.mp4 and yourfile_dash.mpd with 10 second segments and 1 second fragments. The -rap flag tells MP4Box to try to make segments break on a keyframe or start of a decoding sequence. While we're asking for 10 second segments, the actual duration of each segment may vary. We'll touch on how to handle that later. For more info about MPD files, see MPEG-DASH Tutorial.

Note  MPD files can have any extension. If you're publishing from a website that blocks files with unknown extensions, you can change the .mpd extension to .xml, for example. We did this for our online example to get around the blocking issue.

 

The HTML5 video element

The HTML portion of the example is very simple. It consists of a video element, an Input Element field, a button, and some <div> areas to display results. The video element has the autoplay attribute set, but no src or controls attributes. The source for the video element is set by the information we'll get from the MPD file, and the play/pause control is done with the Play button. The Play button's click event is handled in the JavaScript code with an addEventListener() rather than using the onclick() event on the element itself.

<div id="grid">
  <div id="col1">
  <label>Enter .mpd file: 
      <input type="text" id="filename" value="sample_dash.xml" />
    </label> <button id="load">Play</button><br />
         
    <!-- Some areas to display info and content -->
    <div id="mydiv">
      <span id="myspan"><br />This demo requires Internet Explorer 11</span>
    </div>
    <div id="videoInfo"></div>
    <div>&nbsp;</div>
    <div id="curInfo"> 
      <h3>Current values:</h3>
      <ul>
        <li>Index: <span id="curIndex"></span> of <span id="numIndexes"></span></li>
        <li>Segment length: <span id="segLength"></span></li>
        <li>Video time: <span id="curTime"></span></li>
      </ul>
    </div> 
  </div>
  <div id="col2">
    <!-- Video element -->
    <video id="myVideo" autoplay="autoplay" >No video available</video>
          <div id="description">
            This example uses HTML5 video, Media Source Extensions, and MPEG-DASH files.<br /> 
            For more info see <a href="https://go.microsoft.com/fwlink/p/?LinkID=390962">Building a simple MPEG-DASH streaming player</a>. 
          </div>
  </div>
</div>

The architecture of the page puts the <script> tags below the HTML code in the <body> of the page. This adds efficiency by ensuring that the HTML elements have finished loading before the script starts to run.

Handling play and pause

To play a DASH file in the example, click the Play button. Because the intrinsic controls are turned off on the video element, the play button's event handler plays or pauses the current video. The play button's behavior is conditional:

  • If the video element is paused (its initial state), and the MPD file hasn't been loaded before, the handler calls the getData() function to load and parse the MPD file.
  • If the video is paused, but the file was loaded and hasn't changed, the handler calls only the play method.
  • If the file name in the input field has changed, and the video is paused, the handler loads and then plays the new file.
  • If the video is playing, the handler calls the pause method so the user can stop and start the video.
// Click event handler for load button    
playButton.addEventListener("click", function () {
  //  If video is paused then check for file change
  if (videoElement.paused == true) {
    // Retrieve mpd file, and set up video
    var curMpd = document.getElementById("filename").value;
    //  If current mpd file is different then last mpd file, load it.
    if (curMpd != lastMpd) {
      //  Cancel display of current video position
      window.cancelAnimationFrame(requestId);
      lastMpd = curMpd;
      getData(curMpd);
    } else {
      //  No change, just play
      videoElement.play();
    }
  } else {
    //  Video was playing, now pause it
    videoElement.pause();
  }
}, false);

To keep the button labels in sync with the state of the video element, the paused and playing events are used to handle switching the button's label between Play and Pause.

// Handler to switch button text to Play
videoElement.addEventListener("pause", function () {
  playButton.innerText = "Play";
}, false);

// Handler to switch button text to pause
videoElement.addEventListener("playing", function () {
  playButton.innerText = "Pause";
}, false);

Getting the .mpd file and DASH parameters

The MPD file is at the core of using DASH. The MPD is an XML file that describes how the media is segmented, the type and codec (MP4 here), the bit rate, length, and basic segment size of the video. Some MPD files include audio info, and you can split content into separate streams for the video and audio players. The example shown here only uses a single buffer for both video and audio.

// Gets the mpd file and parses it    
function getData(url) {
  if (url !== "") {
    var xhr = new XMLHttpRequest(); // Set up xhr request
    xhr.open("GET", url, true); // Open the request          
    xhr.responseType = "text"; // Set the type of response expected
    xhr.send();

    //  Asynchronously wait for the data to return
    xhr.onreadystatechange = function () {
      if (xhr.readyState == xhr.DONE) {
        var tempoutput = xhr.response;
        var parser = new DOMParser(); //  Create a parser object 

        // Create an xml document from the .mpd file for searching
        var xmlData = parser.parseFromString(tempoutput, "text/xml", 0);
        log("parsing mpd file");

        // Get and display the parameters of the .mpd file
        getFileType(xmlData);

        // Set up video object, buffers, etc  
        setupVideo();

        // Initialize a few variables on reload
        clearVars();
      }
    }

    // Report errors if they happen during xhr
    xhr.addEventListener("error", function (e) {
      log("Error: " + e + " Could not load url.");
    }, false);
  }
}

This example uses the XMLHttpRequest object to retrieve the MPD file into the response attribute, tempoutput. We create a DOMParser object to parse the MPD file data into an XML document. We want the document (xmlData) that we can use with querySelectorAll and getAttribute methods to extract the value of the XML nodes in the MPD file.

// Retrieve parameters from our stored .mpd file
function getFileType(data) {
  try {
    file = data.querySelectorAll("BaseURL")[0].textContent.toString();
    var rep = data.querySelectorAll("Representation");
    type = rep[0].getAttribute("mimeType");
    codecs = rep[0].getAttribute("codecs");
    width = rep[0].getAttribute("width");
    height = rep[0].getAttribute("height");
    bandwidth = rep[0].getAttribute("bandwidth");

    var ini = data.querySelectorAll("Initialization");
    initialization = ini[0].getAttribute("range");
    segments = data.querySelectorAll("SegmentURL");

    // Get the length of the video per the .mpd file
    //   since the video.duration will always say infinity
    var period = data.querySelectorAll("Period");
    var vidTempDuration = period[0].getAttribute("duration");
    vidDuration = parseDuration(vidTempDuration); // display length

    var segList = data.querySelectorAll("SegmentList");
    segDuration = segList[0].getAttribute("duration");

  } catch (er) {
    log(er);
    return;
  }
  showTypes();  // Display parameters 
}

// Display parameters from the .mpd file
function showTypes() {
  var display = document.getElementById("myspan");
  var spanData;
  spanData = "<h3>Reported values:</h3><ul><li>Media file: " + file + "</li>";
  spanData += "<li>Type: " + type + "</li>";
  spanData += "<li>Codecs: " + codecs + "</li>";
  spanData += "<li>Width: " + width + " -- Height: " + height + "</li>";
  spanData += "<li>Bandwidth: " + bandwidth + "</li>";
  spanData += "<li>Initialization Range: " + initialization + "</li>";
  spanData += "<li>Segment length: " + segDuration / 1000 + " seconds</li>";
  spanData += "<li>" + vidDuration + "</li>";
  spanData += "</ul>";
  display.innerHTML = spanData;
  document.getElementById("numIndexes").innerHTML = segments.length;
  document.getElementById("curInfo").style.display = "block";
  document.getElementById("curInfo").style.display = "block";
}

The getData() function calls the getFileType() function which fills global variables with the information from the MPD file. We then call the showTypes() function to display the parameters to the screen.

Setting up video and buffers

After the MPD file has been parsed, the player retrieves and plays the media content pointed to in the MPD file. The player creates a segmentList array that contains the ranges for each segment. This array is accessed using the index variable.

When getting and playing video data, timing is very important. In an app that plays only at a single resolution, the number of segments you put into the buffer isn't a big concern; however, you don't want to use up too much memory. When playing a file with multiple levels of quality, you'll want to be a little more careful. If you're playing low resolution because of a slow network, you'll want to be ready to download a higher resolution video segment the next time the network speeds increases. In this case, you might not want to get too far ahead of the current playing segment.

The play process goes like this:

  1. Download the video's initialization segment to the buffer and play it.
  2. Download a segment of video to the buffer, and play it.
  3. Repeat step 2 until all segments have been played.

DASH media segments are downloaded and appended to the buffer, which is then played by the HTML5audio or video elements. The MediaSource buffer takes the place of a file URL for the src of these elements. The addSourceBuffer method creates and adds a buffer to the MediaSource object. The removeSourceBuffer removes an existing SourceBuffer from the MediaSource object. The appendBuffer method adds media data to a SourceBuffer.

// Create mediaSource and initialize video 
function setupVideo() {
  clearLog(); // Clear console log

  //  Create the media source 
  if (window.MediaSource) {
    mediaSource = new window.MediaSource();
   } else {
    log("mediasource or syntax not supported");
    return;
  }
  var url = URL.createObjectURL(mediaSource);
  videoElement.pause();
  videoElement.src = url;
  videoElement.width = width;
  videoElement.height = height;

  // Wait for event that tells us that our media source object is 
  //   ready for a buffer to be added.
  mediaSource.addEventListener('sourceopen', function (e) {
    try {
      videoSource = mediaSource.addSourceBuffer('video/mp4');
      initVideo(initialization, file);           
    } catch (e) {
      log('Exception calling addSourceBuffer for video', e);
      return;
    }
  },false);

To get individual segments from a single video file, we use setRequestHeader to specify a byte range for each segment in the file. The XHR response property is typecast to a Uint8Array and appended to the source buffer.

The initVideo() function in the example downloads the initialization segment from the .MP4 file and puts it into the SourceBuffer. The XHR request are asynchronous, so to ensure functions are called at the right time, the readystatechange event is used. When readystatechange fires, the readyState property is checked. If it's equal to xhr.DONE, the response attribute (media data), is added to the source buffer as a Uint8Array.

//  Load video's initialization segment 
function initVideo(range, url) {
  var xhr = new XMLHttpRequest();
  if (range || url) { // make sure we've got incoming params

    // Set the desired range of bytes we want from the mp4 video file
    xhr.open('GET', url);
    xhr.setRequestHeader("Range", "bytes=" + range);
    segCheck = (timeToDownload(range) * .8).toFixed(3); // use .8 as fudge factor
    xhr.send();
    xhr.responseType = 'arraybuffer';
    try {
      xhr.addEventListener("readystatechange", function () {
         if (xhr.readyState == xhr.DONE) { // wait for video to load
          // Add response to buffer
          try {
            videoSource.appendBuffer(new Uint8Array(xhr.response));
            // Wait for the update complete event before continuing
            videoSource.addEventListener("update",updateFunct, false);

          } catch (e) {
            log('Exception while appending initialization content', e);
          }
        }
      }, false);
    } catch (e) {
      log(e);
    }
  } else {
    return // No value for range or url
  }
}

function updateFunct() {
  //  This is a one shot function, when init segment finishes loading, 
  //    update the buffer flag, call getStarted, and then remove this event.
  bufferUpdated = true;
  getStarted(file); // Get video playback started
  //  Now that video has started, remove the event listener
  videoSource.removeEventListener("update", updateFunct);
}

The sourceBuffer's update event is used to see when the data has finished loading. When the media has finished loading into the SourceBuffer, we link to updateFunct(). The updateFunct() function sets the bufferUpdated flag. This flag is used later to check that the initialization content from the MP4 file is actually loaded first. The function then calls the getStarted() function, which starts loading the video segments. Finally the update event handler is removed to prevent it from being called on every segment.

Feeding the buffer

After the initialization data is loaded, the media segments start to load and play. In this example, the first segment of data is loaded outside of the regular play loop. This is a small workaround to get the video started because the loop that drives the segment request and buffer maintenance is based on the video playing. At initialization, the video is paused. After the first segment is loaded and is playing, the video update method is called and the loop starts.

To keep the video element playing, media segments are requested based on the time length of the current segment. The example uses a 10% fudge factor to ensure the content gets downloaded in time. If the current segment has 10 seconds of video, the next segment is requested after 8 seconds, or 80% of the segment total. This gives a small amount of extra time to request the segment, but doesn't eat up memory so quickly.

To calculate the time length of the current segment, we use the formula: time = (size * 8) / bitrate. The time is in seconds, size is in bytes, and bitrate is in bits-per-second. The bitrate of the media file is specified by the MPD file as bandwidth, and size is the current byte range's end minus the start.

In the example, the byte range is stored in the MPD file in the format of xxxx-yyyy, or start-end. The example here splits the string and subtracts the start from the end to get the size in bytes of the current segment. That value is multiplied by 8 to convert bytes to bits, and then divided by the bandwidth. The result is the time in seconds that the current segment takes to play. That value is multiplied by .8, or 80%, and stored in the segCheck global variable used to calculate when to get the next segment.

function timeToDownload(range) {
  var vidDur = range.split("-");
  // Time = size * 8 / bitrate
  return (((vidDur[1] - vidDur[0]) * 8) / bandwidth)
}

It might seem like overkill to calculate the length in time of each segment when the MPD file gives us the duration of the segments as a parameter. Unfortunately, the duration parameter is only a suggestion. In practice, the segments are often shorter or longer than the stated duration. The actual value depends on how the DASH MP4 file was segmented. The DASH segmenting tool tries to make segment breaks on keyframes, so the time depends on how often the video compression sets a keyframe. In a compressed video codec like MP4, a keyframe is a fully rendered frame, and is followed by a series of frames that contain only the changes for movement in the frame. The frequency of keyframes vary based on the amount of change, either action within a frame, or a scene change.

The playback loop uses the video element's timeupdate event to drive when to get the next segment. When the event fires, it calls the fileChecks() function.

The fileChecks() function first compares the current index with the total elements of the array of segments. If we're not at the end of our media (no more segments), the fileChecks() function then calculates the amount of time that the current segment has been playing. This value is compared to the segCheck value calculated earlier. If it's greater than or equal, get the next segment of media data.

This loop continues until all the segments have been loaded and played. When the index matches the number of segments, the removeEventListener method is called to stop the timeupdate event.

//  Get video segments 
function fileChecks() {
  // If we're ok on the buffer, then continue
  if (bufferUpdated == true) {
    if (index < segments.length) {
      // Loads next segment when time is close to the end of the last loaded segment 
      if ((videoElement.currentTime - lastTime) >= segCheck) {
        playSegment(segments[index].getAttribute("mediaRange").toString(), file);
        lastTime = videoElement.currentTime;
        curIndex.textContent = index + 1; // Display current index    
        index++;
      }
    } else {
      videoElement.removeEventListener("timeupdate", fileChecks, false);
    }
  }
}

The PlaySegment() function downloads the media data and puts it into the source buffer. The function is called with the media byte-range for the segment and the URL of the MP4 file.

//  Play segment plays a byte range (format nnnn-nnnnn) of a media file    
function playSegment(range, url) {
  var xhr = new XMLHttpRequest();
  if (range || url) { // Make sure we've got incoming params
    xhr.open('GET', url);
    xhr.setRequestHeader("Range", "bytes=" + range);
    xhr.send();
    xhr.responseType = 'arraybuffer';
    try {
      xhr.addEventListener("readystatechange", function () {
        if (xhr.readyState == xhr.DONE) { //wait for video to load
          //  Calculate when to get next segment based on time of current one
            segCheck = (timeToDownload(range) * .8).toFixed(3); // Use .8 as fudge factor
            segLength.textContent = segCheck;
          // Add received content to the buffer
          try {
            videoSource.appendBuffer(new Uint8Array(xhr.response));
          } catch (e) {
            log('Exception while appending', e);
          }
        }
      }, false);
    } catch (e) {
      log(e);
      return // No value for range
    }
  }
}

To sum up, after the initialization process is complete, the timeupdate event drives the download and playback of segments. When the current segment has played approximately 90% of the way through, another segment is downloaded and added to the buffer and the play method is called.

Where to go from here

The example presented here shows how to create and attach buffers to the HTML5 video element and read one type of MPD file to get segments of video from a single file. As we've said, you can also use an MPD file to describe a number of small video files rather than segments in a single larger file. To work with that type of MPD setup, you can modify the code that reads the segment section of the MPD file to get individual URLs. This eliminates the need to use setRequestHeader because you'd be getting the whole file with XHR. However, you'd want to calculate how many bytes were received in the file, rather than doing the math on the byte range of the segment.

As browsers catch up with modern standards, you could also rewrite some of the more critical timing code with promises. Promises allow you to chain asynchronous events easy. The sample code handles this with flags, but promises would be cleaner.

Rather than writing all this code yourself, take a look at the dash.js library and reference player. Dash.js is an open source library and player that is supported by many industry media companies, including Microsoft. Dash.js is a modular library with components that can be replaced or rewritten as needed. For large companies, this gives the flexibility of creating modules that handle special needs. For more info see dash.js on GitHub.

If you need to stream large quantities of video, add digital rights management (DRM). And if you want to charge for your content, take a look at Microsoft Azure. Azure provides a complete media services platform that can support HTML5 media and MSE, as well as fallback content using Smooth Streaming content to a large number of devices. For more info, see Azure Media Services.

Complete example code

You can see this example in action on the MSDN samples site.

The architecture of the sample uses Cascading Style Sheets (CSS) in the <head> section, then HTML code in the <body> section, followed by the <script> code, also in the <body>. The script content is put after the HTML to ensure that the HTML elements are loaded before the scripts start. Alternatively, you could put your script in a function in the <head> element and call it with an onload event.

The .MPD file extension renamed to an .XML extension to work on a server that does not recognize MPD as a legitimate MIME type.

Here's the complete code listing:

  <!DOCTYPE html>
<html>
<!-- Media streaming example
  Reads an .mpd file created using mp4box and plays the file
-->     
<head>
  <meta charset="utf-8" />
  <title>Media streaming example</title>
  <style>
    /* CSS code to format the parameter list and elements */
    body {
      font-family:'Segoe UI';
      background-color:#f0f0f0;
    }
    /* Set up parameter and video display*/
    #myDiv {
      /*display: block;*/
      width: 600px;
      height: 400px;
      overflow: auto;      
    }
    /* Zoom or shrink video based on the native size of the video*/
    video {
      width:640px;
      height:480px;
      border:1px black solid;
    }
    #description {
      display: block;
      width:640px;      
    }

    #grid {
      display: -ms-grid;
      -ms-grid-columns: 35% 65%;
      -ms-grid-rows: 1fr;
    }    

    #col1 {
      -ms-grid-row: 1;
      -ms-grid-column: 1;
      padding:20px;
    }
    #col2 {
      -ms-grid-row: 1;
      -ms-grid-column: 2;
    }
    #curInfo{
      display:none;
    }
  </style>
</head>

<body>     
  <h1>Simple MPEG-DASH Streaming player</h1>

  <div id="grid">
    <div id="col1">
    <label>Enter .mpd file: 
        <input type="text" id="filename" value="sample_dash.xml" />
      </label> <button id="load">Play</button><br />
           
      <!-- Some areas to display info and content -->
      <div id="mydiv">
        <span id="myspan"><br />This demo requires Internet Explorer 11</span>
      </div>
      <div id="videoInfo"></div>
      <div>&nbsp;</div>
      <div id="curInfo"> 
        <h3>Current values:</h3>
        <ul>
          <li>Index: <span id="curIndex"></span> of <span id="numIndexes"></span></li>
          <li>Segment length: <span id="segLength"></span></li>
          <li>Video time: <span id="curTime"></span></li>
        </ul>
      </div> 
    </div>
    <div id="col2">
      <!-- Video element -->
      <video id="myVideo" autoplay="autoplay" >No video available</video>
            <div id="description">
              This example uses HTML5 video, Media Source Extensions, and MPEG-DASH files.<br /> 
              For more info see <a href="https://go.microsoft.com/fwlink/p/?LinkID=390962">Building a simple MPEG-DASH streaming player</a>. 
            </div>
    </div>
  </div>

  <!-- script section -->
  <script>
    'use strict'
    // Global Parameters from .mpd file
    var file;  // MP4 file
    var type;  // Type of file
    var codecs; //  Codecs allowed
    var width;  //  Native width and height
    var height;

    // Elements
    var videoElement = document.getElementById('myVideo');
    var playButton = document.getElementById("load");
    videoElement.poster = "poster.png";

    // Description of initialization segment, and approx segment lengths 
    var initialization;
    var segDuration;
    var vidDuration;

    // Video parameters
    var bandwidth; // bitrate of video

    // Parameters to drive segment loop
    var index = 0; // Segment to get
    var segments;
    var curIndex = document.getElementById("curIndex"); // Playing segment
    var segLength = document.getElementById("segLength");

    // Source and buffers
    var mediaSource;
    var videoSource;

    // Parameters to drive fetch loop
    var segCheck;
    var lastTime = 0;
    var bufferUpdated = false;

    // Flags to keep things going 
    var lastMpd = "";
    var vTime = document.getElementById("curTime");
    var requestId = 0;

    // Click event handler for load button    
    playButton.addEventListener("click", function () {
      //  If video is paused then check for file change
      if (videoElement.paused == true) {
        // Retrieve mpd file, and set up video
        var curMpd = document.getElementById("filename").value;
        //  If current mpd file is different then last mpd file, load it.
        if (curMpd != lastMpd) {
          //  Cancel display of current video position
          window.cancelAnimationFrame(requestId);
          lastMpd = curMpd;
          getData(curMpd);
        } else {
          //  No change, just play
          videoElement.play();
        }
      } else {
        //  Video was playing, now pause it
        videoElement.pause();
      }
    }, false);

    // Do a little trickery, start video when you click the video element
    videoElement.addEventListener("click", function () {
      playButton.click();
    }, false);

    // Event handler for the video element errors
    document.getElementById("myVideo").addEventListener("error", function (e) {
      log("video error: " + e.message);
    }, false);


    // Gets the mpd file and parses it    
    function getData(url) {
      if (url !== "") {
        var xhr = new XMLHttpRequest(); // Set up xhr request
        xhr.open("GET", url, true); // Open the request          
        xhr.responseType = "text"; // Set the type of response expected
        xhr.send();

        //  Asynchronously wait for the data to return
        xhr.onreadystatechange = function () {
          if (xhr.readyState == xhr.DONE) {
            var tempoutput = xhr.response;
            var parser = new DOMParser(); //  Create a parser object 

            // Create an xml document from the .mpd file for searching
            var xmlData = parser.parseFromString(tempoutput, "text/xml", 0);
            log("parsing mpd file");

            // Get and display the parameters of the .mpd file
            getFileType(xmlData);

            // Set up video object, buffers, etc  
            setupVideo();

            // Initialize a few variables on reload
            clearVars();
          }
        }

        // Report errors if they happen during xhr
        xhr.addEventListener("error", function (e) {
          log("Error: " + e + " Could not load url.");
        }, false);
      }
    }

    // Retrieve parameters from our stored .mpd file
    function getFileType(data) {
      try {
        file = data.querySelectorAll("BaseURL")[0].textContent.toString();
        var rep = data.querySelectorAll("Representation");
        type = rep[0].getAttribute("mimeType");
        codecs = rep[0].getAttribute("codecs");
        width = rep[0].getAttribute("width");
        height = rep[0].getAttribute("height");
        bandwidth = rep[0].getAttribute("bandwidth");

        var ini = data.querySelectorAll("Initialization");
        initialization = ini[0].getAttribute("range");
        segments = data.querySelectorAll("SegmentURL");

        // Get the length of the video per the .mpd file
        //   since the video.duration will always say infinity
        var period = data.querySelectorAll("Period");
        var vidTempDuration = period[0].getAttribute("duration");
        vidDuration = parseDuration(vidTempDuration); // display length

        var segList = data.querySelectorAll("SegmentList");
        segDuration = segList[0].getAttribute("duration");

      } catch (er) {
        log(er);
        return;
      }
      showTypes();  // Display parameters 
    }

    // Display parameters from the .mpd file
    function showTypes() {
      var display = document.getElementById("myspan");
      var spanData;
      spanData = "<h3>Reported values:</h3><ul><li>Media file: " + file + "</li>";
      spanData += "<li>Type: " + type + "</li>";
      spanData += "<li>Codecs: " + codecs + "</li>";
      spanData += "<li>Width: " + width + " -- Height: " + height + "</li>";
      spanData += "<li>Bandwidth: " + bandwidth + "</li>";
      spanData += "<li>Initialization Range: " + initialization + "</li>";
      spanData += "<li>Segment length: " + segDuration / 1000 + " seconds</li>";
      spanData += "<li>" + vidDuration + "</li>";
      spanData += "</ul>";
      display.innerHTML = spanData;
      document.getElementById("numIndexes").innerHTML = segments.length;
      document.getElementById("curInfo").style.display = "block";
      document.getElementById("curInfo").style.display = "block";
    }


    function render() {
      // Display current video position
      vTime.innerText = formatTime(videoElement.currentTime);
      // Recall this function when available 
      requestId = window.requestAnimationFrame(render);
    }

    // Create mediaSource and initialize video 
    function setupVideo() {
      clearLog(); // Clear console log

      //  Create the media source 
      if (window.MediaSource) {
        mediaSource = new window.MediaSource();
       } else {
        log("mediasource or syntax not supported");
        return;
      }
      var url = URL.createObjectURL(mediaSource);
      videoElement.pause();
      videoElement.src = url;
      videoElement.width = width;
      videoElement.height = height;

      // Wait for event that tells us that our media source object is 
      //   ready for a buffer to be added.
      mediaSource.addEventListener('sourceopen', function (e) {
        try {
          videoSource = mediaSource.addSourceBuffer('video/mp4');
          initVideo(initialization, file);           
        } catch (e) {
          log('Exception calling addSourceBuffer for video', e);
          return;
        }
      },false);

      // Handler to switch button text to Play
      videoElement.addEventListener("pause", function () {
        playButton.innerText = "Play";
      }, false);

      // Handler to switch button text to pause
      videoElement.addEventListener("playing", function () {
        playButton.innerText = "Pause";
      }, false);
      // Remove the handler for the timeupdate event
      videoElement.addEventListener("ended", function () {
        videoElement.removeEventListener("timeupdate", checkTime);
      }, false);
    }

    //  Load video's initialization segment 
    function initVideo(range, url) {
      var xhr = new XMLHttpRequest();
      if (range || url) { // make sure we've got incoming params

        // Set the desired range of bytes we want from the mp4 video file
        xhr.open('GET', url);
        xhr.setRequestHeader("Range", "bytes=" + range);
        segCheck = (timeToDownload(range) * .8).toFixed(3); // use .8 as fudge factor
        xhr.send();
        xhr.responseType = 'arraybuffer';
        try {
          xhr.addEventListener("readystatechange", function () {
             if (xhr.readyState == xhr.DONE) { // wait for video to load
              // Add response to buffer
              try {
                videoSource.appendBuffer(new Uint8Array(xhr.response));
                // Wait for the update complete event before continuing
                videoSource.addEventListener("update",updateFunct, false);

              } catch (e) {
                log('Exception while appending initialization content', e);
              }
            }
          }, false);
        } catch (e) {
          log(e);
        }
      } else {
        return // No value for range or url
      }
    }
    
    function updateFunct() {
      //  This is a one shot function, when init segment finishes loading, 
      //    update the buffer flag, call getStarted, and then remove this event.
      bufferUpdated = true;
      getStarted(file); // Get video playback started
      //  Now that video has started, remove the event listener
      videoSource.removeEventListener("update", updateFunct);
    }

    //  Play our file segments
    function getStarted(url) {

      //  Start by loading the first segment of media
      playSegment(segments[index].getAttribute("mediaRange").toString(), url);

      // Start showing video time
      requestId = window.requestAnimationFrame(render);

      // Display current index
      curIndex.textContent = index + 1;
      index++;

      //  Continue in a loop where approximately every x seconds reload the buffer
      videoElement.addEventListener("timeupdate", fileChecks, false);

    }
    //  Get video segments 
    function fileChecks() {
      // If we're ok on the buffer, then continue
      if (bufferUpdated == true) {
        if (index < segments.length) {
          // Loads next segment when time is close to the end of the last loaded segment 
          if ((videoElement.currentTime - lastTime) >= segCheck) {
            playSegment(segments[index].getAttribute("mediaRange").toString(), file);
            lastTime = videoElement.currentTime;
            curIndex.textContent = index + 1; // Display current index    
            index++;
          }
        } else {
          videoElement.removeEventListener("timeupdate", fileChecks, false);
        }
      }
    }

    //  Play segment plays a byte range (format nnnn-nnnnn) of a media file    
    function playSegment(range, url) {
      var xhr = new XMLHttpRequest();
      if (range || url) { // Make sure we've got incoming params
        xhr.open('GET', url);
        xhr.setRequestHeader("Range", "bytes=" + range);
        xhr.send();
        xhr.responseType = 'arraybuffer';
        try {
          xhr.addEventListener("readystatechange", function () {
            if (xhr.readyState == xhr.DONE) { //wait for video to load
              //  Calculate when to get next segment based on time of current one
                segCheck = (timeToDownload(range) * .8).toFixed(3); // Use .8 as fudge factor
                segLength.textContent = segCheck;
              // Add received content to the buffer
              try {
                videoSource.appendBuffer(new Uint8Array(xhr.response));
              } catch (e) {
                log('Exception while appending', e);
              }
            }
          }, false);
        } catch (e) {
          log(e);
          return // No value for range
        }
      }
    }

    //  Logs messages to the console
    function log(s) {
      //  send to console
      //    you can also substitute UI here
      console.log(s);
    };

    //  Clears the log
    function clearLog() {
      console.clear();
    }

    function clearVars() {
      index = 0;
      lastTime = 0;
    }

    function timeToDownload(range) {
      var vidDur = range.split("-");
      // Time = size * 8 / bitrate
      return (((vidDur[1] - vidDur[0]) * 8) / bandwidth)
    }

    // Converts mpd time to human time
    function parseDuration(pt) {
      // Parse time from format "PT#H#M##.##S"
      var ptTemp = pt.split("T")[1];
      ptTemp = ptTemp.split("H")
      var hours = ptTemp[0];
      var minutes = ptTemp[1].split("M")[0];
      var seconds = ptTemp[1].split("M")[1].split("S")[0];
      var hundredths = seconds.split(".");
      //  Display the length of video (taken from .mpd file, since video duration is infinate)
      return "Video length: " + hours + ":" + pZ(minutes, 2) + ":" + pZ(hundredths[0], 2) + "." + hundredths[1];

    }


    //  Converts time in seconds into a string HH:MM:SS.ss
    function formatTime(timeSec) {
      var seconds = timeSec % 60;                                 // Get seconds portion                   
      var minutes = ((timeSec - seconds) / 60) % 60;              // Get minutes portion
      var hours = ((timeSec - seconds - (minutes * 60))) / 3600;  // Get hours portion
      seconds = seconds.toFixed(2);   // Restrict to 2 places (hundredths of seconds)
      var dispSeconds = seconds.toString().split(".");
      return (pZ(hours, 2) + ":" + pZ(minutes, 2) + ":" + pZ(dispSeconds[0], 2) + "." + pZ(dispSeconds[1], 2));
    }

    //  Pad digits with zeros if needed 
    function pZ(value, padCount) {
      var tNum = value + '';
      while (tNum.length < padCount) {
        tNum = "0" + tNum;
      }
      return tNum;
    }
  </script>
</body>
</html>

MPEG-DASH and streaming reference and resources