02 Feb 2017

Streaming Video Encoding Best Practices: Video Encoding by the Numbers

We’ve followed Jan Ozer for a while. To us he is one of the best and brightest when it comes to streaming video and encoding. We were pleased to find out that he’s released a recent book Video Encoding by the Numbers.

This book is chock full of detailed and quantitative information on the decisions behind how to encode your video no matter the content type.

Here’s some background information on the book:

Video Encoding by the Numbers teaches readers to optimize the quality and efficiency of their streaming video by measuring the impact of critical configuration options with industry-standard quality metrics like PSNR and SSIMplus. This takes the guesswork out of most encoding decisions and allows readers to achieve the optimal quality/data rate tradeoff.

Since all videos encode differently, the tests detailed in the book involve eight different videos, including movie footage, animations, talking head footage, a music video, and PowerPoint and Camtasia-based videos. The book walks the reader through quality testing, basic encoding configurations, encoding with H.264, HEVC, and VP9, and encoding for adaptive streaming, including technologies for transmuxing and dynamic packaging.

When appropriate, chapters conclude with a section detailing how to configure the options discussed in that chapter with FFmpeg, a preferred tool for high-volume video producers, including packaging into HLS and DASH formats (the latter with MP4Box). The book also details how to use key Apple HLS creation and checking tools like Media File Segmenter and Variant Playlist Creator.

and a link to Jan’s very informative blog and a description of the book in his own words: http://www.streaminglearningcenter.com/blogs/ozer-ships-new-book-video-encoding-by-the-numbers.html

Also, here’s a couple teaser images to get your appetite going. 🙂

Metrics to capture the optimal balance between time and quality

Metrics to capture the optimal balance between time and quality

 

Streaming Video: Encoding Time and Quality by Preset

Streaming Video: Lets use metrics to really measure what it means to have an efficient encode.

Anyway, check it out for yourself. We’re sure you won’t be disappointed.

Share this
07 Sep 2016

Triggering actions in Wowza over HTTP

In this post, you’ll look at a couple of ways to trigger applications and live streams using HTTP requests. You can use this to treat Video-On-Demand content as if it were live without having to set up a schedule or stream it from an encoder, or to control Wowza from an external location.

Prerequisites:

You should have

  • a local instance of Wowza Streaming Engine
  • an IDE such as Eclipse – here’s a link to get started with the Wowza IDE
  • experience programming in Java
  • experience with creating Wowza modules – here’s a link to developer documentation for the Wowza Streaming Engine Java API

 

Method 1:

You’ll be starting an application and a live-stream by sending a request to:

http://localhost:8086/customListener/myVideo/live

You’ll need to create a HTTP Provider, which can be used to send and receive information from the Wowza server.  You’ll make use of the HTTProvider2Base class in order to enable this capability. After you build your project into a .jar, you’ll install this module as a server listener through the configuration files.

 

Explanation:

public class Application extends HTTProvider2Base {
    @Override
    public void onHTTPRequest(IVHost vhost, IHTTPRequest req, IHTTPResponse resp) {
    
        String[] splitPath = req.getPath().split("/");

When an HTTP request comes in, you’ll need to catch it by implementing onHTTPRequest. When a request arrives, most of the important information will be inside of the IHTTPRequest object. Extract this important information from the path and store it in an array.

    String resource = "";
    String application = "";
    String applicationInstance = "";
 
    if(splitPath.length > 2){
        //Resource to play
        resource = splitPath[1];
        //Application to play it on
        application = splitPath[2];
    }else if (splitPath.length > 3){
        //Optional instance of application
        applicationInstance = splitPath[3];
    }

Next, assign this information to values that you’ll use later. The first part of your path, splitPath[0], is used only to navigate to your HTTP Provider, so you can safely ignore it. The second part of the path at splitPath[1] should be the resource you wish to stream. This should be a file name without an extension. For example:  myVideo

The third part of your path should be the application you want to utilize. For example: live

Optionally, you may add a fourth segment, which will be the application instance where you want to create the stream. For example, include channel1 if you wish to stream to live/channel1. If you do not specify this value, it will default to _definst_.

    //start application
    vhost.startApplicationInstance(application);
    //get application
    IApplication app = vhost.getApplication(application);
    IApplicationInstance instance = app.getAppInstance(applicationInstance);
    //Start stream
    Stream stream = Stream.createInstance(instance, "MyStream");
    stream.play(resource,0,-1,false);

Finally, you’ll  start up the application, access the application instance (which may just be the default), and use the resource named in the url to start your stream. In this example, your stream will be titled MyStream.

Your stream will be accessible at rtmp://localhost:1935/live/MyStream

Complete code:

package com.realeyes.wowza.modules.httpDemo
 
import com.wowza.wms.application.IApplication;
import com.wowza.wms.application.IApplicationInstance;
import com.wowza.wms.http.HTTProvider2Base;
import com.wowza.wms.http.IHTTPRequest;
import com.wowza.wms.http.IHTTPResponse;
import com.wowza.wms.stream.publish.Stream;
import com.wowza.wms.vhost.IVHost;
 
public class Application extends HTTProvider2Base {
    @Override
    public void onHTTPRequest(IVHost vhost, IHTTPRequest req, IHTTPResponse resp) {
 
    String[] splitPath = req.getPath().split("/");
    String resource = "";
    String application = "";
    String applicationInstance = "";
 
    if(splitPath.length > 2){
        //Resource to play
        resource = splitPath[1];
        //Application to play it on
        application = splitPath[2];
    }else if (splitPath.length > 3){
        //Optional instance of application
        applicationInstance = splitPath[3];
    }
 
    //start application
    vhost.startApplicationInstance(application);
    //get application
    IApplication app = vhost.getApplication(application);
    IApplicationInstance instance = app.getAppInstance(applicationInstance);
    //Start stream
    Stream stream = Stream.createInstance(instance, "MyStream");
    stream.play(resource,0,-1,false);
 
    }
}

Post-code Setup:

After you’ve saved your code and built it into a .jar file (which should be done for you automatically when running the default settings), you need to specify this as an HTTP Provider.

  1. Navigate to your wowza install directory and open conf/VHost.xml
  2. Under <HttpProviders>, add a <HTTPProvider> entry.
  3. Specify BaseClass as the fully qualified class path to your module.
  4. Specify RequestFilters as “customListener*”
  5. Specify AuthenticationMethod as “none”

It should look something like this:

<HTTPProvider>
	<BaseClass>com.realeyes.wowza.modules.httpDemo.Application</BaseClass>
	<RequestFilters>customListener*</RequestFilters>
	<AuthenticationMethod>none</AuthenticationMethod>
</HTTPProvider>


The value you specify for RequestFilters will end up being splitPath[0], and only requests that match this filter will run through your code.

Restart Wowza, and you should be able to hit your endpoint at http://localhost:8086/customListener/myVideo/myApplication

Method 2:

What if you don’t want to use port 8086, or what if you’d rather start and connect to a stream with just one request? You can use a similar technique at the application level as well, by implementing a built-in listener called “onHTTPSessionCreate”. Using this, you’ll intercept the request and delay the response until you’ve started your stream.

Your URL will look a little different as well. It should look like this:

http://localhost:1935/live/myStream/playlist.m3u8?fileName=sample.mp4

For this method, you’ll access the application and the stream directly, and specify the resource to use through a query parameter.

 

Explanation:

public class Application extends ModuleBase {

    IApplicationInstance thisInstance;
    public Map <String,String> query = null;

    public void onAppStart(IApplicationInstance appInstance) {
        thisInstance = appInstance;
    }

First you’ll want to store your application instance and create a map for your query parameters. You’ll use this later.

public void onHTTPSessionCreate(IHTTPStreamerSession httpSession) {

    try {
        query = splitQueryString(httpSession.getQueryStr());
    } catch (UnsupportedEncodingException e) {
        e.printStackTrace();
    }
    
    //...
}
        
//Taken mostly from http://stackoverflow.com/a/13592567/773737
public Map <String, String> splitQueryString(String str) throws UnsupportedEncodingException {
    final Map <String, String> query_pairs = new LinkedHashMap <String, String>();
    final String[] pairs = str.split("&");
    for (String pair: pairs) {
        final int idx = pair.indexOf("=");
        final String key = idx > 0 ? URLDecoder.decode(pair.substring(0, idx), "UTF-8") : pair;
        if (!query_pairs.containsKey(key)) {
            query_pairs.put(key, null);
        }
        final String value = idx > 0 && pair.length() > idx + 1 ? URLDecoder.decode(pair.substring(idx + 1), "UTF-8") : null;
        query_pairs.put(key, value);
    }
    return query_pairs;
}

The onHTTPSessionCreate listener catches requests to HLS streams that end with playlist.m3u8. From here, extract the query string from your request into the map you created previously.

    if (query.get("fileName") != null) {
        String fileName = query.get("fileName");
        Stream stream = Stream.createInstance(thisInstance, "myStream");
        stream.play(fileName, 0, -1, false);

        //Sleep so we don't disconnect the client while the stream is starting up
        try {
            Thread.sleep(3000);
        } catch (InterruptedException e) {
             e.printStackTrace();
        }
    }

Finally, use the fileName parameter to start up your stream in the same way as method #1. You don’t need to specify your application or your application instance, because this code is being run inside of your application instance.  You’re actually accessing a stream that doesn’t yet exist, and since the stream creation process call takes a few seconds to complete, this would normally return an error. To prevent this, sleep for a few seconds in order to delay the response.

Your client should experience a short delay in addition to the seconds you spend sleeping, but afterwards, it should be connected to the new stream.

Complete code:

 

package com.realeyes.wowza.modules.httpDemo2;

import com.wowza.wms.application.*;

import java.io.UnsupportedEncodingException;
import java.net.URLDecoder;
import java.util.LinkedHashMap;
import java.util.Map;

import com.wowza.wms.amf.*;
import com.wowza.wms.client.*;
import com.wowza.wms.module.*;
import com.wowza.wms.request.*;
import com.wowza.wms.stream.publish.Stream;
import com.wowza.wms.httpstreamer.model.*;
import com.wowza.wms.httpstreamer.cupertinostreaming.httpstreamer.*;

public class Application extends ModuleBase {

    IApplicationInstance thisInstance;
    public Map <String,String> query = null;

    public void onAppStart(IApplicationInstance appInstance) {
        thisInstance = appInstance;
    }
    
    public void onHTTPSessionCreate(IHTTPStreamerSession httpSession) {

        try {
            query = splitQueryString(httpSession.getQueryStr());
        } catch (UnsupportedEncodingException e) {
            e.printStackTrace();
        }
        
        if (query.get("fileName") != null) {
            String fileName = query.get("fileName");
            Stream stream = Stream.createInstance(thisInstance, "myStream");
            stream.play(fileName, 0, -1, false);

            //Sleep so we don't disconnect the client while the stream is starting up
            try {
                Thread.sleep(3000);
            } catch (InterruptedException e) {
                 e.printStackTrace();
            }
        }
    }

    //Taken mostly from http://stackoverflow.com/a/13592567/773737
    public Map <String, String> splitQueryString(String str) throws UnsupportedEncodingException {
        final Map <String, String> query_pairs = new LinkedHashMap <String, String>();
        final String[] pairs = str.split("&");
        for (String pair: pairs) {
            final int idx = pair.indexOf("=");
            final String key = idx > 0 ? URLDecoder.decode(pair.substring(0, idx), "UTF-8") : pair;
            if (!query_pairs.containsKey(key)) {
                query_pairs.put(key, null);
            }
            final String value = idx > 0 && pair.length() > idx + 1 ? URLDecoder.decode(pair.substring(idx + 1), "UTF-8") : null;
            query_pairs.put(key, value);
        }
        return query_pairs;
     }
}

 

 

Post-code Setup:

After you compile and build your module, add it to your application.

To install this module to the default “live” application:

  1. Navigate to your Wowza install directory and
  2. Open /conf/live/Application.xml,
  3. Add a new <Module> to your <Modules> entry, like so:
<Module>
 <Name>httpDemo</Name>
 <Description>demo</Description>
 <Class>com.realeyes.wowza.modules.httpDemo2.Application</Class>
</Module>

Restart your server and you should be good to go.

Conclusion:

Being able to trigger applications or streams over HTTP can be a useful tool. You can make further use of this functionality to stop streams, gather metrics, kick users or inject advertisements. You could even create a REST service using this, and integrate Wowza into your back-end server.

To see the official documentation behind the concepts used here, follow these links:

Stream Class Example

How to create an HTTP Provider

How to control access to RTSP/RTP streams

How to control access to HTTP streams

 

Share this
15 Jul 2016

Unified Content Across Platforms, Are We There Yet?

tl;dr; For content that doesn’t require DRM, we’re there. If you need DRM there’s still some work ahead.
At WWDC 2016 Apple had some big news for the video streaming community. They announced two big changes that move us closer to a world where media files can truly be shared. Announcement one was that the HLS (HTTP Live Streaming) specification would be expanded to allow for use of fMP4 (fragmented MP4) media segments. The second announcement was that FairPlay would support fMP4 segments that are encrypted with CENC. Unfortunately as is often the case, the devil is in the details.
Let’s start with the good news. fMP4 support in HLS means that content libraries that do not require DRM can use a single set of content to serve most (if not all) clients. You will need an MPEG-DASH manifest and an HLS manifest, but the segments referenced by those manifests can be shared. This is huge! This has a major impact on the cache-ability of assets and on storage costs for media libraries. The open question here is what versions of iOS will support this new version of HLS? Will there be an update to old versions of iOS or will this be an iOS 10+ feature?
Now for the news that turned out to be a pretty big letdown. Apple announced “Sample encryption uses part of ISO/IEC 23001:7 2016” (that specification is for Common Encryption, a draft version can be obtained from mpeg). The next line in the slide says “MPEG standard—“Common Encryption”, this is looking great. However that was followed by “‘cbcs’ mode” which made it clear something fishy was going on. The language they used should also have made it clear something was up “uses part of” is not a good sign for implementing a standard.
To understand what is going on here we need to understand the CENC spec and the new HLS spec a bit better. The new versions of the CENC spec define 4 separate protection schemes for content. The 4 protection schemes defined in section 4.2 of that specification are ‘cenc’, ‘cbc1’, ‘cens’, and ‘cbcs’ (yes, one of the schemes has the same name as the overall spec). Of these 4 protection schemes only the first, cenc, is required for implementors to support. The other three are optional.
Now we need to better understand what was added to HLS. The new spec in section 4.3.2.4 says “fMP4 Media Segments are encrypted using the ‘cbcs’ scheme of Common Encryption”. They’ve added support for the optional ‘cbcs’ protection scheme, without adding support for CENC spec required ‘cenc’ protection scheme. This means that if you need to protect your content with DRM you cannot use the same fMP4 fragments for FairPlay as you use for PlayReady or WideVine.
Why would Apple choose to “use part of” the CENC spec and not implement the one required protection scheme? We don’t know all of the answers, but one key point that Roger Pantos drove home in one of his presentations “Content Protection for HTTP Live Streaming”[4] was that battery life is king. He even said “every choice we made was predicated on giving you good battery life”. So what does the ‘cbcs’ protection scheme have to do with good battery life?
Of the 4 protection schemes defined in CENC the first two (‘cenc’ and ‘cbc1’) are Full Sample Encryption. As the name implies these protection schemes encrypt the entirety of the protected segment. The two newer protection schemes (‘cens’ and ‘cbcs’) use Subsample Encryption. In subsample encryption only a portion of the protected segment is encrypted. The spec says “Each Subsample SHALL have an unprotected part followed by a protected part”, so about half of the segment ends up being encrypted. This means that there is less work that needs to be done on the client as only about half as much data needs to be decrypted. In addition the ‘cbcs’ protection scheme treats each subsample as an independent encrypted block, this means that subsamples can be decrypted in parallel allowing for faster decryption of streams.
The ‘cbcs’ protection scheme is a great approach allowing for faster and more efficient protection of content, however the lack of support for ‘cenc’ within FairPlay means that DRM protected content must still be fragmented. We’ve come much closer to an ecosystem where we can use a single set of content files to deliver high quality protected video content, but we’re not quite there yet.
Share this
26 Aug 2015

Using JMeter to Load Test Live HLS Concurrency of Wowza Streaming Engine

When I was tasked with determine the max users that an m4.xlarge AWS instance running WowzaStreamingEngine delivering HLS content could reliably handle; I found out quickly I had a fairly difficult task ahead of me. Luckily I found a few blog posts that pointed me in the right direction and provided the base JMeter test plan to work with.

read more
Share this
06 Jul 2015
01 Oct 2014
02 Jul 2014
11 Oct 2013

© 2017 RealEyes Media, LLC. All rights reserved.