06 Jun 2017

Virtual and Augmented Reality in 2017: A Quick Look

Project Tango Logo

Digital media and applications are escaping their confines of 2d platforms with augmented and virtual reality. You may have been hearing a lot about virtual reality and augmented reality lately with Google Daydream Logotechnology like Microsoft’s HoloLens, Facebook’s Oculus Rift and augmented reality phone games like Ingress and Pokemon Go becoming more popular. Over the last year we have also seen an explosion in the amount of AR and VR ready phones to go to market. Last November the first augmented reality enabled phone was released, using technology from Google’s augmented reality program, Project Tango. Even more phones already come ready to be used with Google’s VR tech, Daydream.

What does this mean for the future of phone applications and the way we view media? In this post we will discuss what the difference between augmented and virtual reality is, how they work, where they are being used now and where can expect to see more.

 

Augmented Reality versus Virtual Reality

The first thing we need to understand is the difference between augmented reality and virtual reality. Augmented reality and virtual reality are commonly confused for one another, so let’s start by setting the record straight. While they are very similar there are a few key differences.

Augmented reality uses the real world as the environment for the user’s experience. Augmented reality uses virtual objects simulated in an already existing space. An example of how augmented reality would work is the following. You are in your living room and you would like to see what a new blue chair would look like next to your couch.  You download a furniture app that utilizes AR technology, you turn on the application’s camera and it shows you your living room with the new chair in the middle of it. Augmented reality enhances the world we live in through the phone’s camera, but it is still easy to tell what is real and what is being simulated.

Let’s look at how virtual reality is different. Virtual reality involves more equipment, usually in the form of a headset attached to a personal computer or mobile phone. Virtual reality takes the user and makes them feel as though they are in a different environment by generating graphics and sound. An example of how this would work is the following. You get your virtual reality headset out, put it on and now you are in a furniture store full of chairs to choose from. A virtual salesperson walks you through a selection and you pick the right one for your living room. You can not tell the difference between reality and the simulated furniture store while the virtual reality program is running.

 

A Look at How Augmented Reality Technologies are Being Used Today

Project Tango Development KitRecently Tango augmented reality technology has become accessible to developers. This technology uses the three core technologies of area learning, motion tracking and depth perception to understand its surroundings. Tango offers a Unity plugin and development device that makes building augmented reality easier for developers so we can expect to see more games and applications using that technology soon.

Last year the Lenovo Phab 2 Pro was released, the first phone to come with Tango technology. Unfortunately this release was not as successful as anticipated. Of the few apps available many of them were reported to be buggy. The Lenovo hardware fell short as well with many reviews complaining about the heavy hardware, low graphics quality and AR apps draining the battery in short amounts of time. Overall the release was just a good proof of concept for Augmented reality on mobile devices, but could use some major improvements.

A Look at How Virtual Reality Technologies are Being Used Today

Virtual reality has seemed to have currently found its home in the gaming industry. HTC Vive HeadsetSome of the most popular applications for VR has been through the HTC Vive, Daydream View, Oculus Rift, and the very popular Sony Playstation VR. These technologies are being used to develop and play games using platforms like the Oculus Rift store and Steam VR. Google is also  investing in virtual reality development through Daydream and the Daydream View headset. A few phones are already on the market that come Daydream ready and we can expect to see more.

 

Media Streaming with VR

While gaming seems to currently dominate the present state of the Virtual Reality market there are other applications being used for the technology. Applications like Within and Ryot VR are using the technology to engage viewers in news videos and documentaries shot with 3d cameras. This has been mostly used to shoot short artistic or documentary style pieces focusing on social topics. More popular media outlets like the New York Times are also getting involved with VR technology with the NYT VR app that allows users to go through an expanding catalog of news stories.

One of the factors that is helping to expand the market for VR media streaming is the availability of 3d cameras. Consumer ready 3d cameras are hitting the markets and making it easier for people to create their own VR content. Here’s some links to entry-level 3d cameras:

Also, for those that want to stream 360 video, there’s a number of players in the 360 streaming media space:

Conclusion

We can expect to see more AR and VR in the near future as more hardware becomes available like headsets and AR and VR enabled phones. We’ll reach critical mass once the number of applications and sources of AR and VR media grows as well. We think we can continue to expect to see much more from these technologies in the near future.

Share this
02 Feb 2017

Streaming Video Encoding Best Practices: Video Encoding by the Numbers

We’ve followed Jan Ozer for a while. To us he is one of the best and brightest when it comes to streaming video and encoding. We were pleased to find out that he’s released a recent book Video Encoding by the Numbers.

This book is chock full of detailed and quantitative information on the decisions behind how to encode your video no matter the content type.

Here’s some background information on the book:

Video Encoding by the Numbers teaches readers to optimize the quality and efficiency of their streaming video by measuring the impact of critical configuration options with industry-standard quality metrics like PSNR and SSIMplus. This takes the guesswork out of most encoding decisions and allows readers to achieve the optimal quality/data rate tradeoff.

Since all videos encode differently, the tests detailed in the book involve eight different videos, including movie footage, animations, talking head footage, a music video, and PowerPoint and Camtasia-based videos. The book walks the reader through quality testing, basic encoding configurations, encoding with H.264, HEVC, and VP9, and encoding for adaptive streaming, including technologies for transmuxing and dynamic packaging.

When appropriate, chapters conclude with a section detailing how to configure the options discussed in that chapter with FFmpeg, a preferred tool for high-volume video producers, including packaging into HLS and DASH formats (the latter with MP4Box). The book also details how to use key Apple HLS creation and checking tools like Media File Segmenter and Variant Playlist Creator.

and a link to Jan’s very informative blog and a description of the book in his own words: http://www.streaminglearningcenter.com/blogs/ozer-ships-new-book-video-encoding-by-the-numbers.html

Also, here’s a couple teaser images to get your appetite going. 🙂

Metrics to capture the optimal balance between time and quality

Metrics to capture the optimal balance between time and quality

 

Streaming Video: Encoding Time and Quality by Preset

Streaming Video: Lets use metrics to really measure what it means to have an efficient encode.

Anyway, check it out for yourself. We’re sure you won’t be disappointed.

Share this
14 Oct 2016

CD Summit & Jenkins Days: Promising DevOps Technologies

I had the pleasure of attending CD Summit & Jenkins Days this week in Denver. The conference was very focused on Continuous Delivery, DevOps, and great team culture to promote best development and deployment practices. I’ve never been to this conference series before but it was a pleasant experience that I hope to check out again in the future.

If you are responsible for championing DevOps at your place of employment and wear many hats, it can be difficult to keep track of all the fancy new developments. Here’s a list of the new technologies that I learned about or dove deeper with while attending the conference:

Jenkins Pipeline

DevOps - Jenkins Pipeline View

  • Really cool way to manage your workflow from build to test to deploy.
  • Reminiscent of the fancy pipeline approach I dig about GoCD
  • Groovy DSL and Jenkinsfile yummies. Less UI-based configuration please!

IBM BlueMix Garage Method and ToolChains

DevOps - IBM Bluemix Toolchains

  • DevOps best practices through the Garage Method
  • Customizable Toolchains to get you from dev to deploy
  • Sweet Web-based IDE
  • Both free for anyone using BlueMix

Electric Cloud’s Electric Flow

DevOps - Electric Cloud Electric Flow

  • A fully featured and extensible release automation tool with a super slick UI
  • Free for up to 10 deploy nodes
  • So far about 180 different plugins

In Conclusion

I enjoyed my time at the conference and learned a lot. The tools above are the three most impressive that I saw and I look forward to playing around with them!

Share this
07 Sep 2016

Triggering actions in Wowza over HTTP

In this post, you’ll look at a couple of ways to trigger applications and live streams using HTTP requests. You can use this to treat Video-On-Demand content as if it were live without having to set up a schedule or stream it from an encoder, or to control Wowza from an external location.

Prerequisites:

You should have

  • a local instance of Wowza Streaming Engine
  • an IDE such as Eclipse – here’s a link to get started with the Wowza IDE
  • experience programming in Java
  • experience with creating Wowza modules – here’s a link to developer documentation for the Wowza Streaming Engine Java API

 

Method 1:

You’ll be starting an application and a live-stream by sending a request to:

http://localhost:8086/customListener/myVideo/live

You’ll need to create a HTTP Provider, which can be used to send and receive information from the Wowza server.  You’ll make use of the HTTProvider2Base class in order to enable this capability. After you build your project into a .jar, you’ll install this module as a server listener through the configuration files.

 

Explanation:

public class Application extends HTTProvider2Base {
    @Override
    public void onHTTPRequest(IVHost vhost, IHTTPRequest req, IHTTPResponse resp) {
    
        String[] splitPath = req.getPath().split("/");

When an HTTP request comes in, you’ll need to catch it by implementing onHTTPRequest. When a request arrives, most of the important information will be inside of the IHTTPRequest object. Extract this important information from the path and store it in an array.

    String resource = "";
    String application = "";
    String applicationInstance = "";
 
    if(splitPath.length > 2){
        //Resource to play
        resource = splitPath[1];
        //Application to play it on
        application = splitPath[2];
    }else if (splitPath.length > 3){
        //Optional instance of application
        applicationInstance = splitPath[3];
    }

Next, assign this information to values that you’ll use later. The first part of your path, splitPath[0], is used only to navigate to your HTTP Provider, so you can safely ignore it. The second part of the path at splitPath[1] should be the resource you wish to stream. This should be a file name without an extension. For example:  myVideo

The third part of your path should be the application you want to utilize. For example: live

Optionally, you may add a fourth segment, which will be the application instance where you want to create the stream. For example, include channel1 if you wish to stream to live/channel1. If you do not specify this value, it will default to _definst_.

    //start application
    vhost.startApplicationInstance(application);
    //get application
    IApplication app = vhost.getApplication(application);
    IApplicationInstance instance = app.getAppInstance(applicationInstance);
    //Start stream
    Stream stream = Stream.createInstance(instance, "MyStream");
    stream.play(resource,0,-1,false);

Finally, you’ll  start up the application, access the application instance (which may just be the default), and use the resource named in the url to start your stream. In this example, your stream will be titled MyStream.

Your stream will be accessible at rtmp://localhost:1935/live/MyStream

Complete code:

package com.realeyes.wowza.modules.httpDemo
 
import com.wowza.wms.application.IApplication;
import com.wowza.wms.application.IApplicationInstance;
import com.wowza.wms.http.HTTProvider2Base;
import com.wowza.wms.http.IHTTPRequest;
import com.wowza.wms.http.IHTTPResponse;
import com.wowza.wms.stream.publish.Stream;
import com.wowza.wms.vhost.IVHost;
 
public class Application extends HTTProvider2Base {
    @Override
    public void onHTTPRequest(IVHost vhost, IHTTPRequest req, IHTTPResponse resp) {
 
    String[] splitPath = req.getPath().split("/");
    String resource = "";
    String application = "";
    String applicationInstance = "";
 
    if(splitPath.length > 2){
        //Resource to play
        resource = splitPath[1];
        //Application to play it on
        application = splitPath[2];
    }else if (splitPath.length > 3){
        //Optional instance of application
        applicationInstance = splitPath[3];
    }
 
    //start application
    vhost.startApplicationInstance(application);
    //get application
    IApplication app = vhost.getApplication(application);
    IApplicationInstance instance = app.getAppInstance(applicationInstance);
    //Start stream
    Stream stream = Stream.createInstance(instance, "MyStream");
    stream.play(resource,0,-1,false);
 
    }
}

Post-code Setup:

After you’ve saved your code and built it into a .jar file (which should be done for you automatically when running the default settings), you need to specify this as an HTTP Provider.

  1. Navigate to your wowza install directory and open conf/VHost.xml
  2. Under <HttpProviders>, add a <HTTPProvider> entry.
  3. Specify BaseClass as the fully qualified class path to your module.
  4. Specify RequestFilters as “customListener*”
  5. Specify AuthenticationMethod as “none”

It should look something like this:

<HTTPProvider>
	<BaseClass>com.realeyes.wowza.modules.httpDemo.Application</BaseClass>
	<RequestFilters>customListener*</RequestFilters>
	<AuthenticationMethod>none</AuthenticationMethod>
</HTTPProvider>


The value you specify for RequestFilters will end up being splitPath[0], and only requests that match this filter will run through your code.

Restart Wowza, and you should be able to hit your endpoint at http://localhost:8086/customListener/myVideo/myApplication

Method 2:

What if you don’t want to use port 8086, or what if you’d rather start and connect to a stream with just one request? You can use a similar technique at the application level as well, by implementing a built-in listener called “onHTTPSessionCreate”. Using this, you’ll intercept the request and delay the response until you’ve started your stream.

Your URL will look a little different as well. It should look like this:

http://localhost:1935/live/myStream/playlist.m3u8?fileName=sample.mp4

For this method, you’ll access the application and the stream directly, and specify the resource to use through a query parameter.

 

Explanation:

public class Application extends ModuleBase {

    IApplicationInstance thisInstance;
    public Map <String,String> query = null;

    public void onAppStart(IApplicationInstance appInstance) {
        thisInstance = appInstance;
    }

First you’ll want to store your application instance and create a map for your query parameters. You’ll use this later.

public void onHTTPSessionCreate(IHTTPStreamerSession httpSession) {

    try {
        query = splitQueryString(httpSession.getQueryStr());
    } catch (UnsupportedEncodingException e) {
        e.printStackTrace();
    }
    
    //...
}
        
//Taken mostly from http://stackoverflow.com/a/13592567/773737
public Map <String, String> splitQueryString(String str) throws UnsupportedEncodingException {
    final Map <String, String> query_pairs = new LinkedHashMap <String, String>();
    final String[] pairs = str.split("&");
    for (String pair: pairs) {
        final int idx = pair.indexOf("=");
        final String key = idx > 0 ? URLDecoder.decode(pair.substring(0, idx), "UTF-8") : pair;
        if (!query_pairs.containsKey(key)) {
            query_pairs.put(key, null);
        }
        final String value = idx > 0 && pair.length() > idx + 1 ? URLDecoder.decode(pair.substring(idx + 1), "UTF-8") : null;
        query_pairs.put(key, value);
    }
    return query_pairs;
}

The onHTTPSessionCreate listener catches requests to HLS streams that end with playlist.m3u8. From here, extract the query string from your request into the map you created previously.

    if (query.get("fileName") != null) {
        String fileName = query.get("fileName");
        Stream stream = Stream.createInstance(thisInstance, "myStream");
        stream.play(fileName, 0, -1, false);

        //Sleep so we don't disconnect the client while the stream is starting up
        try {
            Thread.sleep(3000);
        } catch (InterruptedException e) {
             e.printStackTrace();
        }
    }

Finally, use the fileName parameter to start up your stream in the same way as method #1. You don’t need to specify your application or your application instance, because this code is being run inside of your application instance.  You’re actually accessing a stream that doesn’t yet exist, and since the stream creation process call takes a few seconds to complete, this would normally return an error. To prevent this, sleep for a few seconds in order to delay the response.

Your client should experience a short delay in addition to the seconds you spend sleeping, but afterwards, it should be connected to the new stream.

Complete code:

 

package com.realeyes.wowza.modules.httpDemo2;

import com.wowza.wms.application.*;

import java.io.UnsupportedEncodingException;
import java.net.URLDecoder;
import java.util.LinkedHashMap;
import java.util.Map;

import com.wowza.wms.amf.*;
import com.wowza.wms.client.*;
import com.wowza.wms.module.*;
import com.wowza.wms.request.*;
import com.wowza.wms.stream.publish.Stream;
import com.wowza.wms.httpstreamer.model.*;
import com.wowza.wms.httpstreamer.cupertinostreaming.httpstreamer.*;

public class Application extends ModuleBase {

    IApplicationInstance thisInstance;
    public Map <String,String> query = null;

    public void onAppStart(IApplicationInstance appInstance) {
        thisInstance = appInstance;
    }
    
    public void onHTTPSessionCreate(IHTTPStreamerSession httpSession) {

        try {
            query = splitQueryString(httpSession.getQueryStr());
        } catch (UnsupportedEncodingException e) {
            e.printStackTrace();
        }
        
        if (query.get("fileName") != null) {
            String fileName = query.get("fileName");
            Stream stream = Stream.createInstance(thisInstance, "myStream");
            stream.play(fileName, 0, -1, false);

            //Sleep so we don't disconnect the client while the stream is starting up
            try {
                Thread.sleep(3000);
            } catch (InterruptedException e) {
                 e.printStackTrace();
            }
        }
    }

    //Taken mostly from http://stackoverflow.com/a/13592567/773737
    public Map <String, String> splitQueryString(String str) throws UnsupportedEncodingException {
        final Map <String, String> query_pairs = new LinkedHashMap <String, String>();
        final String[] pairs = str.split("&");
        for (String pair: pairs) {
            final int idx = pair.indexOf("=");
            final String key = idx > 0 ? URLDecoder.decode(pair.substring(0, idx), "UTF-8") : pair;
            if (!query_pairs.containsKey(key)) {
                query_pairs.put(key, null);
            }
            final String value = idx > 0 && pair.length() > idx + 1 ? URLDecoder.decode(pair.substring(idx + 1), "UTF-8") : null;
            query_pairs.put(key, value);
        }
        return query_pairs;
     }
}

 

 

Post-code Setup:

After you compile and build your module, add it to your application.

To install this module to the default “live” application:

  1. Navigate to your Wowza install directory and
  2. Open /conf/live/Application.xml,
  3. Add a new <Module> to your <Modules> entry, like so:
<Module>
 <Name>httpDemo</Name>
 <Description>demo</Description>
 <Class>com.realeyes.wowza.modules.httpDemo2.Application</Class>
</Module>

Restart your server and you should be good to go.

Conclusion:

Being able to trigger applications or streams over HTTP can be a useful tool. You can make further use of this functionality to stop streams, gather metrics, kick users or inject advertisements. You could even create a REST service using this, and integrate Wowza into your back-end server.

To see the official documentation behind the concepts used here, follow these links:

Stream Class Example

How to create an HTTP Provider

How to control access to RTSP/RTP streams

How to control access to HTTP streams

 

Share this
15 Jul 2016

Unified Content Across Platforms, Are We There Yet?

tl;dr; For content that doesn’t require DRM, we’re there. If you need DRM there’s still some work ahead.
At WWDC 2016 Apple had some big news for the video streaming community. They announced two big changes that move us closer to a world where media files can truly be shared. Announcement one was that the HLS (HTTP Live Streaming) specification would be expanded to allow for use of fMP4 (fragmented MP4) media segments. The second announcement was that FairPlay would support fMP4 segments that are encrypted with CENC. Unfortunately as is often the case, the devil is in the details.
Let’s start with the good news. fMP4 support in HLS means that content libraries that do not require DRM can use a single set of content to serve most (if not all) clients. You will need an MPEG-DASH manifest and an HLS manifest, but the segments referenced by those manifests can be shared. This is huge! This has a major impact on the cache-ability of assets and on storage costs for media libraries. The open question here is what versions of iOS will support this new version of HLS? Will there be an update to old versions of iOS or will this be an iOS 10+ feature?
Now for the news that turned out to be a pretty big letdown. Apple announced “Sample encryption uses part of ISO/IEC 23001:7 2016” (that specification is for Common Encryption, a draft version can be obtained from mpeg). The next line in the slide says “MPEG standard—“Common Encryption”, this is looking great. However that was followed by “‘cbcs’ mode” which made it clear something fishy was going on. The language they used should also have made it clear something was up “uses part of” is not a good sign for implementing a standard.
To understand what is going on here we need to understand the CENC spec and the new HLS spec a bit better. The new versions of the CENC spec define 4 separate protection schemes for content. The 4 protection schemes defined in section 4.2 of that specification are ‘cenc’, ‘cbc1’, ‘cens’, and ‘cbcs’ (yes, one of the schemes has the same name as the overall spec). Of these 4 protection schemes only the first, cenc, is required for implementors to support. The other three are optional.
Now we need to better understand what was added to HLS. The new spec in section 4.3.2.4 says “fMP4 Media Segments are encrypted using the ‘cbcs’ scheme of Common Encryption”. They’ve added support for the optional ‘cbcs’ protection scheme, without adding support for CENC spec required ‘cenc’ protection scheme. This means that if you need to protect your content with DRM you cannot use the same fMP4 fragments for FairPlay as you use for PlayReady or WideVine.
Why would Apple choose to “use part of” the CENC spec and not implement the one required protection scheme? We don’t know all of the answers, but one key point that Roger Pantos drove home in one of his presentations “Content Protection for HTTP Live Streaming”[4] was that battery life is king. He even said “every choice we made was predicated on giving you good battery life”. So what does the ‘cbcs’ protection scheme have to do with good battery life?
Of the 4 protection schemes defined in CENC the first two (‘cenc’ and ‘cbc1’) are Full Sample Encryption. As the name implies these protection schemes encrypt the entirety of the protected segment. The two newer protection schemes (‘cens’ and ‘cbcs’) use Subsample Encryption. In subsample encryption only a portion of the protected segment is encrypted. The spec says “Each Subsample SHALL have an unprotected part followed by a protected part”, so about half of the segment ends up being encrypted. This means that there is less work that needs to be done on the client as only about half as much data needs to be decrypted. In addition the ‘cbcs’ protection scheme treats each subsample as an independent encrypted block, this means that subsamples can be decrypted in parallel allowing for faster decryption of streams.
The ‘cbcs’ protection scheme is a great approach allowing for faster and more efficient protection of content, however the lack of support for ‘cenc’ within FairPlay means that DRM protected content must still be fragmented. We’ve come much closer to an ecosystem where we can use a single set of content files to deliver high quality protected video content, but we’re not quite there yet.
Share this
16 Sep 2015
26 Aug 2015

Using JMeter to Load Test Live HLS Concurrency of Wowza Streaming Engine

When I was tasked with determine the max users that an m4.xlarge AWS instance running WowzaStreamingEngine delivering HLS content could reliably handle; I found out quickly I had a fairly difficult task ahead of me. Luckily I found a few blog posts that pointed me in the right direction and provided the base JMeter test plan to work with.

read more
Share this
09 Jul 2015
06 Jul 2015
19 Feb 2015

Using PHDS/PHLS/PRTMP? A case for updating to AMS 5.0.7.

Using the PHDS, PHLS, or PRTMP feature of Adobe Media Server is reliant on some certificate files provided with the installation. From time to time these files are set to expire and new files are provided by new AMS install versions. With the release of AMS 5.0.7, it has been noted in the release notes that these certificate files will need to be replaced again before April 6th, 2015:

read more
Share this

© 2017 RealEyes Media, LLC. All rights reserved.