Time – the one thing we cannot afford to lose

Time. .  We spend it in a wide variety of ways – at work performing tasks and creating things we’ve been tasked with.  We also spend time in our vehicles or on our bikes, headed to and from work.  And – given the present pandemic we face here in the US, many of us are spending more time with our families and working all in the same place.  I once asked a billionaire to identify the one thing he couldn’t buy but that was the most valuable thing in his life.  His answer?  Time.  No matter how we look at it, time is one of the most important yet finite elements in our lives.  We generally end up wishing we had more.  

The pandemic we are all living with today has had a hand in changing many things about life – how and where we work, how often we venture out of our homes, and what we do when we are home and not working.  We all have stories (and likely opinions) about quarantines, and I won’t digress here – but rather make a compelling point that shows a collective 50% increase in streaming data usage by device since April 2020. 50%, that is significant!  So while we have had more free time during this pandemic, more and more of us have turned to streaming services to consume more content.  In fact, as many as one-quarter of us had so much time at hand, we subscribed to another service.  Bottom line – pandemic or not, we are streaming more content to our devices than ever, and the trend shows no signs of slowing.

But what happens when the train is late?

So the subheading might be a head scratcher.  But here’s the problem.  You have that extra free time, sports leagues are back in play and games and other coverage can be streamed, but the game is several seconds behind real time and you’ve catching goals, home runs, and soon to be touchdowns after they happen. 

As in this example, John Appleseed up top is watching a feed of the football match and sees the shot on goal.  His friend Tim is also watching, but appears roughly 6-7 seconds behind John in the stream he’s watching of the match. This offset has been

 measured recently at as much as 30 seconds, and we see 15 seconds on a regular basis.  So they are both enjoying their experience of the event, until the shot on goal.  

So, when John shoots Tim a text to celebrate the goal, Tim is perplexed.  He hasn’t yet seen the same shot John has.  In video streaming, this condition is called latency.  Latency is defined as the delay before a transfer of data begins following an instruction for its transfer. For purposes of this discussion latency is the delay between the live shot on goal and the time when the viewer can see the goal on their device.  When video is processed for streaming, often times the source is passed from the venue – in this case let’s say it is the stadium – and the stream ladder of renditions is created in the cloud somewhere.  So in essence, there is very little processing of these streams at the edge and nearly all that work is done via a transcode step in the cloud.  There is absolutely nothing wrong with this approach, but it does create latency that can be visualized in the example above.  This condition is most notable in live applications like news and sports, when things change moment by moment.  So in short  – latency steals time.

 

Time travel – is this a thing?

In a word, no.  Science fiction nor special effects are needed to accelerate the delivery of streaming video to the device of your choice.  However, there is a lot that can be done to reduce the amount of time elapsed between that proverbial goal happening on the pitch and the time you actually see it on your device.  By processing the actual steams you’ll consume on a device and not in the cloud, we are able to deliver those to a content delivery network (CDN) and reduce latency to near 1 second.  This means you are seeing that goal roughly one second after it happened in the stadium.  That’s significant.  

If you’re watched the Netflix series The Umbrella Academy, we’re not exactly not talking time manipulation and travel as their character  Number 5 is able to do, moving through time and changing global outcomes, but what I’ve just outlined represents a measurable improvement in how we consume live streaming video.  Again, this isn’t a cloud versus on-premise argument or endorsement, but a different approach to delivery when time matters most – usually in news and sports applications.

Time is important for many reasons, and this is just one.  Offering broadcasters and publishers the flexibility to deliver their content using contribution or distribution architecture is yet another element of flexibility that is at the core of what Videon is.  It’s about time.  

 

Authored by Matt Smith

Matt Smith is a recognized digital media industry evangelist and thought leader, having spoken at the National Association of Broadcasters (NAB) Show, IBC, and various other shows.  He’s served in a variety of roles in the industry during his career, with stops at Comcast, Brightcove Anvato, Envivio and others

4 Steps to Configure Videon’s Live Streaming Encoder with Low Latency Video

Low latency remains a key component in any live streaming application. Video chat, distance learning, live auctions, event broadcasts and many more use case cases all need fast stream delivery to ensure a positive user experience. The challenge lies in providing low latency that is consistent and scalable. Complicating matters further, is the fact that live streaming video is itself nuanced with many subtle yet impactful considerations that must be taken into account.

In order to simplify this process and cut through any confusion, Red5 Pro and Videon have joined forces to bring real-time latency live video streaming to the mass market. To accomplish this, they make use of Videon’s hardware based live streaming encoder to perform the video streaming. This document outlines the process for setting up a low latency workflow and displays the impressive ultra-low latency results achievable when pairing Videon encoders with Red5 Pro’s cloud-based streaming platform.

This step-by-step guide walks you through implementing a WebRTC based ultra low latency workflow using Videon’s VersaStreamer 4K encoder and Red5 Pro’s Real-Time Streaming platform. We will then measure the glass-to-glass latency of the configured workflow.

Workflow Diagram

The latency consistently clocks at 400-500ms using either single instance or Red5 Pro Clustering technology and Stream Manager. The lowest latency during this testing was clocked at 147.5ms

Along with our partners at Videon, Red5 Pro recommends using our cloud-based streaming software and Videon’s VersaStreamer 4K for applications that require real-time interaction between viewers and broadcasters.

 

Workflow Details

All the tests and the results shown in this post were run with encode and decode endpoints located in Videon’s lab located in State College, PA.

 

Videon VersaStreamer 4K

  • Software Version:  6.3.0.61.3

 

Red5 Pro Server

  • Software Version:  5.7.11.b376-release
  • Configured in Amazon Web Services EC2
  • Region: us-east-1
  • Instance type: c5.xlarge
  • OS:  Ubuntu 16.04.6 LTS

 

Audio/Video Source

ASUS Chromebox

  • Google Chrome OS
  • Version 76.0.3809.136 (Official Build) (64-bit)
  • Video:  1920x1080P60
  • Audio:  HDMI Embedded (Spotify music)

Gefen 1:3 HDMI Splitter

  • Model:  repeater

 

Players

Red5 Pro HTML Publisher and Subscriber SDK

  • Version:  5.7.0
  • https://github.com/red5pro/streaming-html5

 

Setup Instructions

First, as illustrated by the block diagram above, connect the hardware. Once the hardware is configured, you can then install and setup the Red5 Pro instance or cluster as per the provided Red5 Pro instructions. Please note that all defaults should be enabled.

Next, you will need to setup the Videon VersaStreamer 4K by opening “Encoder Settings: in the Videon VersaStreamer 4K web UI. Make sure that the following settings are selected in order to provide the best performance.

  • Encoding Mode > Constant Bitrate
  • Video Encoding > H.264 (AVC)
  • H.264 Profile > Baseline Profile (note that you must use baseline due to WebRTC based decoder limitations for higher profiles in browsers)

All other settings may be optimized for your specific workflow and deployment conditions. The following image shows all the settings used for the tests that were performed in order to produce the results stated in this post.

You will now need to turn on RTMP streaming to the Red5 Pro server in Output Settings by selecting the RTMP 1 tab:

  1. Turn RTMP Output 1 “On”
  2. From the dropdown menu, select “Generic RTMP”
  3. Enter the following in the URL entry box:
  4. rtmp://<Red5 Pro Server>:1935/live/stream
  5. Replace <Red5 Pro Server> with the AWS EC2 instance that has your Red5 Pro server configured and running along with the configured port.

You can now play the stream in the Red5 Pro web player by following the next 2 steps:

  1. In the browser’s address bar, enter this URL: http://<Red5 Pro Server>:5080/live/viewer.jsp?host=<Red5 Pro Server IP>&stream=stream#. Note that if you want to use this in production you will want to enable SSL and use a fully qualified domain and https. More info on installing SSL certificates can be found in our documentation.
  2. Replace <Red5 Pro Server> with the hostname of the AWS EC2 instance used in Output Settings of Step 3b from the section above and <Red5 Pro Server IP> with the external IP address of the same instance.

It should be noted that j in Red5 Pro’s upcoming 7.0 release, SRT can be used instead of RTMP for even better latency and performance.

 

Test Measurements

Once you have followed these instructions and conducted a test according to the block diagram, the speed, and power of the combination of Red5 Pro and Videon will become readily apparent.

Please make sure to use a video source that has a running timer. Considering that we measure our latency in milliseconds rather than seconds, you will need to display down to milliseconds in order to properly measure the low latency output. As our partners at Videon did, you can use the Red5 Pro Web Player and monitor output for playback.

Samples were measured by simultaneously taking a picture of the preview monitor screen and the Red5 Pro web player at the same time.  By comparing the time codes, you can see the latency from when the local video source captures the video to when the video is encoded by Videon’s VersaStreamer 4k and streamed through Red5 Pro’s cloud infrastructure. The following pictures show the workflow at various times after beginning the stream. Videon consistently measured under 200ms of glass to glass latency. Despite the fact that adverse network conditions can increase latency, even setups in poor networks measured <500ms of latency.

 

Recommendations

For real-time latency workflows, the Red5 Pro software and the Videon VersaStreamer 4K is an optimal pairing. With sub 500ms of latency, smooth and natural interactions can be achieved by using Red5 Pro’s cloud-based streaming platform and Videon’s VersaStreamer 4K. This powerful combination enables viewers and streamers to interact in real-time for a completely interactive experience.  Using the Videon live streaming encoder and the Red5 Pro software, is the best way to provide low latency video streaming to a broader audience.

Want to learn more? Get in touch. Send an email to info@red5pro.com or schedule a call directly.

Sergio Epelbaum Transforms Live Surgical Streaming With Sonora

Prensario Magazine recently published a great article about how Sergio Epelbaum, founder of Argentina’s Vision Producciones, has been using Videon’s VersaStreamer to transform live streaming from the operating room during demonstrations at ophthalmological conferences.  Thanks Prensario!  [Read the Prensario article here in Spanish!]

Sergio founded Vision Producciones 25 years ago, and in the beginning, the company specialized in making surgical videos. He since has expanded the company into an award-winning full-service medical production company that does everything from live streaming of surgical procedures to website design and communications programming for medical journals. 

The magazine article showcases Sergio’s work for FacoExtrema, a huge live streaming event that draws thousands of online viewers — more viewers than there are ophthalmologists across Latin America, actually. Sergio puts a team in the OR to film the surgery and then uses a private network connected by antennas to send video to a local viewing area and streams it live for a worldwide audience. 

The Videon platform gives Sergio several critical benefits. For one, it’s a well-made product that’s available at an affordable price point. Because it supports H.264 and acts simultaneously as an encoder and decoder, the platform saves space while reducing bandwidth, cabling, and power requirements.

Sergio’s story is a great example of how advanced encoding solutions are changing live streaming across industries. Check it out!