Streaming Video – A Tutorial on How it Works

September 7, 2017

A Vimond Media Solutions Focus

Andreas-Holland-CCO-Vimond

Andreas Helland, CCO at Vimond Media Solutions

Broadcast was once the exclusive domain of experts. Then came the Internet. Having worked in the broadcast and OTT (Over-The-Top) industries for the past few decades, Vimond has witnessed how IP technologies have democratized the ability to produce and distribute video.

It sounds simple. You shoot the video. You edit it. And you upload it to the cloud. But then come questions. How can you make sure it’s available for everyone? Will they have to wait long for it to buffer? How do you stream a TV channel or live events? What about device screen size, device manufacturer and specifications, target bandwidth and location, content type, discovery, copyright issues, server locations, and content delivery networks? The list goes on.

So if you’re a non-expert who needs to know more about how streaming video works, either to keep up with the OTT competition or with what is happening in your own company – here is a quick tutorial. Let’s start with some basics.

 

Compression, Formats, Codecs, and Containers

Video is a sequence of images, where we define the color for each pixel, in each image. But if we store every image fully, a video would end up being too large.

A full-length, uncompressed progressive HD movie, with 10-bit colour, 1920 x 1080 pixels, and 25 frames per second, translates into a 1.4 terabyte file. To view it without buffering, viewers would need very high-speed data of about 2 Gbps. The answer to this dilemma is compression, which comes in two basic flavors:

 

  • Lossless compression allows the images to be fully restored after decompression, but are CPU-intensive.
  • Lossy compression reduces the size of the original file dramatically, by simplifying the image or removing detail.

 

Video formats are specifications for compressing video to a stream. Examples include the MPEG family (1, 2 and 4); H.264 (MPEG-4 AVC) and H.265 (HEVC)

Codecs (coder-decoders) are the methods for implementing a specific video format. They use algorithms to shrink the size of the video file and to decompress it when asked. Examples include x264, x265, ffmpeg, DivX, Xvid, WME, VP3 – VP9, Sorenson, Blackbird, Dirac, libtheora, Huffyuv, and 3ivx

Video formats and codecs are constantly being improved and updated as better hardware is developed and new devices come on to market, and most of all because the public demands more.

Video container (or wrapper) formats define how elements coexist in a stored file, but not what kinds of data can be stored. Containers usually contain multiple, interrelated video and audio tracks. Individual tracks can have metadata, such as aspect ratio or language. The container can also have metadata, such as the video title, cover art, episode numbers, subtitles, and so on.

Because you use the same codec to play back the stream with which it was coded, many video containers also embed the codec. Examples include MP4, Microsoft (AVI, ASF, WMV), Google WebM, Apple (m4v, MOV), Adobe Flash FLV, Matroska MKV, Ogg, 3gp, DivX and RM,

 

Transcoding and Streaming

Streaming-Video-Tutorial

Streaming today uses adaptive bitrate (ABR) techniques

In a multi-screen world, you also need to scale the video to fit different devices. With transcoding, video is adapted to the size of the device, and the bitrate (bits per second of video) is adjusted in order to cap the amount of data to be transferred. The different streams are then packaged into the same container.

Streaming today uses adaptive bitrate (ABR) techniques, whereby the stream is broken down into a sequence of small HTTP-based file downloads. Each download contains a short segment of a transport stream. It also includes a manifest, which contains timing data, quality and a list of other available streams.

At the start of the session, an ABR stream downloads the manifest, an extended playlist containing the metadata for the various sub-streams that are available. As the stream is played, the client may select from a number of different quality streams to adapt to the data rate. Examples include Google’s MPEG-Dynamic Adaptive Streaming over HTTP (DASH), Apple’s HTTP Live Streaming (HLS), and Microsoft Smooth Streaming

In addition to the standards and software, one also needs hardware that can take the original video file, fragment it and deliver it smoothly. This is done by a streaming (aka, origin) server, Examples include devices from Unified Streaming, Wowsa, Adobe Media, Apache, Nginx Plus, and others.

 

Distribution, VOD vs. Live

In a standard Video on Demand (VOD) pipeline, the video is prepared beforehand and is not time critical. For example, content and metadata are ingested from various sources and stored in the cloud. The content is archived in the chosen format. Content is then distributed to the CDN, while the online data-storage or hosting service ensures availability.

In a VOD scenario, you have the video, the container, format and everything needed for delivery to customer. For live video, you are attaching customer to a potentially endless stream of data. In either case, customer access lines vary enormously. The goal of the service provider is to balance the amount of buffering and availability of content with acceptably high video quality.

But for live events, timing and synchronization are extremely important. Content must reach the end-user as soon as possible, and redundancy in the system must be designed such that any failover happens without the end-user noticing. The importance of synchronized time codes to redundancy cannot be over-emphasized: customers typically pay a premium to watch live video, and are unhappy with interruptions.

 

Keep on learning

In a satellite operation, the video expert has typically been someone with deep knowledge of industry standards, such as DVB or MPEG. In the streaming video world, the technology domain is broader and more dynamic. The new expert must understand multiple and evolving formats, codecs, containers, transcoders, streamers and CDNs – and, moreover, must know how these and other technologies are deployed to deliver both on-demand and time-sensitive live video.

You may not need to become a streaming video expert. But given the proliferation of OTT video, even within the satellite industry, and how it cuts across traditional video, networking, IT and business silos, you may need to become more familiar with the category. A little learning, in this case, is a good thing.

Published by SatMagazine, September 2017