Low Latency – the Future of Video Streaming over the Internet

Low Latency – the Future of Video Streaming over the Internet

Low Latency – the Future of Video Streaming over the Internet

Get Low: Here’s how OTT Service Providers can put an End to their Latency Woes

Picture this: Your friend and you are hooked on to a football match – him on his mobile, you on your television. Your favourite teams are playing head to head and you’re at the edge of your seat for your team to score. Suddenly your friend calls and tells you, your team’s won. You’re baffled, until 2 minutes later you watch the goal happen on your television screen. You’re happy but a bit disappointed that you didn’t get to experience the win along with your friend – simply because of a 2 minute delay.

That delay in live streaming is something known as latency. It takes place because encoded video takes time to transmit over the Internet to a VDS. This latency is affected by the encoded media bitrate (the lower the bitrate, the lower the latency), the latency and bandwidth of the internet connection, and the proximity (over the Internet) to the VDS.

While latency may not be an issue for people who prefer to watch video at their convenience, it is critical in situations such as live sporting events, betting or reality voting or for delivering real time updates to users from the venue. This becomes a challenge for OTT service providers competing with traditional transmission methods like cable or Satellite TV.

Some may argue that reducing the chunk size of the video may bring down the latency, from 10 seconds to 1 second. While this brings down the latency from 30 seconds to approximately 5 seconds, it puts a lot of pressure on the existing CDN, while increasing data processing substantially.

However, thanks to advancements in technology, there are simpler ways to bring down latency in OTT services to as low as 2 seconds, which is at par with traditional methods.

 

Low Latency CDN

Real-time streaming video applications require media packets to arrive in a timely manner. UDP or User Datagram Protocol is a video streaming technology used to deliver the multimedia flow as a sequence of small packets. The output is pushed out across networks in different phases/ gates within the network to manage package loss and stabilize stream. The application layer protocol is implemented on top of UDP/IP to provide an end-to-end network transport for video streaming.

A low-latency CDN architecture ensures a sufficient bandwidth between the CDN POPs and the publishing point for accommodating the sum of all profile levels of all channels. The TS profiles are determined as part of the project plan, according to the users’ devices and type of content.

Since most platforms still include their own players which do not support playing TS over UDP and expect to get their streams in HTTP chunked protocols, it is important to ensure low latency on the client side using SDK in order to emulate the received TS to HLS or MPEG-DASH protocols. This enables the service providers to keep their existing players and continue playing the streams as usual.

This combination of UDP technology, error correction and low latency SDK (client side) can guarantee latency as low as 1-2 seconds, along with the the added benefit of frame accuracy across all users.

Low Latency CDN is an inevitable evolution of the OTT streaming industry, and it is just a matter of few years until it will become a mandatory requirement for any content service provider which includes live content.

 

Learn more about Vimmi’s Low Latency Solution:

Get Low: Here’s how OTT Service Providers can put an End to their Latency Woes

Picture this: Your friend and you are hooked on to a football match – him on his mobile, you on your television. Your favourite teams are playing head to head and you’re at the edge of your seat for your team to score. Suddenly your friend calls and tells you, your team’s won. You’re baffled, until 2 minutes later you watch the goal happen on your television screen. You’re happy but a bit disappointed that you didn’t get to experience the win along with your friend – simply because of a 2 minute delay.

That delay in live streaming is something known as latency. It takes place because encoded video takes time to transmit over the Internet to a VDS. This latency is affected by the encoded media bitrate (the lower the bitrate, the lower the latency), the latency and bandwidth of the internet connection, and the proximity (over the Internet) to the VDS.

While latency may not be an issue for people who prefer to watch video at their convenience, it is critical in situations such as live sporting events, betting or reality voting or for delivering real time updates to users from the venue. This becomes a challenge for OTT service providers competing with traditional transmission methods like cable or Satellite TV.

Some may argue that reducing the chunk size of the video may bring down the latency, from 10 seconds to 1 second. While this brings down the latency from 30 seconds to approximately 5 seconds, it puts a lot of pressure on the existing CDN, while increasing data processing substantially.

However, thanks to advancements in technology, there are simpler ways to bring down latency in OTT services to as low as 2 seconds, which is at par with traditional methods.

 

Low Latency CDN

Real-time streaming video applications require media packets to arrive in a timely manner. UDP or User Datagram Protocol is a video streaming technology used to deliver the multimedia flow as a sequence of small packets. The output is pushed out across networks in different phases/ gates within the network to manage package loss and stabilize stream. The application layer protocol is implemented on top of UDP/IP to provide an end-to-end network transport for video streaming.

A low-latency CDN architecture ensures a sufficient bandwidth between the CDN POPs and the publishing point for accommodating the sum of all profile levels of all channels. The TS profiles are determined as part of the project plan, according to the users’ devices and type of content.

Since most platforms still include their own players which do not support playing TS over UDP and expect to get their streams in HTTP chunked protocols, it is important to ensure low latency on the client side using SDK in order to emulate the received TS to HLS or MPEG-DASH protocols. This enables the service providers to keep their existing players and continue playing the streams as usual.

This combination of UDP technology, error correction and low latency SDK (client side) can guarantee latency as low as 1-2 seconds, along with the the added benefit of frame accuracy across all users.

Low Latency CDN is an inevitable evolution of the OTT streaming industry, and it is just a matter of few years until it will become a mandatory requirement for any content service provider which includes live content.

 

 

Learn more about Vimmi’s Low Latency Solution:

Follow by Email
Facebook
Facebook
YOUTUBE
YOUTUBE
LinkedIn