Skip to main content

Achieving time synchronization for millions of customers during live events at Prime Video

Prime Video invented a new way to set the producer reference time as the global time reference for live-event playback on customer devices.

At Prime Video, we often provide insightful X-Ray metadata during playback for live events, including dynamic in-game stats, play-by-play, full team rosters, player information, and team facts. But to prevent any spoilers from disrupting our customers’ viewing experience, we must apply time synchronization to all live game updates so that client devices don’t display updates before they occur in the live event. For example, if a player injury statistic is shown before the moment occurs, it will cause disrupt the viewing experience for our customers.

Prime Video also provides highlight annotation for replays during live events, such as when a player scores points and moves the ball into an opposing team’s end zone. We do this by identifying the time locations of interesting moments from one video stream and then applying those time offsets to all video streams delivered to customers. We designed and applied time synchronization between these different video streams with timestamp information before delivery, so that customers can watch the correct replay moments while the live stream is ongoing.

As Prime Video continues to scale and deliver tens of thousands of live events every year, we investigated the challenge of time synchronization from camera-capture in live production to video playback on devices, partnered with AWS Elemental, and invented a generic automated solution for time synchronization that applies to both MPEG-DASH and HTTP Live Streaming (HLS). By using this new design and implementation, Prime Video has consistently achieved time synchronization across all customer-facing experiences and scenarios since 2019.

Media workflow latencies impact live-event playback on customer screens

Whenever Prime Video has a live-event broadcast, such as a football game during NFL Thursday Night Football (TNF), our media workflow has multiple steps beginning with camera-video capture in the stadium and ending with the video playback on customer devices. The following diagram provides a high-level overview of the media workflow using HTTP/TCP based streaming technology, such as MPEG-DASH and HLS.

The media workflow in Prime Video live events at a high level has many different steps, including camera capture on the stadium, fiber and satellite transmission, signal acquisition, AWS Elemental encoder, packaging and content stitching, and video delivery though CDNs and ISPs, and video playback on devices.

A high-level overview of the media workflow during a live event.

The diagram shows the following workflow:

  1. A camera’s video stream captures in-game action during a live event at a stadium and sends it to an outside broadcasting truck.
  2. The video stream is sent from the stadium through different fiber and satellite links to multiple Prime Video signal acquisition partners. This process achieves redundancy, reliability, and high availability (HA).
  3. At signal acquisition sites, partners perform production workflows, including optional SCTE decoration for dynamic advertisement insertion. These onsite AWS Elemental encoders process live feeds based on different encoding profiles.
  4. AWS Elemental encoders output HLS video streams over dedicated and redundant AWS Direct Connect circuits from signal acquisition sites into multiple Availability Zones (AZs) on the Amazon Web Services (AWS) Cloud. In different AZs, AWS Elemental MediaPackage packages video data and, if required, AWS Elemental MediaTailor dynamically inserts video advertisements.
  5. Video data output from Elemental MediaPackage and Elemental MediaTailor (including both video fragments and manifests) are sent to multiple content delivery networks (CDNs) and internet service providers (ISPs), such as Amazon CloudFront, for reliable and efficient video delivery across the internet.
  6. Prime Video applications on devices receive the video data and play back the video stream for our customers.

Each step in this media workflow can introduce different latency amounts because there are always multiple alternative paths for redundancy and reliability (these include encoding instances, packaging and content stitching; delivery paths across CDNs and ISPs, and varying amounts of video buffering across players). This varying latency means that the same video content could arrive to customer devices at a different time. Without time synchronization, it will create a poor customer experience for certain scenarios.
The challenge was for us to find and define a reliable reference time for the video content in live events starting from camera-capture at the live event venue and use it for time synchronization during video playback on customer devices.

Finding and standardizing the producer reference time for live events

We investigated our live-event media workflow at Prime Video and found that the latencies in network transmission from the live-event venue to signal acquisition partner sites were almost constant, with minor offsets in different paths.

We chose to synchronize our video streams to a global time at our partner sites which could be synchronized to Network Time Protocol (NTP) sever, or to a server system clock using Chrony in the AWS Elemental encoder. We made this the producer reference time (PRT) and it became the central reference for all downstream processing. In the future during live production, we could change this approach and ensure that video content in live feeds captured at live-event venues are synchronized at frame accuracy from the Global Positioning System (GPS) time, which would achieve an even better synchronization accuracy.

We make this PRT available and propagated it through the entire media workflow to video playback on customer device screens, in both HLS and MPEG-DASH.

PRT in HLS

When the PRT is available in the input stream to an AWS Elemental Live encoder and Elemental MediaPackage, the outbound manifest from Elemental MediaPackage includes tags in the HLS manifest after we correctly configure AWS Elemental services. The following snippet is an example of an HLS media playlist:

Metadata and highlight identification on top of a video stream can use tags that are identified as the central reference time. HLS-capable video players on devices also receive tags delivered in the HLS manifests. Using the definitions in HLS specification, we can then achieve time synchronization between video segments and frames in the video stream, in addition to X-Ray metadata and highlight annotations, because they both refer to the same PRT in the tags.

PRT in MPEG-DASH

At Prime Video, we work with AWS Elemental and create a period supplemental property with in each period, which contains the PRT for the beginning of the period. The following are the definitions from section 11.2 in ANSI/SCTE 214-1 2016:

“Correspondence of PeriodStart to UTC time may be signaled using with . The value of @value attribute shall be the timestamp corresponding to , in format defined in RFC 3339.”

The following is an example of the MPEG-DASH manifest with the supplemental property and used at Prime Video:

The value of the attribute in the is the PRT of the beginning of the period. It requires some calculation to get the PRT of video segments in the period. This calculation is shown in the following equation:

SegmentPRT = SP\text{\textunderscore}PRT + ( Segment\text{\textunderscore}t - PTO) / timescale

SegmentPRT indicates the PRT of a segment in the period, SP_PRT is the PRT in the SupplementalProperty of this period, Segment_t is the value of @t for a segment in SegmentTimeline, PTO and timescale are the and timescale applied to the segment, defined in MPEG-DASH specification. MPEG-DASH capable video players on devices receive in manifests and use the equation to achieve time synchronization between video segments and frames in a video stream, in addition to X-Ray metadata and highlight annotations.

Achieving time synchronization at scale

Since 2019, Prime Video has collaborated with AWS Elemental and deployed time synchronization for HTTP/TCP based streaming technologies. We have done this at scale on all eligible devices across our live events, helping achieve great customer experiences.

Our work went through multiple discussions and prototype experiments before we pushed it to production and was only possible with an incredible team of talented engineers and technologists. The same mechanism can also apply to many other interesting scenarios in live events at Prime Video in the future.

Stay tuned for more!

Senior Principal Engineer – Prime Video