<SYSTEM>This is a complete collection of developer guides for Mux.</SYSTEM>

# Stream videos in five minutes
Upload and play back your video files in your application using Mux in five minutes or less.
## 1. Get an API Access Token

The Mux Video API uses a token key pair that consists of a **Token ID** and **Token Secret** for authentication. If you haven't already, generate a new Access Token in the [Access Token settings](https://dashboard.mux.com/settings/access-tokens) of your Mux account dashboard.

<Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} alt="Mux access token settings" />

You'll be presented with a form to create your new Access Token.

<Image src="/docs/images/new-access-token.png" width={545} height={233} alt="Mux Video access token permissions" />

* **Access Tokens** belong to an **Environment** — a container for the various Access Tokens, Signing Keys, and assets that you'll come to add to Mux. For this guide, you can keep the **Production** environment selected.
* **Access Tokens** can have varying permissions to control what kinds of changes they have the ability to make. For this guide, your **Access Token** should have Mux Video **Read** and **Write** permissions.
* You can give your **Access Token** an internal-only name like "Onboarding" so you know where you've used it within your application.

Now, click the **Generate token** button.

You'll be presented with your new **Access Token ID** and **Secret Key**.

<Image src="/docs/images/settings-generated-access-token.png" width={545} height={233} alt="Mux access token environment" />

Once you have your new **Access Token ID** and **Secret Key**, you're ready to upload your first video.

## 2. POST a video

Videos stored in Mux are called <ApiRefLink href="/docs/api-reference/video/assets">assets</ApiRefLink>. To create your first video asset, you need to send a <ApiRefLink href="/docs/api-reference/video/assets/create-asset">POST request to the /assets endpoint</ApiRefLink> and set the `input` value to the URL of a video file that's accessible online.

Here are a few demo videos you can use that are stored on common cloud storage services:

* Amazon S3: https://muxed.s3.amazonaws.com/leds.mp4
* Google Drive: https://drive.google.com/uc?id=13ODlJ-Dxrd7aJ7jy6lsz3bwyVW-ncb3v
* Dropbox: https://www.dropbox.com/scl/fi/l2sm1zyk6pydtosk3ovwo/get-started.mp4?rlkey=qjb34b0b7wgjbs5xj9vn4yevt\&dl=0

To start making API requests to Mux, you might want to install one of our officially supported API SDKs. These are lightweight wrapper libraries that use your API credentials to make authenticated HTTP requests to the Mux API.

```elixir

# mix.exs
def deps do
  [
    {:mux, "~> 1.8.0"}
  ]
end

```

```go

go get github.com/muxinc/mux-go

```

```node

# npm
npm install @mux/mux-node --save

# yarn
yarn add @mux/mux-node

```

```php

# composer.json
{
    "require": {
        "muxinc/mux-php": ">=0.0.1"
    }
}

```

```python

# Via pip
pip install git+https://github.com/muxinc/mux-python.git

# Via source
git checkout https://github.com/muxinc/mux-python.git
cd mux-python
python setup.py install --user

```

```ruby

gem 'mux_ruby'

```



<Callout type="info">
  For an example of how to make API Requests from your local environment, see the [Make API Requests](/docs/core/make-api-requests) guide.
</Callout>

<CodeExamples product="video" example="createAsset" />

The response will include an **Asset ID** and a **Playback ID**.

* Asset IDs are used to manage assets using `api.mux.com` (e.g. to read or delete an asset).
* <ApiRefLink href="/docs/api-reference/video/playback-id">Playback IDs</ApiRefLink> are used to stream an asset to a video player through `stream.mux.com`. You can add multiple playback IDs to an asset to create playback URLs with different viewing permissions, and you can delete playback IDs to remove access without deleting the asset.

```json
{
  "data": {
    "status": "preparing",
    "playback_ids": [
      {
        "policy": "public",
        "id": "TXjw00EgPBPS6acv7gBUEJ14PEr5XNWOe"
      }
    ],
    "video_quality": "basic",
    "mp4_support": "none",
    "master_access": "none",
    "id": "01itgOBvgjAbES7Inwvu4kEBtsQ44HFL6",
    "created_at": "1607876845"
  }
}
```

<Callout type="info">
  Mux does not store the original file in its exact form, so if your original quality files are important to you, don't delete them after submitting them to Mux.
</Callout>

## 3. Wait for \`ready\`

As soon as you make the `POST` request, Mux begins downloading and processing the video. For shorter files, this often takes just a few seconds. Very large files over poor connections may take a few minutes (or longer).

When the video is ready for playback, the asset `status` changes to `ready`. You should wait until the asset status is `ready` before you attempt to play the video.

The best way to be notified of asset status updates is via **webhooks**. Mux can send a webhook notification as soon as the asset is ready. See the [webhooks guide](/docs/core/listen-for-webhooks) for details.

If you can't use webhooks for some reason, you can manually **poll** the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">asset API</ApiRefLink> to see asset status. Note that this only works at low volume. Try this example:

## Try an example request

<CodeExamples product="video" example="retrieveAsset" />

Please don't poll this API more than once per second.

## 4. Watch your Video

To play back an asset, create a playback URL using the `PLAYBACK_ID` you received when you created the asset.

```curl
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

## Preview in a player

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

struct ContentView: View {

    private let player = AVPlayer(
        url: URL.makePlaybackURL(
            playbackID: playbackID
        )
    )

    var body: some View {
        //  VideoPlayer comes from SwiftUI
        //  Alternatively, you can use AVPlayerLayer or AVPlayerViewController
        VideoPlayer(player: player)
            .onAppear() {
                player.play()
            }
    }
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

extension URL {
    static func makePlaybackURL(
        playbackID: String
    ) -> URL {
        guard let baseURL = URL(
            string: "https://stream.mux.com"
        ) else {
            preconditionFailure("Invalid base URL string")
        }

        guard let playbackURL = URL(
            string: "\(playbackID).m3u8",
            relativeTo: baseURL
        ) else {
            preconditionFailure("Invalid playback URL component")
        }

        return playbackURL
    }
}

```



See the [playback guide](/docs/guides/play-your-videos) for more information about how to integrate with a video player.

## Preview with `stream.new`

[Stream.new](https://stream.new/) is an open source project by Mux that allows you to add a video and get a shareable link to stream it.

Go to `stream.new/v/{PLAYBACK_ID}` to preview your video streaming. This URL is shareable and automatically generated using the video playback ID. Copy the link below and open it in a browser to view your video.

```
https://stream.new/v/{PLAYBACK_ID}
```

After you have everything working [integrate Mux Data](/docs/guides/track-your-video-performance) with your player for monitoring playback performance.

## 5. Manage your Mux assets

After you have assets created in your Mux environment, you may find some of these other endpoints handy:

* <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Create an asset</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/list-assets">List assets</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/get-asset">Retrieve an asset</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/delete-asset">Delete an asset</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/get-asset-input-info">Retrieve asset input info</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/create-asset-playback-id">Create asset playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/get-asset-playback-id">Retrieve asset playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/delete-asset-playback-id">Delete asset playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/update-asset-mp4-support">Update MP4 support on asset</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/update-asset-master-access">Update master access on asset</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">Update asset track</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/assets/delete-asset-track">Delete an asset track</ApiRefLink>

More Video methods and descriptions are available at the <ApiRefLink href="/docs/api-reference/video">API Docs</ApiRefLink>.

# Next Steps

<GuideCard
  title="Play your videos"
  description="Set up your iOS application, Android application or web application to start playing your Mux assets"
  links={[
    {title: "Read the guide", href: "/docs/guides/play-your-videos"},
  ]}
/>

<GuideCard
  title="Preview your video"
  description="Now that you have Mux assets, build rich experiences into your application by previewing your videos with Thumbnails and Storyboards"
  links={[
    {title: "Read the guide", href: "/docs/guides/get-images-from-a-video"},
  ]}
/>

<GuideCard
  title="Integrate Mux Data"
  description="Add the Mux Data SDK to your player and start collecting playback performance metrics."
  links={[
    {title: "Read the guide", href: "/docs/guides/track-your-video-performance"},
  ]}
/>


# Mux fundamentals
A reference guide covering the essential concepts, terminology, and components you need to understand when building with Mux.
Whether you're just getting started with Mux or need a quick refresher on how the pieces fit together, this guide covers the fundamental concepts you'll encounter when building video, audio, and live streaming applications.

# Quick reference

| Term | Description |
| :--- | :---------- |
| [**Organization**](#organizations) | The top-level account container. You can belong to multiple organizations, each with its own billing, team members, and environments. |
| [**Environment**](#environments) | A container within an organization for organizing your Mux resources (assets, live streams, API tokens, etc.). Each organization can have multiple environments. |
| [**Access Token**](#access-tokens) | A credential pair (Token ID + Token Secret) used to authenticate API requests. Scoped to a single environment. |
| [**Asset**](#assets) | A video or audio file that has been uploaded to Mux and processed for streaming playback. |
| [**Playback ID**](#playback-ids) | A unique identifier used to stream an asset or live stream to viewers. |
| [**Live Stream**](#live-streams) | A resource representing a live broadcast that can receive RTMP/SRT input and deliver to viewers. |
| [**Stream Key**](#live-streams) | A secret credential that allows a broadcaster to push video to a specific live stream. |
| [**Signing Key**](#signing-keys) | A public/private key pair used to create signed tokens (JWTs) for secure playback. |
| [**Webhook**](#webhooks) | An HTTP callback that Mux sends to your server when events occur (e.g., asset ready, live stream started). |

# Organizations

An **organization** is your top-level Mux account. It's the highest container in the Mux hierarchy and contains everything else: environments, team members, and billing settings.

Key things to know about organizations:

* **You can belong to multiple organizations.** This is useful if you work with different companies or clients, each with their own Mux account.
* **Each organization has its own billing.** Usage charges are tracked and billed per organization.
* **Team members are managed at the organization level.** You can invite collaborators and assign roles (Admin, Member) within each organization.
* **Organizations contain environments.** All your media resources live inside environments, which live inside organizations.

You can switch between organizations and create new ones from the [Mux Dashboard](https://dashboard.mux.com/organizations).

# Environments

An **environment** is a container within an organization for organizing your Mux resources. Each environment has its own isolated set of assets, live streams, access tokens, signing keys, and webhooks.

Common use cases for multiple environments:

* Separate **development** and **production** resources
* Isolate resources for different websites or domains (e.g., `site1.com`, `site2.com`)
* Organize by project or use case (e.g., CMS media, marketing site, customer uploads)
* Keep test data separate from production content

<Callout type="warning" title="Environment isolation">
  Resources are scoped to their environment. An access token created in Development cannot be used to manage assets in Production, and webhooks configured for one environment won't fire for events in another.
</Callout>

You can view and manage environments in the [Mux Dashboard](https://dashboard.mux.com/organizations).

# Access Tokens

**Access tokens** are credentials that authenticate your API requests to Mux. Each token consists of two parts:

| Part | Description |
| :--- | :---------- |
| **Token ID** | The "username" portion of your credential. Safe to log (but not expose publicly). |
| **Token Secret** | The "password" portion. Keep this secure and never expose it in client-side code. |

<Callout type="info" title="Secret recovery">
  Mux only stores a hash of your token secret. If you lose it, you'll need to create a new access token.
</Callout>

<Callout type="warning" title="Server-side only">
  Mux API requests must be made from a server, not from client-side code. The API does not support CORS, and exposing your credentials in a browser or mobile app is a security risk.
</Callout>

## Token permissions

When creating an access token, you configure which permissions it has:

| Permission | Use case |
| :--------- | :------- |
| **Mux Video Read** | Retrieve information about assets and live streams |
| **Mux Video Write** | Create, update, and delete assets and live streams |
| **Mux Data Read** | Access playback performance metrics |
| **Mux Data Write** | Create Data annotations |
| **System Read** | View signing keys and other system resources |
| **System Write** | Create and manage signing keys |

For most use cases when getting started, you'll want **Mux Video Read** and **Write** permissions.

You can create and manage access tokens in the [Mux Dashboard](https://dashboard.mux.com/settings/access-tokens).

**Learn more:** [Make API requests](/docs/core/make-api-requests) | [Use an SDK](/docs/core/sdks)

# Assets

An **asset** is a video or audio file that has been ingested into Mux and processed for adaptive bitrate streaming. When you create an asset, Mux:

1. Downloads the file from your provided URL (or receives it via [direct upload](/docs/guides/upload-files-directly))
2. Transcodes it into multiple quality levels
3. Packages it for HLS streaming
4. Generates a unique **asset ID**

```json
// Example asset response
{
  "data": {
    "id": "01itgOBvgjAbES7Inwvu4kEBtsQ44HFL6",
    "status": "ready",
    "playback_ids": [
      {
        "id": "TXjw00EgPBPS6acv7gBUEJ14PEr5XNWOe",
        "policy": "public"
      }
    ],
    "duration": 120.5,
    "aspect_ratio": "16:9"
  }
}
```

## Asset status lifecycle

Assets progress through several statuses:

| Status | Description |
| :----- | :---------- |
| `preparing` | Mux is downloading and processing the file |
| `ready` | The asset is ready for playback |
| `errored` | Something went wrong during processing |

Rather than polling the API to check status, use [webhooks](/docs/core/listen-for-webhooks) to be notified when an asset is ready.

**Learn more:** [Stream videos in five minutes](/docs/core/stream-video-files) | <ApiRefLink href="/docs/api-reference/video/assets">Assets API</ApiRefLink>

# Playback IDs

A **playback ID** is what you use to actually stream content to viewers. While asset IDs are used to *manage* your content (via `api.mux.com`), playback IDs are used to *stream* your content (via `stream.mux.com`).

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

## Playback policies

Each playback ID has a policy that controls how it can be accessed:

| Policy | Description |
| :----- | :---------- |
| `public` | Anyone with the URL can access the content |
| `signed` | Viewers need a valid JWT token to watch |

An asset can have multiple playback IDs with different policies. This lets you, for example, have a public playback ID for trailers and a signed playback ID for the full content.

<Callout type="info" title="Multiple playback IDs">
  You can add and remove playback IDs without affecting the underlying asset. This is useful for revoking access without re-encoding your content.
</Callout>

**Learn more:** [Play your videos](/docs/guides/play-your-videos) | [Secure video playback](/docs/guides/secure-video-playback)

# Live streams

A **live stream** represents a live broadcast channel. Unlike assets (which are created from existing files), live streams receive real-time input and deliver it to viewers with low latency.

## Key live stream components

| Component | Description |
| :-------- | :---------- |
| **Stream Key** | A secret credential broadcasters use to connect their encoder to Mux |
| **RTMP URL** | The ingest endpoint (`rtmp://global-live.mux.com:5222/app`) |
| **SRT URL** | Alternative ingest endpoint for SRT protocol |
| **Playback ID** | Used to stream to viewers (same concept as asset playback IDs) |

<Callout type="warning" title="Keep stream keys secret">
  Anyone with your stream key can broadcast to your live stream. Treat it like a password.
</Callout>

## Live stream lifecycle

| Status | Description |
| :----- | :---------- |
| `idle` | No one is broadcasting; waiting for input |
| `active` | A broadcaster is connected and viewers can watch |
| `disabled` | The live stream has been disabled and won't accept connections |

When a live stream ends, Mux automatically creates a new asset from the recording (if recording is enabled).

**Learn more:** [Configure broadcast software](/docs/guides/configure-broadcast-software) | [Handle disconnections](/docs/guides/handle-live-stream-disconnects) | <ApiRefLink href="/docs/api-reference/video/live-streams">Live Streams API</ApiRefLink>

# Signing keys

**Signing keys** are cryptographic key pairs used to generate JWTs (JSON Web Tokens) for [secure video playback](/docs/guides/secure-video-playback). When you have assets or live streams with `signed` playback policies, you need signing keys to create valid playback tokens.

| Component | Description |
| :-------- | :---------- |
| **Key ID** | A unique identifier for the signing key |
| **Private Key** | Used by your server to sign JWTs. Keep this secret. |

Your server uses the private key to create short-lived tokens that grant access to specific content. The token can include claims for:

* **Expiration time** - When the token becomes invalid
* **Playback restrictions** - Additional rules like allowed domains

<Callout type="info" title="Not the same as access tokens">
  Signing keys and access tokens serve different purposes:

  * **Access tokens** authenticate your server-to-Mux API requests
  * **Signing keys** create tokens that authenticate viewer playback requests
</Callout>

You can create and manage signing keys in the [Mux Dashboard](https://dashboard.mux.com/settings/signing-keys).

**Learn more:** [Secure video playback](/docs/guides/secure-video-playback) | <ApiRefLink href="/docs/api-reference/system/signing-keys">Signing Keys API</ApiRefLink>

# Webhooks

**Webhooks** are HTTP callbacks that Mux sends to your application when events occur. Instead of repeatedly polling the API to check if an asset is ready, you configure a webhook URL and Mux notifies you automatically.

Common webhook events:

| Event | Description |
| :---- | :---------- |
| `video.asset.ready` | An asset has finished processing and is ready for playback |
| `video.asset.errored` | An asset failed to process |
| `video.live_stream.active` | A live stream has started broadcasting |
| `video.live_stream.idle` | A live stream has stopped broadcasting |
| `video.upload.asset_created` | A direct upload has completed and created an asset |

<Callout type="warning" title="Environment-scoped">
  Webhooks are configured per environment. Make sure your webhook is set up in the same environment where your resources are created.
</Callout>

**Learn more:** [Listen for webhooks](/docs/core/listen-for-webhooks) | [Verify webhook signatures](/docs/core/verify-webhook-signatures)

# IDs at a glance

Mux uses several different types of identifiers. Here's a quick reference:

| ID Type | Format Example | Purpose |
| :------ | :------------- | :------ |
| **Organization ID** | `abc123` | Identify your organization |
| **Environment ID** | `j0863n` | Identify specific environments within an organization |
| **Asset ID** | `01itgOBvgj...` | Identify and manage assets via the API |
| **Playback ID** | `TXjw00EgPB...` | Stream content to viewers |
| **Live Stream ID** | `aA02skpHX...` | Identify and manage live streams via the API |
| **Upload ID** | `OA02dANZ...` | Track direct upload status |
| **Token ID** | `44c819de-4add-...` | Identify access tokens (part of API auth) |
| **Signing Key ID** | `JjPXgkqO...` | Identify signing keys for JWT creation |

# SDKs

Mux provides official SDKs for several languages that handle authentication and make it easier to work with the API:

* [Node.js](/docs/integrations/mux-node-sdk)
* [Python](/docs/integrations/mux-python-sdk)
* [Ruby](/docs/integrations/mux-ruby-sdk)
* [PHP](/docs/integrations/mux-php-sdk)
* [Java](/docs/integrations/mux-java-sdk)
* [C# / .NET](/docs/integrations/mux-csharp-sdk)
* [Elixir](/docs/integrations/mux-elixir-sdk)

For client-side playback, see [Mux Player](/docs/guides/mux-player-web) and the various player SDK guides.

**Learn more:** [Use an SDK](/docs/core/sdks)

# API and webhook specifications

Mux publishes machine-readable specifications for both the API and webhook events:

| Specification | URL | Description |
| :------------ | :-- | :---------- |
| **Combined spec** | [`mux.com/full-combined-spec.json`](https://www.mux.com/full-combined-spec.json) | All API endpoints and webhook events in one spec |
| **API spec** | [`mux.com/api-spec.json`](https://www.mux.com/api-spec.json) | Core API endpoints only |
| **Webhook spec** | [`mux.com/webhook-spec.json`](https://www.mux.com/webhook-spec.json) | Webhook event schemas only |
| **Image API spec** | [`mux.com/image-spec.json`](https://www.mux.com/image-spec.json) | Thumbnail, animated GIF, and storyboard endpoints |
| **Streaming API spec** | [`mux.com/stream-spec.json`](https://www.mux.com/stream-spec.json) | HLS and MP4 streaming playback endpoints |
| **Engagement Counts spec** | [`mux.com/stats-spec.json`](https://www.mux.com/stats-spec.json) | Real-time view and viewer count endpoints |

These are useful for generating API clients, importing into tools like [Postman](/docs/core/postman), validating webhook payloads, or integrating with any tooling that supports OpenAPI. Use the combined spec if you want everything in one file.

# What's next?

Now that you understand the fundamentals, here are some recommended next steps:

<GuideCard
  title="Stream videos in five minutes"
  description="Upload your first video to Mux and play it back in your application."
  links={[
    {title: "Read the guide", href: "/docs/core/stream-video-files"},
  ]}
/>

<GuideCard
  title="Listen for webhooks"
  description="Set up webhooks to receive real-time notifications when events occur."
  links={[
    {title: "Read the guide", href: "/docs/core/listen-for-webhooks"},
  ]}
/>

<GuideCard
  title="Secure video playback"
  description="Learn how to protect your content with signed URLs and playback restrictions."
  links={[
    {title: "Read the guide", href: "/docs/guides/secure-video-playback"},
  ]}
/>


# Getting started for AI agents
A reference for LLMs and AI agents writing code against the Mux API.
# Getting started for AI agents

This guide is written for LLMs and AI coding agents. It contains everything you need to write working code against the Mux API on the first try.

## CLI

The [Mux CLI](/docs/integrations/mux-cli) (`@mux/cli`) lets you manage Mux resources directly from the terminal. It is useful for quick operations, scripting, automation, and CI/CD pipelines.

Install:

```bash
npm install -g @mux/cli    # global install
# or run directly without installing:
npx @mux/cli
# or install via Homebrew:
brew install muxinc/tap/mux
```

After installing, authenticate with `mux login` or set `MUX_TOKEN_ID` and `MUX_TOKEN_SECRET` environment variables. Always pass `--agent` to optimize output for AI agents (includes JSON output).

Common commands:

| Command | What it does |
| :--- | :--- |
| `mux assets create --input-url URL` | Create an asset from a URL |
| `mux assets create --file video.mp4` | Upload a local file |
| `mux assets list` | List assets |
| `mux assets get ${ASSET_ID}` | Get asset details |
| `mux assets delete ${ASSET_ID}` | Delete an asset |
| `mux live-streams create` | Create a live stream |
| `mux uploads create` | Create a direct upload URL |
| `mux sign ${PLAYBACK_ID}` | Generate a signed playback URL |

Use `--wait` to block until an asset is ready:

```bash
mux assets create --input-url "https://example.com/video.mp4" --wait --agent
```

For the full list of commands, see the [CLI documentation](/docs/integrations/mux-cli).

## Common resources

* [Authentication and making API requests](/docs/core/make-api-requests) — how to authenticate with the Mux API
* [Rate limits](/docs/core/make-api-requests#api-rate-limits) — API rate limit details
* [Full API reference](/docs/api-reference/video) — complete endpoint documentation
* [Webhooks guide](/docs/core/listen-for-webhooks) — listen for events like `video.asset.ready`
* [Mux Player guide](/docs/guides/play-your-videos) — the easiest way to play Mux videos
* [Direct uploads guide](/docs/guides/upload-files-directly) — let users upload video files from the browser
* [Stream video files guide](/docs/core/stream-video-files) — ingest a video, wait for it to be ready, get a playback URL
* [Live streaming guide](/docs/guides/start-live-streaming) — set up live streams
* [Secure video playback](/docs/guides/secure-video-playback) — signed playback for private content
* [Static MP4 renditions guide](/docs/guides/enable-static-mp4-renditions) — downloadable MP4/M4A files
* [MCP Server](/docs/integrations/mcp-server) — use Mux from AI tools that support MCP
* [Docs for LLMs](/docs/core/llms-txt) — machine-readable documentation bundles
* [OpenAPI spec](https://www.mux.com/api-spec.json) — full API specification

## Instructions for your human

Before you can make API calls, your user needs to set up a Mux account and provide you with API credentials.

1. **Sign up for Mux** at the [Mux dashboard](https://dashboard.mux.com/signup). No credit card is required. The free tier includes enough usage to develop and test.
2. **Go to the [Mux dashboard](https://dashboard.mux.com)**, then navigate to **Settings → Access Tokens**.
3. **Create an API access token** with all permissions — **Mux Video**, **Mux Data**, and **System** — with read and write access.
4. **Save the Access Token ID and Access Token Secret.** The secret is only shown once.

Store the credentials as environment variables:

```bash
MUX_TOKEN_ID=your-token-id
MUX_TOKEN_SECRET=your-token-secret
```

For details on how authentication works, see the [make API requests guide](/docs/core/make-api-requests).

<Callout type="warning">
  Never expose API credentials in client-side code. All Mux API calls must be made from a server.
</Callout>

## Two main ways to use Mux

### Website embed

This is for when you have a small, constrained number of videos — a hero video or background video on your homepage, a demo reel, or a few dozen videos across your site. The key characteristic is that the set of videos doesn't change often and is manageable enough to hardcode.

In this case, you can hardcode playback IDs directly in your code or extract them into a JSON config file with metadata:

```json
{
  "videos": [
    { "title": "Hero Video", "playbackId": "TXjw00EgPBPS6acv7gBUEJ14PEr5XNWOe" },
    { "title": "Product Demo", "playbackId": "a4nOgR00sKz6cMWLeM5skT8ePBn7U6gC5" }
  ]
}
```

Then use [Mux Player](/docs/guides/play-your-videos) to embed each video:

```jsx
import MuxPlayer from '@mux/mux-player-react';

<MuxPlayer playbackId="TXjw00EgPBPS6acv7gBUEJ14PEr5XNWOe" />
```

To create your assets and get playback IDs, use the CLI (`mux assets create --input-url URL --wait --json`) or the [stream video files guide](/docs/core/stream-video-files).

### User uploaded

This is for when videos are uploaded dynamically as part of your application. Common scenarios include:

* **Admin-managed content** — an admin area where authorized users upload and manage videos (e.g., a course platform, media library, or CMS)
* **User-generated content (UGC)** — end users upload their own videos (e.g., a social platform, portfolio site, or community forum)
* **Programmatic ingestion** — videos are created automatically from external sources or pipelines

For these use cases, you will need to:

1. **Accept uploads** — use [direct uploads](/docs/guides/upload-files-directly) to let users upload video files from the browser, or create assets server-side from URLs using the [stream video files guide](/docs/core/stream-video-files)
2. **Listen for events** — use [webhooks](/docs/core/listen-for-webhooks) to know when a video is ready for playback (`video.asset.ready`), when it errors, or when it's deleted
3. **Persist video data** — store asset IDs, playback IDs, status, and metadata in your database (see [Persisting Mux data](#persisting-mux-data) below)
4. **Play videos** — use [Mux Player](/docs/guides/play-your-videos) with the stored playback ID

## SDKs

Official server-side SDKs:

| Language | Package | Docs |
| :--- | :--- | :--- |
| Node.js | `@mux/mux-node` | [Guide](/docs/integrations/mux-node-sdk) |
| Python | `mux_python` | [Guide](/docs/integrations/mux-python-sdk) |
| Ruby | `mux_ruby` | [Guide](/docs/integrations/mux-ruby-sdk) |
| PHP | `mux-php` | [Guide](/docs/integrations/mux-php-sdk) |
| Go | `mux-go` | [GitHub](https://github.com/muxinc/mux-go) |
| Java | `com.mux:mux-sdk-java` | [Guide](/docs/integrations/mux-java-sdk) |
| C# | `Mux.Csharp.Sdk` | [Guide](/docs/integrations/mux-csharp-sdk) |
| Elixir | `mux` | [Guide](/docs/integrations/mux-elixir-sdk) |

## Sensible defaults

Unless the user specifies otherwise, use these values:

| Parameter | Default to use | Notes |
| :--- | :--- | :--- |
| `playback_policy` | `["public"]` | Use `"signed"` only if the user needs secure/private playback |
| `video_quality` | `"basic"` | No encoding costs and great for most use cases. Use `"plus"` if the user needs higher quality encoding |
| `static_renditions` | Do not set | Only set if the user explicitly needs downloadable MP4/M4A files. See the [static renditions guide](/docs/guides/enable-static-mp4-renditions) |
| `max_resolution_tier` | Do not set | Defaults to `1080p`. Set to `"2160p"` only if the user requests 4K |

### Cross-references for defaults

* `playback_policy`: `"public"` allows open access. Use `"signed"` for [secure video playback](/docs/guides/secure-video-playback) with [signed JWTs](/docs/guides/signing-jwts).
* `video_quality`: See [pricing](/docs/pricing/video) for the difference between `"basic"` and `"plus"`.
* `static_renditions`: Replaces the deprecated `mp4_support` parameter. Use `[{ "resolution": "highest" }]` for an MP4 download or `[{ "resolution": "audio-only" }]` for an M4A file. See the [static renditions guide](/docs/guides/enable-static-mp4-renditions) for all options.

## Persisting Mux data

After creating an asset, save the relevant data into your database or persistence layer. At minimum, store the **asset ID** and **playback ID**. You will also want to save metadata as it becomes available: `status`, `duration`, `aspect_ratio`, `resolution_tier`, and any `static_renditions` information.

For **simple integrations** with a fixed set of videos — hero videos, background videos, demo reels on a marketing site — you can hardcode playback IDs in a JSON file or config object. These rarely change and don't need a database.

For **production applications** where users upload videos, videos are created programmatically, or the video catalog changes over time, always persist asset data in a database. Use [webhooks](/docs/core/listen-for-webhooks) to keep your database in sync — listen for `video.asset.ready` to update status, and `video.asset.deleted` to clean up records.

## IDs reference

| ID type | What it's for |
| :--- | :--- |
| Asset ID | Managing the asset (get, update, delete) via `api.mux.com` |
| Playback ID | Streaming the video via `stream.mux.com` |
| Upload ID | Tracking direct upload status |
| Stream Key | Broadcasting to a live stream (keep secret) |
| Live Stream ID | Managing the live stream via `api.mux.com` |

## Common mistakes

**Do NOT confuse Asset IDs with Playback IDs.** Asset IDs are for API operations (`api.mux.com`). Playback IDs are for streaming (`stream.mux.com`). They are different strings.

**Do NOT use the playback URL before the asset is ready.** Always check `status === "ready"` first. A playback URL for a `preparing` asset will not work.

**Do NOT construct playback URLs with the Asset ID.** The correct URL is `https://stream.mux.com/{PLAYBACK_ID}.m3u8`, not `https://stream.mux.com/{ASSET_ID}.m3u8`.

**Do NOT expose API keys in client-side code.** API credentials (Token ID and Token Secret) must never be included in frontend JavaScript, mobile apps, or any code that runs on the user's device. All Mux API requests must be made from a trusted server.

**Do NOT expose stream keys in client-side code.** Stream keys allow anyone to broadcast to your live stream. Keep them server-side only.

**Do NOT hardcode playback URLs.** Always construct them from the playback ID returned by the API.

**Do NOT poll more than once per second.** The API has rate limits. Poll every 2 seconds for asset status.

**Do NOT use `POST` endpoints at high volume without backoff.** POST requests are rate limited to ~1 request per second sustained. GET requests allow ~5 per second.


# Make API requests
Learn how to work with Mux's API through HTTP requests.
## HTTP basic auth

| Term         | Description                                            |
| :----------- | :----------------------------------------------------- |
| Token ID     | access token ID, the "username" in HTTP basic auth     |
| Token secret | access token secret, the "password" in HTTP basic auth |

Every request to the API is authenticated via an [Access Token](https://dashboard.mux.com/settings/access-tokens), which includes the ID and the secret key. You can think of the Access Token’s ID as its username and secret as the password. Mux only stores a hash of the secret, not the secret itself. If you lose the secret key for your access token, Mux cannot recover it; you will have to create a new Access Token. If the secret key for an Access Token is leaked you should revoke that Access Token on the settings page: https://dashboard.mux.com/settings/access-tokens.

Note that in order to access the settings page for access tokens you must be an admin on the Mux organization.

API requests are authenticated via HTTP Basic Auth, where the username is the Access Token ID, and the password is the Access Token secret key. Due to the use of Basic Authentication and because doing so is just a Really Good Idea™, all API requests must made via HTTPS (to `https://api.mux.com`).

<Callout type="warning" title="Watch out for mismatched tokens and environments">
  Access tokens are scoped to an environment, for example: a development token cannot be used in requests to production. Verify the intended environment when creating an access token.
</Callout>

This is an example of authenticating a request with cURL, which automatically handles HTTP Basic Auth. If you run this request yourself it will not work, you should replace the Access Token ID (`44c819de-4add-4c9f-b2e9-384a0a71bede`) and secret (`INKxCoZ+cX6l1yrR6vqzYHVaeFEcqvZShznWM1U/No8KsV7h6Jxu1XXuTUQ91sdiGONK3H7NE7H`) in this example with your own credentials.

```shell
curl https://api.mux.com/video/v1/assets \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{ "inputs": [{ "url": "https://muxed.s3.amazonaws.com/leds.mp4" }], "playback_policies": ["public"], "video_quality": "basic" }' \
  -u 44c819de-4add-4c9f-b2e9-384a0a71bede:INKxCoZ+cX6l1yrR6vqzYHVaeFEcqvZShznWM1U/No8KsV7h6Jxu1XXuTUQ91sdiGONK3H7NE7H
```

HTTP basic auth works by base64 encoding the username and password in an `Authorization` header on the request.

Specifically, the header looks something like this:

```bash
'Authorization': 'Basic base64(MUX_TOKEN_ID:MUX_TOKEN_SECRET)'
```

1. The access token ID and secret are concatenated with a `:` and the string is base64 encoded.
2. The value for the `Authorization` header is the string `Basic` plus a space ` ` followed by the base64 encoded result from Step 1.

In the cURL example above, the cURL library is taking care of the base64 encoding and setting the header value internally. The HTTP library you use in your server-side language will probably have something similar for handling basic auth. You should be able to pass in the `username` (Access Token ID) and `password` (Access Token secret) and the library will handle the details of formatting the header.

## Access token permissions

<Callout type="success" title="Full Permissions">
  If you're just getting started with Mux Video, use Read and Write.
</Callout>

If you are creating or modifying resources with Mux Video then you need **Read** and **Write** permissions. This includes things like:

* Creating new assets
* Creating direct uploads
* Creating new live streams

If you need to create signed tokens for secure video playback, your access token needs **System** write permissions. Learn more about [secure video playback](/docs/guides/secure-video-playback) and <ApiRefLink href="/docs/api-reference/system/signing-keys">signing keys</ApiRefLink>.

Mux Data only requires **Write** permissions if you need to create Annotations via API. Annotations created in the Dashboard do not require **Write** permissions.

<Image src="/docs/images/new-access-token.png" width={760} height={376} alt="Mux access token permissions" sm />

If your code is not creating anything and only doing `GET` requests then you can restrict the access token to **Read** only.

## CORS and client side API requests

Mux API endpoints do not have CORS headers, which means if you try to call the Mux API from the browser you will get an error:

<Callout type="error" title="CORS Error in Browser">
  request has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
</Callout>

This is expected. Although making API requests directly from the browser or your mobile app would be convenient, it leaves a massive security hole in your application by the fact that your client side code would contain your API keys. Anyone who accesses your application would have the ability to steal your API credentials and make requests to Mux on your behalf. An attacker would be able to gain full control of your Mux account.

Mux API Credentials should never be stored in a client application. All Mux API calls should be made from a trusted server.

Instead of trying to make API requests from the client, the flow that your application should follow is:

1. Client makes a request to your server
2. Your server makes an authenticated API request to Mux
3. Your server saves whatever it needs in your database
4. Your server responds to the client with only the information that the client needs. For example, with live streaming that's the stream key for a specific stream, for uploads that's just the direct upload URL

## Using Mux with serverless functions

Serverless functions are a great way to add pieces of secure server-side code to your client heavy application. Examples of services that help you run serverless functions are:

* [AWS Lambda](https://aws.amazon.com/lambda/)
* [Firebase Cloud Functions](https://firebase.google.com/docs/functions)
* [Cloudflare Workers](https://workers.cloudflare.com/)
* [Vercel Functions](https://vercel.com/docs/functions)
* [Netlify Functions](https://docs.netlify.com/functions/overview/)

The basic idea behind serverless functions is that you can write a bit of server code and deploy it to run on these platforms. Your client application can make requests to these endpoints to perform specific actions. Below is an example from [with-mux-video](https://github.com/vercel/next.js/blob/canary/examples/with-mux-video/pages/api/upload.js) of a serverless function endpoint that makes an API call to create a Mux Direct Upload.

```js
// pages/api/upload.js
// see: https://github.com/vercel/next.js/tree/canary/examples/with-mux-video
import Mux from '@mux/mux-node';

const mux = new Mux();

export default async function uploadHandler(req, res) {
  const { method } = req;

  switch (method) {
    case 'POST':
      try {
        const upload = await mux.video.uploads.create({
          new_asset_settings: { playback_policy: ['public'], video_quality: 'basic' },
          cors_origin: '*',
        });
        res.json({
          id: upload.id,
          url: upload.url,
        });
      } catch (e) {
        console.error('Request error', e);
        res.status(500).json({ error: 'Error creating upload' });
      }
      break;
    default:
      res.setHeader('Allow', ['POST']);
      res.status(405).end(`Method ${method} Not Allowed`);
  }
}
```

## API pagination

Our list endpoints (such as <ApiRefLink href="/docs/api-reference/video/assets/list-assets">List Assets</ApiRefLink>) do not return every single relevant record.
To offer everyone the best performance we limit the amount of records you can receive and offer pagination parameters to help you navigate through your list.

### Page/limit pagination

Our most common pagination controls are `page` and `limit`.

| Parameter | Default | Maximum | Description                                      |
| :-------- | :------ | :---- | :--------------------------------------------------|
| `page`    | `1`     | None | The page number to return. The first page is `1`.   |
| `limit`   | `10`    | `100` | The number of records to return per page.          |

If you have 100 assets and you want to get the first 10, you would make a request like this:

```http
GET /video/v1/assets?page=1&limit=10
```

And if you want to get the next 10, you would increment the page parameter from `1` to `2` and make a request like this:

```http
GET /video/v1/assets?page=2&limit=10
```

### Cursor pagination

In addition to `page`/`limit`, the <ApiRefLink href="/docs/api-reference/video/assets/list-assets">List Assets</ApiRefLink> endpoint also supports cursor pagination.
Cursor pagination is a more efficient and reliable way of paginating through very large collections.

<Callout type="info" title="More to come">
  Cursor pagination is only available on the <ApiRefLink href="/docs/api-reference/video/assets/list-assets">List Assets</ApiRefLink> endpoint, but we plan to add it to more endpoints in the future. If you want it added to any specific endpoints please [let us know!](/support)
</Callout>

When you make a request to the list assets endpoint we return a `next_cursor` value.

```json
// GET /video/v1/assets
{
  "data": [
    {
      "id": "asset_id",
      "status": "ready",
      ...
    }
  ],
  "next_cursor": "eyJwYWdlX2xpbWl0IjoxMDAwLCJwYWdlX2NvdW50IjoxfQ"
}
```

Take that `next_cursor` value and make a new request to the list assets endpoint with the `cursor` parameter.

```json
// GET /video/v1/assets?cursor=eyJwYWdlX2xpbWl0IjoxMDAwLCJwYWdlX2NvdW50IjoxfQ
{
  "data": [
    {
      "id": "asset_id",
      "status": "ready",
      ...
    }
  ],
  "next_cursor": null
}
```

If `next_cursor` is `null`, you've reached the end of your list. If `next_cursor` is not `null` you can use that value to get the next page, repeating this pattern until `next_cursor` is `null`.

## API rate limits

Mux Video implements a simple set of rate limits. Rate limits are set per account (not per environment). These rate limits exist for two reasons:

1. First, to protect you, or customers from runaway scripts or batch process - we don't want you to accidentally delete all your content, or run up a large bill if you're not expecting it.
2. Second, to ensure that there's always Mux infrastructure available when our customers need it, for example to start that critical live stream, or ingest that urgent video.

<Callout type="warning" title="Exceeding the rate limit">
  When the rate limit threshold is exceeded, the API will return a HTTP status code `429`.
</Callout>

### Video API

1. All Video API activities that include a `POST` request to `https://api.mux.com/video/` are rate limited to a sustained 1 request per second (RPS) with the ability to burst above this for short periods of time. This includes creating new <ApiRefLink href="/docs/api-reference/video/assets">Assets</ApiRefLink>, <ApiRefLink href="/docs/api-reference/video/live-streams">Live Streams</ApiRefLink>, and <ApiRefLink href="/docs/api-reference/video/direct-uploads">Uploads</ApiRefLink>.

2. All other request methods are limited to 5 sustained requests per second (RPS) with the ability to burst above this for short periods of time. This includes `GET`, `PUT`, `PATCH`, & `DELETE` verbs. Examples include (but not limited to) requests for <ApiRefLink href="/docs/api-reference/video/assets/get-asset">retrieving an asset</ApiRefLink>, <ApiRefLink href="/docs/api-reference/video/assets/update-asset-mp4-support">updating mp4 support</ApiRefLink>, & <ApiRefLink href="/docs/api-reference/video/delivery-usage/list-delivery-usage">listing delivery usage</ApiRefLink>.

### Playback

There are no limits as to the number of viewers that your streams can have, all we ask is that you let us know if you're planning an event expected to receive more than 100,000 concurrent live viewers.

### Monitoring Data API

Requests against the <ApiRefLink href="/docs/api-reference/data/monitoring/list-monitoring-dimensions">Monitoring Data</ApiRefLink> APIs are rate limited to a sustained 1 request per second (RPS) with the ability to burst above this for short periods of time.

### General Data API

Requests against the all other <ApiRefLink href="/docs/api-reference/data/video-views">General Data</ApiRefLink> APIs are rate limited to a sustained 5 request per second (RPS) with the ability to burst above this for short periods of time.

# OpenAPI specification

The complete Mux API is described by an OpenAPI specification, available at [`https://www.mux.com/api-spec.json`](https://www.mux.com/api-spec.json). You can use this spec to generate API clients, import endpoints into tools like [Postman](/docs/core/postman), or integrate with any tooling that supports OpenAPI.


# Use a Mux SDK
Mux SDKs are available for a variety of languages and platforms.
Mux has API SDKs for several major languages. You are not required to use them, but these SDKs handle the details of authentication for you and make it a little nicer to send API requests to Mux; in languages with static typing or type hints, they also will help you form correct requests and reduce development time.

* [Node](/docs/integrations/mux-node-sdk)
* [Python](/docs/integrations/mux-python-sdk)
* [PHP](/docs/integrations/mux-php-sdk)
* [Ruby](/docs/integrations/mux-ruby-sdk)
* [Elixir](/docs/integrations/mux-elixir-sdk)
* [Java](/docs/integrations/mux-java-sdk)
* [C# and other .NET languages](/docs/integrations/mux-csharp-sdk)


# Make API requests with Postman
In this guide you will learn how to fork, set up, and work with Mux's API collection using Postman's API interface.
## Fork the collection

We recommend [Postman](https://postman.com) as a way to easily explore and interact with our API.

Similar to forking a repository on GitHub, forking a collection on Postman allows you to create a new instance of the collection.
Here, you can send requests, collaborate, and submit changes to the original collection.
Without forking the collection, the collection will be **read-only** and you will not be able to make requests unless you're a member of the workspace — even if the collection is public.

If you're already a Postman user, you can fork our [officially supported Postman collection](https://www.postman.com/muxinc/workspace/mux-apis/overview?utm_campaign=postman-collab\&utm_medium=guide\&utm_source=mux) and add it to your workspace by clicking the button below.

You can then stay up to date with future changes to our API specification by pulling changes. More on that in the sections below.

[![Run in Postman](https://run.pstmn.io/button.svg)](https://god.gw.postman.com/run-collection/18282356-97f1767e-f35a-4fca-b1c5-bf612e6f8e76?action=collection%2Ffork\&collection-url=entityId%3D18282356-97f1767e-f35a-4fca-b1c5-bf612e6f8e76%26entityType%3Dcollection%26workspaceId%3D2bcc854d-f831-4c9f-ac0a-3b4382f3a5cd)

## Basic authentication

| Term         | Description                                                |
| :----------- | :--------------------------------------------------------- |
| Token ID     | access token ID, the "username" in basic auth              |
| Token secret | access token secret key, the "password" in basic auth      |

## Set up credentials

Once you've created your access tokens via your [Mux account](https://dashboard.mux.com/signup?type=video?utm_campaign=postman-collab\&utm_medium=guide\&utm_source=mux), you can input them into their respective fields under authorization.

<Image src="/docs/images/postman-auth.png" width={1217} height={723} alt="Basic authentication in Postman" />

## Environment variables

You can use [environment variables](https://learning.postman.com/docs/sending-requests/variables/?utm_campaign=mux-collab\&utm_medium=site\&utm_source=mux) to store and reuse values — like your credentials —
across requests and collections. Variables can either be scoped to the environment or globally, available to all collections within a workspace.

To create environment variables, click the eye icon on the right-hand side of the collection and choose the scope you want your credentials to apply to.

<Image src="/docs/images/postman-env-variables.png" width={1217} height={723} alt="Environment variables menu in Postman" />

Next, add your credentials and set the type to **secret**. This will hide values on-screen. Once you've finished setting up your environment variables,
you can go back to basic authentication and use the variables instead of the values directly. To do this, use `{{variable_name}}` in the form field.

<Image src="/docs/images/postman-hidden-auth.png" width={1217} height={723} alt="Hidden authentication in Postman" />

## Sample request body and responses

Even with extensive documentation, it can be hard to navigate an API for the first time. To help you make requests and understand their responses, we use Postman's
[examples feature](https://learning.postman.com/docs/sending-requests/examples/?utm_campaign=mux-collab\&utm_medium=site\&utm_source=mux) for all Mux Video and Mux Data endpoints.

You can view an endpoint's sample request body by clicking the endpoint on the left-hand API menu and then clicking **body** in the main section of the interface.

<Image src="/docs/images/postman-sample-request-body.png" width={1217} height={723} alt="Sample API request body in Postman" />

You can view an endpoint's sample request response by clicking the right-facing carat on the endpoint. A new item will appear in the collection with the icon **e.g.**.

<Image src="/docs/images/postman-sample-request-response.png" width={1217} height={523} alt="Sample API request response in Postman" />

## Stay up to date with the main collection

Similar to a forked repository on GitHub, your Postman fork will only stay up to date with the origin collection if you periodically [pull changes](https://learning.postman.com/docs/collaborating-in-postman/version-control/#pulling-updates)
to keep your fork in sync.

You can pull changes by clicking the three dots next to the name of your fork. This will open a sub-menu. Click on **merge changes** near the bottom of the menu.

<Image src="/docs/images/postman-fork-sub-menu.png" width={517} height={123} alt="Forked Postman collection's sub-menu" />

If your fork is not in sync with the origin collection, there will be a yellow banner that states, "The destination has been modified since you last updated the fork. We’d recommend pulling changes." Click **pull changes** on the right.

You will then see a diff where source is the origin and destination is your fork.

<Image src="/docs/images/postman-pull-changes-diff.png" width={617} height={323} alt="API diff when pulling changes" />

Sometimes there will be merge conflicts. If you encounter them, you can choose whether you keep the source or destination version of a change.

Once everything looks good, click the orange button labeled **pull changes**.


# Listen for webhooks
Learn how to listen for webhooks from Mux.
Mux uses [webhooks](https://webhooks.fyi) to let your application know when things happen asynchronously, outside of an API request cycle. For example, you may want to update something on your end when an <ApiRefLink href="/docs/api-reference/video/assets/get-asset">asset</ApiRefLink> transitions its status from `processing` to `ready`, or when a live stream starts or ends. When these asynchronous events happen, we'll make a POST request to the address you give us and you can do whatever you need with it on your end.

After a webhook is configured for an environment, notifications will be sent for all events for that environment.

<Callout type="warning">
  Note that webhooks are scoped per *environment*. If you have configured webhooks and you are not seeing them show up, double check that the webhook is correctly configured for the environment you are working in.
</Callout>

If Mux doesn't receive a `2xx` response from your system, we will continue to try the message for the next 24 hours (with an increasing delay between attempts).

<Callout type="info">
  Mux makes an effort to deliver each message successfully once, but in certain
  situations duplicate webhook messages may be sent even if your service
  responds with a 2xx response code. Please ensure that your webhook handling
  mechanism treats duplicated event delivery appropriately.
</Callout>

# Webhooks vs. polling

Please use webhooks to track asset status rather than polling the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">Asset API</ApiRefLink>. Webhooks are much more efficient for both you and Mux, and we rate limit GET requests to the `/assets` endpoint, which means polling the `/assets` API doesn't scale.

# Handling webhooks locally

A common gotcha for anyone new to working with webhooks is figuring out how to receive them when working in a local environment. Since your application runs on a local URL like `http://localhost:3000`, Mux can't reach it directly to deliver webhook events.

The recommended approach is to use the [Mux CLI](/docs/integrations/mux-cli) to listen for events and forward them to your local server.

## Using the Mux CLI

The Mux CLI can connect to Mux's event stream and forward webhook events to your local development server in real-time.

<Callout type="warning">
  CLI webhook forwarding is for **local development only** and provides **no delivery guarantees**. In production, you must configure a webhook endpoint in the [Mux Dashboard](https://dashboard.mux.com) that points to your server's webhook URL.
</Callout>

### Listen and forward events

```bash
mux webhooks listen --forward-to http://localhost:3000/api/webhooks/mux
```

When using `--forward-to`, the CLI displays a webhook signing secret and signs each forwarded request with a `mux-signature` header. Set `MUX_WEBHOOK_SECRET` in your app's environment to [verify these signatures](/docs/core/verify-webhook-signatures):

```typescript
const event = mux.webhooks.unwrap(body, headers, process.env.MUX_WEBHOOK_SECRET);
```

The signing secret is unique per environment and persisted between sessions, so you only need to configure it once.

### Replay past events

The CLI stores the last 100 events received during `listen` sessions. You can replay them to re-test your webhook handler without creating new resources:

```bash
# List stored events
mux webhooks events list

# Replay a specific event
mux webhooks events replay <event-id> --forward-to http://localhost:3000/api/webhooks/mux

# Replay all stored events
mux webhooks events replay --all --forward-to http://localhost:3000/api/webhooks/mux
```

### Trigger synthetic events

You can also send synthetic webhook events to your local server for testing, without making any API calls or creating real resources:

```bash
mux webhooks trigger video.asset.ready --forward-to http://localhost:3000/api/webhooks/mux
```

Run `mux webhooks trigger <invalid-type>` to see all supported event types.

For the full list of webhook CLI commands, see the [Mux CLI docs](/docs/integrations/mux-cli#webhook-forwarding).

## Alternative: using ngrok

If you prefer, you can also use a tunneling tool like [ngrok](https://ngrok.com/docs/integrations/webhooks/mux-webhooks) to expose your local server to the internet and receive webhooks directly from Mux.

```bash
ngrok http 3000
```

This gives you a public URL (e.g. `https://abc123.ngrok.io`) that you can configure as a webhook endpoint in the [Mux Dashboard](https://dashboard.mux.com). Your full webhook URL would be something like `https://abc123.ngrok.io/api/webhooks/mux`.

<Callout type="info">
  You'll need to create an ngrok account (a free account works for most testing purposes). See [ngrok's Mux integration docs](https://ngrok.com/docs/integrations/webhooks/mux-webhooks) for more details.
</Callout>

# Configuring endpoints

Webhook endpoints are configured in the Mux dashboard under "Settings."

<Image src="/docs/images/webhooks.png" width={500} height={500} />

Enter a URL from your application that Mux will call for event notifications.

<Image src="/docs/images/new-webhook.png" width={1192} height={898} />

# Receiving events

Mux will submit a POST request to the configured URL, which your application can treat the same as any other route. Your event handler can do things like update the state of the specified asset in your database, or trigger other work.

Note that a single request attempt will timeout after 5 seconds, after which the attempt is considered failed and will be reattempted. If you expect this will be a problem in your workflow, consider doing the work in an asynchronous task so you can respond to the event immediately.

For more details on the Webhook event object definition, see [the example response](#example-response).

# Example response

```json
{
  "type": "video.asset.ready",
  "object": {
    "type": "asset",
    "id": "0201p02fGKPE7MrbC269XRD7LpcHhrmbu0002"
  },
  "id": "3a56ac3d-33da-4366-855b-f592d898409d",
  "environment": {
    "name": "Demo pages",
    "id": "j0863n"
  },
  "data": {
    "tracks": [
      {
        "type": "video",
        "max_width": 1280,
        "max_height": 544,
        "max_frame_rate": 23.976,
        "id": "0201p02fGKPE7MrbC269XRD7LpcHhrmbu0002",
        "duration": 153.361542
      },
      {
        "type": "audio",
        "max_channels": 2,
        "max_channel_layout": "stereo",
        "id": "FzB95vBizv02bYNqO5QVzNWRrVo5SnQju",
        "duration": 153.361497
      }
    ],
    "status": "ready",
    "max_stored_resolution": "SD",
    "max_stored_frame_rate": 23.976,
    "id": "0201p02fGKPE7MrbC269XRD7LpcHhrmbu0002",
    "duration": 153.361542,
    "created_at": "2018-02-15T01:04:45.000Z",
    "aspect_ratio": "40:17"
  },
  "created_at": "2018-02-15T01:04:45.000Z",
  "accessor_source": null,
  "accessor": null,
  "request_id": null
}
```

# Types of Events

## Asset Events

| Event | Description |
|-------|-------------|
| `video.asset.created` | Asset has been created |
| `video.asset.ready` | Asset is ready for playback. You can now use the asset's `playback_id` to successfully start streaming this asset. |
| `video.asset.errored` | Asset has encountered an error. Use this to notify your server about assets with errors. Asset errors can happen for a number of reasons, most commonly an input URL that Mux is unable to download or a file that is not a valid video file. |
| `video.asset.updated` | Asset has been updated. Use this to make sure your server is notified about changes to assets. |
| `video.asset.deleted` | Asset has been deleted. Use this so that your server knows when an asset has been deleted, at which point it will no longer be playable. |
| `video.asset.live_stream_completed` | The live stream for this asset has completed. Every time a live stream starts and ends a new asset gets created and this event fires. |
| `video.asset.static_rendition.created` | A new static rendition for this asset has been created. Static renditions are streamable mp4 files that are most commonly used for allowing users to download files for offline viewing. |
| `video.asset.static_rendition.ready` | A static rendition for this asset is ready. Static renditions are streamable mp4 files that are most commonly used for allowing users to download files for offline viewing. |
| `video.asset.static_rendition.skipped` | A static rendition for this asset was skipped, due to the source not being suitable for the requested static rendition. Static renditions are streamable mp4 files that are most commonly used for allowing users to download files for offline viewing. |
| `video.asset.static_rendition.deleted` | A static rendition for this asset was deleted. The static renditions (mp4 files) for this asset will no longer be available. |
| `video.asset.static_rendition.errored` | A static rendition for this asset errored. This indicates that there was some error when creating a static rendition (mp4s) of your asset. This should be rare and if you see it unexpectedly please open a support ticket. |
| `video.asset.master.ready` | Master access for this asset is ready. Master access is used when downloading an asset for purposes of editing or post-production work. The master access file is not intended to be streamed or downloaded by end-users. |
| `video.asset.master.preparing` | Master access for this asset is being prepared. After requesting master access you will get this webhook while it is being prepared. |
| `video.asset.master.deleted` | Master access for this asset has been deleted. Master access for this asset has been removed. You will no longer be able to download the master file. If you want it again you should re-request it. |
| `video.asset.master.errored` | Master access for this asset has encountered an error. This indicates that there was some error when creating master access for this asset. This should be rare and if you see it unexpectedly please open a support ticket. |
| `video.asset.track.created` | A new track for this asset has been created, for example a subtitle text track. |
| `video.asset.track.ready` | A track for this asset is ready. In the example of a subtitle text track the text track will now be delivered with your HLS stream. |
| `video.asset.track.errored` | A track for this asset has encountered an error. There was some error preparing this track. Most commonly this could be a text track file that Mux was unable to download for processing. |
| `video.asset.track.deleted` | A track for this asset has been deleted. |
| `video.asset.warning` | This event fires when Mux has encountered a non-fatal issue with the recorded asset of the live stream. At this time, the event is only fired when Mux is unable to download a slate image from the URL set as `reconnect_slate_url` parameter value. More details on this event is available [here](/docs/guides/handle-live-stream-disconnects#reconnect-window-and-slates). |

## Upload Events

| Event | Description |
|-------|-------------|
| `video.upload.asset_created` | An asset has been created from this upload. This is useful to know what a user of your application has finished uploading a file using the URL created by a [Direct Upload](/docs/guides/upload-files-directly). |
| `video.upload.cancelled` | Upload has been canceled. This event fires after hitting the <ApiRefLink href="/docs/api-reference/video/direct-uploads/cancel-direct-upload">cancel direct upload</ApiRefLink> API. |
| `video.upload.created` | Upload has been created. This event fires after <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">creating a direct upload</ApiRefLink>. |
| `video.upload.errored` | Upload has encountered an error. This event fires when the asset created by the direct upload fails. Most commonly this happens when an end-user uploads a non-video file. |

## Live Stream Events

| Event | Description |
|-------|-------------|
| `video.live_stream.created` | A new live stream has been created. Broadcasters with a `stream_key` can start sending encoder feed to this live stream. |
| `video.live_stream.connected` | An encoder has successfully connected to this live stream. |
| `video.live_stream.recording` | Recording on this live stream has started. Mux has successfully processed the first frames from the encoder. If you show a *red dot* icon in your UI, this would be a good time to show it. |
| `video.live_stream.active` | This live stream is now "active". The live streams `playback_id` OR the `playback_id` associated with this live stream's asset can be used right now to created HLS URLs (`https://stream.mux.com/{PLAYBACK_ID}.m3u8` and start streaming in your player. Note that before the live stream is `"active"`, trying to stream the HLS URL will result in HTTP `412` errors. |
| `video.live_stream.disconnected` | An encoder has disconnected from this live stream. Note that while disconnected the live stream is still `status: "active"`. |
| `video.live_stream.idle` | The `reconnect_window` for this live stream has elapsed. The live stream `status` will now transition to `"idle"`. |
| `video.live_stream.updated` | This live stream has been updated. For example, after <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">resetting the live stream's stream key</ApiRefLink>. |
| `video.live_stream.enabled` | This live stream has been enabled. This event fires after <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">enable live stream</ApiRefLink> API. |
| `video.live_stream.disabled` | This live stream has been disabled. This event fires after <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">disable live stream</ApiRefLink> API. Disabled live streams will no longer accept new RTMP connections. |
| `video.live_stream.deleted` | This live stream has been deleted. This event fires after <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream">delete live stream API</ApiRefLink> API. |
| `video.live_stream.warning` | This live stream event fires when Mux has encountered a non-fatal issue. There is no disruption to the live stream ingest and playback. At this time, the event is only fired when Mux is unable to download an image from the URL set as `reconnect_slate_url` parameter value. More details on this event is available [here](/docs/guides/handle-live-stream-disconnects#reconnect-window-and-slates). |

## Simulcast Target Events

These simulcast target events are useful when creating a UI that shows your users the status of their configured 3rd party endpoints. These events are handy when you want to build a UI that shows the state of each simulcast target and keep track of the state changes as they happen.

| Event | Description |
|-------|-------------|
| `video.live_stream.simulcast_target.created` | A new simulcast target has been created for this live stream. |
| `video.live_stream.simulcast_target.idle` | When the parent live stream is `"disconnected"`, all simulcast targets will have be `"idle"`. |
| `video.live_stream.simulcast_target.starting` | When the parent live stream fires `"connected"` then the simulcast targets transition to `"starting"`. |
| `video.live_stream.simulcast_target.broadcasting` | This fires when Mux has successfully connected to the simulcast target and has begun pushing content to that third party. |
| `video.live_stream.simulcast_target.errored` | This fires when Mux has encountered an error either while attempting to connect to the third party streaming service or while broadcasting. Mux will try to re-establish the connection and if it does successfully the simulcast target will transition back to `"broadcasting"`. |
| `video.live_stream.simulcast_target.updated` | This simulcast target has been updated. |
| `video.live_stream.simulcast_target.deleted` | This simulcast target has been deleted. |

# Webhook specification

A machine-readable specification of all Mux webhook events is available at [`https://www.mux.com/webhook-spec.json`](https://www.mux.com/webhook-spec.json). You can use this to generate types, validate payloads, or integrate with any tooling that supports OpenAPI-style schemas.


# Verify webhook signatures
You have the option to verify webhook requests that Mux sends to your endpoints. Mux will include a signature in the request's header. You can use this signature in your code to make sure the request was sent by Mux and not a third party.
## Obtain your signing secret

Before you get started, you will need your signing secret for your webhook. You can find that where you configure webhooks on the [webhooks settings page](https://dashboard.mux.com/settings/webhooks). Please note that the signing secret is different for each webhook endpoint that we notify.

<Image src="/docs/images/webhook-security.png" width={1181} height={479} />

Webhooks contain a header called `mux-signature` with the timestamp and a signature. The timestamp is prefixed by `t=` and the signature is prefixed by a scheme. Schemes start with `v`, followed by an integer. Currently, the only valid signature scheme is `v1`. Mux generates signatures using [HMAC](https://en.wikipedia.org/wiki/HMAC) with [SHA-256](https://en.wikipedia.org/wiki/SHA-2).

```text
Mux-Signature: t=1565220904,v1=20c75c1180c701ee8a796e81507cfd5c932fc17cf63a4a55566fd38da3a2d3d2`
```

## How to verify webhook signatures

### Step 1: Extract the timestamp and signature

Split the header at the `,` character and get the values for `t` (timestamp) and `v1` (the signature)

### Step 2: Prepare the `signed_payload` string

You will need:

* the timestamp from Step 1 as a string (for example: "1565220904")
* the dot character `.`
* the raw request body (this will be JSON in a string format)

### Step 3: Determine the expected signature

Use the 3 components from Step 2 to compute an HMAC with the SHA256 hash function. Depending on the language that you are using this will look something like the following:

```js
secret = 'my secret' // your signing secret
payload = timestamp + "." + request_body
expected_signature = createHmacSha256(payload, secret)
```

### Step 4: Compare signature

Compare the signature in the header to the expected signature. If the signature matches, compute the difference between the current timestamp and the received timestamp, then check to make sure that the timestamp is within our tolerance. By default, our SDKs allow a tolerance of 5 minutes.

## Examples

Our official SDKs for [Node](https://github.com/muxinc/mux-node-sdk) and [Elixir](https://github.com/muxinc/mux-elixir) contain helper methods for verifying Mux webhooks. If you're using one of these languages it's best to use our available helper methods. Note that the helper methods use the raw request body instead of a payload including the timestamp.

```elixir

# check the mux-elixr docs for details and a full example using Phoenix
# https://github.com/muxinc/mux-elixir#verifying-webhook-signatures-in-phoenix
Mux.Webhooks.verify_header(raw_body, signature_header, secret)

```

```go

func generateHmacSignature(webhookSecret, payload string) string {
    h := hmac.New(sha256.New, []byte(webhookSecret))
    h.Write([]byte(payload))
    return hex.EncodeToString(h.Sum(nil))
}

func IsValidMuxSignature(req *http.Request, body []byte) error {
    muxSignature := req.Header.Get("Mux-Signature")

    if muxSignature == "" {
        return errors.New("no Mux-Signature in request header")
    }

    muxSignatureArr := strings.Split(muxSignature, ",")

    if len(muxSignatureArr) != 2 {
        return errors.New(fmt.Sprintf("Mux-Signature in request header should be 2 values long: %s", muxSignatureArr))
    }

    timestampArr := strings.Split(muxSignatureArr[0], "=")
    v1SignatureArr := strings.Split(muxSignatureArr[1], "=")

    if len(timestampArr) != 2 || len(v1SignatureArr) != 2 {
        return errors.New(fmt.Sprintf("missing timestamp: %s or missing v1Signature: %s", timestampArr, v1SignatureArr))
    }

    timestamp := timestampArr[1]
    v1Signature := v1SignatureArr[1]

    webhookSecret := "" //insert secret here or load from config file.
    payload := fmt.Sprintf("%s.%s", timestamp, string(body))
    sha := generateHmacSignature(webhookSecret, payload)

    if sha != v1Signature {
        return errors.New("not a valid mux webhook signature")
    }

    fmt.Println("timestamp sha:", sha)
    fmt.Println("v1Signature:", v1Signature)
    return nil
}

```

```laravel

/**
 * Verify the signature (laravel)
 *
 * @param Request $request
 * @return boolean
 */
protected function verifySignature(Request $request)
{
    // Get the signature from the request header
    $muxSig = $request->header('Mux-Signature');

    if(empty($muxSig)) {
        return false;
    }

    // Split the signature based on ','.
    // Format is 't=[timestamp],v1=[hash]'
    $muxSigArray = explode(',', $muxSig);

    if(empty($muxSigArray) || empty($muxSigArray[0]) || empty($muxSigArray[1])) {
        return false;
    }

    // Strip the first occurence of 't=' and 'v1=' from both strings
    $muxTimestamp = Str::replaceFirst('t=', '', $muxSigArray[0]);
    $muxHash = Str::replaceFirst('v1=', '', $muxSigArray[1]);

    // Create a payload of the timestamp from the Mux signature and the request body with a '.' in-between
    $payload = $muxTimestamp . "." . $request->getContent();

    // Build a HMAC hash using SHA256 algo, using our webhook secret
    $ourSignature = hash_hmac('sha256', $payload, config('mux.webhook_secret'));

    // `hash_equals` performs a timing-safe crypto comparison
    return hash_equals($ourSignature, $muxHash);
}

```

```node

import Mux from '@mux/mux-node';

// check the mux-node-sdk docs for details
// https://github.com/muxinc/mux-node-sdk/blob/master/api.md#webhooks
const mux = new Mux();
mux.webhooks.verifySignature(body, headers, secret);

```



# Content Security Policy for Mux
Learn how to configure Content Security Policy (CSP) to work with Mux Video and Data services.
## Understanding CSP with Mux

Content Security Policy (CSP) is a security feature that helps protect your web application from cross-site scripting (XSS) attacks and other code injection attacks. CSP works by restricting the resources (such as scripts, stylesheets, images, and network connections) that a web page can load.

When integrating Mux Video and Mux Data into your application, you'll need to configure your CSP to allow connections to Mux services. This guide will help you set up the appropriate CSP directives to ensure your Mux integration works securely.

<Callout type="info" title="CSP Basics">
  If you're new to Content Security Policy, we recommend reading [Google's CSP guide](https://developers.google.com/web/fundamentals/security/csp) for a comprehensive introduction to CSP concepts and implementation.
</Callout>

## Basic CSP configuration

For most applications, the simplest approach is to use a basic CSP that allows all Mux services. This configuration ensures compatibility with all current and future Mux features:

```
Content-Security-Policy: default-src 'self' *.mux.com *.litix.io storage.googleapis.com
```

This CSP directive allows your application to:

* Load resources from your own domain (`'self'`)
* Connect to all Mux Video services (`*.mux.com`)
* Connect to all Mux Data services (`*.litix.io`)
* Connect to Google Cloud Storage (`storage.googleapis.com`) -- this is needed for [Direct Uploads](/docs/guides/upload-files-directly)

The wildcard approach for `mux.com` and `litix.io` is recommended because Mux utilizes multiple CDNs and subdomains to provide optimal performance globally. These hostnames may change without notice as we optimize our infrastructure.

## Granular CSP configuration

If your security requirements call for a more restrictive CSP, you can use specific directives instead of the broad `default-src` approach. Here's a granular configuration that covers all Mux functionality:

```
Content-Security-Policy: 
  connect-src 'self' https://*.mux.com https://*.litix.io https://storage.googleapis.com;
  media-src 'self' blob: https://*.mux.com;
  img-src 'self' https://image.mux.com https://*.litix.io;
  script-src 'self' https://src.litix.io;
  worker-src 'self' blob:
```

<Callout type="warning" title="Merge with existing policies">
  The above configuration must be merged with your existing CSP directives. Each directive should combine values from both your current policy and the Mux requirements.
</Callout>

## Upload and media handling

If your application uploads media files to Mux via [Direct Uploads](/docs/guides/upload-files-directly), you'll need additional CSP directives to handle binary data and file uploads:

```
Content-Security-Policy: 
  connect-src 'self' https://*.mux.com https://*.litix.io https://storage.googleapis.com;
  media-src 'self' blob: https://*.mux.com;
  img-src 'self' https://image.mux.com https://*.litix.io;
  script-src 'self' https://src.litix.io;
  worker-src 'self' blob:;
  form-action 'self' https://*.mux.com https://storage.googleapis.com
```

The key additions for upload functionality are:

| Directive | Purpose |
| :-------- | :------ |
| `https://storage.googleapis.com` in `connect-src` | Allows uploads to Google Cloud Storage endpoints used by Mux |
| `form-action` directive | Permits form submissions and PUT/POST requests to upload endpoints |
| `blob:` in `media-src` and `worker-src` | Enables handling of binary file data during upload processing |

## Product-specific requirements

Different Mux features have specific CSP requirements. Here's what you need for each:

### Mux Video Playback

For video playback functionality, you **must** include:

```
connect-src https://*.mux.com;
media-src blob: https://*.mux.com;
worker-src blob:
```

This is required because:

* HLS manifests and video segments are delivered via `https://stream.mux.com` and other `*.mux.com` subdomains
* Video players use web workers and blob URLs for optimal performance
* Mux uses multiple CDNs with different hostnames for global performance

### Video Thumbnails and Storyboards

If you're displaying video thumbnails or timeline hover previews, include:

```
img-src https://image.mux.com;
connect-src https://image.mux.com
```

The `connect-src` directive is needed for dynamic thumbnail loading in timeline hover previews, while `img-src` covers standard image embedding.

### Mux Data Integration

For Mux Data analytics, you **must** allow:

```
connect-src https://*.litix.io;
img-src https://*.litix.io
```

This covers:

* Data collection endpoints across multiple subdomains
* Fallback beacon loading through image tags
* Various monitoring and analytics endpoints

<Callout type="success" title="Environment-specific restriction">
  For tighter security, you can replace `https://*.litix.io` with `https://img.litix.io` and `https://<env_key>.litix.io` where `<env_key>` is your Mux environment key. However, the wildcard approach is recommended for maximum compatibility.
</Callout>

### Hosted Mux Data Integrations

If you're loading pre-built Mux Data integrations from our hosted domain (rather than installing via NPM), add:

```
script-src https://src.litix.io
```

This is not required if you bundle the Mux Data SDK directly into your application code.

### Complete Example

Here's a complete CSP that supports all Mux features including uploads:

```
Content-Security-Policy: 
  default-src 'self';
  connect-src 'self' https://*.mux.com https://*.litix.io https://storage.googleapis.com;
  media-src 'self' blob: https://*.mux.com;
  img-src 'self' https://image.mux.com https://*.litix.io;
  script-src 'self' https://src.litix.io;
  worker-src 'self' blob:;
  form-action 'self' https://*.mux.com https://storage.googleapis.com
```

<Callout type="warning" title="Test thoroughly">
  After implementing your CSP, test all Mux functionality in your application including video playback, uploads, thumbnails, and analytics to ensure everything works as expected.
</Callout>


# Mux Uploader for web
Mux Uploader is a drop in component for uploading videos to Mux from your web application
**Mux Uploader** is a drop-in web component that makes it easy to upload video files to Mux.

This component allows you to build a fully-functional, customizable video upload UI in your application using a single line of code. Mux Uploader supports:

* Manual file selection
* Drag and drop for files
* Optional pausing and resuming of uploads
* Automatic offline/online detection with upload resumes
* And more!

<Player playbackId={"XYND6DHqq7A01ziIbLWuPH02d004GoqYhHgBucY3M6Tydo"} muted autoPlay loop style={{'--controls': 'none' }} thumbnailTime={0} />

Mux Uploader can be used as either a web component (`<mux-uploader>` from `@mux/mux-uploader`), or a React component (`<MuxUploader />` from `@mux/mux-uploader-react`).

## Quick start

Here are some examples of Mux Uploader in action.

### Mux Uploader HTML element

Install with either npm, yarn or load Mux Uploader from the hosted script.

#### NPM

```shell
npm install @mux/mux-uploader@latest
```

#### Yarn

```shell
yarn add @mux/mux-uploader@latest
```

#### Hosted

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-uploader"></script>
```

#### Example HTML element implementation

```html
<script
  src="https://cdn.jsdelivr.net/npm/@mux/mux-uploader"
></script>
<mux-uploader></mux-uploader>
```

### Mux Uploader React component

You will need to select one of the package options below. Both examples will automatically update the uploader. You can always anchor the package to a specific version if needed.

#### NPM

```shell
npm install @mux/mux-uploader-react@latest
```

#### Yarn

```shell
yarn add @mux/mux-uploader-react@latest
```

#### Example React Usage

```jsx
import MuxUploader from "@mux/mux-uploader-react";

export default function App() {
  return (
    <MuxUploader/>
  );
}
```

## Upload a video

Mux Uploader allows you to use upload URLs provided by Mux's <ApiRefLink href="/docs/api-reference/video/direct-uploads">Direct Uploads</ApiRefLink> in your web application.
It takes care of rendering a file selector, uploading your video file, displaying progress updates to the user, handling retries, and more.

This does mean that you'll need to provide a new upload URL whenever a user will be uploading a new video file in your application. You provide that URL value via the `endpoint` attribute or property. It looks like this:

### HTML example

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<!-- Replace endpoint value with a valid Mux Video Direct Upload URL -->\n<mux-uploader\n  endpoint=\"https://httpbin.org/put\"\n></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js'",
      "hidden": true
    }
  }
}
```

The `endpoint` indicates the direct upload URL that will receive the video file you're uploading.

You can generate a signed direct upload URL by making a server-side API call to Mux's <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Create Direct Upload</ApiRefLink> endpoint,
or you can use `curl` based on the example from the link if you just want to test it out.

In a successful API response, you will receive a unique signed upload URL that can then be passed along to your client application and set as the `endpoint` property on a `mux-uploader` element. The URL for a Direct Upload looks like `"https://storage.googleapis.com/video..."`.

<Callout type="info">
  In the following examples, you will replace the value of the `endpoint` property with your unique direct upload URL.
</Callout>

### React example

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader from \"@mux/mux-uploader-react\";\n\nexport default function App() {\n  return (\n    <MuxUploader endpoint=\"https://httpbin.org/put\" />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

## Overview of the upload process

Video uploads and processing take time. Processing time can vary depending on the file size and type of video that you upload.

Mux uses [webhooks ](/docs/core/listen-for-webhooks)to keep your application informed about what's happening with your uploaded video — from when the upload completes to when the video is ready to be played.

<Callout type="info">
  To minimize processing time, consider following [Mux's guide for handling standard video input](/docs/guides/minimize-processing-time).
</Callout>

The overall flow generally looks like this:

### 1. Set up webhooks

* Set up a public webhook endpoint in your application to receive events from Mux
* Configure the webhook in your Mux dashboard to send events to this endpoint

### 2. Upload the video

* Create a direct upload URL using the Mux API
* Save the upload ID to your database
* Pass the URL to the `endpoint` property on the Mux Uploader component

### 3. Wait for video to be ready

* When the upload completes, show a "processing" indicator to the user.
* Poll your database to check if the video is ready for playback.

### 4. Handle webhook events

Listen for specific webhook events, particularly:

* `video.upload.asset_created` which indicates that the upload has completed and an asset has been created
* `video.asset.ready` which indicates that the video has been processed and is ready for playback

The `video.upload.asset_created` event contains the `asset_id` in the event payload.
The `video.asset.ready` event contains the `playback_id` in the event payload.

### 5. Store the information in your database

Save the `asset_id` and `playback_id` to your database, associating them with the user or relevant entity in your application.

Here's an example of how you might structure your database table schema:

| videos                    |                        |            |
|--------------------------|------------------------|------------|
| id                       | uuid (primary key)     |            |
| user\_id                  | uuid (foreign key)     | References users.id |
| upload\_id                | string                 | From initial upload |
| asset\_id                 | string                 | From Mux webhook |
| playback\_id              | string                 | From Mux webhook |
| title                    | string                 | Optional metadata |
| status                   | enum                   | e.g. `preparing`, `ready` |
| created\_at               | timestamp              |            |
| updated\_at               | timestamp              |            |

### 6. Use the IDs

While Mux generates several IDs during the upload and processing flow, there are two key IDs you'll primarily work with:

1. The `asset_id`: This is used when you need to manage your video through the Mux API (like deleting the video or checking its status)
2. The `playback_id`: This is what you'll use to actually play your video, either by:
   * Adding it to Mux Player
   * Creating a URL where your video can be played

<Callout type="info">
  Note that this process happens asynchronously, so your application should be designed to handle the delay between the initial upload and when the video becomes available for playback.
</Callout>

For more detailed implementations, you can refer to the examples provided in the Mux documentation for various frameworks:

* [Next.js](/docs/frameworks/next-js)
* [SvelteKit](/docs/frameworks/sveltekit)
* [Astro](/docs/frameworks/astro)
* [Remix](/docs/frameworks/remix-js)

## Fetching the upload URL async

At the time you render the `<mux-uploader>`, you may not have the direct upload URL yet. Instead, you might want to retrieve it async from your server after a user selects a file. You can do that by setting the `endpoint` property value to a custom function instead of a URL.

```html
<mux-uploader></mux-uploader>

<script>
  const muxUploader = document.querySelector("mux-uploader");
  /*
    Endpoint should be a function that returns a promise and resolves
    with a string for the upload URL.
  */
  muxUploader.endpoint = function () {
    /*
      In this example, your server endpoint would return the upload URL
      in the response body "https://storage.googleapis.com/video..."
    */
    return fetch("/your-server/api/create-upload").then(res => res.text());
  };
</script>
```

This is even easier using React props:

```jsx
import MuxUploader from "@mux/mux-uploader-react";

export default function App() {
  return (
    <MuxUploader
      endpoint={() => {
        return fetch("/your-server/api/create-upload")
          .then(res => res.text());
      }}
    />
  );
}
```

## Customizing the UI

As you can see in the examples above, Mux Uploader provides a fairly feature rich and reasonably styled (albeit basic) UI by default.

It will automatically update based on different stages or states of uploading, like showing a UI for file selection before a video has been picked,
showing progress as the file is uploaded, showing when the file upload has completed, and showing error state with the option to retry if something
goes wrong with the upload.

In addition, Mux Uploader provides many ways to customize this look and feel, including:

* attributes / properties like `no-drop` or `pausable` to enable/disable UI components
* intuitive styling with CSS, just like any other HTML element.
* state transition attributes like `upload-in-progress` or `upload-error` for responsive styling
* attribute / property based data customization for things like `dynamic-chunk-size` or `max-file-size`
* overridable and composable components like `<mux-uploader-file-select>` or `<mux-uploader-drop>` for full flexibility of UI

<GuideCard
  title="Core functionality"
  description="Understand the features and core functionality of Mux Uploader"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/uploader-web-core-functionality",
    },
  ]}
/>

<GuideCard
  title="Integrate Mux Uploader"
  description="Interate Mux Uploader in your web application. See examples in popular front end frameworks."
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/uploader-web-integrate-in-your-webapp",
    },
  ]}
/>

<GuideCard
  title="Customize the look and feel"
  description="Customize Mux Uploader to match your brand and needs"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/uploader-web-customize-look-and-feel",
    },
  ]}
/>


# Core functionality of Mux Uploader
In this guide, see the features and functionality that Mux Uploader gives you out of the box.
## Mux Video integration

Mux Uploader is built for working with Mux's [Direct Uploads](/docs/guides/upload-files-directly) API and workflow. Add your upload
URL as Mux Uploader's [`endpoint`](/docs/guides/mux-uploader#upload-a-video) to use it.

Mux Uploader uses [UpChunk](https://github.com/muxinc/upchunk) under the hood to handle large files by splitting them into small chunks before uploading them.

## Controls and UI

Mux Uploader provides a feature-rich, dynamic UI that changes based on the current state of your media upload.
These can be broken down into:

| State | Attribute | Description |
| ----- | --------- | ----------- |
| Initial | (none) | State before a media file has been selected for upload |
| In Progress | `upload-in-progress` | State while media chunks are being uploaded |
| Completed | `upload-complete` | State after the media has successfully finished uploading all chunks |
| Error | `upload-error` | State whenever an error occurs that results in a failure to fully upload the media |

## Initial State

The initial state by default will show both a drag and drop region and a file select button to select your file for upload.
By default, it looks like this:

<Image src="/docs/images/mux-uploader-web-drop.png" width={502} height={210} />

## In Progress State

Under normal conditions, the in progress state will indicate the ongoing uploading progress as both a numeric percentage and
a progress bar. It will look something like this:

<Image src="/docs/images/mux-uploader-web-progress.png" width={435} height={129} />

### Pausing

In addition, you can [opt into pausing](/docs/guides/uploader-web-customize-look-and-feel#enable-pausing), in which case the UI
will look like one of these, depending on if you are unpaused, pausing (after the current chunk finishes uploading), or paused.

<MultiImage
  images={[
  { src: "/docs/images/mux-uploader-web-pause.png", width: 710, height: 173 },
  { src: "/docs/images/mux-uploader-web-pausing.png", width: 710, height: 173 },
  { src: "/docs/images/mux-uploader-web-resume.png", width: 710, height: 173 },
]}
/>

### Offline

Finally, if you unfortunately end up loosing internet connection while uploading is in progress, you'll see this:

<Image src="/docs/images/mux-uploader-web-offline.png" width={436} height={182} />

## Completed State

Once uploading has completed, Mux Uploader will present the following status:

<Image src="/docs/images/mux-uploader-web-complete.png" width={436} height={108} />

## Error State

And in the unfortunate case where you encounter an error, by default you'll see the error message and a retry button:

<Image src="/docs/images/mux-uploader-web-retry.png" width={710} height={160} />

<Callout type="info">
  If you want to explore different ways to customize the UI for these different states,
  check out our documentation on [customizing Mux Uploader's look and feel](/docs/guides/uploader-web-customize-look-and-feel).
</Callout>

## Error handling

Mux Uploader will monitor for unrecoverable errors and surface them via the UI, giving the
user the opportunity to retry the upload. Mux Uploader monitors both HTTP-status based errors
(e.g. 4xx, 5xx statuses) and file processing errors like exceeding maximum file size limits. See our [optional configuration options](#configure-upload-details) below for more ways to work around some of these errors.

In addition, before surfacing an HTTP-based error, Mux Uploader will automatically retry the request 5 times.

You may also listen for these errors via the `uploaderror` event, discussed in the section below.

## Using events

All of Mux Uploader's core UI behaviors and functionality are driven by specific events. These fall into two
categories:

1. user-driven update events (e.g. notifying Mux Uploader which file to upload or to retry uploading after an error)
2. state-driven informational events (e.g. notifying subcomponents or anyone else listening about the upload progress or that an error occurred)

For example, you can listen for the `progress` event to receive details on how far along your file upload is.

```js
  const muxUploader = document.querySelector('mux-uploader');

  muxUploader.addEventListener('progress', function (e) {
    console.log(`My upload is ${e.detail}% complete!`)
  });
```

When the upload is complete, you'll see 100% on the progress bar and the `success` event will fire.

If an error occurs during the upload, an `uploaderror` event will fire.

### Example HTML Usage

```html
<mux-uploader endpoint="https://my-authenticated-url/storage?your-url-params"></mux-uploader>

<script>
  const muxUploader = document.querySelector('mux-uploader');

  muxUploader.addEventListener('success', function () {
    // Handle upload success
  });

  muxUploader.addEventListener('uploaderror', function () {
    // Handle upload error
  });
</script>
```

### Example React Usage

```jsx
import MuxUploader from "@mux/mux-uploader-react";

export default function App() {
  return (
    <MuxUploader
      endpoint="https://my-authenticated-url/storage?your-url-params"
      onSuccess={() => {
        // Handle upload success
      }}
      onUploadError={() => {
        // Handle upload error
      }}
    />
  );
}
```

## Configure Upload Details

In addition to various UI customization and behaviors, Mux Uploader exposes the following attributes / properties for configuring details
about the file upload itself:

| Attribute / Property | Description |
| --- | --- |
| `max-file-size` / `maxFileSize` | The largest size, in kB, allowed for upload |
| `chunk-size` / `chunkSize` | The size of each upload chunk, in kB. Useful for advanced optimization based on known network conditions or file details. |
| `dynamic-chunk-size` / `dynamicChunkSize` | A boolean that tells Mux Uploader to automatically adapt its chunk size larger or smaller based on network conditions. |
| `use-large-file-workaround` / `useLargeFileWorkaround` | A boolean that enables a less memory efficient way of loading and chunking files for environments that don't reliably handle [`ReadableStream` for large files](https://developer.mozilla.org/en-US/docs/Web/API/Streams_API/Using_readable_streams). This can occur on e.g. Safari browsers with files >= 4GB. **NOTE:** This fallback will only be used if and when attempts to use `ReadableStream` fails. |

## Full API reference

Any features or settings not mentioned above can be found in our [full API reference](https://github.com/muxinc/elements/blob/main/packages/mux-uploader/REFERENCE.md)
covering all of the available events, attributes, properties, slots, CSS parts, and CSS variables available on Mux Uploader and all of its subcomponents.


# Integrate Mux Uploader into your web application
In this guide, you will learn about Mux Uploader and how to use it in your web application.
## Install Mux Uploader

Mux Uploader has 2 packages:

* `@mux/mux-uploader`: the web component, compatible with all frontend frameworks
* `@mux/mux-uploader-react`: the React component, for usage in React

Both are built with TypeScript and can be installed either via `npm`, `yarn` or the hosted option on `jsdelivr`.

### NPM

```shell
npm install @mux/mux-uploader@latest #or @mux/mux-uploader-react@latest
```

### Yarn

```shell
yarn add @mux/mux-uploader@latest #or @mux/mux-uploader-react@latest
```

### Hosted

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-uploader"></script>
<!--
or src="https://cdn.jsdelivr.net/npm/@mux/mux-uploader-react"
-->
```

## Providing attributes

The only required value to use Mux uploader is [`endpoint`](/docs/guides/mux-uploader#upload-a-video).

## Examples

### HTML element

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<!-- Replace endpoint value with a valid Mux Video Direct Upload URL -->\n<mux-uploader\n  endpoint=\"https://httpbin.org/put\"\n></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js'",
      "hidden": true
    }
  }
}
```

Using in HTML just requires adding the hosted `<script>` tag to your page and then adding the `<mux-uploader>` element where you need it.

### React

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader from \"@mux/mux-uploader-react\";\n\nexport default function App() {\n  return (\n    <MuxUploader endpoint=\"https://httpbin.org/put\" />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

For our React implementation, you can use it just like you would any other React component.

### Svelte

Because Svelte supports web components, it doesn't need a separate wrapper component like React. View the SveltKit example in the
[Mux Elements repo](https://github.com/muxinc/elements/tree/main/examples/svelte-kit) for a fully functioning example.

```html
<script context="module" lang="ts">
  export const prerender = true;
</script>

<script lang="ts">
  // this prevents the custom elements from being redefined when the REPL is updated and reloads, which throws an error
  // this means that any changes to the custom element won't be picked up without saving and refreshing the REPL
  // const oldRegister = customElements.define;
  // customElements.define = function(name, constructor, options) {
  // 	if (!customElements.get(name)) {
  // 		oldRegister(name, constructor, options);
  // 	}
  // }
  // import { page } from '$app/stores';
  import { onMount } from "svelte";
  onMount(async () => {
    await import("@mux/mux-uploader");
  });
</script>

<mux-uploader endpoint="https://httpbin.org/put" />
```

### Vue

Because Vue supports web components, it doesn't need a separate wrapper component like React. View the Vue example in the [Mux Elements repo](https://github.com/muxinc/elements/tree/main/examples/vue-with-typescript) for a fully functioning example.

```html
<script setup lang="ts">
  import "@mux/mux-uploader";
</script>

<template>
  <main>
    <mux-uploader endpoint="https://httpbin.org/put" />
  </main>
</template>
```

<GuideCard
  title="Customize the look and feel"
  description="Customize Mux Uploader to match your brand"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/uploader-web-customize-look-and-feel",
    },
  ]}
/>

{/* <GuideCard
    title="Advanced usage"
    description="Learn about advanced usage of Mux Player"
    links={[
      {
        title: "Read the guide",
        href: "/docs/guides/player-advanced-usage",
      },
    ]}
  /> */}


# Customize the look and feel of Mux Uploader
Learn how to customize the look and feel of Mux Uploader to fit your brand and use case.
## Configure UI features

The basic use case of Mux Uploader includes many UI features which may be enabled or disabled by default.
You can toggle many of these via attributes/properties.

### Enable pausing

For larger video files, you may want to allow your users to pause and resume an upload. You can enable this in the UI using
the `pausable` attribute, property, or React prop.

Because Mux Uploader uploads the file in chunks, it will wait to complete uploading the current chunk before pausing. To indicate this,
the pause button will actually have 3 states:

1. Pause - indicates the upload is not currently paused, but can be by pressing the button.
2. Pausing - indicates that the upload will pause once the current chunk upload finishes. The button will be disabled in this case.
3. Resume - indicates the upload is currently paused, but can be resumed by pressing the button.

Below are examples of what this looks like in the UI.

<MultiImage
  images={[
  { src: "/docs/images/mux-uploader-web-pause.png", width: 710, height: 173 },
  { src: "/docs/images/mux-uploader-web-pausing.png", width: 710, height: 173 },
  { src: "/docs/images/mux-uploader-web-resume.png", width: 710, height: 173 },
]}
/>

### Disable Retrying

If for some reason your video upload fails, Mux Uploader will allow a user to retry via the UI. You can disable this using the
`no-retry` attribute or `noRetry` property in the web component, or just `noRetry` prop in React.

Below are examples of what this looks like in the UI.

<MultiImage
  images={[
  { src: "/docs/images/mux-uploader-web-retry.png", width: 710, height: 160 },
  { src: "/docs/images/mux-uploader-web-no-retry.png", width: 710, height: 141 },
]}
/>

### Disable Drag & Drop

Mux Uploader makes drag and drop available for your video files by default. You can disable this using the
`no-drop` attribute or `noDrop` property in the web component, or just `noDrop` prop in React.

Below are examples of what this looks like in the UI.

<MultiImage
  images={[
  { src: "/docs/images/mux-uploader-web-drop.png", width: 502, height: 210 },
  { src: "/docs/images/mux-uploader-web-no-drop.png", width: 710, height: 50 },
]}
/>

<Callout type="info">
  **Note:** There are two likely cases where you may want to disable drag and drop on Mux Uploader:

  1. You still want to support drag and drop, but your page or application design needs the drop zone component somewhere different.
     Mux Uploader supports this by allowing you to [use its subcomponents directly](/docs/guides/uploader-web-use-subcomponents-directly).
  2. You want to use Mux Uploader with all of its features baked in but drag and drop doesn't make sense for your designs. Because
     things like the upload progress UI requires more space for its display, you'll probably also want to
     [use CSS to customize Mux Uploader](#style-with-css).
</Callout>

### Disable other UI subcomponents or features

Mux Uploader also provides attributes and properties to disable:

* The upload progress UI (`no-progress` / `noProgress` for the web component attribute / property, `noProgress` for the React prop)
* The upload status UI (e.g. when the upload is complete or when an error occurs) (`no-status` / `noStatus` for the web component attribute / property, `noStatus` for the React prop)

Since removing these UI elements might result in a poor user experience, you may want to [use Mux Uploader's subcomponents directly](/docs/guides/uploader-web-use-subcomponents-directly) for a more bespoke design when doing so.

## Override the file selector with slots

Because Mux Uploader is a [web component](https://developer.mozilla.org/en-US/docs/Web/API/Web_components), it lets you provide your
own file select element simply by adding it as a child and using the [named slot](https://developer.mozilla.org/en-US/docs/Web/API/Web_components/Using_templates_and_slots#using_the_element-details_custom_element_with_named_slots)
`slot="file-select"` attribute or property.

This is really handy if, for example, you already have a `.btn` class or similar that styles buttons in your application. For example:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  .btn {\n    /* your styles for .btn */\n    padding: 6px 8px;\n    border: 1px solid #0d9488;\n    border-radius: 5px;\n    font-size: 24px;\n    color: white;\n    background: deeppink;\n    cursor: pointer;\n  }\n</style>\n\n<!-- slot=\"file-select\" tells mux-uploader to replace the default file selector with a button.btn element -->\n<mux-uploader endpoint=\"https://httpbin.org/put\">\n  <button class=\"btn\" type=\"button\" slot=\"file-select\">Pick a file</button>\n</mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js'",
      "hidden": true
    }
  }
}
```

The same applies to the React version of the component, `<MuxUploader/>`, as it's just a wrapper around the web component:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader from \"@mux/mux-uploader-react\";\n\nexport default function App() {\n  return (\n    <MuxUploader endpoint=\"https://httpbin.org/put\">\n      <button\n        slot=\"file-select\"\n        type=\"button\"\n        style={{\n          /* your styles for .btn */\n          padding: '6px 8px',\n          border: '1px solid #0d9488',\n          borderRadius: 5,\n          fontSize: 24,\n          color: 'white',\n          background: 'deeppink',\n          cursor: 'pointer',\n        }}\n      >Pick a file</button>\n    </MuxUploader>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

## Style with CSS

The Mux Uploader element, `<mux-uploader>`, can be styled and positioned with CSS just like you would any other HTML element. For example:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  mux-uploader {\n    display: inline-flex;\n    width: 350px;\n    height: 275px;\n    color: white;\n    background: hotpink;\n    font-family: \"Gill Sans\", sans-serif;\n  }\n</style>\n\n<!-- slot=\"file-select\" tells mux-uploader to replace the default file selector with a button.btn element -->\n<mux-uploader endpoint=\"https://httpbin.org/put\"></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js'",
      "hidden": true
    }
  }
}
```

Because Mux Uploader React is a wrapper around the HTML element, the same applies to it as well:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader from \"@mux/mux-uploader-react\";\n\nexport default function App() {\n  return (\n    <MuxUploader\n      endpoint=\"https://httpbin.org/put\"\n      style={{\n        display: 'inline-flex',\n        width: 350,\n        height: 275,\n        color: 'white',\n        background: 'hotpink',\n        fontFamily: '\"Gill Sans\", sans-serif',\n      }}\n    />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

<Callout type="info" title="A couple of notes here:">
  * Mux Uploader relies on certain styles for its layout, so take care when overriding them. For example: flexbox is used by default to layout
    its subcomponents so it might be best to prefer `display: inline-flex` instead of potentially changing it to `inline` or `inline-block`.
  * Because Mux Uploader is a complex component made up of various sub-components, your mileage may vary on simply relying
    on CSS to style the component. In these more advanced cases of styling, you may want to explore [using CSS variables](#use-css-variables-for-additional-styling) or
    [using the Mux Uploader subcomponents directly](/docs/guides/uploader-web-use-subcomponents-directly).
</Callout>

### Use CSS variables for additional styling

In addition to styling with standard CSS, Mux Uploader exposes some additional styles via [CSS variables](https://developer.mozilla.org/en-US/docs/Web/CSS/Using_CSS_custom_properties).
This allows you to tweak some of the "under the hood" subcomponents' styles simply. These include:

| Name                            | CSS Property       | Default Value               | Description                                     |
| ------------------------------- | ------------------ | --------------------------- | ----------------------------------------------- |
| `--overlay-background-color`    | `background-color` | `rgba(226, 253, 255, 0.95)` | background color of the drop overlay            |
| `--progress-bar-fill-color`     | `background`       | `#000000`                   | color for progress bar                          |
| `--progress-percentage-display` | `display`          | `block`                     | display value for text percentage progress UI   |
| `--progress-radial-fill-color`  | `stroke`           | `black`                     | stroke color for radial progress (experimental) |

Building off of the prior examples, you can use these just like you would other CSS variables:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  mux-uploader {\n    --overlay-background-color: purple;\n    --progress-bar-fill-color: purple;\n    --progress-percentage-display: none;\n    display: inline-flex;\n    width: 350px;\n    height: 275px;\n    color: white;\n    background: hotpink;\n    font-family: \"Gill Sans\", sans-serif;\n  }\n</style>\n\n<!-- slot=\"file-select\" tells mux-uploader to replace the default file selector with a button.btn element -->\n<mux-uploader endpoint=\"https://httpbin.org/put\"></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js'",
      "hidden": true
    }
  }
}
```

And for React:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader from \"@mux/mux-uploader-react\";\n\nexport default function App() {\n  return (\n    <MuxUploader\n      endpoint=\"https://httpbin.org/put\"\n      style={{\n        '--overlay-background-color': 'purple',\n        '--progress-bar-fill-color': 'purple',\n        '--progress-percentage-display': 'none',\n        display: 'inline-flex',\n        width: 350,\n        height: 275,\n        color: 'white',\n        background: 'hotpink',\n        fontFamily: '\"Gill Sans\", sans-serif',\n      }}\n    />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

### Use uploader attributes for state-driven styling

Mux Uploader uses read-only properties and attributes to manage and advertise different state changes during the upload process.

These are:

| State | Description |
| --- | --- |
| (none) | Upload has not yet begun |
| `upload-in-progress` | Upload is currently in progress. **NOTE:** This includes while the upload is paused. |
| `upload-complete` | Upload has completed. |
| `upload-error` | An error occurred while attempting to upload. |

These allow you to use [attribute selectors](https://developer.mozilla.org/en-US/docs/Learn/CSS/Building_blocks/Selectors/Attribute_selectors)
if you want state-driven, dynamic styling via CSS.

Here's a basic example of these in action that builds off of the prior examples:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  mux-uploader {\n    --overlay-background-color: purple;\n    --progress-bar-fill-color: purple;\n    --progress-percentage-display: none;\n    display: inline-flex;\n    width: 350px;\n    height: 275px;\n    color: white;\n    background: hotpink;\n    font-family: \"Gill Sans\", sans-serif;\n  }\n\n  mux-uploader[upload-in-progress] {\n    background: orange;\n  }\n\n  mux-uploader[upload-complete] {\n    background: green;\n  }\n</style>\n\n<!-- slot=\"file-select\" tells mux-uploader to replace the default file selector with a button.btn element -->\n<mux-uploader endpoint=\"https://httpbin.org/put\"></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js'",
      "hidden": true
    }
  }
}
```

<Callout type="info">
  **NOTE:** Because Mux Uploader React is a thin wrapper around the Mux Uploader web component, you can use these exact same CSS selectors
  in your React application. Alternatively, some frameworks, like [Tailwind CSS](https://tailwindcss.com/), have built-in support for arbitrary
  attribute selectors. For an example of this in use, see the [section below](#using-tailwind-css).
</Callout>

### Styling in React

If you're using React to build your application, there are some common patterns used in React that are less likely to be relevant for
the web component version. Below are a couple of these.

### Using CSS modules

One common pattern for styling in React is to use CSS-in-JS, for example, using CSS modules:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import styles from \"./Styles.module.css\";\nimport MuxUploader from \"@mux/mux-uploader-react\";\n\nexport default function App() {\n  return (\n    <MuxUploader className={styles.uploader} endpoint=\"https://httpbin.org/put\" />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    },
    "/Styles.module.css": {
      "code": ".uploader {\n  --overlay-background-color: purple;\n  --progress-bar-fill-color: purple;\n  --progress-percentage-display: none;\n  display: inline-flex;\n  width: 350px;\n  height: 275px;\n  color: white;\n  background: hotpink;\n  font-family: \"Gill Sans\", sans-serif;\n}\n"
    }
  },
  "template": "react"
}
```

### Using Tailwind CSS

Another common approach to styling React applications is using [Tailwind CSS](https://tailwindcss.com). Here's an example for Mux Uploader
approximating the previous examples, including CSS variables via
[arbitrary properties](https://tailwindcss.com/docs/adding-custom-styles#arbitrary-properties) and attribute selectors via
[arbitrary variants](https://tailwindcss.com/docs/hover-focus-and-other-states#using-arbitrary-variants):

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader from \"@mux/mux-uploader-react\";\n\n// Declaring the tailwind classes as an array for legibility\nconst stylesList = [\n  'inline-flex',\n  'w-4/5',\n  'h-4/5',\n  'font-sans',\n  'text-white',\n  'bg-pink-400',\n  '[--progress-percentage-display:none]',\n  '[--overlay-background-color:purple]',\n  '[--progress-bar-fill-color:purple]',\n  '[&[upload-in-progress]]:bg-orange-500',\n  '[&[upload-complete]]:bg-green-500',\n];\n\nconst stylesStr = stylesList.join(' ');\n\nexport default function App() {\n  return (\n    <MuxUploader\n      className={stylesStr}\n      endpoint=\"https://httpbin.org/put\"\n    />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "options": {
    "externalResources": [
      "https://cdn.tailwindcss.com"
    ]
  },
  "template": "react"
}
```


# Compose custom UIs with subcomponents
Learn how to use Mux Uploader's various subcomponents to compose even more bespoke user experiences and designs.
Although Mux Uploader is a single component that's easy to drop into your web application, it's actually built using several subcomponents
"under the hood." If your application design or desired user experience requires more customization, you can use the individual web components that come packaged with Mux Uploader to build out a custom upload UI that meets your needs.

To use this approach, add an `id` attribute to your `<mux-uploader>` element with a unique value.

You can then associate the `<mux-uploader>` element with any of the packaged components by adding a `mux-uploader=""` attribute to each component and setting it to the `id` that you gave to the `<mux-uploader>` element.

Here's a simple example for the web component:

```html
<!-- add a mux-uploader tag with an id attribute and hide it with CSS -->
<mux-uploader id="my-uploader" style="display: none;"></mux-uploader>

<!-- ...then, somewhere else in your app, add a reference back to it -->
<mux-uploader-file-select mux-uploader="my-uploader">
  <button slot="file-select">Pick a video</button>
</mux-uploader-file-select>
```

Here's one for React:

```jsx
import MuxUploader, { MuxUploaderFileSelect } from "@mux/mux-uploader-react";

export default function App() {
  return (
    <MuxUploader id="my-uploader" style={{ display: "none"}} />

    {/* ...then, somewhere else in your app, add a reference back to it */}
    <MuxUploaderFileSelect mux-uploader="my-uploader">
      <button slot="file-select">Pick a video</button>
    </mux-uploader-file-select>
  );
}
```

Because all of these are web components, you can use CSS to style them or
any of their slotted children (discussed below).

# Subcomponents

## File Select

The file select subcomponent is what tells Mux Uploader to open the file selection browser. The web component is
`<mux-uploader-file-select>`, and the React component is `<MuxUploaderFileSelect>`.

You can further customize it by slotting in your own `<button>` or other component in the `file-select` slot.

Here's an example:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  /* Hide the uploader before uploading */\n  mux-uploader:not([upload-error]):not([upload-in-progress]):not([upload-complete]) {\n    display: none;\n  }\n\n  mux-uploader-file-select button {\n    background: hotpink;\n    color: white;\n    padding: 4px 2px;\n    border: none;\n  }\n\n  mux-uploader-file-select button:hover {\n    background: purple;\n  }\n</style>\n<!-- In this example, we're still using Mux Uploader as a visual component -->\n<mux-uploader\n    id=\"my-uploader\"\n    no-drop\n    endpoint=\"https://httpbin.org/put\"\n  ></mux-uploader>\n<mux-uploader-file-select mux-uploader=\"my-uploader\">\n  <button>Select from a folder</button>\n</mux-uploader-file-select>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

## Drop

The drop subcomponent is what implements the [drag and drop API](https://developer.mozilla.org/en-US/docs/Web/API/HTML_Drag_and_Drop_API)
and tells Mux Uploader the relevant details about the file.
The web component is `<mux-uploader-drop>`, and the React component is `<MuxUploaderDrop>`.

Mux Uploader Drop provides a few slots for customization.

* `heading` - By default this is a `<span>` with the text "Drop a video file here to upload".
* `separator` - By default this is a `<span>` containing the text "or" placed between the heading and any additional children.
* (default) - Any additional children that don't have a specified slot will show up below the two previous slots.

Here's an example that puts all of these together, including CSS:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  /* Customize drop area background color & active background color */\n  mux-uploader-drop {\n    padding: 40px;\n    color: white;\n    background: hotpink;\n  }\n\n  mux-uploader-drop[active] {\n    background: #ffe4e6;\n  }\n</style>\n\n<mux-uploader-drop mux-uploader=\"my-uploader\">\n  <!-- Change the heading text/UI shown -->\n  <div slot=\"heading\">Drop videoz here plz</div>\n  <!-- Remove/hide the separator text/UI (default \"Or\") -->\n  <div slot=\"separator\" style=\"display: none;\"></div>\n  <div>You can also add arbitrary children for designs like the drop zone being the full screen</div>\n</mux-uploader-drop>\n<!-- In this example, we're still using Mux Uploader as a visual component -->\n<mux-uploader\n    id=\"my-uploader\"\n    no-drop\n    endpoint=\"https://httpbin.org/put\"\n  ></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

In addition, Mux Uploader Drop has attributes/properties for optionally showing an overlay whenever a file is
dragged over it. These are on by default in Mux Uploader, and are:

* `overlay` - A boolean attribute / property / React prop for enabling the overlay UI.
* `overlay-text` (`overlayText` property and React prop) - Allows you to provide custom text to show on the overlay.

If you'd like to further customize the overlay with a different background color, you can use the
`--overlay-background-color` CSS variable (which is also available when [using Mux Uploader directly](/docs/guides/uploader-web-customize-look-and-feel#use-css-variables-for-additional-styling))

Here's an example of these in action:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  /* Customize drop area background color & overlay background color */\n  mux-uploader-drop {\n    padding: 40px;\n    color: white;\n    background: hotpink;\n    --overlay-background-color: purple;\n  }\n</style>\n\n<!-- Use an overlay with customized overlay text -->\n<mux-uploader-drop overlay-text=\"Just let go!\" overlay mux-uploader=\"my-uploader\">\n  <!-- Change the heading text/UI shown -->\n  <div slot=\"heading\">Drop videoz here plz</div>\n  <!-- Remove/hide the separator text/UI (default \"Or\") -->\n  <div slot=\"separator\" style=\"display: none;\"></div>\n  <div>You can also add arbitrary children for designs like the drop zone being the full screen</div>\n</mux-uploader-drop>\n<!-- In this example, we're still using Mux Uploader as a visual component -->\n<mux-uploader\n    id=\"my-uploader\"\n    no-drop\n    endpoint=\"https://httpbin.org/put\"\n  ></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

### Custom Drop

You can even implement your own drag and drop completely separate from `<mux-uploader>` and as long as you dispatch a custom `file-ready` with the file in the `detail` property then `<mux-uploader>` will handle the upload upon receiving the event.

```html
<script>
  const muxUploader = document.querySelector("mux-uploader");

  // Dispatch custom event to trigger upload
  muxUploader.dispatchEvent(
    new CustomEvent("file-ready", {
      composed: true,
      bubbles: true,
      detail: file,
    })
  );
</script>
```

## Progress

The progress subcomponent is what visualizes progress of your upload. In fact, it is used twice "under the hood" by the default `<mux-uploader>`:
once for showing the %, and once for showing the progress bar.
The web component is `<mux-uploader-progress>`, and the React component is `<MuxUploaderProgress>`.

In addition, Mux Uploader Progress exposes the `type` attribute / property / React prop for choosing the particular kind of visualization you'd prefer. The
available type values are:

* `percentage` (default) - Show as a numeric % in text
* `bar` - Show as a progress bar
* `radial` (***Experimental***) - Show as a radial/circular progress indicator

Each of these types also has CSS variables available for further customization:

`percentage`:

* `--progress-percentage-display` - Applies to the `display` of the underlying percentage element (default: `block`).

`bar`:

* `--progress-bar-height` - Applies to the `height` of the progress bar (default: `4px`).
* `--progress-bar-fill-color` - Applies to the color of the progress bar's progress indication (default: `black`).

`radial`:

* `--progress-radial-fill-color` - Applies to the color of the radial progress indication (default: `black`).

Here's an example of these in action:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  mux-uploader-progress {\n    --progress-bar-fill-color: purple;\n    --progress-radial-fill-color: purple;\n    color: purple;\n    --progress-bar-height: 10px;\n  }\n</style>\n<!-- In this example, we're still using Mux Uploader as a visual component -->\n<mux-uploader\n    id=\"my-uploader\"\n    no-progress\n    no-drop\n    endpoint=\"https://httpbin.org/put\"\n  ></mux-uploader>\n  <mux-uploader-progress\n  type=\"percentage\"\n  mux-uploader=\"my-uploader\"\n></mux-uploader-progress>\n<mux-uploader-progress\n  type=\"bar\"\n  mux-uploader=\"my-uploader\"\n></mux-uploader-progress>\n<mux-uploader-progress\n  type=\"radial\"\n  mux-uploader=\"my-uploader\"\n></mux-uploader-progress>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

## Status

The status subcomponent is what indicates when the upload is completed, or an error has occurred, or when you're offline.
The web component is `<mux-uploader-status>`, and the React component is `<MuxUploaderStatus>`.

Here's an example with a bit of CSS customization, using Mux Uploader's [state attributes](/docs/guides/uploader-web-customize-look-and-feel#use-uploader-attributes-for-state-driven-styling)
on the status component for additional state-driven styling:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  mux-uploader {\n    height: 2rem;\n    padding: 4px 2px;\n  }\n\n  mux-uploader-status {\n    background: hotpink;\n    color: white;\n    padding: 4px 2px;\n    height: 2rem;\n    display: block;\n  }\n\n  mux-uploader-status[upload-error] {\n    background: crimson;\n    /* By default, the error text color will be red. */\n    color: white;\n  }\n\n  mux-uploader-status[upload-complete] {\n    background: dodgerblue;\n  }\n\n  mux-uploader-file-select button:hover {\n    background: purple;\n  }\n</style>\n<mux-uploader-status mux-uploader=\"my-uploader\"></mux-uploader-status>\n<!--\n  In this example, we're still using Mux Uploader as a visual component.\n  Change the endpoint to an invalid one to see what an error looks like.\n-->\n<mux-uploader\n    id=\"my-uploader\"\n    no-drop\n    no-status\n    endpoint=\"https://httpbin.org/put\"\n  ></mux-uploader>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

## Retry

The retry subcomponent that is displayed when an error has occurred to retry uploading and will notify Mux Uploader to retry when clicked.
The web component is `<mux-uploader-retry>`, and the React component is `<MuxUploaderRetry>`.

Here's a simple example:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<!-- In this example, we're still using Mux Uploader as a visual component. -->\n<mux-uploader\n    id=\"my-uploader\"\n    no-drop\n    no-retry\n    endpoint=\"http://fake.url.for/retry/purposes\"\n  ></mux-uploader>\n<mux-uploader-retry mux-uploader=\"my-uploader\"></mux-uploader-retry>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

## Pause

The pause subcomponent that is displayed while an upload is in progress and will notify Mux Uploader to either pause or resume uploading
when clicked, depending on the current uploading state.
The web component is `<mux-uploader-pause>`, and the React component is `<MuxUploaderPause>`.

Here's a simple example:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<!--\n  In this example, we're still using Mux Uploader as a visual component.\n  We've also made the chunk size smaller to help demo pause/resume behavior.\n-->\n<mux-uploader\n  id=\"my-uploader\"\n  no-drop\n  chunk-size=\"512\"\n  endpoint=\"https://httpbin.org/put\"\n></mux-uploader>\n<mux-uploader-pause mux-uploader=\"my-uploader\"></mux-uploader-pause>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```

# Advanced use cases

Here are some more examples of working with the subcomponents directly, using multiple subcomponents together to demonstrate the versatility
and composability of using the various subcomponents together in either React or vanilla HTML.

## React CSS modules

Just like you can do with the "batteries" usage of Mux Uploader, you can use [CSS-in-JS](/docs/guides/uploader-web-customize-look-and-feel#using-css-modules)
to handle styling of your subcomponents in React. Here's an example of how you can style Mux Uploader using CSS modules:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import styles from \"./Styles.module.css\";\nimport MuxUploader, { MuxUploaderFileSelect, MuxUploaderProgress } from \"@mux/mux-uploader-react\"; \n\nexport default function App() {\n  return (\n    <div>\n        <h2 className={styles.heading}>Mux Uploader with CSS Modules</h2>\n        <MuxUploader id=\"css-modules-uploader\" className={styles.uploader} endpoint=\"https://httpbin.org/put\" />\n        <MuxUploaderFileSelect muxUploader=\"css-modules-uploader\">\n          <button className={styles.button}>Pick your video</button>\n        </MuxUploaderFileSelect>\n        <MuxUploaderProgress type=\"percentage\" muxUploader=\"css-modules-uploader\" className={styles.progress} />\n    </div>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    },
    "/Styles.module.css": {
      "code": ".heading {color: #333;}\n.uploader { display: none; }\n.progress { color: orange; }\n.button {\n    background: #3a3a9d;\n    padding: 1em;\n    color: white;\n    border-radius: .35em;\n    border: 0;\n    cursor: pointer;\n}\n    "
    }
  },
  "template": "react"
}
```

## React Tailwind CSS

Also like Mux Uploader, you can use [Tailwind CSS](/docs/guides/uploader-web-customize-look-and-feel#using-tailwind-css) for your subcomponent styling. Here's an example in React:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxUploader, { MuxUploaderFileSelect, MuxUploaderProgress, MuxUploaderDrop } from \"@mux/mux-uploader-react\"; \n\nexport default function App() {\n  return (\n    <div className=\"p-4\">\n        <h2 className=\"text-lg text-slate-800 mb-2 font-bold\">Mux Uploader with Tailwind example</h2>\n        <MuxUploader id=\"my-uploader\" className=\"hidden\" endpoint=\"https://httpbin.org/put\" />\n\n        <MuxUploaderDrop\n            id=\"my-uploader\"\n            className=\"border border-4 border-slate-200 rounded-0.125 shadow mb-4\"\n            overlay\n            overlayText=\"Let it go\"\n        >\n            <span slot=\"heading\" className=\"text-slate-600 text-xl mb-2\">Drop your favorite video</span>\n            <span slot=\"separator\" className=\"text-slate-400 text-sm italic\">— or —</span>\n\n            <MuxUploaderFileSelect muxUploader=\"my-uploader\">\n                <button\n                    className=\"bg-pink-500 hover:bg-pink-600 my-2 px-col-0.5 py-2 rounded-0.125 text-white text-sm\"\n                >\n                    Select from a folder\n                </button>\n            </MuxUploaderFileSelect>\n        </MuxUploaderDrop>\n\n        <MuxUploaderProgress\n            type=\"percentage\"\n            muxUploader=\"my-uploader\"\n            className=\"text-3xl text-orange-600 underline\"\n        />\n    </div>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "options": {
    "externalResources": [
      "https://cdn.tailwindcss.com"
    ]
  },
  "template": "react"
}
```

## Uploader Page

In this example, we use the Mux Uploader Drop component as the parent for a full page upload experience, with the various subcomponents as descendants
with their own customization for a more bespoke look and feel:

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-uploader": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<style>\n  /* Various styles to customize a full page upload experience. See below for the HTML usage. */\n  body {\n    margin: 0;\n    color: white;\n    font-family: \"Gill Sans\", sans-serif;\n  }\n\n  /* Hide the uploader since we're only using it for functionality */\n  mux-uploader {\n    display: none;\n  }\n\n  /* Style the drop component as the root container for the page's UI */\n  mux-uploader-drop {\n    padding: 2rem;\n    width: 100vw;\n    height: 100vh;\n    display: flex;\n    flex-direction: column;\n    align-items: flex-start;\n    justify-content: flex-start;\n    background: hotpink;\n    /* Style the overlay background based on the page's color palette */\n    --overlay-background-color: purple;\n  }\n\n  /* Use a '+' cursor when dragging over the drop subcomponent */\n  mux-uploader-drop[active] {\n    cursor: copy;\n  }\n\n  mux-uploader-drop > [slot=\"heading\"] {\n    margin: 0;\n  }\n\n  /* Hide the drop component's separator text using its part selector */\n  mux-uploader-drop::part(separator) {\n    display: none;\n  }\n\n  mux-uploader-drop > .main-content {\n    flex-grow: 1;\n    align-self: stretch;\n  }\n\n  /* Use CSS to further customize the file select component's custom button (see below) */\n  mux-uploader-file-select > button {\n    padding: 6px 8px;\n    border: 1px solid #0d9488;\n    border-radius: 5px;\n    font-size: 24px;\n    color: white;\n    background: hotpink;\n    cursor: pointer;\n  }\n\n  mux-uploader-file-select > button:hover {\n    background: purple;\n  }\n\n  /* Customize the progress details to fit with the page's theme, including color palette */\n  mux-uploader-progress {\n    --progress-bar-fill-color: purple;\n    --progress-radial-fill-color: purple;\n    color: purple;\n    --progress-bar-height: 10px;\n  }\n\n  mux-uploader-status {\n    font-family: \"Gill Sans\", sans-serif;\n    font-size: 24px;\n    display: block;\n    padding: 6px 0;\n  }\n\n  /* Update the status component's text color based on the upload state to better fit the page's palette */\n  mux-uploader-status[upload-error] {\n    /* By default, the error text color will be red. */\n    color: navy;\n  }\n\n  mux-uploader-status[upload-complete] {\n    background: dodgerblue;\n  }\n</style>\n\n<!--\n  Note that in this example, mux-uploader is a child of mux-uploader-drop. This is a perfectly valid use case.\n-->\n<mux-uploader-drop\n mux-uploader=\"my-uploader\"\n overlay\n overlay-text=\"Drop to upload\"\n>\n  <mux-uploader\n    no-drop\n    no-progress\n    no-retry\n    no-status\n    id=\"my-uploader\"\n    endpoint=\"https://httpbin.org/put\"\n  ></mux-uploader>\n  <!-- By using the slot, this will automatically get hidden based on upload state changes -->\n  <h1 slot=\"heading\">Drop your video file anywhere on the page</h1>\n  <div class=\"main-content\">\n    <mux-uploader-status mux-uploader=\"my-uploader\"></mux-uploader-status>\n    <mux-uploader-progress mux-uploader=\"my-uploader\" type=\"percentage\"></mux-uploader-progress>\n    <mux-uploader-progress mux-uploader=\"my-uploader\"></mux-uploader-progress>\n  </div>\n  <mux-uploader-file-select mux-uploader=\"my-uploader\">\n    <button>Browse Files</button>\n  </mux-uploader-file-select>\n</mux-uploader-drop>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-uploader/dist/mux-uploader.js';",
      "hidden": true
    }
  }
}
```


# Use different video quality levels
Learn how to pick an appropriate video quality and control the video quality of assets.
<Callout type="warning" title="Encoding tiers have been renamed to video quality levels">
  [We recently renamed encoding tiers to video quality levels. Read the blog for more details.](https://www.mux.com/blog/no-one-said-naming-was-easy-encoding-tiers-are-now-video-quality-levels)
</Callout>

## Introducing video quality levels

Mux Video supports encoding content with three different video quality levels. The video quality level informs the quality, cost, and available platform features for the asset.

### Basic

The *basic* video quality level uses a reduced encoding ladder, with a lower target video quality, suitable for simpler video use cases.

There is no charge for video encoding when using basic quality.

### Plus

The *plus* video quality level encodes your video at a consistent high-quality level. Assets encoded with the plus quality use an AI-powered per-title encoding technology that boosts bitrates for high-complexity content, ensuring high-quality video, while reducing bitrates for lower-complexity content to save bandwidth without sacrificing on quality.

The plus quality level incurs a [cost per video minute of encoding](https://mux.com/pricing).

### Premium

The *premium* video quality level uses the same AI-powered per-title encoding technology as plus, but is tuned to optimize for the presentation of premium media content, where the highest video quality is required, including use cases like live sports, or studio movies.

The premium quality level incurs a higher [cost per video minute of encoding, storage, and delivery](https://mux.com/pricing).

## Set a video quality level when creating an asset

The video quality of an asset is controlled by setting the `video_quality` attribute on your <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create-asset API call</ApiRefLink>, so to create an asset with the `basic` quality level, you should set `"video_quality": "basic"` as shown below.

```json
// POST /video/v1/assets
{
	"inputs": [
		{
			"url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
		}
	],
	"playback_policies": [
		"public"
	],
	"video_quality": "basic"
}
```

And of course you can also select the `video_quality` within Direct Uploads, too; you just need to set the same `"video_quality": "basic"` field in the `new_asset_settings` of your <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">create-direct-upload API call</ApiRefLink>.

```json
// POST /video/v1/uploads
{
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic"
  },
  "cors_origin": "*"
}
```

## Set the video quality when creating a live stream

To set the `video_quality` for a live stream, you just need to set the `"video_quality"` field within the `new_asset_settings` configuration of your <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create-live-stream API call</ApiRefLink> to `"plus"` or `"premium"`, as shown below.

```json
// POST /video/v1/live-streams
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "plus"
  }
}
```

All on-demand assets created from the live stream will also inherit the given quality level. Live streams can currently only use the `plus` or `premium` video quality levels.

## Supported features

Assets using different video quality levels have different features or limits available to them. Refer to the below table for more details:

| Feature | Basic | Plus | Premium |
| :-- | :-- | :-- | :-- |
| [JIT encoding](https://www.mux.com/features#just-in-time-encoding) | ✅ | ✅ | ✅ |
| [Multi CDN delivery](https://www.mux.com/blog/multi-cdn-support-in-mux-video-for-improved-performance-and-reliability) | ✅ | ✅ | ✅ |
| [Mux Data included](https://mux.com/data) | ✅ | ✅ | ✅ |
| [Mux Player included](https://mux.com/player) | ✅ | ✅ | ✅ |
| [Thumbnails, Gifs,](/docs/guides/get-images-from-a-video) [Storyboards](/docs/guides/create-timeline-hover-previews) | ✅ | ✅ | ✅ |
| [Watermarking](/docs/guides/add-watermarks-to-your-videos) | ✅ | ✅ | ✅ |
| [Signed playback IDs and playback restrictions](/docs/guides/secure-video-playback) | ✅ | ✅ | ✅ |
| [On-Demand](/docs/core/stream-video-files) | ✅ | ✅ | ✅ |
| [Master Access](/docs/guides/download-for-offline-editing) | ✅ | ✅  | ✅ |
| [Audio-only Assets](https://www.mux.com/blog/have-you-heard-we-support-audio-only-files-too) | ✅ | ✅ | ✅ |
| [Auto-generated captions](/docs/guides/add-autogenerated-captions-and-use-transcripts) | ✅ | ✅ | ✅ |
| [Clipping to a new asset](/docs/guides/create-clips-from-your-videos)| ✅ | ✅ | ✅ |
| [Multi-track audio](/docs/guides/add-alternate-audio-tracks-to-your-videos) | ✅ | ✅ | ✅ |
| [Live Streaming](/docs/guides/start-live-streaming) | ❌ | ✅ |  ✅ |
| [Adaptive bitrate ladder](https://www.mux.com/video-glossary/abr-adaptive-bitrate) | Reduced | Standard | Extended |
| [Maximum streaming resolution](/docs/guides/stream-videos-in-4k) | 2160p (4K) | 2160p (4K) | 2160p (4K) |
| [MP4 support](/docs/guides/enable-static-mp4-renditions) \* | ✅  | ✅ | ✅ |
| [DRM](/docs/guides/protect-videos-with-drm) <BetaTag /> | ❌ | ✅ | ✅ |

## Examples for comparison

| Encoding tier | Content complexity | Playback page |
| :-- | :-- | :-- |
| basic | Simple | [moodeng - basic](https://player.mux.com/elUDQkfDSv43pR8ejgrURIGcNDzx00Jgq) |
| basic | Complex | [waterfall - basic](https://player.mux.com/z00qLVB3NsE1FOE7mCKboHGXQDQIun4sp) |
| plus | Simple | [moodeng - plus](https://player.mux.com/Vl7wZGmm3ztT01VL8UzTv9QEfPleP4kMR) |
| plus | Complex | [waterfall - plus](https://player.mux.com/aTFGdFMLBgdvegF29Orvoj026mpkb8v6Q) |
| premium | Simple | [moodeng - premium](https://player.mux.com/JQ1P00AC3EjGfL7BjI951M006dvL4asVce) |
| premium | Complex | [waterfall - premium](https://player.mux.com/Q8Vk00DqGWTGS40201TtXROm67LNCMlfxed) |

\* MP4 pricing depends on the video quality level of your asset. [Learn more.](/docs/pricing/video#do-you-charge-for-mp4-downloads)


# Stream videos in 4K
Learn how to ingest, store, and deliver your videos in 4K resolutions
## Introduction to 4K

Mux Video supports ingesting, storing, and delivering on-demand video assets in high resolutions up to and including 4K (2160p). At this time, 2K and 4K output is not supported for Mux [Live Streaming](https://www.mux.com/docs/guides/start-live-streaming). Live Streams will accept 2K and 4K input, but the output will be capped to 1080p (and will be billed as 1080p).

Much premium video content is created in 4K, but more recently user-generated content is often in 4K as well, as mobile devices are increasingly capable of producing 4K video.

Mux Video also supports [2K and 2.5K video on-demand video assets](/docs/guides/stream-videos-in-4k#stream-2k-and-25k-content).

## Create a 4K asset

To ingest, store, and deliver an asset in 4K, you'll need to set the new `max_resolution_tier` attribute on your <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create-asset API call</ApiRefLink>.

```json
// POST /video/v1/assets
{
	"inputs": [
		{
			"url": "https://storage.googleapis.com/muxdemofiles/mux-4k.mp4"
		}
	],
	"playback_policies": [
		"public"
	],
	"video_quality": "basic",
	"max_resolution_tier": "2160p"
}
```

This field controls the maximum resolution that we'll encode, store, and deliver your media in. We do not to automatically ingest content at 4K so that you can avoid unexpectedly high ingest bills. If you send us 4K content today and don't set `max_resolution_tier`, nothing changes in your bill.

This also allows you to build applications where some of your content creators are able to upload 4K content while others remain capped at 1080p.

And of course you can use 4K with Direct Uploads, too; you just need to set the same `"max_resolution_tier": "2160p"` field in the `new_asset_settings` of your <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">create-direct-upload API call</ApiRefLink>.

```json
// POST /video/v1/uploads
{
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic",
    "max_resolution_tier": "2160p"
  },
  "cors_origin": "*"
}
```

## Play your assets in 4K

For assets with 4K enabled at ingest, we'll automatically add 2K and 4K renditions to your HLS Playback URLs. Mux uses high-bitrate H.264 for delivering 4K content, which is supported on a wide range of devices, like Mux Player shown below.

<Player playbackId="fss00bwClhYMynhxeE2Hv757J02VI68KY5" thumbnailTime="0" title="4K Video Demo" />

While we've tested playback and built device detection rules that should protect you against unexpected playback failures, you should always test playback on your own device footprint before enabling 4K widely on your platform.

## Limiting playback resolution below 4K

Of course, you might not want all of your viewers to be able to view your content in 4K. Lots of streaming platforms choose to only offer 4K playback to their high subscription tiers. You can implement this by controlling playback resolution with the max\_resolution query parameter on your playback URLs, as shown below.

```
https://stream.mux.com/${PLAYBACK_ID}.m3u8?max_resolution=1080p
```

## Preparing 4K inputs

Mux uses just-in-time (JIT) encoding to make sure your assets are available as soon as possible after you create them, and this includes 4K assets.

Most of the usual restrictions for standard inputs still apply when you're using 4K, but there are a few different restrictions you should be aware of:

* The input must have a maximum dimension of 4096 pixels
* The input must have a maximum keyframe interval of 10 seconds
* The input must be 20 Mbps or less
* The input must have a frame rate between 5 fps and 60 fps

[You can find full details of the standard input specification for 1080p and 4K content in our documentation.](/docs/guides/minimize-processing-time#standard-input-specs)

## Stream 2K and 2.5K content

Mux Video also supports 2K and 2.5K (1440p) content. If you want your asset processed as 2/2.5K, you just need to set `"max_resolution_tier": "1440p"` in your create asset (or create direct upload) calls instead.


# Upload files directly
Allow your users to upload content directly to Mux.
## Overview

Direct Uploads allow you to provide an authenticated upload URL to your client applications so content can be uploaded directly to Mux without needing any intermediary steps. You still get to control who gets an authenticated URL, how long it's viable, and, of course, the Asset settings used when the upload is complete.

The most common use-case for Direct Uploads is in client applications, such as native mobile apps and the browser, but you could also use them to upload directly from your server or in a command line tool. Any time you don't feel the need to store the original on your own, just generate a signed URL and push the content directly.

Let's start by walking through the simplest use case of getting a file directly into Mux.

## Upload a file directly into Mux

### 1. Create an authenticated URL

The first step is creating a new Direct Upload with the Mux Asset settings you want. The Mux API will return an authenticated URL that you can use directly in your client apps, as well as an ID specific to that Direct Upload so you can check the status later via the API.

```curl

curl https://api.mux.com/video/v1/uploads \
  -X POST \
  -H "Content-Type: application/json" \
  -u MUX_TOKEN_ID:MUX_TOKEN_SECRET \
  -d '{ "new_asset_settings": { "playback_policies": ["public"], "video_quality": "basic" }, "cors_origin": "*" }'

```

```elixir

# config/dev.exs
config :mux,
  access_token_id: "MUX_TOKEN_ID",
  access_token_secret: "MUX_TOKEN_SECRET"

client = Mux.client()
params = %{"new_asset_settings" => %{"playback_policies" => ["public"], "video_quality" => "basic"}, "cors_origin" => "https://your-browser-app.com"}
Mux.Video.Uploads.create(client, params)

```

```go

import (
  muxgo "github.com/muxinc/mux-go"
)

client := muxgo.NewAPIClient(
  muxgo.NewConfiguration(
    muxgo.WithBasicAuth(os.Getenv("MUX_TOKEN_ID"), os.Getenv("MUX_TOKEN_SECRET")),
  ))

car := muxgo.CreateAssetRequest{PlaybackPolicy: []muxgo.PlaybackPolicy{muxgo.PUBLIC}, VideoQuality: "basic"}
cur := muxgo.CreateUploadRequest{NewAssetSettings: car, Timeout: 3600, CorsOrigin: "*"}
u, err := client.DirectUploadsApi.CreateDirectUpload(cur)

```

```node

import Mux from '@mux/mux-node';
const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET
});

mux.video.uploads.create({
  cors_origin: 'https://your-browser-app.com', 
  new_asset_settings: {
    playback_policy: ['public'],
    video_quality: 'basic'
  }
}).then(upload => {
  // upload.url is what you'll want to return to your client.
});

```

```php

$config = MuxPhp\Configuration::getDefaultConfiguration()
  ->setUsername(getenv('MUX_TOKEN_ID'))
  ->setPassword(getenv('MUX_TOKEN_SECRET'));

$uploadsApi = new MuxPhp\Api\DirectUploadsApi(
    new GuzzleHttp\Client(),
    $config
);

$createAssetRequest = new MuxPhp\Models\CreateAssetRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC], "video_quality" => "basic"]);
$createUploadRequest = new MuxPhp\Models\CreateUploadRequest(["timeout" => 3600, "new_asset_settings" => $createAssetRequest, "cors_origin" => "https://your-browser-app.com"]);
$upload = $uploadsApi->createDirectUpload($createUploadRequest);

```

```python

import mux_python

configuration = mux_python.Configuration()
configuration.username = os.environ['MUX_TOKEN_ID']
configuration.password = os.environ['MUX_TOKEN_SECRET']

uploads_api = mux_python.DirectUploadsApi(mux_python.ApiClient(configuration))

create_asset_request = mux_python.CreateAssetRequest(playback_policy=[mux_python.PlaybackPolicy.PUBLIC], video_quality="basic")
create_upload_request = mux_python.CreateUploadRequest(timeout=3600, new_asset_settings=create_asset_request, cors_origin="*")
create_upload_response = uploads_api.create_direct_upload(create_upload_request)

```

```ruby

MuxRuby.configure do |config|
  config.username = ENV['MUX_TOKEN_ID']
  config.password = ENV['MUX_TOKEN_SECRET']
end

uploads_api = MuxRuby::DirectUploadsApi.new

create_asset_request = MuxRuby::CreateAssetRequest.new
create_asset_request.playback_policy = [MuxRuby::PlaybackPolicy::PUBLIC]
create_asset_request.video_quality = "basic"
create_upload_request = MuxRuby::CreateUploadRequest.new
create_upload_request.new_asset_settings = create_asset_request
create_upload_request.timeout = 3600
create_upload_request.cors_origin = "https://your-browser-app.com"
upload = uploads_api.create_direct_upload(create_upload_request)

```



### 2. Use the URL to upload in your client

Once you've got an upload object, you'll use the authenticated URL it includes to make a `PUT` request that includes the file in the body. The URL is resumable, which means if it's a *really* large file you can send your file in pieces and pause/resume at will.

```ReactNative

async function uploadVideo () {
  // videoUri here is the local URI to the video file on the device
  // this can be obtained with an ImagePicker library like expo-image-picker
  const imageResponse = await fetch(videoUri)
  const blob = await imageResponse.blob()

  // Create an authenticated Mux URL
  // this request should hit your backend and return a "url" in the
  // response body
  const uploadResponse = await fetch('/backend-api')
  const uploadUrl = (await uploadResponse.json()).url

  try {
    let res = await fetch(uploadUrl, {
      method: 'PUT',
      body: blob,
      headers: { "content-type": blob.type}
    });
    console.log("Upload is complete");
  } catch(error) {
    console.error(error);
  }
};

```

```curl

curl -v -X PUT -T myawesomevideo.mp4 "$URL_FROM_STEP_ONE"

```

```js

import * as UpChunk from '@mux/upchunk';

const upload = UpChunk.createUpload({
  // getUploadUrl is a function that resolves with the upload URL generated
  // on the server-side
  endpoint: getUploadUrl,
  // picker here is a file picker HTML element
  file: picker.files[0],
  chunkSize: 5120, // Uploads the file in ~5mb chunks
});

// subscribe to events
upload.on('error', err => {
  console.error('💥 🙀', err.detail);
});

upload.on('progress', progress => {
  console.log('Uploaded', progress.detail, 'percent of this file.');
});

// subscribe to events
upload.on('success', err => {
  console.log("Wrap it up, we're done here. 👋");
});

```

```node

// assuming you're using ESM
import fs from "fs";
import got from "got";

const uploadUrl = /* Authenticated URL from step 1 */

got.put(uploadUrl, {
  body: fs.createReadStream('/path/to/your/file'),
});

```



If you were following along with these examples, you should find new Assets in the Mux Dashboard with the settings you specified in the original upload create request, but the video you uploaded in the second step!

If the upload doesn't work via cURL, be sure that you've put quotes around the upload URL.

## Using Direct Uploads in your application

The examples above are a great way to upload a one-off file into Mux, but let's talk about how this workflow looks in your actual application. Typically you're going to want to do a few things:

* Authenticate the request that gives the user a signed URL so random people don't start ingesting Assets into your Mux account.
* Save information in your application about the file when the user creates the upload, such as who uploaded it and when, details about the video like title, tags, etc.
* Make sure the Asset that's ultimately created from that upload is associated with that information.

Just like Assets, Direct Uploads have their own events, and then the Asset created off the upload has the usual events as well. When you receive the `video.upload.asset_created` event you'll find an `asset_id` key that you could use in your application to tie the Asset back to the upload, but that gets tricky if your application misses events or they come out of order. To keep things simple, we like to use the `passthrough` key when creating an Asset. Let's look at how the passthrough workflow would work in a real application.

<Callout type="info" title="Upload reliably with our Upload SDKs">
  We provide SDKs for Android, iOS, iPadOS, and web frontend that handle difficult parts of the upload process, such has handling large files and preprocessing video for size and cost. Once your backend has created an authenticated URL for the upload, you you can give it to one of our Upload SDKs to reliably process and upload the the video.

  For more information, check out our upload SDK guides:

  * [Upload directly from an Android app](/docs/guides/upload-video-directly-from-android)
  * [Upload directly from iOS or iPadOS](/docs/guides/upload-video-directly-from-ios-or-ipados)
  * [Upload directly from your Web App](/docs/guides/mux-uploader)
</Callout>

<Callout type="info" title="Next.js React example">
  [with-mux-video](https://github.com/vercel/next.js/tree/canary/examples/with-mux-video) is a full open-source example application that uses direct uploads

  `npx create-next-app --example with-mux-video with-mux-video-app`

  Another open-source example application is [stream.new](https://stream.new). GitHub repo link: [muxinc/stream.new](https://github.com/muxinc/stream.new)

  `git clone git@github.com:muxinc/stream.new.git`

  Both of these example applications use [Next.js](https://nextjs.org/), UpChunk, Mux Direct Uploads and Mux playback.
</Callout>

### Creating an `/upload` route in the application

In the route we build to create and return a new Direct Upload, we'll first create a new object in our application that includes a generated ID and all the additional information we want about that Asset. *Then* we'll create the Direct Upload and include that generated ID in the `passthrough` field.

```node
const { json, send } = require('micro');
const uuid = require('uuid/v1');

// This assumes you have MUX_TOKEN_ID and MUX_TOKEN_SECRET
// environment variables.
const mux = new Mux();

// All the 'db' references here are going to be total pseudocode.
const db = yourDatabase();

module.exports = async (req, res) => {
  const id = uuid();
  // Go ahead and grab any info you want from the request body.
  const assetInfo = await json(req);

  // Create a new upload using the Mux SDK.
  const upload = await mux.video.uploads.create({
    // Set the CORS origin to your application.
    cors_origin: 'https://your-app.com',

    // Specify the settings used to create the new Asset after
    // the upload is complete
    new_asset_settings: {
      passthrough: id,
      playback_policy: ['public'],
      video_quality: 'basic'
    }
  });

  db.put(id, {
    // save the upload ID in case we need to update this based on
    // 'video.upload' webhook events.
    uploadId: upload.id,
    metadata: assetInfo,
    status: 'waiting_for_upload',
  });

   // Now send back that ID and the upload URL so the client can use it!
  send(res, 201, { id, url: upload.url });
}
```

Excellent! Now we've got a working endpoint to create new Mux uploads that we can use in our Node app or deploy as a serverless function. Next we need to make sure we have an endpoint that handles the Mux webhooks when they come back.

```node
const { json, send } = require('micro');

// More db pseudocode.
const db = yourDatabase();

module.exports = async (req, res) => {
  // We'll grab the request body again, this time grabbing the event
  // type and event data so we can easily use it.
  const { type: eventType, data: eventData } = await json(req);

  switch (eventType) {
    case 'video.asset.created': {
      // This means an Asset was successfully created! We'll get
      // the existing item from the DB first, then update it with the
      // new Asset details
      const item = await db.get(eventData.passthrough);
      // Just in case the events got here out of order, make sure the
      // asset isn't already set to ready before blindly updating it!
      if (item.asset.status !== 'ready') {
        await db.put(item.id, {
          ...item,
          asset: eventData,
        });
      }
      break;
    };
    case 'video.asset.ready': {
      // This means an Asset was successfully created! This is the final
      // state of an Asset in this stage of its lifecycle, so we don't need
      // to check anything first.
        const item = await db.get(eventData.passthrough);
      await db.put(item.id, {
        ...item,
        asset: eventData,
        });
      break;
    };
    case 'video.upload.cancelled': {
      // This fires when you decide you want to cancel an upload, so you
      // may want to update your internal state to reflect that it's no longer
      // active.
      const item = await db.findByUploadId(eventData.passthrough);
      await db.put(item.id, { ...item, status: 'cancelled_upload' });
    }
    default:
      // Mux sends webhooks for *lots* of things, but we'll ignore those for now
      console.log('some other event!', eventType, eventData);
  }
}
```

Great! Now we've got our application listening for events from Mux, then updating our DB to reflect the relevant changes. You could also do cool things in the webhook handler like send your customers events via [Server-Sent Events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events) or [WebSockets](https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API).

## Handle large files with UpChunk

In general, just making a `PUT` request with the file in the body is going to work fine for most client applications and content. When the files start getting a little bigger, you can stretch that by making sure to stream the file from the disk into the request. With a reliable connection, that can take you to gigabytes worth of video, but if that request fails, you or your customer are going to have to start the whole thing over again.

In those scenarios where you have really big files and potentially need to pause/restart a transfer, you can chunk up the file and use the resumable features of the upload endpoint! If you're doing it in a browser we wrote [UpChunk](https://github.com/muxinc/upchunk) to help, but the process isn't nearly as scary as it sounds.

### Installing UpChunk

**With NPM**

```shell
npm install --save @mux/upchunk
```

**With yarn**

```shell
yarn add @mux/upchunk
```

**With CDN**

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/upchunk@2"></script>
```

### Using UpChunk

```js

import * as UpChunk from '@mux/upchunk';

// Pretend you have an HTML page with an input like: <input id="picker" type="file" />
const picker = document.getElementById('picker');

picker.onchange = () => {
  const getUploadUrl = () =>
    fetch('/the-backend-endpoint').then(res => {
      res.ok ? res.text() : throw new Error('Error getting an upload URL :(')
    });

  const upload = UpChunk.createUpload({
    endpoint: getUploadUrl,
    file: picker.files[0], 
    chunkSize: 5120, // Uploads the file in ~5mb chunks
  });

  // subscribe to events
  upload.on('error', err => {
    console.error('💥 🙀', err.detail);
  });
}

```

```react

import React, { useState } from 'react';
import * as UpChunk from '@mux/upchunk';

function Page() {
  const [progress, setProgress] = useState(0);
  const [statusMessage, setStatusMessage] = useState(null);

  const handleUpload = (inputRef) => {
    try {
      const response = await fetch('/your-server-endpoint', { method: 'POST' });
      const url = await response.text();
    
      const upload = UpChunk.createUpload({
        endpoint: url, // Authenticated url
        file: inputRef.files[0], // File object with your video file’s properties
        chunkSize: 5120, // Uploads the file in ~5mb chunks
      });
    
      // Subscribe to events
      upload.on('error', error => {
        setStatusMessage(error.detail);
      });

      upload.on('progress', progress => {
        setProgress(progress.detail);
      });

      upload.on('success', () => {
        setStatusMessage("Wrap it up, we're done here. 👋");
      });
    } catch (error) {
      setErrorMessage(error);
    }
  }

  return (
    <div className="page-container">
      <h1>File upload button</h1>
      <label htmlFor="file-picker">Select a video file:</label>
      <input type="file" onChange={(e) => handleUpload(e.target)}
        id="file-picker" name="file-picker"/ >

      <label htmlFor="upload-progress">Downloading progress:</label>
      <progress value={progress} max="100"/>
      
      <em>{statusMessage}</em>
        
    </div>
  );
}

export default Page;

```



### Alternatives to UpChunk

* Split the file into chunks that are a multiple of 256KB (`256 * 1024` bytes). For example, if you wanted to have 20MB chunks, you'd want each one to be 20,971,520 bytes (`20 * 1024 * 1024`). The exception is the final chunk, which can just be the remainder of the file. Bigger chunks will be a faster upload, but think about each one as its own upload in the sense of needing to restart that one if it fails, but needing to upload fewer chunks can be faster.
* Set a couple of headers:
  * `Content-Length`: the size of the current chunk you're uploading.
  * `Content-Range`: what bytes you're currently uploading. For example, if you've got a 10,000,000 byte file and you're uploading in ~1MB chunks, this header would look like `Content-Range: bytes 0-1048575/10000000` for the first chunk.
* Now use a `PUT` request like we were for "normal" uploads, just with those additional headers and each individual chunk as the body.
* If the server responds with a `308`, you're good to continue uploading! It will respond with as `200 OK` or `201 Created` when the upload is completed.

## Upload streamed data as it becomes available

When dealing with streaming data where the total file size is unknown until the end—such as live recordings or streaming AI generated data—you can upload the data to Mux in chunks as it becomes available.

This approach has several benefits:

* No need to know the total file size upfront
* Reduced memory usage on the client, because you're uploading chunks and releasing them instead of buffering the entire file in memory
* Faster uploads, because you're uploading chunks in parallel with the client instead of waiting for the entire file to be recorded

### Example: MediaRecorder

When recording media directly from a user's device using the [MediaRecorder API](https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder), the total file size is unknown until the recording is complete. To handle this, you can upload the media data to Mux in chunks as it becomes available, without needing to know the total size up front.

Let's look at an example of how to do this with a web app. First, we'll set up the MediaRecorder to capture media data in chunks. Each chunk will be passed to the `uploadChunk` function, which will upload it to Mux.

<Callout type="info" title="Open source example repository">
  You can find a complete example repository demonstrating this approach in our [examples repo](https://github.com/muxinc/examples/tree/main/mediarecorder-streaming-uploads).
</Callout>

Start by declaring some global variables to track recording state and upload progress.

```javascript
// Global variables to track recording state and upload progress
let mediaRecorder;
let mediaStream;
let nextByteStart = 0;
const CHUNK_SIZE = 8 * 1024 * 1024; // 8MB chunks - must be multiple of 256KB
const maxRetries = 3; // Number of upload retry attempts
const lockName = 'uploadLock'; // Used by Web Locks API for sequential uploads
let activeUploads = 0; // Track number of chunks currently uploading
let isFinalizing = false; // Flag to prevent new uploads during finalization
```

Next, configure the MediaRecorder to capture media data in chunks.

```javascript
async function startRecording() {
  // Request access to user's media devices
  mediaStream = await navigator.mediaDevices.getUserMedia({ audio: true, video: true });

  // Use a widely supported MIME type for maximum compatibility
  const mimeType = 'video/webm';

  // Initialize MediaRecorder with optimal settings
  mediaRecorder = new MediaRecorder(mediaStream, {
    mimeType,
    videoBitsPerSecond: 5000000, // 5 Mbps video bitrate
    audioBitsPerSecond: 128000   // 128 kbps audio bitrate
  });

  // Buffer to accumulate media data until we have enough for a chunk
  let buffer = new Blob([], { type: mimeType });
  let bufferSize = 0;

  // Handle incoming media data
  mediaRecorder.ondataavailable = async (event) => {
    // Only process if we have data and aren't in the finalization phase
    if (event.data.size > 0 && !isFinalizing) {
      // Combine the new data with our existing buffer
      // We use a Blob to efficiently handle large binary data
      // The type must match what we specified when creating the MediaRecorder
      buffer = new Blob([buffer, event.data], { type: mimeType });
      bufferSize += event.data.size;

      // Keep processing chunks as long as we have enough data
      // This ensures we maintain a consistent chunk size of 8MB (CHUNK_SIZE)
      // which is required by Mux's direct upload API
      while (bufferSize >= CHUNK_SIZE) {
        // Extract exactly CHUNK_SIZE bytes from the start of our buffer
        const chunk = buffer.slice(0, CHUNK_SIZE);
        
        // Keep the remainder in the buffer for the next chunk
        buffer = buffer.slice(CHUNK_SIZE);
        bufferSize -= CHUNK_SIZE;

        // Upload this chunk, passing false for isFinalChunk since we're still recording
        // nextByteStart tracks where in the overall file this chunk belongs
        await uploadChunk(chunk, nextByteStart, false);
        
        // Increment our position tracker by the size of the chunk we just uploaded
        nextByteStart += chunk.size;
      }
      // Any remaining data stays in the buffer until we get more from ondataavailable
    }
  };

  // Start recording, getting data every 500ms
  mediaRecorder.start(500);
}
```

Uploaded chunks need to be delivered in multiples of 256KB (`256 * 1024` bytes). Since the chunks provided by the `MediaRecorder` API can be smaller than that, you'll need to collect them in a buffer until you have an aggregate chunk that is at least 256KB in size. 8MB is a good size for a chunk, so we'll use that as our chunk size in this example.

When the recording is complete, call the `stopRecording` function to upload the final chunk and clean up the MediaRecorder.

```javascript
async function stopRecording() {
  // Only proceed if we have an active mediaRecorder
  if (mediaRecorder && mediaRecorder.state !== 'inactive') {
    // Stop recording new data
    mediaRecorder.stop();
    // Set flag to prevent new uploads from starting during finalization
    isFinalizing = true;

    // Wait for any in-progress chunk uploads to complete
    // Check every 100ms until all uploads are done
    while (activeUploads > 0) {
      await new Promise(resolve => setTimeout(resolve, 100));
    }

    // If there's any remaining data in the buffer that hasn't been uploaded yet
    // Upload it as the final chunk (isFinalChunk = true)
    if (buffer.size > 0) {
      await uploadChunk(buffer, nextByteStart, true);
      nextByteStart += buffer.size;
    }

    // Clean up by stopping all media tracks (camera, mic etc)
    if (mediaStream) {
      mediaStream.getTracks().forEach(track => track.stop());
    }

    // Reset finalization flag now that we're done
    isFinalizing = false;
  }
}
```

Within the `uploadChunk` function, perform a `PUT` request to the authenticated Mux upload URL. Use the `Content-Range` header to indicate the byte range of the chunk being uploaded. Since the total file size is unknown, use `*` as the total size until the final chunk is uploaded.

```javascript
async function uploadChunk(chunk, byteStart, isFinalChunk) {
  // Calculate the end byte position for this chunk by adding chunk size to start position
  // Subtract 1 since byte ranges are inclusive (e.g. bytes 0-499 is 500 bytes)
  const byteEnd = byteStart + chunk.size - 1;

  // For the total size in the Content-Range header:
  // - If this is the final chunk, use the actual total size (byteEnd + 1)
  // - Otherwise use '*' since we don't know the final size yet
  const totalSize = isFinalChunk ? byteEnd + 1 : '*';

  // Set required headers for resumable upload:
  // - Content-Length: Size of this chunk in bytes
  // - Content-Range: Byte range being uploaded in format "bytes START-END/TOTAL"
  const headers = {
    'Content-Length': chunk.size.toString(),
    'Content-Range': `bytes ${byteStart}-${byteEnd}/${totalSize}`,
  };

  let attempt = 0;
  let success = false;

  // Use Web Locks API to enforce sequential uploads
  await navigator.locks.request(lockName, async () => {
    activeUploads++;
    while (attempt < maxRetries && !success) {
      try {
        const response = await fetch('MUX_DIRECT_UPLOAD_URL_HERE', {
          method: 'PUT',
          headers,
          body: chunk
        });

        if (response.ok || response.status === 308) {
          success = true;
        } else {
          throw new Error(`Upload failed with status: ${response.status}`);
        }
      } catch (error) {
        attempt++;
        if (attempt < maxRetries) {
          await new Promise(resolve => setTimeout(resolve, attempt * 1000)); // Exponential backoff
        } else {
          throw error;
        }
      }
    }
    activeUploads--;
  });

  return success;
}
```

<Callout type="warning" title="Maintain sequential uploads with the Web Locks API">
  In the provided example, the [`navigator.locks.request`](https://developer.mozilla.org/en-US/docs/Web/API/Web_Locks_API) method is used to enforce sequential chunk uploads. This is necessary because if the `MediaRecorder` is stopped, the `ondataavailable` event can trigger multiple times simultaneously, which would cause multiple concurrent uploads if not properly synchronized. If you attempt to upload the final chunk before the previous uploads have completed, the upload will fail.
</Callout>

The final chunk is indicated by the `isFinalChunk` parameter, which is passed to the `uploadChunk` function. When `isFinalChunk` is `true`, the function will upload the remaining data in the buffer as the final chunk and modify the `totalSize` to reflect the total amount of data that was uploaded.


# Upload video directly from an Android app
Allow your users to upload content directly to Mux from a native Android app.
Direct Uploads allow you to provide an authenticated upload URL to your client applications so content can be uploaded directly to Mux without needing any intermediate steps. You still get to control who gets an authenticated URL, how long it's viable, and, of course, the Asset settings used when the upload is complete.

The Android Upload SDK allows a client application to upload video files from an Android device to Mux Video. The upload can be paused before completion and resume where it left off, even after process death.

Let's start by uploading a video directly into Mux from an Android application. The code from these examples can be found in our [upload example app](https://github.com/muxinc/android-upload-sdk/tree/main/app)

## Gradle setup

To integrate the Mux Upload SDK into your Android app, you first have to add it to your project, then create a `MuxUpload` object using an upload URL your app can fetch from a trusted server. Once you create the `MuxUpload`, you can start it with `start()`.

A working example can be found alongside our source code [here](https://github.com/muxinc/android-upload-sdk/tree/main/app).

## Add Mux's maven repository to your project

Add our maven repository to your project's `repositories` block. Depending on your setup, you may need to do this either the `settings.gradle` under `dependencyResolutionManagement` or your project's `build.gradle`.

```gradle\_kts

// In your repositories block
maven {
url = uri("https://muxinc.jfrog.io/artifactory/default-maven-release-local")
}
  
```

```gradle\_groovy

// In your repositories block
maven {
url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
}
  
```



## Add the Upload SDK to your app's dependencies

Add the upload SDK to your app's `dependencies` block in its `build.gradle` file.

```gradle\_kts

// in your app's dependencies
implementation("com.mux.video:upload:0.4.1")
  
```

```gradle\_groovy

// in your app's dependencies
implementation "com.mux.video:upload:0.4.1"
  
```



## Upload a video

In order to securely upload a video, you will need to create a PUT URL for your video. Once you have created the upload URL, return it to the Android client then use the `MuxUpload` class to upload the file to Mux.

## Getting an Upload URL to Mux Video

In order to upload a new video to Mux, you must first create a new <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Direct Upload</ApiRefLink> to receive the file. The Direct Upload will contain a resumable PUT url for your Android client to use while uploading the video file.

You should not create your Direct Uploads directly from your app. Instead, refer to the [Direct Upload Guide](/docs/guides/upload-files-directly) to create them securely on your server backend.

## Creating and starting your `MuxUpload`

To perform the upload from your Android app, you can use the `MuxUpload` class. At the simplest, you need to build your `MuxUpload` via its `Builder`, then add your listeners and `start()` the upload.

```kotlin

/**
* @param myUploadUri PUT URL fetched from a trusted environment
* @param myVideoFile File where the local video is stored. The app must have permission to read this file
*/
fun beginUpload(myUploadUrl: Uri, myVideoFile: File) {
val upl = MuxUpload.Builder(myUploadUrl, myVideoFile).build()
upl.addProgressListener { innerUploads.postValue(uploadList) }
upl.addResultListener {
  if (it.isSuccess) {
    notifyUploadSuccess()
  } else {
    notifyUploadFail()
  }
}
upl.start()
}
  
```

```java

/**
* @param myUploadUri PUT URL fetched from a trusted environment
* @param myVideoFile File where the local video is stored. The app must have permission to read this file
*/
public void beginUpload(Uri uploadUrl, File videoFile) {
MuxUpload upload = new MuxUpload.Builder(uploadUri, videoFile).build();
upload.setProgressListener(progress -> {
  handleProgress(progress);
});
upload.setResultListener(result -> {
  if (UploadResult.isSuccessful(progressResult)) {
    handleSuccess(UploadResult.getFinalProgress(progressResult));
  } else {
    handleFailure(UploadResult.getError(progressResult));
  }
});
upload.start();
}
  
```



### Resume uploads after network loss or process death

The upload SDK will keep track of uploads that are in progress. When your app starts, you can restart them using `MuxUploadManager`. For more information on managing, pausing, and resuming uploads, see the next section of this guide.

```kotlin

// You can do this anywhere, but it's really effective to do early in app startup
MuxUploadManager.resumeAllCachedJobs()
  
```

```java

// You can do this anywhere, but it's really effective to do early in app startup
MuxUploadManager.INSTANCE.resumeAllCachedJobs()
  
```



### Upload from a coroutine

If you're using Kotlin coroutines, you don't have to rely on the listener API to receive notifications when an upload succeeds or fails. If you prefer, you can use `awaitSuccess` in your coroutine.

```kotlin
suspend fun uploadFromCoroutine(videoFile: File): Result<UploadStatus> {
  val uploadUrl = withContext(Dispatchers.IO) {
    getUploadUrl()  // via call to your backend server, see the guide above
  }
  val upload = MuxUpload.Builder(uploadUrl, videoFile).build()
  // Set up your listener here too
  return upload.awaitSuccess()
}
```

## Resuming and managing Uploads

`MuxUpload`s are managed globally while they are in-progress. Your upload can safely continue while your user does other things in your app. Optionally, you can listen for progress updates for these uploads in, eg, a foreground `Service` with a system notification, or a progress view in another `Fragment`.

### Find Uploads already in progress

`MuxUpload`s are managed internally by the SDK, and you don't have to hold onto a reference to your `MuxUpload` in order for the upload to complete. You can get a `MuxUpload` object for any file currently uploading using `MuxUploadManager`

This example listens for progress updates. You can also `pause()` or `cancel()` your uploads this way if desired.

```kotlin

fun listenToUploadInProgress(videoFile: File) {
val upload = MuxUploadManager.findUploadByFile(videoFile)
upload?.setProgressListener { handleProgress(it) }
}
  
```

```java

public void listenToUploadInProgress(File videoFile) {
MuxUpload uploadInProgress = MuxUploadManager.INSTANCE.findUploadByFile(videoFile);
if (uploadInProgress != null) {
  uploadInProgress.setProgressListener(progress -> handleProgress(progress));
}
}
  
```



## Advanced

### Setting a Maximum resolution

If desired, you may choose a maximum resolution for the content being uploaded. You may wish to scale down the video files that are too large for your asset tier, for instance. This can save data costs for your users and it ensures that your assets are available to play as soon as possible.

The Mux Upload SDK scales down any input video larger than 4k (3840x2160 or 2160x3840) by default. You can choose to scale them down further to save on user data, or if you're targeting a basic video quality asset.

### Disable Input Standardization

<Callout type="warning">
  The setting described here will only affect *local* changes to your input. Mux Video will still convert any non-standard inputs to a standard format during ingestion.
</Callout>

The Upload SDK is capable of processing input videos in order to optimize them for use with Mux Video. This behavior can be disabled if it isn't desired, although this may result in extra processing on Mux's servers. We don't recommend disabling standardization unless you are experiencing issues.

```kotlin

fun beginUpload(myUploadUrl: Uri, myVideoFile: File) {
val upl = MuxUpload.Builder(myUploadUrl, myVideoFile)
  .standarizationRequested(false) // disable input processing
  .build()
// add listeners etc
upl.start()
}
  
```

```java

public void beginUpload(Uri uploadUrl, File videoFile) {
MuxUpload upload = new MuxUpload.Builder(uploadUri, videoFile)
    .standarizationRequested(false) // disable input processing
    .build();
// add listeners etc
upload.start();
}
  
```



## Release notes

### Current release

### 1.0.0

New:

* 4k and 720p input standardization
* Background-Uploading example in sample app

Fixes:

* fix: currentStatus always reports READY
* fix: upload success reported as failure in some cases

### Previous releases

### 0.5.0

New:

* Audio transcoding

Improvements:

* Improved performance reporting

### 0.4.2

New

* feat: Add `UploadResult` class for java users to interpret Result

Fixes

* Fix: Some methods on MuxUpload.Builder don't return Builder
* Fix: the application context shouldn't be visible
* Fix: Transcoding errors handled incorrectly

### 0.4.1

New

* feat: Add API for listening to and retrieving the status of an upload
* feat: Add Input Standardization, to process videos device-side

Fixes

* fix: Metrics events not properly redirected

Known Issues

* There's no notification/callback for the result of input standardization

### 0.4.0

Improvements

* doc: Finish out the public KDoc
* doc: Improve KDocs and add logo
* nfc: Hide some internal classes from java. This should not affect anyone using the SDK

Fixes

* fix: pausing uploads shouldn't create errors

### 0.3.1

Improvements

* fix: Restarts broken due to invalid progress data

### 0.3.0

Breaking

* breaking: Remove extraneous MIME type and Retry Time Builder fields. Configuring these did nothing

Updates

* update: Add the ability to resume all failed or paused uploads

Improvements

* Greatly improved the example app

### 0.2.0

Improvements

* fix: Hide some constructors to prevent unintended use by java callers
* feat: Add Performance Metrics Tracking

### 0.1.0

🥳 First beta release of the Mux Android SDK

Features

* Upload multiple files at once
* Pause and resume uploads, even after process death
* Observe upload progress from anywhere in your app without extra code


# Upload video directly from iOS or iPadOS
Allow your users to upload video to Mux from an iOS or iPadOS application with Direct Uploads and the Upload SDK.
[Direct Uploads](/docs/guides/upload-files-directly) allow you to upload content from your client applications directly to Mux without needing any intermediary steps using an authenticated URL.

This guide will help you install the Upload SDK from Mux. The Upload SDK is designed to handle common tasks required to upload large video files, like file chunking and networking. By using the Upload SDK, your application will also become able to pause and resume uploads across restarts, report upload progress, and make adjustments that minimize processing time when your upload is ingested by Mux.

The Upload SDK is supported on iOS 14 and iPadOS 14, or higher. macOS is not supported at this time.

Your application can also handle uploads [on its own](/docs/guides/upload-files-directly#if-you-dont-want-to-use-upchunk) using built-in [`URLSession`](https://developer.apple.com/documentation/foundation/urlsession) and [file system](https://developer.apple.com/documentation/foundation/file_system) APIs. We encourage you to check out the Upload SDK [implementation](https://github.com/muxinc/swift-upload-sdk) as an example to follow along.

## Install the SDK

Let's start by installing the SDK. We'll use the Swift Package Manager. [Step-by-step guide on using Swift Package Manager in Xcode](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app).

Open your applications project in Xcode. In the Xcode menu bar select File > Add Packages. In the top-right corner of the modal window that opens enter the SDK repository URL which is `https://github.com/muxinc/swift-upload-sdk`.

By default Xcode will fetch the latest version of the SDK available on the `main` branch. If you need a specific package version or to restrict the range of package versions used in your application, select a different Dependency Rule. [Here's an overview of the different SPM Dependency Rules and their semantics](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app#Decide-on-package-requirements).

Click on Add Package to begin resolving and downloading the SDK package. When completed, select your application target as the destination for the `MuxUploadSDK` package product. To use the SDK in your application, import it's module: `import MuxUploadSDK`.

## Upload content from your application

## Getting an authenticated URL from Mux Video

<Callout type="info">
  You must create a new [Direct Upload](/docs/guides/upload-files-directly#1-create-an-authenticated-mux-url) to upload a new video to Mux.

  The Direct Upload will contain an authenticated `PUT` url that's unique to your upload. Your application will upload video to this url.
</Callout>

Direct Uploads are resumable and if your application application started an upload and needed to pause it, use the same url to resume the upload.

We recommend that you avoid creating Direct Uploads outside of a trusted environment such as a backend server. Your application can request a new authenticated URL from your server when it needs one. You can also hardcode a pre-made URL in an internal build of your application for a one-time test.

## Create and start your direct upload

Once your application has an authenticated direct upload URL, you're ready to start uploading!

Your application will use the authenticated url to construct a `PUT` request. The body of the request will contain your video data. The `DirectUpload` API in the Upload SDK handles these operations for you.

Initialize a `DirectUpload` with your authenticated URL and a local video file URL. We'll also set the progress handler callback to log the upload progress to the console. In a later example you'll learn how to customize how an upload behaves.

```swift

  import MuxUploadSDK

  // The url found in the response after creating a direct upload
  let authenticatedURL: URL = /* fetch from trusted environment */

  // In this example we're uploading a video input file saved locally inside the application sandbox
  let videoInputURL: URL = /* URL to a video available locally */

  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL
  )

  // Let's log the progress to the console
  directUpload.progressHandler = { state in
    print("Uploaded (state.progress.completedUnitCount) / (state.progress.totalUnitCount)")
  }

  // Then start the direct upload
  directUpload.start()
  
```



## Tactics for handling large files

### Chunking uploads

Smaller videos can be uploaded with a single request. We recommend breaking up larger videos into chunks and treating them as separate uploads.

The Upload SDK handles the required networking and file chunking operations for you regardless of file size. By default the SDK splits your video into *8MB* chunks when necessary. To change the chunk size your application will initialize its own `DirectUploadOptions` and pass the custom size as `chunkSizeInBytes`. Be sure to convert any quantities expressed in kilobytes or megabytes to bytes first.

Initialize a `DirectUpload` and pass the custom options you've created as the `options` parameter. A default set of options will be used if `options` isn't set when initializing a `DirectUpload`.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Construct custom upload options to upload a file in 6MB chunks
  let chunkSizeInBytes = 6 * 1024 * 1024
  let options = DirectUploadOptions(
    chunkSizeInBytes: chunkSizeInBytes
  )

  // Initialize a DirectUpload with custom options
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Let's log the upload progress to the console
  directUpload.progressHandler = { state in
    print("Uploaded (state.progress.completedUnitCount) / (state.progress.totalUnitCount)")
  }

  // Then start the direct upload
  directUpload.start()
  
```



Smaller chunk sizes result in more requests while larger chunk sizes lead to fewer requests that take longer to complete. We recommend using a smaller chunk size on unstable or lossy networks.

### What happens if an upload request fails?

When the SDK becomes aware of a failed upload `PUT` request, it will automatically retry it. By default the SDK will retry uploading each chunk up to 3 times before the upload is deemed to have failed. This limit can be altered by your application by initializing its own `DirectUploadOptions` with a custom value for `retryLimitPerChunk`. Then initialize a `DirectUpload` with the custom options as the `option` argument.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Construct custom upload options with a higher per-chunk retry limit
  let options = DirectUploadOptions(
    retryLimitPerChunk: 5
  )

  // Initialize a DirectUpload that will retry each chunk
  // request up to 5 times
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Then start the direct upload
  directUpload.start()
  
```



### Pause and resume uploads

Your application might become suspended or terminated in the middle of a long-running upload. You can avoid losing the progress completed so far by pausing the upload and resuming it when the app becomes active again.

```swift

  import MuxUploadSDK

  class UploadCoordinator {
    func handleApplicationWillTerminate() {
      UploadManager.shared.allManagedUploads().forEach { upload in
        upload.pause()
      }
    }

    func handleApplicationDidBecomeActive() {
      UploadManager.shared.resumeAllUploads()
    }
  }
  
```



A direct upload can be resumed as long as it remains in a `waiting` status and hasn't yet transitioned to a `timed_out` status. You can customize this length of time by setting the `timeout` value in the <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">create direct upload request</ApiRefLink> to a value between 1 minute and 7 days. If no value is set the upload times out 1 hour after being *created*.

## Need a playable asset as fast as possible?

<Callout type="info" title="Beta Functionality">
  The APIs around this feature are not final.
</Callout>

After your direct upload is completed, Mux Video will convert the uploaded input into a playable asset.

Some types of inputs require additional processing time during ingestion before becoming ready for playback. By default the Upload SDK reduces the processing time by adjusting upload inputs locally to a faster-to-process format when needed. More details on how audio and video input formats relate to new asset processing time [available here](/docs/guides/minimize-processing-time).

### Setting a maximum resolution

The SDK can adjust the resolution of your video input locally before it is uploaded to Mux. By default the SDK will adjust the input resolution to 1920 x 1080 for any inputs that are larger.

You can also reduce the maximum resolution further to 1280 x 720. Initialize a new `DirectUploadOptions` and set `.preset1280x720` as `InputStandardization.maximumResolution`.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Reduce the maximum resolution to 1280 x 720
  let options = DirectUploadOptions(
    inputStandardization: .init(maximumResolution: .preset1280x720)
  )

  // Initialize a DirectUpload with custom options
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Then start the direct upload
  directUpload.start()
  
```



### Skipping input adjustments

<Callout type="warning">
  The setting described here will only affect *local* changes to your input. Mux Video will still convert any non-standard inputs to a standard format during ingestion.
</Callout>

In most cases your application won't need to bypass these adjustments. When necessary they can be skipped by initializing `DirectUploadOptions` and passing `.skipped` for `inputStandardization`, then passing those to the `options` argument when initializing a new `DirectUpload` like you've customized other options before.

```swift

  import MuxUploadSDK

  let authenticatedURL: URL = /* fetch from trusted environment */
  let videoInputURL: URL = /* URL to a video available locally */

  // Skip adjustments to your input locally
  let options = DirectUploadOptions(
    inputStandardization: .skipped
  )

  // Initialize a DirectUpload with that skips input standardization
  // and uploads your video as-is
  let directUpload = DirectUpload(
    uploadURL: authenticatedURL,
    inputFileURL: videoInputURL,
    options: options
  )

  // Then start the direct upload
  directUpload.start()
  
```



## Release notes

### Current release

### 1.0.0

Improvements

* Direct uploads are cancelable while inputs are standardized on the client
* Video inputs can be standardized to 2160p (4K) resolution
* Upload source `AVAsset` has the correct URL

Known Issues

* When checking if a video input file is standard or not, the SDK compares an averaged bitrate to a resolution-dependent limit. If different parts of the video input have varying input, the video may require further processing by Mux upon ingestion.

### 0.7.0

New

* Add macOS deployment target

Improvements

* Fix memory leak occurring when uploading large files

### 0.6.0

New

* Add Foundation Measurement API for chunk size

Breaking

* Rename Version to SemanticVersion for explicitness in API

Improvements

* Remove UIKit dependency from SDK
* Backfill missing inline API docs

### 0.5.0

New

* Add an overload initializer for `DirectUploadOptions`

Breaking

* Remove prefix use in public APIs and respell Upload as DirectUpload

### Previous releases

### 0.4.0

API Changes

* Deprecation: `MuxUpload.init(uploadURL:videoFileURL:chunkSize:retriesPerChunk:)` has been deprecated and will be removed in a future SDK version. Use `init(uploadURL:inputFileURL:options:)` instead
* Breaking Change: `MuxUpload.startTime` now returns an optional value
* Breaking Change: `MuxUpload.Status` has been renamed to `MuxUpload.TransportStatus`
* Add: `UploadOptions` struct to contain all available `MuxUpload` options
* Add: Options to request or to skip input standardization
* Add: `MuxUpload` initializer APIs that accept `AVAsset` or `PHAsset`
* Add: `MuxUpload.InputStatus` enum to represent the current state of the upload and change handler

New

* Support for on-device input standardization, create a playable asset from your direct upload faster. When input standardization is requested from the SDK, input video is converted to a standard range of values on a best-effort basis

Fixes

* Prevent integer overflow when calculating chunk request content ranges
* Prevent crash from chunk worker force unwrap
* Remove public methods from internal SDK classes
* Prevent removal of result handler properties when passing MuxUpload via UploadManager

### 0.3.0

API Changes

* `MuxUpload`'s initializer no longer requires a MIME type or Retry Time. These are calculated internally
* Added methods for querying the `UploadManager` for the list of currenty-active uploads, and listening for changes to the list
* Add opt-out for upload statistics

Improvements

* Add a much-improved example app

### 0.2.1

Improvements

* Track upload statistics

Fixes

* Resumed Uploads start at the beginning of the file

### 0.2.0

Improvements

* Remove Alamofire Dependency

### 0.1.0

Our first release of Mux's Swift Upload SDK!! 🎉 💯

This public beta release includes chunked, pause-able, resume-able video uploads for Mux Video. You can upload from anywhere in your app as well as query the upload state from anywhere in your app regardless of your app architecture. Uploads can be resumed even after your app restarted after a shutdown.


# Minimize processing time
Learn how to optimize your video files for the fastest processing time.
Mux Video accepts most modern video formats and codecs. However, certain types of inputs need to be *standardized* in order for Mux to do further operations on them, and this can add time before the video is ready to be streamed. If you want to standardize your content before sending it to Mux, and potentially improve performance, this guide will show what you need to do.

## Standard input specs

To be considered standard input, the input video file must meet the following requirements:

* **H.264 or HEVC video codecs**. H.264 is the dominant video codec in use today and almost every device supports H.264. HEVC is a more modern codec that's increasingly popular, though not as universally supported. While Mux accepts other codecs as input, other codecs are considered non-standard and will be standardized automatically to H.264.
* **Closed GOP** (group-of-pictures). (Warning: video jargon ahead. You can likely ignore this) In video files encoded with a closed-GOP, all B frames reference other frames in the same GOP. Closed GOPs always begins with an IDR (Instantaneous Decoder Refresh) frame. This means that every GOP can be played independently, without reference to another GOP. Standard inputs must be encoded with a closed-GOP.
* **8-bit 4:2:0, or 10-bit 4:2:0 if HEVC**. This refers to the color depth and chroma subsampling. If you don't know what this is, you can probably ignore this, since most streaming video is 8-bit 4:2:0. HDR usually uses 10-bit 4:2:0, which is only supported as standard when using the HEVC video codec. However, SDR is preferred as Mux does not provide full HDR support.
* **Simple Edit Decision Lists**. Edit Decision Lists (EDLs) are typically added during post-production and define how certain segments are used to build a track timeline for playback. A good example of a Simple Edit Decision List is to fix out of order frames in the video. Input files with more complex uses of EDLs are considered non-standard.
* **AAC audio codec**. AAC is the dominant audio codec in use today and almost every device supports this audio codec. While Mux accepts files that use other audio codecs, Mux only delivers AAC audio and non-AAC audio inputs are considered non-standard.

### Additional requirements for assets with a `max_resolution_tier` of 1080p

Assets ingested up to 1080p are subject to the following standard input requirements.

* **1080p/2K or smaller**. Files with a resolution of up to 2048x2048 are considered standard. Files with a larger than this are considered non-standard, unless ingested with a higher `max_resolution_tier`.
* **Max 20-second keyframe interval (10 seconds for HEVC)**. To stream well using HTTP-based streaming methods like HLS, Mux requires all keyframes intervals to be less than 20 seconds, or less than 10 seconds if encoded with HEVC.
* **8Mbps or below**. While Mux accepts higher bitrate inputs, average bitrates higher than 8Mbps are generally challenging for most viewers' connections and are considered non-standard. The bitrate should not exceed 16Mbps for any single GOP.
* **Frame rate between 5 and 120**. Inputs with average frames per second (fps) less than 5 or greater than 120 is considered non-standard. Frame rates within this range will be preserved (e.g. 60 fps will remain 60 fps). Inputs with less than 5 fps or greater than 120 fps will be automatically standardized to 30 fps.

### Additional requirements for assets with a `max_resolution_tier` of 2160p (4K)

[Assets ingested at 2K and 4K resolutions](/docs/guides/stream-videos-in-4k) are subject to the following standard input requirements.

* **2160p or smaller**. The input file must not have any dimension (width, height, or both) that exceeds 4096 pixels.
* **Max 10-second keyframe interval (6 seconds for HEVC)**. To stream 4k video well, a 10 second keyframe interval is required. If using the HEVC video CODEC, this is further limited to 6 seconds.
* **20Mbps or below**. While Mux accepts higher bitrate inputs, bitrates higher than 20Mbps are generally challenging for most viewers' connections.
* **Frame rate between 5 and 60**. For 4k videos, a frame rate above 60fps is considered non-standard.

## How to create standard input (ffmpeg)

As a starting point, here is a sample ffmpeg command for creating video that complies with Mux standard input. Feel free to modify this by using things like 2-pass encoding, different presets, or different bitrates (as long as the total bitrate ends up below than 8Mbps).

```shell
ffmpeg -i input.mp4 -c:a copy -vf "scale=w=min(iw\,1920):h=-2" -c:v libx264 \
-profile high -b:v 7000k -g 239 -pix_fmt yuv420p -maxrate 16000k -bufsize 24000k out.mp4
```

### Standard input for 4K

If you are creating a 4K video, the resolution and bitrate limits are higher. Here is a sample ffmpeg command for creating video that complies with Mux standard input for 4K.

```shell
ffmpeg -i input.mp4 -c:a copy -vf "scale=w=min(iw\,4096):h=-2" -c:v libx264 \
-profile high -b:v 18000k -g 239 -pix_fmt yuv420p -maxrate 36000k -bufsize 54000k out.mp4
```

## How to create standard input on mobile devices

<Callout type="info" title="Mux mobile upload SDKs">
  Mux's [iOS](https://www.mux.com/docs/guides/upload-video-directly-from-ios-or-ipados) and [Android](https://www.mux.com/docs/guides/upload-video-directly-from-android) upload SDKs are optimised to pre-process video files created on mobile devices to create standard input video files before uploading to Mux.
</Callout>

Many mobile devices capture H.264 8-bit 4:2:0 video by default. More recent mobile devices increasing use HEVC 4:2:0 by default (either 8-bit with HDR disabled, or 10-bit with HDR enabled). Here are the main things to watch out for:

* Ensure that the total file bitrate is below 8 mbps.
* Prefer recording video in SDR (standard dynamic range). Some newer devices capture video in HDR (High Dynamic Range), which requires 10-bit 4:2:0 color. This is also acceptable, but may be more likely to exceed the limited bitrate to be considered standard input. If you have the choice, you should record SDR.
* Ensure the output file is smaller than 1080p (1920x1080) or 2K (2048x1152). Some cameras shoot 4K video, which by default is converted down to 1080p when using Mux Video. If you want to use 4K video, ensure you're enabling that in your API calls to Mux.
* If possible, choose a keyframe interval of 5s or so, but certainly between 1 and 10 seconds, and enable closed-GOP encoding. (If you don't see these options in your app or camera, it's probably the default already.)

## Non-standard input

Mux Video works fine with video outside of the standard input specs. But because other videos cannot be easily streamed to many modern devices, Mux Video must perform an initial encoding operation on non-standard input to create a mezzanine file. This means that non-standard input will be slower to ingest.

As soon as Mux Video detects that the input file is non-standard, it emits the `video.asset.non_standard_input_detected`. This lets you know right away that your video will need additional processing time, along with the specific reasons why it's considered non-standard.

```json
{
  "type": "video.asset.non_standard_input_detected",
  "data": {
    "id": "{ASSET_ID}",
    "status": "preparing",
    "non_standard_input_reasons": {
      "video_gop_size": "high"
    }
    // ... other asset fields
  }
}
```

*Note that `non_standard_input_reasons` may not be finalized as additional reasons maybe found later and will be included on the asset after ingest completion.*

Mux Video also exposes this information in all subsequent asset webhooks, including the `video.asset.ready` event. This information is also returned in the asset object when retrieved using the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">Get Asset API</ApiRefLink>.

### Understanding the transcoding progress of a non-standard input

When an asset needs additional processing, you can use the `progress` field from the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">Get Asset API</ApiRefLink> response, which returns the current `state` of and, the completion percentage of a transcode.

```json
// GET /video/v1/assets/{ASSET_ID}
{
  "id": "{ASSET_ID}",
  "status": "preparing",
  "non_standard_input_reasons": {
    "video_gop_size": "high"
  },
  "progress": {
    "state": "transcoding", 
    "progress": 23.02
  }
  // ... other asset fields
}
```

This field tells you both the current processing state and an estimated completion percentage from `0` to `100`, allowing you to keep your users informed with accurate progress indicators.

The progress field can have the following `state`s:

* `transcoding`: Asset is undergoing non-standard transcoding. `progress` will be between `0` and `100`.
* `ingesting`: Asset is being ingested (initial processing before or after transcoding). Progress will be `0`.
* `errored`: Asset has encountered an error (`status` is `errored`). Progress will be `-1`.
* `completed`: Asset processing is complete (`status` is `ready`).  Progress will be `100`.
* `live`: Asset is a live stream currently in progress.  Progress will be `-1`.

## General limits

The max duration for any single asset is 12 hours.


# Control recording resolution
If the video being captured in your app doesn't need to be played back in full resolution, specify a lower resolution when recording to take advantage of Mux's resolution dependent pricing.
## Android

The way you control the resolution of a recorded video depends on the API used to record or encode it. All of Google's major camera and recording APIs have a method for setting either the exact or maximum resolution of the videos they create.

### CameraX

With the CameraX library provide a `QualitySelector` that doesn't allow for resolutions beyond 720p (1280x720).

```kotlin
// Selects only Standard HD (720p) and Standard Definition (480p)
val selector = QualitySelector.fromOrderedList(
  listOf(Quality.HD, Quality.SD),
  FallbackStrategy.lowerQualityOrHigherThan(Quality.SD)
 )

val recorder = Recorder.Builder()
  .setQualitySelector(selector)
  ...
  .build()
```

### MediaCodec

If you are encoding video yourself via the `MediaCodec` API, you can set the encoder's output resolution by setting it in the input `MediaFormat`. For more information on how to configure and use `MediaCodec`, [try the docs](https://developer.android.com/reference/android/media/MediaCodec)

```kotlin
val mediaCodec = MediaCodec.createByCodecName(codecName)
val encodeFormat = MediaFormat().apply {
  setInteger(MediaFormat.KEY_FRAME_RATE, myExampleFrameRate)
  //... Other required params
  // Output 720p
  setInteger(MediaFormat.KEY_HEIGHT, 720)
  setInteger(MediaFormat.KEY_WIDTH, 1280)
}
mediaCodec.configure(
  encodeFormat,
  myInputSurface,
  null,
  MediaCodec.CONFIGURE_FLAG_ENCODE
)
```

### Camera2

Camera2 doesn't have an API to set the video resolution directly, but it infers it from the input surface. You have to call `SurfaceHolder.setFixedSize()` on your capture requests' targets. This can only be done on Lollipop/API 21 or higher. Please refer to [the camera2 docs](https://developer.android.com/reference/android/hardware/camera2/CameraDevice#createCaptureSession\(android.hardware.camera2.params.SessionConfiguration\)) for more information

```kotlin
val supportedCameraResolutions = streamConfigMap.getOutputSizes(ImageFormat.NV21)
val size =
  supportedCameraResolutions.toList().sortedBy { it.height }.findLast { it.height <= 720 && it.width <= 1280 }
size?.let { cameraSurfaceHolder.setFixedSize(it.width, it.height) }
cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
  .apply { addTarget(cameraSurfaceHolder.surface) }
  // ...
  .build()
cameraDevice.createCaptureSession(...)
```

### MediaRecord

MediaRecord's output size can be configured by calling `MediaRecord.setVideoSize()` before calling `prepare()`.

```kotlin
mediaRecord.setVideoSize(1280, 720)
mediaRecord.prepare()
```

## iOS and iPadOS

This guide covers setting maximum video resolution when recording video on iOS and iPadOS. The directions and code examples on this page assume you are using AVFoundation to setup and configure your camera. If you’ve never used AVFoundation before we recommend you brush up on the basics before proceeding further, see the official Apple [documentation](https://developer.apple.com/documentation/avfoundation) for a quick introduction and sample code.

Video recording on iOS is managed using [AVCaptureSession](https://developer.apple.com/documentation/avfoundation/avcapturesession). The resolution for video output from AVCaptureSession can be configured using a settings preset and this example shows how to configure VCaptureSession to output video at a resolution of 720p (1280 x 720 pixels).

```swift
let session = fetchYourCaptureSession()
session.beginConfiguration()

let updatedSessionPreset = AVCaptureSession.hd1280x720
if session.canSetSessionPreset(updatedSessionPreset) {
    session.sessionPreset = updatedSessionPreset
}

session.commitConfiguration()
```

Don’t forget to call `beginConfiguration()` before applying any configuration changes. When all the configuration changes have been applied, make sure your implementation calls `commitConfiguration()`.

It is best for any work that is done in-between calls to `beginConfiguration()` and `commitConfiguration()` to be synchronous. If you need to perform any asynchronous tasks, such as fetching the preferred resolution from your backend, make sure those are complete before you begin to configure `AVCaptureSession`.

## OBS

Streams initiated via [OBS](https://obsproject.com/) can be configured in Settings > Video > Output (Scaled) Resolution.

<Image src="/docs/images/output-obs.png" width="988" height="761" />


# Play your videos
In this guide you will learn how to play Mux videos in your application.
## 1. Get your playback ID

Each `asset` and each `live_stream` in Mux can have one or more **Playback IDs**.

This is an example of the `"playback_ids"` from the body of your `asset` or `live_stream` in Mux. In this example, the `PLAYBACK_ID` is `"uNbxnGLKJ00yfbijDO8COxTOyVKT01xpxW"` and the `policy` is `"public"`.

<Callout type="warning">
  **Playback IDs** can have a policy of `"public"` or `"signed"`. For the purposes of this guide we will be working with `"public"` playback IDs.

  If this is your first time using Mux, start out with `"public"` playback IDs and then read more about [securing video playback with signed URLs](/docs/guides/secure-video-playback) later.
</Callout>

```json
"playback_ids": [
  {
    "policy": "public",
    "id": "uNbxnGLKJ00yfbijDO8COxTOyVKT01xpxW"
  }
],
```

## 2. Create an HLS URL

HLS is a standard protocol for streaming video over the internet. Most of the videos you watch on the internet, both live video and on-demand video is delivered over HLS. Mux delivers your videos in this standard format.

Because HLS is an industry standard, you are free to use any HLS player of your choice when working with Mux Video.

HLS URLs end with the extension `.m3u8`. Use your `PLAYBACK_ID` to create an HLS URL like this:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

<Callout type="info">
  If you're curious to learn more about how HLS works you might find this informational site [howvideo.works](https://howvideo.works) makes for some good bedtime reading.
</Callout>

## Other formats

HLS (`.m3u8`) is used for streaming assets (video on demand) and live streams. For offline viewing and post-production editing take a look at the guide for [download your videos](/docs/guides/download-your-videos) which covers `mp4` formats and master access.

## 3. Use the HLS URL in a player

Most browsers do not support HLS natively in the `video` element (Safari and IE edge are exceptions). Some JavaScript will be needed in order to support HLS playback in your web application.

The default player in iOS and TVOS (AVPlayer) supports HLS natively, so no extra effort is needed. In the Swift example below we're using the [VideoPlayer](https://developer.apple.com/documentation/avkit/videoplayer) struct that comes with SwiftUI and AVKit.

Similarly, the default player ExoPlayer on Android also supports HLS natively.

<Callout type="info" title="Next.js React example">
  If you're using [Next.js](https://nextjs.org/) or React for your application, the [with-mux-video example](https://github.com/vercel/next.js/tree/canary/examples/with-mux-video) is a good place to start.

  `npx create-next-app --example with-mux-video with-mux-video-app`
</Callout>

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

struct ContentView: View {

    private let player = AVPlayer(
        url: URL.makePlaybackURL(
            playbackID: playbackID
        )
    )

    var body: some View {
        //  VideoPlayer comes from SwiftUI
        //  Alternatively, you can use AVPlayerLayer or AVPlayerViewController
        VideoPlayer(player: player)
            .onAppear() {
                player.play()
            }
    }
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

extension URL {
    static func makePlaybackURL(
        playbackID: String
    ) -> URL {
        guard let baseURL = URL(
            string: "https://stream.mux.com"
        ) else {
            preconditionFailure("Invalid base URL string")
        }

        guard let playbackURL = URL(
            string: "\(playbackID).m3u8",
            relativeTo: baseURL
        ) else {
            preconditionFailure("Invalid playback URL component")
        }

        return playbackURL
    }
}

```



## 4. Find a player

The examples below are meant to be a starting point. You are free to use **any player that supports HLS** with Mux videos. Here's some popular players that we have seen:

### Mux Player

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```



See [the Mux Player guide](/docs/guides/mux-player-web) for more details and configuration options.

### Mux Video Element

If Mux Player does more than you're looking for, and you're interested in using something more like the native HTML5 `<video>` element for your web application, take a look at the `<mux-video>` element. The Mux Video Element is a drop-in replacement for the HTML5 `<video>` element, but it works with Mux and has Mux Data automatically configured.

* HTML: [Mux Video element](https://github.com/muxinc/elements/tree/main/packages/mux-video)
* React: [MuxVideo component](https://github.com/muxinc/elements/tree/main/packages/mux-video-react)

### Popular web players

* [HLS.js](https://github.com/video-dev/hls.js) is free and open source. This library does not have any UI components like buttons and controls. If you want to either use the HTML5 `<video>` element's default controls or build your own UI elements HLS.js will be a great choice.
* [Plyr.io](https://plyr.io/) is free and open source. Plyr has UI elements and controls that work with the underlying `<video>` element. Plyr does not support HLS by default, but it can be used *with* [HLS.js](https://github.com/video-dev/hls.js). If you like the feel and theming capabilities of Plyr and want to use it with Mux videos, follow the [example for using Plyr + HLS.js](https://codepen.io/pen?template=oyLKQb).
* [Video.js](https://videojs.com/) is a free and open source player. As of version 7 it supports HLS by default. The underlying HLS engine is [videojs/http-streaming](https://github.com/videojs/http-streaming).
* [JWPlayer](https://www.jwplayer.com/html5-video-player/) is a commercial player and supports HLS by default. The underlying HLS engine is HLS.js.
* [Brightcove Player](https://player.support.brightcove.com/getting-started/overview-brightcove-player.html) is a commercial player built on Video.js and HLS is supported by default.
* [Bitmovin Player](https://bitmovin.com/video-player/) is a commercial player and supports HLS by default.
* [THEOplayer](https://www.theoplayer.com/) is a commercial player and supports HLS by default. The player chrome is built on Video.js, but the HLS engine is custom.
* [Agnoplay](https://www.agnoplay.com/) is a fully agnostic, cloud-based player solution for web, iOS and Android with full support for HLS.

### Use Video.js with Mux

Video.js kit is a project built on [Video.js](https://videojs.com) with additional Mux specific functionality built in.
This includes support for:

* Enabling [timeline hover previews](/docs/guides/create-timeline-hover-previews)
* [Mux Data integration](/docs/guides/monitor-video-js)
* `playback_id` helper (we'll figure out the full playback URL for you)

For more details, head over to the [Use Video.js with Mux](/docs/guides/playback-videojs-with-mux) page.

## 5. Advanced playback features

## Playback with subtitles/closed captions

Subtitles/Closed Captions text tracks can be added to an asset either on asset creation or later when they are available. Mux supports [SubRip Text (SRT)](https://en.wikipedia.org/wiki/SubRip) and [Web Video Text Tracks](https://www.w3.org/TR/webvtt1/) format for ingesting Subtitles and Closed Captions text tracks. For more information on Subtitles/Closed Captions, see this [blog post](https://mux.com/blog/subtitles-captions-webvtt-hls-and-those-magic-flags) and [the guide for subtitles](/docs/guides/add-subtitles-to-your-videos).

Mux includes Subtitles/Closed Captions text tracks in HLS (.m3u8) for playback. Video Players show the presence of Subtitles/Closed Captions text tracks and the languages available as an option to enable/disable and to select a language. The player can also default to the viewer's device preferences.

<Image src="/docs/images/hls-player-options-menu.png" width={768} height={454} caption="HLS.js video player options menu for Subtitles/Closed Captions text track" />

If you are adding text tracks to your Mux videos, make sure you test them out with your player.

In addition, Mux also supports downloading of Subtitles/Closed Captions text tracks as "sidecar" files when [downloading your videos](/docs/guides/download-your-videos).

```
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt
```

Replace `{PLAYBACK_ID}` with your asset's playback ID and `{TRACK_ID}` with the unique identifier value returned when this subtitle/closed caption text track was added to this asset.

## Add delivery redundancy with Redundant Streams

Mux Video streams are delivered using multiple CDNs. The best performing CDN is selected for the viewer initiating the playback. Video is then streamed by that CDN for that particular user. When the selected CDN has a transient or regional failure, the viewer's playback experience could be interrupted for the duration of the failure. If this happens your application should handle the playback failure and re-initiate the playback session. Mux Video's CDN selection logic would then select a different CDN for streaming.

The redundant streams modifier allows Mux to list each rendition for every CDN in the HLS manifest. The order is based on CDN performance with the best performing one listed first. If your video player supports redundant streams then the player will detect the failure mid-playback and switch to the next CDN on the list during a failure without interrupting the playback.

For more information on the Redundant Streams playback modifier and player support based on our tests, see [this blog post](https://mux.com/blog/survive-cdn-failures-with-redundant-streams/).

To use this feature in your application add `redundant_streams=true` to the HLS URL:

```none
https://stream.mux.com/{PLAYBACK_ID}.m3u8?redundant_streams=true
```

<Callout type="warning">
  # Using `redundant_streams` with signed URLs

  If you are using [signed playback URLs](/docs/guides/secure-video-playback) make sure you include the extra parameter in your signed token.
</Callout>

This table shows the support of various video players for redundant streams. This table will be updated as more players are tested or updated. If your player isn't listed here, please reach out.

| Player | Version | Manifest 4xx | Manifest 5xx | Media 4xx | Media 5xx |
| :-- | :-- | :-- | :-- | :-- | :-- |
| Video.js | >= 7.6.6 |✅ |✅ |✅ |✅ |
| HLS.js | >= 0.14.11 |✅ |✅ |✅ |✅ |
| JWPlayer | Production Release Channel |✅ |✅ |✅ |✅ |
| Safari iOS (AVPlayer) | >= iOS 13.6.1 |✅ |✅ |✅ |✅ |
| Safari MacOS | Safari >= 13.1.2 MacOS 10.15.X |✅ |✅ |✅ |✅ |
| ExoPlayer | >= r2.12.0 |✅ |✅ |✅ |✅ |

## Securing video playback

When using a policy of `"public"` for your playback IDs, your HLS playback URLs will work for as long as the playback ID exists. If you use a `"signed"` policy then you can have more control over playback access. This involves creating signing keys and using JSON web tokens to generate signatures on your server. See the guide for [secure video playback](/docs/guides/secure-video-playback).

## 6. Next steps

<GuideCard
  title="Get images from a video"
  description="Now that you have playback working, build rich experiences into your application by previewing your videos with thumbnails and gifs."
  links={[
    {title: "Read the guide", href: "/docs/guides/get-images-from-a-video"},
  ]}
/>

<GuideCard
  title="Track your video performance"
  description="Add the Mux Data SDK to your player and start collecting playback performance metrics."
  links={[
    {title: "Read the guide", href: "/docs/guides/track-your-video-performance"},
  ]}
/>


# Mux Player for web
Mux Player is a drop in component for adding Mux videos into your web application
**Mux Player** is a drop-in component that you can put in your web application to play Mux assets. Mux Player supports:

* on-demand assets
* live streams
* low-latency live streams
* DVR mode for live or low-latency live streams

Mux Player can be used as a web component (`<mux-player>` from `@mux/mux-player`), as a React component (`<MuxPlayer />` from `@mux/mux-player-react`), or as a web embed (`<iframe src="https://player.mux.com/{playbackId}">`)

Mux Player is fully-featured video player for content hosted by Mux Video. Mux Player is fully integrated with Mux Data without any extra configuration. Mux Player provides a responsive UI based on video player dimensions and stream type, automatic thumbnail previews and poster images, and modern video player capabilities (fullscreen, picture-in-picture, Chromecast, AirPlay).

## Quick start

Here are some examples of Mux Player in action.

## HTML element

Install with either npm, yarn or load Mux Player from the hosted script.

### NPM

```shell
npm install @mux/mux-player@latest
```

### Yarn

```shell
yarn add @mux/mux-player@latest
```

### Hosted

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
```

### Example HTML element implementation

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
<mux-player
  playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  metadata-video-title="Test VOD"
  metadata-viewer-user-id="user-id-007"
></mux-player>
```

<Callout type="info">
  When using the HTML element version of Mux Player, you will see the `Player Software` in Mux Data come through as `mux-player`.
</Callout>

## HTML Embed

### Example HTML embed implementation

```html
<iframe
  src="https://player.mux.com/EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs?metadata-video-title=Test%20VOD&metadata-viewer-user-id=user-id-007"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>
```

<Callout type="info">
  When using the HTML embed version of Mux Player, you will see the `Player Software` in Mux Data come through as `mux-player-iframe`.
</Callout>

## React

You will need to select one of the package options below. Both examples will automatically update the player. You can always anchor the package to a specific version if needed.

### NPM

```shell
npm install @mux/mux-player-react@latest
```

### Yarn

```shell
yarn add @mux/mux-player-react@latest
```

### Example React implementation

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react\"; \n\nexport default function App() {\n  return (\n    <MuxPlayer\n      playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n      metadata={{\n        video_id: \"video-id-54321\",\n        video_title: \"Test video title\",\n        viewer_user_id: \"user-id-007\",\n      }}\n    />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

<Callout type="info">
  When using the React version of Mux Player, you will see the `Player Software` in Mux Data come through as `mux-player-react`.
</Callout>

## Adaptive controls

As shown in the examples above, the available controls will adjust based on your video's stream type, live or on-demand.

Mux Player will also take into account the size that the player is being displayed at, regardless of the browser window size, and will selectively hide controls that won't fit in the UI.

In the latest version of Mux Player stream type is automatically set and you don't need to manually provide this. Player themes other than the default theme that need to know what the stream type is may need it defined to avoid the player having a delay in showing the correct controls. In this instance, you would set `stream-type` (`streamType` in React) to either `on-demand` or `live` so that the UI can adapt before any information about the video is loaded.

The following will also appear in some use cases based on support detection:

* [AirPlay](https://www.apple.com/airplay/)
* [Chromecast](https://store.google.com/us/product/chromecast). Requires an extra step, see the [customize look and feel](/docs/guides/player-customize-look-and-feel) guide.
* Fullscreen
* Picture-in-picture button
* Volume controls

<GuideCard
  title="Core functionality"
  description="Understand the features and core functionality of Mux Player"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/player-core-functionality",
    },
  ]}
/>

<GuideCard
  title="Integrate Mux Player"
  description="Interate Mux Player in your web application. See examples in popular front end frameworks."
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/player-integrate-in-your-webapp",
    },
  ]}
/>

<GuideCard
  title="Customize the look and feel"
  description="Customize Mux Player to match your brand"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/player-customize-look-and-feel",
    },
  ]}
/>

## Set accent color for your brand

The default accent color of the player is Mux pink `#fa50b5`. You should override this with your brand color. Use the `accent-color` HTML attribute or `accentColor` React prop.

```html
<mux-player
  playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  accent-color="#ea580c"
  metadata-video-title="Test VOD"
  metadata-viewer-user-id="user-id-007"
></mux-player>
```

For React:

```jsx
<MuxPlayer
  playbackId="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  accentColor="#ea580c"
  metadata={{
    videoTitle: "Test VOD",
    ViewerUserId: "user-id-007"
  }}
/>
```


# Core functionality of Mux Player
In this guide, see the features and functionality that Mux Player gives you out of the box.
## Mux Platform integration

Mux Player is built for playing assets hosted with Mux Video. Features like timeline hover previews and automatically pulling poster images work with minimal configuration because the video is hosted by Mux.

Mux Player will use the optimal HLS.js settings based on the type of stream being played, on-demand or live. New versions of Mux Player will contain upgraded versions of HLS.js that are known to be stable versions and tested with Mux Player.

## Mux Data integration

Mux Player is integrated with Mux Data automatically to measure the performance and quality of experience. See the [Understand metric definitions](/docs/guides/understand-metric-definitions) guide to learn more about the metrics that are tracked with Mux Data.

Your Mux Data environment will be inferred from the playback ID provided to Mux Player. No configuration is necessary. If you would like to override that default and send the video views to a specific Mux environment, you can pass the `env-key` (HTML element) attribute or `envKey` (React) prop.

## Responsiveness

Mux Player has different UI permutations based on stream type (`on-demand` or `live`), feature support (like AirPlay), and player size.

Note that the responsiveness of Mux Player is based on the size of the container that it is being rendered in, not the viewport size. If you have a collection of small players in a large viewport, the layout of the controls for each player will be sized appropriately.

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<div>\n<mux-player\n  playback-id=\"v69RSHhFelSm4701snP22dYz2jICy4E4FUyk02rW4gxRM\"\n  metadata-video-title=\"Test Live Stream\"\n  metadata-viewer-user-id=\"user-id-007\"\n></mux-player>\n</div> \n\n<div style=\"max-width: 250px;\">\n<mux-player\n  playback-id=\"v69RSHhFelSm4701snP22dYz2jICy4E4FUyk02rW4gxRM\"\n  metadata-video-title=\"Test Live Stream\"\n  metadata-viewer-user-id=\"user-id-007\"\n></mux-player>\n</div>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-player'",
      "hidden": true
    }
  }
}
```

[Here is a CodeSandbox environment you can view samples in](https://codesandbox.io/s/mux-player-responsiveness-sample-nikk79)

## Controls and UI

Mux Player will show or hide controls based on availability.

On iPhone browsers Mux Player uses Apple's fullscreen functionality.

On iPhone and iPad browsers, the volume slider is not present. Volume level must be controlled via the hardware buttons. This is a restriction of iOS and iPadOS.

The fullscreen button will not show if fullscreen functionality is not available on the page. For example, if Mux Player is embedded inside of an iframe that does not include the `allow="fullscreen"` attribute. This is currently the case on [CodeSandbox](https://codesandbox.io) examples and other similar code testing platforms.

If you are embedding Mux Player in an iframe, use the `<iframe allow="fullscreen">` in order to access fullscreen functionality.

You'll notice the controls are different for on-demand and live stream types.

## Quality selector

By default Mux Player will show a quality selector in the control bar. This is not strictly necessary, the player will use an adaptive bitrate algorithm to determine the highest quality that can be streamed smoothly. However, in some scenarios users may want to pin to a higher rendition for text legibility or because they simply have a preference for viewing the higher quality resolution than what the adaptive bitrate algorithm determines. In these scenarios it's important to understand that there is a tradeoff. If the user is selecting a higher rendition than what the player would naturally use, they will likely experience rebuffering because the available bandwidth is lower than the quality they want to view. That is perfectly okay, but they have to be willing to make that tradeoff.

<Image src="/docs/images/mux-player-quality-selector.png" width={342} height={358} alt="Mux Player quality selector" />

### Caveats with quality selector

There's some details to understand about when the quality selector will be available depending on the device, operating system and browser. The quality selector is only available in environments that use [Media Source Extensions](https://www.w3.org/TR/media-source-2/) (a.k.a. MSE) to power the streaming.

For Mux Player, that means:

* The quality selector **is available** in all non-Safari desktop browsers because Mux Player uses MSE in these browsers
* The quality selector **is available** on Android, because Mux Player uses MSE in Android browsers.
* The quality selector **is not available by default** on MacOS Safari and any iPadOS browser because Mux Player uses Apple's internal HLS playback engine on these platforms. However, MSE is supported on these platforms so the quality selector can be enabled by forcing MSE with the attribute `playback-engine="mse"` (web component & iframe embed) or `playbackEngine="mse"` (React). See [more here about changing the default playback engine](/docs/guides/player-advanced-usage#change-playback-engine).
* The quality selector **is not available** and cannot be enabled on any iOS browsers because MSE is not supported on iOS (instead iOS requires that HLS playback is done via Apple's internal HLS playback engine, which we do not have programmatic access to)

If you prefer to hide the quality selector all together, you can do that in the web component or React with the CSS variable which sets the `display` property on the control:

```css
mux-player {
  --rendition-menu-button: none;
}
```

See more about styling with CSS in the [Customize look and feel guide](/docs/guides/player-customize-look-and-feel#available-css-variables)

## Multi-track audio selector

By default, if your stream has multiple audio tracks (e.g. descriptive audio, dubs for another language, etc.), Mux Player will show an audio track selector in the control bar. If there is only one or no audio track, the control will be automatically hidden.

<Image src="/docs/images/mux-player-audio-track-selector.png" width={326} height={302} alt="Mux Player audio track selector" />

If you prefer to hide the audio track selector all together, you can do that in the web component or React with the CSS variable which sets the `display` property on the control:

```css
mux-player {
  --audio-track-menu-button: none;
}
```

For more details on how to use multi-track audio, including adding it via Mux Video, check out [our blogpost](https://www.mux.com/blog/parlez-vous-anglais-introducing-multi-track-audio).

## Chromecast

Chromecast support is built-in.

* For Mux player >= v2.3.0 no additional configuration is needed.
* For Mux player \< v2.3.0 the only thing you need to do in order to enable it is add the [Google Cast script](https://developers.google.com/cast) to the `<head>` of your webpage.

```html
<script
  defer
  src="https://www.gstatic.com/cv/js/sender/v1/cast_sender.js?loadCastFramework=1"
></script>
```

When this script is loaded and a Chromecast is detected on the network then Mux Player will show the Chromecast button in the control bar.

Note that the default Chromecast receiver app does not currently support low-latency Live Streams. If you have your own receiver app that you want to use instead of the default Chromecast receiver app you can over-ride the variable: `chrome.cast.media.DEFAULT_MEDIA_RECEIVER_APP_ID` to point to your receiver app ID.

## Live Stream playback

When live streaming with Mux you have 2 options for viewers:

* **Non-DVR mode**: This is most common. Use the `playback_id` associated with the **Live Stream** for playback. Non-DVR mode keeps viewers on the "live edge" of the live content and does not allow them to seek backwards while the stream is live.
* **DVR mode**: This is less common, but might be what you want depending on the use case. Use the `playback_id` associated with the **Asset** that corresponds to the **Live Stream** for playback. DVR mode allows users to seek backwards while the stream is still live.

For more information about non-DVR mode and DVR mode and some of the tradeoffs to consider, take a look at [this guide](/docs/guides/stream-recordings-of-live-streams).

When viewing live streams with Mux Player you have 2 options:

1. Use the `playback_id` associated with the **Live Stream** itself.
2. Live Streams created in Mux have a corresponding **Asset**. Use the `playback_id` associated with the **Asset** in order to view the live stream in DVR-mode.

When using DVR-mode in Mux Player, the UI will show a timeline for users to scroll back to the beginning of the Live Stream while the Live Stream is still active.

## Timeline hover previews

Timeline hover previews show a small thumbnail of the video content at a given timestamp. They help to provide a contextual visual for the viewer based on where their cursor is positioned over the timeline.

When you play back a video hosted on Mux using Mux Player, you’ll see built-in timeline hover previews for the video with no extra work on your end.

<Image src="/docs/images/mux-player-desktop-on-demand.png" width={799} height={464} alt="Timeline hover preview example" />

## Accessibility

Mux Player has taken steps to being fully WCAG AA compliant. At this time Mux Player supports:

* Keyboard navigation
* Screen reader compatibility with the [Accessibility Object Model](https://wicg.github.io/aom/spec/)
* Closed captions / subtitles will show by default ([if the video has them](/docs/guides/add-subtitles-to-your-videos))

Make sure to take accessibility into consideration when customizing Mux Player. See the guide for [customizing the look and feel of Mux Player](/docs/guides/player-customize-look-and-feel) to change things like primary color, secondary color, or styling with CSS.

When setting color variables and changing styles make sure your implementation meets [the contrast ratio requirements for WCAG 2.1](https://www.w3.org/TR/WCAG/#contrast-minimum).

## Error handling

Mux Player will internally make every attempt to recover from errors and maintain smooth playback.

When Mux Player encounters unrecoverable fatal errors, it will try to:

1. Make it clear to the viewer where the error is coming from and what, if anything, they can do about it.
2. Provide context for a developer to debug and prevent the error from happening in the future. Developer logs prefixed with `[mux-player]` will contain debugging details and a link to more information.
3. The error will be tracked with details in your Mux Data dashboard.

## Audio player

If you have an audio-only Mux asset, you can set the `audio` attribute on `mux-player` to display the audio player. You can also add the `audio` attribute to a video asset to make a video look like an audio player.

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<mux-player\n  playback-id=\"x00Y6AhtNCs01UIW02FhPY4H6hZHkQLuiLoD1tTMj00zuxE\"\n  metadata-video-title=\"Test Audio Stream\"\n  metadata-viewer-user-id=\"user-id-007\"\n  muted\n  audio\n  primary-color=\"#075389\"\n  secondary-color=\"#d6e6f1\"\n  style=\"width: 100%; border: 0;\"\n></mux-player>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-player'",
      "hidden": true
    }
  }
}
```

## Autoplay

Like the native `<video>` element, Mux Player supports the standard `autoplay` attribute.

```html
<mux-player
  playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  autoplay
></mux-player>
```

The main difference between this and the native `autoplay` attribute when being used on a `<video>` element is that Mux Player is explicitly calling `.play()` on the underlying video, which has a better chance of autoplay working.

<Callout>
  Check out our general [autoplay guide](/docs/guides/web-autoplay-your-videos) for more details on why autoplay doesn't always work
</Callout>

The Mux Player autoplay attribute also supports some additional values:

* `autoplay="muted"` - will first attempt to mute the audio before calling `.play()` on the video, increasing the odds of successful playback
* `autoplay="any"` - will attempt playback with the currently set player options. If this fails it will fall back to trying again after muting the audio


# Integrate Mux Player into your web application
In this guide, you will learn about Mux Player and how to use it in your web application.
## Install Mux Player

Mux Player has 2 packages:

* `@mux/mux-player`: the web component, compatible with all frontend frameworks
* `@mux/mux-player-react`: the React component, for usage in React

Both are built with TypeScript and can be installed either via `npm`, `yarn` or the hosted option on `jsdelivr`. `@mux/mux-player` can also be used as an `<iframe>` embed.

### NPM

```shell
npm install @mux/mux-player@latest #or @mux/mux-player-react@latest
```

### Yarn

```shell
yarn add @mux/mux-player@latest #or @mux/mux-player-react@latest
```

### CDN

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
<!--
or
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player-react" defer></script>
-->
```

### Embed

```html
<iframe
  src="https://player.mux.com/{PLAYBACK_ID}"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>
```

## Providing attributes

While syntax differs between React and HTML, there are two recommended values to provide in either approach:

* **Playback ID**: Used by the player to create a URL that describes where the video can be streamed from. Under the hood this looks like `stream.mux.com/{PLAYBACK_ID}.m3u8`.
* `metadata`: Information about the video to be tracked by Mux Data as part of a view. At a minimum, you should provide `video_id`, `video_title`, and `viewer_user_id`. See: [Mux Data Metadata](/docs/guides/make-your-data-actionable-with-metadata).

### HTML Web Component attributes

In the HTML web component, using JavaScript it can be assigned as a property on the element:

```js
document.querySelector("mux-player").metadata = { video_id: "video-id-123" };
```

Or, you can add them as attributes to the player in the HTML using the `metadata-*` prefix:

```html
<mux-player
  playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  metadata-video-id="video-id-123456"
  metadata-video-title="Big Buck Bunny"
  metadata-viewer-user-id="user-id-bc-789"
>
```

### HTML embed attributes

In the HTML embed, you can add most supported attributes to the URL as query parameters.

<Callout type="warning">
  Remember that query parameters should be URL encoded. You might do this with [`encodeURIComponent()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/encodeURIComponent).
</Callout>

```html
<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-id=video-id-123456&metadata-video-title=Bick%20Buck%20Bunny&metadata-viewer-user-id=user-id-bc-789"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>
```

### React attributes

Following JavaScript conventions, attributes in React are [camelCased](https://developer.mozilla.org/en-US/docs/Glossary/Camel_case) rather than [kebab-cased](https://developer.mozilla.org/en-US/docs/Glossary/Kebab_case). For example, `playback-id` becomes `playbackId`.

`metadata` is specified as an object in props.

```jsx
<MuxPlayer
  playbackId="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  metadata={{
    video_id: 'video-id-123456',
    video_title: 'Big Buck Bunny',
    viewer_user_id: 'user-id-bc-789',
  }}
></MuxPlayer>
```

## Examples

### HTML element

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<mux-player\n  playback-id=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n  metadata-video-title=\"Test VOD\"\n  metadata-viewer-user-id=\"user-id-007\"\n></mux-player>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-player'",
      "hidden": true
    }
  }
}
```

<Callout type="info">
  When using the HTML element version of Mux Player, you will see the `Player Software` in Mux Data come through as `mux-player`.
</Callout>

### HTML embed

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "files": {
    "/index.html": {
      "code": "<iframe\n  src=\"https://player.mux.com/a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M?metadata-video-title=Test%20VOD&metadata-viewer-user-id=user-id-007\"\n  style=\"aspect-ratio: 16/9; width: 100%; border: 0;\"\n  allow=\"accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;\"\n  allowfullscreen=\"true\"\n></iframe>",
      "active": true
    },
    "/index.js": {
      "code": "",
      "hidden": true
    }
  }
}
```

<Callout type="info">
  When using the HTML embed version of Mux Player, you will see the `Player Software` in Mux Data come through as `mux-player-iframe`.
</Callout>

### React

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react\"; \n\nexport default function App() {\n  return (\n    <MuxPlayer\n      playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n      metadata={{\n        video_id: \"video-id-54321\",\n        video_title: \"Test video title\",\n        viewer_user_id: \"user-id-007\",\n      }}\n    />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

<Callout type="info">
  When using the React version of Mux Player, you will see the `Player Software` in Mux Data come through as `mux-player-react`.
</Callout>

### Svelte

Since Svelte supports web components, here is an examples of using `@mux/mux-player` component. View the Sveltkit example in the [Mux Elements repo](https://github.com/muxinc/elements/tree/main/examples/svelte-kit) for a fully functioning example.

```html
<script context="module" lang="ts">
  export const prerender = true;
</script>

<script lang="ts">
  // this prevents the custom elements from being redefined when the REPL is updated and reloads, which throws an error
  // this means that any changes to the custom element won't be picked up without saving and refreshing the REPL
  // const oldRegister = customElements.define;
  // customElements.define = function(name, constructor, options) {
  // 	if (!customElements.get(name)) {
  // 		oldRegister(name, constructor, options);
  // 	}
  // }
  // import { page } from '$app/stores';
  import { onMount } from "svelte";
  onMount(async () => {
    await import("@mux/mux-player");
  });
</script>

<mux-player
  playback-id="g65IqSFtWdpGR100c2W8VUHrfIVWTNRen"
  metadata-video-id="video-id-54321"
  metadata-video-title="Svelte Kit: Episode 2"
  metadata-viewer-user-id="user-id-sveltekit007"
/>
```

### Vue

Since Vue supports web components, here is an examples of using `@mux/mux-player` component. View the Vue example in the [Mux Elements repo](https://github.com/muxinc/elements/tree/main/examples/vue-with-typescript) for a fully functioning example.

```html
<script setup lang="ts">
  import "@mux/mux-player";
</script>

<template>
  <main>
    <mux-player
      playback-id="g65IqSFtWdpGR100c2W8VUHrfIVWTNRen"
      metadata-video-id="video-id-54321"
      metadata-video-title="Vue 3: Episode 2"
      metadata-viewer-user-id="user-id-vue3007"
    />
  </main>
</template>
```

<GuideCard
  title="Customize the look and feel"
  description="Customize Mux Player to match your brand"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/player-customize-look-and-feel",
    },
  ]}
/>

<GuideCard
  title="Advanced usage"
  description="Learn about advanced usage of Mux Player"
  links={[
    {
      title: "Read the guide",
      href: "/docs/guides/player-advanced-usage",
    },
  ]}
/>


# Customize the look and feel of Mux Player
Learn how to customize the look and feel of Mux Player to fit your brand and use case.
Mux Player is a fully-featured player out of the box that is made to look good and be fully functional and responsive for different screen sized. You can customize things on Mux Player like the colors and which controls are showing up.

If you want to go further with customization on things like icons, breakpoints, or where controls are shown, you will want to go down the path of using a [different theme or creating your own theme](/docs/guides/player-themes).

## Customize the poster image

By default Mux Player will pull the default poster image from the middle of the video based on the Playback ID that you provide. The default poster image is the mid-point of the Mux asset.

`https://image.mux.com/{PLAYBACK_ID}/thumbnail.jpg`

If you want to change the poster image, you have two options:

1. Pass in `thumbnail-time` (React: `thumbnailTime`) with the value in seconds of the thumbnail that you want to pull from the video.

   * The `thumbnail-time` attribute (React: `thumbnailTime`) are only available if you're NOT using [Signed URLs.](/docs/guides/secure-video-playback)
   * If you *are* using Signed URLs you'll need to add the `time=` parameter to your signed token (see the [Usage with signed URLs](/docs/guides/player-advanced-usage) guide).

2. Use the `poster=` attribute.
   * You can set any arbitrary image URL the same way you would do with the HTML5 `<video>` element. For the best viewer experience, your poster image should match the aspect ratio of the video.

### Provide a placeholder while the poster image loads

While the poster image loads, Mux Player will display the contents of the `placeholder=` attribute. Consider using a [Data URL](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URLs) so that the placeholder is immediately available without a network request.

Mux Player [embedded in an iframe through player.mux.com](/docs/guides/player-integrate-in-your-webapp#embed) will automatically generate a Data URL placeholder for you.

If you are generating your pages with a Node.js server (like [Next.js](https://nextjs.org/docs/app/getting-started/fetching-data)), you can generate Data URLs for Mux Videos with the `@mux/blurup` package.

The Data URLs generated by `@mux/blurup` contain lightweight multicolor gradients that visually represent what the default poster image will look like once it has fully loaded.

For example:

<MultiImage
  images={[
  { src: "/docs/images/blurup-loading.png", width: 409, height: 230 },
  { src: "/docs/images/blurup-loaded.png", width: 409, height: 227 },
]}
/>

```js
// Server-Side
import { createBlurUp } from '@mux/blurup';

const options = {};
const muxPlaybackId = 'O6LdRc0112FEJXH00bGsN9Q31yu5EIVHTgjTKRkKtEq1k';

const getPlaceholder() = async () => {
  const { blurDataURL, aspectRatio } = await createBlurUp(muxPlaybackId, options);
  console.log(blurDataURL, aspectRatio);
  // data:image/svg+xml;charset=utf-8,<svg xmlns="http://www.w3.org/2000/svg" width="100%" ...
};
```

```html
<!-- Client-Side -->
<mux-player
  playback-id="{playbackId}"
  placeholder="{blurDataUrl}"
  style="aspect-ratio: {aspectRatio}"
></mux-player>
```

If you change the thumbnail time with `thumbnailTime`, you should also pass a time configuration to `createBlurUp(playbackId, { time: customThumbTime })` to generate the correct placeholder.

You can learn more about `@mux/blurup` [on GitHub](https://www.github.com/muxinc/blurup).

If you have a client-side-only application and you *can't* generate a blur placeholder, you might want to pass a smaller resolution poster image URL as the placeholder value that will load more quickly than the final hi-res poster.

This placeholder is provided for you if you're using Mux Player in an iframe through player.mux.com.

## Add a video title

Use the `title` attribute to add a title in the top left corner on Mux Player. This title is visible when the player is wide enough to accommodate it. Note that this is different that `metadata-video-title`, which is a Mux Data metadata field.

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<mux-player\nplayback-id=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\ntitle=\"My awesome video\"\nmetadata-video-title=\"Test video title\"\nmetadata-viewer-user-id=\"user-id-007\"\nstyle=\"aspect-ratio: 16/9; width: 100%;\"\n></mux-player>\n\n<!-- or, embed the player with an iframe -->\n\n<iframe\n  src=\"https://player.mux.com/a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M?title=My%20awesome%20video&metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007\"\n  style=\"aspect-ratio: 16/9; width: 100%; border: 0;\"\n  allow=\"accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;\"\n  allowfullscreen=\"true\"\n></iframe>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-player'",
      "hidden": true
    }
  },
  "stacked": true
}
```

## Style with CSS

The Mux Player Web Component can be styled and positioned with CSS just like you would any other HTML element. For example:

```css
mux-player {
  width: 100%;
  max-width: 800px;
  margin: 40px auto;
}
```

In React, you can style the `<MuxPlayer>` component the same way you style other components; with [styled-components](https://styled-components.com/) or directly with the `style=` prop.

<Callout type="warning">
  You can not style Mux Player with CSS if you are using the HTML embed through player.mux.com.
</Callout>

### Aspect ratio

<ApiRefLink href="/docs/api-reference/video/assets/get-asset">The Mux API</ApiRefLink> will provide you the aspect ratio of your video in the form of `w:h`. You should save this `aspect_ratio` in your database or CMS alongside the `playback_id` and other asset details. Then you can use that with CSS in the form of `w / h`. This is using [the CSS aspect-ratio property](https://developer.mozilla.org/en-US/docs/Web/CSS/aspect-ratio) which is supported in all evergreen browsers.

Setting the aspect ratio of the player is important for preventing [Cumulative Layout Shift](https://web.dev/cls/) on the page.

```css
mux-player {
  aspect-ratio: 16 / 9;
}
{/* or if you're using the iframe embed */}
iframe {
  aspect-ratio: 16 / 9;
}
```

### Rounded corners

You can add rounded corners to the player by wrapping it in a `div` with the `style` attribute set to `border-radius`.

```html
<div style="border-radius: 10px; overflow: hidden; display: flex;">
  <mux-player></mux-player>
</div>
```

### Video size and position

You can change the way that video is sized within its `<video>` element. Mux Player provides two css variables that you can use to override the standard `object-fit` and `object-position` [css properties](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit).

```css
mux-player {
  --media-object-fit: cover;
  --media-object-position: center;
}
```

<Callout type="warning">
  When using the player.mux.com iframe embed, you can not use CSS to style mux-player directly, so you won't have access to these CSS Custom Properties.
</Callout>

## Hiding controls with CSS

By default, Mux Player will show all the controls associated with the current player size and stream type.

To hide certain controls, use CSS variables like this:
`--seek-backward-button` will control the `display` of the seek backward button. Set it to `none` to hide it completely.

```css
mux-player {
  --seek-backward-button: none;
  --seek-forward-button: none;
}
```

CSS vars can also be passed inline

```html
<mux-player
  style="--seek-backward-button: none; --seek-forward-button: none;"
></mux-player>
```

<Callout type="player.mux.com warning">
  When using the iframe embed, you can not use CSS to style mux-player directly, so you won't have access to these CSS Custom Properties.
</Callout>

### Controls sections

You can target specific sections of the player by prefixing the CSS vars with the section. The following sections are available:

* `top` the top control bar that shows on the small player size
* `center` the center controls that show the seek forward/backward button and play button
* `bottom` the bottom control bar

```html
<mux-player
  style="--center-controls: none; --top-captions-button: none;"
></mux-player>
```

### Available CSS variables

The below CSS selector shows all available CSS vars for hiding, each one can be prefixed with a section.

```css
mux-player {
  /* Hide all controls at once */
  --controls: none;

  /* Hide the error dialog */
  --dialog: none;

  /* Hide the loading indicator */
  --loading-indicator: none;

  /* Target all sections by excluding the section prefix */
  --play-button: none;
  --live-button: none;
  --seek-backward-button: none;
  --seek-forward-button: none;
  --mute-button: none;
  --captions-button: none;
  --airplay-button: none;
  --pip-button: none;
  --fullscreen-button: none;
  --cast-button: none;
  --playback-rate-button: none;
  --volume-range: none;
  --time-range: none;
  --time-display: none;
  --duration-display: none;
  --rendition-menu-button: none;

  /* Target a specific section by prefixing the CSS var with (top|center|bottom) */
  --center-controls: none;
  --bottom-play-button: none;
}
```

### Controls Backdrop Color

Mux Player exposes a CSS variable (`--controls-backdrop-color`) to set the controls backdrop color.
This is the background color that will show up behind the controls in the player.

```css
mux-player {
  --controls-backdrop-color: rgb(0 0 0 / 60%);
}
```

The backdrop color is turned off by default. Note if you change this color be sure to make the contrast against the controls high enough as it has implications on the accessibility of the controls as they may not meet [the contrast ratio requirements for WCAG 2.1](https://www.w3.org/TR/WCAG/#contrast-minimum).

### CSS Parts

Mux Player uses a [shadow DOM](https://developer.mozilla.org/en-US/docs/Web/Web_Components/Using_shadow_DOM) to encapsulate its styles and behaviors. As a result, it's not possible to target its internals with the usual CSS selectors. Instead, some components expose parts that can be targeted with the [CSS part selector](https://developer.mozilla.org/en-US/docs/Web/CSS/::part), or `::part()`.

```html
<style>
  mux-player::part(center play button) {
    display: none;
  }
</style>
<mux-player playback-id="DS00Spx1CV902MCtPj5WknGlR102V5HFkDe"></mux-player>
```

Supported parts: `live`, `layer`, `media-layer`, `poster-layer`, `vertical-layer`, `centered-layer`, `gesture-layer`, `top`, `center`, `bottom`, `play`, `button`, `seek-backward`, `seek-forward`, `mute`, `captions`, `airplay`, `pip`, `cast`, `fullscreen`, `playback-rate`, `volume`, `range`, `time`, `display`.

CSS parts allow you to style each element individually with a selector like `::part(center play button)` or target multiple elements if the part is assigned to multiple elements internally, usage `::part(button)`. Every CSS property can be declared in the selector, this makes it a very powerful API.

Note that if you are using advanced styling with `::parts` selectors then be sure to test out your custom styles when upgrading to new versions of Mux Player.

## Provide color variables

The colors of Mux Player can be customized with the following options:

| HTML Attribute | React Prop | Description |
| ------------- | ---------- | ----------- |
| `accent-color` | `accentColor` | Changes the color used to accent the controls |
| `primary-color` | `primaryColor` | Changes the color of the control icons |
| `secondary-color` | `secondaryColor` | Sets the background color of the control bar |

<Callout type="warning">
  When using the iframe embed, you can not use CSS to style mux-player directly, so you won't have access to these CSS Custom Properties.
</Callout>

### HTML element example

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<mux-player\nplayback-id=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\nmetadata-video-title=\"Test video title\"\nmetadata-viewer-user-id=\"user-id-007\"\naccent-color=\"#f97316\"\n></mux-player>",
      "active": true
    },
    "/index.js": {
      "code": "import '@mux/mux-player'",
      "hidden": true
    }
  }
}
```

### React example

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react\"; \n\nexport default function App() {\n  return (\n    <MuxPlayer\n      playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n      metadata={{\n        video_id: \"video-id-54321\",\n        video_title: \"Test video title\",\n        viewer_user_id: \"user-id-007\",\n      }}\n      accentColor=\"#f97316\"\n    />\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

## Change default behavior

Below are the attributes (Web Component) / props (React) available to enable, disable, hide, or change aspects of various controls to suit your use case.

### Mute

While Mux Player defaults to enabling sound, you can pass an attribute/prop to start playback muted.

`muted` is a boolean value that, when `true`, defaults sound to a muted state. Users can still unmute and manage volume as desired.

### Skip forward/backward

The amount of time for skip forward/backward defaults to 10 seconds. This can be changed by passing the following attributes (HTML element) / props (React), which updates both the seek buttons and keyboard ("hotkey") behaviors.

| Attribute (HTML) | React Prop | Description | Example |
| --------------- | ---------- | ----------- | ------- |
| `forward-seek-offset` | `forwardSeekOffset` | Sets the number of seconds to skip forward | `forward-seek-offset="5"` will apply a 5 second skip forward |
| `backward-seek-offset` | `backwardSeekOffset` | Sets the number of seconds to skip backward | `backward-seek-offset="5"` will apply a 5 second skip backward |

### Closed captions

When captions are available on an asset, we show the control for them and enable their appearance by default.

You can opt to disable their appearance (while still showing the control) by using the `default-hidden-captions` (HTML element & embed) attribute or `defaultHiddenCaptions` (React) prop and a boolean value.

### Start time

If you'd like to set a specific time stamp as the start of playback for an asset, you can use the `start-time` (HTML element & embed) attribute or `startTime` (React) prop and a time value.

When `start-time` is provided, it will also be used for the `thumbnail-time` if no `thumbnail-time` is explicitly provided.

Example: `start-time="13"` will begin playback at 13 seconds into the asset.

### Looping content

You can automatically loop the asset once playback completes with the `loop` attribute and a boolean value.

If you have a background looping video on your page for example, you might want to: turn off all controls, autoplay, mute and loop the video:

```html
<style>
  mux-player {
    --controls: none;
  }
</style>

<mux-player
  playback-id="23s11nz72DsoN657h4314PjKKjsF2JG33eBQQt6B95I"
  autoplay="muted"
  loop
></mux-player>
```

<Callout type="warning">
  When using the iframe embed, you can not use CSS to style mux-player directly, so you won't have access to the `--controls` CSS Custom Properties.
</Callout>

## Autoplay

Autoplay in browsers is a difficult beast. See [this doc](/docs/guides/web-autoplay-your-videos) if you're curious about the details. The good news is that Mux Player can help you handle autoplay when it is warranted.

Before you decide to autoplay your assets, first ask yourself: *Is this necessary?* Often times it negatively impacts accessibility, and many viewers find autoplay to be an impediment to their experience.

Here are your options for autoplay:

| Attribute (HTML & embed) | Prop (React) | Description | Behavior |
| --------------- | ------------ | ----------- | -------- |
| `autoplay` | `autoPlay` | Basic autoplay | Will try to autoplay with sound on (likely to fail) |
| `autoplay="muted"` | `autoPlay="muted"` | Muted autoplay | Will autoplay the video in muted state (likely to work) |
| `autoplay="any"` | `autoPlay="any"` | Fallback autoplay | Will try autoplay with sound first, then fall back to muted if that fails |

## Keyboard shortcuts

By default, Mux Player has several keyboard shortcuts, or hotkeys, enabled. These hotkeys will only function if the player or one of the player controls are focused.

### Default hotkeys

| Key   | Name to turn off | Behavior                        |
| ----- | ---------------- | ------------------------------- |
| Space | `nospace`        | Toggle Playback                 |
| `c`   | `noc`            | Toggle captions/subtitles track |
| `k`   | `nok`            | Toggle Playback                 |
| `m`   | `nom`            | Toggle mute                     |
| `f`   | `nof`            | Toggle fullscreen               |
| ⬅️    | `noarrowleft`    | Seek back 10s                   |
| ➡️    | `noarrowright`   | Seek forward 10s                |

### Turning hotkeys off

You can turn off all hotkeys or individual ones.

#### Turning all hotkeys off

To turn all hotkeys off, add the `nohotkeys` attribute to the Mux Player element:

```html
<mux-player
  nohotkeys
  playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>
<!-- or for the embed... -->
 <iframe
  src="https://player.mux.com/EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs?nohotkeys=true
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>
```

With the Mux Player Web Component or React component can also do it via JavaScript:

```js
const player = document.querySelector("mux-player");
// disable all hotkeys
player.nohotkeys = true;

// re-enable all hotkeys
player.nohotkeys = false;
```

#### Turning off specific hotkeys

If you only want to turn off specific hotkeys, you can do so via JavaScript or HTML.

Using the "Name to turn off" above, you can add those to a `hotkey` attribute to turn off the specific hotkeys you don't want enabled.

For example, to turn off seeking with the arrow keys:

```html
<mux-player
  hotkeys="noarrowleft noarrowright"
  playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>
<!-- or for the embed... -->
 <iframe
  src="https://player.mux.com/EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs?hotkeys=noarrowleft%20noarrowright"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
/></iframe>
```

If you're using the Mux Player Web Component or React component, you can also do this programmatically via the `hotkeys` property on the element. This provides a [DOM Token List](https://developer.mozilla.org/en-US/docs/Web/API/DOMTokenList), a la [classList](https://developer.mozilla.org/en-US/docs/Web/API/Element/classList), that allows you to add or remove each key.

```jsx
const player = document.querySelector("mux-player");

// turn off seeking with the arrow keys
player.hotkeys.add("noarrowright", "noarrowleft");

// re-enable the arrow keys
player.hotkeys.remove("noarrowright", "noarrowleft");
```

## Styling captions

Although the `::cue` CSS selector/psuedo-element exists and has good [browser support](https://developer.mozilla.org/en-US/docs/Web/CSS/::cue) on paper, actual support of individual CSS properties when combined with it is very inconsistent. FireFox particularly doesn't support many of them.

There are two unique CSS properties that you can use to do very basic styling of the captions text though. Combined with the `::part()` selector we can apply them like this:

```css
mux-player::part(media-layer) {
  -webkit-text-fill-color: red;
  -webkit-text-stroke: 1px blue;
}
```

Despite being `-webkit-` prefixed, these have good cross-browser support.

Broader and more advanced support for caption styling will be available in a future version of the player.

<Callout type="warning">
  You can not style the Mux Player element with CSS if you are using the HTML embed through player.mux.com.
</Callout>


# Choose a theme for Mux Player
Learn how to configure a new Mux Player theme
Mux Player is built on top of [Media Chrome](https://www.media-chrome.org/)
that comes with simple but powerful [theming](https://www.media-chrome.org/en/themes)
capabilities. It allows you to fully control the video player UI layout
and style but keeps the complexity of media state management out of the way.

<Callout type="warning">
  Themes are unavailable if you are using the Mux Player HTML embed through player.mux.com.
</Callout>

## Mux themes

The `minimal` and `microvideo` themes require one extra import,
then set the `theme` attribute and you're ready to go!

### Minimal theme

This theme pares down the Mux Player experience to the bare bones controls
viewers need, ideal for those that want a simpler player experience.

Here's an example of a React app using the Minimal theme.

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest",
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react\";\nimport \"@mux/mux-player/themes/minimal\";\n\nexport default function App() {\n  return (\n    <>\n      <MuxPlayer\n        theme=\"minimal\"\n        playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n      />\n    </>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

### Microvideo theme

This theme optimizes for shorter content that doesn't need the robust playback
controls that longer content typically requires.

Here's an example of a HTML page using the Microvideo theme.

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<script\n  type=\"module\"\n  src=\"https://cdn.jsdelivr.net/npm/player.style/microvideo/+esm\"\n></script>\n<script\n  type=\"module\"\n  src=\"https://cdn.jsdelivr.net/npm/@mux/mux-player\"\n></script>\n<mux-player\n  theme=\"microvideo\"\n  playback-id=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n></mux-player>",
      "active": true
    },
    "/index.js": {
      "code": "\nimport '@mux/mux-player';\nimport '@mux/mux-player/themes/microvideo';\n    ",
      "hidden": true
    }
  }
}
```

### Classic theme

This theme is the classic 1.x version of Mux Player. Here's an example of a HTML page using the Classic theme.

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest",
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react\";\nimport \"@mux/mux-player/themes/classic\";\n\nexport default function App() {\n  return (\n    <>\n      <MuxPlayer\n        theme=\"classic\"\n        playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n      />\n    </>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

### Styling

You can use the same styling methods like explained in
[customize look and feel](/docs/guides/player-customize-look-and-feel#style-with-css).

Note that the CSS variables, CSS parts and styling guidelines are relevant to themes that ship from `@mux/mux-player/themes`. Any other Media Chrome themes created by you or a third party will not necessarily share the same CSS variables and parts.

Unlike the Mux Player default theme, these themes come with some buttons disabled by default.
However these can still be enabled by setting some CSS vars.

| Button | CSS Variable |
| --- | --- |
| Seek backward button | `--seek-backward-button: block;` |
| Seek forward button | `--seek-forward-button: block;` |
| PiP (Picture-in-Picture) button | `--pip-button: block` |

## Media Chrome themes

Mux Player uses Media Chrome themes to layout and style the UI of
the video player. Please read the
[themes documentation](https://www.media-chrome.org/en/themes)
to learn how to create a theme.

There are two ways to consume a Media Chrome theme in Mux player.

### Via an inline `<template id="mytheme">`

See the example on [Codesandbox](https://codesandbox.io/s/mux-player-tiny-theme-template-vc7d0y?file=/index.html)

### Via a custom element `<media-theme-mytheme>`

See the example on [Codesandbox](https://codesandbox.io/s/mux-player-tiny-theme-custom-element-gst24f?file=/index.html)


# Lazy-loading Mux Player
Improve your users' page load experience by lazy-loading the Mux Player.
## Installation

### React

After [installing `@mux/mux-player-react`](/docs/guides/player-integrate-in-your-webapp), import Mux Player React Lazy from `@mux/mux-player-react/lazy`:

Depending on your bundler your import might look a little different. If you're having trouble with the import try:

* `@mux/mux-player-react/lazy`
* `@mux/mux-player-react/dist/lazy.mjs`
* `@mux/mux-player-react/dist/lazy`

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player-react": "latest"
    }
  },
  "files": {
    "/App.js": {
      "code": "import MuxPlayer from \"@mux/mux-player-react/dist/lazy.mjs\"; \n\nexport default function App() {\n  return (\n    <>\n      <p style={{ backgroundColor: \"#eee\", height: \"100vh\" }}>\n        Scroll down to see Mux Player load lazily.\n      </p>\n      <MuxPlayer\n        playbackId=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\n        metadata={{\n          video_id: \"video-id-54321\",\n          video_title: \"Test video title\",\n          viewer_user_id: \"user-id-007\",\n        }}\n        style={{ aspectRatio: 16/9 }}\n      />\n    </>\n  );\n}\n",
      "active": true
    },
    "/src/index.js": {
      "code": "",
      "hidden": true
    }
  },
  "template": "react"
}
```

<Callout type="info">
  Mux Player React Lazy will not be available if you are using the hosted option
  on jsdelivr.com.
</Callout>

## Preventing cumulative layout shift

Because the player is added to the DOM after the page loads, it will cause a [cumulative layout shift](https://web.dev/cls), pushing content down and causing a jarring jump for your users. To prevent this, make sure your player has an `aspectRatio` style property. `@mux/mux-player-react/lazy` will display a placeholder with this aspect ratio while the player loads.

```jsx
<MuxPlayer
  playbackId="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
  // without this line, the player will cause a layout shift when it loads
  style={{ aspectRatio: 16/9 }}
/>
```

## Customizing the placeholder

While Mux Player React Lazy loads, it will display a placeholder with the same background color as the player. (By default, a black background).

<Player playbackId="Wd01CoLZp2Adx00qefHtyGVPSP2h4wO33OZqR00vf7wCnQ" style={{ aspectRatio: "495 / 274", '--center-controls': 'none' }} />

If the `placeholder=` attribute is defined, the attribute's contents will display in the placeholder before load. You can generate placeholders that match your video poster with `@mux/blurup`. [See the placeholder guide to learn more](/docs/guides/player-customize-look-and-feel#provide-a-placeholder-while-the-poster-image-loads).

<Player playbackId="bXA3Oh7v22fRBU013damYqUxFK6HrmJcrI00Q00b2OSvmc" style={{ aspectRatio: "656 / 277", '--center-controls': 'none' }} />

## Defining when to load

In addition to the standard attributes that Mux Player React accepts, Mux Player React Lazy will also accept a `loading` attribute:

* `loading="page"`: Loads the player and replaces a placeholder after the page loads and the initial JavaScript bundle is executed
* `loading="viewport"`: (Default) Extends `loading="page"` by also waiting until the placeholder has entered the viewport

## Using other frameworks

If you are working in an environment that supports [dynamic imports](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/import), like [Webpack](https://webpack.js.org/guides/code-splitting/), [Rollup](https://rollupjs.org/guide/en/#code-splitting), [Parcel](https://parceljs.org/features/code-splitting/), or [many modern browsers](https://caniuse.com/es6-module-dynamic-import), you can reproduce the behavior of Mux Player React Lazy.

If you have access to a Node.js server, generate a placeholder that matches your video with `@mux/blurup`.

```js
// Server-Side
import { createBlurUp } from '@mux/blurup';

const options = {};
const muxPlaybackId = 'O6LdRc0112FEJXH00bGsN9Q31yu5EIVHTgjTKRkKtEq1k';

const getPlaceholder() = async () => {
  const { blurDataURL, aspectRatio } = await createBlurUp(muxPlaybackId, options);
  console.log(blurDataURL, aspectRatio);
  // data:image/svg+xml;charset=utf-8,<svg xmlns="http://www.w3.org/2000/svg" width="100%" ...
};
```

Then, use a dynamic import to load Mux Player. When the load is complete, replace the placeholder with the player.

```html

<div class="wrapper">
  <div class="placeholder"></div>
</div>

<script>
const wrapper = document.querySelector(".wrapper");
const placeholder = document.querySelector(".placeholder");

import("@mux/mux-player").then(() => {
  const player = document.createElement("mux-player");

  player.setAttribute("playback-id", playbackId);
  player.setAttribute("placeholder", blurUpPlaceholder);
  player.setAttribute("metadata-video-title", "Test video title");
  player.setAttribute("metadata-viewer-user-id", "user-id-007");

  wrapper.replaceChild(player, placeholder);
});
</script>

<style>
.wrapper {
  aspect-ratio: {sourceWidth} / {sourceHeight};
  width: 100%;
  position: relative;
}
mux-player, .placeholder {
  position: absolute;
  inset: 0;
}
.placeholder {
  background-image: url({blurUpPlaceholder});
  background-color: black;
  background-size: contain;
  background-repeat: no-repeat;
}
</style>

```

```svelte

<script>
  const player = import('@mux/mux-player');
</script>

<main>
  <div class="wrapper" style:aspect-ratio="{sourceWidth / sourceHeight}">
    {#await player}
      <div class="placeholder" style:background-image="url('{data.blurUpPlaceholder}')" />
    {:then}
      <mux-player
        playback-id={playbackId}
        placeholder={blurUpPlaceholder}
        metadata-video-title="Test VOD"
        metadata-viewer-user-id="user-id-007"
      />
    {/await}
  </div>
</main>

<style>
  .wrapper {
    width: 100%;
    position: relative;
  }
  mux-player, .placeholder {
    position: absolute;
    inset: 0;
  }
  .placeholder {
    background-color: black;
    background-size: contain;
    background-repeat: no-repeat;
  }
</style>


```



# Running ads with Mux Player
Monetize your videos by running ads in Mux Player with the Google IMA SDK.
Mux Player doesn’t have a built-in way to integrate ads. We will show you how this can be achieved though using client side ad insertion with the Google IMA SDK. This guide will demonstrate how you can enable [preroll](https://www.mux.com/video-glossary/preroll) ads, but [midroll](https://www.mux.com/video-glossary/midroll) and [postroll](https://www.mux.com/video-glossary/postroll) ads could also be achieved using this approach.

If you're unfamiliar with the Google IMA SDK, we recommend reading the [documentation](https://developers.google.com/interactive-media-ads/docs/sdks/html5/client-side) as well as the examples in [this repository](https://github.com/googleads/googleads-ima-html5).

Within web video, ad insertion typically comes in two flavors:

* **SSAI - Server Side Ad Insertion**: A mechanism to insert advertisements into the linear video stream so that it’s played out *without* any other needed technology on the viewing side.
* **CSAI - Client Side Ad Insertion**: A method whereby the video player requests an ad from an ad server via the video player located inside an application or website. When the ad server has received the ad request from the video player, it sends back an ad and displays it inside the video content.

<Callout type="info">
  The following guide is using vanilla JS, however it can applied to any popular framework (React, Angular, etc.)
</Callout>

## 1. Set up Mux Player

First, make sure you have Mux Player set up on your webpage. Include the following CDN links:

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
```

Then add Mux Player within a containing div element. This div element will act as a container for both Mux Player and the Ad layer

```html
<div id="mainContainer">
    <mux-player
    playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
    metadata-video-title="Test VOD"
    metadata-viewer-user-id="user-id-007"
    ></mux-player>
</div>
```

<Callout type="info">
  [See the related player documentation](/docs/guides/mux-player-web)
</Callout>

## 2. Include the Google IMA SDK

```html
<script src="//imasdk.googleapis.com/js/sdkloader/ima3.js"></script>
```

<Callout type="warning">
  While developing locally, the newer versions of Google Chrome might block the loading of this script because of the lack of HTTPS/SSL support
</Callout>

## 3. Create an ad container

We need to add two container elements. One to contain the ad itself, which will overlay Mux Player, and another to wrap around both of them

```html
<div id="mainContainer">
    <mux-player
    playback-id="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
    metadata-video-title="Test VOD"
    metadata-viewer-user-id="user-id-007"
    ></mux-player>
    <div id="ad-container"></div>
</div>
```

Now we add some CSS to position the ad on top of the player. It's important that the `ad-container` is the exact same size as the player element, so that ads are displayed at the same size as the video.

```css
mux-player {
    width: 640px;
    height: 360px;
}

#mainContainer {
    position: relative;
    width: 640px;
    height: 360px;
}

#ad-container {
    position: absolute;
    top: 0;
    left: 0;
    width: 100%;
    height: 100%;
    z-index: 1000;
}
```

## 4. Initialize the IMA SDK and handle playback

Initialize the IMA SDK and configure it to use the ad container element we created earlier. The code below initializes the Google IMA SDK for client-side ad insertion. It creates an AdDisplayContainer object, which is used to overlay ads on top of Mux Player. Then, an AdsLoader object is created, which loads the IMA SDK ads. Create an AdsRequest object and request ads using the IMA SDK. You’ll need to have an ad tag URL from your ad server.

```javascript
let muxPlayer = document.querySelector('mux-player'); // initialize for later use

const adDisplayContainer = new google.ima.AdDisplayContainer(document.getElementById('ad-container'));
const adsLoader = new google.ima.AdsLoader(adDisplayContainer);
let adsManager; // initializes the adsManager that get's utilized in later event handlers

adsLoader.addEventListener(google.ima.AdsManagerLoadedEvent.Type.ADS_MANAGER_LOADED, onAdsManagerLoaded, false);
adsLoader.addEventListener(google.ima.AdErrorEvent.Type.AD_ERROR, onAdError, false);

let adsRequest = new google.ima.AdsRequest();
adsRequest.adTagUrl = 'https://pubads.g.doubleclick.net/gampad/ads?iu=/21775744923/external/single_ad_samples&sz=640x480&cust_params=sample_ct%3Dlinear&ciu_szs=300x250%2C728x90&gdfp_req=1&output=vast&unviewed_position_start=1&env=vp&impl=s&correlator=';

// The above is a testing preroll ad. Please fill in your tag URL from your ad server.

adsRequest.linearAdSlotWidth = 640;
adsRequest.linearAdSlotHeight = 360;
adsRequest.nonLinearAdSlotWidth = 640;
adsRequest.nonLinearAdSlotHeight = 150;

adsLoader.requestAds(adsRequest);
```

When ads are loaded, initialize the AdsManager to start playing them. This sets up the Google IMA SDK's AdsManager and attaches event listeners to handle ad events. It also initializes and starts the AdsManager, ensuring that ads are played correctly. If an error occurs during initialization or ad playback, the content will be played instead.

```javascript
function onAdsManagerLoaded(adsManagerLoadedEvent) {
    let adsRenderingSettings = new google.ima.AdsRenderingSettings();
    adsManager = adsManagerLoadedEvent.getAdsManager(muxPlayer, adsRenderingSettings);

    // Add event listeners to the ads manager here
    adsManager.addEventListener(google.ima.AdErrorEvent.Type.AD_ERROR, onAdError);
    adsManager.addEventListener(google.ima.AdEvent.Type.CONTENT_PAUSE_REQUESTED, onContentPauseRequested);
    adsManager.addEventListener(google.ima.AdEvent.Type.CONTENT_RESUME_REQUESTED, onContentResumeRequested);

    try {
        adsManager.init(640, 360, google.ima.ViewMode.NORMAL);
        adsManager.start();
    } catch (adError) {
        muxPlayer.play(); // If ad fails, continue with the content
    }
}

function onAdError(adErrorEvent) {
    console.log(adErrorEvent.getError());
    if (adsManager) {
        adsManager.destroy();
    }
    muxPlayer.play(); // Continue with the content
}

function onContentPauseRequested() {
    muxPlayer.pause();
}

function onContentResumeRequested() {
    muxPlayer.play();
}
```

<Callout type="info">
  Under `adsManager.init(640, 360, google.ima.ViewMode.NORMAL)`, the values of 640 and 360 should match the values of the ad and main container dimensions. Otherwise you will see some unwanted results with the ad dimensions
</Callout>

## 5. Link Mux Player events with IMA SDK

These event listeners synchronize ad playback with video playback, ensuring that everything is tied together.

```javascript
muxPlayer.addEventListener('play', function () {
    adDisplayContainer.initialize();
});

muxPlayer.addEventListener('pause', function () {
    if (adsManager) {
        adsManager.pause();
    }
});

muxPlayer.addEventListener('playing', function () {
    if (adsManager) {
        adsManager.resume();
    }
});
```

## 6. Start the ad and content

Ensure the ad is initialized and content plays accordingly.

<Callout type="warning">
  Please be mindful when testing if you're using an adblocker as you will not receive any ads
</Callout>


# Advanced usage of Mux Player
In this guide, you will learn about more advanced usage of Mux Player.
## Listen for events

Mux Player emits all of events available on the [HTML5 video element](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/video#events).

<Callout type="warning">
  Events are unavailable if you are using the Mux Player HTML embed through player.mux.com.
</Callout>

For example, if you want to keep track of how much of a particular video a user has watched, you probably want to use the `timeupdate` event like this:

### HTML element

Sandpack interactive code example configuration JSON.stringified:
```json
{
  "customSetup": {
    "dependencies": {
      "@mux/mux-player": "latest"
    }
  },
  "files": {
    "/index.html": {
      "code": "<mux-player\nplayback-id=\"a4nOgmxGWg6gULfcBbAa00gXyfcwPnAFldF8RdsNyk8M\"\nmetadata-video-title=\"Test video title\"\nmetadata-viewer-user-id=\"user-id-007\"\n></mux-player>"
    },
    "/index.js": {
      "code": "import '@mux/mux-player';\n\nconst muxPlayer = document.querySelector(\"mux-player\"); \n\nmuxPlayer.addEventListener(\"timeupdate\", function (event) {\nconsole.log('time update!', event);\n});",
      "active": true
    }
  }
}
```

In React, the events are camel-cased and prefixed with `on\*`. For example `timeupdate` becomes `onTimeUpdate`:

### React

```jsx
function saveWatchProgress(event) {
  /* event */
}

<MuxPlayer onTimeUpdate={saveWatchProgress} />;
```

## Secure your playback experience

Mux offers a couple of ways to secure your media content:

* using signed URLs, which ensures only people with a valid, unexpired token can load your video in allowed playback contexts
* using [Digital Rights Management](/docs/guides/protect-videos-with-drm) <BetaTag />

Both options are easy to use with Mux Player and are discussed below.

### Use signed URLs

If you followed the guide for [Secure video playback](/docs/guides/secure-video-playback) then you are using signed URLs and a few extra steps are required to use Mux Player (or any player for that matter).

First off, you should already be creating JSON Web Tokens (JWTs) on your **server**. If you're not doing that already, head over to that guide and do that part first.

Note that JWTs are granular, so a unique token is used for each resource:

* **Playback** is used to get the actual video.
* **Thumbnail** is used to get a still image from the video. Mux Player uses it for a poster image
* **Storyboard** is used for [timeline hover previews](/docs/guides/create-timeline-hover-previews). This only works for on-demand video, live streams aren't supported.
* **DRM** is used for playing DRM-protected content. See the [section below](#use-digital-rights-management-drm).

Each JWT will look something like this below. These examples were created with playback ID `qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w`.

**Playback token:**

```
eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6ImFkamYzb2JpYURUcEF0QVlpS3NCMkpvRlkwMXBpbEJMTHdYcUQzaHpJYURJIn0.eyJleHAiOjE5NjE2NDY0MDMsImF1ZCI6InYiLCJzdWIiOiJxSUpCcWFKUGtoTlhpSGJlZDhqMmp5eDAydFFRV0JJNWZMNldrSVFZTDYzdyJ9.mukZou10_iwaqPeHVFbXwTZShMK1D8kWpFAFOl6bwuIMB7hx0bAqscZxj5FwrIB8dzB6s_9YtJEEVXcR6ezxOhOc_y2ij1XM4YQYCuGH-elJc3rapHbahv2K7L_asz9Bdu1Ld6i6Ux7keNpEuGSYCDmsPmvdII7_XAPmzU01ZTvaXqCgzCY2PO7xz6z3hu1HOww2eL41TSif_Zu0okNZlhfHE9U-nyr4OVpuS9Q-rTtVvfE2ILSd9Ezt02AuOK-JkBCeR3Xf-UrbXB33ZFHLJrYVA-B516Iym0CGRfVssZsAn80_PNaxS_3M_OmVzyaDJ4zudb-YjGcaNl0yf96h6w
```

**Thumbnail token:**

```
eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6ImFkamYzb2JpYURUcEF0QVlpS3NCMkpvRlkwMXBpbEJMTHdYcUQzaHpJYURJIn0.eyJleHAiOjE5NjE2NTkzMzAsImF1ZCI6InQiLCJzdWIiOiJxSUpCcWFKUGtoTlhpSGJlZDhqMmp5eDAydFFRV0JJNWZMNldrSVFZTDYzdyJ9.zQ0tDimpgu7nsT9Tb7GBgitMpYSbLBodwS-fSc7U0K0WT-giCUgxXXSqXquwpHMjEEfSuCsCU3Y1gq2P7WaJUBGTOTLKT5GOwyhjeoJzTPXEQqW7T-tpKXhjEDVwy_H2UPNVdA9ZALos5R9rrWyiTQA53sxT56FWy-IhvaISpiB16nzankRKCAo98kh6lloexE8p3lXnUhLwIK8Hqco4hRmHSmWqUndnJrbq0_kag0o8R0drffSMj6CvKas8_f6v3MtHXDhW0JkJ1TZKwICt7W-jrSyMfhgAb9wltBCUXdNHYvQTXkFfFnsI1R-BuZodQL2zN3pVBqzuhQA0UPADMw
```

**Storyboard token** (only needed for `on-demand`):

```
eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6ImFkamYzb2JpYURUcEF0QVlpS3NCMkpvRlkwMXBpbEJMTHdYcUQzaHpJYURJIn0.eyJleHAiOjE5NjE2NTkzMTQsImF1ZCI6InMiLCJzdWIiOiJxSUpCcWFKUGtoTlhpSGJlZDhqMmp5eDAydFFRV0JJNWZMNldrSVFZTDYzdyJ9.QxvtM-FBakS8IPl_mZloBKLKyHRU8md7IbSifAYbAVHrLwUre3-CXlOcsd6sKi0hVen_DnSqQeuuFTYF6o2TeS31gnBsf5U4W7JDpOjxAepj4ODM6bpPJBu6XDpZmMTduuwVrIXP9pQWSwiHSQ93hk6RR17YrPgGz6sCXIL5gt0re_WqkSEazwYEscu9eByMN3F_sM7W830C7Wzeatb1TMeEf6wQhbpKABLB33VM0FOuM5ojjI9DWmDhJksfFVrOxaZtoju4hjiWQtNPVBCFP28J9LHNLA7brRXvDGaIUxHG5-vrcVuImlghdWgPyrAOb0lWYSiklYx2ObHhNWJK1g
```

When you have generated the 3 tokens, pass them into Mux Player:

```embed

<iframe
  src="https://player.mux.com/qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w?playback-token=your-playback-token&thumbnail-token=your-thumbnail-token&storyboard-token=your-storyboard-token"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html
<mux-player
  playback-id="qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w"
  playback-token="your-playback-token"
  thumbnail-token="your-thumbnail-token"
  storyboard-token="your-storyboard-token"
  metadata-video-id="video-id-54321"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-007"
></mux-player>
```

```react

<MuxPlayer
  playbackId="qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w"
  metadata={{
    video_id: "video-id-54321",
    video_title: "Test video title",
    viewer_user_id: "user-id-007",
  }}
  tokens={{
    playback: "your-playback-token",
    thumbnail: "your-thumbnail-token",
    storyboard: "your-storyboard-token",
  }}
/>
```



If you are using JavaScript and the Mux Player Web Component or React component, you can use the `tokens` property too:

```javascript
const muxPlayer = document.querySelector("mux-player");
muxPlayer.tokens = {
  playback: "eyJhbGciOiJSUzI1NiI...",
  thumbnail: "eyJhbGciOiJSUzI1N...",
  storyboard: "eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsI...",
};
```

Mux Player send errors to Mux Data when tokens are incorrect. The most common error cases with signed URLs that Mux Player detects are:

* [Playback ID mismatch](https://github.com/muxinc/elements/blob/main/errors/403-playback-id-mismatch.md)
* [Expired token](https://github.com/muxinc/elements/blob/main/errors/403-expired-token.md)
* [Malformatted token](https://github.com/muxinc/elements/blob/main/errors/403-malformatted-token.md)

These errors will be logged to the browser console and sent to your Mux Data dashboard.

### Use Digital Rights Management (DRM)

<Callout type="info">
  This feature is currently in beta. [Learn more about DRM.](/docs/guides/protect-videos-with-drm) <BetaTag />
</Callout>

<Callout type="info">
  To play DRM protected content on iOS and iPadOS devices the device should be running the current minor and patch version of iOS or iPadOS.

  We strongly recommend that viewers use the latest version of iOS/iPadOS 17 or 18 when viewing DRM protected content.

  Playing DRM protected content on an OS version that is not the latest minor and patch version of a major release is known to result in playback failures.
</Callout>

If you've [setup your playback ID to be DRM-protected](/docs/guides/protect-videos-with-drm), playback is as simple as adding the DRM token to your set of tokens used.

```embed

<iframe
  src="https://player.mux.com/your-playback-id?drm-token=your-drm-token&playback-token=your-playback-token&thumbnail-token=your-thumbnail-token&storyboard-token=your-storyboard-token"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html
<mux-player
  playback-id="your-playback-id"
  playback-token="your-playback-token"
  drm-token="your-drm-token"
  thumbnail-token="your-thumbnail-token"
  storyboard-token="your-storyboard-token"
></mux-player>
```

```react

<MuxPlayer
  playbackId="your-playback-id"
  tokens={{
    playback: "your-playback-token",
    drm: "your-drm-token",
    thumbnail: "your-thumbnail-token",
    storyboard: "your-storyboard-token",
  }}
/>
```



If you are using JavaScript and the Mux Player Web Component or React component, you can use the `tokens` property too:

```javascript
const muxPlayer = document.querySelector("mux-player");
muxPlayer.tokens = {
  playback: "eyJhbGciOiJSUzI1NiI...",
  drm: "eyJhbGciOiJSUzI1NiIs...",
  thumbnail: "eyJhbGciOiJSUzI1N...",
  storyboard: "eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsI...",
};
```

## Controlling an iframe-embedded Mux Player with Player.js

Mux Player embedded within an iframe with player.mux.com supports the Player.js spec. This means you can control the player from your own window's JavaScript. See the [Player.js docs](https://github.com/embedly/player.js#playerjs) for more information.

## Preloading assets

By default `preload` will behave similar to the HTML5 `<video>` element.

Use the `preload=` attribute with values of `"none"`, `"metadata"` or `"auto"`.
Or omit it for the default behavior.

When there is no `preload` attribute, the player will use the behavior that the browsers set initially.
Most browsers use `"auto"`, but some (like Chrome) use `"metadata"` instead.
On mobile devices, `preload` is always `none`.
For the most consistent user experience, we recommended providing the `preload` attribute.

The value `"auto"` will start loading the video as soon as possible and give the user the best experience with the shortest startup time.

If you want to preserve bandwidth (and delivery cost) set `preload="none"` (load nothing until the user tries to play) or `preload="metadata"` (load the minimum amount of data for the media to get basic information like its duration).

The tradeoff with using `preload="metadata"` or `preload="none"` is that when the user plays the video they will experience a slower startup time because the video has to load before playback can start. You'll see the slower startup time reflected in your Mux Data dashboard and this will negatively impact the [Overall Viewer Experience metric](/docs/guides/data-overall-viewer-experience-metric).

## Use custom video domains

By default, all Mux Video assets will be hosted on mux.com. This includes things like posters, storyboards, and media sources.

[Custom Domains](https://www.mux.com/blog/introducing-custom-domains), is a feature which allows you to stream these assets from a domain of your choice.

Once you have your custom domain set up, provide it via the `custom-domain` attribute or `customDomain` property. If your custom domain is `media.example.com` then internally Mux Player will take that value and expand it to `image.media.example.com` for images and `stream.media.example.com` for video.

```embed

<iframe
  src="https://player.mux.com/qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w?customDomain=media.example.com"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<mux-player
  playback-id="qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w"
  custom-domain="media.example.com"
></mux-player>
```

```react

<MuxPlayer
  playbackId="qIJBqaJPkhNXiHbed8j2jyx02tQQWBI5fL6WkIQYL63w"
  customDomain="media.example.com"
/>
```



If you are using JavaScript and the Mux Player Web Component or React component, you can use the `customDomain` property too:

```javascript
const muxPlayer = document.querySelector("mux-player");
muxPlayer.customDomain = "media.example.com";
```

## Access the underlying video element

The `media.nativeEl` property is a reference to the underlying video element. When using the Mux Player Web Component or React component, Yyu can use this to access the video element's properties and methods.

```jsx
  <MuxPlayer
    playbackId="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
    ref={(muxPlayerEl) => console.log(muxPlayerEl.media.nativeEl)}
    metadata={{
      video_id: "video-id-54321",
      video_title: "Test video title",
      viewer_user_id: "user-id-007",
    }}
  />
```

## Change playback engine

Mux Player will automatically handle Adaptive Bitrate Streaming with your Mux Asset. For a beginner's guide on how this works, [howvideo.works](https://howvideo.works/) is an informational site that explains the basic concepts. Under the hood, Mux Player uses [HLS.js](https://github.com/video-dev/hls.js/) and Mux Player will pick the optimal HLS.js configuration based on the provided `stream-type`.

On iOS, iPadOS, and MacOS, Mux Player will use Apple's native HLS streaming engine. On Android, Mux Player will use HLS.js.

It is not recommended, but if you have a good reason to control whether Mux Player uses HLS.js (MSE, Media Source Extension) or native HLS playback you can with the `prefer-playback` attribute (in React `preferPlayback`). Values can be `"mse"` or `"native"`. When a value is provided for `prefer-playback`, Mux Player will use that playback strategy if available.

Note that setting the `prefer-playback` attribute should be done with caution. If you are setting this, make sure you thoroughly test playback on the various operating systems and browsers that Mux Player will be running in. Also, keep an eye on Mux Data to verify that your playback metrics are on track.

## Re-using player instances

Mux Player instances can be re-used by re-setting the `playback-id`.

In React, this is done by changing the `playbackId` prop to a new value.

In the web component, this can be done by either calling `setAttribute` with a new value for the `playback-id` attribute or by assigning the `playbackId` property. Both are equally valid ways of interacting with the `<mux-player>` element instance.

```js
const muxPlayer = document.querySelector('mux-player');

// using setAttribute
muxPlayer.setAttribute('playback-id', 'new-playback-id-xxx');
// using the `playbackId` prop
muxPlayer.playbackId = 'new-playback-id-xxx';
```

## Debugging

Add the `debug` attribute or React prop in order to print verbose logging to the developer console. This will enable verbose logging from:

* Mux Player itself (prefixed with `[mux-player]`)
* [HLS.js](https://github.com/video-dev/hls.js/)
* Mux Data

Note that this must be set before setting a `playback-id` to take full advantage of debug logging.

## Disabling cookies

Even though Mux Data cookies do not contain any personally identifiable information (PII) and are used for more reliable and informative QOE metrics, there are times when you may want or need cookies to be disabled.

In those cases, you can use the `disable-cookies` attribute or `disableCookies` React prop to turn off use of cookies by Mux Data. Note this must be set before setting a `playback-id` to take effect.

For more on the use of cookies in Mux Data, see [the docs](/docs/guides/monitor-html5-video-element#disable-cookies).

## Custom storyboards

By default Mux Player will use the [storyboard](/docs/guides/create-timeline-hover-previews#webvtt) WebVTT text track that corresponds to your `plaback-id`

`https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?format=webp`

If you want to use a different WebVTT source file for your storyboard, you can use the `storyboard-src` attribute or `storyboardSrc` React prop to override it. Keep in mind that the WebVTT source file must conform to our expectations for storyboards.

## Add chapters and time-based metadata

The Mux Player Web Component and React component support both chapters and time-based metadata (cue points). Chapters visually split the timeline into sections with titles that users can click to jump to. Cue points allow you to associate custom metadata with ranges of time in the timeline. Both support getting a callback when the chapter or cue point has become active. You can use either individually or both at the same time, depending on your use-case.

If you omit `endTime` from a cue point or chapter, it will automatically end when the next one begins by joining them together without gaps. If you include an `endTime`, you can have gaps between your chapters or cue points.

Both chapters and cue points will be removed if you unload the media or change the current playback ID.

### Chapters example

A chapter is defined as: `{startTime: number; endTime?: number; value: string}`, with the value containing the chapter's title and `endTime` being optional. Both `startTime` and `endTime` are in seconds.

<Image src="/docs/images/chapter-example.png" width={1596 } height={ 438} caption="Mux Player chapter example with a gap between chapters" />

```js
const muxPlayerEl = document.querySelector('mux-player');

function addChaptersToPlayer() {
  // Chapters can also specify an `endTime` if we don't want them to automatically join up
  muxPlayerEl.addChapters([
    { startTime: 1, value: 'Chapter 1' },
    { startTime: 3, value: 'Chapter 2' },
    { startTime: 10, value: 'Chapter 3 - will span to the end' },
  ]);
}

// NOTE: We need to wait until the player has loaded some data first
// otherwise, we have no media to associate them with
if (muxPlayerEl.readyState >= 1) {
  addChaptersToPlayer();
} else {
  muxPlayerEl.addEventListener('loadedmetadata', addChaptersToPlayer, { once: true });
}

muxPlayerEl.addEventListener('chapterchange', () => {
  console.log(muxPlayerEl.activeChapter);
  console.log(muxPlayerEl.chapters);
});
```

Chapters currently work with streaming assets (video on demand) and audio, but not live content.

### Time-based metadata (cue points)

A CuePoint is defined as: `{ startTime: number; endTime?: number; value: any; }`, with the `value` being a JSON-serializable value that you want to associate with that range of time. Like chapters, start and end times are in seconds and `endTime` is optional.

```js
const muxPlayerEl = document.querySelector('mux-player');
function addCuePointsToPlayer() {
  // CuePoints can also specify an `endTime` if we don't want them to automatically join up
  const cuePoints = [
    { startTime: 1, value: 'Simple Value' },
    { startTime: 3, value: { complex: 'Complex Object', duration: 2 } },
    { startTime: 10, value: true },
    { startTime: 15, value: { anything: 'That can be serialized to JSON and makes sense for your use case' } }
  ];

  muxPlayerEl.addCuePoints(cuePoints);
}

// We're using `duration` and `'durationchange'` to determine if the `<mux-player>` element has loaded src.
// This gives us the opportunity to compare our CuePoints against the duration of the media if needed.
// You could use other events, such as `'loadedmetadata'` if that makes more sense for your use case.
if (playerEl.duration) {
  addCuePointsToPlayer();
} else {
  muxPlayerEl.addEventListener('durationchange', addCuePointsToPlayer, { once: true });
}

muxPlayerEl.addEventListener('cuepointchange', () => {
  console.log(muxPlayerEl.activeCuePoint);
  console.log(muxPlayerEl.cuepoints);
});
```

If cue points are specified without an `endTime`, then like chapters they will automatically be joined up end-to-end. This means that if a user seeks anywhere between two cue points, the `cuepointchange` event will fire and the `activeCuePoint` will be the earlier cue point. If you only care about the `activeCuePoint` when the `currentTime` is roughly the same as the `startTime` of a cue point, you can add some custom logic to account for that, e.g.:

```js
function cuePointChangeListener() {
  // Only do something with the activeCuePoint if we're "near" its `startTime`.
  const cuePointBuffer = 1; // how close the playhead needs to be to the CuePoint, in seconds
  if (Math.abs(muxPlayerEl.currentTime - muxPlayerEl.activeCuePoint.startTime) <= cuePointBuffer) {
    console.log('Active CuePoint playing near its time!', muxPlayerEl.activeCuePoint);
  }
}
```

## Synchronize video playback

To facilitate synchronizing video playback across players, Mux Player exposes `currentPdt` and `getStartDate()`.

If the stream includes Program Date Time tags, `currentPdt` and `getStartDate()` will return a [Date][] object that corresponds to the PDT at the current time or at the begining of the stream.
If there is no PDT, or if the video hasn't loaded yet, `currentPdt` and `getStartDate()` will return an Invalid Date object.

See [Synchronize video playback](/docs/examples/synchronize-video-playback) for more information.

<Callout type="info">
  `currentPdt` and `getStartDate()` currently require that [Slates](/docs/guides/handle-live-stream-disconnects#reconnect-window-and-slates) are enabled on your stream.
  If Slates are not enabled, it is possible that the times provided are not accurate.
</Callout>

Refer to this sample for the usage below:

```text
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-TARGETDURATION:2
#EXT-X-MAP:URI="https://chunk-gce-us-east1-production.cfcdn.mux.com/v1/chunk/3aJUOua6jsMHYybcqXRBpcXH82aCYXTu02TPTKHzIokndAPmz300ZThlCZbeNAy1t73003iytFZNJdjcvjTsOrCVTaGZgQ9J00uU/18446744073709551615.m4s?skid=default&signature=NjBmMjFkODBfYWVhMjIyZTdmMDU0ZmI0YWU2ZWJkZTJiYTY4MzhmYWQzNWQ2YzMyMTVlYjdjNmM0NzZiZjBmZGU0ODU1MTUyNQ=="
#EXT-X-PLAYLIST-TYPE:VOD

#EXT-X-PROGRAM-DATE-TIME:2021-06-28T17:53:25.533+00:00
#EXTINF:2,
https://chunk-gce-us-east1-production.cfcdn.mux.com/v1/chunk/3aJUOua6jsMHYybcqXRBpcXH82aCYXTu02TPTKHzIokndAPmz300ZThlCZbeNAy1t73003iytFZNJdjcvjTsOrCVTaGZgQ9J00uU/0.m4s?skid=default&signature=NjBmMjFkODBfOWJkMzMyMTc5YzgwY2VmMTdlYzIwODgzZGI2NWFiMThiM2U1NDM0NzM0NDZhMmQwOThhZmI0NDQ5OWY5N2VmMA==

#EXT-X-PROGRAM-DATE-TIME:2021-06-28T17:53:27.533+00:00
#EXTINF:2,
https://chunk-gce-us-east1-production.cfcdn.mux.com/v1/chunk/3aJUOua6jsMHYybcqXRBpcXH82aCYXTu02TPTKHzIokndAPmz300ZThlCZbeNAy1t73003iytFZNJdjcvjTsOrCVTaGZgQ9J00uU/1.m4s?skid=default&signature=NjBmMjFkODBfMjA1ZWNmYzgzYWRhMzNjMTY5YmEyYmM2NzE4MDk5N2I1MWE3NzhjODlhNGIzNWI3NGIwNTA5ZTIxOWQyNjI5OQ==

#EXT-X-PROGRAM-DATE-TIME:2021-06-28T17:53:29.533+00:00
#EXTINF:2,
https://chunk-gce-us-east1-production.cfcdn.mux.com/v1/chunk/3aJUOua6jsMHYybcqXRBpcXH82aCYXTu02TPTKHzIokndAPmz300ZThlCZbeNAy1t73003iytFZNJdjcvjTsOrCVTaGZgQ9J00uU/2.m4s?skid=default&signature=NjBmMjFkODBfZTIyOTA5YWFjZjMzYTY4MzQ4YWEzZDBiNDkyODk1NTg2ODE2M2YwZjI3NmY2MTVhOTM5MTA2MzQ4ODIyNTNkOQ==

#EXT-X-PROGRAM-DATE-TIME:2021-06-28T17:53:31.533+00:00
#EXTINF:2,
https://chunk-gce-us-east1-production.cfcdn.mux.com/v1/chunk/3aJUOua6jsMHYybcqXRBpcXH82aCYXTu02TPTKHzIokndAPmz300ZThlCZbeNAy1t73003iytFZNJdjcvjTsOrCVTaGZgQ9J00uU/3.m4s?skid=default&signature=NjBmMjFkODBfNDRkZTNhYTE5M2RhYTA4MTA4MWFkODc0YzgyMDcyMGMwODFmZWIxOGRiNWM4YzJhMTM0YTNiNGRhYmYyMWE1Nw==

#EXT-X-ENDLIST
```

### `currentPdt`

This will return a JavaScript [Date][] object that is based on the currentTime.
If there is no PDT in the stream, an invalid date object is returned.

```js
const player = document.querySelector('mux-player');
// assuming the above stream, the initial currentPdt would be
player.currentPdt;
// Mon Jun 28 2021 13:53:25 GMT-0400 (Eastern Daylight Time)
player.currentPdt.getTime();
// 1624902805533

// now if we seek forward, by 10 seconds
player.currentTime = 10;

player.currentPdt;
// Mon Jun 28 2021 13:53:35 GMT-0400 (Eastern Daylight Time)
player.currentPdt.getTime();
// 1624902815533
```

### `getStartDate()`

This will return a JavaScript [Date][] object that is based on the beginning of the stream.
This method is a reflection of the [HTML specified method](https://html.spec.whatwg.org/multipage/media.html#dom-media-getstartdate).

```js
const player = document.querySelector('mux-player');
// assuming the above stream, getStartDate() would return
player.getStartDate();
// Mon Jun 28 2021 13:53:25 GMT-0400 (Eastern Daylight Time)
player.getStartDate().getTime();
// 1624902805533
// notice that when currentTime is 0, getStartDate() is equivalent to currentPdt

// now if we seek forward, by 10 seconds
player.currentTime = 10;

player.getStartDate();
// Mon Jun 28 2021 13:53:25 GMT-0400 (Eastern Daylight Time)
player.getStartDate().getTime();
// 1624902805533
// notice that even though we seeked forward, we still get the same value.
```

[Date]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date

## Full API reference

Any features or settings not mentioned above can be found in our [full API reference](/docs/guides/player-api-reference) covering all of the available events, attributes, properties, and methods exposed by the player.


# Mux Player examples
Browse our collection of code examples for building common use cases with Mux Player
<GuideCard
  imageSrc="/docs/images/example-player-loop@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Looping background video"
  description="Display a looping background video on your site with the Mux Player."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/s/looping-hero-background-video-op53sr",
    },
  ]}
/>

<GuideCard
  imageSrc="/docs/images/example-player-ambient-mode@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Ambient mode"
  description="Create a dynamic background gradient that matches colors from the video."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/s/ambient-mode-vv63e9",
    },
  ]}
/>

<GuideCard
  imageSrc="/docs/images/example-player-audio-viz@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Audio visualization with audio parameter"
  description="Display a visual representation of the audio in your video during playback."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/s/audio-visualization-o52wog",
    },
  ]}
/>

<GuideCard
  imageSrc="/docs/images/example-player-metadata@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Sending detailed metadata to Mux Data"
  description="Display a visual representation of the audio in your video during playback."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/s/send-detailed-metadata-wm6o44",
    },
  ]}
/>

<GuideCard
  imageSrc="/docs/images/example-player-disable-seek@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Disable seeking"
  description="Prevent you viewers from seeking to a specific point in the video."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/s/disable-seeking-w7pltk",
    },
  ]}
/>

<GuideCard
  imageSrc="/docs/images/example-player-playlist@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Play videos in a playlist"
  description="Play through a set of audio or video sources."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/s/mux-player-media-playlist-ntj11i",
    },
  ]}
/>

<GuideCard
  imageSrc="/docs/images/example-player-loop@2x.png"
  imageWidth={536}
  imageHeight={300}
  title="Mux Player Meditate (Audio Only + CuePoints)"
  description="Use Mux Player + CuePoints for advanced customization of playback and interactivity."
  links={[
    {
      title: "View on CodeSandbox →",
      href: "https://codesandbox.io/p/sandbox/mux-player-audio-cuepoints-vl2r9b",
    },
  ]}
/>


# Mux Player FAQs
Get answers to common questions about Mux Player
# Do you support non-Mux HLS streams?

Mux Player is designed with the Mux Platform in mind. Being tightly coupled with Mux Video is what enables features like timeline hover previews, and those sweet, descriptive errors in Mux Data.

# How can I access the underlying video element using Mux Player?

The `media.nativeEl` property is a reference to the underlying video element. You can use this to access the video element's properties and methods.

```jsx
  <MuxPlayer
    playbackId="EcHgOK9coz5K4rjSwOkoE7Y7O01201YMIC200RI6lNxnhs"
    ref={(muxPlayerEl) => console.log(muxPlayerEl.media.nativeEl)}
    metadata={{
      video_id: "video-id-54321",
      video_title: "Test video title",
      viewer_user_id: "user-id-007",
    }}
  />
```

This isn't possible when using the iframe-embedded version of Mux Player through player.mux.com. You can control the embedded version of Mux Player through the [Player.js spec](https://github.com/embedly/player.js#playerjs).

# Do you have a Mux Player for native mobile?

Yes, we have public beta SDKs for [iOS](/docs/guides/mux-player-ios) and [Android](/docs/guides/mux-player-android). If you're building directly in Swift/Objective-C or Kotlin/Java then you can use these SDKs directly. If you're building with Flutter or React Native you will need to bridge these native SDKs into your framework.

# I would love to speak to someone on the team about a feature idea or a problem I'm running into with the player, how can I do that?

Please [leave us some feedback](/support) and we'll be in touch!

# How is Mux Player built?

Mux Player is built with [Web Components](https://developer.mozilla.org/en-US/docs/Web/Web_Components). Web Components is a native browserAPI for defining custom HTML tags that can be used in the DOM.
Mux Player is built on top of [Media Chrome](https://github.com/muxinc/media-chrome) and the [Mux Video HTML element](https://github.com/muxinc/elements/tree/main/packages/mux-video). You can think of it like this:

* [Mux Video HTML element](https://github.com/muxinc/elements/tree/main/packages/mux-video) handles the HLS playback tech behind the scenes and integration with Mux Data.
* [Media Chrome](https://github.com/muxinc/media-chrome) is the UI layer.
  Both the Mux Video HTML element and Media Chrome are maintained and under active development by Mux.

# What are the developer system requirements?

Mux Player package targets ES2019, if you're targeting an older JavaScript runtime Mux Player might not be compatible with your build setup.

# Evergreen browser support

Mux Player supports the most recent versions of evergreen browsers on desktop and mobile. Evergreen browsers are the modern browsers that are automatically updated:

* Chrome (Mac, Windows, Linux, iOS, iPadOS, Android)
* Safari (Mac, iOS, iPadOS)
* Firefox (Mac, Windows, Linux, Android)
* Edge (Mac, Windows, Linux)

# TypeScript support

Mux Player is fully written in TypeScript version 4.5. If you are on an older version of TypeScript (pre-4.0), you will likely have to upgrade your TypeScript package in order to get the TypeScript benefits.


# Mux Player for web releases
Every new release of Mux Player for web is posted here with release notes
# Current release

## 3.10.1

* Fix: Default playback to MSE except for Safari to cover all Chromium browsers which support native HLS playback now

# Previous releases

## 3.10.0

* Feature: Add `max-auto-resolution` attribute for automatic resolution capping
* Fix: Improvements to DRM for Airplay playback

## 3.9.2

* Fix: Upgrade ce-la-react to fix missing key warning

## 3.9.1

* Fix: Upgrade hls.js to 1.6.15

## 3.9.0

* Feature: Add typed CSSProperties for mux-player custom CSS vars
* Fix: Add MENU\_ITEM constant for styling of menu items with CSS
* Fix: Upgrade media-chrome to 4.16.0 and media-tracks to 0.3.4

## 3.8.0

* Feature: Add `nomutedpref` prop to mux player
* Feature: Add support to define custom fullscreen element
* Fix: Default playback to MSE check for Google Chrome 142+
* Fix: Upgrade media-chrome to 4.15.1

## 3.7.0

* Feature: Expose centered layout in themes
* Fix: Default playbackto MSE for Google Chrome which supports native HLS playback now
* Fix: Upgrade hls.js to 1.6.13

## 3.6.1

* Fix: Upgrade media-chrome to 4.14.0

## 3.6.0

* Feature: Add switch to control the behavior of the ended event
* Feature: Add classic theme video title
* Fix: Disable spacebar shortcut if `nohotkeys` enabled
* Fix: Remove error dialog for audio only

## 3.5.3

* Fix: Stop-gap solution to some architectural layer + src-related prop setting causing early and incorrect playback-core initialization

## 3.5.2

* Fix: Update order of props setting so playback id always comes first to resolve session-based expectations (e.g. mux data metadata)

## 3.5.1

* Fix: Upgrade hls.js to 1.6.6, remove workaround for MTA (multi-track audio)

## 3.5.0

* Feature: Add Google IMA support for mux-player and mux-video variants
* Feature: Add retry logic for 412 not playable errors
* Feature: Add free plan logo to the player

## 3.4.1

* Fix: Bring back cast button for drm protected videos
* Fix: Change default of `preferCmcd` to `'none'` for improved cacheability
* Fix: `rendition-menu` visual improvements

## 3.4.0

* Feature: Add fullscreen API on player element
* Feature: Add `video-title` attribute & `videoTitle` property
* Fix: `defaultHiddenCaptions` property bug for React
* Fix: Casting devices discovery after new video load

## 3.3.4

* Fix: Allow extension less Mux m3u8 url as src

## 3.3.3

* Feature: Add optional Mux logo to Mux video
* Fix: Remove redundant FPS DRM generateRequest() for native playback

## 3.3.2

* Fix: `default-hidden-captions` attribute bug for Vue

## 3.3.1

* Fix: Player controls unresponsive after casting prompt

## 3.3.0

* Feature: Implement Mux badge that can be enabled via a `proudly-display-mux-badge` attribute
* Fix: Update hls.js version to fix multi-DRM playready bug
* Fix: Update media-chrome to fix a bug with the error dialog not hiding on error recovery
* Fix: Media Chrome theme flicker on load

## 3.2.0

* Feature: Set Mux data default player init time for greater accuracy. Expose attribute and property for externally defined player init time
* Feature: Use Media Chrome's error dialog
* Feature: NPM package includes provenance statements from now on
* Fix: Slot behavior of child elements

## 3.1.0

* Feature: Error handling rearchitecture (including more granular and DRM error cases)
* Feature: Add asset start and end time props and attrs
* Fix: Chapters disapearing after preload none
* Fix: Menu CSS vars to hide menu button
* Fix: Update peer dependencies for React 19 RC
* Chore: Upgrade to [Media Chrome v4.2.1](https://github.com/muxinc/media-chrome/releases/tag/v4.2.1)

## 3.0.0

* Fix: `addChapters` and `addCuepoints` now have correct TypeScript method types
* Fix: Removed seek forwards and backwards buttons from mobile pre-playback UI
* Fix: Added missing buttons to mobile live audio view (play, live and mute)
* Chore: Upgrade to [Media Chrome v4.1.1](https://github.com/muxinc/media-chrome/releases/tag/v4.1.1)
* Feature: New tooltips for buttons in the UI, enabled by default

## 2.8.1

* Fix: Use CSS to disable subtitle shifting for iOS in fullscreen
* Chore: Upgrade to [Media Chrome v3.2.5](https://github.com/muxinc/media-chrome/releases/tag/v3.2.5)

## 2.8.0

* Feature: [Adds DRM support](/docs/guides/protect-videos-with-drm)
* Fix: Pseudo-ended eval case where media is not attached
* Fix: Hide cast button by default when using DRM
* Chore: Upgrade to [Media Chrome v3.2.3](https://github.com/muxinc/media-chrome/releases/tag/v3.2.3)
* Chore: Upgrade hls.js, custom-media-element, castable-video, and media-tracks

## 2.7.0

* Feature: PDT Clipping Support
* Feature: Add [`addChapters()`](/docs/guides/player-advanced-usage#add-chapters-and-time-based-metadata) API

## 2.6.0

* Feature: Add `'use client'` to components for better out of box functionality with Next.JS
* Fix: Cleanup TypeScript types

## 2.5.0

* Chore: Upgrade to mux-embed v5.2.0 & [Media Chrome v3.2.0](https://github.com/muxinc/media-chrome/releases/tag/v3.2.0),
* Chore: Upgrade hls.js and React TypeScript types
* Feature: Add `disable-tracking` / `disableTracking` attribute / property to disable Mux Data tracking

## 2.4.1

* Fix: Make sure we do not apply holdback to seekable when live streams have ended

## 2.4.0

* Chore: Upgrade to [Media Chrome v3.1.1](https://github.com/muxinc/media-chrome/releases/tag/v3.1.1) (major version bump)
* Fix: Cleanup various issues with DVR UI (including seekable time updates for time range and time display cases)
* Fix: Polish new time preview w/ shifting arrow
* Fix: Polish using easing gradients for UI backdrop
* Feature: `forward-seek-offset` / `forwardSeekOffset` + `backward-seek-offset` / `backwardSeekOffset` attributes / properties now also update keyboard hotkeys offsets

## 2.3.3

* Chore: Upgrade Media Chrome
* Fix: Enable chapters & metadata tracks if cloned and appended to native video element
* Fix: Fire an ended event if playback is stalled near the end of playback

## 2.3.2

* Chore: Upgrade Media Chrome
* Fix: Subtitles selection edge cases

## 2.3.1

* Fix: Remove unneeded `target-live-window="NaN"` attribute sprouting
* Fix: Upgrade media-chrome to 2.0.1 fixing an undefined type error
* Fix: Upgrade custom-media-element and media-tracks packages improving types

## 2.3.0

* Feature: Upgrade to [Media Chrome v2](https://github.com/muxinc/media-chrome/releases/tag/v2.0.0) and [castable-video v1](https://github.com/muxinc/castable-video)
  The Google cast framework script is now automatically loaded, [see guide](/docs/guides/player-core-functionality#chromecast)
  Usage of the standard [Remote Playback API](https://developer.mozilla.org/en-US/docs/Web/API/Remote_Playback_API)
* Feature: Add `extra-source-params` / `extraSourceParams` attribute / property for advanced usage
* Feature: Add the ability to set `default-duration` / `defaultDuration` before media loads
* Feature: Allow forcibly showing buttons that we usually hide at small sizes via CSS vars
* Feature: Add unofficial `_hlsConfig` property to media elements and playback core
* Feature: Add additional CSS parts for export
* Fix: Audio controls styling, controlbar background color and timerange width
* Fix: Attributes mismatch to make sure controls don't overlap
* Fix: Android tap issues on show and hide of controls

## 2.2.0

* Feature: Use playback rate `selectmenu` for the new theme
* Fix: Use solid accent color in rate menu
* Fix: Upgrade Media Chrome
* Fix: Update menu styles

## 2.1.0

* Feature: Add support for manifest manipulation and other media stream query param properties
* Fix: Prevent clicks on background gradients
* Fix: Add volume slider to live player UI

## 2.0.1

* Fix: Make sure `accent-color` gets set properly

## 2.0

* Feature: New default theme named `gerwig` 🎉
  No functional breaking changes, only visual changes
* See [Upgrade guide from 1.x to 2.0](https://github.com/muxinc/elements/blob/bfea94bcbdfc9e3c68afb24d2b3414d83bf4639b/packages/mux-player/UPGRADING_V1_to_V2.md)
* See [blog post](https://www.mux.com/blog/mux-player-2-0-for-web-and-coming-soon-for-ios-and-android)
* See [Twitter / X.com thread](https://twitter.com/MuxHQ/status/1709628018216358194)

## 1.14.1

* Fix: Resolve regression so `title` will be used by Mux Data as `video_title` if not overridden by explicit metadata
* Fix: Resolve issue where MTA implementation could cause load issues/hangs in playback for LL-HLS streams

## 1.13.0

* Feature: Add custom poster slot to mux-player and mux-player-react to allow for server-side progressive enhancement 🎉 See [issue #590](https://github.com/muxinc/elements/issues/590)
* Feature: Add muti-track audio selector 🗣️ ([see guide](/docs/guides/player-core-functionality#multi-track-audio-selector))

## 1.12.1

* Fix: Improve dist exports for greater compatibility with different build tools, including not declaring non-existent exports in package.json

## 1.12.0

* Feature: Add quality selector [see guide](/docs/guides/player-core-functionality#quality-selector)
* Feature: Expose underlying poster image CSS part for advanced styling
* Fix: Fix bug around loading themes in React

## 1.11.4

* Fix issue with edge case assets when used in Next.js production builds in Chrome causing hundreds of requests for `0.ts` segment. See [issue #688](https://github.com/muxinc/elements/issues/688)

## 1.11.3

* Chore: media chrome version bump, fixes a resize observer crash that can happen in CodeSandbox

## 1.11.2

* Chore: bump media chrome and Hls.js to latest versions

## 1.11.1

* Chore: bump media chrome and Hls.js to latest versions

## 1.11.0

* Fix: Upgrade hls.js to [`v1.4.1`](https://github.com/video-dev/hls.js/releases/tag/v1.4.1).
* Feat: Add no-volume-pref attribute to turn off saving the user selected volume in [local storage](https://developer.mozilla.org/en-US/docs/Web/API/Window/localStorage).

## 1.10.1

* Fix: Force theme to be ltr direction.
* Fix: Use webkit pseudo element for captions movement, where available.

## 1.10.0

* Feature: Add support for synchronizing video playback (`currentPdt` and `getStartDate()`)
* Fix: Fix resetting currentTime to `0` in `mux-player-react`.

## 1.9.0

* Feature: Add support for Media Chrome themes.
* Feature: Add `minimal` and `microvideo` theme exports.
* Feature: Add cuepoint event handlers for `mux-player-react`.
* Feature: Use Mux Data `player_error_context` to get better error grouping.
* Feature: Add `--dialog` and `--loading-indicator` CSS vars.
* Fix: Upgrade hls.js to [`v1.4.0-beta.2`](https://github.com/video-dev/hls.js/releases/tag/v1.4.0-beta.2).
* Fix: Update hls.js configs to optimize streaming performance.
* Fix: Update types and improve support for Angular projects.

## 1.8.0

* Feature: Add `max-resolution` attribute on mux-player and mux-video.
* Feature: Add API for CuePoints metadata.
* Fix: Typescript error for Vite based apps like Sveltekit, Nuxt, Vue.
* Fix: Explicitly clean up text tracks, even for native (non-hls.js) playback.

## 1.7.1

* Fix: Only initialize with setupCuePoints when using hls.js for playback (resolves Safari playback error)

## 1.7.0

* Feature: Introduce a captions menu button.
* Fix: Bring back play button to the control bar for small player size.
* Fix: Migrate to use new Media Chrome media-live-button.
* Fix: Improve attribute empty behavior.
* Fix: Upgrade Media Chrome v0.18.1.
* Fix: Use new Media Chrome template syntax.

## 1.6.0

* Feature: Add `storyboard-src` attribute and corresponding prop
* Fix: Use webp format instead of jpg, less bandwidth
* Fix: Memory leaks related to the playback engine not being torn down properly.

## 1.5.1

* Fix: Allow setting of a Media Chrome theme template via a property.

## 1.5.0

* Feature: Mux player uses a new HTML based templating syntax as preparation for
  Media Chrome theme compatibility which will give developers an easy way to change
  the look and feel of the player.
* Feature: Allow `<mux-player>` web component to receive any Mux Data `metadata-*` fields, beyond `metadata-video-title`, `metdata-video-id` and `metadata-viewer-user-id`, now things like `metadata-sub-property-id` and any other Mux Data fields can be passed with this syntax. Note the `muxPlayer.metadata = { video_title: "My Title", sub_property_id: "Sub prop 123" }` syntax also still works.
* Fix: Prevent the player of duplicate rendering the top-level internal elements in edge cases.

## 1.4.0

* Feature: Player design update: removed the backdrop shade by default.
* Fix: Attributes set after the `playback-id` are now correctly passed in playback core.

## 1.3.0

* Feature: Add `disable-cookies` attribute and `disableCookies` property.
* Feature: Add `experimental-cmcd` attribute and `experimentalCmcd` property for headers-based CMCD usage.
* Feature: Add ability to unset poster
* Feature: Conditionally use title for title metadata in Mux Data
* Feature: Add storyboard getter on player
* Fix: Check JWT before setting poster and storyboard urls
* Fix: Don't register prop for --controls-backdrop-color CSS var
* Fix: Upgrade to Media Chrome v0.15.1
* Fix: Various edge case fixes in Media Chrome UI
* Fix: Improve hiding controls behavior when interacting with play or fullscreen buttons.

## 1.2.0

* Feature: Implement React lazy for `mux-player-react`
* Feature: Add type-compliant `seekable` property to the API
* Fix: `playbackRate` for `mux-player-react`

## 1.1.3

* Fix: Add default values to object-fit and object-position

## 1.1.2

* Fix: Upgrade Media Chrome to v0.14.0
* Fix: Properly check iPhones for fullscreen unavailability
* Fix: Properly unset poster image sources when they're removed

## 1.1.1

* Fix: Add `--media-object-fit` and `--media-object-position` to `mux-video`

## 1.1.0

* Feature: Add ability to unset `poster` by setting it to an empty string
* Fix: Turn off backdrop color when controls are disabled

## 1.0.0 🎉

* Feature: Replace `prefer-mse` with `prefer-playback` for more control
* Feature: Add default width 100% to avoid unexpected CLS and resizing scenarios
* Feature: Disable unusable controls when `playback-id` is unset
* Feature: Add hotkey for toggling closed captions (`c`)
* Fix: Google Chrome v106 caption positioning bug
* Fix: Disable all controls when error dialog is open (a11y)
* Fix: Hide fullscreen button when fullscreen is unavailable (e.g. `iframe` usage)
* Fix: Ignore Safari for captions movement.
* Fix: `audio` UI height bugs
* Fix: Add missing setter for defaultHiddenCaptions prop.
* Fix: Clean up `crossOrigin` and `playsInline` usage while respecting defaults/availability.
* Fix: Make player interface compliant with more of `HTMLVideoElement` type expectations, even on initialization
* Fix: Handle removing/nil `playback-id`
* Fix: Add `preload` property support
* Fix: `title` property bug
* Fix: Use `CSS.registerProperty` on vars to declare them as colors for better resilience/fallback
* Fix: (Mux Player React) Resolve issues with `currentTime` prop
* Fix: (Mux Player React) Remove vestigial code for `tertiaryColor` prop

## 1.0.0-beta.0

* Feature: add `video` CSS part for styling the `<video>` element
* Feature: add `--controls-backdrop-color` CSS var to allow changing the backdrop color
* Feature: upgrade hls.js to version `v1.2.3`
* Feature: prefer Media Source Extensions on Android
* Feature: refresh seek backward and forward icons
* Fix: memory leak of hls.js instances
* Fix: `start-time` attribute now works on iOS
* Fix: a11y tab order of player controls
* Fix: control bar icon alignment was off by a few pixels
* Fix: restore right-click video menu

## 0.1.0-beta.27

* Feature: configure playback rates for the player
* Feature: add a title component to the player
* Feature: allow hiding controls based on CSS variables
* Feature: allow turning off keyboard shortcuts via the hotkeys attribute, don't allow seeking in live streams with the arrow keys
* Feature: use Media Chrome's poster image element for posters
* Fix: don't pollute global in SSR
* Fix: change position of the live indicator

## 0.1.0-beta.26

* Improvement: update the warning logged when an incorrect stream type is passed to the player.

## 0.1.0-beta.25

* Feature: add keyboard shortcuts and a `nohotkeys` attribute to turn off keyboard shortcuts.
* Feature: expose CSS parts for targeting controls via CSS.

## 0.1.0-beta.24

* Improvement: Improve time range behavior; add preview time code, smooth playhead progress and fine seek control, keep preview thumb in player bounding box.
* Improvement: Add Mux flavored cast icon.
* Feature: Add `defaultMuted` and `defaultPlaybackRate` properties.
* Feature: Add `textTracks` property, `addTextTrack()` and `removeTextTrack()` methods.

## 0.1.0-beta.23

* Update: Rely on Media Chrome availability states where appropriate.
  Remove unneeded code from `mux-player`.

## 0.1.0-beta.22

* Improvement: Optimize `mux-player` tests.

## 0.1.0-beta.21

* Update: Mux Player (and all Mux Elements) are now published under the `@mux` NPM scope. Please update `@mux/mux-player` references to `@mux/mux-player` as of `0.1.0-beta.21`.

## 0.1.0-beta.20

* Feature: Chromecast is built in -- via [castable-video](https://github.com/muxinc/castable-video). See docs in the Core Features section for details on how to enable it.

## 0.1.0-beta.19

* Fix: import for [castable-video](https://github.com/muxinc/castable-video) while we hammer on Chromecast.

## 0.1.0-beta.18

* Fix: Some captions shifting jankyness on live streams when shifting wasn't necessary.
* Fix: Captions offset for Safari
* Feature: Support for audio-only Mux assets with the `audio` attribute
* Feature: Experimental Chromecast support added with [castable-video](https://github.com/muxinc/castable-video). This is intentionally undocumented while we work out the kinks.
* Improvement: Better progress bar alignment.

## 0.1.0-beta.17

* Fix: Some recoverable errors were incorrectly being sent to Mux Data -- this caused an inflated playback error percentage metric in your Mux Data dashboard. This incorrect error tracking was especially prevalent on live streams. We fixed this after it was discovered at [TMI](https://tmi.mux.com/).

## 0.1.0-beta.16

* Fix: Log an error if a token is passed in with playback-id (playback tokens should be passed in via `playback-token` attribute)

## 0.1.0-beta.15

* Fix: update `commonjs` import files to cjs.js. This fixes some build systems that rely on the cjs.js extension

## 0.1.0-beta.14

* Improvement: Tweaked a few Hls.js configuration settings for live and low-latency live based on some recent testing (backed up by Mux Data, of course). This is the kind of thing the team working on Mux Player worries about so that you don't have to!

## 0.1.0-beta.13

* Fix: For live streams on non-Safari browsers the red (live) / gray (behind live) dot indicator was being a little too aggressive about switching to gray, which indicates the viewer is behind the live edge. This is fixed now, you shouldn't fall back from the live edge unless you pause or rebuffer.

## 0.1.0-beta.12

* Important fix for fullscreen. In previous versions if you entered fullscreen you would get stuck there
* Improve interaction so that clicks (not taps) anywhere on the video player will play/pause. Many people expected and asked for this behavior, so now you have it.

## 0.1.0-beta.11

* Added `thumbnail-time` optional attribute that can be used to set the poster image thumbnail (if you're not using signing tokens)
* Point to [github/template-parts@0.5.2](https://github.com/github/template-parts/releases/tag/v0.5.2) instead of Mux's fork because they were so kind to [get a fix in for us](https://github.com/github/template-parts/pull/55). Thanks GitHub!

## 0.1.0-beta.10

* Improvement: The progress bar now shows above the controls, it's cleaner 💅🏼
* Fix: when changing playback-id on an existing mux-player instance we had some leftover state around
* Fix: full screen was incorrectly using the controls layout depending on the size of the player before it entered full screen. That meant if the player was small and you went full screen you still saw the small controls. Bad!

## 0.1.0-beta.9

* Your beautiful errors will now flow nicely into Mux Data. Your Mux Data errors dashboard just got a whole lot more useful. This is a big one.
* Mux Player is now implemented as a Media Chrome "theme" under the hood. Laying some groundwork for some exciting Media Chrome things to come
* Fix for adding event listeners on `mux-player`, if mux-player JavaScript was loaded after your HTML, events wouldn't get registered. Sorry about that -- fixed now. And we have tests to make sure we don't accidentally introduce a regression down the road.
* The `.hls` property on `mux-player` is super-secret and should not be used unless you are a serious professional. We make no guarantee and your warranty is void if you use this property. To reflect this stance, it has been renamed to `_hls`.
* Fixed some seek to live behavior
* When the error dialog is open we no longer steal the focus of the document. Much better.

## 0.1.0-beta.8

* If you're using Webpack 4, maybe upgrade? But if not, we got you covered. Fixed package.json to point browser field at `mjs` so that Webpack 4 is happy

## 0.1.0-beta.7

* Fix: make mux-player size based on video element
* Fix: make mux-player errors more uniform

## 0.1.0-beta.6

* Fix: messed up the release in beta.5, quick follow-on

## 0.1.0-beta.5

* Fix: clear out some state that was hanging around when playback-id is changed on an existing Mux Player instance, and add some test coverage for this sort of thing
* Fix: mux-player web component metadata- attributes were not always propagating down
* Fix: prevent non-fatal Hls.js errors from propagating and causing error states

## 0.1.0-beta.4

* Paid off some technical debt to handle web components being upgraded after existing in the DOM
* Fix `primary-color` attribute so that it is used for all controls, both icons + text. Previously it was only being applied to icon colors

## 0.1.0-beta.3

* Fix developer log links that go to GitHub
* Make sure internal state monitoring setup happens when the element exists. Fixes a bug in React when the captions button was sometimes not showing.

## 0.1.0-beta.2

* Added descriptive error handling. This is important so that you and your viewers are able to easily and quickly understand why a video is not playing. Is your local network connection offline? Is the signed URL expired? Maybe you mixed up PlaybackIDs and you have the wrong signed URL? Is it a problem specific to the media on your device? Often times video-related playback errors are cryptic and difficult to understand the root cause of. We put extra effort into this and we hope it helps you when things go wrong 💖.
* Fix conditional rendering bug when attributes are removed sometimes the template wasn't updating.

## 0.1.0-beta.1

* When the control bar is engaged, slide the captions/subtitles up so they are still visible and don't get obscured

## 0.1.0-beta.0

First beta tag release 🎉

* Extended autoplay options `autoplay`, `autoplay="muted"` and `autoplay="any"` are all options now. See docs above for details.
* Started tracking [Player Startup Time](/docs/guides/data-startup-time-metric#player-startup-time) with Mux Data. The mo' QoE data we can get, the better!
* Changed the behavior of the time display, it now defaults to ascending time (current time) and on click will toggle to show remaining time. Previously it showed only remaining time and that was confusing.
* Fixed a bug related to storyboards on the thumbnails track when the underlying source changed. This should have impacted exactly 0 developers but we wanted to make sure to squash it anyway. If you somehow ran into this bug then you're welcome.

## 0.1.0-alpha.7

* Support for Signed URLs (see advanced usage section)
* No longer require `env-key` to be passed in (Mux Data will infer environment based on the PlaybackID)


# Mux Player for iOS
Learn how to use Mux Player SDK to play video delivered by Mux in your iOS or iPadOS application
This guide will help you install the Mux Player SDK in your native iOS or iPadOS application. If you encounter any issues please let us know by filing [an issue on Github](https://github.com/muxinc/mux-player-swift).

## Install the SDK

Let's start by installing the SDK. We'll use the Swift Package Manager. [Step-by-step guide on using Swift Package Manager in Xcode](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app).

Open your applications project in Xcode. In the Xcode menu bar select File > Add Packages. In the top-right corner of the modal window that opens enter the SDK repository URL which is `https://github.com/muxinc/mux-player-swift`.

By default Xcode will fetch the latest version of the SDK available on the `main` branch. If you need a specific package version or to restrict the range of package versions used in your application, select a different Dependency Rule. [Here's an overview of the different SPM Dependency Rules and their semantics](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app#Decide-on-package-requirements).

Click on Add Package to begin resolving and downloading the SDK package. When completed, select your application target as the destination for the `MuxPlayerSwift` package product. To use the SDK in your application, import it's module.

```swift
import MuxPlayerSwift
```

## Stream video from a Mux asset

Use `MuxPlayerSwift` to setup an `AVPlayerViewController` or `AVPlayerLayer` that can download and stream a Mux asset with only a playback ID. The SDK will also enable Mux Data monitoring to help you measure the performance and quality of your application's video experiences.

```swift
/// After you're done testing, you can check this video out to learn more about video and players (as well as some philosophy)
let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset, configured with the playback ID.
let playerViewController = AVPlayerViewController(playbackID: playbackID)

/// Prepare an AVPlayerLayer to stream and monitor a Mux asset, configured with the playback ID.
let playerLayer = AVPlayerLayer(playbackID: playbackID)
```

Your application can customize how Mux Video delivers video to the player using playback URL modifiers. A playback URL modifier is appended as a query parameter to a public playback URL. The `MuxPlayerSwift` exposes a type-safe Swift API that constructs these URLs.

```swift
/// After you're done testing, you can check out this video out to learn more about video and players (as well as some philosophy)
let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

/// Create playback options to limit resolution up to 720p.
let playbackOptions = PlaybackOptions(maximumResolutionTier: .upTo720p)

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset with playback options.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

The example above delegated constructing the playback URL and appending playback modifiers to the SDK.

When using the [`AVPlayerViewController`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/avkit/avplayerviewcontroller/#initializers) or [`AVPlayerLayer`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/avfoundation/avplayerlayer#initializers) convenience initializers provided by the `MuxPlayerSwift` there are no required steps to enable Mux Data monitoring for video streamed from a Mux playback URL.

[See the below section](/docs/guides/mux-player-ios#monitor-media-playback) for more details and how to customize Mux Data monitoring.

### AVPlayerLayer-backed views

If you're using `UIView` that is backed by an `AVPlayerLayer` to display video, `MuxPlayerSwift` [exposes APIs](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/avfoundation/avplayerlayer#instance-methods) to setup an existing `AVPlayerLayer` for playback.

```swift
/// Prepare an already initialized AVPlayerLayer to stream and monitor a Mux asset
func preparePlayerLayer(playbackID: String, in playerView: UIView) {
  // Check to make sure the player view backing layer is of the correct type and get a reference if it is.
  guard let playerLayer = playerView.layer as? AVPlayerLayer else {
    return
  }

  // Prepares the player layer to stream media and monitor playback with Mux Data.
  playerLayer.prepare(playbackID: playbackID)
}
```

Your application can also customize playback or monitoring with Mux Data by using the same parameters as shown for the `AVPlayerLayer` initializers above.

## Monitor media playback

By default Mux Data metrics will be populated in the same environment as your playback ID.  [Learn more about Mux Data metric definitions here](/docs/guides/understand-metric-definitions).

Read on for additional (and optional) setup steps to modify or extend the information Mux Data tracks.

Use [`MonitoringOptions`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/monitoringoptions) to set custom monitoring-related parameters.

If you're already using the Mux Data SDK for AVPlayer [this initializer](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/monitoringoptions/init\(customerdata:playername:\)) allows you to any of your existing logic for constructing `MUXSDKCustomerData`.

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let customEnvironmentKey = "ENV_KEY"

// Configure custom Mux Data player metadata.
let playerData = MUXSDKCustomerPlayerData()
playerData.environmentKey = customEnvironmentKey

// Configure custom Mux Data video metadata.
let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Video Behind the Scenes"
videoData.videoSeries = "Video101"

// Combine metadata into customer data for monitoring.
let customerData = MUXSDKCustomerData()
customerData.customerPlayerData = playerData
customerData.customerVideoData = videoData

// Build monitoring options and create the player.
let monitoringOptions = MonitoringOptions(customerData: customerData, playerName: "MyPlayer1")
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  monitoringOptions: monitoringOptions
)
```

## Secure your playback experience

Mux Video offers several levels of playback access control. [See here for more](/docs/guides/secure-video-playback).

### Signed Playback URLs

`MuxPlayerSwift` supports playback of assets enabled for access with signed playback URLs. Playing back assets with a signed playback policy requires the player to include a valid and unexpired JSON Web Token (JWT) when requesting media from Mux.

Your application should generate and sign the JWT in a trusted environment that you control like a server-side application. As a security measure any playback modifiers must be passed through the JWT as additional claims.

Once your application is in possession of the JWT, it can begin streaming.

To start playback, use the JWT to initialize `PlaybackOptions`. Then, initialize `AVPlayerViewController` or `AVPlayerLayer` with your playback ID, as in prior examples.

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let playbackToken = "YOUR_PLAYBACK_TOKEN"

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset
/// with a playback ID that has a signed playback policy.
let playbackOptions = PlaybackOptions(playbackToken: playbackToken)
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let playbackToken = "YOUR_PLAYBACK_TOKEN"

/// Prepare an AVPlayerLayer to stream and monitor a Mux asset
/// with a playback ID that has a signed playback policy.
let playbackOptions = PlaybackOptions(playbackToken: playbackToken)
let playerLayer = AVPlayerLayer(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

```swift
/// Prepare an already initialized AVPlayerLayer to stream and monitor a Mux asset
/// with a playback ID that has a signed playback policy.
func preparePlayerLayer(playbackID: String, playbackToken: String, in playerView: UIView) {
  let playbackOptions = PlaybackOptions(playbackToken: playbackToken)

  // Check to make sure the player view backing layer is of the correct type and get a reference if it is.
  guard let playerLayer = playerView.layer as? AVPlayerLayer else {
    return
  }

  // Prepares the player layer to stream media and monitor playback with Mux Data.
  playerLayer.prepare(playbackID: playbackID, playbackOptions: playbackOptions)
}
```

<Callout type="info">
  If your JWT includes a playback restriction, Mux will not be able perform domain validation when the playback URL is loaded by `AVPlayer` because no referrer information is supplied.

  To allow `AVPlayer` playback of referrer restricted assets set the `allow_no_referrer` boolean parameter to `true` when creating a playback restriction. Conversely, a playback restriction with `allow_no_referrer` to `false` will disallow `AVPlayer` playback. [See here for more](/docs/guides/secure-video-playback#using-referer-http-header-for-validation).
</Callout>

### Digital Rights Management

DRM (Digital Rights Management) provides an extra layer of content security for video content streamed from Mux.
For more details see the [Protect Your Videos with DRM guide](/docs/guides/protect-videos-with-drm).

<Callout type="warning">
  Mux uses the industry standard FairPlay protocol for delivering DRM'd video content to native iOS and iPadOS applications. To play back DRM'd content on these platforms, you'll need to obtain a FairPlay deployment package (also called a “Fairplay certificate”), [see more details from Apple here](/docs/guides/protect-videos-with-drm#prerequisites-for-fairplay-drm-on-apple-devices). *Without this, DRM’d content will not be playable in your application on these platforms.*
</Callout>

```swift
let playbackID = "YOUR_PLAYBACK_ID"
let playbackToken = "YOUR_PLAYBACK_TOKEN"
let drmToken = "YOUR_DRM_TOKEN"

/// Configure playback options for a playback ID configured for DRM.
let playbackOptions = PlaybackOptions(
  playbackToken: playbackToken,
  drmToken: drmToken
)

/// Prepare an AVPlayerViewController to stream and monitor a Mux asset.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

## Customize playback

## More tools to control playback behavior

### Restrict Resolution Range

Mux Video gives you extra control over the available resolutions of your video.

`MuxPlayerSwift` exposes convenience APIs to adjust the [maximum](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/maxresolutiontier) and [minimum](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/minresolutiontier) resolutions if they are available.

Setting a maximum resolution helps reduce delivery costs while setting a minimum resolution helps ensure visual quality of your video. Maximum and minimum resolutions can be set independently or composed together.

<Callout type="info">
  If you're using signed URLs, you'll need to embed `min_resolution` and `max_resolution` into the JWT claims. [Full documentation available here](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url).
</Callout>

This example restricts the resolution range `AVPlayer` requests to be between 720p and 1080p.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure playback options to request renditions between 720p and 1080p.
let playbackOptions = PlaybackOptions(
  maximumResolutionTier: .upTo1080p,
  minimumResolutionTier: .atLeast720p
)

// Prepare an AVPlayerViewController to stream a Mux asset with this range.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

This example restricts the resolution `AVPlayer` requests to a single fixed resolution of 720p.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure playback options to request only the 720p rendition.
let playbackOptions = PlaybackOptions(singleRenditionResolutionTier: .only720p)

// Prepare an AVPlayerViewController to stream a Mux asset at a fixed 720p resolution.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

### Adjust Resolution Selection

When `AVPlayer` requests delivery of HLS content, it first downloads a series of text files, commonly called manifests or playlists, to describe the available qualities of a video and the location of all the segments that comprise the video.

The order of renditions in the initial manifest or playlist can influence the resolution level a player selects. When using the Mux Player Swift SDK your application can manipulate that order in the [`PlaybackOptions`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/playbackoptions) provided to `AVPlayerViewController` or `AVPlayerLayer`.

If your Mux asset is publicly playable, specify a [`RenditionOrder`](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/renditionorder).

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure playback options to prioritize higher renditions first.
let playbackOptions = PlaybackOptions(renditionOrder: .descending)

// Prepare an AVPlayerViewController to stream a Mux asset with descending rendition order.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

The available values for rendition order are [listed here](https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/renditionorder#enumeration-cases).

<Callout type="info">
  If using signed playback URLs in your application, you'll need "rendition\_order" : "desc" for descending rendition order into your JWT claims. [Full documentation available here](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url).
</Callout>

See [the Mux blog](https://www.mux.com/blog/more-tools-to-control-playback-behavior-min-resolution-and-rendition-order) and [this guide](/docs/guides/control-playback-resolution#using-playback-modifiers-to-manipulate-playback-resolution) for more on these options.

### Instant Clips

Instant clips are an alternative to our [long-standing asset-based clipping feature](/docs/guides/create-clips-from-your-videos). Requesting instant clips using relative time is now [available for use with all video on-demand (VOD) assets](https://www.mux.com/blog/instant-clipping-update).

Instant clipping allows you to request a stream whose start time is at some later point in the video, relative to the start time of the asset. Likewise you're able to request a stream that ends sooner than when the underlying asset completes. Instant clips do not incur the wait time or expense of a creating a new asset.

Unlike asset-based clipping, instant clipping is done by trimming your VOD assets HLS manifest. This means that instant clipping operates at the segment level of accuracy. You should expect that the content that you clip out may be several seconds longer than you’ve requested. We always make sure to include the timestamps that you request, but your content may start a few seconds earlier, and end a few seconds later.

Assets that originate from a livestream can also be converted into instant clips using program date time epochs. Support for these clips will be available in a future Mux Player Swift release.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Configure instant clipping for a highlight clip of your publicly viewable asset.
// The clip starts about 10 seconds after your asset starts and finishes approximately 10 more seconds after that.
// A few extra seconds of video may be included in the clip.
let clipping = InstantClipping(assetStartTimeInSeconds: 10, assetEndTimeInSeconds: 20)
let playbackOptions = PlaybackOptions(clipping: clipping)

// Create an AVPlayerViewController to stream the instant clip.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

<Callout type="info">
  If using signed playback URLs in your application, you'll need to include `asset_start_time` and `asset_end_time` keys in your JWT claims to enable instant clipping. [Full documentation available here](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url).
</Callout>

## Improve playback performance

`AVPlayer` supports playing back HTTP live streams without pre-loading. When playing the same video more than once in this case, `AVPlayer` does not provide a means for making sure cached video files are used for subsequent playback. This may result in higher network throughput and additional video delivery cost when the same video is played more than once.

One alternative is to [enable static renditions for your Mux asset](/docs/guides/enable-static-mp4-renditions) that your application can download and persist using `URLSession` and `FileManager` APIs.

Another is to first preload streams. As with static renditions, it requires your application to manage downloads and handling storage. (See the [Apple offline playback and storage documentation](https://developer.apple.com/documentation/avfoundation/offline_playback_and_storage?language=objc) for more details on pre-loading).

Mux Player Swift offers a caching mechanism for streams that only provide a single rendition to the player. This cache is primarily tested against and provides the most cost-savings benefit when constraining playback to a single rendition. Your application is required to select a resolution tier to which playback will be constrained when using the smart cache.

This avoids manual intervention required by the previous two options.

Like a browser’s cache, cached segments can be used by all players in your application, as long as they’re configured to use the smart cache.  If your application uses multiple player instances, you can enable smart cache for them by using the convenience initializers provided by the SDK.

The example below enables the smart cache with playback at single fixed resolution.

```swift
let playbackID = "YOUR_PLAYBACK_ID"

// Select a single rendition tier for smart caching (for example, only 720p).
// Available values: https://devdocs.mux.dev/mux-player-swift/documentation/muxplayerswift/singlerenditionresolutiontier
let singleRenditionResolutionTier: SingleRenditionResolutionTier = .only720p

// Enable smart cache and constrain playback to the selected single rendition tier.
let playbackOptions = PlaybackOptions(
  enableSmartCache: true,
  singleRenditionResolutionTier: singleRenditionResolutionTier
)

// Prepare an AVPlayerViewController to stream a Mux asset with smart caching enabled.
let playerViewController = AVPlayerViewController(
  playbackID: playbackID,
  playbackOptions: playbackOptions
)
```

<Callout type="info">
  The smart cache will be automatically purged after your application is terminated. It may be purged by the operating system at any time when your application is suspended in the background.
</Callout>

Mux supports HLS video delivery with Transport Stream segments or Common Media Application Format (CMAF) chunks. Both are supported by the smart cache. [See this explainer for a general introduction to video delivery](https://howvideo.works/#delivery).


# Mux Player for iOS releases
Every new release of Mux Player for iOS is posted here with release notes
# Current release

## v1.5.0

New:

* Added `MuxPlayerContext` as a way to use an AVPlayer with Mux monitoring without needing to call `stopMonitoring` or do other teardown. Also supports AVQueuePlayer and has some basic Audio Session handling

Fixes:

* Fix several swift docs inaccuracies

# Previous releases

## v1.4.0

Improvements

* Resolves an issue where DRM playback would fail to start after a [media services reset](https://developer.apple.com/documentation/avfaudio/avaudiosession/mediaserviceswereresetnotification#Discussion). It is still required to recreate the player and assets after such an event.
* Improvements to the DRM registration process for more consistent startup latency

## v1.3.0

Updates

* Allow disabling Mux Data automatic error tracking via `MonitoringOptions`
* Allow creating `AVPlayerItem`s with Mux `PlaybackOptions` without using our `AVPlayerViewController` or `AVPlayerLayer` APIs. See the header docs for more details

## v1.2.1

Improvements

* Updated and relaxed `mux-stats-sdk-avplayer` dependency
* Added default message "No additional information." for `Monitor` `handleUpdatedPlayerError`

## v1.2.0

Improvements

* Support instant clipping using relative time for publicly playable assets

## v1.1.1

Improvements

* Capture additional details in Mux Data when experiencing an error playing video with DRM

## v1.1.0

Improvements

* [Adds DRM support](/docs/guides/protect-videos-with-drm)

## v1.0.0

Improvements

* Enable caching when streaming on-demand at a fixed resolution tier
* Updated Mux Data dependencies to meet App Store privacy manifest requirements

API Changes

* Add convenience initializers to constrain playback to a single rendition with a preset resolution tier
* Add additional `AVPlayerViewController` extensions that configure an already existing instance for playing back a video from Mux
* Remove `ascending` option for `RenditionOrder` this parameter is not supported by Mux Video

## v0.5.0

Improvements

* Sets player software name and version to default values when reporting playback events to Mux Data

## v0.4.0

API Changes

* Add `1440p` max resolution playback modifier

Fixes

* Correct SDK version

## v0.3.0

API Changes

* Add: maximum resolution playback modifiers for 1080p and 2160p

## v0.2.0

Additions

* Initialize an `AVPlayerLayer` to stream and monitor video with a public or signed playback ID
* Setup an already existing `AVPlayerLayer` to stream and monitor video with a public or signed playback ID

Breaking

* The SDK module has been renamed to `MuxPlayerSwift`.
  * Update SPM package links from `https://github.com/muxinc/mux-avplayer-sdk` to `https://github.com/muxinc/mux-player-swift`
  * Replace any import statements: `import MuxAVPlayerSDK` to `import MuxPlayerSwift`

This SDK is pre-release software and may contain issues or missing functionality. We recommend against submitting apps based on it to the App Store.

## v0.1.0

Initial Release

* Feature: setup `AVPlayerViewController` to stream and monitor video with a public or signed playback ID
* Feature: automatic Mux Data monitoring setup
* Feature: passthrough of all metadata supported by the `AVPlayer` Data SDK
* Feature: custom domains for playback
* Feature: support for limiting playback resolution to 720p

Known Issues

* Mux Data monitoring will not automatically stop when `AVPlayerViewController` is no longer in use, call `stopMonitoring` on `AVPlayerViewController` to stop monitoring manually.


# Mux Player for Android
This guide walks through integration with Mux's Player SDK for Android, which is built on media3
The Mux Player SDK is a thin wrapper on top of Google's media3 player SDK with convenient tools for Mux Video users. This SDK is not required to use Mux Video, but it can help you do things like controlling your data and delivery usage, playing Mux assets by ID, automatically supporting player features like caching, and transparently tracking performance and engagement with [Mux Data](https://data.mux.com/)

This guide will help you install the Mux Video SDK in your app, use it to play a Mux Video asset, configure Mux Player for your specific app and media, and show you how to handle less-common scenarios like using Mux Video's [custom domains](/docs/guides/use-a-custom-domain-for-streaming).

## 1. Install the Mux Player SDK

## Add our repository to your Gradle project

Add Mux's maven repository to your gradle files. Newer projects require declaring this in `settings.gradle`, and older projects require it to be set in the project-level `build.gradle`.

```gradle\_groovy

// in a repositories {} block
maven {
  url 'https://muxinc.jfrog.io/artifactory/default-maven-release-local' 
}

```

```gradle\_kts

// in a repositories {} block
maven {
  url = uri("https://muxinc.jfrog.io/artifactory/default-maven-release-local")
}

```



## Add the dependency to your app

Add our library to the `dependencies` block for your app.

```gradle\_kts

implementation("com.mux.player:android:1.0.0")
  
```

```gradle\_groovy

implementation "com.mux.player:android:1.0.0"
  
```



## 2. Play a Mux Asset

## Create a MuxPlayer

To use the SDK, you must create a `MuxPlayer` object using its `Builder`. The basic configuration will enable all of Mux Video's features, and you can make additional config changes using our `Builder`. Almost all of our default config options are the same as ExoPlayer's. We only change things about the default configuration when we need to in order to support a Mux Player feature.

```kotlin

val player: MuxPlayer = MuxPlayer.Builder(context = this)
.enableLogcat(true) // Optional. Only applies to Mux. Media3 logging is not touched
.applyExoConfig {
  // Call ExoPlayer.Builder methods here (but not build()!)
  setHandleAudioBecomingNoisy(true)
}
.build()
  
```

```java

MuxPlayer player = new MuxPlayer.Builder(context)
.enableLogcat(true) // Optional. Only applies to Mux. Media3 logging is not touched
.plusExoConfig((config) -> {
  // Call ExoPlayer.Builder methods here (but not build()!)
  config.setHandleAudioBecomingNoisy(true);
})
.build();
  
```



## Play a Mux Video asset

To play a Mux Video asset using this SDK, you can use our `MediaItems` API to create new instances of media3's `MediaItem` or `MediaItem.Builder`. For the basic example, we'll leave everything default and play an asset you've already uploaded to Mux Video

```kotlin

// Use the MediaItems class instead of MediaItem.Builder()
val mediaItem = MediaItems.builderFromMuxPlaybackId("YOUR PLAYBACK ID")
// It's just a MediaItem from here, so you can configure it however you like
.setMediaMetadata(
  MediaMetadata.Builder()
    .setTitle("Hello from Mux Player on Android!")
    .build()
)
.build()

// From here, everything is exactly the same as ExoPlayer
player.setMediaItem(mediaItem)
player.prepare()
player.playWhenReady = true
  
```

```java

MediaMetadata metadata = new MediaMetadata.Builder()
.setTitle("Hello from Mux Player on Android")
.build();
// Use the MediaItems class instead of MediaItem.Builder()
MediaItem item = MediaItems.builderFromMuxPlaybackId("YOUR PLAYBACK ID")
// It's just a MediaItem from here, so you can configure it however you like
.setMediaMetadata(metadata)
.build();

// From here, everything is exactly the same as ExoPlayer
player.setMediaItem(item);
player.setPlayWhenReady(true);
player.prepare();
  
```



### Protecting your content

Mux Video offers options for securing your content from unauthorized playing or recording. For more information, [see below](/docs/guides/mux-player-android#secure-your-playback-experience)

## Control Your Usage and Quality

## Enable smart caching to improve experience and decrease usage

Mux Player can cache content as it is requested from Mux Video and store it for later requests. Caching can reduce overall data usage and costs by storing some streamed video locally in a private directory on the device. This way content doesn't need to be downloaded again if the user watches the content over, when playback loops, or during seeking. Mux Player's caching is automatic when enabled, and we manage the cache files for you.

If you are interested in Mux Player's caching features, you can enable them when you build your `MuxPlayer`.

```kotlin

val player: MuxPlayer = MuxPlayer.Builder(context)
// disabled by default
.enableSmartCache(true)
.build()
  
```

```java

MuxPlayer player = new MuxPlayer.Builder(context)
  // disabled by default
  .enableSmartCache(true)
  .build()
  
```



## Limit data and delivery usage

Depending on your use case and app, you may need to control your either Mux Video usage or your app's data bandwidth usage. Doing this can allow you to save costs and minimize playback interruptions for users on slower devices or data plans. Mux provides some tools to manage costs and resource usage by limiting the maximum resolution your app can stream from Mux Video. To take advantage of this feature, you can supply a `PlaybackResolution` to our `MediaItems` class.

```kotlin

val mediaItem = MediaItems.builderFromMuxPlaybackId(
"YOUR PLAYBACK ID",
maxResolution = PlaybackResolution.FHD_1080, // limit playback resolution to 1080p
)
// .. configure your MediaItem further if required
.build()

// .. Add the MediaItem to your MuxPlayer like you normally would
  
```

```java

MediaItem mediaItem = MediaItems.builderFromMuxPlaybackId(
      "YOUR PLAYBACK ID",
      PlaybackResolution.FHD_1080 // limit playback resolution to 1080p
  )
  // .. configure your MediaItem further if required
  .build();

// .. Add the MediaItem to your MuxPlayer like you normally would

```



## Guarantee a minimum resolution

Some use cases require a minimum playback resolution. Applications like screen-sharing for instance, may wish to preserve a certain level of visual quality even if play has to be interrupted to buffer more data. Apps that need their video playback to always be above a certain resolution, regardless of network conditions, can request a minium resolution.

```kotlin

val mediaItem = MediaItems.builderFromMuxPlaybackId(
"YOUR PLAYBACK ID",
minResolution = PlaybackResolution.HD_720,
)
// .. configure your MediaItem further if required
.build()

// .. Add the MediaItem to your MuxPlayer like you normally would
  
```

```java

MediaItem mediaItem = MediaItems.builderFromMuxPlaybackId(
    "YOUR PLAYBACK ID",
    null, // null for default
    /*minResolution =*/ PlaybackResolution.HD_720
)
// .. configure your MediaItem further if required
.build();
  
```



For more information about controlling your data and platform usage, please see our [guide](/docs/guides/control-playback-resolution) on controlling playback resolution.

## Add or customize Mux Data metadata

The Mux Player SDK transparently integrates with Mux Data in order to monitor for issues and track engagement with your content. To verify this is working, you can simply play the video in your app, and wait for your session to appear on the Mux Data dashboard. Your session should appear in your Mux Data environment automatically in the same environment as your video asset.

## Automatically-Detected Metadata

Mux will automatically collect information about your stream, playback environment, and current playback session ("view") to send to Mux Data. Examples of the kind of information collected are Mux Asset and Playback IDs, player and stream resolution, the start and end times of the view, and some basic information about the end users device like OS and model number.

## Customize metadata about your player, viewer, or playback session

The SDK can automatically detect a lot of information about the media you're playing, but you can customize this information if you need to, via the `CustomerData` class. Anything you specify this way will override metadata values that would ordinarily be detected automatically.

You can initialize your player with whatever custom metadata you like, and you can also update that metadata at any time.

```kotlin

private fun createPlayer(context: Context): MuxPlayer {
return MuxPlayer.Builder(context)
  .addMonitoringData(
    CustomerData().apply {
      customerViewData = CustomerViewData().apply {
        viewSessionId = UUID.generateUUID()
      }
      customerVideoData = CustomerVideoData().apply {
        videoSeries = "My Series"
        videoId = "abc1234zyxw"
      }
      customData = CustomData().apply { 
        customData1 = "my custom metadata field"
        customData2 = "another custom metadata field"
        customData10 = "up to 10 custom fields"
      }
    }
  )
  .build()
}
  
```

```java

private MuxPlayer createPlayer(Context context) {
CustomerData customerData = new CustomerData();
CustomerVideoData videoData = new CustomerVideoData();
videoData.setVideoTitle("Lots of custom data");
videoData.setVideoSeries("my series");
videoData.setVideoId("my app's id for the media");
CustomData customData = new CustomData();
customData.setCustomData1("my custom data field");
customData.setCustomData2("another custom metadata field");
customData.setCustomData10("up to 10 custom fields");

customerData.setCustomerVideoData(videoData);
customerData.setCustomData(customData);

return new MuxPlayer.Builder(context)
  .addMonitoringData(customerData)
  .build();
}
  
```



## Secure your playback experience

Depending on the needs of your business and your users, you may need to secure your videos against unauthorized copying or viewing. Mux Video offers options for securing your playback experience. The right option for your app depends on your own use case. Your best option, if any, is a trade-off between security, complexity, and loading time for the end user.

## Signed Playback URLs

Mux Player supports playing Mux Video assets with signed playback. Signed playback uses a JSON web token (JWT) signed on your application server, created using a key identifier created using our APIs. For more information about how to set up signed playback, check out our [secure video playback guide](/docs/guides/secure-video-playback).

For this guide, we'll focus on what to do on the client, once you have the JWT from your app's backend server. To play the asset securely you can supply your JWT to `MediaItems.fromMuxPlaybackId` or `MediaItems.builderFromMuxPlaybackId`. The resulting `MediaItem` will be configured to play the asset securely using your token.

```kotlin

private fun playSomething(jwt: String, context: Context) {
val player = createPlayer(context)
val mediaItem = MediaItems.builderFromMuxPlaybackId(
  PlaybackIds.TEARS_OF_STEEL,
  playbackToken = jwt,
)
  .setMediaMetadata(
    MediaMetadata.Builder()
      .setTitle("Private Playback ID Example")
      .build()
  )
  .build()
player.setMediaItem(mediaItem)

// .. Then prepare and play your media as normal
}    

```

```java

MuxPlayer player = createPlayer(context);
MediaItem mediaItem = MediaItems.builderFromMuxPlaybackId(
    PlaybackIds.TEARS_OF_STEEL,
    PlaybackResolution.QHD_1440,
    PlaybackResolution.LD_540,
    RenditionOrder.Descending,
    /* domain = */ null, // null for default
    // put your Signed Playback Token here
    /*playbackToken = */ jwt
)
.setMediaMetadata(
    new MediaMetadata.Builder()
        .setTitle("Private Playback ID Example")
        .build()
)
.build();
player.setMediaItem(mediaItem);

// .. Then prepare and play your media as normal

```



## Digital Rights Management (DRM)

Mux Player for Android can be configured to protect videos from unauthorized use via Widevine DRM. Support for DRM is automatically enabled in the player. As long as you have both a signed playback token (see above) and a DRM token, your DRM-protected asset can be played using Mux Player. The process of setting up DRM is somewhat complex, and is detailed here in our [DRM Guide](/docs/guides/protect-videos-with-drm). This guide will focus on what to do once you have obtained a Playback Token and DRM Token from your application server.

### 1. Setting up DRM for an asset

To use DRM playback for your asset, you'll need to set up a DRM configuration and DRM-enabled playback ID. The process for doing this is the same regardless of your player, and you can read more about it in our [DRM Guide](/docs/guides/protect-videos-with-drm). Once you have an environment and asset set up with DRM, you can use your that asset's DRM Token, Playback Token, and Playback ID with Mux Player to do DRM playback transparently.

### 2. Playing a DRM-protected asset

To play your DRM-protected asset, simply provide the Playback Token and DRM Token you generated in the last step. You can provide them as parameters to `MediaItems.fromMuxPlaybackId()`. No other configuration is required in order to use DRM with Mux Player.

```kotlin

private fun playSomething(myPlaybackId: String, myPlaybackToken: String, myDrmToken: String, context: Context) {
val player = createPlayer(context)
val mediaItem = MediaItems.builderFromMuxPlaybackId(
  playbackId,
  playbackToken = myPlaybackToken,
  drmToken = myDrmToken,
)
  .setMediaMetadata(
    MediaMetadata.Builder()
      .setTitle("DRM playback Example")
      .build()
  )
  .build()
player.setMediaItem(mediaItem)

// .. Then prepare and play your media as normal
}    

```

```java

MuxPlayer player = createPlayer(context);
MediaItem mediaItem = MediaItems.builderFromMuxPlaybackId(
    PlaybackIds.TEARS_OF_STEEL,
    PlaybackResolution.QHD_1440,
    PlaybackResolution.LD_540,
    RenditionOrder.Descending,
    /* domain = */ null, // null for default
    // put your Signed Playback Token here
    /*playbackToken = */ jwt,
    /*drmToken = */ drmToken,
)
.setMediaMetadata(
    new MediaMetadata.Builder()
        .setTitle("Private Playback ID Example")
        .build()
)
.build();
player.setMediaItem(mediaItem);

// .. Then prepare and play your media as normal

```



## Advanced Features

## Enable smart caching to improve experience and decrease usage

Mux Player can cache content as it is requested from Mux Video and store it for later requests. This caching is automatic, and we manage the cache content and cache files for you. To enable smart caching, all you need to do is set the parameter when you build your `MuxPlayer`.

```kotlin

val player: MuxPlayer = MuxPlayer.Builder(context)
.enableSmartCache(true)
.build()
  
```



## Use a custom Mux Video domain

If you are using a Mux Video [Custom Domain](/docs/guides/use-a-custom-domain-for-streaming), you can specify the domain on a per-`MediaItem` basis. The URL of the stream will have the specified domain and the `stream.` subdomain

```kotlin

val mediaItem = MediaItems.builderFromMuxPlaybackId(
"YOUR PLAYBACK ID",
domain = "customdomain.com", // https://stream.customdomain.com/...
)
// .. configure your MediaItem further if required
.build()

// .. Add the MediaItem to your MuxPlayer like you normally would
  
```



## Use a specific Mux Data Environment Key

Ordinarily, Mux Data will record views and monitoring data in the same Environment as the Mux Video asset being played. If you are using a different Mux Data environment for some reason, you can specify another Mux Data Env Key for your player to use.

```kotlin

private fun createPlayer(context: Context): MuxPlayer {
return MuxPlayer.Builder(context)
  .setMuxDataEnv("Another Mux Data Env Key") // replace with your other key  
  .build()
}
  
```

```java

MuxPlayer player = new MuxPlayer.Builder(context)
  .setMuxDataEnv("Another Mux Data Env Key") // replace with your other key  
  .build()
  
```



## Instant Clipping

### Instant Clips

Instant clips are an alternative to our [long-standing asset-based clipping feature](/docs/guides/create-clips-from-your-videos). Requesting instant clips using relative time is now [available for use with all video on-demand (VOD) assets](https://www.mux.com/blog/instant-clipping-update).

Instant clipping allows you to request a stream whose start time is at some later point in the video, relative to the start time of the asset. Likewise you're able to request a stream that ends sooner than when the underlying asset completes. Instant clips do not incur the wait time or expense of a creating a new asset.

Unlike asset-based clipping, instant clipping is done by trimming your VOD assets HLS manifest. This means that instant clipping operates at the segment level of accuracy. You should expect that the content that you clip out may be several seconds longer than you’ve requested. We always make sure to include the timestamps that you request, but your content may start a few seconds earlier, and end a few seconds later.

Assets that originate from a livestream can also be converted into instant clips using program date time epochs. Support for these clips will be available in a future Mux Player Android release.


# Mux Player for Android releases
Every new release of Mux Player for Android is posted here with release notes
# Current release

## v1.5.3

Improvements:

* Update media3 to v1.9.2 and Mux Data to v1.11.1 (#87)

# Previous releases

## v1.5.2

Fixes:

* Swap MuxPlayer to implement ExoPlayer

## v1.5.1

Fixes:

* FileNotFoundException from the cache stalls playback

## v1.5.0

Improvements:

* Update media3 version to 1.6.1
* Update Mux Data to v1.8.0

Fixes:

* fix: DRM playback broken

## v1.4.0

Updates:

* Add method for updating `CustomerData`

Improvements:

* Track player dimensions when using the default PlayerView (or SurfaceView or TextureView)

Notes:

* `MuxPlayer` now implements `Player` instead of `ExoPlayer`. Most people shouldn't have a problem, but if you referred to our player as an `ExoPlayer`, you'll need to change it to `Player` or `ExoPlayer`

## v1.3.0

Update:

* update: Update Mux Data to v1.6.2 and Media3 to v1.5.0

## v1.2.2

Fixes

* fix: Rendering issues on Compose UI & API 34 (upstream from media3: [link](https://github.com/androidx/media/issues/1237))

Improvements

* Update media3 to 1.4.1 + mux data to 1.6.0

## v1.2.1

Fixes

* Fix cache errors when switching sources extremely quickly

## v1.2.0

Improvements

* Add Instant Clipping asset relative time parameters to `MediaItems`

## v1.1.3

Improvements:

* fix: playback fails sometimes when changing videos

## v1.1.2

Please prefer to use v1.1.3

## v1.1.1

Please prefer to use v1.1.3

## v1.1.0

Improvements

* [Adds DRM support](/docs/guides/protect-videos-with-drm)

## v1.0.0

Updates:

* Bump to version 1.0.0
* Added a 'Default' rendition order

Fixes:

* Remove option for non-existent `Ascending` rendition order

Improvements:

* Misc API & code quality improvements
* Complete public API docs

## v0.3.1

Improvements:

* fix: Player should always request redundant\_streams
* feat: Set player software name as `mux-player-android`

## v0.3.0

New:

* new: Add max and min playback resolution
  Updates:
* update: Improve example app appearance + misc updates


# Mux Background Video
Mux Background Video is a lightweight component for adding background videos to your web application
**Mux Background Video** is a lightweight component and HLS engine for creating background videos using Mux HLS streams.

* **React**: React component for easy integration
* **Web Component**: Custom element for easy integration
* **Lightweight**: Minimal bundle size with no dependencies
* **Preload Control**: Control video preloading behavior
* **Audio Control**: Optionally enable audio tracks for background videos
* **Resolution Control**: Set maximum resolution for optimal performance

## Quick start

## Installation

```bash
npm install @mux/mux-background-video
```

## Usage

<Callout type="info">
  Requires Mux [Basic](https://www.mux.com/docs/guides/use-video-quality-levels#basic) or [Premium](https://www.mux.com/docs/guides/use-video-quality-levels#premium) video quality currently because transmuxing of `.ts` segments is not supported.
</Callout>

### HTML Custom Element

The easiest way to use Mux Background Video is with the custom element:

```html
<!DOCTYPE html>
<html>
<head>
  <title>Background Video</title>
  <style>
    html, 
    body {
      height: 100%;
    }

    body {
      margin: 0;
      padding: 0;
    }
    
    mux-background-video,
    img {
      display: block;
      width: 100%;
      height: 100%;
      object-fit: cover;
    }
  </style>
  <script type="module" src="http://cdn.jsdelivr.net/npm/@mux/mux-background-video/html/+esm"></script>
</head>
<body>
  <mux-background-video src="https://stream.mux.com/YOUR_PLAYBACK_ID.m3u8">
    <img src="https://image.mux.com/YOUR_PLAYBACK_ID/thumbnail.webp?time=0" alt="Mux Background Video" />
  </mux-background-video>
</body>
</html>
```

### JavaScript Import

You can also import the custom element directly:

```ts
import '@mux/mux-background-video/html';

// The custom element is automatically registered
// You can now use <mux-background-video> in your HTML
```

### React Component

For React applications, use the React component:

```tsx
import { MuxBackgroundVideo } from '@mux/mux-background-video/react';

function HeroSection() {
  return (
    <MuxBackgroundVideo src="https://stream.mux.com/YOUR_PLAYBACK_ID.m3u8">
      <img src="https://image.mux.com/YOUR_PLAYBACK_ID/thumbnail.webp?time=0" alt="Mux Background Video" />
    </MuxBackgroundVideo>
  );
}
```

## Analytics

To enable [Mux data](https://www.mux.com/data) collection for your background videos, include the Mux embed script in your HTML page before the Mux Background Video script:

```html
<script defer src="https://cdn.jsdelivr.net/npm/mux-embed"></script>
```

Once this script is included, Mux data will automatically be enabled for all background videos on the page, providing you with detailed analytics and insights about video performance.

## API Reference

### HTML Custom Element: `<mux-background-video>`

The `<mux-background-video>` element automatically handles HLS streaming.

#### Attributes

* **`src`**: The Mux HLS stream URL (required)
* **`max-resolution`**: Maximum resolution for the video (e.g., "720p", "1080p")
* **`audio`**: Enable audio track (default: false)
* **`preload`**: Controls video preloading behavior (default: auto)
  * `"none"`: No preloading
  * `"metadata"`: Preload only metadata
  * `"auto"`: Preload video data

#### HTML Structure

```html
<mux-background-video audio max-resolution="720p" src="YOUR_STREAM_URL">
  <img src="https://image.mux.com/YOUR_PLAYBACK_ID/thumbnail.webp?time=0" alt="Mux Background Video" />
</mux-background-video>
```

#### JavaScript Attributes

You can also set attributes programmatically:

```typescript
const element = document.querySelector('mux-background-video');

// Set maximum resolution
element.setAttribute('max-resolution', '1080p');

// Enable audio track
element.toggleAttribute('audio', true);

// Set preload behavior
element.setAttribute('preload', 'metadata');

// Set the stream URL
element.setAttribute('src', 'https://stream.mux.com/NEW_PLAYBACK_ID.m3u8');

// Get current values
console.log(element.getAttribute('src'));
console.log(element.getAttribute('max-resolution'));
console.log(element.hasAttribute('audio'));
console.log(element.getAttribute('preload'));
```

### React Component: `<MuxBackgroundVideo />`

#### Props

* **`src`**: The Mux HLS stream URL (required)
* **`maxResolution`**: Maximum resolution for the video (e.g., "720p", "1080p")
* **`audio`**: Enable audio track (default: false)
* **`preload`**: Controls video preloading behavior (default: auto)
  * `"none"`: No preloading
  * `"metadata"`: Preload only metadata
  * `"auto"`: Preload video data

#### Example

```tsx
<MuxBackgroundVideo
  src="https://stream.mux.com/YOUR_PLAYBACK_ID.m3u8"
  maxResolution="720p"
  audio={true}
>
  <img src="https://image.mux.com/YOUR_PLAYBACK_ID/thumbnail.webp?time=0" alt="Mux Background Video" />
</MuxBackgroundVideo>
```


# Control playback resolution
Control the video resolution your users receive in order to give the best user experience as well as take advantage of Mux's resolution based pricing.
# Default playback URL

The default playback URL will contain all available resolutions of your video. The resolutions available will depend on the video source file.

By default if the source file contains 1080p or higher, then the highest resolution provided by Mux will be 1080p. If the source file is lower than 1080p, the highest resolution available will be the resolution of the source.

You can also stream 4K content using Mux Video, which will be delivered at higher resolutions including 2.5K and 4K. For more details see the [guide to streaming 4K videos](/docs/guides/stream-videos-in-4k).

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

Use the default playback URL for most use cases. The video player will determine the best resolution based on the available bandwidth of the viewer.

# Using playback modifiers to manipulate playback resolution

Mux exposes a set of [playback modifiers](/docs/guides/modify-playback-behavior), which give you extra control over the availiable resolutions of your content.

## Specify maximum resolution

The playback URL below with the `max_resolution` query parameter modifies the resolutions available for the player to choose from.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?max_resolution=720p
```

The `max_resolution` parameter can be set to `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, or `2160p`. You may want to do this in order to reduce your delivery costs, or build a feature to your product where only certain viewers get lower resolution video.

*Please note that not all resolutions are available for all assets. If you specify a max resolution that is not available for the asset, Mux will limit the resolution to the highest resolution available below the one you specified. For example, if you specify `max_resolution=1080p` but the highest resolution available for the asset is 720p, then the manifest will be capped at 720p.*

## Specify minimum resolution

The playback URL below with the `min_resolution` query parameter modifies the resolutions available for the player to choose from.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?min_resolution=720p
```

The `min_resolution` parameter can be set to `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, or `2160p`. You may want to use this to omit the lowest quality renditions from the HLS manifest when the visual quality of your content is critical to the delivery, for example in live streams where detailed screen share content is present.

*Please note that not all resolutions are available for all assets. If you specify a min resolution that is not available for the asset, Mux will limit the resolution to the next highest resolution available below the one you specified. For example, if you specify `max_resolution=270p` but the lowest resolution available for the asset is 360p, then the manifest will start at at 360p.*

## Specify rendition order

By default the top resolution in the playlist is one of the middle resolutions. Many players will start with the first one listed so this default behavior strikes a balance by giving the player something that's not too low in terms of quality but also not too high in terms of bandwidth.

You may want to change this behavior by specifying `rendition_order=desc` which will sort the list of renditions from highest (highest quality, most bandwidth) to lowest (lowest quality, least bandwidth). Players that start with the first rendition in the list will now attempt to start playback with the highest resolution. The tradeoff is that users on slow connections will experience increaesed startup time.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?rendition_order=desc
```

# Usage with signed URLs

If you are using `signed` Playback IDs according to the [Secure video playback guide](/docs/guides/secure-video-playback) then your playback modifiers must be encoded in the `token` that you generate on your server. [See the modify playback behaviour guide](/docs/guides/modify-playback-behavior) about embedding extra params in your JWT.

# Using playback modifiers in Mux Player

Mux Player supports  `min_resolution`, `max_resolution` and `rendition_order` as attributes on the web component and props on the React component.

For example to set the `max_resolution=` parameter with Mux Player, you can set `max-resolution="720p"` attribute (`maxResolution="720p"` in React). When setting this attribute Mux Player will internally add it on as a query parameter on the streaming URL.

As with all playback modifiers, if you're using signed URLs, your parameters should be encoded in the `playback-token` attribute (`tokens.playback` in React).

# When using AVPlayer on iOS

Set the playback modifier by appending a `URLQueryItem` to the playback `URL`. [Initialize `AVPlayer` using the `URL` itself](https://developer.apple.com/documentation/avfoundation/avplayer/1385706-init) as shown in an example below using `max_resolution` or [initialize with an `AVPlayerItem` constructed with the URL](https://developer.apple.com/documentation/avfoundation/avplayer/1387104-init).

```objc


/// Creates a playback URL.
///
/// - Parameters:
///     - playbackID: playback ID for the asset.
///     - enableMaximumResolution: if true include a query parameter 
///     that sets a maximum resolution for the video.
///
/// - Returns: Playback URL with maximum resolution query param appended.
///
- (NSURL *)playbackURLWithPlaybackID:(NSString *)playbackID
           enableMaximumResolution:(BOOL)enableMaximumResolution {
  if (enableMaximumResolution) {
      NSURLComponents *components = [
          NSURLComponents componentsWithURL: [NSURL URLWithString: @"https://stream.mux.com/"]
          resolvingAgainstBaseURL: NO
      ];
      components.path = [NSString stringWithFormat: @"%@.m3u8", playbackID];

      NSURLQueryItem *queryItem = [
          NSURLQueryItem queryItemWithName: @"max_resolution"
          value: @"720p"
      ];
      components.queryItems = @[queryItem];

      return [components URL];
  } else {
      NSString *formattedURLString = [
          NSString stringWithFormat: @"https://stream.mux.com/%@.m3u8", playbackID
      ];
      return [NSURL URLWithString: formattedURLString];
  }
}

NSString *playbackID = @"qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4";

NSURL *url = [self playbackURLWithPlaybackID: playbackID
                     enableMaximumResolution: NO];

AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithURL: url];
AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem: playerItem];
 
```

```swift

import AVKit
import Foundation

let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

// Flag controlling if a max resolution is requested
let shouldLimitResolutionTo720p = true

let player = AVPlayer(
    using: playbackID,
    limitResolutionTo720p: shouldLimitResolutionTo720p
)

extension AVPlayer {
    /// Initializes a player configured to stream
    /// the provided asset's playback ID.
    /// - Parameters:
    ///   - playbackID: a playback ID of your asset
    ///   - limitResolutionTo720p: if true configures
    ///   the player to select a resolution no higher
    ///   than 720p. False by default. 
    convenience init(
        using playbackID: String,
        limitResolutionTo720p: Bool = false
    ) {
        let playbackURL = URL.makePlaybackURL(
            playbackID: playbackID,
            limitResolutionTo720p: limitResolutionTo720p
        )
        
        self.init(
            url: playbackURL
        )
    }
}

/// Convenience extensions for working with URLs
extension URL {
    /// Convenience initializer for a static URL
    /// - Parameters:
    ///   - staticString: a static representation
    ///   of a valid URL, supplying an invalid URL
    ///   results in precondition failure
    init(staticString: StaticString) {
        guard let url = URL(
            string: "\(staticString)"
        ) else {
            preconditionFailure("Invalid URL static string")
        }

        self = url
    }

    /// Convenience constructor for a playback URL with
    /// an optional 720p limit
    /// - Parameters:
    ///     - baseURL: either the Mux stream URL or can be
    ///     customized if using Custom Domains for Mux Video.
    ///     - playbackID: playback ID for the asset
    ///     - limitResolutionTo720p: set an upper threshold for the
    ///     resolution chosen by the player to 720p. By default no limit 
    ///     is set and the player can choose any available resolution.
    /// - Returns: a playback URL for a Mux Video asset with a resolution
    /// limit if it is requested
    static func makePlaybackURL(
        baseURL: StaticString = "https://stream.mux.com",
        playbackID: String,
        limitResolutionTo720p: Bool = false
     ) -> URL {
        var components = URLComponents(
            url: URL(
                staticString: baseURL
            ),
            resolvingAgainstBaseURL: false
        )

        components?.path = "/\(playbackID).m3u8"

        if limitResolutionTo720p {
            components?.queryItems = [
                URLQueryItem(
                    name: "max_resolution",
                    value: "720p"
                )
            ]
        }
    
        guard let playbackURL = components?.url else {
            preconditionFailure("Invalid playback URL component")
        }

        return playbackURL
     }
}

```



# Autoplay your videos
Use this guide to understand how to autoplay your videos on web-based players.
If you are autoplaying videos with any web based players that use the video element it will help to understand how browsers handle autoplay so that you can provide the best experience for your users. This applies to video elements with the autoplay attribute and anytime you are calling `play()` on a video element (this includes all HTML5 players like VideoJS, JWPlayer, Shaka player, etc.).

Browser vendors are frequently changing their policies when autoplay is allowed and not allowed, so your application should be prepared to deal with both scenarios, and we want to make sure we're tracking your views and errors accurately.

<Callout title="Autoplay with Mux Player">
  Mux Player has some extra options for helping with autoplay that do some of the following recommendations for you. Check out the [Mux Player autoplay guide](/docs/guides/player-core-functionality#autoplay) for details.
</Callout>

## Increase your chance of autoplay working

There's a few conditions that will increase your chance of autoplay working.

* Your video is muted with the muted attribute.
* The user has interacted with the page with a click or a tap.
* (Chrome - desktop) The user’s [Media Engagement Index](https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#mei) threshold has been crossed. Chrome keeps track of how often a user consumes media on a site and if a user has played a lot of media on this site then Chrome will probably allow autoplay.
* (Chrome - mobile) The user has added the site to their home screen.
* (Safari) Device is not in power-saving mode.

<Callout type="error" title="Autoplay will never work 100% of the time">
  Even if autoplay works when you test it out, you can never rely on it working for every one of your users. Your application must be prepared for autoplay to fail.
</Callout>

## Avoid the `autoplay` attribute

When you use the `autoplay` attribute on a `<video>` element (it looks like `<video autoplay>`), you have no way to know if the browser blocked or didn't block autoplay.

It is recommended to use `video.play()` instead, which returns a promise and allows you to know if playback played successfully or not. If autoplay worked, the promise will resolve, if autoplay did not work then the promise will reject with an error. The great thing about this approach is that you can choose what to do with the error.

For example: you can report the error to your own error tracking tools or update the UI to reflect this error. Note that Mux's custom error tracking is for tracking fatal errors, so you wouldn't want to report an autoplay failure to Mux because then it will be considered a fatal error.

```js
const video = document.querySelector('#my-video');

video.play().then(function () {
  // autoplay was successful!
}).catch(function (error) {
  // do something if you want to handle or track this error
});
```

For further reading, see [the mux blog post](https://mux.com/blog/video-autoplay-considered-harmful/) about this topic.


# Use your own custom domain
Use your own domain for live streaming ingest and live and on-demand playback. For live streaming ingest we support CNAME-ing or "canonical naming" `global-live.mux.com`.  For playback, we support the streaming of videos or serving images from your own domain.
## Use your own domain name for live ingest

CNAME-ing, short for "Canonical Naming", is a configuration that allows you to change the default domain name we provide.

Add a [CNAME](https://en.wikipedia.org/wiki/CNAME_record) to your domain's [DNS](https://en.wikipedia.org/wiki/Domain_Name_System) settings, and configure it to point to `global-live.mux.com`. After a short amount of time you should be able use your own domain name for ingest.

<Callout type="warning" title="RTMP, RTMPS and SRT with custom domains">
  Mux supports both RTMP, RTMPS and SRT ingestion. RTMP and SRT support custom domains by configuring the CNAME record to point at the relevant ingest URL's domain (such as global-live.mux.com, or a [regional live ingest URL](/docs/guides/configure-broadcast-software#available-ingest-urls)). Custom domains will not work with RTMPS.

  Please reach out to [our support team](/support) with additional details of your requirements.
</Callout>

Here are a few popular domain services with CNAME-ing instructions. If your domain service is not listed, try searching their support resources.

* [Cloudflare](https://support.cloudflare.com/hc/en-us/articles/360020348832-Configuring-a-CNAME-setup)
* [Google Domains](https://support.google.com/a/answer/47283?hl=en)
* [AWS Route 53](https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/ResourceRecordTypes.html#CNAMEFormat)
* [GoDaddy](https://www.godaddy.com/help/add-a-cname-record-19236)

Note that the CNAME doesn't have to be `global-live`, it can be anything you want it to be.

After configuring your DNS settings it may take a few hours before the new configuration works, depending on your DNS provider.

Here are a few examples of RTMPS and RTMP CNAME URLs before and after they are changed to custom domains:

```text
# RTMPS examples
rtmps://global-live.mux.com:443/app
# RTMP examples
rtmp://global-live.mux.com:5222/app
rtmp://your-cname.your-site.com:5222/app
```

## Use your own domain for delivering videos and images

For delivery, a custom domain allows you to play videos or deliver images from your domain rather than stream.mux.com or images.mux.com. Use your own domain for delivery, such as `media.mycustomdomain.com`

Why might you be interested in this feature? If you want to have a consistent brand presence across all your assets, sandbox your videos, or have a need to be allowlisted. If you are interested in this feature, please reach out to your Mux Account team.

## Availability

Custom domains for playback is available for our customers with an annual contract with Mux. If you do not have an annual contract with Mux you can add-on this feature for the price of $100 per month. Please reach out to our [Support Team](/support) to get set up.


# Embed videos for social media
Learn how to embed your videos using Open Graph cards
## Introduction to Open Graph

The Open Graph protocol is initialized by using HTML meta tags in the `<head>` section of a webpage, allowing you to define objects
from your webpage as thumbnails. These can be used in social media posts or appear in search results. Open Graph also helps search engines
find videos on your webpage that might be otherwise hidden due to JavaScript.

Here is a list of Open Graph properties for video optimization:

| Property        | Description                                            |
| :---------------| :----------------------------------------------------- |
| `og:type`         | The object’s type e.g video, audio                  |
| `og:url`          | The URL of the webpage                                 |
| `og:title`       | Title of the video                                     |
| `og:description`  | Description of the video                               |
| `og:image`        | Thumbnail of the video                                 |
| `og:video`        | The URL of the video                                   |
| `og:video:width`  | Width of the video in pixels                           |
| `og:video:height` | Height of the video in pixels                          |
| `og:site_name`    | The website name the contains the video                |

### Object types

You can also use sub types, for example, if your object type is video and you want to create a open graph card
for an episode or movie you can use video.episode or video.movie.

```html
<meta property="og:type" content="video.episode">
<meta property="og:type" content="video.movie">
<meta property="og:type" content="video.tv_show">
```

### Optional meta tags

Use additional properties to provide additional metadata about your object such as the actor and director.

| Property        | Description                                         |
| :----------- | :----------------------------------------------------- |
| `video:actor` | profile array - Actors in the movie. |
| `video:actor:role` | string - The role they played. |
| `video:director` | profile array - Directors of the movie. |
| `video:writer` | profile array - Writers of the movie. |
| `video:duration` | integer >=1 - The movie's length in seconds. |
| `video:release_date` | datetime - The date the movie was released. |
| `video:tag` | string array - Tag words associated with this movie. |

### Integrate the Open Graph meta tags

To add the Open Graph meta tags into your website, simply implement new meta tags in the `<head>` section of the webpage.
Below is an example of Open Graph tags:

```html
<!DOCTYPE html>
<html lang="en">
  <head>
    <meta property="og:title" content="Mux Video" />
    <meta property="og:type" content="video.episode" />
    <meta property="og:description" content="MP4 video asset for Open Graph Cards" />
    <meta property="og:image" content="https://image.mux.com/aYKMM7VxaD2InrbhrKlhi00V6R9EpRmQNmBJ10200AK02bE/thumbnail.png" />
    <meta property="og:video" content="https://stream.mux.com/F9cP5Xgdcp7028hN4gQrOmlF62ZDHNloCTQQao8Pk00kk/medium.mp4" />
    <meta property="og:video:width" content="350">
    <meta property="og:video:height" content="200">
    <meta property="og:video:duration" content="300">
    <meta property="og:url" content="http://mux.com">
  </head>
  <body>
    <video
      id="my-player"
      controls
      style="width: 100%; max-width: 500px;"
    />
  </body>
</html>
```

## Creating Twitter/X cards

Twitter cards are implemented using meta tags, but unlike Open Graph cards, they use different property names.

There are four different types of cards to choose from which is defined in the meta tag property twitter:card:

* photo card
* player card
* summary card
* app card

The player card provides functionality to play external media files inside of Twitter. Below are the definitions of other Twitter meta tags you can use with the player card.

### Twitter/X meta tags

| Property        | Description                                         |
| :----------- | :----------------------------------------------------- |
| `twitter:card` | Type of Twitter card e.g., "player." |
| `twitter:title` | The title of your content as it should appear in the card |
| `twitter:site` | The Twitter @username the card should be attributed to |
| `twitter:description` | Description of the content (optional) |
| `twitter:player` | HTTPS URL to I-frame player; this must be a HTTPS URL which does not generate active mixed content warnings in a web browser (URL to the page hosting the player) |
| `twitter:player:width` | Width of I-frame specified in twitter:player in pixels |
| `twitter:player:height` | Height of I-frame specified in twitter:player in pixels |
| `twitter:image` | Image to be displayed in place of the player on platforms that don’t support I-frame or inline players; you should make this image the same dimensions as your player |

### Example HTML

```html
<meta name="twitter:card" content="player" />
<meta name="twitter:title" content="Some great video" />
<meta name="twitter:site" content="@twitter_username">
<meta name="twitter:description" content="Great video by @twitter_username" />
<meta name="twitter:player" content="https://link-to-a-videoplayer.com" />
<meta name="twitter:player:width" content="360" />
<meta name="twitter:player:height" content="200" />
<meta name="twitter:image" content="https://link-to-a-image.com/image.jpg" />
```

## Preview your Open Graph cards

You can preview your Open Graph cards using any one of many services that allow you to simply enter
a URL that generates a preview.

One such service is [Opengraph.xyz](https://opengraph.xyz) that allows you to not only preview
what you have configured, but also helps you to generate more Open Graph meta tags.

Below is a preview of the example above in [Opengraph.xyz](https://opengraph.xyz)

<Image src="/docs/images/OpenGraph_preview.png" width={1790} height={491} alt="Preview Open Graph cards" />


# Use static MP4 and M4A renditions
Learn how to create downloadable MP4 and M4A files from your videos for offline playback, social media sharing, transcription services, and legacy device support.
<Callout type="info">
  This guide covers using `static_renditions` to create MP4 files, which replaces the deprecated `mp4_support` method. While `mp4_support` continues to function, we recommend using `static_renditions` for all new implementations.

  For details on the older method, see [enabling static mp4 renditions using mp4\_support](/docs/guides/enable-static-mp4-renditions-using-mp4-support).
</Callout>

## What are static MP4 and M4A renditions?

Static renditions are downloadable versions of your video assets in MPEG-4 video (`.mp4`) or audio (`.m4a`) format. These files are created alongside the default HLS streaming format and can be used for downloading or streaming the content.

Static renditions allow you to create downloadable files that can be used for:

* Supporting very old devices, like Android \< v4.0 (Less than 1% of Android users)
* Supporting assets that are very short in duration (e.g., \< 10s) on certain platforms
* Embedding a video in [Open Graph cards](/docs/guides/embed-videos-for-social-media) for sharing on sites like Facebook and Twitter
* Downloading videos for offline viewing

It also allows users to download M4A audio files, which may be useful for:

* Feeding into transcription services
* Delivering a streamable audio-only file to an audio element
* Downloading an audio-only file, useful for things like podcasts

In the majority of other cases, you'll want to use our default HLS (.m3u8) format, which provides a better viewing experience by dynamically adjusting the quality level to the viewer's connection speed.
The HLS version of a video will also be ready sooner than the MP4 versions, if time-to-ready is important.

## How video quality affects static renditions

Static renditions are created at the same quality level (Basic, Plus, or Premium) as your original Mux video, and will be the highest quality video rendition possible, a specific desired resolution, or an audio-only version. For videos with multiple versions at the highest resolution (which can happen with Premium quality), we'll use the highest quality version available.

## How to enable static renditions

There are several points in an asset's lifecycle where you can enable static renditions. You can enable them when initially creating an asset, add them later to an existing asset, or configure them as part of a direct upload. The method you choose will depend on your workflow and when you determine you need the static renditions.

### During asset creation

You can add static renditions to an asset when <ApiRefLink href="/docs/api-reference/video/assets/create-asset">creating an asset</ApiRefLink> by including the `"static_renditions": []` array parameter and specifying a `{ "resolution": <option> }` object as an array element for each static rendition that should be created.

There two types of static renditions: `standard` and `advanced`. Standard static rendition MP4s provide the most common options for needed for generating static renditions: either the highest video rendition possible, or an audio-only copy of the content. Advanced static rendition MP4s allow you to specify the specific resolution of the static renditions that are generated. Standard static renditions incur a cost for the number of minutes [stored](/docs/pricing/video#static-rendition-mp4s-storage) and [delivered](/docs/pricing/video#static-renditions-mp4s). Advanced static renditions also incur a [cost per minute of MP4s that are generated](/docs/pricing/video#advanced-static-rendition-mp4s-encoding).

#### Standard Static Rendition Options

The standard static rendtions options are:

* `highest`: Produces an MP4 file with the video resolution up to 4K (2160p).
* `audio-only`: Produces an M4A (audio-only MP4) file for a video asset.

One or both options can be specified.

* For an audio-only asset: The `audio-only` rendition option will produce an M4A file and `highest` is skipped.
* For a video-only asset: The `highest` rendition option will produce an MP4 file and `audio-only` is skipped.

Here's an example of creating an asset with `static_renditions` specified using the `highest` and `audio-only` options:

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "static_renditions" : [ 
    {
      "resolution" : "highest"
    },
    {
      "resolution" : "audio-only"
    }
  ]
}
```

#### Advanced Static Rendition Options

The advanced supported resolutions are:

* 270p
* 360p
* 480p
* 540p
* 720p
* 1080p
* 1440p
* 2160p

Mux will not upscale to produce MP4 renditions - renditions that would cause upscaling are “skipped”.

Note that advanced explicit resolution static renditions cannot be mixed with the `highest` standard static rendition. However, they can be generated on the same asset as the `audio-only` rendition.

Here's an example of creating an asset with `static_renditions` specified using the `720p`, `480p`, and `audio-only` options:

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "static_renditions" : [ 
    {
      "resolution" : "720p"
    },
    {
      "resolution" : "480p"
    },
    {
      "resolution" : "audio-only"
    }
  ]
}
```

### After asset creation

You can add static renditions to existing assets retroactively by calling the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-static-rendition">create static rendition API</ApiRefLink>, as shown below. The create static rendition API will need to be called for each static rendition you would like to add to the asset.

```json
// POST /video/v1/assets/{ASSET_ID}/static-renditions
{
  "resolution" : "highest"
}
```

### During direct upload

To enable static renditions for direct upload, you need to specify the same `static_renditions` field within `new_asset_settings`, as shown below:

```json
// POST /video/v1/uploads
{
  "cors_origin": "https://example.com/",
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "static_renditions" : [ 
      {
        "resolution" : "highest"
      }
    ]
  }
}
```

### During live stream creation

Static renditions can be created from the recorded version of a live stream. This is useful if you want to create downloadable files from a live stream soon after the live stream is finished.

If you want to enable static renditions from the recorded version of a future live stream soon after the live stream is finished, use the `static_renditions` property in the `new_asset_settings` when <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">creating the live stream</ApiRefLink>.

```json
// POST /video/v1/live-streams
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "static_renditions" : [
      {
        "resolution" : "highest"
      }
    ]
  }
}
```

### After live stream creation

To update the static renditions that are configured to be created from the recorded version of a future live stream, use the <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream-new-asset-settings-static-renditions">update live stream static renditions API</ApiRefLink>..

```json
// PUT /video/v1/live-streams/{LIVE_STREAM_ID}/new-asset-settings/static-renditions
{
  "static_renditions" : [
    {
      "resolution" : "highest"
    },
    {
      "resolution" : "audio-only"
    }
  ]
}
```

## Access static renditions

After adding static renditions, you'll see an additional key on the asset object called `static_renditions`. This is the object that will contain the information about which static renditions are available.

```json
{
  ...all asset details...
  "static_renditions" : [
    {
      "id" : "ABC123",
      "type" : "standard",
      "status" : "preparing",
      "resolution" : "highest",
      "name" : "highest.mp4",
      "ext":"mp4"
    },
    {
      "id" : "GHI678",
      "type" : "standard",
      "status" : "preparing",
      "resolution" : "audio-only",
      "name" : "audio.m4a",
      "ext":"m4a"
    }
  ]
}
```

<Callout type="info" title="Static rendition status">
  Static renditions take longer to create than our default HLS version of the video, so they will not be ready immediately when the asset status is `ready`.

  The `static_renditions[].status` parameter refers to the current status of processing for each of the static renditions. Instead each static rendition will be ready when its `static_renditions[].status` is `ready`, and a `video.asset.static_rendition.ready` webhook is fired.
</Callout>

You can build the streaming URL by combining the playback ID with the name of the static rendition. The URL follows this pattern:

```html
https://stream.mux.com/{PLAYBACK_ID}/{STATIC_RENDITION_NAME}
```

The `name` field in each static rendition object (like `highest.mp4` or `audio.m4a`) is what you'll use as the `STATIC_RENDITION_NAME`.

```
ex. https://stream.mux.com/abcd1234/highest.mp4
ex. https://stream.mux.com/abcd1234/audio.m4a
```

If you want a browser to download the MP4 or M4A file rather than attempt to stream it, you can provide a file name for the static rendition to save it via the `download` query parameter:

```
https://stream.mux.com/{PLAYBACK_ID}/{STATIC_RENDITION_NAME}?download={SAVED_FILE_NAME}
```

For example, if you want to save the `highest.mp4` file as `cats.mp4`, you can use the following URL:

```
ex. https://stream.mux.com/abcd1234/highest.mp4?download=cats
```

### Accessing static renditions of DRM enabled assets

You can not access static renditions using the playback ID of a `drm` playback policy. If you want to use static renditions you must add a `public` or `signed` advanced playback policy alongside the `drm` policy.

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "advanced_playback_policies": [
    {
      "policy": "drm",
      "drm_configuration_id": "your-drm-configuration-id"
    }, 
    {
      "policy": "signed",
    }
  ],
  "static_renditions": [
    {
      "resolution": "highest"
    }
  ],
  "video_quality": "plus"
}
```

## Remove static renditions

### From an asset

To remove static renditions from an asset, you can use the <ApiRefLink href="/docs/api-reference/video/assets/delete-asset-static-rendition">delete static rendition API</ApiRefLink>. You call the delete static rendition API with the id for each rendition to remove from the asset. The static rendition files will be deleted when they are removed from an asset.

```json
// DELETE /video/v1/asset/{ASSET_ID}/static-renditions/{STATIC_RENDITION_ID}
```

To completely disable static renditions on an asset, delete all of the static renditions configured on the asset.

### From future live streams

To remove the static renditions that are configured to be created from the recorded version of a future live stream, use the <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream-new-asset-settings-static-renditions">delete live stream static renditions API</ApiRefLink>.

```javascript
// DELETE /video/v1/live-streams/{LIVE_STREAM_ID}/new-asset-settings/static-renditions
```

## Webhooks

Your application can be automatically updated with the status of static renditions for an asset through [webhooks](/docs/core/listen-for-webhooks).

There are five events you can receive, which can be fired for each individual static rendition.

| Webhook       | Description   |
| :------------ |:--------------|
|`video.asset.static_rendition.created` |Emitted when a static rendition entry is created and the file is being prepared. |
|`video.asset.static_rendition.ready` |Emitted when a static rendition is ready to be downloaded. |
|`video.asset.static_rendition.errored` |Emitted when a static rendition fails to be generated. |
|`video.asset.static_rendition.skipped` |Emitted when a static rendition is skipped because the requested resolution conflicts with the asset metadata. For example, specifying  `audio-only` for a video-only asset or  `highest` for an audio-only asset.
|`video.asset.static_rendition.deleted` |Emitted when an individual static rendition is deleted. Note: This event is not emitted when the parent asset is deleted. |

## Signed static rendition URLs

Mux videos have two types of playback policy, `public` or `signed`. If your `playback_id` is `signed`, you will need to also sign requests made for MP4 URLs.

You can check out how to do that in our [signed URLs guide](/docs/guides/secure-video-playback).

If you run into any trouble signing requests, please reach out to [Mux Support](/support) and we'll be able to help.

## Migrate from the deprecated `mp4_support` parameter

Previously, MP4 support was specified using the `mp4_support` parameter on an asset. This method continues to work though it has been deprecated and new functionality will use the `static_renditions` array.

The `mp4_support` parameter and the `static_renditions` array cannot be used at the same time on an asset.

To use the `static_renditions` array with assets that have MP4 support enabled using `mp4_support`, you need to first use the <ApiRefLink href="/docs/api-reference/video/assets/update-asset-mp4-support">update asset MP4 support APIs</ApiRefLink>, setting `mp4_support` to `none` to remove the `mp4_support`. Then you can create the static renditions individually as described above.

```json
// PUT /video/v1/assets/{ASSET_ID}/mp4-support
{
  "mp4_support": "none"
}
```

Similarly, the `mp4_support` parameter cannot be used if an asset has existing `static_renditions` specified. Delete the static renditions and the legacy `mp4_support` parameter can be enabled.

## Pricing

Additional storage fees apply for assets that have static renditions enabled.

Streaming of static renditions is charged at the same rate as HLS streaming.

[See pricing documentation for full details](/docs/pricing/video#static-renditions-mp4s)


# Download for offline editing
Learn how to download a master quality video for editing and post production or archiving
## Why download the master

When a video is ingested into Mux we store a version of the video that's equivalent in quality to the original video, we call this the master. The `max_resolution_tier` of an asset will determine the master file's resolution e.g. a `max_resolution_tier` of `2160p` will result in a 4K master file. All of the streamed versions of the video are created from the master, and the master itself is never streamed to a video player because it's not optimized for streaming.

There are a few common use cases where Mux may have the only copy of the original video:

* You're using Mux live streaming and the only copy is the recorded asset after the event
* You're using Mux's [direct upload](/docs/guides/upload-files-directly) feature so Mux has the only copy
* You deleted the original version from your own cloud storage because Mux is already storing a high quality version for you

When this is the case, there are a number of reasons you may want to retrieve the master version from Mux, including:

* Allowing users to download the video and edit it in a tool like Final Cut Pro
* Archiving the video for the future, for example if you're un-publishing (deleting) a video asset from Mux
* Moving your videos to another service

Enabling master access will create a *temporary* URL to the master version as an MP4 file.
You can use this URL to download the video to your own hosting, or provide the URL to a user to download directly from Mux.

**The URL will expire after 24 hours, but you can enable master access on any asset at any time.**

<Callout type="warning" title="API Only!">
  The methods described here are available only via the Mux API; you won't find these features in the Mux Dashboard.
</Callout>

## Enable master access

If you want the master be available soon after a video is uploaded, use the `master_access` property when <ApiRefLink href="/docs/api-reference/video/assets/create-asset">creating an asset</ApiRefLink>.

```json
{
  "inputs": [
    {
      "url": "VIDEO_URL"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "video_quality": "basic",
  "master_access": "temporary"
}
```

You can also add it afterward by <ApiRefLink href="/docs/api-reference/video/assets/update-asset-master-access">updating the asset</ApiRefLink>.

### Enable master access when a live stream finishes

If you want to download the recorded version of a live stream soon after the live stream is finished, use the `master_access` property in the `new_asset_settings` when <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">creating the live stream</ApiRefLink>.

```json
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic",
    "master_access": "temporary"
  }
}
```

## Retrieving the URL to the master

After master access has been requested, a new object called `master` will exist on the asset details object.

```json
{
  ...all asset details...
  "master_access": "temporary",
  "master": {
    "status": "preparing"
  }
}
```

Making the master available is an asynchronous process that happens after an Asset is `ready`, or in the case of a live streamed asset, after the `live_stream_completed` event. Because of this, the `master` object has a `status` property that gives the current state of the master, starting with `preparing`.

In most cases, the master will be available very quickly. When it's ready the `status` will be updated to `ready`, and a new `url` property will exist on the object. This is the URL you can use to download the master yourself, or to let a user download the master.

```json
{
  ...all asset details...
  "master_access": "temporary",
  "master": {
    "status": "ready",
    "url": "https://mezzanine.mux.com/ABC123/mezzanine.mp4?skid=foo&signature=bar"
  }
}
```

## Customizing the filename

The filename of the downloaded file can be controlled by appending the `download` query parameter to the URL returned from the API, for example:

```http
https://mezzanine.mux.com/ABC123/mezzanine.mp4?skid=foo&signature=bar&download=desired_filename.mp4
```

This will cause the browser to download the file with the name `desired_filename.mp4` instead of the default name.

<Callout type="warning" title="Important">
  It is important that you do not modify the URL or query parameters returned from the API in any other way than to add the `download` parameter.
</Callout>

## Webhooks for master access

Your application can be automatically updated with the status of master access for an asset through [webhooks](/docs/core/listen-for-webhooks).

There are four related events you can receive.

| Webhook       | Description   |
| :------------ |:--------------|
|`video.asset.master.preparing` | Received when master access is first requested |
|`video.asset.master.ready` |Received when the URL to the master is available |
|`video.asset.master.deleted` |Received if master access has been set to `none` via a PUT to the `master-access` endpoint |
|`video.asset.master.errored` |Received if an unexpected error happens while making the master available |


# Get thumbnails and images from a video
Learn how to get images from a video to show a preview thumbnail or poster image.
## Get an image from a video

To get an image from Mux, use a `playback_id` to make a request to `image.mux.com` in the following format:

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/thumbnail.{png|jpg|webp}
```

<Image src="/docs/images/thumbnail-1.png" width={1920} height={800} />

Images can be served in either `webp`, `png`, or `jpg` format. Webp is an image format that uses lossy and lossless image compression methods to reduce image size while maintaining good quality. If your images are in the `webp` format, they will typically yield a smaller file size compared to using `png` or `jpg` images.

You can control how the image is created by including the following query string parameters with your request. If you don't include any, Mux will default to choosing an image from the middle of your video.

### Thumbnail Query String Parameters

| Parameter     | Type          | Description                                           |
| :-------------|:--------------|:------------------------------------------------------|
| `time`        | `float`       | The time (in seconds) of the video timeline where the image should be pulled. Defaults to the middle of the original video.|
| `width`       | `int32`       | The width of the thumbnail (in pixels). Defaults to the width of the original video. |
| `height`      | `int32`       | The height of the thumbnail (in pixels). Defaults to the height of the original video. |
| `rotate`      | `int32`       | Rotate the image clockwise by the given number of degrees. Valid values are `90`, `180`, and `270`. |
| `fit_mode`    | `string`      | How to fit a thumbnail within width + height. Valid values are `preserve`, `stretch`, `crop`, `smartcrop`, and `pad` (see below). |
| `flip_v`      | `boolean`     | Flip the image top-bottom after performing all other transformations. |
| `flip_h`      | `boolean`     | Flip the image left-right after performing all other transformations. |
| `latest`      | `boolean`     | When set to `true`, pulls the latest thumbnail from the playback ID of an ongoing live stream. Can only be used with live streams. [More details](#getting-the-latest-thumbnail-from-a-live-stream). |

The `fit_mode` parameter can have the following values:

* `preserve` : By default, Mux will preserve the aspect ratio of the video, while fitting the image within the requested width and height. For example if the thumbnail width is 100, the height is 100, and the video's aspect ratio is 16:9, the delivered image will be 100x56 (16:9).
* `stretch` : The thumbnail will exactly fill the requested width and height, even if it distorts the image. Requires both width and height to be set. (Not very popular.)
* `crop` : The video image will be scaled up or down until it fills the requested width and height box. Pixels then outside of the box will be cropped off. The crop is always centered on the image. Requires both width and height to be set.
* `smartcrop` : An algorithm will attempt to find an area of interest in the image and center it within the crop, while fitting the requested width and height. Requires both width and height to be set.
* `pad` : Similar to preserve but Mux will "letterbox" or "pillar box" (add black padding to) the image to make it fit the requested width and height exactly. This is less efficient than preserve but allows for maintaining the aspect ratio while always getting thumbnails of the same size. Requires both width and height to be set.

#### Example with Query String Parameters

Here is an example request for an image including query parameters which:

* has a width of 400 and a height of 200
* uses the `smartcrop` fit mode
* is taken from the 35th second of the video
* is a PNG

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/thumbnail.png?width=400&height=200&fit_mode=smartcrop&time=35
```

<Image src="/docs/images/thumbnail-2.png" width={400} height={200} />

<Callout type="info">
  Note that there is a default limit of 1 thumbnail and 1 GIF for every 10 seconds of duration per asset. For assets under 100 seconds in duration, the limit is 10 thumbnails and 10 GIFs. For example, you can retrieve 30 thumbnails and 30 GIFs for a 5 minute asset or 10 thumbnails and 10 GIFs for a 30 second asset.
</Callout>

## Get an animated GIF from a video

To get an animated `gif` or `webp` from Mux, use a `playback_id` associated with an asset or live stream to make a request to `image.mux.com` in the following format:

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/animated.{gif|webp}
```

<Image src="/docs/images/animated-image-1.gif" width={640} height={266} />

You can control how the image is created by including the following query string parameters with your request.

### Animated GIF Query String Parameters

| Parameters    | Type          | Description   |
| :------------ |:--------------|:--------------|
| `start`       | `float`       | The time (in seconds) of the video timeline where the animated GIF should begin. Defaults to 0. |
| `end`         | `float`       | The time (in seconds) of the video timeline where the GIF ends. Defaults to 5 seconds after the start. Maximum total duration of GIF is limited to 10 seconds; minimum total duration of GIF is 250ms. |
| `width`       | `int32`       | The width in pixels of the animated GIF. Default is 320px, or if height is provided, the width is determined by preserving aspect ratio with the height. Max width is 640px. |
| `height`      | `int32`       | The height in pixels of the animated GIF. The default height is determined by preserving aspect ratio with the width provided. Maximum height is 640px. |
| `fps`         | `int32`       | The frame rate of the generated GIF. Defaults to 15 fps. Max 30 fps. |

#### Example with Query String Parameters

Here is an example request for a GIF including query parameters which:

* set a width of 640
* set a frame rate of 5fps

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/animated.gif?width=640&fps=5
```

<Image src="/docs/images/animated-image-2.gif" width={640} height={360} />

## Common uses for image requests

Images and GIFs can be used anywhere in your project, but here are some examples of common ways you can use images from Mux.

<Callout type="error">
  Avoid using images for storyboards (timeline hover previews). To learn more about storyboards, you can view [this guide](/docs/guides/create-timeline-hover-previews).
</Callout>

### Add a poster image to your player

Most video players will default to showing a black frame with a play icon and other video controls before a user presses play to start the video playback.
You can add a poster image to the majority of video players where you could feed an image URL from Mux. Here's an example using a HTML5 video element
with a poster image setup.

```jsx
<video id="my-video" width="640" height="360" poster="https://image.mux.com/{PLAYBACK_ID}/thumbnail.jpg" controls>
```

### Use a GIF to show a preview

When a user is picking a video from a catalogue, you could show a preview of the video using an animated GIF whilst they hover over a thumbnail of the video.

<Image src="/docs/images/animated-catalogue-example.gif" width={640} height={300} />

You could use pure CSS, some JavaScript, or another method which best fits with your application to achieve a similar result (the example above used CSS).

## Using signed URLs

Mux videos have two types of playback policy, `public` or `signed`. If your `playback_id` is `signed`, you will need to also sign requests made for images and animated GIFs.
You can check out how to do that in our [signed URLs guide](/docs/guides/secure-video-playback).

If you run into any trouble signing image requests, please [reach out](/support) and we'll be able to help.

## Getting the latest thumbnail from a live stream

When a live stream is active, you can use the `?latest=true` query string parameter to get the latest thumbnail from the live stream. This thumbnail is refreshed every 10 seconds.

This is useful for building moderation and classification workflows when working with user-generated live streams, but can also be used to show a discovery experience, showing the active live streams in your application.

Using the `latest` parameter on a VOD asset or non-live stream will return a 400 error.

[Read the blog post for more details end examples for this feature.](/blog/latest-thumbnail)


# Create timeline hover previews
Learn how to add hover image previews to your player.
## What are timeline hover previews?

Timeline hover previews, also known as trick play or scrub bar previews, make player operations like fast-forward, rewind, and seeking more visual to the user. Here it is in action:

<Image src="/docs/images/animated-storyboard.gif" width={640} height={360} />

Each image (also called a thumbnail or tile) you see when hovering over the scrub bar (or player timeline) on the video player is part of a larger image called a storyboard.
A storyboard is a collection of thumbnails or tiles, created from video frames selected at regular time intervals and are arranged in a grid layout.

Below image an example storyboard for the video, [Tears of Steel](https://mango.blender.org/), the same video used to demo timeline hover previews above:

<Image src="/docs/images/storyboard.png" width={1920} height={1600} />

## Add timeline hover previews to your player

There are a few different ways to add this functionality to your players, depending on which methods your chosen player exposes to support timeline hover previews.

The storyboard image can be requested from the following URL in either `webp`, `jpg`, or `png` format from Mux:

```curl \[object Object]
https://image.mux.com/{PLAYBACK_ID}/storyboard.{png|jpg|webp}
```

Each storyboard has an associated metadata file and can be used as a `metadata` text track. The storyboard image is referenced from the metadata in this case.

The storyboard metadata provides the x-axis and y-axis coordinates of each image in the storyboard image and the corresponding time range. The metadata is available in both WebVTT and JSON format.

Storyboard images will contain 50 tiles within the image if the asset is less than 15 minutes in duration. If the asset is more than 15 minutes, then there will be 100 tiles populated in the storyboard image.

### WebVTT

Most popular video players use WebVTT file for describing individual tiles of the storyboard image. You can request the WebVTT file by making a request to generate a storyboard of the image.

```curl \[object Object]
GET https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt
```

```
WEBVTT

00:00:00.000 --> 00:01:06.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=0,0,256,160

00:01:06.067 --> 00:02:14.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=256,0,256,160

00:02:14.067 --> 00:03:22.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=512,0,256,160

00:03:22.067 --> 00:04:28.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=768,0,256,160

00:04:28.067 --> 00:05:36.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=1024,0,256,160

00:05:36.067 --> 00:06:44.067
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg#xywh=0,160,256,160
```

By default, this request will generate a `jpg` for the storyboard image. If you'd like to change the format to `webp`, you can do so by adding `?format=webp` to the end of the request URL:

```curl \[object Object]
GET https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?format=webp
```

#### WebVTT Compatible Video Players

The list below shows the various video players supporting the WebVTT files for trick play. If your player isn't listed here, [please reach out](/support), and we'll help where we can!

* [VideoJS](https://videojs.com/) + [VTT Thumbnails plugin](https://www.npmjs.com/package/videojs-vtt-thumbnails)
* [JW Player](https://docs.jwplayer.com/players/docs/ios-add-preview-thumbnails)
* [THEOplayer](https://www.theoplayer.com/docs/theoplayer/how-to-guides/texttrack/how-to-implement-preview-thumbnails/)
* [Bitmovin](https://bitmovin.com/demos/thumbnail-seeking)
* [Flow Player](https://flowplayer.com/demos/video-thumbnails)
* [Plyr](https://plyr.io)

<Callout type="info">
  Using a WebVTT file may be limited to HTML5 browser-based video players and may not be supported in Device specific SDKs including iOS and Android. iOS, Android, and other device platforms use a HLS iFrame Playlist. Generating HLS iFrame Playlists is on Mux's roadmap.
</Callout>

### JSON

There are many other scenarios for using storyboards. For instance:

* A quick way of previewing the entire video can save the video editor/reviewer's time without requiring a full video playback
* Ease of developing trick play like functionality in Chromeless Video players like [hls.js](https://github.com/video-dev/hls.js/)

Using a WebVTT file for metadata can be burdensome to implement. Storyboard metadata expressed in an easy to understand & widely supported format like JSON helps in taking advantage of storyboards in new ways. Mux provides the same storyboard metadata in JSON format.

```curl \[object Object]
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.json
```

```json
{
  "url": "https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.jpg",
  "tile_width": 256,
  "tile_height": 160,
  "duration": 6744.1,
  "tiles": [
    {
      "start": 0,
      "x": 0,
      "y": 0
    },
    {
      "start": 66.066667,
      "x": 256,
      "y": 0
    },
    {
      "start": 134.066667,
      "x": 512,
      "y": 0
    },
    {
      "start": 202.066667,
      "x": 768,
      "y": 0
    },
    {
      "start": 268.066667,
      "x": 1024,
      "y": 0
    },
    {
      "start": 336.066667,
      "x": 0,
      "y": 160
    }
  ]
}
```

By default, this request will generate a `jpg` for the storyboard image. If you'd like to change the format to `webp`, you can do so by adding `?format=webp` to the end of the request URL:

```curl \[object Object]
https://image.mux.com/Dk8pvMnvTeqDk9dy5nqmXz02MM4YtdElW/storyboard.json?format=webp
```

## Cross-Origin Resource Sharing (CORS) requirements

The storyboards URLs use `image.mux.com` hostname and `stream.mux.com` hostname is used for video playback URL. Because the URLs use different hostnames, it is recommended to add `crossorigin` attribute to the [`<video>` HTML tag](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/video) for access.

## Roku trick play

Roku announced changes to their [channel certification criteria](https://blog.roku.com/developer/channel-certification-criteria-updates-october-2020) mandating trick play for on-demand video longer than 15mins starting October 1st, 2020.
To support this requirement, you can add this playback modifier for playback on Roku devices when making a playback request:

```none
https://stream.mux.com/{PLAYBACK_ID}.m3u8?roku_trick_play=true
```

Mux will include an Image Media Playlist in the HLS manifest to support this requirement on Roku.

<Callout type="warning" title="Using `roku_trick_play` with signed URLs">
  If you are using [signed playback URLs](/docs/guides/secure-video-playback) make sure you include the extra `roku_trick_play` in your signed token.
</Callout>

## Using signed URLs

Mux videos have two types of playback policy, `public` or `signed`. If your `playback_id` is `signed`, you will need to also sign requests made for storyboards.
You can check out how to do that in our [signed URLs guide](/docs/guides/secure-video-playback).


# Use Video.js with Mux
Learn what video.js kit is and how to use it in your application.
## 1. Introduction to video.js kit

Video.js kit is a project built on [Video.js](https://videojs.com) with additional Mux specific functionality built in.
This includes support for:

* Enabling [timeline hover previews](/docs/guides/create-timeline-hover-previews)
* [Mux Data integration](/docs/guides/monitor-video-js)
* `playback_id` helper (we'll figure out the full playback URL for you)

## 2. Integrate video.js kit

Video.js kit is hosted on npm. To install it, navigate to your project and run:

```text
// npm
npm install @mux/videojs-kit

// yarn
yarn add @mux/videojs-kit
```

Now import the JavaScript and CSS in your application like this:

```js
// include the video.js kit JavaScript and CSS
import videojs from '@mux/videojs-kit';
import '@mux/videojs-kit/dist/index.css';
```

If you're not using a package manager such as npm, there are hosted versions provided by [jsdelivr.com](https://www.jsdelivr.com/) available too.
Use the hosted versions by including this in your HTML page:

```html
// script
<script src="https://cdn.jsdelivr.net/npm/@mux/videojs-kit@latest/dist/index.js"></script>
// CSS
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@mux/videojs-kit@latest/dist/index.css">
```

Then, on your page include a `<video>` element where you want to add your player.

```html
<video 
  id="my-player" 
  class="video-js vjs-16-9" 
  controls 
  preload="auto" 
  width="100%"
  data-setup='{}'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

Replace the \{PLAYBACK\_ID} with the `playback_id` of your video from Mux.

## Integrate using Video.js's default playback engine.

Video.js kit by default uses [hls.js](https://github.com/video-dev/hls.js).
As of version 0.8.0, you can now integrate with [Video.js's default playback engine](https://github.com/videojs/http-streaming).

To do so, you can follow the steps above but swap out the specific import file to be `index.vhs.js`.

For import:

```js
// include the video.js kit JavaScript and CSS
import videojs from '@mux/videojs-kit/dist/index.vhs.js';
```

For a script tag:

```html
<script src="https://unpkg.com/@mux/videojs-kit@latest/dist/index.vhs.js"></script>
```

## Source Code

Video.js kit is open source and can be found on GitHub here: [https://github.com/muxinc/videojs-mux-kit](https://github.com/muxinc/videojs-mux-kit)

## 3. Set configuration options

There are some built in additional features which can be set when you initialize video.js kit.

## Include a timeline hover preview

You can enable a timeline hover preview by including `timelineHoverPreviews: true` in the configuration options when you create your player.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%"
  data-setup='{
    "timelineHoverPreviews": true
  }'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

## Enable Mux Data

You can enable Mux Data by including the following options when you create your player.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%"
  data-setup='{
    "plugins": {
      "mux": {
        "debug": true,
        "data":{
          "env_key": "ENV_KEY",
          "video_title": "Example Title"
        }
      }
    }
  }'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

The `videojs-mux` data plugin is included by default, so you don't need to include this in addition to video.js kit. To link your data integration to your account,
you should replace the `ENV_KEY` in the configuration with an appropriate Mux environment key, as well as [set metadata](/docs/guides/monitor-video-js#3-make-your-data-actionable).

## Set the video source

You can set the video source using the `playback_id` from your video in Mux, and we'll figure out the fully formed playback URL automatically.

When you set the source for video.js, make sure you set the `type` as `video/mux` and the `src` as your `playback_id`.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>

```

## Initialize with JavaScript

The options above can also be initialized with JavaScript. Here's an example showing how you could enable all options with JavaScript.

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  timelineHoverPreviews: true,
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: "{PLAYBACK_ID}",
  type: "video/mux",
});
</script>
```

## 4. Using signed URLs

Playback of Mux videos with a signed playback policy is now supported from v0.3.0 onward.

Before continuing, ensure you have followed the [secure video playback](/docs/guides/secure-video-playback) guide, and are comfortable generating JSON Web Tokens (JWTs) to use with Mux.

## Playback a video with a signed URL

Use the `playback_id` from your video in Mux and then append your video JWT token
like this `{PLAYBACK_ID}?token={YOUR_VIDEO_JWT}` as your player source. When you set the source for video.js, make sure you set the `type` as `video/mux`.

In HTML:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%" data-setup="{}">
  <source src="{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}" type="video/mux" />
</video>

```

Or, achieve the same result using JavaScript:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: `{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}`,
  type: `video/mux`,
});

</script>
```

## Enable a timeline hover preview from a signed URL

Mux requires a separate JWT token to access the timeline hover preview storyboard URL, which isn't something that Video.js Kit is able to automatically figure out,
unlike with public playback URLs. Instead, we require the fully formed URL to be passed to the player.

To setup timeline hover previews with a signed URL, first make sure that `timelineHoverPreviews` is set to `false` or not set at all, which stops the automatic URL generation taking place.
Then, either set the `timelineHoverPreviewsUrl` in the player configuration like this:

```html
<video id="my-player" 
  class="video-js vjs-16-9" 
  controls 
  preload="auto" 
  width="100%" 
  data-setup='{
    "timelineHoverPreviewsUrl": "https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?token={JWT_STORYBOARD_TOKEN}"
  }'>
  <source src="{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}" type="video/mux" />
</video>

```

Or, achieve the same result using JavaScript and use the `player.timelineHoverPreviews()` function:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: `{PLAYBACK_ID}?token={JWT_VIDEO_TOKEN}`,
  type: `video/mux`,
});

player.timelineHoverPreviews({
  enabled: true, 
  src: "https://image.mux.com/{PLAYBACK_ID}/storyboard.vtt?token={JWT_STORYBOARD_TOKEN}"
});

</script>
```

`player.timelineHoverPreviews` is a function that can be used to set, update or remove timeline hover previews from a player, and takes a single object parameter.

The object has two properties, `src` which should be a string pointing to the VTT file which contains the timeline hover previews information, and `enabled` which can be
either `true` for the player to attempt to use the provided source and setup the timeline hover previews, or `false` which will disable any timeline hover previews which are
currently configured on the player.

### Remove timeline hover previews

To switch off timeline hover previews, you can use the following API;

```js

player.timelineHoverPreviews({
  enabled: false, 
});
```

## 5. Enable a Quality Selector

As of [version v0.10.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.10.0), Video.js kit comes bundled with the \[`videojs-contrib-quality-levels`]\[] and \[`videojs-http-source-selector`]\[] plugins. They are not enabled by default.

To enable them, you can pass it in as part of the plugins object:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%"
  data-setup='{
    "plugins": {
      "mux": {
        "debug": true,
        "data":{
          "env_key": "ENV_KEY",
          "video_title": "Example Title"
        }
      },
      "httpSourceSelector": {}
    }
  }'
>
  <source src="{PLAYBACK_ID}" type="video/mux" />
</video>
```

Or, you call the `httpSourceSelector` function manually on the player:

```html
<video id="my-player" class="video-js vjs-16-9" controls preload="auto" width="100%">
</video>

<script>
const player = videojs('my-player', {
  timelineHoverPreviews: true,
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY',
        video_title: 'Example Title'
      }
    }
  }
});

player.src({
  src: "{PLAYBACK_ID}",
  type: "video/mux",
});

// enable the quality selector plugin
player.httpSourceSelector();
</script>
```

## 6. Configure webpack for other plugins

If you want to use another plugin but when you load up your page, but the plugin isn't loading up in Video.js, you'll need to configure Webpack, or another bundler specially.
This is due to the internals of how Video.js and Video.js kit are built. When using the default build, Video.js kit doesn't use the default Video.js built, but rather Video.js's `core.js` build. This means that Video.js plugins need to be configured use the same build.
This can be done with [Webpack's `resolve.alias` configuration](https://webpack.js.org/configuration/resolve/#resolvealias):

```js
config.resolve = {
  alias: {
    'video.js': 'video.js/core',
  }
};
```

## 7. Release notes

### Current release: v0.11.0

[View v0.11.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.11.0)

This release enables configuring hls.js via Video.js options:

```js
videojs('mux-default', {
  html5: {
    hls: {
      capLevelToPlayerSize: true
    }
  }
});
```

For advanced use cases, the `hlsjs` instance is exposed on the tech.

### Previous releases

#### v0.10.0

[View v0.10.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.10.0)

* This release includes \[`videojs-contrib-quality-levels`]\[] and \[`videojs-http-source-selector`]\[] by default.

#### v0.9.3

[View v0.9.3](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.3)

* Update Video.js version to v7.18.1.

#### v0.9.2

[View v0.9.2](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.2)

* Update hls.js to [v1.1.5](https://github.com/video-dev/hls.js/releases/tag/v1.1.5)

#### v0.9.1

[View v0.9.1](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.1)

* As part of 0.7.0, tighter error handling integration with hls.js made all errors be triggered on the player. This meant that errors that don't inhibit playback and that hls.js handled automatically were treated the same as fatal errors that hls.js doesn't handle automatically. Now, only errors that hls.js considers fatal will trigger an error event.

#### v0.9.0

[View v0.9.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.9.0)

* Integrate with [`videojs-contrib-quality-levels`](https://github.com/videojs/videojs-contrib-quality-levels).

#### v0.8.0

[View v0.8.0](https://github.com/muxinc/videojs-mux-kit/releases/tag/v0.8.0)

* Introduce VHS build file

Check the [GitHub releases](https://github.com/muxinc/videojs-mux-kit/releases) page for the full version history.

[videojs-http-source-selector]: https://github.com/jfujita/videojs-http-source-selector

[videojs-contrib-quality-levels]: https://github.com/videojs/videojs-contrib-quality-levels


# Modify playback behavior
Use playback modifiers in your HLS urls to change the default playback behavior.
Playback modifiers are optional parameters added to the video playback URL. These modifiers allow you to change the behavior of the stream you receive from Mux.

Mux Video supports 2 different types of playback policies: `public` and `signed`. Playback modifiers are supported for both types of playback policies. However, the method to add them differs.

# Query String with `public` playback URL

```text
https://stream.mux.com/{PLAYBACK_ID}.m3u8?{MODIFIER_NAME}={MODIFIER_VALUE}
```

Replace `PLAYBACK_ID` with your asset's public policy playback ID. Replace `MODIFIER_NAME` and `MODIFIER_VALUE` with any of the supported modifiers listed below in this document.

# JWT Claim with `signed` playback URL

```text
https://stream.mux.com/{PLAYBACK_ID}.m3u8?token={TOKEN}
```

Replace `PLAYBACK_ID` with your asset's signed policy playback ID and `TOKEN` with the signature generated. Add modifiers to your claims body in the JWT payload. View the guide for [Secure video playback](/docs/guides/secure-video-playback#note-on-query-parameters-after-signing) for details about adding query parameters to signed tokens.

# Availiable playback modifiers

| Modifier | Availiable Values | Default Value | Description |
| :-- | :-- | :-- | :-- |
| `redundant_streams` | `true`, `false` | `false` | Includes HLS redundant in the stream's manifest. See the [Play your videos](/docs/guides/play-your-videos#add-delivery-redundancy-with-redundant-streams) guide. |
| `roku_trick_play` | `true`, `false` | `false` | Adds support for timeline hover previews on Roku devices. See the [Create timeline hover previews](/docs/guides/create-timeline-hover-previews#roku-trick-play) guide.
| `default_subtitles_lang` | A BCP47 compliant language code | none | Sets which subtitles/captions language should be set as the default. See the [Subtitles guide](/docs/guides/add-subtitles-to-your-videos#showing-subtitles-by-default) guide.
| `max_resolution`| `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, `2160p` | none | Sets the maximum resolution of renditions included in the manifest. See the [Control playback resolution](/docs/guides/control-playback-resolution#specify-maximum-resolution) guide.|
| `min_resolution`| `270p`, `360p`, `480p`, `540p`, `720p`, `1080p`, `1440p`, `2160p` | none | Sets the minimum resolution of renditions included in the manifest. See the [Control playback resolution](/docs/guides/control-playback-resolution#specify-minimum-resolution) guide. |
| `rendition_order`| `desc` | Automatically ordered by Mux's internal logic. | Sets the logic to order renditions by in the HLS manifest. See [the blog post.](https://www.mux.com/blog/more-tools-to-control-playback-behavior-min-resolution-and-rendition-order#rendition_order)|
| `program_start_time` | An epoch timestamp | none | Sets the start time of the asset created from a live stream or live stream when using the [instant clipping feature](/docs/guides/create-instant-clips). |
| `program_end_time` | An epoch timestamp | none | Sets the end time of the asset created from a live stream or live stream when using the [instant clipping feature](/docs/guides/create-instant-clips). |
| `asset_start_time` | Time (in seconds) | none | Sets the relative start time of the asset when using the [instant clipping feature](/docs/guides/create-instant-clips). |
| `asset_end_time` | Time (in seconds) | none | Sets the relative end time of the asset when using the [instant clipping feature](/docs/guides/create-instant-clips). |


# Add metadata to your videos
Learn how to add titles and other metadata to your videos for better organization,
  discoverability, and actionable analytics.
## What is asset metadata?

Metadata provides additional descriptive information about your video assets. Mux currently supports three key optional metadata fields that help you organize and manage your video content across the API and dashboard:

* `title`: A descriptive name for your video content. We limit this to 512 code points.
* `creator_id`: A value you set to identify the creator or owner of the video. We limit this to 128 code points.
* `external_id`: Another value you set to reference this asset in your system, such as the video ID in your database. We limit this to 128 code points.

<Callout id="code-points" type="info">
  **What is a code point?** Many of us use the term "characters" when referring to letters in a string, but when storing those characters some cost more than others. This cost is called a "code point". <br /><br />While each ASCII character can be stored with a single code point, some unicode characters, such as `é`, are stored as two code points. One for the `e`, and one for the ` ́`. You can easily test this in JavaScript. JavaScript's `.length` property counts code points, not characters, so `"é".length` will be `2`.
</Callout>

Here's an example of what a `meta` object might look like:

```json
{
   "title": "Guide: Adding metadata to videos",
   "creator_id": "user_23456",
   "external_id": "cdef2345"
}
```

<Callout type="info">
  **Note:** Do not include personally identifiable information in these fields. They will be accessible by browsers to display player UI.
</Callout>

Once set on an asset, you'll find this metadata on assets across the Mux API and dashboard, including asset management, [engagement](/beta/engagement) and [data](/data).

## Manage metadata through the Dashboard

We've deeply integrated asset metadata throughout the Mux dashboard:

<Player playbackId="trRCuyNyUHeYdQ5ZbvSsRf34Reuc301CDQzAxDUqog1w" thumbnailTime="10" title="Asset metadata demo" className="flex" />

* When uploading, we use your filename as the title - but you can change it at any time
* For live streams, you can set the default metadata for recordings on the stream details page
* When viewers watch your content, all metadata flows into Mux Data and the engagement dashboard - making it easy to find videos by title, or filter by creator id.

## Manage metadata through the API

## Create an asset with metadata

When creating an asset you can include your metadata in the body of the request.

#### Example request

```json
// POST /video/v1/assets
{
    "inputs": [
        {
            "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
        }
    ],
    "playback_policies": [
        "public"
    ],
    "video_quality": "basic",
    "meta": {
        "title": "Mux demo video",
        "creator_id": "abcd1234",
        "external_id": "bcde2345"
    }
}
```

#### Need more help?

* Check out our [getting started guide](/docs/core/stream-video-files) for a more thorough introduction to creating assets.
* Check out our <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Create an asset</ApiRefLink> for a list of all possible parameters.

## Update the metadata on an asset

Once an asset has been created the metadata can be changed at any time. Make a request to update the asset and include your metadata in the request body.

#### Example request

```json
// PATCH /video/v1/assets/{ASSET_ID}
{
    "meta": {
        "title": "Updated Mux demo video",
        "creator_id": "cdef3456",
        "external_id": "defg4567"
    }
}
```

#### Need more help?

* Check out our <ApiRefLink href="/docs/api-reference/video/assets/update-asset">Update asset API reference</ApiRefLink> for more details.

## Directly upload a video with metadata

Direct uploads are a [multi-step process](/docs/guides/upload-files-directly), and metadata should be attached in the very first step. When creating your authenticated URL in that first step you can include your metadata alongside the rest of the asset settings in `new_asset_settings`.

#### Example Request

```json
// POST /video/v1/uploads
{
    "new_asset_settings": {
        "playback_policies": [
            "public"
        ],
        "video_quality": "basic",
        "meta": {
            "title": "Mux demo video",
            "creator_id": "abcd1234",
            "external_id": "bcde2345"
        }
    },
    "cors_origin": "*",
}
```

#### Need more help?

* Check out our [direct upload guide](/docs/guides/upload-files-directly) for details on every step.

## Set live stream metadata defaults for creating assets

Mux automatically creates a new asset each time you connect to a live stream. When creating or updating your live stream you can include metadata that gets automatically set on the generated assets in the request body, under `new_asset_settings`.

#### Example "Create Live Stream" request

```json
// POST /video/v1/live-streams
{
    "playback_policies": [
        "public"
    ],
    "new_asset_settings": {
        "playback_policies": [
            "public"
        ],
    },
    "meta": {
        "title": "Mux demo live stream recording",
        "creator_id": "abcd1234",
        "external_id": "bcde2345"
    }
}
```

#### Need more help?

* Check out our "[start live streaming](/docs/guides/start-live-streaming)" guide for a deeper walkthrough.


# Add auto-generated captions to your videos and use transcripts
Learn how to add auto-generated captions to your on-demand Mux Video assets, to increase accessibility and to create transcripts for further processing.
## How auto-generated captions work

Mux uses [OpenAI's Whisper model](https://openai.com/index/whisper) to automatically generate captions for on-demand assets. This guide shows you how to enable this feature, what you can do with it, and what some of the limitations you might encounter are.

Generally, you should expect auto-generated captions to work well for content with reasonably clear audio. It may work less well with assets that contain a lot of non-speech audio (music, background noise, extended periods of silence).

We recommend that you try it out on some of your typical content, and see if the results meet your expectations.

This feature is designed to generate captions in the same language that your content's audio is produced in. It should not be used to programatically generate translated captions in other languages.

## Enable auto-generated captions

When you <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create a Mux Asset</ApiRefLink>, you can add a `generated_subtitles` array to the API call, as follows:

```json
// POST /video/v1/assets
{
    "inputs": [
        {
            "url": "...",
            "generated_subtitles": [
                {
                    "language_code": "en",
                    "name": "English CC"
                }
            ]
        }
    ],
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic"
}
```

Mux supports the following languages and corresponding language codes for VOD generated captions. Languages labeled as "beta" may have lower accuracy.

| Language | Language Code | Status |
| :-- | :-- | :-- |
| English | en | Stable |
| Spanish | es | Stable |
| Italian | it | Stable |
| Portuguese | pt | Stable |
| German | de | Stable |
| French | fr | Stable |
| Automatic Detection | auto | Stable |
| Polish | pl | Beta |
| Russian | ru | Beta |
| Dutch | nl | Beta |
| Catalan | ca |  Beta |
| Turkish | tr |  Beta |
| Swedish | sv | Beta |
| Ukrainian | uk | Beta |
| Norwegian | no | Beta |
| Finnish | fi | Beta |
| Slovak | sk | Beta |
| Greek | el | Beta |
| Czech | cs | Beta |
| Croatian | hr | Beta |
| Danish | da | Beta |
| Romanian | ro | Beta |
| Bulgarian | bg | Beta |

You can also enable autogenerated captions if you're <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">using Direct Uploads</ApiRefLink> by specifying the `generated_subtitles` configuration in the first entry of the `input` list of the `new_asset_settings` object, like this:

```json
// POST /video/v1/uploads
{
    "new_asset_settings": {
        "playback_policies": [
            "public"
        ],
        "video_quality": "basic",
        "inputs": [
            {
                "generated_subtitles": [
                    {
                        "language_code": "en",
                        "name": "English CC"
                    }
                ]
            }
        ]
    },
    "cors_origin": "*"
}
```

Auto-captioning happens separately from the initial asset ingest, so that this doesn't delay the asset being available for playback. If you want to know when the text track for the captions is ready, listen for the `video.asset.track.ready` webhook for a track with `"text_source": "generated_vod"`.

### Retroactively enable auto-generated captions

You can retroactively add captions to any asset by POSTing to the `generate-subtitles` endpoint on the asset audio track that you want to generate captions for, as shown below:

```json
// POST /video/v1/assets/${ASSET_ID}/tracks/${AUDIO_TRACK_ID}/generate-subtitles

{
  "generated_subtitles": [
    {
      "language_code": "en",
      "name": "English (generated)"
    }
  ]
}
```

**For self-service customers:** You can add captions to any asset using this API.

**For contract customers:** If you need to add captions to assets older than 7 days, [please contact support](/support/human) and we'd be happy to help. Please note that there may be a charge for backfilling captions onto large libraries.

## Retrieve a transcript

For assets that have a `ready` auto-generated captions track, you can also request a transcript (a plain text file) of the speech recognized in your asset.

To get this, use a playback id for your asset and the track id for the `generated_vod` text track:

<Callout>
  If you don't know the `TRACK_ID`, you can retrieve it by listing the asset's tracks using the{' '}
  <ApiRefLink href="/docs/api-reference/video/assets">Asset endpoint</ApiRefLink> under `tracks` and the corresponding `track.id`.
</Callout>

```
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.txt
```

Signed assets require a `token` parameter specifying a JWT with the same `aud` claim used for [video playback](/docs/guides/secure-video-playback#4-generate-a-json-web-token-jwt):

```
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.txt?token={JWT}
```

You can also retrieve a WebVTT version of the text track by replacing `.txt` with `.vtt` in the URL.

You might find this transcript useful for doing further processing in other systems. For example, content moderation, sentiment analysis, summarization, extracting insights from your content, and many more.

## FAQ

### How much does auto-generated captioning cost for on-demand assets?

There is no additional charge for this feature. It's included as part of the standard encoding and storage charges for Mux Video assets.

### How long does it take to generate captions?

It depends on the length of the asset, but generally it takes about 0.1x content duration. As an example, a 1 hour asset would take about 6 minutes to generate captions for.

### Help, the captions you generated are full of mistakes!

We're sorry to hear that! Unfortunately, though automatic speech recognition has improved enormously in recent years, sometimes it can still get things wrong.

One option you have is to edit and replace the mis-recognized speech in the captions track:

1. Download the full VTT file we generated at `https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt`
2. Edit the VTT file using your preferred text editor
3. Delete the autogenerated track with the <ApiRefLink href="/docs/api-reference/video/assets/delete-asset-track">'delete track' API</ApiRefLink>
4. Add a new track to your asset using the edited VTT file, using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">`create track` API</ApiRefLink>

### My content is in multiple languages

We currently do not recommend using this feature on mixed-language content.

### I want to generate captions in a different language to my content

We currently do not support automatic translation in generated captions - you should only generate captions in the language that matches your audio track.

### My content is in a language you don't support

We'd love to hear more about the languages that you'd like to see us support, please [reach out](/support) with details.


# Add subtitles/captions to videos
Learn how to add subtitles or captions to your videos for accessibility and multi-language support.
## Introduction to subtitles and captions

Subtitles and captions allow for text overlays on a video to be shown at a specified time. First, let's clarify these two terms which are often used interchangeably.

* **Subtitles** refers to text on screen for translation purposes.
* **Captions** refers to text on screen for use by deaf and hard of hearing audiences. If you see text like `[crowd cheers]`, you are seeing *captions* on your screen.

In any case, Mux supports both in the form of [WebVTT](https://www.w3.org/TR/webvtt1/) or [SRT](https://en.wikipedia.org/wiki/SubRip) and these files can be human or computer generated. From Mux's perspective these files are converted into "text tracks" associated with the asset. If the text track provided is *captions* then supply the attribute `closed_captions: true` when creating the text track.

The rest of this guide will use the term "subtitles" to refer to adding text tracks that can be either subtitles or captions.

## How to add subtitles to your video

You can add subtitles to any video asset in Mux. To add subtitles, you will need to provide either a `SRT` or `WebVTT` file containing the subtitle information to the Mux API.

Here's an example of what a WebVTT file looks like:

```html
00:28.000 --> 00:30.000 position:90% align:right size:35%
...you have your robotics, and I
just want to be awesome in space.

00:31.000 --> 00:33.000 position:90% align:right size:35%
Why don't you just admit that
you're freaked out by my robot hand?
```

<Callout type="info">
  Mux can also [automatically generate your captions](/docs/guides/add-autogenerated-captions-and-use-transcripts)
</Callout>

## Create a subtitle track

When you <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create an asset</ApiRefLink> in Mux, you can also include text tracks as part of the input. There's no limit on the number of tracks you can include when you make the request.

The first input in your array of inputs must be the video file. After that, the caption tracks should be appended to the list, each including the source URL to the caption track, plus additional metadata. Here's an example of the order to use here:

```json
{
    "inputs": [
      {
        "url": "{VIDEO_INPUT_URL}"
      },
      {
        "url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-en.vtt",
        "type": "text",
        "text_type": "subtitles",
        "closed_captions": false,
        "language_code": "en",
        "name": "English"
      },
      {
        "url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-fr.vtt",
        "type": "text",
        "text_type": "subtitles",
        "closed_captions": false,
        "language_code": "fr",
        "name": "Français"
      }
    ],
    "playback_policies": [
      "public"
    ],
    "video_quality": "basic"
}
```

This will enable WebVTT subtitles in the stream URL, which can then be used by many different players.

You can also add text tracks using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">create asset track</ApiRefLink>. This can be helpful for adding captions to live stream recordings once they have finished, or if you need to update or remove additional languages for a video after it was first added to Mux. Assets must be in the `ready` state before you can use the create asset track API to add a text track.

## Showing subtitles by default

To show subtitles by default, you can include an additional playback modifier with the HLS stream request like this:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?default_subtitles_lang=en
```

The `default_subtitles_lang` playback modifier requires a valid [BCP-47](https://tools.ietf.org/rfc/bcp/bcp47.txt) language value to set the DEFAULT attribute value to YES for that language.
If there's no exact language match, the closest match of the same language is selected.

For instance, subtitles text track with language `en-US` is selected for `default_subtitles_lang=en`. This helps with regional variations and gives more flexibility.

Video players play the default text track for autoplaying videos even when muted.

<Callout type="warning" title="Using `default_subtitles_lang` with signed URLs">
  If you are using [signed playback URLs](/docs/guides/secure-video-playback) make sure you include the extra parameter in your signed token.
</Callout>

## Accessibility

The [A11Y project](https://www.a11yproject.com/) is a community-driven effort to make digital accessibility easier and includes checking videos for accessibility.

With Mux videos, the `jsx-a11y/media-has-caption` rule fails because it looks for a `<track>` attribute on the player. However, Mux videos include subtitles with HLS manifest when you request the stream.
If you have added text tracks to your Mux videos you can safely disable this linting rule and still provide accessible video.

## Workflow for generating subtitles

You may want to generate subtitle tracks for your Mux assets. These might be machine generated or human-generated by yourself or a 3rd party. Some example third-party services you might use to do this are [Rev.com](https://www.rev.com/) and [Simon Says](https://www.simonsays.ai/).

<Callout type="info">
  Mux can also [automatically generate your captions](/docs/guides/add-autogenerated-captions-and-use-transcripts)
</Callout>

Using static renditions and webhooks from Mux, your automated flow might look like this:

1. Create a Mux asset (either with a Direct Upload, an `input` parameter, or the recording of a live stream).
2. Add `mp4_support` to your asset either at asset creation time or add `mp4_support` to your asset if it is already created. See [Download your videos guide](/docs/guides/download-your-videos) for details about how to do this.
3. Wait for the `video.asset.static_renditions.ready` webhook. This lets you know that the mp4 rendition(s) are now available.
4. Fire off a request to the 3rd party you are using for creating subtitles. You should pass along the mp4 file and get back either an SRT file or WebVTT file when the subtitle track is ready.
5. Wait for the subtitle track to be ready, when it is, make an API request to add this text track to your asset, as described above.


# Add alternate audio tracks to videos
Learn how to use multi-track audio to add alternate audio tracks to your videos
## Introduction to multi-track audio

The multi-track audio feature allows you to add alternate audio tracks to the video assets in your Mux account.

Videos with multi-track audio can be used for increased accessibility or multi-language support, or just to allow viewers to opt into a different audio experience, like a director's commentary.

## (Optional) Set the language and name for your primary audio track

<Callout type="info">
  Optional but highly recommended to increase accessibility if you're delivering alternate audio tracks.
</Callout>

When you <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create an asset</ApiRefLink> in Mux, you can also specify the `language_code` and `name` of the primary audio track that's embedded in your first input file.

```json
// POST https://api.mux.com/video/assets

{
  "inputs": [
    {
      "url": "{VIDEO_INPUT_URL}",
      "language_code" : "en",
      "name" : "English"
    }
  ],
  "playback_policies": [
    "public"
  ],
  "video_quality": "basic"
}
```

A `name` is optional but highly recommended. If you don't specify it, we'll generate it for you based on the `language_code` you provided. The `language_code` must be a [BCP-47 language tag](https://en.wikipedia.org/wiki/IETF_language_tag), such as `en` for English, or `es` for Spanish. You can find a list of common [BCP-47 language tags here](https://en.wikipedia.org/wiki/IETF_language_tag#List_of_common_primary_language_subtags).

You can still use multi-track audio with assets that don't have a language or name set on your initial upload; we'll just call your primary audio track "Default," with no language.

## Add alternate audio tracks to your asset

Once you've created your asset with a primary audio track, you can add alternate audio tracks using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-track">create asset track API</ApiRefLink>, specifying the URL of the audio file you wish to add, and the `language_code` of the alternate audio track. This is the same API that you can use to add captions to your assets.

Mux supports most audio file formats and codecs, such as M4A, WAV, or MP3 file.  but for fastest processing, you should [use standard inputs wherever possible](/docs/guides/minimize-processing-time).

```json
// POST https://api.mux.com/video/assets/${ASSET_ID/tracks

{
  "url": "https://example.com/bar.m4a",
  "type": "audio",
  "language_code": "fr",
  "name": "Français"
}
```

Assets must be in the `ready` state before you can use the create asset track API to add the alternate audio track.

You always need to specify the `language_code` for an alternate audio track, but the `name` is optional. If you don't specify a `name`, we'll generate it for you based on the language code you provided.

You will need to call the API once for each alternate audio track that you want to add.

## Play your videos with multi-track audio

When the alternate audio track has been processed, Mux will automatically add it to the HLS playback URL for your asset.

Many video players already support multi-track audio right out of the box, including [Mux Player](/docs/guides/mux-player-web), Video.js, ExoPlayer, and AVPlayer. So just drop your usual playback URL into your favorite video player, and click play. If your player doesn't support multi-track audio, you'll just hear the primary audio track.

Switching between audio tracks differs in each video player, but this will usually be a menu on the bottom right allowing you to change the track. For example below in Mux Player, you need to click the waveform icon.

<Player playbackId="3x5wDUHxkd8NkEfspLUK3OpSQEJe3pom" thumbnailTime="0" title="Multi-track audio demo" />


# Introduction to video clipping
Overview of Mux's video clipping features, comparing instant clipping and asset-based clipping, with guidance on when to use each.
## Clipping video with Mux

Mux provides two approaches for creating clips from your video content: **Instant Clipping** and **Asset-Based Clipping**. Each method is designed for different use cases, offering flexibility depending on your needs for speed, accuracy, and workflow.

### Instant Clipping

[Instant Clipping](/docs/guides/create-instant-clips) allows you to create clips instantly by specifying start and end times directly in the playback URL (using query parameters or JWT claims). This approach does not require re-encoding or creating a new asset, so clips are available immediately and at no extra encoding cost. Instant clipping operates at the segment level, so clips may start or end a few seconds outside your requested range, depending on the stream's segment duration.

#### Key features:

* No additional encoding or asset creation required
* Clips are available instantly
* No extra cost for encoding or storage
* Segment-level accuracy (not frame-accurate)
* Ideal for live streams and quick highlight creation

**Learn more:** [Create instant clips](/docs/guides/create-instant-clips)

### Asset-Based Clipping

[Asset-Based Clipping](/docs/guides/create-clips-from-your-videos) creates a new, standalone asset from a portion of an existing video or live stream recording. This method involves a re-encoding process, resulting in a new asset with its own playback ID, and supports frame-accurate clipping. Asset-based clips incur encoding and storage costs and may take some time to process before they are ready for playback.

#### Key features:

* Creates a new asset with its own playback ID
* Frame-accurate clipping
* Supports additional features like watermarks and text tracks
* Incurs encoding and storage costs
* Suitable for polished, distributable clips or downloadable MP4s

**Learn more:** [Create asset-based clips](/docs/guides/create-clips-from-your-videos)

## Which clipping approach should you use?

Choose **Instant Clipping** if:

* You need clips to be available immediately
* You want to avoid extra encoding costs
* Segment-level accuracy is sufficient for your use case (e.g., live highlights, quick previews)
* You want to limit playback to a specific range without creating a new asset

Choose **Asset-Based Clipping** if:

* You require frame-accurate clips
* You need a new asset for distribution, download, or further processing
* You want to add watermarks or preserve text tracks in the clip
* You are willing to wait for processing and incur encoding/storage costs

| Feature | Instant Clipping | Asset-Based Clipping |
|---------|-----------------|---------------------|
| Availability | ✅ Immediate | ⏳ Requires processing |
| Frame Accuracy | ❌ Segment-level only | ✅ Frame-accurate |
| Additional Encoding Cost | ❌ No | ✅ Yes |
| Additional Storage Cost | ❌ No | ✅ Yes |
| Watermark Support | ❌ No | ✅ Yes |
| Text Track Support | ❌ No | ✅ Yes |
| Downloadable MP4s | ❌ No | ✅ Yes |
| Live Stream Support | ✅ Yes | ✅ Yes (recordings) |
| Unique Playback ID | ❌ No | ✅ Yes |

For a deeper dive into each approach, see the individual guides:

* [Create instant clips](/docs/guides/create-instant-clips)
* [Create asset-based clips](/docs/guides/create-clips-from-your-videos)


# Create instant clips
Learn how to create instant clips at no extra cost.
## Use cases for instant clipping

Instant clipping allows you to set the start and end times of the streaming URL to make clips that are instantly available without the wait time or expense of a new asset being created. This feature can be used to build a variety of viewer facing workflows.

<Callout type="info">
  If you require frame accurate clips, clipped masters, or clipped MP4s, you should use the [asset-based clipping feature](/docs/guides/create-clips-from-your-videos).
</Callout>

Here are examples of workflows that can be built with instant clipping:

### Pre-live workflows

Sometimes you need to connect your contribution encoder to a live stream and test that the video is working end-to-end before exposing the live stream to your audience. But when you have [DVR mode](/docs/guides/stream-recordings-of-live-streams) turned on for your stream, it's often necessary to prevent viewers being able to seek back into the parts of the live stream where your announcers are saying "testing, testing, 1… 2… 3…".

Instant clipping can be used to specify a start time to allow playback of a live stream, stopping users from seeking back into the stream beyond where you want. You can also specify an end time if you're worried about extra content at the end of your live events.

### Post-live trimming without re-encoding

With our [asset-based clipping feature](/docs/guides/create-clips-from-your-videos) you're able to create clipped on-demand assets, which are shortened versions of a given asset - this is commonly called "top and tail editing". These assets always incur an encoding cost to process the clipped version, and can take some time to process.

With instant clipping, for any asset generated from a live stream, you can simply specify the start and end times of the content you want clipped directly during playback without the need for time-consuming and costly re-processing.

For example, if you broadcast multiple sports events back-to-back on a single live stream, you can use instant clipping to generate instant on-demand streams of each match as it ends for no extra cost.

### Highlight clips

Sometimes a really exciting moment happens on a live stream, and you want to clip out a short highlight for others to enjoy. You can use instant clipping to pull out short clips from a currently active asset for promoting on your homepage or embedding into news articles.

This can be used for example to instantly show just the 90th-minute equalizer goal on your home page while having extra time and penalties to watch live on your pay-to-view platform.

## How instant clipping works

### From a live stream

Every live stream or asset generated from a live stream contains a timestamp that is close (usually within a second) to the time that Mux received the source video from the contribution encoder. This timestamp is known as ["Program Date Time"](https://www.mux.com/video-glossary/pdt-program-date-and-time) or "PDT" for short.

<Callout type="info">
  "PDT" has nothing to do with the Pacific Daylight time zone; all times are represented in UTC or with unix timestamps.
</Callout>

Instant clipping works by trimming the HLS manifests from live streams and VOD assets originating from live streams using these PDT timestamps, without re-encoding any segments. This means that instant clipping operates at the segment level of accuracy, so you should expect that the content that you clip out may be several seconds longer than you've requested. We always make sure to include the timestamps that you request, but your content may start a few seconds earlier, and end a few seconds later. The exact accuracy depends on the latency settings of the live stream that you're clipping from.

### From a VOD asset

Regardless if an asset has originated from a live stream or was uploaded, you can create instant clips using relative time markers for the start and end to generate the trimmed HLS manifest.  The relative time markers are based on the beginning of the asset and so specifying a range of `10` - `20` would result in a 10 second clip between `0:00:10` and `0:00:20`.

## Creating an instant clip URL

Instant clipping is controlled by passing [playback modifiers](/docs/guides/modify-playback-behavior) (query string arguments or JWT claims) to the playback URL of your live stream or VOD assets. If you're using signed URLs, these playback modifiers need to be embedded into your JWT.

### Live stream instant clips

While Mux timestamps video frames when they are received, there is a delay while enough frames are processed to form sufficient segments for a live stream to be started.

This means that you should expect some delay from wall-clock time to when you can use a given timestamp as a `program_start_time`.

For example, if a commentator presses a “Go Live” button at 13:00 UTC, which sets the `program_start_time` of a Live Stream to that timestamp, you should expect request for the live stream's manifest to respond with a HTTP 412 error for up to 15 seconds after (this will depend on the `latency_mode` of your live stream).

The start and end time of your trimmed live stream or on-demand asset are specified by using the following two parameters:

#### Using `program_start_time`

This parameter accepts an epoch time and can be set on a playback URL, and sets the start time of the content within the live stream or asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?program_start_time=${EPOCH_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?program_start_time=1707740400
```

When used on a live stream, this will cause the live stream to behave as if it is idle prior to this time.

When used on an asset, this will trim the start of the streamed media to this timestamp if needed.

#### Using `program_end_time`

This parameter accepts an epoch time and can be set on a playback URL, and sets the end time of the content within the live stream or asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?program_end_time=${EPOCH_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?program_end_time=1707740460
```

When used on a live stream, this will cause the live stream to behave as if it is idle after this time.

When used on an asset, this will trim the end of the streamed media to this timestamp.

#### Combining `program_start_time` and `program_end_time`

These parameters can be used together to extract a specific clip of a live stream or asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?program_start_time=${EPOCH_TIME}&program_end_time=${EPOCH_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?program_start_time=1707740400&program_end_time=1707740460
```

### VOD instant clips

The start and end time of your trimmed on-demand asset are specified by using the following two parameters:

#### Using `asset_start_time`

This parameter accepts relative time and can be set on a Playback URL, and sets the start time of the content within the asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?asset_start_time=${RELATIVE_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?asset_start_time=10
```

#### Using `asset_end_time`

This parameter accepts relative time and can be set on a Playback URL, and sets the end time of the content within the asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?asset_end_time=${RELATIVE_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?asset_end_time=20
```

#### Combining `asset_start_time` and `asset_end_time`

You can also use both of these parameters to create an instant clip of specific portion of your asset, for example:

```
# Format
https://stream.mux.com/${PLAYBACK_ID}.m3u8?asset_start_time=${RELATIVE_TIME}&asset_end_time=${RELATIVE_TIME}

# Example
https://stream.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq.m3u8?asset_start_time=10&asset_end_time=20
```

## Thumbnail & Storyboard support

### Images for VOD assets

To generate images for VOD assets, the `time` [query string parameter](/docs/guides/get-images-from-a-video#thumbnail-query-string-parameters) can be used to retrieve an image from the video, for example:

```
# Format
https://image.mux.com/${PLAYBACK_ID}/thumbnail.png?time=${RELATIVE_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/thumbnail.png?time=15
```

Storyboard generation for VOD assets support these parameters as a way to generate storyboard tiles for frames between the `asset_start_time` and `asset_end_time` values, for example:

```
#Format
https://image.mux.com/${PLAYBACK_ID}/storyboard.png?asset_start_time=${RELATIVE_TIME}&asset_end_time=${RELATIVE_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.png?asset_start_time=10&asset_end_time=20
```

### Images for live streams

For thumbnails, you can now pass an absolute time using the `program_time` parameter, for example:

```
# Format
https://image.mux.com/${PLAYBACK_ID}/thumbnail.png?program_time=${EPOCH_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/thumbnail.png?program_time=1707740460
```

You can pass the same set of [playback modifiers](/docs/guides/modify-playback-behavior) (`program_start_time` and `program_end_time`) on a request for a storyboard and the storyboard will be trimmed appropriately, for example:

```
#Format
https://image.mux.com/${PLAYBACK_ID}/storyboard.png?program_start_time=${RELATIVE_TIME}&program_end_time=${RELATIVE_TIME}

# Example
https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.png?program_start_time=1707740400&program_end_time=1707740460

```

## Using instant clipping in Mux Player

We've also made sure it's easy to pass these parameters to Mux Player when you're using it for playback.

Instant clipping is supported in Mux Player through two paths:

### Using Public Playback IDs: Via extra source params

<Callout type="info">
  This feature was added in mux-player 2.3.0, but we recommend using the latest version at all times.
</Callout>

Here's an example of using the extra source params for using the `asset_start_time` and `asset_end_time` parameters with mux-player for both video delivery and storyboards:

```html
<mux-player
  playback-id="sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq"
  extra-source-params="asset_start_time=10&asset_end_time=20"
  metadata-video-title="Instant clipping demo (Public)"
  storyboard-src="https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.vtt?format=webp&asset_start_time=10&asset_end_time=20"
></mux-player>
```

Using the extra source params can also be used for instant clipping for live streams for video and storyboards as well:

```html
<mux-player
  playback-id="sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq"
  extra-source-params="program_start_time=1707740400&program_end_time=1707740460"
  metadata-video-title="Instant clipping demo (Public)"
  storyboard-src="https://image.mux.com/sp9WNcgcktsmlvFLKgNm3jjSGRD00RPlq/storyboard.vtt?format=webp&program_start_time=1707740400&program_end_time=1707740460"
></mux-player>
```

### Via signed URLs

When using signed URLs, it's required to include the clipping parameters as [claims inside the respective JWTs](/docs/guides/modify-playback-behavior#jwt-claim-with-signed-playback-url) passed to Mux Player.

For the playback token and the storyboard token, the following parameters should be injected into the JWT claims:

* `asset_start_time` and/or `asset_end_time`
* `program_start_time` and/or `program_end_time`

For the thumbnail token, the `program_time` parameter should be injected into the JWT claim.

Then Mux Player can be loaded in the usual way, passing in the signed tokens:

```html
<mux-player
  playback-id="s6oiUXJ6W1JH02D9ThJZQtyg74ubYTiT7"
  playback-token="${PLAYBACK_TOKEN}"
  storyboard-token="${STORYBOARD_TOKEN}"
  thumbnail-token="${THUMBNAIL_TOKEN}"
  metadata-video-title="Instant clipping demo (Signed)"
></mux-player>
```

## Stream security considerations

We strongly recommend using this feature alongside [signed URLs](/docs/guides/secure-video-playback). When using this feature without signed URLs, it is possible for users to manipulate the manifest playback URL to expose parts of the media that you want to keep hidden.

## Choosing between asset clipping and instant clipping

Not sure if you should be [generating a new asset](/docs/guides/create-clips-from-your-videos) when clipping, or using instant clipping for your workflow? Here are some tips that can help you choose the right approach for your product.

Instant clipping is a great choice when:

* You require a clip to be instantly available
* You need the clips to not incur additional encoding fees
* You need to pre-emptively limit the availability of content to build pre-live workflows for live streaming

You should use our [asset-based clipping](/docs/guides/create-clips-from-your-videos) when:

* You require frame accuracy in your clips
* You require trimmed [MP4s](/docs/guides/enable-static-mp4-renditions) or [masters](/docs/guides/download-for-offline-editing)


# Create clips from your videos
Learn how to create clips from your video files or live stream event recordings.
To drive higher viewer engagement with the videos already on your service, you can create additional videos from your existing library or catalog. These videos could:

* Provide quick previews
* Highlight key moments
* Be polished versions of a live stream with the extra minutes trimmed from the beginning & end (aka preroll and postroll slates) for on-demand replays

Mux can now help you quickly create these kinds of videos using the asset clipping functionality.

<Callout type="info">
  If you do not need frame accurate clips, or require immediate availability of clips, you may find that the [instant clipping feature may meet your requirements](/docs/guides/create-instant-clips).
</Callout>

## 1. Create a clip

When you [POST a new video](/docs/core/stream-video-files) or [start live streaming](/docs/guides/start-live-streaming), Mux creates a new asset for the video file or live stream event recording.
You can create a clip from an existing asset by making a <ApiRefLink href="/docs/api-reference/video/assets/create-asset">POST request to /assets endpoint</ApiRefLink> and defining the `input` object's clipping parameters.

* `url` is defined with `mux://assets/{asset_id}` template where `asset_id` is the source Asset Identifier to create the clip from.
* `start_time` is the time offset in seconds from the beginning of the video, indicating the clip's start marker. The default value is 0 when not included.
* `end_time` is the time offset in seconds from the beginning of the video, indicating the clip's end marker. The default value is the duration of the video when not included.

A request and response might look something like this:

### Example request

```bash
curl https://api.mux.com/video/v1/assets \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{
        "inputs": [
          {
            "url": "mux://assets/01itgOBvgjAbES7Inwvu4kEBtsQ44HFL6",
            "start_time": 10.0,
            "end_time": 51.10
          }
        ],
        "playback_policies": [
          "public"
        ],
        "video_quality" : "basic"
      }' \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}
```

### Example response

```json
{
  "data": {
    "status": "preparing",
    "playback_ids": [
      {
        "policy": "public",
        "id": "TXjw00EgPBPS6acv7gBUEJ14PEr5XNWOe"
      }
    ],
    "mp4_support": "none",
    "master_access": "none",
    "id": "kcP3wS3pKcEPywS5zjJk7Q1Clu99SS1O",
    "created_at": "1607876845",
    "video_quality" : "basic",
    "source_asset_id": "01itgOBvgjAbES7Inwvu4kEBtsQ44HFL6"
  }
}
```

Mux creates a new asset for the clip. And the response will include an **Asset ID** and a **Playback ID**.

* Asset IDs are used to manage assets using `api.mux.com` (e.g. to read or delete an asset).
* <ApiRefLink href="/docs/api-reference/video/playback-id">Playback IDs</ApiRefLink> are used to stream an asset to a video player through `stream.mux.com`. You can add multiple playback IDs to an asset to create playback URLs with different viewing permissions, and you can delete playback IDs to remove access without deleting the asset.
* `source_asset_id` is the video or live stream event recording asset used to create the clip. The `source_asset_id` can be useful for associating clips with the source video object in your CMS.

## 2. Wait for "ready" event

When the clip is ready for playback, the asset "status" changes to "ready".

The best way to do this is via **webhooks**. Mux can send a webhook notification as soon as the asset is ready. See the [webhooks guide](/docs/core/listen-for-webhooks) for details.

If you can't use webhooks for some reason, you can manually **poll** the <ApiRefLink href="/docs/api-reference/video/assets/get-asset">asset API</ApiRefLink> to see asset status. Note that this only works at low volume.

### Build your own request

<CodeExamples product="video" example="retrieveAsset" />

Please don't poll this API more than once per second.

## 3. Play your clip

To play back the video, create a playback URL using the `PLAYBACK_ID` you received when you created the clip.

```curl
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

struct ContentView: View {

    private let player = AVPlayer(
        url: URL.makePlaybackURL(
            playbackID: playbackID
        )
    )

    var body: some View {
        //  VideoPlayer comes from SwiftUI
        //  Alternatively, you can use AVPlayerLayer or AVPlayerViewController
        VideoPlayer(player: player)
            .onAppear() {
                player.play()
            }
    }
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

extension URL {
    static func makePlaybackURL(
        playbackID: String
    ) -> URL {
        guard let baseURL = URL(
            string: "https://stream.mux.com"
        ) else {
            preconditionFailure("Invalid base URL string")
        }

        guard let playbackURL = URL(
            string: "\(playbackID).m3u8",
            relativeTo: baseURL
        ) else {
            preconditionFailure("Invalid playback URL component")
        }

        return playbackURL
    }
}

```



See the [playback guide](/docs/guides/play-your-videos) for more information about how to integrate with a video player.

## FAQs

A few commonly asked questions:

### How many clips can be created from a single source asset?

Unlimited! Mux creates a new asset for each clip. Hence, there is no limit to how many clips you can create.

### Is there a cost to create clips?

Each clip is a new asset and is considered an on-demand video. On-Demand video pricing applies and that includes Encoding, Storage, and Delivery usage.

### Can I use basic video quality on clips?

Yes! Clips can be created as either `basic` or `plus`.

### Can I create clips when adding new video files?

Mux only allows creating clips from existing videos in your account. That means, clipping specific parameters (`start_time` and `end_time`) added to <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Asset Creation</ApiRefLink> are only applicable for `input.url` with `mux://assets/{asset_id}` format.

### Can I create clips from live streams?

Yes! Mux supports creating clips from the active asset being generated by a live stream while broadcasting. If you clip an asset while the broadcast is active, just remember that the active asset is still growing, so if you don't provide `end_time`, it will default to the end of the asset at the time of creation. As such, when clipping an active asset during the broadcast, for best results you should always provide an `end_time`.

### My source asset has subtitles/captions text tracks. Will the clip have them?

Mux copies all the text tracks from the source asset to the new asset created for the clip. Mux also trims the text tracks to match the clip's start and end markers.

### What other data is copied from the source asset?

Mux copies the captions and watermark image from the source asset to the clips created. If your source asset does not have a watermark image and you want your clipped
asset to have a watermark, pass it through in `overlay_settings`. See more details in the [watermark guide](/docs/guides/add-watermarks-to-your-videos).

All other fields, such as `passthrough`, are not copied over.

### What is the minimum duration for a clip?

Clips must have a duration of at least 500 milliseconds.


# Add watermarks to your videos
This guide will show how to add watermarks (overlays) to your videos. Watermarks can be added to assets, live streams and direct uploads.
A watermark is an image overlaid on a video, often used to brand a video or visually label a specific version of a video.

<Image src="/docs/images/watermark-img.jpg" width={1920} height={1080} />

You can add a watermark to your video using the `overlay_settings` in the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">asset creation API</ApiRefLink>. The first input in your array of inputs must be the video file you want to apply the watermark to, and the second should be the URL to the source watermark image along with placement details. Multiple watermarks are possible using additional inputs as described in our <ApiRefLink href="/docs/api-reference/video/assets/create-asset">API documentation for creating an asset</ApiRefLink>.

<Callout type="info">
  Valid file types for watermarks are `.png` and `.jpg`.

  Other file types such as `.gif`, `.webp`, and `.svg` are not supported at this time.
</Callout>

For a live stream, the `overlay_settings` must be embedded under the `input` array within `new_asset_settings` in the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">live stream creation API</ApiRefLink>, and the overlays will apply both to playback through the live stream's playback IDs *and* all assets created from the live stream. The watermark image will be retrieved from this URL at the start of each live stream, so you should make sure that the image will be available at that URL for as long as you plan to use the live stream.

```asset

{
  "inputs": [
    {
      "url": "https://muxed.s3.amazonaws.com/leds.mp4"
    },
    {
      "url": "https://muxed.s3.amazonaws.com/example-watermark.png",
      "overlay_settings": {
        "vertical_align": "bottom",
        "vertical_margin": "2%",
        "horizontal_align": "right",
        "horizontal_margin": "2%",
        "width": "25%",
        "opacity": "90%"
      }
    }
  ],
  "playback_policies": ["public"],
  "video_quality": "basic"
}

```

```direct\_upload

{
  "cors_origin": "*",
  "new_asset_settings": {
    "playback_policies": ["public"],
    "video_quality": "basic",
    "inputs": [{
      "url": "https://muxed.s3.amazonaws.com/example-watermark.png",
      "overlay_settings": {
        "vertical_align": "bottom",
        "vertical_margin": "2%",
        "horizontal_align": "right",
        "horizontal_margin": "2%",
        "width": "25%",
        "opacity": "90%"
      }
    }]
  }
}

```

```live\_stream

{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ],
    "inputs": [
      {
        "url": "https://muxed.s3.amazonaws.com/example-watermark.png",
        "overlay_settings": {
          "vertical_align": "bottom",
          "vertical_margin": "2%",
          "horizontal_align": "right",
          "horizontal_margin": "2%",
          "width": "25%",
          "opacity": "90%"
        }
      }
    ]
  }
}

```



## Positioning with percents vs. pixels

The overlay settings are made to help you position and size a watermark consistently no matter what the size or shape of the input video. When setting the width, height, and margins you have the option of using either percents or pixels.

With percent values the watermark `width` and `horizontal_margin` will be relative to the width of the video while the `height` and `vertical_margin` will be relative to the height of the video. For example if you set the watermark `horizontal_margin` to 10% for a video that is 1920 pixels wide, the watermark will be 192 pixels from the edge.

```json
{
  "inputs": [
    {
      "url": "{VIDEO_INPUT_URL}"
    },
    {
      "url": "{WATERMARK_URL}",
      "overlay_settings": {
        "vertical_align": "top",
        "vertical_margin": "10%",
        "horizontal_align": "left",
        "horizontal_margin": "10%"
      }
    }
  ],
  "playback_policies": ["public"]
}
```

<Image src="/docs/images/watermark-percent.png" width={1280} height={1486} />

While the result of using percents is probably easiest to understand, the one shortcoming is positioning a watermark with an exact margin. For example you may want your horizontal and vertical margins to be equal, or for there to be the same exact horizontal margin for vertical videos as with horizontal videos. Both of those examples can be a challenge with percents, where the actual result can be different depending on the width and height of the video.

Setting margins with pixels allows you to get exact with your margins, widths, and heights. However, you can't always control the size of the input video, and a watermark that is 80px wide would look very different on a video that is 960 pixels wide compared to a video that is 1920 pixels wide. For that reason, when you use pixel values in your overlay settings they will always be applied as if the video is first scaled to fit 1920x1080 for horizontal videos or 1080x1920 for vertical videos. So in the previous example, the watermark would be 80px wide on the 1920px wide video, and 40px wide on the 960px wide video.

```json
{
  "inputs": [
    {
      "url": "{INPUT_URL}"
    },
    {
      "url": "{WATERMARK_URL}",
      "overlay_settings": {
        "width": "80px",
        "vertical_align": "top",
        "vertical_margin": "40px",
        "horizontal_align": "left",
        "horizontal_margin": "40px"
      }
    }
  ],
  "playback_policies": ["public"]
}
```

<Image src="/docs/images/watermark-pixel.png" width={1330} height={1550} />

The reason behind this is that your watermark should look the same no matter what the original size of the input video, and videos are most often scaled to fit the player window or the screen of the device.

## Center a watermark

<Image src="/docs/images/watermark-center.jpg" width={1920} height={1080} />

To center a watermark on the video, simply set `vertical_align` to "middle" and `horizontal_align` to "center".

```json
{
  "inputs": [
    {
      "url": "{INPUT_URL}"
    },
    {
      "url": "{WATERMARK_URL}",
      "overlay_settings": {
        "vertical_align": "middle",
        "horizontal_align": "center"
      }
    }
  ],
  "playback_policies": ["public"]
}
```


# Adjust audio levels
This guide will show how to adjust the audio level to your videos. Audio normalization can be added to on-demand assets.
## What is audio normalization?

Here at Mux, When we refer to audio normalization, we are referring to loudness normalization. Loudness normalization adjusts the recording based on perceived loudness.

Below, is an audio stream *before* normalization

<Image src="/docs/images/audio-norm-before.png" width={640} height={200} alt="Audio norm before" />

An audio stream *after* normalization

<Image src="/docs/images/audio-norm-after.png" width={640} height={200} alt="Audio norm after" />

LUFS, which stands for Loudness Units relative to Full Scale, are a measurement of loudness over the entire length of an audio stream. This is the measurement used in the normalization process.
The whole goal of normalizing is attaining the gain to bring the average amplitude to a target level; the "norm".

## When to use audio normalization

The main use of audio normalization is to standardize the perceived loudness of your assets. Whether to use normalization at all depends on the content.
When audio gain is normal and audio quality is high, normalization can be beneficial. Please note however, similar to other video and audio processing, this processing on your audio is going to change it some way.
So make an informed decision on whether to use this feature or not.

## How to turn on audio normalization

At this moment, the only way to enable audio normalization on a Mux asset is through the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">create asset endpoint.</ApiRefLink> You cannot update this after the asset has been created. This option also only applies to on-demand assets (audio-only included) but not live streams.

To enable audio normalization on your asset ingest, set the `normalize_audio` key to `true` in the body of your asset creation. By default, this boolean is set to false.

A typical request and response might look something like this:

### Example request

```bash
curl https://api.mux.com/video/v1/assets \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{
        "inputs": [
          {
            "url": "https://example.com/myVideo.mp4"
          }
        ],
        "playback_policies": ["public"],
        "video_quality": "basic"
        "normalize_audio": true 
    }' \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}
```

### Example response

```json
{
    "data": {
        "status": "preparing",
        "playback_ids": [
            {
                "policy": "public",
                "id": "006Hx6bozgZv2sL9700Y8TT02MKdw4nq01ipMVawIGV9j000"
            }
        ],
        "normalize_audio": true,
        "mp4_support": "none",
        "master_access": "none",
        "id": "jlJydoVkYh01Z3JrLr02RGcp4mJdLvPRbk9n00000",
        "video_quality": "basic",
        "created_at": "1612979762"
    }
}
```

## Target loudness

Our target loudness value for audio normalization in our video stack is currently –24 LUFS. So, if possible, master your audio with this value in mind.


# Start live streaming
In this guide you will learn how to build a live streaming platform with Mux live streaming.
Whether you’re looking to build “Twitch for X”, online classrooms, a news & sports broadcasting platform or something the world’s never seen before, the Mux Live Streaming API  makes it easy to build live video into your own software. With a simple API call you get everything you need to push a live stream and play it back at high quality for a global audience.

<Image src="/docs/images/live-streaming-overview-2.png" width={1798} height={1040} />

## 1. Get an API Access Token

<Callout type="info">
  For a guided example of how to make API Requests from your local environment, see the guide and watch this video tutorial: [ Make API Requests](/docs/core/make-api-requests).
</Callout>

The Mux Video API uses a token key pair that consists of a **Token ID** and **Token Secret** for authentication. If you haven't already, generate a new Access Token in the [Access Token settings](https://dashboard.mux.com/settings/access-tokens) of your Mux account dashboard.

<Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} alt="Mux access token settings" />

The access token should have Mux Video **Read** and **Write** permissions.

<Image src="/docs/images/new-access-token.png" width={760} height={376} alt="Mux Video access token permissions" sm />

Access Tokens also belong to an Environment. Be sure to use the same Environment when using Mux Video and Mux Data together, so the data from Mux Data can be used to optimize your Mux Video streams.

## 2. Create a unique Live Stream

<ApiRefLink href="/docs/api-reference/video/live-streams">Detailed API reference</ApiRefLink>

The Live Stream object in the Mux API is a record of a live stream of video that will be pushed to Mux. To create your first Live Stream, <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">POST request to the /live-streams endpoint</ApiRefLink>.

You can either replace `${MUX_TOKEN_ID}` and `${MUX_TOKEN_SECRET}` with your own access token details or make sure to export those environment variables with the correct values first.

```curl

curl https://api.mux.com/video/v1/live-streams \
  -H "Content-Type: application/json" \
  -X POST \
  -d '{ "playback_policies": ["public"], "new_asset_settings": { "playback_policies": ["public"] } }' \
  -u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET}

```

```elixir

# config/dev.exs
config :mux,
  access_token_id: "MUX_TOKEN_ID",
  access_token_secret: "MUX_TOKEN_SECRET"

client = Mux.client()
{:ok, live_stream, _env} = Mux.Video.LiveStreams.create(client, %{playback_policy: "public", new_asset_settings: %{playback_policy: "public"}});

```

```go

import (
    muxgo "github.com/muxinc/mux-go"
)

client := muxgo.NewAPIClient(
    muxgo.NewConfiguration(
        muxgo.WithBasicAuth(os.Getenv("MUX_TOKEN_ID"), os.Getenv("MUX_TOKEN_SECRET")),
    ))

createAsset := muxgo.CreateAssetRequest{PlaybackPolicy: []muxgo.PlaybackPolicy{muxgo.PUBLIC}}
createLiveStream := muxgo.CreateLiveStreamRequest{NewAssetSettings: createAsset, PlaybackPolicy: []muxgo.PlaybackPolicy{muxgo.PUBLIC}}
live_stream, err := client.LiveStreamsApi.CreateLiveStream(createLiveStream)

```

```node

import Mux from '@mux/mux-node';
const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET
});

await mux.video.liveStreams.create({
  playback_policy: ['public'],
  new_asset_settings: { playback_policy: ['public'] },
});

```

```php

$config = MuxPhp\Configuration::getDefaultConfiguration()
  ->setUsername(getenv('MUX_TOKEN_ID'))
  ->setPassword(getenv('MUX_TOKEN_SECRET'));

$liveApi = new MuxPhp\Api\LiveStreamsApi(
    new GuzzleHttp\Client(),
    $config
);

$createAssetRequest = new MuxPhp\Models\CreateAssetRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC]]);
$createLiveStreamRequest = new MuxPhp\Models\CreateLiveStreamRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC], "new_asset_settings" => $createAssetRequest]);
$stream = $liveApi->createLiveStream($createLiveStreamRequest);

```

```python

import mux_python

configuration = mux_python.Configuration()
configuration.username = os.environ['MUX_TOKEN_ID']
configuration.password = os.environ['MUX_TOKEN_SECRET']

live_api = mux_python.LiveStreamsApi(mux_python.ApiClient(configuration))
new_asset_settings = mux_python.CreateAssetRequest(playback_policy=[mux_python.PlaybackPolicy.PUBLIC])
create_live_stream_request = mux_python.CreateLiveStreamRequest(playback_policy=[mux_python.PlaybackPolicy.PUBLIC], new_asset_settings=new_asset_settings)
create_live_stream_response = live_api.create_live_stream(create_live_stream_request)

```

```ruby

MuxRuby.configure do |config|
  config.username = ENV['MUX_TOKEN_ID']
  config.password = ENV['MUX_TOKEN_SECRET']
end

create_asset_request = MuxRuby::CreateAssetRequest.new
create_asset_request.playback_policy = [MuxRuby::PlaybackPolicy::PUBLIC]
create_live_stream_request = MuxRuby::CreateLiveStreamRequest.new
create_live_stream_request.new_asset_settings = create_asset_request
create_live_stream_request.playback_policy = [MuxRuby::PlaybackPolicy::PUBLIC]
create_live_stream_request.latency_mode = "reduced"

```



The response will include a **Playback ID** and a **Stream Key**.

* <ApiRefLink href="/docs/api-reference/video/assets/get-asset-playback-id">Playback IDs</ApiRefLink> for a Live Stream can be used the same way as Playback IDs for an Asset. You can use it to [play video](/docs/guides/play-your-videos), [get images from a video](/docs/guides/get-images-from-a-video) or [build timeline hover previews with your player](/docs/guides/create-timeline-hover-previews).
* The **Stream Key** is a **secret** that can be used along with Mux's RTMP Server URL (see table below) to configure RTMP streaming software.

<Callout type="warning" title="Important">
  The *Stream Key* should be treated as a **private key for live streaming**. Anyone with the key can use it to stream video to the Live Stream it belongs to, so make sure your users know to keep it safe. If you lose control of a stream key, you can either delete the Live Stream or <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">reset the stream key</ApiRefLink>
</Callout>

```json
{
  "data": {
    "id": "QrikEQpEXp3RvklQSHyHSYOakQkXlRId",
    "stream_key": "super-secret-stream-key",
    "status": "idle",
    "playback_ids": [
      {
        "policy": "public",
        "id": "OJxPwQuByldIr02VfoXDdX6Ynl01MTgC8w02"
      }
    ],
    "created_at": "1527110899"
  }
}
```

Mux also allows you to set a few additional options on your live stream. When enabled, you can support more use cases.

| Option | Description |
|--------|-------------|
| `"latency_mode": "reduced"` | Mux live streams have an option for "reduced latency". When `"latency_mode": "reduced"` is enabled, we treat your stream a little differently to minimize glass-to-glass latency. The latency reduces to about 10-15 seconds compared to 30 seconds typically without enabling this option. For more details, please refer to the [Live Stream Latency guide](/docs/guides/reduce-live-stream-latency). |
| `"latency_mode": "low"` | Similar to `"reduced"` latency option, `"latency_mode": "low"` live streams reduce the glass-to-glass latency to as low as 5 seconds but the latency can vary depending on your viewer's geographical location and internet connectivity. For more details, please refer to the [Live Stream Latency guide](/docs/guides/reduce-live-stream-latency). |
| `audio_only` | Mux live streams is ready for Audio specific use cases too. For example, you can host Live Podcasts or broadcast Radio Shows. When `audio_only` is enabled, we only process the audio track, even dropping the video track if broadcast. |

<Callout type="warning">
  A live stream can only be configured as "reduced latency" or "low latency" or standard latency.
</Callout>

You can find more details about the options on the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Create Live Stream</ApiRefLink>.

## 3. Start your broadcast

Mux supports live streaming using the RTMP protocol, which is supported by most broadcast software/hardware as well as open source software for mobile applications.

Your users or your client app will need software that can push an RTMP stream. That software will be configured using the **Stream Key** from the prior step along with Mux's RTMP Server URL. Mux supports both RTMP and RTMPS:

| RTMP Server URL |  Description | Common Applications |
|---|---|---|
| rtmp://global-live.mux.com:5222/app | Mux's standard RTMP entry point. Compatible with the majority of streaming applications and services. |Open Source RTMP SDKs, most [app-store streaming applications](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/). |
| rtmps://global-live.mux.com:443/app | Mux's secure RTMPS entry point. Compatible with less streaming applications, but offers a higher level of security. | OBS, Wirecast, Streamaxia RTMP SDKs |

Learn more about:

* [Additional regional ingest URLs](/docs/guides/configure-broadcast-software#available-ingest-urls) for when you want control over the geographic region receiving your user's livestream
* [How to configure broadcast software](/docs/guides/configure-broadcast-software) for when users will be using their own streaming software, e.g. Twitch live streamers
* [How to live stream from a mobile app](/docs/guides/live-streaming-from-your-app) for when users will live stream using your mobile app

<Callout type="warning" title="Important">
  Mux's RTMP server URL uses port number 5222 and not the standard RTMP port number 1935.  If your encoder does not provide a method to change the port number, please contact [our support team](/support) with your encoder details.
</Callout>

[Mux Video also supports Secure Reliable Transport (SRT) for receiving live streams](/docs/guides/use-srt-to-live-stream). If you want to live stream with a protocol other than RTMP or SRT, let us know!

<Image src="/docs/images/obs-setup.png" width="1954" height="1492" alt="obs setup" />

The broadcast software will describe how to start and stop an RTMP session. Once the session begins, the software will start pushing live video to Mux and the Live Stream will change its status to `active` indicating it is receiving the RTMP stream and is playable using the Playback ID.

### Broadcasting Webhooks

When a Streamer begins sending video and the Live Stream changes status, your application can respond by using [Webhooks](/docs/core/listen-for-webhooks). There are a few related events that Mux will send. Your application may benefit from some or none of these events, depending on the specific user experience you want to provide.

| Event | Description |
|-------|-------------|
| `video.live_stream.connected` | The Streamer's broadcasting software/hardware has successfully connected with Mux servers. Video is not yet being recorded and is not yet playable. |
| `video.live_stream.disconnected` | The Streamer's broadcasting software/hardware has disconnected from Mux servers, either intentionally or unintentionally because of a network drop. |
| `video.live_stream.recording` | Video is being recorded and prepared for playback. The recording of the live stream (the Active Asset) will include video sent after this point. If your UI has a red "recording" light, this would be the event that turns it on. |
| `video.live_stream.active` | The Live Stream is now playable using the Live Stream's Playback ID or the Active Asset's Playback ID |
| `video.live_stream.idle` | The Streamer's broadcasting software/hardware previously disconnected from Mux servers and the `reconnect_window` has now expired. The recording of the live stream (the Active Asset) will now be considered complete. The next time video is streamed using the same Stream Key it will create a new Asset for the recording. |
| `video.asset.live_stream_completed` | This event is fired by the Active Asset when the Live Stream enters the `idle` state and the Active Asset is considered complete. The Asset's playback URL will switch to being an "on-demand" (not live) video. |

## 4. Playback your live stream

To play back a live stream, use the `PLAYBACK_ID` that was returned when you created the Live Stream along with stream.mux.com to create an HTTP Live Streaming (HLS) playback URL.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

```android

implementation 'com.google.android.exoplayer:exoplayer-hls:2.X.X'

// Create a player instance.
SimpleExoPlayer player = new SimpleExoPlayer.Builder(context).build();
// Set the media item to be played.
player.setMediaItem(MediaItem.fromUri("https://stream.mux.com/{PLAYBACK_ID}.m3u8"));
// Prepare the player.
player.prepare();

```

```embed

<iframe
  src="https://player.mux.com/{PLAYBACK_ID}?metadata-video-title=Test%20video%20title&metadata-viewer-user-id=user-id-007"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html

<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<mux-player
  playback-id="{PLAYBACK_ID}"
  metadata-video-title="Test video title"
  metadata-viewer-user-id="user-id-007"
></mux-player>

```

```react

import MuxPlayer from '@mux/mux-player-react';

export default function VideoPlayer() {
  return (
    <MuxPlayer
      playbackId="{PLAYBACK_ID}"
      metadata={{
        video_id: "video-id-54321",
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}

```

```swift

import SwiftUI
import AVKit

let playbackID = "qxb01i6T202018GFS02vp9RIe01icTcDCjVzQpmaB00CUisJ4"

struct ContentView: View {

    private let player = AVPlayer(
        url: URL.makePlaybackURL(
            playbackID: playbackID
        )
    )

    var body: some View {
        //  VideoPlayer comes from SwiftUI
        //  Alternatively, you can use AVPlayerLayer or AVPlayerViewController
        VideoPlayer(player: player)
            .onAppear() {
                player.play()
            }
    }
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

extension URL {
    static func makePlaybackURL(
        playbackID: String
    ) -> URL {
        guard let baseURL = URL(
            string: "https://stream.mux.com"
        ) else {
            preconditionFailure("Invalid base URL string")
        }

        guard let playbackURL = URL(
            string: "\(playbackID).m3u8",
            relativeTo: baseURL
        ) else {
            preconditionFailure("Invalid playback URL component")
        }

        return playbackURL
    }
}

```



See the [playback guide](/docs/guides/play-your-videos) for more information about how to integrate with a video player.

After you have everything working [integrate Mux Data](/docs/guides/track-your-video-performance) with your player for monitoring playback performance.

## 5. Stop your broadcast

When the Streamer is finished they will stop the broadcast software/hardware, which will disconnect from the Mux servers. After the `reconnect_window` time (if any) runs out, the Live Stream will transition to a status of `idle`.

<Callout type="info">
  Mux automatically disconnects clients after 12 hours. Contact us if you require longer live streams.
</Callout>

## 6. Manage your Mux Live streams

After you have live streams created in your Mux environment, you may find some of these other endpoints handy:

* <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Create a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/list-live-streams">List live streams</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/get-live-stream">Retrieve a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream">Delete a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream-playback-id">Create a live stream playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream-playback-id">Delete a live stream playback ID</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">Reset a stream key for a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">Signal a live stream is finished</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">Disable a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">Enable a live stream</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream-simulcast-target">Create a live stream simulcast target</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/delete-live-stream-simulcast-target">Delete a live stream simulcast target</ApiRefLink>
* <ApiRefLink href="/docs/api-reference/video/live-streams/get-live-stream-simulcast-target">Retrieve a live stream simulcast target</ApiRefLink>

More Video methods and descriptions are available at the <ApiRefLink href="/docs/api-reference/video">API Docs</ApiRefLink>.

<GuideCard
  title="Play your live stream"
  description="Set up your iOS application, Android application or web application to start playing your Mux assets"
  links={[
    {title: "Read the guide", href: "/docs/guides/play-your-videos"},
  ]}
/>

<GuideCard
  title="Integrate Mux Data"
  description="Add the Mux Data SDK to your player and start collecting playback performance metrics."
  links={[
    {title: "Read the guide", href: "/docs/guides/track-your-video-performance"},
  ]}
/>


# Configure Broadcast Software
There are a number of popular (and even free) software encoders that you can use with the Mux live streaming API. Hardware encoders that allow for custom RTMP server configuration have similar settings. This guide details how to configure a few common encoders.
## Overview / configuration term glossary

Most broadcasting software uses some standard set of terms. Mux has chosen a set of terms are very commonly used.

* **Server URL** - This is the URL of the Mux RTMP server, as listed in the table below.
* **Stream Key** - The Stream Key is essentially used to authenticate your live stream with the Mux RTMP server. This is your secret key to live streaming. Mux does not use additional authentication.

***

| RTMP Server URL | Description | Common Applications |
| :-- | :-- | :-- |
| rtmp://global-live.mux.com:5222/app | Mux's standard RTMP entry point. Compatible with the majority of streaming applications and services | Open Source RTMP SDKs, most [app-store streaming applications](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/) |
| rtmps://global-live.mux.com:443/app | Mux's secure RTMPS entry point. Compatible with less streaming applications, but offers a higher level of security | OBS, Wirecast, Streamaxia RTMP SDKs |

***

Here is a list of other terms that we have heard:

* **Stream Name** - A common alias and the technically correct term (in the RTMP specification) for *Stream Key*.
* **Location** or **URL** - Many times, broadcast software that just asks for a location or a URL wants a combination of the *Stream URL* and the *Stream Key* like `rtmp://global-live.mux.com:5222/app/{STREAM_KEY}`. If location or URL are asked for with a stream name/key, then this is an alias for *Server URL*.
* **FMS URL** - Flash Media Server URL, an alias for *Server URL*.

Seen or heard a term that you don't understand? Ask us! Think we missed something that you know? Leave a comment at the bottom of the page!

<Callout type="info">
  Mux's RTMP server URL uses port number 5222 and not the standard RTMP port number 1935. If your encoder does not provide a method to change the port number, please contact [support](/support) with your encoder details.
</Callout>

## Recommended encoder settings

Twitch has a clear and concise [guide to broadcast encoder settings](https://help.twitch.tv/s/article/broadcasting-guidelines?language=en_US). YouTube has [a bit more detailed guide](https://support.google.com/youtube/answer/2853702) as well. Here's a very simple recommendation of where to start, but we do recommend playing with your settings to see what works best for your content:

### Common

* **Video CODEC** - H.264 (Main Profile)
* **Audio CODEC** - AAC

### Great - 1080p 30fps

* **Bitrate** - 5000 kbps
* **Keyframe Interval** - 2 seconds

### Good - 720p 30fps

* **Bitrate** - 3500 kbps
* **Keyframe Interval** - 2 seconds

### Works - 480p 30fps

* **Bitrate** - 1000 kbps
* **Keyframe Interval** - 5 seconds

<Callout type="warning" title="Important">
  You should also consider your available upload bandwidth when choosing an encoder bitrate. For a more reliable connection, we recommend using no more than ~50% of the available upload bandwidth for your live stream ingest.
</Callout>

## Alternate ingest protocols

Mux Video also supports [Secure Reliable Transport (SRT) for receiving live streams](/docs/guides/use-srt-to-live-stream).

## Available Ingest URLs

Mux's regional ingest urls let you manually select your ingest region. This may be useful if you notice DNS is not routing your traffic efficiently, or if you would like to manage your own failover process.

| Region | RTMP Ingest URL | SRT Ingest URL |
| :-- | :-- | :-- |
|Global (Auto-Select) | `rtmp://global-live.mux.com/app` | `srt://global-live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |
|U.S. East | `rtmp://us-east.live.mux.com/app` | `srt://us-east.live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |
|U.S. West | `rtmp://us-west.live.mux.com/app` | `srt://us-west.live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |
|Europe	| `rtmp://eu-west.live.mux.com/app` | `srt://eu-west.live.mux.com:6001?streamid={STREAM_KEY}&passphrase={SRT_PASSPHRASE}` |

<Callout type="info" title="RTMPS support">
  All of these RTMP URLs support RTMPS.

  For example, `rtmp://us-east.live.mux.com/app` becomes `rtmps://us-east.live.mux.com/app`
</Callout>

### Choosing the right ingest URL

* If you want Mux to automatically route to the best region, use `global-live.mux.com`.
* If you prefer manual control over routing, use a specific regional ingest URL (e.g., `us-east.live.mux.com`).
* For redundancy, configure your encoder to failover to another regional endpoint.

### Using regional ingest URLs in OBS

To set up OBS with Mux Live Streaming:

1. Go to: Settings → Stream

2. Select "Custom..." as the service

3. Enter the Ingest URL based on your preferred region

   `rtmps://us-east.live.mux.com/app`

4. Enter your Stream Key (found in your Mux Live settings)

5. Click "Start Streaming"

### Building your SRT URL

Note: Before you use a SRT URL, make sure your encoder supports SRT Caller mode.

The SRT URL is composed of three parts.

1. The protocol and host: `srt://us-east.live.mux.com:6001`
2. A streamid query parameter
3. A passphrase query parameter

Here's an example:

```
srt://us-east.live.mux.com:6001?streamid=abc-123-def-456&passphrase=GHI789JKL101112
```

For more information on SRT, check out our [Use SRT to live stream](https://www.mux.com/docs/guides/use-srt-to-live-stream) docs.

## Software encoders

Any encoder that supports RTMP should work with Mux Video.

* [OBS](https://obsproject.com/) (Free and Open Source)
* [Wirecast](https://www.telestream.net/wirecast/) (Commercial)
* [XSplit](https://www.xsplit.com/broadcaster) (Commercial)
* [vMix](https://www.vmix.com) (Commercial)

## Hardware encoders

Any encoder that supports RTMP should work with Mux Video.

* [VidiU](https://teradek.com/collections/vidiu-family)
* [DataVideo RTMP Encoders](https://www.datavideo.com/global/category/video-encoder)
* [Magewell Ultra Stream](https://www.magewell.com/ultra-stream)
* [Osprey Talon](https://www.ospreyvideo.com/talon-encoders) (contact [sales@ospreyvideo.com](mailto:sales@ospreyvideo.com) for documentation)
* [Videon](https://support.videonlabs.com/hc/en-us/articles/4408934112659-Stream-to-Mux-RTMP-)

## Mobile devices (iOS, Android)

If you just want a pre-built iOS application you can stream from, [check out our write up here](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/).

If you want to build your own application, [check out this documentation](/docs/guides/live-streaming-from-your-app).


# Use SRT to live stream
Learn how to use the Secure Reliable Transport (SRT) protocol to send a live stream to Mux
## Learn about SRT

SRT is a modern, common alternative to RTMP and is designed for high-quality, reliable point-to-point video transmission over unreliable networks.

The Mux Video SRT feature also supports using [HEVC (h.265) as the live stream input codec](#use-the-hevc-codec-with-srt), reducing inbound bitrate requirements.

SRT is supported by a wide range of free and commercial video encoders.

## Use SRT to connect to a live stream

Authentication for SRT is a little different than with RTMP, and requires two pieces of information:

1. `streamid` This is the same `stream_key` attribute you know & love from RTMP
2. `passphrase` This is a new piece of information exposed in the Live Streams API called `srt_passphrase`. You'll need to use this when your encoder asks you for a `passphrase`.

All new and existing live streams now expose the `srt_passphrase` field.

You can get this field through the API using the <ApiRefLink href="/docs/api-reference/video/live-streams/get-live-stream">Get Live Stream API call</ApiRefLink>:

```json
// GET https://api.mux.com/video/v1/live-streams/{LIVE_STREAM_ID}
{
  // [...]
  "stream_key": "abc-123-def-456",
  "srt_passphrase": "GHI789JKL101112",
  // [...]
}
```

You can also see the SRT connection details from the dashboard of any live stream:

<Image src="/docs/images/srt-dashboard.png" width={1193} height={472} alt="SRT connection details in the Mux Live Stream dashboard" />

## Configure your encoder

Depending on the encoder you're using, the exact path to setting the SRT configuration will vary.

Some encoders accept all configuration parameters in the form of an SRT URL, in which case you'll need to construct an SRT URL as below, substituting the `stream_key` and `srt_passphrase`.

```
srt://global-live.mux.com:6001?streamid={stream_key}&passphrase={srt_passphrase}
```

Mux's global SRT ingest urls will connect you to the closest ingest region. While these ingest URLs typically provide optimal performance, you can also select a specific region using our [regional ingest URLs.](/docs/guides/configure-broadcast-software#available-ingest-urls).

### Common Configuration values

Other encoders will break out the SRT configuration as multiple fields, you should fill them out as below:

| Field | Value |
| --- | --- |
| Hostname / URL / Port | `srt://global-live.mux.com:6001` |
| Stream ID | Use the `stream_key` from your live stream. |
| Passphrase | Use the `srt_passphrase` field from your live stream. |
| Mode | Should be set to `caller` if required. |
| Encryption Key Size / Length | Set to `128` if expressed as “bits”, or `16` if expressed as “pbkeylen” |

### Tuning

You may need some of the tuning settings below.

| Field | Value | Notes |
| --- | --- | --- |
| Latency | `500` is generally a safe starting value | Set to at least 4 x the RTT to `global-live.mux.com`  |
| Bandwidth | `25%` is generally a safe starting value | Set to the percentage of overhead you have available in your internet connection for bursts of retransmission. For example, if you have a 5Mbps internet connection, and you set your encoder's target bitrate to 4Mbps, a value of 25% would be appropriate, as it would allow the encoder to burst to 5Mbps for retransmission purposes. |

## Stream!

If you've configured your encoder correctly, you should be all set to connect your encoder and start streaming. You can then check you see the live stream in your Mux Dashboard.

You will see all the usual state transitions, events, and webhooks that you'd expect when connecting from an RTMP source.

## Example Encoder Configuration

### OBS

OBS accepts the SRT endpoint as a single URL, and should be structured as shown below:

<Image src="/docs/images/srt-obs-config.png" width={677} height={150} alt="SRT configuration for OBS" />

*Stream Key should be left empty as both the Stream ID and the Passphrase are being set in the URL field.*

### Videon

Videon encoders need each parameter to be configured separately, as shown below:

<Image src="/docs/images/srt-videon-config.png" width={544} height={700} alt="SRT configuration for Videon" />

### Wirecast

Wirecast needs each parameter to be configured separately, as shown below:

<Image src="/docs/images/srt-wirecast-config.png" width={1037} height={538} alt="SRT configuration for Wirecast" />

### Larix Broadcaster on iOS and Android

Larix Broadcaster also needs each parameter to be configured separately, as shown below:

<Image src="/docs/images/srt-larix-combined-config.png" width={726} height={700} alt="SRT configuration for Larix Broadcaster on iOS and Android" />

### FFmpeg

FFmpeg takes an SRT URL with the parameters on the URL, for example:

```shell
ffmpeg \
  -f lavfi -re -i testsrc=size=1920x1080:rate=30 \
  -f lavfi -i "sine=frequency=1000:duration=3600" \
  -c:v libx264 -x264-params keyint=120:scenecut=0 \
  -preset superfast -b:v 5M -maxrate 6M -bufsize 3M -threads 4 \
  -c:a aac \
  -f mpegts 'srt://global-live.mux.com:6001?streamid={stream_key}&passphrase={srt_passphrase}'
```

### Gstreamer

Gstreamer takes an SRT URL with the parameters on the URL, for example:

```shell
gst-launch-1.0 -v videotestsrc ! queue ! video/x-raw, height=1080, width=1920 \
  ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=main \
  ! ts. audiotestsrc ! queue ! avenc_aac ! mpegtsmux name=ts \
  ! srtsink uri='srt://global-live.mux.com:6001?streamid={stream_key}&passphrase={srt_passphrase}'
```

## Use the HEVC codec with SRT

When sending a live stream for ingest over SRT, Mux supports [HEVC](https://www.mux.com/video-glossary/hevc-high-efficiency-video-coding) (h.265) as the contribution codec.

Using HEVC generally allows you to reduce the inbound bitrate of your live stream without sacrificing quality. The amount you can reduce the bitrate by will vary depending on the encoder that you're using, but generally this would be between 30% and 50%.

## Simulcast using SRT

You can also now simulcast to SRT destinations for streams that are sent to Mux over SRT.

You can also simulcast streams that were sent over SRT to RTMP destinations.

To configure a simulcast destination as SRT, you can simply pass the SRT URL in the `url` field when creating a Simulcast Target, as shown below:

```json
POST /video/v1/live-streams/{LIVE_STREAM_ID}/simulcast-targets

{
  "url" : "srt://my-srt-server.example.com:6001?streamid=streamid&passphrase=passphrase",
  "passthrough" : "My SRT Destination"
}
```

## Simulcasting and HEVC over SRT

When simulcasting an inbound SRT stream sent over HEVC, Mux does not currently transcode the output stream, so you need to be confident that the simulcast destination supports the HEVC codec.

Below is a current list of the codecs and protocols supported by common simulcast destinations:

| Platform | Protocols | Codecs |
| --- | --- | --- |
| Facebook | RTMP(S) | h.264 |
| X (Twitter) | RTMP(S), HLS Pull | h.264 |
| YouTube | RTMP(S), HLS Pull, SRT (Closed Beta) | h.264, HEVC, AV1 |
| Twitch / IVS | RTMP(S) | h.264 |

## Known limitations

## Simulcast retains source codec

[See simulcasting notes above.](#simulcasting-and-hevc-over-srt)

## Cross-protocol and cross-codec reconnects

We do not support switching ingest protocols or codecs within a reconnect window.  If you want to reuse the same Live Stream with different protocols you'll need to wait for the reconnect period to expire or call the <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">Complete Live Stream API</ApiRefLink>.

## Embedded Captions

Embedded captions (608) are not supported. Auto-generated captions *can* be used with SRT live streams.

## Multi-track audio

While we support multiple audio tracks in an SRT stream, we recommend against sending more than one, as there's no mechanism to configure which will be used. Mux will choose the first audio stream listed in the PMT; other audio streams will be dropped.

## Feedback

We'd love to hear your feedback as you use SRT. If you run into issues or have feedback, please contact [Mux Support](/support), and we'll get back to you.


# Live streaming from your app
Use this guide to set up your application for live streaming to Mux.
## Live building blocks

A recap from our [Start live streaming](/docs/guides/start-live-streaming) guide:

<Image sm src="/docs/images/live-streaming-overview-2.png" width={1798} height={1040} />

Live streaming from a native application requires software to capture the camera feed and stream it to the Mux live endpoint using the RTMP protocol. Fortunately for both iOS and Android you can find open source software to stream RTMP. The following open source applications can be used as a guide for building live streaming into your own app.

## iOS & Android examples

<Callout type="warning" title="Use current examples">
  Use the examples linked in this guide. They will contain the most current code and issue list. We may not provide support for outdated apps and dependencies.
</Callout>

* [iOS Live Streaming Example](https://github.com/muxinc/examples/tree/master/ios-live-streaming)
* [Android Live Streaming Example](https://github.com/muxinc/examples/tree/master/android-live-streaming)

Over time we'll build out more examples and SDKs for iOS and Android. If you have any feedback or requests please let us know.

If you're looking for a commercial solution, [Streamaxia's OpenSDK](https://www.streamaxia.com/) and [Larix Broadcaster](https://softvelum.com/larix/) are know to work well with Mux's RTMP ingest.

## Web app live streaming

There are not any reliable open source solutions for building web-based encoders for streaming out over RTMP. Check [the blog post](https://mux.com/blog/the-state-of-going-live-from-a-browser) for more information on going live from the browser.


# Reduce live stream latency
This guide covers types of latency, causes of latency, reconnect windows, and lower latency options.
Mux Video live streaming is built with RTMP ingest and HLS delivery. HLS inherently introduces latency.
To the broadcasting industry, this latency is called glass-to-glass latency. Standard glass-to-glass latency with HLS
is greater than 20 seconds and typically about 25 to 30 seconds.

To clarify some terminology and industry jargon:

* **Glass-to-glass latency**: Also sometimes referred to as end-to-end latency. This latency is defined as the time lag between when a camera captures an action and when that action reaches a viewer’s device.
* **Wall-clock time**: Also might be referred to as "realtime". If you have a clock on the wall where you are capturing video content, this would be the time on that clock.

The nature of HLS delivery means that clients are not necessarily synchronized. Some clients might be 15 seconds behind wall-clock time and others might be 30 seconds behind.

# Where does the latency come from?

You don't have to worry about these gritty details when using Mux for live streams, but to give you an idea of how a live stream works:

<Image src="/docs/images/live-stream-workflow.png" width={2250} height={1848} />

1. **Captured by a camera**
2. **Processed by an encoder** - If the computer running the encoder is running out of CPU this process can get behind and start lagging.
3. **Send to an RTMP ingest server** - This server is ingesting the video content in real-time. This part is called the "first mile", it's happening over the internet, often times on consumer or cellular network connections so things like TCP packet-loss and random network disconnects are always happening.
4. **Ingest server decodes and encodes** - Assuming all the content is traveling over the internet fast enough, the encoder on the other end needs to keep up and have enough CPU available to package up segments of video as they come in. The encoder has to ingest video, build up a buffer of content and then start decoding, processing and encoding for HLS delivery.
5. **Manifest files and segments of video delivered** - After all of that, files are created and delivered over HTTP through multiple CDNs to reach end users. Each file becomes available after the entire segment's worth of data is ready. This part also happens over the internet where the same risks around packet-loss and network congestion are factors. Network issues are especially a factor for the last mile of delivery to the end user.
6. **Decoded and played on the client** - When video makes it all the way to the client. The player has to decode and playback the video. Players do not play each segment on the screen as they receive it, they keep a buffer of playable video in-memory which also contributes to the glass-to-glass latency experienced by the end user.

When you consider each of the steps above, any point of that pipeline has the potential to slow down or get backed up. The more latency you can tolerate,
the safer the system is and the lower probability you have for an unhappy viewer. If any single step gets backed up momentarily, the whole system has a chance
to catch up before an interruption in playback. And, when everything is running smoothly, the player has extra time to spend downloading the higher quality version of your content.

<Callout type="info">
  As shown in the image above, there is latency added at every step. However, Mux does not control any latency added during video capture on the camera,
  encoder processing delays, and amount of video buffered & the decoding time of the video player.
</Callout>

# Reconnect Window

When an end-user is streaming from their encoder to Mux, we need to know how to handle situations when the client disconnects unexpectedly.

There are situations when a client disconnects on purpose: for example hitting "Stop streaming" on OBS.
Those are intentional disconnects, we're talking about times when the client just stops sending video.
In order to handle this, live streams have a `reconnect_window`. After an unexpected disconnect,
Mux will keep the live stream "active" for the given period of time and wait for the client to reconnect and start streaming again.

When the `reconnect_window` expires, the live stream transitions back into the `idle` state. In HLS terminology,
Mux writes the `#EXT-X-ENDLIST` tag to the HLS manifest. At this point, your player can consider the live stream to have ended.
By default, `reconnect_window` is `60` (seconds) - you can set this as high as `1800` (30 minutes).

By adding the slate image, you can improve your viewer's video playback experience during the Reconnect window time interval.
You can learn more on Reconnect Windows and Slates [here](/docs/guides/handle-live-stream-disconnects#reconnect-window-and-slates).

# Lower latency Options

Mux live streams have options for "reduced" and "low" latency. The `"latency_mode": "reduced"` option gets your latency down to a range of 12-20 seconds and
the `"latency_mode": "low"` further reduces the latency to as low as 5 seconds. But your viewers might see some variance to the glass-to-glass latency because
the latency depends on many factors, including player configurations, your viewer's geographic location, and their internet connectivity speed.

## Input Requirements

You should only set `latency_mode` to `reduced` or `low` if you have control over the following:

* the encoder software
* the hardware the encoder software is running on
* the network the encoder software is connected to

Typically, home networks in cities and mobile connections are not stable enough to reliably use `reduced` or `low` latency options.

## Create a live stream with the "reduced" latency option

Check out our <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Live Stream API reference</ApiRefLink> and find the `latency_mode` parameter and set the parameter to `"reduced"` in the request to create a live stream.

Set the latency mode on a live stream to `reduced` by making this POST request:

```json
//POST https://api.mux.com/video/v1/live-streams

{
    "latency_mode": "reduced",
    "reconnect_window": 60,
    "playback_policies": ["public"],
    "new_asset_settings": {
        "playback_policies": ["public"]
    }
}
```

You can read more about the reduced latency feature in this [blog post about Reduced Latency](https://mux.com/blog/reduced-latency-for-mux-live-streaming-now-available/).

## Create a live stream with the "low" latency option

<ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Create Live Stream</ApiRefLink> is also used to set the `"latency_mode": "low"` flag.

Set the latency mode on a live stream to `low` by making this POST request:

```json
//POST https://api.mux.com/video/v1/live-streams

{
    "latency_mode": "low",
    "reconnect_window": 60,
    "playback_policies": ["public"],
    "new_asset_settings": {
        "playback_policies": ["public"]
    }
}
```

<Callout type="warning">
  A live stream can only be configured as "reduced" or "low" or "standard" latency.
</Callout>

# Low-latency FAQs

## Is low-latency HLS different from standard latency HLS?

**Yes**. The `"latency_mode": "low"` mode uses [Apple's new low-latency HLS (LL-HLS) spec](https://developer.apple.com/documentation/http_live_streaming/enabling_low-latency_hls)
that allows your viewers to play live streams with as low as 5 seconds glass-to-glass latency. In comparison, the other latency modes are based on the
earlier version of the HLS spec which puts a cap on the lower limits of the glass-to-glass latency.

Mux has closely followed the new low-latency HLS spec and published about the development of the new spec a few times on our blog
[in June 2019](https://mux.com/blog/the-community-gave-us-low-latency-live-streaming-then-apple-took-it-away/)
and again [in January 2020](https://mux.com/blog/low-latency-hls-part-2/).

## Do video players support low-latency HLS?

**Yes**. Because `"latency_mode": "low"` mode uses a recent version of Apple's LL-HLS spec, you may need to upgrade your video player. Below is the list of the
most commonly used video players and the minimum version.

<Callout type="info">
  The video player may have supported LL-HLS spec in earlier versions. However, the minimum video player version mentioned below represents
  the version we used for evaluating Mux's low-latency feature.

  <br />

  <br />

  Additionally, the video player companies are continuously improving video
  playback experience as the LL-HLS spec is more widely adopted and used in the real world.
  We recommend updating your video player versions frequently whenever possible to get the latest fixes and improvements.
</Callout>

| Player | Version | Additional details |
| :-- | :-- | :-- |
| HLS.js | >= 1.1.5 | Other potentially relevant issues to track - ([#3596](https://github.com/video-dev/hls.js/issues/3596)) |
| JW Player | >= 8.20.5 | Setting the `liveSyncDuration` [configuration option](https://docs.jwplayer.com/players/reference/setup-options#behavior) can increase latency. So you should not set this option for low-latency playback. |
| THEOplayer | >= 6.0.0 | LL-HLS playback is enabled by default. |
| THEOplayer | >= 2.84.1 | Requires enabling `LL-HLS` add-on on your player instance and set `lowlatency` parameter to `true` in the player configuration. |
| VideoJS | >= 8.0.0 | LL-HLS playback is enabled by default. |
| VideoJS | >= 7.16.0 | Enabling low latency playback requires initializing `videojs-https-streaming` with the `experimentalLLHLS` flag. [See FAQs](/docs/guides/reduce-live-stream-latency#how-do-you-enable-ll-hls-playback-on-videojs-player-prior-to-800).|
| [Mux Player](/docs/guides/mux-player-web) | >= 1.0 | |
| [Mux Video.js Kit](/docs/guides/playback-videojs-with-mux) | >= 0.4 | Mux Video.js Kit uses HLS.js so the same issues apply. |
| Apple iOS (AVPlayer) | 13.\* | Requires `com.apple.developer.coremedia.hls.low-latency` app entitlement for your iOS apps. Also, there are known issues that occasionally cause playback failures. |
| Apple iOS (AVPlayer) | 14.\* | There are known issues that occasionally cause playback issues. |
| Apple iOS (AVPlayer) | 15.\* | |
| Android ExoPlayer | >= 2.14 | |
| Agnoplay | >= 1.0.33 | |

## My video player does not support the LL-HLS spec. Can it still play a low-latency live stream?

**Maybe**. Apple's LL-HLS specification is backward compatible. So your video player should fall back to playing standard HLS.
Those viewers will have noticeably higher glass-to-glass latency. However, your video player does need support for demuxed audio & video tracks
(each track requested separately) in MP4 format for being backward compatible. Most video players already support demuxed audio &
video tracks in MP4 format.

## My video player is running into issues when playing a low-latency live stream. Can I play the same live stream without the low-latency?

**Yes**. You can add the `low_latency=false` parameter to the video playback URL. Mux can revert back to delivering the same live stream
using standard HLS by adding this `low_latency=false` parameter. However, your video player does need support for demuxed audio & video tracks
(each track requested separately) in MP4 format for the `low_latency=false` parameter to work.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?low_latency=false
```

If your `playback_id` is `signed`, then all query parameters, including `low_latency` need to be added to the claims body. Take a look at
the [signed URLs guide](/docs/guides/secure-video-playback) for details.

## How do you enable LL-HLS playback on VideoJS player prior to 8.0.0?

You can enable low latency playback using Apple's LL-HLS spec by initializing `videojs-http-streaming`
module with the `experimentalLLHLS` flag along with any other options.

```
var player = videojs(video, {
  html5: {
    vhs: {
      experimentalLLHLS: true
    }
  }
});
```

## Can I change my existing live stream's latency to low latency?

Yes. You can change your existing live stream's latency and set the latency to any of the available options -
`standard` or `reduced` or `low` using the <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">Live Stream PATCH</ApiRefLink>.
You can only execute this API when the live stream status is `idle` and helpful in migrating your live stream based on the streamer's
requirements. After Mux successfully runs this API, your webhook endpoint also receives `video.live_stream.updated` event.

```json
//PATCH https://api.mux.com/video/v1/live-streams/{LIVE_STREAM_ID}

{
    "latency_mode": "low"
}
```


# Show live stream health stats to your streamer
Learn how to get the live stream health stats using the Live Stream Stats API.
In this guide you will learn how to use the Live Stream Health Stats<BetaTag /> API in order to embed the live stream health stats for a particular live stream ID into your applications. A common use case is when you want to show the live stream stats to your streamer during a live event, so that the streamer can monitor the status and take actions when issues occur.

<Callout type="info">
  The Live Stream Stats API is not a 1:1 mapping of what you see on the [Live Stream Health page in the Mux dashboard](/docs/guides/debug-live-stream-issues).
</Callout>

You will use JSON Web Tokens to authenticate to this API.

## 1. Understand Live Stream Stats

The Live Stream Stats API returns **Stream Drift Session Average**, **Stream Drift Deviation From Rolling Average**, and **Status**. Before we dive into each of them, understanding a couple of terms here might be helpful:

* **Wallclock time**: Also called the real-world time.
* **Stream drift**: The difference between elapsed media time and elapsed wallclock time. For example, if your encoder has been connected for 10 seconds and it has sent 5 seconds of media during that time, then your current stream drift would be 5s.

Now keep reading below for the metrics the API returns and their definitions.

### Stream Drift Session Average

**Stream Drift Session Average** is the running average of **stream drift** for the lifetime of an ingest connection. It applies a smoothing function to the potentially jagged, fluctuating raw metric. Use this metric as an indication of the average offset between the elapsed wallclock time and media time throughout the whole session.

The value we return from the API is measured in miliseconds and is continuously updated with each measurement taken. It is reset whenever the encoder disconnects.

### Stream Drift Deviation From Rolling Average

To get an indication of whether the current drift is consistent (good) or growing (bad), use **Deviation From Rolling Average**. It is the difference between current **stream drift** and current **stream drift rolling average**. The rolling average only takes the last ~30s of data into account, so it represents the recent drift, rather than measurements taken potentially long time ago. Disparities between current drift and the rolling average can be a good indicator because session average moves slower and may not reflect the latest status.

Use this metric to understand whether the stream is experiencing issues at the moment. When it is, the **Deviation From Rolling Average** will likely be high.

### Status

The `status` returned from the Live Stream Health API could be any of the following values: `excellent`, `good`, `poor`, or `unknown`.

* `excellent`: The Stream Drift Deviation From Rolling Average is less than or equal to 500ms
* `good`: The Stream Drift Deviation From Rolling Average is less than or equal to 1s but greater than 500ms
* `poor`: The Stream Drift Deviation From Rolling Average is greater than 1s
* `unknown`: We are unable to calculate the stream drift. This is usually because the live stream is inactive and/or we have not received any data about it for a few minutes.

Use `status` as an indicator of the latest health status of the live stream ingest. A common use case is to render color coded UI for your streamer's ease-of-use based on the status information, such as green, yellow, or red. You can also check out our pre-built UI to monitor the status by going to the Mux Dashboard for the specific live stream.

## 2. Create a Signing Key

Signing keys can be managed (created, deleted, listed) from the [Signing Keys settings](https://dashboard.mux.com/settings/signing-keys) of the Mux dashboard or via the Mux System API.

<Callout type="warning">
  When making a request to the System API to generate a signing key, the access
  token being used must have the System permission. You can confirm whether your
  access token has this permission by going to Settings > API Access Token. If
  your token doesn't have the System permission listed, you'll need to generate
  another access token with all of the permissions you need, including the
  System permission.
</Callout>

When creating a new signing key, the API will generate a 2048-bit RSA key pair and return the private key and a generated key ID; the public key will be stored at Mux to validate signed tokens. Store the private key in a secure manner.

You probably only need one signing key active at a time and can use the same signing key when requesting live stream stats for multiple live streams. However, you can create multiple signing keys to enable key rotation, creating a new key and deleting the old only after any existing signed URLs have expired.

### Example request

```bash
curl -X POST \
-H "Content-Type: application/json" \
-u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET} \
'https://api.mux.com/system/v1/signing-keys'
```

### Example response

```json
// POST https://api.mux.com/system/v1/signing-keys
{
  "data": {
    "private_key": "(base64-encoded PEM file with private key)",
    "id": "(unique signing-key identifier)",
    "created_at": "(UNIX Epoch seconds)”
  }
}
```

<Callout type="warning">
  Be sure that the signing key's environment (Staging, Production, etc.) matches
  the environment of the live streams you would like to call for! When creating a signing
  key via API, the environment of the access token used for authentication will
  be used.
</Callout>

This can also be done manually via the UI. If you choose to create and download your signing key as a PEM file from UI, you will need to base64 encode it before using it with (most) libraries.

```bash
❯ cat /path/to/file/my_signing_key.pem | base64
LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktL...
```

## 3. Generate a JSON Web Token

The following JWT claims are required:

| Claim Code | Description                | Value                                                                                                                                                              |
| :--------- | :------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `sub`      | Subject of the JWT         | The ID for which counts will be returned                                                                                                                           |
| `aud`      | Audience (identifier type) | `live_stream_id` (Mux Video Live Stream ID) |
| `exp`      | Expiration time            | UNIX Epoch seconds when the token expires. Use this to ensure any tokens that are distributed become invalid after a period of time.                               |
| `kid`      | Key Identifier             | Key ID returned when signing key was created                                                                                                                       |

<Callout type="warning">
  Live Stream ID is available to Mux
  Video customers only and is generated by Mux. Be sure to double check both
  the query ID type and value!
</Callout>

### Expiration time

Expiration time should be at least the duration of the live stream. When the signed URL expires, you will no longer be able to receive live stream stats data from the API.

<Callout type="info">
  [See the related video documentation](/docs/guides/secure-video-playback#expiration-time)
</Callout>

## 4. Signing the JWT

The steps can be summarized as:

1. Load the private key used for signing
2. Assemble the claims (`sub`, `aud`, `exp`, `kid` etc) in a map
3. Encode and sign the JWT using the claims map and private key and the RS256 algorithm.

There are dozens of software libraries for creating and reading JWTs. Whether you’re writing in Go, Elixir, Ruby, or a dozen other languages, don’t fret, there’s probably a JWT library that you can rely on. For a list of open source libraries to use, check out [jwt.io](https://jwt.io/libraries).

<Callout type="warning">
  The following examples assume you're working with either a private key
  returned from the API, or copy & pasted from the Dashboard, **not** when
  downloaded as a PEM file. If you've downloaded it as a PEM file, you will need
  to base64 encode the file contents.
</Callout>

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"
    "github.com/golang-jwt/jwt/v4"
)

func main() {

    myId := ""       // Enter the id for which you would like to get counts here
    myIdType := ""   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
    keyId := ""      // Enter your signing key id here
    key := ""        // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": myId,
        "aud": myIdType,
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// using @mux/mux-node@8

import Mux from '@mux/mux-node';
const mux = new Mux();
const myId = ''; // Enter the id for which you would like to get counts here
const myIdType = ''; // Enter the type of ID provided in myId; one of video_id | asset_id | playback_id | live_stream_id
const signingKeyId = ''; // Enter your Mux signing key id here
const privateKeyBase64 = ''; // Enter your Mux base64 encoded private key here

const getViewerCountsToken = async () => {
    return await mux.jwt.signViewerCounts(myId, {
        expiration: '1 day',
        type: myIdType,
        keyId: signingKeyId,
        keySecret: privateKeyBase64,
    });
};

const sign = async () => {
    const token = await getViewerCountsToken();
    console.log(token);
};

sign();

```

```php

<?php

  // Using https://github.com/firebase/php-jwt

  use \Firebase\JWT\JWT;

  $myId = "";       // Enter the id for which you would like to get counts here
  $myIdType = "";   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
  "sub" => $myId,
  "aud" => $myIdType,
  "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
  "kid" => $keyId
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

my_id = ''              # Enter the id for which you would like to get counts here
my_id_type = ''         # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

payload = {
    'sub': my_id,
    'aud': my_id_type,
    'exp': int(time.time()) + 3600, # 1 hour
}
headers = {
    'kid': signing_key_id
}

encoded = jwt.encode(payload, private_key, algorithm="RS256", headers=headers)
print(encoded)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(subject, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: subject, aud: audience, exp: expires.to_i, kid: signing_key_id}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

my_id = ''                 # Enter the id for which you would like to get counts here
my_id_type = ''            # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''        # Enter your signing key id here
private_key_base64 = ''    # Enter your base64 encoded private key here

token = sign_url(my_id, my_id_type, Time.now + 3600, signing_key_id, private_key_base64)

```



## 5. Making a Request

Supply the JWT in the resource URL using the `token` query parameter. The API will inspect and validate the JWT to make sure the request is allowed.

Example:

```bash
curl 'https://stats.mux.com/live-stream-health?token={JWT}'
```

Response:

```json
{
  "data": [
    {
      "ingest_health": {
        "updated_at": "2022-11-14T17:32:23",
        "stream_drift_session_avg": 384,
        "stream_drift_deviation_from_rolling_avg": 12,
        "status": "excellent",
        },
    },
  ],
}
```

* `stream_drift_session_avg` is the session average of stream drift. Use this to represent the overall health of the stream.
* `stream_drift_deviation_from_rolling_avg` is the delta between the current stream drift and the rolling average. Use this to represent the latest stream health.


# Manage stream keys
Learn how to manage stream keys and enable/disable access to go live.
When live streaming, a stream key is used by a broadcaster to receive a live stream for a Mux account. Stream keys, by nature, are private and should be handled with care. This means that access to the stream key should be reserved for the broadcaster and hidden from end users.

## 1. Use case - Single stream

Single live stream configurations are great for when only one stream will ever be active at a time, or for disposable, single-use live streams.

For example, if you are hosting a conference where the agenda is a back-to-back track of speakers, a single live stream is used in this scenario.

## 2. Use case - Multiple streams

Creating multiple live stream configurations are implemented in situations where multiple live streams are expected. Some reasons you might choose this live stream configuration would include—

* Multiple concurrent streams that overlap in when they go live
* User generated content where going live can happen at any time and there is no established schedule

## Concurrent live streams

When working with multiple streams that can overlap in realtime, use multiple live stream configurations. Each live stream configuration can be tied to each live stream event.

For example, if you are hosting different concurrent events, each event would need an individual live stream configuration created.

If you want to control the ability to accept a live stream, you can use the <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">enable live stream</ApiRefLink> and <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">disable live stream</ApiRefLink> API endpoints. These endpoints can be called based on your business logic from your CMS/backend to control your content creator's ability to go live.

## User generated content

If your solution allows your users to go live at any time, a live stream configuration for each potential content creator will need to be created. As you will see in the following, the Mux live stream configuration `id` will be tied to each content creator using your service.

When provisioning your user as a content creator, <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create a live stream</ApiRefLink> configuration that will be used solely by *this* content creator. The `data.id` response value needs to be stored within your CMS so that it can be used to deliver the live stream to end users when the content creator goes live. A live stream configuration created for a content creator can be reused by that content creator over their life span.

The `data.stream_key` value *could* also be stored in the CMS in case the content creator wants to recall the stream key at a later time.

Another option is to pass through the stream key to the content creator at provision time without storing the stream key. A common use-case that we support is for the ability to reset the stream key for a given live stream configuration. To do this, Mux offers a <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">reset stream key</ApiRefLink> API.

## Advanced options

### Reset stream key

If a stream key needs to be reset for a live stream configuration because it was lost or compromised, the <ApiRefLink href="/docs/api-reference/video/live-streams/reset-stream-key">reset stream key</ApiRefLink> can be used to regenerate the stream key.

### Complete live stream

Typically, when a content creator has end their live stream session by stopping a stream, Mux will wait for the duration configured for the live stream's `reconnect_window` before making it available as an on-demand asset.

To make a live stream available immediately, you can <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">signal live stream complete</ApiRefLink> to immediately make the live stream available as an on-demand asset.

<Callout type="info">
  Mux does not close the encoder connection immediately. Encoders are often
  configured to re-establish connections immediately which would result in a new
  recorded asset. For this reason, Mux waits for 60s before closing the
  connection with the encoder. This 60s timeframe is meant to give encoder
  operators a chance to disconnect from their end.
</Callout>

## Enable live stream

To enable a live stream configuration so that it is able to receive an RTMP session, call the <ApiRefLink href="/docs/api-reference/video/live-streams/enable-live-stream">enable live stream</ApiRefLink> API endpoint.

<Callout type="info">
  By default, all newly created live stream configurations are enabled.
</Callout>

## Disable live stream

Should you want to disable a live stream configuration so that it no longer accepts RTMP sessions, the <ApiRefLink href="/docs/api-reference/video/live-streams/disable-live-stream">disable live stream</ApiRefLink> is used to achieve this use case.

<Callout type="info">
  Unlike <ApiRefLink href="/docs/api-reference/video/live-streams/signal-live-stream-complete">signal live stream complete</ApiRefLink>, Mux closes the encoder connection immediately with this API. Any attempt
  from the encoder to re-establish the connection will fail until the live
  stream is re-enabled.
</Callout>


# Stream recordings of live streams
Every live stream on Mux is automatically recorded as an `asset`.
When playing back a live stream, Mux offers you two options for historical playback. Both options are available at any time; you can switch between the two at will.

1. **Non-DVR mode** - keep all users "live". Only a small portion of your live stream (approximately 30 seconds) will be exposed to your viewers through your player.
2. **DVR mode** - allow your users to scrub back to the beginning of the live stream whenever they want in your player.

The 3 Mux concepts we need to understand here are:

* <ApiRefLink href="/docs/api-reference/video/live-streams">Live streams</ApiRefLink>: This is the top level live streaming resource. Your stream key maps back to a single live stream. Live streams are reusable. Each Live stream has one or more playback IDs associated with it.
* <ApiRefLink href="/docs/api-reference/video/assets">Assets</ApiRefLink>: Assets are videos on demand. In Mux, assets get created by either: direct uploads, creating an asset with an input URL, or from recordings of live streams. Each Asset has one or more playback IDs associated with it.

{/* we have to add that darn <i></i> to disable gfm autolink literals */}

* <ApiRefLink href="/docs/api-reference/video/playback-id">Playback IDs</ApiRefLink>: Playback IDs are the resource that controls playback. A playback ID may point to either a live stream OR an asset and it can be either public or signed. More information on [signed playback IDs is here](/docs/guides/secure-video-playback). A playback ID is the identifier that you use in a `stream.mux.com` URL of the form: <pre>https://<i />stream.mux.com/{"{"}PLAYBACK\_ID{"}"}.m3u8</pre>.

## DVR mode vs. non-DVR mode

In non-DVR Mode, all users viewing the live stream will be viewing the most recent content. The player will have access to approximately the most recent 30 seconds of content.

In order to use non-DVR mode, construct your playback URL with the playback ID *associated with the live stream*.

<Callout type="success" title="Non-DVR Mode is most common">
  Most uses of live streaming opt for non-DVR mode and, if you are unsure about which to use, we recommend that you stick with non-DVR mode.
</Callout>

When using DVR mode, the player will have access to your stream's content going all the way back to the beginning of the live stream.

<Callout type="warning" title="Be careful with DVR Mode and long lived streams">
  Mux does not recommend DVR mode for live streams longer than four hours. If you expect long live streams, you should use non-DVR mode.
</Callout>

If you choose to use DVR mode, then you should construct your playback URL using the playback ID *associated with the live stream's `active_asset_id`*.

## Assets created from live streams

Mux will automatically start creating an asset in the background when you begin broadcasting to your live stream. This asset has two purposes:

* You can use the asset directly in order to enable DVR mode playback.
* When the live stream is over, you can use the asset to play back the recording of the live stream.

Since assets are automatically created from every live stream and live streams can be re-used as many times as you want, Mux creates a new asset every time a live stream begins broadcasting. A single live stream can end up producing an indefinite number of assets.

The lifecycle of events produced by a Mux live stream is as below.

| Step | Event | Description |
|------|-------|-------------|
| 1 | Initial State | Live stream begins in status `idle` |
| 2 | `video.live_stream.connected` | The encoder has connected. At this point in time the live stream will have a new `active_asset_id`. The `active_asset_id` is the ID that points to a new asset that Mux is creating for this live stream. |
| 3 | `video.asset.created` | The asset corresponding to the `active_asset_id` from step 2 gets created. This asset has a `live_stream_id` that points back to the live stream it was created from. This asset does not have any content yet, it is a placeholder that will be getting content from the ingested live stream. |
| 4 | `video.live_stream.recording` | Mux has started recording the incoming content. The live stream's status will still be `idle` at this point. |
| 5 | `video.live_stream.active` | The live stream's state has transitioned `active`. **When in non-DVR mode**, the live stream's playback ID can now be used to build a playback URL on `stream.mux.com`. |
| 6 | `video.asset.ready` | The asset (`active_asset_id`) from step 2 and 3 will be "ready" at around the same time that the live stream is "active". This asset only has about 10 seconds worth of content at this point. The `duration` on this asset reflects the current playable duration. If you are using DVR mode, it is at this point that you can use the `active_asset_id` to build a playback URL on `stream.mux.com`. |
| 7 | `video.live_stream.disconnected` | The encoder has disconnected, and the live stream status is still `active`. Please note that live streams that do not use the `"latency_mode": "reduced"` option will enter a reconnect window (defaulting to a duration of 60 seconds) after disconnecting. The encoder can re-connect within this reconnect window and, in doing so, pick back up where it left off with the same `active_asset_id`. For more information, please consult [handling live stream disconnects](/docs/guides/handle-live-stream-disconnects). |
| 8 | `video.live_stream.idle` | After the encoder has stayed disconnected for the duration of the reconnect window, the live stream will transition back to status `idle`. This live stream will no longer have an active asset associated with it, but for ease of use this event will include the `active_asset_id` of the asset that is just ending. The next time an encoder connects this lifecycle with start back at step 1 with a new `active_asset_id`. |
| 9 | `video.asset.live_stream_completed` | This event fires at the same time as the live stream transitions back to `idle`. This event tells you that the asset is finalized. The `duration` of the asset will now be the full, finalized duration; you can use the playback ID in your player to play the recording of the live stream. |

Please note that some of these webhook events correspond to the `live_stream` resource and others correspond to the `asset`.

More information about configuring and using webhooks can be found in the [webhooks guide](/docs/core/listen-for-webhooks).


# Stream live to 3rd party platforms
Also known as Restreaming, Live Syndication, Rebroadcasting, or RTMP Passthrough.
## 1. What does Simulcasting do?

With the Simulcasting feature, developers can enable their users publish live streams on social platforms.

The Mux Video API makes it easy for developers to build live streaming into their applications. Combined with simulcasting, existing features like Persistent Stream Keys and Automatic Live Stream Recording together provide a way to connect with a number of social sharing apps.

What Simulcasting can help you do:

* Forward a live stream on to social networks like YouTube, Facebook, and Twitch
* Let users publish user-generated live streams on social platforms
* Connect with a number of social sharing apps

<Callout type="info" title="Other names for Simulcasting">
  Other domains may use varying terminology to refer to the same general process including:

  * Restreaming
  * Live Syndication
  * Rebroadcasting
  * RTMP Passthrough
  * [Multistreams - a term used by Crowdcast](https://www.crowdcast.io/multistreams)
</Callout>

## 2. Select a Simulcast Target supported by Mux

Mux Simulcasting works with any arbitrary RTMP server. That means Mux will support Simulcast Targets from any platform that supports the RTMP or RTMPS protocol.

Targets that are supported include but are not limited to the following:

* Facebook Live
* YouTube Live
* Twitch
* Crowdcast
* Vimeo

Unfortunately the following Targets are not supported:

* Instagram (you can only go live from the Instagram app)

## 3. Add simulcasting to a Mux live stream

Use the Mux API to add simulcasting to a live stream.

The first step is to add a Simulcasting Target. You can do this when the Live Stream object is first created, or anytime afterward. Note that Simulcast Targets can only be added while the Live Stream object is not active.

Here is an example of adding a Simulcasting Target for each additional platform the stream should be published to:

```text
POST https://api.mux.com/video/v1/live-streams
```

```text
{
  "playback_policies": [
    "public"
  ],
  "new_asset_settings": {
    "playback_policies": [
      "public"
    ]
  },
  "simulcast_targets" : [
    {
      "url" : "rtmp://a.rtmp.youtube.com/live2",
      "stream_key" : "12345",
      "passthrough" : "YouTube Example"
    },
    {
      "url" : "rtmps://live-api-s.facebook.com:443/rtmp/",
      "stream_key" : "12345",
      "passthrough" : "Facebook Example"
    }
  ]
}
```

## 4. Find your RTMP Credentials on any supported platform

As defined in the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream-simulcast-target">Simulcast Targets API reference</ApiRefLink>, RTMP credentials consist of two parts:

* a `url` , which is the RTMP hostname including the application name for the third party live streaming service
* a `stream_key`, which is the password that represents a stream identifier for the third party live streaming service to simulcast the parent live stream to.

Note that stream keys are sensitive and should be treated with caution the same way you would with an API key or a password.

<Callout type="info" title="Specific examples, steps, and setting recommendations">
  New to live streaming? In this blog post we provide a step-by-step outline how to use Twitch, YouTube, or Facebook for getting RTMP credentials.

  Not sure what settings to use?
  As for settings, a recommendation for your end users is 4,000 kbps at 720p resolution with 2s keyframe intervals. However, this post also provides an in-depth explanation for choosing personalized settings.

  [Help Your Users be in 5 Places at Once: Your Guide to Simulcasting](https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/)
</Callout>

## 5. More information about simulcasting

### Pricing

Simulcasting has an added cost on top of live streaming, but like all of our pricing, you only pay for what you use.

See the [Pricing Page](https://mux.com/pricing/) for details.

### Availability

There's a limit of 6 simulcasts/restreams per live stream. Let us know if you have a use case that requires more.

### Blog Posts about Simulcasting

We have several blog posts that cover more topics about simulcasting products, if you want to read more:

* [Seeing double? Let your users simulcast (aka restream) to any social platform](https://mux.com/blog/seeing-double-let-your-users-simulcast-a-k-a-restream-to-any-social-platform/)
* [Help Your Users be in 5 Places at Once: Your Guide to Simulcasting](https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/)


# Handle Live Stream Disconnects
In this guide we will walk through how to handle disconnects that happen during live streams.
## How Mux handles disconnects

Before reading this guide, you created and set up a Live Stream by following these steps:

* You have connected your encoder (for example OBS, Wirecast, your live streaming app) to an RTMP ingest server as covered in this guide: [Configure Broadcast Software](/docs/guides/configure-broadcast-software)).
* Mux sends the `video.live_stream.connected` event to your environment.
* When the encoder starts sending media to the ingest server, the webhook events `video.live_stream.recording` and then `video.live_stream.active` are delivered to your environment.

If everything goes smoothly, the encoder will keep sending media and the server will keep processing it, creating video segments and
updating the HLS playlists with new pieces of video (to understand how this
works read [Reduce live stream latency](/docs/guides/reduce-live-stream-latency)).  Since all of
this streaming is happening live, the ingest server needs to know what it should do when the encoder disconnects unexpectedly.

What happens when the live stream disconnects either intentionally or due to a drop in the network? Mux sends the `video.live_stream.disconnected`
event for the live stream to your environment. This is where the `reconnect_window` comes into play.

# Reconnect Window

When you <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create the Live Stream</ApiRefLink> or <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">update the Live Stream</ApiRefLink>, you can set the `reconnect_window` parameter in the Request JSON.

The Reconnect Window is the time in seconds that Mux should wait for the live stream broadcasting software to reconnect before considering the live stream finished
and completing the recorded asset. As a default, Mux sets `reconnect_window` to 60 seconds for Standard Latency streams and zero seconds for Reduced and Low Latency streams, but this can be adjusted to any value between 0 to 1800 seconds.

<Callout type="info">
  Reconnect Window is supported for all latency modes of the live stream, including "standard", "reduced" and "low".
</Callout>

### Reconnect Window and slates

When you <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">create the Live Stream</ApiRefLink> or <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">update the Live Stream</ApiRefLink>,
you can set the `reconnect_slate_url` parameter with the URL of the slate image.

Slate insertion can help output a live stream for viewers without interruptions. Below are some examples where Mux receives an imperfect stream and how Mux handles the output:

* If the input contains only audio for the relevant time, the most recent video frame is duplicated
* If the input contains only video for the relevant time, Mux will output silent audio
* If a slate is inserted and the input has no audio or video (including because the encoder was disconnected) a slate period begins where Mux will output silent audio, and duplicate the most recent video frame. After 0.5 seconds, Mux will switch to the slate image and continue to send silent audio. If the encoder is still connected, Mux will disconnect the encoder after 5 minutes. Mux will then continue inserting slates for up to the duration of the `reconnect_window` in seconds. Viewers may experience a maximum slate duration of up to 5 minutes over the `reconnect_window` duration

When Mux stops receiving the media, Mux adds the slate image as a video frame to the live stream. This event of not receiving media disconnects the encoder and starts the `reconnect_window` time interval.
Mux stops adding the slate image when Mux starts receiving media again or the reconnect window time interval expires.

Enable slates for `standard`, `reduced`, and `low` latency mode live streams:

* For `standard` latency live streams, set the `use_slate_for_standard_latency` parameter to `true` and make sure the `reconnect_window` parameter value is greater than 0s. Live streams, created before the slate image functionality was available, will not automatically start using slates until this parameter is set.
* For `reduced` and `low` latency mode live streams, set the `reconnect_window` parameter value to greater than 0s.

Mux selects one of the following images as the default slate image depending on the live stream's video aspect ratio. The default slate image is used unless you
set the `reconnect_slate_url` parameter. We recommended setting the slate image whose aspect ratio matches the live stream's video aspect ratio. You can modify
the `reconnect_slate_url` parameter using the <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">update the Live Stream</ApiRefLink>.

<Image src="/docs/images/slate-laptop-illustration-horizontal.png" width={640} height={360} caption="Horizontal Laptop Illustration" />

<Image src="/docs/images/slate-laptop-illustration-vertical.png" width={360} height={640} caption="Vertical Laptop Illustration" sm />

Mux downloads the slate image, hosted at the URL set as `reconnect_slate_url` parameter value, at the start of the live stream recording.
So, you must ensure the image is always downloadable from the URL. When Mux can not download the image, the default slate image (shown above) is used
and the video.live\_stream.warning for the live stream as well as the `video.asset.warning` webhook event for the asset is fired. Below is an example
of the webhook event body:

```json
{
    "type": "video.live_stream.warning", // or "video.asset.warning"
    "object": {
      "type": "live",
      "id": "CiinCsHA2EbsU00XwzherzjWAek3VmtUz8"
    },
    "id": "3a56ac3d-33da-4366-855b-f592d898409d",
    "environment": {
      "name": "Production",
      "id": "j0863n"
    },
    "data": {
      "warning": {
        "type": "custom_slate_unavailable",
        "message": "Unable to download custom reconnect slate image from URL 'http://example.com/bad_url.png' -- using black frames for slate if needed."
      },
      "stream_key": "5203dc64-074a-5914-0dfc-ce007f5db53a",
      "status": "idle",  // or "preparing"
      "id": "CiinCsHA2EbsU00XwzherzjWAek3VmtUz8",
      "active_asset_id": "0201p02fGKPE7MrbC269XRD7LpcHhrmbu0002"
    },
    "created_at": "2022-07-14T21:08:27.000000Z",
    "accessor_source": null,
    "accessor": null,
    "request_id": null
  }
```

<Callout type="info">
  The `status` parameter in the webhook event body (shown above) is `idle` for the live stream and `preparing` for the asset
  event to match the corresponding `status` parameter values.
</Callout>

## How to handle reconnects from the Player/Client side without a slate image

When Mux is not receiving any media, the viewer experience depends on whether Mux starts receiving media before the reconnect window expires.
We strongly recommend <ApiRefLink href="/docs/api-reference/video/live-streams/update-live-stream">updating the Live Stream</ApiRefLink> to add a slate image.
However, there are two possible scenarios when you do not want to add the slate image.

### In scenario 1, the encoder re-connects

The ingest server will wait for the duration of the `reconnect_window` before it ends the live stream. While the encoder is disconnected, media is no longer being sent, so the HLS playlists are not getting new segments of video.

<Callout type="info">
  # Stalled player during live stream

  A stalled player during a live stream happens when the live stream is still active, but the HLS manifest file is not getting new video segments appended to it.

  The player will enter a `stalled` state if it runs out of buffer. To avoid this, consider adding extra buffer to your player.
</Callout>

If the encoder reconnects before the `reconnect_window` expires then the HLS playlist will resume appending new video segments to the live stream.

### In scenario 2, the encoder disconnects

If the encoder does not reconnect before the `reconnect_window` expires, the following events will occur:

1. Mux writes an `EXT-X-ENDLIST` tag to the HLS playlist. According to the HLS specification: *EXT-X-ENDLIST: Indicates that no more media files will be added to the playlist file*. This tells the player **this stream is over** and no more media is coming. Your player should emit an `ended` event, or something equivalent.
2. The live stream will transition from `active` back to `idle`
3. Mux will create a new asset. The `active_asset_id` while the live stream was active will be finalized. If the same live stream goes live *again* at a later time, then the live stream will get a new `active_asset_id` and a new asset will be created.


# Stream simulated live
Appear to be broadcasting live, but use a pre-recorded video.
<Callout type="warning" title="Other terms for Simulated Live">
  * Pre-Recorded Live
  * Scheduled Live
  * Playout Service
  * Simulated Live from VOD
  * Psuedo-live
  * Live Linear Channel
</Callout>

You may have a pre-recorded video and want to use Mux to broadcast it as if it were live.

For now, Mux does not support Simulated Live streaming directly as a feature. As a work-around, this guide provides a few options to implement your own Simulated Live streaming solution.

Simulated Live streaming is a common strategy to ensure reliability. For example, if your platform has groups of users watching content simultaneously, you will want to employ one of the following strategies.

<Callout type="info" title="Before reading on...">
  You should be familiar with how live streaming works:

  1. Create a live stream with the Mux API
  2. Get the unique stream key for that live stream
  3. Put the server URL and stream key into an encoder (OBS, Wirecast, etc.)
  4. Use the Playback ID to view your live stream in any player that supports HLS
</Callout>

## Option 1: Use a 3rd party service to send an RTMP stream

The most straightforward and reliable option we recommend is to use a third party service built for Simulated Live streaming. The service will allow you to upload videos and send out an RTMP stream at a scheduled time.

Upload your video to the service, enter in the Mux `rtmp` ingest server details, and schedule the time you want it to "go live".

For example, [restream.io](https://restream.io/) offers [this guide](https://support.restream.io/en/articles/2715850-getting-started-with-streaming-pre-recorded-videos) to get started with pre-recorded videos. Note that there is a cost associated with this option.

## Option 2: Build your own server to send an RTMP stream

The second option we recommend is to build your own server that is capable of uploading video and sending an RTMP stream to Mux.

To do this, run encoder software that can ingest a video file and sends output to a Mux RTMP ingest URL. Software you might use to build a server include [ffmpeg](https://ffmpeg.org/) or [GStreamer](https://gstreamer.freedesktop.org/).

If you are going with this "home-rolling" route, your program should:

* Handle network blips gracefully. Even if your server is running in a reliable cloud like AWS or Google, networking between commercial data centers may experience interruptions.

* Handle disconnects. In particular, `ffmpeg` does not have any built in disconnect handling so if you use that software you should make sure you have a solution to handle them.

* Hold up to rigorous testing. Test the program with different types of content and long running streams. Make sure what you built is reliable before you use it in production.

## Option 3: Use on-demand and simulate live in the UI

The final option is to skip the backend live streaming setup, use an on-demand video, and make it "appear live" in your UI. This is a work-around we have seen success with.

To simulate a Live Stream in the UI you could:

* Hide the timeline of the player so that users can't seek back and forth
* Have the client make requests to the server to check server-time and use the server-time to keep the playhead synced to the "current live" time
* Show a red dot that gives the impression to the user "this is live"

## Provide your feedback

We'd love to hear what is working and what isn't working, so if you are using one of these solutions (or some other solution), please send your ideas.

If you are interested in Simulated Live streaming as a Mux feature, let us know about your use case and specific needs!


# Debug live stream issues
Learn how to debug live streams and identify the most commonly seen live stream issues with the Live Stream Input Health dashboard.
## Navigate to the Live Stream Input Health dashboard

The Live Stream Input Health dashboard is a real-time dashboard that provides visibility on how Mux receives your live stream from the encoder. When a sizable percentage of your viewers complain about their viewing experience or your configured Mux Data Alert fires, a good starting point for identifying the problem is understanding the live stream's health. The video below shows how to navigate to your Live Stream Input Health dashboard.

<Image src="/docs/images/navigate-to-live-stream-input-health.gif" width={1666} height={1088} alt="Navigate to live stream input health" />

## Healthy live stream

Let's first look at a healthy live stream in the dashboard.

<Image src="/docs/images/health-live-stream.png" width={2322} height={724} alt="Health live stream" />

A few key points to notice from this graph that indicate this is a healthy live stream:

* Mux is receiving consistent frames per second. Receiving inconsistent frames per second can introduce video stuttering and sometimes cause playback interruptions for all your viewers.
* Consistent non-zero audio bitrate is important for uninterrupted listening. A good encoder always creates a constant non-zero bitrate even when no person is speaking, or no music is being played. A varying audio bitrate can result in a bad listening experience and sometimes a good indicator for Audio-Video sync problems.
* Like Audio, a consistent average video bitrate is equally important for a good viewing experience. A varying video bitrate does not necessarily cause a playback problem but could result in a bad viewing experience.
  * A low variance in the video bitrate typically means optimal network bandwidth availability and encoder hardware resource utilization.
  * A high variance in the video bitrate indicates that the encoder hardware cannot keep up with the encoding load. Try reducing the video bitrate and using a constant bitrate (CBR) for a more reliable live stream input. Alternatively, you can also switch to another encoder like [OBS, Wirecast, etc](/docs/guides/configure-broadcast-software#software-encoders).
  * An unstable/unreliable network bandwidth availability results in transient video bitrate drops, which can cause playback interruptions.

<Callout type="success">
  No actions required.
</Callout>

## Unhealthy live stream

Now let's look at a few examples of live stream issues and potential next steps for resolution.

## Example 1: High video bitrate variance

<Image src="/docs/images/unhealthy-live-stream-1.png" width={2322} height={722} alt="Unhealthy live stream high video bitrate variance" />

Because of the constant frames per second and audio bitrate this live stream looks good, but the high variance of video bitrate and drop in the average video bitrate mid-stream can impact the viewer experience.

<Callout type="warning">
  # Use lower and constant video bitrate

  Configure your encoder to use a lower video bitrate and a constant video bitrate. Recommended encoder settings are [available here](/docs/guides/configure-broadcast-software#recommended-encoder-settings).
</Callout>

## Example 2: Intermittent loss

<Image src="/docs/images/unhealthy-live-stream-2.png" width={2322} height={722} alt="Unhealthy live stream intermittent loss" />

Mux is receiving mostly constant frames per second and audio/video bitrate. This indicates that when the encoder is connected the stream is healthy. However the small spikes as well as intermittent loss in receiving the live stream, indicates transient network bandwidth availability issues.

<Callout type="error">
  Try switching to a more reliable network and/or stop other network bandwidth consuming services for the duration of the live stream.
</Callout>

## Example 3: Spiky audio and video bitrate

<Image src="/docs/images/unhealthy-live-stream-3.png" width={2340} height={718} alt="Unhealthy live stream spiky audio and video bitrate" />

There is a high variance in receiving audio and video bitrate in this example. Because connection never fully drops the network connection is probably not the problem in this one. More likely is that the encoder is unable to keep up at a fast enough pace to send consistent video and audio data. One cause of this is that the device running the computer might be running out of available CPU.

<Callout type="error">
  Consider using any of these [recommended encoders](/docs/guides/configure-broadcast-software#software-encoders) for your live stream.
</Callout>

<Callout type="warning" title="Use lower and constant video bitrate">
  Configure your encoder to use a lower video bitrate and a constant video bitrate. Recommended encoder settings are [available here](/docs/guides/configure-broadcast-software#recommended-encoder-settings).
</Callout>

## Example 4: Spiky frame rate

<Image src="/docs/images/unhealthy-live-stream-4.png" width={2296} height={728} alt="Unhealthy live stream spiky frame rate" />

This is a good example of a very unhealthy live stream. There is high variance in the video bitrate and several instances of the frame rate dipping to nearly zero. The spiky video bitrate mid-stream indicates that the encoder is optimizing the video encoding based on the feed contents. This is not ideal for live streaming.

<Callout type="error">
  Try switching to a more reliable network and/or stop other network bandwidth consuming services for the duration of the live stream.
</Callout>

<Callout type="error" title="Use constant video bitrate">
  Configure your encoder to use a constant video bitrate. Recommended encoder settings are [available here](/docs/guides/configure-broadcast-software#recommended-encoder-settings).
</Callout>

## Integrate Live Stream Input Health data

<Callout type="info">
  Please note, this feature is only available to customers who have subscribed to this feature. [Contact our Sales team](https://www.mux.com/sales-contact) if you would like more information.
</Callout>

Live Stream Input Health data can be integrated with an Amazon Kinesis or Google Pub/Sub endpoint in your cloud account. Health and encoding metadata are sent to Kinesis or Pub/Sub as the events occur and are made available to retrieve from the stream with the same five second interval as the Dashboard.

Each message is either a Live Stream input health update or an metadata update from the encoder. The data can be stored in your long-term storage for immediate display and historical reporting.

This method of access is most useful for customers who want to embed live stream health in a user-facing application feature or need to build an internal operational tool for stream reporting.

## Setting up a streaming export

Streaming exports can be configured in the **Streaming Exports** settings in your Mux dashboard. See the setup guide for your platform for more information on setting up an export:

* [Amazon Kinesis Data Streams](/docs/guides/export-amazon-kinesis-data-streams)
* [Google Cloud Pub/Sub](/docs/guides/export-google-cloud-pubsub)

## Message Format

Messages are formatted using Protobuf (proto2) encoding. Every message uses the `live_stream_input_health.v1.LiveStreamInputHealth` message type defined in the export Protobuf spec.

The protobuf definition for the Live Stream Input Health is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/live_stream_input_health/v1). Please subscribe to this repository for updates to the protobuf definition.

There are two types of updates that can be specified, though new types may be added in the future. Each message contains one type of update:

* Encoder metadata sent by the RTMP encoder
* Stream Input Health data

The following are descriptions of the data provided by each type of update:

```javascript
RTMPMetadataEvent = {
  // Unless otherwise specified, all the data contained in `video_track` and `audio_track` is as
  // specified by the encoder (not as observed).
  "video_track": {                           // Video track, present for AV streams
      "width": 1280,                         // Width of the input video
      "data_rate": 4000,                     // Kbps data rate of the video
      "codec_id": "avc1",                    // Video codec
      "height": 720,                         // Height of the input video
      "frame_rate": 30                       // Number of frames per second
  },
  "audio_track": {                           // Audio track, present for AV and audio-only streams
      "sample_size": 16,                     // Bits per audio sample
      "sample_rate": 44100,                  // Sample rate
      "data_rate": 128,                      // Kbps data rate of the audio
      "codec_id": "mp4a",                    // Audio codec
      "channel_count": 1                     // Number of audio channels
  },
	"encoder": "ffmpeg",                       // The encoder used to transcode for the broadcast

  "live_stream_id": "uiwe7gZtIcuyYSCfjfpGjad02RPqN", // The Mux Live Stream Id for live stream
  "asset_id": "hfye6sBqRmR8MRJZaWYq602X1rB0"         // The Mux Asset Id for the asset where the input stream is stored
}
```

```javascript
HealthUpdateEvent = {
  "video_tracks": [                        // Video tracks, present for AV streams
    {
      "bytes_received": 3155737,           // Number of video bytes received data during this interval
      "stream_start_ms": 4979091,          // Timestamp of the first video frame in this interval, as measured in milliseconds since start of the stream
      "stream_end_ms": 4985097,            // Timestamp of the last video frame in this interval, as measured in milliseconds since start of the stream
      "keyframes_received": 3,             // Number of keyframes that occurred during this interval
      "total_frames_received": 180         // Total number of video frames received during this interval
    }
  ],
  "audio_tracks": [                        // Audio tracks, present for AV and audio-only streams
    {
      "bytes_received": 94864              // Number of audio bytes received from the encoder during this interval
    }
  ],
  "caption_tracks": [                      // Caption tracks
    {
      "bytes_received": 12354,             // Number of captions bytes received from the encoder during this interval
      "channel_count": 1                   // Number of captions channels that received data during this interval
    }
  ],

  "measurement_start_ms": 1644313838000,       // Timestamp of the start of the interval in milliseconds since Unix epoc
  "measurement_end_ms": 1644313838000,         // Timestamp of the end of the internval in milliseconds since Unix epoc

  "live_stream_id": "uiwe7gZtIcuyYSCfjfpGjad02RPqN",   // The Mux Live Stream Id for live stream
  "asset_id": "hfye6sBqRmR8MRJZaWYq602X1rB0",           // The Mux Asset Id for the asset where the input stream is stored
  "asn": 25135,                            // The ASN number for the ingest IP address
  "asn_name": "VODAFONE_UK_ASN (AS2135)"   // The friendly name associated with the ASN number
}
```

### Update Frequency

* Encoder metadata is sent when the RTMP stream connects to Mux. Some encoders also send metadata updates during the live stream.
* Live Stream Input Health updates occur every 5 seconds for each stream that is currently connected.


# Add your own live closed captions
Learn how to add your own closed captions to your live stream for accessibility.
## Why are closed captions important?

Closed captions refers to the visual display of the audio in a program. Closed captions make video more accessible to people who are deaf or hard of hearing, but the benefits go beyond accessibility. Closed captions empower your viewers to consume video content in whichever way is best for them, whether it be audio, text, or a combination.

## Supported live caption formats

There are many types of closed caption sources, and each streaming standard may use a different format for embedding captions on the output. Mux supports receiving closed captions embedded in the H.264 video stream using the CEA-608 standard for a single language.

CEA-608 stems from the analog era where closed captions data was carried directly in the transmission in a line of the video content that wasn’t displayed unless the decoder was told to look for it. These were often referred to as “Line 21” captions. CEA-608 is still the primary standard for transmitting closed captions within the same stream as audio/video content.

Most major live caption providers (e.g. AI-Media, EEG Falcon, 3Play, Verbit) will support the CEA-608 standard. Mux will translate the CEA-608 captions into WebVTT that will be delivered as part of the HLS stream/manifest, in a standard HLS-supported manner. We will continue to evaluate demand for supporting captions for multiple languages and other caption formats.

## Integrate your own closed captions

Add the `embedded_subtitles` array at time of stream creation or to an existing live stream. Closed captions are a type of subtitle. The resulting Asset's subtitle text track will have `closed_captions: true` set.

| Input Parameters | Type   | Description                                                                                                                                                                                    |
| ---------------- | ------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| name             | string | The name of the track containing a human-readable description. This value must be unique across all the text type and subtitles text type tracks. Defaults to `language_code` if not provided. |
| passthrough      | string | Arbitrary metadata set for the live closed caption track. Max 255 characters.                                                                                                                  |
| language\_code    | string | The language of the closed caption stream. Value must be BCP 47 compliant. Defaults to `en` if not provided                                                                                    |
| language\_channel | string | CEA-608 caption channel to read caption data from. Possible values: "cc1"                                                                                                                      |

### Step 1A: Create a live stream in Mux

Create a live stream using the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Live Stream Creation API</ApiRefLink>. Let Mux know that closed captions will be embedded in the RTMP stream at time of live stream creation.

#### API Request

```json
POST /video/v1/live-streams

Request Body
{
  "playback_policy" : [
    "public"
  ],
  "embedded_subtitles" : [
    {
      "name": "English CC",
      "passthrough": "English closed captions",
      "language_code": "en-US",
      "language_channel" : "cc1"
    }
  ],
  "new_asset_settings" : {
    "playback_policy" : [
      "public"
    ]
  }
}
```

#### API Response

```json
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": ["public"]
    },
    "embedded_subtitles": [
      {
        "name": "English CC",
        "passthrough": "English closed captions",
        "language_code": "en-US",
        "language_channel": "cc1"
      }
    ],
    "id": "e00Ed01C9ws015d5SLU00ZsaUZzh5nYt02u",
    "created_at": "1624489336"
  }
}
```

### Step 1B: Configure live closed captions for an existing live stream

Use the Live Stream Closed Captions API to configure closed captions to an existing live stream. Live closed captions can not be configured to an active live stream.

#### API Request

```json
PUT / video/v1/live-streams/{live_stream_id}/embedded-subtitles

Request Body
{
  "embedded_subtitles": [
    {
      "name": "en-US",
      "language_code": "en-US",
      "language_channel": "cc1"
    }
  ]
}
```

#### API Response

```json
Response
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": [
        "public"
      ]
    },
    "embedded_subtitles" : [
      {
        "name": "English",
        "language_code": "en-US",
        "language_channel": "cc1"
      }
    ],
    "id": "e00Ed01C9ws015d5SLU00ZsaUZzh5nYt02u",
    "created_at": "1624489336"
  }
}
```

### Step 2: Create an event with your preferred closed caption vendor

Log into your preferred closed caption provider account (e.g. AI-Media, 3Play, Verbit) and create an event that needs to be captioned. To create an event, you will need to provide the following inputs

* Start date and time
* Language of audio to be captioned
* Destination Stream URL and Stream Key (Mux). The caption vendor will send video with captions encoded via the 608 standard to this destination.

| RTMP Server URL                     | Description                                                                                                          | Common Applications                                                                                                         |
| ----------------------------------- | -------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- |
| rtmp://global-live.mux.com:5222/app | Mux's standard RTMP ingest URL. Compatible with the majority of streaming applications and services.                | Open Source RTMP SDKs, most [app-store streaming applications](https://mux.com/blog/guide-to-rtmp-broadcast-apps-for-ios/). |
| rtmps://global-live.mux.com:443/app | Mux's secure RTMPS ingest URL. Compatible with fewer streaming applications, but offers a higher level of security. | OBS, Wirecast, Streamaxia RTMP SDKs                                                                                         |

Mux's global RTMP or RTMPS ingest urls will connect you to the closest ingest region. While these ingest URLs typically provide optimal performance, you can also select a specific region using our [regional ingest URLs.](/docs/guides/configure-broadcast-software#available-ingest-urls).

Upon successful event creation, the closed caption provider will provide the following

* Stream URL
* Stream Key

Learn more about:

* [How to setup live captions with AI-Media EEG Falcon](https://www.ai-media.tv/wp-content/uploads/Manual_Falcon-User-Guide-.pdf)
* [How to setup live captions with 3Play Media](https://support.3playmedia.com/hc/en-us/articles/360048839533-Live-Captions-Schedule-Live-Captions-for-an-RTMP-Stream)
* [How to setup live captions with Verbit](https://verbit-ai.zendesk.com/hc/en-us/articles/4403013880594-Verbit-s-RTMP-Solution-for-Livestreaming-Events)

### Step 3: Point your RTMP stream to your caption provider

Configure your video encoder with the Stream URL and Stream Key provided by the closed caption provider in Step 2.

### Step 4: Start your live stream

When the stream goes live, a new live asset is created and tracks will be created for the corresponding captions.

### Step 5: Monitor closed caption stream health

When your stream is live, visit the Live Health Dashboard to monitor closed caption stream health. The dashboard will show whether Mux is receiving closed captions. More details can be found at [Debug live stream issues](/docs/guides/debug-live-stream-issues)

## Update stream to not expect live captioning for future connections

Let Mux know to not expect closed captions when the live stream starts again. This can be done by configuring your live stream to not have any captions. This request can only be made while the live stream is idle.

### API Request

```json
PUT /video/v1/live-streams/{live_stream_id}/embedded-subtitles

Request Body
{
  "embedded_subtitles" : []
}
```

## FAQ

### Are there any language restrictions?

Yes. The 608 standard only supports the following languages: English, Spanish, French, German, Dutch, and Portuguese, or Italian. We currently only support live closed captions for a single language. We will evaluate supporting multiple languages based off of customer feedback.

### Is the 608 standard supported by my closed caption vendor?

Caption vendors known to support the 608 captions: 3Play, AI-Media EEG Falcon, Verbit

Caption vendors known to not support 608 captions: Rev.ai

### When can I edit my live closed caption configuration?

You can only edit your live caption configuration while the live stream is idle; you cannot make any changes while the live stream is active.

### Will formatting be preserved?

Mux will translate the CEA-608 captions into WebVTT. Though Mux attempts to preserve the caption formatting, some formatting may be lost.

### Does live captions work with audio-only?

No. If you have a use case for this, please let us know.

### How do I download my closed caption track?

```json
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt
```

More details can be found at [Advanced Playback features](/docs/guides/play-your-videos#advanced-playback-features)

### Does live closed captions work with low latency?

Not at this time. If you have a use case for this, please let us know.


# Add Auto-Generated Live Captions
In this guide you will learn how to add auto-generated live captions to your Mux live stream.
## Overview

Mux is excited to offer auto-generated live closed captions in English, French, German, Italian, Portuguese, and Spanish. Closed captions make video more accessible to people who are deaf or hard of hearing, but the benefits go beyond accessibility. Captions empower your viewers to consume video content in whichever way is best for them, whether it be audio, text, or a combination.

For auto-generated live closed captions, we use artificial intelligence based speech-to-text technology to generate the closed captions. Closed captions refer to the visual display of the audio in a program.

## 1. Is my content suitable for auto-generated live closed captions?

Non technical content with clear audio and minimal background noise is most suitable for auto-generated live captions. Content with music and multiple speakers speaking over each other are not good use cases for auto-generated live captions.

<Callout type="info">Accuracy ranges for auto-generated live captions range from 70-95%.</Callout>

## 2. Increase accuracy of captions with transcription vocabulary

For all content, we recommend you provide transcription vocabulary of technical terms (e.g. CODEC) and proper nouns. By providing the transcription vocabulary beforehand, you can **increase the accuracy** of the closed captions.

The transcription vocabulary helps the speech to text engine transcribe terms that otherwise may not be part of general library. Your use case may involve brand names or proper names that are not normally part of a language model’s library (e.g. "Mux"). Or perhaps you have a term, say "Orchid" which is a brand name of a toy. The engine will recognize "orchid" as a flower but you would want the word transcribed with proper capitalization in the context as a brand.

Please note that it can take up to 20 seconds for the transcription vocabulary to be applied to your live stream.

## 3. Create a new transcription vocabulary

You can create a new transcription library by making a `POST` request to `/video/v1/transcription-vocabularies` endpoint API and define the input parameters. Each transcription library can have up to 1,000 phrases.

## Request Body Parameters

| Input parameters | Type     | Description                                                                                                                                           |
| ---------------- | -------- | ----------------------------------------------------------------------------------------------------------------------------------------------------- |
| name             | `string` | The human readable description of the transcription library.                                                                                          |
| phrases          | `array`  | An array of phrases to populate the transcription library. A phrase can be one word or multiple words, usually describing a single object or concept. |

### API Request

```json
POST /video/v1/transcription-vocabularies
{
  "name": "TMI vocabulary",
  "phrases": ["Mux", "Demuxed", "The Mux Informational", "video.js", "codec", "rickroll"]
}
```

### API Response

```json
{
  "data": {
    "updated_at": "1656630612",
    "phrases": ["Mux", "Demuxed", "The Mux Informational", "video.js", "codec", "rickroll"],
    "name": "TMI vocabulary",
    "id": "4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR00cLKXFlc",
    "created_at": "1656630612"
  }
}
```

## 4. Enable auto-generated live closed captions

Add the `generated_subtitles` array at time of stream creation or to an existing live stream.

## Request Body Parameters

| Input parameters               | Type     | Description                                                                                                                                                                                                                 |
| ------------------------------ | -------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `name`                         | `string` | The human readable description for the generated subtitle track. This value must be unique across all the text type and subtitles text type tracks. If not provided, the name is generated from the chosen `language_code`. |
| `passthrough`                  | `string` | Arbitrary metadata set for the generated subtitle track.                                                                                                                                                                    |
| `language_code`                | `string` | BCP-47 language tag for captions. Defaults to `"en"`.                                                                       |
| `transcription_vocabulary_ids` | `array`  | The IDs of existing Transcription Vocabularies that you want to be applied to the live stream. If the vocabularies together contain more than 1,000 unique phrases, only the first 1,000 will be used.                      |

Mux supports the following languages and corresponding language codes for live generated captions.

| Language | Language Code |
| :-- | :-- |
| English | `"en"` |
| Spanish | `"es"` |
| Italian | `"it"` |
| Portuguese | `"pt"` |
| German | `"de"` |
| French | `"fr"` |

<Callout type="info">Locale codes such as `"en-US"` or `"es-MX"` are accepted and will be parsed down to their language code (e.g., `"en-US"` → `"en"`).</Callout>

# Step 1A: Create a live stream in Mux

Create a live stream using the <ApiRefLink href="/docs/api-reference/video/live-streams/create-live-stream">Live Stream Creation API</ApiRefLink>. Let Mux know that you want auto-generated live closed captions.

### API Request

```json
POST /video/v1/live-streams

Request Body
{
  "playback_policy" : ["public"],
  "generated_subtitles": [
    {
      "name": "English CC (auto)",
      "passthrough": "English closed captions (auto-generated)",
      "language_code": "en",
      "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
    }
  ],
  "new_asset_settings" : {
    "playback_policy" : ["public"]
  }
}
```

### API Response

```json
Response
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": [
        "public"
      ]
    },
    "generated_subtitles" : [
      "name": "English CC (auto)",
      "passthrough": "English closed captions (auto-generated)",
      "language_code": "en",
      "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
    ],
    "id": "e00Ed01C9ws015d5SLU00ZsaUZzh5nYt02u",
    "created_at": "1624489336"
  }
}

```

# Step 1B: Configure live captions for an existing live stream

Use the Generated Subtitles API to configure generated closed captions to an existing live stream. Live closed captions can not be configured to an active live stream.

### API Request

```json
PUT /video/v1/live-streams/{live_stream_id}/generated-subtitles

Request Body
{
  "generated_subtitles": [
    {
      "name": "English CC (auto)",
      "passthrough": "{\"description\": \"English closed captions (auto-generated)\"}",
      "language_code": "en",
      "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
    }
  ]
}
```

### API Response

```json
Response
{
  "data": {
    "stream_key": "5bd28537-7491-7ffa-050b-bbb506401234",
    "playback_ids": [
      {
        "policy": "public",
        "id": "U00gVu02hfLPdaGnlG1dFZ00ZkBUm2m0"
      }
    ],
    "new_asset_settings": {
      "playback_policies": [
        "public"
      ]
    },
    "generated_subtitles": [
      {
        "name": "English CC (auto)",
        "passthrough": "{\"description\": \"English closed captions (auto-generated)\"}",
        "language_code": "en",
        "transcription_vocabulary_ids": ["4uCfJqluoYxl8KjXxNF00TgB56OyM152B5ZR"]
      }
    ]
  }
}
```

# Step 2: Start your live stream

* At the start of the Live Stream, two text tracks will be created for the active asset, with `text_source` attributes of `generated_live` and `generated_live_final`, respectively.

* While the stream is live, the `generated_live` track will be available and include predicted text for the audio.

* At the end of the stream, the `generated_live_final` track will transition from the preparing to ready state; this track will include finalized predictions of text and result in higher-accuracy, better-timed text.

* After the live event has concluded, the playback experience of the asset created will only include the more accurate `generated_live_final` track, but the sidecar VTT files for both tracks will continue to exist.

## 5. Update stream to not auto-generate closed captions for future connections

To prevent future connections to your live stream from receiving auto-generated closed captions, update the `generated_subtitles` configuration to `null` or an empty array.

### API Request

```json
PUT /video/v1/live-streams/{live_stream_id}/generated-subtitles

Request Body
{
  "generated_subtitles" : []
}
```

## 6. Manage and update your transcription vocabulary

### Update phrases in a transcription vocabulary

Phrases can be updated at any time, but won't go into effect to active live streams with auto-generated live closed captions enabled where the transcription vocabulary has been applied. If the updates are applied to an active live stream, they will not be applied until the next time the stream is active.

### API Request

```json
PUT /video/v1/transcription-vocabularies/$ID
{
  "phrases": ["Demuxed", "HLS.js"]
}
```

## FAQs

### What happens if my live stream has participants speaking languages other than the caption stream I've chosen?

If you've added an auto-generated English caption stream and your audio contains a non-English language, we will attempt to auto-generate captions for all the content in English. e.g. If French and English are spoken, we will create captions for the French language content using the English model and the output would be incomprehensible.

### When can I edit my live caption configuration?

Only when the live stream is idle. You cannot make any changes while the live stream is active.

### How do I download my auto-generated closed caption track?

```json
https://stream.mux.com/{PLAYBACK_ID}/text/{TRACK_ID}.vtt
```

More details can be found at [Advanced Playback features](/docs/guides/play-your-videos#advanced-playback-features)

### Do live captions work with low latency live streams?

Not at this time.


# Live Streaming FAQs
Answers to common questions relating to Live Streaming.
## What is Mux’s latency for live streams?

For a standard live stream, latency is expected to be greater than 20 seconds and typically about 25 - 30 seconds. We offer a
[reduced latency](https://mux.com/blog/reduced-latency-for-mux-live-streaming-now-available/) mode which will reduce latency to about 12 - 20 seconds.
A low-latency live stream can go as low as 5 seconds of glass-to-glass latency but the latency can vary depending on your viewer's geographical
location and internet connectivity.

## Does Mux support WebRTC for live streaming ingest?

We currently do not support direct WebRTC ingest to a Live Stream.

We have focused on RTMP and RTMPS for live streams from an encoder as they are the most universal ingest protocols.

## How can I go live from a browser?

To [go live directly from a browser](https://mux.com/blog/the-state-of-going-live-from-a-browser/), you need to convert the browser stream into a format that can be consumed by RTMP on our end.

For example, we’ve had customers use Zoom to provide the video from a browser, and use its RTMP-out feature to broadcast a stream with Mux for streaming to a larger [conference-like audience](https://mux.com/blog/how-to-host-your-own-online-conference/).

## Is Mux Live Streaming API suitable for 2-way video communication applications?

Mux's Live Streaming API is not intended to provide 2-way video communication. For use cases that require 2-way video communication, we'd suggest looking at one of our partners, [LiveKit](https://livekit.io/).

## Is it possible to rewind live content while the live stream continues?

**DVR (Digital Video Recorder) mode** is a live stream feature that lets it rewind. Mux supports DVR and non-DVR modes for live streams.

**non-DVR mode** is enabled by default for live streams and only has access to the most recent 30 seconds of the live stream.

**DVR mode** is possible by utilizing the live stream's `active_asset_id`. When constructing the playback URL, the `playback_id` for the associated `active_asset_id` is used. When the live video ends, the <ApiRefLink href="/docs/api-reference/video/playback-id">Playback ID</ApiRefLink> associated with the `active_asset_id` will automatically transition to an on-demand asset for playback instead.

For more information and caveats behind these two modes, refer to our [Stream recordings of live streams](/docs/guides/stream-recordings-of-live-streams) guide.

## What is the maximum live stream duration?

Currently, we have a 12 hour limit for continuous streaming to our live endpoints. The live stream is disconnected after 12 hours.

If the encoder reconnects, Mux will transition to a new asset with its own playback ID.

## Do I need to create stream keys for every live event?

No, stream keys can be re-used as many time as you want. It's common for applications to assign one stream key to each user (broadcaster) in their system and allow that user to re-use the same stream key over time.

## Is there a limit to creating stream keys and live steams?

There is no limit on how many stream keys and live streams you can create.

Once created, stream keys are persistent and can be used for any number of live events.

## Do you charge for creating stream keys?

We don’t charge for creating stream keys, only when sending us an active RTMP feed.

## Can I live stream a pre-recorded video?

Mux does not support generating simulated live from on-demand assets. Such a service is also called a "Playout service".

However, you can run a simulated live stream using a tool like [OBS](https://obsproject.com/) and Wirecast to send your on-demand asset to us as an RTMP stream. See how to configure your RTMP encoder on our [Configuring Broadcast Software docs page](/docs/guides/configure-broadcast-software).

For a more comprehensive guide and common options we recommend for work-arounds, see this guide of how to [Stream simulated live](/docs/guides/stream-simulated-live).

## Can I restream/simulcast my live stream to social platforms like Facebook?

Yes. Mux Video live service supports up to six simultaneous restreams to third party platforms that support RTMP feed.

Read more in this blog post: [Help Your Users be in 5 Places at Once: Your Guide to Simulcasting](https://mux.com/blog/help-your-users-be-in-5-places-at-once-your-guide-to-simulcasting/).

## Is my content saved after the live broadcast is over?

Yes. Mux will automatically create an <ApiRefLink href="/docs/api-reference/video/assets">on-demand (VOD) asset</ApiRefLink> after your live stream ends, which can be streamed again instantly after the live stream ends.

## Can I get access to my live event's recording?

Yes, you can enable downloading of the entire event recording using [Master access](/docs/guides/download-for-offline-editing) feature.

With Master access enabled, you will receive a Webhook notification after the live stream ends, indicating that the master copy of the video asset is available to download.

## Can I generate thumbnails/GIFs while the live stream is active?

Yes. You can use our [thumbnail and animated GIF API](/docs/guides/get-images-from-a-video) while the live event is active.

Many customers use thumbnails or GIFs to show what content is currently playing or to as a way to promote the live stream.

## Can I test Mux live streaming for free?

On any paid plan, you can create [free test live streams](https://mux.com/blog/new-test-mux-video-features-for-free/) to help evaluate the Mux Video APIs without incurring any cost.

We give you access to create an unlimited number of test live streams. Test live streams are watermarked with the Mux logo, limited to 5 minutes, and disabled after 24 hours.

## Can I add multiple audio channels or tracks to my live stream?

No, we currently support only one audio track for live streams. On-demand video assets do support [multiple alternative audio tracks](/docs/guides/add-alternate-audio-tracks-to-your-videos)

You may want your users to be able to select a language on the player and view a stream showing the same video content but play different audio. One workaround would be first, ingest multiple streams with one in each language. Then add logic to the player to switch between different playback URLs and the complete stream when the user changes the language.

## What happens if I live stream variable frame rate (VFR) content?

While Mux does not output variable frame rate (VFR) content for live streams, we will accept variable frame rate (VFR) content for ingest. Having said that, we recommend using constant frame rate (CFR) content for live streams to ensure the best playback experience.


# Monitor HTML5 video element
This guide walks through integration with any HTML5 video player to collect video performance metrics with Mux data. Use this if Mux does not have an SDK specific for your player.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Custom Dimensions
- Custom Beacon Domain

```

Notes:

```md
Live Latency is available for Native Safari HLS.
```

## 1. Install mux-embed

Include the Mux JavaScript SDK on every page of your web app that includes video. You can use the Mux-hosted version of the script or install via npm. `mux-embed` follows [semantic versioning](https://semver.org/) and the API will not change between major releases.

If possible, use the SDK for your particular player (e.g. Video.js, JW Player, etc.). While the HTML5 SDK works with any modern HTML5 video player, the player-specific Mux SDK is preferable because it offers a deeper integration and in most cases collects more pieces of data. If you don't see your player listed then use `mux-embed` and let us know so we can prioritize creating an SDK for the player that you are using.

```cdn

<script src="https://src.litix.io/core/4/mux.js"></script>

```

```npm

npm install --save mux-embed

```

```yarn

yarn add mux-embed

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```html

<script>
  if (typeof mux !== 'undefined') {
    window.muxPlayerInitTime = mux.utils.now();
  }
</script>

<video
  id="my-player"
  src="https://muxed.s3.amazonaws.com/leds.mp4"
  controls
  width="960"
  height="400"
/>

<script>
  // Initialize Mux Data monitoring by passing in the "id" attribute of your video player
  if (typeof mux !== 'undefined') {
    mux.monitor('#my-player', {
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata fields
        player_name: 'Main Player', // any arbitrary string you want to use to identify this player
        player_init_time: window.muxPlayerInitTime // ex: 1451606400000
        // ...
      }
    });
  }
</script>

```

```javascript

import mux from 'mux-embed';

const playerInitTime = mux.utils.now();

// Initialize Mux Data monitoring by passing in the "id" attribute of your video player
mux.monitor('#my-player', {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Metadata fields
    player_name: 'Main Player', // any arbitrary string you want to use to identify this player
    player_init_time: playerInitTime,
    // ...
  }
});

```

```react

import mux from 'mux-embed';
import React, { useEffect, useRef } from 'react';

export default function VideoPlayer () {
  const videoRef = useRef(null);

  useEffect(() => {
    if (videoRef.current) {
      const initTime = mux.utils.now();

      mux.monitor(videoRef.current, {
        debug: false,
        data: {
          env_key: 'ENV_KEY', // required
          // Metadata fields
          player_name: 'Main Player', // any arbitrary string you want to use to identify this player
          player_init_time: initTime,
          // ...
        }
      });
    }
  }, [videoRef]);

  return (
    <video
      controls
      ref={videoRef}
      src="https://muxed.s3.amazonaws.com/leds.mp4"
      style={{ width: '100%', maxWidth: '500px' }}
    />
  );
}

```



Call `mux.monitor` and pass in a valid CSS selector or the video element itself. Followed by the SDK options and metadata. If you use a CSS selector that matches multiple elements, the first matching element in the document will be used.

Log in to the Mux dashboard and find the environment that corresponds to your `env_key` and look for video views. It takes about a minute or two from tracking a view for it to show up on the Metrics tab.

**If you aren't seeing data**, check to see if you have an ad blocker, tracking blocker or some kind of network firewall that prevents your player from sending requests to Mux Data servers.

## 3. Make your data actionable

The only required field in the `options` that you pass into `mux-embed` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` key when calling `mux.monitor`.

```js
mux.monitor('#my-player', {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required

    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'

    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000

    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `monitor`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
const myPlayer = document.querySelector('#my-player');
myPlayer.src = 'https://muxed.s3.amazonaws.com/leds.mp4';

mux.emit('#my-player', 'videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
mux.emit('#my-player', 'programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
mux.monitor('#my-player', {
  debug: false,
  disableCookies: true,
  data: {
    env_key: 'ENV_KEY',
    // ... rest of metadata
  }
}
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
mux.monitor('#my-player', {
  debug: false,
  respectDoNotTrack: true, // Disable tracking of browsers where Do Not Track is enabled
  data: {
    env_key: 'EXAMPLE_ENV_KEY',
    // ... rest of metadata
  }
}
```

### Customize error tracking behavior

By default, `mux-embed` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
mux.emit('#my-player', 'error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

mux.monitor('#my-player', {
  debug: false,
  errorTranslator: errorTranslator,
  data: {
    env_key: 'ENV_KEY', // required

    // ... additional metadata
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
mux.monitor('#my-player', {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: 'ENV_KEY', // required

    // ... additional metadata
  }
```

### Use TypeScript with mux-embed  <BetaTag />

TypeScript support for mux-embed is currently in beta, so you'll need to take a couple extra steps in order to use it.

Use TypeScript's [triple slash `<reference path="..."/>` directive](https://www.typescriptlang.org/docs/handbook/triple-slash-directives.html#-reference-path-). At the top of your `.ts` or `.tsx` file where you want to use the types, add a line that looks like this:

```ts
/// <reference path="../../node_modules/mux-embed/dist/types/mux-embed.d.ts"/>
```

Note that the triple slash directive requires passing in the relevant path from your .ts or tsx file to the source d.ts file in node\_modules/.

Also, you may have a linting rule that prevents you from using the triple slash directive, you can disable that with and eslint-disable line:

```ts
// eslint-disable-next-line @typescript-eslint/triple-slash-reference
```

Here's an example directory structure and component file:

```sh filename="Directory Structure"
├── node_modules/
│   └── mux-embed/
│       └── dist/
│           └── types/
│               └── mux-embed.d.ts
└── src/
    └── video-component/
        └── video-component.ts
```

```ts filename="video-component.ts"
// NOTE: You may also need to disable linter rules, such as this example for @typescript-eslint
// eslint-disable-next-line @typescript-eslint/triple-slash-reference
/// <reference path="../../node_modules/mux-embed/dist/types/mux-embed.d.ts"/>
import mux from 'mux-embed';

// ...

let videoEl?: HTMLVideoElement;

// This should now be type valid, too!
videoEl?.mux.destroy();
```

This opt-in approach is temporary while we're in beta with TypeScript support. If you run into any issues with the types, please let us know so we can improve them.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
mux.monitor('#my-player', {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: 'ENV_KEY', // required
    // ... additional metadata
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v5.17.1

* fix issue where playing time might accumulate for paused players

### Previous releases

#### v5.17.0

* add compatibility for dash.js 5

#### v5.16.1

* Update parsing of initial value for player\_playback\_mode

#### v5.16.0

* Add Playback Range Tracker for new engagement metrics

#### v5.15.0

* Automatically detect playback mode changes for HTML 5 Video

#### v5.14.0

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.

#### v5.13.0

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * New playbackmodechange event
  * Two new metrics, ad\_playing\_time\_ms\_cumulative and view\_playing\_time\_ms\_cumulative, to track playing time by wall clock time

#### v5.12.0

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.

#### v5.11.0

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time

#### v5.10.0

* Adds support for cdnchange events

#### v5.9.1

* Submit Aggregate Startup Time when autoplay is set

#### v5.9.0

* Improve scaling calculation accuracy by using more events for tracking

#### v5.8.3

* add custom 11 through 20 to types

#### v5.8.2

* remove duplicate video\_source\_mime\_type from types

#### v5.8.1

* fix typo in types for viewer\_plan

#### v5.8.0

* Add support for video\_creator\_id

#### v5.7.0

* Add keys for new customer-defined dimensions

#### v5.6.0

* Fix issue where firefox did not send beacons, and some final beacons might not be sent

#### v5.5.0

* Update mechanism for generating unique IDs, used for `view_id` and others
* Use crypto.randomUUID(), when available, for generating UUID values

#### v5.4.3

* \[chore] internal build process fix (no functional changes)

#### v5.4.2

* feat(google-ima): Beta implementation of google-ima extension to mux-embed
* feat(mux-embed): Add methods for post-initialization overrides of functionality (for internal use only).
* fix(mux-embed): typecheck for dashjs.getSource is incorrect.

#### v5.4.1

* Expose `updateData` globally and fix types
* Fix an issue where views were not ended cleanly on long resume detection

#### v5.4.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

#### v5.3.3

* expose HEARTBEAT and DESTROY under mux.events

#### v5.3.2

* Fix type issues for error severity and business exception

#### v5.3.1

* fix(mux-embed): Remove 3rd party dependencies and replace with appropriately equivalent functionality.

#### v5.3.0

* Ignore request events when emitting heartbeat events
* Fix an issue where video quality metrics may not be calculated correctly on some devices

#### v5.2.1

* Send hb events regardless of errors

#### v5.2.0

* Bug fix to not de-dupe error event metadata
* Extend `errorTranslator` to work with `player_error_severity` and `player_error_business_exception`

#### v5.1.0

* Target ES5 for bundles and validate bundles are ES5

* fix an issue where seeking time before first play attempt counted towards video startup time

#### v5.0.0

* Add opt-in TypeScript Types to Mux Embed and use + refactor for other dependent data SDKs. Update published dists to include CJS and ESM.
* Mux Embed now provides (opt in) TypeScript types in its published package, as well as publishes CJS and ESM versions of the package.
* This allows us to provide a lower risk and iterative roll out of official TypeScript types for `mux-embed`. The export types updates were required to ensure actual matches between the dist package and corresponding TypeScript types.
* This *should* have no direct impact on users, though different build tools will now potentially select one of the new export types (e.g. the ESM "flavor" of `mux-embed`). TypeScript types *should not* be applied unless they are explicitly referenced in app (discussed in docs updates).

#### v4.30.0

* fix an issue causing certain network metrics to not be available for dashjs v4.x

* fix an issue where certain IDs used may cause a DOM exception to be raised

#### v4.29.0

* fix(mux-embed): avoid using element id for muxId. attach muxId to element.

#### v4.28.1

* fix an issue where beaconDomain deprecation line was incorrectly logged

#### v4.28.0

* Deprecate `beaconDomain` in favor of `beaconCollectionDomain`. The `beaconDomain` setting will continue to function, but integrations should change to `beaconCollectionDomain` instead.

#### v4.27.0

* Fix an issue where playback time was incorrectly counted during seeking and other startup activities
* Add events for the collection of ad clicks
* fix an issue where seek latency could be unexpectedly large
* fix an issue where seek latency does not include time at end of a view
* Add events for the collection of ad skips

#### v4.26.0

* muxData cookie expiration should be one year

#### v4.25.1

* Do not deduplicate ad IDs in ad events

#### v4.25.0

* Include ad watch time in playback time

#### v4.24.0

* Fix an issue where beacons over a certain size could get hung and not be sent

#### v4.23.0

* Collect Request Id from the response headers, when available, for HLS.js (`requestcompleted` and `requestfailed`) and Dash.js (`requestcompleted`). The following headers are collected: `x-request-Id`, `cf-ray` (Cloudflare), `x-amz-cf-id` (CloudFront), `x-akamai-request-id` (Akamai)

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update Headers type

#### v4.22.0

* Send errors, `requestfailed`, and `requestcancelled` events on Dash.js. Because of this change, you may see the number of playback failures increase as we now automatically track additional fatal errors.

#### v4.21.0

* Include Ad metadata in ad events

#### v4.20.0

* Support for new dimension, `view_has_ad`

#### v4.19.0

* End views after 5 minutes of rebuffering

#### v4.18.0

* Add audio, subtitle, and encryption key request failures for HLS.js
* Capture ad metadata for Video.js IMA
* Capture detailed information from HLS.js for fatal errors in the Error Context

#### v4.17.0

* Extend `errorTranslator` to work with `player_error_context`

#### v4.16.0

* Add new `renditionchange` fields to Shaka SDK
* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields
* Add frame drops to Shaka SDK
* Add new `renditionchange` info to Web SDKs
* Adds the new Media Collection Enhancement fields

#### v4.15.0

* update `mux.utils.now` to use `navigationStart` for timing reference

* fix issue where views after `videochange` might incorrectly accumulate rebuffering duration

* Resolved issue sending beacons when view is ended

* Record `request_url` and `request_id` with network events

#### v4.14.0

* Tracking FPS changes if specified in Manifest

#### v4.13.4

* Resolved issue sending beacons when paused

#### v4.13.3

* Fixed issue with monitoring network events for hls.js monitor

#### v4.13.2

* Fix an issue with sending unnecessary heartbeat events on the window `visibilitychange` event

#### v4.13.1

* Fixes an issue with accessing the global object

#### v4.13.0

* Collect the `x-request-id` header from segment responses to make it easier to correlate client requests to other logs

* Upgraded internal webpack version

* Flush events on window `visibilitychange` event

#### v4.12.1

* Use Fetch API for sending beacons

#### v4.12.0

* Generate a new unique view if the player monitor has not received any events for over an hour.

#### v4.11.0

* Detect fullscreen and player language

#### v4.10.0

* Replace query string dependency to reduce package size
* Remove `ImageBeacon` fallback, removing support for IE9

#### v4.9.4

* Generate all `view_id`'s internally

#### v4.9.3

* Use common function for generating short IDs

#### v4.9.2

* Fixed an issue around the `disablePlayheadRebufferTracking` option

#### v4.9.1

* Fix issue where `getStartDate` does not always return a date object

#### v4.9.0

* Support PDT and player\_live\_edge\_program\_time for Native Safari

* Set a max payload size in mux-embed

#### v4.8.0

* Add option `disablePlayheadRebufferTracking` to allow players to disable automatic rebuffering metrics.
  Players can emit their own `rebufferstart` or `rebufferend` events and track rebuffering metrics.

* Fix an issue with removing `player_error_code` and `player_error_message` when the error code is `1`.
  Also stops emitting `MEDIA_ERR_ABORTED` as errors.

* Now leaving Player Software Version for HTML5 Video Element unset rather than "No Versions" as it is no longer needed.

#### v4.7.0

* Add an option to specify beaconCollectionDomain for Data custom domains

#### v4.6.2

* Fix an issue with emitting heartbeat events while the player is not playing

#### v4.6.1

* Fix an issue with removing event listeners from window after the player monitor destroy event

#### v4.6.0

* Update hls.js monitor to record session data with fields prefixed as `io.litix.data.`
* Update the manifest parser to parse HLS session data tags

#### v4.5.0

* Add short codes to support internal video experiments
* Collect request header prefixed with `x-litix-*`
* Capture fatal hls.js errors
* Make `envKey` an optional parameter

#### v4.4.4

* Add a player events enum on the `mux` object (e.g. `mux.events.PLAY`)
* Use the browser `visibilitychange` listener instead of `unload` to handle destructuring the player monitor.

#### v4.4.3

* Fix: Specify `video_source_is_live` for HLS.js monitor

#### v4.4.2

* Group events into 10 second batches before sending a beacon

#### v4.4.1

* Exclude latency metrics from beacons if `video_source_is_live` is not `true`

#### v4.4.0

* Add a lightweight HLS manifest parser to capture latency metrics for player's that don't expose an API for accessing the manifest.
* Allow players to emit `player_program_time` instead of calculating internally

#### v4.3.0

* Add support for calculating latency metrics when streaming using HLS

#### v4.2.5

* Remove default `video_id` when not specified by the developer.

#### v4.2.4

* Add minified keys for latency metrics

#### v4.2.3

* Add minified keys for new program time metrics

#### v4.2.2

* Fix bug causing missing bitrate metrics using HLS.js {'>'}v1.0.0

#### v4.2.1

* (video element monitor) Fix an issue where some non-fatal errors thrown by the video were tracked as playback failures

#### v4.2.0

* Fix an issue where views triggered by `programchange` may not report metrics correctly
* Fix an issue where calling `el.mux.destroy()` multiple times in a row raised an exception

#### v4.1.1

* Fix an issue where `player_remote_played` wasn't functioning correctly

#### v4.1.0

* Add support for custom dimensions

#### v4.0.1

* Support HLS.js v1.0.0

#### v4.0.0

* Enable sending optional ad quartile events through.
* Move device detection server-side, improving data accuracy and reducing client SDK size.
* Fix an issue where jank may be experienced in some web applications when the SDK is loaded.

#### v3.4.0

* Setting to disable rebuffer tracking `disableRebufferTracking` that defaults to `false`.

#### v3.3.0

* Adds `viewer_connection_type` detection.

#### v3.2.0

* Adds support for `renditionchange`.

#### v3.1.0

* Add checks for window being undefined and expose a way for SDKs to pass in platform information. This work is necessary for compatibility with react-native-video.

#### v3.0.0

* Setting to disable Mux Data collection when Do Not Track is present now defaults to off
* Do not submit the source URL when a video is served using the data: protocol

#### v2.10.0

* Use Performance Timing API, when available, for view event timestamps

#### v2.9.1

* Fix an issue with server side rendering

#### v2.9.0

* Support for Dash.js v3

#### v2.8.0

* Submit Player Instance Id as a unique identifier

#### v2.7.3

* Fixed a bug when using `mux.monitor` with Hls.js or Dash.js the source hostname was not being properly collected.


# Monitor HLS.js
This guide walks through integration with [HLS.js](https://github.com/video-dev/hls.js) to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Can infer CDN identification from response headers
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
Average Bitrate Metrics available in v3.2.0 and newer.
```

## 1. Install mux-embed

Include the Mux JavaScript SDK on every page of your web app that includes video. You can use the Mux-hosted version of the script or install via npm. `mux-embed` follows [semantic versioning](https://semver.org/) and the API will not change between major releases.

```cdn

<script src="https://src.litix.io/core/4/mux.js"></script>

```

```npm

npm install --save mux-embed

```

```yarn

yarn add mux-embed

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```html

<script>
  if (typeof mux !== 'undefined') {
    window.muxPlayerInitTime = mux.utils.now();
  }
</script>

<video
  id="my-player"
  controls
  width="960"
  height="400"
/>

<script>
  if (Hls.isSupported()) {
    let hls = new Hls();

    // we're using a Mux HLS URL in this example, but the Mux Data integration
    // with HLS.js works with any HLS url
    hls.loadSource('https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8');
    hls.attachMedia(videoEl);

    if (typeof mux !== 'undefined') {
      const videoEl = document.querySelector('#my-player');

      mux.monitor(videoEl, {
        debug: false,
        hlsjs: hls,
        Hls: Hls,
        data: {
          env_key: 'ENV_KEY', // required
          // Metadata fields
          player_name: 'Main Player', // any arbitrary string you want to use to identify this player
          player_init_time: window.muxPlayerInitTime // ex: 1451606400000
          // ...
        }
      });
    }
  }
</script>

```

```javascript

import Hls from "hls.js";
import mux from "mux-embed";

const muxPlayerInitTime = mux.utils.now();
const videoEl = document.querySelector('#my-player');

if (Hls.isSupported()) {
  let hls = new Hls();

  // we're using a Mux HLS URL in this example, but the Mux Data integration
  // with HLS.js works with any HLS url
  hls.loadSource('https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8');
  hls.attachMedia(videoEl);
  mux.monitor(videoEl, {
    debug: false,
    hlsjs: hls,
    Hls: Hls,
    data: {
      env_key: 'ENV_KEY', // required
      // Metadata fields
      player_name: 'Main Player', // any arbitrary string you want to use to identify this player
      player_init_time: window.muxPlayerInitTime // ex: 1451606400000
      // ...
    }
  });
}

```

```react

import React, { useEffect, useRef } from "react";
import Hls from "hls.js";
import mux from "mux-embed";

export default function VideoPlayer() {
  const videoRef = useRef(null);
  const src = "https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8";

  useEffect(() => {
    let hls;

    if (videoRef.current) {
      const video = videoRef.current;
      const initTime = mux.utils.now();

      if (video.canPlayType("application/vnd.apple.mpegurl")) {
        // This will run in safari, where HLS is supported natively
        video.src = src;
      } else if (Hls.isSupported()) {
        // This will run in all other modern browsers
        hls = new Hls();
        hls.loadSource(src);
        hls.attachMedia(video);

        mux.monitor(video, {
          debug: false,
          // pass in the 'hls' instance and the 'Hls' constructor
          hlsjs: hls,
          Hls,
          data: {
            env_key: "ENV_KEY", // required
            // Metadata fields
            player_name: "Main Player", // any arbitrary string you want to use to identify this player
            player_init_time: initTime
            // ...
          }
        });
      }
    }

    return () => {
      if (hls) {
        hls.destroy();
      }
    };
  }, [videoRef]);

  return (
    <video
      controls
      ref={videoRef}
      style={{ width: "100%", maxWidth: "500px" }}
    />
  );
}

```



Call `mux.monitor` and pass in a valid CSS selector or the video element itself. Followed by the SDK options and metadata. If you use a CSS selector that matches multiple elements, the first matching element in the document will be used.

In the SDK options, be sure to pass in the `hlsjs` instance and the `Hls` constructor. If the `Hls` constructor is available on the global `window` object then it can be omitted from the SDK options.

Alternatively, if your player does not immediately have access to the HLS.js player instance, you can start monitoring HLS.js at any time in the future. In order to do this, you can call either of the following:

```js
mux.addHLSJS("#my-player", options)
// or
myVideoEl.mux.addHLSJS(options)
```

Log in to the Mux dashboard and find the environment that corresponds to your `env_key` and look for video views. It takes about a minute or two from tracking a view for it to show up on the Metrics tab.

**If you aren't seeing data**, check to see if you have an ad blocker, tracking blocker or some kind of network firewall that prevents your player from sending requests to Mux Data servers.

## 3. Make your data actionable

The only required field in the `options` that you pass into `mux-embed` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` key when calling `mux.monitor`.

```js
mux.monitor('#my-player', {
  debug: false,
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY', // required

    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'

    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000

    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `monitor`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
mux.emit('#my-player', 'videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
mux.emit('#my-player', 'programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
mux.monitor('#my-player', {
  debug: false,
  disableCookies: true,
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY',
    // ... rest of metadata
  }
}
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
mux.monitor('#my-player', {
  debug: false,
  hlsjs: hls,
  Hls,
  respectDoNotTrack: true, // Disable tracking of browsers where Do Not Track is enabled
  data: {
    env_key: 'ENV_KEY',
    // ... rest of metadata
  }
}
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `mux-embed` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
mux.emit('#my-player', 'error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

mux.monitor('#my-player', {
  debug: false,
  errorTranslator,
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY', // required

    // ... additional metadata
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
mux.monitor('#my-player', {
  debug: false,
  automaticErrorTracking: false,
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY', // required

    // ... additional metadata
  }
```

### Use TypeScript with mux-embed  <BetaTag />

`mux-embed` now provides TypeScript type definitions with the published package! If you want to opt in, you can check out how [here](/docs/guides/monitor-html5-video-element#opt-in-to-using-mux-embed-typescript-type-definitions--).

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
mux.monitor('#my-player', {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY', // required
    // ... additional metadata
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v5.17.1

* fix issue where playing time might accumulate for paused players

### Previous releases

#### v5.17.0

* add compatibility for dash.js 5

#### v5.16.1

* Update parsing of initial value for player\_playback\_mode

#### v5.16.0

* Add Playback Range Tracker for new engagement metrics

#### v5.15.0

* Automatically detect playback mode changes for HTML 5 Video

#### v5.14.0

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.

#### v5.13.0

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * New playbackmodechange event
  * Two new metrics, ad\_playing\_time\_ms\_cumulative and view\_playing\_time\_ms\_cumulative, to track playing time by wall clock time

#### v5.12.0

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.

#### v5.11.0

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time

#### v5.10.0

* Adds support for cdnchange events

#### v5.9.1

* Submit Aggregate Startup Time when autoplay is set

#### v5.9.0

* Improve scaling calculation accuracy by using more events for tracking

#### v5.8.3

* add custom 11 through 20 to types

#### v5.8.2

* remove duplicate video\_source\_mime\_type from types

#### v5.8.1

* fix typo in types for viewer\_plan

#### v5.8.0

* Add support for video\_creator\_id

#### v5.7.0

* Add keys for new customer-defined dimensions

#### v5.6.0

* Fix issue where firefox did not send beacons, and some final beacons might not be sent

#### v5.5.0

* Update mechanism for generating unique IDs, used for `view_id` and others
* Use crypto.randomUUID(), when available, for generating UUID values

#### v5.4.3

* \[chore] internal build process fix (no functional changes)

#### v5.4.2

* feat(google-ima): Beta implementation of google-ima extension to mux-embed
* feat(mux-embed): Add methods for post-initialization overrides of functionality (for internal use only).
* fix(mux-embed): typecheck for dashjs.getSource is incorrect.

#### v5.4.1

* Expose `updateData` globally and fix types
* Fix an issue where views were not ended cleanly on long resume detection

#### v5.4.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

#### v5.3.3

* expose HEARTBEAT and DESTROY under mux.events

#### v5.3.2

* Fix type issues for error severity and business exception

#### v5.3.1

* fix(mux-embed): Remove 3rd party dependencies and replace with appropriately equivalent functionality.

#### v5.3.0

* Ignore request events when emitting heartbeat events
* Fix an issue where video quality metrics may not be calculated correctly on some devices

#### v5.2.1

* Send hb events regardless of errors

#### v5.2.0

* Bug fix to not de-dupe error event metadata
* Extend `errorTranslator` to work with `player_error_severity` and `player_error_business_exception`

#### v5.1.0

* Target ES5 for bundles and validate bundles are ES5

* fix an issue where seeking time before first play attempt counted towards video startup time

#### v5.0.0

* Add opt-in TypeScript Types to Mux Embed and use + refactor for other dependent data SDKs. Update published dists to include CJS and ESM.
* Mux Embed now provides (opt in) TypeScript types in its published package, as well as publishes CJS and ESM versions of the package.
* This allows us to provide a lower risk and iterative roll out of official TypeScript types for `mux-embed`. The export types updates were required to ensure actual matches between the dist package and corresponding TypeScript types.
* This *should* have no direct impact on users, though different build tools will now potentially select one of the new export types (e.g. the ESM "flavor" of `mux-embed`). TypeScript types *should not* be applied unless they are explicitly referenced in app (discussed in docs updates).

#### v4.30.0

* fix an issue causing certain network metrics to not be available for dashjs v4.x

* fix an issue where certain IDs used may cause a DOM exception to be raised

#### v4.29.0

* fix(mux-embed): avoid using element id for muxId. attach muxId to element.

#### v4.28.1

* fix an issue where beaconDomain deprecation line was incorrectly logged

#### v4.28.0

* Deprecate `beaconDomain` in favor of `beaconCollectionDomain`. The `beaconDomain` setting will continue to function, but integrations should change to `beaconCollectionDomain` instead.

#### v4.27.0

* Fix an issue where playback time was incorrectly counted during seeking and other startup activities
* Add events for the collection of ad clicks
* fix an issue where seek latency could be unexpectedly large
* fix an issue where seek latency does not include time at end of a view
* Add events for the collection of ad skips

#### v4.26.0

* muxData cookie expiration should be one year

#### v4.25.1

* Do not deduplicate ad IDs in ad events

#### v4.25.0

* Include ad watch time in playback time

#### v4.24.0

* Fix an issue where beacons over a certain size could get hung and not be sent

#### v4.23.0

* Collect Request Id from the response headers, when available, for HLS.js (`requestcompleted` and `requestfailed`) and Dash.js (`requestcompleted`). The following headers are collected: `x-request-Id`, `cf-ray` (Cloudflare), `x-amz-cf-id` (CloudFront), `x-akamai-request-id` (Akamai)

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update Headers type

#### v4.22.0

* Send errors, `requestfailed`, and `requestcancelled` events on Dash.js. Because of this change, you may see the number of playback failures increase as we now automatically track additional fatal errors.

#### v4.21.0

* Include Ad metadata in ad events

#### v4.20.0

* Support for new dimension, `view_has_ad`

#### v4.19.0

* End views after 5 minutes of rebuffering

#### v4.18.0

* Add audio, subtitle, and encryption key request failures for HLS.js
* Capture ad metadata for Video.js IMA
* Capture detailed information from HLS.js for fatal errors in the Error Context

#### v4.17.0

* Extend `errorTranslator` to work with `player_error_context`

#### v4.16.0

* Add new `renditionchange` fields to Shaka SDK
* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields
* Add frame drops to Shaka SDK
* Add new `renditionchange` info to Web SDKs
* Adds the new Media Collection Enhancement fields

#### v4.15.0

* update `mux.utils.now` to use `navigationStart` for timing reference

* fix issue where views after `videochange` might incorrectly accumulate rebuffering duration

* Resolved issue sending beacons when view is ended

* Record `request_url` and `request_id` with network events

#### v4.14.0

* Tracking FPS changes if specified in Manifest

#### v4.13.4

* Resolved issue sending beacons when paused

#### v4.13.3

* Fixed issue with monitoring network events for hls.js monitor

#### v4.13.2

* Fix an issue with sending unnecessary heartbeat events on the window `visibilitychange` event

#### v4.13.1

* Fixes an issue with accessing the global object

#### v4.13.0

* Collect the `x-request-id` header from segment responses to make it easier to correlate client requests to other logs

* Upgraded internal webpack version

* Flush events on window `visibilitychange` event

#### v4.12.1

* Use Fetch API for sending beacons

#### v4.12.0

* Generate a new unique view if the player monitor has not received any events for over an hour.

#### v4.11.0

* Detect fullscreen and player language

#### v4.10.0

* Replace query string dependency to reduce package size
* Remove `ImageBeacon` fallback, removing support for IE9

#### v4.9.4

* Generate all `view_id`'s internally

#### v4.9.3

* Use common function for generating short IDs

#### v4.9.2

* Fixed an issue around the `disablePlayheadRebufferTracking` option

#### v4.9.1

* Fix issue where `getStartDate` does not always return a date object

#### v4.9.0

* Support PDT and player\_live\_edge\_program\_time for Native Safari

* Set a max payload size in mux-embed

#### v4.8.0

* Add option `disablePlayheadRebufferTracking` to allow players to disable automatic rebuffering metrics.
  Players can emit their own `rebufferstart` or `rebufferend` events and track rebuffering metrics.

* Fix an issue with removing `player_error_code` and `player_error_message` when the error code is `1`.
  Also stops emitting `MEDIA_ERR_ABORTED` as errors.

* Now leaving Player Software Version for HTML5 Video Element unset rather than "No Versions" as it is no longer needed.

#### v4.7.0

* Add an option to specify beaconCollectionDomain for Data custom domains

#### v4.6.2

* Fix an issue with emitting heartbeat events while the player is not playing

#### v4.6.1

* Fix an issue with removing event listeners from window after the player monitor destroy event

#### v4.6.0

* Update hls.js monitor to record session data with fields prefixed as `io.litix.data.`
* Update the manifest parser to parse HLS session data tags

#### v4.5.0

* Add short codes to support internal video experiments
* Collect request header prefixed with `x-litix-*`
* Capture fatal hls.js errors
* Make `envKey` an optional parameter

#### v4.4.4

* Add a player events enum on the `mux` object (e.g. `mux.events.PLAY`)
* Use the browser `visibilitychange` listener instead of `unload` to handle destructuring the player monitor.

#### v4.4.3

* Fix: Specify `video_source_is_live` for HLS.js monitor

#### v4.4.2

* Group events into 10 second batches before sending a beacon

#### v4.4.1

* Exclude latency metrics from beacons if `video_source_is_live` is not `true`

#### v4.4.0

* Add a lightweight HLS manifest parser to capture latency metrics for player's that don't expose an API for accessing the manifest.
* Allow players to emit `player_program_time` instead of calculating internally

#### v4.3.0

* Add support for calculating latency metrics when streaming using HLS

#### v4.2.5

* Remove default `video_id` when not specified by the developer.

#### v4.2.4

* Add minified keys for latency metrics

#### v4.2.3

* Add minified keys for new program time metrics

#### v4.2.2

* Fix bug causing missing bitrate metrics using HLS.js {'>'}v1.0.0

#### v4.2.1

* (video element monitor) Fix an issue where some non-fatal errors thrown by the video were tracked as playback failures

#### v4.2.0

* Fix an issue where views triggered by `programchange` may not report metrics correctly
* Fix an issue where calling `el.mux.destroy()` multiple times in a row raised an exception

#### v4.1.1

* Fix an issue where `player_remote_played` wasn't functioning correctly

#### v4.1.0

* Add support for custom dimensions

#### v4.0.1

* Support HLS.js v1.0.0

#### v4.0.0

* Enable sending optional ad quartile events through.
* Move device detection server-side, improving data accuracy and reducing client SDK size.
* Fix an issue where jank may be experienced in some web applications when the SDK is loaded.

#### v3.4.0

* Setting to disable rebuffer tracking `disableRebufferTracking` that defaults to `false`.

#### v3.3.0

* Adds `viewer_connection_type` detection.

#### v3.2.0

* Adds support for `renditionchange`.

#### v3.1.0

* Add checks for window being undefined and expose a way for SDKs to pass in platform information. This work is necessary for compatibility with react-native-video.

#### v3.0.0

* Setting to disable Mux Data collection when Do Not Track is present now defaults to off
* Do not submit the source URL when a video is served using the data: protocol

#### v2.10.0

* Use Performance Timing API, when available, for view event timestamps

#### v2.9.1

* Fix an issue with server side rendering

#### v2.9.0

* Support for Dash.js v3

#### v2.8.0

* Submit Player Instance Id as a unique identifier

#### v2.7.3

* Fixed a bug when using `mux.monitor` with Hls.js or Dash.js the source hostname was not being properly collected.


# Monitor AVPlayer
This guide walks through integration with iOS and TVOS AVPlayer player to collect video performance metrics with Mux data.
Mux Data integration for AVPlayer supports applications running on iOS 12.0 or newer, tvOS 12.0 or newer, and Mac Catalyst that use `AVPlayerViewController`, `AVPlayerLayer`, or a standalone `AVPlayer` playing audio or if presented with a fixed size. Applications running on visionOS 1.0 and higher are also supported if they use `AVPlayerViewController` or a standalone `AVPlayer` playing audio or if presented with a fixed size.

This integration uses Mux's core Objective-C SDK and the full source can be seen here: [muxinc/mux-stats-sdk-avplayer](https://github.com/muxinc/mux-stats-sdk-avplayer). This SDK is packaged as an xcframework.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Ads metrics
- Customizable Error Tracking
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
Packaged with: cocoapods, SPM and carthage. Request Latency is not available.
```

## 1. Install the Mux Data SDK

## Installation

### Installing in Xcode with Swift Package Manager

1. In Xcode click "File" > "Swift Packages" > "Add Package Dependency..."
2. The package repository URL is `https://github.com/muxinc/mux-stats-sdk-avplayer.git`

```
https://github.com/muxinc/mux-stats-sdk-avplayer.git
```

3. Click `Next`.
4. Since the `MUXSDKStats` follows SemVer, we recommend setting the "Rules" to install the latest version and choosing the option "Up to Next Major". [Here's an overview of the different SPM Dependency Rules and their semantics](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app#Decide-on-package-requirements).

### Installing in Package.swift

Open your Package.swift file, add the following to `dependencies`:

```swift
    .package(
      url: "https://github.com/muxinc/mux-stats-sdk-avplayer",
      .upToNextMajor(from: "4.0.0")
    ),
```

Note that `MUXSDKStats` has a dependency on `MuxCore`, so you will see that `MuxCore` gets installed as well.

> As of Xcode 14.3.1 integrating the Mux SDKs as part of a shared framework using Swift Package Manager library targets is now supported. [An example for setting this up is available here](https://github.com/muxinc/examples/tree/main/swift-data-library-installation).

### Installing with CocoaPods

To install with CocoaPods, modify your Podfile to use frameworks by including `use_frameworks!` and then add the following pods to your Podfile:

```
pod 'Mux-Stats-AVPlayer', '~>4.0'
```

This will install `Mux-Stats-AVPlayer` and the latest current release of our [core Objective-C Library](https://github.com/muxinc/stats-sdk-objc).

Next, add correct import statement into your application.

```objc

@import MUXSDKStats;

```

```swift

import MUXSDKStats

```



## 2. Initialize the monitor for your AVPlayer instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

The example below uses `monitorAVPlayerViewController(_:withPlayerName:customerData:)`. If you are using `AVPlayerLayer`, use `monitorAVPlayerLayer(_:withPlayerName:customerData:)` instead.

The `playerName` parameter is a string that identifies this instance of your player. When calling `destroyPlayer(_:)` or `videoChange(forPlayer:with:)` later on, you will need this string. Each instance of a player that runs simultaneously in your application should have a different `playerName`.

<Callout type="warning">
  **If you are using SwiftUI**, attach the monitor in the `onAppear` action for your view. This ensures that the Mux Data SDK is able to get the dimensions of the view which is used to calculate video quality metrics.
</Callout>

```objc

MUXSDKCustomerPlayerData *playerData = [[MUXSDKCustomerPlayerData alloc] initWithPropertyKey:@"ENV_KEY"];

MUXSDKCustomerVideoData *videoData = [MUXSDKCustomerVideoData new];
// insert videoData metadata
videoData.videoTitle = @"Title1";
videoData.videoSeries = @"animation";

MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] initWithCustomerPlayerData:playerData
                                                                                videoData:videoData
                                                                                 viewData:nil
                                                                               customData:nil
                                                                               viewerData:nil];

_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController 
                                             withPlayerName:@"mainPlayer" 
                                               customerData:customerData];


```

```swift

let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")
playerData?.playerName = "playerName"

let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Title1"
videoData.videoSeries = "animation"

guard let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: nil,
    customData: nil,
    viewerData: nil
) else {
    return
}

let playerBinding = MUXSDKStats.monitorAVPlayerViewController(
    playerViewController,
    withPlayerName: "playerName",
    customerData: customerData
)
// if you're using AVPlayerLayer instead of AVPlayerViewController use this instead:
// MUXSDKStats.monitorAVPlayerLayer(playerLayer, withPlayerName: "playerName", customerData: customerData)

```



For more complete examples check the [demo app in the repo](https://github.com/muxinc/mux-stats-sdk-avplayer/tree/master/Examples).

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

## 3. Make your data actionable

The only required field is `env_key`. But without some more metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Metadata fields are provided via the `MUXSDKCustomerPlayerData` and `MUXSDKCustomerVideoData` objects.

For the full list of properties, see the MuxCore headers in the [latest stats-sdk-objc release](https://github.com/muxinc/stats-sdk-objc/releases/latest).

For more details about each property, view the [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) guide.

```objc

MUXSDKCustomerPlayerData *playerData = [[MUXSDKCustomerPlayerData alloc] initWithPropertyKey:@"ENV_KEY"];
playerData.viewerUserId = @"1234";
playerData.experimentName = @"player_test_A";
playerData.playerName = @"iOS AVPlayer";
playerData.playerVersion = @"1.0.0";

MUXSDKCustomerVideoData *videoData = [MUXSDKCustomerVideoData new];
videoData.videoTitle = @"Big Buck Bunny";
videoData.videoId = @"bigbuckbunny";
videoData.videoSeries = @"animation";
videoData.videoDuration = @(120000); // in milliseconds
videoData.videoIsLive = @NO;
videoData.videoCdn = @"cdn";

MUXSDKCustomerViewData *viewData= [[MUXSDKCustomerViewData alloc] init];
viewData.viewSessionId = @"some session id";

MUXSDKCustomData *customData = [[MUXSDKCustomData alloc] init];
[customData setCustomData1:@"my-data-string"];
[customData setCustomData2:@"my-custom-dimension-2"];

MUXSDKCustomerViewerData *viewerData = [[MUXSDKCustomerViewerData alloc] init];
viewerData.viewerApplicationName = @"MUX DemoApp";

MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] initWithCustomerPlayerData:playerData
                                                                                videoData:videoData
                                                                                 viewData:viewData
                                                                               customData:customData
                                                                               viewerData:viewerData];


_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController 
                                             withPlayerName:@"mainPlayer"
                                               customerData:customerData];

```

```swift

guard let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY") else {
    return
}
playerData.playerName = "playerName"
playerData.viewerUserId = "1234"
playerData.experimentName = "player_test_A"
playerData.playerVersion = "1.0.0"

let videoData = MUXSDKCustomerVideoData()
videoData.videoId = "abcd123"
videoData.videoTitle = "My Great Video"
videoData.videoSeries = "Weekly Great Videos"
videoData.videoDuration = 120000 // in milliseconds
videoData.videoIsLive = false
videoData.videoCdn = "cdn"

let viewerData = MUXSDKCustomerViewerData()
viewerData.viewerApplicationName = "MUX video-demo"

guard let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: nil,
    customData: nil,
    viewerData: viewerData
) else {
    return
}
let playerBinding = MUXSDKStats.monitorAVPlayerViewController(
    playerViewController,
    withPlayerName: "playerName",
    customerData: customerData
)

// if you're using AVPlayerLayer instead of AVPlayerViewController use this instead:
// MUXSDKStats.monitorAVPlayerLayer(playerLayer, withPlayerName: "playerName", customerData: customerData)

```



## 4. Set or update metadata after monitor

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `monitorAVPlayerLayer`, `monitorAVPlayerViewController`, or `monitorAVPlayer`. Then, once you have the metadata, update it by calling `setCustomerData(_:forPlayer:)`.

```objc

// Sometime later before the player is destroyed you can do this:
// The player name ("mainPlayer" in this example) should be a player that
// you have already called one of the `monitorAVPlayer` methods with
// In this example we are updating videoData, but the same can be done
// for updating playerData, customData or viewData
// Note that the values in customerData passed as nil will keep previously set data
// Note that viewerData can't be updated

MUXSDKCustomerVideoData *videoData = [MUXSDKCustomerVideoData new];
videoData.videoTitle = @"Big Buck Bunny";
videoData.videoId = @"bigbuckbunny";

MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] init];
customerData.customerVideoData = videoData;

[MUXSDKStats setCustomerData:customerData
                   forPlayer:@"mainPlayer"];

```

```swift

// Example: after monitoring, you need to update `customerData` with new metadata.
// In this example we are updating `videoData`, but the same process can be 
// used for updating `playerData`, `customData` or `viewData`.

// Note: The player name ("mainPlayer" in this example) should be a player that
// has been defined in steps 1-3.
// Note: All values in customerData passed as nil will keep previously set data.
// Note: `viewerData` object cannot be updated.

let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Big Buck Bunny"
videoData.videoId = "bigbuckbunny"

guard let customerData = MUXSDKCustomerData(
    customerPlayerData: nil,
    videoData: videoData,
    viewData: nil,
    customData: nil,
    viewerData: nil
) else {
    return
}

MUXSDKStats.setCustomerData(customerData, forPlayer: "mainPlayer")

```



## 5. Advanced

## Changing the Video

There are two cases where the underlying tracking of the video view need to be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `videoChange(forPlayer:with:)` which will remove all previous video data and reset all metrics for the video view. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It is required to call `videoChange(forPlayer:with:)` immediately before telling the player which new source to play. This recommendation changed in `v1.2.0`.

It is also required to call `play()` for player after replacing the current item.

If you have new player data you can include it in the `customerData` you pass to `videoChange(forPlayer:with:)`.

```swift

let videoData = MUXSDKCustomerVideoData()
videoData.videoId = "abcd123"
videoData.videoTitle = "My Great Video"
videoData.videoSeries = "Weekly Great Videos"
videoData.videoDuration = 120000 // in milliseconds
videoData.videoIsLive = false
videoData.videoCdn = "cdn"

let customerData = MUXSDKCustomerData(
    customerPlayerData: nil,
    videoData: videoData,
    viewData: nil
)

MUXSDKStats.videoChange(forPlayer: "AVPlayer", with: customerData)

// you should have AVPlayer and AVPlayerItem at this point
player.replaceCurrentItem(with: playerItem)

// call play() on the player here to start playback
player.play()

```



### New program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `programChange(forPlayer:with:)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

```swift

let videoData = MUXSDKCustomerVideoData()
videoData.videoId = "abcd123"
videoData.videoTitle = "My Great Video"
videoData.videoSeries = "Weekly Great Videos"
videoData.videoDuration = 120000 // in milliseconds
videoData.videoIsLive = false
videoData.videoCdn = "cdn"

let customerData = MUXSDKCustomerData(
    customerPlayerData: nil,
    videoData: videoData,
    viewData: nil
)

MUXSDKStats.programChange(forPlayer: "AVPlayer", with: customerData)

```



## Usage with Google Interactive Media Ads (IMA)

If you are using Google Interactive Media Ads, and specifically either the iOS SDK `GoogleAds-IMA-iOS-SDK` or the tvOS SDK `GoogleAds-IMA-tvOS-SDK` then we have another
plugin library that integrates tracking of ad playback events. You should have a fully functioning Google Ads IMA integration working in your iOS or tvOS application first.

<Callout type="info">
  The `v0.14.0` and higher releases of the Mux Google Ads IMA plugin expose a new API. If you've already integrated an earlier version [documentation is available to migrate to the new API](/docs/guides/monitor-avplayer#steps-to-migrate-existing-ima-integration-to-new-api).
</Callout>

### Installation

### Swift Package Manager

#### Installing in Xcode with Swift Package Manager

1. In Xcode click "File" > "Swift Packages" > "Add Package Dependency..."
2. The package repository URL is `https://github.com/muxinc/mux-stats-google-ima.git`

#### Installing as a dependency in Package.swift manifest

In order to install in your iOS application open your `Package.swift` file, add the following to dependencies:

```swift
.package(
  url: "https://github.com/muxinc/mux-stats-google-ima",
  .upToNextMajor(from: "0.14.0")
)
```

### Cocoapods

The Mux Google IMA plugin is available through CocoaPods. To install it, add the following line to your Podfile:

```ruby
pod 'Mux-Stats-Google-IMA'
```

### Steps for new IMA integrations

1. Import the SDK: `import MuxStatsGoogleIMAPlugin` in Swift  `#import <MuxStatsGoogleIMAPlugin/MuxStatsGoogleIMAPlugin.h>` in Objective-C
2. After initializing the Mux monitor with `monitorAVPlayerViewController` or `monitorAVPlayerLayer`, save this value to a variable (below it's called `playerBinding`)
3. Create an `adListener` instance using the `playerBinding` you created above and your applications IMA ads loader by calling `MUXSDKIMAAdListener(playerBinding: playerBinding!, monitoringAdsLoader: yourAdsLoader)`.
4. Add `IMAAdsManager` monitoring by calling `adListener.monitorAdsManager(yourIMAAdsManager)`
5. Notify `adListener` when you send your ad request

* For client-side ads, the most common case, use `imaListener.clientAdRequest(yourIMAAdsRequest)` to forward each `IMAAdsRequest` you initiate
* For server-side ads using [Dynamic Ad Insertion](https://developers.google.com/interactive-media-ads/docs/sdks/ios/dai), use `imaListener.daiAdRequest(yourIMAStreamRequest)` to forward each `IMAAdsRequest` you initiate

6. `MUXSDKIMAAdListener` will automatically intercept `IMAAdsLoader` and `IMAAdsManager` delegate calls

### Steps to migrate existing IMA integration to new API

1. Replace calls to `MuxImaLister` with `MUXSDKIMAAdListener`. `MuxImaListener` supports the same new API so this step is optional, the remaining steps are applicable to `MuxImaListener`. As of `v0.14.0``MuxImaLister` is deprecated and will be removed in a future release.
2. Supply an `IMAAdsLoader` when calling the `MUXSDKIMAAdListener` initializer. Make sure your IMAAdsLoader delegate **is configured before this step**.
3. MUXSDKIMAAdListener will forward `IMAAdsLoaderDelegate` calls to your delegate.
4. When you've created a new `IMAAdsManager`, like you've done with `IMAAdsLoader`, configure your own  `IMAAdsManagerDelegate` first and then call `monitorAdsManager`.
5. MUXSDKIMAAdListener will forward `IMAAdsManagerDelegate` calls to your delegate.
6. Remove calls to `dispatchEvent`, `dispatchError`, and `onContentPauseOrResume` from your integration.

```objc

#import<MuxStatsGoogleIMAPlugin/MuxStatsGoogleIMAPlugin.h>

- (void)viewDidLoad {
  // Follow the instructions from pod 'GoogleAds-IMA-iOS-SDK' to set up
  // your adsLoader and set your ViewController as the delegate
  //
  // From your ViewController, when you call either
  //    monitorAVPlayerViewController:withPlayerName:playerData:videoData:
  //    monitorAVPlayerLayer:withPlayerName:playerData:videoData:
  //
  // You will get back a MUXSDKPlayerBinding object
  [MUXSDKPlayerBinding *playerBinding] = [MUXSDKStats monitorAVPlayerViewController:_avplayerController
                                                                     withPlayerName:DEMO_PLAYER_NAME
                                                                       customerData:customerData];
  //
  // Use the MUXSDKPlayerBinding object to initialize the MuxImaListener class
  //
  _adsListener = [[MUXSDKIMAAdListener alloc] initWithPlayerBinding:playerBinding 
                                                   monitorAdsLoader:adsLoader];

  //...

  // When you send your ad request, you must report it to the IMA listener so it can properly track state
  [_adsListener clientAdRequest:request]; // for Client-Side Ads (the usual case)
  // OR
  [_adsListener daiAdRequest:daiRequest]; // for Dynamic Server-Side Ads (DAI/SSAI)
}
// when the adsLoader fires adsLoadedWithData you get a
// reference to the adsManager. Set your ViewController as the delegate
// for the adsManager
- (void)adsLoader:(IMAAdsLoader *)loader adsLoadedWithData:(IMAAdsLoadedData *)adsLoadedData {
    _adsManager = adsLoadedData.adsManager;
    // Set your adsManager delegate before passing it to adsListener for monitoring
    _adsManager.delegate = self;
    IMAAdsRenderingSettings *adsRenderingSettings = [[IMAAdsRenderingSettings alloc] init];
    adsRenderingSettings.webOpenerPresentingController = self;
    [_adsManager initializeWithAdsRenderingSettings:adsRenderingSettings];
    [_adsListener monitorAdsManager: _adsManager];
}

//
- (void)adsManager:(IMAAdsManager *)adsManager didReceiveAdEvent:(IMAAdEvent *)event {
    // When the SDK notified us that ads have been loaded, play them.
    if (event.type == kIMAAdEvent_LOADED) {
        [_adsManager start];
    }
}

- (void)adsManager:(IMAAdsManager *)adsManager didReceiveAdError:(IMAAdError *)error {
    [_avplayer play];
}

- (void)adsManagerDidRequestContentPause:(IMAAdsManager *)adsManager {
    [_avplayer pause];
}

- (void)adsManagerDidRequestContentResume:(IMAAdsManager *)adsManager {
    [_avplayer play];
}

```

```swift

import MuxCore
import MUXSDKStats
import GoogleInteractiveMediaAds
import MuxStatsGoogleIMAPlugin

var adsListener: MUXSDKIMAAdListener?

override func viewDidLoad() {
  super.viewDidLoad()
  // Follow the instructions from pod 'GoogleAds-IMA-iOS-SDK' to set up
  // your adsLoader and set your ViewController as the delegate
  //
  // From your ViewController, when you call either
  //    monitorAVPlayerViewController
  //    monitorAVPlayerLayer
  //
  // You will get back a MUXSDKPlayerBinding object

  // Configure your ads loader delegate before initializing MUXSDKIMAAdListener
  adsLoader.delegate = self

  // Setup your content players AVPlayerViewController or AVPlayerLayer
  let playerViewController = AVPlayerViewController()

  let playerBinding = MUXSDKStats.monitorAVPlayerViewController(
    self, 
    withPlayerName: playName, 
    customerData: customerData
  )

  // Use the MUXSDKPlayerBinding object to initialize the MuxImaListener class
  // Save a reference to adsListener, we'll use a property
  let adsListener = MUXSDKIMAAdListener(
    playerBinding: playerBinding!,
    monitorAdsLoader: adsLoader
  )
  self.adsListener = adsListener
  // ... 

  // When you send your ad request, you must report it to the IMA listener so it can properly track state
  adsListener.clientAdRequest(request) // for Client-Side Ads (the usual case)
  // OR
  adsListener.daiAdRequest(daiAdRequest) // for Dynamic Server-Side Ads (SSAI)
}

// the adsLoader delegate will fire this and give access to the adsManager
// the application needs to register as a delegate for the adsManager too
func adsLoader(_ loader: IMAAdsLoader!, adsLoadedWith adsLoadedData: IMAAdsLoadedData!) {
    adsManager = adsLoadedData.adsManager;
    adsManager.delegate = self;
    adsListener?.monitorAdsManager(adsManager)
}

// all of these events get fired by the adsManager delegate
// when this happens, the application needs to do some stuff and send the events
// to our sdk
func adsManager(_ adsManager: IMAAdsManager!, didReceive event: IMAAdEvent!) {
    if (event.type == kIMAAdEvent_LOADED) {
      adsManager.start()
    }
}

func adsManager(_ adsManager: IMAAdsManager!, didReceive error: IMAAdError!)
    avPlayer.play()
}

func adsManagerDidRequestContentPause(_ adsManager: IMAAdsManager!) {
    avPlayer.pause()
}

func adsManagerDidRequestContentResume(_ adsManager: IMAAdsManager!) {
    avPlayer.play()
}

```



If you have enabled Picture in Picture support and are using the `IMAPictureInPictureProxy`, you will need an additional step in order to track ad related metrics correctly.

```objc

#import<MuxStatsGoogleIMAPlugin/MuxStatsGoogleIMAPlugin.h>

- (void)viewDidLoad {
  // Follow the instructions from pod 'GoogleAds-IMA-iOS-SDK' to set up
  // your adsLoader and set your ViewController as the delegate
  //
  // From your ViewController, when you call either
  //    monitorAVPlayerViewController:withPlayerName:playerData:videoData:
  //    monitorAVPlayerLayer:withPlayerName:playerData:videoData:
  //
  // You will get back a MUXSDKPlayerBinding object
  [MUXSDKPlayerBinding *playerBinding] = [MUXSDKStats monitorAVPlayerViewController:_avplayerController
                                                                     withPlayerName:DEMO_PLAYER_NAME
                                                                       customerData:customerData];
  //
  // Use the MUXSDKPlayerBinding object to initialize the MuxImaListener class
  //
  _adsListener = [[MUXSDKIMAAdListener alloc] initWithPlayerBinding:playerBinding 
                                                   monitorAdsLoader:adsLoader];
  [_adsListener setPictureInPicture:YES];

  //...

  // When you send your ad request, you must report it to the IMA listener so it can properly track state
  [_adsListener clientAdRequest:request]; // for Client-Side Ads (the usual case)
  // OR
  [_adsListener daiAdRequest:daiRequest]; // for Dynamic Server-Side Ads (DAI/SSAI)
}
// when the adsLoader fires adsLoadedWithData you get a
// reference to the adsManager. Set your ViewController as the delegate
// for the adsManager
- (void)adsLoader:(IMAAdsLoader *)loader adsLoadedWithData:(IMAAdsLoadedData *)adsLoadedData {
    _adsManager = adsLoadedData.adsManager;
    // Set your adsManager delegate before passing it to adsListener for monitoring
    _adsManager.delegate = self;
    IMAAdsRenderingSettings *adsRenderingSettings = [[IMAAdsRenderingSettings alloc] init];
    adsRenderingSettings.webOpenerPresentingController = self;
    [_adsManager initializeWithAdsRenderingSettings:adsRenderingSettings];
    [_adsListener monitorAdsManager: _adsManager];
}

//
- (void)adsManager:(IMAAdsManager *)adsManager didReceiveAdEvent:(IMAAdEvent *)event {
    // When the SDK notified us that ads have been loaded, play them.
    if (event.type == kIMAAdEvent_LOADED) {
        [_adsManager start];
    }
}

- (void)adsManager:(IMAAdsManager *)adsManager didReceiveAdError:(IMAAdError *)error {
    [_avplayer play];
}

- (void)adsManagerDidRequestContentPause:(IMAAdsManager *)adsManager {
    [_avplayer pause];
}

- (void)adsManagerDidRequestContentResume:(IMAAdsManager *)adsManager {
    [_avplayer play];
}

```

```swift

import MuxCore
import MUXSDKStats
import GoogleInteractiveMediaAds
import MuxStatsGoogleIMAPlugin

var adsListener: MUXSDKIMAAdListener?

override func viewDidLoad() {
  super.viewDidLoad()
  // Follow the instructions from pod 'GoogleAds-IMA-iOS-SDK' to set up
  // your adsLoader and set your ViewController as the delegate
  //
  // From your ViewController, when you call either
  //    monitorAVPlayerViewController
  //    monitorAVPlayerLayer
  //
  // You will get back a MUXSDKPlayerBinding object

  // Configure your ads loader delegate before initializing MUXSDKIMAAdListener
  adsLoader.delegate = self

  // Setup your content players AVPlayerViewController or AVPlayerLayer
  let playerViewController = AVPlayerViewController()

  let playerBinding = MUXSDKStats.monitorAVPlayerViewController(
    self, 
    withPlayerName: playName, 
    customerData: customerData
  )

  // Use the MUXSDKPlayerBinding object to initialize the MuxImaListener class
  // Save a reference to adsListener, we'll use a property
  let adsListener = MUXSDKIMAAdListener(
    playerBinding: playerBinding!,
    monitorAdsLoader: adsLoader
  )
  adsListener.setPictureInPicture(true)
  self.adsListener = adsListener
  // ... 

  // When you send your ad request, you must report it to the IMA listener so it can properly track state
  adsListener.clientAdRequest(request) // for Client-Side Ads (the usual case)
  // OR
  adsListener.daiAdRequest(daiAdRequest) // for Dynamic Server-Side Ads (SSAI)
}

// the adsLoader delegate will fire this and give access to the adsManager
// the application needs to register as a delegate for the adsManager too
func adsLoader(_ loader: IMAAdsLoader!, adsLoadedWith adsLoadedData: IMAAdsLoadedData!) {
    adsManager = adsLoadedData.adsManager;
    adsManager.delegate = self;
    adsListener?.monitorAdsManager(adsManager)
}

// all of these events get fired by the adsManager delegate
// when this happens, the application needs to do some stuff and send the events
// to our sdk
func adsManager(_ adsManager: IMAAdsManager!, didReceive event: IMAAdEvent!) {
    if (event.type == kIMAAdEvent_LOADED) {
      adsManager.start()
    }
}

func adsManager(_ adsManager: IMAAdsManager!, didReceive error: IMAAdError!)
    avPlayer.play()
}

func adsManagerDidRequestContentPause(_ adsManager: IMAAdsManager!) {
    avPlayer.pause()
}

func adsManagerDidRequestContentResume(_ adsManager: IMAAdsManager!) {
    avPlayer.play()
}

```



For a complete example project written in Swift with UIKit, check out the `Example/DemoApp` folder of [muxinc/mux-stats-google-ima](https://github.com/muxinc/mux-stats-google-ima)

You can find more examples in the "/Examples" directory of [muxinc/mux-stats-sdk-avplayer](https://github.com/muxinc/mux-stats-sdk-avplayer) on GitHub. All of these apps have examples with Google IMA ads. `video-demo` is an iOS app written in Swift and `TVDemoApp` is a TVOS app written in objective-c

## Track orientation change events

As of 1.3.0 Mux-Stats-AVPlayer can optionally track `orientationchange` events. To use this functionality, call the `orientationChange(forPlayer:with:)` method.

These events will show up on the events log on the view views page.

```objc

@implementation ViewController

  - (void)viewWillTransitionToSize:(CGSize)size
       withTransitionCoordinator:(id<UIViewControllerTransitionCoordinator>)coordinator {
    [super viewWillTransitionToSize:size withTransitionCoordinator:coordinator];
    [coordinator animateAlongsideTransition:^(id<UIViewControllerTransitionCoordinatorContext> context) {} completion:^(id<UIViewControllerTransitionCoordinatorContext> context) {
        [MUXSDKStats orientationChangeForPlayer:DEMO_PLAYER_NAME withOrientation:[self viewOrientationForSize:size]];

    }];
    }

  - (MUXSDKViewOrientation) viewOrientationForSize:(CGSize)size {
      return (size.width > size.height) ? MUXSDKViewOrientationLandscape : MUXSDKViewOrientationPortrait;
  }

@end

```

```swift

class VideoPlayerController: AVPlayerViewController {
    override func viewWillTransition(to size: CGSize, with coordinator: UIViewControllerTransitionCoordinator) {
        super.viewWillTransition(to: size, with: coordinator)
        MUXSDKStats.orientationChange(forPlayer: "playerName", with: self.viewOrientationForSize(size: size))
    }

    func viewOrientationForSize(size: CGSize) -> MUXSDKViewOrientation {
        return (size.width > size.height) ? MUXSDKViewOrientation.landscape : MUXSDKViewOrientation.portrait
    }
}

```



## Usage with AVQueuePlayer

To use with `AVQueuePlayer`  you will need to follow these steps:

1. Listen for `AVPlayerItemDidPlayToEndTime` in your application
2. When that notification fires, call `videoChange(forPlayer:with:)`

Here is an example that sets up a `AVQueuePlayer` with two items, and listener after the first item finishes playing and passes in new `videoData`.

```swift

class AVQueuePlayerViewController: AVPlayerViewController {
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        guard let url1 = URL(string: "https://stream.mux.com/xwatk58X7MkCRBA47f01U9bGHt1lklEW5.m3u8") else {
            return
        }
        
        guard let url2 = URL(string: "https://stream.mux.com/HEp7gv53NFjtbZPZB8s02ZiNixXUgYgT7.m3u8") else {
            return
        }
        
        let item1 = AVPlayerItem(url: url1)
        let item2 = AVPlayerItem(url: url2)
        
        NotificationCenter.default.addObserver(
            self,
            selector: #selector(self.playerItemDidReachEnd),
            name: NSNotification.Name.AVPlayerItemDidPlayToEndTime,
            object: item1
        )
        
        let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")
        playerData?.playerName = "AVPlayer"
        
        let videoData = MUXSDKCustomerVideoData()
        videoData.videoIsLive = false
        videoData.videoTitle = "Title1"
        
        guard let customerData = MUXSDKCustomerData(
            customerPlayerData: playerData,
            videoData: videoData,
            viewData: nil
        ) else {
            return
        }
        
        MUXSDKStats.monitorAVPlayerViewController(self, withPlayerName: "AVPlayer", customerData: customerData)
        
        player = AVQueuePlayer(items: [item1, item2])
        player?.play()
    }
    
    @objc func playerItemDidReachEnd(notification: NSNotification) {
        let videoData = MUXSDKCustomerVideoData()
        videoData.videoTitle = "Title2"
        videoData.videoId = "RollerbladingCoder2"
        
        guard let customerData = MUXSDKCustomerData(
            customerPlayerData: nil,
            videoData: videoData,
            viewData: nil
        ) else {
            return
        }
        
        MUXSDKStats.videoChange(forPlayer: "AVPlayer", with: customerData)
    }
}

```



## Overriding device metadata

By default, the Mux Data SDK for iOS collects data about your users' device to report on the dashboard. If you wish to provide your own device metadata, you can use `CustomerViewerData` to override the detected values.

```objc

// ... set up videoData, playerData, etc

MUXSDKCustomerViewerData *viewerData = [[MUXSDKCustomerViewerData alloc] init];
viewerData.viewerApplicationName = @"MUX DemoApp";
viewerData.viewerDeviceCategory = "kiosk";
viewerData.viewerDeviceModel = "ABC-12345";
viewerData.viewerDeviceManufacturer = "Example Display Systems, Inc";
MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] initWithCustomerPlayerData:playerData
                                                                                videoData:videoData
                                                                                  viewData:viewData
                                                                                customData:customData
                                                                                viewerData:viewerData];
_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController withPlayerName:DEMO_PLAYER_NAME customerData:customerData];


```

```swift

// ... set up videoData, playerData, etc

let viewerData = MUXSDKCustomerViewerData()
viewerData.viewerDeviceCategory = "kiosk"
viewerData.viewerDeviceModel = "ABC-12345"
viewerData.viewerDeviceManufacturer = "Example Display Systems, Inc"

let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: viewData,
    customData: customData,
    viewerData: viewerData
)

```



## Handling errors manually

By default, `automaticErrorTracking` is enabled which means the Mux SDK will catch errors that the player throws and track an `error` event. Error tracking is meant for fatal errors. When an error is thrown it will mark the view as having encountered an error in the Mux dashboard and the view will no longer be monitored.

If you want to disable automatic error tracking and track errors manually, you can do so by passing `false` as the `automaticErrorTracking` parameter to the `monitorAVPlayerLayer`, `monitorAVPlayerViewController`, or `monitorAVPlayer` method you are using.

Whether automatic error tracking is enabled or disabled, you can dispatch errors manually with `dispatchError(_:withMessage:forPlayer:)`.

```objc

_avplayer = player;
_avplayerController.player = _avplayer;

NSString *playerName = @"iOS AVPlayer"
NSString *environmentKey = @"yourEnvironmentKey";

MUXSDKCustomData *customData = [[MUXSDKCustomData alloc] init];
[customData setCustomData1:@"my-custom-dimension"];

MUXSDKCustomerPlayerData *playerData = [[MUXSDKCustomerPlayerData alloc] initWithEnvironmentKey:environmentKey];

MUXSDKCustomerVideoData *videoData = [[MUXSDKCustomerVideoData alloc] init];
videoData.videoTitle = @"Your Video Title";

MUXSDKCustomerViewData *viewData= [[MUXSDKCustomerViewData alloc] init];
viewData.viewSessionId = @"some session id";

MUXSDKCustomerViewerData *viewerData = [[MUXSDKCustomerViewerData alloc] init];
viewerData.viewerApplicationName = @"MUX DemoApp";

MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] initWithCustomerPlayerData:playerData
                                                                                videoData:videoData
                                                                                 viewData:viewData
                                                                               customData:customData
                                                                               viewerData:viewerData];


_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController 
                                             withPlayerName:playerName 
                                               customerData:customerData
                                               automaticErrorTracking: NO];

// later you can dispatch an error yourself
[MUXSDKStats dispatchError: @"1234"
               withMessage: @"Something is not right"
                 forPlayer: playerName];

```

```swift

let playerName = "iOS AVPlayer"

let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")

let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Your Video Title"

guard let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: nil,
    customData: nil,
    viewerData: nil
) else {
    return
}

let playerBinding = MUXSDKStats.monitorAVPlayerViewController(
    playerViewController,
    withPlayerName: playerName,
    customerData: customerData,
    automaticErrorTracking: false
)

// Later, you can dispatch an error yourself
MUXSDKStats.dispatchError(
    "1234",
    withMessage: "Something is not right",
    forPlayer: playerName
)

```



## Error Categorization

Set custom error metadata to distinguish between fatal errors or warnings and classify errors as playback failures or business exceptions. Errors categorized as warnings or as business exceptions are not considered playback failures, meaning these errors are excluded from alerting, giving a more accurate picture of the health of your system with less noise from alerts. You can find [more general information on Error Categorization here](/docs/guides/error-categorization).

This is an example of how to categorize an error event to be a warning.

```objc

_avplayer = player;
_avplayerController.player = _avplayer;

NSString *playerName = @"iOS AVPlayer";
NSString *environmentKey = @"yourEnvironmentKey";

MUXSDKCustomData *customData = [[MUXSDKCustomData alloc] init];
[customData setCustomData1:@"my-custom-dimension"];

MUXSDKCustomerPlayerData *playerData = [[MUXSDKCustomerPlayerData alloc] initWithPropertyKey:environmentKey];

MUXSDKCustomerVideoData *videoData = [MUXSDKCustomerVideoData new];
videoData.videoTitle = @"Your Video Title";

MUXSDKCustomerViewData *viewData= [[MUXSDKCustomerViewData alloc] init];
viewData.viewSessionId = @"some session id";

MUXSDKCustomerViewerData *viewerData = [[MUXSDKCustomerViewerData alloc] init];
viewerData.viewerApplicationName = @"MUX DemoApp";

MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] initWithCustomerPlayerData:playerData
                                                                                videoData:videoData
                                                                                 viewData:viewData
                                                                               customData:customData
                                                                               viewerData:viewerData];

_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController 
                                             withPlayerName:playerName 
                                               customerData:customerData];

// later you can dispatch an error yourself
[MUXSDKStats dispatchError: @"1234"
               withMessage: @"Something is not right"
                  severity: MUXSDKErrorSeverityWarning
              errorContext: @"Error context"
                 forPlayer: playerName];

```

```swift

let playerName = "iOS AVPlayer"

let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")

let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Your Video Title"

guard let customerData = MUXSDKCustomerData(
  customerPlayerData: playerData,
  videoData: videoData,
  viewData: nil,
  customData: nil,
  viewerData: nil
) else {
  return
}

let playerBinding = MUXSDKStats.monitorAVPlayerViewController(
  playerViewController,
  withPlayerName: playerName,
  customerData: customerData
)

// Later, you can dispatch an error yourself
MUXSDKStats.dispatchError(
  "1234",
  withMessage: "Something is not right",
  severity: MUXSDKErrorSeverity.warning,
  errorContext: "Error Context",
  forPlayer: playerName
)

```



This is an example of how to categorize an error event as a business exception.

```objc

// Call this method from the source of the business exception with parameters appropriate to your integration.
- (void)dispatchBusinessExceptionWithPlayerName:(NSString *)playerName
                              playerErrorSeverity:(MUXSDKErrorSeverity)errorSeverity
                              playerErrorCode:(NSString *)playerErrorCode
                              playerErrorMessage:(NSString *)playerErrorMessage
                              playerErrorContext:(NSString *)playerErrorContext {
  [MUXSDKStats dispatchError: playerErrorCode,
                 withMessage: playerErrorMessage,
                    severity: MUXSDKErrorSeverityWarning,
         isBusinessException: YES,
                errorContext: playerErrorContext,
                   forPlayer: playerName];
}

```

```swift

// Call this method from the source of the business exception with parameters appropriate to your integration.
func dispatchBusinessException(playerName: String,
                               playerErrorCode: String,
                               playerErrorMessage: String,
                               playerErrorContext: String) {
  MUXSDKStats.dispatchError(
    playerErrorCode,
    withMessage: playerErrorMessage,
    severity: MUXSDKErrorSeverity.warning,
    isBusinessException: true,
    errorContext: playerErrorContext,
    forPlayer: playerName
  )
}

```



## App Store warning: ITMS-90809: Deprecated API Usage

It has come up a few times that users of our iOS library get this warning from Apple.

> Apple will stop accepting submissions of apps that use UIWebView APIs . See https://developer.apple.com/documentation/uikit/uiwebview for more information.

If you run `grep -r "UIWebView" .` in your project you will see a match coming from the `dSYM/` directory in Mux-Core. At first glance, we too thought our SDK was triggering this warning.

However, after looking into this with several different applications we found that the warning was not being triggered by our SDK. In every case it was coming from another 3rd party.

Note that none of the Mux iOS libraries (including `Mux-Core` and `Mux-Stats-AVPlayer`) use `UIWebView`. If you are getting this warning you must have another SDK that is using `UIWebView`.

The reason there is some confusion around this and the reason you get a match in the `dSYM/` directory in Mux-Core is because our SDK links to `UIKit` and targets a version of iOS that *may include* `UIWebView`.  The `dSYM` files are used for debugging purposes and they do not contain any functional code. You may see that this same confusion came up in other SDKs like Mapbox and Stripe (listed below).

### Resources:

* [Mux issue #32](https://github.com/muxinc/mux-stats-sdk-avplayer/issues/32)
* [Mux issue #53](https://github.com/muxinc/mux-stats-sdk-avplayer/issues/53)
* [Mapbox issue #373](https://github.com/mapbox/ios-sdk-examples/issues/373)
* [Stripe issue #82](https://github.com/stripe/stripe-terminal-ios/issues/82#issuecomment-625406168)

<LinkedHeader step={steps[6]} />

### Current release

#### v4.11.0

Improvements:

* Observe and report changes in network connection type

Fixes:

* Use Foundation networking for request metrics to avoid missing values from AVMetrics

### Previous releases

#### v4.10.0

Updates:

* Add (incubating) playbackModeChange API
* Add cumulative ad playing time and total content time metric tracking. The metrics track the "wall-clock" time spent with video playing during a view, excluding buffering, seeking, and startup time.

#### v4.9.0

Improvements:

* Move calls to [`AVPlayerItem.currentDate()`](https://developer.apple.com/documentation/avfoundation/avplayeritem/currentdate\(\)) and [`AVPlayerItem.accessLog()`](https://developer.apple.com/documentation/avfoundation/avplayeritem/accesslog\(\)) off the main thread.

#### v4.8.2

Fixes:

* Disable Text-Based API (.tbd) generation for framework builds. CocoaPods fails to strip these, and this can lead to App Store upload errors.

#### v4.8.1

Fixes:

* Fix crash `Invalid parameter not satisfying: tag != nil` introduced in v4.8.0

#### v4.8.0

Improvements:

* Use AVMetrics to generate bandwidth metric events for HLS streams (iOS 18+, tvOS 18+, visionOS 2+). This also enables automatic CDN change tracking via the `X-CDN` header for these OS versions.

#### v4.7.0

Improvements:

* Fix crash when AVPlayer is used off the main thread
* Moves some calls to AVAsset to the background to avoid blocking the main thread
* Fixes mismatched bitrate/size reporting in rendition change events

#### v4.6.0

Improvements:

* Builds from source for both Swift Package Manager and CocoaPods for more flexible integration in your projects
* Various tidying-up of public header files

Fixes:

* Fixes an issue where some customer viewer data fields were not being sent

#### v4.4.0

Improvements:

* Updates the MuxCore dependency to 5.3.x
* preserve debugging symbols from framework build

Fixes:

* update package spec with missing macCatalyst platform

#### v4.3.0

Improvements:

* Update MuxCore to v5.2.0

#### v4.2.0

Fixes:

* Send ended when an AVPlayerItem finishes playing to completion

#### v4.1.2

Improvements:

* Update MuxCore to v5.1.2

#### v4.1.1

Improvements:

* Update MuxCore to v5.1.1

#### v4.1.0

Improvements:

* Automatically dispatch a `viewend` when a new `AVPlayerItem` becomes the player `currentItem`.

Fixes:

* No longer dispatch `viewinit` if the player `currentItem` is replaced with `nil`.

#### v4.0.0

New:

* Error events can be categorized with warning or fatal severity levels.
* Error events can be categorized as business exceptions.

Improvements:

* Player error details (same as listed above) are no longer deduplicated and are explicitly included with each error event sent to Mux.

API Changes:

* The minimum deployment targets for the SDK are now iOS 12 and tvOS 12.
* Removes deprecated `MUXSDKStats` APIs.

#### v3.6.2

Fixes:

* A crash that occurred when monitoring playback using AirPlay.

#### v3.6.1

Improvements:

* Include privacy manifest file

#### v3.6.0

Improvements:

* Applications running on `visionOS` can monitor metrics for `AVPlayerViewController` or `AVPlayer` with a fixed player size. We recommend testing your `visionOS` application's AVPlayer monitoring integration on both the simulator and a physical device prior to deploying to the App Store.

Fixes:

* A memory leak has been fixed that occurred when tearing down monitoring of a standalone `AVPlayer` with a fixed player size.

Known Issues:

* Installation using Cocoapods on `visionOS` applications is not currently supported. Installation on `iOS` and `tvOS` using Cocoapods is not affected.
* Monitoring `AVPlayerLayer` playback on `visionOS` applications is not supported at this time.
* Views from playback on `visionOS` will always indicate Used Fullscreen to be `false`.

#### v3.5.1

Fixes:

* Add referential safety checks when dispatching session data

#### v3.5.0

API Changes:

* Expose reporting an error context parameter alongside customer errors

Known Issues:

* Including a `SESSION-DATA` tag in the manifest of a monitored HLS stream may cause a crash in v3.5.0 or earlier of `MUXSDKStats`. To resolve the issue limit `SESSION-DATA` tags only to applications that use `MUXSDKStats` v3.5.1 or higher.

#### v3.4.2

Fixes:

* Pin MuxCore to specific version consistent across Cocoapods and Swift Package Manager

#### v3.4.1

Fixes:

* Add state check when dispatching viewend event

Improvements:

* Update MuxCore dependency

#### v3.4.0

API Changes:

* Monitor AVPlayer with a fixed player size
* Set custom player software name and version values when initializing a new binding

Improvements

* Documentation revisions
* Audio-only monitoring example

#### v3.3.3

Fixes:

* Set the player width and height dimensions to the entire area of the screen where the player is present. Before this change, player width and height were set to the width and height of the video drawn on screen. Letterboxed or pillarboxed areas of the player were previously excluded as a result.

Player width and height dimensions are now equal to the `AVPlayerLayer` bounds or `AVPlayerViewController` view bounds, depending on which is used. Previously `AVPlayerViewController` `videoBounds` or `AVPlayerLayer` videoRect\` were used to set the player width and height.

Upscale Percentage or Downscale Percentage calculations are not affected if the player draws the video with the same aspect ratio as the video resolution.

#### v3.3.2

Fixes:

* Crash when removing a time observer from the wrong AVPlayer instance during monitoring teardown
  Improvements:
* Add Swift Package Manager example application

#### v3.3.1

Improvements:

* Update MuxCore with backfilled header nullability annotations to remove build warnings

#### v3.3.0

Updates:

* Add `drmType` to `MUXSDKCustomerViewData` so you can track this field if you wish

Improvements:

* System reliability updates during large events

#### v3.2.1

Fixes:

* Fix ad metadata not being reported

#### v3.2.0

Fixes:

* Fix wrong viewer time when finishing seek after an ad break

#### v3.1.0

Updates:

* Add Frame Drop Metrics (#172)
* Add 5 more Custom Dimensions (6 through 10) to `MUXSDKCustomData`

#### v3.0.0

Updates:

* Add fields to `CustomerViewerData` allowing them to override detected device metadata values
* Add Request ID metadata property to BandwidthMetricData
* Add Customer overrides for Device Metadata

Breaking:

* Due to Xcode 14, support for iOS and tvOS versions 9 and 10 have been removed. This may result in a warning for client applications with deployment versions below iOS/tvOS 11. For more information [see the last 'Deprecations' block in the release notes](https://developer.apple.com/documentation/Xcode-Release-Notes/xcode-14-release-notes).

Improvements:

* Update to MuxCore 4.0.0, Xcode 14
* Improve HLS/DASH segment request metrics (#165)

#### v2.13.2

* Fix an issue with certain error conditions not being properly recognized on the data dashboard

#### v2.13.1

* Relax muxcore pod dependency version, can now update any to 3.x version, 3.12 or higher
* Start a new View if a View receives events after 1 hour of inactivity

#### v2.12.1

* Fix: Crash in `AVMetadataItem` inspection when dispatching session data
* Fix: State check for `isPaused`

#### v2.12.0

* Fix: Register seek events when state is buffering
* Capture HLS session data and send event to core

#### v2.11.0

* Set Xcode build setting `APPLICATION_EXTENSION_API_ONLY = YES`
* Fix: Update rendition change logic to fire events after playback has started

#### v2.10.0

* Fix: Missing programmatic seek events for iOS 15.0
* Add picture in picture ads compatibility with mux-stats-google-ima 0.7.0

#### v2.9.0

* Fix: Missing programmatic seek latency metric and sequencing bugs
* Fix: Clear customer metadata stored under `playerName` when `destroyPlayer` is called
* Add Carthage binary project specification
* Add internal device detection properties

#### v2.8.0

* Fixes a bug that caused missing seek events when seeking programmatically

#### v2.7.0

* Add `player_live_edge_program_time`
* Add `player_program_time`

#### v2.6.0

* Allow overriding of viewer information (application name)
* Tests for AVQueuePlayer
* Custom beacon collection domains
* Adds `programChangeForPlayer:withCustomerData:`

#### v2.5.0

* Consolidates  `MUXSDKCustomerViewData`, `MUXSDKCustomerVideoData`, and `MUXSDKCustomerPlayerData` into `MUXSDKCustomerData` and deprecates methods that treat these as separate arguments
* Adds support for custom dimensions

#### v2.4.2

* Replaces `identifierForVendor` with alternative UUID
* Fixes race condition when checking viewer connection type

#### v2.4.1

* Fixes a bug when disabling automatic video change that could sometimes result in views not being split apart and/or having a high seek latency.

#### v2.4.0

* Automatically build statically linked frameworks
* Removes use of categories
* Updates documentation

#### v2.3.2

* Adds a new method to disable built in `videochange` calls when using `AVQueuePlayer`. This method can be called as:

```
[MUXSDKStats setAutomaticVideoChange:PLAYER_NAME enabled:false];
```

#### v2.2.2

* Fixes a code signing is missing error for Mac Catalyst
* Fixes a crash from a KVO observer being removed incorrectly
* Fixes bugs in seeking tracking for tvOS

#### v2.2.1

* Fixes a bug where AirPlay rebuffering was incorrectly reported as paused

#### v2.2.0

* Add Swift PM support

#### v2.1.0

* Submits `viewer_device_model` field
* Updates our implementation of the Google IMA SDK in demo apps to work with the latest version
* Automated UI test for ads

#### v2.0.0

This release moves the build process to use [XCFramework bundle type](https://developer.apple.com/videos/play/wwdc2019/416/). For iOS, there are no changes required to your application code.

If you are using this SDK with TVOS the name of the module has changed (the `Tv` suffix is no longer needed):

TVOS before 2.0:

```objc
@import MuxCoreTv;
@import MUXSDKStatsTv;
```

TVOS after 2.0:

```objc
@import MuxCore;
@import MUXSDKStats;
```

#### v1.7.0

* Adds support for `view_session_id`.
* Adds support for `player_remote_played` - this will be true when a video is shown over AirPlay.

#### v1.6.0

* Add `viewer_connection_type` for iOS (either `wifi` or `cellular`). Detecting `viewer_connection` type is done off the main thread to make sure this doesn't interfere with the performance of your application. Note that `viewer_connection_type` is omitted from TVOS because in versions before TVOS 12 there is no reliable way to detect `wifi` vs. `ethernet`.

#### v1.4.1

* (bugfix) `monitorAVPlayerLayer` with optional argument `automaticErrorTracking` was misnamed to `withAutomaticErrorTracking`. This has been changed to the correct name which is consistent with the corresponding `monitorAVPlayerViewController` method (thanks @hlung in [#58](https://github.com/muxinc/mux-stats-sdk-avplayer/pull/58))
* (bugfix) nullability warnings for MUXSDKStats (thanks @hlung in [#58](https://github.com/muxinc/mux-stats-sdk-avplayer/pull/58))

#### v1.4.0

* add option to disable automatic error tracking when calling either `monitorAVPlayerViewController` or `monitorAVPlayerLayer`
* add `MUXSDKStats.dispatchError` method to manually dispatch an error

You probably will not need to use these features, but if your player is throwing noisy non-fatal errors or you want to catch the player errors yourself and take precise control over the error code and error message then you now have that ability.

Dispatching an error should only be used for fatal errors. When the player goes into the error state then it is no longer being tracked and the view will show up as having encountered an error in the Mux dashboard.

#### v1.3.8

* Performance updates that optimize main thread usage.

#### v1.3.7

* Bug fix: Update our framework build process to be compatible with `carthage 0.35.0`. See the [GitHub issue](https://github.com/muxinc/mux-stats-sdk-avplayer/issues/54) for more details. The gist of it is that Carthage no longer ignores dSYM files, so those need to be packaged up correctly with the framework.

#### v1.3.6

* Bug fix: Rebuild frameworks without importing UIKit (we don't use it). This came to our attention when it was reported that our SDK was triggering this warning from Apple “The App Store will no longer accept new apps using UIWebView as of April 2020 and app updates using UIWebView as of December 2020.”

#### v1.3.5

* Bug fix for usage with `AVQueuePlayer`. Unlike other methods of changing the `playerItem` on an `AVPlayer` instance, when `AVQueuePlayer` progresses from one item to the next the `rate` observer does not fire so we have to handle it in a special case. See instructions above for usage with `AVQueuePlayer`.

#### v1.3.4

* Update scaling logic to report upscaling based on logical resolution, not physical resolution. This will result in lower upscaling percentages, but correlates more closely with perceived visual quality

#### v1.3.3

* Fix a bug to make sure all needed header files are included in the `tvOS` framework

#### v1.3.2

* Fix a bug in request metrics tracking, request metric event timestamps should always be sent in Unix millisecond timestamps, not seconds.

#### v1.3.1

* Fix an issue where multiple AVPlayer instances that are tracked simultaneously report the same throughput metrics.

#### v1.3.0

* Add support for `orientationchange` events. This can be dispatched with `MUXSDKStats orientationChangeForPlayer: withOrientation:`
* Add support for automatically tracking `renditionchange` events. You can see this new event in the events list for a view.
* Improve implementation for bandwidth metrics calculation. Instead of polling for changes on the access log, use `AVPlayerItemNewAccessLogEntryNotification`
* Fix bug in `programChange` so that it works consistently now
* Dispatch `viewend` when `destoryPlayer` is called. Previously this was not called which didn't affect metrics, but resulted in a `viewdropped` event in the events list.

#### v1.2.1

* Fix bug that prevents request metrics tracking from working. AVPlayer gives us `requestStart` and `requestResponseEnd`, so with those data points we can track throughput. This bug fix requires Mux-Stats-Core v2.1.3 or greater. Run `pod update Mux-Stats-AVPlayer`  and `pod update Mux-Stats-Core` to get the latest versions.

#### v1.2.0

* Fix bug in Mux-Stats-AVPlayer that prevents `videoChangeForPlayer` from working
* Fix bug in AVPlayer SDK where it misses initial play event at times if SDK is initialized too late. This could cause some iOS views to not be displayed in the monitoring dashboard, and to potentially have incomplete metrics such as Video Startup Time.
* Add ability to optionally pass in new player data when calling `videoChangeForPlayer`: `videoChangeForPlayer:withPlayerData:withVideoData`

#### v1.1.3

* Fix a bug to prevent an edge-case scenario where crashes can happen after calling `destroyPlayer` when observers have not yet bet set up on the player instance.

#### v1.1.2

* bump dependency version of Mux-Stats-Core to 2.1

#### v1.1.1

* bugfix - report the correct Mux Plugin Version. This SDK was erroneously reporting the incorrect 'Mux Plugin Version' attribute for views

#### v1.1.0

* Added new static method to `MUXSDKStats` `updateCustomerDataForPlayer:withPlayerData:withVideoData`. This allows a developer to update customerPlayerData and/or customerVideoData after the SDK has been initialized. Not all metadata can be changed if it was previously set, but all metadata that was not set initially can be updated to the intended values.

#### v1.0.2

* Fix a bug that caused slowness when loading AVPlayer due to checking currentItem.asset.duration before the duration was loaded

#### v1.0.1

* Fix a bug with incorrect source video duration

#### v1.0.0

* Extract GoogleAds-IMA-iOS-SDK into a separate library (Mux-Stats-Google-IMA). The reason for this change was to remove the hard dependency on GoogleAds-IMA-iOS-SDK
* In order to implement ad events tracking, please follow the instructions to use this library (Mux-Stats-AVPlayer) in conjunction with Mux-Stats-Google-IMA and GoogleAds-IMA-iOS-SDK

#### 0.1.5

* add support for tracking ad playback with GoogleAds-IMA-iOS-SDK

#### 0.1.1

* add support for AVPlayer monitoring


# Monitor AndroidX Media3
This guide walks through integration with Google's Media3 to collect video performance metrics with Mux data.
The Mux Data SDK for Media3 integrates Mux Data with Google's [AndroidX Media3](https://developer.android.com/guide/topics/media/media3) SDK in order to integrate your video app with Mux Data. Our SDK consists of a set of [open-source libraries](https://github.com/muxinc/mux-stats-sdk-media3) capable of observing Media3 for events and data related to your customers' playback experience.

This guide will walk you through a basic integration with Mux Data and your Media3 app. You will add the Mux Data SDK to your project, integrate the SDK with your Media3 `Player` and if necessary, learn to customize our SDK's functionality based on your specific needs

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager
- Can infer CDN identification from response headers
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Custom Beacon Domain
- Extraction of HLS Session Data

```

Notes:

```md
Request Latency is not available.
```

<Callout type="warning" title="Players other than `ExoPlayer`">
  Most of this guide assumes you are using `ExoPlayer`, specifically, as opposed to a `MediaController` or a custom implementation of `Player`. Our SDK does offer support for players other than `ExoPlayer`, but this support is limited by the interface of `Player` and `Player.Listener`. You may supplement the data we are able to collect using your `Player`'s specific APIs by overriding `BaseMedia3Binding` and supplying that object when you create your `MuxStatsSdkMedia3<>` for your custom player.
</Callout>

## 1. Install the Mux Data SDK

## Add our repository to your Gradle project

Add Mux's maven repository to your gradle files. Newer projects require declaring this in `settings.gradle`, and older projects require it to be set in the project-level `build.gradle`.

```gradle\_groovy

// in a repositories {} block
maven {
  url 'https://muxinc.jfrog.io/artifactory/default-maven-release-local' 
}

```

```gradle\_kts

// in a repositories {} block
maven {
  url = uri("https://muxinc.jfrog.io/artifactory/default-maven-release-local")
}

```



## Add a dependency for Mux Data

Add our library to the `dependencies` block for your app. Replace the string `[Current Version]` with the current version of the SDK from the [releases page](https://github.com/muxinc/mux-stats-sdk-media3/releases).

```gradle\_kts

implementation("com.mux.stats.sdk.muxstats:data-media3:[Current Version]")
  
```

```gradle\_groovy

implementation "com.mux.stats.sdk.muxstats:data-media3:[Current Version]"
  
```



## Stay on a version of Media3

By default, we try to support the latest minor release of media3 with our SDK. That is, 1.0, 1.1, etc. When media3 updates, we update our `data-media3` library to support the newest version. If you need an update to the Mux Data SDK, but can't update your media3 integration, you can use one of our `-atX_Y` variants. These variants of our Mux Data SDK receive all the same updates as the default version, but offer support for a specific version of media3.

To stay on a specific version of media3, add the appropriate version to the end of our `artifactId`. For example, to always use Media3 1.0.x, use the library at `com.mux.stats.sdk.muxstats:data-media3-at_1_0:[Current Version]`

```gradle\_kts

// Stay on media3 1.0 while getting the most-recent mux data
implementation("com.mux.stats.sdk.muxstats:data-media3-at_1_0:[Current Version]")
  
```

```gradle\_groovy

// Stay on media3 1.0 while getting the most-recent mux data
implementation "com.mux.stats.sdk.muxstats:data-media3-at_1_0:[Current Version]"
  
```



### Officially Supported Media3 Versions

We try to support all production versions of media3. Currently, we support the following versions:

* 1.9.x
* 1.8.x
* 1.6.x
* 1.5.x
* 1.4.x
* 1.3.x
* 1.2.x
* 1.1.x
* 1.0.x

## 2. Integrate this SDK with Media 3 in your app

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

To monitor a `Player`, monitor it using `monitorWithMuxData()`. You must initialize your Mux Data integration with a valid Environment Key.

If your player is newly-created, it's best to do this at around the same time that you call `prepare()` or `play()`. Ideally, you should do it synchronously, either right before calling *or* right after you start preparing/playing.

```java

CustomerData myCustomerData = new CustomerData();
MuxStatsSdkMedia3<ExoPlayer> muxStats =
    new MuxStatsSdkMedia3<>(
        /* context = */ context,
        /* envKey = */ "YOUR MUX DATA ENV KEY HERE",
        /* customerData = */ myCustomerData,
        /* player = */ exoPlayer,
        /* playerView = */ playerView,
        /* playerBinding = */ new ExoPlayerBinding()
    );

// Do these after creating the monitor
player.setPlayWhenReady(true);
player.prepare();


// ... When you are done with your Player
muxStats.release();
player.release();

```

```kotlin

val myCustomerData = CustomerData()
val muxStats = exoPlayer.monitorWithMuxData(
  context = context,
  envKey = "YOUR MUX DATA ENV KEY HERE",
  customerData = myCustomerData,
  playerView = playerView
)

// Do these after creating the monitor
player.playWhenReady = true
player.prepare()

// ... When you are done with your Player
muxStats.release()
player.release()

```



### Reusing `Player` instances

If your `Player`'s state was `IDLE` before calling `monitorWithMuxData()` or `enable()`, you should follow the same advice as if you were working with a new `Player` instance: start monitoring either immediately before or right after calling `prepare()` and `play()`. When you start monitoring again, a new View will be created in Mux Data.

The easiest way to get your player into the `IDLE` state is to call `stop()`, though you'll have to call `prepare()` and `play()` again to start playing.

If you want to start monitoring a `Player` instance that was already created and prepared, you should start start monitoring via `monitorWithMuxData()` or `enable()` immediately after you start the player again or set a new `MediaItem`. When you attach a monitor to the `Player` in this case, a new View will be created in Mux Data. In this case, this order is important; you must monitor the player after setting a new `MediaItem` in order to properly count any buffering the player may do.

## 3. Add Metadata

You can make your data more informative and actionable by supplementing it with data of your own. To supply this data, you can use the `CustomerData` object you created in Step 2.

```java

CustomerData myCustomerData = new CustomerData();
CustomerVideoData customerVideoData = new CustomerVideoData();
customerVideoData.setVideoTitle("Sintel");
CustomerViewerData customerViewerData = new CustomerViewerData();
customerViewerData.setMuxViewerDeviceCategory("kiosk");
customerViewerData.setMuxViewerDeviceManufacturer("Example Display Systems");
customerViewerData.setMuxViewerOsVersion("1.2.3-dev");
CustomData customData = new CustomData();
// You can add up to 10 strings to track your own data
customData.setCustomData1("Hello");
customData.setCustomData2("World");
customData.setCustomData3("From");
customData.setCustomData4("Mux");
customData.setCustomData5(":)");
myCustomerData.setCustomerVideoData(customerVideoData);
myCustomerData.setCustomerViewerData(customerViewerData);

// And now create your monitor object, like in step 2

```

```kotlin

val customerData = CustomerData().apply {
  customerVideoData = CustomerVideoData().apply {
    // Data about this video
    // Add or change properties here to customize video metadata such as title,
    //   language, etc
    videoId = "My Custom Video ID"
  }
  customerViewData = CustomerViewData().apply {
    // Data about this viewing session
    viewSessionId = UUID.randomUUID().toString()
  }
  customerViewerData = CustomerViewerData().apply {
    // Data about the Viewer and the device they are using
    muxViewerDeviceCategory = "kiosk"
    muxViewerDeviceManufacturer = "Example Display Systems"
    muxViewerOsVersion = "1.2.3-dev"
  }
  customData = CustomData().apply {
    // Add values for your Custom Dimensions.
    // Up to 10 strings can be set to track your own data
    customData1 = "Hello"
    customData2 = "World"
    customData3 = "From"
    customData4 = "Mux"
    customData5 = "Data"
  }

  // And now call monitorWithMuxData, like in Step 2.

```



Those examples contain only a few of the fields available. For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

All metadata details are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

Certain metadata can be collected automatically, such as the media title, source URL, and poster art.

## 4. Advanced Features

## Changing the video

There are two cases where the underlying tracking of the video view needs to be reset: first, when you load a new source URL into an existing player, and second, when the program within a single media stream changes (such as a program within a live stream, described more below).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

## New source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `muxStatsExoPlayer.videoChange(CustomerVideoData)` which will remove all previous video data and reset all metrics for the video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

It's best to change the video info immediately after telling the player which new source to play.

## New program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `muxStatsExoPlayer.programChange(CustomerVideoData)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

## Manually set when a video is being played full-screen

For most use cases, the SDK is capable of detecting whether or not a video is being played full-screen. Specifically, it can do so in the case where the player view is the same size as the device display (excepting ActionBars and other framework window decoration).

For other uses cases (non-overlaid controls, window decoration via plain `View`s, etc) you may need to tell the SDK when the user switches to full-screen.

If you are using `SimplePlayerView` or a similar ExoPlayer UI component, you can set the full-screen flag from the `OnFullScreenModeChangedListener`.

```kotlin
  override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)

    // If you are using SimplePlayerView, StyledPlayerView, etc
    playerView = findViewById(R.id.my_player_view)

    playerView.setFullscreenButtonClickListener { isFullScreen ->
      // Set presentation based on which mode is requested
      if(isFullScreen) {
        muxStats.presentationChange(MuxSDKViewPresentation.FULLSCREEN)
      } else {
        muxStats.presentationChange(MuxSDKViewPresentation.NORMAL)
      }
      // Handle moving to fullscreen playback with your code
    }
  }
```

## Error tracking

By default, Mux's integration with ExoPlayer automatically tracks fatal errors as thrown by ExoPlayer. If a fatal error happens outside the context of ExoPlayer and you want to track it with Mux, you can call `muxStats.error(MuxErrorException)` like this:

```kotlin
// Error code: integer value for the generic type of error that
// occurred.
// Error message: String providing more information on the error
// that occurred.
// For an example, the HTML5 video element uses the
// following: https://developer.mozilla.org/en-US/docs/Web/API/MediaError
// for codes and messages. Feel free to use your own codes and messages
val errorCode = 1
val errorMessage = "A fatal error was encountered during playback"
val errorContext = "Additional information about the error such as a stack trace"
val error = MuxErrorException(errorCode, errorMessage, errorContext)
muxStats.error(error)
```

Note that `error(MuxErrorException e)` can be used with or without automatic error tracking. If your application has retry logic that attempts to recover from ExoPlayer errors then you may want to disable automatic error tracking like this:

```kotlin
muxStats.setAutomaticErrorTracking(false)
```

<Callout type="warning">
  It is important that you only trigger an error when the playback has to be abandoned or aborted in an unexpected manner, as Mux tracks fatal playback errors only.
</Callout>

## Usage with Google Interactive Media Ads (IMA)

The Mux Data SDK for Media3 can observe events that occur during Ad playback. To enable this functionality, you need to attach an instance of `MuxStatsSdkMedia3<ExoPlayer>` to your `ImaAdsLoader`.

<Callout type="warning">
  The Mux Data SDK must take over the `AdErrorListener` and `AdEventListener` of your loader, but you can supply your own listeners, as shown in the example.
</Callout>

Fist, add Mux's Media3 IMA Extension to your build:

```gradle\_kts

// in your app's dependencies
implementation("com.mux.stats.sdk.muxstats:data-media3-ima:0.7.1")
  
```

```gradle\_groovy

// in your app's dependencies
implementation "com.mux.stats.sdk.muxstats:data-media3-ima:0.7.1"
  
```



Then, use the extension to monitor your IMA integration.

```kotlin

val newPlayer = ExoPlayer.Builder(this)
.setMediaSourceFactory(DefaultMediaSourceFactory(DefaultDataSource.Factory(this))
.setLocalAdInsertionComponents({ adsLoader }, view.playerView))
// ... rest of builder calls
.build()
val customerData = CustomerData()
// optionally, set properties on CustomerData

muxStats = newPlayer.monitorWithMuxData(context, DATA_ENV_KEY, customerData)
adsLoader = ImaAdsLoader.Builder(this)
// ... rest of builder calls
.monitorWith(
  muxStats = muxStats!!,
  customerAdErrorListener = { /*Optional parameter, your custom logic*/ },
  customerAdEventListener = { /*Optional parameter, your custom logic*/ },
)
.build()
adsLoader.setPlayer(newPlayer)
  
```

```java

ExoPlayer player = new ExoPlayer.Builder(this)
  // ... Add IMA components
  .build();

MuxStatsSdkMedia3<ExoPlayer> muxStats =
  new MuxStatsSdkMedia3<>(
      /* context = */ this,
      /* envKey = */ "YOUR MUX DATA ENV KEY HERE",
      /* customerData = */ myCustomerData, // Populated as in Step 2 of the guide
      /* player = */ player,
      /* playerView = */ playerView,
      /* playerBinding = */ new ExoPlayerBinding()
  );

MuxImaAdsListener muxAdsListener = MuxImaAdsListener.newListener(
  muxStats,
  adEvent -> {}, // If you have handling logic for AdEvents, put it here
  adError -> {} // If you have handling logic for Ad Errors, put it here
);
adsLoader = new ImaAdsLoader.Builder(this)
  .setAdErrorListener(muxAdsListener)
  .setAdEventListener(muxAdsListener)
  // Set up rest of AdsLoader
  .build();
adsLoader.setPlayer(player);
  
```



## Manually set the screen orientation

The Mux SDK supports sending an event when the playback orientation changes. You can trigger this by calling `muxStatsExoPlayer.orientationChange(MuxSDKViewOrientation orientation)`, passing either `MuxSDKViewOrientation.LANDSCAPE` or `MuxSDKViewOrientation.PORTRAIT` depending on the current orientation of the player.

## Migrating from the Mux Data SDK for ExoPlayer

If you are updating from our ExoPlayer SDK, you have to do a short migration. The migration steps below should get you building again:

1. Change your Mux Data SDK dependency to `implementation "com.mux.stats.sdk.muxstats:data-media3:1.0.0"`
2. Change all mentions of `MuxStatsExoPlayer` to `MuxStatsSdkMedia3<ExoPlayer>`
3. **If you are using java**, add `new ExoPlayerBinding()` to the end of the parameters you set when creating your `muxStats`.
4. **If you are using the IMA Ads SDK**: You will need to rewrite your integration as explained in Step 4 of this guide.

<LinkedHeader step={steps[6]} />

### Current release

#### v1.11.1

New:

* Track which sections of content a user has watched
* Track mid-view changes to network type and send `networkchange` events

Improvements:

* Send `"no_connection"` as a network connection type if connectivity is momentarily lost during a stream

Internal Lib Updates:

* Update `muxstats:android` to 1.7.2
* Update `stats.muxcore` to 8.9.0

### Previous releases

#### v1.11.0

New:

* Detect changes to network connectivity during views and dispatch `networkchange` events

Internal Lib Updates:

* Update `stats.muxcore` to v8.8.0
* Update `muxstats:android` to v1.7.0

#### v1.10.1

Updates:

* Add support for media3 v1.9.x
  Fixes:
* fix: manifestNewestTime reported as the earliest PDT, not the latest

#### v1.10.0

Updates:

* Add (incubating) `playbackModeChange` API methods to `MuxStatsSdkMedia3`.
* Add cumulative ad playing time and total content time metric tracking. The metrics track the "wall-clock" time spent with video playing during a view, excluding buffering, seeking, and startup time.
* library-ima: Detect the type of ad being played (preroll, midroll, or postroll)

Internal lib updates:

* Update `stats.muxcore` to 8.6.0
* Update `stats.android` to 1.5.0

#### v1.9.0

Improvements:

* Update Kotlin version from 1.9 to 2.2.10. This should be a backward-compatible change, but please reach out if you see issues

Internal Lib Updates:

* Update `stats.muxcore` to v8.5.2

#### v1.8.1

Updates:

* Add support for media3 v1.8

Fixes:

* Un-deprecate `CustomerVideoData.videoCdn`

Internal lib updates:

* Update `stats.muxcore` to v8.5.1

#### v1.8.0

Updates:

* Add automatic CDN-change detection, assuming your CDN is sending `x-cdn` response headers
* Improved Request Metrics tracking

Internal lib updates:

* Update `stats.muxcore` to v8.5.0

#### v1.7.4

Improvements:

* fix: AbstractMethodError in some apps

#### v1.7.3

Improvements:

* fix: viewerClientApplicationName and viewerClientApplicationVersion not reported

Internal lib updates:

* Update sdk.android to v1.4.10
* Update muxstats.java to v8.4.1
* This update also makes `sdk:android` and `muxstats.java` peers of each other. Previously, `sdk.android` depended on `muxstats.java`. This should be an internal-only change, but it's noted here in case you are tracking transitive dependencies in your build workflows

#### v1.7.2

Improvements:

* Add support for media3 v1.6.0

Fixes:

* fix: when showing multiple Players simultaneously, each should be counted as a separate view

Internal Lib Updates:

* Update `muxstats:android` to v1.4.9

#### v1.7.1

Updates

* Add `CustomerVideoData::videoCreatorId`

Internal lib updates:

* update `muxstats.java` to v8.4.0
* Update `sdk:android` to v1.4.8

#### v1.7.0

Updates

* Add more Standard Dimensions

Internal lib updates:

* Update `stats.muxcore` to v8.3.0
* Update `muxstats.android` to v1.4.7

#### v1.6.3

Improvements:

* Adds 10 more custom dimension slots for media customers
  Internal lib updates:
* Update `stats.android` to v1.4.6 and `stats.muxcore` to v8.2.0

#### v1.6.2

Improvements:

* update: add support for media3 1.5.x
* fix: content `renditionchange`s during ad breaks must be deferred until after the ad break

Internal lib updates:

* update `android` to v1.4.5
* update `muxstats.core` to v8.4.1

#### v1.6.1

Improvements:

* fix: suppress some ad events when outside of an ad break
* fix: dropped frames not tracked

Internal Library Updates:

* Update `muxstats-android` to v1.4.4

#### v1.6.0

Updates:

* Better tracking of ad events. If you are using a `VideoPlayerAdCallback` supply it to `ImaAdsLoader.monitorWith`

Fixes:

* fix rebuffering not ended when seeking starts
* fix verbose logging causing bad views in some cases

Internal lib updates:

* Update `stats.java` to 8.1.2
* Update `muxstats.android` to 1.4.2

#### v1.5.2

Fixes:

* fix: media3 version reported as, eg, `1.2.x` instead of the real version

Improvements:

* Add support for media3 v1.4
* Handle nonfatal codec exceptions on API 21+

Internal lib updates:

* Update `android` lib 1.4.0
* Update `stats.java` lib to 8.1.0
* Remove `kt-utils` from the dependencies. It is no longer required

#### v1.5.1

Improvements:

* fix: incorrect startup time after enable, disable, and videoChange

#### v1.5.0

Improvements:

* Update Android Core to 1.3.0
* misc. local build updates

#### v1.4.0

New:

* Expose parameter to control logging level when initializing monitoring

#### v1.3.2

* fix: incorrect screen resolution reported in some cases
* Update to Mux Android Core 1.2.2
* Update to Mux Java Core 8.0.2

#### v1.3.1

Updates:

* update: Add support for media3 1.3.0

Fixes:

* fix: reported app hang due to event handling during beacon dispatch
* fix: crash when exoplayer HLS module not used

Improvements:

* Update Android Core to v1.2.1
* Update Java Core to v8.0.1

#### v1.3.0

New:

* `MuxErrorException` now allows you to report non-fatal and business-related errors

Improvements:

* update: Updated MuxCore to version 8.0.0
* update: Updated Android Core to version 1.2.0

Fixes:

* fix: renditionchange sent in situations where rendition was only reset
* fix: Capture IMA CSAI media failures with LOG events
* fix: rebuffering percentage inflated if client ads fail to load

#### v1.2.2

Fixes:

* fix: populate ad data even for non-preroll ads
* fix: seeking time included in time-to-first frame if user seeks before play starts

Improvements:

* remove extraneous androidx deps from the exoplayer lib (they are still required if using IMA)

#### v1.2.1

Updates:

* add support for media3 v1.2.x

#### v1.2.0

Updates:

* add support for media3 v1.2.x

#### v1.1.0

Updates:

* Expose `IDevice` and `INetworkRequest` for injection, as with the other player sdks

#### v1.0.3

Updates:

* update: Update compileSdkVersion and targetSdkVersion to 34.

#### v1.0.2

Fixes:

* fix: SSAI Ad events not properly reported

#### v1.0.1

Fixes:

* fix: setting playWhenReady to true while READY sends play but not playing

Improvements:

* Update to Core 1.0.1 - Fixes handling of leaving ads by seeking out of them

#### v1.0.0

New:

* Update this SDK to v1.0.0 (🎉)

Fixes:

* fix: Custom Domain implementation POSTs to wrong URL
* fix: viewstart may not be sent if monitor attached while idle && playWhenReady == true

#### v0.8.0

Improvements:

* feat: Detect Title, Source URL, and Poster Art


# Monitor ExoPlayer
This guide walks through integration with Google's ExoPlayer to collect video performance metrics with Mux data.
This documents integration instructions for [Google's `ExoPlayer` library](https://github.com/google/ExoPlayer), version 2.x. `ExoPlayer` versions before 2.0 are not supported. As of version 3.0.0 of Mux's integration with `ExoPlayer`, only versions of `ExoPlayer` greater than or equal to 2.10.x are supported.

The Mux integration with `ExoPlayer` is built on top of Mux's core Java SDK, and the full code can be seen here: [muxinc/mux-stats-sdk-exoplayer](https://github.com/muxinc/mux-stats-sdk-exoplayer).

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager
- Can infer CDN identification from response headers
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
Request Latency is not available.
```

## 1. Add a dependency on the Mux Data SDK

## Add Gradle dependency on the Mux ExoPlayer SDK

Add the Mux Maven repository to your Gradle file:

```gradle
repositories {
    maven {
        url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
    }
}
```

Next, add a dependency on the Mux Data ExoPlayer SDK. Supported versions of ExoPlayer are:

* r2.10.6
* r2.11.1
* r2.12.1
* r2.13.1
* r2.14.1
* r2.15.1
* r2.16.1
* r2.17.1
* r2.18.1
* r2.19.1
* `amznPort` (see below)

There is typically API compatibility within an ExoPlayer major-minor version, so you should be able to pair one of the versions listed above with any player sharing the same major-minor version (e.g., the ExoPlayer r2.12.1 version of the Mux ExoPlayer SDK works with ExoPlayer r2.12.0 and r2.12.2 equally well).

Add a dependency to your Gradle file using the Mux SDK version and an ExoPlayer version listed above in the following format:

```gradle
api 'com.mux.stats.sdk.muxstats:MuxExoPlayer_(ExoPlayer SDK version with underscores):(Mux SDK version)'
```

Example using Mux ExoPlayer SDK 2.7.2 and ExoPlayer version r2.16.1:

```gradle
api 'com.mux.stats.sdk.muxstats:MuxExoPlayer_r2_16_1:2.7.2'
```

## Configure ProGuard/R8

If you're using ProGuard or R8, you'll need to add the following line to your app's proguard rules file (eg, `proguard-rules.pro`). This won't change anything about your app binary, it just suppresses a known warning

```
-dontwarn com.google.ads.interactivemedia.v3.api.**
```

#### Amazon ExoPlayer Port

In addition to the versions above, the Mux Data ExoPlayer SDK also supports [Amazon's official ExoPlayer port for Amazon Devices](https://github.com/amzn/exoplayer-amazon-port). If you are monitoring ExoPlayer on an Amazon device, you can get that version with the following line:

```gradle
api 'com.mux.stats.sdk.muxstats:MuxExoPlayer_amznPort:(Mux SDK version)'
```

For an example integration, you can see the demo application within [muxinc/mux-stats-sdk-exoplayer](https://github.com/muxinc/mux-stats-sdk-exoplayer) which integrates Mux into the ExoPlayer demo application.

## 2. Initialize the monitor with your ExoPlayer instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

First, create the `CustomerPlayerData` and `CustomerVideoData` objects as appropriate for your current playback

```kotlin
val customerData = CustomerData().apply {
        customerVideoData = CustomerVideoData().apply {
          // Data about this video
          // Add or change properties here to customize video metadata such as title,
          //   language, etc
          videoTitle = "Mux ExoPlayer Android Example"
          // ExoPlayer doesn't provide an API to obtain this, so it must be set manually
          videoSourceUrl = videoUrl
        }
        customerViewData = CustomerViewData().apply {
          // Data about this viewing session
          viewSessionId = UUID.randomUUID().toString()
        }
        customerViewerData = CustomerViewerData().apply {
          // Data about the Viewer and the device they are using
          muxViewerDeviceCategory = "kiosk"
          muxViewerDeviceManufacturer = "Example Display Systems"
          muxViewerOsVersion = "1.2.3-dev"
        }
        customData = CustomData().apply {
          // Add values for your Custom Dimensions.
          // Up to 5 strings can be set to track your own data
          customData1 = "Hello"
          customData2 = "World"
          customData3 = "From"
          customData4 = "Mux"
          customData5 = "Data"
        }
```

Next, create the `MuxStatsExoPlayer` object by passing your `Context` (typically your `Activity`), your `ENV_KEY`, the `ExoPlayer` instance, and the customer data object you just created.

```kotlin
muxStatsExoPlayer = exoPlayer.monitorWithMuxData(
      context = requireContext(),
      envKey = "YOUR_ENV_KEY_HERE",
      playerView = playerView,
      customerData = customerData
    )
```

If you haven't set your `playerView` already, do so now. We recommend this in order to determine a number of viewer context values as well as track the size of the video player.

```java
muxStatsExoPlayer.setPlayerView(simpleExoPlayerView.getVideoSurfaceView());
```

Finally, when you are destroying the player, call the `MuxStatsExoPlayer.release()` function.

```java
muxStatsExoPlayer.release()
```

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

#### Note For ExoPlayer v2.15 and below

On older supported versions of ExoPlayer, Mux prefers that you pass an instance of `SimpleExoPlayer` specifically, instead of any `ExoPlayer`. In the latter case, however, some metrics and errors may not be available, such as upscaling metrics. Updating to ExoPlayer r2.16.0 or higher will remove this limitation

```kotlin
muxStatsExoPlayer = exoPlayer.monitorWithMuxData(
      context = requireContext(),
      envKey = "YOUR_ENV_KEY_HERE",
      playerView = playerView,
      customerData = customerData
    )

```

or in java:

```java
// Make sure to monitor the player before calling `prepare` on the ExoPlayer instance
muxStatsExoPlayer = new MuxStatsExoPlayer(this, "YOUR_ENV_KEY_HERE", player, playerView, customerData);
```

## 3. Add Metadata

Options are provided to this SDK via the objects within the `CustomerData` object.

All metadata details are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

There is one caveat with ExoPlayer; ExoPlayer does not provide an API to retrieve the current source URL from the player. Due to this, `CustomerVideoData` has a method allowing you to set via `CustomerVideoData.setVideoSourceUrl(String url)`. Setting this value will allow you to see the source URL as well as the dimension Source Hostname within the dashboard.

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Advanced

## Changing the video

There are two cases where the underlying tracking of the video view need to be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

## New source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `muxStatsExoPlayer.videoChange(CustomerVideoData)` which will remove all previous video data and reset all metrics for the video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

It's best to change the video info immediately after telling the player which new source to play.

## New program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `muxStatsExoPlayer.programChange(CustomerVideoData)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

## Manually set when a video is being played full-screen

For most use cases, the SDK is capable of detecting whether or not a video is being played full-screen. Specifically, it can do so in the case where the player view is the same size as the device display (excepting ActionBars and other framework window decoration).

For other uses cases (non-overlaid controls, window decoration via plain `View`s, etc) you may need to tell the SDK when the user switches to full-screen.

If you are using `SimplePlayerView` or a similar ExoPlayer UI component, you can set the full-screen flag from the `OnFullScreenModeChangedListener`.

```kotlin
  override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)

    // If you are using SimplePlayerView, StyledPlayerView, etc
    playerView = findViewById(R.id.my_player_view)

    playerView.setFullscreenButtonClickListener { isFullScreen ->
      // Set presentation based on which mode is requested
      if(isFullScreen) {
        muxStats.presentationChange(MuxSDKViewPresentation.FULLSCREEN)
      } else {
        muxStats.presentationChange(MuxSDKViewPresentation.NORMAL)
      }
      // Handle moving to fullscreen playback with your code
    }
  }
```

## Error tracking

By default, Mux's integration with ExoPlayer automatically tracks fatal errors as thrown by ExoPlayer. If a fatal error happens outside the context of ExoPlayer and you want to track it with Mux, you can call `muxStatsExoPlayer.error` like this:

```kotlin
// Error code: integer value for the generic type of error that
// occurred.
// Error message: String providing more information on the error
// that occurred.
// For an example, the HTML5 video element uses the
// following: https://developer.mozilla.org/en-US/docs/Web/API/MediaError
// for codes and messages. Feel free to use your own codes and messages
val errorCode = 1
val errorMessage = "A fatal error was encountered during playback"
val errorContext = "Additional information about the error such as a stack trace"
val error = MuxErrorException(errorCode, errorMessage, errorContext)
muxStatsExoPlayer.error(error)
```

Note that `muxStatsExoPlayer.error(MuxErrorException e)` can be used with or without automatic error tracking. If your application has retry logic that attempts to recover from ExoPlayer errors then you may want to disable automatic error tracking like this:

```kotlin
muxStatsExoPlayer.setAutomaticErrorTracking(false)
```

<Callout type="warning">
  It is important that you only trigger an error when the playback has to be abandoned or aborted in an unexpected manner, as Mux tracks fatal playback errors only.
</Callout>

## Usage with Google Interactive Media Ads (IMA)

If you are using Google's IMA SDK to play back ads within your Android application, you can configure Mux to monitor the ad performance by passing your instance of `AdsLoader` to `muxStatsExoPlayer.monitorImaAdsLoader(adsLoader)`.

### ExoPlayer r2.12.x and Up

```kotlin
// For example, within the r2.12.x demo application
// PlayerActivity.getAdsLoader
adsLoader = ImaAdsLoader.Builder(context = this)
    /*
     * This replaces `monitorImaAdsLoader` method because in r2.12.x ImaAdsLoader
     * will create google.v3.AdsLoader on adRequest, which means that monitorImaAdsLoader
     * Will always receive null pointer and will be unable to recieve add events.
     */
    .setAdErrorListener(muxStats.getAdErrorEventListener())
    .setAdEventListener(muxStats.getAdEventListener())
    .build()
```

### ExoPlayer pre-r2.12.x

```kotlin
// Within setting up the AdsMediaSource
sdkFactory = ImaSdkFactory.getInstance()
adsLoader = sdkFactory.createAdsLoader(this)
muxStatsExoPlayer.monitorImaAdsLoader(adsLoader)
```

As of version `1.3.0` and later, the Mux SDK for ExoPlayer supports firing an event when the playback orientation changes. You can trigger this by calling `muxStatsExoPlayer.orientationChange(MuxSDKViewOrientation orientation)`, passing either `MuxSDKViewOrientation.LANDSCAPE` or `MuxSDKViewOrientation.PORTRAIT` depending on the current orientation of the player.

## Java Build Compatibility

## Java and Android Gradle Plugin Build Compatibility

Starting with version `2.6.0`, the Mux SDK for ExoPlayer requires JDK 11 and version 7.0 or greater of the Android Gradle Plugin. This is only a requirement for build compatibility. The Mux SDK for ExoPlayer will remain bytecode-compatible with Java 1.8.

If you are updating from version `2.5.9` or lower, you may need to:

* Update Android Studio to version `2020.x` or greater
* Update your dependency on the Android Build Tools plugin to `7.0.0` or greater
* Update Gradle in `gradle-wrapper.properties` to `7.0.2` or greater
* Ensure your Android Studio is using JDK 11:
  * Go to Android Studio Settings
  * Go `Build, Execution and Deployment` -> `BuildTools` -> `Gradle`
  * If the `Gradle JDK` option is not set to a Java 11 JDK, click the dropdown and select a Java 11 JDK. It should be the default on Studio `2020.x`

<LinkedHeader step={steps[6]} />

### Current release

#### v3.5.2

Fixes:

* fix rebuffering not ended when seeking starts
* fix extra-verbose logging causing crashes in some cases

### Internal library updates

* Update MuxCore to v8.1.2

### Previous releases

#### v3.5.1

Fixes:

* allow media3 and exoplayer Data SDKs to coexist in the same app build

#### v3.5.0

New:

* `MuxErrorException` now allows you to report non-fatal and business-related errors

Improvements:

* Updated MuxCore to version 8.0.0
* Updated Android Core to version 1.2.0

Fixes:

* fix: Capture IMA CSAI media failures with LOG events

#### v3.4.7

Fixes:

* fix: ad metadata not collected for mid and postrolls
* fix: time-to-first-frame incorrect if user seeks before play starts

#### v3.4.6

Fixes:

* fix: seeking not properly tracked on ExoPlayer 2.18 and 2.19

#### v3.4.5

Improvements:

* Add support for ExoPlayer 2.19

#### v3.4.4

Fixes:

* fix issue where beaconCollectionDomain wouldn't work correctly

#### v3.4.3

Improvements:

* chore: Cut support for ExoPlayer v2.10.x - v2.13.x
* fix: starting an ad break while rebuffering doesn't end rebuffering

#### v3.4.2

Improvements:

* collect segment response headers beginning with x-litix

#### v3.4.1

Improvements:

* Include ad playback time in total playback time

#### v3.4.0

Updated:

* Added `viewDrmType` to `CustomerViewData` so customers can provide their own value
  Improvements:
* Under-the-hood reliability improvements during large events

#### v3.3.4

Improvements:

* Simplify some internal HTTP error handling. This change should not affect the majority of users

#### v3.3.3

Improvements:

* Update to MuxCore v7.8.0, adds `longBeaconDispatch` to `CustomOptions`. This feature should only be used in a small number of use cases, and your setting may be overridden by mux's backend servers

#### v3.3.2

Improvements:

* Update beacon batch interval from 5s to 10s (#277)

#### v3.3.1

Improvements:

* Update to Mux Core 7.7.2, Fixes bug in ad-metadata reporting

#### v3.3.0

New:

* Add ad-related metadata to ad events
  Improvements:
* Update to Gradle 7.3 + Wrapper 7.4.0 + Simplify Demo Variant Names

#### v3.2.0

Updates:

* Improve Error Code Variety on ExoPlayer 2.15+
* Add Error Context and DRM Type for views
* Add API Total dropped frames

Improvements:

* Added extra values for Rendition lists.

#### v3.1.1

Fixes:

* Fix ArrayIndexOutOfBounds Exception after clearing the media item

#### v3.1.0

Updates:

* Override metadata about your users' device with `CustomerViewerData`

Fixes:

* Allow overriding Device Category metadata
* Exoplayer 2.11: Fix renditonchange sent on non-video track changes
* Fix beacon dispatcher crashing when verbose logging is enabled

Improvements:

* Update to MuxCore 7.4.0 (improvements/fixes have been noted in the release notes)

#### v3.0.2

Improvements:

* Collect Request IDs for HLS segments for Error Tracking

#### v3.0.1

Fixes:

* Fix the Kotlin extension for MuxStatsExoPlayer to require envKey

#### v3.0.0

API Improvements:

* Automatic Screen Size Detection: You no longer have to manually input your device's screen size to see fullscreen/screen size metrics. Just pass in your `Activity` and `PlayerView` when you make your `MuxStatsExoPlayer`
* Supply your player view via constructor parameter
* Kotlin extension for monitoring ExoPlayer
* `ENV_KEY` is now a required parameter to create a `MuxStatsExoPlayer`. It's required, so it's been made mandatory. The existing (non-env-key) constructors are now deprecated

Please refer to [the new usage guide](/docs/guides/monitor-exoplayer#2-initialize-the-monitor-with-your-exoplayer-instance) for more details

APIs Removed:

* Removed deprecated constructors of `MuxExoPlayer`. Use `CustomerData` instead
* Removed `MuxExoPlayer.setStreamType()` as it was no longer used
* Removed several methods, such as `getPlayerData()`, `getCurrentPosition()`, etc that are not meant for public use

The full list of removed methods is long, but the change is unlikely to impact you if you are using the SDK as documented. You can review the complete list of removed APIs [on our Release page on GitHub](https://github.com/muxinc/mux-stats-sdk-exoplayer/releases/tag/v3.0.0)

Commit Changelog:
Breaking:

* Remove Support for ExoPlayer 2.9.6
* Remove deprecated constructors (see above)

Updates:

* API Update: Add Environment Key via Constructor

Improvements:

* Convert to Kotlin, Refactor ExoPlayer interaction for maintainability, remove deprecations
* Remove non-ads demos, as the difference is not significant. This reduces CI time
* Remove Release Variants for test and demo apps. They are not required, and this reduces build/CI time
* Add GitHub Actions for Basic CI and Release Automation

#### v2.10.0

Fixes:

* Fix `setPlayerSize` to treat input as physical pixels, as documented. If you are using `setPlayerSize()`, you may have to update your code

#### v2.9.1

Improvements:

* Support for ExoPlayer `v2.18.1`
* Fix crashes in rare cases where the player is released asynchronously
* Update MuxCore to `v7.3.1`

MuxCore Changes:

* Split views with long periods of inactivity into multiple views

#### v2.9.0

Improvements:

* Add ability to override OS data values (incubating)
* Update to MuxCore `7.3.0`

MuxCore Changes:

* Support for overriding OS data values

#### v2.8.0

Improvements:

* Add support for Custom Data Domains
* Add support for manually tracking if a view was played automatically
* Update to MuxCore `v7.2.0`

Fixes:

* Fix Issue with HLS/DASH CDN tracking

MuxCore Changes:

* Custom Beacon Collection Domains
* Add Autoplay flag on CustomerPlayerData
* Fix serialization strategy for complex objects in beacons

### 2.7.2

Fixes:

* Fix Build/Crash Issues When Used With Minimal/Custom ExoPlayers

#### v2.7.1

Fixes:

* Fix an issue where our core library wasn't being packaged properly

#### v2.7.0

Improvements:

* Add support for Experiment Tracking via manifest tags (HLS only)
* Add support for Amazon ExoPlayer Port
* Add support for ExoPlayer `v2.17.x`

Fixes:

* HLS/DASH: Fix CDN tracking when playlist and chunks are coming from different CDNs
* Rate-limit `requestcompleted` events to prevent ingestion errors when the `DataSource` enters a retry loop

MuxCore Changes

* Add support for Experiment Tracking

#### v2.6.1

MuxCore 7.0.10 Fixes:

* Fix event-handling issues that can cause events to be dropped in rare cases

#### v2.6.0

Improvements:

* Add support for ExoPlayer r2.16.1
* Update to AGP 7.0
* Add additional logging for Event dispatching errors
* Add ability to override device name

Fixes

* Fix an issue with screen dimensions while in fullscreen

MuxCore 7.0.7 and 7.0.8 Changes:

* Fix potential packaging errors when used with androidX
* Fix bug related to the manual fullscreen API

#### v2.5.9

Improvements:

* Add support for measuring live stream glass-to-glass latency (#181)

MuxCore 7.0.6 Changes

* Added support for Live Latency

MuxCore 7.0.7 Changes

* Final API for Live latency

#### v2.5.8

Improvements:

* Add API to indicate whether video is shown fullscreen
* MuxCore:
  * Add support for latency metrics
  * Add a `Fullscreen` enum and API
  * Remove Sentry

Fixes:

* Fix for usage of legacy support libraries
* Added `-donotwarn` for ExoPlayer classes
* MuxCore:
  * Fix upscale percentages by clamping player size

#### v2.5.7

Improvements:

* Add support for ExoPlayer r2.15

Fixes:

* Updating to MuxCore 7.0.4 to fix ConcurrentModificationException when calling updateCustomerData.

#### v2.5.6

Fixes:

* Fix reference to packageVersionName in Gradle `deployVariant` task. Includes a change to the Gradle package layout, see example in docs.

#### v2.5.5

Fixes:

* Problem with ExoPlayer default implementation of methods on interfaces.

#### v2.5.4

Fixes:

* Reverts audio test improvements introduced in v2.5.3.

#### v2.5.3

Improvements:

* Upgrade Docker base image used for builds to JDK 8u302
* Audio test improvements

Fixes:

* Retain code obfuscation and mapping files
* Added pause event to be dispatched when player-stop is called

#### v2.5.2

* Updating to MuxCore 7.0.2 with fixes to code obfuscation

#### v2.5.1

* Fix packaging of ExoPlayer SDK AAR with MuxCore

#### v2.5.0

Improvements:

* Releasing process involving Artifactory
* MuxCore pulled from Maven instead of in bundled jar
* Support for overriding the beacon domain
* Javadoc coverage for public API
* For API version 30+ use context.getDisplay instead of WindowManager.

Fixes:

* Removed VideoComponent listener and now capturing firstFrameRendered
* Added conversion from physical `px` to `dpx` on `setScreen` size
* MuxCore:
  * Fix customer data null pointer exception
  * Fixed key name in setMuxEmbed function
  * Handle case where player size is larger than physical screen, treat as full-screen

#### v2.4.15

* Reduced the amount of messages sent each second to main thread.
* Additional logging for bandwidth metrics tests.

#### v2.4.14

* Support ExoPlayer 2.14

#### v2.4.13

* Add CustomerData class to ProGuard

#### v2.4.12

* Add `checkstyle` task to Gradle
* Replaced FrameRendererListener with VideoListener.
* Custom data update: deprecate MuxExoPlayer constructors that take a CustomerData argument separately, add custom-dimensions example to demo app

#### v2.4.11

* Run automated tests on real devices
* Fix MIME-type detection for HLS & DASH stream by allowing the server to make that determination.
* Upgrade MuxCore to 6.6.0, which includes:
  * Add support for custom dimensions in view metadata
  * Fix propagation of bandwidth metrics data by sending even when unchanged

#### v2.4.10

* Fix an issue where a null pointer exception may be raised when playing back DASH content (only present in v2.4.9)

#### v2.4.9 (deprecated)

* Added support for CDN header tracking, including mid-stream CDN switching
* Fix a null-pointer crash in the ads listener
* Updated the Mux Core library, added support for bandwidth metrics

#### v2.4.8

* Reset internal state when calling `videochange`, fixing an issue where rebuffering may be reported incorrectly after calling `videochange`

#### v2.4.7

* Fix an issue where metrics weren't tracked correctly sometimes when playback starts with a seek event
* Upgrade MuxCore to 6.3.0, which includes:
  * Reset error-tracking state when loading a new video.
* \[Internal] Fix automated tests for r2.13.1

#### v2.4.5

* Add support for ExoPlayer r2.13.x

#### v2.4.4

* Removed all content from res directory under MuxExoPlayer, ensuring smaller build size
* \[Internal] Added test for playback end events and view end event
* \[Maintenance] Reformat code with Google Java style
* Upgrade MuxCore to 6.2.0, which includes:
  * Added `viewEnd` event on player release.

#### v2.4.3

* Fix an issue where `customerViewData` was not propagated correctly through all constructors

#### v2.4.2

* Fix an issue where `customerViewData` was not propagated correctly through constructors

#### v2.4.1

* Fix an issue where detection of rebuffering after seeking was not working at times
* Use a random UUID stored in shared preferences for `mux_viewer_id`
* Fix an issue where `view_session_id` wasn't sent correctly

#### v2.4.0

* Fix an issue where additional icons and image files were included
* Fix an issue where the application would crash on Android 11
* Expose additional fatal playback errors

#### v2.3.1

* Fix an issue where AAR file size was too large due to inadvertent inclusion of a video file

#### v2.3.0

* Fix an issue where logical resolution was calculated incorrectly
* Report `wired` instead of `ethernet` for certain connection types
* \[internal] Integrate automated integration tests

#### v2.2.0

* Upgrade to Android Studio 4.1
* Upgrade to Gradle 6.1.1
* Update Dockerfile and build script for new tooling
* Support back to minAPI 16 via multidexing support

#### v2.1.0

* Support ExoPlayer r2.12.x flavors
* Expose CustomerViewData through ProGuard
* Ensure packages are scoped to com.mux.stats.sdk in ProGuard
* Update version reported by the plugin (v2.0.0 reported v1.5.0 unintentionally, now will report v2.1.0)
* Fix an issue where accessing ad integration could cause a crash
* Bump to MuxCore v6.0.0
* Fix invalid rebuffering reported for audio-only and playback
* Ensure that events are sent in a more timely manner (some events are held after a PauseEvent until
  the next active event)

#### v2.0.0

* Bump to v5.0.0 of MuxCore
  * Update ad handling logic to ensure that ad metrics and dimensions are tracked correctly
  * Retry sending failed beacons, rather than letting them drop
  * Fix issue where we were incorrectly calculating scaling metrics when screen or video resolution was negative
  * Fix an issue where watch time is incorrectly increasing after certain events
  * Make sure that time to first frame is not tracked for views that result from `programchange`
  * Add support for `viewer_connection_type`, which is a breaking change for `IDevice`, as it adds another method that must be implemented
  * Add support for `view_session_id`, which includes an additional `CustomerViewData` class. This changes the constructor for creating a `MuxStats` instance
* Drop support for ExoPlayer r2.7.x and r2.8.x
* Implement SeekingEvent directly in `MuxStatsExoPlayer`
* Fix issue where source type could be null and cause a crash
* Fix an issue where ad events are sent out of order in some cases
* Add connection type detection
* Report logical sizes for player size, rather than physical size
* Fix an issue where time to first frame was incorrectly measured in some cases, such as mid- or post-roll ad playback without a pre-roll
* Add support for `CustomerViewData`, including `setViewSessionId`

#### v1.5.0

* Fix an issue where if you were using `muxStatsExoPlayer.setPlayerSize(width, height)` those values were not used correctly. Note: If you call this, you must update the player size whenever that changes, as the SDK will no longer pull those values automatically.

#### v1.4.0

* Move `MuxSDKViewOrientation` to `com.mux.stats.sdk.core.MuxSDKViewOrientation` and expose it publicly

#### v1.3.0

* Add support for `RenditionChangeEvent`, which is tracked automatically
* Add support for `OrientationChangeEvent`, which can be triggered by calling `muxStatsExoPlayer.orientationChange(MuxSDKViewOrientation orientation)`. Supported orientations are `MuxSDKViewOrientation.LANDSCAPE` and `MuxSDKViewOrientation.PORTRAIT`.
* Fix an issue where full screen tracking was not working correctly

#### v1.2.0

* Add support for ExoPlayer 2.11.x
* Note: there is a known issue right now with ExoPlayer r2.11.x where ads are not tracked correctly. This is under development.

#### v1.1.0

* Add support for additional debug logging. See `muxStatsExoPlayer.enableMuxCoreDebug(Boolean enable, Boolean verbose)`
* Add the ability to update customerVideoData and customerPlayerData mid-stream, in cases that certain metadata may not be available at the beginning of playback. See `muxStatsExoPlayer.updateCustomerData(CustomerPlayerData customerPlayerData, CustomerVideoData customerVideoData)`
* Fix an issue where if `MuxStatsExoPlayer` is initialized too late, the stream is not tracked correctly
* Fix an issue where Mux Plugin Version is reported incorrectly
* Fix an issue where the `EndedEvent` is not sent to the backend
* Fix an issue where tracking playback is not correct when playWhenReady is set to false (i.e. non-autoplay playback)
* Fix an issue where events could be sent after playback completes, forcing the view to be active for longer than it actually was
* Utilize more accurate client timestamps for event timing

#### v1.0.0

* Add support for ExoPlayer 2.9.x
* Add support for ExoPlayer 2.10.x
* Fix issue where ExoPlayer versions 2.9.x and greater would log messages about accessing the player on the wrong thread
* **breaking change** Removed support for ExoPlayer 2.6.x and older (due to changes in build pipeline and Gradle configurations)
* Support Gradle 3.5.2

#### v0.5.1

* Clean up demo application
* Allow disabling of Sentry reporting for exceptions.

#### v0.5.0

* Deprecated method `muxStatsExoPlayer.getImaSDKListener` in favor of `muxStatsExoPlayer.monitorImaAdsLoader(adsLoader)`. The previous method will still work, but you should migrate to the new method as the deprecated method will be removed with the next major version.
* Fix an issue where Google IMA SDK was a hard requirement unintentionally.

#### v0.4.5

* Introduce support for tracking ads with Google's IMA SDK.

#### v0.4.3

* Fix an issue where a `NullPointerException` may occur during playback of a video while tracking bandwidth metrics.

#### v0.4.2

* Added API method `programChange(CustomerVideoData customerVideoData)`, for use when inside of a single stream the program changes. For instance, in a long-running live stream, you may have metadata indicating program changes which should be tracked as separate views within Mux. Previously, `videoChange` might have been used for this case, but this would not work correctly, and you would not necessarily have seen the subsequent views show up.
* Fixed a bug where under poor network conditions, an exception raised as a result of a network request could result in not tracking the view correctly subsequently (such as missing rebuffer tracking after this point).

#### v0.4.1

* Remove the listeners on the `ExoPlayer` object when `release` is called.
  * This fixes and issue where the application may crash after calling release
    if the ExoPlayer instance is removed while the SDK is still listening to
    it.

#### v0.4.0

* \[feature] Support bandwidth throughput metrics on video segment download
  for HLS and Dash streaming.
* **breaking change** The signature for `getAdaptiveMediaSourceEventListener`
  and `getExtractorMediaSourceEventListener` has been changed. These methods
  are used to enable throughput metrics tracking for ExoPlayer versions
  *before* r2.8.0, and now require that the streaming protocol type is
  passed as the first parameter. The type is the same as is returned from
  [this ExoPlayer API call](https://github.com/muxinc/stats-sdk-exoplayer/blob/release-v2/demo/src/main/java/com/google/android/exoplayer2/demo/PlayerActivity.java#L355).

#### v0.3.0

* **breaking change** The signature for the `MuxStatsExoPlayer` constructor
  has changed, and now requires an additional parameter (the first) to be
  and Android `Context` reference.
* abstract more core logic into mux-stats-sdk-java
* \[build] rename and copy build artifacts

#### v0.2.2

* add back in previously missing methods to `MuxStatsExoPlayer`:
  * `videoChange`
  * `setPlayerSize`
  * `error`
  * `setAutomaticErrorTracking`

#### v0.2.1

* add support for `ExoPlayer` r2.7.x
* add support for `ExoPlayer` r2.8.x
* update to v2.1.0 of mux-stats-sdk-java


# Monitor dash.js
This guide walks through integration with [dash.js](https://github.com/Dash-Industry-Forum/dash.js?) to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Can infer CDN identification from response headers
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Custom Beacon Domain

```

Notes:

```md
No notes provided
```

## 1. Install mux-embed

Include the Mux JavaScript SDK on every page of your web app that includes video. You can use the Mux-hosted version of the script or install via npm. `mux-embed` follows [semantic versioning](https://semver.org/) and the API will not change between major releases.

```cdn

<script src="https://src.litix.io/core/4/mux.js"></script>

```

```npm

npm install --save mux-embed

```

```yarn

yarn add mux-embed

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```html

<script>
  if (typeof mux !== 'undefined') {
    window.muxPlayerInitTime = mux.utils.now();
  }
</script>

<video
  id="my-player"
  controls
  width="960"
  height="400"
/>

<script>
  const dashjsPlayer = dashjs.MediaPlayer().create();
  dashjsPlayer.initialize(videoEl, 'https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd', true);

  // Initialize Mux Data monitoring by passing in the "id" attribute of your video player
  if (typeof mux !== 'undefined') {
    const videoEl = document.querySelector('#my-player');

    mux.monitor(videoEl, {
      debug: false,
      dashjs: dashjsPlayer,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata fields
        player_name: 'Main Player', // any arbitrary string you want to use to identify this player
        player_init_time: window.muxPlayerInitTime // ex: 1451606400000
        // ...
      }
    });
  }
</script>

```

```javascript

import dashjs from "dashjs";
import mux from "mux-embed";

const dashjsPlayer = dashjs.MediaPlayer().create();
const videoEl = document.querySelector('#my-player');

dashjsPlayer.initialize(videoEl, 'https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd', true);

mux.monitor(videoEl, {
  debug: false,
  dashjs: dashjsPlayer,
  data: {
    env_key: 'ENV_KEY', // required
    // Metadata fields
    player_name: 'Main Player', // any arbitrary string you want to use to identify this player
    player_init_time: window.muxPlayerInitTime // ex: 1451606400000
    // ...
  }
});

```

```react

import React, { useEffect, useRef } from "react";
import dashjs from "dashjs";
import mux from "mux-embed";

export default function VideoPlayer() {
  const videoRef = useRef(null);
  const src = "https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd";

  useEffect(() => {
    let dashjsPlayer;

    if (videoRef.current) {
      const video = videoRef.current;
      const initTime = mux.utils.now();

      dashjsPlayer = dashjs.MediaPlayer().create();
      dashjsPlayer.initialize(video, src, true);

      mux.monitor(video, {
        debug: false,
        // pass in the 'dashjsPlayer' instance
        dashjs: dashjsPlayer,
        data: {
          env_key: "ENV_KEY", // required
          // Metadata fields
          player_name: "Main Player", // any arbitrary string you want to use to identify this player
          player_init_time: initTime
          // ...
        }
      });
    }

    return () => {
      if (dashjsPlayer) {
        dashjsPlayer.destroy();
      }
    };
  }, [videoRef]);

  return (
    <video
      controls
      ref={videoRef}
      style={{ width: "100%", maxWidth: "500px" }}
    />
  );
}

```



Call `mux.monitor` and pass in a valid CSS selector or the video element itself. Followed by the SDK options and metadata. If you use a CSS selector that matches multiple elements, the first matching element in the document will be used.

In the SDK options, be sure to pass in the `dashjs` player instance.

Alternatively, if your player does not immediately have access to the dash.js player instance, you can start monitoring dash.js at any time in the future. In order to do this, you can call either of the following:

```js
mux.addDashJS("#my-player", options)
// or
myVideoEl.mux.addDashJS(options)
```

Log in to the Mux dashboard and find the environment that corresponds to your `env_key` and look for video views. It takes about a minute or two from tracking a view for it to show up on the Metrics tab.

**If you aren't seeing data**, check to see if you have an ad blocker, tracking blocker or some kind of network firewall that prevents your player from sending requests to Mux Data servers.

## 3. Make your data actionable

The only required field in the `options` that you pass into `mux-embed` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` key when calling `mux.monitor`.

```js
mux.monitor('#my-player', {
  debug: false,
  dashjs: dashjsPlayer,
  data: {
    env_key: 'ENV_KEY', // required

    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'

    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000

    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `monitor`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
mux.emit('#my-player', 'videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
mux.emit('#my-player', 'programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
mux.monitor('#my-player', {
  debug: false,
  disableCookies: true,
  dashjs: dashjsPlayer,
  data: {
    env_key: 'ENV_KEY',
    // ... rest of metadata
  }
}
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
mux.monitor('#my-player', {
  debug: false,
  dashjs: dashjsPlayer,
  respectDoNotTrack: true, // Disable tracking of browsers where Do Not Track is enabled
  data: {
    env_key: 'EXAMPLE_ENV_KEY',
    // ... rest of metadata
  }
}
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `mux-embed` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
mux.emit('#my-player', 'error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

mux.monitor('#my-player', {
  debug: false,
  errorTranslator: errorTranslator,
  dashjs: dashjsPlayer,
  data: {
    env_key: 'ENV_KEY', // required

    // ... additional metadata
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
mux.monitor('#my-player', {
  debug: false,
  automaticErrorTracking: false,
  dashjs: dashjsPlayer,
  data: {
    env_key: 'EXAMPLE_ENV_KEY', // required

    // ... additional metadata
  }
```

### Use TypeScript with mux-embed  <BetaTag />

`mux-embed` now provides TypeScript type definitions with the published package! If you want to opt in, you can check out how [here](/docs/guides/monitor-html5-video-element#use-typescript-with-mux-embed--).

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
mux.monitor('#my-player', {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  dashjs: dashjsPlayer,
  data: {
    env_key: 'EXAMPLE_ENV_KEY', // required
    // ... additional metadata
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v5.17.1

* fix issue where playing time might accumulate for paused players

### Previous releases

#### v5.17.0

* add compatibility for dash.js 5

#### v5.16.1

* Update parsing of initial value for player\_playback\_mode

#### v5.16.0

* Add Playback Range Tracker for new engagement metrics

#### v5.15.0

* Automatically detect playback mode changes for HTML 5 Video

#### v5.14.0

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.

#### v5.13.0

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * New playbackmodechange event
  * Two new metrics, ad\_playing\_time\_ms\_cumulative and view\_playing\_time\_ms\_cumulative, to track playing time by wall clock time

#### v5.12.0

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.

#### v5.11.0

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time

#### v5.10.0

* Adds support for cdnchange events

#### v5.9.1

* Submit Aggregate Startup Time when autoplay is set

#### v5.9.0

* Improve scaling calculation accuracy by using more events for tracking

#### v5.8.3

* add custom 11 through 20 to types

#### v5.8.2

* remove duplicate video\_source\_mime\_type from types

#### v5.8.1

* fix typo in types for viewer\_plan

#### v5.8.0

* Add support for video\_creator\_id

#### v5.7.0

* Add keys for new customer-defined dimensions

#### v5.6.0

* Fix issue where firefox did not send beacons, and some final beacons might not be sent

#### v5.5.0

* Update mechanism for generating unique IDs, used for `view_id` and others
* Use crypto.randomUUID(), when available, for generating UUID values

#### v5.4.3

* \[chore] internal build process fix (no functional changes)

#### v5.4.2

* feat(google-ima): Beta implementation of google-ima extension to mux-embed
* feat(mux-embed): Add methods for post-initialization overrides of functionality (for internal use only).
* fix(mux-embed): typecheck for dashjs.getSource is incorrect.

#### v5.4.1

* Expose `updateData` globally and fix types
* Fix an issue where views were not ended cleanly on long resume detection

#### v5.4.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

#### v5.3.3

* expose HEARTBEAT and DESTROY under mux.events

#### v5.3.2

* Fix type issues for error severity and business exception

#### v5.3.1

* fix(mux-embed): Remove 3rd party dependencies and replace with appropriately equivalent functionality.

#### v5.3.0

* Ignore request events when emitting heartbeat events
* Fix an issue where video quality metrics may not be calculated correctly on some devices

#### v5.2.1

* Send hb events regardless of errors

#### v5.2.0

* Bug fix to not de-dupe error event metadata
* Extend `errorTranslator` to work with `player_error_severity` and `player_error_business_exception`

#### v5.1.0

* Target ES5 for bundles and validate bundles are ES5

* fix an issue where seeking time before first play attempt counted towards video startup time

#### v5.0.0

* Add opt-in TypeScript Types to Mux Embed and use + refactor for other dependent data SDKs. Update published dists to include CJS and ESM.
* Mux Embed now provides (opt in) TypeScript types in its published package, as well as publishes CJS and ESM versions of the package.
* This allows us to provide a lower risk and iterative roll out of official TypeScript types for `mux-embed`. The export types updates were required to ensure actual matches between the dist package and corresponding TypeScript types.
* This *should* have no direct impact on users, though different build tools will now potentially select one of the new export types (e.g. the ESM "flavor" of `mux-embed`). TypeScript types *should not* be applied unless they are explicitly referenced in app (discussed in docs updates).

#### v4.30.0

* fix an issue causing certain network metrics to not be available for dashjs v4.x

* fix an issue where certain IDs used may cause a DOM exception to be raised

#### v4.29.0

* fix(mux-embed): avoid using element id for muxId. attach muxId to element.

#### v4.28.1

* fix an issue where beaconDomain deprecation line was incorrectly logged

#### v4.28.0

* Deprecate `beaconDomain` in favor of `beaconCollectionDomain`. The `beaconDomain` setting will continue to function, but integrations should change to `beaconCollectionDomain` instead.

#### v4.27.0

* Fix an issue where playback time was incorrectly counted during seeking and other startup activities
* Add events for the collection of ad clicks
* fix an issue where seek latency could be unexpectedly large
* fix an issue where seek latency does not include time at end of a view
* Add events for the collection of ad skips

#### v4.26.0

* muxData cookie expiration should be one year

#### v4.25.1

* Do not deduplicate ad IDs in ad events

#### v4.25.0

* Include ad watch time in playback time

#### v4.24.0

* Fix an issue where beacons over a certain size could get hung and not be sent

#### v4.23.0

* Collect Request Id from the response headers, when available, for HLS.js (`requestcompleted` and `requestfailed`) and Dash.js (`requestcompleted`). The following headers are collected: `x-request-Id`, `cf-ray` (Cloudflare), `x-amz-cf-id` (CloudFront), `x-akamai-request-id` (Akamai)

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update Headers type

#### v4.22.0

* Send errors, `requestfailed`, and `requestcancelled` events on Dash.js. Because of this change, you may see the number of playback failures increase as we now automatically track additional fatal errors.

#### v4.21.0

* Include Ad metadata in ad events

#### v4.20.0

* Support for new dimension, `view_has_ad`

#### v4.19.0

* End views after 5 minutes of rebuffering

#### v4.18.0

* Add audio, subtitle, and encryption key request failures for HLS.js
* Capture ad metadata for Video.js IMA
* Capture detailed information from HLS.js for fatal errors in the Error Context

#### v4.17.0

* Extend `errorTranslator` to work with `player_error_context`

#### v4.16.0

* Add new `renditionchange` fields to Shaka SDK
* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields
* Add frame drops to Shaka SDK
* Add new `renditionchange` info to Web SDKs
* Adds the new Media Collection Enhancement fields

#### v4.15.0

* update `mux.utils.now` to use `navigationStart` for timing reference

* fix issue where views after `videochange` might incorrectly accumulate rebuffering duration

* Resolved issue sending beacons when view is ended

* Record `request_url` and `request_id` with network events

#### v4.14.0

* Tracking FPS changes if specified in Manifest

#### v4.13.4

* Resolved issue sending beacons when paused

#### v4.13.3

* Fixed issue with monitoring network events for hls.js monitor

#### v4.13.2

* Fix an issue with sending unnecessary heartbeat events on the window `visibilitychange` event

#### v4.13.1

* Fixes an issue with accessing the global object

#### v4.13.0

* Collect the `x-request-id` header from segment responses to make it easier to correlate client requests to other logs

* Upgraded internal webpack version

* Flush events on window `visibilitychange` event

#### v4.12.1

* Use Fetch API for sending beacons

#### v4.12.0

* Generate a new unique view if the player monitor has not received any events for over an hour.

#### v4.11.0

* Detect fullscreen and player language

#### v4.10.0

* Replace query string dependency to reduce package size
* Remove `ImageBeacon` fallback, removing support for IE9

#### v4.9.4

* Generate all `view_id`'s internally

#### v4.9.3

* Use common function for generating short IDs

#### v4.9.2

* Fixed an issue around the `disablePlayheadRebufferTracking` option

#### v4.9.1

* Fix issue where `getStartDate` does not always return a date object

#### v4.9.0

* Support PDT and player\_live\_edge\_program\_time for Native Safari

* Set a max payload size in mux-embed

#### v4.8.0

* Add option `disablePlayheadRebufferTracking` to allow players to disable automatic rebuffering metrics.
  Players can emit their own `rebufferstart` or `rebufferend` events and track rebuffering metrics.

* Fix an issue with removing `player_error_code` and `player_error_message` when the error code is `1`.
  Also stops emitting `MEDIA_ERR_ABORTED` as errors.

* Now leaving Player Software Version for HTML5 Video Element unset rather than "No Versions" as it is no longer needed.

#### v4.7.0

* Add an option to specify beaconCollectionDomain for Data custom domains

#### v4.6.2

* Fix an issue with emitting heartbeat events while the player is not playing

#### v4.6.1

* Fix an issue with removing event listeners from window after the player monitor destroy event

#### v4.6.0

* Update hls.js monitor to record session data with fields prefixed as `io.litix.data.`
* Update the manifest parser to parse HLS session data tags

#### v4.5.0

* Add short codes to support internal video experiments
* Collect request header prefixed with `x-litix-*`
* Capture fatal hls.js errors
* Make `envKey` an optional parameter

#### v4.4.4

* Add a player events enum on the `mux` object (e.g. `mux.events.PLAY`)
* Use the browser `visibilitychange` listener instead of `unload` to handle destructuring the player monitor.

#### v4.4.3

* Fix: Specify `video_source_is_live` for HLS.js monitor

#### v4.4.2

* Group events into 10 second batches before sending a beacon

#### v4.4.1

* Exclude latency metrics from beacons if `video_source_is_live` is not `true`

#### v4.4.0

* Add a lightweight HLS manifest parser to capture latency metrics for player's that don't expose an API for accessing the manifest.
* Allow players to emit `player_program_time` instead of calculating internally

#### v4.3.0

* Add support for calculating latency metrics when streaming using HLS

#### v4.2.5

* Remove default `video_id` when not specified by the developer.

#### v4.2.4

* Add minified keys for latency metrics

#### v4.2.3

* Add minified keys for new program time metrics

#### v4.2.2

* Fix bug causing missing bitrate metrics using HLS.js {'>'}v1.0.0

#### v4.2.1

* (video element monitor) Fix an issue where some non-fatal errors thrown by the video were tracked as playback failures

#### v4.2.0

* Fix an issue where views triggered by `programchange` may not report metrics correctly
* Fix an issue where calling `el.mux.destroy()` multiple times in a row raised an exception

#### v4.1.1

* Fix an issue where `player_remote_played` wasn't functioning correctly

#### v4.1.0

* Add support for custom dimensions

#### v4.0.1

* Support HLS.js v1.0.0

#### v4.0.0

* Enable sending optional ad quartile events through.
* Move device detection server-side, improving data accuracy and reducing client SDK size.
* Fix an issue where jank may be experienced in some web applications when the SDK is loaded.

#### v3.4.0

* Setting to disable rebuffer tracking `disableRebufferTracking` that defaults to `false`.

#### v3.3.0

* Adds `viewer_connection_type` detection.

#### v3.2.0

* Adds support for `renditionchange`.

#### v3.1.0

* Add checks for window being undefined and expose a way for SDKs to pass in platform information. This work is necessary for compatibility with react-native-video.

#### v3.0.0

* Setting to disable Mux Data collection when Do Not Track is present now defaults to off
* Do not submit the source URL when a video is served using the data: protocol

#### v2.10.0

* Use Performance Timing API, when available, for view event timestamps

#### v2.9.1

* Fix an issue with server side rendering

#### v2.9.0

* Support for Dash.js v3

#### v2.8.0

* Submit Player Instance Id as a unique identifier

#### v2.7.3

* Fixed a bug when using `mux.monitor` with Hls.js or Dash.js the source hostname was not being properly collected.


# Monitor video.js
This guide walks through integration with [video.js](https://videojs.com/) to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Can infer CDN identification from response headers
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Ads metrics
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
Request metrics and CDN identification are available using `videojs-contrib-hls` or Video.js v7+. Preroll Ads metrics & metadata available if using `videojs-ima`. Session Data is available with Video.js w/ HLS.js.
```

## 1. Install \`videojs-mux\`

Include the Mux JavaScript SDK on every page of your web app that includes video. You can use the Mux-hosted version of the script or install via npm. `videojs-mux` follows [semantic versioning](https://semver.org/) and the API will not change between major releases.

```cdn

<!-- Include videojs-mux after Video.js -->
<script src="/path/to/video.js"></script>
<!-- Include other videojs plugin files here -->
<script src="https://src.litix.io/videojs/4/videojs-mux.js"></script>

```

```npm

npm install --save videojs-mux

```

```yarn

yarn add videojs-mux

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Call video.js like you normally would and include the Mux plugin options.

```html

<video id="my-player" class="video-js vjs-default-skin"  controls>
  <!-- 
      we're using a Mux HLS URL in this example, but the Mux Data integration
      with HLS.js works with any source that plays with video.js
  -->
  <source src="https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8" type="application/x-mpegURL">
</video>

<script>
  // EITHER initialize Mux Data monitoring like this
  videojs('my-player', {
    plugins: {
      mux: {
        debug: false,
        data: {
          env_key: 'ENV_KEY', // required

          // Metadata
          player_name: '', // ex: 'My Main Player'

          // ... and other metadata
        }
      }
    }
  });


  // OR call the mux function on the player instance
  // var player = videojs('my-player');
  // player.mux({
  //   debug: false,
  //   data: { ... }
  // });
</script>

```

```javascript

import videojs from "video.js";
import "video.js/dist/video-js.css";
import "videojs-mux";

videojs('my-player', {
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required

        // Metadata
        player_name: '', // ex: 'My Main Player'

        // ... and other metadata
      }
    }
  }
});

```

```react

import React, { useEffect, useRef } from "react";
import videojs from "video.js";
import "video.js/dist/video-js.css";
import "videojs-mux";

export default function VideoPlayer() {
  const videoRef = useRef(null);
  const playerRef = useRef(null);
  const src = "https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8";

  useEffect(() => {
    if (videoRef.current) {
      const video = videoRef.current;

      playerRef.current = videojs(video, {
        sources: [{ src, type: "application/x-mpegURL" }],
        plugins: {
          mux: {
            debug: false,
            data: {
              env_key: "ENV_KEY", // required
              // Metadata
              player_name: "", // ex: 'My Main Player'
              // ... and other metadata
            }
          }
        }
      });
    }

    return () => {
      if (playerRef.current) {
        playerRef.current.dispose();
      }
    };
  }, [videoRef]);

  return (
    <video
      controls
      ref={videoRef}
      style={{ width: "100%", maxWidth: "500px" }}
    />
  );
}

```



## 3. Make your data actionable

The only required field in the `options` that you pass into `videojs-mux` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
videojs('#my-player', {
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Site Metadata
        viewer_user_id: '', // ex: '12345'
        experiment_name: '', // ex: 'player_test_A'
        sub_property_id: '', // ex: 'cus-1'
        // Player Metadata
        player_name: '', // ex: 'My Main Player'
        player_version: '', // ex: '1.0.0'
        // There is no need to provide player_init_time, tracked automatically
        // player_init_time: '', // ex: 1451606400000;
        // Video Metadata
        video_id: '', // ex: 'abcd123'
        video_title: '', // ex: 'My Great Video'
        video_series: '', // ex: 'Weekly Great Videos'
        video_duration: '', // in milliseconds, ex: 120000
        video_stream_type: '', // 'live' or 'on-demand'
        video_cdn: '' // ex: 'Fastly', 'Akamai'
      }
    }
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first initialize the Mux SDK. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance returned by the `videojs` function
player.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by the `videojs` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by the `videojs` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
videojs('#my-player', {
  plugins: {
    mux: {
      debug: false,
      disableCookies: true,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
videojs('#my-player', {
  plugins: {
    mux: {
      debug: false,
      respectDoNotTrack: true,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `videojs-mux` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance returned by the `videojs` function
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

videojs('#my-player', {
  plugins: {
    mux: {
      debug: false,
      errorTranslator,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
videojs('#my-player', {
  plugins: {
    mux: {
      debug: false,
      automaticErrorTracking: false,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

### Ads tracking with `videojs-mux`

If you are using [`videojs-ima`](https://github.com/googleads/videojs-ima), Brightcove's IMA3 FreeWheel or OnceUX plugins with VideoJS then `videojs-mux` will track ads automatically. No extra configuration is needed.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
videojs('#my-player', {
  plugins: {
    mux: {
      debug: false,
      beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v4.21.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v4.21.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v4.21.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v4.21.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v4.21.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v4.21.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v4.21.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v4.21.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v4.21.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v4.21.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v4.21.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v4.21.7

* Update `mux-embed` to v5.9.0

#### v4.21.6

* Update `mux-embed` to v5.8.3

#### v4.21.5

* Update `mux-embed` to v5.8.2

#### v4.21.4

* Update `mux-embed` to v5.8.1

#### v4.21.3

* Update `mux-embed` to v5.8.0

#### v4.21.2

* Update `mux-embed` to v5.7.0

#### v4.21.1

* Update `mux-embed` to v5.6.0

#### v4.21.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v4.20.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v4.20.2

* Update `mux-embed` to v5.4.2

#### v4.20.1

* Update `mux-embed` to v5.4.1

#### v4.20.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v4.19.4

* Update `mux-embed` to v5.3.3

#### v4.19.3

* Update `mux-embed` to v5.3.2

#### v4.19.2

* Update `mux-embed` to v5.3.1

#### v4.19.1

* Update `mux-embed` to v5.3.0

#### v4.19.0

* utilize onRequest rather than beforeSend for videojs 8.x

* Update `mux-embed` to v5.2.1

#### v4.18.1

* Update `mux-embed` to v5.2.0

#### v4.18.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v4.17.0

* Refactors for stricter data types (e.g. string vs. number) based on TypeScript types.

* Update `mux-embed` to v5.0.0

#### v4.16.4

* Update `mux-embed` to v4.30.0

#### v4.16.3

* Update `mux-embed` to v4.29.0

#### v4.16.2

* Update `mux-embed` to v4.28.1

#### v4.16.1

* Update `mux-embed` to v4.28.0

#### v4.16.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v4.15.3

* Update `mux-embed` to v4.26.0

#### v4.15.2

* Update `mux-embed` to v4.25.1

#### v4.15.1

* Update `mux-embed` to v4.25.0

#### v4.15.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v4.14.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v4.13.4

* Update `mux-embed` to v4.22.0

#### v4.13.3

* Update `mux-embed` to v4.21.0

#### v4.13.2

* Update `mux-embed` to v4.20.0

#### v4.13.1

* Update `mux-embed` to v4.19.0

#### v4.13.0

* Set Mux Error Context with error status from Video.js

#### v4.12.0

* Capture ad metadata for Video.js IMA

* Update `mux-embed` to v4.18.0

#### v4.11.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v4.10.1

* fix issue where VideoJS with hls.js might cause an exception when monitored

#### v4.10.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v4.9.1

* fix an issue where an exception may happen on certain Samsung TVs using `videojs-mux`

#### v4.9.0

* Register `beforesetup` hook to track `player_init_time` automatically. There is now no need to provide `player_init_time` in plugin initialization

* Record `request_url` and `request_id` with network events

* Update `mux-embed` to v4.15.0

#### v4.8.5

* Update `mux-embed` to v4.14.0

#### v4.8.4

* Update `mux-embed` to v4.13.4

#### v4.8.3

* Update `mux-embed` to v4.13.3

#### v4.8.2

* Update `mux-embed` to v4.13.2

#### v4.8.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v4.8.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v4.7.8

* Update `mux-embed` to v4.12.1

#### v4.7.7

* Update `mux-embed` to v4.12.0

#### v4.7.6

* Update `mux-embed` to v4.11.0

#### v4.7.5

* Update `mux-embed` to v4.10.0

#### v4.7.4

* Update `mux-embed` to v4.9.4

#### v4.7.3

* Use `videojs.Vhs` instead of `videojs.Hls` when available

#### v4.7.2

* Update `mux-embed` to v4.9.3

#### v4.7.1

* Update `mux-embed` to v4.9.2

#### v4.7.0

* HLS session and latency metrics

#### v4.6.6

* Update `mux-embed` to v4.9.1

#### v4.6.5

* Update `mux-embed` to v4.9.0

#### v4.6.4

* Fix an issue with removing `player_error_code` and `player_error_message` when the error code is `1`.
  Also stops emitting `MEDIA_ERR_ABORTED` as errors.
* Update `mux-embed` to v4.8.0

#### v4.6.3

* Update `mux-embed` to v4.7.0

#### v4.6.2

* Update `mux-embed` to v4.6.2

#### v4.6.1

* Update `mux-embed` to v4.6.1

#### v4.6.0

* Bump mux-embed to 4.6.0

#### v4.5.0

* Export a `register` function that takes a `videojs` instance to install the mux plugin on

#### v4.4.0

* Update `mux-embed` to v4.4.2

#### v4.3.0

* Update `mux-embed` to v4.3.0

#### v4.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v4.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v4.0.0

* Update `mux-embed` to v4.0.0
* Support server-side device detection
* Internal fixes and improvements

#### v3.1.4

* update logging around retrieving BANDWIDTH information

#### v3.1.3

* Bump `mux-embed` dependency to `3.4.3`.

#### v3.1.2

* Bump `mux-embed` dependency to `3.4.2`.


# Monitor React native video
This guide walks through integration with react-native-video to collect video performance metrics with Mux Data.
<Callout type="warning" title="Beta SDK">
  This SDK is currently beta.
  See the [Known Issues](https://github.com/muxinc/mux-stats-sdk-react-native-video#known-issues) and [Caveats](https://github.com/muxinc/mux-stats-sdk-react-native-video#caveats) in the README on GitHub.
</Callout>

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager

```

Notes:

```md
Video Quality metrics are not available.
```

## 1. Install Mux Data SDK

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-react-native-video
```

```yarn
yarn add @mux/mux-data-react-native-video
```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Wrap your `Video` component with the `muxReactNativeVideo` higher-order-component.

```jsx
import app from './package.json' // this is your application's package.json
import Video from 'react-native-video'; // import Video from react-native-video like your normally would
import muxReactNativeVideo from '@mux/mux-data-react-native-video';

// wrap the `Video` component with Mux functionality
const MuxVideo = muxReactNativeVideo(Video);

// Pass the same props to `MuxVideo` that you would pass to the
// `Video` element. All of these props will be passed through to your underlying react-native-video component
// Include a new prop for `muxOptions`
<MuxVideo
  style={styles.video}
  source={{
    uri:
      'https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8',
  }}
  controls
  muted
  muxOptions={{
    application_name: app.name,            // (required) the name of your application
    application_version: app.version,      // the version of your application (optional, but encouraged)
    data: {
      env_key: 'YOUR_ENVIRONMENT_KEY',     // (required)
      video_id: 'My Video Id',             // (required)
      video_title: 'My awesome video',
      player_software_version: '5.0.2',     // (optional, but encouraged) the version of react-native-video that you are using
      player_name: 'React Native Player',  // See metadata docs for available metadata fields /docs/web-integration-guide#section-5-add-metadata
    },
  }}
/>
```

## 3. Make your data actionable

The required fields in the `muxOptions` that you pass into the `MuxVideo` component are `application_name`, `data.env_key` and `data.video_id`. However, without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
  muxOptions={{
    application_name: app.name,            // (required) the name of your application
    application_version: app.version,      // the version of your application (optional, but encouraged)
    data: {
      env_key: 'ENV_KEY',
      // Site Metadata
      viewer_user_id: '', // ex: '12345'
      experiment_name: '', // ex: 'player_test_A'
      sub_property_id: '', // ex: 'cus-1'
      // Player Metadata
      player_name: '', // ex: 'My Main Player'
      player_version: '', // ex: '1.0.0'
      player_init_time: '', // ex: 1451606400000
      // Video Metadata
      video_id: '', // ex: 'abcd123'
      video_title: '', // ex: 'My Great Video'
      video_series: '', // ex: 'Weekly Great Videos'
      video_duration: '', // in milliseconds, ex: 120000
      video_stream_type: '', // 'live' or 'on-demand'
      video_cdn: '' // ex: 'Fastly', 'Akamai'
    },
  }}
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first initialize the Mux SDK. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
MuxVideo.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Advanced options

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
  muxOptions={{
    application_name: app.name,              // (required) the name of your application
    application_version: app.version,        // the version of your application (optional, but encouraged)
    beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
    data: {
      env_key: 'ENV_KEY',
      // Site Metadata
      viewer_user_id: '', // ex: '12345'
      experiment_name: '', // ex: 'player_test_A'
      sub_property_id: '', // ex: 'cus-1'
      // Player Metadata
      player_name: '', // ex: 'My Main Player'
      player_version: '', // ex: '1.0.0'
      player_init_time: '', // ex: 1451606400000
      // Video Metadata
      video_id: '', // ex: 'abcd123'
      video_title: '', // ex: 'My Great Video'
      video_series: '', // ex: 'Weekly Great Videos'
      video_duration: '', // in milliseconds, ex: 120000
      video_stream_type: '', // 'live' or 'on-demand'
      video_cdn: '' // ex: 'Fastly', 'Akamai'
    },
  }}
});
```

<LinkedHeader step={steps[6]} />

### Current release

#### v0.19.10

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v0.19.9

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v0.19.8

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v0.19.7

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v0.19.6

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v0.19.5

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v0.19.4

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v0.19.3

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v0.19.2

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v0.19.1

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v0.19.0

* React Native custom onProgress handling

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v0.18.3

* Fix for race condition between rebuffering and pause events

#### v0.18.2

* Update `mux-embed` to v5.9.0

#### v0.18.1

* fix issue where updateData wasn't exposed, and issues with player\_is\_paused reporting

#### v0.18.0

* expose updateData method on MuxVideo element

#### v0.17.6

* Update `mux-embed` to v5.8.3

#### v0.17.5

* Update `mux-embed` to v5.8.2

#### v0.17.4

* Update `mux-embed` to v5.8.1

#### v0.17.3

* Update `react-native-video` version and Add the Mux, Inc Apple team to the demo app
* Update `mux-embed` to v5.8.0

#### v0.17.2

* Update `mux-embed` to v5.7.0

#### v0.17.1

* Update `mux-embed` to v5.6.0

#### v0.17.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v0.16.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v0.16.2

* Update `mux-embed` to v5.4.2

#### v0.16.1

* Update `mux-embed` to v5.4.1

#### v0.16.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v0.15.6

* Update `mux-embed` to v5.3.3

#### v0.15.5

* Update `mux-embed` to v5.3.2

#### v0.15.4

* Update `mux-embed` to v5.3.1

#### v0.15.3

* Update `mux-embed` to v5.3.0

#### v0.15.2

* Update `mux-embed` to v5.2.1

#### v0.15.1

* Update `mux-embed` to v5.2.0

#### v0.15.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v0.14.4

* Update `mux-embed` to v5.0.0

#### v0.14.3

* Update `mux-embed` to v4.30.0

#### v0.14.2

* Update `mux-embed` to v4.29.0

#### v0.14.1

* Update `mux-embed` to v4.28.1

#### v0.14.0

* Add renditionchange events for Android

* Introduces error tracking

* Bug fix for rebuffering metrics

* Update `mux-embed` to v4.28.0

#### v0.13.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v0.12.3

* Update `mux-embed` to v4.26.0

#### v0.12.2

* Update `mux-embed` to v4.25.1

#### v0.12.1

* Update `mux-embed` to v4.25.0

#### v0.12.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v0.11.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v0.10.3

* Update `mux-embed` to v4.22.0

#### v0.10.2

* Update `mux-embed` to v4.21.0

#### v0.10.1

* Update `mux-embed` to v4.20.0

#### v0.10.0

* Improve accuracy of react-native-video rebuffer tracking

* Update `mux-embed` to v4.19.0

#### v0.9.0

* Allow for timeupdates less than 250ms

#### v0.8.1

* Update `mux-embed` to v4.18.0

#### v0.8.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v0.7.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v0.6.6

* Update `mux-embed` to v4.15.0

#### v0.6.5

* Update `mux-embed` to v4.14.0

#### v0.6.4

* Update `mux-embed` to v4.13.4

#### v0.6.3

* Update `mux-embed` to v4.13.3

#### v0.6.2

* Update `mux-embed` to v4.13.2

#### v0.6.1

* Update `mux-embed` to v4.13.1

#### v0.6.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v0.5.8

* Publish package to NPM

#### v0.5.7

* Update `mux-embed` to v4.12.1

#### v0.5.6

* Update `mux-embed` to v4.12.0

#### v0.5.5

* Update `mux-embed` to v4.11.0

#### v0.5.4

* Update `mux-embed` to v4.10.0

#### v0.5.3

* Update `mux-embed` to v4.9.4

#### v0.5.2

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v0.5.1

* Update `mux-embed` to v4.9.2

#### v0.5.0

* We now expose the emit function the SDK uses which allows developers to manually invoke an event emission.

#### v0.4.6

* Update `mux-embed` to v4.9.1

#### v0.4.5

* Update `mux-embed` to v4.9.0

#### v0.4.4

* Update `mux-embed` to v4.8.0

#### v0.4.3

* Update `mux-embed` to v4.7.0

#### v0.4.2

* Update `mux-embed` to v4.6.2

#### v0.4.1

* Update `mux-embed` to v4.6.1

#### v0.4.0

* Bump mux-embed to 4.6.0

#### v0.3.0

* Fix an issue where `playerID` is `null` when wrapping the component with react-native-video-controls.

#### v0.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v0.1.0

* Initial release


# Monitor Kaltura Web
This guide walks through integration with the [Kaltura web player](https://github.com/kaltura/kaltura-player-js) to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
No notes provided
```

## 1. Install @mux/mux-data-kaltura

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-kaltura
```

```yarn
yarn add @mux/mux-data-kaltura
```

```cdn
<script src="https://src.litix.io/kaltura/1/kaltura-mux.js"></script>
```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Under the Kaltura `plugins` option, pass in the mux configuration with key `mux`.

Log in to the Mux dashboard and find the environment that corresponds to your `env_key` and look for video views. It takes about a minute or two from tracking a view for it to show up on the Metrics tab.

**If you aren't seeing data**, check to see if you have an ad blocker, tracking blocker or some kind of network firewall that prevents your player from sending requests to Mux Data servers.

## 3. Make your data actionable

The only required field in the `options` that you pass into `@mux/mux-data-kaltura` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` key, in the `mux` plugin configuration.

```html

<div id="kalturaPlayer" style="width: 560px; height: 395px"></div>
<script>
  var config = {
    targetId: 'kalturaPlayer',
    sources: {
      progressive: [
        {
          mimetype: 'video/mp4',
          url: 'https://muxed.s3.amazonaws.com/leds.mp4',
        },
      ],
    },
    provider: {
      partnerId: 4298703,
    },
    playback: {
      autoplay: false,
    },
    plugins: {
      mux: {
        data: {
          env_key: '<YOUR_ENV_KEY>', // required
          // Metadata
          player_name: 'Kaltura Player', // ex: 'My Main Player'
          // ... and other metadata
        },
      },
    },
  };
  var DEkalturaPlayer = KalturaPlayer.setup(config);
</script>

```

```javascript

import initKalturaMux from "@mux/mux-data-kaltura";

initKalturaMux(KalturaPlayer);

var config = {
  targetId: 'kalturaPlayer',
  sources: {
    progressive: [
      {
        mimetype: 'video/mp4',
        url: 'https://muxed.s3.amazonaws.com/leds.mp4',
      },
    ],
  },
  provider: {
    partnerId: "<PARTNER ID>",
  },
  playback: {
    autoplay: false,
  },
  plugins: {
    mux: {
      data: {
        env_key: '<YOUR_ENV_KEY>', // required
        // Metadata
        player_name: 'Kaltura Player', // ex: 'My Main Player'
        // ... and other metadata
      },
    },
  },
};

KalturaPlayer.setup(config);

```



For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first initialize the Mux SDK. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance returned by the `KalturaPlayer.setup` function
player.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by the `KalturaPlayer.setup` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by the `KalturaPlayer.setup` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
var kalturaPlayer = KalturaPlayer.setup({
  // ...
  plugins: {
    mux: {
      debug: false,
      disableCookies: true,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
var kalturaPlayer = KalturaPlayer.setup({
  // ...
  plugins: {
    mux: {
      respectDoNotTrack: true,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-kaltura` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance returned by the `KalturaPlayer.setup` function
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

var kalturaPlayer = KalturaPlayer.setup({
  // ...
  plugins: {
    mux: {
      errorTranslator,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
var kalturaPlayer = KalturaPlayer.setup({
  // ...
  plugins: {
    mux: {
      automaticErrorTracking: false,
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

### Ads tracking with `@mux/mux-data-kaltura`

Mux supports Kaltura's playkit-js-ima plugin for pre-, mid-, and post-roll ads. Simply configure these plugins as you would normally, and Mux will track ads automatically. No additional configuration is needed.

Other Kaltura ad integrations have not been tested, but may work out of the box. Please contact us with any questions.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
var kalturaPlayer = KalturaPlayer.setup({
  // ...
  plugins: {
    mux: {
      beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
      data: {
        env_key: "ENV_KEY",
        // ...
      }
    }
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v1.9.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v1.9.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v1.9.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v1.9.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v1.9.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v1.9.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v1.9.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v1.9.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v1.9.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v1.9.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v1.9.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v1.9.7

* Update `mux-embed` to v5.9.0

#### v1.9.6

* Update `mux-embed` to v5.8.3

#### v1.9.5

* Update `mux-embed` to v5.8.2

#### v1.9.4

* Update `mux-embed` to v5.8.1

#### v1.9.3

* Update `mux-embed` to v5.8.0

#### v1.9.2

* Update `mux-embed` to v5.7.0

#### v1.9.1

* Update `mux-embed` to v5.6.0

#### v1.9.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v1.8.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v1.8.2

* Update `mux-embed` to v5.4.2

#### v1.8.1

* Update `mux-embed` to v5.4.1

#### v1.8.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v1.7.6

* Update `mux-embed` to v5.3.3

#### v1.7.5

* Update `mux-embed` to v5.3.2

#### v1.7.4

* Update `mux-embed` to v5.3.1

#### v1.7.3

* Update `mux-embed` to v5.3.0

#### v1.7.2

* Update `mux-embed` to v5.2.1

#### v1.7.1

* Update `mux-embed` to v5.2.0

#### v1.7.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v1.6.5

* Update `mux-embed` to v5.0.0

#### v1.6.4

* Update `mux-embed` to v4.30.0

#### v1.6.3

* Update `mux-embed` to v4.29.0

#### v1.6.2

* Update `mux-embed` to v4.28.1

#### v1.6.1

* Update `mux-embed` to v4.28.0

#### v1.6.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v1.5.3

* Update `mux-embed` to v4.26.0

#### v1.5.2

* Update `mux-embed` to v4.25.1

#### v1.5.1

* Update `mux-embed` to v4.25.0

#### v1.5.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v1.4.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v1.3.5

* Update `mux-embed` to v4.22.0

#### v1.3.4

* Update `mux-embed` to v4.21.0

#### v1.3.3

* Update `mux-embed` to v4.20.0

#### v1.3.2

* Update `mux-embed` to v4.19.0

#### v1.3.1

* Update `mux-embed` to v4.18.0

#### v1.3.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v1.2.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v1.1.6

* Record `request_url` and `request_id` with network events
* Update `mux-embed` to v4.15.0

#### v1.1.5

* Update `mux-embed` to v4.14.0

#### v1.1.4

* Update `mux-embed` to v4.13.4

#### v1.1.3

* Update `mux-embed` to v4.13.3

#### v1.1.2

* Update `mux-embed` to v4.13.2

#### v1.1.1

* Update `mux-embed` to v4.13.1

#### v1.1.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v1.0.14

* Publish package to NPM

#### v1.0.13

* Update `mux-embed` to v4.12.1

#### v1.0.12

* Update `mux-embed` to v4.12.0

#### v1.0.11

* Update `mux-embed` to v4.11.0

#### v1.0.10

* Update `mux-embed` to v4.10.0

#### v1.0.9

* Update `mux-embed` to v4.9.4

#### v1.0.8

* Update `mux-embed` to v4.9.3

#### v1.0.7

* Update `mux-embed` to v4.9.2

#### v1.0.6

* Update `mux-embed` to v4.9.1

#### v1.0.5

* Update `mux-embed` to v4.9.0

#### v1.0.4

* Update `mux-embed` to v4.8.0

#### v1.0.3

* Update `mux-embed` to v4.7.0

#### v1.0.2

* Update `mux-embed` to v4.6.2

#### v1.0.1

* Update `mux-embed` to v4.6.1

#### v1.0.0

* Bump mux-embed to 4.6.0

#### v1.0.0-beta.1

* Update mux-embed to v4.4.2 to support latency metrics

#### v1.0.0-beta.0

* First beta release of the Kaltura SDK for web


# Monitor Kaltura Player (iOS and tvOS)
This guide walks through integration with iOS and TVOS Kaltura player to collect video performance metrics with Mux data.
Mux Data `Mux-Stats-Kaltura` supports iOS 13.0 or newer and tvOS 13.0 or newer. The Mux integration with Kaltura is built on top of Mux's core Objective-C SDK, and the full code can be seen here: [muxinc/mux-stats-sdk-kaltura-ios](https://github.com/muxinc/mux-stats-sdk-kaltura-ios).

This SDK is built with `XCFramework` bundle type and supports Mac Catalyst.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Ads metrics

```

Notes:

```md
Packaged with: cocoapods.
```

## 1. Install the Mux Data SDK

## Installing with SwiftPM

1. In Xcode click "File" > "Swift Packages" > "Add Package Dependency..."
2. The package repository URL is `https://github.com/muxinc/mux-stats-sdk-kaltura-ios.git`
3. Click `next`.
4. Select dependency resolution options. We recommend setting the "Rules" to install the latest version and choosing the option "Up to Next Major".

Note that `MUXSDKStatsKaltura` has a dependency on `MuxCore`, so you will see that `MuxCore` gets installed as well.

> As of Xcode 14.3.1 integrating the Mux SDKs as part of a shared framework using Swift Package Manager is now supported.

## Installing with CocoaPods

To install with CocoaPods, modify your Podfile to use frameworks by including `use_frameworks!` and then add the following pods to your Podfile:

```
pod 'Mux-Stats-Kaltura', '~>3.0'
```

This will install `Mux-Stats-Kaltura` and the latest current release of our [core Objective-C Library](https://github.com/muxinc/stats-sdk-objc). There will be no breaking updates in major versions, so you can safely run `pod update` for future versions.

Next, add correct import statement into your application.

```swift
import MUXSDKKaltura
```

## 2. Initialize the monitor for your Kaltura player instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

The example below uses `monitorPlayer:player:playerName:customerData:`.

The `playerName` parameter is a string that identifies this instance of your player. When calling `destroyPlayer` later on, you will need this string. Each instance of a player that runs simultaneously in your application should have a different `playerName`.

```swift

let playerName = "iOS KalturaPlayer"
let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")
playerData?.playerName = self.playerName

let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Title Video Kaltura"
videoData.videoId = "my-video-id"

let viewData = MUXSDKCustomerViewData()
viewData.viewSessionId = "my-session-id"

let customData = MUXSDKCustomData()
customData.customData1 = "my-custom-data"

let viewerData = MUXSDKCustomerViewerData()
viewerData.viewerApplicationName = "my-app-name"

let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: viewData,
    customData: customData,
    viewerData: viewerData
)

guard let player = self.kalturaPlayer, let data = customerData else {
    return
}

MUXSDKStats.monitorPlayer(
    player: player,
    playerName: playerName,
    customerData: data
)

```



For more complete examples check the [demo apps in the repo](https://github.com/muxinc/mux-stats-sdk-kaltura-ios/tree/main/apps/DemoApp).

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

## 3. Make your data actionable

The only required field is `env_key`. But without some more metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Metadata fields are provided via the `MUXSDKCustomerPlayerData` and `MUXSDKCustomerVideoData` objects.

For the full list of properties view the header files for this interfaces:

* [MUXSDKCustomerPlayerData.h](https://github.com/muxinc/stats-sdk-objc/blob/master/XCFramework/MuxCore.xcframework/ios-arm64/MuxCore.framework/Headers/MUXSDKCustomerPlayerData.h)
* [MUXSDKCustomerVideoData.h](https://github.com/muxinc/stats-sdk-objc/blob/master/XCFramework/MuxCore.xcframework/ios-arm64/MuxCore.framework/Headers/MUXSDKCustomerVideoData.h)

For more details about each property, view the [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) guide.

```swift

let playerName = "My Main Player"
let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")
playerData.experimentName = "player_test_A"
playerData.playerName = playerName
playerData.playerVersion = "1.0.0"

let videoData = MUXSDKCustomerVideoData()
videoData.videoId = "abcd123"
videoData.videoTitle = "My Great Video"
videoData.videoSeries = "Weekly Great Videos"
videoData.videoDuration = 120000 // in milliseconds
videoData.videoIsLive = false
videoData.videoCdn = "cdn"

let viewData = MUXSDKCustomerViewData()
viewData.viewSessionId = "my session id"

let customData = MUXSDKCustomData()
customData.customData1 = "Custom data 1"
customData.customData2 = "Custom Data 2"

let viewerData = MUXSDKCustomerViewerData()
viewerData.viewerApplicationName = "MUX Kaltura DemoApp"

let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: viewData,
    customData: customData,
    viewerData: viewerData
)

guard let player = self.kalturaPlayer, let data = customerData else {
    return
}

MUXSDKStats.monitorPlayer(
    player: player,
    playerName: playerName,
    customerData: data
)

```



## 4. Set or update metadata after monitor

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call  `monitorPlayer`. Then, once you have the metadata, you can update the metadata with the  `setCustomerDataForPlayer`  method.

```swift

// Sometime later before the player is destroyed you can do this:
// The player name ("iOS KalturaPlayer" in this example) should be a player that
// you have already called `monitorPlayer` method with
let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Big Buck Bunny"
videoData.videoSeries = "Updated animation"
// In this example we are updating videoData, but the same can be done
// for updating playerData, customData or viewData
// the values in customerData passed as nil will keep previously set data
// viewerData can't be updated
guard let customerData = MUXSDKCustomerData(
    customerPlayerData: nil,
    videoData: videoData,
    viewData: nil,
    customData: nil,
    viewerData: nil
) else {
    return
}
MUXSDKStats.setCustomerDataForPlayer(name: "iOS KalturaPlayer", customerData: customerData)

```



## 5. Advanced

## Changing the Video

​
There are two cases where the underlying tracking of the video view need to be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).
​

### New source

​
When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:
​

* The player advances to the next video in a playlist
* The user selects a different video to play
  ​
  This is done by calling  `videoChangeForPlayer`  which will remove all previous video data and reset all metrics for the video view. You can include any metadata when changing the video but you should only need to update the values that start with  `video_`.
  ​
  It is required to call  `videoChangeForPlayer`  immediately before telling the player which new source to play.
  ​

```swift

// Example of changing the media in Kaltura Player
// Call MUX videoChange before stop, because playkit stop will replace current item for nil
let playerData = MUXSDKCustomerPlayerData(environmentKey: self.environmentKey)
playerData?.playerName = self.playerName
        
let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "Apple Video Kaltura"
videoData.videoId = "apple"
videoData.videoSeries = "conference"
        
let viewData = MUXSDKCustomerViewData()
viewData.viewSessionId = "my second session id"
        
let customData = MUXSDKCustomData()
customData.customData1 = "Kaltura test video change"
        
let viewerData = MUXSDKCustomerViewerData()
viewerData.viewerApplicationName = "MUX Kaltura DemoApp"
        
guard let customerData = MUXSDKCustomerData(
    customerPlayerData: playerData,
    videoData: videoData,
    viewData: viewData,
    customData: customData,
    viewerData: viewerData
) else {
    return
}
        
MUXSDKStats.videoChangeForPlayer(name: "iOS KalturaPlayer", customerData: customerData)
// Change media in your player (your steps may vary)
// For example:

// Resets The Player And Prepares for Change Media
self.kalturaPlayer?.stop()
        
// Prepare PlayKit player
self.kalturaPlayer?.prepare(newMediaConfig)
        
// Wait for `canPlay` event to play
self.kalturaPlayer?.addObserver(self, events: [PlayerEvent.canPlay]) { event in
    self.kalturaPlayer?.play()
}

```



​

### New program (in single stream)

​
In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.
​
In this case, call  `programChangeForPlayer:name:customerData`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. You can include any metadata when changing the video but you should only need to update the values that start with  `video`.

## Usage with Google Interactive Media Ads (IMA)

If you are using Google Interactive Media Ads and the `PlayKit_IMA` SDK, you can track ad playback events by installing the `mux-stats-google-ima-kaltura-ios` companion package.

Please note: A fully functioning `PlayKit_IMA` integration is required for ad playback tracking in your iOS or tvOS application.

Add the following to your Podfile and run `pod install`

```
'Mux-Stats-Google-IMA-Kaltura' ~> '2.0.0'
```

Initialize the Mux monitor with `MUXSDKStats.monitorPlayer`. Create an listener instance by calling `MUXSDKImaKalturaListener(playerBinding: playerBinding, player: player)`.
Start dispatching the events by calling `start` on the listener instance.

```swift
import MUXSDKStatsKaltura

// Follow the instructions from pod 'PlayKit_IMA' to set up
// your IMA plugin configuration in the loadPlayer method
//
// When you call `monitorPlayer:withPlayer:playerName:customerData:`
// from your ViewController, you will get back a MUXSDKPlayerBinding object
let playerBinding = MUXSDKStats.monitorPlayer(
  player: player,
  playerName: self.playerName,
  customerData: data
)

// Use the MUXSDKPlayerBinding object and the Player instance to initialize the MUXSDKImaKalturaListener class
// and call start on the listener object
let listener = MUXSDKImaKalturaListener(playerBinding: playerBinding, player: player)
listener.start()
```

You can find a [complete example here](https://github.com/muxinc/mux-stats-sdk-kaltura-ios/tree/main/apps/DemoApp).

## Track orientation change events

You can optionally track  `orientationchange`  events. To use this functionality, call the  `orientationChangeForPlayer`  method.
​
These events will show up on the events log on the view views page.

```swift

override func viewWillTransition(to size: CGSize, with coordinator: UIViewControllerTransitionCoordinator) {
    super.viewWillTransition(to: size, with: coordinator)
    let orientation = UIDevice.current.orientation.isLandscape ? MUXSDKViewOrientation.landscape : MUXSDKViewOrientation.portrait
    MUXSDKStats.orientationChangeForPlayer(name: "iOS KalturaPlayer", orientation: orientation)
}

```



## Handling errors manually

By default,  `automaticErrorTracking`  is enabled which means the Mux SDK will catch errors that the player throws and track an  `error`  event. Error tracking is meant for fatal errors. When an error is thrown it will mark the view as having encountered an error in the Mux dashboard and the view will no longer be monitored.
​
If you want to disable automatic and track errors manually you can do by passing in  `automaticErrorTracking: false`  to the  `monitorPlayer`  method that you are using.
​
Whether automatic error tracking is enabled or disabled, you can dispatch errors manually with  `dispatchError`.

```swift

let playerName = "iOS KalturaPlayer"   
let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")
// ...insert player metadata

let videoData = MUXSDKCustomerVideoData()
// ...insert video metadata

let customerData = MUXSDKCustomerData(customerPlayerData: playerData, videoData: videoData, viewData: nil, customData: nil, viewerData: nil)

guard let player = self.kalturaPlayer, let data = customerData else {
    return
}

MUXSDKStats.monitorPlayer(player: player, playerName: self.playerName, customerData: data, automaticErrorTracking: false)

// Later, you can dispatch an error yourself
MUXSDKStats.dispatchErrorForPlayer(name: playerName, code: "1234", message: "Something is not right")

```



<LinkedHeader step={steps[6]} />

### Current release

#### v4.0.0

Improvements:

* Include privacy manifest file to satisfy [upcoming privacy requirements for App Store submissions](https://developer.apple.com/news/?id=3d8a9yyh)
* Update Mux Core and Kaltura player dependencies

### Previous releases

#### v3.0.0

Improvements:

* Repackage SDK as source distribution
* Add Swift Package Manager support
* Raise minimum deployment targets to iOS 13 and tvOS 13

Breaking:

* Rename module name from `Mux_Stats_Google_IMA_Kaltura` to `MUXSDKStatsKaltura`

Known Issues:

* Cocoapod pod spec linting fails on Xcode 14.3 and above due to Cocoapods/Cocoapods issue #11839. As a workaround use xcode-select to switch to Xcode 14.2 before linting.

#### v2.0.1

Fixes:

* Fix build issues in react-native projects

#### v2.0.0

Fixes:
\*Update MuxCore dependency and rebuild with recent tools. Some linkage changes have been necessary, but you shouldn't see any issues. You may require Xcode 14 to use this version of the Data SDK for Kaltura

#### v1.1.1

* Fix: Change minimum deployment target to iOS 9.0 and tvOS 9.0

#### v1.1.0

* Test: Unit test for destroy player
* Feature: Support for Google IMA SDK Listener

#### v1.0.0

* Fix: Missing play event
* Fix: Improve rendition change detection
* Fix: Missing rebuffering metrics
* Test: Unit test infrastructure
* Test: Add test coverage

#### v0.3.0

* Third beta release of the Kaltura SDK for iOS
* Adds tvOS support
* Adds tvOS target to DemoApp and updates example project

#### v0.2.0

* Second beta release of the Kaltura SDK for iOS
* Adds  `setCustomerDataForPlayer` to update metadata after monitor call
* Adds  `videoChangeForPlayer` to update metadata when a video change occurs
* Adds  `programChangeForPlayer:name:customerData:` to update metadata when a program change within a stream
* Adds  `orientationChangeForPlayer` to track orientation changes
* Adds  manual error tracking with `dispatchError`

#### v0.1.0

* First beta release of the Kaltura SDK for iOS


# Monitor Kaltura Player (Android)
This guide walks through integration with the [Kaltura PlayKit and TVPlayer for Android](https://github.com/kaltura/playkit-android) to collect video performance metrics with Mux Data.
This documents integration instructions for [Kaltura PlayKit and TVPlayer for Android](https://github.com/kaltura/playkit-android) version v4.16.0 or higher.

The Mux integration with `Kaltura` is built on top of Mux's core Java SDK, and the full code can be seen here: [muxinc/mux-stats-sdk-kaltura-android](https://github.com/muxinc/mux-stats-sdk-kaltura-android).

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Ads metrics
- Ads metadata

```

Notes:

```md
No notes provided
```

## 1. Install the Mux Data SDK

Add the Mux Maven repository to your Gradle file:

```text
repositories {
    maven {
        url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
    }
}
```

Next, add a dependency to your Gradle file using the Mux SDK version in the following format:

```text
api 'com.mux.stats.sdk.muxstats:MuxKalturaSDK:(Mux SDK version)'
```

Example using Mux Kaltura SDK 0.1.0:

```text
api 'com.mux.stats.sdk.muxstats:MuxKalturaSDK:0.1.0'
```

## 2. Initialize the monitor with your Kaltura player instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Our SDK supports Kaltura PlayKit and TVPlayer v4.16.0 or higher.

First, create the `CustomerPlayerData` and `CustomerVideoData` objects as appropriate for your current playback, and be sure to set your `ENV_KEY`.

```java
import com.mux.stats.core.model.CustomerData;
import com.mux.stats.core.model.CustomerPlayerData;
import com.mux.stats.core.model.CustomerVideoData;
import com.mux.stats.core.model.CustomerViewData;
// ...
CustomerPlayerData customerPlayerData = new CustomerPlayerData();
customerPlayerData.setEnvironmentKey("ENV_KEY");
CustomerVideoData customerVideoData = new CustomerVideoData();
customerVideoData.setVideoTitle("The most epic video ever");
CustomerViewData customerViewData = new CustomerViewData();
customerViewData.setViewSessionId("A26C4C2F-3C8A-46FB-885A-8D973F99A998");
CustomerData customerData = new CustomerData(customerPlayerData, customerVideoData, customerViewData);
```

Next, create the `MuxStatsKaltura` object by passing your Android `Context` (typically your `Activity`), the player instance, a player name, and the customer data object. The following example shows how to instantiate the SDK using the TVPlayer "KalturaPlayer" (represented by the variable `player`). For a PlayKit-only player just pass in your raw `com.kaltura.playkit.Player` reference in place of the KalturaPlayer.

```java
MuxNetworkRequests network = new MuxNetworkRequests();
muxStats = new MuxStatsKaltura(this, player, "my-player-name", customerData, new CustomOptions().setSentryEnabled(false), network);
```

In order to correctly monitor if the player is full-screen, provide the screen size to the `MuxStatsKaltura` instance.

```java
Point size = new Point();
getWindowManager().getDefaultDisplay().getSize(size);
muxStats.setScreenSize(size.x, size.y);
muxStats.enableMuxCoreDebug(true, false);
```

Finally, when you are destroying the player, call the `MuxStatsKaltura.release()` method.

```java
muxStats.release()
```

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

## 3. Add Metadata

In the Java SDK, options are provided via the `CustomerPlayerData`, `CustomerVideoData`, and `CustomerViewData` objects.

All metadata details except for `envKey` are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Advanced

## Changing the video

There are two cases where the underlying tracking of the video view must be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

## New source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `MuxStatsKaltura.videoChange(CustomerVideoData)` which will remove all previous video data and reset all metrics for the video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

It's best to change the video info immediately after telling the player which new source to play.

## New program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `MuxStatsKaltura.programChange(CustomerVideoData)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

## Error tracking

By default, Mux's integration with Kaltura automatically tracks fatal errors as thrown by the Kaltura player. If a fatal error happens outside the context of Kaltura player and you want to track it with Mux, you can call `MuxStatsKaltura.error` like this:

```java
// Error code: integer value for the generic type of error that
// occurred.
// Error message: String providing more information on the error
// that occurred.
// For an example, the HTML5 video element uses the
// following: https://developer.mozilla.org/en-US/docs/Web/API/MediaError
// for codes and messages. Feel free to use your own codes and messages
int errorCode = 1;
String errorMessage = "A fatal error was encountered during playback";
MuxErrorException error = new MuxErrorException(errorCode, errorMessage);
muxStats.error(error);
```

Note that `MuxStatsKaltura.error(MuxErrorException e)` can be used with or without automatic error tracking. If your application has retry logic that attempts to recover from Kaltura player errors then you may want to disable automatic error tracking like this:

```java
muxStats.setAutomaticErrorTracking(false)
```

<Callout type="warning">
  It is important that you only trigger an error when the playback has to be abandoned or aborted in an unexpected manner, as Mux tracks fatal playback errors only.
</Callout>

### Sentry

In order to improve our SDKs, Mux utilizes [Sentry](https://sentry.io) to track exceptions that our SDK may throw. No personal data is captured by Mux's SDK in these error reports, but if you want to disable this functionality, you can. This should be managed through the `CustomOptions` object passed to the constructor.

```java
muxStats = new MuxStatsKaltura(this, player, "my-player-name", customerData, new CustomOptions().setSentryEnabled(false), network);
```

## Release notes

### Current release

#### v0.2.0

Improvements:

* Update to MuxCore 7.8, adds `CustomerViewerData` to `CustomerData`

### Previous releases

#### v0.1.0

Feature:

* First beta release of the Kaltura SDK for Android


# Monitor JW Player
This guide walks through integration with [JW Player](https://www.jwplayer.com/) for the web to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Customizable Error Tracking
- Ads metrics
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
No notes provided
```

## 1. Install \`@mux/mux-data-jwplayer\`

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-jwplayer
```

```yarn
yarn add @mux/mux-data-jwplayer
```

```cdn

<!-- Include jwplayer-mux after the core JW Player JavaScript file -->
<!--  Note that the KEY in the example should be replaced with the key
provided by JW Player for your account. -->
<script src="https://content.jwplatform.com/libraries/KEY.js"></script>
<script src="https://src.litix.io/jwplayer/4/jwplayer-mux.js"></script>

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```html

<!--Call jwplayer like you normally would and get a reference to the player.
Call initJWPlayerMux with the player reference and the SDK options.-->

<div id="my-player"></div>
<script>
  const conf = {
    // Insert JW Player configuration here
  };

  const playerInitTime = initJWPlayerMux.utils.now();
  const player = jwplayer('my-player').setup(conf);

  // Initialize Mux Data monitoring
  initJWPlayerMux(player, {
    debug: false,
    data: {
      env_key: 'EXAMPLE_ENV_KEY', // required

      // Metadata
      player_name: '', // ex: 'My Main Player'
      player_init_time: playerInitTime // ex: 1451606400000

      // ... and other metadata
    }
  });
</script>

```

```javascript

import initJWPlayerMux from '@mux/mux-data-jwplayer';

const conf = {
  // Insert JW Player configuration here
};
const playerInitTime = initJWPlayerMux.utils.now();
const player = jwplayer('my-player').setup(conf);

initJWPlayerMux(player, {
  debug: false,
  data: {
    env_key: 'EXAMPLE_ENV_KEY', // required

    // Metadata
    player_name: '', // ex: 'My Main Player'
    player_init_time: playerInitTime // ex: 1451606400000

    // ... and other metadata
  }
});

```



Be sure to call `initJWPlayerMux` immediately after initializing JW Player so that Mux can attach as soon as possible.

## 3. Make your data actionable

The only required field in the `options` that you pass into `@mux/mux-data-jwplayer` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
initJWPlayerMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000, you can use `initJWPlayerMux.utils.now()`
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `initJWPlayerMux`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance returned by the `jwplayer` function
player.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by the `jwplayer` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by the `jwplayer` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
initJWPlayerMux(player, {
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
initJWPlayerMux(player, {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-jwplayer` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance returned by the `jwplayer` function
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

initJWPlayerMux(player, {
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
initJWPlayerMux(player, {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Ads tracking with `@mux/mux-data-jwplayer`

Mux supports JW Player's VAST integration for pre-, mid-, and post-roll ads. Simply configure these plugins as you would normally, and Mux will track ads automatically. No additional configuration is needed.

Other JW Player ad integrations, such as Google IMA and FreeWheel have not been tested, but may work out of the box. Please contact us with any questions.

### Latency metrics with `@mux/mux-data-jwplayer`

Mux supports latency metrics by parsing the incoming HLS manifest. JW Player allows us to intercept the manifest response using an [`onXhrOpen` hook](https://docs.jwplayer.com/players/reference/playlists#playlistsources).
This is not available in Safari browsers where HLS is played natively.

```js
var player = jwplayer('my-player').setup({
  playlist: [{
    sources: [{
      file: 'video.m3u8',
      onXhrOpen: function(xhr, url) {
        player.mux && player.mux.onXhrOpen(xhr, url);
      }
    }]
  }]
});

// Initialize Mux Data monitoring
initJWPlayerMux(player, {
  // ...
});
```

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
initJWPlayerMux(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v4.20.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v4.20.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v4.20.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v4.20.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v4.20.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v4.20.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v4.20.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v4.20.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v4.20.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v4.20.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v4.20.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v4.20.7

* Update `mux-embed` to v5.9.0

#### v4.20.6

* Update `mux-embed` to v5.8.3

#### v4.20.5

* Update `mux-embed` to v5.8.2

#### v4.20.4

* Update `mux-embed` to v5.8.1

#### v4.20.3

* Update `mux-embed` to v5.8.0

#### v4.20.2

* Update `mux-embed` to v5.7.0

#### v4.20.1

* Update `mux-embed` to v5.6.0

#### v4.20.0

* Add error details from sourceError to error context

#### v4.19.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v4.18.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v4.18.2

* Update `mux-embed` to v5.4.2

#### v4.18.1

* Update `mux-embed` to v5.4.1

#### v4.18.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v4.17.7

* Update `mux-embed` to v5.3.3

#### v4.17.6

* add support for dropped frame count

#### v4.17.5

* Update `mux-embed` to v5.3.2

#### v4.17.4

* Update `mux-embed` to v5.3.1

#### v4.17.3

* Update `mux-embed` to v5.3.0

#### v4.17.2

* Update `mux-embed` to v5.2.1

#### v4.17.1

* Update `mux-embed` to v5.2.0

#### v4.17.0

* Collect additional data on rendition change: height, width, rendition namet

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v4.16.0

* Add opt-in TypeScript Types to Mux Embed and use + refactor for other dependent data SDKs.

* Update `mux-embed` to v5.0.0

#### v4.15.4

* Update `mux-embed` to v4.30.0

#### v4.15.3

* Update `mux-embed` to v4.29.0

#### v4.15.2

* Update `mux-embed` to v4.28.1

#### v4.15.1

* Update `mux-embed` to v4.28.0

#### v4.15.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v4.14.3

* Update `mux-embed` to v4.26.0

#### v4.14.2

* Update `mux-embed` to v4.25.1

#### v4.14.1

* Update `mux-embed` to v4.25.0

#### v4.14.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v4.13.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v4.12.0

* Emit `requestfailed` events and include more detailed information from JW Player in the Mux Error Context

* Update `mux-embed` to v4.22.0

#### v4.11.4

* Update `mux-embed` to v4.21.0

#### v4.11.3

* Update `mux-embed` to v4.20.0

#### v4.11.2

* Update `mux-embed` to v4.19.0

#### v4.11.1

* Update `mux-embed` to v4.18.0

#### v4.11.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v4.10.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v4.9.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Record `request_url` and `request_id` with network events

* Update `mux-embed` to v4.15.0

#### v4.8.5

* Update `mux-embed` to v4.14.0

#### v4.8.4

* Update `mux-embed` to v4.13.4

#### v4.8.3

* Update `mux-embed` to v4.13.3

#### v4.8.2

* Update `mux-embed` to v4.13.2

#### v4.8.1

* Update `mux-embed` to v4.13.1

#### v4.8.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v4.7.12

* Publish package to NPM

#### v4.7.11

* Display an error message if the JW player is removed but the Mux monitor is not destroyed
* Update `mux-embed` to v4.12.1

#### v4.7.10

* Update `mux-embed` to v4.12.0

#### v4.7.9

* Update `mux-embed` to v4.11.0

#### v4.7.8

* Update `mux-embed` to v4.10.0

#### v4.7.7

* Update `mux-embed` to v4.9.4

#### v4.7.6

* Update `mux-embed` to v4.9.3

#### v4.7.5

* Update `mux-embed` to v4.9.2

#### v4.7.4

* Update `mux-embed` to v4.9.1

#### v4.7.3

* Update `mux-embed` to v4.9.0

#### v4.7.2

* Update `mux-embed` to v4.8.0

#### v4.7.1

* Update `mux-embed` to v4.7.0

#### v4.7.0

* Introducing HLS Session Data Support

* Update `mux-embed` to v4.6.2

#### v4.6.1

* Update `mux-embed` to v4.6.1

#### v4.6.0

* Bump mux-embed to 4.6.0

#### v4.5.0

* Update mux-embed to v4.4.2
* Use JW error codes for `player_error_code` on errors

#### v4.4.0

* Add support for latency metrics

#### v4.3.1

* Remove unneeded debug logging

#### v4.3.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v4.2.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v4.1.0

* Improve metrics by sending the `playing` and `adplaying` events at more appropriate times

#### v4.0.0

* Update mux-embed to v4.0.0
* Support server-side device detection

#### v3.1.0

* add rendition change event for getting bitrate metrics

#### v3.0.0

* bump `mux-embed` dependency to `3.0.0`


# Monitor JW Player (iOS)
This guide walks through integration with [JW Player](https://www.jwplayer.com/) for the web to collect video performance metrics with Mux data.
In order to integrate Mux Data tracking for your JW Player, you will need to be using JW Player `3.x` or later. You will need to already have a JW Player license key and an iOS app with a working implementation of `JWPlayer-SDK`.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Live Stream Latency metric

```

Notes:

```md
No notes provided
```

## 1. Install Mux Data SDK

```
pod 'Mux-Stats-JWPlayer', '~> 0.3'
```

This will install `Mux-Stats-JWPlayer` and the latest current release of our [core Objective-C library](https://github.com/muxinc/stats-sdk-objc).

## 2. Initialize the Mux monitor

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Next, import `MUXSDKStatsJWPlayer` into your application and call `MUXSDKStatsJWPlayer.monitorJWPlayerController`, passing in your JW player instance and metadata.

```swift
import MUXSDKStatsJWPlayer;

class VideoPlayerController: UIViewController {
   var player: JWPlayerController?

  override func viewDidLoad ()
      super.viewDidLoad()
    let config = JWConfig()
    config.file = "http://example.com/hls.m3u8"
    player = JWPlayerController(config: config)
  }

  override func viewDidAppear(_ animated: Bool) {
      super.viewDidAppear(animated)
        player!.view!.frame = self.view.bounds
      view.addSubview(player!.view)

      let playName = "iOS JW player"
      let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY");
      // insert player metadata
      let videoData = MUXSDKCustomerVideoData();
      // insert video metada
      MUXSDKStatsJWPlayer.monitorJWPlayerController(player!, name: playName, delegate: nil, playerData: playerData!, videoData: videoData)
            player!.play()
  }
}
```

## Register a delegate (optional)

If your own ViewController implements `<JWPlayerDelegate>` and you want to use it, then pass that in as the delegate argument to `monitorJWPlayerController`. See the example below:

```swift
override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
    player!.view!.frame = self.view.bounds
    view.addSubview(player!.view)

    let playName = "iOS JW player"
    let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY");
    // insert player metadata
    let videoData = MUXSDKCustomerVideoData();
    // insert video metada
    // pass in `self` as the delegate
    MUXSDKStatsJWPlayer.monitorJWPlayerController(player!, name: playName, delegate: self, playerData: playerData!, videoData: videoData)
    player!.play()
}

// example of implementing a delegate method
func onReady(_ event: JWEvent & JWReadyEvent) {
  // this will get called when JWPlayer triggers onPlay
}
```

## 3. Make your data actionable

The only required field is `env_key`. But without some more metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Metadata fields are provided via the `MUXSDKCustomerPlayerData` and `MUXSDKCustomerVideoData` objects.

For the full list of properties view the header files for this interfaces:

* [MUXSDKCustomerPlayerData.h](https://github.com/muxinc/stats-sdk-objc/blob/master/XCFramework/MuxCore.xcframework/ios-arm64/MuxCore.framework/Headers/MUXSDKCustomerPlayerData.h)
* [MUXSDKCustomerVideoData.h](https://github.com/muxinc/stats-sdk-objc/blob/master/XCFramework/MuxCore.xcframework/ios-arm64/MuxCore.framework/Headers/MUXSDKCustomerVideoData.h)

For more details about each property, view the [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) guide.

```swift
let playName = "iOS AVPlayer"
let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY");
playerData.viewerUserId = "1234"
playerData.experimentName = "player_test_A"
// note that the 'playerName' field here is unrelated to the 'playName' variable above
playerData.playerName = "My Main Player"
playerData.playerVersion = "1.0.0"

let videoData = MUXSDKCustomerVideoData();
videoData.videoId = "abcd123"
videoData.videoTitle = "My Great Video"
videoData.videoSeries = "Weekly Great Videos"
videoData.videoDuration = 120000 // in milliseconds
videoData.videoIsLive = false
videoData.videoCdn = "cdn"

MUXSDKStatsJWPlayer.monitorJWPlayerController(player!, name: playName, delegate: self, playerData: playerData!, videoData: videoData)
```


# Monitor Android MediaPlayer
This guide walks through integration with Android MediaPlayer to collect video performance metrics with Mux data.
This documents integration instructions for Android's MediaPlayer class. This integration supports Android 4.2 (API level 17) and newer, though older versions of Android have spotty support for streaming protocols such as HLS and Dash.

The Mux integration with MediaPlayer is built on top of Mux's core Java SDK, and the full code can be seen here: [muxinc/mux-stats-sdk-mediaplayer](https://github.com/muxinc/mux-stats-sdk-mediaplayer).

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager

```

Notes:

```md
Video Quality metrics are not available.
```

## 1. Install the Mux Data SDK

The easiest way to get the AAR is to download the latest version from: [muxinc/mux-stats-sdk-mediaplayer releases](https://github.com/muxinc/mux-stats-sdk-mediaplayer/releases).

If you would prefer to build it yourself, first clone [the repo](https://github.com/muxinc/mux-stats-sdk-mediaplayer). Then, you can do one of the following:

1. Open the project in Android Studio and build the release variant of the `MuxMediaPlayer` module. You can then Find the AAR in `mux-stats-sdk-mediaplayer/MuxMediaPlayer/build/outputs/aar/MuxMediaPlayer-release.aar`
2. Build the AAR directly:

```sh
./gradlew :MuxMediaPlayer:assembleRelease
```

We recommend using Android Studio's new module tool which can be accessed via `File > New > New Module...`. Select the `Import .JAR/.AAR Package` and then select the `mux.aar` that you downloaded or built. This should correctly configure the IDE as well as modify your build configuration (Gradle/Maven).

For an example integration, you can see the demo application within [this repo](https://github.com/muxinc/mux-stats-sdk-mediaplayer) which integrates Mux into the MediaPlayer demo application.

## 2. Initialize the monitor with your MediaPlayer instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

First, create the `CustomerPlayerData` and `CustomerVideoData` objects as appropriate for your current playback, and be sure to set your `ENV_KEY`.

```java
import com.mux.stats.core.models.CustomerPlayerData;
import com.mux.stats.core.models.CustomerVideoData;
// ...
CustomerPlayerData customerPlayerData = new CustomerPlayerData();
customerPlayerData.setEnvironmentKey("ENV_KEY");
CustomerVideoData customerVideoData = new CustomerVideoData();
customerVideoData.setVideoTitle("My great video");
```

Next, Create the `MuxStatsMediaPlayer` object by passing your Android `Context` (typically your `Activity`), the `MediaPlayer` instance, a player name, and the customer data objects.

```java
import com.mux.stats.sdk.muxstats.mediaplayer.MuxStatsMediaPlayer;
...
muxStatsMediaPlayer = new MuxStatsMediaPlayer(this, player, "demo-player", customerPlayerData, customerVideoData);
```

In order to correctly monitor if the player is full-screen, provide the screen size to the `MuxStatsMediaPlayer` instance.

```java
Point size = new Point();
getWindowManager().getDefaultDisplay().getSize(size);
muxStatsMediaPlayer.setScreenSize(size.x, size.y);
```

In order to determine a number of viewer context values as well as track the size of the video player, set the player view.

```java
muxStatsMediaPlayer.setPlayerView(playerView);
```

To allow `MuxStatsMediaPlayer` to listen for various `MediaPlayer` events, add it as a listener. `MediaPlayer` only allows single listeners, so if your activity or application also needs to listen to these events, use the helper methods to wrap your listener implementation with `MuxStatsMediaPlayer`'s listener implementation.

```java
player.setOnCompletionListener(muxStatsMediaPlayer.getOnCompletionListener(myCompletionListener));
player.setOnErrorListener(muxStatsMediaPlayer.getOnErrorListener(myErrorListener));
player.setOnPreparedListener(muxStatsMediaPlayer.getOnPreparedListener(this));
player.setOnInfoListener(muxStatsMediaPlayer.getOnInfoListener(null));  // No wrapped listener.
player.setOnSeekCompleteListener(muxStatsMediaPlayer.getOnSeekCompleteListener(null));  // No wrapped listener.
player.setOnVideoSizeChangedListener(muxStatsMediaPlayer.getOnVideoSizeChangedListener(myVideoSizeChangedListener));
```

Finally, when you are destroying the player, call the `MuxStatsMediaPlayer.release()` method.

```java
muxStatsMediaPlayer.release()
```

## 3. Set up required events

`MediaPlayer` does not provide listener callbacks for all necessary events, so you must add explicit calls into `MuxStatsMediaPlayer` at the same time that certain `MediaPlayer` methods are invoked:

* `start`: [view docs](https://developer.android.com/reference/android/media/MediaPlayer.html#start%28%29)
* `pause`: [view docs](https://developer.android.com/reference/android/media/MediaPlayer.html#pause%28%29)
* `seekTo`: [view docs](https://developer.android.com/reference/android/media/MediaPlayer.html#seekTo%28int%29)

For example, in the demo, a [MediaController view](https://developer.android.com/reference/android/widget/MediaController) is used to control the `MediaPlayer` instance, and the appropriate `MuxStatsMediaPlayer` methods are invoked in the
[MediaPlayerControl](https://developer.android.com/reference/android/widget/MediaController.MediaPlayerControl) implementation used to link the two instances.

```java
private class MediaPlayerControl implements MediaController.MediaPlayerControl,
        MediaPlayer.OnBufferingUpdateListener {
    @Override
    public void start() {
        if (player != null) {
            player.start();
            muxStats.play();
        }
    }

    @Override
    public void pause() {
        if (player != null) {
            player.pause();
            muxStats.pause();
        }
    }

    @Override
    public void seekTo(int pos) {
        if (player != null) {
            player.seekTo(pos);
            muxStats.seeking();
        }
    }
}
```

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

## 4. Make your data actionable

In the MediaPlayer SDK, options are provided via the CustomerPlayerData and CustomerVideoData objects.

All metadata details except for envKey are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 5. Advanced options

## Changing the video

There are two cases where the underlying tracking of the video view need to be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New Source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `muxStatsMediaPlayer.videoChange(CustomerVideoData)` which will remove all previous video data and reset all metrics for the video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

It's best to change the video info immediately after telling the player which new source to play.

### New Program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `muxStatsMediaPlayer.programChange(CustomerVideoData)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

## Error tracking

By default, Mux's integration with MediaPlayer automatically tracks fatal errors as thrown by MediaPlayer. In some applications, however, you may want to disable this and track errors on your own, especially if you have retry logic in your application to try to recover from errors that MediaPlayer encounters.

In this case, there are two things that you need to do:

1. Turn off the automatic error tracking. To do this, call `muxStatsExoPlayer.setAutomaticErrorTracking(false)`
2. When your application encounters a fatal error that you cannot recover from, call `muxStatsExoPlayer.error(MuxErrorException e)`, including a message and a code.

The following is an example of firing a custom error.

```java
// Error code: integer value for the generic type of error that
// occurred.
// Error message: String providing more information on the error
// that occurred.
// For an example, the HTML5 video element uses the
// following: https://developer.mozilla.org/en-US/docs/Web/API/MediaError
// for codes and messages. Feel free to use your own codes and messages
int errorCode = 1;
String errorMessage = "A fatal error was encountered during playback";
MuxErrorException error = new MuxErrorException(errorCode, errorMessage);
muxStatsMediaPlayer.error(error);
```

It is important that you only trigger an error when the playback has to be abandoned or aborted in an unexpected manner, as Mux tracks fatal playback errors only.

<LinkedHeader step={steps[6]} />

### Current release

#### v0.1.0

* Initial integration with MediaPlayer


# Monitor Bitmovin player
This guide walks through integration with Bitmovin player to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Customizable Error Tracking
- Ads metrics
- Ads metadata
- Custom Beacon Domain

```

Notes:

```md
No notes provided
```

## 1. Install \`@mux/mux-data-bitmovin\`

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-bitmovin
```

```yarn
yarn add @mux/mux-data-bitmovin
```

```cdn

<!-- Include bitmovin-mux after the core Bitmovin javascript file -->
<script src="https://cdn.bitmovin.com/player/web/8/bitmovinplayer.js"></script>
<script src="https://src.litix.io/bitmovin/5/bitmovin-mux.js"></script>

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Call `bitmovin.player.Player` like you normally would. Call `initBitmovinMux` with the player reference and the SDK options.

```html

<div id="my-player"></div>
<script>
  // Record the player init time
  const playerInitTime = initBitmovinMux.utils.now();
  // Configure the player as appropriate for your verion
  const conf = {
      // Insert player configuration here
  };
  // It is preferred to retrieve the reference from the return of
  // the initialization rather than on a player callback so that
  // Mux can track events as soon as possible.
  // For 5.x, 6.x, and 7.x this may look different
  const container = document.getElementById('my-player');
  const source = {
     // Insert source config here
  };
  var player = new bitmovin.player.Player(container, conf);
  player.load(source);

  initBitmovinMux(player, {
    debug: false,
    data: {
      env_key: 'ENV_KEY', // required
      // Metadata
      player_name: '', // ex: 'My Main Player'
      player_init_time: playerInitTime // ex: 1451606400000
      // ... and other metadata
    }
  });
</script>

```

```javascript

import initBitmovinMux from "@mux/mux-data-bitmovin";
// Record the player init time
const playerInitTime = initBitmovinMux.utils.now();
// Configure the player as appropriate for your verion
const conf = {
    // Insert player configuration here
};
// It is preferred to retrieve the reference from the return of
// the initialization rather than on a player callback so that
// Mux can track events as soon as possible.
// For 5.x, 6.x, and 7.x this may look different
const container = document.getElementById('my-player');
const source = {
    // Insert source config here
};
var player = new bitmovin.player.Player(container, conf);
player.load(source);

initBitmovinMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Metadata
    player_name: '', // ex: 'My Main Player'
    player_init_time: playerInitTime // ex: 1451606400000
    // ... and other metadata
  }
}, bitmovin);

```



## 3. Make your data actionable

The only required field in the `options` that you pass into `@mux/mux-data-bitmovin` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
initBitmovinMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000, can use `initBitmovinMux.utils.now()`
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `initBitmovinMux`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance returned by the `bitmovin.player.Player` function
player.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by the `bitmovin.player.Player` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by the `bitmovin.player.Player` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
initBitmovinMux(player, {
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
initBitmovinMux(player, {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-bitmovin` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance returned by the `bitmovin.player.Player` function
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
  };
}

initBitmovinMux(player, {
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
initBitmovinMux(player, {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Ads tracking with `@mux/mux-data-bitmovin`

Mux supports Bitmovin's VAST advertising client for pre-, mid-, and post-roll ads. Simply configure these plugins as you would normally, and Mux will track ads automatically. No additional configuration is needed.

The metrics for preroll request and response times, as well as number of requests, are pending an update to Bitmovin's API. Everything else will operate normally, but those metrics may be missing.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
initBitmovinMux(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v6.4.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v6.4.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v6.4.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v6.4.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v6.4.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v6.4.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v6.4.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v6.4.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v6.4.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v6.4.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v6.4.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v6.4.7

* Update `mux-embed` to v5.9.0

#### v6.4.6

* Update `mux-embed` to v5.8.3

#### v6.4.5

* Update `mux-embed` to v5.8.2

#### v6.4.4

* Update `mux-embed` to v5.8.1

#### v6.4.3

* Update `mux-embed` to v5.8.0

#### v6.4.2

* Update `mux-embed` to v5.7.0

#### v6.4.1

* Update `mux-embed` to v5.6.0

#### v6.4.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v6.3.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v6.3.2

* Update `mux-embed` to v5.4.2

#### v6.3.1

* Update `mux-embed` to v5.4.1

#### v6.3.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v6.2.6

* Update `mux-embed` to v5.3.3

#### v6.2.5

* Update `mux-embed` to v5.3.2

#### v6.2.4

* Update `mux-embed` to v5.3.1

#### v6.2.3

* Update `mux-embed` to v5.3.0

#### v6.2.2

* Update `mux-embed` to v5.2.1

#### v6.2.1

* Update `mux-embed` to v5.2.0

#### v6.2.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v6.1.0

* Refactors for stricter data types (e.g. string vs. number) based on TypeScript types.

* Update `mux-embed` to v5.0.0

#### v6.0.3

* Update `mux-embed` to v4.30.0

#### v6.0.2

* Update `mux-embed` to v4.29.0

#### v6.0.1

* Update `mux-embed` to v4.28.1

#### v6.0.0

* fix an issue when using modular v8 imports for Bitmovin player

* Update `mux-embed` to v4.28.0

#### v5.12.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v5.11.3

* Update `mux-embed` to v4.26.0

#### v5.11.2

* Update `mux-embed` to v4.25.1

#### v5.11.1

* Update `mux-embed` to v4.25.0

#### v5.11.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v5.10.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v5.9.4

* Update `mux-embed` to v4.22.0

#### v5.9.3

* Update `mux-embed` to v4.21.0

#### v5.9.2

* Update `mux-embed` to v4.20.0

#### v5.9.1

* Update `mux-embed` to v4.19.0

#### v5.9.0

* Set Mux Error Context with additional error information from Bitmovin player

#### v5.8.1

* Update `mux-embed` to v4.18.0

#### v5.8.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v5.7.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v5.6.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Update `mux-embed` to v4.15.0

#### v5.5.5

* Update `mux-embed` to v4.14.0

#### v5.5.4

* Update `mux-embed` to v4.13.4

#### v5.5.3

* Update `mux-embed` to v4.13.3

#### v5.5.2

* Update `mux-embed` to v4.13.2

#### v5.5.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v5.5.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v5.4.8

* Publish package to NPM

#### v5.4.7

* Update `mux-embed` to v4.12.1

#### v5.4.6

* Update `mux-embed` to v4.12.0

#### v5.4.5

* Provide a more friendly error message if the Bitmovin instance is not available
* Update `mux-embed` to v4.11.0

#### v5.4.4

* Update `mux-embed` to v4.10.0

#### v5.4.3

* Update `mux-embed` to v4.9.4

#### v5.4.2

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v5.4.1

* Update `mux-embed` to v4.9.2

#### v5.4.0

* Support Bitmovin module-based player

#### v5.3.6

* Update `mux-embed` to v4.9.1

#### v5.3.5

* Update `mux-embed` to v4.9.0

#### v5.3.4

* Update `mux-embed` to v4.8.0

#### v5.3.3

* Update `mux-embed` to v4.7.0

#### v5.3.2

* Update `mux-embed` to v4.6.2

#### v5.3.1

* Update `mux-embed` to v4.6.1

#### v5.3.0

* Bump mux-embed to 4.6.0

#### v5.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v5.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v5.0.0

* Update mux-embed to v4.0.0
* Support server-side device detection

#### v4.0.0

* remove support for version 5 of the Bitdash player
* allow passing of global `bitmovin` object, rather than requiring it be on `window`

#### v3.1.1

* fix an issue where manifests with `EXT-X-PROGRAM-DATE-TIME` could cause issues with video startup time

#### v3.1.0

* bugfix for `aderror` tracking

#### v3.0.1

* fix ad tracking on latest releases of the Bitmovin v7 and v8 players
* improve ad tracking for Bitmovin v8

#### v3.0.0

* bump `mux-embed` dependency to `3.0.0`


# Monitor Bitmovin Player Android
This guide walks through integration with the Bitmovin Player Android SDK to collect video performance metrics with Mux data.
This documents integration instructions for [Bitmovin's `Bitmovin Player` library](https://bitmovin.com/docs/player/api-reference/android/android-sdk-api-reference-v3#/player/android/3/docs/index.html), version 3.x and 2.x.

The Mux integration with `Bitmovin Player` is built on top of Mux's core Java SDK, and the full code can be seen here: [muxinc/mux-stats-sdk-bitmovin-android](https://github.com/muxinc/mux-stats-sdk-bitmovin-android).

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager

```

Notes:

```md
No notes provided
```

## 1. Install the Mux Data SDK

Add the Mux SDK to your project using one of the following approaches:

## Add Gradle dependency on the Mux Bitmovin Player SDK

Add the Mux Maven repository to your Gradle file:

```text
repositories {
    maven {
        url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
    }
}
```

Next, add a dependency on the Mux Data Bitmovin Player SDK. We support both `minapi16` and `minapi21` as separate artifacts.

The current version is `v0.5.1`. Additional releases can be found on our [releases page](https://github.com/muxinc/mux-stats-sdk-bitmovin-android/releases).

### Bitmovin Player support

We support version `3.11.1` of Bitmovin Player. Support for additional versions is planned

```groovy
implementation 'com.mux.stats.sdk.muxstats:muxstatssdkbitmovinplayer_r3_11_1:[CurrentVersion]'
```

## 2. Initialize the monitor with your Bitmovin Player instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

First, create the `CustomerPlayerData` and `CustomerVideoData` objects as appropriate for your current playback, and be sure to set your `ENV_KEY`.

```java
import com.mux.stats.sdk.core.model.CustomerPlayerData;
import com.mux.stats.sdk.core.model.CustomerVideoData;
import com.mux.stats.sdk.core.model.CustomerViewData
import com.mux.stats.sdk.core.model.CustomData;
import com.mux.stats.sdk.core.model.CustomerData;

CustomerPlayerData customerPlayerData = new CustomerPlayerData();
customerPlayerData.setEnvironmentKey("YOUR_ENVIRONMENT_KEY_HERE");

CustomerVideoData customerVideoData = new CustomerVideoData();
customerVideoData.setVideoTitle(intent.getStringExtra("YOUR_VIDEO_TITLE"));

CustomerViewData customerViewData = new CustomerViewData();
customerViewData.setViewSessionId("A26C4C2F-3C8A-46FB-885A-8D973F99A998");

CustomData customData = new CustomData();
customData.setCustomData1("YOUR_CUSTOM_STRING_HERE");

CustomerData customerData = new CustomerData(customerPlayerData, customerVideoData, customerViewData);
customerData.setCustomData(customData);
```

Next, create the `MuxStatsSDKBitmovinPlayer` object by passing your Android `Context` (typically your `Activity`), a `Bitmovin PlayerView` instance, a player name, and the customer data objects.

```java
import com.mux.stats.sdk.muxstats.MuxStatsSDKBitmovinPlayer;
...
// Make sure to monitor the player before calling `prepare` on the Bitmovin Player instance
muxStatsBitmovinPlayer = new MuxStatsSDKBitmovinPlayer(
  this, player, "demo-player", customerData);
```

In order to correctly monitor if the player is full-screen, provide the screen size to the `MuxStatsSDKBitmovinPlayer` instance.

```java
Point size = new Point();
getWindowManager().getDefaultDisplay().getSize(size);
muxStatsBitmovinPlayer.setScreenSize(size.x, size.y);
```

In order to determine a number of viewer context values as well as track the size of the video player, set the player view.

```java
muxStatsBitmovinPlayer.setPlayerView(playerView);
```

Finally, when you are destroying the player, call the `MuxStatsSDKBitmovinPlayer.release()` function.

```java
muxStatsBitmovinPlayer.release()
```

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

## 3. Add Metadata

In the Java SDK, options are provided via the objects within the `CustomerData` object.

All metadata details except for `envKey` are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Advanced

## Changing the video

There are two cases where the underlying tracking of the video view need to be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

## New source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `muxStatsBitmovinPlayer.videoChange(CustomerVideoData)` which will remove all previous video data and reset all metrics for the video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

It's best to change the video info immediately after telling the player which new source to play.

## New program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `muxStatsBitmovinPlayer.programChange(CustomerVideoData)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

## Error tracking

By default, Mux's integration with Bitmovin Player automatically tracks fatal errors as thrown by Bitmovin Player. If a fatal error happens outside the context of Bitmovin Player and you want to track it with Mux, you can call `muxStatsBitmovinPlayer.error` like this:

```java
// Error code: integer value for the generic type of error that
// occurred.
// Error message: String providing more information on the error
// that occurred.
// For an example, the HTML5 video element uses the
// following: https://developer.mozilla.org/en-US/docs/Web/API/MediaError
// for codes and messages. Feel free to use your own codes and messages
int errorCode = 1;
String errorMessage = "A fatal error was encountered during playback";
MuxErrorException error = new MuxErrorException(errorCode, errorMessage);
muxStatsBitmovinPlayer.error(error);
```

Note that `muxStatsBitmovinPlayer.error(MuxErrorException e)` can be used with or without automatic error tracking. If your application has retry logic that attempts to recover from Bitmovin Player errors then you may want to disable automatic error tracking like this:

```java
muxStatsBitmovinPlayer.setAutomaticErrorTracking(false)
```

<Callout type="warning">
  It is important that you only trigger an error when the playback has to be abandoned or aborted in an unexpected manner, as Mux tracks fatal playback errors only.
</Callout>

<LinkedHeader step={steps[5]} />

### Current release

#### v0.5.2

Fixes:

* Fix `ANRs` during Position Checks (#9)

### Previous releases

#### v0.5.1

Improvements:

* Detect Fullscreen Bitmovin's size listeners instead of guessing from view & screen sizes (#5)

#### v0.5.0

* Initial release


# Monitor castLabs Player (Web)
This guide walks through integration with [castLabs PRESTOplay for Web](https://castlabs.com/prestoplay/web-apps/) to collect video performance metrics with Mux Data.
<Callout type="warning" title="Third-party integration">
  This integration is managed and operated by [castLabs](https://castlabs.com/).
  Feedback should be made by using the [contact form](https://castlabs.com/contact/) or creating a ticket in the [General Helpdesk](https://castlabs.atlassian.net/servicedesk/customer/portal/26).
</Callout>

# Mux Environment Key

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

# Integration Guide

CastLabs maintains an online version of the official documentation which you can check out [here](https://demo.castlabs.com/#/docs/analytics#mux-data).


# Monitor castLabs Player (Android)
This guide walks through integration with [castLabs PRESTOplay for Android](https://castlabs.com/prestoplay/android/) to collect video performance metrics with Mux Data.
<Callout type="warning" title="Third-party integration">
  This integration is managed and operated by [castLabs](https://castlabs.com/).
  Feedback should be made by using the [contact form](https://castlabs.com/contact/) or creating a ticket in the [General Helpdesk](https://castlabs.atlassian.net/servicedesk/customer/portal/26).
</Callout>

# Mux Environment Key

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

# Integration Guide

CastLabs maintains an online version of the official documentation which you can check out [here](https://players.castlabs.com/android/latest/docs/build/html/extensions.html?highlight=mux#id6).


# Monitor Akamai media player
This guide walks through integration with [Akamai Media Player](https://www.akamai.com/us/en/products/media-delivery/adaptive-media-player.jsp) to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Custom Dimensions

```

Notes:

```md
No notes provided
```

## 1. Load \`@mux/mux-data-akamai\` as a plugin

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```npm
npm install --save @mux/mux-data-akamai
```

```yarn
yarn add @mux/mux-data-akamai
```

```cdn
<script src="http://src.litix.io/akamai/3/akamai-mux.js"></script>
```



Register the mux plugin with the `akamai` object.

```html

<div id="my-player"></div>
<script>
akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        // ... and other metadata
      }
    }
  }
});
</script>

```

```javascript

import initAkamaiMux from "@mux/mux-data-akamai";

initAkamaiMux(akamai);

akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        // ... and other metadata
      }
    }
  }
});

```



## 2. Make your data actionable

The only required field in the `data` key that you pass into `plugins.mux` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Site Metadata
        viewer_user_id: '', // ex: '12345'
        experiment_name: '', // ex: 'player_test_A'
        sub_property_id: '', // ex: 'cus-1'
        // Player Metadata
        player_name: '', // ex: 'My Main Player'
        player_version: '', // ex: '1.0.0'
        // Video Metadata
        video_id: '', // ex: 'abcd123'
        video_title: '', // ex: 'My Great Video'
        video_series: '', // ex: 'Weekly Great Videos'
        video_duration: '', // in milliseconds, ex: 120000
        video_stream_type: '', // 'live' or 'on-demand'
        video_cdn: '' // ex: 'Fastly', 'Akamai'
      }
    }
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 3. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by the `akamai.amp.AMP.create` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by the `akamai.amp.AMP.create` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 4. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      debug: false,
      disableCookies: true,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        // ... and other metadata
      }
    }
  }
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      debug: false,
      respectDoNotTrack: true,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        // ... and other metadata
      }
    }
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-akamai` will track errors emitted from the video element as fatal errors.

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
  };
}

akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      debug: false,
      respectDoNotTrack: true,
      errorTranslator,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        // ... and other metadata
      }
    }
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      debug: false,
      respectDoNotTrack: true,
      automaticErrorTracking: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        // ... and other metadata
      }
    }
  }
});
```

### Ads tracking with `@mux/mux-data-akamai`

Ad events are tracked automatically if your player is configured for ads. No additional configuration is needed.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
akamai.amp.AMP.create("#my-player", {
  // ... other player configuration
  plugins: {
    mux: {
      resources: [
        {src: "http://src.litix.io/akamai/3/akamai-mux.js", type: "text/javascript"},
      ],
      // ... various configuration options
      beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        player_init_time: playerInitTime // ex: 1451606400000
        // ... and other metadata
      }
    }
  }
});
```

<LinkedHeader step={steps[5]} />

### Current release

#### v3.11.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v3.11.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v3.11.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v3.11.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v3.11.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v3.11.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v3.11.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v3.11.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v3.11.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v3.11.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v3.11.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v3.11.7

* Update `mux-embed` to v5.9.0

#### v3.11.6

* Update `mux-embed` to v5.8.3

#### v3.11.5

* Update `mux-embed` to v5.8.2

#### v3.11.4

* Update `mux-embed` to v5.8.1

#### v3.11.3

* Update `mux-embed` to v5.8.0

#### v3.11.2

* Update `mux-embed` to v5.7.0

#### v3.11.1

* Update `mux-embed` to v5.6.0

#### v3.11.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v3.10.10

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v3.10.9

* Update `mux-embed` to v5.4.2

#### v3.10.8

* Update `mux-embed` to v5.4.1

#### v3.10.7

* Update `mux-embed` to v5.4.0

#### v3.10.6

* Update `mux-embed` to v5.3.3

#### v3.10.5

* Update `mux-embed` to v5.3.2

#### v3.10.4

* Update `mux-embed` to v5.3.1

#### v3.10.3

* Update `mux-embed` to v5.3.0

#### v3.10.2

* Update `mux-embed` to v5.2.1

#### v3.10.1

* Update `mux-embed` to v5.2.0

#### v3.10.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v3.9.0

* Minor refactors to have strict typing and type inferences available.

* Update `mux-embed` to v5.0.0

#### v3.8.4

* Update `mux-embed` to v4.30.0

#### v3.8.3

* Update `mux-embed` to v4.29.0

#### v3.8.2

* Update `mux-embed` to v4.28.1

#### v3.8.1

* Update `mux-embed` to v4.28.0

#### v3.8.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v3.7.3

* Update `mux-embed` to v4.26.0

#### v3.7.2

* Update `mux-embed` to v4.25.1

#### v3.7.1

* Update `mux-embed` to v4.25.0

#### v3.7.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v3.6.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v3.5.5

* Update `mux-embed` to v4.22.0

#### v3.5.4

* Update `mux-embed` to v4.21.0

#### v3.5.3

* Update `mux-embed` to v4.20.0

#### v3.5.2

* Update `mux-embed` to v4.19.0

#### v3.5.1

* Update `mux-embed` to v4.18.0

#### v3.5.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v3.4.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v3.3.6

* Update `mux-embed` to v4.15.0

#### v3.3.5

* Update `mux-embed` to v4.14.0

#### v3.3.4

* Update `mux-embed` to v4.13.4

#### v3.3.3

* Update `mux-embed` to v4.13.3

#### v3.3.2

* Update `mux-embed` to v4.13.2

#### v3.3.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v3.3.0

* Upgraded internal webpack version

* Export a function to register the mux plugin with Akamai

* Update `mux-embed` to v4.13.0

#### v3.2.14

* Publish package to NPM

#### v3.2.13

* Update `mux-embed` to v4.12.1

#### v3.2.12

* Update `mux-embed` to v4.12.0

#### v3.2.11

* Update `mux-embed` to v4.11.0

#### v3.2.10

* Update `mux-embed` to v4.10.0

#### v3.2.9

* Update `mux-embed` to v4.9.4

#### v3.2.8

* Update `mux-embed` to v4.9.3

#### v3.2.7

* Update `mux-embed` to v4.9.2

#### v3.2.6

* Update `mux-embed` to v4.9.1

#### v3.2.5

* Update `mux-embed` to v4.9.0

#### v3.2.4

* Fix an issue with removing `player_error_code` and `player_error_message` when the error code is `1`.
  Also stops emitting `MEDIA_ERR_ABORTED` as errors.
* Update `mux-embed` to v4.8.0

#### v3.2.3

* Update `mux-embed` to v4.7.0

#### v3.2.2

* Update `mux-embed` to v4.6.2

#### v3.2.1

* Update `mux-embed` to v4.6.1

#### v3.2.0

* Bump mux-embed to 4.6.0

#### v3.1.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v3.0.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly


# Monitor NexPlayer
This guide walks through integration with [NexPlayer](https://nexplayersdk.com/) to collect video performance metrics with Mux Data.
<Callout type="warning">
  # Third-party integration

  This integration is managed and operated by [NexPlayer](https://github.com/NexPlayer/NexPlayer_HTML5_Mux).
  Feedback should be made on the GitHub repo's [Issues](https://github.com/NexPlayer/NexPlayer_HTML5_Mux/issues) page or by contacting NexPlayer support by [email](mailto:supportmadrid@nexplayer.com).
</Callout>

## 1. Install NexPlayer\_HTML5\_Mux

Add the NexPlayer\_HTML5\_Mux plugin to your project by cloning the [GitHub repo](https://github.com/NexPlayer/NexPlayer_HTML5_Mux) or installing using yarn/npm.

```npm

npm install --save https://github.com/NexPlayer/NexPlayer_HTML5_Mux.git

```

```yarn

yarn add https://github.com/NexPlayer/NexPlayer_HTML5_Mux.git

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Add NexPlayer as you normally would to your solution including recommended CSS styling. In addition, you will need to import the Mux SDK and `NexMuxHandShake.js` into the `<head />` and set the `window.muxPlayerInitTime` to the current date/time.

<Callout type="warning">
  # NexPlayer minimum version

  Be sure to use the NexPlayer SDK v5.5.3.1 as it contains necessary functionality to integrate with Mux.
</Callout>

```html
<head>
  <style type="text/css">
    #player_container {
      position: relative;
      padding-top: 28%;
      padding-bottom: 28%;
      left: 28%;
    }

    #player {
      background-color: #191828;
      position: absolute;
      top: 0%;
      width: 50%;
      height: 50%;
    }
  </style>
  <script type="text/javascript" src="https://src.litix.io/core/4/mux.js"></script>
  <script type="text/javascript" src="https://nexplayer.nexplayersdk.com/5.5.3.1/nexplayer.js"></script>
  <script type="text/javascript" src="../node_modules/NexPlayer_HTML5_Mux/app/NexMuxHandShake.js"></script>
  <script type="text/javascript">window.muxPlayerInitTime = Date.now();</script>
</head>
```

Initialize your instance of NexPlayer with a configuration that includes the NexPlayer\_HTML5\_Mux plugin that activates Mux Data. Be sure to replace the `ENV_KEY` and `NEXPLAYER_KEY` with respective values.

```html

<div id="player_container">
  <div id="player" />
</div>

<script type="text/javascript">
  var muxConfiguration = {
    debug: false,
    data: {
      env_key: 'ENV_KEY'

      // Metadata
      player_name: '', // ex: 'My Main Player'
      player_init_time: window.muxPlayerInitTime // ex: 1451606400000

      // ... and other metadata
    },
  };

  var player = null;
  var videoElem = null;
  let nexMux = null;

  var callBackWithPlayers = function (nexplayerInstance, videoElement) {
    player = nexplayerInstance;
    videoElem = videoElement;

    videoElem.addEventListener("loadeddata", function() {
      nexMux = new NexMuxHandShake();
      nexMux.useAdMetrics = true;
      nexMux.initMuxData(muxConfiguration);
    });
  }

  nexplayer.Setup({
    key: 'NEXPLAYER_KEY',
    div: document.getElementById('player'),
    callbacksForPlayer: callBackWithPlayers,
    src: 'https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8',
  });
</script>

```



## 3. Make your data actionable

The only required field in the options that you pass into the NexPlayer\_HTML5\_Mux plugin is `ENV_KEY`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `muxConfiguration` on initialization.

```js
var muxConfiguration = {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: 'NexPlayer', // ex: 'My Main Player'
    player_version:  '', // ex: '1.0.0'
    player_init_time: window.muxPlayerInitTime, // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  },
};
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// nexMux is the instance returned by the 
// `new NexMuxHandShake()` in the above example
nexMux.videoChange({
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// nexMux is the instance returned by the 
// `new NexMuxHandShake()` in the above example
nexMux.programChange({
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 5. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
var muxConfiguration = {
  debug: false,
  disableCookies: true,
  data: {
    env_key: 'ENV_KEY', // required
    ...
  },
};
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
var muxConfiguration = {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: 'ENV_KEY', // required
    ...
  },
};
```

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
var muxConfiguration = {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: 'ENV_KEY', // required
    ...
  },
};
```

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
var muxConfiguration = {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: 'ENV_KEY', // required
    ...
  },
};
```


# Monitor Ooyala player
This guide walks through integration with Ooyala player to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Custom Dimensions
- Customizable Error Tracking
- Ads metrics
- Custom Beacon Domain

```

Notes:

```md
No notes provided
```

## 1. Install \`ooyala-mux\`

Include the Mux JavaScript SDK on every page of your web app that includes video.

```html
<!-- Include ooyala-mux after the core Ooyala javascript files -->
<script src="https://player.ooyala.com/static/v4/stable/latest/core.min.js"></script>
<!-- Insert other Ooyala plugin files here -->
<script src="https://src.litix.io/ooyala/4/ooyala-mux.js"></script>
```

## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Call `OO.player.create` like you normally would. Call `initOoyalaMux` with the player reference in the `onCreate` callback.

```html
<div id="my-player"></div>
<script>
  let playerInitTime;

  // Use a callback for when the player is created to register Mux Data
  function onPlayerCreated (player) {
    initOoyalaMux(player, {
      debug: false,
      data: {
        env_key: 'ENV_KEY', // required
        // Metadata
        player_name: '', // ex: 'My Main Player'
        player_init_time: playerInitTime // ex: 1451606400000
        // ... and other metadata
      }
    }
  });

  const asset = {
    // Insert Ooyala asset configuration here
  };

  const playerConfig = {
    onCreate: onPlayerCreated,
    // Insert other Ooyala player configuration (e.g. autoplay etc) here
  };

  // Create the player with the Mux callback
  OO.ready(function() {
    playerInitTime = initOoyalaMux.utils.now();
    OO.player.create('playerdiv', asset, playerConfig)
  });
</script>
```

## 3. Make your data actionable

The only required field in the `options` that you pass into `ooyala-mux` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
initOoyalaMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance received in the `onCreate` callback
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance received in the `onCreate` callback
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 5. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
initOoyalaMux(player, {
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
initOoyalaMux(player, {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `ooyala-mux` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance received in the `onCreate` callback
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: 'Additional context for the error'
  };
}

initOoyalaMux(player, {
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
initOoyalaMux(player, {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Ads tracking with `ooyala-mux`

Mux has been tested with and supports Ooyala's `google-ima-ads-manager`. Configure these plugins as you would normally, and Mux will track ads automatically. No additional configuration is needed.

Other Ooyala ad integrations, such as FreeWheel and VAST/VPAID may work out of the box. Please contact us with any questions.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
initOoyalaMux(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

<LinkedHeader step={steps[6]} />

### Current release

#### v4.12.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v4.12.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v4.12.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v4.12.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v4.12.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v4.12.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v4.12.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v4.12.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v4.12.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v4.12.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v4.12.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v4.12.7

* Update `mux-embed` to v5.9.0

#### v4.12.6

* Update `mux-embed` to v5.8.3

#### v4.12.5

* Update `mux-embed` to v5.8.2

#### v4.12.4

* Update `mux-embed` to v5.8.1

#### v4.12.3

* Update `mux-embed` to v5.8.0

#### v4.12.2

* Update `mux-embed` to v5.7.0

#### v4.12.1

* Update `mux-embed` to v5.6.0

#### v4.12.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v4.11.10

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v4.11.9

* Update `mux-embed` to v5.4.2

#### v4.11.8

* Update `mux-embed` to v5.4.1

#### v4.11.7

* Update `mux-embed` to v5.4.0

#### v4.11.6

* Update `mux-embed` to v5.3.3

#### v4.11.5

* Update `mux-embed` to v5.3.2

#### v4.11.4

* Update `mux-embed` to v5.3.1

#### v4.11.3

* Update `mux-embed` to v5.3.0

#### v4.11.2

* Update `mux-embed` to v5.2.1

#### v4.11.1

* Update `mux-embed` to v5.2.0

#### v4.11.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v4.10.5

* Update `mux-embed` to v5.0.0

#### v4.10.4

* Update `mux-embed` to v4.30.0

#### v4.10.3

* Update `mux-embed` to v4.29.0

#### v4.10.2

* Update `mux-embed` to v4.28.1

#### v4.10.1

* Update `mux-embed` to v4.28.0

#### v4.10.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v4.9.3

* Update `mux-embed` to v4.26.0

#### v4.9.2

* Update `mux-embed` to v4.25.1

#### v4.9.1

* Update `mux-embed` to v4.25.0

#### v4.9.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v4.8.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v4.7.5

* Update `mux-embed` to v4.22.0

#### v4.7.4

* Update `mux-embed` to v4.21.0

#### v4.7.3

* Update `mux-embed` to v4.20.0

#### v4.7.2

* Update `mux-embed` to v4.19.0

#### v4.7.1

* Update `mux-embed` to v4.18.0

#### v4.7.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v4.6.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v4.5.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Update `mux-embed` to v4.15.0

#### v4.4.5

* Update `mux-embed` to v4.14.0

#### v4.4.4

* Update `mux-embed` to v4.13.4

#### v4.4.3

* Update `mux-embed` to v4.13.3

#### v4.4.2

* Update `mux-embed` to v4.13.2

#### v4.4.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v4.4.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v4.3.14

* Publish package to NPM

#### v4.3.13

* Update `mux-embed` to v4.12.1

#### v4.3.12

* Update `mux-embed` to v4.12.0

#### v4.3.11

* Update `mux-embed` to v4.11.0

#### v4.3.10

* Update `mux-embed` to v4.10.0

#### v4.3.9

* Update `mux-embed` to v4.9.4

#### v4.3.8

* Update `mux-embed` to v4.9.3

#### v4.3.7

* Update `mux-embed` to v4.9.2

#### v4.3.6

* Update `mux-embed` to v4.9.1

#### v4.3.5

* Update `mux-embed` to v4.9.0

#### v4.3.4

* Update `mux-embed` to v4.8.0

#### v4.3.3

* Update `mux-embed` to v4.7.0

#### v4.3.2

* Update `mux-embed` to v4.6.2

#### v4.3.1

* Update `mux-embed` to v4.6.1

#### v4.3.0

* Bump mux-embed to 4.6.0

#### v4.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v4.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v4.0.0

* Update `mux-embed` to v4.0.0
* Support server-side device detection

#### v3.0.0

* bump `mux-embed` to 3.0.0


# Monitor Shaka player
This guide walks through integration with Shaka player to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Can infer CDN identification from response headers
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Custom Beacon Domain
- Extraction of HLS Session Data

```

Notes:

```md
Request Latency is not available.
```

## 1. Install \`@mux/mux-data-shakaplayer\`

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-shakaplayer
```

```yarn
yarn add @mux/mux-data-shakaplayer
```

```cdn
<script src="https://src.litix.io/shakaplayer/5/shakaplayer-mux.js"></script>
```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Call `new shaka.Player` like you normally would and get the return value (a reference to the `player`). Call `initShakaPlayerMux` with the player reference and the SDK options.

```html

<div id="my-player"></div>
<script>
  const playerInitTime = initShakaPlayerMux.utils.now();
  const video = document.querySelector('#my-player');
  const player = new shaka.Player(video);

  // calling initShakaPlayerMux will return a shakaPlayerMux object
  // you will need this for handling any errors when calling
  // player.load()
  const shakaPlayerMux = initShakaPlayerMux(player, {
    debug: false,
    data: {
      env_key: 'ENV_KEY',
      // Metadata
      player_name: 'Custom Player', // ex: 'My Main Player',
      player_init_time: playerInitTime // ex: 1451606400000
      // ... and other metadata
    }
  });

  player.load('https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8').then(function () {
    // Successfully loaded the manifest. Mux data will begin tracking
  }).catch(function (error) {
    // There was an error loading this manifest. Call shakaPlayerMux.loadErrorHandler(error) so that Mux data can track this error.
    shakaPlayerMux.loadErrorHandler(error);
    // Do the rest of your error handling logic
  })

  // When you are ready to destroy shakaplayer, you must also destroy
  // the mux monitor
  player.destroy()
  player.mux.destroy()
</script>

```

```javascript

import shaka from "shaka-player";
import initShakaPlayerMux from "@mux/mux-data-shakaplayer";

const playerInitTime = initShakaPlayerMux.utils.now();
const video = document.querySelector('#my-player');
const player = new shaka.Player(video);


// calling initShakaPlayerMux will return a shakaPlayerMux object
// you will need this for handling any errors when calling
// player.load()
const shakaPlayerMux = initShakaPlayerMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY',
    // Metadata
    player_name: 'Custom Player', // ex: 'My Main Player',
    player_init_time: playerInitTime // ex: 1451606400000
    // ... and other metadata
  }
}, shaka);

player.load('https://stream.mux.com/yb2L3z3Z4IKQH02HYkf9xPToVYkOC85WA.m3u8').then(function () {
  // Successfully loaded the manifest. Mux data will begin tracking
}).catch(function (error) {
  // There was an error loading this manifest. Call shakaPlayerMux.loadErrorHandler(error) so that Mux data can track this error.
  shakaPlayerMux.loadErrorHandler(error);
  // Do the rest of your error handling logic
})
// When you are ready to destroy shakaplayer, you must also destroy
// the mux monitor:
// player.destroy()
// player.mux.destroy()

```



## Passing in `shaka` global

You'll see the 3rd argument to `initShakaPlayerMux` is `shaka`. This is the global `shaka` object. If you are using a bundler and importing `shaka` with `require` or `import` then you'll need to pass in the `shaka` object.

If no `shaka` object is passed in, then `initShakaPlayerMux` will look for `shaka` on then global `window` object.

## 3. Make your data actionable

The only required field in the `options` that you pass into `@mux/mux-data-shakaplayer` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
initShakaPlayerMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY',
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `initShakaPlayerMux`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance returned by `new shaka.Player`
player.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by `new shaka.Player`
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by `new shaka.Player`
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
// player is the instance returned by `new shaka.Player`
initShakaPlayerMux(player, {
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
// player is the instance returned by `new shaka.Player`
initShakaPlayerMux(player, {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-shakaplayer` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance returned by `new shaka.Player`
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

// player is the instance returned by `new shaka.Player`
initShakaPlayerMux(player, {
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
// player is the instance returned by `new shaka.Player`
initShakaPlayerMux(player, {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Track Ad playback with a custom integration

Our integration for Shaka player does not have a built-in integration for tracking ad playback. If you would like to track ads played within Shaka player, you will need to build a custom integration, which is detailed here: [Build a Custom Integration](/docs/guides/build-a-custom-data-integration).

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
// player is the instance returned by `new shaka.Player`
initShakaPlayerMux(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v5.14.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v5.14.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v5.14.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v5.14.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v5.14.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v5.14.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v5.14.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v5.14.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v5.14.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v5.14.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v5.14.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v5.14.7

* Update `mux-embed` to v5.9.0

#### v5.14.6

* Update `mux-embed` to v5.8.3

#### v5.14.5

* Update `mux-embed` to v5.8.2

#### v5.14.4

* Update `mux-embed` to v5.8.1

#### v5.14.3

* Update `mux-embed` to v5.8.0

#### v5.14.2

* Update `mux-embed` to v5.7.0

#### v5.14.1

* Update `mux-embed` to v5.6.0

#### v5.14.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v5.13.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v5.13.2

* Update `mux-embed` to v5.4.2

#### v5.13.1

* Update `mux-embed` to v5.4.1

#### v5.13.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v5.12.8

* Update `mux-embed` to v5.3.3

#### v5.12.7

* Update `mux-embed` to v5.3.2

#### v5.12.6

* Update `mux-embed` to v5.3.1

#### v5.12.5

* Update `mux-embed` to v5.3.0

#### v5.12.4

* fix an issue where `[Object object]` would be returned in error\_context at times

#### v5.12.3

* Update `mux-embed` to v5.2.1

#### v5.12.2

* Update `mux-embed` to v5.2.0

#### v5.12.1

* Resolve Shaka crash if `response.data` is not present

#### v5.12.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v5.11.0

* tsignore added due to new TypeScript types (types not fully applied yet)

* Update `mux-embed` to v5.0.0

#### v5.10.5

* Update `mux-embed` to v4.30.0

#### v5.10.4

* Update `mux-embed` to v4.29.0

#### v5.10.3

* Only submit requestcompleted events for the manifest, media, and encryption requests

#### v5.10.2

* Update `mux-embed` to v4.28.1

#### v5.10.1

* Update `mux-embed` to v4.28.0

#### v5.10.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v5.9.3

* Update `mux-embed` to v4.26.0

#### v5.9.2

* Update `mux-embed` to v4.25.1

#### v5.9.1

* Update `mux-embed` to v4.25.0

#### v5.9.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v5.8.6

* Update `mux-embed` to v4.23.0

#### v5.8.5

* Update `mux-embed` to v4.22.0

#### v5.8.4

* Update `mux-embed` to v4.21.0

#### v5.8.3

* Update `mux-embed` to v4.20.0

#### v5.8.2

* Update `mux-embed` to v4.19.0

#### v5.8.1

* Load error codes on-demand

#### v5.8.0

* Collect Shaka contextual error information

* Update `mux-embed` to v4.18.0

#### v5.7.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v5.6.0

* Add new `renditionchange` fields to Shaka SDK

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Add frame drops to Shaka SDK

* Update `mux-embed` to v4.16.0

#### v5.5.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Record `request_url` and `request_id` with network events

* Update `mux-embed` to v4.15.0

#### v5.4.5

* Update `mux-embed` to v4.14.0

#### v5.4.4

* Update `mux-embed` to v4.13.4

#### v5.4.3

* Update `mux-embed` to v4.13.3

#### v5.4.2

* Update `mux-embed` to v4.13.2

#### v5.4.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v5.4.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v5.3.14

* Publish package to NPM

#### v5.3.13

* Update `mux-embed` to v4.12.1

#### v5.3.12

* Update `mux-embed` to v4.12.0

#### v5.3.11

* Update `mux-embed` to v4.11.0

#### v5.3.10

* Update `mux-embed` to v4.10.0

#### v5.3.9

* Update `mux-embed` to v4.9.4

#### v5.3.8

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v5.3.7

* Update `mux-embed` to v4.9.2

#### v5.3.6

* Update `mux-embed` to v4.9.1

#### v5.3.5

* Update `mux-embed` to v4.9.0

#### v5.3.4

* Update `mux-embed` to v4.8.0

#### v5.3.3

* Update `mux-embed` to v4.7.0

#### v5.3.2

* Update `mux-embed` to v4.6.2

#### v5.3.1

* Update `mux-embed` to v4.6.1

#### v5.3.0

* Bump mux-embed to 4.6.0

#### v5.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v5.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v5.0.0

* Update `mux-embed` to v4.0.0
* Support server-side device detection

#### v4.0.1

* remove mime type detection, Mux will now detect this server-side based on the source
* HLS mime type changed from `application/vnd.apple.mpegurl` to `application/x-mpegurl`. This is part of a larger effort to standardize mime type detection across different players

#### v4.0.0

* Only send 'critical' errors to Mux. Previously, any error (including non-fatal errors) could be sent to Mux. See: https://shaka-player-demo.appspot.com/docs/api/shaka.util.Error.html#.Severity


# Monitor THEOplayer (Web)
This guide walks through integration with THEOplayer to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Available for deployment from a package manager
- Average Bitrate metrics and `renditionchange` events
- Customizable Error Tracking
- Ads metrics
- Custom Beacon Domain
- Extraction of HLS Session Data
- Live Stream Latency metric

```

Notes:

```md
No notes provided
```

## 1. Install \`@mux/mux-data-theoplayer\`

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-theoplayer
```

```yarn
yarn add @mux/mux-data-theoplayer
```

```cdn

<!-- Include theoplayer-mux after the core THEOplayer javascript files -->
<script type="text/javascript" src="https://cdn.theoplayer.com/latest/~yourlicense~/theoplayer.loader.js"></script>
<script src="https://src.litix.io/theoplayer/4/theoplayer-mux.js"></script>

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Call `new THEOplayer.Player` like you normally would. Call `initTHEOplayerMux` with a reference to the player instance and the Mux SDK options.

```html

<div id="my-player" class='video-js theoplayer-skin theo-seekbar-above-controls'></div>
<script>
  const playerInitTime = initTHEOplayerMux.utils.now();
  const playerWrapper = document.querySelector('#my-player');

  // Get a reference to your player, and pass it to the init function
  const player = new THEOplayer.Player(playerWrapper, {
    // Insert player configuration here
  });

  player.src = 'https://muxed.s3.amazonaws.com/leds.mp4';

  initTHEOplayerMux(player, {
    debug: false,
    data: {
      env_key: 'ENV_KEY', // required
      // Metadata
      player_name: '', // ex: 'My Main Player'
      player_init_time: playerInitTime // ex: 1451606400000
      // ... and other metadata
    }
  });
</script>

```

```javascript

import * as THEOplayer from 'theoplayer';
import initTHEOplayerMux from '@mux/mux-data-theoplayer';

const playerInitTime = initTHEOplayerMux.utils.now();
const playerWrapper = document.querySelector('#my-player');

// Get a reference to your player, and pass it to the init function
const player = new THEOplayer.Player(playerWrapper, {
  // Insert player configuration here
});

player.src = 'https://muxed.s3.amazonaws.com/leds.mp4';

initTHEOplayerMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Metadata
    player_name: '', // ex: 'My Main Player'
    player_init_time: playerInitTime // ex: 1451606400000
    // ... and other metadata
  }
}, THEOplayer);

```



## Passing in `THEOplayer` global

You'll see the 3rd argument to `initTHEOplayerMux` is `THEOplayer`. This is the global `THEOplayer` object. If you are using a bundler and importing `THEOplayer` with `require` or `import` then you'll need to pass in the `THEOplayer` object.

If no `THEOplayer` object is passed in, then `initTHEOplayerMux` will look for `THEOplayer` on then global `window` object.

## 3. Make your data actionable

The only required field in the `options` that you pass into `initTHEOplayerMux` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
// player here is the instance of THEOplayer.Player
initTHEOplayerMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `initTHEOplayerMux`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance of THEOplayer.Player
let monitor = initTHEOplayerMux(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required

    video_id: 'abcd123',
  }
});

monitor.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

For THEOplayer, you do not need to emit the `videochange` event when the player source property of the player is updated. The `sourcechange` event that is fired when you update the source property of the player is handled automatically. However, you still need to pass the updated video metadata under `metadata.mux`, as shown in the example below.

When this is done, it removes all previous video data and resets all metrics for the video view. Note: the previous method using changeMuxVideo has been deprecated, but will continue to work for 2.x versions of this plugin.

```js
player.source = {
  sources: {
    // ...your source
  },
  metadata: {
    mux: {
      video_id: 'new-ID',
      video_title: 'New title',
      // ... other metadata
    }
  }
}
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance of THEOplayer.Player
let monitor = initTHEOPlayerMux(player, {
  debug: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});

// emit `programchange` when the content within the stream changes
monitor.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
// player here is the instance of THEOplayer.Player
initTHEOplayerMux(player, {
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
// player is the instance of THEOplayer.Player
initTHEOplayerMux(player, {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-theoplayer` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance of THEOplayer.Player
let monitor = initTHEOPlayerMux(player, {
  debug: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});

// emit the `error` event when an error occurs
monitor.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

// player is the instance of THEOplayer.Player
initTHEOplayerMux(player, {
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
// player is the instance of THEOplayer.Player
initTHEOplayerMux(player, {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Ads tracking with `@mux/mux-data-theoplayer`

Mux has been tested with and supports [THEOplayer's Ads integration](https://docs.theoplayer.com/how-to-guides/01-ads/00-introduction.md). Simply configure the ads as you would with THEOplayer normally, and Mux will track ads automatically. No additional configuration is needed.

Other THEOplayer ad integrations, such as Google IMA, may work out of the box but have not currently been tested. Please contact us with any questions.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
// player is the instance of THEOplayer.Player
initTHEOplayerMux(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Destroy the monitor

In some cases, you may want to stop tracking an instance of THEOplayer. To this, we provide a `destroy` method within the returned object of `initTHEOplayerMux`, which will immediately end the active Mux Data view and stop tracking the THEOplayer instance.

```
// player is the instance of THEOplayer.Player
let monitor = initTHEOplayerMux(player, {
  debug: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});

// once ready to destroy the monitor
monitor.destroy();
```

<LinkedHeader step={steps[7]} />

### Current release

#### v5.4.4

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v5.4.3

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v5.4.2

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v5.4.1

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v5.4.0

* fix issue with sourcechange causing metadata conflicts

#### v5.3.15

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v5.3.14

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v5.3.13

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v5.3.12

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v5.3.11

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v5.3.10

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v5.3.9

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v5.3.8

* Fix issue with audio tracking where the player is not initialized

#### v5.3.7

* Update `mux-embed` to v5.9.0

#### v5.3.6

* Update `mux-embed` to v5.8.3

#### v5.3.5

* Update `mux-embed` to v5.8.2

#### v5.3.4

* Update `mux-embed` to v5.8.1

#### v5.3.3

* Update `mux-embed` to v5.8.0

#### v5.3.2

* Update `mux-embed` to v5.7.0

#### v5.3.1

* Update `mux-embed` to v5.6.0

#### v5.3.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v5.2.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v5.2.2

* Update `mux-embed` to v5.4.2

#### v5.2.1

* Update `mux-embed` to v5.4.1

#### v5.2.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v5.1.9

* Update `mux-embed` to v5.3.3

#### v5.1.8

* Update `mux-embed` to v5.3.2

#### v5.1.7

* Update `mux-embed` to v5.3.1

#### v5.1.6

* Update `mux-embed` to v5.3.0

#### v5.1.5

* fix an issue where video bitrate for renditionchange events could be calculated incorrectly for non-dash streams
* fix an issue where request/response interceptors were not removed on destroy

#### v5.1.4

* utilize width and height directly from THEOplayer's API for renditionchange events
* add support for detecting frame rate, name, and codec for renditionchange events
* Update `mux-embed` to v5.2.1

#### v5.1.3

* Update `mux-embed` to v5.2.0

#### v5.1.2

* Fix issue when videoTracks or audioTracks is undefined

#### v5.1.1

* Ensure seeking/seeked and rebuffering/rebuffered events are better distinguished.

#### v5.1.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v5.0.4

* Update `mux-embed` to v5.0.0

#### v5.0.3

* Update `mux-embed` to v4.30.0

#### v5.0.2

* Update `mux-embed` to v4.29.0

#### v5.0.1

* Update `mux-embed` to v4.28.1

#### v5.0.0

* use a new mechanism to track rebuffering for better accuracy
  * fix an issue where player time was reported in the wrong units
  * improved internal cleanup for memory management

* Update `mux-embed` to v4.28.0

#### v4.17.1

* Fixed the README files (public and internal) with correct information

#### v4.17.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v4.16.0

* Fix error context reporting for HLS manifests

#### v4.15.3

* Update `mux-embed` to v4.26.0

#### v4.15.2

* Update `mux-embed` to v4.25.1

#### v4.15.1

* Update `mux-embed` to v4.25.0

#### v4.15.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v4.14.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v4.13.4

* Update `mux-embed` to v4.22.0

#### v4.13.3

* Update `mux-embed` to v4.21.0

#### v4.13.2

* Update `mux-embed` to v4.20.0

#### v4.13.1

* Update `mux-embed` to v4.19.0

#### v4.13.0

* Set Mux Error Context with additional error information from THEOplayer

#### v4.12.1

* Fall back to player element size to get better player resolutions
* Update `mux-embed` to v4.18.0

#### v4.12.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v4.11.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v4.10.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Record `request_url` and `request_id` with network events

* Update `mux-embed` to v4.15.0

#### v4.9.5

* Update `mux-embed` to v4.14.0

#### v4.9.4

* Update `mux-embed` to v4.13.4

#### v4.9.3

* Update `mux-embed` to v4.13.3

#### v4.9.2

* Update `mux-embed` to v4.13.2

#### v4.9.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v4.9.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v4.8.6

* Publish package to NPM

#### v4.8.5

* Update `mux-embed` to v4.12.1

#### v4.8.4

* Update `mux-embed` to v4.12.0

#### v4.8.3

* Update `mux-embed` to v4.11.0

#### v4.8.2

* Update `mux-embed` to v4.10.0

#### v4.8.1

* Update `mux-embed` to v4.9.4

#### v4.8.0

* Allow for passing in the THEOplayer instance instead of using the instance on window

#### v4.7.6

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v4.7.5

* Update `mux-embed` to v4.9.2

#### v4.7.4

* Update `mux-embed` to v4.9.1

#### v4.7.3

* Update `mux-embed` to v4.9.0

#### v4.7.2

* Update `mux-embed` to v4.8.0

#### v4.7.1

* Update `mux-embed` to v4.7.0

#### v4.7.0

* Introducing HLS Session Data support

* Update `mux-embed` to v4.6.2

#### v4.6.1

* Update `mux-embed` to v4.6.1

#### v4.6.0

* Bump mux-embed to 4.6.0

#### v4.5.1

* Update mux-embed to v4.4.4
* Stops emitting a `requestcompleted` event for every manifest request

#### v4.5.0

* Update mux-embed to v4.4.2

#### v4.4.0

* Add support for bandwidth metrics

#### v4.3.1

* Fix an issue where normal events were being fired as ad events

#### v4.3.0

* Update mux-embed to v4.4.0
* Support latency metrics when using HLS

#### v4.2.0

* Update mux-embed to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v4.1.1

* Fix an issue where bitrate reported for HLS streams would be double the expected value

#### v4.1.0

* Update mux-embed to v4.1.1
* Add support for custom dimensions
* Fix an issue where `player_remote_played` may not be tracked correctly

#### v4.0.0

* Update mux-embed to v4.0.0
* Support server-side device detection

#### v3.1.0

* Add `renditionchange` tracking event

#### v3.0.1

* Inject metadata for certain edge case startup sequences

#### v3.0.0

* Update `mux-embed` to 3.0.0


# Monitor THEOplayer (iOS)
This guide walks through integration with THEOplayer to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Custom Dimensions
- Customizable Error Tracking

```

Notes:

```md
Packaged with: cocoapods. Supports ad events, ads metadata is not available.
```

## 1. Install the Mux Data SDK

## Requirements:

* THEOplayer.xcframework SDK for iOS (> 5.9)
* A working implementation of `THEOplayer` in your iOS app

Before integrating `Mux-Stats-THEOplayer` into your player, first make sure your THEOplayer implementation is working as expected.

Add `Mux-Stats-THEOplayer` to your podfile

```
pod 'Mux-Stats-THEOplayer', '~> 0.8'
```

Run `pod install` then import `MuxCore` and `MUXSDKStatsTHEOplayer` modules into your application. Call `monitorTHEOplayer` and pass in a reference to your `THEOplayer` instance.

## 2. Initialize the monitor for your THEOplayer instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Below is an example configuration for a simple THEOplayer implementation. The key part to pay attention to is `monitorTHEOplayer`. This example is using ads with THEOplayer, which will also be tracked with Mux Data.

```swift
import MuxCore
import MUXSDKStatsTHEOplayer
import THEOplayerSDK
import UIKit

class ViewController: UIViewController {
    let playerName = "demoplayer"
    var player: THEOplayer!

    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        self.player = THEOplayer(configuration: THEOplayerConfiguration(chromeless: false))
        self.player.frame = view.bounds
        self.player.addAsSubview(of: view)

        let typedSource = TypedSource(
            src: "https://stream.mux.com/tqe4KzdxU6GLc8oowshXgm019ibzhEX3k.m3u8",
            type: "application/vnd.apple.mpegurl")

        let ad = THEOAdDescription(src: "https://pubads.g.doubleclick.net/gampad/ads?sz=640x480&iu=/124319096/external/ad_rule_samples&ciu_szs=300x250&ad_rule=1&impl=s&gdfp_req=1&env=vp&output=vmap&unviewed_position_start=1&cust_params=deployment%3Ddevsite%26sample_ar%3Dpremidpostpod&cmsid=496&vid=short_onecue&correlator=")

        let source = SourceDescription(source: typedSource, ads: [ad], textTracks: nil, poster: nil, analytics: nil, metadata: nil)
        self.player.source = source

        // TODO: Add your env key
        let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY")!

        let videoData = MUXSDKCustomerVideoData()
        videoData.videoTitle = "Big Buck Bunny"
        videoData.videoId = "bigbuckbunny"
        videoData.videoSeries = "animation"

        MUXSDKStatsTHEOplayer.monitorTHEOplayer(self.player, name: playerName, playerData: playerData, videoData: videoData, softwareVersion: "1.1.1")
        self.player.play()
    }
}
```

## 3. Make your data actionable

The only required field is `env_key`. But without some more metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Metadata fields are provided via the `MUXSDKCustomerPlayerData` and `MUXSDKCustomerVideoData` objects.

For the full list of properties view the header files for this interfaces:

* [MUXSDKCustomerPlayerData.h](https://github.com/muxinc/stats-sdk-objc/blob/master/XCFramework/MuxCore.xcframework/ios-arm64/MuxCore.framework/Headers/MUXSDKCustomerPlayerData.h)
* [MUXSDKCustomerVideoData.h](https://github.com/muxinc/stats-sdk-objc/blob/master/XCFramework/MuxCore.xcframework/ios-arm64/MuxCore.framework/Headers/MUXSDKCustomerVideoData.h)

For more details about each property, view the [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) guide.

```swift
let playName = "iOS AVPlayer"
let playerData = MUXSDKCustomerPlayerData(environmentKey: "ENV_KEY");
playerData.viewerUserId = "1234"
playerData.experimentName = "player_test_A"
// note that the 'playerName' field here is unrelated to the 'playName' variable above
playerData.playerName = "My Main Player"
playerData.playerVersion = "1.0.0"

let videoData = MUXSDKCustomerVideoData();
videoData.videoId = "abcd123"
videoData.videoTitle = "My Great Video"
videoData.videoSeries = "Weekly Great Videos"
videoData.videoDuration = 120000 // in milliseconds
videoData.videoIsLive = false
videoData.videoCdn = "cdn"


MUXSDKStatsTHEOplayer.monitorTHEOplayer(self.player, name: playerName, playerData: playerData, videoData: videoData, softwareVersion: "1.1.1")
self.player.play()
```

## 4. Advanced options

## Changing the video

If you want to change the video in the player, you'll need to let the Mux SDK know by calling `videoChangeForPlayer`. From the perspective of Mux Data, this will create a new view.

```swift
let videoData = MUXSDKCustomerVideoData()
videoData.videoTitle = "New Video"
videoData.videoId = "newVideoId"
MUXSDKStatsTHEOplayer.videoChangeForPlayer(name: self.playerName, videoData: videoData)

let typedSource = TypedSource(src: "https://stream.mux.com/tNrV028WTqCOa02zsveBdNwouzgZTbWx5x.m3u8", type: "application/vnd.apple.mpegurl")
let source = SourceDescription(source: typedSource, ads: [], textTracks: nil, poster: nil, analytics: nil, metadata: nil)
self.player.source = source
self.player.play()
```

## Handling Errors manually

By default, `automaticErrorTracking` is enabled which means the Mux SDK will catch errors that the player throws and track an error event. Error tracking is meant for fatal errors. When an error is thrown it will mark the view as having encountered an error in the Mux dashboard and the view will no longer be monitored.

If you want to disable automatic and track errors manually you can do by passing in `automaticErrorTracking` false when calling `monitorTHEOplayer`

Whether automatic error tracking is enabled or disabled, you can dispatch errors manually with `dispatchError`.

```swift
MUXSDKStatsTHEOplayer.monitorTHEOplayer(self.player, name: playerName, playerData: playerData, videoData: videoData, softwareVersion: "1.1.1", automaticErrorTracking: false)
MUXSDKStatsTHEOplayer.dispatchError(name: playerName, code: "1234", message: "Something is not right")
```

<LinkedHeader step={steps[5]} />

### Current release

#### v0.12.0

* Update range of supported THEOplayer versions to major version 8

### Previous Releases

#### v0.11.0

* Relax THEOplayer version constraint to allow installation alongside any version of THEOplayer whose major version is 7
* Add an ads integration presence check to remove console warning when no ads integration is present

#### v0.10.0

* Support use in tvOS applications
* Update minimum supported THEOplayer dependency to 7.1.0
* Update pinned MuxCore dependency to 4.7.1

#### v0.9.0

* Update minimum supported THEOplayer dependency to 6.12.1
* Update pinned MuxCore dependency to 4.7.0
* The minimum deployment target is now iOS 12.0

#### v0.8.0

* Add support for THEOplayer 5.9 and above
* Add support for installation with Swift Package Manager

#### v0.7.0

* Remove the THEOplayerSDK.framework from build artifact
* Add THEOplayerSDK.framework to .gitignore

#### v0.6.0

* Add MUXSDKCustomerData
* Custom data support through customer data object

#### v0.5.0

* Update to use xcframeworks to provide Xcode 13 and M1 compatibility

#### v0.4.1

* Fix an issue where an error message could be wrongly set when an AdError occurs

#### v0.4.0

* Fix an issue with error message and code in AdError events
* Fix compatibility with Xcode 12

#### v0.3.0

* Add error code tracking as well as error message when handling errors
* Bump the required THEOplayer.framework SDK for iOS to > v2.76

#### v0.2.0

* Add option to disable automatic error tracking when calling `monitorTHEOplayer`
* Add API to manually dispatch an error with `MUXSDKStatsTHEOplayer.dispatchError`

You probably will not need to use these features, but if your player is throwing noisy non-fatal errors or you want to catch the player errors yourself and take precise control over the error code and error message then you now have that ability.

* (bugfix) fix build script for frameworks for `AppStore error ITMS-90562: Invalid Bundle` in the `CFBundleSupportedPlatforms` plist
* (bugfix) fix crash that can happen when using Google IMA ads with THEOplayer

#### v0.1.0

* Initial release


# Monitor THEOplayer
This guide walks through integration with the THEOplayer Android SDK to collect video performance metrics with Mux data.
This documents integration instructions for [THEO Technologies' `THEOplayer` library](https://www.theoplayer.com/sdk/android)

The Mux integration with `THEOplayer` is built on top of Mux's core Java SDK, and the full code can be seen here: [muxinc/mux-stats-sdk-theoplayer-android](https://github.com/muxinc/mux-stats-sdk-theoplayer-android).

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Available for deployment from a package manager
- Custom Dimensions
- Average Bitrate metrics and `renditionchange` events
- Ads metrics

```

Notes:

```md
`renditionchange` events are tracked, bitrate metrics are not available
```

## 1. Install the Mux Data SDK

Add the Mux SDK to your project using one of the following approaches:

## Add Gradle dependency on the Mux THEOplayer SDK (preferred)

Add the Mux Maven repository to your Gradle file:

```text
repositories {
    maven {
        url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
    }
}
```

Next, add a dependency on the Mux Data THEOplayer SDK.

The latest version of our SDK can be found [here](https://github.com/muxinc/mux-stats-sdk-theoplayer-android/releases/latest)

```groovy
implementation 'com.mux.stats.sdk.muxstats:muxstatssdktheoplayer:[CurrentVersion]'
```

## 2. Initialize the monitor with your THEOplayer instance

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

First, create the `CustomerPlayerData` and `CustomerVideoData` objects as appropriate for your current playback, and be sure to set your `ENV_KEY`.

```java
import com.mux.stats.sdk.core.model.CustomerPlayerData;
import com.mux.stats.sdk.core.model.CustomerVideoData;
import com.mux.stats.sdk.core.model.CustomerViewData
import com.mux.stats.sdk.core.model.CustomData;
import com.mux.stats.sdk.core.model.CustomerData;

CustomerPlayerData customerPlayerData = new CustomerPlayerData();
customerPlayerData.setEnvironmentKey("YOUR_ENVIRONMENT_KEY_HERE");

CustomerVideoData customerVideoData = new CustomerVideoData();
customerVideoData.setVideoTitle(intent.getStringExtra("YOUR_VIDEO_TITLE"));

CustomerViewData customerViewData = new CustomerViewData();
customerViewData.setViewSessionId("A26C4C2F-3C8A-46FB-885A-8D973F99A998");

CustomData customData = new CustomData();
customData.setCustomData1("YOUR_CUSTOM_STRING_HERE");

CustomerData customerData = new CustomerData(customerPlayerData, customerVideoData, customerViewData);
customerData.setCustomData(customData);
```

Next, create the `MuxStatsSDKTHEOPlayer` object by passing your Android `Context` (typically your `Activity`), a `THEOplayerView` instance, a player name, and the customer data objects.

```java
import com.mux.stats.sdk.muxstats.MuxStatsSDKTHEOPlayer;
...
// Make sure to monitor the player before calling `prepare` on the THEOplayer instance
muxStatsTHEOplayer = new MuxStatsSDKTHEOPlayer(
  this, player, "demo-player", customerData);
```

In order to correctly monitor if the player is full-screen, provide the screen size to the `MuxStatsSDKTHEOPlayer` instance.

```java
Point size = new Point();
getWindowManager().getDefaultDisplay().getSize(size);
muxStatsTHEOPlayer.setScreenSize(size.x, size.y);
```

In order to determine a number of viewer context values as well as track the size of the video player, set the player view.

```java
muxStatsTHEOplayer.setPlayerView(theoPlayerView);
```

Finally, when you are destroying the player, call the `MuxStatsSDKTHEOPlayer.release()` function.

```
muxStatsTHEOplayer.release()
```

After you've integrated, start playing a video in your player. A few minutes after you stop watching, you'll see the results in your Mux data dashboard. Login to the dashboard and find the environment that corresponds to your `env_key` and look for video views.

## 3. Add Metadata

In the Java SDK, options are provided via the objects within the `CustomerData` object.

All metadata details except for `envKey` are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Advanced

## Changing the video

There are two cases where the underlying tracking of the video view need to be reset. First, when you load a new source URL into an existing player, and second when the program within a singular stream changes (such as a program within a live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

## New source

When you change to a new video (in the same player) you need to update the information that Mux knows about the current video. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

This is done by calling `muxStatsTHEOplayer.videoChange(CustomerVideoData)` which will remove all previous video data and reset all metrics for the video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

It's best to change the video info immediately after telling the player which new source to play.

## New program (in single stream)

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, call `muxStatsTHEOplayer.programChange(CustomerVideoData)`. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

## Detect when a video is being played full-screen

For most use cases, the SDK is capable of detecting whether or not a video is being played full-screen. Specifically, it can do so in the case where the player view is the same size as the device display (excepting ActionBars and other framework window decoration).

For other uses cases (non-overlaid controls, window decoration via plain `View`s, etc) you may need to tell the SDK when the user switches to full-screen.

```java
  @Override
  public void onCreate(@Nullable Bundle savedInstanceState) {
    super.onCreate(savedInstanceState)

    // If you are using SimplePlayerView, StyledPlayerView, etc
    theoPlayerView = findViewById(R.id.my_player_view);

    theoPlayerView.getFullscreenManager().addFullscreenChangeListener(new FullscreenChangeListener() {
      @Override
      public void onEnterFullscreen() {
        muxStatsTHEOplayer.presentationChange(MuxSDKViewPresentation.FULLSCREEN);
      }
      @Override
      public void onExitFullscreen() {
        muxStatsTHEOPlayer.presentationChange(MuxSDKViewPresentation.PORTRAIT);
      }
    });
  }
```

## Error tracking

By default, Mux's integration with THEOplayer automatically tracks fatal errors as thrown by THEOplayer. If a fatal error happens outside the context of THEOplayer and you want to track it with Mux, you can call `muxStatsTHEOplayer.error` like this:

```java
// Error code: integer value for the generic type of error that
// occurred.
// Error message: String providing more information on the error
// that occurred.
// For an example, the HTML5 video element uses the
// following: https://developer.mozilla.org/en-US/docs/Web/API/MediaError
// for codes and messages. Feel free to use your own codes and messages
int errorCode = 1;
String errorMessage = "A fatal error was encountered during playback";
MuxErrorException error = new MuxErrorException(errorCode, errorMessage);
muxStatsTHEOplayer.error(error);
```

Note that `muxStatsTHEOplayer.error(MuxErrorException e)` can be used with or without automatic error tracking. If your application has retry logic that attempts to recover from THEOplayer errors then you may want to disable automatic error tracking like this:

```java
muxStatsTHEOplayer.setAutomaticErrorTracking(false)
```

<Callout type="warning">
  It is important that you only trigger an error when the playback has to be abandoned or aborted in an unexpected manner, as Mux tracks fatal playback errors only.
</Callout>

<LinkedHeader step={steps[5]} />

### Current release

#### v0.4.2

Updates:

* update: rename library artifact to `muxstatssdktheoplayer`

### Previous releases

#### v0.4.1

Improvements:

* Use "Android TV" osFamily on tv devices

#### v0.4.0

Improvements:

* Update to Core 8.1.0

#### v0.3.0

Improvements:

* Add support for THEOplayer v7
  Fixes:
* fix: NullPointerException with getPlayerData() inside MuxStats  (#29)

#### v0.2.0

Updates:

* Support THEOPlayer v5 and higher

#### v0.1.3

Fixes:

* Update THEOplayer to 2.92.0 (#18)

#### v0.1.2

Improvements:

* Update to MuxCore 7.0.10
  MuxCore Fixes:
* Fix event-handling errors in rare cases

#### v0.1.1

* Initial release


# Monitor Flowplayer
This guide walks through integration with Flowplayer to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Custom Dimensions
- Customizable Error Tracking
- Ads metrics
- Custom Beacon Domain

```

Notes:

```md
No notes provided
```

## 1. Install \`@mux/mux-data-flowplayer\`

Include the Mux JavaScript SDK on every page of your web app that includes video.

```npm
npm install --save @mux/mux-data-flowplayer
```

```yarn
yarn add @mux/mux-data-flowplayer
```

```cdn

<!-- include flowplayer-mux after the other flowplayer libraries -->
<link rel="stylesheet" href="https://releases.flowplayer.org/7.2.1/skin/skin.css">
<!-- include Flowplayer -->
<script src="https://releases.flowplayer.org/7.2.1/flowplayer.min.js"></script>
<script src="https://src.litix.io/flowplayer/3/flowplayer-mux.js"></script>

```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Call `flowplayer` like you normally would and save a reference to the player. Call `initFlowplayerMux` with the player reference.

```html

<div id="my-player"></div>
<script>
  const playerInitTime = initFlowplayerMux.utils.now();
  const container = document.getElementById('my-player');
  const player = flowplayer(container, {
    /// ... flowplayer config
  });

  // Make sure to call this immediately after the return from flowplayer
  initFlowplayerMux(player, container, {
    debug: false,
    data: {
      env_key: 'ENV_KEY', // required
      // Metadata
      player_name: '', // ex: 'My Main Player'
      player_init_time: playerInitTime
      // ... and other metadata
    }
  });
</script>

```

```javascript

import initFlowplayerMux from "@mux/mux-data-flowplayer";
import flowplayer from "@flowplayer/player";

const playerInitTime = initFlowplayerMux.utils.now();
const container = document.getElementById('my-player');
const player = flowplayer(container, {
  /// ... flowplayer config
});

initFlowplayerMux(player, container, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Metadata
    player_name: '', // ex: 'My Main Player'
    player_init_time: playerInitTime
    // ... and other metadata
  }
}, flowplayer);

```



## Passing in `flowplayer` global

You'll see the 3rd argument to `initFlowplayerMux` is `flowplayer`. This is the global `flowplayer` object. If you are using a bundler and importing `flowplayer` with `require` or `import` then you'll need to pass in the `flowplayer` object.

If no `flowplayer` object is passed in, then `initFlowplayerMux` will look for `flowplayer` on then global `window` object.

## 3. Make your data actionable

The only required field in the `options` that you pass into `@mux/mux-data-flowplayer` is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
initFlowplayerMux(player, container, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000, can use `initFlowplayerMux.utils.now()`
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `initFlowplayerMux`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
// player is the instance that gets returned from the `flowplayer` function
player.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance that gets returned from the `flowplayer` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance that gets returned from the `flowplayer` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 6. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
// player is the instance that gets returned from the `flowplayer` function
initFlowplayerMux(player, container, {
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
// player is the instance that gets returned from the `flowplayer` function
initFlowplayerMux(player, {
  debug: false,
  respectDoNotTrack: true,
  data: {
    env_key: "ENV_KEY",
    // ...
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-flowplayer` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance that gets returned from the `flowplayer` function
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

// player is the instance that gets returned from the `flowplayer` function
initFlowplayerMux(player, {
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
initFlowplayerMux(player, {
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Ads tracking with `@mux/mux-data-flowplayer`

Mux has been tested with and support Flowplayer's IMA and VAST plugins for ad support. No addition configuration is needed, Mux will track ads automatically.

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
initFlowplayerMux(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

<LinkedHeader step={steps[7]} />

### Current release

#### v3.14.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v3.14.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v3.14.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v3.14.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v3.14.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v3.14.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v3.14.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v3.14.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v3.14.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v3.14.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v3.14.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v3.14.7

* Update `mux-embed` to v5.9.0

#### v3.14.6

* Update `mux-embed` to v5.8.3

#### v3.14.5

* Update `mux-embed` to v5.8.2

#### v3.14.4

* Update `mux-embed` to v5.8.1

#### v3.14.3

* Update `mux-embed` to v5.8.0

#### v3.14.2

* Update `mux-embed` to v5.7.0

#### v3.14.1

* Update `mux-embed` to v5.6.0

#### v3.14.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v3.13.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v3.13.2

* Update `mux-embed` to v5.4.2

#### v3.13.1

* Update `mux-embed` to v5.4.1

#### v3.13.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v3.12.6

* Update `mux-embed` to v5.3.3

#### v3.12.5

* Update `mux-embed` to v5.3.2

#### v3.12.4

* Update `mux-embed` to v5.3.1

#### v3.12.3

* Update `mux-embed` to v5.3.0

#### v3.12.2

* Update `mux-embed` to v5.2.1

#### v3.12.1

* Update `mux-embed` to v5.2.0

#### v3.12.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v3.11.5

* Update `mux-embed` to v5.0.0

#### v3.11.4

* Update `mux-embed` to v4.30.0

#### v3.11.3

* Update `mux-embed` to v4.29.0

#### v3.11.2

* Update `mux-embed` to v4.28.1

#### v3.11.1

* Update `mux-embed` to v4.28.0

#### v3.11.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v3.10.3

* Update `mux-embed` to v4.26.0

#### v3.10.2

* Update `mux-embed` to v4.25.1

#### v3.10.1

* Update `mux-embed` to v4.25.0

#### v3.10.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v3.9.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v3.8.5

* Update `mux-embed` to v4.22.0

#### v3.8.4

* Update `mux-embed` to v4.21.0

#### v3.8.3

* Update `mux-embed` to v4.20.0

#### v3.8.2

* Update `mux-embed` to v4.19.0

#### v3.8.1

* Update `mux-embed` to v4.18.0

#### v3.8.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v3.7.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v3.6.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Update `mux-embed` to v4.15.0

#### v3.5.5

* Update `mux-embed` to v4.14.0

#### v3.5.4

* Update `mux-embed` to v4.13.4

#### v3.5.3

* Update `mux-embed` to v4.13.3

#### v3.5.2

* Update `mux-embed` to v4.13.2

#### v3.5.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v3.5.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v3.4.3

* Publish package to NPM

#### v3.4.2

* Update `mux-embed` to v4.12.1

#### v3.4.1

* Update `mux-embed` to v4.12.0

#### v3.4.0

* Add ability to pass in the Flowplayer instance to `initFlowplayerMux` function

* Update `mux-embed` to v4.11.0

#### v3.3.10

* Update `mux-embed` to v4.10.0

#### v3.3.9

* Update `mux-embed` to v4.9.4

#### v3.3.8

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v3.3.7

* Update `mux-embed` to v4.9.2

#### v3.3.6

* Update `mux-embed` to v4.9.1

#### v3.3.5

* Update `mux-embed` to v4.9.0

#### v3.3.4

* Update `mux-embed` to v4.8.0

#### v3.3.3

* Update `mux-embed` to v4.7.0

#### v3.3.2

* Update `mux-embed` to v4.6.2

#### v3.3.1

* Update `mux-embed` to v4.6.1

#### v3.3.0

* Bump mux-embed to 4.6.0

#### v3.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v3.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v3.0.0

* Update mux-embed to v4.0.0
* Support server-side device detection

#### v2.0.2

* Improve error handling/reporting

#### v2.0.1

* Detect the correct `video_source_url`

#### v2.0.0

* Bump `mux-embed` to 3.0.0


# Monitor Brightcove (Web)
This guide walks through integration with [Brightcove web player](https://player.support.brightcove.com/) to collect video performance metrics with Mux Data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Custom Dimensions
- Request metrics
- Ads metrics
- Ads metadata
- Custom Beacon Domain

```

Notes:

```md
No notes provided
```

## 1. Install \`videojs-mux\`

Either include the Mux JavaScript SDK for video.js (`videojs-mux`) either via the Brightcove Studio by adding the script `https://src.litix.io/videojs/4/videojs-mux.js` as a new JavaScript line in your Plugins configuration or load `videojs-mux` from the CDN on your web pages.

```npm
npm install --save videojs-mux
```

```yarn
yarn add videojs-mux
```

```cdn
<script src="https://src.litix.io/videojs/4/videojs-mux.js"></script>
```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

Initialize the `videojs` player like you normally would and get a reference to the `player`. Call `player.mux` with the Mux plugin options to initialize monitoring.

```html
<video
  id="my-player"
  data-video-id="..."
  data-account="..."
  data-player="..."
  data-embed="default"
  data-application-id
  class="video-js"
  controls>
>
</video>

<script>
  const playerInitTime = Date.now();
  // Get a reference to your player, and pass it to the init function
  const player = videojs("my-player");
  player.mux({
    debug: false,
    data: {
      env_key: 'ENV_KEY', // required
      // Metadata
      player_name: '', // ex: 'My Main Player'
      player_init_time: playerInitTime // ex: 1451606400000
      // ... and other metadata
    }
  });
</script>
```

## 3. Make your data actionable

The only required field in the `options` that you pass into the `data` options in the `player.mux` function is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata under the `data` on initialization.

```js
// player is the instance returned by the `videojs` function
player.mux({
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

```js
// player is the instance returned by the `videojs` function
player.mux.emit('videochange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

```js
// player is the instance returned by the `videojs` function
player.mux.emit('programchange', {
  video_id: 'abc345',
  video_title: 'My Other Great Video',
  video_series: 'Weekly Great Videos',
  // ...
});
```

## 5. Advanced options

### Disable cookies

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies. For information about the specific data tracked in the cookie, please refer to: [What information is stored in Mux Data HTML cookies](/docs/guides/ensure-data-privacy-compliance#what-information-is-stored-in-mux-data-html-cookies).

This is done by setting `disableCookies: true` in the options.

```js
// player is the instance returned by the `videojs` function
player.mux({
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Over-ride 'do not track' behavior

By default, Mux plugins for HTML5-based players do not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack: true`.

```js
// player is the instance returned by the `videojs` function
player.mux({
  debug: false,
  disableCookies: true,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `videojs-mux` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
// player is the instance returned by the `videojs` function
player.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

// player is the return value from the `videojs` function
player.mux({
  debug: false,
  errorTranslator,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
// player is the return value from the `videojs` function
player.mux({
  debug: false,
  automaticErrorTracking: false,
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
// player is the return value from the `videojs` function
player.mux({
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

<LinkedHeader step={steps[6]} />

### Current release

#### v4.21.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v4.21.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v4.21.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v4.21.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v4.21.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v4.21.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v4.21.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v4.21.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v4.21.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v4.21.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v4.21.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v4.21.7

* Update `mux-embed` to v5.9.0

#### v4.21.6

* Update `mux-embed` to v5.8.3

#### v4.21.5

* Update `mux-embed` to v5.8.2

#### v4.21.4

* Update `mux-embed` to v5.8.1

#### v4.21.3

* Update `mux-embed` to v5.8.0

#### v4.21.2

* Update `mux-embed` to v5.7.0

#### v4.21.1

* Update `mux-embed` to v5.6.0

#### v4.21.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v4.20.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v4.20.2

* Update `mux-embed` to v5.4.2

#### v4.20.1

* Update `mux-embed` to v5.4.1

#### v4.20.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v4.19.4

* Update `mux-embed` to v5.3.3

#### v4.19.3

* Update `mux-embed` to v5.3.2

#### v4.19.2

* Update `mux-embed` to v5.3.1

#### v4.19.1

* Update `mux-embed` to v5.3.0

#### v4.19.0

* utilize onRequest rather than beforeSend for videojs 8.x

* Update `mux-embed` to v5.2.1

#### v4.18.1

* Update `mux-embed` to v5.2.0

#### v4.18.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v4.17.0

* Refactors for stricter data types (e.g. string vs. number) based on TypeScript types.

* Update `mux-embed` to v5.0.0

#### v4.16.4

* Update `mux-embed` to v4.30.0

#### v4.16.3

* Update `mux-embed` to v4.29.0

#### v4.16.2

* Update `mux-embed` to v4.28.1

#### v4.16.1

* Update `mux-embed` to v4.28.0

#### v4.16.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v4.15.3

* Update `mux-embed` to v4.26.0

#### v4.15.2

* Update `mux-embed` to v4.25.1

#### v4.15.1

* Update `mux-embed` to v4.25.0

#### v4.15.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v4.14.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v4.13.4

* Update `mux-embed` to v4.22.0

#### v4.13.3

* Update `mux-embed` to v4.21.0

#### v4.13.2

* Update `mux-embed` to v4.20.0

#### v4.13.1

* Update `mux-embed` to v4.19.0

#### v4.13.0

* Set Mux Error Context with error status from Video.js

#### v4.12.0

* Capture ad metadata for Video.js IMA

* Update `mux-embed` to v4.18.0

#### v4.11.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v4.10.1

* fix issue where VideoJS with hls.js might cause an exception when monitored

#### v4.10.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v4.9.1

* fix an issue where an exception may happen on certain Samsung TVs using `videojs-mux`

#### v4.9.0

* Register `beforesetup` hook to track `player_init_time` automatically. There is now no need to provide `player_init_time` in plugin initialization

* Record `request_url` and `request_id` with network events

* Update `mux-embed` to v4.15.0

#### v4.8.5

* Update `mux-embed` to v4.14.0

#### v4.8.4

* Update `mux-embed` to v4.13.4

#### v4.8.3

* Update `mux-embed` to v4.13.3

#### v4.8.2

* Update `mux-embed` to v4.13.2

#### v4.8.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v4.8.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v4.7.8

* Update `mux-embed` to v4.12.1

#### v4.7.7

* Update `mux-embed` to v4.12.0

#### v4.7.6

* Update `mux-embed` to v4.11.0

#### v4.7.5

* Update `mux-embed` to v4.10.0

#### v4.7.4

* Update `mux-embed` to v4.9.4

#### v4.7.3

* Use `videojs.Vhs` instead of `videojs.Hls` when available

#### v4.7.2

* Update `mux-embed` to v4.9.3

#### v4.7.1

* Update `mux-embed` to v4.9.2

#### v4.7.0

* HLS session and latency metrics

#### v4.6.6

* Update `mux-embed` to v4.9.1

#### v4.6.5

* Update `mux-embed` to v4.9.0

#### v4.6.4

* Fix an issue with removing `player_error_code` and `player_error_message` when the error code is `1`.
  Also stops emitting `MEDIA_ERR_ABORTED` as errors.
* Update `mux-embed` to v4.8.0

#### v4.6.3

* Update `mux-embed` to v4.7.0

#### v4.6.2

* Update `mux-embed` to v4.6.2

#### v4.6.1

* Update `mux-embed` to v4.6.1

#### v4.6.0

* Bump mux-embed to 4.6.0

#### v4.5.0

* Export a `register` function that takes a `videojs` instance to install the mux plugin on

#### v4.4.0

* Update `mux-embed` to v4.4.2

#### v4.3.0

* Update `mux-embed` to v4.3.0

#### v4.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v4.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v4.0.0

* Update `mux-embed` to v4.0.0
* Support server-side device detection
* Internal fixes and improvements

#### v3.1.4

* update logging around retrieving BANDWIDTH information

#### v3.1.3

* Bump `mux-embed` dependency to `3.4.3`.

#### v3.1.2

* Bump `mux-embed` dependency to `3.4.2`.


# Monitor Brightcove (iOS)
This guide walks through integration with [Brightcove iOS player](https://player.support.brightcove.com/) to collect video performance metrics with Mux Data.
Brightcove's native SDK for iOS is based on `AVPlayerLayer`. You will need to be using Brightcove's iOS player version `6.x`.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics

```

Notes:

```md
No notes provided
```

## 1. Install Mux Data SDK

```
pod 'Mux-Stats-AVPlayer', '~>3.0'
```

This will install `Mux-Stats-AVPlayer` and the latest current release of our [core Objective-C Library](https://github.com/muxinc/stats-sdk-objc). There will be no breaking updates in major versions, so you can safely run `pod update` for future versions.

Next, add correct import statement into your application.

## 2. Initialize AVPlayerLayer monitor

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

In your application, you will need to hook into Brightcove's SDK lifecycle events in order to access the underlying `AVPlayerLayer` instance.

```objc
@import BrightcovePlayerSDK;
@import MUXSDKStats;

@property (nonatomic, copy) NSString *trackedPlayerName;

- (void)playbackController:(id<BCOVPlaybackController>)controller didAdvanceToPlaybackSession:(id<BCOVPlaybackSession>)session
{
    // Destroy previous MUXSDKStats if this signifies the other view ended
    // Note: you may want to handle this in another lifecycle event, if you
    // have one that signifies when the video playback has ended/exited.
    if (self.trackedPlayerName != nil) {
        [MUXSDKStats destroyPlayer:self.trackedPlayerName];
    }

    MUXSDKCustomerPlayerData *playerData = [[MUXSDKCustomerPlayerData alloc] initWithEnvironmentKey:@"ENV_KEY"];
    [playerData setPlayerName: @"Brightcove SDK w/ Mux"];
    // set additional player metadata here
    MUXSDKCustomerVideoData *videoData = [MUXSDKCustomerVideoData new];
    [videoData setVideoId:@"EXAMPLE ID"];
    // set additional video metadata here
    self.trackedPlayerName = @"example_player_name";
    [MUXSDKStats monitorAVPlayerLayer:session.playerLayer withPlayerName:self.trackedPlayerName playerData:playerData videoData:videoData];
}
```

Refer to the detailed guide for AVPlayer to finish setup.

<GuideCard
  title="Detailed AVPlayer guide"
  description="After getting a reference to your AVPlayerLayer instance, finish configuring it."
  links={[
    {title: "Read the guide", href: "/docs/guides/monitor-avplayer"},
  ]}
/>


# Brightcove (Android)
This guide walks through integration with Brightcove's Android player to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Customizable Error Tracking

```

Notes:

```md
No notes provided
```

## 1. Install the Mux Data SDK

Brightcove's native SDK for Android has support for both the native `MediaPlayer` as well as `ExoPlayer`. In the case that you utilize `ExoPlayer` (via a class such as `BrightcoveExoPlayerVideoView`), monitoring basic video playback is relatively simple.

# Requirements

* Brightcove SDK for Android 6.x
* ExoPlayer-based Brightcove Player (e.g. `BrightcoveExoPlayerVideoView`)

# Integration Instructions

Brightcove's SDK for Android encapsulates an underlying `SimpleExoPlayer` instance. In order to integrate, you need to create an instance of `MuxStats` for each new video loaded into the player. This is best done by listening for the `didSetVideo` event that the `EventEmitter` emits.

Brightcove's current Android SDK (6.2.x) uses ExoPlayer r2.7.x, so you should include the appropriate AAR file from our releases page and in our [Monitor ExoPlayer guide](/docs/guides/monitor-exoplayer).

Note: `didSetVideo` is used in order to get the updated `Video` in the case that a playlist of `Video` objects, so that you can retrieve the updated metadata.

```java
// MainFragment.java (or MainActivity.java, wherever
// you have access to your `BrightcoveExoPlayerVideoView`

import com.mux.stats.sdk.core.model.CustomerPlayerData;
import com.mux.stats.sdk.core.model.CustomerVideoData;
import com.mux.stats.sdk.muxstats.MuxStatsExoPlayer;

public class MainFragment extends BrightcovePlayerFragment implements EventListener {

  public static final String TAG = MainFragment.class.getSimpleName();
  private MuxStatsExoPlayer muxStatsExoPlayer;

  @Override
  public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
    View result = inflater.inflate(R.layout.fragment_main, container, false);
    baseVideoView = (BrightcoveExoPlayerVideoView) result.findViewById(R.id.brightcove_video_view);
    super.onCreateView(inflater, container, savedInstanceState);
    baseVideoView.getEventEmitter().on("didSetVideo", this);

    // Set up your videos for playback here
    Video video = Video.createVideo("https://path/to/video.mp4", DeliveryType.HLS);

    baseVideoView.add(video);
    baseVideoView.start();
    return result;
  }

  @Override
  public void processEvent(Event event) {
    ExoPlayerVideoDisplayComponent videoDisplayComponent = (ExoPlayerVideoDisplayComponent) baseVideoView.getVideoDisplay();
    Video video = baseVideoView.getCurrentVideo();
    ExoPlayer exoPlayer = videoDisplayComponent.getExoPlayer();

    CustomerPlayerData customerPlayerData = new CustomerPlayerData();
    CustomerVideoData customerVideoData = new CustomerVideoData();
    customerVideoData.setVideoTitle(video.getId());
    CustomerData customerData = new CustomerData(customerPlayerData, customerVideoData, null)

    if (muxStatsExoPlayer != null) {
      muxStatsExoPlayer.release();
      muxStatsExoPlayer = null;
    }

    muxStatsExoPlayer = new MuxStatsExoPlayer(this, "YOUR_ENV_KEY_HERE", exoPlayer, baseVideoView, customerData);
  }
}
```


# Monitor CTS PDK
This guide walks through integration with Comcast Technology Solutions Player Development Kit (CTS PDK).
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Custom Dimensions
- Custom Beacon Domain

```

Notes:

```md
Video Quality metrics are not available.
```

## 1. Install \`cts-mux\`

If installing from the MPX Console, load `ctx-mux` from the CDN:

```curl
https://src.litix.io/cts/3/cts-mux.js
```

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

If installing in the player embed, follow the example below

```html
<div class="tpPlayer"
     id="player"
     // ... other configuration options
     tp:muxPlugin = "priority=1|URL=https://src.litix.io/cts/3/cts-mux.js|env_key=ENV_KEY|debug=false">
</div>
<script>
  // Creates the Player object that builds the component.
  const player = new Player("player");
  player.bind("player");
</script>
```

## 2. Make your data actionable

The only required field in the SDK options is `env_key`. Mux will automatically pull some metadata fields like `video_id`, `video_title`, and `video_duration` from the player itself. You can optionally override these values in the plugin parameters. Providing useful metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

Pass in metadata fields separated by `|` with the plugin parameters.

```html
<div class="tpPlayer"
     id="player"
     // ... other configuration options
     tp:muxPlugin = "priority=1|URL=https://src.litix.io/cts/3/cts-mux.js|env_key=ENV_KEY|debug=false|player_name='EXAMPLE_PLAYER_NAME'|player_version=1.0.0">
</div>
<script>
  // Creates the Player object that builds the component.
  const player = new Player("player");
  player.bind("player");
</script>
```

The only required field in the `options` that you pass into the `data` options in the `player.mux` function is `env_key`. But without some metadata the metrics in your dashboard will lack the necessary information to take meaningful actions. Metadata allows you to search and filter on important fields in order to diagnose issues and optimize the playback experience for your end users.

For more information, view [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata).

## 3. Advanced options

### Changing the video

If the underlying source changes of the video within the same player, `cts-mux` will track this change automatically. No extra configuration is needed.

### Disable cookies

By default, `cts-mux` uses a cookie to track playback across subsequent page views. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For instance, if your site or application is targeted towards children under 13, you should disable the use of cookies.

This is done by setting `disableCookies=true` in the options passed to the Mux plugin.

```html
<div class="tpPlayer"
     id="player"
     // ... other configuration options
     tp:muxPlugin = "priority=1|URL=https://src.litix.io/cts/3/cts-mux.js|env_key=ENV_KEY|debug=false|player_name='EXAMPLE_PLAYER_NAME'|disableCookies=true>
</div>
<script>
  // Creates the Player object that builds the component.
  const player = new Player("player");
  player.bind("player");
</script>
```

### Over-ride 'do not track' behavior

By default, `cts-mux` does not respect [Do Not Track](https://www.eff.org/issues/do-not-track) when set within browsers. This can be enabled in the options passed to Mux, via a setting named `respectDoNotTrack`. The default for this is `false`. If you would like to change this behavior, pass `respectDoNotTrack=true`.

```html
<div class="tpPlayer"
     id="player"
     // ... other configuration options
     tp:muxPlugin = "priority=1|URL=https://src.litix.io/cts/3/cts-mux.js|env_key=ENV_KEY|debug=false|player_name='EXAMPLE_PLAYER_NAME'|respectDoNotTrack=true>
</div>
<script>
  // Creates the Player object that builds the component.
  const player = new Player("player");
  player.bind("player");
</script>
```

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

There is currently no way to change the default error tracking behavior. If this is something you need in your CTS PDK integration, please reach out.

### Ads tracking with `cts-mux`

Mux has been tested with CTS's VAST plugin for ad support. Configure the VAST plugin as you would with your PDK player normally, and Mux will track ads automatically. No additional configuration is needed.

<LinkedHeader step={steps[4]} />

### Current release

#### v3.13.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v3.13.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v3.13.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v3.13.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v3.13.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v3.13.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v3.13.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v3.13.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v3.13.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v3.13.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v3.13.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v3.13.7

* Update `mux-embed` to v5.9.0

#### v3.13.6

* Update `mux-embed` to v5.8.3

#### v3.13.5

* Update `mux-embed` to v5.8.2

#### v3.13.4

* Update `mux-embed` to v5.8.1

#### v3.13.3

* Update `mux-embed` to v5.8.0

#### v3.13.2

* Update `mux-embed` to v5.7.0

#### v3.13.1

* Update `mux-embed` to v5.6.0

#### v3.13.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v3.12.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v3.12.2

* Update `mux-embed` to v5.4.2

#### v3.12.1

* Update `mux-embed` to v5.4.1

#### v3.12.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v3.11.6

* Update `mux-embed` to v5.3.3

#### v3.11.5

* Update `mux-embed` to v5.3.2

#### v3.11.4

* Update `mux-embed` to v5.3.1

#### v3.11.3

* Update `mux-embed` to v5.3.0

#### v3.11.2

* Update `mux-embed` to v5.2.1

#### v3.11.1

* Update `mux-embed` to v5.2.0

#### v3.11.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v3.10.0

* Refactors to properly enforce new TypeScript types and account for non-standard constructor usage by CTS.

* Update `mux-embed` to v5.0.0

#### v3.9.4

* Update `mux-embed` to v4.30.0

#### v3.9.3

* Update `mux-embed` to v4.29.0

#### v3.9.2

* Update `mux-embed` to v4.28.1

#### v3.9.1

* Update `mux-embed` to v4.28.0

#### v3.9.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v3.8.3

* Update `mux-embed` to v4.26.0

#### v3.8.2

* Update `mux-embed` to v4.25.1

#### v3.8.1

* Update `mux-embed` to v4.25.0

#### v3.8.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v3.7.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v3.6.5

* Update `mux-embed` to v4.22.0

#### v3.6.4

* Update `mux-embed` to v4.21.0

#### v3.6.3

* Update `mux-embed` to v4.20.0

#### v3.6.2

* Update `mux-embed` to v4.19.0

#### v3.6.1

* Update `mux-embed` to v4.18.0

#### v3.6.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v3.5.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v3.4.6

* Update `mux-embed` to v4.15.0

#### v3.4.5

* Update `mux-embed` to v4.14.0

#### v3.4.4

* Update `mux-embed` to v4.13.4

#### v3.4.3

* Update `mux-embed` to v4.13.3

#### v3.4.2

* Update `mux-embed` to v4.13.2

#### v3.4.1

* Update `mux-embed` to v4.13.1

#### v3.4.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v3.3.14

* Publish package to NPM

#### v3.3.13

* Update `mux-embed` to v4.12.1

#### v3.3.12

* Update `mux-embed` to v4.12.0

#### v3.3.11

* Update `mux-embed` to v4.11.0

#### v3.3.10

* Update `mux-embed` to v4.10.0

#### v3.3.9

* Update `mux-embed` to v4.9.4

#### v3.3.8

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v3.3.7

* Update `mux-embed` to v4.9.2

#### v3.3.6

* Update `mux-embed` to v4.9.1

#### v3.3.5

* Update `mux-embed` to v4.9.0

#### v3.3.4

* Update `mux-embed` to v4.8.0

#### v3.3.3

* Update `mux-embed` to v4.7.0

#### v3.3.2

* Update `mux-embed` to v4.6.2

#### v3.3.1

* Update `mux-embed` to v4.6.1

#### v3.3.0

* Bump mux-embed to 4.6.0

#### v3.2.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v3.1.0

* Update `mux-embed` to v4.1.1
* Fix an issue where `player_remote_played` would not be reported correctly

#### v3.0.0

* Update mux-embed to v4.0.0
* Support server-side device detection


# Monitor Chromecast
This guide walks through integration with Chromecast to collect video performance metrics with Mux data.
Mux Data is the best way to monitor video streaming performance.

Integration is easy - just initialize the Mux SDK, pass in some metadata, and you're up and running in minutes.

This documents integration instructions for Chromecast. For other players, see the additional Integration Guides.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Web metrics such as Player Startup Time, Page Load Time, etc
- Average Bitrate metrics and `renditionchange` events
- Request metrics
- Customizable Error Tracking
- Custom Beacon Domain

```

Notes:

```md
Average Bitrate metrics available in v4.2.11 and newer.
```

## 1. Include the Mux Data SDK

Mux supports Chromecast applications that are built on top of the Cast Application Framework [CAF](https://developers.google.com/cast/docs/caf_receiver_overview) Receiver SDK. The CAF Receiver SDK supports the following [streaming protocols](https://developers.google.com/cast/docs/media#delivery-methods-and-adaptive-streaming-protocols).

A Chromecast application contains two main components: a sender and a receiver. The Mux Data SDK is integrated at the receiver side; include the `chromecast-mux.js` JavaScript file within your custom receiver application. You can use the Mux-hosted version of the script to receive automatic updates. (The API will not change within major versions, as in `chromecast/MAJOR_VERSION/chromecast-mux.js`).

```npm
npm install --save @mux/mux-data-chromecast
```

```yarn
yarn add @mux/mux-data-chromecast
```

```cdn
<script src="//src.litix.io/chromecast/4/chromecast-mux.js"></script>
```



## 2. Initialize Mux Data

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

To monitor video playback within your Chromecast application, pass the `PlayerManager` instance to `initChromecastMux` along with SDK options and metadata.

You can initialize within a message interceptor for the `LOAD` event, or immediately on app load as before. This suggestion changed in version 4.0.0 and newer.

```js
import initChromecastMux from '@mux/mux-data-chromecast';

var app = {
  init: function () {
    const context = cast.framework.CastReceiverContext.getInstance();
    const playerManager = context.getPlayerManager();
    let firstPlay = true;
    let playerInitTime = initChromecastMux.utils.now();

    playerManager.setMessageInterceptor(cast.framework.messages.MessageType.LOAD, loadRequestData => {
      if (firstPlay) {
        initChromecastMux(playerManager, {
          debug: false,
          data : {
            env_key: 'ENV_KEY', // required

            // Metadata
            player_name: 'Custom Player', // ex: 'My Main Player'
            player_init_time: playerInitTime,

            // ... additional metadata
          }
        });
      }

      return loadRequestData;
    });

    context.start();
  }
};

$(document).ready(function () {
  app.init();
});
```

After you've finished integration, the quickest way to see that the SDK is loaded is to pass `debug: true` in the options passed to the SDK. With this flag enabled, you can open the debug console, and you should start seeing debug statements from \[mux] when you click play on the video.

After playing a video, a few minutes after you stop watching, you'll see the results in your Mux account. We'll also email you when your first video view has been recorded. Log in to the dashboard and find the environment that corresponds to your env\_key and look for video views.

Note that it may take a few minutes for views to show up in the Mux Data dashboard.

## 3. Make your data actionable

[Detailed Documentation](/docs/guides/make-your-data-actionable-with-metadata)

Options are provided via the `data` object passed in the call to `initChromecastMux`.

All metadata details except for `env_key` are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Set or update metadata after initialization

There are some cases where you may not have the full set of metadata until after the video playback has started. In this case, you should omit the values when you first call `initChromecastMux`. Then, once you have the metadata, you can update the metadata with the `updateData` method.

```js
playerManager.mux.updateData({ video_title: 'My Updated Great Video' });
```

## 5. Changing the video

There are two cases where the underlying tracking of the video view need to be reset:

1. **New source:** When you load a new source URL into an existing player.
2. **New program:** When the program within a singular stream changes (such as a program change within a continuous live stream).

Note: You do not need to change the video info when changing to a different source of the same video content (e.g. different resolution or video format).

### New source

If your application plays multiple videos back-to-back in the same video player, you need to signal when a new video starts to the Mux SDK. Examples of when this is needed are:

* The player advances to the next video in a playlist
* The user selects a different video to play

In order to signal the Mux SDK that a new view is starting, you will need to emit a `videochange` event, along with metadata about the new video. See metadata in [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata) for the full list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video_`.

It's best to change the video info immediately after telling the player which new source to play.

The source change should be done by intercepting the `cast.framework.messages.MessageType.LOAD` message and doing the following:

```js
playerManager.setMessageInterceptor(cast.framework.messages.MessageType.LOAD, loadRequestData => {
  // It's important to only call this on subsequent videos being loaded, not
  // the first playback (where you call `initChromecastMux`).
  if (!firstVideo) {
    playerManager.mux.emit('videochange', { ... });
  }

  return loadRequestData;
});
```

### New program

In some cases, you may have the program change within a stream, and you may want to track each program as a view on its own. An example of this is a live stream that streams multiple programs back to back, with no interruptions.

In this case, you emit a `programchange` event, including the updated metadata for the new program within the continuous stream. This will remove all previous video data and reset all metrics for the video view, creating a new video view. See [Metadata](/docs/guides/make-your-data-actionable-with-metadata) for the list of video details you can provide. You can include any metadata when changing the video but you should only need to update the values that start with `video`.

Note: The `programchange` event is intended to be used *only* while the player is currently not paused. If you emit this event while the player is paused, the resulting view will not track video startup time correctly, and may also have incorrect watch time. Do not emit this event while the player is paused.

## 6. Advanced options

### Customize error tracking behavior

<Callout type="error" title="Errors are fatal">
  Errors tracked by mux are considered fatal meaning that they are the result of playback failures. If errors are non-fatal they should not be captured.
</Callout>

By default, `@mux/mux-data-chromecast` will track errors emitted from the video element as fatal errors. If a fatal error happens outside of the context of the player, you can emit a custom error to the mux monitor.

```js
playerManager.mux.emit('error', {
  player_error_code: 100,
  player_error_message: 'Description of error',
  player_error_context: 'Additional context for the error'
});
```

When triggering an error event, it is important to provide values for `player_error_code` and `player_error_message`. The `player_error_message` should provide a generalized description of the error as it happened. The `player_error_code` must be an integer, and should provide a category of the error. If the errors match up with the [HTML Media Element Error](https://developer.mozilla.org/en-US/docs/Web/API/MediaError), you can use the same codes as the corresponding HTML errors. However, for custom errors, you should choose a number greater than or equal to `100`.

In general you should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages.

The error message and code are combined together and aggregated with all errors that occur in your environment in order to find the most common errors that occur. To make error aggregation as useful as possible, these values should be general enough to provide useful information but not specific to each individual error (such as stack trace).

You can use `player_error_context` to provide instance-specific information derived from the error such as stack trace or segment-ids where an error occurred. This value is not aggregated with other errors and can be used to provide detailed information. *Note: Please do not include any personally identifiable information from the viewer in this data.*

### Error translator

If your player emits error events that are not fatal to playback or the errors are unclear and/or do not have helpful information in the default error message and codes you might find it helpful to use an error translator or disable automatic error tracking all together.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context)
  };
}

initChromecastMux(playerManager, {
  debug: false,
  errorTranslator: errorTranslator,
  data : {
    env_key: 'ENV_KEY', // required
    // Metadata
    player_name: 'Custom Player', // ex: 'My Main Player'
    // ... additional metadata
  }
});

```

If you return `false` from your `errorTranslator` function then the error will not be tracked. Do this for non-fatal errors that you want to ignore. If your `errorTranslator` function itself raises an error, then it will be silenced and the player's original error will be used.

### Disable automatic error tracking

In the case that you want full control over what errors are counted as fatal or not, you may want to consider turning off Mux's automatic error tracking completely. This can be done by passing `automaticErrorTracking: false` in the configuration object.

```js
initChromecastMux(playerManager, {
  debug: false,
  automaticErrorTracking: false,
  data : {
    env_key: 'ENV_KEY', // required
    // Metadata
    player_name: 'Custom Player', // ex: 'My Main Player'
    // ... additional metadata
  }
});
```

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
initChromecastMux(playerManager, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  data: {
    env_key: "ENV_KEY",
    // ...
  }
});
```

## Destroying the Monitor

There are certain use cases where you want to stop monitoring playback within a player (for instance if the player is no longer being used, you are recycling players, or you are shutting down the application). In this case, you should make sure to destroy the monitor. This can be done by simply calling `playerManager.mux.destroy()`.

<LinkedHeader step={steps[7]} />

### Current release

#### v4.16.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v4.16.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v4.16.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v4.16.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v4.16.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v4.16.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v4.16.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v4.16.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v4.16.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v4.16.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v4.16.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v4.16.7

* Update `mux-embed` to v5.9.0

#### v4.16.6

* Update `mux-embed` to v5.8.3

#### v4.16.5

* Update `mux-embed` to v5.8.2

#### v4.16.4

* Update `mux-embed` to v5.8.1

#### v4.16.3

* Update `mux-embed` to v5.8.0

#### v4.16.2

* Update `mux-embed` to v5.7.0

#### v4.16.1

* Update `mux-embed` to v5.6.0

#### v4.16.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v4.15.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v4.15.2

* Update `mux-embed` to v5.4.2

#### v4.15.1

* Update `mux-embed` to v5.4.1

#### v4.15.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v4.14.6

* Update `mux-embed` to v5.3.3

#### v4.14.5

* Update `mux-embed` to v5.3.2

#### v4.14.4

* Update `mux-embed` to v5.3.1

#### v4.14.3

* Update `mux-embed` to v5.3.0

#### v4.14.2

* Update `mux-embed` to v5.2.1

#### v4.14.1

* Update `mux-embed` to v5.2.0

#### v4.14.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v4.13.0

* TypeScript type changes only.

* Update `mux-embed` to v5.0.0

#### v4.12.4

* Update `mux-embed` to v4.30.0

#### v4.12.3

* Update `mux-embed` to v4.29.0

#### v4.12.2

* Update `mux-embed` to v4.28.1

#### v4.12.1

* Update `mux-embed` to v4.28.0

#### v4.12.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v4.11.5

* Update `mux-embed` to v4.26.0

#### v4.11.4

* Update `mux-embed` to v4.25.1

#### v4.11.3

* \[advanced-use] Add option to turn off automatic ad tracking for Chromecast applications

#### v4.11.2

* Update `mux-embed` to v4.25.0

#### v4.11.1

* Fix an issue where certain ad providers may result in javascript errors

#### v4.11.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v4.10.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v4.9.0

* fix an issue where retrieving ad information on chromecast can throw an exception

* Update `mux-embed` to v4.22.0

#### v4.8.0

* Include Ad metadata in ad events

* Update `mux-embed` to v4.21.0

#### v4.7.0

* * Added capturing player dimensions with device pixel ratio considered
  * Added capturing dropped frames

* Update `mux-embed` to v4.20.0

#### v4.6.2

* Update `mux-embed` to v4.19.0

#### v4.6.1

* Update `mux-embed` to v4.18.0

#### v4.6.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v4.5.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v4.4.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Update `mux-embed` to v4.15.0

#### v4.3.5

* Update `mux-embed` to v4.14.0

#### v4.3.4

* Update `mux-embed` to v4.13.4

#### v4.3.3

* Update `mux-embed` to v4.13.3

#### v4.3.2

* Update `mux-embed` to v4.13.2

#### v4.3.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v4.3.0

* Upgraded internal webpack version

* Improve Chromecast rebuffering metrics

* Update `mux-embed` to v4.13.0

#### v4.2.15

* Publish package to NPM

#### v4.2.14

* Update `mux-embed` to v4.12.1

#### v4.2.13

* Update `mux-embed` to v4.12.0

#### v4.2.12

* Update `mux-embed` to v4.11.0

#### v4.2.11

* Listen for Chromecast BITRATE\_CHANGED event, update the video source width and height, then call Mux `renditionchange` with the new bitrate

#### v4.2.10

* Update `mux-embed` to v4.10.0

#### v4.2.9

* Update `mux-embed` to v4.9.4

#### v4.2.8

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v4.2.7

* Update `mux-embed` to v4.9.2

#### v4.2.6

* Update `mux-embed` to v4.9.1

#### v4.2.5

* Update `mux-embed` to v4.9.0

#### v4.2.4

* Update `mux-embed` to v4.8.0

#### v4.2.3

* Update `mux-embed` to v4.7.0

#### v4.2.2

* Update `mux-embed` to v4.6.2

#### v4.2.1

* Update `mux-embed` to v4.6.1

#### v4.2.0

* Bump mux-embed to 4.6.0

#### v4.1.1

* Fix an issue where `player.mux.destroy()` would raise an exception if called without any parameters.

#### v4.1.0

* Update `mux-embed` to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v4.0.0

* Remove automatic video change tracking. You must now emit `videochange` events to signal a change. This should be done inside an interceptor for the `LOAD` event.
* Fix an issue where `ended` events were sent at the wrong time.
* Ensure that tracking is paused on the Chromecast `STOPPED` event.

#### v3.1.0

* Update mux-embed to v4.1.1
* Add support for custom dimensions
* Fix an issue where `player_remote_played` was not functioning. This value defaults to `true` if not set

#### v3.0.0

* Update mux-embed to v4.0.0
* Update device model appropriately for various Chromecast devices
* Support server-side device detection

#### v2.0.1

* Bug fix: Ensure the `video_source_url` is detected

#### v2.0.0

* Support ad event tracking
* Default `videochange` detection to false - this can still be enabled if required
* Clean up error tracking to report only fatal errors
* Minor optimisations and bug fixes

#### v1.0.0

* Support customizing error handling (via configuring automaticErrorTracking and errorTranslator).
* Do not shut down on REQUEST\_STOP.
* Expose `playerManager.mux.destroy()` to stop monitoring the player instance.
* Clean up better and minor bug fix around destroying monitor.

#### v0.1.0

* Initial SDK created.


# Monitor Roku
This guide walks through integration with Roku to collect video performance metrics with Mux data.
Mux's Roku integration supports Roku SceneGraph applications, in conjunction with standard `Video` nodes. Mux runs as a `Task` alongside the `Video` node, and supports instances where the `Video` nodes are reused with additional content as well as when the `Video` nodes are reset between content.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Custom Dimensions

```

Notes:

```md
Video Quality metrics are not available.
```

## 1. Include the Mux Data SDK

Place the SDK file in your `libs` folder. The latest version of the SDK can be found here:

```sh
https://src.litix.io/roku/2/mux-analytics.brs
```

## 2. Setup a new Mux Task

Create a new `Task` XML named `MuxTask.xml` inside your `components` folder and give it the following interface. This is used to link the `mux-analytics.brs` file into your application.

```html
<component name="MuxTask" extends="Task">
  <interface>
    <field id="video" type="node" alwaysNotify="true"/>
    <field id="config" type="assocarray" alwaysNotify="true"/>
    <field id="rafEvent" type="assocarray" alwaysNotify="true"/>
    <field id="error" type="assocarray" alwaysNotify="true"/>
    <field id="view" type="String" alwaysNotify="true"/>
    <field id="exit" type="Boolean" alwaysNotify="true"/>
    <field id="exitType" type="String" alwaysNotify="true" value="hard"/>
    <field id="useRenderStitchedStream" type="Boolean" alwaysNotify="true" value="false"/>
    <field id="useSSAI" type="Boolean" alwaysNotify="true" value="false"/>
    <field id="disableAutomaticErrorTracking" type="Boolean" alwaysNotify="true" value="false"/>
    <field id="randomMuxViewerId" type="Boolean" value="false"/>
    <field id="cdn" type="String" alwaysNotify="true" />
    <field id="disablePlayheadRebufferTracking" type="Boolean" alwaysNotify="true" value="false" />
    <field id="disableDecoderStats" type="Boolean" alwaysNotify="true" value="false" />
    <field id="rebufferstart" type="Boolean" alwaysNotify="true" />
    <field id="rebufferend" type="Boolean" alwaysNotify="true" />
    <field id="playback_mode" type="assocarray" alwaysNotify="true" />
    <field id="request" type="assocarray" alwaysNotify="true" />
  </interface>
  <script type="text/brightscript" uri="pkg:/libs/mux-analytics.brs"/>
</component>
```

## 3. Setup the task to respond to video events

Within your main application, create the Mux Task node, and pass the `Video` node that you are tracking to it. This should be done before the content is set into the `Video` node so that Mux can track the load process.

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```js
m.mux = m.top.CreateNode("mux")
m.mux.setField("video", m.video)

muxConfig = {
  env_key: "ENV_KEY",
}

m.mux.setField("config", muxConfig)
m.mux.control = "RUN"

' Load the video into the Video node
```

After you've integrated, start playing a video in the player you've integrated with. A few minutes after you stop watching, you'll see the results in your Mux account. We'll also email you when your first video view has been recorded.

You can also test that Mux is receiving data in the Mux Data dashboard. Login to the dashboard and find the environment that corresponds to your `ENV_KEY` and look for video views.

Note that it may take a few minutes for views to show up in the Mux Data dashboard.

## 4. Debugging

To help you with the integration process and ensure you have successfully incorporated the SDK within your player, we have provided a number of optional manifest attributes. These attributes can help you better understand how the MUX SDK event tracking works as well as show you the actual data being collected. Some of the benefits of using some of the debugging attributes (mentioned below) are that you will be able to see the SDK events and data collected as it occurs.

**NOTE:** The outputs illustrated below are printed on a single line within the terminal to reduce clutter.

<Callout type="info">
  Note that the following settings are configured in your application's `manifest` file, rather than passed into the config object.
</Callout>

### mux\_debug\_events

#### Values

`full`, `partial` or `none`

#### Description

Outputs the event at the time it occurs. Default value is `none`

#### Example output

Property set to `partial`:

```sh
[mux-analytics] EVENT playerready
```

Property set to `full`:

```sh
[mux-analytics] EVENT playing
{
  viewer_application_name:Roku,
  mux_api_version:2.1,
  view_seek_duration:0,
  viewer_application_version:9.20,
  player_name:Reset Player,
  viewer_time:1582317809984,
  view_start:1582317808627,
  player_model_number:4660X,
  video_source_mime_type:mp4,
  event:playing,
  ...
```

***

### mux\_debug\_beacons

#### Values

`full`, `partial` or `none`

#### Description

Outputs the data (full) or event(s) (partial) that is being sent (at the time of sending). Default value is `none`.

#### Example output

Property set to `partial`:

```sh
[mux-analytics] BEACON (2) [  playerready viewstart ]
```

Property set to `full`:

```sh
[mux-analytics] BEACON (2)
[
  {
    viewer_application_name:Roku,
    mux_api_version:2.1,
    view_seek_duration:0,
    viewer_application_version:9.20,
    player_name:Reset Player,
    viewer_time:1582317809984,
    view_start:1582317808627,
    player_model_number:4660X,
    video_source_mime_type:mp4,
    event:playerready,
    ...
  }, {
    viewer_application_name:Roku,
    mux_api_version:2.1,
    view_seek_duration:0,
    viewer_application_version:9.20,
    player_name:Reset Player,
    viewer_time:1582317809984,
    view_start:1582317808627,
    player_model_number:4660X,
    video_source_mime_type:mp4,
    event:viewstart,
    ...
  }
]
```

***

### `mux_base_url`

#### Values

Protocol + domain name. Eg. `https://img.litix.io`

#### Description

Controls to which domain the data should be sent. Useful for environmental builds of your project

## 5. Make your data actionable

The Roku SDK supports adding metadata via two different mechanisms.

The majority of the metadata should be passed inside the `muxConfig` object that is passed to the Mux Task. You can read detailed information about the fields that are supported in [Metadata](/docs/guides/make-your-data-actionable-with-metadata). To update any field, update this within `muxConfig` and then call `m.mux.setField("config", muxConfig)`.

Some other underlying information is mapped from standard [Roku content metadata](https://developer.roku.com/docs/developer-program/getting-started/architecture/content-metadata.md), most of which you probably already set when creating your video. In particular, the metadata fields that you should set (if you do not already) are:

* *ContentType*
* *URL*
* *Live*
* *StreamFormat*
* *Length*

## 6. Advertising configuration

If advertising is to be used, you must send the appropriate events to the Mux Task, as shown below.

```js
function setUpRokuAdFramework
  adIface.SetTrackingCallback(adTrackingCallback, adIface)
end function

function adTrackingCallback(obj = Invalid as Dynamic, eventType = Invalid as Dynamic, ctx = Invalid as Dynamic)
  m.mux = GetGlobalAA().global.findNode("mux")
  adUrl = Invalid
  if obj <> Invalid
    adUrl = obj.getAdUrl()
  end if
  m.mux.setField("rafEvent", {obj: { adurl: adUrl }, eventType:eventType, ctx:ctx})
end function
```

If you would like to pass your own ad parameters, you can do so by setting `ctx.mux` to an associative array with the values you desire. Currently the only value supported is `ad_type`.

```js
function adTrackingCallback(obj = Invalid as Dynamic, eventType = Invalid as Dynamic, ctx = Invalid as Dynamic)
  m.mux = GetGlobalAA().global.findNode("mux")
  adUrl = Invalid
  if obj <> Invalid
    adUrl = obj.getAdUrl()
  end if
  # Add your ad_type if you desire
  ctx.mux = {}
  ctx.mux.ad_type = "preroll"
  m.mux.setField("rafEvent", {obj: { adurl: adUrl }, eventType:eventType, ctx:ctx})
end function
```

If you are utilizing RAF's `renderStitchedStream` method to stitch ads and content together client-side, then you must tell the Mux SDK that this is in use. This is set via `useRenderStitchedStream` on the Mux Task, set to `true`, such as:

```js
mux.setField("useRenderStitchedStream", true)
```

If you are *not* utilizing `renderStitchedStream` but instead controlling ad and content playback directly, then you need to set `useRenderStitchedStream` to `false`.

If you are utilizing server-side ad insertion (SSAI), you should signal that to the SDK by setting `useSSAI` to `true`:

```js
mux.setField("useSSAI", true)
```

## 7. Additional configuration

There are a number of options you can configure on MuxTask, which can be used to customize the SDK:

| Field Name | Type | Description |
|:-|:-|:-|
| `video` | `Node` | The `Video` node that is being tracked. |
| `config` | `assocarray` | Configuration for the Mux task, including env key and additional metadata. |
| `rafEvent` | `assocarray` | Used to forward RAF events from your application into the Mux SDK. See [Advertising Configuration](#6-advertising-configuration) for more information. |
| `view` | `String` | Used to control when views start and end directly. See [Controlling View Start and End Directly](#controlling-view-start-and-end-directly) for more information. |
| `disableAutomaticErrorTracking` | `Boolean` | Used to signal errors manually. See [Manually Tracking Errors](#manually-tracking-errors) for more information. |
| `error` | `assocarray` | Used to signal errors manually. See [Manually Tracking Errors](#manually-tracking-errors) for more information. |
| `exit` | `Boolean` | Used to exit tracking directly. See [Controlling View Start and End Directly](#controlling-view-start-and-end-directly) for more information. |
| `exitType` | `String` | Used to control the mechanism used while exiting tracking. See [Controlling View Start and End Directly](#controlling-view-start-and-end-directly) for more information. |
| `useRenderStitchedStream` | `Boolean` | Used to set whether `renderStitchedStream` is used for advertising in your application. See [Advertising Configuration](#6-advertising-configuration) for more information. |
| `useSSAI` | `Boolean` | Used to set whether Roku's built-in server-side-ad-insertion is used for advertising in your application. See [Advertising Configuration](#6-advertising-configuration) for more information. |
| `randomMuxViewerId` | `Boolean` | Used to instruct Mux to use a random viewer ID instead of an ID based on the device. By default, Mux utilizes the [device RIDA](https://developer.roku.com/docs/references/brightscript/interfaces/ifdeviceinfo.md#getrida-as-string) as the viewer ID. If this flag is enabled, unique viewer counts may be inflated.  |
| `cdn` | `String` | Used to manually signal when the CDN changes. See [CDN Tracking](#cdn-tracking) for more information. |
| `disablePlayheadRebufferTracking` | `Boolean` | Used to control rebuffering directly. See [Rebuffer Controls](#rebuffer-controls) for more information. |
| `rebufferstart` | `Boolean` | Used to control rebuffering directly. See [Rebuffer Controls](#rebuffer-controls) for more information. |
| `rebufferend` | `Boolean` | Used to control rebuffering directly. See [Rebuffer Controls](#rebuffer-controls) for more information. |
| `disableDecoderStats` | `Boolean` | By default, Mux sets `enableDecoderStats` to true on the `Video` node. Set this to `true` not enable decoder stats; dropped frames will no longer be tracked. |
| `playback_mode` | `assocarray` | Used to set and update [Playback Mode](#playback-mode). |
| `request` | `assocarray` | Used to trigger [custom network request events](#network-custom-request-event). |

### Controlling View Start and End Directly

In some situations, it is necessary to directly signal the beginning or ending of a `view` to Mux. This is necessary when the `Video` Node is recycled (i.e. more pieces of content are loaded into the same Node), or when using advertising, as the ads run outside of the lifecycle of the Video.

Note: A `view` is defined as the user watching a single piece of *content*, which includes any advertising.

```js
mux = GetGlobalAA().global.findNode("mux")

' To signal the start of a view:
mux.setField("view", "start")

' To signal the end of a view:
mux.setField("view", "end")
```

The `exitType` setting controls the behavior of the task when a request to exit/terminate the thread is invoked (via `mux.exit=true`). The default value of `exitType` is `hard`.

If the value is set to `hard` then the thread terminates immediately and any data that has not propagated already to the MUX servers is lost.

If the value is set to `soft` then the thread sends all the remaining data to the MUX servers and terminates afterward.

To change value to `soft` call `m.mux.setField("exitType", "soft")`

NOTE: This means that there might be a time difference between you calling `mux.exit=true` and the task thread actually terminating. Please ensure you have a single `MUX Task` running at any given time.

### Manually Tracking Errors

The Mux SDK for Roku tracks error events from the Video node automatically, and reports them as fatal playback errors. If you would like to disable this automatic error tracking, you can set the following in your MuxTask.xml:

```js
<field id="disableAutomaticErrorTracking" type="Boolean" alwaysNotify="true" value="true"/>
```

While it is not advised to control this at runtime, you can also set this by calling

```js
mux.setField("disableAutomaticErrorTracking", true)
```

In order to emit events, you will need to trigger any errors directly, by calling

```js
mux.setField("error", {
  player_error_code: errorCode,
  player_error_message: errorMessage,
  player_error_context: errorContext,
  player_error_severity: errorSeverity,
  player_error_business_exception: isBusinessException
})
```

The error code and message should always be provided, and you can set the other fields if desired. The possible values or `errorSeverity` are `"warning"` or `"fatal"`. Read more about [Error Classification](/docs/guides/error-categorization) for more details.

### CDN Tracking

The Mux SDK for Roku can listen to CDN change events. In order to emit events, you will need to trigger any CDN changes directly, by calling

```js
' The "new_cdn" string should be the new CDN name
mux.setField("cdn", "new_cdn")
```

### Rebuffer Controls

<Callout type="warning">
  Mux does not suggest disabling the automatic rebuffer tracking. The following is for advanced usage only.
</Callout>

The Mux SDK for Roku tracks error events from the Video node automatically. If you would like to disable this automatic rebuffer tracking, you can set the following in your MuxTask.xml:

```js
<field id="disablePlayheadRebufferTracking" type="Boolean" alwaysNotify="true" value="true"/>
```

While it is not advised to control this at runtime, you can also set this by calling

```js
mux.setField("disablePlayheadRebufferTracking", true)
```

Rebuffer start and end events can be emitted by calling the following

```js
mux.setField("rebufferstart", true)
```

```js
mux.setField("rebufferend", true)
```

### Playback Mode

A playback mode event can be emitted by calling the following

```js
mux.setField("playback_mode", {
  player_playback_mode: mode,
  player_playback_mode_data: data
})
```

The `mode` should always be provided, suggested values are: `inline`, `fullscreen`, `mini` and `pip`.
The `data` is optional, if provided it should be a parse-able JSON string.

A `playbackmodechange` event will be emitted containing the following fields:

* `player_playback_mode` (containing the mode set)
* `player_playback_mode_data` (containing the data set)
* `ad_playing_time_ms_cumulative` (containing ad watch time)
* `view_playing_time_ms_cumulative` (containing the total content watch time plus ad watch time)

### Network custom request event

Custom request events can be emitted with custom data for network requests by calling the following

```js
mux.setField("request", manifestRequest)
```

The `manifestRequest` must be a parse-able JSON string.

The following fields can be included in `manifestRequest`. All fields should be numeric values, not strings, except where noted:

* `type` (required, string) - The status of the request: `completed`, `failed`, or `canceled`
* `request_start` (numeric) - Timestamp in milliseconds since the Unix epoch when the request was initiated
* `request_response_start` (numeric) - Timestamp in milliseconds since the Unix epoch when the first byte of the response was received
* `request_response_end` (numeric) - Timestamp in milliseconds since the Unix epoch when the last byte of the response was received
* `request_bytes_loaded` (numeric) - The total number of bytes loaded as part of this request
* `request_hostname` (string) - The hostname portion of the URL that was requested
* `request_type` (string) - The type of content being requested. One of: `manifest`, `video`, `audio`, `video_init`, `audio_init`, `media`, `subtitle`, or `encryption`
* `request_id` (string) - A unique identifier for the request
* `request_url` (string) - The URL that was requested
* `request_labeled_bitrate` (numeric) - Labeled bitrate in bps of the video, audio, or media segment that was downloaded
* `request_response_headers` (object) - A map of response headers and their values
* `request_media_duration` (numeric) - The duration of the media loaded, in seconds. Should not be included for manifest requests
* `request_video_width` (numeric) - For events with `media` or `video` `request_type`, the width of the video included in the segment/fragment
* `request_video_height` (numeric) - For events with `media` or `video` `request_type`, the height of the video included in the segment/fragment
* `request_error` (string) - The name of the error event that occurred (e.g., `FragLoadError`)
* `request_error_code` (numeric) - The response code of the request that spawned the error (e.g., 401, 400, 500)
* `request_error_text` (string) - The message returned with the failed status code

Depending on the `type` value, a corresponding event will be emitted: `requestcompleted` (for `type: "completed"`), `requestfailed` (for `type: "failed"`), or `requestcanceled` (for `type: "canceled"`).

For more information about these fields, see [Network Request Data](/docs/guides/mux-data-playback-events#network-request-data).

#### Latency and Throughput Calculation

If you set `request_start`, `request_response_start`, `request_response_end`, and `request_bytes_loaded`, latency and throughput will be automatically calculated and sent in each event.

* Latency is calculated as the time between `request_start` and `request_response_start`.
* Throughput is calculated as `request_bytes_loaded` divided by the time between `request_response_start` and `request_response_end`.

#### Request Duration Calculation

If the request is completed (`type` is `"completed"`), the `request_type` is `"api"` or `"encryption"`, and both `request_start` and `request_response_end` are present as valid numeric values, the total request duration (`request_duration`) will be calculated as `request_response_end - request_start` and included in the event. Requests of type `"api"` or `"encryption"` are limited to a maximum of 5 per view; if this limit is exceeded, the event is discarded and not sent to Mux.

<LinkedHeader step={steps[8]} />

### Current release

#### v2.6.2

* fix crash due to Invalid starting playhead

### Previous releases

#### v2.6.1

* Fix crash when `debug_beacons` or `debug_events` are set to `full`

#### v2.6.0

* Fix inconsistent heartbeat management. may prevent excessive beacons or view state issues
* Track playhead ranges for engagement tracking

#### v2.5.3

* fix crash due to ad watch time metrics being Invalid during ad breaks

#### v2.5.2

* fix an issue where playback\_mode may not have updated correctly on changes if set in initial config

#### v2.5.1

* fix an issue where request\_url was not included in certain request events
* expose mechanism to report `ad_type` within ads

#### v2.5.0

* fix issue where subsequent views could end up with metadata from previous views
* expose option for disabling tracking decoder stats
* fix typo in player\_error\_message
* remove warning when env\_key is not set

#### v2.4.1

* fix: max function doesn't exist

#### v2.4.0

* add network `request` field
* add network change events: Network change events are now sent when a network transition is detected (for example, from Wi-Fi to Ethernet).

#### v2.3.2

* fix issue where invalid playhead time would crash

#### v2.3.1

* Improve performance by reducing thread rendevous
* fix: player\_playback\_mode not appearing in Dashboard
* fix: send codec in `video_source_codec` to have it show up in `renditionchange` events

#### v2.3.0

* add calculations for ad watch time, now sending analytics properties `ad_playing_time_ms_cumulative` as total ad watch time and `view_playing_time_ms_cumulative` as content watch time plus ad watch time
* add support for `playback_mode` user event, which triggers a `playbackmodechange` event, containing the playback mode and data
* `playbackmodechange` event sent before every `viewstart` event, containing `standard` playback mode

#### v2.2.2

* fix issue where watch time was calculated incorrectly, a guard was added for big jumps on content playback time calculations

#### v2.2.1

* fix for `MAX_VIDEO_POSITION_JUMP` not being properly declared, which could cause app crashes

#### v2.2.0

* fix viewing data irregularities where watch time values could be near maxint values
* add support for disabling automatic rebuffer tracking and allow for rebuffer events to be emitted directly
* `video_codec` and `video_audio_codec` fields are now sent in `renditionchange` events

#### v2.1.0

* fix error code inconsistencies on `error` event handling
* `viewer_connection_type` is set to invalid if unable to be determined, set to `other` if connection not Wired or Wireless
* `viewer_connection_type` is now rechecked and updated every 10 seconds
* add support for `cdn` user event, which triggers a `cdnchange` analytics event, containing the current and previous CDN

#### v2.0.1

* fix an issue where `client_application_name` and other newer metadata fields could not be set
* fix an issue where `disableAutomaticErrorTracking` was not settable at runtime

#### v2.0.0

* BREAKING: `disableAutomaticErrorTracking`, `useRenderStitchedStream`, and `useSSAI` have had their types changed to `Boolean`, so you will need to make sure to update your `MuxTask.xml` files to the right types, and anywhere you might set those values dynamically.
* add support for `disableAutomaticErrorTracking`
* add support for `useRandomMuxViewerId`
* fix issue where `ended` event was sent when it should not have been
* support severity and business exception in manual error handler

#### v1.8.0

* Fix an issue where beacon requests were not delayed upon retry
* Fix a couple if internal typos
* Various performance improvements to reduce the number of rendezvous

#### v1.7.1

* Fix issue where a crash may occur due to the Content node having an invalid URL

#### v1.7.0

* Fix issue where renditionchange was triggered too often when demuxed audio/video are used
* Fix issue where a memory leak was possible with some configurations with ads involved
* Add support for useSSAI to track ad breaks correctly when SSAI integrations are used

#### v1.6.0

* Add support for automatically detecting video changes and metadata when using a playlist within a single Content Node
* Add support for 8k devices
* Add support for Video Quality Metrics
* Add support for tracking individual network requests, throughput, and network errors

#### v1.5.1

* Remove unintended logging

#### v1.5.0

* Fix an issue where views were not tracked correctly when playing with advertisements via `renderStitchedStream`
* Fix an issue where `player_init_time` was expected as a string but would not work correctly
* Performance improvements
* Update sample app to have option for `renderStitchedStream`

#### v1.4.3

* Add support for collecting dropped frame counts automatically where possible

#### v1.4.2

* Add support for `beaconCollectionDomain`
* Add support for setting `env_key` instead of `property_key`

#### v1.4.1

* Fix a syntax issue causing compilation problems

#### v1.4.0

* Fix a misnamed ad event (`adpause` was incorrectly sent as `adpaused`)
* Add support for a few more ad events
* Fix an issue where ad play count was attributed at ad completion, rather than ads beginning to play

#### v1.3.3

* Fix an issue where certain env keys were not handled correctly

#### v1.3.2

* Fix an issue where hostname extraction did not work correctly for hostnames with `-`s

#### v1.3.1

* Fixes an issue where certain Roku devices would not correctly expose the model number

#### v1.3.0

Updates:

* Add `drmType` property to the Mux node. This value is automatically reported from the player if available (#44)
* Add `droppedFrames` property to the Mux node. This value is must be reported from your player. (#44)
* Add `errorContext` field to Error Events. This value is automatically reported from the player if available (#44)

#### v1.2.1

* Fixes an issue that could cause incorrect playback reporting when seeking occurs during a view and updated the SDK testing infrastructure.

#### v1.2.0

* Remove auto-generated `video_id` value; applications should pass their own `video_id` in the metadata.

#### v1.1.1

* Fix an issue where an invalid value provided for `player_init_time` could cause the application to crash.

#### v1.1.0

* Add support for custom dimensions

#### v1.0.3

* Fix an issue where properties from the Roku application (such as Director) that are not string types crash the application
* Fix an issue with the sample application running ads

#### v1.0.2

* Fix an issue where `viewer_device_model` was not populated correctly.

#### v1.0.1

* Fix an issue where the player playhead position was not reported. This has no impact on collected metrics, but fixes a display issue within the dashboard when viewing individual views.

#### v1.0.0

* Fix an issue where `viewer_user_id` was overwritten unintentionally.
* Fix an issue where `player_mux_plugin_name` and device type were set incorrectly.
* Fix an issue where the `seeked` event was incorrectly named.
* Provide updated device information to match the intended uses for each field.
* Fix an issue where certain metrics (large numbers) were sent in scientific notation, causing incorrect values to be stored.
* Fix an issue where error code and message were incorrectly sent with `aderror` events.

#### v0.2.0

* Remove the debug option of `mux_minification`. If you set this, it will have no action. Instead, all events and beacons will be logged in an un-minified version, while everything will be sent to the Mux backend minified.
* Update such that `player_instance_id` (controlled by the Mux SDK) is sent as a GUID rather than a different format of ID.

#### v0.1.0

* Add `exitType` configuration option
* Fix an issue where source duration is reported incorrectly
* Fix an issue where, on certain devices, the rebuffer percentage could be reported incorrectly (e.g. extremely high)
* Fix an issue where `watch_time` may have been calculated incorrectly in certain situations
* Fix an issue to allow correctly tracking exits before video start


# Samsung-Tizen
This guide walks through integration with Samsung Tizen to collect video performance metrics with Mux data.
Mux Data is the best way to monitor video streaming performance.

Integration is easy - just initialize the Mux SDK, pass in some metadata, and you're up and running in minutes.

This documents integration instructions for Samsung Tizen TVs. For other players, see the additional Integration Guides.

## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics
- Custom Beacon Domain

```

Notes:

```md
No notes provided
```

## 1. Include the Mux Data SDK

Mux Data supports applications built for Samsung Tizen TVs using JavaScript and Tizen's [AVPlay API](https://developer.samsung.com/tv/develop/api-references/samsung-product-api-references/avplay-api). The Samsung Tizen Smart TV SDK supports C++, JavaScript, and Microsoft .NET; this SDK is only compatible with JavaScript applications using AVPlay.

Include the Mux Data SDK by including the `tizen-mux.js` JavaScript file within your `index.html` file defining your application. You can use the Mux-hosted version of the script to receive automatic updates. (The API will not change within major versions, as in `tizen/MAJOR_VERSION/tizen-mux.js`.)

```html
<!-- place within the <head> of your index.html -->
<script src="//src.litix.io/tizen/2/tizen-mux.js"></script>
```

## 2. Initialize Mux Data

To monitor video playback within your Tizen application, pass the AVPlay player instance to `monitorTizenPlayer` along with SDK options and metadata.

```js
// Place in your application initialization code, around
// where you call `prepare`

var player = $('#my-player').get(0);
player.url = this.url;
var playerInitTime = monitorTizenPlayer.utils.now();
this.prepare();
monitorTizenPlayer(player, {
  debug: true,
  data: {
    env_key: 'ENV_KEY', // required
    // Metadata
    player_name: 'Custom Player', // ex: 'My Main Player'
    player_init_time: playerInitTime,
    // ... additional metadata
  },
  // Optional passthrough listener
  playbackListener: playbackListener
});
```

Tizen's AVPlay API does not allow multiple AVPlayPlaybackCallback listeners to be registered to a player. If you require your own listener to be registered, you must pass this in as `playbackListener` as shown above. Mux's SDK will proxy the calls to your listener. (Note: the location of this changed with v1.0.0)

To stop monitoring your player (e.g. when playback is complete), call `player.mux.stopMonitor()`.

Log in to the Mux dashboard and find the environment that corresponds to your `env_key` and look for video views. It takes about a minute or two from tracking a view for it to show up on the Metrics tab.

**If you aren't seeing data**, check to see if you have an ad blocker, tracking blocker or some kind of network firewall that prevents your player from sending requests to Mux Data servers.

## 3. Make your data actionable

[Detailed Documentation](/docs/guides/make-your-data-actionable-with-metadata)

Options are provided via the data object passed in the call to `monitorTizenPlayer`.

All metadata details except for `env_key` are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

```js
monitorTizenPlayer(player, {
  debug: false,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 4. Advanced options

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
monitorTizenPlayer(player, {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', //ex: 'foo.bar.com'
  data: {
    env_key: 'ENV_KEY', // required
    // ,,,
  }
});
```

<LinkedHeader step={steps[5]} />

### Current release

#### v2.15.18

* fix issue where playing time might accumulate for paused players
  * Updated dependency: `mux-embed` to v5.17.1

### Previous releases

#### v2.15.17

* add compatibility for dash.js 5
  * Updated dependency: `mux-embed` to v5.17.0

#### v2.15.16

* Update parsing of initial value for player\_playback\_mode
  * Updated dependency: `mux-embed` to v5.16.1

#### v2.15.15

* Add Playback Range Tracker for new engagement metrics
  * Updated dependency: `mux-embed` to v5.16.0

#### v2.15.14

* Automatically detect playback mode changes for HTML 5 Video
  * Updated dependency: `mux-embed` to v5.15.0

#### v2.15.13

* Emit a renditionchange event at the start of views to eanble updated rendition tracking.
  * Updated dependency: `mux-embed` to v5.14.0

#### v2.15.12

* Add ad type metadata to Ad Events
* Add support for the upcoming Playback Mode changes:
  * Updated dependency: `mux-embed` to v5.13.0

#### v2.15.11

* SDKs will no longer immediately send error events that are flagged as warnings. Fatal errors will still immediately be sent.
  * Updated dependency: `mux-embed` to v5.12.0

#### v2.15.10

* Allow dev to specify page starting load and page finished loading times to calculate Page Load Time
  * Updated dependency: `mux-embed` to v5.11.0

#### v2.15.9

* Adds support for cdnchange events
  * Updated dependency: `mux-embed` to v5.10.0

#### v2.15.8

* Submit Aggregate Startup Time when autoplay is set
  * Updated dependency: `mux-embed` to v5.9.1

#### v2.15.7

* Update `mux-embed` to v5.9.0

#### v2.15.6

* Update `mux-embed` to v5.8.3

#### v2.15.5

* Update `mux-embed` to v5.8.2

#### v2.15.4

* Update `mux-embed` to v5.8.1

#### v2.15.3

* Update `mux-embed` to v5.8.0

#### v2.15.2

* Update `mux-embed` to v5.7.0

#### v2.15.1

* Update `mux-embed` to v5.6.0

#### v2.15.0

* Update mechanism for generating unique IDs, used for `view_id` and others

* Update `mux-embed` to v5.5.0

#### v2.14.3

* \[chore] internal build process fix (no functional changes)
* Update `mux-embed` to v5.4.3

#### v2.14.2

* Update `mux-embed` to v5.4.2

#### v2.14.1

* Update `mux-embed` to v5.4.1

#### v2.14.0

* Add updateData function that allows Mux Data metadata to be updated mid-view.

* Update `mux-embed` to v5.4.0

#### v2.13.6

* Update `mux-embed` to v5.3.3

#### v2.13.5

* Update `mux-embed` to v5.3.2

#### v2.13.4

* Update `mux-embed` to v5.3.1

#### v2.13.3

* Update `mux-embed` to v5.3.0

#### v2.13.2

* Update `mux-embed` to v5.2.1

#### v2.13.1

* Update `mux-embed` to v5.2.0

#### v2.13.0

* Target ES5 for bundles and validate bundles are ES5

* Update `mux-embed` to v5.1.0

#### v2.12.5

* Update `mux-embed` to v5.0.0

#### v2.12.4

* Update `mux-embed` to v4.30.0

#### v2.12.3

* Update `mux-embed` to v4.29.0

#### v2.12.2

* Update `mux-embed` to v4.28.1

#### v2.12.1

* Update `mux-embed` to v4.28.0

#### v2.12.0

* fix an issue where seek latency could be unexpectedly large

* fix an issue where seek latency does not include time at end of a view

* Update `mux-embed` to v4.27.0

#### v2.11.3

* Update `mux-embed` to v4.26.0

#### v2.11.2

* Update `mux-embed` to v4.25.1

#### v2.11.1

* Update `mux-embed` to v4.25.0

#### v2.11.0

* Fix an issue where beacons over a certain size could get hung and not be sent

* Update `mux-embed` to v4.24.0

#### v2.10.0

* Fix an issue where tracking rebuffering can get into an infinite loop

* Update `mux-embed` to v4.23.0

#### v2.9.5

* Update `mux-embed` to v4.22.0

#### v2.9.4

* Update `mux-embed` to v4.21.0

#### v2.9.3

* Update `mux-embed` to v4.20.0

#### v2.9.2

* Update `mux-embed` to v4.19.0

#### v2.9.1

* Update `mux-embed` to v4.18.0

#### v2.9.0

* Support `player_error_context` in `errorTranslator`

* Update `mux-embed` to v4.17.0

#### v2.8.0

* Adds support for new and updated fields: `renditionchange`, error, DRM type, dropped frames, and new custom fields

* Update `mux-embed` to v4.16.0

#### v2.7.0

* Expose `utils` on SDK initialization function to expose `utils.now()` for `player_init_time`

* Update `mux-embed` to v4.15.0

#### v2.6.5

* Update `mux-embed` to v4.14.0

#### v2.6.4

* Update `mux-embed` to v4.13.4

#### v2.6.3

* Update `mux-embed` to v4.13.3

#### v2.6.2

* Update `mux-embed` to v4.13.2

#### v2.6.1

* Fixes an issue with accessing the global object
* Update `mux-embed` to v4.13.1

#### v2.6.0

* Upgraded internal webpack version

* Update `mux-embed` to v4.13.0

#### v2.5.8

* Publish package to NPM

#### v2.5.7

* Update `mux-embed` to v4.12.1

#### v2.5.6

* Update `mux-embed` to v4.12.0

#### v2.5.5

* Update `mux-embed` to v4.11.0

#### v2.5.4

* Update `mux-embed` to v4.10.0

#### v2.5.3

* Update `mux-embed` to v4.9.4

#### v2.5.2

* Use common function for generating short IDs
* Update `mux-embed` to v4.9.3

#### v2.5.1

* Update `mux-embed` to v4.9.2

#### v2.5.0

* Improve rebuffering metrics by using Tizen buffering events instead of playhead tracking

#### v2.4.6

* Update `mux-embed` to v4.9.1

#### v2.4.5

* Update `mux-embed` to v4.9.0

#### v2.4.4

* Update `mux-embed` to v4.8.0

#### v2.4.3

* Update `mux-embed` to v4.7.0

#### v2.4.2

* Update `mux-embed` to v4.6.2

#### v2.4.1

* Update `mux-embed` to v4.6.1

#### v2.4.0

* Bump mux-embed to 4.6.0

#### v2.2.0

* Update mux-embed to v4.2.0
* Fix an issue where views that resulted from `programchange` may not have been tracked correctly
* Fix an issue where if `destroy` was called multiple times, it would raise an exception

#### v2.1.0

* Update mux-embed to v4.1.1

#### v2.0.0

* Update mux-embed to v4.0.0
* Support server-side device detection

#### v1.0.0

* Update to `mux-embed` v3.1.0
* The mechanism for registering your own AVPlayPlaybackCallback listener changed. Previously, you set this on the player itself, but in v1.0.0 and newer, simply pass it in when you call `monitorTizenPlayer`, alongside the `debug` and `data` options, as `playbackListener`

#### v0.3.0

* Support `programchange`
* Update to `mux-embed` v2.8.0
* Fix an issue where `play` event may not have been sent appropriately

#### v0.1.0

* Initial SDK released.


# Monitor LG
This guide walks through integration with LG Smart TVs to collect video performance metrics with Mux data.
## Features

The following data can be collected by the Mux Data SDK when you use the \{featureDef.name} SDK, as described
&#x20;       below.

```md
- Engagement metrics
- Quality of Experience Metrics

```

Notes:

```md
No notes provided
```

## 1. Integration overview

LG Smart TV applications are built on top of the HTML5 video technology. To support video streaming, these applications can be integrated with player SDKs such as the [HLS.js](https://video-dev.github.io/hls.js/) and [Dash.js](https://github.com/Dash-Industry-Forum/dash.js).

Due to the HTML5 nature of LG Smart TV applications, the Mux Data integration with LG televisions uses one of the HTML5 integrations, such as the ones listed above. When setting up your application, you should check which video player engine that is used, and depending on that, utilize the appropriate integration point within `mux-embed`.

Check these 3 web integration guides for more details:

* [HTML5 video element](/docs/guides/monitor-html5-video-element)
* [HLS.js](/docs/guides/monitor-hls-js)
* [Dash.js](/docs/guides/monitor-dash-js)

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

```js
// main.js
play: function() {
  var data = {
    env_key: 'ENV_KEY', // required
    player_name: 'My Custom Player',
    player_init_time: mux.utils.now(),
    // ... additional metadata
  };

  switch (this.playerEngine) {
    case this.PLAYENGINE_HLSJS:
      if (Hls.isSupported()) {
        const hls = new Hls();
        hls.loadSource('<your source file>');
        hls.attachMedia(this.player);
        hls.on(Hls.Events.MANIFEST_PARSED,function(e,d) {
          app.player.play();
        });
        mux.monitor('#my-player', {
          debug: true,
          hlsjs: hls,
          Hls: Hls,
          data: data
        });
        this.hls = hls;
      }
      break;
    case this.PLAYENGINE_DASHJS:
      const dashjsPlayer = dashjs.MediaPlayer().create();
      dashjsPlayer.getDebug().setLogToBrowserConsole(false);
      mux.monitor('#my-player', {
        debug: true,
        dashjs: dashjsPlayer,
        data: data
      });
      dashjsPlayer.initialize(this.player, 'http://dash.edgesuite.net/envivio/EnvivioDash3/manifest.mpd', true);
      this.dashjsPlayer = dashjsPlayer;
      break;
  }
}
```

After you've finished integration, the quickest way to see that the SDK is loaded is to pass `debug: true` in the options passed to the SDK. With this flag enabled, you can open the debug console, and you should start seeing debug statements from \[mux] when you click play on the video.

After playing a video, a few minutes after you stop watching, you'll see the results in your Mux account. We'll also email you when your first video view has been recorded. Log in to the dashboard and find the environment that corresponds to your env\_key and look for video views.

Note that it may take a few minutes for views to show up in the Mux Data dashboard.

## 2. Make your data actionable

Options are provided via the data object passed in the call to `mux.monitor`.

All metadata details except for `env_key` are optional, however you'll be able to compare and see more interesting results as you include more details. This gives you more metrics and metadata about video streaming, and allows you to search and filter on important fields like the player version, CDN, and video title.

```js
mux.monitor('#my-player', {
  debug: false,
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY', // required
    // Site Metadata
    viewer_user_id: '', // ex: '12345'
    experiment_name: '', // ex: 'player_test_A'
    sub_property_id: '', // ex: 'cus-1'
    // Player Metadata
    player_name: '', // ex: 'My Main Player'
    player_version: '', // ex: '1.0.0'
    player_init_time: '', // ex: 1451606400000
    // Video Metadata
    video_id: '', // ex: 'abcd123'
    video_title: '', // ex: 'My Great Video'
    video_series: '', // ex: 'Weekly Great Videos'
    video_duration: '', // in milliseconds, ex: 120000
    video_stream_type: '', // 'live' or 'on-demand'
    video_cdn: '' // ex: 'Fastly', 'Akamai'
  }
});
```

For more information, see the [Metadata Guide](/docs/guides/make-your-data-actionable-with-metadata).

## 3. Advanced options

### Customize beacon collection domain

If you have [integrated a custom domain for Data collection](/docs/guides/integrate-a-data-custom-domain), specify your custom domain by setting `beaconCollectionDomain`.

```js
mux.monitor('#my-player', {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN', // ex: 'foo.bar.com'
  hlsjs: hls,
  Hls,
  data: {
    env_key: 'ENV_KEY', // required
    // ... additional metadata
  }
});
```


# Monitor Agnoplay player service
This guide walks through integration with Agnoplay to collect video performance metrics with Mux Data. Because Agnoplay has Mux Data fully pre-integrated there will be no need of any development effort to activate Mux Data.
<Callout type="warning" title="Third-party integration">
  This integration is managed and operated by [Agnoplay](https://agnoplay.com).
  Feedback should be made through your Agnoplay representative [https://agnoplay.com](https://agnoplay.com) or info@agnoplay.com.
</Callout>

# Environment key

Get your `ENV_KEY` from the [Mux environments dashboard](https://dashboard.mux.com/environments).

<Callout type="info" title="Env Key is different than your API token">
  `ENV_KEY` is a client-side key used for Mux Data monitoring. These are not to be confused with API tokens which are created in the admin settings dashboard and meant to access the Mux API from a trusted server.
</Callout>

<Image src="/docs/images/env-key.png" width={2004} height={250} />

# Contact Agnoplay

Contact your Agnoplay representative through [https://agnoplay.com](https://agnoplay.com) or [info@agnoplay.com](mailto:info@agnoplay.com), and provide them with your environment key.

# Wait for the magic

The Agnoplay support team will add your environment key to the configuration of Agnoplay instance, after which your Mux Data environment will be populated with data within minutes. That's all to it.


# Understand metric definitions
Understand the playback metrics Mux uses to measure viewership and quality of experience.
Engagement and Quality of Experience metrics are tracked during each playback attempt and a value for each metric is assigned to the view that is generated. Metrics reports aggregate the values from individual, completed views that match a specified filter to calculate the metric for analysis.

Each report is defined by the metric being analyzed, the time range, and a filter that can focus the report on a specific subset of views. Engagement metrics use the start time of each view for determining if it should be included in a time range. Quality of Experience metrics are aggregated based on the end time of each view. Views are only included in a metric calculation if they have a valid value for the metric.

# Views and Watch Time

Views and Watch Time are key metrics that are often shown along with other engagement or Quality of Experience metrics in your Mux dashboard.

## Definition of a View

In Mux Data, a "View" is an attempt (successful or not) to play a video. A view is created when a viewer clicks play or if playback is started programmatically. If the user taps play, the video starts to load and fails, that counts as a single view. If a user taps play, starts watching the video, pauses, then resumes within 60 minutes, that counts as a single view.

Each view is tracked until playback is explicitly ended or 60 minutes after playback stops. Playback can be explicitly ended by the SDK or if a viewer navigates off the page with the video being played.

A single video can be watched multiple times in a single view by looping or seeking to the start of the video. If you see more views than expected in your dashboard or see duplicate views check on the code that initializes the Mux Data SDK to make sure you are initializing it once per playback attempt.

After a view stops receiving playback events for 60 seconds, it is considered complete and is available in Metrics and exported via streaming exports. Playback events will be added to views if playback resumes within 60 minutes. After 60 minutes of inactivity, the view is finalized and new playback events will create a new view.

## How Watch Time is calculated

Watch Time is not currently an aggregated metric in Mux Data but is used in some of the metrics calculations. The Watch Time for a view is the cumulative amount of time the user spent watching or attempting to watch the video. This metric includes actively playing content, starting up, rebuffering, and seeking. It is similar to Playing Time, which is an aggregated metric available for analysis, but Watch Time also includes the time spent rebuffering.

If user watches for 90 seconds, has 4 seconds of rebuffering, spends 2 seconds seeking by rewinding and then watches 60 more seconds that would total 156 seconds of watch time (90 + 4 + 2 + 60).

If a user watches a 2 minute video at 2x speed, Watch Time will be 1 minute (assuming no buffering, seeking, startup time). This is because Watch Time is measuring how much time has elapsed during playback, not how much video duration was watched.

# Metrics

Mux has engagement metrics to track viewership and five top-level metrics to measure quality of experience. Detailed definitions and formulas can be found on these metric guide pages:

* [Viewer Engagement](/docs/guides/data-engagement-metric)
* [Overall Viewer Experience](/docs/guides/data-overall-viewer-experience-metric)
* [Playback Success](/docs/guides/data-playback-success-metric)
* [Startup Time](/docs/guides/data-startup-time-metric)
* [Smoothness](/docs/guides/data-smoothness-metric)
* [Video Quality](/docs/guides/data-video-quality-metric)

These metrics are available in the Mux Data Dashboard and via the Mux Data API. They can be used by your team to track KPIs and optimize the viewing experience for your end users.

This is what the metrics look like on the Mux Data Dashboard:

<Image alt="Mux top 6 metrics" width={321} height={800} src="/docs/images/top-6-metrics.png" />

In order to get the most value out of the metrics measured by Mux, make sure your data is actionable by providing [valuable metadata for each view](/docs/guides/make-your-data-actionable-with-metadata). Use this in conjunction with filters to segment data metrics.


# Understand Monitoring Metrics and Dimensions
Learn about Mux Data's real-time monitoring metrics and dimensions to measure viewer engagement and streaming performance
Mux Data Monitoring offers near real-time to measure streaming performance and current viewers. Mux Data Monitoring metrics are available to Mux Data customers on a Media plan. Monitoring Metrics are offered at sub 20 second latency and are available for 24 hours.

## Monitoring Metrics

### Current Concurrent Viewers (CCV)

The number of viewers currently watching a video. This includes viewers currently waiting for the video to start playing, experiencing rebuffering, or who just experienced a playback failure. It does not include viewers that are paused or have been rebuffering more than five consecutive minutes.

The Monitoring Dashboard includes CCV by geography and Top Titles by CCV. Additional breakdown dimensions are available via API.

### Video startup failures by startup attempts

The number of viewers who have just experienced a video startup failure (an error that prevents the user from seeing the first frame of video, be it ads or content) as a percent of Start Attempts. Start Attempts is defined as the number of viewers who are in the video loading state or have just experienced a jump from the video loading state to video startup success, video startup failure, or exits before video starts.

### Playback Failures by CCV

The number of viewers who have just experienced a playback failure (a fatal error that prevents future playback) as a percent of Current Concurrent Viewers (CCV). Errors defined as non-fatal are not included in this metric.

The Playback Failures by CCV metric is measured differently from the Playback Failure Percentage in Mux QoE Metrics.

### Exits Before Video Start by Start Attempts

The number of viewers who have just abandoned a video view while waiting at least one second for the video to start playing, as a percent of Start Attempts. Examples where this may happen include closing the app/browser or clicking to a different video before playback begins.

The Exits Before Video Start by Start Attempts metric is measured differently from Exits Before Video Start in the Mux QoE Metrics.

### Current Rebuffering Percentage

Current Rebuffering Percentage measures the amount of time viewers spent in the rebuffering state, from the last time measured to the current time, as a percentage of the total watch time. The total watch time is the amount of time viewers spent watching or attempting to watch video, which includes startup time, rebuffering time, and time actually watching the video (it does not include paused, errored, or exited states).

Current Rebuffering Percentage is measured differently from the Rebuffer Percentage in the Mux QoE Metrics.

### Current Average Bitrate

The average of the video bitrates shown to viewers over the time period. The bitrate for a view is the value indicated in the video manifest for the rendition that is played for the viewer during the time period.

### Video Startup Time

Video Startup Time measures the current median startup time. This could be considered a "typical" startup time across viewers; half experience a faster startup time and half experience a slower startup time.

## Monitoring Dimensions

Current concurrent viewers, current rebuffering percentage, exits before video start, playback failure percentage, current average bitrate, video startup failure percentage can be filtered and broken down by the following dimensions.
| Dimension | Description |
|-----------|-------------|
| ASN | An autonomous system number (ASN) is a number assigned to a local network, registered into the carrier's routing community and placed under the umbrella of an administrative domain called an autonomous system. An ASN is often correlated with an ISP, though a single ISP may operate multiple ASNs. |
| CDN | The Content Delivery Network used to deliver the video. If using an SDK that supports CDN header extraction, this value will be auto-populated. |
| Operating System | Operating System (`iOS`, `Windows`, etc.) |
| Player Name | You can provide a name for the player (e.g. `My Player`) if you want to compare different configurations or types of players around your site or application. This is different from the player software (e.g. `Video.js`), which is tracked automatically by the SDK. |
| Region | A geographical subunit of a country. Examples include region, province, or state. |
| Stream Type | The type of video stream (e.g: `live` or `on-demand`) |
| Sub-property ID | A sub property is an optional way to group data within a property. For example, sub properties may be used by a video platform to group data by its own customers, or a media company might use them to distinguish between its many websites. |
| Video Series | The series of the video (e.g.: `Season 1 or Awesome Show`) |
| Video Title | Title of the video (e.g.: `Awesome Show: Pilot`) |
| View Has Ad | Tracks if an ad is present during a view. |
| Video ID | Your internal ID for the video |
| Mux Asset ID | Mux generated ID for Mux Video Assets |
| Mux Livestream ID | Mux generated ID for Mux Video Livestreams |
| Mux Playback ID | Mux generated Playback ID enabling streaming videos and live streams from Mux. An Asset or Livestream may have more than one Mux Playback Id. |


# Viewer Engagement
Engagement metrics help you track the success of your videos by measuring how many people are watching and for how long.
<Image alt="Viewer Engagement Dashboard" width={2092} height={1686} src="/docs/images/viewer-engagement-dashboard.png" />

```md
## Views
Views counts the total number of views that started during the selected time interval. This is calculated differently from the views in Quality of Experience metrics, which counts the total number of views that ended during the selected time interval.

## Unique Viewers
Unique Viewers counts distinct viewers based on the start time of the associated view, using Viewer ID to determine uniqueness. If a Viewer ID isn't provided, the default Viewer ID generated by the Mux SDK is used. We recommend setting a meaningful, anonymized Viewer ID to get an accurate Unique Viewer count. The Viewer ID should not use any value that contains personally identifiable information (such as email address, username, etc).

## Playing Time
Playing Time is the total time (in hours) that viewers watched playing video content or ads, and excludes rebuffering, seeking, and paused time.

## Content Playing Time

Content Playing Time measures the time that is spent on content playback, excluding ad playback. It specifically measures the time when the player's playhead is progressing.


## Ad Playing Time

Ad Playing Time measures the time that is spent on ad playback. It specifically measures the time when the player's playhead is progressing during ad playback.


## Ad Attempts

The Ad Attempts metric counts the number of times that an ad attempted to start playing. The number of Ad Attempts helps you understand how often you attempted to show ads to viewers.

Ad Attempts occur when each individual ad is attempted during the ad break. If a viewer ends the stream before all ads scheduled for the ad break have been attempted, the ads not yet attempted will not be included as Ad Attempts. Not every Ad Attempt will result in the ad being shown because an error can occur or a viewer could leave before the ad starts playing.


## Ad Impressions

The Ad Impressions metric counts the number of times that an ad successfully started playing. The number of Ad Impressions helps you understand how often you showed ads to viewers.

The ad playing is counted as an impression if it plays any frames, regardless of how long the ad plays.


## Ad Breaks

The Ad Breaks metric counts the number of times that ad breaks occurred. The number of Ad Breaks helps you understand how often you broke away from content in order to show ads to viewers.

Ad breaks can contain multiple ads attempts and impressions. An Ad break is counted if it occurs during playback, even if there were no ads actually shown to the viewer due to no ads being scheduled or ad errors that prevented ads from being displayed.

```


# Overall Viewer Experience
Overall Viewer Experience is a high-level score from 0 to 100 that measures the QoE (Quality of Experience).
<Image alt="Overall Viewer Experience dashboard" width={1204} height={940} src="/docs/images/overall-viewer-experience-dashboard.png" />

```md
## Overall Viewer Experience Score

Overall Viewer Experience Score is a metric that describes the overall Quality of Experience (QoE) of video streaming in a single number. A score of 100 means every viewer had a satisfying experience, and a score of 0 means that every viewer had a frustrating experience.

Overall Viewer Experience Score is based on four other Viewer Experience Scores, which each describe one of the four elements of video streaming performance: Playback Success, Startup Time, Playback Smoothness, and Video Quality.

Viewer Experience Scores are useful as a way to describe streaming performance from the perspective of an end viewer and not just from the perspective of system-level metrics. The Viewer Experience Scores are QoE (Quality of Experience) metrics, which describe the actual end-user experience of watching video. This is in contrast to QoS (Quality of Service) metrics, which describe a specific system’s performance without reference to user experience.

For example, "Downscaling Percentage" is a useful metric to track when it comes to QoS, since a high Downscaling Percentage means a service is delivering more video than necessary to fill the player or display. But users don't see downscaling, so Downscaling Percentage doesn't come into QoE when describing video quality. However Upscaling Percentage is a QoE metric because upscaling creates visual artifacts in the video, affecting the viewer’s experience.

Both QoE and QoS metrics are important for different purposes. QoS metrics are useful when troubleshooting specific system problems, while QoE metrics are useful when evaluating technologies or prioritizing problems to improve.

### Formula
Each individual video view is given an experience score of 0-100, and the Overall Viewer Experience Score is calculated by averaging the experience of all video views. Each view’s experience score is measured by first calculating a score between 0 and 100 for each of the elements of streaming performance (Playback Success, Smoothness, Startup Time, and Quality). Those scores are then averaged using higher weights for the more impactful elements of the viewer experience.

The weights are created by measuring the relative importance of each element of the experience. Mux conducted user surveys and research across millions of video views on the relative tradeoffs between increasing one metric at the expense of another. For example, you can increase Quality at the expense of Startup Time and vice versa. However, doing so would be a bad idea because Startup Time is more valuable than Quality. Generally, we found that Playback Success is the most important, followed by Smoothness then Startup Time, and finally Quality.

We want to make sure the Overall Score captures these complex relationships between metrics since developers may decide to make certain tradeoffs in order to improve their QoE scores. So, instead of just averaging the scores, Mux creates a set of tradeoff scores first and combines that into the overall score. The exception is Playback Success, which is a multiplier applied to the Overall Score.


The Overall Viewer Experience Score is defined as:
$$
Playback\ Success\ Score * \frac{T_{Sm, Q} + T_{Sm, Su} + T_{Su, Q}}{3}
$$
where
$$
T_{Sm, Q} = \text{Tradeoff(Smoothness, Quality)}
$$
$$
T_{Sm, Su} = \text{Tradeoff(Smoothness, Startup)}
$$
$$
T_{Su, Q} = \text{Tradeoff(Startup, Quality)}
$$

This way of combining the metrics is a more accurate representation of the viewer’s quality of experience, and increases the usefulness of Mux Data’s scoring system.

### Use this metric to:
* See trends in your overall viewer experience over time
* Begin prioritizing your efforts in areas of your platform (devices, regions, etc.) that have the lowest score and can use the most improvement

```


# Playback Success
Playback Success for a single view is a score of 0, 50, or 100 that measures if the user was able to successfully begin playback.
<Image alt="Playback Success" width={1189} height={881} src="/docs/images/playback-success-dashboard.png" />

````md
## Playback Success Score

Playback Success Score focuses on whether a video played back successfully.

Successful playback includes two components:

* Did the video play without an error?
* Did the user actually get to playback, or did they exit before playback started?

### Formula
**Playback Success Score** is fairly simple. A failure that ends playback is a `0`, while a video that plays through without failure is `100`. A view that is terminated by the viewer before playback starts (an “Exit Before Video Start,” or EBVS) is given a score of `50`.  EBVS views that occur in less than 1 second are given no score.

```
100: successful playback
50: exit before video start
0: playback failure
N/A: exit before video start <1 second
```

Why are EBVS views given a score of 50? The reason is that while exits can often point to streaming problems (e.g. the video took too long to load), some percentage of exits before video start are normal. A user might click the wrong video or see a link to a different video they want to watch more. If a view is abandoned in less than one second, we assume the video start was unintentional or programmatic and exclude those play attempts from the score.

### Use this metric to:
* Understand how playback failures impact the overall viewer experience
* Compare playback success performance to other areas of viewer experience
* Find areas where playback success can be optimized and improved


## Exits Before Video Start

Viewers will sometimes abandon a video (e.g. close the page/app or click the back button) because it is taking too long to load. The Exits Before Video Start Percentage metric captures how frequently this happens.

For this metric we count the number of video views where the viewer clicked play (or the player began to autoplay) but the video never began to play back (Video Startup Time was never recorded), excluding playback failures.  We then divide that number by the total number of video views.

### Use this metric to:
* Watch how changes in Video Startup Time directly impact viewers abandoning the video
* Compare players and understand if factors other than startup time may be causing viewers to leave, for example visual cues like loading indicators and poster frames.

Note: Viewers may leave for reasons other than long startup times, for example deciding that they clicked the wrong video or clicking on a related video. Before becoming concerned with your platform’s specific percentage you should attempt to improve your Video Startup Time and see how that impacts your Exits Before Video Start.


## Playback Failure Percentage

The Playback Failure Percentage metric gives the percentage of video views that failed to play due to a fatal error. Playback failures can happen at any point during video playback, causing the playback to end prematurely.

An error is considered a Playback Failure if it has a severity of fatal and it is not due to a business rule exception.

### Use this metric to:
* Understand where playback failures are happening most frequently
* Watch for spikes in playback failures due to new errors

Visit the Errors section to see which specific errors are happening the most frequently.


## Video Startup Failure Percentage

The Video Start Failure Percentage metric is the percentage of video views that experienced an error that prevents the user from seeing the first frame of video, which could be either ads or content.

An error is considered a Startup Failure if it occurs before the first frame of video is shown, has a severity of fatal, and the error is not due to a business rule exception.

### Use this metric to:
* Understand where video startup failures are happening most frequently
* Watch for spikes in startup failures due to new errors

Visit the Errors section to see which specific errors are happening the most frequently when start failures occur.

## Business Exception Percentage

  The Business Exception Percentage metric gives the percentage of video views that failed to play due to a fatal business rule exception.   Business Rule Exceptions can happen at any point during video playback, causing the playback to end prematurely.

  An error is considered a Business Rule Exception if it has a severity of fatal and it is specified as due to a business rule exception.


  ### Use this metric to:
  * Understand where playback failures are happening most frequently
  * Watch for spikes in playback failures due to new errors

  Visit the Errors section to see which specific business rule exceptions are happening the most frequently.
  

## Video Startup Business Exception Percentage

  The Video Startup Business Exception Percentage metric is the percentage of video views that experienced a business rule exception that   prevents the user from seeing the first frame of video, which could be either ads or content.

  An error is considered a Startup Business Rule Exception if it occurs before the first frame of video is shown,   has a severity of fatal, and the error is specified as due to a business rule exception.

  ### Use this metric to:
  * Understand where video startup business rule exceptions are happening most frequently
  * Watch for spikes in startup business rule exceptions due to new errors

  Visit the Errors section to see which specific errors are happening the most frequently when startup business rule exceptions occur.
  

## View Dropped Percentage

  Video views sometimes end for unknown reasons. This could be caused by a technical issue such as an   application or player crash, loss of session internet connection or browser behavior that   may restrict analytics. The View Dropped Percentage metric captures how frequently this happens.

  For this metric we count the number of video views where the viewing session ended without a clean   exit. We then divide that number by the total number of video views.

  ### Use this metric to:
  * Understand where dropped views are happening most frequently
  * Watch for spikes in dropped views due to player or application updates
  

## Ad Errors

The Ad Errors metric counts the number of times that ad errors occurred when trying to play. The number of Ad Errors helps you understand how often ads you attempt to run have problems being viewed.

An Ad Error is not necessarily a playback failure; errors often result in the ad playback ending or the ad being skipped before returning to video content playback.


## Ad Error Percentage

The Ad Error Percentage metric gives the percentage of ad attempts that failed during playback due to an ad error. Ad Errors can happen at any point during the ad playback, often causing the ad to end prematurely.

### Use this metric to:
* Understand how often errors are occurring when showing ads
* Watch for spikes in ad errors that can occur due a system issue


## Ad Breaks with Errors

The Ad Breaks with Errors metric counts the number of times that ad errors occurred during an ad break. The number of Ad Breaks with Errors helps you understand how often ad breaks have problems showing ads.

Some ad services will skip all remaining ads in the ad pod if an error happens and this metric can help you understand how often that occurs.


## Ad Breaks with Error Percentage

The Ad Breaks with Errors Percentage metric gives the percentage of ad breaks where an ad error occurred during the ad break.

### Use this metric to:
* Understand how often ad breaks have errors that occur when showing ads
* Understand if ad errors are more concentrated within a smaller number of ad breaks or spread out across ad breaks


## Ad Startup Error Percentage

The Ad Startup Error Percentage metric gives the percentage of time that an ad attempted to play but a failure occurred before the ad started and an ad impression was recorded.

### Use this metric to:
* Understand how often errors prevent ads from playing for viewers


## Ad Exits Before Start Percentage

The Ad Exits Before Start Percentage metric gives the percentage of time that an ad attempted to play but the user stopped or left the stream before the ad started playing.

The Ad Exit Before Start metric is intended to capture views where the user exits explicitly; ad start failures are not considered an exit in this metric because the user is not choosing to leave before the ad is shown when an error occurs.

### Use this metric to:
* Understand how often viewers stop the video stream before an ad started playing


## Ad Playback Failure Percentage

The Ad Playback Failure Percentage metric gives the percentage of video views that failed to play due to a fatal error during an ad break, causing ad playback to end prematurely.

An error is considered a Ad Playback Failure if it occurs during ad playback, has a severity of fatal, and is not due to a business rule exception.

### Use this metric to:
* Understand where ad playback failures are happening most frequently
* Watch for spikes in ad playback failures due to new errors

Visit the Errors section to see which specific errors are happening the most frequently.


## Content Playback Failure Percentage

The Content Playback Failure Percentage metric gives the percentage of video views that failed to play due to a fatal error during content playback. Content playback failures can happen any time during video playback outside of ad breaks, causing content playback to end prematurely.

An error is considered a Content Playback Failure if it occurs during content playback, has a severity of fatal and it is not due to a business rule exception.

### Use this metric to:
* Understand where content playback failures are happening most frequently
* Watch for spikes in content playback failures due to new errors

Visit the Errors section to see which specific errors are happening the most frequently.

````


# Startup Time
Startup Time is the time between when the user attempts to start playback and when they see the first frame of video.
<Image alt="Startup Time Dashboard" width={1204} height={993} src="/docs/images/startup-time-dashboard.png" />

```md
## Startup Time Score

Startup Time Score describes how happy or unhappy viewers are with startup time. Longer startup times mean lower scores, while shorter startup times mean higher scores. Once startup time reaches a certain point (around 8 seconds), we begin to decrease the rate of score decay since additional seconds of startup becomes less impactful for long startup times.

### Formula
$$
\frac{8}{8 + startup\_time\_in\_seconds} * 100
$$

This score decreases at a greater rate after 500ms of starting up.

Note that EBVS views do not receive a Startup Time Score.

Example values:
* 400 ms: 95
* 2 seconds: 80
* 8 seconds: 50
* 20 seconds: 29

### Use this metric to:
* Understand how problems with startup time impact the overall viewer experience
* Compare startup time performance to other areas of viewer experience
* Find areas where startup time can be optimized and improved


## Video Startup Time

Video Startup Time measures the time that the viewer waits for the video to play after the page is loaded and the player is ready. It specifically measures from when the player has been instructed to play the video, either directly by the user or via autoplay, to when the first frame of video is showing and the playhead is progressing. In the case that the player is configured with a pre-roll ad, Video Startup Time is the time until the first frame of the pre-roll ad is displayed.

Mux provides two percentiles of this metric:
* Median (50th Percentile) – Helps understand a typical experience (half are better than this number, half are worse)
* 95th Percentile – Helps understand what a poorer experience is like on your platform, while excluding outliers and happening frequently enough (1 in 20 views) to always be worth your attention.

Our data shows that viewers can be very impatient when waiting for a video to start, leaving in as little as two seconds for certain content types.

Network performance and initial rendition selection have the greatest impact on this number.

Preloading the video data before the viewer clicks play can also have a positive impact on this metric, however this should only be done when the video is the primary piece of content.


## Player Startup Time

Player Startup Time measures the time from when the player is first initialized in the page to when it is ready to receive further instructions.

While Player Startup Time is usually low, it can point to subtle difference in the operations of players. When combined with Page Load Time and Video Startup Time we can see Aggregate Startup Time, and understand the full amount of time a viewer waits on a video watch page.

To get Player Startup Time data, you must pass [`player_init_time`](https://docs.mux.com/guides/data/make-your-data-actionable-with-metadata) in your client integration.


## Page Load Time

Page load time measures the time from the initial user request for a page to the time when the video player is first initialized. Use this metric to understand the impact of new page resources (JavaScript files, CSS, etc.) on the viewer wait time. This can also be used to compare video players, and the size and loading speed of their files impacts the wait time.

Page Load Time is only recorded for the first video view on a page, so you may see a smaller number of total views for this metric.


## Aggregate Startup Time

Aggregate Startup Time combines Page Load Time, Player Startup Time, and Video Startup Time to show the total time a viewer waits for a video to play after requesting to watch the video on the previous screen or page.

On the web we often have web pages that are dedicated to individual videos. These pages are referred to as watch pages (e.g. `mydomain.com/watch?video=1234`). Viewers get to these pages by clicking on search results and lists of video thumbnails. In the case of watch pages we need to not only understand the Video Startup Time, but the full time that the viewer waited from when they clicked/tapped to watch the video.

Use this metric to optimize your watch pages for shorter wait times. For example, waiting to load secondary content (comments, related videos) until the video is playing, or choosing a video player that has the lowest impact on aggregate startup time.

This metric is only recorded for the first video view on a page, so you may see lower total view counts represented in this metric.


## Seek Latency

The Seek Latency metric measures the average amount of time that viewers wait for the video to start playing after seeking to a new time. Seek latency is calculating as the amount of time between the start and end of the seeking event or when the video is ready to resume playback. If a user pauses during the seek event, it will still measure the amount of time it took for the video to be ready to resume playback or the end of the seeked event.

Seeking is any time the player is asked to jump backward or forward to a new time in the video outside of normal playback. Aside from the viewer clicking on the progress bar, seeking can also happen programmatically, for example when jumping ahead to resume where the viewer previously stopped watching.

### Use this metric to:
* Look for cases of extreme seek startup times
* Compare video players and their ability to respond quickly to a viewer’s seek request


## Video Startup Preroll Request Time

The Video Startup Preroll Request Time measures the total amount of Video Startup Time that is spent making ad requests, waiting for the ad responses, and parsing the VAST/VMAP response. Specifically,
this measures the amount of time the viewer is waiting for the video to start in which the player does not yet know which ad to play, if any.

It is important to call out that this time _only_ includes time during video startup (i.e. after the user initiates playback, or auto-play does). Any time spent requesting ads before playback is initiated is not included in this metric.

### Use this metric to:
* Understand the performance of your ad server as it affects video startup time
* Attribute slow startup times to ad network performance, versus ad asset performance


## Video Startup Preroll Load Time

The Video Startup Preroll Load Time measures the total amount of Video Startup Time that is spent loading the first preroll ad asset. Specifically, this measures the amount of time the viewer is waiting for the first preroll ad to start playing after all ad responses have been received and parsed.

It is important to call out that this time _only_ includes time during video startup (i.e. after the user initiates playback, or auto-play does). Any time spent preloading the ad asset before playback is initiated is not included in this metric.

### Use this metric to:
* Understand the performance of your ad asset delivery as it affect video startup time
* Attribute slow startup times to ad asset performance, versus ad network performance


## Requests for First Preroll

This metric measures the number of ad requests that are made up to the point of preroll ad playback beginning. Depending on your ad architecture, it is possible that your player may make sequential ad requests in the case that the previous request returned no playable ad in order to fill all possible impressions. In this case, multiple ad requests can lead to slow startup times, which can potentially be improved by reducing the possible waterfall/fallback calls, ensuring playable ads being returned in the first request, or other means within your ad server.

### Use this metric to:
* Understand performance correlation with the number of ad requests made to retrieve the first playable ad


## Content Startup Time

Content Startup Time measures the time that the viewer waits for the video content (not including ads) to play after the page is loaded and the player is ready. It specifically measures from when the player has been instructed to play the video, either directly by the user or via autoplay, to when the first frame of playback content is showing and the playhead is progressing, which excludes the time spent loading and viewing preroll ads.

Mux provides two percentiles of this metric:
* Median (50th Percentile) – Helps understand a typical experience (half are better than this number, half are worse)
* 95th Percentile – Helps understand what a poorer experience is like on your platform, while excluding outliers and happening frequently enough (1 in 20 views) to always be worth your attention.


## Ad Preroll Startup Time

Ad Preroll Startup Time measures the time that the viewer waits for the preroll ad to load and start playing after the player is ready. It specifically measures from when the player has been instructed to play the preroll ad, either directly by the user or via autoplay, to when the first frame of ad content is showing and the playhead is progressing.

Mux provides two percentiles of this metric:
* Median (50th Percentile) – Helps understand a typical experience (half are better than this number, half are worse)
* 95th Percentile – Helps understand what a poorer experience is like on your platform, while excluding outliers and happening frequently enough (1 in 20 views) to always be worth your attention.

```


# Smoothness
Smoothness is a score based on the amount of rebuffering that happened during a view.
<Image alt="Smoothness Dashboard" width={1197} height={958} src="/docs/images/smoothness-dashboard.png" />

```md
## Smoothness Score

Smoothness Score measures the amount of rebuffering a viewer sees when watching video. A higher Smoothness Score means the viewer experiences less rebuffering, while a lower score means a viewer sees more rebuffering.

### Formula
Average of:
$$
\frac{1}{\sqrt{1 + \Big(\frac{\text{rebuffer\_count}}{2}\Big)^2}}*100
$$
and
$$
e^{-10 * rebuffer\_percentage} * 100
$$

Rebuffering can be measured as a combination of the number of rebuffering events and the rebuffering percentage. We measure rebuffering in both ways in order to account for views where rebuffering events are short but occur often. We consider multiple interruptions to be worse than a single interruption, even if total time spent rebuffering is the same. Averaging both measurements helps account for this, and provides a truer representation of the viewer experience.

Rebuffering time is measured as a percentage because a 5-second rebuffering duration is much more meaningful when viewing a 10-second clip versus a 2-hour movie. Our research shows that watch time rapidly decreases from just a single percentage point of rebuffering. We use an exponential curve to model this rapid initial decrease from rebuffering, which then slows down after a rebuffering percent of ~10%. This is because rebuffering is now high enough that incremental amounts of rebuffering becomes less impactful.

Rebuffering counts are slightly less negative under certain circumstances. For example, a single half-second rebuffering event during a long video might be barely noticeable. However, multiple rebuffering events do become noticeable and will rapidly decrease your score. This curve differs from the percentage curve in that a single rebuffering event is not harshly penalized. The score decreases rapidly in the 2 to 4 range, and then slows down past 5 rebuffering events due to the diminishing impact of additional events.

Both rebuffering percentage and rebuffering counts degrade the user experience, so we average the score. This helps balance for cases when either count or percentage is high while the other metric is low. A single rebuffering event of 1% isn’t great, but it’s better than 10 rebuffering events of 0.1%.

Note that EBVS views do not receive a Smoothness Score.

Examples:
* No rebuffering: 100
* 5 minute video with a single 5s rebuffer: average of 80 and 90 = 85
* 20 minute video with four 15s rebuffers: average of 44 and 60 = 54

### Use this metric to:
* Understand how problems with rebuffering impact the overall viewer experience
* Compare rebuffering performance to other areas of viewer experience
* Find areas where rebuffering can be optimized and improved


## Rebuffer Percentage

Rebuffer Percentage measures the volume of rebuffering that is occurring across the platform. Rebuffer Duration is the sum of the time viewers spend rebuffering on the platform or the given set of video views. Watch Time is the sum of all time viewers spend watching video, inclusive of rebuffering time. The Rebuffer Percentage then measures the rebuffer duration as a percentage of watch time.

Rebuffering occurs when the video stalls while a viewer is attempting to play through content, most often because it is taking more time to download (buffer) the content than it takes to play it. Stalls can also occur when a viewer attempts to seek to different times in the media, which is treated as a separate metric in Mux Data called Seek Latency.

### Use this metric to:
* Understand how much time viewers spend waiting for videos to rebuffer
* Optimize an adaptive algorithm or rebuffering strategy
* Compare players and CDNs


## Rebuffer Frequency

Rebuffer Frequency measures how often rebuffering events happen. It’s important to track this number because it can reveal issues of video stuttering, where the player is being too aggressive when restarting playback and has to frequently stop to rebuffer. This issue can be lost when measuring the rebuffering time, but can be just as frustrating as longer rebuffering events.

Rebuffering occurs when the video stalls while a viewer is attempting to play through content, most often because it is taking more time to download (buffer) the content than it takes to play it. Stalls can also occur when a viewer attempts to seek to different times in the media, which is treated as a separate metric in Mux Data called Seek Latency.

### Use this metric to:
* Understand how frequently viewers are interrupted by rebuffering
* Optimize an adaptive algorithm or rebuffering strategy
* Compare players and CDNs


## Rebuffer Duration

Rebuffer Duration is the amount of time in seconds that viewers wait for rebuffering per video view. Videos with longer durations have more opportunities for rebuffing events to occur and can make comparisons with shorter videos difficult, making Total Rebuffer Percentage the safer metric to optimize with. However Rebuffer Duration can be a useful metric for understanding the true viewer experience because it’s measured in seconds as opposed to a percentage.

Rebuffering occurs when the video stalls while a viewer is attempting to play through content, most often because it is taking more time to download (buffer) the content than it takes to play it. Stalls can also occur when a viewer attempts to seek to different times in the media, which is treated as a separate metric in Mux Data called Seek Latency.

### Use this metric to:
* Understand how long viewers wait for videos to rebuffer per video view
* Optimize an adaptive algorithm or rebuffering strategy


## Rebuffer Count

Rebuffer Count shows the number of rebuffering events that happen during video views. Compared to Total Rebuffer Frequency, Rebuffer Count can help you easily understand how many views are seeing more than zero rebuffering events.

Rebuffering occurs when the video stalls while a viewer is attempting to play through content, most often because it is taking more time to download (buffer) the content than it takes to play it. Stalls can also occur when a viewer attempts to seek to different times in the media, which is treated as a separate metric in Mux Data called Seek Latency.

### Use this metric to:
* Understand how often viewers are interrupted by rebuffering per video view
* Optimize an adaptive algorithm or rebuffering strategy


## Rendition Change Count

Rendition Change Count shows the total number of rendition changes (both upshifts and downshifts in video quality) that occurred during playback per video view. This metric helps you understand how frequently adaptive bitrate streaming is adjusting video quality levels or renditions in response to network conditions.

### Use this metric to:
* Understand how stable video quality is during playback
* Identify views with excessive quality fluctuations
* Evaluate the effectiveness of adaptive bitrate algorithms


## Rendition Upshift Count

Rendition Upshift Count shows the number of times video quality shifted upward to a higher quality rendition during playback. Upshifts indicate that network conditions improved enough to support higher bitrate video, providing viewers with better quality.

### Use this metric to:
* Understand how often viewers experience quality improvements
* Measure the responsiveness of adaptive bitrate algorithms to improved network conditions
* Identify scenarios where upshifts correlate with positive viewer engagement


## Rendition Downshift Count

Rendition Downshift Count shows the number of times video quality shifted downward to a lower quality rendition during playback. Downshifts typically occur when network conditions degrade, and the player adapts to prevent rebuffering by reducing video quality.

### Use this metric to:
* Understand how often viewers experience quality degradation
* Correlate downshifts with rebuffering events to evaluate adaptive bitrate effectiveness
* Identify network or CDN issues causing quality problems

```


# Video Quality
Video Quality compares the resolution of the video stream to the dimensions of the player.
<Image alt="Video Quality Dashboard" width={2394} height={1946} src="/docs/images/video-quality-dashboard.png" />

````md
## Video Quality Score

Video Quality Score measures the visual quality a user sees by comparing the resolution of a video stream to the resolution of the player in which it is played. If a video stream is significantly upscaled, quality generally suffers, and viewers have an unacceptable experience.

Note that video quality is notoriously difficult to quantify, especially in a reference-free way (without comparing a video to a pristine master). Bitrate doesn't work, since the same bitrate may look excellent on one video and terrible on another.

Several factors contribute to actual video quality: bitrate, codec, content type, and the quality of the original source. However, if content is encoded well and at the right bitrates, upscaling correlates reasonably well to video quality. We use a combination of average and max upscaling in order to account for extreme drops in quality, even when it only occurs for brief moments.

### Formula
$$
e^{-0.33 * (0.15 * U_{m} + 0.85 * U_{a})} * 100
$$
where
$$
U_{m} = \text{Max Upscale Percentage}
$$
$$
U_{a} = \text{Average Upscale Percentage}
$$

Video Quality Score is inversely related to the upscaling percentage for each view. 85% of the score is based on average upscaling, and 15% is based on max (peak) upscaling.

Note that EBVS views do not receive a Video Quality Score.

Examples:
* No upscaling: 100
* 50% upscaling throughout: 85
* 200% upscaling for the first 30 seconds, and no upscaling for the next 20 minutes: 92


### Use this metric to:
* Understand how video quality impacts the overall viewer experience
* Compare video quality to other areas of viewer experience
* Find areas where video quality can be optimized and improved


## Upscale Percentage

Upscaling is when the video player has to increase the size of the video to fill the player’s display. For example, if the video source is 320x240 and the player size 640x480, the player will stretch the video to fill the player dimensions. In that process the quality of the video degrades.

Upscale Percentage is measured as the change in one dimension, specifically the dimension that fits the player first when upscaling. In the 320x240 to 640x480 example, the Upscale Percentage would be 100%, calculated as (640-320) / 320.

However while the video plays the upscaling percentage may change if a new video rendition is selected or if the player goes to fullscreen. For this reason in the total Upscale Percentage metric we multiply each upscale percentage by the amount of time the video was upscaled. If the video was upscaled 100% for half of the video, and 0% for half of the video, the Total Upscale Percentage would be 50%.

### Use this metric to:
* Optimize the dimensions of videos to reduce poor quality due to stretching


## Downscale Percentage

Downscaling is the inverse of upscaling, measuring when the video source is too big for the player and has to be reduced in size to fit the display. In the process of shrinking the video pixels are
thrown out and essentially wasted. While this does not mean a reduction in video quality it does mean wasted bandwidth for you and the viewer, and significant occurrences of downscaling should be addressed.

### Use this metric to:
* Optimize the dimensions of videos for bandwidth and cost savings


## Max Upscale Percentage

While Upscale Percentage helps understand the volume of upscaling that’s occurring on your platform, the Max Upscale Percentage can help reveal points of significant upscaling, even if they don’t last the full video. It can also be more clear which video rendition may be the culprit as the percentage will exactly match the difference between a rendition and the player dimensions.

### Use this metric to:
* Optimize the dimensions of videos to reduce poor quality due to stretching


## Max Downscale Percentage

While Downscale Percentage helps understand the volume of downscaling that’s occurring on your platform, the Max Downscale Percentage can help reveal points of significant downscaling, even if they don’t last the full video. It can also be more clear which video rendition may be the culprit as the percentage will exactly match the difference between a rendition and the player dimensions.

### Use this metric to:
* Optimize the dimensions of videos for bandwidth and cost savings


## Weighted Average Bitrate

Weighted Average Bitrate is the time weighted average of the indicated bitrates that a viewer experiences during a video stream. The  weighted average is calculated from the amount of time spent at each bitrate while a video is played. The bitrate value is the indicated bitrate from the video manifest for the rendition that is used for each segment of playback.

For example, if during a view lasting 3 minutes a video plays for 1 minute at 1Mbps and 2 minutes at 2Mpbs, the Weighted Average Bitrate would be: [(1Mbps * 1min) + (2Mbps * 2min)] / 3min = 1.67Mbps

This metric only includes the bitrates (as indicated in the manifest) of video segments that are actually played. It does not include the segments that are downloaded but unplayed due to, for instance, an ABR algorithm that switches to a higher bitrate and discards previously downloaded lower bitrate segments in the cache.

### Use this metric to:
* Measure and optimize the visual quality of the videos that viewers experience


## Live Stream Latency

Live Stream Latency measures the time it takes from when a camera captures an action in real life to when viewers of a live stream see it happen on their screen. This metric allows you to quantify the amount of latency viewers experience and to identify viewers that may be encountering high latency which would impact their viewing experience.

This value is sometimes referred to as the glass-to-glass latency for a live stream but it is usually more accurately called ingest-to-render latency. The clock time for the live stream is determined using the time specified in `EXT-X-PROGRAM-DATE-TIME` tags embedded in the HLS manifest. The time specified in the HLS manifest is compared to the current UTC time, as specified by Mux Data servers.

Standard HLS streams usually have a latency of about 30 seconds, and Low Latency HLS (LL-HLS) streams target 5-10 seconds.

Note that if you are attempting to compare latency across different infrastructures or video platforms it is important to understand when during the video capture, ingest, or encoding the `EXT-X-PROGRAM-DATE-TIME` tags are inserted. For example, Mux Video inserts PDT tags when the video is ingested for streaming. Because of this behavior you should expect the latency measured for Mux Video streams to be around 1 second lower than the actual glass-to-glass latency. Some other platforms behave similarly and would also omit the time before ingest time from latency. For each streaming platform you use, you should assess at what point during the streaming pipeline the PDT tags get inserted so you know what is being measured and can take that into consideration.

We make an effort to only calculate the latency when the player is playing near the live edge segment, as identified by the `HOLD-BACK` and `PART-HOLD-BACK` tags in the playlist, if specified. Viewers playing more than 5 minutes behind the live edge will be excluded from the metric.

This metric will be calculated for any HLS or LL-HLS live stream that contains `EXT-X-PROGRAM-DATE-TIME` tags. For more information about `EXT-X-PROGRAM-DATE-TIME` tags please refer to the [HLS specification](https://datatracker.ietf.org/doc/html/draft-pantos-http-live-streaming#section-4.3.2.6).


## Request Throughput

Request Throughput measures the average throughput, in Mbps, for all media requests that were completed. Throughput is measured as the number of bits received per second from the time a request is initiated until it is completed.

Note that request metrics are only available for certain SDKs and playbacks. See our docs for more information.

### Formula
Each request has a total time loaded and a total bits downloaded. All of the bits loaded are added up and divided by the total request time (the sum of all time spent downloading those bits). If two requests are sent in parallel, they still have independent measurements which are added to the totals respectively.

```
    sum(bits downloaded) / sum(request time)
```

For example, if there were three requests that each downloaded 40 Megabits (5MB) in one 4 second request and two 2 second requests then the request throughput is 15 Mbps, `(40 + 40 + 40) / (4 + 2 + 2)`.

### Use this metric to:
* Compare the performance of multiple CDNs
* Troubleshoot throughput problems by CDN, ASN (ISP), or geography
* Understand the bandwidth of your users across different geographies, devices, etc.


## Request Latency

Request Latency measures the average time to first byte for media requests, that is the time from when the request is initiated to the time when the first byte of data is received from the server.

Note that request metrics are only available for certain SDKs and playbacks. See our docs for more information.

### Formula
```
    sum(time to first byte) / (number of requests)
```

For example, if the time for first byte for five requests was 100ms, 200ms, 100ms, 75ms, and 150ms, the Request Latency would be:
```
    (100 + 200 + 100 + 75 + 150) / 5 = 125
```

### Use this metric to:
* Compare the performance of multiple CDNs
* Troubleshoot latency problems by CDN, ASN (ISP), or geography


## Max Request Latency

Max Request Latency measures the maximum time to first byte for a media request, that is the maximum time an individual request took from the time it was initiated to the time when the first byte of data was received from the server.

Note that request metrics are only available for certain SDKs and playbacks. See our docs for more information.

### Formula
```
    max(time to first byte)
```

For example, if the time for first byte for five requests was 100ms, 200ms, 100ms, 75ms, and 150ms, the Max Request Latency would be 200ms.

### Use this metric to:
* Compare the performance of multiple CDNs
* Troubleshoot latency problems by CDN, ASN (ISP), or geography

````


# Make your dimensions actionable with metadata
Configure metadata with your SDK in order to populate dimensions to search, filter, and segment your video performance metrics.
One of Mux Data's core concepts are dimensions, which are the attributes of a video view that you can use to search, filter, and segment your video performance metrics.

While some of these dimensions are populated automatically, Mux Data allows you to provide details about the video and environment that either can't be detected automatically, can't be accessed if the video fails to load, or should be overridden.

Each dimension corresponds with a metadata key which can be used to set these values. While all metadata details except for `env_key` are optional, some may be necessary to calculate certain metrics and you'll see more helpful results as you include more.

## Dimension Details

#### Level

Each dimension is either considered `basic` or `advanced`. All dimensions are available for the standard retention period of 100 days. Long Term Metrics only support `basic` dimensions. If you are interested in Long Term Metrics, please reach out to [Mux Support](mailto:help@mux.com).

<Callout type="info">
  Long Term Metrics are available on **Mux Data Custom Media** plan. Long Term Metrics are available on Media Plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

#### Scoping

Many of Mux Data's dimensions are scoped to specific categories. Based on the category, they may have different behavior in terms of how these details are updated.

* Video details (prepended by `video_`) describe the current video that's playing and are all reset automatically when changing the video. This metadata might come from your internal CMS or video management system.
* Player details (prepended by `player_`) describe the player configuration that's being used and should be set whenever monitoring is started on a new player. They do not reset when the video is changed.
* All other details will persist until explicitly changed.

#### Type

There are three types of dimensions based on their availability.

**Tracking**: Enables tracking of additional metrics but are unavailable as dimensions.

**Limited**: Appear as attributes of a view on the individual view page as well as in the API.

**Full**: Can be used as filters and breakdowns in aggregate reports, in the video view page and in the API.

## High Priority Configurable Metadata

The following dimensions are the most important fields whose metadata keys you should populate in order to get the basic functionality from Mux Data.

<Callout type="info" title="Note about `viewer_user_id`">
  For `viewer_user_id` you should not use any value that is personally identifiable on its own (such as email address, username, etc.). Instead, you should supply an anonymized viewer ID which you have stored within your own system.
</Callout>

| Dimension Name | Key Name | Unit | Type | Level | Description |
|:-|:-|:-|:-:|:-:|:-|
| Environment | `env_key` | Unique ID | Required | N/A | Your env key from the Mux dashboard. This field ensures that your data goes into the correct environment. Note this was previously named `property_key` |
| Video ID | `video_id` | Text | Full | basic | Your internal ID for the video. Defaults to the Mux External ID if enabled for Assets and Livestreams hosted by Mux. |
| Video Title | `video_title` | Text | Full | basic | Title of the video being played (e.g.: `Awesome Show: Pilot`). Defaults to the Mux Video Title if enabled for Assets and Livestreams hosted by Mux. |
| Viewer ID | `viewer_user_id` | Unique ID | Full | adv | An ID representing the viewer who is watching the stream. Use this to look up video views for an individual viewer. If no value is specified, a unique ID will be generated by the SDK. Note: You should not use any value that is personally identifiable on its own (such as email address, username, etc.). Instead, you should supply an anonymized viewer ID which you have stored within your own system. |

## Optional Configurable Metadata

The following dimensions can be set manually using the metadata key name and will be reported by Mux Data. The key name provided below is the snake\_case used by Web SDKs. Keynames for iOS and Android SDK use camelCase and may differ in some cases. Review API documentation for API key names.

| Dimension Name | SDK Key Name | Unit | Type | Level | Description |
|:-|:-|:-|:-:|:-:|:-|
| Audio Codec | `video_audio_codec` | Text | Full | adv | The codec of the audio that played during the view. |
| CDN | `video_cdn` | Text | Full | basic | CDN delivering the video, either detected by Mux (via response `X-CDN` header) or specified in the view as `video_cdn`. Specifying a `video_cdn` value on the view does not override the detected value, if the `X-CDN` value is set on the segment response headers. |
| CDN Edge PoP | `view_cdn_edge_pop` | Text | Full | adv | Region where the CDN edge point of presence server is located or other origin server identification. |
| Content Type | `video_content_type` | Text | Full | basic | The type of content: e.g. `short`, `movie`, `episode`, `clip`, `trailer`, or `event` |
| Client Application Name | `view_client_application_name` | Text | Full | adv | Name of the customer application that the viewer is using to watch the content. e.g 'OurBrand iOS App'|
| Client Application Version | `view_client_application_version` | Text | Full | adv | Version of the customer application that the viewer is using to view the content. |
| DRM Type | `view_drm_type` | Text | Full | adv | The DRM SDK or service that is used for the video playback, such as `widevine` or `playready` |
| DRM Level | `view_drm_level` | Text | Full | adv | Security level of the specific DRM type. Some DRM types do not have levels. |
| Duration | `video_duration` | Milliseconds | Limited | (none) | The length of the video in milliseconds |
| Encoding Variant | `video_encoding_variant` | Text | Full | adv | Allows you to compare different encoders or encoding settings. This could designate the encoder used (e.g. `x264`, `hevc`, or `av1`), the preset used (e.g. `av1-0`, `av1-4`, or `av1-8`), or other properties of the encoding you want to track.  |
| Experiment Name | `experiment_name` | Text | Full | adv | You can use this field to separate views into different experiments, if you would like to filter by this dimension later. |
| Origin | `view_cdn_origin` | Text | Full | adv | Identifying name of the Content Origin or Region where the Origin server is located. |
| Page Type | `page_type` | Text | Full | adv | Provide the context of the page for more specific analysis. Values include `watchpage`, `iframe`, or leave empty. **`watchpage`** — A web page that is dedicated to playing a specific video (for example youtube.com/watch/ID or hulu.com/watch/ID) **iframe** — An iframe specifically used to embed a player on different sites/pages |
| Player Initialization Time | `player_init_time` | Milliseconds since Epoch | Tracking | N/A | If you are explicitly loading your player in page (perhaps as a response to a user interaction), include the timestamp (milliseconds since Jan 1 1970) when you initialize the player (or for HTML5 video, when right before you add the  element to the DOM) in order to accurately track page load time and player startup time. |
| Player Name | `player_name` | Text | Full | basic | You can provide a name for the player (e.g. `My Player`) if you want to compare different configurations or types of players around your site or application. This is different from the player software (e.g. `Video.js`), which is tracked automatically by the SDK. |
| Player Version | `player_version` | Text | Full | adv | As you make changes to your player you can compare how new versions of your player perform (e.g. `1.2.0`). This is not the player software version (e.g. `Video.js 5.0.0`), which is tracked automatically by the SDK. |
| Sub Property ID | `sub_property_id` | Text | Full | basic | A sub property is an optional way to group data within a property. For example, sub properties may be used by a video platform to group data by its own customers, or a media company might use them to distinguish between its many websites. |
| Time Shift Enabled | `view_time_shift_enabled` | Boolean | Full | adv | Boolean indicating if this view had timeshift enabled. |
| Used Captions | `player_captions_enabled` | Boolean | Full | adv | Boolean indicating if the player used captions at any time during the view. |
| Used PiP | `player_pip_enabled` | Boolean | Full | adv | Boolean indicating if the player used Picture in Picture at any time during the view. |
| Video Affiliate | `video_affiliate` | Text | Full | adv | Affiliate station that the viewer is watching or associated with the viewer. |
| Video Brand | `video_brand` | Text | Full | adv | Brand associated with the video or the brand of the streaming platform the viewer is using to watch the video. |
| Video Codec | `video_codec` | Text | Full | adv | The codec of the video that played during the view. |
| Video Dynamic Range Type | `video_dynamic_range_type` | Text | Full | adv | The format or type of dynamic range available on the video during the view. |
| Video Language | `video_language_code` | Text | Full | adv | The audio language of the video, assuming it's unchangeable after playing. |
| Video Producer | `video_producer` | Text | Full | adv | The producer of the video title |
| Video Series | `video_series` | Text | Full | basic | The series of the video (e.g.: `Season 1`) |
| Video Stream Type | `video_stream_type` | Text | Full | basic | The type of video stream (e.g: `live` or `on-demand`) |
| View Session ID | `view_session_id` | Unique ID | Full | adv | An ID that can be used to correlate the view with platform services upstream such as CDN or origin logs. |
| Video Variant Name | `video_variant_name` | Text | Full | adv | Allows you to monitor issues with the files of specific versions of the content, for example different audio translations or versions with hard-coded/burned-in subtitles. |
| Video Variant ID | `video_variant_id` | Text | Full | adv | Your internal ID for a video variant |
| Viewer Plan | `viewer_plan` | Text | Full | adv | Name of the viewer's customer-specific plan, product, or subscription. |
| Viewer Plan Status | `viewer_plan_status` | Text | Full | adv | Status pertaining to that viewer's subscription plan (e.g. subscriber, non-subscriber, SVOD, AVOD, free, standard, premium). |
| Viewer Plan Category | `viewer_plan_category` | Text | Full | adv | Category of the viewer's customer-specific subscription plan (e.g. bundle-type, subscription-campaign-id). |
| Custom Dimensions | `custom_1 - 10` | Text | Full | adv | Customer-defined metadata |

## Overridable Metadata

The following dimensions are populated automatically where the data is supported by the SDK. This data can be overridden by the SDK client implementation using the metadata key name, if needed.

| Dimension Name | Key Name | Unit | Type | Level | Description |
|:-|:-|:-|:-:|:-:|:-|
| Autoplay | `player_autoplay_on` | Boolean | Full | adv | Indicates whether the player was set to autoplay the video or not. This tracks whether the video has `autoplay=true` set; it is not always able to tell if the browser disregarded the setting, otherwise prevented the video from playing, or if the video play was triggered via a script. |
| Browser | `viewer_application_name` | Text | Full | basic | Browser used for the video view (`Safari`, `Chrome`, etc.). On Android and iOS  applications this defaults to the bundle identifier. |
| Browser Version | `viewer_application_version` | Version | Full | adv | Browser version (e.g. `66.0.3359.158`). On Android and iOS applications this defaults to the bundle version. |
| Connection Type | `viewer_connection_type` | Text | Full | adv | The type of connection used by the player, as reported by the client when available: `cellular`, `other`, `wifi`, `wired` |
| Device Brand | `viewer_device_manufacturer` | Text | Full | basic | Device Manufacturer (e.g. `Apple`, `Microsoft`, etc.) |
| Device Category | `viewer_device_category` | Text | Full | basic | The form factor of the device: `camera`, `car browser`, `console`, `desktop`, `feature phone`, `peripheral`, `phone`, `portable media player`, `smart display`, `smart speaker`, `tablet`, `tv`, `wearable` |
| Device Model | `viewer_device_model` | Text | Full | adv | Device Model (e.g. `iPhone11,2`) |
| Device Name | `viewer_device_name` | Text | Full | basic | Device Name (e.g. `iPhone 12`) |
| Error Code | `player_error_code` | Text | Full | adv | Error code encountered by the player during playback. |
| Operating System | `viewer_os_family` | Text | Full | basic | Operating System (`iOS`, `Windows`, etc.) |
| Operating System Version | `viewer_os_version` | Version | Full | adv | Operating System version (e.g. `10.6`) |
| Page URL | `page_url` | URL | Limited | adv | Page URL |
| Player Height | `player_height` | Integer | Limited | adv | Height of the player as displayed, in logical pixels |
| Player Instance ID | `player_instance_id` | Unique ID | Limited | (none) | Identifies the instance of the Player class that is created when a video is initialized |
| Player Language | `player_language_code` | Text | Full | adv | Player's text language |
| Player Poster | `player_poster` | URL | Limited | (none) | The image shown as the pre-visualization before play |
| Player Software | `player_software_name` | Text | Full | basic | Player Software being used to play the Video (e.g. `Video.js`, `JW Player`, etc.). Note this was previously named `player_software` |
| Player Software Version | `player_software_version` | Text | Full | adv | Player Software Version (e.g. `2.45.5`) |
| Player Width | `player_width` | Integer | Limited | (none) | Width of the player as displayed, in logical pixels |
| Preload | `player_preload_on` | Boolean | Full | adv | Specifies if the player was configured to load the video when the page loads. |
| Remote Played | `player_remote_played` | Boolean | Full | adv | If the video is remote played to AirPlay as specified by the SDK. |
| Source Height | `player_source_height` | Integer | Limited | adv | Height of the source video being sent to the player, in pixels |
| Source Width | `player_source_width` | Integer | Limited | (none) | Width of the source video being as seen by the player |
| Source Type | `video_source_mime_type` | Text | Full | basic | Format of the source, as determined by the player. E.g. `application/dash+xml`, `x-application/mpegUrl`, `mp4`, etc. |
| Used Full Screen | `player_is_fullscreen` | Boolean | Limited | adv | Indicates whether the viewer used full screen to watch the video. |
| Video Creator ID | `video_creator_id` | Text | Full | adv | A unique identifier for the creator of the video. Defaults to the Mux Creator ID if enabled for Assets and Livestreams hosted by Mux. |

## Internal Metadata

The following dimensions are populated automatically by the SDK, and cannot be overriden by the SDK client implementation.

| Dimension Name | Unit | Type | Level | Description |
|:-|:-|:-:|:-:|:-|
| ASN | Boolean | Full | adv | The Autonomous System Number (ASN) representing the network provider of the viewer. |
| Audio Codec Initial | Text | Full | adv | Initial codec of the audio that played. Derived from the first value of the audio\_codec dimension. |
| CDN Trace | Sequence | Full | adv | Populates all values of video\_cdn field over the course of a view. |
| Continent Code | Text | Full | basic | The continent from which the video was accessed, represented as a code. |
| Country | Text | Full | adv | The country from which the video was accessed, represented as a country code. |
| Exited Before Video Start | Boolean | Full | basic | Indicates whether the viewer exited before the video started playing. |
| Mux Asset ID | Unique ID | Full | basic | A unique identifier for the video asset being played. |
| Mux Live Stream ID | Unique ID | Full | basic | The unique identifier of the live stream being played. |
| Mux Playback ID | Text | Full | basic | A unique identifier for the video view. |
| Mux Plugin | Text | Full | adv | The name of the Mux plugin used by the video player. |
| Mux Plugin Version | Text | Full | adv | The version of the Mux plugin used by the video player. |
| Playback Business Exception | Boolean | Full | adv | Indicates whether a business rule-related issue caused playback failure. |
| Playback Failure | Boolean | Full | adv | Indicates whether the playback failed for any reason. |
| Region | Text | Full | adv | The specific region or state where the video was accessed. |
| Source Bitrate | Integer | Limited | adv | Bitrate of the source video in bps. |
| Source Bitrate Initial | Integer | Limited | adv | Initial bitrate of the source video in bps. Derived from the first value of the player\_source\_bitrate dimension. |
| Source Framerate | Number | Limited | adv | Framerate of the source video. |
| Source Framerate Initial | Number | Limited | adv | Initial framerate of the source video. Derived from the first value of the player\_source\_fps dimension. |
| Source Height | Integer | Limited | adv | The height (in pixels) of the source currently loaded in the player, regardless of the size of the player. |
| Source Height Initial | Integer | Limited | adv | Initial height (in pixels) of the source currently loaded in the player, regardless of the size of the player. Derived from the first value of the player\_source\_height dimension. |
| Source Hostname | Text | Full | adv | The hostname of the video source, such as the CDN or media server. |
| Source Width | Integer | Limited | adv | The width (in pixels) of the source currently loaded in the player, regardless of the size of the player. |
| Source Width Initial | Integer | Limited | adv | Initial width (in pixels) of the source currently loaded in the player, regardless of the size of the player. Derived from the first value of the player\_source\_width dimension. |
| Video Codec Initial | Text | Full | adv | Initial codec of the video that played. Derived from the first value of the video\_codec dimension. |
| Video Dynamic Range Type Initial | Text | Full | adv | Initial format or type of dynamic range available on the video that played. Derived from the first value of the video\_dynamic\_range\_type dimension. |
| Video Startup Failure | Boolean | Full | basic | Indicates whether the video failed to start due to an error. |
| Video Startup Business Exception | Boolean | Full | adv | Indicates whether a business rule-related issue caused a video startup failure. |
| View Dropped | Boolean | Full | adv | Boolean indicating whether the view was finalized without an explicit viewend event. |
| View Has Ad | Boolean | Full | basic | Indicates whether the video view included an ad. |

## Set Metadata with Session Data

Metadata is normally set using code in the SDK configuration. However, some video metadata can also be set using Session Data key/value pairs in the HLS manifest. This method makes it easier to communicate values to the Mux player SDK without having to side-channel information to the client or change client-side code in order to configure metadata for a view.

Some common use cases where this is helpful are, for example, setting a view session id that comes from a backend system which can be used to associate a playback view with the requests that were made to a CDN or being able to easily capture which experiments a viewer is participating in without having to communicate that to the player.

HLS Session Data, which is represented in an HLS master playlist using the `EXT-X-SESSION-DATA` tag, is a key/value pair that can be read by the player. When the master playlist is loaded into a video player integrated with a Mux Data SDK that supports extracting Session Data, the Session Data keys that use the `io.litix.data` prefix will be included in the Mux Data view as dimension metadata the same as if you had configured the value from the SDK configuration code.

<Callout type="info" title="Note about HLS Session Data for developers using Mux Video:">
  This feature is intended for developers using their own custom video delivery pipeline. HLS Session Data is set by Mux Video when videos are viewed; injecting your own HLS Session Data into Mux Video content is not currently supported.
</Callout>

The Session Data tags are interpreted as follows from the master playlist:

```text
Tag: #EXT-X-SESSION-DATA
Key: DATA-ID="io.litix.data.[dimension_name]"
Value: VALUE="dimension value"
```

The dimension names available to be set from the master playlist:

* `video_*`
* `custom_*`
* `experiment_name`
* `view_session_id`
* `viewer_user_id`

The following is an example of Session Data tags in a master playlist:

```text
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS

#EXT-X-SESSION-DATA:DATA-ID="io.litix.data.experiment_name",VALUE="abr_test:true"
#EXT-X-SESSION-DATA:DATA-ID="io.litix.data.view_session_id",VALUE="12345ABCD"

#EXT-X-STREAM-INF:BANDWIDTH=2516370,AVERAGE-BANDWIDTH=2516370,CODECS="mp4a.40.2,avc1.640020",RESOLUTION=1280x720
...
```

The Session Data tags contained in a master playlist would result in the `Experiment Name` dimension set to `abr_test:true` and `View Session ID` dimension set to `12345ABCD`.

<Callout type="info">
  We're aware of a crash that may occur in [AVPlayer Data SDK](https://github.com/muxinc/mux-stats-sdk-avplayer) versions 2.12.0 - 3.5.0 when processing HLS Session Data that is prefixed with `io.litix.data`. AVPlayer Data SDK integrations that process HLS Session Data not prefixed with `io.litix.data` are not affected. Custom integrations that use the [Objective-C MuxCore SDK](https://github.com/muxinc/stats-sdk-objc) and do not depend on the AVPlayer Data SDK are not affected.
</Callout>

## iOS/Android Metadata

In iOS and Android SDKs, names are converted to lowerCamelCase `setters` and `getters`. For example, to set the Video Stream Type field in iOS or Android, use `videoStreamType` instead of `video_stream_type`.

In the Objective-C SDKs, options are provided via the `MUXSDKCustomerPlayerData`, `MUXSDKCustomerVideoData`, `MUXSDKCustomerViewData`, and `MUXSDKCustomData` objects. See the header directories in MuxCore.xcframework from the [latest release](https://github.com/muxinc/stats-sdk-objc/releases/latest) for a complete list of names.

In the Java SDK, options are provided via the `CustomerPlayerData`, `CustomerVideoData`, and `CustomData` objects. Use your IDE to inspect these objects' API.


# Extend Data with custom metadata
Configure your SDKs to track and report on custom metadata for views in Mux Data.
<Callout type="info">
  Limits

  The number of custom dimensions you can track depends on your plan. See the [pricing page](https://mux.com/pricing/data) for details.
</Callout>

## 1. What are Custom Dimensions?

There are many metadata dimensions that can be used to track information about video views such as Video Title, Video Series, or Encoding Variant. You can find the whole list on the [guide to making your data actionable](/docs/guides/make-your-data-actionable-with-metadata). Custom Dimensions allow you to define, submit, and report on metadata necessary to support your use case that are not in the pre-defined collection of metadata dimensions in Mux Data. Examples could include metadata such as the device firmware version or a subscription plan type.

Each custom dimension can have a display name and an assigned category. They also have a pre-defined field name, such as `custom_1`, that is used to refer to the dimension in code when submitting a value to track as part of a view. You'll use these field names when sending these values via an SDK integration or accessing a dimension value using the Mux Data API.

## 2. Configuring Custom Dimensions

The Custom Dimensions configuration is available from the Settings page and selecting the "Custom Dimensions" tab. You will see the list of the dimensions that are available for reporting.

<Image sm src="/docs/images/custom-dimensions-tab.png" width={1211} height={767} />

To enable a dimension, click the switch next to the left of the dimension name. To disable a dimension, simply click the switch off. The custom dimension data will continue to be collected from the SDKs but it will not be available to users for reporting in the Mux Dashboard.

<Image sm src="/docs/images/custom-dimensions-enable.png" width={1120} height={453} />

To edit a dimension, click the edit pencil to the right of the row. You can set the display name of the dimension to match your preferred definition and assign the most appropriate category. By default, Custom Dimensions are included in the Custom category but they can added to any existing dimension category.

The name and category of the dimension are used wherever dimensions are displayed, such as the Metrics Breakdown page, the View detail page, the Filter model, or the dimensions list on the Compare page.

<Image src="/docs/images/edit-custom-dimension.png" width={545} height={233} alt="Mux Data edit custom dimension" />

## 3. Reporting on Custom Dimensions

Once configured to be visible, Custom Dimensions are available to report on in the same method as pre-defined dimensions. The dimensions are available for filtering, aggregation, and comparison from the Metrics Breakdown screen in the category that was assigned for each visible dimension.

The Custom Dimension values are also available in the export files using the pre-defined field name (i.e. `custom_1`).

<Image sm src="/docs/images/custom-dimensions-breakdown.png" width={839} height={684} />

## 4. Submitting Custom Metadata from Mux Data SDKs

Custom Dimension data is configured in the Mux Data SDKs in a similar method to other view metadata.

Metadata is submitted to the SDKs using the pre-defined field name assigned to the dimension you have configured. For example, if you configured the `custom_1` dimension to have the display name "Secondary User Id," you submit that secondary user id value using the `custom_1` or `CustomData1` metadata field, depending on the platform.

<Callout type="info">
  Make sure you are using an up-to-date version of each Mux Data SDK to enable support for submitting Custom Dimensions.
</Callout>

### HTML5 Video Element and other web SDKs

In web-based SDKs, Custom Dimensions are set in the same `data` object as the other view metadata fields.

```js
mux.monitor('#test_video', {
  data: {
    // Set other view data
    video_title: 'Big Buck Bunny',
    player_init_time: playerInitTime,
    env_key: 'YOUR_ENVIRONMENT_KEY_HERE',

    // Set custom dimension data
    custom_1: 'My Custom Dimension Value'    // Set the custom value here
  }
});
```

For more guidance on using and configuring web-based SDKs, please refer to the guide on [monitoring the HTML5 video element](/docs/guides/monitor-html5-video-element).

Version 4.1.0 or later of the HTML5 Video Element monitor is necessary to support Custom Dimensions.

### ExoPlayer

In Android-based SDKs, Custom Dimensions are set in the `CustomData` object and attached to the `CustomerData` object that is used to initialize the Mux Data SDK.

```java
// Set other view data
CustomerPlayerData customerPlayerData = new CustomerPlayerData();
customerPlayerData.setEnvironmentKey("YOUR_ENVIRONMENT_KEY_HERE");
CustomerVideoData customerVideoData = new CustomerVideoData();
customerVideoData.setVideoTitle("Big Buck Bunny");

// Set custom dimension data
CustomData customData = new CustomData();
customData.setCustomData1("MY_CUSTOM_DIMENSION_VALUE");  // Set the custom value here
CustomerData customerData = new CustomerData(customerPlayerData, customerVideoData, null);
customerData.setCustomData(customData);

muxStats = new MuxStatsExoPlayer(this, player, "demo-player", customerData);
```

An example integration that includes Custom Dimensions can be found in the demo application for [muxinc/mux-stats-sdk-exoplayer](https://github.com/muxinc/mux-stats-sdk-exoplayer) which integrates Mux into an ExoPlayer demo application.

For more guidance on using and configuring Android SDKs, please refer to the guide on [monitoring ExoPlayer](/docs/guides/monitor-exoplayer).

Version 2.5.0 or later of the ExoPlayer monitor is necessary to support Custom Dimensions.

### AVPlayer

In iOS-based SDKs, Custom Dimensions are set in the `MUXSDKCustomData` object and attached to the `MUXSDKCustomerData` object that is used to initialize the Mux Data SDK.

```swift
// Set custom dimension data
MUXSDKCustomData *customData = [[MUXSDKCustomData alloc] init];
[customData setCustomData1:@"my-custom-dimension-value"];  // Set the custom value here

// Set other view data
MUXSDKCustomerPlayerData *playerData = [[MUXSDKCustomerPlayerData alloc] initWithPropertyKey:@"YOUR_ENVIRONMENT_KEY_HERE"];
MUXSDKCustomerVideoData *videoData = [MUXSDKCustomerVideoData new];
videoData.videoTitle = @"Big Buck Bunny";
MUXSDKCustomerViewData *viewData= [[MUXSDKCustomerViewData alloc] init];

MUXSDKCustomerData *customerData = [[MUXSDKCustomerData alloc] initWithCustomerPlayerData:playerData videoData:videoData viewData:viewData customData: customData];
_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController withPlayerName:@"demo-player" customerData:customerData];
```

An example integration that includes Custom Dimensions can be found in the demo application for [muxinc/mux-stats-sdk-avplayer](https://github.com/muxinc/mux-stats-sdk-avplayer/tree/master/Examples/MUXSDKStatsExampleSPM) which integrates Mux into a AVPlayer demo application.

For more guidance on using and configuring iOS SDKs, please refer to the guide on [monitoring AVPlayer](/docs/guides/monitor-avplayer).

Version 2.5.0 or later of the AVPlayer monitor is necessary to support Custom Dimensions.

### Roku

In the Roku SDK, Custom Dimensions are set in the same `muxConfig` object as the other view metadata fields.

```js
m.mux = m.top.CreateNode("mux")
m.mux.setField("video", m.video)

muxConfig = {
  property_key: "YOUR_ENVIRONMENT_KEY_HERE",
  ' Set the custom dimension data
  custom_1: "my-custom-dimension-value"
}

m.mux.setField("config", muxConfig)
m.mux.control = "RUN"

' Load the video into the Video node
```

For more guidance on using and configuring the Roku SDK, please refer to the guide on [monitoring Roku](/docs/guides/monitor-roku).

Version 1.1.0 or later of the Roku monitor is necessary to support Custom Dimensions.


# Filter your Data
Learn how to use the `filters[]` parameter to filter your data with flexible syntax for different dimension types.
The `filters[]` parameter allows you to filter your data using flexible syntax that supports different types of operations depending on the dimension type.

## Filter Syntax Overview

The basic format for all filters is:

<code>
  filters\[]=<span className="bg-yellow text-white">\<operation></span><span className="bg-orange text-white">\<dimension></span>:<span className="bg-pink text-white">\<value></span>
</code>

* <span className="bg-yellow text-white font-mono">\<operation></span> is the optional prefix that defines the type of filter.
  Examples:
  * (none) → equals → `country:US`
  * `!` → not equals → `!country:US`
  * `+` → set contains → `+video_cdn_trace:fastly`
  * `-` → set omits → `-video_cdn_trace:cloudflare`

* <span className="bg-orange text-white font-mono">\<dimension></span> is the field or metric you want to filter on.
  Examples: `country`, `operating_system`, `video_cdn_trace`

* **`:`** acts as the separator between the dimension and the value.

* <span className="bg-pink text-white font-mono">\<value></span> is the value you're comparing against.
  Examples:
  * Scalar → `US`, `windows`
  * Trace → `[fastly,akamai]`
  * Empty trace → `[]`

## Supported Operations

### Scalar Operations

Scalar operations can be used with single-value dimensions or simple key-value pairs. Use these operations when you want to filter by an exact match or exclusion.

| Syntax | Operation | Example | Description |
| --- | --- | --- | --- |
| `dimension:value` | Equals | `filters[]=country:US` | Field equals value |
| `!dimension:value` | Not equals | `filters[]=!operating_system:windows` | Field does not equal value |

### Set Operations

Use for trace dimensions that can have multiple values in an ordered list. Use these operations when you want to check if a single value appears in the trace dimension.

| Syntax | Operation | Example | Description |
| --- | --- | --- | --- |
| `+dimension:value` | Has | `filters[]=+video_cdn_trace:fastly` | Set contains value |
| `-dimension:value` | Omits | `filters[]=-video_cdn_trace:cloudflare` | Set does NOT contain value |

Please note that set operations cannot be used as a wildcard for substring searches. For example, `filters[]=+video_cdn_trace:fas` *cannot* be used to return views with CDN traces that contain `fastly`.

### Trace Operations

Use for trace dimensions that can have multiple values in an ordered list. Use this operation when you want to filter for an exact, ordered match.

| Syntax | Operation | Example | Description |
| --- | --- | --- | --- |
| `dimension:[value1,value2]` | Equals | `filters[]=video_cdn_trace:[fastly,akamai]` | Trace equals exactly `[fastly, akamai]` |

## Practical Examples

### Scalar (Basic) Operations

Filter for views from the US:

```
filters[]=country:US
```

Exclude mobile operating systems:

```
filters[]=!operating_system:mobile
```

### Set Operations

Find views with 'fastly' as a CDN value in the video\_cdn\_trace dimension.

```
filters[]=+video_cdn_trace:fastly
```

Exclude views that went through `cloudflare`:

```
filters[]=-video_cdn_trace:cloudflare
```

### Trace Operations

Find views that went through exactly `fastly` first, then `akamai`:

```
filters[]=video_cdn_trace:[fastly,akamai]
```

Find views where no CDN value was set:

```
filters[]=video_cdn_trace:[]
```

### Multiple Filters

You can combine multiple filters.

**Filters with different dimensions**

When you are combining filters with different dimensions they are combined with AND.

```
# Views from US AND went through fastly
filters[]=country:US
filters[]=+video_cdn_trace:fastly
```

You can also combine dimensions with the same dimension. These are combined with OR.

```
# Views from US OR Canada
filters[]=country:US
filters[]=country:CA
```

However, if you are combining filters with the same dimension value with a negated value, those are combined using AND.

```
# Views NOT from US AND NOT from Canada
filters[]=!country:US
filters[]=!country:CA
```

### Value Formatting

* **Scalar (basic) values**: Plain strings (`country:US`)
* **Trace values**: Comma-separated values in brackets (`video_cdn_trace:[a,b,c]`)
* **Empty traces**: Use empty brackets (`video_cdn_trace:[]`)

## Common Errors

❌ **Don't use brackets with set operators:**

```
filters[]=+video_cdn_trace:[fastly]  # Invalid
```

✅ **Correct:**

```
filters[]=+video_cdn_trace:fastly    # Valid
```

❌ **Don't use scalar operator syntax with trace dimensions:**

```
filters[]=video_cdn_trace:fastly  # Invalid
```

✅ **Use trace or set operator syntax with trace dimensions:**

```
filters[]=video_cdn_trace:[fastly]  # Exact match
filters[]=+video_cdn_trace:fastly   # Contains check
```

## URL Encoding

When using filters in URLs, remember to properly encode the parameters:

```bash
# Single filter
/metrics?filters[]=country:US

# Multiple filters
/metrics?filters[]=country:US&filters[]=+tags:beta&filters[]=video_cdn_trace:[fastly,akamai]

# URL encoded
/metrics?filters%5B%5D=country%3AUS&filters%5B%5D=%2Btags%3Abeta&filters%5B%5D=video_cdn_trace%3A%5Bfastly%2Cakamai%5D
```

Here's an example of how you can URL encode the filter params using JavaScript:

```javascript
// Example filters
const filters = [
  "country:US",
  "+tags:beta",
  "video_cdn_trace:[fastly,akamai]"
];

// Use URLSearchParams to build query string
const params = new URLSearchParams();
filters.forEach(f => params.append("filters[]", f));

// Full URL
const url = `/metrics?${params.toString()}`;

console.log(url);
// /metrics?filters%5B%5D=country%3AUS&filters%5B%5D=%2Btags%3Abeta&filters%5B%5D=video_cdn_trace%3A%5Bfastly%2Cakamai%5D
```

## Common Use Cases

### Analytics and Debugging

#### Find problematic CDN paths:

```
# Views that went through cloudflare but not fastly
filters[]=-video_cdn_trace:fastly
filters[]=+video_cdn_trace:cloudflare
```

#### Debug specific video delivery paths:

```
# Exact CDN sequence analysis
filters[]=video_cdn_trace:[fastly,akamai,cloudfront]
```

### Performance Analysis

#### High-performance regions:

```
# Exclude slow CDN providers
filters[]=-video_cdn_trace:slow-cdn
filters[]=!operating_system:legacy
```

#### Mobile vs Desktop comparison:

```
# Mobile traffic analysis
filters[]=operating_system:ios
filters[]=operating_system:android
```

### Content Filtering

#### Live vs VOD content:

```
# Exclude recorded content
filters[]=!content_type:recorded
filters[]=content_type:live
```

#### Platform-specific analysis:

```
# Web platform only, excluding mobile apps
filters[]=platform:web
filters[]=!platform:ios
filters[]=!platform:android
```

## Error Handling

The API will return validation errors for:

* Invalid dimension names
* Incorrect operator usage for dimension type
* Malformed values (e.g., mismatched brackets, quotes)
* Invalid operator combinations (e.g., `!+dimension:value`)

Example error response:

```json
{
  "error": "Sequence dimensions require bracket notation. Use video_cdn_trace:[value] instead of video_cdn_trace:value"
}
```

## Advanced Tips

### Testing Your Filters

To ensure your filters are working as expected, it can be helpful to limit the dataset you're working with. For that reason, you may wish to test your filters with a small date range first:

```bash
/metrics?timeframe[]=24:hours&filters[]=country:US&filters[]=+video_cdn_trace:akamai
```


# Build a custom dashboard
Create custom dashboards in Mux Data to visualize and track the metrics that matter most to your video performance. Custom dashboards allow you to combine multiple metrics, apply filters, and organize data in a way that best serves your monitoring and analysis needs.
## What are Custom Dashboards?

Custom dashboards provide a centralized view of your video performance data through configurable components. You can create dashboards with multiple visualization types, apply filters, and customize time periods to focus on specific aspects of your video performance.

### Key features:

* Four component types: Timeseries, Bar charts, Lists, and Metric numbers
* 10 components per Dashboard
* Dashboard and component-level filtering
* Flexible time period selection
* Comparison intervals
* Dashboard sharing and duplication

<Callout type="info">
  Custom Dashboards are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

## Creating a Dashboard

To create a new custom dashboard:

1. Navigate to the **Dashboards** section in Mux Data
2. Select **Create Dashboard** from the left menu or main window
3. Enter a descriptive name for your dashboard
4. Select **Create Dashboard**

Your new dashboard will be created and ready for customization with components and filters.

<MultiImage
  images={[
  { src: "/docs/images/build-a-custom-dashboard-A.png", width: 3935, height: 2018, alt: "Dashboard screen in the Mux Video application showing an empty custom dashboards view with a “Create Dashboard” button." },
  { src: "/docs/images/build-a-custom-dashboard-B.png", width: 1464, height: 794, alt: "Dialog titled “Create Dashboard” in Mux, with a text input field filled in as “My first Dashboard” and two buttons at the bottom: “Cancel” and “Create Dashboard.”" },
]}
/>

## Dashboard Configuration

### Time Periods

Configure the time period for your entire dashboard to focus on specific date ranges:

* **Default**: Last 24 hours
* **Relative periods**: Choose from predefined options like last 7 days or last 30 days
* **Specific periods**: Set exact start and end dates for consistent historical analysis

Time period changes apply to all dashboard components. Save your dashboard to preserve time period settings.

<Callout type="info">
  Custom Dashboards are currently only available for the standard 100 days of data. Long-term Metrics are not yet available with Custom Dashboards.
</Callout>

### Dashboard Filters

Dashboard filters apply to all components within the dashboard providing consistent data filtering across visualizations.

#### Dimension Filters

Filter by dimension values such as country, operating system, or player version:

1. Select the **Filter Dimensions** button
2. Search for and select the dimension type
3. Choose specific values to include or exclude
4. Multiple values use OR logic (e.g., selecting iOS and Android shows views from either platform)

<Image alt="New Dashboard creation screen in Mux, showing a dimension filter panel with “Device Model” selected. Viewer Device Model is filtered to “iPhone,” and view counts for different iPhone models are listed." src="/docs/images/build-a-custom-dashboard-C.png" width={2420} height={1164} />

#### Metric Filters

Filter by metric values to focus on specific performance thresholds:

1. Select the **Filter Metrics** button
2. Choose a metric (e.g., rebuffering percentage)
3. Select an operator (≤, ≥, =, etc.)
4. Set the value threshold

<Image alt="Metrics filter interface in Mux dashboard builder, showing a filter applied to only include results where Rebuffer Percentage is greater than 5%." src="/docs/images/build-a-custom-dashboard-D.png" width={1264} height={500} caption="Example: Filter for views with rebuffering percentage ≤ 5% to focus on high-quality playback experiences." />

Filter changes can be previewed without saving. Click **Save** at the bottom of the dashboard to apply filters permanently.

### Component Filters

Components can have their own filters in addition to Dashboard filters. Dashboard filters act as parent filters affecting all components. Component level filters are additive to Dashboard filters but only apply to the component.

<Callout type="info">
  If dashboard and component filters conflict, the component may show no data. Ensure filter combinations are logical and compatible.
</Callout>

## Dashboard Components

Components visualize individual metrics within your dashboard. Each component type serves different analytical purposes and can be customized with specific filters and options.

1. To add a Dashboard component to a new Dashboard, select the Create Component button.
2. To add a Dashboard component to an existing Dashboard, select the Edit Icon next to the date selector.

### Metric Numbers

Display key performance indicators in a prominent metrics bar at the top of your dashboard. Up to 5 Metric numbers can be added per dashboard. Metric numbers (up to 5) collectively count as 1 component.

<Image alt="A dashboard titled “Platform Player Key Metrics” displaying metrics for the last 24 hours, including Views, Unique Viewers, Video Startup Failure Percentage, Playback Failure Percentage, and Rebuffer Percentage." src="/docs/images/build-a-custom-dashboard-E.png" width={1999} height={554} />

#### Configuration:

1. Select **Metric Number** as the component type
2. Choose the metric to display
3. Provide a descriptive name (50 character limit)
4. **Optional**: Add a comparison time period to show rate of change
5. **Optional**: Apply component-specific dimension or metric filters

<Callout type="info">
  Metric number components appear in creation order and cannot be reordered.
</Callout>

### Timeseries

<Image alt="Line graph showing “Video Startup Time” over a 24-hour period in Mux, comparing performance for “Last 24 hours” (orange line) versus “One day ago” (purple dashed line)." src="/docs/images/build-a-custom-dashboard-F.png" width={1134} height={832} />

Track metrics over time to identify trends, patterns, and anomalies in your video performance.

#### Configuration:

1. Select **Timeseries** as the component type
2. Choose the metric to chart over time
3. Set a descriptive component name
4. Select component size (half or full width)
5. **Optional**: Choose either:
   * **Comparison interval**: Compare current period with a previous timeframe
   * **Breakdown values**: Chart multiple values for a single dimension type (e.g., different device types)
6. **Optional**: Apply component-specific filters

<Callout type="info">
  Comparison intervals and breakdown values are mutually exclusive options. Also note that breakdown dimensions will take priority over dashboard and component filters of the same dimension.
</Callout>

### Bars

<Image alt="Bar chart titled “Video Startup Failure Percentage” broken down by browser. Chrome has the highest failure rate, followed by Firefox, Safari, and Edge. A tooltip highlights Firefox with a failure percentage of 1.39%." src="/docs/images/build-a-custom-dashboard-G.png" width={1134} height={834} />

Compare performance across different dimension values using horizontal bars.

#### Configuration:

1. Select **Bars** as the component type
2. Choose the metric to measure in the bars visualization
3. Select component size (half or full width)
4. Choose breakdown dimension type and values that you wish to display
5. **Optional**: Add a comparison interval to compare current period with a previous timeframe
6. **Optional**: Apply component-specific filters

<Callout type="info">
  Breakdown values must come from a single dimension category.
</Callout>

### Lists

Rank and organize data to quickly identify top performers or problem areas.

<Image alt="Table showing Rebuffer Percentage broken down by operating system. Windows has the highest rebuffer rate at 0.70%, followed by iOS and Android, with directional trend indicators in green or red." src="/docs/images/build-a-custom-dashboard-H.png" width={1190} height={770} />

#### Configuration:

1. Select **List** as the component type
2. Choose the metric to measure for each list item
3. Select the dimension to list (e.g., player names, video titles)
4. Set sort order (ascending or descending)
5. Specify the number of items to display in the list component
6. Provide a descriptive component name
7. **Optional**: Add a comparison interval
8. **Optional**: Apply component-specific filters

<Callout type="info">
  Lists are only available in half-width size.
</Callout>

## Dashboard Management

### Sharing Dashboards

When creating a new dashboard, you can choose to share it with everyone in your environment. Public dashboards appear in the Shared folder for all users in your environment. You can change the sharing level at any time from the More Options dropdown.

All users can view public dashboards. To save an editable version of a public dashboard, create a duplicate (see below).

### Sharing via Dashboard Link

Any dashboard can be shared with users who have access to your Mux environment via the dashboard link, even if it's not marked as public. Users who receive a link can:

* View the dashboard
* Favorite it to save it to their personal list
* Create a duplicate to make their own editable copy (see below)

### Editing Dashboard Permissions

Users have the ability to edit dashboards they are the owner of but do not have the ability to edit public dashboards they do not own. Admins have full editing abilities for all dashboards.

Advanced role-based permissions are coming soon.

### Favoriting Dashboards

Favorite personal or shared dashboards to allow quick access to your most frequently used dashboards. You can have up to 20 favorited dashboards across your environments.

Favorite a dashboard by pressing the favoriting star in the dashboard menu. When a dashboard is favorited, the star will be highlighted and the dashboard will be added to the top of the custom dashboard navigation sidebar in the favorites section.

<Image sm alt="Star a custom dashboard by pressing the star icon" src="/docs/images/build-a-custom-dashboard-K.png" width={750} height={224} />

### Saving Dashboard Copies

Save a modified version without affecting the original:

1. Make your desired changes to the dashboard
2. Use the **Save As** option in the save menu
3. Provide a new name for the copy

<Image alt="Bottom section of a Mux dashboard displaying two metric widgets: one for “Exits Before Video Start” (line chart) and another for “Rebuffer Percentage” by Windows. Save, Save As, and Cancel buttons appear below." src="/docs/images/build-a-custom-dashboard-J.png" width={1999} height={436} />

### Exporting Dashboards

Export a dashboard to a PDF to save a snapshot of your dashboard:

1. Select the **More Options** menu (⋯) next to the favorite button
2. Choose **Export PDF**

<Image sm alt="Dropdown menu under the time range selector “Last 24 hours” with options to “Export PDF“, “Duplicate“ or “Delete” the dashboard." src="/docs/images/build-a-custom-dashboard-I.png" width={774} height={364} />

### Duplicating Dashboards

Create an exact copy of an existing dashboard:

1. Select the **More Options** menu (⋯) next to the favorite button
2. Choose **Duplicate**

<Callout type="info">
  Duplication is not available while a dashboard is being edited.
</Callout>

### Deleting Dashboards

Permanently remove dashboards you no longer need:

1. Select the **More Options** menu (⋯) next to the favorite button
2. Choose **Delete**
3. Confirm the deletion

<Callout type="warning">
  Deleting a dashboard removes it for all users. Duplicate dashboards are not affected.
</Callout>

### Dashboard Navigation

#### Exploring Metric Details

Access detailed metric analysis directly from dashboard components:

1. Select the **Go To Metrics** icon on any component
2. The metrics page opens with:
   * Selected filters from your dashboard applied
   * The component's metric pre-selected


# Save and share filter sets
Learn how to create filter sets to improve collaboration across teams, ensure data consistency, and create filtered views of Mux Data.
**What are filter sets?**

Filter sets allow you to save and share commonly used filter combinations into sets to ensure data consistency and streamline operational workflows.

1. To create a filter set, select the filter set button on any dashboard that supports filters.
2. Select ‘create new filter set’ from the menu.

<Image alt="A Mux Data interface showing the filter configuration panel. The user has selected three dimensions: Operating System (iOS), Stream Type (LIVE), and Video ID (456182). A tag labeled “456182” appears under Video ID. The right panel shows filters for iOS, LIVE, and Video ID 456182, with an “Apply” button at the bottom." src="/docs/images/save-and-share-filter-sets-1.png" width={1999} height={800} />

3. Select a name for your filter set.

4. Choose if your filter set is public or private. Private means that only you will see it in the menu under your ‘private’ filter sets. Public means that all users in an environment will be able to see and select the filter set.

5. Select the filters that you wish to add to your filter set. The filter values that were selected on the page will automatically populate in this menu. You can remove or add new dimension and metric filters before saving.

<Image sm alt="A Mux Metrics dashboard with filters applied. The top bar includes options for Filters, Dimensions, and Metrics, with a dropdown showing “Clear all.” A date range selector is set to “Last 24 hours.” The graph area shows zero total video views and no data available. The “Create new filter set” option is visible in the open filter menu." src="/docs/images/save-and-share-filter-sets-2.png" width={454} height={675} />

**Add a new filter value to a filter set**

You can manually create a new filter value if it doesn’t yet exist in Mux Data. This is useful for an upcoming event or new product launch.

1. In the filter menu, select the dimension type on the left.
2. Type the value of the dimension that you wish to add.
3. The value you entered will appear in the results with zero views.
4. Select that value to add it to your filter set.
5. Select apply.
6. Select save.

This value will now be associated with your saved filter set. When selecting this filter set, it will show zero views until there are views that match that criteria.

<Image alt="A Mux interface showing the “Create Filter Set” modal. The form includes a field to name the filter set, a privacy selector (Private selected), and a summary of applied filters: Dimensions (iOS, LIVE) and Metrics (Page Load Time). Buttons at the bottom allow Cancel or Save." src="/docs/images/save-and-share-filter-sets-3.png" width={1962} height={902} />

**Navigating with filter sets**
When a filter set is selected, it will persist when navigating across dashboards in Mux Data. Not all filters are supported across all dashboards.

If you navigate to a dashboard on Mux that doesn’t support a filter in your selected filter set, that filter will be deactivated while on that dashboard. A warning message will appear and the filter set will appear yellow if all values are not applicable to that page. The filter will also appear in a deactivated state in the filter display menu.

Once you navigate to a dashboard where that filter is applicable, it will be reactivated.

<Image alt="Mux Monitoring Overview dashboard in dark mode. A yellow alert in the top right reads “Filters Disabled – There are dimensions or metrics filters that are unavailable for this page.” The dashboard displays six empty charts for metrics such as Video Startup Failure, Playback Failures, Rebuffering Percentage, and Average Bitrate, all showing 0.00%." src="/docs/images/save-and-share-filter-sets-4.png" width={1999} height={1048} />

**Filter sets and Custom Dashboards**

When you build a Custom Dashboard, filters and filter sets are saved as a setting of that dashboard. When you navigate to a Custom Dashboard, all selected filters and filter sets will be reset to the values saved to that dashboard.

Filter sets can be added to custom dashboards. However, filter sets are not available to be applied to custom dashboard components.

**Delete a filter set**

1. Select the edit filter set icon for the filter set you wish to delete.
2. In the filter set menu, select delete filter set icon.

<MultiImage
  images={[
  { sm: true, src: "/docs/images/save-and-share-filter-sets-5.png", width: 1130, height: 924, alt: "A Mux Metrics dashboard with the filter menu open. The filter dropdown lists private saved filter sets titled “IBC Conference” and “Engaged Views,” along with an option to create a new filter set. The dashboard displays graphs for views and overall viewer experience." },
  { sm: true, src: "/docs/images/save-and-share-filter-sets-6.png", width: 822, height: 1130, alt: "A Mux modal titled “Edit Filter Set.” The name field shows “IBC Conference.” Privacy is set to Private. One dimension filter is applied — a long video ID string. The bottom of the modal includes Cancel and Save buttons." },
]}
/>


# Focus your operational response with error categorization
Configure error categorization through the Mux Data Dashboard or your SDKs to track and report on custom error metadata for views in Mux Data.
## 1. What is Error Categorization?

Error Categorization allows you to set custom error metadata to provide more actionable data. By using error categorization, you can distinguish between fatal errors or warnings and classify errors as playback failures or business exceptions. Errors categorized as warnings or as business exceptions are not considered playback failures, meaning these errors are excluded from alerting, giving a more accurate picture of the health of your system with less noise from alerts.

Playback Failure metrics (`Playback Failure Percentage` and `Video Startup Playback Failure Percentage`) only include fatal operational failures, while errors categorized as business exceptions and warnings are excluded. Errors that are categorized as a business exception will be included in the `Playback Business Exception Percentage` and `Video Startup Business Exception Percentage` metrics.

There are two dimensions, `Playback Business Exception` and `Video Startup Business Exception`, that are available as filters. Like the Playback Failure metrics, the `Playback Failure` and `Video Startup Failure` dimensions are not set for business exceptions and warnings.

The category information for errors can be set from the Mux Dashboard or from the individual player SDKs. You only need to set the categorization on an error in one place and information about the categories that are set in the Dashboard overrides the information set in the SDKs.

## 2. Configuring Error Categorization

Categorizing Errors is available from the Settings page and selecting the "Categorize Errors" tab. You must be an admin user to add a new error code categorization.

<Image sm src="/docs/images/categorize-errors-tab.png" width={1254} height={436} />

In the configuration page, you can categorize errors by code. Click the "Add an error code" button. In the dropdown, you will see the error codes your environment has encountered. Select from this dropdown and press "Add" to create a new categorization. By default, errors will have fatal error severity and will be tagged as playback failures.

Type into the filter box to search for specific error codes. If you are configuring an error code not previously seen in this environment, you can press "Enter" to create a new categorization.

<Image sm src="/docs/images/categorize-errors-add.png" width={1254} height={477} />

## 3. Submitting Error metadata from Mux Data SDKs

### Attach severity and type to errors with Mux Data SDKs

Error Categorization can also be configured in the Mux Data SDKs in a similar method to other error metadata. If an error code is already configured in the data dashboard, the settings from the dashboard will take precedence.

### HTML5 Video Element and other web SDKs

In web-based SDKs, Error Categorizations can be set by passing through a function to the player. This function will set the relevant error metadata.

```js
function errorTranslator (error) {
  return {
    player_error_code: translateCode(error.player_error_code),
    player_error_message: translateMessage(error.player_error_message),
    player_error_context: translateContext(error.player_error_context),
    player_error_severity: translateSeverity(error.player_error_severity),
    player_error_business_exception: translateBusinessException(error.player_error_business_exception)
  };
}

mux.monitor('#my-player', {
  debug: false,
  errorTranslator: errorTranslator,
  data: {
    env_key: 'ENV_KEY', // required
    // ... additional metadata
  }
});
```

For more guidance on using and configuring the error translator in web-based SDKs, please refer to the guide on [monitoring the HTML5 video element](/docs/guides/monitor-html5-video-element#error-translator).

Version 5.2.0 or later of the HTML5 Video Element monitor is necessary to support Error Categorization.

### Android

Error Categorization is supported for custom integrations that use the Core Java-based SDK `v8.0.0` or later.

This is an example of how to categorize an error event to be a warning.

```java
    import com.mux.stats.sdk.core.events.EventBus;
    import com.mux.stats.sdk.core.model.CustomerPlayerData;
    import com.mux.stats.sdk.muxstats.IPlayerListener;
    import com.mux.stats.sdk.events.playback.ErrorEvent;

    public class PlayerListener extends EventBus implements IPlayerListener {
    MuxStats muxStats;

    // Call from the source of warning or player callback meant to trigger warning with parameters appropriate to your integration. Dispatches an error event that Mux will categorize as a warning by default
    public void onPlaybackWarning(String errorCode, String errorMessage, String errorContext) {
        PlayerData playerData = new PlayerData();
        playerData.setErrorCode(errorCode);
        playerData.setErrorMessage(errorMessage);
        
        ErrorEvent errorEvent = new ErrorEvent(playerData, errorContext, ErrorSeverity.ErrorSeverityWarning);

        dispatch(errorEvent);
    }
```

For more guidance and additional examples please refer to the guide on [custom integrations in Java](/docs/guides/data-custom-java-integration).

### Objective-C (iOS, tvOS, visionOS)

Error Categorization is supported when using the Mux `AVPlayer` integration `v4.0.0` or later and with custom integrations that use the Core Objective-C-based SDK `v5.0.0` or later.

#### AVPlayer Integration

This is an example of how to categorize an error event to be a warning.

```objc
- (void)dispatchPlaybackWarningWithPlayerName:(NSString *)playerName
                              playerErrorCode:(NSString *)playerErrorCode
                           playerErrorMessage:(NSString *)playerErrorMessage
                           playerErrorContext:(NSString *)playerErrorContext {
  [MUXSDKStats dispatchError: playerErrorCode,
                 withMessage: playerErrorMessage,
                    severity: MUXSDKErrorSeverityWarning,
                errorContext: playerErrorContext,
                   forPlayer: playerName];
}
```

For more guidance and additional examples please refer to the [AVPlayer monitoring guide](/docs/guides/monitor-avplayer).

#### Custom Integrations

This is an example of how to categorize an error event to be a warning.

```objc
// Call this method from the source of the playback warning (such as an `AVPlayer` key-value property observer, for example) with parameters appropriate to your integration.
- (void)dispatchPlaybackWarningWithPlayerName:(NSString *)playerName
                              playerErrorCode:(NSString *)playerErrorCode
                           playerErrorMessage:(NSString *)playerErrorMessage
                           playerErrorContext:(NSString *)playerErrorContext
                           playerPlayheadTime:(NSNumber *)playerPlayheadTime {
  MUXSDKErrorEvent *errorEvent = [[MUXSDKErrorEvent alloc] initWithSeverity:MUXSDKErrorSeverityWarning
                                                                    context:playerErrorContext];

  // Configure any custom video or view data if necessary
  MUXSDKPlayerData *playerData = [[MUXSDKPlayerData alloc] init];
  [playerData setPlayerErrorCode:playerErrorCode];
  [playerData setPlayerErrorMessagae:playerErrorMessage];
  [playerData setPlayerPlayheadTime: playerPlayheadTime];
  // ... repeat for any other `MUXSDKPlayerData` properties if they've changed

  [MUXSDKCore dispatchEvent:errorEvent 
                  forPlayer:playerName];
}
```

For more guidance and additional examples please refer to the guide on [custom integrations in Objective-C](/docs/guides/data-custom-objectivec-integration).

### Roku

Error categorization is supported when using an SDK v2.0.0 or later.

```js
mux.setField("error", {
  player_error_code: errorCode,
  player_error_message: errorMessage,
  player_error_context: errorContext,
  player_error_severity: errorSeverity,
  player_error_business_exception: isBusinessException
})
```

The possible values or `errorSeverity` are `"warning"` or `"fatal"`.

For more guidance on using and configuring the Roku SDK, please refer to the guide on [monitoring Roku](/docs/guides/monitor-roku).


# Export raw video view data
Understand how to export your video views data into your own data warehouse for processing and analysis.
View data can be exported from Mux Data for aggregation and reporting in your data infrastructure. Views are available individually using the <ApiRefLink href="/docs/api-reference/data/video-views/get-video-view">Views API</ApiRefLink> or in bulk with the export methods: daily CSV exports or streaming exports.

## Call the Export API to get daily aggregated data

Full data exports are available via the <ApiRefLink href="/docs/api-reference/data/exports/list-exports-views">Exports API</ApiRefLink>. This API is available for Mux Data customers on Media plans.

Use this API to get a list of CSV files available for download. Files are available to download for seven days after they are generated. Each CSV file is a single day of data and includes every single dimension collected by Mux, for each single video view. The table below details each of these data fields.

The Versions column indicates what fields are included in each version. Newer export versions will include the latest columns available. Some columns may be empty based on the features enabled. From version 2 onward, fields are sorted in alphabetical order. Older versions of the export may have fields in a different order, please refer to the export file for the most accurate ordering. Please contact support to change the export version that is generated.

**We strongly suggest you build the file import to use the field names rather than ordinal order so additional fields can be added to the file without causing an error.**

## Stream views as they complete

<Callout type="info">
  Streaming Exports are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

Mux Data supports streaming exports of video views to an Amazon Kinesis Data Stream or Google Cloud Pub/Sub topic in your cloud account. Views are sent to Kinesis or Pub/Sub as they complete and are made available to retrieve from the stream within about one minute after the view ends.

Each message is a single view, with all of the metadata and metrics, and the event timeline for the view. The view data can be stored in your long-term storage for aggregation and reporting.

This method of access is most useful for customers who want to update metrics on a rolling basis throughout the day or are embedding metrics in a user-facing application feature and need faster updates than once per day.

## Setting up a streaming export

Streaming exports can be configured in the **Streaming Exports** settings in your Mux dashboard. See the setup guide for your platform for more information on setting up an export:

* [Amazon Kinesis Data Streams](/docs/guides/export-amazon-kinesis-data-streams)
* [Google Cloud Pub/Sub](/docs/guides/export-google-cloud-pubsub)

## Message format

Messages are in either JSON format or Protobuf (proto2) encoding. You can choose between the two formats when setting up the streaming export in the Mux Dashboard -> Settings -> Streaming Export -> New streaming export page.

For Protobuf encoding, every message uses the `VideoView` message type defined in the export Protobuf spec, which is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/video_view). Use the latest Protobuf spec when creating schemas or generating code.

The fields in the Protobuf definition match those used in the latest version of the Exports API. The available fields are noted in the table below.

## View handling

A view can be updated after it has been exported. This will be expressed with a record of the latest version of the view being emitted to the stream. When processing views, make sure you're able to handle multiple or duplicate records for each view ID (`view_id`). The `view_id` can be used as a unique primary key for each view record.

## Understand the data fields

**Mux API Value**: field name in the CSV file or streaming export

**Unit**: unit of the field, such as text, percentage, or bits per second. Note that all units of type `Time` are represented as timestamps in UTC.

**Type**:

* Dimension: metadata about the view
* Metric: metrics calculated by Mux
* Score: score calculated by Mux

**Versions**: export version in which the fields are included

| Mux API Value | Unit | Type | Definition | Versions |
|---------------|------|------|------------|----------|
|`asn` |Integer |Dim. | Autonomous System Number uniquely identifying each network| v1+ |
|`asset_id` |Text |Dim. | If Mux Video is used, the Asset Id of the video.| v4+ |
|`audio_codec` |Text |Dim. | The codec of the audio that played during the view. | v13+ |
|`browser` |Text |Dim.| Browser used for the video view (`Safari`, `Chrome`, etc.).| v2+ |
|`browser (viewer_application_name)` |Text |Dim.| Deprecated - see `browser`| v1 |
|`browser_version` |Version |Dim. | Browser version (e.g. `66.0.3359.158`).| v2+ |
|`browser_version (viewer_application_version)` |Version |Dim. | Deprecated - see `browser_version (viewer_application_version)`| v1 |
|`cdn` |Text |Dim. | CDN delivering the video view, either determined by response header auto-detection or provided as video\_cdn.| v1+ |
|`city` |Text |Dim. | City of the viewer.| v1+ |
|`client_application_name` |Text |Dim. | Name of the customer application that the viewer is using to watch the content. e.g 'OurBrand iOS App'. | v13+ |
|`client_application_version` |Text |Dim. | Version of the customer application that the viewer is using to view the content. | v13+ |
|`continent_code` |ISO Code |Dim. | 2-letter ISO code identifying the Continent of the viewer (e.g. `NA`, `EU`).| v1+ |
|`country` |ISO Code |Dim. | 2-letter Country Code.| v2+ |
|`country (country_code)` |ISO Code |Dim. | Deprecated - see `country`| v1 |
|`country_name` |Text |Dim. | Country of the viewer.| v1+ |
|`custom_1` |Text |Dim. | Customer-defined metadata.| v2+ |
|`custom_2` |Text |Dim. | Customer-defined metadata.| v2+ |
|`custom_3` |Text |Dim. | Customer-defined metadata.| v2+ |
|`custom_4` |Text |Dim. | Customer-defined metadata.| v2+ |
|`custom_5` |Text |Dim. | Customer-defined metadata.| v2+ |
|`custom_6` |Text |Dim. | Customer-defined metadata.| v5+ |
|`custom_7` |Text |Dim. | Customer-defined metadata.| v5+ |
|`custom_8` |Text |Dim. | Customer-defined metadata.| v5+ |
|`custom_9` |Text |Dim. | Customer-defined metadata.| v5+ |
|`custom_10` |Text |Dim. | Customer-defined metadata.| v5+ |
|`environment_id`|Unique ID |Dim. | Mux Environment ID, linked with a specific environment| v4+ |
|`error_type` |Unique ID |Dim. | Mux-internal ID used to categorize errors.| v2+ |
|`error_type (error_type_id)` |Unique ID |Dim. | Deprecated - see `error_type`| v1 |
|`exit_before_video_start` |Boolean |Metric | Identifies when a viewer abandons the video because it is taking too long to load.| v1+ |
|`experiment_name` |Text |Dim. | A/B Testing:  use this field to separate views into different experiments.| v1+ |
|`isp` |Text |Dim. | Unused| v1+ |
|`latitude` |Degrees |Dim. | Latitude of the viewer, truncated to 1 decimal place.| v1+ |
|`live_stream_id` |Text |Dim. | If Mux Video is used, the Live Stream Id of the video.| v4+ |
|`live_stream_latency` |Integer |Metric | Live Stream Latency measuring the average time from ingest to display for the view.| v4+ |
|`longitude` |Degrees |Dim. | Longitude of the viewer, truncated to one decimal place.| v1+ |
|`max_downscale_percentage` | Percentage | Metric | Maximum Downscale Percentage at any point in time during a video view.| v2+ |
|`max_downscale_percentage (view_max_downscale_percentage)` | Percentage | Metric | Deprecated - see `max_downscale_percentage`| v1 |
|`max_upscale_percentage` | Percentage | Metric |  Maximum Upscale Percentage at any point in time during a video view.| v2+ |
|`max_upscale_percentage (view_max_upscale_percentage)` | Percentage | Metric | Deprecated - see `max_upscale_percentage`| v1 |
|`metro` |Text |Dim. | Unused| v1+ |
|`mux_api_version` | Text|Dim. | Ignore | v1+ |
|`mux_embed_version` |Dim. |Dim. | Internal version of Mux Core SDK. Ignore| v1+ |
|`mux_viewer_id` |Unique ID |Dim. | A Mux Internal ID representing the viewer who is watching the stream.| v1+ |
|`operating_system` |Text |Dim. | Operating System (`iOS`, `Windows`, etc.).| v2+ |
|`operating_system (viewer_os_family)` |Text |Dim. | Deprecated - see `operating_system`| v1 |
|`operating_system_version` |Version |Dim. | Operating System version (e.g. `10.15`).| v2 |
|`operating_system_version (viewer_os_version)` |Version |Dim. | Deprecated - see `operating_system_version`| v1 |
|`page_load_time` |Milliseconds |Metric | Measures the time from the initial user request for a page to the time when the video player is first initialized| v1+ |
|`page_type` |Text |Dim. | Provides the context of the page for more specific analysis. Values include `watchpage` or `iframe`.| v1+ |
|`page_url` |URL |Dim. | Page URL| v1+ |
|`platform_description` |Text |Dim. | Unused| v1+ |
|`playback_id` |Text |Dim. | If Mux Video is used, the Playback Id of the video.| v4+ |
|`playback_business_exception_error_type_id` |Unique ID |Dim. | An ID value that is present when a playback business exception occurs | v9+ |
|`playback_failure_error_type_id` |Unique ID |Dim. | An ID value that is present when a playback failure occurs | v9+ |
|`playback_success_score` |Decimal |Dim. | Playback Success Score| v2+ |
|`player_autoplay` |Boolean |Dim. | Indicates whether the player autoplayed the video or not| v1+ |
|`player_captions_enabled` |Boolean |Dim. | Boolean indicating if the player used captions at any time during the view. | v13+ |
|`player_error_code` |String |Dim. | An error code that represents a fatal error (one resulting in playback failure). Often an integer, but implementation-dependent.| v1+ |
|`player_error_context` |Text |Dim. | Error instance-specific information such as stack trace or segment number.| v5+ |
|`player_error_message` |Text |Dim. | Message sent by the player when an error has been fired up (associated with an error code)| v1+ |
|`player_height` |Integer |Dim. | Height of the player as displayed in page, in pixels| v1+ |
|`player_instance_id` |Unique ID |Dim. | Identifies the instance of the Player class that is created when a video is initialized| v1+ |
|`player_language` |Text |Dim. | Player's text language| v1+ |
|`player_load_time` |Milliseconds |Metric | Deprecated - see `player_startup_time`)| v1+ |
|`player_mux_plugin_name` |Text |Dim. | Mux Integration Plugin name (e.g. `mux-player`)| v1+ |
|`player_mux_plugin_version` |Version |Dim. | Mux Integration Plugin version (e.g. `2.2.0`)| v2+ |
|`player_name` |Text |Dim. | Identifies different configurations or types of players around your site or application (e.g. `My Player`)| v1+ |
|`player_pip_enabled` |Boolean |Dim. | Boolean indicating if the player used Picture in Picture at any time during the view. | v13+ |
|`player_poster`|URL| Dim. | The image shown as the pre-visualization before play | v1+ |
|`player_preload` |Boolean |Dim. | Specifies if the player was configured to load the video when the page loads.| v1+ |
|`player_remote_played` |Boolean |Dim. | Specify from the SDK if the video is remote played to AirPlay or Chromecast.| v2+ |
|`player_software` |Text |Dim. | Player Software being used to play the Video (e.g. `Video.js`, `JW Player`, etc.)| v1+ |
|`player_software_version` |Text |Dim. | Player Software Version (e.g. `2.45.5`)| v1+ |
|`player_source_domain` |Text |Dim. | Video Source Domain (e.g. `myvideostreams.com`)| v1+ |
|`player_source_duration` |Milliseconds |Dim. | Video Source Duration| v1+ |
|`player_source_height` |Integer |Dim. | Height of the source video being sent to the player, in pixels| v1+ |
|`player_source_stream_type` |Text |Dim. | Unused| v1+ |
|`player_source_url` |URL |Dim. | Video Source URL| v1+ |
|`player_source_width` | Integer | Dim. | Width of the source video being as seen by the player | v1+ |
|`player_startup_time` |Milliseconds |Metric | Measures the time from when the player is first initialized in the page to when it is ready to receive further instructions.| v1+ |
|`player_version` |Text |Dim. | As you make changes to your player you can compare how new versions of your player perform. Set in combination with `player_name` (e.g. `1.2.0`) | v1+ |
|`player_view_count` |Integer |Dim. | View Count - equal to 1 in Full Exports (1 line = 1 video view)| v1+ |
|`player_width` |Integer |Dim. | Width of the player as displayed in page, in pixels| v1+ |
|`property_id` |Unique ID |Dim. | Mux Property ID, linked with a specific environment. Deprecated, please use `environment_id`. | v1+ |
|`rebuffer_count` |Integer |Metric | Number of rebuffering events that happen during the video view. | v2+ |
|`rebuffer_count (buffering_count)` |Integer |Metric | Deprecated - see `rebuffer_count` | v1 |
|`rebuffer_duration` |Milliseconds |Metric | Amount of time in milliseconds that viewers wait for rebuffering per video view. | v2+ |
|`rebuffer_duration (buffering_duration)` |Milliseconds |Metric | Deprecated - see `rebuffer_duration` | v1 |
|`rebuffer_frequency` |Events per millisecond |Metric | Measures how often rebuffering events happen. | v2+ |
|`rebuffer_frequency (buffering_rate)` |Events per millisecond |Metric | Deprecated - see `rebuffer_frequency` | v1 |
|`rebuffer_percentage` |Percentage |Metric | Volume of rebuffering that is occurring across the view| v1+ |
|`region` |Text |Dim. | Region of the viewer| v1+ |
|`session_id` |Unique ID |Dim. | Mux Session ID tracking a viewer's session| v1+ |
|`smoothness_score` |Decimal |Score | Smoothness Score| v2+ |
|`source_hostname` |Text |Dim. | Video Hostname (e.g. `media.myvideos.com`).| v2+ |
|`source_hostname (player_source_host_name)` |Text |Dim. | Deprecated - see `source_hostname`| v1 |
|`source_type` |Text |Dim. | Format of the source, as determined by the player. E.g. `application/dash+xml`, `x-application/mpegUrl`, `mp4`, etc.| v2+ |
|`source_type (player_source_type)` |Text |Dim. | Deprecated - see `source_type`| v1 |
|`startup_time_score` |Decimal |Score | Startup Time Score| v2+ |
|`stream_type` |Text |Dim. | Type of stream (e.g. `live` or `on-demand`).| v2+ |
|`stream_type (video_stream_type)` |Text |Dim. | Deprecated - see `stream_type`| v1 |
|`sub_property_id` |Text |Dim. | Sub Property Id| v2+ |
|`time_to_first_frame` |Milliseconds | Metric | Deprecated - see `video_startup_time`| v1 |
|`used_fullscreen` |Boolean |Dim. | Indicates whether the viewer used full screen to watch the video.| v1+ |
|`video_affiliate` |Text |Dim. | Affiliate station that the viewer is watching or associated with the viewer. | v13+ |
|`video_brand` |Text |Dim. | Brand associated with the video or the brand of the streaming platform the viewer is using to watch the video. | v13+ |
|`video_cdn_trace` |Array |Dim. | Sequential values of the video delivery CDN over the course of the view | v14+ |
|`video_codec` |Text |Dim. | The codec of the video that played during the view. | v13+ |
|`video_content_type` |Text |Dim. | Content Type (e.g. `short`, `movie`, `episode`, `clip`, `trailer`, or `event`).| v1+ |
|`video_creator_id` |Text |Dim. | A unique identifier for the creator of the video. Defaults to the Mux Creator ID if enabled for Assets and Livestreams hosted by Mux.| v13+ |
|`video_duration` |Milliseconds |Dim. | The length of the video supplied to Mux via custom metadata| v1+ |
|`video_dynamic_range_type` |Text |Dim. | The format or type of dynamic range available on the video during the view. | v13+ |
|`video_encoding_variant` |Text |Dim. | An optional detail that allows you to compare different encoding settings.| v1+ |
|`video_id` |Unique ID |Dim. | Your internal ID for the video| v1+ |
|`video_language` |Text|Dim. | The audio language of the video, assuming it's unchangeable after playing.| v1+ |
|`video_producer` |Text |Dim. | The producer of the video title| v1+ |
|`video_quality_score` |Decimal |Score | Video Quality Score| v2+ |
|`video_startup_business_exception_error_type_id` |Unique ID |Dim. | An ID value that is present when a video startup business exception occurs | v9+ |
|`video_series` |Text |Dim. | Series name (e.g. `The Girls`)| v1+ |
|`video_startup_time` |Milliseconds | Metric | (Video Startup Time on Mux Dashboards) Measures from when the player has been instructed to play the video, to when the first frame of video (either content or preroll ad) is showing and the playhead is progressing.| v2+ |
|`video_startup_failure` |Boolean | Metric | Identifies when a viewer encounters an error before the first frame of the video begins playback.| v7+ |
|`video_title` |Text |Dim. | Video Title| v1+ |
|`video_variant_id` |Unique ID |Dim. | Your internal ID for a video variant| v1+ |
|`video_variant_name` | Text |Dim. | An optional detail that allows you to monitor issues with the files of specific versions of the content, for example different audio translations or versions with hard-coded/burned-in subtitles.| v1+ |
|`view_cdn_edge_pop` |Text |Dim. | Region where the CDN edge point of presence server is located or other origin server identification. | v13+ |
|`view_cdn_origin` |Text |Dim. | Identifying name of the Content Origin or Region where the Origin server is located. | v13+ |
|`view_content_startup_time` |Milliseconds |Metric | Measures from when the player has been instructed to play the video, to when the first frame of video content is showing and the playhead is progressing.| v10+ |
|`view_content_watch_time` |Milliseconds |Metric | Total Content Watch Time across the view (includes Startup Time, Playing time, potential rebuffering).| v10+ |
|`view_downscaling_percentage` |Percentage |Metric | Downscale Percentage| v2+ |
|`view_drm_level` |Text |Dim. | Security level of the specific DRM type. Some DRM types do not have levels.  v13+ |
|`view_drm_type` |Text |Dim. | The type of DRM used during playback (e.g. `widevine` or `playready`).| v5+ |
|`view_dropped` |Boolean |Dim. | Boolean indicating whether the view was finalized without an explicit viewend event. | v11+ |
|`view_dropped_frame_count` |Integer |Metric | The number of frames that were dropped by the player during playback| v5+ |
|`view_end` |Time |Dim. | Date and Time at which the view ended, in UTC.| v1+ |
|`view_has_ad` |Boolean |Metric | Identifies if an advertisement played or attempted to play during the video view.| v6+ |
|`view_id` |Unique ID |Dim. | Unique View Identifier| v1+ |
|`view_max_playhead_position` |Milliseconds |Dim. | The furthest the video was played, indicated by the maximum time value of the playhead during the view.| v3+ |
|`view_playing_time` |Milliseconds |Metric | The amount of time the video spent playing during the view; this value does not include time spent joining, rebuffering, or seeking.| v3+ |
|`view_seek_count` |Integer |Dim. | The number of times that the viewer attempted to seek to a new location within the view.| v1+ |
|`view_seek_duration` |Milliseconds |Dim. | Total amount of time spent waiting for playback to resume after the viewer seeks to a new location. Seek Latency metric in the Dashboard is this value divided by `view_seek_count`.| v1+ |
|`view_session_id` |Unique ID |Dim. | An id that can be used to correlate the view with platform services upstream such as CDN or origin logs.| v2+ |
|`view_start` |Time |Dim. | Date and Time at which the view started, in UTC.| v1+ |
|`view_time_shift_enabled` |Boolean |Dim. | Boolean indicating if this view had time\_shift enabled. | v13+ |
|`view_total_content_playback_time` |Milliseconds |Dim. | Internal metric used in calculating upscale and downscale percentages.| v1+ |
|`view_total_downscaling` |Milliseconds |Dim. | Internal number used to calculate Downscale Percentage Metric. Downscale Percentage = `view_total_downscaling / view_total_content_playback_time` | v1+ |
|`view_total_upscaling` |Milliseconds |Dim. | Internal number used to calculate Upscale Percentage Metric. Upscale Percentage = `view_total_upscaling / view_total_content_playback_time`| v1+ |
|`view_upscaling_percentage` |Percentage |Metric | Upscale Percentage| v2+ |
|`viewer_application_engine` |Text |Dim. | Web Browser Engine (`Gecko`, `WebKit`, etc.)| v1+ |
|`viewer_connection_type` |Text |Dim. | The type of connection used by the player, as reported by the client when available: `cellular`, `other`, `wifi`, `wired`| v2+ |
|`viewer_device_category` |Text |Dim. | The form factor of the device: `camera`, `car browser`, `console`, `desktop`, `feature phone`, `peripheral`, `phone`, `portable media player`, `smart display`, `smart speaker`, `tablet`, `tv`, `wearable`| v1+ |
|`viewer_device_manufacturer` |Text |Dim. | Device Brand (e.g. `Apple`, `Microsoft`, etc.)| v1+ |
|`viewer_device_model` |Text |Dim. | Device Model (e.g. `iPhone11,2`)| v4+ |
|`viewer_device_name` |Text |Dim. | Device Name (e.g. `iPhone 12`)| v1+ |
|`viewer_experience_score` |Decimal |Score | Overall Viewer Experience Score| v2+ |
|`viewer_os_architecture` |Text |Dim. | No longer used. Ignore.| v1+ |
|`viewer_plan` |Text |Dim. | Name of the viewer's customer-specific plan, product, or subscription. | v13+ |
|`viewer_plan_category` |Text |Dim. | Category of the viewer's customer-specific subscription plan (e.g. bundle-type, subscription-campaign-id). | v13+ |
|`viewer_plan_status` |Text |Dim. | Status pertaining to that viewer's subscription plan (e.g. subscriber, non-subscriber, SVOD, AVOD, free, standard, premium). | v13+ |
|`viewer_user_agent` |Text |Dim. | User Agent (e.g. `Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0)`)| v1+ |
|`viewer_user_id` |Unique ID |Dim. | A Customer-defined ID representing the viewer who is watching the stream. Note: You should not use any value that is personally identifiable such as email address, username, etc. Instead, you should supply an anonymized viewer ID which you have stored within your own system.| v1+ |
|`watch_time` |Milliseconds |Dim. | Total Watch Time across the view (includes Startup Time, Playing time, potential rebuffering).| v1+ |
|`watched` |Boolean |Dim. | Ignore| v1+ |
|`weighted_average_bitrate` |bits/sec |Metric | Weighted Average Bitrate, expressed in bps.| v2+ |

## Ad Metrics and Dimensions

| Mux API Value | Unit | Type | Definition | Versions |
|---------------|------|------|------------|----------|
|`ad_attempt_count` |Integer |Metric | The number of times that the player attempted to play an ad | v8+ |
|`ad_break_count` |Integer |Metric | The number of times that the player entered an ad break| v8+ |
|`ad_break_error_count` |Integer |Metric | The number of times that the viewer encountered an ad error during an ad break| v8+ |
|`ad_break_error_percentage` |Percentage |Metric | Percentage of views that contain ads that encountered an ad break error| v8+ |
|`ad_error_count` |Integer |Metric | The number of times that the player encountered an ad error| v8+ |
|`ad_error_percentage` |Percentage |Metric | Percentage of views that contain ads that encountered an ad error| v8+ |
|`ad_impression_count` |Integer |Metric | The number of times that the player began ad playback| v8+ |
|`ad_startup_error_count` |Integer |Metric | The number of times that the player errored on ad startup| v8+ |
|`ad_startup_error_percentage` |Percentage |Metric | Percentage of views that contain ads that encountered an ad startup error| v8+ |
|`ad_exit_before_start_count` |Integer |Metric | The number of times that the viewer exited before the ad started playback| v8+ |
|`ad_exit_before_start_percentage` |Percentage |Metric | Percentage of views that contain ads that encountered an ad exit before start| v8+ |
|`ad_playback_failure_error_type_id` |Unique ID |Dim. | An ID value that is present when an ad playback failure occurs | v10+ |
|`ad_preroll_startup_time` |Milliseconds |Metric | Measures from when the player has been instructed to play a preroll ad to when the first frame of the ad is showing and the playhead is progressing.| v10+ |
|`ad_watch_time` |Milliseconds |Metric | Total Watch Time for ad playback across the view (includes Ad Preroll Startup Time, ad playing time, potential rebuffering). | v10+ |
|`preroll_ad_asset_hostname` |Hostname |Dim. | Hostname of the Preroll Ad Asset.| v1+ |
|`preroll_ad_tag_hostname` |Hostname |Dim. | Hostname of a Preroll Ad Tag.| v1+ |
|`preroll_played` |Boolean |Dim. | Flag to identify video views for which a Preroll Ad has been successfully played.| v1+ |
|`preroll_requested` |Boolean |Dim. | Flag to identify video views for which a Preroll Ad has been requested.| v1+ |
|`requests_for_first_preroll` |Integer |Metric | Measures the number of ad requests that are made up to the point of preroll ad playback beginning.| v1+ |
|`video_startup_preroll_load_time` |Milliseconds |Metric | Total amount of Video Startup Time that is spent loading the first preroll ad asset.| v1+ |
|`video_startup_preroll_request_time` |Milliseconds |Metric | Total amount of Video Startup Time that is spent making preroll ad requests.| v1+ |

## Request-level Metrics

| Mux API Value | Unit | Type | Definition | Versions |
|---------------|------|------|------------|----------|
|`max_request_latency` |Milliseconds |Metric | Maximum time to first byte for a media request.| v2+ |
|`max_request_latency (view_max_request_latency)` |Milliseconds |Metric | Deprecated - see `max_request_latency`| v1 |
|`request_latency` |Milliseconds |Metric | Measures the average time to first byte for media requests.| v2+ |
|`request_latency (view_average_request_latency)` |Milliseconds |Metric | Deprecated - see `request_latency`| v1 |
|`request_throughput` |bits/sec |Metric | Measures the average throughput, in bits per second, for all media requests that were completed.| v2+ |
|`request_throughput (view_average_request_throughput)` |bits/sec |Metric | Deprecated - see `request_throughput`| v1 |

## CSV file formats

The daily CSV export files are generated based on the specific version that is configured and include the fields specified in the section above.

Sample CSV export files are available to download, for reference:

* [Version 2](/exports/export_v2_sample.csv)
* [Version 3](/exports/export_v3_sample.csv)
* [Version 4](/exports/export_v4_sample.csv)

## Streaming Export message format

The protobuf definition for Streaming Exports of video views is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/video_view). Please subscribe to this repository for updates to the protobuf definition.

The JSON format streaming export contains identical fields as the protobuf-encoded format.

## Streaming Export versioning

## Backward compatibility

The Streaming Export schema provided by Mux Data is backward compatible, meaning that each schema version guarantees that it will still work upon future upgrades. Customers do not need to worry about breaking changes.

## When to upgrade the schema?

When Mux adds new fields into the Streaming Export, we will upgrade the schema version. Without taking any actions you will not be impacted by it: the fields that you used to get will keep working as normal, and the new fields introduced since your last upgrade will not be sent to you either. The benefit of designing it this way is that you will not be getting new fields without knowing.

For customers who want to get the new fields, read below to see the how-tos.

## How to upgrade the schema?

### If integrated with Google Pub/Sub

If your Google Pub/Sub topic is **schematized**, once a schema is associated with a topic, you can no longer change that schema. This means that customers using Google Pub/Sub for Streaming Export must take a couple of steps to move to a new topic that is associated with a new schema.

* Create a new topic in Google Pub/Sub with upgraded schema
* Point the Mux Data Streaming Export to that new topic
* Go to Mux Dashboard → Settings → Streaming Export → Click Upgrade.

If your Google Pub/Sub topic is **schemaless**, which it must be if you want to use JSON, you do not need to create new topics or reconfigure your streaming export, but to get the new fields released from Mux, customer needs to do the 3rd step as mentioned above.

### If integrated with Amazon Kinesis

* If using protobuf message format, make sure you get the latest protobuf definition from Mux public repo. Subscribe to the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/video_view) to receive updates.
* Go to Mux Dashboard → Settings → Streaming Export → Click Upgrade button.


# Stream export data to an Amazon Kinesis data stream
Learn how to send streaming exports data from Mux to Amazon Kinesis Data Streams.
<Callout type="info">
  Streaming Exports are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

<Callout type="info">
  For a detailed walkthrough of the Amazon Kinesis Data Streams setup process, see this [blog post](https://www.mux.com/blog/mux-amazon-kinesis-integration).
</Callout>

In order to stream exports from Mux to Amazon Kinesis Data Streams, you’ll need to set up a data stream in your AWS account. This guide covers the high-level steps required for setup.

## 1. Add a new streaming export

To add a new streaming export, go to **Settings > Streaming Exports** in your Mux dashboard. From that tab, click **New streaming export** to open the configuration modal.

Select the type of data you want to export, the environment you want to send data from, the export format, and select **Amazon Kinesis Data Streams** as the service.

## 2. Set up a data stream in Amazon Kinesis

You'll need to complete the following setup in your AWS account before you can create a new streaming export in Mux:

1. Create an Amazon Kinesis data stream.
2. Create an IAM role for Mux’s AWS account. To create the IAM role, you'll need Mux's AWS account ID and an external ID, which are shown in the configuration modal. See [this AWS user guide](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_for-user_externalid.html) for more information about how the external ID is used. When creating the role, choose "AWS account" for the Trusted entity type. Select "Another AWS account" and enter Mux’s AWS account ID. Check "Require external ID" and paste in the "External ID" that Mux provided to you in the configuration modal.
3. Provide the IAM role you created with write access to your data stream. Here’s an example of an IAM policy that grants the necessary permissions (replace the resource with your data stream ARN):

```json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
          "kinesis:ListShards",
          "kinesis:PutRecord",
          "kinesis:PutRecords"
      ],
      "Resource": [
        "arn:aws:kinesis:{region}:{account-id}:stream/{stream-name}"
      ]
    }
  ]
}
```

## 3. Finish setup in Mux

In the configuration modal, provide the data stream ARN and IAM role ARN. Make sure the values you provide match these formats:

* Data stream ARN\
  `arn:aws:kinesis:{region}:{account-id}:stream/{data-stream-name}`
* IAM role ARN\
  `arn:aws:iam::{account-id}:role/{role-name}`

Click **Enable export**, and your streaming export will be activated immediately. We will start streaming views as soon as they're completed.

## Process messages

With your export set up, you can begin consuming incoming messages. For more information on the message format and processing data, see the main [Export raw Mux data](/docs/guides/export-raw-video-view-data) guide.


# Stream export data to a Google Cloud Pub/Sub topic
Learn how to send streaming exports data from Mux to Google Cloud Pub/Sub.
<Callout type="info">
  Streaming Exports are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

In order to stream exports from Mux to a Pub/Sub topic, you'll need to set up a topic in your Google Cloud account. Mux will write data to the topic as it becomes available. This guide covers the high-level steps required for setup.

## 1. Add a new streaming export

To add a new streaming export, go to **Settings > Streaming Exports** in your Mux dashboard. From that tab, click **New streaming export** to open the configuration modal.

Select the type of data you want to export, the environment you want to send data from, the export format, and select **Google Cloud Pub/Sub** as the service.

## 2. Set up a topic in Google Cloud Pub/Sub

You'll need to complete the following setup in your Google Cloud account before you can create a new streaming export in Mux:

1. *(Optional)* If you want to use a schema with your Pub/Sub topic, you can create one using the Protobuf spec for the data you are exporting, which is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf).
2. Create a Pub/Sub topic. If you're creating a topic with a schema, set the message encoding to **Binary**.
3. Add the Mux service account to the topic as a Principal with the **Pub/Sub Publisher** role. The Mux service account is shown in the configuration modal.

## 3. Finish setup in Mux

In the configuration modal, provide the Pub/Sub topic name. This should be the full topic name, including the project ID, and match the format `projects/{project-id}/topics/{topic-id}`.

Click **Enable export**, and your streaming export will be activated immediately. We will start streaming data as soon as it becomes available.

## Process messages

With your export set up, you can begin consuming incoming messages. For more information on the message format and processing data, see the main [Export raw Mux data](/docs/guides/export-raw-video-view-data) guide.


# Set up alerts
Set up alerts so your team can be notified when certain conditions occur on your video platform.
## Define terms

**Rule:** Criteria for when an alert should be triggered based on a metric, threshold, and filter criteria.

**Incident:** A specific instance when the conditions of an alert Rule are met and an alert is triggered.

**Start Time:** The timestamp of when a metric initially crosses over an alert rule threshold.

**End Time:** The timestamp of when a metric crosses out of an alert rule threshold.

**Trigger Interval:** The time period from when a metric initially crosses over an alert rule threshold to when an alert incident notification occurs.

**Resolution Interval:** The time period from when a metric crosses out of an alert rule threshold to when an alert incident notification occurs.

**Incident Duration:** The total length of time spent in an incident, from the start of the open interval to the start of the close interval.

## Alert rules

There are two types of alerts supported by Mux Data: Anomaly and Threshold.

**Anomaly** alerts are configured automatically based on the historical failure rate of views in the organization.

**Threshold** alerts allow you to define specific criteria for viewer experience metrics that will trigger notifications.

## Anomaly Alerts

Anomaly alerts are generated when failures are elevated over a historical level. The historical level is determined by measuring the overall failure rate of videos in each organization over the recent past using a moving window.

There are two levels of playback failures that are used for anomaly detection:

* Organization-wide: The failure rate is calculated using all video views that are tracked within the organization.
* Per Video Title: The failure rates are calculated using the video views of every video title tracked within each environment separately.

To determine whether failure rates are elevated, the anomaly detector groups the on-going views in buckets ordered by time and compares the failure rate of each bucket to the historical organization-wide failure rate. If the failure rate of the bucket of views is determined to be an extreme outlier the failed views will be flagged as anomalous, and an incident will be opened.

The bucket sizes are based on the anomaly alert level:

* Organization-wide: 1000 views
* Per Video Title: 100 views for each video title

The outlier determinations are set dynamically based on your historical values so there is no need for configuring the specific thresholds in the anomaly alerts.

Anomaly alert incidents are automatically closed when the error-rate returns to normal levels. An incident for a video title or organization that hasn't received views in the last 8 days will be marked as expired.

## Threshold Alerts

<Callout type="info">
  Threshold Alerts are available on **Mux Data Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

Threshold alerts allow you to define specific criteria for alerts that will trigger incident notifications.

Alert rules can be created for metrics collected in the Monitoring Dashboard:

* Failures
* Rebuffering Percentage
* Video Start Time
* Concurrent Viewers

<Image src="/docs/images/alert-rules-1.png" width={1217} height={643} />

From the menu on the right of list item on the Alert Rules list page, the following actions can be taken:

* Edit
* Duplicate
* Delete

<Image src="/docs/images/alert-rules-dropdown.png" width={1217} height={643} />

### Filters

Filters are applied to the alert definition to track only the specific data you want considered for the alert. Data for alert rules can be included or excluded from the following dimensions:

* ASN
* CDN
* Country
* Mux Asset ID
* Mux Live Stream ID
* Mux Playback ID
* Operating System
* Player Name
* Region / State
* Stream Type
* Sub Property ID
* Video Series
* Video Title

### Value ("trigger if")

The threshold value the metric must cross for the alert condition to be met.

### Above/Below ("rises above/falls below")

For alerts specified with the Concurrent Viewers metric, the criteria can be specified to trigger when the metric is above or below a threshold. Other metrics are specified to trigger when the metric is above the threshold.

### Alert Interval ("for at least X minutes")

The amount of time the threshold criteria needs to be met in order to open or close an alert incident. The interval length can be set between 1 and 60 minutes.

### Minimum Audience ("with a minimum audience of X average concurrent views")

The average number of concurrent viewers must be over the specified value in order to enter or exit an alert incident.

If the number of concurrent viewers falls below the specified minimum audience during an alert incident the incident will continue. The number of concurrent viewers needs to be over the minimum audience in order to for the alert incident to close.

**Note:** If a rule definition is changed, any open incident based on that rule is automatically closed. A new incident will be opened after the alert interval time if the updated rule criteria is met.

## Manage incidents

## Listing Incidents

When Anomaly and Threshold alert incidents are generated, they are listed in the "Incidents" tab.

You can choose the type of alert in the list:

* Threshold
* Anomaly

By default, currently "Open" issues are shown but all historical issues can be viewed by choosing "All".

<Image src="/docs/images/incident-page-1.png" width={1252} height={496} />

## Incidents

When an alert is triggered, the metric performance is captured in an Incident. The Incident page provides a place to see the characteristics of the alert and the metric behavior at start and end of the incident.

<Image src="/docs/images/incident-open-page.png" width={931} height={1153} />

Incidents contain the following information:

**Alert Name:** Name for the alert as defined in the rule definition

**Started:** The timestamp when the metric first crossed over the threshold defined in the alert rule.

**Ended:** The timestamp when the metric first crossed out of the metric threshold defined in the alert rule.

**Duration:** The length of time the alert was firing, the time between the Started and End times.

**\[Metric]** The value of the metric when the alert incident is triggered

**At End:** The value of the metric when the alert incident is resolved

**Peak:** The peak value of the metric while the alert is firing

Some data is only shown once an alert is closed such as the Closed time, Duration, etc.

### Incident Start/Close Charts

Charts and additional details are captured in the Incident page when an alert incident is opened and closed.

**Incident Start:**

**\[Metric]** The value of the metric when the alert incident is triggered

**Concurrent Viewers:** The number of viewers when the incident started.

In order to provide context on your video platform performance as the incident occurs, the chart shows up to 10 minutes before the incident starts and up to 5 minutes after the incident is opened. The lead-up time may be shorter than 10 minutes is the alert rule directly triggers an alert after it is saved.

**Incident Close:**

**\[Metric]** The value of the metric when the alert incident is resolved

**Concurrent Viewers:** The number of viewers when the incident ended.

For post-mortem reviews, the performance of the metric is captured in a chart that shows up to 10 minutes as the incident is resolved and the next 5 minutes after it ends.

Incidents can be queried via the <ApiRefLink href="/docs/api-reference/data/incidents/list-incidents">List Incidents API</ApiRefLink>.

**Note:** If an incident is open when it's rule definition is modified, the incident will be automatically closed. Any configuration data about the incident, such the threshold value or filters applied will reflect the rule configuration as of when the incident is opened.

## Notify your team

Mux Data can send notifications when alert incidents are opened and closed. Notifications are sent to channels that define the method and address of the services where the notifications should be delivered.

Channels are available for:

* Email
* Slack
* [PagerDuty](/docs/guides/pagerduty-alert-notifications)

Notification configuration can be found on the Alerts page in the "Notification Channels" tab. To setup a new channel, click the "Add Channel" button. From there you can choose the notification channel type, enter the destination address (email, Slack channel, or PagerDuty integration key).

You can choose what types of alerts are sent to each channel. Choose "Anomaly" If you only want to receive notification of alerts generated automatically by Mux Data,  "Threshold" will be notified of alerts that are configured in the environment, and "All" will sent notifications for all alerts.


# PagerDuty Alert Notifications
Use Mux Data Alerts with PagerDuty to alert your team about open incidents.
## How the integration works

Alerts can be defined by specified thresholds or based on levels dynamically defined by machine learned anomaly detection.

Video metrics that cause the creation of a new incident in Mux Data will also send a trigger event to PagerDuty which generates a new incident in the configured service or event rule set.

When an alert incident is resolved in Mux Data, a resolve event is sent to PagerDuty and the associated PagerDuty incident will be closed.

## Integration walk-through in PagerDuty

There are two ways to integrate with PagerDuty: via Global Event Routing or on a PagerDuty Service.

If you are adding this integration to an existing PagerDuty service, please skip to the Integrating with a PagerDuty Service section of this guide.

### Integrating with Global Event Routing

Integrating with Global Event Routing enables you to route events to specific services based on the payload of the event from your tool. If you would like to learn more, please visit the PagerDuty article on Global Event Routing.

1. From the Configuration menu, select Event Rules.

2. On the Event Rules screen, click on the arrow next to Incoming Event Source to display the Integration key information. Copy your Integration Key. This is the same integration key you will use for any other tool you want to integrate with using event rules. When you have finished setting up the integration in your tool, you will return to this interface to specify how to route events from your tool to services in PagerDuty.

<Image src="/docs/images/pd-event-routing.png" width={2392} height={1278} />

### Integrating With a PagerDuty Service

1. From the **Configuration** menu, select **Services**.
2. There are two ways to add an integration to a service:
   * **If you are adding your integration to an existing service**: Click the **name** of the service you want to add the integration to. Then, select the **Integrations** tab and click the **New Integration** button.
   * **If you are creating a new service for your integration**: Please read our documentation in section [Configuring Services and Integrations](https://support.pagerduty.com/docs/services-and-integrations#section-configuring-services-and-integrations) and follow the steps outlined in the [Create a New Service](https://support.pagerduty.com/docs/services-and-integrations#section-create-a-new-service) section, selecting ***Mux Data*** as the **Integration Type** in step 4. Continue with the ***In Mux Data*** section (below) once you have finished these steps.
3. Enter an **Integration Name** in the format `monitoring-tool-service-name` (e.g.  ***Mux Data-Production***) and select  ***Mux Data***  from the Integration Type menu.
4. Click the **Add Integration** button to save your new integration. You will be redirected to the Integrations tab for your service.
5. An **Integration Key** will be generated on this screen. Keep this key saved in a safe place, as it will be used when you configure the integration with Mux Data  in the next section.

<Image src="/docs/images/pd-integrations.png" width={2530} height={1054} />

## Integration walk-through in Mux Data

1. From the navigation menu, choose ***Alerts*** and then ***Notifications***.

<Image src="/docs/images/pd-mux-1.png" width={2530} height={1054} />

2. Click on the ***Add Channel*** button to create a new notification channel that sends alerts to PagerDuty.

<Image src="/docs/images/pd-mux-2.png" width={1640} height={498} />

3. In the New Channel dialog, for the ***Service*** choose `PagerDuty`, enter the ***Integration Key*** for the Service or Event Rule from the steps above, and choose which types of Mux Data alerts you would like sent to PagerDuty. `Anomaly` will send all automatically generated Anomaly alerts to the PagerDuty service, `Threshold` will send the notifications generated by the alerts you explicitly configure, and `All` will send all alert notifications generated in Mux Data to PagerDuty.

<Image src="/docs/images/pd-mux-3.png" width={1152} height={866} />

4. Click ***Add Channel*** to create the notification channel for PagerDuty.

## How to Uninstall

To stop notifying PagerDuty of alert incidents, delete the PagerDuty Notification Channel in Mux Data.

1. From the Alerts ***Notification Channels*** tab, scroll to the PagerDuty channel you would like to delete. Click the ***garbage can*** icon to delete the channel.

<Image src="/docs/images/pd-uninstall.png" width={1800} height={152} />

## FAQ

### Can you trigger incidents for more than one PagerDuty service or event rule from Mux Data?

To send alert notifications to multiple services or event rules, you can create more than one PagerDuty Notification Channel in Mux Data. Each PagerDuty Notification Channel in Mux Data can be set with the Integration Key from the desired service or event rule that should be notified when an alert is trigger or resolved.

## Requirements

* Mux Data integrations require access to Anomaly or Threshold Alerts. If you do not have access to this feature, please contact Mux for more information.

## Support

If you need help with this integration or information about Mux, please contact:

* Technical Support: https://www.mux.com/support
* Information: info@mux.com


# Enable automatic CDN detection
See how to configure your CDN so that Mux Data can detect CDN when tracking network requests
Mux has the capability to track each network request made by the player in order to expose network-level metrics such as throughput and latency measurements. In addition, Mux is able to auto-detect the CDN used to serve each manifest, segment, or fragment by inspecting certain response headers.  Enabling CDN auto-detection requires some minor configuration at each of your CDNs.

# Player SDK Integration

Mux currently supports automatic CDN detection for the following player integrations.

## Web

* [HLS.js](/docs/guides/monitor-hls-js)
* [Dash.js](/docs/guides/monitor-hls-js)
* [Video.js](/docs/guides/monitor-video-js)
* [Shaka player](/docs/guides/monitor-shaka-player)

## Android

* [ExoPlayer](/docs/guides/monitor-exoplayer)
* \[AndroidX Media3] (/docs/guides/monitor-androidx-media3)

Simply integrate the player SDK and each network request will be tracked.

For platforms or SDKs that do not support automatic CDN detection using response headers (e.g. iOS, Roku), you can configure your SDK to pass in a CDN value to the corresponding
SDK key if the player is aware of which CDN is delivering content. Learn more in our \[metadata guide]\(/docs//guides/make-your-data-actionable-with-metadata or in the relevant SDK documentation.

# CDN Configuration for automatic CDN detection

In order for Mux to automatically detect which CDN is serving the content to the player, you need to make a few configuration changes to each of your CDNs. These changes are necessary to expose two specific headers.

| Header | Description |
| --- | --- |
| `X-CDN` | This is a custom header that you need to add to *all* responses from each of your CDNs. The value of this should be a name describing that specific CDN; you should lowercase the name and replace spaces with `_`s. For example: `fastly`, `cloudfront`, `level3`, etc. |
| `Access-Control-Expose-Headers` | This should be set on each response, with the value being a comma-separated string of headers to expose to the client. At the least, you should set this to `X-CDN`. It is also suggested that you add other identifying headers that your CDN may use, such as `X-Cache`, `X-Served-By`, `Via`, or similar headers. More information on `Access-Control-Expose-Headers`, see here: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Expose-Headers. |

# Mid-stream CDN switching and automatic CDN detection

Mid-stream CDN switching changes which CDN is used for content requests. If Mux is automatically detecting the CDN used for video delivery via network events, all detected CDN
values used to deliver video content will be placed in the `CDN Trace` dimension. The CDN values will be placed in sequential order that they were detected over the course of
view. A `cdn_change` event will also be created when the SDK detects that the detected CDN has been updated in the network event.


# Show how many people are watching your videos
Learn how to get the latest view and unique viewer counts for a video using the Engagement Counts API.
In this guide you will learn how to use the Engagement Counts<BetaTag /> API in order to embed the latest view and unique viewer counts for a particular video ID into your applications.

You will use JSON Web Tokens to authenticate to this API.

## 1. Create a Signing Key

Signing keys can be managed (created, deleted, listed) from the [Signing Keys settings](https://dashboard.mux.com/settings/signing-keys) of the Mux dashboard or via the Mux System API.

<Callout type="warning">
  When making a request to the System API to generate a signing key, the access
  token being used must have the System permission. You can confirm whether your
  access token has this permission by going to Settings > API Access Token. If
  your token doesn't have the System permission listed, you'll need to generate
  another access token with all of the permissions you need, including the
  System permission.
</Callout>

When creating a new signing key, the API will generate a 2048-bit RSA key pair and return the private key and a generated key ID; the public key will be stored at Mux to validate signed tokens. Store the private key in a secure manner.

You probably only need one signing key active at a time and can use the same signing key when requesting counts for multiple videos. However, you can create multiple signing keys to enable key rotation, creating a new key and deleting the old only after any existing signed URLs have expired.

### Example request

```bash
curl -X POST \
-H "Content-Type: application/json" \
-u ${MUX_TOKEN_ID}:${MUX_TOKEN_SECRET} \
'https://api.mux.com/system/v1/signing-keys'
```

### Example response

```json
// POST https://api.mux.com/system/v1/signing-keys
{
  "data": {
    "private_key": "(base64-encoded PEM file with private key)",
    "id": "(unique signing-key identifier)",
    "created_at": "(UNIX Epoch seconds)"
  }
}
```

<Callout type="warning">
  Be sure that the signing key's environment (Staging, Production, etc.) matches
  the environment of the views you would like to count! When creating a signing
  key via API, the environment of the access token used for authentication will
  be used.
</Callout>

This can also be done manually via the UI. If you choose to create and download your signing key as a PEM file from UI, you will need to base64 encode it before using it with (most) libraries.

```bash
❯ cat /path/to/file/my_signing_key.pem | base64
LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktL...
```

## 2. Generate a JSON Web Token

The following JWT claims are required:

| Claim Code | Description                | Value                                                                                                                                                              |
| :--------- | :------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `sub`      | Subject of the JWT         | The ID for which counts will be returned                                                                                                                           |
| `aud`      | Audience (identifier type) | `video_id` (Mux Data Video ID) <br /> `asset_id` (Mux Video Asset ID) <br /> `playback_id` (Mux Video Playback ID) <br /> `live_stream_id` (Mux Video Live Stream ID) |
| `exp`      | Expiration time            | UNIX Epoch seconds when the token expires. Use this to ensure any tokens that are distributed become invalid after a period of time.                               |
| `kid`      | Key Identifier             | Key ID returned when signing key was created                                                                                                                       |

<Callout type="warning">
  Each of these ID types (used for the `aud` claim) are distinct and cannot be
  used interchangeably. Video ID is an optional Data dimension provided by the
  customer (you!). For more information on leveraging Video ID, see how to [Make your data actionable](/docs/guides/make-your-data-actionable-with-metadata). Mux Video Asset ID, Playback ID and Live Stream ID are available to Mux
  Video customers only and are generated by Mux. Be sure to double check both
  the query ID type and value!
</Callout>

### Expiration time

Expiration time should be at least the duration of the video or the expected duration of the live stream. When the signed URL expires, you will no longer receive counts from the API.

Your application should consider cases where the user loads a video, leaves your application, then comes back later at some time in the future and tries to play the video again. You will likely want to detect this behavior and make sure you fetch a new signed URL to make sure the counts that are displayed in your application continue to display.

<Callout type="info">
  [See the related video documentation](/docs/guides/secure-video-playback#expiration-time)
</Callout>

## 3. Signing the JWT

The steps can be summarized as:

1. Load the private key used for signing
2. Assemble the claims (`sub`, `aud`, `exp`, `kid` etc) in a map
3. Encode and sign the JWT using the claims map and private key and the RS256 algorithm.

There are dozens of software libraries for creating and reading JWTs. Whether you’re writing in Go, Elixir, Ruby, or a dozen other languages, don’t fret, there’s probably a JWT library that you can rely on. For a list of open source libraries to use, check out [jwt.io](https://jwt.io/libraries).

<Callout type="warning">
  The following examples assume you're working with either a private key
  returned from the API, or copy & pasted from the Dashboard, **not** when
  downloaded as a PEM file. If you've downloaded it as a PEM file, you will need
  to base64 encode the file contents.
</Callout>

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"
    "github.com/golang-jwt/jwt/v4"
)

func main() {

    myId := ""       // Enter the id for which you would like to get counts here
    myIdType := ""   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
    keyId := ""      // Enter your signing key id here
    key := ""        // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": myId,
        "aud": myIdType,
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// using @mux/mux-node@8

import Mux from '@mux/mux-node';
const mux = new Mux();
const myId = ''; // Enter the id for which you would like to get counts here
const myIdType = ''; // Enter the type of ID provided in myId; one of video_id | asset_id | playback_id | live_stream_id
const signingKeyId = ''; // Enter your Mux signing key id here
const privateKeyBase64 = ''; // Enter your Mux base64 encoded private key here

const getViewerCountsToken = async () => {
    return await mux.jwt.signViewerCounts(myId, {
        expiration: '1 day',
        type: myIdType,
        keyId: signingKeyId,
        keySecret: privateKeyBase64,
    });
};

const sign = async () => {
    const token = await getViewerCountsToken();
    console.log(token);
};

sign();

```

```php

<?php

  // Using https://github.com/firebase/php-jwt

  use \Firebase\JWT\JWT;

  $myId = "";       // Enter the id for which you would like to get counts here
  $myIdType = "";   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
  "sub" => $myId,
  "aud" => $myIdType,
  "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
  "kid" => $keyId
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

my_id = ''              # Enter the id for which you would like to get counts here
my_id_type = ''         # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

payload = {
    'sub': my_id,
    'aud': my_id_type,
    'exp': int(time.time()) + 3600, # 1 hour
}
headers = {
    'kid': signing_key_id
}

encoded = jwt.encode(payload, private_key, algorithm="RS256", headers=headers)
print(encoded)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(subject, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: subject, aud: audience, exp: expires.to_i, kid: signing_key_id}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

my_id = ''                 # Enter the id for which you would like to get counts here
my_id_type = ''            # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''        # Enter your signing key id here
private_key_base64 = ''    # Enter your base64 encoded private key here

token = sign_url(my_id, my_id_type, Time.now + 3600, signing_key_id, private_key_base64)

```



## 4. Making a Request

Supply the JWT in the resource URL using the `token` query parameter. The API will inspect and validate the JWT to make sure the request is allowed.

Example:

```bash
curl 'https://stats.mux.com/counts?token={JWT}'
```

Response:

```json
{
  "data": [{ "views": 95, "viewers": 94, "updated_at": "2021-09-28T18:21:19Z" }]
}
```

* `views` is the total (non-unique) number of views happening
* `viewers` is the total unique number of views happening

Uniqueness is determined by the `viewer_user_id` metadata field. See the [Metadata guide](/docs/guides/make-your-data-actionable-with-metadata) for details on adding metadata fields.


# Build a Custom Integration
If Mux does not have an SDK specific to your player, you may want to build a custom integration.
Mux provides pre-built SDKs and integrations for most major platforms, but there are some platforms for which there is no pre-built integration. In this case, Mux provides core SDKs for multiple languages, including JavaScript, Java, and Objective-C. These core libraries encapsulate the majority of the business and metric calculation logic, while exposing a common API for plugging in individual player integrations.

# Integration Overview

Mux Data SDKs operate by tracking the playback events that occur through the idea of a `Player`. To Mux, a `Player` is an object that encapsulates the playback of videos, exposing APIs for playback events and retrieving playback state information.

In most cases, the `Player` is a single object exposed by the player technology. For instance, for our Video.js integration (`videojs-mux`), the `Player` is just the Video.js [Player object](https://docs.videojs.com/player). However, in some scenarios, there may be one or more underlying player instances that are unified through a single composite API/object. In these cases, the `Player` would be that higher-level object.

There are three major steps for building an integration for a `Player`:

1. Initialize a monitor for the `Player` that is being tracked.
2. Provide a set of callbacks for the core SDK to retrieve player/device information
3. Emit events for each of the important playback events.

The core SDKs share the above common architecture, but there are differences driven primarily by each programming language. The individual documentation for each will describe the exact steps for integration:

* [JavaScript - Building a custom Integration](/docs/guides/data-custom-javascript-integration)
* [Java - Building a custom Integration](/docs/guides/data-custom-java-integration)
* [Objective-C - Building a custom Integration](/docs/guides/data-custom-objectivec-integration)

Read on for an overview of each of these steps.

# Initialize a Player monitor

Because each core SDK supports the idea of tracking multiple `Player`s (for instance, if more than one video is being played in the same view/web page), each `Player` must be identifiable with a unique ID. This ID is used when initializing the monitor, as well as when emitting events to the core SDK.

The first step that a custom integration must do is initialize a monitor for the `Player`. This is done differently for each core SDK, but the goal is just to allow the core library to prepare the state necessary for tracking a `Player`.

In this step, some information must be provided:

* the `Player` ID
* some integration-specific metadata
* methods to retrieve information from the `Player` (more on this in a later section)

## Integration-level Metadata

When initializing a monitor for a `Player`, metadata about the integration itself should be passed. The possible fields that should be passed are the following (all are strings):

* `player_software_name`: the name of the underlying player software (i.e. 'Video.js')
* `player_software_version`: the version of this player software
* `player_mux_plugin_name`: the name of the plugin
* `player_mux_plugin_version`: the version of the plugin

# Provide Callbacks

To ease the burden of sending a lot of data with each event that is emitted, the Mux core SDKs accept callbacks that allow the core to retrieve information from the player when necessary. The callbacks required differ by core SDK, so read the appropriate section for the core SDK that you are developing with:

* [JavaScript SDK Callbacks](/docs/guides/data-custom-javascript-integration#provide-callbacks)
* [Java SDK Callbacks](/docs/guides/data-custom-java-integration#provide-callbacks)
* [Objective-C SDK Callbacks](/docs/guides/data-custom-objectivec-integration#provide-callbacks)

# Emit events

The majority of the work in an integration is creating and emitting the specific playback events as playback occurs. Most players have a concept of `events` such as `play`, `pause`, `error`, and others, but these events are often named differently depending on the player in use. The Mux core SDKs expect events named in a consistent manner, as defined in [Mux Playback Events](/docs/guides/mux-data-playback-events).

Each core SDK has a different mechanism for emitting these events, so read the appropriate section for the core SDK that you are developing with:

* [JavaScript SDK Emit Events](/docs/guides/data-custom-javascript-integration#emit-events)
* [Java SDK Emit Events](/docs/guides/data-custom-java-integration#emit-events)
* [Objective-C SDK Emit Events](/docs/guides/data-custom-objectivec-integration#emit-events)


# Custom JavaScript integration
This is a guide for building a custom JavaScript integration with Mux Data. Build a custom integration if Mux does not already have an SDK for your player.
Mux has a pre-built integration with many HTML5-based video players that are available in the market. Check the SDKs in the [Track your video performance](/docs/guides/track-your-video-performance) guide to see if there is not a pre-built integration for your player.

If there is no integration for a given player, you can install the Mux core JavaScript library (`mux-embed`) and build a custom Mux Data integration.

## Important related docs

Before proceeding, read the following overview: [Building a Custom Integration](/docs/guides/build-a-custom-data-integration).

In addition, Mux has made available a [template repository](https://github.com/muxinc/web-player-framework). This repo is intended to provide the basics for creating a working integration, after implementing the necessary callbacks and methods.

## Include the `mux-embed` library

### Install via yarn or npm (preferred)

Mux utilizes NPM to distribute the core Mux library, `mux-embed`. This library includes the internal state machine for tracking playback, as well as helper methods that may be useful while building the integration. Include `mux-embed` via `yarn` or `npm`, whichever you prefer.

```sh
yarn add mux-embed
```

This will add `mux-embed` as a dependency to your package, and will allow you to upgrade it at any time as new versions are released. Mux follows the semver standard, so updates within a major version will not have any breaking changes.

### Load from the CDN (not preferred)

If you do not use a package manager, you can include the source file from https://src.litix.io/core/4/mux.js directly in a vendor folder. The script has been built to support `npm`/`yarn`, but will also work in a standalone environment.

In either case, once the script is included in your library, you can import it as follows:

```js
import mux from 'mux-embed';

// mux.log - logs message
// mux.utils - includes multiple helper methods for massaging data
```

## Initialize the SDK

Loading and importing `mux` will initialize the SDK. However, for each new player that is being tracked, you need to initialize the SDK for that player. This is done by calling

```js
mux.init(playerID, options);
```

The core `mux` library can track multiple players at once, so it is important to pass in a unique player ID for each player that you want to track. This ID is going to be used in all future calls to the `mux` library for each player.

The `init` method also takes an optional `options` JSON object. This JSON object supports the following keys:

| Property | Required | Type | Description |
| --- | --- | --- | --- |
| `debug` | No | boolean | Controls whether debug log statements are logged to the console |
| `getPlayheadTime` | Yes | function | Callback for playhead position (see below) |
| `getStateData` | Yes | function | Callback for player state (see below) |
| `data` | No | object | Data about the viewer, video, and integration |

Within the `data` object, you should pass any information that is listed in [Metadata](/docs/guides/make-your-data-actionable-with-metadata), which is typically about the viewer or the video itself. In addition, the following should be provided:

| Property | Description | Example |
| --- | --- | --- |
| `player_software_name` | The name of the underlying player software | `'Video.js'` |
| `player_software_version` | The version of the underlying player software | `'1.0.1'` |
| `player_mux_plugin_name` | The name of the plugin being built | Some descriptive string |
| `player_mux_plugin_version` | The version of the plugin being built | A version string |

The only required property underneath `data` is the `env_key`, which is your env\_key found for each environment on https://dashboard.mux.com/environments.

For most integrations, there should be some `data` that is passed down to the integration at runtime in the page/application, such as viewer information and video information, and often times the `env_key`. This information should be merged with the above four properties as a whole before being passed to `mux.init`.

See the [JavaScript Integration Framework](https://github.com/muxinc/web-player-framework) for an example of how this is done.

## Provide callbacks

The JavaScript Core SDK expects two callback functions to be passed in the `options` object to `mux.init`: `getPlayheadTime` and `getStateData`. These callbacks make it so that additional data does not need to be provided when emitting most events.

The `getPlayheadTime` callback is a simple function that should return the accurate playhead position, in milliseconds.

The `getStateData` callback is a function that should return the following properties:

```js
options.getStateData = () => {
  return {
    // Required properties - these must be provided every time this is called
    // You _should_ only provide these values if they are defined (i.e. not 'undefined')
    player_is_paused: player.isPaused(), // Return whether the player is paused, stopped, or complete (i.e. in any state that is not actively trying to play back the video)
    player_width: player.getWidth(), // Return the width, in pixels, of the player on screen
    player_height: player.getHeight(), // Return the height, in pixels, of the player on screen
    video_source_height: player.currentSource().height, // Return the height, in pixels, of the current rendition playing in the player
    video_source_width: player.currentSource().width, // Return the height, in pixels, of the current rendition playing in the player

    // Preferred properties - these should be provided in this callback if possible
    // If any are missing, that is okay, but this will be a lack of data for the customer at a later time
    player_is_fullscreen: player.isFullscreen(), // Return true if the player is fullscreen
    player_autoplay_on: player.autoplay(), // Return true if the player is autoplay
    player_preload_on: player.preload(), // Return true if the player is preloading data (metadata, on, auto are all "true")
    video_source_url: player.src().url, // Return the playback URL (i.e. URL to master manifest or MP4 file)
    video_source_mime_type: player.src().mimeType, // Return the mime type (if possible), otherwise the source type (hls, dash, mp4, flv, etc)
    video_source_duration: secondsToMs(player.getDuration()), // Return the duration of the source as reported by the player (could be different than is reported by the customer)

    // Optional properties - if you have them, send them, but if not, no big deal
    video_poster_url: player.poster().url(), // Return the URL of the poster image used
    player_language_code: player.language() // Return the language code (e.g. `en`, `en-us`)
  };
};
```

## Emit events

The [Playback Events](/docs/guides/mux-data-playback-events) should be emitted as the events are defined. For the JavaScript core SDK, all events are emitted via `mux.emit`. This method takes three arguments:

* the player name (the same used in the call to `mux.init`
* the event name (e.g. `play`)
* (optional) additional data to send along with the event.

All playback events should be emitted as defined except for one: `viewinit` does not need to be emitted for custom JavaScript integrations. This is handled directly by the call to `mux.init`, and also within the helper `mux.emit('videochange', data)`.

For the basic [Playback Events](/docs/guides/mux-data-playback-events), no additional metadata is necessary, as it will be pulled via the callbacks defined above. However, for the ad event and network events, there are additional data fields that should be sent, as documented.

Lastly, when changing the video, the new video metadata should be included within the third parameter.

For instance:

```js
// Emit the `play` event
mux.emit('playerId', 'play');

// Emit an ad event, with additional ad metadata
mux.emit('playerId', 'adrequest', {
  ad_tag_url: "https://pubads.g.doubleclick.net/ads/..."
});

// Changing a video
mux.emit('playerId', 'videochange', {
  video_title: 'New Video Title',
    // ... all other metadata about the video
});
```

## Tearing Down

When you are tearing down the player and want to stop monitoring it, make sure to remove any listeners that you have on the player for sending events to `mux`. After this, make sure to call `mux.emit('playerId', 'destroy');` for your player, so that the core library can clean up any monitoring and end the view.


# Custom Objective-C integration
This is a guide for building a custom integration with Mux Data in Objective-C.
Mux has a pre-built integration with Apple's `AVPlayer` for iOS and tvOS applications; for these players, see here: [iOS Integration Guide](/docs/guides/monitor-avplayer).

If the player that you use does not expose the `AVPlayer` instance directly, swaps between multiple instances during playback, or uses some other playback mechanism completely, a custom integration may be needed.

## Important Related Docs

Before proceeding, read the following overview: [Building a Custom Integration](/docs/guides/build-a-custom-data-integration).

In addition, the source code for Mux's integration with Apple's AVPlayer is open source and can be found in the [Mux-Stats-AVPlayer GitHub repository](https://github.com/muxinc/mux-stats-sdk-avplayer). This project is a good example of how to use the Objective-C core library in building a player integration.

## Include the Mux-Core library

### Installing in Xcode with Swift Package Manager

1. In your Xcode project click "File" > "Add Package"
2. In the top-right corner of the modal window paste in the SDK repository URL:

```
https://github.com/muxinc/stats-sdk-objc.git
```

3. Click `Next`.
4. Since the `MuxCore` follows SemVer, we recommend setting the "Rules" to install the latest version and choosing the option "Up to Next Major". [Here's an overview of the different SPM Dependency Rules and their semantics](https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app#Decide-on-package-requirements).

### Installing in Package.swift

Open your Package.swift file, add the following to `dependencies`:

```swift
    .package(
      url: "https://github.com/muxinc/stats-sdk-objc",
      .upToNextMajor(from: "5.0.1")
    ),
```

### Installing with CocoaPods

To include the core Objective-C library via CocoaPods, modify your Podfile to use frameworks by including `use_frameworks!` and then add the following pod to your Podfile:

```ruby
pod "Mux-Stats-Core", "~> 5.0"
```

This will include our current release of the [core Objective-C library](https://github.com/muxinc/stats-sdk-objc). There will be no breaking updates within major versions of this library, so you can safely run `pod update`.

Since version 3, `Mux-Stats-Core` has been updated for Xcode 12 and [XCFrameworks bundle type](https://developer.apple.com/videos/play/wwdc2019/416/).

### Including Manually (not preferred)

If you do not you use CocoaPods and wish to include the library manually, view the `XCFramework` directory in the [Mux Objective-C Core SDK](https://github.com/muxinc/stats-sdk-objc) and dragging the framework into your Xcode project.

## Initialize the SDK

There is no need to initialize a player monitor for each player that is being tracked, as this happens automatically when events are emitted for a specific player. For the Objective-C library, the Environment and Viewer-specific data should be emitted to the SDK globally as follows.

```objc
MUXSDKEnvironmentData *environmentData = [[MUXSDKEnvironmentData alloc] init];
[environmentData setMuxViewerId:[[[UIDevice currentDevice] identifierForVendor] UUIDString]];
MUXSDKViewerData *viewerData = [[MUXSDKViewerData alloc] init];
NSString *bundleId = [[NSBundle mainBundle] bundleIdentifier];
if (bundleId) {
  [viewerData setViewerApplicationName:bundleId];
}
// Set additional Viewer data as above
MUXSDKDataEvent *dataEvent = [[MUXSDKDataEvent alloc] init];
[dataEvent setEnvironmentData:environmentData];
[dataEvent setViewerData:viewerData];
[MUXSDKCore dispatchGlobalDataEvent:dataEvent];
```

The only field that should be modified within `MUXSDKEnvironmentData` is the `muxViewerId`, via `setMuxViewerId`, which should be a device-specific string. This field is used within the Mux Dashboard as the Viewer ID in the case that a user-specific value is not provided in the metadata, via `[MUXSDKCustomerViewerData setViewerUserId:]`.

If you are monitoring playback and delivery of Mux Video assets, you may opt-in to Mux Data inferring your environment details from player HTTP traffic. To opt-in, initialize `MUXSDKCustomerPlayerData` with `environmentKey` set to `nil`.

For `MUXSDKViewerData`, the fields that may be provided are the following.

```objc
@property (nullable) NSString *viewerApplicationEngine;
@property (nullable) NSString *viewerApplicationName;
@property (nullable) NSString *viewerApplicationVersion;
@property (nullable) NSString *viewerConnectionType;
@property (nullable) NSString *viewerDeviceCategory;
@property (nullable) NSString *viewerDeviceManufacturer;
@property (nullable) NSString *viewerDeviceName;
@property (nullable) NSString *viewerOsArchitecture;
@property (nullable) NSString *viewerOsFamily;
@property (nullable) NSString *viewerOsVersion;
```

See the [AVPlayer integration](https://github.com/muxinc/mux-stats-sdk-avplayer/blob/a7459d7113a8427ddbe9fd7cb252d7bc891fe8d9/Sources/MUXSDKStats/MUXSDKStats.m#L86) for example values used.

## Emit events

For the Objective-C core SDK, there are two types of events that should be emitted: data events and playback events. Data events are events that update metadata about the video or view, whereas playback events are those described here: [Mux Playback Events](/docs/guides/mux-data-playback-events).

All events are emitted to a specific `Player`, so make sure to include the unique player ID with each event emitted.

### Data Events

Data events are emitted via `[MUXSDKCore dispatchEvent: forPlayer:]`, and should be emitted when any of the following pieces of metadata change:

* `MUXSDKVideoData`
* `videoSourceWidth` - width of the video currently being played, in pixels
* `videoSourceHeight` - height of the video currently being played, in pixels
* `videoSourceIsLive` - whether the video currently being played is live or not
* `videoSourceDuration` - the duration, in milliseconds, of the video currently being played
* `videoSourceAdvertisedBitrate` - the bitrate of the current rendition being played, in bits per second
* `videoSourceFrameDrops` - the total number of dropped video frames for the current View
* Anything in `MUXSDKCustomerPlayerData`, as defined here: [Metadata](/docs/guides/make-your-data-actionable-with-metadata#iosandroid-metadata)
* Anything in `MUXSDKCustomerVideoData`, as defined here: [Metadata](/docs/guides/make-your-data-actionable-with-metadata#iosandroid-metadata)
* Anything in `MUXSDKCustomerViewData`, as defined here: [Metadata](/docs/guides/make-your-data-actionable-with-metadata#iosandroid-metadata)

When any of the above fields change, do the following:

* Create one or more instances of `MUXSDKVideoData `, `MUXSDKCustomerPlayerData`, `MUXSDKCustomerVideoData`, and `MUXSDKCustomerViewData` depending on what changed
* Assign all properties with the most recent value via the helper methods to the appropriate instance of data
* Attach these to an instance of `MUXSDKDataEvent`
* Emit this `MUXSDKDataEvent` via `[MUXSDKCore dispatchEvent: forPlayer:]`

For example, when the resolution of the video being played back changes (such as in adaptive streaming), the following should be done:

```objc
"code": "// Prepare the data update\nMUXSDKVideoData *videoData = [[MUXSDKVideoData alloc] init];\n[videoData setVideoSourceWidth:[NSNumber numberWithInt:width]];\n[videoData setVideoSourceHeight:[NSNumber numberWithInt:height]];\n// put it within a MUXSDKDataEvent\nMUXSDKDataEvent *dataEvent = [[MUXSDKDataEvent alloc] init];\n[dataEvent setVideoData:videoData];\n// Emit the event\n[MUXSDKCore dispatchEvent:dataEvent forPlayer:_playerName];",
"language": "objectivec"
```

### Playback Events

The [Mux Playback Events](/docs/guides/mux-data-playback-events) should be emitted as the events are defined in the referenced document. With regards to naming, the names should align with those in the document, with the following changes: `MUXSDK` is appended in front of the name, the name itself should be PascalCased, and `Event` is appended at the end. For instance, for playerready, the corresponding event is `MUXSDKPlayerReadyEvent`, as defined in `MUXSDKPlayerReadyEvent.h`.

With each playback event that is emitted, the following fields within `MUXSDKPlayerData` should be included with the latest values:

* `playerMuxPluginName` - The name of the integration being built, as a string
* `playerMuxPluginVersion` - The version of the integration being built, as a string
* `playerSoftwareName` - The name of the player software (e.g. `AVPlayer`, `AVPlayerLayer`, etc)
* `playerSoftwareLanguageCode` - The language code (e.g. en-US) of the player UI localization
* `playerWidth` - The width of the player, in logical pixels
* `playerHeight` - The height of the player, in logical pixels
* `playerIsFullscreen` - Boolean of whether the player is currently displayed in full screen or not
* `playerIsPaused`- Boolean of whether the player is currently paused (i.e. not playing or trying to play)
* `playerPlayheadTime` - The current playhead time of the player, in milliseconds

For instance, when emitting the `MUXSDKPlayerReady` event, it should look like the following:

```objc
// Get the player data
MUXSDKPlayerData *playerData = [[MUXSDKPlayerData alloc] init];
// Set the player data information
[playerData setPlayerMuxPluginName:@"Sample Custom Player"];
// ... repeat the above for all values within `MUXSDKPlayerData`
// Emit the event
MUXSDKPlayerReadyEvent *event = [[MUXSDKPlayerReadyEvent alloc] init];
[event setPlayerData:playerData];
[MUXSDKCore dispatchEvent:event forPlayer:_playerName];
```

In addition to the above data fields, for ad and network events there are additional data fields that should be sent. These are documented alongside the events described in [Mux Playback Events](/docs/guides/mux-data-playback-events), and follow similar naming conventions.

In particular:

* Network throughput events should be emitted as `MUXSDKRequestBandwidthEvent`s, with the addition of `MUXSDKBandwidthMetricData` set on the event via `[MUXSDKRequestBandwidthEvent
   setBandwidthMetricData:]`.
* If your player gives you access to your streams rendtion list, you can use the `renditionLists` property of `MUXSDKBandwidthMetricData`  track a stream's renditions with their resolution, framerate, bitrate, and RFC `CODECS` tag ([ref](https://datatracker.ietf.org/doc/html/rfc6381)).
* Ad events are emitted via a special method, `dispatchAdEvent`, and details can be seen within [Mux's IMA integration for AVPlayer](https://github.com/muxinc/mux-stats-google-ima/blob/master/Sources/MuxStatsGoogleIMAPlugin/MuxImaListener.m)

Lastly, for the `MUXSDKRenditionChangeEvent`, you should make sure to dispatch a `MUXSDKDataEvent` with the latest updated `MUXSDKVideoData` immediately before dispatching the `MUXSDKRenditionChangeEvent`.

## Sample event sequence

There are multiple steps in setting up and tracking a view correctly. A very simple sequence of events to track a basic playback would look like the following steps:

1. Dispatch a global data event with the environment and viewer data
2. Dispatch the `MUXSDKViewInitEvent` with the current state of the player and video
3. Dispatch a `MUXSDKDataEvent` with the updated `MUXSDKCustomerVideoData` and `MUXSDKCustomerPlayerData` for the current video view
4. Dispatch the rest of the [Mux Playback Events](/docs/guides/mux-data-playback-events) (e.g. `MUXSDKPlayerReadyEvent`, `MUXSDKPlayEvent`, `MUXSDKPlayingEvent`, `MUXSDKTimeUpdateEvent`, etc), each time with the updated current state of the player

Note: For each Playback Event and `MUXSDKViewInitEvent` that is dispatched, the current state of the player and video data (`MUXSDKPlayerData` and `MUXSDKVideoData` should be attached to the event prior to dispatching the event.

```objc
// First, emit the global data event setting up the information about
// the player. This will likely only be called once within your application
// and does not need to be called for each player that is tracked.
MUXSDKDataEvent *dataEvent = [[MUXSDKDataEvent alloc] init];
[dataEvent setEnvironmentData:environmentData];
[dataEvent setViewerData:viewerData];
[MUXSDKCore dispatchGlobalDataEvent:_dataEvent];

// Prepare the view before you emit any other playback events
MUXSDKViewInitEvent *event = [[MUXSDKViewInitEvent alloc] init];
[event setPlayerData:playerData];
[MUXSDKCore dispatchEvent:event forPlayer:playerName];

// Dispatch data about the view itself.
// Note: customerPlayerData must include your environment key.
MUXSDKDataEvent *dataEvent = [[MUXSDKDataEvent alloc] init];
[dataEvent setCustomerPlayerData:customerPlayerData];
[dataEvent setCustomerVideoData:customerVideoData];
[MUXSDKCore dispatchEvent:dataEvent forPlayer:_playerName];

// Emit playback events
MUXSDKPlayerReadyEvent *event = [[MUXSDKPlayerReadyEvent alloc] init];
[event setPlayerData:playerData];
[MUXSDKCore dispatchEvent:event forPlayer:_playerName];

// When the player begins to attempt playback
MUXSDKPlayEvent *event = [[MUXSDKPlayEvent alloc] init];
[event setPlayerData:playerData];
[MUXSDKCore dispatchEvent:event forPlayer:_playerName];

// When the player actually displays first moving frame
MUXSDKPlayingEvent *event = [[MUXSDKPlayingEvent alloc] init];
[event setPlayerData:playerData];
[MUXSDKCore dispatchEvent:event forPlayer:_playerName];

// ... and repeat for all of the playback events
```

## Additional methods

Most of the events are signaled as listed above. However, there are a few cases of events that require additional work.

### Reporting Network Conditions

Versions 5.8.0 and later include `MUXSDKNetworkChangeEvent`. This allows you to report changes in network connection type and optionally if the connection is in low data mode. This event should be posted as soon as network information is available after `viewinit` (`MUXSDKViewInitEvent`) and whenever connection type or low data mode changes.

A complete example implementation is available in the [AVPlayer monitoring SDK's `NetworkMonitor.swift`](https://github.com/muxinc/mux-stats-sdk-avplayer/blob/master/Sources/MUXSDKStatsInternal/NetworkMonitor.swift).

### Changing the Video

In order to change the video within a player, there are a few events that need to be fired in sequence. You can see the implementation of this within the [muxinc/mux-stats-sdk-avplayer](https://github.com/muxinc/mux-stats-sdk-avplayer/blob/a7459d7113a8427ddbe9fd7cb252d7bc891fe8d9/Sources/MUXSDKStats/MUXSDKStats.m#L526) code. You should do the following:

1. Dispatch a `viewend` event
2. Dispatch a `viewinit` event
3. Dispatch a `MUXSDKDataEvent` with the new video's `MUXSDKCustomerVideoData`, with the `videoChange` property set to `YES`.

If at various times the same underlying video stream needs to be monitoried as effectively separate videos and separate Data views, two additional events: `play` and `playing` need to be dispatched

See an example implementation of this in the [muxinc/mux-stats-sdk-avplayer](https://github.com/muxinc/mux-stats-sdk-avplayer/blob/a7459d7113a8427ddbe9fd7cb252d7bc891fe8d9/Sources/MUXSDKStats/MUXSDKStats.m#L565) code.

The following are the required steps from start to finish:

1. As before, dispatch a `viewend` event
2. As before, dispatch a `viewinit` event
3. As before, dispatch a `MUXSDKDataEvent` with the new video's `MUXSDKCustomerVideoData`, with the `videoChange` property set to `YES`.
4. Dispatch a `play` event
5. Dispatch a `playing` event

### Sending Error events

Your custom integration is able to dispatch error events associated with the current view. These errors can get alerted on and are also visually indicated on the event timeline shown for that view.

When dispatching errors your custom integration can provide additional error metadata with Error Categorization. This section will cover several examples of dispatching errors using the Objective-C SDK. You can find [more general information on Error Categorization here](/docs/guides/error-categorization).

<Callout type="info">
  Any error categories specified by your custom integration can be configured to be overridden based on the player error code. [See the Error Categorization guide for more details](/docs/guides/error-categorization#2-configuring-error-categorization).
</Callout>

This example dispatches an error event that Mux will categorize as a fatal playback error unless a different default for the player error code applies.

```objc
// Call this method from the source of the fatal playback error (such as an `AVPlayer` key-value property observer, for example) with parameters appropriate to your integration.
- (void)dispatchPlaybackWarningWithPlayerName:(NSString *)playerName
                              playerErrorCode:(NSString *)playerErrorCode
                           playerErrorMessage:(NSString *)playerErrorMessage
                           playerErrorContext:(NSString *)playerErrorContext
                           playerPlayheadTime:(NSNumber *)playerPlayheadTime {
  MUXSDKErrorEvent *errorEvent = [[MUXSDKErrorEvent alloc] initWithContext:playerErrorContext];

  // Configure any custom video or view data if necessary
  MUXSDKPlayerData *playerData = [[MUXSDKPlayerData alloc] init];
  [playerData setPlayerErrorCode:playerErrorCode];
  [playerData setPlayerErrorMessagae:playerErrorMessage];
  [playerData setPlayerPlayheadTime: playerPlayheadTime];
  // ... repeat for any other `MUXSDKPlayerData` properties if they've changed

  [MUXSDKCore dispatchEvent:event
                forPlayer:playerName];
}
```

This example dispatches an error that Mux will categorize as a warning unless a different default for the player error code applies.

```objc
// Call this method from the source of the playback warning (such as an `AVPlayer` key-value property observer, for example) with parameters appropriate to your integration.
- (void)dispatchPlaybackWarningWithPlayerName:(NSString *)playerName
                              playerErrorCode:(NSString *)playerErrorCode
                           playerErrorMessage:(NSString *)playerErrorMessage
                           playerErrorContext:(NSString *)playerErrorContext
                           playerPlayheadTime:(NSNumber *)playerPlayheadTime {
  MUXSDKErrorEvent *errorEvent = [[MUXSDKErrorEvent alloc] initWithSeverity:MUXSDKErrorSeverityWarning
                                                                    context:playerErrorContext];

  // Configure any custom video or view data if necessary
  MUXSDKPlayerData *playerData = [[MUXSDKPlayerData alloc] init];
  [playerData setPlayerErrorCode:playerErrorCode];
  [playerData setPlayerErrorMessagae:playerErrorMessage];
  [playerData setPlayerPlayheadTime: playerPlayheadTime];
  // ... repeat for any other `MUXSDKPlayerData` properties if they've changed

  [MUXSDKCore dispatchEvent:errorEvent
                  forPlayer:playerName];
}
```

This example dispatches an error that Mux will catgeorize as a business exception unless a different default for the player error code applies.

```objc
// Call this method from the source of the business exception with parameters appropriate to your integration.
- (void)dispatchBusinessExceptionWithPlayerName:(NSString *)playerName
                                  playerErrorCode:(NSString *)playerErrorCode
                                  playerErrorMessage:(NSString *)playerErrorMessage
                                  playerErrorContext:(NSString *)playerErrorContext
                                  playerPlayheadTime:(NSNumber *)playerPlayheadTime {

  // This method does not set an explicit error severity, see below for an example method that does.
  MUXSDKErrorEvent *errorEvent = [[MUXSDKErrorEvent alloc] initWithContext:playerErrorContext];
  [errorEvent setIsBusinessException: YES];

  // Configure any custom video or view data if necessary
  MUXSDKPlayerData *playerData = [[MUXSDKPlayerData alloc] init];
  [playerData setPlayerErrorCode:playerErrorCode];
  [playerData setPlayerErrorMessage:playerErrorMessage];
  [playerData setPlayerPlayheadTime:playerPlayheadTime];
  // ... repeat for any other `MUXSDKPlayerData` properties if they've changed

  [MUXSDKCore dispatchEvent:errorEvent
                  forPlayer:playerName];
}

// Call this method from the source of the business exception with parameters appropriate to your integration.
- (void)dispatchBusinessExceptionWithPlayerName:(NSString *)playerName
                                  playerErrorSeverity:(MUXSDKErrorSeverity)errorSeverity
                                  playerErrorCode:(NSString *)playerErrorCode
                                  playerErrorMessage:(NSString *)playerErrorMessage
                                  playerErrorContext:(NSString *)playerErrorContext
                                  playerPlayheadTime:(NSNumber *)playerPlayheadTime {
  MUXSDKErrorEvent *errorEvent = [[MUXSDKErrorEvent alloc] initWithContext:playerErrorContext];
  [errorEvent setIsBusinessException: YES];

  // Configure any custom video or view data if necessary
  MUXSDKPlayerData *playerData = [[MUXSDKPlayerData alloc] init];
  [playerData setPlayerErrorCode:playerErrorCode];
  [playerData setPlayerErrorMessage:playerErrorMessage];
  [playerData setPlayerPlayheadTime:playerPlayheadTime];
  // ... repeat for any other `MUXSDKPlayerData` properties if they've changed

  [MUXSDKCore dispatchEvent:errorEvent
                  forPlayer:playerName];
}
```

### Destroying the Monitor

When you are tearing down the player and want to stop monitoring it, make sure to remove any listeners that you have on the player for sending events to `MUXSDKCore`. After this, make sure to call `[MUXSDKCore destroyPlayer: _name];` for the name of your player, so that the core library can clean up any monitoring and end the view session.

## Release notes

### Current release

#### v5.10.0

Improvements

* Continues tracking cumulative playing time after `MUXSDKSeekedEvent` and `MUXSDKRebufferEndEvent` when their `playerData.playerIsPaused` is true.
* Updates the constant value for `MUXSDKConnectionTypeNoConnection`

### Previous releases

#### v5.9.0

Improvements

* Adds `viewerDeviceName` property to `MUXSDKCustomerViewerData`

#### v5.8.1

Fixes

* Resolves a leak (retain cycle) involving an internal class (`MUXSDKCoreView`).

#### v5.8.0

Improvements

* Adds `MUXSDKNetworkChangeEvent` and predefined values for connection type via `MUXSDKConnectionType`

#### v5.7.1

Fixes:

* Ensure `playbackmodechange` events are sent
* Restores tvOS-specific seeking detection behavior, where a best-effort attempt is made to count the preceding pause as part of the seek. This behavior was missing in 5.7.0, potentially causing changes in metrics.

#### v5.7.0

Updates:

* `playbackmodechange` event added for tracking changes to the presentation of a video (ie, fullscreen, pip, etc)
* Added timing metrics for playing time and ad playing time. These metrics track the wall-clock time spent playing (excluding startup time, rebuffering, seeking, etc)

Improvements:

* Added nullability annotations and nil-handling improvements to most public APIs
* Made most properties of `MUXSDKPlayerData`, `MUXSDKVideoData`, `CustomerData`, et al `nonatomic`
* Handle trackable events via typed delegate method
* Enable link-time optimization by thinLTO
* MUXSDKCore methods may now be called from any thread
* Numerous internal improvements

Fixes:

* Fix HTTP retry delay 1000x higher than intentional
* Wait for HTTP connectivity for beacons

#### v5.6.0

Improvements:

* Allow disabling playhead-based rebuffer tracking via `-[MUXSDKCore disablePlayheadRebufferTrackingForPlayerID:]` and manual dispatch of `MUXSDKRebufferStartEvent` and `MUXSDKRebufferEndEvent`

Fixes:

* Prevent overriding `muxEmbed`, `muxEmbedVersion`, and `muxApiVersion` from `MUXSDKEnvironmentData`
* Individual frameworks within the XCFramework are no longer separately code signed. The parent XCFramework is still signed.

#### v5.5.1

Improvements:

* Removes the deprecation on `MUXSDKCustomerVideoData.videoCDN` (added in v5.5.0)

#### v5.5.0

Improvements:

* Adds CDN change tracking, including automatically via `X-CDN` headers
* The MuxCore frameworks are now built as Mergeable Libraries. See this [WWDC session](https://developer.apple.com/videos/play/wwdc2023/10268/) and the [official docs](https://developer.apple.com/documentation/xcode/configuring-your-project-to-use-mergeable-libraries) for more info. This is not enabled for the CocoaPods build.

Fixes:

* Fixes missing videoData in some instances, including after a rendition change
* Corrects a typo in `viewPrerollAdAssetHostname` causing it not to be sent
* Fixes an issue leading to incorrect `viewPlayingTime` and/or `viewMaxPlayheadPosition`
* Resolves a few naming issues that could cause builds to fail on case-sensitive filesystems

#### v5.4.1

Fixes:

* fix rebuffering ending while player is still buffering in some cases

#### v5.4.0

Improvements:

* Add customer viewer data to `MUXSDKDataEvent` and `MUXSDKTrackableEvent`
* Use consistent umbrella header path so `#import <MuxCore/MuxCore.h>` works on all platforms

#### v5.3.1

Improvements:

* Add `MUXSDKCustomerVideoData.videoCreatorId`

#### v5.3.0

Improvements:

* adds additional dimensions

Fixes:

* adds missing macCatalyst platform to package spec

#### v5.2.0

Improvements:

* expose additional custom dimensions

#### v5.1.2

Fixes:

* dispatch queued up events when receiving `adbreakend`, `aderror` events
* patch memory leak when a player monitor is created and destroyed

#### v5.1.1

Fixes:

* Fully resets all player metrics associated with a previous view that had ended due to a time out when receiving a `viewinit` event.

#### v5.1.0

Improvements:

* Include codec and rendition name in `renditionchange` events
* Add safety checks when player identifier is `nil`

#### v5.0.1

Improvements:

* Include privacy manifest file

#### v5.0.0

Improvements:

* Error events can be categorized with warning or fatal severity levels
* Error events can be categorized as business exceptions
* An error translator can be configured to extend or customize the Core SDK error handling logic

Fixes:

* Player error details such as error code, error context, error message, error severity, and whether the error is a business exception are only sent to Mux when an error event is dispatched.
* Player error details (same as listed above) are no longer deduplicated and are explicitly included with each error event sent to Mux.
* The SDK continues to track watch time after an error event is dispatched based on player playhead progression. To explicitly indicate that watch time should no longer be tracked after an error during a playback session please dispatch a `ViewEnd` event.

#### v4.7.1

Improvements:

* Include privacy manifest file

#### v4.7.0

Improvements:

* Add support for monitoring media on `visionOS`. We recommend testing your `visionOS` SDK integration on both the simulator and a physical device prior to deploying to the App Store.

Fixes:

* Compute correct Video Startup Time if `AdPlayingEvent` occurs a significant time after the view has started
* Ensure seeks are excluded from Video Startup Time in all cases

Known Issues:

* Installation using Cocoapods on `visionOS` is not currently supported. Installation on `iOS` and `tvOS` using Cocoapods is not affected.

#### v4.6.0

API Changes:

* Expose player software name and player software version on `MUXSDKPlayerData`

Improvements:

* Bump beacon interval to 10 seconds to match the other Core SDKs

#### v4.5.2

Improvements:

* Backfill header nullability annotations

#### v4.5.1

Fixes:

* Include at playback time in the calculation for total playback time

#### v4.5.0

Updated:

* Add DRM Type to `MUXSDKCustomerViewData` so you can specify it from another source

Deprecated:

* `MUXSDKDispatcher`'s `handleBatch beaconCollectionDomain: osFamily: jsonDict: callback:` has been deprecated in favor of an overload that takes headers for requests to the collection domain
* `MUXSDKNetworkRequestBuilding` `buildRequestFromURL: eventsJsonDict: error:` has been deprecated in favor of an overload that takes headers for requests to the collection domain

Improvements:

* Performance + Reliability improvements during large events

#### v4.4.2

Fixes:

* Fix ad metadata not being reported

#### v4.4.1

Improvements:

* Update beacon interval from 5s to 10s

#### v4.4.0

* Ad per-ad metadata for Ad events
* Fix strange views when a user seeks into an ad break

#### v4.3.0

* Add DRM Type and Error Context metadata fields

#### v4.2.0

* Add more Custom Dimensions

#### v4.1.1

* Fix Rendition::name misnamed

#### v4.1.0

* Add framerate, codec, and name to rendition properties

#### v4.0.0

* Due to Xcode 14, support for iOS and tvOS versions 9 and 10 have been removed. For more information see the last 'Deprecations' block in the release notes. This may result in a warning for client applications with deployment versions below iOS/tvOS 11

#### v3.14.0

* Split Views with >1 hour of inactivity into new views

#### v3.11.0

* Add inferred environment key support for users of Mux Data and Mux Video
* Expose `MUXSDKEndedEvent` in the public headers

#### v3.10.1

* Add weak self/strong self in closure block to avoid any retain cycles

#### v3.10.0

* Capture experiments values from HLS Session Data

#### v3.9.0

* Add Experiment Fields
* Log sent beacons in debug mode
* Set Xcode build setting `APPLICATION_EXTENSION_API_ONLY` = YES

#### v3.8.0

* Add internal device detection properties
* Add project binary specification file for Carthage support

#### v3.7.0

* Use synchronized to make query data objects thread safe.

#### v3.6.0

* Add transmission time and round trip time to beacon requests
* Add `player_live_edge_program_time`
* Add `player_program_time`

#### v3.5.0

* Allow overriding of viewer information (application name)
* Add nullability specifiers
* Custom beacon collection domains

#### v3.4.0

* Adds the `MUXSDKCustomerData` model
* Adds support for setting custom dimensions

#### v3.3.0

* Automatically build statically linked frameworks
* Remove dependency on `UIKit`

#### v3.2.0

* Add Swift PM support

#### v3.1.0

* Submits a new `mux_embed field`
* Fixes bugs with video start-up time for midroll or postroll ads
* Updates ad tracking to be more accurate
* Tracks `view_playing_time`

#### v3.0.3

* No functional changes, just generating a new release on CocoaPods

#### v3.0.2

* Include linker flags that enable the framework to be built without support for modules.
* Move instance variables out of headers

#### v3.0.0

This release moves the build process to use [XCFramework bundle type](https://developer.apple.com/videos/play/wwdc2019/416/). For iOS, there are no changes required to your application code.

If you are using this SDK with TVOS the name of the module has changed (the `Tv` suffix is no longer needed):

TVOS before 3.0:

```obj-c
@import MuxCoreTv;
```

TVOS after 3.0:

```obj-c
@import MuxCore;
```

#### v2.4.1

* (bugfix) Works around an issue where a view with no pre-roll ads, but with midroll and/or postroll ads will cause Mux to update the TTFF value erroneously

#### v3.0.0-beta.0

This release moves the build process to use [XCFramework bundle type](https://developer.apple.com/videos/play/wwdc2019/416/). For iOS, there are no changes required to your application code.

If you are using this SDK with TVOS the name of the module has changed (the `Tv` suffix is no longer needed):

TVOS before 3.0:

```obj-c
@import MuxCoreTv;
```

TVOS after 3.0:

```obj-c
@import MuxCore;
```

#### v2.4.0

* Adds support for `player_remote_played` and `view_session_id`.
* In addition to existing options that are provided via the `MUXSDKCustomerPlayerData` and `MUXSDKCustomerVideoData` objects, there is now support for `MUXSDKCustomerViewData`. The `view_session_id` may be set on`MUXSDKCustomerViewData`.

#### v2.3.0

* Update build process for Xcode 12 to exclude arm\_64 architectures when building for simulator. Before Xcode 12, xcodebuild never built arm\_64 slices for the simulator. Now, it does (in preparation for Apple silicon). Because arm\_64 slices now get built for the simulator, `lipo` errors out, because it can't have the same architecture for two different platforms (it already has arm\_64 for the device platform). This is a temporary work around until a later major version release which will use the new `XCFramework` hotness
* bump iOS deploy target to '9.0' in `podspec` and project build settings for Xcode 12 compatibility

#### v2.2.0

* bugfix: Removes erroneously committed logs from the compiled frameworks

#### v2.2.1

* bugfix - ignore scaling calculations when player or source width or height dimension is 0

#### v2.2.0

* Add support for `renditionchange` events
* Add support for `orientationchange` events

#### v2.1.3

* bugfix for request metrics calculation. If we don't have `responseStart`, fallback to `requestStart` in order to calculate throughput

#### v2.1.2

* bugfix - Use monotonically increasing time in Objc client library. Avoids a bug if system time changes during a view.

#### v2.1.1

* Expose `videoSourceUrl` on `MUXSDKCustomerVideoData`. This allows a user to set the videoSourceUrl (along with their other VideoData, in which case any videoSourceUrl that is inferred from the player will be ignored.

#### v2.1.0

* Fix build process for Xcode 11
* Make `player_instance_id` a full uuid-style string
* Make sure to always send `player_instance_id`
* Bump Mux API versions for new collectors/processors


# Custom Java integration
This is a guide for building a custom integration with Mux Data in Java. Build a custom integration if Mux does not already have an SDK for your player.
Mux has a pre-built integration with Google's [ExoPlayer v2](/docs/guides/monitor-exoplayer) and [Android Media Player](/docs/guides/monitor-android-media-player) for Android applications.

If the player that you use does not expose the `ExoPlayer` instance directly, swaps between multiple instances during playback, or uses some other playback mechanism completely (for instance, outside of Android), a custom integration may be needed.

## Important Related Docs

Before proceeding, read the following overview: [Building a Custom Integration](/docs/guides/build-a-custom-data-integration).

In addition, the source code for Mux's integration with Google's ExoPlayer is open source and can be found in the [Mux-Stats-SDK-ExoPlayer GitHub repository](https://github.com/muxinc/mux-stats-sdk-exoplayer). This project is a good example of how to use the Java core library in building a player integration.

## Include the Library

The Mux Core Java library is made available as a JAR file which can be installed using the following methods:

## Option 1: Add Gradle dependency (preferred)

Add the Mux Maven repository to your Gradle file:

```text
repositories {
    maven {
        url "https://muxinc.jfrog.io/artifactory/default-maven-release-local"
    }
}
```

Next, add a dependency on Mux Core (current version is 8.8.0):

```
api 'com.mux:stats.muxcore:8.8.0'
```

## Option 2: Add Maven dependency

Add the Mux repository to your Maven pom.xml:

```xml
<repository>
    <id>mux</id>
    <name>Mux Maven Repository</name>
    <url>https://muxinc.jfrog.io/artifactory/default-maven-release-local</url>
    <releases>
        <enabled>true</enabled>
    </releases>
    <snapshots>
        <enabled>false</enabled>
    </snapshots>
</repository>
```

Next, add a dependency on Mux Core (current version is 7.0.11):

```xml
<dependency>
    <groupId>com.mux</groupId>
    <artifactId>stats.muxcore</artifactId>
    <version>8.8.0</version>
</dependency>
```

## Initialize the SDK

The core Java SDK is initialized by implementing certain interfaces and providing these back to the SDK. In general, the structure used within [MuxBaseExoPlayer](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/main/java/com/mux/stats/sdk/muxstats/MuxBaseExoPlayer.java#L89) should be followed, where you create a class that extends `EventBus` and implements `IPlayerListener`, and then follows the following general steps.

```java
import com.mux.stats.sdk.core.events.EventBus;
import com.mux.stats.sdk.core.model.CustomerPlayerData;
import com.mux.stats.sdk.core.model.CustomerVideoData;
import com.mux.stats.sdk.core.model.CustomerViewData;
import com.mux.stats.sdk.muxstats.IPlayerListener;

public class PlayerListener extends EventBus implements IPlayerListener {
      MuxStats muxStats;

    PlayerListener(Context ctx, ExoPlayer player, String playerName, CustomerPlayerData customerPlayerData, CustomerVideoData customerVideoData, CustomerViewData customerViewData) {
        super();
        this.player = new WeakReference<>(player);
        state = PlayerState.INIT;
        MuxStats.setHostDevice(new MuxDevice(ctx));
        MuxStats.setHostNetworkApi(new MuxNetworkRequests());
        muxStats = new MuxStats(this, playerName, customerPlayerData, customerVideoData, customerViewData);
        addListener(muxStats);
    }
}
```

The above does the following:

1. Initializes the `EventBus` superclass
2. Sets the host device to a new instance of a class that implements `IDevice`
3. Sets the host network API to a new instance of a class that implements `INetworkRequest`
4. Instantiates a new instance of `MuxStats`, passing itself (a class that implements `IPlayerListener`) along with metadata
5. Adds muxStats as a listener for `this`'s events (via EventBus)

The `IDevice`, `INetworkRequest`, and `IPlayerListener` interfaces are described in the next section, as they provide the majority of the functionality aside from the actual emitting of events.

## Provide Callbacks

The core Java SDK relies heavily on callbacks, via implemented interfaces. These interfaces provide necessary metadata as well as core functionality that may be different depending on your Java environment.

## `IDevice`

The `IDevice` interface provides device-specific information to the core library, which is used as metadata attached to each view.

```java
package com.mux.stats.sdk.muxstats;

public interface IDevice {
    // Return the hardware name (e.g. Build.HARDWARE)
    String getHardwareArchitecture();
    // Return the OS (e.g. Android)
    String getOSFamily();
    // Return the OS version
    String getOSVersion();
    // Return the device manufacturer (e.g. Build.MANUFACTURER)
    String getManufacturer();
    // Return the model name (e.g. Build.MODEL)
    String getModelName();
    // Return the player version
    String getPlayerVersion();
    // Return a unique identifier for this device
    String getDeviceId();
    // Return the name of the running application
    String getAppName();
    // Return the version of the running application
    String getAppVersion();
    // Return the name of the plugin (e.g. exoplayer-mux)
    String getPluginName();
    // Return the version of the plugin
    String getPluginVersion();
    // Return the player software (e.g. 'ExoPlayer')
    String getPlayerSoftware();
    // Return the network connection type (e.g. 'wifi', 'cellular', 'ethernet')
    String getNetworkConnectionType();
    // Return milliseconds since epoch, ideally from a
    // monotonically increasing clock. For instance, in
    // ExoPlayer and Android, we suggest
    // android.os.SystemClock.elapsedRealtime
    long getElapsedRealtime();
    // Return provide a mechanism to log an output, for instance to logcat
    void outputLog(String tag, String msg);
}
```

There must be an instance of a class that implements the `IDevice` interface, and this should be provided to `MuxStats.setHostDevice` prior to instantiating an instance of `MuxStats`.

You can see the implementation of `IDevice` within Mux's ExoPlayer integration within [MuxBaseExoPlayer.java](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/main/java/com/mux/stats/sdk/muxstats/MuxBaseExoPlayer.java#L605).

## INetworkRequest

The `INetworkRequest` interface defines the methods that the Mux core SDK requires in order to make the necessary network requests.

```java
package com.mux.stats.sdk.muxstats;

/**
 * <b>MuxStats</b> will use this interface implementation to send events and metrics to the backend,
 * overlaying player SDK need to implement this interface and set it to the <b>MuxStats</b> via
 * {@link MuxStats#setHostNetworkApi(INetworkRequest)} method.
 * Always set this interface before instantiating the <b>MuxStats</b> instance.
 */
public interface INetworkRequest {

  /**
   * This interface is used to get from the network implementation that
   * {@link #postWithCompletion(String, String, String, Hashtable, IMuxNetworkRequestsCompletion)}
   * have succeed or not.
   *
   * @deprecated please prefer {@link IMuxNetworkRequestsCompletion2}
   */
  @Deprecated
  interface IMuxNetworkRequestsCompletion {

    /**
     * Called by the implementation object when
     * {@link #postWithCompletion(String, String, String, Hashtable,
     * IMuxNetworkRequestsCompletion)} is called.
     *
     * @param result if post was completed successfully or not.
     */
    void onComplete(boolean result);
  }

  interface IMuxNetworkRequestsCompletion2 {
    void onComplete(boolean result, Map<String, List<String>> headers);
  }

  /**
   * Perform a HTTP GET request.
   *
   * @param url url to send get request to.
   */
  void get(URL url);

  /**
   * Perform HTTP POST request.
   *
   * @param url url to send post request to.
   * @param body post request body.
   * @param headers post request headers.
   */
  void post(URL url, JSONObject body, Hashtable<String, String> headers);

  /**
   * Perform network request with confirmation callback, type of request is left to the
   * implementation.
   *
   * @param domain domain to send beacons to.
   * @param envKey backend key used to authenticate with backend.
   * @param body request body.
   * @param headers request headers.
   * @param callback callback triggered after the request signalling the request status.
   */
  void postWithCompletion(String domain, String envKey, String body,
      Hashtable<String, String> headers, IMuxNetworkRequestsCompletion callback);

  /**
   * Perform a network request with the given completion handler. If implemented, the completion
   * handler will also report the response headers for the call
   *
   * This method has a default implementation, which does not report response headers, and delegates
   * to the other postWithCompletion
   *
   * @param domain domain to send beacons to.
   * @param envKey backend key used to authenticate with backend.
   * @param body request body.
   * @param headers request headers.
   * @param callback callback triggered after the request signalling the request status.
   */
  default void postWithCompletion(String domain, String envKey, String body,
      Hashtable<String, String> headers, IMuxNetworkRequestsCompletion2 callback) {
    postWithCompletion(domain, envKey, body, headers, result -> callback.onComplete(result, null));
  }
}
```

There must be an instance of a class that implements the `INetworkRequest` interface, and this should be provided to `MuxStats.setHostNetworkApi` prior to instantiating an instance of `MuxStats`.

You can see the implementation of `INetworkRequest` within Mux's ExoPlayer integration within [MuxNetworkRequests.java](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/main/java/com/mux/stats/sdk/muxstats/MuxNetworkRequests.java).

## IPlayerListener

The `IPlayerListener` interface defines the callbacks that `MuxStats` will utilize to retrieve player state information.

```java
package com.mux.stats.sdk.muxstats;

public interface IPlayerListener {
    // Return the current playhead position in milliseconds
    // The playhead position must be updated at least every 250 milliseconds,
    // but can be updated more often than this.
    long getCurrentPosition();
    // Return the MIME type of the content being played (e.g. "video/mp4"
    // or "application/x-mpegUrl" etc)
    String getMimeType();
    // Return the width of the source, in pixels
    Integer getSourceWidth();
    // Return the height of the source, in pixels
    Integer getSourceHeight();
    // Return the current advertised bitrate, in bits per second
    Integer getSourceAdvertisedBitrate();
    // Return the current advertised framerate
    Float getSourceAdvertisedFramerate();
    // Return the current codec string
    String getSourceCodec();
    // Return the source duration, in milliseconds
    Long getSourceDuration();
    // Return whether the player is currently paused (i.e. not actively
    // trying to play the content). This should return true if the player
    // is not actively playing, rebuffering, or starting up.
    boolean isPaused();
    // Return whether the player is currently buffering content (e.g. not
    // playing back because the buffer is not full enough).
    boolean isBuffering();
    // Return the width of the player, in logical pixels
    int getPlayerViewWidth();
    // Return the height of the player, in logical pixels
    int getPlayerViewHeight();
    // Return the current playback position as based off of the PDT tags
    Long getPlayerProgramTime();
    // Return the time of the furthest position in the manifest as based
    // off of the PDT tags in the stream
    Long getPlayerManifestNewestTime();
    // Return the configured holdback value for a live stream (ms)
    Long getVideoHoldback();
    // Return the configured holdback value for parts in a low latency live
    // stream (ms)
    Long getVideoPartHoldback();
    // Return the configured target duration for parts in a low latency
    // live stream (ms)
    Long getVideoPartTargetDuration();
    // Return the configured target duration for segments in a live
    // stream (ms)
    Long getVideoTargetDuration();
}
```

The class that implements `IPlayerListener` serves as the interface between `MuxStats` and the actual player API, and is provided when creating an instance of `MuxStats`.

You can see the implementation of `IPlayerListener` within Mux's ExoPlayer integration within [MuxBaseExoPlayer.java](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/main/java/com/mux/stats/sdk/muxstats/MuxBaseExoPlayer.java#L64). This superclass is used to handle the base API interaction, and is subclassed by each individual `MuxStatsExoPlayer.java` to handle the varying APIs that ExoPlayer exposes with each of its minor versions (such as [this one for r2.11.1](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/r2_11_1/java/com/mux/stats/sdk/muxstats/MuxStatsExoPlayer.java)).

## Emit Events

## Playback Events

For the Java core SDK, the [Mux Playback Events](/docs/guides/mux-data-playback-events) are emitted via the `dispatch` method that is inherited from the `EventBus` class. In order to emit a given event, you must first instantiate an instance of the event class that you are trying to emit.

```java
import com.mux.stats.sdk.core.events.EventBus;
import com.mux.stats.sdk.core.model.CustomerPlayerData;
import com.mux.stats.sdk.core.model.CustomerVideoData;
import com.mux.stats.sdk.muxstats.IPlayerListener;
import com.mux.stats.sdk.events.playback.PlayEvent;

public class PlayerListener extends EventBus implements IPlayerListener {
      MuxStats muxStats;

    PlayerListener(Context ctx, ExoPlayer player, String playerName, CustomerPlayerData customerPlayerData, CustomerVideoData customerVideoData) {
        super();
        this.player = new WeakReference<>(player);
        state = PlayerState.INIT;
        MuxStats.setHostDevice(new MuxDevice(ctx));
        MuxStats.setHostNetworkApi(new MuxNetworkRequests());
        muxStats = new MuxStats(this, playerName, customerPlayerData, customerVideoData);
        addListener(muxStats);
    }

    // When the player begins trying to play back the video
    public void onPlay() {
        dispatch(new PlayEvent(null));
    }
}
```

While not necessary, each playback event can take an optional parameter of `PlayerData`, if certain information of the player has changed. This object has the following properties:

| Property | Description |
| --- | --- |
| `playerMuxPluginName` | The name of the integration being built, as a string |
| `playerMuxPluginVersion` | The version of the integration being built, as a string |
| `playerSoftwareName` | The name of the player software (e.g. `Exoplayer`, etc) |
| `playerSoftwareLanguageCode` | The language code (e.g. en-US) of the player UI localization |
| `playerWidth` | The width of the player, in logical pixels |
| `playerHeight` | The height of the player, in logical pixels |
| `playerIsFullscreen` | Boolean of whether the player is currently displayed in full screen or not |
| `playerIsPaused` | Boolean of whether the player is currently paused (i.e. not playing or trying to play) |
| `playerPlayheadTime` | The current playhead time of the player, in milliseconds |

Most of these properties are pulled automatically via the `IPlayerListener` interface, so there is no need to provide these values. You will need to emit all required [Playback Events](/docs/guides/mux-data-playback-events) in order to make a working integration.

<Callout type="info">
  Prior to v5.0.0, the SeekingEvent was not necessary. As of v5.0.0, this is now a required event to be emitted by the player integration.

  Prior to v6.0.0, the RebufferStartEvent and RebufferEndEvent were not necessary. As of v6.0.0 and newer, these events must be emitted by the player integration.
</Callout>

## Data Events

There is an additional type of event that is permissible, the `DataEvent`. This event is emitted the same way (via `EventBus.dispatch`), but should be used when some metadata has changed outside of a playback event. Examples of this are when you may have any of the metadata within `CustomerVideoData`, `CustomerPlayerData`, `EnvironmentData`, `VideoData`, or `ViewerData` changes. This event likely will not be needed, but it is provided in the case that it might be useful. Mux does not use this at all within the [ExoPlayer integration](https://github.com/muxinc/mux-stats-sdk-exoplayer).

### Experiment Values

Values for Experiments can be tracked via the tags of an HLS stream's main playlist. The values in the `SessionTags` will override the values provided via objects like `CustomerPlayerData` or `CustomerVideoData`. When your player has loaded the experiment values (such as through and HLS stream's `X-SESSION-DATA` tags), you may pass them to `MuxStats::setSessionData(List<SessionTag>)`

## Bandwidth Throughput Events

For the bandwidth throughput and latency related events, the structure is slightly different. Rather than having a specific class for each event, there is one high level network event, the `RequestBandwidthEvent`. This event exposes a method, `setBandwidthMetricData(BandwidthMetricData)`, which is used to provide all information about the event. In particular, the `BandwidthMetricData` class exposes a property (via a getter/setter) named `requestEventType`, which is a string that will match the event names as defined in [Playback Events - Bandwidth Throughput Events](/docs/guides/mux-data-playback-events#bandwidth-throughput-events).

The implementation of these events for the Mux ExoPlayer integration can be found [here in this file](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/main/java/com/mux/stats/sdk/muxstats/MuxBaseExoPlayer.java#L751), from the linked line until the end of the file. This can serve as a good example of how to implement these events, though they are not necessary for a functioning integration.

## Ad Events

In the case that your player supports advertising, you should instrument the ad events that are defined in [Mux Playback Events - Ad Events](/docs/guides/mux-data-playback-events#ad-events). Ad events are emitted just as normal events would be, but the ad events should have the ad metadata included via a `ViewData` instance that is attached to each event via `setViewData`. For instance, to emit an `AdPlayEvent`, you should do the following:

```
AdData adData = new AdData();
adData.setAdCreativeId(creativeId);
adData.setAdId(adId);
AdPlayEvent adPlayEvent = new AdPlayEvent(null);
adPlayEvent.setAdData(adData);
dispatch(adPlayEvent);
```

The implementation of ad events within Mux's ExoPlayer integration, on top of Google's IMA SDK, can be found within [AdsImaSDKListener.java](https://github.com/muxinc/mux-stats-sdk-exoplayer/blob/14a65c0b365a1245e500543b976b3b9be1101aaa/MuxExoPlayer/src/main/java/com/mux/stats/sdk/muxstats/AdsImaSDKListener.java), and can serve as a good example.

## Changing the video

Rather than requiring an event to be emitted for changing the video, `MuxStats` exposes two helper methods: `videoChange` and `programChange`. These methods encapsulate the logic necessary to end a view and start a new one, and both take an instance of `CustomerVideoData` containing the metadata about the new video being played.

You should call one of these methods when a new video is being loaded into an already-tracked player.

There is one critical difference between `videoChange` and `programChange` - `programChange` is intended to be used in the case that the underlying video changes *within the same stream*. An example of this would be within live linear playback, where the underlying program changes without the player having to reload a new stream.

In the case that the player is loading a new HLS/Dash/MP4 video, you should use `videoChange`.

```
CustomerVideoData customerVideoData = new CustomerVideoData(null);
customerVideoData.setVideoTitle("New Video Title");
// Add other video metadata here
muxStats.videoChange(customerVideoData);
```

## Reporting Network Changes

If your player is able to detect changes in network connectivity (for instance, switching from cellular to wifi), you can report these changes to Mux by calling the `networkChange` method on your instance of `MuxStats`. This method takes a single parameter, a string representing the new network connection type. Valid values are `"wifi"`, `"cellular"`, `"wired"`, `"other"`, and `"no_connection"`.

There's an overload of `networkChange` that also takes a boolean called `isLowDataMode`, which indicates whether the current network connection is in low data mode. This can be useful for mobile connections where the user has enabled a low data usage setting. This can be null if you don't know whether low data mode is enabled.

```java
// change to wifi
muxStats.networkChange("wifi");
// change to cellular with low data mode enabled
muxStats.networkChange("cellular", true);
```

## Sending Error events

Your custom integration is able to dispatch error events associated with the current view. These errors can get alerted on and are also visually indicated on the event timeline shown for that view.

When dispatching errors your custom integration can provide additional error metadata with Error Categorization. This section will cover several examples of dispatching errors using the Java SDK. You can find [more general information on Error Categorization here](/docs/guides/error-categorization).

This example illustates how to construct and send different categories of error events.

<Callout type="info">
  Any error categories specified by your custom integration can be configured to be overridden based on the player error code. [See the Error Categorization guide for more details](/docs/guides/error-categorization#2-configuring-error-categorization).
</Callout>

```java
import com.mux.stats.sdk.core.events.EventBus;
import com.mux.stats.sdk.core.model.CustomerPlayerData;
import com.mux.stats.sdk.core.model.CustomerVideoData;
import com.mux.stats.sdk.muxstats.IPlayerListener;
import com.mux.stats.sdk.events.playback.PlayEvent;
import com.mux.stats.sdk.events.playback.ErrorEvent;

public class PlayerListener extends EventBus implements IPlayerListener {
    MuxStats muxStats;

    PlayerListener(Context ctx, ExoPlayer player, String playerName, CustomerPlayerData customerPlayerData, CustomerVideoData customerVideoData) {
        super();
        this.player = new WeakReference<>(player);
        state = PlayerState.INIT;
        MuxStats.setHostDevice(new MuxDevice(ctx));
        MuxStats.setHostNetworkApi(new MuxNetworkRequests());
        muxStats = new MuxStats(this, playerName, customerPlayerData, customerVideoData);
        addListener(muxStats);
    }

    // When the player begins trying to play back the video
    public void onPlay() {
        dispatch(new PlayEvent(null));
    }

    // Call from onPlayerError() with parameters appropriate to your integration. Dispatches an error event that Mux will categorize as a fatal playback error by default
    public void onPlaybackError(String errorCode, String errorMessage, String errorContext) {
        PlayerData playerData = new PlayerData();
        playerData.setErrorCode(errorCode);
        playerData.setErrorMessage(errorMessage);

        ErrorEvent errorEvent = new ErrorEvent(playerData, errorContext);

        dispatch(errorEvent);
    }

    // Call from onPlayerError() with parameters appropriate to your integration. Dispatches an error event that Mux will categorize as a warning by default
    public void onPlaybackWarning(String errorCode, String errorMessage, String errorContext) {
        PlayerData playerData = new PlayerData();
        playerData.setErrorCode(errorCode);
        playerData.setErrorMessage(errorMessage);

        ErrorEvent errorEvent = new ErrorEvent(playerData, errorContext, ErrorSeverity.ErrorSeverityWarning);

        dispatch(errorEvent);
    }

    // Call from onPlayerError() with parameters appropriate to your integration. Dispatches an error event that Mux will categorize as a business exception by default
    public void onBusinessException(String errorCode, String errorMessage, String errorContext) {
        PlayerData playerData = new PlayerData();
        playerData.setErrorCode(errorCode);
        playerData.setErrorMessage(errorMessage);

        // This method does not set an explicit error severity, see below for an example method that does.
        ErrorEvent errorEvent = new ErrorEvent(playerData, errorContext);
        errorEvent.setIsBusinessException(true);

        dispatch(errorEvent);
    }

    // Call from onPlayerError() with parameters appropriate to your integration. Dispatches an error event that Mux will categorize as a business exception by default
    public void onBusinessException(String errorCode, String errorMessage, String errorContext, ErrorSeverity severity) {
        PlayerData playerData = new PlayerData();
        playerData.setErrorCode(errorCode);
        playerData.setErrorMessage(errorMessage);

        ErrorEvent errorEvent = new ErrorEvent(playerData, errorContext, severity, true);

        dispatch(errorEvent);
    }
}
```

## Tearing Down

There is no `destroy` event for the core Java SDK. Instead, the `release` method is exposed on `MuxStats` that cleans up all tracking and releases all references held within the core library. This method should be called when you release your player instance, and after calling `release`, the instance of `muxStats` will be unusable.

<LinkedHeader step={steps[4]} />

### Current release

#### v8.9.1

Fixes:

* Remove long-unused dependency on commons-math. If you were getting commons-math from us, you'll need to add a dependency for it

### Previous releases

#### v8.9.0

New:

* Track ranges of content played during a view via `video_playback_range` metric

Improvements:

* NetworkChangeEvent no longer accepts null values. If the network disconnects, use `no_connection`
* `null` values from `IDevice.getNetworkConnectionType` are now coerced to `"no_connection"`

#### v8.8.2

Fixes:

* count cumulative playing time after `seeked` events

#### v8.8.1

Fixes:

* start cumulative time tracking on rebufferend if player wasn't paused

#### v8.8.0

New:

* Add `MuxStats.networkChange()` API for tracking network connectivity changes during a view

#### v8.7.0

Updates:

* Allow error codes as String values
* Add overloads of `MuxStats.error()` which take code, message, error context and flags directly
* Add `ErrorSeverity.WARNING` and `ErrorSeverity.FATAL`. Deprecate `ErrorSeverity.errorSeverityWarning` and `ErrorSeverity.erorrSeverityFatal`

#### v8.6.0

Updates:

* Add `playbackModeChange` API methods to `MuxStats`. You can specify your own arbitrary playback modes, or use one of the presets in `PlaybackMode`
* Add cumulative ad playing time and total content time metric tracking. The metrics track the "wall-clock" time spent with video playing during a view, and exclude time spent buffering or paused.
* Add `AdData.adType` for indicating whether an ad is a preroll, midroll, or postroll

#### v8.5.3

Fixes:

* do not dedupe error code and message if they were included

#### v8.5.2

Improvements:

* Prevent overriding `mux_embed_version`, `mux_api_version`, and `mux_embed`
* Do not flush beacons for non-fatal `error` events

#### v8.5.1

Fixes:

* Un-deprecate `CustomerVideoData.videoCdn`

#### v8.5.0

New:

* Add `CdnChangeEvent`, which will be sent automatically if using Request Metrics and sending your `x-cdn` header
  Fixes:
* fix: Beacons not sent in v8.4.2 of the SDK

#### v8.4.2

Fixes:

* fix: duplicate events and incorrect metadata when resuming after a long time
* fix: error severity not reported correctly

#### v8.4.1

Fixes:

* fix: Incorrect minified key for ViewerClientApplicationName and ViewerClientApplicationVersion

#### v8.4.0

Updates:

* Add `CustomerVideoData::videoCreatorId`

#### v8.3.0

Updates:

* Add new Standard Dimensions

#### v8.2.0

Updates:

* support 10 more custom dimensions

#### v8.1.4

Fixes:

* fix: Always send metadata on 'renditionchange'
* fix: resolve conflicting UUIDs in rare cases

#### v8.1.3

Fixes:

* fix: flush beacons when ad breaks end

#### v8.1.2

Fixes:

* fix: end rebuffering on seek

#### v8.1.1

Fixes:

* fix: verbose debug logging logging can break beacon dispatch
* fix: seeking should end any active rebuffering

#### v8.1.0

Updates:

* update: expose `enable` and `disable` methods for pausing and resuming data collection

#### v8.0.2

Improvements:

* size metrics are now ignored if values are set to -1

#### v8.0.1

Fixes:

* fix: reported application hang due to event handling

#### v8.0.0

Improvements

* Error events can be categorized with warning or fatal severity levels
* Error events can be categorized as business exceptions
* An error translator can be configured to extend or customize the Core SDK error handling logic

Fixes:

* Player error details such as error code, error context, error message, error severity, and whether the error is a business exception are only sent to Mux when an error event is dispatched.
* Player error details (same as listed above) are no longer deduplicated and are explicitly included with each error event sent to Mux.
* The SDK continues to track watch time after an error event is dispatched based on player playhead progression. To explicitly indicate that watch time should no longer be tracked after an error during a playback session please dispatch a `ViewEnd` event.

#### v7.13.2

Fixes:

* Update json.org to 20231013

#### v7.13.1

Fixes:

* Update json.org to 20230227

#### v7.13.0

Fixes:

* fix issue where seeking time would be included in time-to-first-frame if user seeked before playback started

#### v7.12.0

Updates:

* add `update()` method for `CustomerData`

#### v7.11.0

New:

* Support video source codec in `IPlayerListener`

#### v7.10.0

New:

* Add ability to set lower-priority video data, for auto-detected metadata

#### v7.9.1

Improvements:

* Additional improvements in reliability during large events

#### v7.9.0

Improvements:

* Added `drmType` to `CustomerViewData` so customers can override it
* Added `x-litix-shard-id` header populated with device ID

#### v7.8.0

New:

* Add a field to `CustomOptions` for controlling beacon update interval. Very few cases require `longBeaconDispatch`.

#### v7.7.4

Fixes:

* Fix Beacon interval incorrectly being 10 minutes

#### v7.7.3

Improvements:

* Update beacon interval changed from 5s to 10s

#### v7.7.2

Improvements:

* Fix Ad metadata not being reported properly

#### v7.7.0

New:
Add `AdEvent` with `AdData` to represent data about individual, non-preroll ad events during play


# Mux playback events
This guide is a canonical list of playback events. This is useful if building a custom integration.
## Playback events overview

The main component of a player integration revolves around events. Most players trigger or fire events for the common playback events such as `play`, `pause`, `error`, and others, but these events are typically named differently on different platforms. The primary purpose of each player integration is to translate these events into the events that the core libraries expect.

Each language library has a slightly different naming scheme to the events, but they should be in-line with each other aside from some slight syntax.

Optional events provide additional detail in tracking views, but are not necessarily required for base Quality of Experience tracking within a player.

Each event occurrence contains a common set of time values that are submitted to the server and contained in the view exports:

| Field | Description |
|-------|-------------|
| `viewer_time` | The wall clock time from the device when the event occurred, in milliseconds since unix epoch |
| `playback_time` | The playhead position at the time of the event, in milliseconds |
| `event_time` | The wall clock time on the server when the event is received, in milliseconds since unix epoch (populated in the view exports, not submitted with the event) |

## General playback events

The main playback events that Mux SDKs expect are defined as follows:

### `playerready`

Signals that the player initialization process has completed, and the player is ready for interaction. A video may or may not have been loaded in the player; this event is specific to the player completed any tasks in initial startup of the player.

### `viewinit`

Signals that a new view is beginning and should be recorded. This must be called first before any additional playback events. Note that this should only be emitted for the first view within a player; for a change of videos within the same player, `videochange` should be used.

This event is only required for building integrations using the Objective-C Core SDK. This is handled automatically as a side effect of initialization of the JavaScript and Java Core SDKs.

### `videochange`

Signals that the video being played in the player has changed. This must be called if a new video is loaded within the same player. The event should be fired immediately after the new video has been given to the player.

This event is only available within the JavaScript Core SDK. For Objective-C, see the section on changing the video in [Custom Objective-C Integration](/docs/guides/data-custom-objectivec-integration). For Java, there are helper methods for this exposed within `MuxStats`.

### `play`

Signals that the player is beginning its attempt to play back the video. The video is not yet showing on the screen (or moving forward in the case of a resume). The buffer may be empty or full depending on the pre-loading strategy.

For the HTML5 video element, this correlates to the [play](https://developer.mozilla.org/en-US/docs/Web/Events/play) event on the video element.

For ad playback, once resuming from the ad break, the `play` event should be fired immediately after the `adbreakend` event, assuming the player will continue playing content after the ad break without interaction from the viewer.

### `playing`

Signals that the video is now actually playing. The buffer is full enough that the player has decided it can start showing frames. In other words, this is when the first moving frame is displayed to the end user.

For the HTML5 video element, this correlates to the [playing](https://developer.mozilla.org/en-US/docs/Web/Events/playing) event on the video element.

### `pause`

Signals that playback has been intentionally delayed, either by the viewer or by the player (e.g. starting an ad).

For the HTML5 video element, this correlates to the [pause](https://developer.mozilla.org/en-US/docs/Web/Events/pause) event on the video element.

In the case of playback breaking to play an ad, the `pause` event should be fired just before the `adbreakstart` event is fired.

### `timeupdate`

Signals that the playback has advanced some non-zero amount forward. This event should be emitted at *least* every 250 milliseconds, but can be sent more often than this.

For the HTML5 video element, this correlates to the [`timeupdate`](https://developer.mozilla.org/en-US/docs/Web/Events/timeupdate) event on the video element.

If the `timeupdate` event is not sent, the integration must provide the ability to retrieve the playhead time in the player callback for the SDK. See each language SDK for details on this callback. In all SDKs, emitting the `timeupdate` event is preferred, even if the playhead time callback is provided. In addition, on Java platforms, while emitting `timeupdate` is preferred, you must also provide the callback for `getCurrentPosition` within the `IPlayerListener` interface.

If the `timeupdate` event is sent, you must include the playhead position, in milliseconds, via the following mechanisms:

* JavaScript: provided as `player_playhead_time` key within the data object passed along with `timeupdate` to the call to `emit`.
* Java: provided via `PlayerData.setPlayerPlayheadTime` on the `PlayerData` emitted with the event.
* Objective-C: provided via `[MUXSDKPlayerData setPlayerPlayheadTime: time]` in the `MUXSDKPlayerData` object emitted with the event.

For integrations using the Objective-C Core SDK, this event is required to be sent.

### `seeking`

Signals that the user has attempted to seek forward or backward within the timeline of the video.

For the HTML5 video element, this correlates to the [seeking](https://developer.mozilla.org/en-US/docs/Web/Events/seeking) event on the video element.

### `seeked`

Signals that the player has the video data for the new playback position, and is ready to immediately start playing at this new position.

For the HTML5 video element, this correlates to the [`seeked`](https://developer.mozilla.org/en-US/docs/Web/Events/seeked) event on the video element.

### `rebufferstart`

Signals that the player has stopped playing back content when it is expected that playback should be progressing.

<Callout type="info">
  For JavaScript and Objective-C/Swift integrations, this event is internal to the core library and must not be emitted by the player integration.

  For Java integrations, after v6.0.0 of the core library, this event must be emitted by the player integration.
</Callout>

### `rebufferend`

Signals that the player has resumed playing back content after playback previous stalled while attempting to play back.

<Callout type="info">
  For JavaScript and Objective-C/Swift integrations, this event is internal to the core library and must not be emitted by the player integration.

  For Java integrations, after v6.0.0 of the core library, this event must be emitted by the player integration.
</Callout>

### `error`

Signals an error that will be associated with the view. Error severity can be set to fatal (i.e. not recoverable) or warning. Errors will be assumed to be playback failures within Mux by default but can be categorized as business exceptions either on the client and on the server. [See the Error Categorization guide for more details](/docs/guides/error-categorization).

For the HTML5 video element, this correlates to the [error](https://developer.mozilla.org/en-US/docs/Web/Events/error) event on the video element.

This specific event should be accompanied by the following metadata:

| Field | Description |
|-------|-------------|
| `player_error_code` | An integer that provides a category of the error. You should not send a distinct code for each possible error message, but rather group similar errors under the same code. For instance, if your library has two different conditions for network errors, both should have the same `player_error_code` but different messages. |
| `player_error_message` | Details about the error encountered, though should remain relatively generic. It shouldn't include a full stack trace, for instance, as this field is used to group like errors together. |
| `player_error_context` | Used to provide instance-specific details for the error, such as stack trace, segment number, or URL. |

### `ended`

Signals that the current video has played to completion.

For the HTML5 video element, this correlates to the [ended](https://developer.mozilla.org/en-US/docs/Web/Events/ended) event on the video element.

### `renditionchange` (optional)

Signals that the current rendition that is actively being played has changed. Note that this event should be triggered when the playing rendition changes, not necessarily when the player logic has started requesting a different rendition.

This specific event should be accompanied by the following metadata:

| Field | Required | Description |
|-------|----------|-------------|
| `video_source_bitrate` | Required | The current rendition's bitrate (combined video and audio), in bits per second (bps) |
| `video_source_width` | Optional for web/Java, Required for iOS | Optional for web and Java integrations, assuming `video_source_width` is returned by the appropriate callback (e.g. `getStateData` for web) |
| `video_source_height` | Optional for web/Java, Required for iOS | Optional for web and Java integrations, assuming `video_source_width` is returned by the appropriate callback (e.g. `getStateData` for web) |
| `video_source_codec` | Optional for web/Java, Required for iOS | |
| `video_source_fps` | Optional for web/Java, Required for iOS | |
| `video_source_name` | Optional for web/Java, Required for iOS | |

### `orientationchange` (optional)

Signals that a device orientation has been changed during the view. On most platforms this information is not available directly to the player SDK so the customer implementation will notify the Mux SDK when the orientation is changed and Mux will fire an event based on the notification.

This specific event should be accompanied by the following metadata:

* `viewer_device_orientation`. The device's orientation after the change. The orientation is expressed as a  `(x, y, z)` coordinate system, with the most common orientations being `(0,0,90)` for portrait and `(0,0,0)` for landscape.

### `playbackmodechange` (optional)

Signals that the mode of playback has changed. You can use a defined preset or a custom value based on your playback offering.

| Field | Type | Required | Description |
|-------|------|----------|-------------|
| `playbackmode`| string | Required | The name of the playback mode. Presets include `standard`, `inline`, `fullscreen`, `pip`, `miniplayer`, `background`. You can also pass a custom string.
| `playbackmodedata` | string (json) | Optional | Optional string that is valid json that contains metadata which can be used for more detailed analysis. Non json strings will be ignored.

### `heartbeat`

Internal event that is used to provide periodic updates on the playback state, while the player is not paused. Each core library emits heartbeat events (`hb`) automatically, and custom integrations should not need to emit this.

### `viewend`

Internal event that is used to signal the end of a view tracked by Mux. Each core library emits the `viewend` event automatically as a result of either tearing down the SDK or changing the video (`videochange`). Custom integrations do not need to emit this manually.

## Ad Events

For players that support ad playback, the following events are expected. If you do not provide these events, playback will still be monitored, but there will not be ad-specific metrics or knowledge of ads vs content.

These events require additional data to be provided. See [Building a Custom Integration](/docs/guides/build-a-custom-data-integration).

### `adrequest` (optional)

Signals that an ad request is about to be made, or was just made but the response has not been received.

In the process of the player retrieving an ad payload, multiple `adrequest` and `adresponse` events may be fired (either due to waterfall, or for an ad break that has multiple ads). In the case that these requests are made in parallel, the player integration must send an `ad_request_id` in the data along with each `adrequest` and `adresponse` event, so that Mux can match them up correctly.

### `adresponse` (optional)

Signals that a response was received from the ad server.

In the process of the player retrieving an ad payload, multiple `adrequest` and `adresponse` events may be fired (either due to waterfall, or for an ad break that has multiple ads). In the case that these requests are made in parallel, the player integration must send a `ad_request_id` in the data object along with each `adrequest` and `adresponse` event, so that Mux can match them up correctly.

The `adresponse` event can only be fired by the player integration if the `adrequest` events are fired as well.

### `adbreakstart`

Signals that an ad break has begun. This coincides with the playback of the video being paused in order to display the ads at the current position. This event should come immediately after the `pause` event is fired due to attempting to play back an ad break, and before any `adplay`, `adplaying`, `adpause`, or `adended`.

The `adbreakstart` event may come before, during, or after the `adrequest`/`adresponse` events, depending on the player’s configuration for making ad requests.

### `adplay`

Signals that the player is beginning its attempt to play back an individual advertisement video. The ad is not yet showing on the screen (or moving forward in the case of a resume). The buffer may be empty or full depending on the pre-loading strategy.

This event is the ad-specific equivalent of `play`.

### `adplaying`

Signals that an advertisement is now actually playing. The buffer is full enough that the player has decided it can start showing frames for the ad.

This event is the ad-specific equivalent of `playing`.

### `adpause`

Signals that playback of an advertisement has been intentionally delayed, either by the viewer or by the player (e.g. user pressing pause on the ad player controls).

This event is the ad-specific equivalent of `pause`.

### `adfirstquartile` (optional)

Signals that the current advertisement has progressed past the first quartile in playback. This event should coincide with the point in time that the ad integration would fire the `firstQuartile` ad tracking beacon (in VAST terminology).

### `admidpoint` (optional)

Signals that the current advertisement has progressed past the midpoint in playback. This event should coincide with the point in time that the ad integration would fire the midpoint ad tracking beacon (in VAST terminology).

### `adthirdquartile` (optional)

Signals that the current advertisement has progressed past the third quartile in playback. This event should coincide with the point in time that the ad integration would fire the `thirdQuartile` ad tracking beacon (in VAST terminology).

### `adended`

Signals that the advertisement has played to completion.

This event is the ad-specific equivalent of `ended`.

### `adbreakend`

Signals that all ads in the ad break have completed, and playback is about to resume on the main content. This event should be come immediately after the last `adended` event in the ad break, and before the resuming `play` event signifying that playback of the main content is resuming.

There may be multiple `adplay`/`adended` combinations within a single ad break.

### `aderror`

Signals that an error has occurred that relates to the ad break currently in play or the ad request/response.

## Bandwidth throughput events

Like the Ad-specific events, these events are not required. However, if you include any of these, you must include all of them. Each of these events refers to a network request made for some component of the media playback. This includes but, depending on your exact configuration, may not be limited to:

* manifests and content segment requests for HLS playback
* manifests, init fragment, and content fragment requests for DASH playback

These events should *not* be fired for ad requests and require additional data to be sent along with them. See [Network Request Data](#network-request-data).

### `requestcompleted`

Signals that a network request for a piece of content returned successfully.

### `requestfailed`

Signals that a network request for a piece of content returned unsuccessfully.

### `requestcanceled`

Signals that a network request for a piece of content was aborted before it could return (either successfully or unsuccessfully).

## Accompanying Data

Each core SDK has its own mechanism for providing data along with each event. This data is used to provide information such as player state (e.g. paused or playhead time), and potentially to override the data that is pulled automatically from the player.

For the most part, most data is retrieved automatically, and you will not need to provide any accompanying data. The notable exceptions for this are in regards to ad information, as well as network request information.

See the following guides for each library on how to provide additional data with each event.

## Ad-Specific Data

The following data should be sent while emitting the ad-specific events, where possible.

### `ad_type`

The type of ad used during playback: `preroll`, `midroll`, `postroll`

### `ad_asset_url`

The URL for the current ad being played. For example, in a VAST response, this would correspond with the MediaFile URL that is being played.

Note: this data should only be included alongside `adplay`, `adplaying`, `adpause`, `adended`, `adfirstquartile`, `admidpoint`, `adthirdquartile` events, as they are the only events that correlate with the ad asset that is being played.

### `ad_tag_url`

The URL for the current ad tag/ad request being made. For example, this could be the URL that is expected to return a VMAP or VAST document detailing what ad(s) to play.

Note: this data should only be included alongside `adrequest` and `adresponse` events, as those are the only events that correlate with the ad tag URL being used currently.

### `ad_creative_id`

The Creative Id of the ad. This usually is the Ad-Id of the selected creative in the VAST response.

### `ad_id`

The Id of the ad. This usually is unique in the Ad Provider's system and specified in the VAST response.

### `ad_universal_id`

The Universal Id of the ad. This usually is globally unique for the ad across all Ad Providers.

Note: the above 3 metadata can be included in all ad events except for `adrequest` and `adresponse` events.

## Network Request Data

The following data should be sent along with any of the network events (`request*`).

### `request_start`

Timestamp that the request was initiated, in milliseconds since the Unix epoch.

Include alongside: `requestcompleted`, `requestfailed`, `requestcanceled`

### `request_bytes_loaded`

The total number of bytes loaded as part of this request.

Include alongside: `requestcompleted`

### `request_response_start`

Timestamp that the response to the request began (i.e. the first byte was received), in milliseconds since the Unix epoch.

Include alongside: `requestcompleted`

### `request_response_end`

Timestamp that the response was fully received (i.e. the last byte was received), in milliseconds since the Unix epoch.

Include alongside: `requestcompleted`

### `request_type` (optional but recommended)

The type of content being requested. Specifying the `video` as the `request_type` for video segements is recommended to ensure CDN tracking accuracy. One of the following:

| Type | Description |
|------|-------------|
| `manifest` | Used when the request is for a master or rendition manifest in HLS, or a DASH manifest. |
| `video` | Used when the request is for a video-only segment/fragment |
| `audio` | Used when the request is for an audio-only segment/fragment |
| `video_init` | Used when the request is for the video init fragment (DASH only) |
| `audio_init` | Used when the request is for the audio init fragment (DASH only) |
| `media` | Used when the type of content being request cannot be determined, is audio+video, or is some other type. |
| `subtitle` | Used when the request is for subtitle or caption content |
| `encryption` | Used when the request is for a DRM encryption key |

Include alongside: `requestcompleted`, `requestfailed`, `requestcanceled`

### `request_hostname`

The hostname portion of the URL that was requested.

Include alongside: `requestcompleted`, `requestfailed`, `requestcanceled`

### `request_id` (optional)

The id for identifying the individual request. CDNs often include a request id in their responses which can be used for correlating requests across the player and CDN.

Include alongside: `requestcompleted`, `requestfailed`, `requestcanceled`

### `request_url` (optional)

The URL that was requested.

Include alongside: `requestcompleted`

### `request_labeled_bitrate` (optional)

Labeled bitrate (in bps) of the video, audio, or media segment that was downloaded.

Include alongside: `requestcompleted`

### `request_response_headers` (optional)

A map of response headers and their values. You should include whatever headers are available to the client, as this information may be used to determine routing of each request. The most important header, though, is the X-CDN header as described in [CDN Configuration for Request-Level Metadata](/docs/guides/enable-automatic-cdn-detection).

Include alongside: `requestcompleted`

### `request_media_duration` (optional)

The duration of the media loaded, in seconds. Should not be included for `requestcompleted` events for manifests.

Include alongside: `requestcompleted`

### `request_video_width` (optional)

For events with `media` or `video` `request_type`, the width of the video included in the segment/fragment that was downloaded.

Include alongside: `requestcompleted`

### `request_video_height` (optional)

For events with `media` or `video` `request_type`, the height of the video included in the segment/fragment that was downloaded.

Include alongside: `requestcompleted`

### `request_error`

The name of the error event that occurred. Note this is not the status code of the request itself, but rather something along the lines of `FragLoadError`.

Include alongside: `requestfailed`

### `request_error_code`

The response code of the request that spawned the error (i.e. 401, 400, 500, etc).

Include alongside: `requestfailed`

### `request_error_text`

The message returned with the failed status code.

Include alongside: `requestfailed`

## Sample Sequence of Events

A sample sequence of events for an integration would look like the following:

| Event | Description |
|-------|-------------|
| `playerready` | |
| `viewinit` | When the video is about to be loaded in a player |
| `play` | When the user presses play to attempt playing back the video |
| `playing` | When the first frame of video is displayed |
| `timeupdate` | At least every 250 ms with progress of the playhead time |
| `pause` | When the viewer presses pause |
| `play` | When the viewer resumes playback |
| `playing` | When the first frame is displayed after resuming |
| `timeupdate` | |
| `ended` | When the video playback is complete |
| `viewend` | When the view is complete - e.g. the user is no longer attempting to watch the video |

At the end, if the viewer loads a new video into the player, a `videochange` event should be emitted instead of the `viewend` event, with the new video data.


# Export Monitoring data for integration
Understand how to export your monitoring data into your own system for processing and taking action.
<Callout type="info">
  The Monitoring Samples Stream is only available on **Mux Custom Media** plans. Learn more about [Mux Data Plans](https://data.mux.com/pricing) or [contact support](https://mux.com/support).
</Callout>

Mux provides a mechanism for customers to subscribe to a near-realtime, video view-level data stream of events and measurements related to the quality of service for customers with a Mux Data integration.

This can be used to identify service-level problems, such as widespread rebuffering or playback failures. It can also be used to integrate Mux data with platforms for multi-CDN switching platform, alerting, or constructing your own version of the Mux Data Monitoring dashboard.

## Monitoring Sample Messages

A single Monitoring Samples payload may contain multiple samples. Each sample corresponds to a single active video view, with a different view id per sample. The sample can contain multiple records, where each record contains metrics for a point in time for the video view. A record specifies a time period and metrics measured over that time period. All metrics inside a single record will apply to the time range implied by the `start` timestamp field plus the `duration_ms` field. If the duration field is zero, the record includes instantaneous metrics. A record MUST contain at least one metric.

<Image sm src="/docs/images/monitoring-stream-format.png" width={640} height={480} />

## Metrics Included

### START\_LATENCY\_MS

Also known as Time To First Frame (TTFF). This is Mux’s Video Startup Time which measures the time that the viewer waits for the video to play after the page is loaded and the player is ready.

### EXIT\_BEFORE\_VIDEO\_START

Instantaneous event metric that is sent when a playback drop is detected. This is sent when Mux has detected an intent to play but playback never begins. Inherently has a delay (up to 1 minute) while waiting to detect play start. The value field contains the playhead time of the player at the time of exit, in milliseconds, typically this value is 0 for videos starting from the beginning. This is NOT sent when the playback is halted due to a PLAYBACK\_ERROR.

### WATCH\_DURATION\_MS

Watch Duration is the amount of time in millisecond that viewers spend attempting to watch a video. This includes all time spent waiting for video to load, including rebuffering and seeking. It does not include time spent paused.

### SEEK\_LATENCY\_MS

The Seek Latency metric measures the average amount of time that viewers wait for the video to start playing again after seeking to a new time. Seeking is any time the player is asked to jump backward or forward to a new time in the video, outside of normal playback.

### REBUFFER\_DURATION\_MS

Rebuffer Duration is the amount of time in milliseconds that viewers spend rebuffering during the record window.

### REBUFFER\_COUNT

Rebuffer Count is the number of independent rebuffer events encountered over the record time window.

### PLAYBACK\_ERROR

Instantaneous event metric that is sent when playback has failed due to a fatal technical error. The value is the player playhead timestamp in milliseconds when the error occurred. Non-fatal technical errors and business errors are not included in the Monitoring stream.

## Continuously stream data

Mux Data supports streaming the Monitoring Samples to an Amazon Kinesis Data Stream in your cloud account. Monitoring data is sent to the configured destination each 30 second interval.

The samples stream data can be stored in your long-term storage for processing and aggregation. This method of access is most useful for customers who want real-time updates of the current performance that can be used for aggregations that inform real-time CDN switching, custom alerting, or internal NOC tools.

## Setting up a Monitoring Samples stream

Monitoring Samples streams are enabled by working with the Mux team; they are *not* currently configured in the **Streaming Exports** settings in your Mux dashboard. Generally, the steps for configuring realtime sample exports are as follows:

* Mux will work with the customer to generate the AWS account details.
* The customer will create the destination and security artifacts in AWS.
* Send the AWS ARNs to Mux.
* Mux enables real-time sample exports to the customer Kinesis stream in production & staging.

For more information on setting up AWS Kinesis, refer to the [Amazon Kinesis Data Streams](/docs/guides/export-amazon-kinesis-data-streams) setup guide for more information on setting up an export.

## Message format

Messages are in either JSON format or Protobuf (proto2) encoding. You can choose between the two formats when setting up the data stream with Mux support.

For Protobuf encoding, every message uses the `com.mux.realtime.Samples` message type defined in the export Protobuf spec, which is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/video_view). Use the latest Protobuf spec when creating schemas or generating code.

## Monitoring Samples message format

The protobuf definition for the Monitoring Samples stream is available in the [mux-protobuf repository](https://github.com/muxinc/mux-protobuf/tree/main/monitoring_samples). Please subscribe to this repository for updates to the protobuf definition.

The JSON format payload contains identical fields as the protobuf-encoded format.

## Versioning

## Backward compatibility

The schema provided by Mux Data is backward compatible, meaning that each schema version guarantees that it will still work upon future upgrades. Customers do not need to worry about breaking changes.

## When to upgrade the schema?

When Mux adds new fields or metrics to the Monitoring Samples stream, we will upgrade the schema version. Without taking any actions, new fields will be automatically included in the data stream. For JSON formatted data, the new fields will be included in the data objects as they are added to the stream. For proto encoded streams, the new fields will be available once you upgrade to the latest [proto definition](https://github.com/muxinc/mux-protobuf/tree/main/monitoring_samples).


# Ensure privacy compliance with Mux Data
Collect the information you need to be successful while ensuring privacy for your viewers and complying with GDPR and other privacy regulations.
## Is Mux Data GDPR/CCPA/VPPA compliant and privacy preserving?

Mux takes privacy expectations seriously; we are compliant with the Video Privacy Protection Act (VPPA), California Consumer Privacy Act (CCPA), and General Data Protection Regulation (GDPR). We also make available a [detailed Data Processing Addendum (DPA)](https://mux.com/dpa/) that details the measures we've taken to achieve compliance. Mux has attested to our data privacy protections and has been certified under the [Data Privacy Framework (DPF)](https://www.dataprivacyframework.gov/) program.

Mux works to ensure the privacy of viewers while providing development teams using Mux Data with the visibility they need to track audience engagement and their viewers' quality of experience. We don't believe that privacy and insights should be a trade-off for developers or video viewers.

We also want to ensure that the metadata that developers send to us is also properly anonymized in order to reduce the possibility of personally identifying a viewer with their activity. We strongly urge developers using Mux Data to provide an anonymized viewer id - using a non-personally identifiable id from a system only the customer has access to - that is meaningful to the developer but not to Mux as part of the view metadata.

## Do you track sensitive personally identifiable information?

No. Mux Data does not store information about the user such as email, name, or built-in device identifier (such as the Id for Analytics on iOS). For more information about the data we store - which doesn't include personal viewer information - please reference our [Data Processing Addendum](https://mux.com/dpa/).

## Does Mux have a Data Protection Addendum (DPA)?

Yes. The Mux Data Protection Addendum (DPA) is available at https://mux.com/dpa/ .

## Does Mux participate in the Data Privacy Framework?

Yes, Mux participates in the EU-U.S. [Data Privacy Framework (DPF)](https://www.dataprivacyframework.gov/), having self-certified our compliance. The DPF enables lawful transfers of personal data from the EU to the U.S. and is designed to ensure strong privacy protections. As a U.S.-based company handling data from international customers, our participation in the DPF underscores our commitment to data privacy and provides reassurance that we meet the standards required under EU law. You can find more detailed information and view Mux’s certification on the official DPF website at https://www.dataprivacyframework.gov/list .

## How do I make a GDPR or CCPA data erasure request?

Mux Data does not knowingly store personally identifiable information, but GDPR and CCPA data erasure requests can be made to the gdpr@mux.com email. This email is monitored and you will receive a response from us that the viewers' data is being removed, if any can be identified.

## Is it possible to keep my viewers' IP address data in Europe?

Yes. Mux Data has an ingest location in the European Union (EU) that can be used for processing video views. The full IP addresses will only be processed at our location in Germany and the post-processed view data, including corresponding truncated IP address with the last octet removed, will be sent to the United States for aggregation and reporting. For more information on using the EU location, please reach out to your sales contact or email sales@mux.com.

## What information does Mux Data collect?

Mux Data collects non-personally identifiable information about the viewer experience that allows you to track engagement and the quality of experience for your audience.

* IP address: We process a viewer's IP address in order to look up coarse location information and do bot detection. After processing, we pseudonymize the IP address by truncating it (to /24 for IPv4) and then we store only the pseudonymized value.
* Geographic location and Autonomous Systems Number (ASN): We generate coarse location information at the country and state-level from the IP address, but we do not collect fine grained latitude/longitude information nor do we access geo-location features of mobile devices.
* Viewer ID: We generate a unique, random identifier for a viewer that is used as a viewer id if none is provided by the developer implementing the Mux Data SDK. We do not associate these IDs with any activity other than the video views and we do not associate the id with any advertising profile data. Because we do not store identifiable information about viewers, we are not able to associate the video view history with a specific individual.
* Device information: Information about the device that is used to access video playback, including model, device type, operating system, and browser used.
* Details about video content watched: Metadata such as type of stream: live or VOD, video format, autoplay status, etc. A list of [additional metadata](/docs/guides/make-your-data-actionable-with-metadata) is available for reference and most metadata is optional, to be set by the developer implementing Mux Data.

## How long does Mux Data store viewership data?

Pseudonymized video view data is stored for up to 100 days and is then deleted from our systems.

## Is Mux Data appropriate for applications targeted to children?

Yes. Mux does not store personally identifiable data, use viewer data for advertising, or sell user identifiable data. The Mux Data and Mux Video SDKs can be used in applications and receive approval for children's apps on the app stores.

## What information is stored in Mux Data's HTTP cookies?

By default, Mux plugins for HTML5-based players use a cookie to track playback across subsequent page views in order to understand viewing sessions. This cookie includes information about the tracking of the viewer, such as an anonymized viewer ID that Mux generates for each user. None of this information is personally-identifiable, but you can disable the use of this cookie if desired. For example, if your site or application is targeted towards children under 13, you should disable the use of cookies. Please refer to the documentation for the specific Mux Data SDK you are using for info on how to disable cookies.

The cookie is set as a first party cookie on the domain of the website that is embedding the player and Mux Data SDK. For example, if the video player with Mux Data integrated is located on the page: `http://example.com/demo.html` the cookie will be set on the domain `example.com`. The cookies are only available on each individual customer's domain and cannot be used to track viewers across Mux customers.

The Mux Data cookie contains the following information:

* `mux_viewer_id`: a randomly generated viewer id that is used as the default anonymous Viewer ID.
* `msn`: random value used to decide if the viewer will be sampled (tracked) or not
* `sid`: randomly generated anonymous session id
* `sst`: the time the session started
* `sex`: the time at which the session will expire

## Do I need to ask permission to track on iOS when I use a Mux Data SDK in my app?

Mux does not access the Identifier for Advertisers (IDFA) in any SDK, nor does it use viewer data for advertising or advertising efficiency measurement so the Apple AppTrackingTransparency (ATT) framework does not require a tracking permission request to use the Mux SDK.

As of version 2.4.2 of the [Mux Data for AVPlayer SDK](https://github.com/muxinc/mux-stats-sdk-avplayer), the Identifier for Vendors (IDFV) is no longer used and the Mux Data SDK generates a random unique identifier on the device for default Viewer Id. We do not sell the identifier data or attempt to track users across Mux customers.

As of version 3.6.1 of the [Mux Data for AVPlayer SDK](https://github.com/muxinc/mux-stats-sdk-avplayer) and versions 4.7.1 and 5.0.1 of the [Mux Data Objective-C Core SDK](https://github.com/muxinc/stats-sdk-objc), a privacy manifest file that satisfies [Apple’s requirements for third-party SDKs](https://developer.apple.com/support/third-party-SDK-requirements/) to outline privacy practices associated with their use. Customers who export data from Mux for additional processing may need to include additional privacy manifest entries with their application subject to their specific practices.

## Does my app need to access a hardware id on Android when I use a Mux Data SDK?

As of version 2.4.1 of the [Mux Data for ExoPlayer SDK](https://github.com/muxinc/mux-stats-sdk-exoplayer), the Mux Data SDK generates a random unique identifier on the device for the default Viewer Id. We do not sell the identifier data or attempt to track users across Mux customers.


# Integrate a Data custom domain
Learn how to integrate a Data custom domain for beacon collection.
In this guide you will learn how to configure a custom domain used for submitting Mux Data beacons from SDK clients. Video view data will be sent to the specified custom domain rather than the default Mux domain.

You might choose to do this for a couple of reasons, such as allowing analytics traffic to bypass school or other network firewall restrictions (via a known domain), [zero-rating](https://en.wikipedia.org/wiki/Zero-rating) this traffic, or to aid tracking performance when ad blockers are in place.

<Callout type="info">
  Custom Domains for Mux Data are available on select plans, such as **Mux Data Media**. [Reach out](mailto:help@mux.com) if you have any questions.
</Callout>

## 1. Point your custom domain to the Mux domain

After selecting your desired custom domain, you will need to create CNAME records with your DNS provider to alias the custom domain to a Mux-controlled one and allow Mux to issue TLS certificates for your selected domain. After providing your Customer Success Manager with the desired subdomain, Mux will provide you with the specific required DNS records to enable custom domains (including the value for `{KEY}` below). The records will have the following basic format:

```
subdomain.yourdomain.com 300 IN CNAME ${KEY}.customdomains.litix.io
_acme-challenge.subdomain.yourdomain.com 300 IN CNAME ${KEY}.validations.customdomains.litix.io
```

Notify Mux after these records have been created so we can issue TLS certificates to terminate beacon traffic sent to your selected custom domain. You will be notified by Mux when the domain has been successfully provisioned.

## 2. Configure your SDK integration to use a custom beacon domain

You can verify whether the custom domain is operational by using `curl` to query your domain:

```
$ curl https://subdomain.yourdomain.com -s -w "%{http_code}"
200%
```

<Callout type="info">
  Make sure that you have upgraded to the latest versions of each SDK to ensure Custom Domains function correctly.
</Callout>

It may take some time for DNS records to propagate before this request will work. After that is complete, configure your SDK integrations to specify your custom domain. Set the `beaconCollectionDomain` property to your custom domain.

Depending on your SDK, you can set the value for `beaconCollectionDomain` in various ways.

```brightscript

m.mux = m.top.CreateNode("mux")
m.mux.setField("video", m.video)
muxConfig = {
  env_key: "ENV_KEY",
  beaconCollectionDomain: "CUSTOM_DOMAIN"
}
m.mux.setField("config", muxConfig)
m.mux.control = "RUN"

```

```javascript

mux.monitor('#my-player', {
  debug: false,
  beaconCollectionDomain: 'CUSTOM_DOMAIN',
  data: {
    env_key: 'ENV_KEY', //required
    // ...
  }
});

```

```kotlin

val customOptions = CustomOptions().apply {
  beaconCollectionDomain = "CUSTOM_DOMAIN"
}
muxStatsExoPlayer = exoPlayer.monitorWithMuxData(
  context = requireContext(),
  envKey = "YOUR_ENV_KEY_HERE",
  playerView = playerView,
  customerData = customerData,
  customOptions = customOptions
)

```

```objc

_playerBinding = [MUXSDKStats monitorAVPlayerViewController:_avplayerController 
                                             withPlayerName:@"mainPlayer" 
                                               customerData:customerData
                                     automaticErrorTracking:YES
                                     beaconCollectionDomain:@"CUSTOM_DOMAIN"];

```

```swift

let playerBinding = MUXSDKStats.monitorAVPlayerViewController(self, withPlayerName: "mainPlayer", customerData: customerData, automaticErrorTracking: true, beaconCollectionDomain: "CUSTOM_DOMAIN");

```



# Track autoplaying videos
Use this guide to understand best practices around autoplay to make your autoplaying videos are correctly tracked.
If you are autoplaying videos with any web based players that use the video element then make sure you read this guide so that Mux can accurately track your videos' startup time. This applies to video elements with the `autoplay` attribute and anytime you are calling `play()` on a video element (this includes all HTML5 players like VideoJS, JWPlayer, Shaka player, etc.).

Browser vendors are frequently changing their policies when autoplay is allowed and not allowed, so your application should be prepared to deal with both scenarios, and we want to make sure we're tracking your views and errors accurately.

# Increase your chance of autoplay working

There's a few conditions that will increase your chance of autoplay working.

* Your video is muted with the muted attribute.
* The user has interacted with the page with a click or a tap.
* (Chrome - desktop) The user’s [Media Engagement Index](https://developers.google.com/web/updates/2017/09/autoplay-policy-changes#mei) threshold has been crossed. Chrome keeps track of how often a user consumes media on a site and if a user has played a lot of media on this site then Chrome will probably allow autoplay.
* (Chrome - mobile) The user has added the site to their home screen.
* (Safari) Device is not in power-saving mode.

<Callout type="error" title="Autoplay will never work 100% of the time">
  Even if autoplay works when you test it out, you can never rely on it working for every one of your users. Your application must be prepared for autoplay to fail.
</Callout>

# Avoid the `autoplay` attribute

When you use the `autoplay` attribute (it looks like `<video autoplay>`, you (and Mux) have no way to know if the browser blocked or didn't block autoplay.

The issue is that when using the `autoplay` attribute the `video` element sometimes does not send the `play` event when it should, which can result in incorrect Video Startup Time measurements.

To avoid this issue, use `video.play()` instead, which returns a promise and allows you to know if playback played successfully or not. If autoplay worked, the promise will resolve, if autoplay did not work then the promise will reject with an error. The great thing about this approach is that you can choose what to do with the error.

For example: you can report the error to your own error tracking tools or update the UI to reflect this error. Note that Mux's custom error tracking is for tracking fatal errors, so you wouldn't want to report an autoplay failure to Mux because then it will be considered a fatal error.

```js
const video = document.querySelector('#my-video');
mux.monitor(
  /*
    see the web-integration-guide HTML5 to set this up
  */
);

video.play().then(function () {
  // autoplay was successful!
}).catch(function (error) {
  // do something if you want to handle or track this error
});
```

For further reading, see [the mux blog post](https://mux.com/blog/video-autoplay-considered-harmful/) about this topic.


# Mux Data FAQs
Answers to common questions about Mux Data.
## Why use Mux Data?

Mux Data uncovers four key dimensions of video quality of service: playback failures, startup time, rebuffering, and video quality. If your aim is broadcast-quality video streaming, Mux Data enables you to monitor these critical video metrics.

With each Mux Data metric, you can monitor and track what matters to your viewers. For example, [Overall Viewer Experience Score](/docs/guides/data-overall-viewer-experience-metric#overall-viewer-experience-score) is a metric that quickly summarizes your video platform's performance.

To get familiar with more of the features of Mux Data, see this [introduction to Mux Data](https://mux.com/data).

## What is the Mux Data Dashboard and what can you do with it?

The Mux Data Dashboard is an interface that lets you set filters and view graphs that monitor each specific key metric you are interested in. With each metric, you can monitor and track what matters to your viewers.

You can also immediately see what is happening before users do with [Anomaly Alerts](/docs/guides/setup-alerts#anomaly-alerts) and [Threshold Alerts](/docs/guides/setup-alerts#threshold-alerts). It is easy to set these alerts for prompt notifications. Check your dashboard to track the source or sources.

You may want to apply [Filters](/docs/guides/setup-alerts#filters) to the alert definition to track only specific data. Finally can also use the <ApiRefLink href="/docs/api-reference/data/metrics/list-insights">List Insights</ApiRefLink> feature as a way of *impact sorting* which browsers, devices, regions, CDNs, players, ads and videos are creating the most problems for your viewers.

## What is the Monitoring Dashboard and what can you do with it?

The [Mux Data Monitoring Dashboard](https://data.mux.com/real-time-monitoring/), previously called the Real-time Dashboard, allows you to monitor your critical metrics in one operational dashboard that updates in real-time. This lets you respond to major streaming issues quickly.

<Callout type="success">
  Read this blog post as a great example and resource:

  [Respond to and Resolve Incidents with the Monitoring (formerly Real-time) Dashboard](https://mux.com/blog/respond-to-and-resolve-incidents-with-the-real-time-dashboard/).

  It dives into how to use tools on the Monitoring Dashboard to investigate the incident, communicate with stakeholders, resolve the issue, and improve your resiliency.
</Callout>

## Can I access Mux Data via an API?

Yes, all Mux Data views and metrics are all available through the Data API. Raw video view data can be [exported via the API](/docs/guides/export-raw-video-view-data). Additionally, here is a detailed blog post describing how to [create graphs using the Mux API](https://mux.com/blog/use-the-mux-data-api-to-create-graphs-in-react/).

## Where do I find Mux Data Pricing? What features are included in the Pay-as-you-go, Media, and Custom Media Plans?

Choose a Mux Data pricing plan on the [Data Pricing page](https://mux.com/data/#DataPricing). Here you can view a breakdown of all features that Mux includes with each plan including Pay-as-you-go, Media, and Custom Media.

Or, [contact our Sales team](https://mux.com/sales-contact) to acquire more detailed information.

## Where do I find all supported metrics, dimensions, and devices?

You can find more [Technical Specs here](https://mux.com/data/#TechSpecs) covering all tracked video metrics, available filters, and supported players.

## Can I use Mux Data to monitor audio-only content?

Yes, Mux Data can be used to monitor audio content that uses the `<audio>` element. Mux Data will track Engagement metrics, such as the number of plays and length of playback time, as well as basic Quality of Experience metrics including Startup Time, Rebuffering Percentage, and others. Video Quality metrics are not calculated for audio content.

## Is Mux Video delivery usage API similar to watch time in Mux Data?

No, these two measurements are quite different. Mux Video's <ApiRefLink href="/docs/api-reference/video/delivery-usage">Delivery Usage API</ApiRefLink> is based on the number of minutes delivered to clients. This is a server-side (CDN) metric. Whereas Mux Data collects metrics from the client-side and calculates watch time based on the user's interaction with the player.

If a user watches a video, rewinds, and watches the video again that content was only delivered one time to the device but it was watched multiple times. In this scenario Mux Video's delivery usage would be lower than the watched time in Mux Data.

More commonly, the client will build up a buffer of downloaded video content. The user will watch some of it and then leave before watching the full length of the video. In this scenario Mux Video's minutes delivered would be higher than the watched time in Mux Data because the client downloaded more minutes of video than it watched.

Another factor to keep in mind is that because Mux Data runs as a client-side SDK, it is susceptible to being blocked by ad-blockers.

## How should I use Mux environments?

Environments allow you to separate data collected from players to more accurately analyze your video engagement and performance. A Development and Production environment are created automatically when you sign up, and this is the most common way of organizing environments. You can rename your environments or add additional environments as needed, but we recommend keeping development and production data separate.

Multiple sites or apps can use the same environment and Mux Data environment key. For example, if you have both web and mobile players, and want to view and compare metrics across them, you should use the same environment. Additionally, if you are using Mux Video, use the same environment for Mux Data. Views tracked by Mux Data for videos or live streams streamed from Mux Video are automatically populated with Mux Video identifiers when they’re within the same environment. This allows you to easily view metrics for your assets and live streams in your Mux dashboard. Learn more in our [blog post](https://www.mux.com/blog/giving-developers-more-with-mux-data-mux-video) on Data features for Mux Video.

## How are Mux Data environment keys used?

Each environment has a client-side key associated with it, which you can find on your Environments page. You’ll also see it in Get Started with Data (accessed from the Overview page) for any environment you haven’t integrated yet. When integrating a Mux Data SDK, your environment key allows us to associate the views collected with that SDK to the correct environment. Environment keys are not secret. In rare cases where you would like to change your environment key, [contact us](/support) and we can change it for you.


# Signing JWTs
JSON Web Tokens are an open, industry standard method for representing claims securely between two parties. Mux APIs leverage JWTs to authenticate requests.
## What is a JWT?

JWTs are made up of a header, a payload, and a signature. The header contains metadata useful for decrypting the rest of the token. The payload contains configuration options. And the signature is generated from a signing key-pair. More information can be found at [jwt.io](https://jwt.io/).

In order to sign the JWT you must create a signing key. Signing keys can be created from the [Signing Keys section](https://dashboard.mux.com/settings/signing-keys) of the Mux Dashboard or via the <ApiRefLink href="/docs/api-reference/system/signing-keys">Mux System API</ApiRefLink>. This key-pair will be used by a cryptographic function to sign JWTs.

## Signing JWTs during Development

While developing an app, you may want an easy way to generate JWTs locally because you're not yet ready to set up a full blown production system that signs JWTs for client-side applications. There are a few different options for generating these JWTs.

### Web Based JWT Signer

<Callout type="warning">
  Pasting credentials into a web browser is generally a bad practice. This web-based tool signs JWTs on the client which means your credentials never leave your machine. This is a tool designed by Mux, intended to be used with Mux credentials, and will always be hosted on a Mux domain. **Never use a tool like this if it is hosted on a non-Mux domain.**
</Callout>

Mux provides a web based JWT Signer at https://jwt.mux.dev. Simply input the Signing key-pair and configure the claims you wish to test your app with. Then, copy the JWT into your application code and run it.

<Image src="/docs/images/jwt-signer.gif" width={600} height={440} alt="Mux's JWT Signer" />

### Node based CLI

Mux provides a [Node.js based CLI](https://github.com/muxinc/cli) for performing common tasks including signing JWTs for [playback IDs](https://github.com/muxinc/cli#mux-sign-playback-id).

After [installing Node.js](https://nodejs.org/), the Mux CLI must be initialized with an Access Token. Follow [this guide](/docs/core/make-api-requests#http-basic-auth) to create an Access Token. With your newly created Access Token, initialize the Mux CLI.

```
npx @mux/cli init
```

Now that the Mux CLI is initialized with your credentials, you can sign a JWT for [Video Playback](https://github.com/muxinc/cli#mux-sign-playback-id).

```
npx @mux/cli sign PLAYBACK-ID
```

For more details, refer to https://github.com/muxinc/cli.

<Callout type="warning">
  You should only sign a JWT on the server, where you can keep your signing key secret. You should not put your signing key in the client itself.
</Callout>

<Callout type="success">
  Setup a REST endpoint behind your own authentication system that provides your client-side code with signed JWTs. That way, the sensitive secret from the signing key-pair stays on the server instead of being included in the client.
</Callout>

## Signing JWTs for Production

Once you're ready for customers to start using your app, you need a way to sign JWTs securely at-scale. Use the code examples below depending on which Mux product you would like to sign JWTs for.

### Sign Video Playback JWTs

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"

    "github.com/golang-jwt/jwt/v4"
)

func main() {

    playbackId := "" // Enter your signed playback id here
    keyId      := "" // Enter your signing key id here
    key        := "" // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": playbackId,
        "aud": "v",
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// We've created some helper functions for Node to make your signing-life easier
const Mux = require('@mux/mux-node');
const mux = new Mux();

async function createTokens () {
  const playbackId = ''; // Enter your signed playback id here

  // Set some base options we can use for a few different signing types
  // Type can be either video, thumbnail, gif, or storyboard
  let baseOptions = {
    keyId: '', // Enter your signing key id here
    keySecret: '', // Enter your base64 encoded private key here
    expiration: '7d', // E.g 60, "2 days", "10h", "7d", numeric value interpreted as seconds
  };

  const token = await mux.jwt.signPlaybackId(playbackId, { ...baseOptions, type: 'video' });
  console.log('video token', token);

  // Now the signed playback url should look like this:
  // https://stream.mux.com/${playbackId}.m3u8?token=${token}

  // If you wanted to pass in params for something like a gif, use the
  // params key in the options object
  const gifToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'gif',
    params: { time: '10' },
  });
  console.log('gif token', gifToken);

  // Then, use this token in a URL like this:
  // https://image.mux.com/${playbackId}/animated.gif?token=${gifToken}

  // A final example, if you wanted to sign a thumbnail url with a playback restriction
  const thumbnailToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'thumbnail',
    params: { playback_restriction_id: YOUR_PLAYBACK_RESTRICTION_ID },
  });
  console.log('thumbnail token', thumbnailToken);

  // When used in a URL, it should look like this:
  // https://image.mux.com/${playbackId}/thumbnail.png?token=${thumbnailToken}
}

```

```php

<?php
  // Using Composer and https://github.com/firebase/php-jwt
  require __DIR__ . '/vendor/autoload.php';
  use \Firebase\JWT\JWT;

  $playbackId = ""; // Enter your signed playback id here
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
    "sub" => $playbackId,
    "aud" => "t",          // v = video, t = thumbnail, g = gif.
    "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
    "kid" => $keyId,

    // Optional, include any additional manipulations
    "time"     => 10,
    "width"    => 640,
    "fit_mode" => "smartcrop"
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

token = {
    'sub': playback_id,
    'exp': int(time.time()) + 3600, # 1 hour
    'aud': 'v'
}
headers = {
    'kid': signing_key_id
}

json_web_token = jwt.encode(
    token, private_key, algorithm="RS256", headers=headers)

print(json_web_token)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(playback_id, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: playback_id, exp: expires.to_i, kid: signing_key_id, aud: audience}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

token = sign_url(playback_id, 'v', Time.now + 3600, signing_key_id, private_key_base64)

```



### Sign Data JWTs

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"
    "github.com/golang-jwt/jwt/v4"
)

func main() {

    myId := ""       // Enter the id for which you would like to get counts here
    myIdType := ""   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
    keyId := ""      // Enter your signing key id here
    key := ""        // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": myId,
        "aud": myIdType,
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// using @mux/mux-node@8

import Mux from '@mux/mux-node';
const mux = new Mux();
const myId = ''; // Enter the id for which you would like to get counts here
const myIdType = ''; // Enter the type of ID provided in myId; one of video_id | asset_id | playback_id | live_stream_id
const signingKeyId = ''; // Enter your Mux signing key id here
const privateKeyBase64 = ''; // Enter your Mux base64 encoded private key here

const getViewerCountsToken = async () => {
    return await mux.jwt.signViewerCounts(myId, {
        expiration: '1 day',
        type: myIdType,
        keyId: signingKeyId,
        keySecret: privateKeyBase64,
    });
};

const sign = async () => {
    const token = await getViewerCountsToken();
    console.log(token);
};

sign();

```

```php

<?php

  // Using https://github.com/firebase/php-jwt

  use \Firebase\JWT\JWT;

  $myId = "";       // Enter the id for which you would like to get counts here
  $myIdType = "";   // Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
  "sub" => $myId,
  "aud" => $myIdType,
  "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
  "kid" => $keyId
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

my_id = ''              # Enter the id for which you would like to get counts here
my_id_type = ''         # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

payload = {
    'sub': my_id,
    'aud': my_id_type,
    'exp': int(time.time()) + 3600, # 1 hour
}
headers = {
    'kid': signing_key_id
}

encoded = jwt.encode(payload, private_key, algorithm="RS256", headers=headers)
print(encoded)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(subject, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: subject, aud: audience, exp: expires.to_i, kid: signing_key_id}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

my_id = ''                 # Enter the id for which you would like to get counts here
my_id_type = ''            # Enter the type of ID provided in my_id; one of video_id | asset_id | playback_id | live_stream_id
signing_key_id = ''        # Enter your signing key id here
private_key_base64 = ''    # Enter your base64 encoded private key here

token = sign_url(my_id, my_id_type, Time.now + 3600, signing_key_id, private_key_base64)

```



# Secure video playback
In this guide you will learn how to use signed URLs for securing video playback.
If you add an asset or start a live stream through Mux without passing a playback policy, you'll be unable to access it in your browser using a URL. This may seem counterintuitive at first, however, this gives you the ability to be explicit about who can access your content, as well as exactly how, where, or when they can access it.

There may be instances where you upload a video to Mux that is not intended to be made available for public viewing. For example, maybe you have a membership site that your users must join to access your videos, or you are offering a pay-to-access live stream.

For these scenarios, Mux offers **playback policies** that allow you to control the different ways users can view and interact with your content.

## Understanding playback policies

When you upload a video or initiate a live stream through Mux, you also have the option to define what type of fine-grained access should apply to your content. This is done by specifying a playback policy.

Mux offers two kinds of playback policies: `public` and `signed`.

* **Public** playback policies will enable playback URLs that can be watched anywhere, at any time, without any restrictions. This option is perfect for sharing your viral cat videos with the whole world.
* **Signed** playback policies will enable playback URLs that require a valid JSON Web Token (JWT) to gain access. The JWT should be signed and generated by your application on a protected server, not on a public client.

A playback policy can be specified when you create a new asset or live stream, or can be added to an existing asset or live stream.

Once an asset or live stream has been assigned a playback policy, the asset will be issued a new playback ID that's associated with its corresponding playback policy. It's possible for each asset or live stream to have multiple playback IDs.

See <ApiRefLink href="/docs/api-reference/video/assets/create-asset-playback-id">Create a playback ID</ApiRefLink> to learn how to add a new playback policy and ID to an existing Asset or Live Stream.

**Public** playback policies are pretty self-explanatory, so let’s dig into the signed playback policies.

## A closer look at signed playback policies

When you apply a signed playback policy to your content, there are two distinct ways you can restrict video playback:

1. **Expiration time** (required) allows you to specify a point in time when your issued JWT should be considered expired. Viewers with a valid token can watch videos until your specified expiration time value passes. All HTTP requests made to access your content past the expiration time are denied.
2. **Playback Restrictions** (optional) allow you to implement additional rules for playing videos. For example, let’s consider Referrer Validation. When you create a signed playback policy, you can supply a list of websites that are allowed to host and serve your content. Any requests from domains that aren't on the allow list are denied if they attempt to play back your content.

Referrer and User-Agent Validation Playback Restrictions are supported today; Mux plans to add more types of restrictions in the future.

Let’s walk through a typical workflow for creating a valid JWT used to access a Mux asset with a signed playback policy.

## 1. Create an Asset or Live Stream with a signed playback policy

Let’s start from scratch and add a new asset to our Mux account using a standard authenticated API call. Notice how we set the `playback_policy` value to `signed` during this **Create Asset** request:

```json
// POST https://api.mux.com/video/assets

{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux-video-intro.mp4"
    }
  ],
  "playback_policies": [
    "signed"
  ],
  "video_quality": "basic"
}
```

## 2. Create a signing key for your Mux account environment

Next, we'll need to create a Mux signing key. Signing keys are used to generate valid JWTs for accessing your content. Signing keys can be managed (created, deleted, listed) from the [Signing Keys settings](https://dashboard.mux.com/settings/signing-keys) of the Mux dashboard or via the Mux System API.

<Callout type="info">
  Remember: Mux signing keys are different than Mux API keys.
</Callout>

When you create a new signing key, the API generates a 2048-bit RSA key-pair and returns the private key and a generated key-id. You should securely store the private key for signing the token, while Mux stores the public key to validate the signed tokens.

Signing keys are created and deleted independently of assets. You probably only need one signing key active at a time, but you can create multiple to enable key rotation, creating a new key and deleting the old one only after any existing signed URLs have expired.

See <ApiRefLink href="/docs/api-reference/system/signing-keys">Create a URL signing key</ApiRefLink> for full documentation.

```json
//POST https://api.mux.com/system/v1/signing-keys

{
  "data": {
    "private_key": "(base64-encoded PEM file with private key)",
    "id": "(unique signing-key identifier)",
    "created_at": "(UNIX Epoch seconds)"
  }
}
```

## 3. Create an optional Playback Restriction for your Mux account environment

Mux supports two types of playback restriction:

* Referrer Validation: Restricts whether the domain specified in the HTTP `Referer` request header or when no referrer domain is specified will be allowed for playback.
* User Agent Validation: Restricts whether a high risk user agent specified in the `User-Agent` request header or when no user agent is specifed will be allowed for playback.

During playback, a restriction is applied using a JWT claim, which will be covered in the next two sections.

<Callout type="info">
  Playback restrictions exist at the environment level. However, creating a playback restriction in an environment does not mean all assets are automatically restricted by it.

  Instead, you should apply a given restriction to a playback by referencing it in the token you create for a signed playback ID.
</Callout>

If you don’t need to use playback restrictions for your content, feel free to jump to the next step.

### Create a Playback Restriction

Most commonly, you want all the videos on your Mux account to be watched only on your website `https://example.com`. To do so, you can create a new Playback Restriction by adding `example.com` domain as the only allowed domain that can play your videos.

See <ApiRefLink href="/docs/api-reference/video/playback-restrictions">Playback Restriction</ApiRefLink> for full documentation.

### Example API Request

```json
//POST https://api.mux.com/video/v1/playback-restrictions

{
  "referrer": {
    "allowed_domains" : [
      "example.com"
	  ],
    "allow_no_referrer" : false
  }
}
```

### Example API Response

```json
{
  "data": {
    "updated_at": "1634595679",
    "referrer": {
      "allowed_domains": [
        "example.com"
      ],
    },
    "id": "JL88SKXTr7r2t9tovH7SoYS8iLBVsjZ2qTuFS8NGAQY",
    "created_at": "1634595679"
  }
}
```

Store the `id` value from the API response above as `PLAYBACK_RESTRICTION_ID` in your application for later use when generating the signed JWT.

### Playback Restriction Syntax

When you create a playback restriction, you may specify the referrer domains and/or user agent restrictions in the same request. The `referrer` field allows you to specify the referrer restrictions and the `user_agent` field allows you to specify the user agent restrictions. For the referrer `allowed_domains` list, you may specify up to 100 unique domains or subdomains where your videos will be embedded.  Specify this in the `referrer.allowed_domains` array using valid DNS-style wildcard syntax. For example:

```json
{
  "referrer": {
    "allowed_domains": [
      "*.example.com",
      "foo.com"
    ],
    "allow_no_referrer": false
  },
  "user_agent": {
    "allow_no_user_agent": false,
    "allow_high_risk_user_agent": false
  }
}
```

Choose from the following options:

* To deny video playback requests for all domains, use an empty Array: `[]`
* To allow playback on `example.com` and all the subdomains of `example.com`, use the syntax: `["*.example.com", "example.com"]`
* Use a single wildcard `*` entry to allow video playback requests from any domain: `["*"]`
* Use a wildcard for one subdomain level. For instance, video playback will be denied from `xyz.foo.example.com` when you include `["*.example.com"]`.

### Playback Restriction considerations

Here are some things to consider when using Playback Restrictions.

* You can create up to 100 different Playback Restrictions per environment on your Mux account.

* You can use a Playback Restriction ID for playing a single video or a group of videos.

* You have a lot of flexibility for associating Playback Restrictions with videos. For instance, you can create one Playback Restriction for each of your clients if your service or application supports multiple clients.

* You can add up to 100 different domains to each Playback Restriction.

* You can restrict playing video on domains added to the Playback Restriction. For instance, if you want multiple partner sites to play a video, you can add the partner site domain to the same Playback Restriction, thereby restricting playback only on those domains.

* If your player supports Chromecast, like Mux Player, make sure you add the Chromecast domain (`www.gstatic.com`) to your playback restrictions, otherwise casting will fail.

* if your player supports AirPlay, like Mux Player, you will only be able to AirPlay to third party devices by adding the AirPlay domain (`mediaservices.cdn-apple.com`) to your playback restrictions. Because first-party Apple devices never forward the referrer header, `allow_no_referrer` must be set to true in order to work on those devices, otherwise airplaying will fail.

[Reach out to Mux Support](mailto:support@mux.com) if you have a use case that requires more than 100 Playback Restrictions or want to add more than 100 domains per Playback Restriction.

### Using `Referer` HTTP Header for validation

Web browsers send the website address requesting the video in the [`Referer` HTTP header](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Referer).
Mux matches the domains configured in the Playback Restriction, with the domain in the `Referer` HTTP header. No video is delivered if there is no match.

The `Referer` HTTP header is only sent by web browsers, while native iOS and Android applications do not send this header. Therefore, Mux cannot perform domain validations on any requests from native iOS and Android applications. For this reason, you can configure the Playback Restrictions to allow or deny all HTTP requests without the `Referer` HTTP header by setting the `allow_no_referrer` boolean parameter.

First party Apple devices, like Apple TV 4K, never set a referrer header regardless of the source. Therefore if airplaying to first party Apple devices is required then `allow_no_referrer` will need to be set to `true` in order to succeed.

Please note that setting `allow_no_referrer` to `true` can result in content playback from unauthorised locations. As such, we strongly recommend creating two Playback Restriction objects, one with `allow_no_referrer` set to `true`, and one set to `false`, and setting the appropriate Playback Restriction ID in the JWT for Web vs. native iOS and/or Android applications.

### Using `User-Agent` HTTP Header for validation

The `User-Agent` HTTP header value is used to validate against a playback restriction. If the `allow_no_user_agent` field is set to false, the playback will be denied if the request does not include an `User-Agent` value.  For the `allow_high_risk_user_agent` validation, Mux maintains a list of user agents that have been known to be associated with higher risk video playback, such as playback devices that are not associated with legitimate end users of most systems. For more information, please reach out to [Mux Support](/support).

## 4. Generate a JSON Web Token (JWT)

All signed requests have a JWT with the following standard claims:

| Claim Code | Description | Value |
| :-- | :-- | :-- |
| sub | Subject of the JWT | Mux Video Playback ID |
| aud | Audience (intended application of the token) | `v` (Video or Subtitles/Closed Captions) <br /> `t` (Thumbnail) <br /> `g` (GIF) <br /> `s` (Storyboard) <br /> `d` (DRM License)|
| exp | Expiration time | UNIX Epoch seconds when the token expires. This should always exceed the current-time plus the duration of the video, else portions of the video may be unplayable. |
| kid | Key Identifier | Key ID returned when signing key was created |

You can also include the following optional claims depending on the type of request.

| Claim Code | Description | Value |
| :-- | :-- | :-- |
| playback\_restriction\_id | Playback Restriction Identifier | `PLAYBACK_RESTRICTION_ID` from the previous step. Mux performs validations when the `PLAYBACK_RESTRICTION_ID` is present to the JWT claims body. This claim is supported for all `aud` types. |

The Image (Thumbnails, Animated GIFs, Storyboard and others) API accepts several options to control image selection and transformations. More details on generating JWT for image can be found [here](/docs/guides/secure-video-playback#note-on-query-parameters-after-signing).

For Playback IDs that use a public policy, the thumbnail options are supplied as query parameters on the request URL.

For Playback IDs that use a signed policy, the thumbnail options must be specified in the JWT claims when using signed URLs. This ensures that the thumbnail options are not altered, such as changing the timestamp or the dimensions of the thumbnail image. For example, if you uploaded a 4K video and wanted to restrict a thumbnail to a width of 600 pixels and a specific timestamp, then simply include the `width` and `time` keys in the JWT claims.

## A note on expiration time

Expiration time should be at least the duration of the Asset or the expected duration of the Live Stream. When the signed URL expires, the URL will no longer be playable, even if playback has already started. Make sure you set the expiration to be sufficiently far in the future so that users do not experience an interruption in playback.

Your application should consider cases where the user loads a video, leaves your application, then comes back later and tries to play the video again. You will likely want to detect this behavior and make sure you fetch a new signed URL to make sure playback can start.

## 5. Sign the JSON Web Token (JWT)

The steps can be summarized as:

1. Load the private key used for signing
2. Assemble the claims (sub, exp, kid, aud, etc) in a map
3. Encode and sign the JWT using the claims map and private key and the RS256 algorithm.

There are dozens of software libraries for creating & reading JWTs. Whether you’re writing in Go, Elixir, Ruby, or a dozen other languages, don’t fret, there is most likely some JWT library you can rely on.

<Callout type="warning">
  The following examples assuming you're working with either a private key returned from the <ApiRefLink href="/docs/api-reference/system/signing-keys">Signing Keys API</ApiRefLink>, or copy & pasted from the Dashboard, **not** when downloaded as a PEM file.
</Callout>

```go

package main

import (
    "encoding/base64"
    "fmt"
    "log"
    "time"

    "github.com/golang-jwt/jwt/v4"
)

func main() {

    playbackId := "" // Enter your signed playback id here
    keyId      := "" // Enter your signing key id here
    key        := "" // Enter your base64 encoded private key here

    decodedKey, err := base64.StdEncoding.DecodeString(key)
    if err != nil {
        log.Fatalf("Could not base64 decode private key: %v", err)
    }

    signKey, err := jwt.ParseRSAPrivateKeyFromPEM(decodedKey)
    if err != nil {
        log.Fatalf("Could not parse RSA private key: %v", err)
    }

    token := jwt.NewWithClaims(jwt.SigningMethodRS256, jwt.MapClaims{
        "sub": playbackId,
        "aud": "v",
        "exp": time.Now().Add(time.Minute * 15).Unix(),
        "kid": keyId,
    })

    tokenString, err := token.SignedString(signKey)
    if err != nil {
        log.Fatalf("Could not generate token: %v", err)
    }

    fmt.Println(tokenString)
}

```

```node

// We've created some helper functions for Node to make your signing-life easier
const Mux = require('@mux/mux-node');
const mux = new Mux();

async function createTokens () {
  const playbackId = ''; // Enter your signed playback id here

  // Set some base options we can use for a few different signing types
  // Type can be either video, thumbnail, gif, or storyboard
  let baseOptions = {
    keyId: '', // Enter your signing key id here
    keySecret: '', // Enter your base64 encoded private key here
    expiration: '7d', // E.g 60, "2 days", "10h", "7d", numeric value interpreted as seconds
  };

  const token = await mux.jwt.signPlaybackId(playbackId, { ...baseOptions, type: 'video' });
  console.log('video token', token);

  // Now the signed playback url should look like this:
  // https://stream.mux.com/${playbackId}.m3u8?token=${token}

  // If you wanted to pass in params for something like a gif, use the
  // params key in the options object
  const gifToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'gif',
    params: { time: '10' },
  });
  console.log('gif token', gifToken);

  // Then, use this token in a URL like this:
  // https://image.mux.com/${playbackId}/animated.gif?token=${gifToken}

  // A final example, if you wanted to sign a thumbnail url with a playback restriction
  const thumbnailToken = await mux.jwt.signPlaybackId(playbackId, {
    ...baseOptions,
    type: 'thumbnail',
    params: { playback_restriction_id: YOUR_PLAYBACK_RESTRICTION_ID },
  });
  console.log('thumbnail token', thumbnailToken);

  // When used in a URL, it should look like this:
  // https://image.mux.com/${playbackId}/thumbnail.png?token=${thumbnailToken}
}

```

```php

<?php
  // Using Composer and https://github.com/firebase/php-jwt
  require __DIR__ . '/vendor/autoload.php';
  use \Firebase\JWT\JWT;

  $playbackId = ""; // Enter your signed playback id here
  $keyId = "";      // Enter your signing key id here
  $keySecret = "";  // Enter your base64 encoded private key here

  $payload = array(
    "sub" => $playbackId,
    "aud" => "t",          // v = video, t = thumbnail, g = gif.
    "exp" => time() + 600, // Expiry time in epoch - in this case now + 10 mins
    "kid" => $keyId,

    // Optional, include any additional manipulations
    "time"     => 10,
    "width"    => 640,
    "fit_mode" => "smartcrop"
  );

  $jwt = JWT::encode($payload, base64_decode($keySecret), 'RS256');

  print "$jwt\n";

?>

```

```python

# This example uses pyjwt / cryptography:
# pip install pyjwt
# pip install cryptography

import jwt
import base64
import time

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

private_key = base64.b64decode(private_key_base64)

token = {
    'sub': playback_id,
    'exp': int(time.time()) + 3600, # 1 hour
    'aud': 'v'
}
headers = {
    'kid': signing_key_id
}

json_web_token = jwt.encode(
    token, private_key, algorithm="RS256", headers=headers)

print(json_web_token)

```

```ruby

require 'base64'
require 'jwt'

def sign_url(playback_id, audience, expires, signing_key_id, private_key, params = {})
    rsa_private = OpenSSL::PKey::RSA.new(Base64.decode64(private_key))
    payload = {sub: playback_id, exp: expires.to_i, kid: signing_key_id, aud: audience}
    payload.merge!(params)
    JWT.encode(payload, rsa_private, 'RS256')
end

playback_id = ''        # Enter your signed playback id here
signing_key_id = ''     # Enter your signing key id here
private_key_base64 = '' # Enter your base64 encoded private key here

token = sign_url(playback_id, 'v', Time.now + 3600, signing_key_id, private_key_base64)

```



## 6. Include the JSON Web Token (JWT) in the media URL

Supply the JWT in the resource URL using the `token` query parameter. The Mux Video service will inspect and validate the JWT to make sure the request is allowed.

Video URL example:

```sh
https://stream.mux.com/{playback-id}.m3u8?token={JWT}
```

Thumbnail options are supplied as query parameters when using a public policy. When using a signed policy, the thumbnail options must be specified as claims in the JWT following the same naming conventions as with query parameters.

Thumbnail URL example:

```sh
https://image.mux.com/{playback-id}/thumbnail.{format}?token={JWT}
```

<Callout type="warning" title="Passing `token` for public playback IDs will fail">
  If you include a `token=` query parameter for a `"public"` playback ID, the URL will fail. This is intentional as to not create the false appearance of security when using a public playback ID.

  If your application uses a mix of "public" and "signed" playback IDs, you should save the playback policy type in your database and include the token parameter only for the signed playbacks.
</Callout>

## Note on query parameters after signing

When you're signing a URL, you're signing the parameters for that URL as well. After the parameters are signed for a playback ID, the resulting signed URL should *only* contain the `token` parameter. This is important because leaving the parameters in the URL would both:

* expose more information about the underlying asset than you may want
* result in an incorrect signature since the extraneous parameters would alter the URL.

<Callout type="warning" title="Be sure to include `params` in your `claims` body">
  While the JWT helper we expose in our Node SDK passes in additional parameters as an extra hash, when working with the JWT directly, these `params` should be embedded directly in your `claims` body.
</Callout>

## Example:

Let's say we're taking the following public example and making a signed URL:

* `https://image.mux.com/{public_playback_id}/thumbnail.jpg?time=25`

Generate a signed URL with `{time: 25}` in the **claims body**. Using the helper example we wrote above, this would look like:

* `sign(signedPlaybackId, { ...requiredTokenOptions, params: { time: 25 } })`

**Correct** Signed URL:

* `https://image.mux.com/{signed_playback_id}/thumbnail.jpg?token={token}`

**Bad** Signed URL:

* `https://image.mux.com/{signed_playback_id}/thumbnail.jpg?time=25&token={token}`

Including query parameters in the token also applies to playback modifiers like `default_subtitles_lang`, `redundant_streams` and `roku_trick_play`. The JWT claims body must include the extra parameter:

```json
{
  "sub": "{PLAYBACK_ID}",
  "aud": "{AUDIENCE_TYPE}",
  "exp": "{EXPIRATION_TIME}",
  "redundant_streams": true
}
```

## Passing custom parameters to a signed token

With signed URLs, you can pass extra parameters via a `custom` key in the claims body like the example above.

This may be useful in order to identify bad actors that share signed URLs in an unauthorized way outside of your application. If you find out that a signed URL gets shared then you can decode the parameters and trace it back to the user who shared it. When including extra parameters like this, be sure to respect the following guidelines:

* Do NOT under any circumstances include personally identifiable information (PII) like a name or email address.
* Put your custom parameters nested inside the `"custom"` key.

```json
{
  "sub": "{PLAYBACK_ID}",
  "aud": "{AUDIENCE_TYPE}",
  "exp": "{EXPIRATION_TIME}",
  "custom": {
    "session_id": "xxxx-123"
  }
}
```


# Protect videos with DRM
Learn how to leverage Digital Rights Management (DRM) to protect your videos
## What is DRM?

<Callout type="info">
  Check out our blog on ["What is DRM"](https://www.mux.com/blog/what-is-drm) to learn more about the concepts of DRM.
</Callout>

DRM (Digital Rights Management) provides an extra layer of content security for video content streamed from Mux.

Leveraging DRM blocks or limits the impact of:

* Screen recording
* Screen sharing
* Downloading tools

Mux uses the industry standard protocols for delivering DRM protected video content, specifically Google Widevine, Microsoft PlayReady, and Apple FairPlay.

DRM requires the use of [Signed URLs](/docs/guides/secure-video-playback), and when combined with [Domain and User-Agent restrictions](/docs/guides/secure-video-playback#3-create-an-optional-playback-restriction-for-your-mux-account-environment), can provide a very strong content protection story, up to and including security levels that satisfy the requirements of Hollywood studios.

### How Mux DRM Protects Your Content

Mux Video's DRM is built to support the strongest protection available for each device without affecting playability. This protection comes in three parts.

* **Video encryption:** ensures you can't play the video without the proper license.
* **Screen capture protection:** ensures that you can't take screenshots or record the screen.
* **HDCP:** prevents recording video from video outputs like HDMI.

However, not every device supports all three protection layers. Mux has configured DRM at a security level that ensures broad device compatibility while still providing meaningful protection. The following table shows the types of protection you can expect across different devices:

| Device Type | Encrypted Video | Screen Capture Protection | HDCP Enforced | Details |
| ----- | ----- | ----- | ----- | ----- |
| **iPhone/iPad** | ✅ Yes | ✅ Yes | ✅ Yes | Apple supports hardware-level protection on all devices created since the iPhone 5s. |
| **Modern Android devices** | ✅ Yes | ✅ Usually | ❌ No | Newer devices such as Google Pixel phones or Samsung phones with Android 12+, or any device with Widevine level 1 support can prevent screen capture. |
| **Older, and lower end Android devices** | ✅ Yes |  Sometimes | ❌ No | Many lower end Android devices are missing the secure hardware necessary for Widevine level 3 and hardware decryption. Many of these lower end devices still try to block screen capture, but it's not nearly as secure. |
| **Chrome/Edge browsers on desktop** | ✅ Yes | Sometimes | ❌ No | Browser-based playback usually relies on software decryption. Many of these devices still try to block screen capture but it's not nearly as secure. |

#### Additional protections

If you're distributing premium content with strict security requirements (like major studio releases), you may need additional types of protection. For this you have a few options:

* **Device-level security upgrades:** Some players allow you to request stronger DRM when devices support it. This ensures hardware-backed devices get enhanced protection while others play at baseline levels.
* **Custom DRM configuration**: We're investigating more granular security controls and looking for early partners to help test these features. [Reach out](https://www.mux.com/support/human) for more information.
* **Video watermarking**: Add visible watermarks using Mux's [watermarking](https://www.mux.com/docs/guides/add-watermarks-to-your-videos) feature, which embeds them directly into the video and prevents misattribution. This doesn't support per-user or forensic watermarking, but you can add per-user watermarks by overlaying images on your player - just know these are easier to bypass since they're not baked into the video. Forensic watermarking isn't currently available, but if it would be valuable for your use case, [let us know](https://www.mux.com/support/human).  We'd love to hear your feedback.
* **Trust the DRM Configuration**: Attempting to increase security by detecting device capabilities or security levels yourself will be painful. The device landscape changes constantly and maintaining accuracy is nearly impossible. This is better addressed with custom DRM configurations.

If you need any of these additional protections, or something we've overlooked, [contact us](https://www.mux.com/support/human). We'd love to hear from you.

## Prerequisites

Before you can start using Mux DRM you must complete the onboarding process. The following is a quick overview of the entire process, and you can find additional detail later in this guide.

1. [Request a FairPlay certificate](#step-1-request-a-fairplay-certificate) for playback on Apple Devices. Don't worry about Widevine and PlayReady certificates, we'll handle those for you.
2. While waiting on FairPlay approval, go to Settings -> Digital Rights Management in your Mux dashboard to request DRM access.
3. After DRM is enabled on your environments we'll send you a DRM configuration ID and tell you how to securely send us your FairPlay certificates.

Once these steps are complete you will have your DRM configuration ID and be able to test DRM playback on non-Apple devices. Once you've sent us your FairPlay certificates you can test on Apple devices.

### Step 1: Request a FairPlay certificate

DRM playback will work out of the box on every [supported platform](#supported-platforms) except for Apple devices. Apple requires you to request your own FairPlay Streaming Deployment package (FPS, often simply referred to as a "FairPlay Certificate"). An FPS can only be requested if you meet the following requirements.

1. You have an Apple Developer account with an active subscription.
2. If you're part of a team account, you are logged in as the owner of a team account.

Once you've met these requirements, you will need to fill out a form. The initial questions ask about your DRM infrastructure, which is Mux. Here's some guidance on how to answer those questions.

|    |    |
| :---- | :---- |
| **Does your organization have a working FPS development server where you'll use the FPS certificate?** | Select "Yes" You will use Mux's verified FPS implementation. |
| **Do you have a third-party streaming distribution partner?** | Select "Yes". Mux is that third-party partner. |
| **Streaming Distribution (DRM License Server) Partner Name** | Enter "Mux, Inc.". |
| **Streaming Distribution (DRM License Server) Partner Website** | Enter "https://mux.com". |
| **Your Company** | Describe your company and the services they provide. |
| **Your Content** | Describe the type of content you will be protecting with FairPlay and why that content needs DRM. |
| **Do you own the content you want to stream?** | If you hold full copyright ownership of your content, select "Yes". Otherwise select "No" and answer the following two additional questions that appear: |
| **Do you have a content licensing agreement with the owner of the content?** | If you license third-party content, select "Yes". |
| **Your Content Provider** | If you license third-party content, include the name of that provider and a description of the rights you have to use their content. |
| **Is this your first request for FPS credentials?** | If this is your first time submitting this form or requesting a FairPlay certificate, select "Yes".
| **Do you assert that the account holder of this developer account owns, or has a license to use, the content that you will be streaming?** | Select the most appropriate answer for your situation. |

Once you've submitted the request, approval may take several days. Once approved, Apple will provide documentation for generating the final certificate. This includes generating a private key via your terminal and filling out a form.

Once you've created your FairPlay deployment package, [contact us](/support/human) and we'll walk you through securely sending us the files.

<Callout type="info">
  You do not need to request a Widevine or PlayReady certificate, Mux manages these for you.
</Callout>

### Step 2: Request DRM for your environment

Go to Settings -> Digital Rights Management to request DRM access. This page will walk you through the necessary requirements and allow you to request access. You will receive a response via email with next steps.

### Step 3: Receive your DRM configuration ID

After we enable DRM on your environment you can find your DRM configuration ID in Settings -> Digital Rights Management. You'll need this when you add a DRM playback ID to an asset.

You can also use the <ApiRefLink href="/docs/api-reference/video/drm-configurations">DRM Configurations API</ApiRefLink> to list the DRM Configurations available to your account.

## Create a DRM protected asset or live stream

Mux Video supports applying DRM to both live streams and assets.

### Creating a DRM protected asset

When using the <ApiRefLink href="/docs/api-reference/video/assets/create-asset">Create Asset API</ApiRefLink>, you can add DRM protection by including `advanced_playback_policies` with your DRM configuration ID. Make sure to set the `video_quality` to `plus` or `premium`, as DRM is only supported on these quality levels.

```json
// POST /video/v1/assets
{
  "inputs": [
    {
      "url": "https://storage.googleapis.com/muxdemofiles/mux.mp4"
    }
  ],
  "advanced_playback_policies": [
    {
      "policy": "drm",
      "drm_configuration_id": "your-drm-configuration-id"
    }
  ],
  "video_quality": "plus"
}
```

When working with `advanced_playback_policies`, keep in mind that you can't use both the `playback_policy` field and `advanced_playback_policies` field in the same request. When working with DRM, stick to `advanced_playback_policies`. If you need more than one playback policy, such as for static renditions, you can include multiple policies in the `advanced_playback_policies` array.

If you need to add DRM protection to an existing asset, you can use the <ApiRefLink href="/docs/api-reference/video/assets/create-asset-playback-id">Playback IDs API</ApiRefLink> to retroactively add a DRM playback policy. This works for any asset created after DRM was enabled in your environment.

### Creating a DRM protected live stream

Just like creating a DRM protected asset, DRM protected live streams require configuration ID must be set in the live stream's `advanced_playback_policies`.

In the example below, we also set the `new_asset_settings` to also use DRM, so any [DVR assets](/docs/guides/live-streaming-faqs#is-it-possible-to-rewind-live-content-while-the-live-stream-continues) and on-demand assets also have DRM applied.

```json
// POST /video/v1/live-streams
{
  "advanced_playback_policies": [
    {
      "policy": "drm",
      "drm_configuration_id": "your-drm-configuration-id"
    }
  ],
  "new_asset_settings": {
    "advanced_playback_policies": [
      {
        "policy": "drm",
        "drm_configuration_id": "your-drm-configuration-id"
      }
    ]
  }
}
```

## Play DRM protected videos

Mux supports three types of DRM: Widevine, FairPlay, and PlayReady. These three DRM systems cover the vast majority of devices in use today, including [desktop browsers](#desktop-browsers), [mobile browsers](#mobile-browsers), and [living room devices (OTT)](#living-room-devices-ott).

### Supported Platforms

Before you start to build your DRM integration, make sure Mux's DRM supports your target platforms. Mux's DRM is verified to work on all of the following platforms, but likely works on additional platforms. If you would like us to verify an additional platform, please [contact us](/support/human).

##### Desktop Browsers

The following desktop browsers support Mux DRM via the [Mux Web Player](#mux-web-player), or any of the players listed in our [player documentation](#playback-in-mux-players).

* Chrome (macOS and Windows)
* Firefox (macOS and Windows)
* Safari (macOS)
* Edge (Windows)
* Legacy Edge (Windows)

##### Mobile Browsers

The following mobile browsers support Mux DRM via the [Mux Web Player](#mux-web-player), or any of the players listed in our [player documentation](#playback-in-mux-players).

* Chrome on Android
* Firefox on Android
* All browsers on iOS

##### Native Mobile Apps

* Android apps using [Mux Player for Android](#mux-player-android)
* iOS apps using [Mux Player for iOS](#mux-player-ios)

##### Living room devices (OTT)

The following living room devices support Mux DRM, and link to the relevant documentation.

* [Chromecast](#chromecast)
* [Google TV](#mux-player-android)
* [Apple TV (tvOS)](#mux-player-ios)
* [Roku](#roku)
* [Fire TV](#mux-player-android)

### Creating your playback and license tokens

To successfully play back content protected by Mux DRM you will need your asset's playback ID and two secure tokens; a playback token and a DRM license token. These tokens are both signed using the JWT requirements laid out in our [secure video playback guide](//www.mux.com/docs/guides/secure-video-playback#2-create-a-signing-key-for-your-mux-account-environment).<br /><br />

If you're using our node library to sign your license URLs we offer a helper function:

```js
const mux = new Mux({ 
  tokenId: "your-access-token-id", 
  tokenSecret: "your-access-token-secret", 
  jwtSigningKey: "your-environment-signing-public-key",
  jwtPrivateKey: "your-environment-signing-private-key"
});

const playbackToken = await mux.jwt.signPlaybackId("your-playback-id", {expiration: '7d'});
const drmLicenseToken = await mux.jwt.signDrmLicense("your-playback-id", {expiration: '7d'});
```

Once you have created your signing tokens, you can use them directly in a [Mux player](#playback-in-mux-players). If you are using a non-Mux player you use these tokens to build your [playback and license URLs](#creating-license-urls).

### Playback in Mux players

Now that you have the necessary tokens, we can hook them into a Mux Player.

#### Mux Web Player

To play back DRM protected content, you should instantiate the player with the new `drm-token` parameter set to the DRM license token that you generated [previously](#creating-your-playback-and-license-tokens). In Mux Player React, you'll use the `tokens` prop to set the `drm` token.

<Callout type="info">
  Support for DRM in Mux Player was added in version 2.8.0.
</Callout>

```embed

<iframe
  src="https://player.mux.com/your-playback-id?drm-token=your-drm-token&playback-token=your-playback-token&thumbnail-token=your-thumbnail-token&storyboard-token=your-storyboard-token"
  style="aspect-ratio: 16/9; width: 100%; border: 0;"
  allow="accelerometer; gyroscope; autoplay; encrypted-media; picture-in-picture;"
  allowfullscreen="true"
></iframe>

```

```html
<mux-player
  playback-id="your-playback-id"
  playback-token="your-playback-token"
  drm-token="your-drm-token"
  thumbnail-token="your-thumbnail-token"
  storyboard-token="your-storyboard-token"
></mux-player>
```

```react

<MuxPlayer
  playbackId="your-playback-id"
  tokens={{
    playback: "your-playback-token",
    drm: "your-drm-token",
    thumbnail: "your-thumbnail-token",
    storyboard: "your-storyboard-token",
  }}
/>
```



[You can see a demo of this working in codesandbox here.](https://codesandbox.io/p/sandbox/mux-player-drm-test-5qh2pm?file=%2Findex.html%3A19%2C7)

With your new tokens all wired up correctly, you should be able to play back your freshly DRM protected content!

[*Here's a demo page with some pre-prepared DRM protected content you can also test a device against.*](https://5qh2pm.csb.app/)

[Full documentation for using DRM with Mux Player for web can be found here.](/docs/guides/player-advanced-usage#using-digital-rights-management-drm)

#### Mux Player iOS

<Callout type="info">
  Support for DRM in Mux Player for iOS was added in version 1.1.0.
</Callout>

The DRM license token can be configured on `PlaybackOptions` using the following API:

```swift
let playbackOptions = PlaybackOptions(
  drmToken: "your-drm-license-token", 
  playbackToken: "your-playback-token",
)

let playerItem = AVPlayerItem(
  playbackID: "your-playback-id",
  playbackOptions: playbackOptions
)
```

[Full documentation for using DRM Mux Player for iOS can be found here.](/docs/guides/mux-player-ios#secure-your-playback-experience)

#### Mux Player Android

The DRM license token can be configured when instantiating a `MediaItem` using the `MediaItems` factory class as follows:

<Callout type="info">
  Support for DRM in Mux Player for Android was added in version 1.1.0.
</Callout>

```kotlin
// You don't need to add your own DrmSessionManager, we take care of this

val player = // Whatever you were already doing

val mediaItem = MediaItems.mediaItemFromPlaybackId(
  playbackId = "your-playback-id",
  playbackToken = "your-playback-token",
  drmToken = "your-drm-license-token"
)

// Normal media3 boilerplate
player.setMediaItem(mediaItem)
player.prepare()
```

[Full documentation for using DRM Mux Player for Android can be found here.](/docs/guides/mux-player-android#secure-your-playback-experience)

### Playback in third-party players

If you can't use one of the [Mux players](#playback-in-mux-players), you still have options. Mux's DRM is compatible with a wide range of third party players. Because these players don't know exactly how Mux's DRM works, you'll need to build the correct [playback and license URLs](#creating-license-urls), then add them to the player.

#### Creating license URLs

The following examples demonstrate the license URL structure for each of the supported DRM providers.

##### Widevine

```
https://license.mux.com/license/widevine/{playback-id}?token={drm-license-token}
```

##### FairPlay

Before you can use FairPlay DRM, you must [request the proper certificate from Apple](#step-1-request-a-fairplay-certificate). Once FairPlay is enabled on your account, you will use one license url and one certificate URL.

**License URL**

```
https://license.mux.com/license/fairplay/{playback-id}?token={drm-license-token}
```

**Certificate URL**

```

https://license.mux.com/appcert/fairplay/{playback-id}?token={drm-license-token}
```

##### PlayReady

```
https://license.mux.com/license/playready/{playback-id}?token={drm-license-token}
```

#### Third party players

The following third-party players have been tested with Mux DRM. If you are using a player not listed here, check out our notes on [other players](#other-players).

##### Roku

In order to play back DRM protected content in Roku, add your DRM Configuration to your content node. This includes the following:

1. Add the following two fields to your channel's manifest:
   ```jsx
   requires_widevine_drm=1
   requires_widevine_version=1.0
   ```
2. When preparing your `contentNode`, ensure you reference the DRM configuration and license URL as follows:

```jsx
drmParams = {
  keySystem: "Widevine",
  licenseServerURL: "https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_JWT}"
}

contentNode = CreateObject("roSGNode", "ContentNode")
contentNode.url = "<content URL>"
contentNode.drmParams = drmParams
contentNode.title = "<your title>"
contentNode.length = <duration in seconds>

' other contentNode properties can be added here, 
' then play your video as you normally would
```

##### Chromecast

Chromecast devices use Google Cast to send videos from one device to another. There are quite a few steps to set up Google Cast, so we recommend you check out our [Google Cast guide](/docs/guides/play-drm-protected-videos-on-google-cast) for more details.

##### HLS.js

[HLS.js](https://github.com/video-dev/hls.js?tab=readme-ov-file) supports DRM via configuration keys in any browser with native [MSE](https://developer.mozilla.org/en-US/docs/Web/API/Media_Source_Extensions_API) support (e.g. old versions of Safari). For browsers that do not support MSE, you will need to use the native video element. Both flows are included in the example below.

```js
// This browser supports MSE and EME, so we can use hls.js
if (Hls.isSupported()) {
  var hls = new Hls({
    emeEnabled: true,
    drmSystems: {
      'com.widevine.alpha': {
        licenseUrl: 'https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
      },
      'com.microsoft.playready': {
        licenseUrl: 'https://license.mux.com/license/playready/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
      },
      'com.apple.fps': {
        licenseUrl: 'https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
        serverCertificateUrl: 'https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      }
    }
  });
// This browser supports EME but not MSE, so we need to use the native video element
} else if (video.canPlayType('application/x-mpegURL')) {
  video.src = mediaUrl;

  video.addEventListener('encrypted', async function(event) {
    const initDataType = event.initDataType;
    const initData = event.initData;

    // Retrieve a MediaKeySystemAccess object to interact with the DRM system
    const access = await navigator.requestMediaKeySystemAccess('com.apple.fps', [{
        initDataTypes: [initDataType],
        videoCapabilities: [{ contentType: 'application/vnd.apple.mpegurl', robustness: '' }],
        distinctiveIdentifier: 'not-allowed',
        persistentState: 'not-allowed',
        sessionTypes: ['temporary'],
    }]);

    if (!access) {
        console.error('Cannot play DRM-protected content with current security configuration on this browser. Try playing in another browser.');
        return;
    }

    // Create DRM keys
    const keys = await access.createMediaKeys();

    // Get the FairPlay license and certificate
    const certificate = await fetch('https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}').then(async (res) => {
        const keyBuffer = await res.arrayBuffer();
        return new Uint8Array(keyBuffer);
    });

    if (!certificate) {
        console.error('Failed to fetch certificate');
        return;
    }

    // Attach the certificate to the DRM keys
    await keys.setServerCertificate(certificate);

    // Attach the keys to the video element
    await video.setMediaKeys(keys);

    // Create a playback session
    const session = (video.mediaKeys).createSession();

    // Create the data necessary to make a DRM license request
    const message = await new Promise((resolve, reject) => {
        session.generateRequest(initDataType, initData);
        session.addEventListener('message', (messageEvent) => {
            resolve(messageEvent.message);
        }, { once: true });
    });

    // Get a DRM license
    const response = await fetch('https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}', {
        method: 'POST',
        headers: { 'Content-type': 'application/octet-stream' },
        body: message,
    });

    // Attach the license key to the session
    const licenseData = await response.arrayBuffer();
    await session.update(licenseData);
  });
}
```

For more details, check out [the HLS.js DRM docs](https://github.com/video-dev/hls.js/blob/master/docs/API.md#drmsystems).

##### Video.js

[Video.js](http://videojs.com) supports DRM via the [`videojs-contrib-eme` plugin](https://github.com/videojs/videojs-contrib-eme).

```js
const player = videojs('vid1', {});

player.eme();
player.src({
  src: 'https://stream.mux.com/{playback-id}.m3u8?token={JWT}',
  type: 'application/x-mpegURL',
  keySystems: {
    'com.widevine.alpha': 'https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
    'com.apple.fps.1_0': {
      certificateUri: 'https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      licenseUri: 'https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
    },
    'com.microsoft.playready': 'https://license.mux.com/license/playready/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
  }
});
```

For more details check, out the [videojs-contrib-eme docs](https://github.com/videojs/videojs-contrib-eme?tab=readme-ov-file#initialization).

##### Shaka player

Shaka player supports DRM via configuration keys.

```js
player.configure({
  drm: {
    servers: {
      'com.widevine.alpha': 'https://license.mux.com/license/widevine/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      'com.apple.fps.1_0': 'https://license.mux.com/license/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}',
      'com.microsoft.playready': 'https://license.mux.com/license/playready/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
    },
    advanced: {
      'com.apple.fps.1_0': {
        serverCertificateUri: 'https://license.mux.com/appcert/fairplay/${PLAYBACK_ID}?token=${DRM_LICENSE_TOKEN}'
      }
    }
  }
});
```

For more details, check out the [Shaka player DRM docs](https://shaka-player-demo.appspot.com/docs/api/tutorial-drm-config.html).

##### Other players

While we have only tested playback with our supported players, there are many others that will work just fine. If your platform supports playback of HLS, CMAF packaged streams, with Widevine, PlayReady, or FairPlay DRM, using CBCS encryption, then Mux might work by following the [custom players guide](#other-players). If you would like us to support additional platforms, [let us know](/support/human).

### Testing that DRM is working

Checking your video is DRM protected is pretty simple: just take a screenshot! If DRM is working correctly, you should see the video replaced with either a black rectangle, or a single frame from the start of the video.

## Configure DRM security levels

Currently Mux's DRM feature defaults to a balance of security and playability, including automatically leveraging higher security levels on devices where this is available.

At times customers may want to adjust this balance to increase security levels, specifically for example to meet contractual Hollywood studio security requirements. [Please contact us](/support) if you need to discuss or adjust the security levels used.

In the future, we will allow self-serve adjustment of security levels through the DRM Configurations API.

In line with common industry practices, only video tracks are currently DRM protected, meaning that audio-only assets and audio-only live streams are not protected by DRM.

## Pricing

DRM is an add-on feature to Mux Video, with a $100/month access fee + $0.003 "[per license](#what-is-a-drm-license)", and discounts available for high volumes.

### What is a DRM license?

One DRM license request typically corresponds to one video view. When a viewer starts watching a DRM-protected video, the player requests a license to decrypt and play the content. While licenses and video views usually line up, the exact count can vary depending on each player's caching behavior.


# Play DRM protected videos on Google Cast Devices
Learn how to use Google Cast with DRM-protected Mux Video content.
Google Cast is a popular method for sending video from one device to be played on another, often from a phone to a TV. Most players support Google Cast out of the box, but if your video is protected by DRM, you will need to do a little more work.

## Overview

Google Cast integrations are made up of two parts, a "sender" and a "receiver". In the example of a phone casting to a TV, the player in a mobile browser is the sender and the receiver is a webpage sent to the TV.

There are quite a few steps in this process, so here's a quick overview:

1. Create a sender, either with the Mux Player or by writing your own
2. Create your playback ID, playback token and DRM token, and add them to the sender
3. Register your test device in the Google Cast dashboard
4. Create a custom receiver and register it on the Google Cast dashboard
5. Add the registered receiver ID to the custom sender

## Sender Setup

A Google Cast sender is an app with a "cast" button. Clicking the cast button performs two actions.

1. If you're not yet connected to another device, the cast button helps set up that connection.
2. Once you're connected to a device, the sender sends everything needed to play a video to the receiver.

In this section we'll walk through how to write your own web-based Google Cast sender.

### Mux Player Sender

The easiest way to set up your own sender is with [Mux Player for Web](/docs/guides/mux-player-web). In addition to the usual `playback-id` property, you'll need to include additional security and Google Cast fields:

* `playback-token`: A signed playback token, as documented in our [Guide to Secured Video Playback](/docs/guides/secure-video-playback)
* `drm-token`: A signed DRM token, as documented in the [Sign a DRM license token](/docs/guides/protect-videos-with-drm#sign-a-drm-license-token) section of our DRM guide
* `cast-receiver`: The application ID of your custom receiver, as documented in the [Custom Receiver](#receiver-setup) section below

<Callout type="info">
  This is only supported in Mux Player version 3.4.1 or greater
</Callout>

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>
<mux-player
  id="player"
  playback-id="your-playback-id"
  playback-token="your-playback-token"
  drm-token="your-drm-token"
  cast-receiver="your-cast-receiver-app-id"
></mux-player>
```

### Custom Web Sender

If you're not using Mux Player, Google offers SDKs for [Web](https://developers.google.com/cast/docs/web_sender), [Android](https://developers.google.com/cast/docs/android_sender) and [iOS](https://developers.google.com/cast/docs/ios_sender). In this section we'll walk through installing the Web SDK and sending a video with DRM to a custom receiver.

#### Requirements

Before you build your own custom web sender, you'll need to create a [playback token](https://www.mux.com/docs/guides/secure-video-playback) and a [DRM license token](https://www.mux.com/docs/guides/protect-videos-with-drm#sign-a-drm-license-token).

**Note:** Google Cast and DRM only works in secure contexts, such as HTTPS or localhost.

#### Import Cast SDK

Include the following script wherever you want to show a cast button.

```html
<script src="https://www.gstatic.com/cv/js/sender/v1/cast_sender.js?loadCastFramework=1"></script>
```

#### Configure Cast SDK

Before we can cast any videos we need to configure the cast context. The cast framework gives us a great place to do that, in the global `__onGCastApiAvailable` function.

```javascript
window['__onGCastApiAvailable'] = function(isAvailable) {
  if (isAvailable) {
    cast.framework.CastContext.getInstance().setOptions({
      receiverApplicationId: 'your-receiver-app-id',
      autoJoinPolicy: chrome.cast.AutoJoinPolicy.ORIGIN_SCOPED,
    });
  }
};
```

For DRM, you'll need to build your own receiver and add the ID to `receiverApplicationId`. We can help you with that in our [custom receiver guide](#receiver-setup).

#### Send Video to Receiver

Let's write a function to encapsulate sending our video over to the receiver.

First we're going to collect all the data we need to play the video in variables at the top.

```javascript
function playVideo(context) {
  const playbackId = 'your-playback-id';
  const playbackToken = 'your-playback-token';
  const drmToken = 'your-drm-token';
  const mediaUrl = `https://stream.mux.com/${playbackId}.m3u8?token=${playbackToken}`;
```

Each of these variables help identify a single asset.

* `playbackId`: An asset can have one or many playback IDs. This is different from the asset ID. You can find it in the API, or in the Mux Dashboard.
* `playbackToken`: A signed playback token, as documented in our [Guide to Secured Video Playback](https://www.mux.com/docs/guides/secure-video-playback).
* `drmToken`: A signed DRM token, as documented in the [Sign a DRM license token](https://www.mux.com/docs/guides/protect-videos-with-drm#sign-a-drm-license-token) section of our DRM guide.

Then we'll build a `MediaInfo` object to include all the information Google Cast needs to play the video.

```javascript
let mediaInfo = new chrome.cast.media.MediaInfo(mediaUrl, 'application/x-mpegurl');

// Mux HLS URLs with DRM will always use `fmp4` segments.
mediaInfo.hlsSegmentFormat = chrome.cast.media.HlsSegmentFormat.FMP4;
mediaInfo.hlsVideoSegmentFormat = chrome.cast.media.HlsVideoSegmentFormat.FMP4;

// Send the information needed to create a new license url.
mediaInfo.customData = {
  mux: {
    playbackId,
    tokens: {
      drm: drmToken
    }
  }
}
```

And finally we'll ask the receiver to load the video.

```javascript
const request = new chrome.cast.media.LoadRequest(mediaInfo);

// Cast the video.
context.getCurrentSession().loadMedia(request).then(() => {
  console.log('Successfully loaded the media');
}).catch((err) => {
  console.log(`Media playback error code: ${err}`);
});
```

Here's the function in its entirety:

<CollapsibleRoot>
  <CollapsibleTrigger>
    View example code
  </CollapsibleTrigger>

  <CollapsibleContent>
    ```javascript
    function playVideo(context) {
      const playbackId = 'your-playback-id';
      const playbackToken = 'your-playback-token';
      const drmToken = 'your-drm-token';
      const mediaUrl = `https://stream.mux.com/${playbackId}.m3u8?token=${playbackToken}`;
      let mediaInfo = new chrome.cast.media.MediaInfo(mediaUrl, 'application/x-mpegurl');

      // Mux HLS URLs with DRM will always use `fmp4` segments.
      mediaInfo.hlsSegmentFormat = chrome.cast.media.HlsSegmentFormat.FMP4;
      mediaInfo.hlsVideoSegmentFormat = chrome.cast.media.HlsVideoSegmentFormat.FMP4;

      // Send the information needed to create a new license url.
      mediaInfo.customData = {
        mux: {
          playbackId,
          tokens: {
            drm: drmToken
          }
        }
      }

      const request = new chrome.cast.media.LoadRequest(mediaInfo);

      // Cast the video.
      context.getCurrentSession().loadMedia(request).then(() => {
        console.log('Load Succeeded');
      }).catch((err) => {
        console.log(`Error code: ${errorCode}`);
      });
    }
    ```
  </CollapsibleContent>
</CollapsibleRoot>

Now we need to hook it up to the cast action. In this example we'll send it as soon as possible by listening for the cast session to start.

```javascript
let context = cast.framework.CastContext.getInstance();
context.addEventListener(cast.framework.CastContextEventType.SESSION_STATE_CHANGED, function(event) {
  switch (event.sessionState) {
    case cast.framework.SessionState.SESSION_STARTED:
    case cast.framework.SessionState.SESSION_RESUMED:
      playVideo(context);
      break;
  }
});
```

#### Add Cast Button

Once all your code is hooked up, adding the button is the easiest part. Put this in the HTML of your page and you're good to go.

```html
<google-cast-launcher>Launch</google-cast-launcher>
```

**More Docs**

* [JavaScript SDK](https://developers.google.com/cast/docs/web_sender)
* [iOS SDK](https://developers.google.com/cast/docs/ios_sender)
* [Android SDK](https://developers.google.com/cast/docs/android_sender)

## Receiver Setup

A Google Cast receiver is a web page that receives data from a sender and plays your video. This is always a webpage, written in HTML and JavaScript, even if you're using a mobile SDK for casting.

### Receiver Prerequisites

Before we can write the code, we need to take care of a couple prerequisites.

1. You need a way for devices to access this web page with a public URL. We recommend either you host the files yourself, or use [ngrok](https://ngrok.com/docs/getting-started/) during development to expose local files on a public URL.
2. You need to register the receiver's public URL with Google in the [Google Cast SDK Developer Console](https://cast.google.com/u/1/publish/). If you have not joined the Google Cast Developer program, do so now.
3. Once you have registered your receiver, you will see your app listed in the [dashboard](https://cast.google.com/u/0/publish/) with a unique Application ID. This is the same receiver ID you will configure in the [Mux Player](https://www.mux.com/docs/guides/mux-player-web) or [Custom Sender](#sender-setup).
4. Until your app is published, you can only cast to registered development devices. This registration requires the destination device's serial number. If you can't find the serial number on the outside of the device, you can use the Chrome browser to cast the [dashboard](https://developers.google.com/cast/codelabs/cast-receiver) directly to the device. This will show a new screen, prominently displaying the device's serial number.

### Receiver Implementation

Google's custom receiver SDK has a lot of functionality built in, so we don't have to do a lot of work. The only thing we need to do is manage the DRM license URL. Before we start working with License URLs, we'll want a small HTML file to describe the playback UI. Here's a minimal example.

```html
<!DOCTYPE html>
<html lang="en">
    <head>
        <meta charset="utf-8">
        <title></title>
        <!-- Web Receiver SDK -->
        <script src="//www.gstatic.com/cast/sdk/libs/caf_receiver/v3/cast_receiver_framework.js"></script>
        <!-- Cast Debug Logger -->
        <script src="//www.gstatic.com/cast/sdk/libs/devtools/debug_layer/caf_receiver_logger.js"></script>
    </head>

    <body>
        <cast-media-player></cast-media-player>
        <footer>
            <script src="js/receiver.js"></script>
        </footer>
    </body>
</html>
```

In the above example, the two script tags in the header load the SDK and the Google Cast logger. Further on you'll see a `<cast-media-player>`, which the Google Cast SDK automatically turns into a video player, and a script for managing the DRM.

Now let's create that script. To start, we're going to grab a reference to the cast context and configure the logger.

```javascript
const context = cast.framework.CastReceiverContext.getInstance();

/**
 * DEBUGGING
 */
const castDebugLogger = cast.debug.CastDebugLogger.getInstance();
const LOG_TAG = 'MUX';
castDebugLogger.setEnabled(true);

// Debug overlay on tv screen.
// You don't need this if you're debugging using the cast tool (https://casttool.appspot.com/cactool) as it will show the logs in your browser.
castDebugLogger.showDebugLogs(true);

castDebugLogger.loggerLevelByTags = {
  [LOG_TAG]: cast.framework.LoggerLevel.DEBUG,
};
```

Next we're going to intercept all playback requests and see if the request includes DRM license information.

```javascript
context.getPlayerManager().setMediaPlaybackInfoHandler((loadRequest, playbackConfig) => {
  const customData = loadRequest.media.customData || {};

  if(customData.mux && customData.mux.tokens.drm){
```

In that conditional, let's build our license URL and add it to the `playbackConfig`.

```javascript
    playbackConfig.licenseUrl = `https://license.mux.com/license/widevine/${customData.mux.playbackId}?token=${customData.mux.tokens.drm}`;
  }

  playbackConfig.protectionSystem = cast.framework.ContentProtection.WIDEVINE;
  castDebugLogger.debug(LOG_TAG, 'license url', playbackConfig.licenseUrl);

  return playbackConfig;
});
```

Finally, we start listening for incoming requests with the line

```javascript
context.start();
```

Click the button below to view the full JavaScript file.

<CollapsibleRoot>
  <CollapsibleTrigger>
    View example code
  </CollapsibleTrigger>

  <CollapsibleContent>
    ```javascript
    /**
     * DEBUGGING
     */
    // https://developers.google.com/cast/docs/debugging/cast_debug_logger
    const castDebugLogger = cast.debug.CastDebugLogger.getInstance();
    const LOG_TAG = 'MUX';
    castDebugLogger.setEnabled(true);

    // Debug overlay on tv screen. You don't need this if you're debugging using the cast tool (https://casttool.appspot.com/cactool) as it will show the logs in your browser.
    castDebugLogger.showDebugLogs(true);

    castDebugLogger.loggerLevelByTags = {
        [LOG_TAG]: cast.framework.LoggerLevel.DEBUG,
    };

    /**
     * DRM SUPPORT
     */
    context.getPlayerManager().setMediaPlaybackInfoHandler((loadRequest, playbackConfig) => {
      const customData = loadRequest.media.customData || {};

      if(customData.mux && customData.mux.tokens.drm){
        castDebugLogger.debug(LOG_TAG, 'Setting license URL.');
        playbackConfig.licenseUrl = `https://license.mux.com/license/widevine/${customData.mux.playbackId}?token=${customData.mux.tokens.drm}`;
      }

      playbackConfig.protectionSystem = cast.framework.ContentProtection.WIDEVINE;

      castDebugLogger.debug(LOG_TAG, 'license url', playbackConfig.licenseUrl);

      return playbackConfig;
    });

    /**
     * START LISTENING FOR CASTS
     */
    context.start();
    ```
  </CollapsibleContent>
</CollapsibleRoot>

## More docs

* [Google's custom receiver docs](https://developers.google.com/cast/codelabs/cast-receiver#0)
* [Debug logger docs](https://developers.google.com/cast/docs/debugging/cast_debug_logger)
* [Adding the Mux Data SDK to Chromecast](https://www.mux.com/docs/guides/monitor-chromecast)

## Testing

Once you've completed every step (Triple check the [overview](#overview) steps!) load up your sender, click the cast button and choose to cast to your test device. After a quick loading screen your DRM-protected video will start playing.

If you're testing our example, the video will automatically start playing behind the cast log. You can remove the cast log by commenting out the line `castDebugLogger.setEnabled(true);` in your custom receiver.


# Limit which Environments a user has access to in the Dashboard
Learn how to restrict which Environments a user can see in the Dashboard
This feature allows Admins to limit which Environments a given user can access within the Dashboard.

## Admin use of Environment restrictions

All management of Environment access is done in the Dashboard under **User > Organization**.

Admins have the following permissions:

* Access to all Environments
* View what Environments any given user has access to
* See what users can access a specific Environment
* Apply Environment restrictions to a user invitation
* Manage Environment restrictions for all users

## Inviting new users with Environment restrictions

Upon inviting a new user to an organization, Admins must provide at least one (1) Environment that the new user can access.

## Modify Environment access for existing users

Admins can modify the Environment restrictions for users at any time. Clicking on a user under **User > Organization** will display a list of the Environments that user has access to in the Dashboard, which can be toggled on/off and applied on Save. All users must have access to at least one (1) environment.


# Add high-performance video to your Next.js application
Use our API and components to handle embedding, storing, and streaming video in your Next.js application
<Callout type="info">
  Mux is available as a native integration through the [Vercel Marketplace](https://vercel.com/marketplace/mux). Visit the [Vercel documentation](https://vercel.com/docs) for specific guidance related to getting up and running with Mux on Vercel.
</Callout>

## When should you use Mux with Next.js?

When adding video to your Next.js app, you'll encounter some common hurdles. First, videos are large. Storing them in your public directory can lead to excessive bandwidth consumption and poor Git repository performance. Next, it's important to compress and optimize your videos for the web. Then, as network conditions change, you might want to adapt the quality of your video to ensure a smooth playback experience for your users. Finally, you may want to integrate additional features like captions, thumbnails, and analytics.

You might consider using Mux's APIs and components to handle these challenges, [and more](https://www.mux.com/features).

## Quickly drop in a video with next-video

[`next-video`](https://next-video.dev) is a React component, [maintained by Mux](https://github.com/muxinc/next-video), for adding video to your Next.js application. It extends both the `<video>` element and your Next app with features to simplify video uploading, storage, and playback.

To get started...

1. Run the install script: `npx -y next-video init`. This will install the `next-video` package, update your `next.config.js` and TypeScript configuration, and create a `/videos` folder in your project.
2. Add a video to your `/videos` folder. Mux will upload, store, and optimize it for you.
3. Add the component to your app:

```jsx
import Video from 'next-video';
import myVideo from '/videos/my-video.mp4'; 
 
export default function Page() { 
 return <Video src={myVideo} />;
}
```

Check out the [`next-video` docs](https://next-video.dev/docs) to learn more.

## Use the API and our components for full control

If you're looking to build your own video workflow that enables uploading, playback, and more in your application, you can use the Mux API and components like [Mux Player](/docs/guides/mux-player-web) and [Mux Uploader](/docs/guides/mux-uploader).

### Example: allowing users to upload video to your app

One reason you might want to build your own video workflow is when you want to allow users to upload video to your app.

Let's start by adding a new page where users can upload videos. This will involve using the [Mux Uploader](/docs/guides/mux-uploader) component, which will upload videos to a Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Direct Uploads URL</ApiRefLink>.

In the code sample below, we'll create an upload URL using the [Mux Node SDK](https://github.com/muxinc/mux-node-sdk) and the Direct Uploads URL API. We'll pass that URL to the Mux Uploader component, which will handle uploading for us.

```appDirJs

import Mux from '@mux/mux-node';
import MuxUploader from '@mux/mux-uploader-react';

const client = new Mux({
  tokenId: process.env['MUX_TOKEN_ID'],
  tokenSecret: process.env['MUX_TOKEN_SECRET'],
});

export default async function Page() {
  const directUpload = await client.video.uploads.create({
    cors_origin: '*',
    new_asset_settings: {
      playback_policy: ['public'],
    },
  });

  return <MuxUploader endpoint={directUpload.url} />;
}

```

```appDirTs

import Mux from '@mux/mux-node';
import MuxUploader from '@mux/mux-uploader-react';

const client = new Mux({
  tokenId: process.env['MUX_TOKEN_ID'],
  tokenSecret: process.env['MUX_TOKEN_SECRET'],
});

export default async function Page() {
  const directUpload = await client.video.uploads.create({
    cors_origin: '*',
    new_asset_settings: {
      playback_policy: ['public'],
    },
  });

  return <MuxUploader endpoint={directUpload.url} />;
}

```

```pagesDirJs

import Mux from '@mux/mux-node';
import MuxUploader from '@mux/mux-uploader-react';

const client = new Mux({
  tokenId: process.env['MUX_TOKEN_ID'],
  tokenSecret: process.env['MUX_TOKEN_SECRET'],
});

export const getServerSideProps = async () => {
  const directUpload = await client.video.uploads.create({
    cors_origin: '*',
    new_asset_settings: {
      playback_policy: ['public'],
    },
  });

  return {
    props: {
      directUpload,
    },
  };
}

export default function Page({ directUpload }) {
  return <MuxUploader endpoint={directUpload.url} />;
}
```

```pagesDirTs

import type { InferGetServerSidePropsType, GetServerSideProps } from 'next'

import Mux, { type Upload } from '@mux/mux-node';
import MuxUploader from '@mux/mux-uploader-react';

const client = new Mux({
  tokenId: process.env['MUX_TOKEN_ID'],
  tokenSecret: process.env['MUX_TOKEN_SECRET'],
});

export const getServerSideProps = (async () => {
  const directUpload = await client.video.uploads.create({
    cors_origin: '*',
    new_asset_settings: {
      playback_policy: ['public'],
    },
  });

  return {
    props: {
      directUpload,
    },
  };
}) satisfies GetServerSideProps<{ directUpload: Upload }>

export default function Page({
  directUpload
}: InferGetServerSidePropsType<typeof getServerSideProps>) {
  return <MuxUploader endpoint={directUpload.url} />;
}
```



<Callout type="warning">
  In production, you'll want to apply additional security measures to your upload URL. Consider protecting the route with authentication to prevent unauthorized users from uploading videos. Also, use `cors_origin` and consider [`playback_policy`](/docs/guides/secure-video-playback) to further restrict where uploads can be performed and who can view uploaded videos.
</Callout>

Next, we'll create an API endpoint that will [listen for Mux webhooks](/docs/core/listen-for-webhooks). When we receive the notification that the video has finished uploading and is ready for playback, we'll add the video's metadata to our database.

```appDirJs

export async function POST(request) {
  const body = await request.json();
  const { type, data } = body

  if (type === 'video.asset.ready') {
    await saveAssetToDatabase(data);
  } else {
    /* handle other event types */
  }
  return Response.json({ message: 'ok' });
}

```

```appDirTs

export async function POST(request: Request) {
  const body = await request.json();
  const { type, data } = body

  if (type === 'video.asset.ready') {
    await saveAssetToDatabase(data);
  } else {
    /* handle other event types */
  }
  return Response.json({ message: 'ok' });
}

```

```pagesDirJs

export default async function muxWebhookHandler (req, res) {
  const { method, body } = req;

  switch (method) {
    case 'POST': {
      const { data, type } = body;

      if (type === 'video.asset.ready') {
        await saveAssetToDatabase(data);
      } else {
        /* handle other event types */
      }
      res.json({ message: ok });
    } default:
      res.setHeader('Allow', ['POST']);
      res.status(405).end(`Method ${method} Not Allowed`);
  }
}

```

```pagesDirTs

import { NextApiRequest, NextApiResponse } from 'next';

export default async function muxWebhookHandler (req: NextApiRequest, res: NextApiResponse): Promise<void> {
  const { method, body } = req;

  switch (method) {
    case 'POST': {
      const { data, type } = body;

      if (type === 'video.asset.ready') {
        await saveAssetToDatabase(data);
      } else {
        /* handle other event types */
      }
      res.json({ message: ok });
    } default:
      res.setHeader('Allow', ['POST']);
      res.status(405).end(`Method ${method} Not Allowed`);
  }
}

```



Finally, let's make a playback page. We retrieve the video metadata from our database, and play it by passing its `playbackId` to [Mux Player](/docs/guides/mux-player-web):

```appDirJs

import Mux from '@mux/mux-node';
import MuxPlayer from '@mux/mux-player-react';

const mux = new Mux();

export default async function Page({ params }) {
  /* Get the asset metadata from your database here or directly from Mux like below. */
  const asset = await mux.video.assets.retrieve(params.id);
  return <MuxPlayer playbackId={asset.playback_ids?.[0].id} accentColor="#ac39f2" />;
}

```

```appDirTs

import Mux from '@mux/mux-node';
import MuxPlayer from '@mux/mux-player-react';

const mux = new Mux();

export default async function Page({ params }: { params: { id: string } }) {
  /* Get the asset metadata from your database here or directly from Mux like below. */
  const asset = await mux.video.assets.retrieve(params.id);
  return <MuxPlayer playbackId={asset.playback_ids?.[0].id!} accentColor="#ac39f2" />;
}

```

```pagesDirJs

import Mux from '@mux/mux-node';
import MuxPlayer from '@mux/mux-player-react';

const mux = new Mux();

export const getStaticProps = async ({ params })  => {
  /* Get the asset metadata from your database here or directly from Mux like below. */
  const asset = await mux.video.assets.retrieve(params.id);
  return {
    props: {
      asset,
    },
  };
}

export default function Page({ asset }) {
  return <MuxPlayer playbackId={asset.playback_ids?.[0].id} accentColor="#ac39f2" />;
}

```

```pagesDirTs

import type { InferGetStaticPropsType, GetStaticProps } from 'next';

import Mux from '@mux/mux-node';
import MuxPlayer from '@mux/mux-player-react';

const mux = new Mux();

export const getStaticProps = (async ({ params }) => {
  /* Get the asset metadata from your database here or directly from Mux like below. */
  const asset = await mux.video.assets.retrieve(params.id);
  return {
    props: {
      asset,
    },
  };
}) satisfies GetStaticProps<{ asset: ReturnType<typeof mux.video.assets.retrieve> }>;

export default function Page({
  asset
}: InferGetStaticPropsType<typeof getStaticProps>) {
  return <MuxPlayer playbackId={asset.playback_ids?.[0].id!} accentColor="#ac39f2" />;
}

```



And we've got upload and playback. Nice!

What's next? You can [integrate with your CMS](/docs/integrations/cms). You can [optimize your loading experience](/docs/guides/player-lazy-loading). Or get started with an example project below:

## Example projects

<GuideCard
  title="Video Course Starter Kit"
  description={<p>If you’re a developer you’ve probably seen and used platforms like <a href="https://egghead.io/">Egghead</a>, <a href="https://leveluptutorials.com/">LevelUp Tutorials</a>, <a href="https://www.coursera.org/">Coursera</a>, etc. This is your starter kit to build something like that with Next.js + Mux. Complete with Github OAuth, the ability to create courses, adding video lessons, progress tracking for viewers.</p>}
  links={[
    {
      title: "View project →",
      href: "https://github.com/muxinc/video-course-starter-kit",
    },
  ]}
/>

<GuideCard
  title="with-mux-video"
  description={<>
    <p>This is a bare-bones starter application with Next.js that uses:</p>
    <ul>
      <li>Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads">Direct Uploads</ApiRefLink></li>
      <li>Mux <a href="/docs/guides/video" title="Mux Video">Video</a> + Mux <a href="/docs/guides/data" title="Mux Data">Data</a></li>
      <li>Mux <a href="/docs/guides/mux-player-web" title="Mux Player">Player</a></li>
    </ul>
  </>}
  links={[
    {
      title: "View project →",
      href: "https://github.com/vercel/next.js/tree/931eee87be8af86bd95336deade5870ad5e04669/examples/with-mux-video",
    },
  ]}
/>

<GuideCard
  title="stream.new"
  description={<>
    <p>Stream.new is an open source Next.js application that does:</p>
    <ul>
      <li>Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads">Direct Uploads</ApiRefLink></li>
      <li>Content Moderation with Google Vision or Hive.ai (<a href="https://www.mux.com/blog/you-either-die-an-mvp-or-live-long-enough-to-build-content-moderation">Read more</a>)</li>
    </ul>
  </>}
  links={[
    {
      title: "View project →",
      href: "https://github.com/muxinc/stream.new",
    },
  ]}
/>


# Add high-performance video to your Remix.js application
Use our API and components to handle embedding, storing, and streaming video in your Remix.js application
## When should you use Mux with Remix.js?

When adding video to your Remix.js app, you'll encounter some common hurdles. First, videos are large. Storing them in your public directory can lead to excessive bandwidth consumption and poor Git repository performance. Next, it's important to compress and optimize your videos for the web. Then, as network conditions change, you might want to adapt the quality of your video to ensure a smooth playback experience for your users. Finally, you may want to integrate additional features like captions, thumbnails, and analytics.

You might consider using Mux's APIs and components to handle these challenges, [and more](https://www.mux.com/features).

## Quickly drop in a video with Mux Player

The quickest way to add a video to your site is with [Mux Player](/docs/guides/mux-player-web). Here's what Mux Player looks like in action:

```jsx
import MuxPlayer from "@mux/mux-player-react";

export default function Page() {
  return (
    <MuxPlayer
      playbackId="jwmIE4m9De02B8TLpBHxOHX7ywGnjWxYQxork1Jn5ffE"
      metadata={{
        video_title: "Test video title",
        viewer_user_id: "user-id-007",
      }}
    />
  );
}
```

If your site has just a few videos, you might upload them to Mux directly through the dashboard. In the [Mux Dashboard](https://dashboard.mux.com/), on your video assets page, select "Create New Asset". On the next screen, you can upload a video directly to Mux.

<Image src="/docs/images/dashboard-create-new-asset.png" width={2404} height={350} alt="In the upper-right corner of the Mux Dashboard is a button labeled &#x22;Create New Asset&#x22;" />

You'll then be able to see your new asset on your video assets page. When you click on the asset, you can find the asset's playback ID in the "Playback and Thumbnails" tab. This playback ID can be used in the `playbackId` prop of the Mux Player component.

<Image src="/docs/images/dashboard-playback-id.png" width={2404} height={350} alt="In the playback and thumbnails tab of an asset you can find the playback ID, as well as more information on how to play the video." />

You can read more about Mux Player, including how to customize its look and feel, over in the [Mux Player guides](/docs/guides/mux-player-web).

If you're managing more videos, you might take a look at our [CMS integrations](/docs/integrations/cms).

Finally, if you need more control over your video workflow, read on.

## Use the API to build your video workflow

If you're looking to build your own video workflow that enables uploading, playback, and more in your application, you can use the Mux API and components like [Mux Player](/docs/guides/mux-player-web) and [Mux Uploader](/docs/guides/mux-uploader).

### Example: allowing users to upload video to your app

One reason you might want to build your own video workflow is when you want to allow users to upload video to your app.

Let's start by adding a new page where users can upload videos. This will involve using the [Mux Uploader](/docs/guides/mux-uploader) component, which will upload videos to a Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Direct Uploads URL</ApiRefLink>.

In the code sample below, we'll create an upload URL using the [Mux Node SDK](https://github.com/muxinc/mux-node-sdk) and the Direct Uploads URL API. We'll pass that URL to the Mux Uploader component, which will handle uploading for us.

```jsx

import { json } from "@remix-run/node";
import { useLoaderData } from "@remix-run/react";
import MuxUploader from "@mux/mux-uploader-react";
import mux from "~/lib/mux.server";

export const loader = async () => {
  const upload = await mux.video.uploads.create({
    new_asset_settings: {
      playback_policy: ["public"],
      video_quality: "basic",
    },
    cors_origin: "*",
  });
  return json({ url: upload.url });
};

export default function UploadPage() {
  const { url } = useLoaderData();
  return <MuxUploader endpoint={url} />
}

```

```tsx

import { json } from "@remix-run/node";
import { useLoaderData } from "@remix-run/react";
import MuxUploader from "@mux/mux-uploader-react";
import mux from "~/lib/mux.server";

export const loader = async () => {
  const upload = await mux.video.uploads.create({
    new_asset_settings: {
      playback_policy: ["public"],
      video_quality: "basic",
    },
    cors_origin: "*",
  });
  return json({ url: upload.url });
};

export default function UploadPage() {
  const { url } = useLoaderData<typeof loader>();
  return <MuxUploader endpoint={url} />
}
```



<Callout type="warning">
  In production, you'll want to apply additional security measures to your upload URL. Consider protecting the route with authentication to prevent unauthorized users from uploading videos. Also, use `cors_origin` and consider [`playback_policy`](/docs/guides/secure-video-playback) to further restrict where uploads can be performed and who can view uploaded videos.
</Callout>

Next, we'll create an API endpoint that will [listen for Mux webhooks](/docs/core/listen-for-webhooks). When we receive the notification that the video has finished uploading and is ready for playback, we'll add the video's metadata to our database.

```js

import { json } from "@remix-run/node";
import Mux from "@mux/mux-node";

// this reads your MUX_TOKEN_ID and MUX_TOKEN_SECRET
// from your environment variables
// https://dashboard.mux.com/settings/access-tokens
const mux = new Mux();

// Mux webhooks POST, so let's use an action
export const action = async ({ request }) => {
  if (request.method !== "POST") {
    return new Response("Method not allowed", { status: 405 });
  }

  const body = await request.text();
  // mux.webhooks.unwrap will validate that the given payload was sent by Mux and parse the payload.
  // It will also provide type-safe access to the payload.
  // Generate MUX_WEBHOOK_SIGNING_SECRET in the Mux dashboard
  // https://dashboard.mux.com/settings/webhooks
  const event = mux.webhooks.unwrap(
    body,
    request.headers,
    process.env.MUX_WEBHOOK_SIGNING_SECRET
  );

  // you can also unwrap the payload yourself:
  // const event = await request.json();
  switch (event.type) {
    case "video.upload.asset_created":
      // we might use this to know that an upload has been completed
      // and we can save its assetId to our database
      break;
    case "video.asset.ready":
      // we might use this to know that a video has been encoded
      // and we can save its playbackId to our database
      break;
    // there are many more Mux webhook events
    // check them out at https://www.mux.com/docs/webhook-reference
    default:
      break;
  }

  return json({ message: "ok" });
};
```

```ts

import { json, type ActionFunctionArgs } from "@remix-run/node";
import Mux from "@mux/mux-node";

// this reads your MUX_TOKEN_ID and MUX_TOKEN_SECRET
// from your environment variables
// https://dashboard.mux.com/settings/access-tokens
const mux = new Mux();

// Mux webhooks POST, so let's use an action
export const action = async ({ request }: ActionFunctionArgs) => {
  if (request.method !== "POST") {
    return new Response("Method not allowed", { status: 405 });
  }

  const body = await request.text();
  // mux.webhooks.unwrap will validate that the given payload was sent by Mux and parse the payload.
  // It will also provide type-safe access to the payload.
  // Generate MUX_WEBHOOK_SIGNING_SECRET in the Mux dashboard
  // https://dashboard.mux.com/settings/webhooks
  const event = mux.webhooks.unwrap(
    body,
    request.headers,
    process.env.MUX_WEBHOOK_SIGNING_SECRET
  );

  // you can also unwrap the payload yourself:
  // const event = await request.json();
  switch (event.type) {
    case "video.upload.asset_created":
      // we might use this to know that an upload has been completed
      // and we can save its assetId to our database
      break;
    case "video.asset.ready":
      // we might use this to know that a video has been encoded
      // and we can save its playbackId to our database
      break;
    // there are many more Mux webhook events
    // check them out at https://www.mux.com/docs/webhook-reference
    default:
      break;
  }

  return json({ message: "ok" });
};
```



Finally, let's make a playback page. We retrieve the video metadata from our database, and play it by passing its `playbackId` to [Mux Player](/docs/guides/mux-player-web):

```jsx
import MuxPlayer from "@mux/mux-player-react";
import { useLoaderData, useParams } from "@remix-run/react";

export const loader = async ({ params, request }) => {
  const { title } = getAssetFromDatabase(params);
  const userId = getUser(request);
  return json({ title, userId });
};

export default function Page() {
  const { title, userId } = useLoaderData();
  const { playbackId } = useParams();
  return (
    <MuxPlayer
      playbackId={playbackId}
      metadata={{
        video_title: title,
        viewer_user_id: userId
      }}
    />
  );
}
```

```tsx

import MuxPlayer from "@mux/mux-player-react";
import { type LoaderFunctionArgs } from "@remix-run/node";
import { useLoaderData, useParams } from "@remix-run/react";

export const loader = async ({ params, request }: LoaderFunctionArgs) => {
  const { title } = getAssetFromDatabase(params);
  const userId = getUser(request);
  return json({ title, userId });
};

export default function Page() {
  const { title, userId } = useLoaderData<typeof loader>();
  const { playbackId } = useParams();
  return (
    <MuxPlayer
      playbackId={playbackId}
      metadata={{
        video_title: title,
        viewer_user_id: userId
      }}
    />
  );
}
```



And we've got upload and playback. Nice!

What's next? You can [integrate with your CMS](/docs/integrations/cms). You can [optimize your loading experience](/docs/guides/player-lazy-loading). Or get started with the example project below:

## Example projects

<GuideCard
  title="remix-examples/mux-video"
  description={<>
    <p>This is a bare-bones starter application with Remix.js that uses:</p>
    <ul>
      <li>Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads">Direct Uploads</ApiRefLink> and <a href="/docs/guides/mux-uploader">Mux Uploader</a></li>
      <li>Mux <a href="/docs/guides/video" title="Mux Video">Video</a> + Mux <a href="/docs/guides/data" title="Mux Data">Data</a></li>
      <li>Mux <a href="/docs/guides/mux-player-web" title="Mux Player">Player</a></li>
    </ul>
  </>}
  links={[
    {
      title: "View project →",
      href: "https://github.com/remix-run/examples/tree/main/mux-video",
    },
  ]}
/>


# Add high-performance video to your SvelteKit application
Use our API and components to handle embedding, storing, and streaming video in your SvelteKit application
## When should you use Mux with Svelte?

When adding video to your SvelteKit app, you'll encounter some common hurdles. First, videos are large. Storing them in your public directory can lead to excessive bandwidth consumption and poor Git repository performance. Next, it's important to compress and optimize your videos for the web. Then, as network conditions change, you might want to adapt the quality of your video to ensure a smooth playback experience for your users. Finally, you may want to integrate additional features like captions, thumbnails, and analytics.

You might consider using Mux's APIs and components to handle these challenges, [and more](https://www.mux.com/features).

## Quickly drop in a video with Mux Player

The quickest way to add a video to your site is with [Mux Player](/docs/guides/mux-player-web). Here's what Mux Player looks like in action:

```svelte
<script lang="ts">
  import "@mux/mux-player";
</script>

<mux-player
  playback-id="jwmIE4m9De02B8TLpBHxOHX7ywGnjWxYQxork1Jn5ffE"
  metadata-video-title="Test VOD"
  metadata-viewer-user-id="user-id-007"
></mux-player>
```

If your site has just a few videos, you might upload them to Mux directly through the dashboard. In the [Mux Dashboard](https://dashboard.mux.com/), on your video assets page, select "Create New Asset". On the next screen, you can upload a video directly to Mux.

<Image src="/docs/images/dashboard-create-new-asset.png" width={2404} height={350} alt="In the upper-right corner of the Mux Dashboard is a button labeled &#x22;Create New Asset&#x22;" />

You'll then be able to see your new asset on your video assets page. When you click on the asset, you can find the asset's playback ID in the "Playback and Thumbnails" tab. This playback ID can be used in the `playbackId` prop of the Mux Player component.

<Image src="/docs/images/dashboard-playback-id.png" width={2404} height={350} alt="In the playback and thumbnails tab of an asset you can find the playback ID, as well as more information on how to play the video." />

You can read more about Mux Player, including how to customize its look and feel, over in the [Mux Player guides](/docs/guides/mux-player-web).

If you're managing more videos, you might take a look at our [CMS integrations](/docs/integrations/cms).

Finally, if you need more control over your video workflow, read on.

## Use the API to build your video workflow

If you're looking to build your own video workflow that enables uploading, playback, and more in your application, you can use the Mux API and components like [Mux Player](/docs/guides/mux-player-web) and [Mux Uploader](/docs/guides/mux-uploader).

### Example: allowing users to upload video to your app

One reason you might want to build your own video workflow is when you want to allow users to upload video to your app.

Let's start by adding the Mux Node SDK to your project. We'll be using this a lot.

```typescript filename=lib/mux.ts
import Mux from '@mux/mux-node';
import { MUX_TOKEN_ID, MUX_TOKEN_SECRET } from '$env/static/private';

const mux = new Mux({
	tokenId: MUX_TOKEN_ID,
	tokenSecret: MUX_TOKEN_SECRET
});

export default mux;
```

Now, we can add a new page where users can upload videos. This will involve using the [Mux Uploader](/docs/guides/mux-uploader) component, which will upload videos to a Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Direct Uploads URL</ApiRefLink>.

We'll start by creating an upload URL using the Direct Uploads URL API.

```js

import mux from '$lib/mux';

export const load = async () => {
	// Create an endpoint for MuxUploader to upload to
	const upload = await mux.video.uploads.create({
		new_asset_settings: {
			playback_policy: ['public'],
			video_quality: 'basic'
		},
		// in production, you'll want to change this origin to your-domain.com
		cors_origin: '*'
	});
	return { id: upload.id, url: upload.url };
}

```

```ts

import mux from '$lib/mux';
import type { PageServerLoad } from './$types';

export const load = (async () => {
	// Create an endpoint for MuxUploader to upload to
	const upload = await mux.video.uploads.create({
		new_asset_settings: {
			playback_policy: ['public'],
			video_quality: 'basic'
		},
		// in production, you'll want to change this origin to your-domain.com
		cors_origin: '*'
	});
	return { id: upload.id, url: upload.url };
}) satisfies PageServerLoad;
```



<Callout type="warning">
  In production, you'll want to apply additional security measures to your upload URL. Consider protecting the route with authentication to prevent unauthorized users from uploading videos. Also, use `cors_origin` and consider [`playback_policy`](/docs/guides/secure-video-playback) to further restrict where uploads can be performed and who can view uploaded videos.
</Callout>

Then, we'll pass that URL to the Mux Uploader component, which will handle uploading for us.

```svelte4Js

<script>
	import '@mux/mux-uploader';
	export let data;
</script>

<mux-uploader endpoint={data.url} />

```

```svelte4Ts

<script lang="ts">
	import '@mux/mux-uploader';
	import type { PageData } from './$types';
	export let data: PageData;
</script>

<mux-uploader endpoint={data.url} />

```

```svelte5Js

<script>
	import '@mux/mux-uploader';
	let { data } = $props();
</script>

<mux-uploader endpoint={data.url} />

```

```svelte5Ts

<script lang="ts">
	import '@mux/mux-uploader';
	import type { PageData } from './$types';

	let { data }: { data: PageData } = $props();
</script>

<mux-uploader endpoint={data.url} />

```



Next, we'll create an API endpoint that will [listen for Mux webhooks](/docs/core/listen-for-webhooks). When we receive the notification that the video has finished uploading and is ready for playback, we'll add the video's metadata to our database.

```js

import mux from '$lib/mux';
import { json } from '@sveltejs/kit';
import { MUX_WEBHOOK_SIGNING_SECRET } from '$env/static/private';

export const POST = async ({ request }) => {
	const body = await request.text();
	// mux.webhooks.unwrap will validate that the given payload was sent by Mux and parse the payload.
	// Generate MUX_WEBHOOK_SIGNING_SECRET in the Mux dashboard
	// https://dashboard.mux.com/settings/webhooks
	const event = mux.webhooks.unwrap(body, request.headers, MUX_WEBHOOK_SIGNING_SECRET);

	// you can also unwrap the payload yourself:
	// const event = await request.json();
	switch (event.type) {
		case 'video.upload.asset_created':
			// we might use this to know that an upload has been completed
			// and we can save its assetId to our database
			break;
		case 'video.asset.ready':
			// we might use this to know that a video has been encoded
			// and we can save its playbackId to our database
			break;
		// there are many more Mux webhook events
		// check them out at https://www.mux.com/docs/webhook-reference
		default:
			break;
	}

	return json({ message: 'ok' });
};

```

```ts

import mux from '$lib/mux';
import { json, type RequestHandler } from '@sveltejs/kit';
import { MUX_WEBHOOK_SIGNING_SECRET } from '$env/static/private';

export const POST: RequestHandler = async ({ request }) => {
	const body = await request.text();
	// mux.webhooks.unwrap will validate that the given payload was sent by Mux and parse the payload.
	// It will also provide type-safe access to the payload.
	// Generate MUX_WEBHOOK_SIGNING_SECRET in the Mux dashboard
	// https://dashboard.mux.com/settings/webhooks
	const event = mux.webhooks.unwrap(body, request.headers, MUX_WEBHOOK_SIGNING_SECRET);

	// you can also unwrap the payload yourself:
	// const event = await request.json();
	switch (event.type) {
		case 'video.upload.asset_created':
			// we might use this to know that an upload has been completed
			// and we can save its assetId to our database
			break;
		case 'video.asset.ready':
			// we might use this to know that a video has been encoded
			// and we can save its playbackId to our database
			break;
		// there are many more Mux webhook events
		// check them out at https://www.mux.com/docs/webhook-reference
		default:
			break;
	}

	return json({ message: 'ok' });
};

```



Finally, let's make a playback page. We retrieve the video metadata from our database, and play it by passing its `playbackId` to [Mux Player](/docs/guides/mux-player-web):

```svelte4Js

<script>
	import '@mux/mux-player';
	export let data;
</script>

<mux-player
	playback-id={data.playbackId}
	accentColor="#FF3E00"
/>

```

```svelte4Ts

<script lang="ts">
	import '@mux/mux-player';
	import type { PageData } from './$types';
	export let data: PageData;
</script>

<mux-player
	playback-id={data.playbackId}
	accentColor="#FF3E00"
/>

```

```svelte5Js

<script>
	import '@mux/mux-player';
	let { data } = $props();
</script>

<mux-player
	playback-id={data.playbackId}
	accentColor="#FF3E00"
/>

```

```svelte5Ts

<script lang="ts">
	import '@mux/mux-player';
	import type { PageData } from './$types';

	let { data }: { data: PageData } = $props();
</script>

<mux-player
	playback-id={data.playbackId}
	accentColor="#FF3E00"
/>

```



And we've got upload and playback. Nice!

What's next? You can [integrate with your CMS](/docs/integrations/cms). You can [optimize your loading experience](/docs/guides/player-lazy-loading). Or get started with the example project below:

## Example projects

<GuideCard
  title="muxinc/examples/sveltekit-uploader-and-player"
  description={<>
    <p>This is a bare-bones starter application with SvelteKit that uses:</p>
    <ul>
      <li>Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads">Direct Uploads</ApiRefLink> and <a href="/docs/guides/mux-uploader">Mux Uploader</a></li>
      <li>Mux <a href="/docs/guides/video" title="Mux Video">Video</a> + Mux <a href="/docs/guides/data" title="Mux Data">Data</a></li>
      <li>Mux <a href="/docs/guides/mux-player-web" title="Mux Player">Player</a></li>
    </ul>
  </>}
  links={[
    {
      title: "View project →",
      href: "https://github.com/muxinc/examples/tree/main/sveltekit-uploader-and-player",
    },
  ]}
/>


# Add high-performance video to your Astro application
Use our API and components to handle embedding, storing, and streaming video in your Astro application
## When should you use Mux with Astro?

When adding video to your Astro app, you'll encounter some common hurdles. First, videos are large. Storing them in your public directory can lead to excessive bandwidth consumption and poor Git repository performance. Next, it's important to compress and optimize your videos for the web. Then, as network conditions change, you might want to adapt the quality of your video to ensure a smooth playback experience for your users. Finally, you may want to integrate additional features like captions, thumbnails, and analytics.

You might consider using Mux's APIs and components to handle these challenges, [and more](https://www.mux.com/features).

## Use the API to build your video workflow

If you're looking to build your own video workflow that enables uploading, playback, and more in your application, you can use the Mux API and components like [Mux Player](/docs/guides/mux-player-web) and [Mux Uploader](/docs/guides/mux-uploader).

### Example: allowing users to upload video to your app

One reason you might want to build your own video workflow is when you want to allow users to upload video to your app.

<Callout type="info">
  Much of the work described here is done on the server and is unique for every user. Make sure your Astro app is [in SSR mode](https://docs.astro.build/en/docs/guides/server-side-rendering/) before you begin.
</Callout>

Let's start by adding a new page where users can upload videos. This will involve using the [Mux Uploader](/docs/guides/mux-uploader) component, which will upload videos to a Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads/create-direct-upload">Direct Uploads URL</ApiRefLink>.

First, install the Astro version of Mux Uploader:

```bash
npm install @mux/mux-uploader-astro
```

In the code sample below, we'll create an upload URL using the [Mux Node SDK](https://github.com/muxinc/mux-node-sdk) and the Direct Uploads URL API. We'll pass that URL to the native Astro `<MuxUploader />` component, which will handle uploading for us.

```astro filename=src/pages/index.astro
---
import Layout from '../layouts/Layout.astro';
import Mux from "@mux/mux-node";
import MuxUploader from "@mux/mux-uploader-astro";

const mux = new Mux({
  tokenId: import.meta.env.MUX_TOKEN_ID,
  tokenSecret: import.meta.env.MUX_TOKEN_SECRET
});

const upload = await mux.video.uploads.create({
  new_asset_settings: {
    playback_policy: ['public'],
    video_quality: 'basic',
  },
  cors_origin: '*',
});
---

<Layout title="Upload a video to Mux">
  <MuxUploader endpoint={upload.url} />
</Layout>
```

<Callout type="warning">
  In production, you'll want to apply additional security measures to your upload URL. Consider protecting the route with authentication to prevent unauthorized users from uploading videos. Also, use `cors_origin` and consider [`playback_policy`](/docs/guides/secure-video-playback) to further restrict where uploads can be performed and who can view uploaded videos.
</Callout>

Next, we'll create an API endpoint that will [listen for Mux webhooks](/docs/core/listen-for-webhooks). When we receive the notification that the video has finished uploading and is ready for playback, we'll add the video's metadata to our database.

```ts filename=src/pages/mux-webhook.json.ts
import type { APIRoute } from 'astro';
import mux from '../lib/mux';

export const POST: APIRoute = async ({ request }) => {
  const body = await request.text();
  // mux.webhooks.unwrap will validate that the given payload was sent by Mux and parse the payload.
  // It will also provide type-safe access to the payload.
  // Generate MUX_WEBHOOK_SIGNING_SECRET in the Mux dashboard
  // https://dashboard.mux.com/settings/webhooks
  const event = mux.webhooks.unwrap(
    body,
    request.headers,
    process.env.MUX_WEBHOOK_SIGNING_SECRET
  );

  // you can also unwrap the payload yourself:
  // const event = await request.json();
  switch (event.type) {
    case 'video.upload.asset_created':
      // we might use this to know that an upload has been completed
      // and we can save its assetId to our database
      break;
    case 'video.asset.ready':
      // we might use this to know that a video has been encoded
      // and we can save its playbackId to our database
      break;
    // there are many more Mux webhook events
    // check them out at https://www.mux.com/docs/webhook-reference
    default:
      break;
  }

  return new Response(JSON.stringify({ message: 'ok' }), {
    headers: {
      'Content-Type': 'application/json',
    },
  });
};
```

Finally, let's make a playback page. We retrieve the video metadata from our database, and play it by passing its `playbackId` to [Mux Player](/docs/guides/mux-player-web).

First, install the Astro version of Mux Player:

```bash
npm install @mux/mux-player-astro
```

Now create your playback page using the native Astro `<MuxPlayer />` component:

```astro filename=src/pages/playback/[playbackId].astro
---
import Layout from '../../layouts/Layout.astro';
import MuxPlayer from "@mux/mux-player-astro";

const { playbackId } = Astro.params;
---

<Layout>
  <MuxPlayer
    playbackId={playbackId}
    metadata={{video_title: 'My Video'}}
  />
</Layout>
```

And we've got upload and playback. Nice!

## Retrieving asset data with the Mux Node SDK

You can use the [Mux Node SDK](https://github.com/muxinc/mux-node-sdk) to retrieve information about your videos and pass that data to your Astro components. This is useful for displaying video metadata like title, duration, and upload date.

```bash
npm install @mux/mux-node
```

Here's an example of fetching video asset data and using it in your component:

```astro filename=src/pages/video/[assetId].astro
---
import Layout from '../../layouts/Layout.astro';
import Mux from "@mux/mux-node";
import MuxPlayer from "@mux/mux-player-astro";

const mux = new Mux({
  tokenId: import.meta.env.MUX_TOKEN_ID,
  tokenSecret: import.meta.env.MUX_TOKEN_SECRET,
});

const { assetId } = Astro.params;
const asset = await mux.video.assets.retrieve(assetId);

const playbackId = asset.playback_ids?.find((id) => id.policy === "public")?.id;
const videoTitle = asset?.meta?.title;
const createdAt = Number(asset?.created_at);
const duration = Number(asset?.duration);

const date = new Date(createdAt * 1000).toDateString();
const time = new Date(Math.round(duration) * 1000).toISOString().substring(14, 19);
---

<Layout>
  <h1>My Video Page</h1>
  <p>Title: {videoTitle}</p>
  <p>Upload Date: {date}</p>
  <p>Length: {time}</p>

  <MuxPlayer
    playbackId={playbackId}
    metadata={{video_title: videoTitle}}
  />
</Layout>
```

## Using Mux video element

If you prefer a simpler alternative to Mux Player that provides browser support for HLS playback with automatic Mux Data analytics, you can use the `<mux-video>` web component:

```bash
npm install @mux/mux-video
```

```astro filename=src/components/SimpleVideoPlayer.astro
<script>import '@mux/mux-video'</script>

<mux-video
  playback-id="FOTbeIxKeMPzyhrob722wytaTGI02Y3zbV00NeFQbTbK00"
  metadata-video-title="My Astro Video"
  controls
  disable-tracking
></mux-video>
```

## Event handling for uploads

You can listen for upload events and handle them with client-side scripts. Here's an example of handling upload events:

```astro filename=src/pages/upload-with-events.astro
---
import Layout from '../layouts/Layout.astro';
import Mux from "@mux/mux-node";
import MuxUploader from "@mux/mux-uploader-astro";

const mux = new Mux({
  tokenId: import.meta.env.MUX_TOKEN_ID,
  tokenSecret: import.meta.env.MUX_TOKEN_SECRET
});

const upload = await mux.video.uploads.create({
  new_asset_settings: {
    playback_policy: ['public'],
    video_quality: 'basic',
  },
  cors_origin: '*',
});
---

<Layout title="Upload with Event Handling">
  <MuxUploader
    id="my-uploader"
    endpoint={upload.url}
    pausable
    maxFileSize={50000}
  />

  <script>
    import type { MuxUploaderElement } from '@mux/mux-uploader-astro';

    const uploader = document.getElementById('my-uploader') as MuxUploaderElement;

    uploader.addEventListener('uploadstart', (event) => {
      console.log('Upload started!', event.detail);
    });

    uploader.addEventListener('success', (event) => {
      console.log('Upload successful!', event.detail);
    });

    uploader.addEventListener('uploaderror', (event) => {
      console.error('Upload error!', event.detail);
    });

    uploader.addEventListener('progress', (event) => {
      console.log('Upload progress: ', event.detail);
    });
  </script>
</Layout>
```

What's next? You can [integrate with your CMS](/docs/integrations/cms). You can [optimize your loading experience](/docs/guides/player-lazy-loading). Or get started with the example project below:

## Example projects

<GuideCard
  title="muxinc/examples/astro-uploader-and-player"
  description={<>
    <p>This is a bare-bones starter application with Astro that uses:</p>
    <ul>
      <li>Mux <ApiRefLink href="/docs/api-reference/video/direct-uploads">Direct Uploads</ApiRefLink> and <a href="/docs/guides/mux-uploader">Mux Uploader</a></li>
      <li>Mux <a href="/docs/guides/video" title="Mux Video">Video</a> + Mux <a href="/docs/guides/data" title="Mux Data">Data</a></li>
      <li>Mux <a href="/docs/guides/mux-player-web" title="Mux Player">Player</a></li>
    </ul>
  </>}
  links={[
    {
      title: "View project →",
      href: "https://github.com/muxinc/examples/tree/main/astro-uploader-and-player",
    },
  ]}
/>


# Add high-performance video to your Laravel application
Use our API and components to handle embedding, storing, and streaming video in your Laravel application
Laravel is one of the most popular PHP frameworks for building a website but doesn't have a built-in path for integrating video.

Mux is a video API for developers that makes it easy to upload and manage your library of video and audio content. We'll handle the video end-to-end for you from upload, encoding, generating thumbnails and captions, right through to playback and customising the player experience.

Here we've outlined some techniques and libraries you can use to make integrating with Mux as smooth as possible.

## Listening for webhooks

Everything that happens to your videos on Mux triggers a webhook that notifies you of the change. This can include an asset being ready for playback, a live streaming connecting, an asset being deleted, and [many others](/docs/webhook-reference).

Read our [webhook guide](/docs/core/listen-for-webhooks) for learning about how to get setup for handling webhooks generally.

In Laravel you would setup a route that looks like this:

```php
// routes/api.php
use App\Http\Controllers\WebhookController;

Route::post('webhook/endpoint', [WebhookController::class, 'handle']);
```

Which references this `WebhookController`:

```php
// app/Http/Controllers/WebhookController.php
namespace App\Http\Controllers;

use Illuminate\Http\Request;

class WebhookController extends Controller
{
    public function handle(Request $request)
    {
        // Process webhook payload here
        // Save the asset ID and playback ID to your database
        return response()->json(['success' => true]);
    }
}
```

This controller will be in charge of storing references to your videos that have successfully been uploaded and processed. It will also be notified if an upload fails for any reason, you might want to store this state too so it can be shown to users if needed.

You should store at least the Asset ID and Playback ID in your database so that you can use them to embed the videos for playback in your page templates. You will use the Asset ID whenever you need to interact with the asset with the Mux API and you will need the Playback ID for playback on the front-end.

## Uploading from the front-end with Direct Uploads

[Direct Uploads](/docs/guides/upload-files-directly) allow you to upload a video from the browser to your Mux account. To start, call the Mux API to generate an upload URL that is provided to the front-end.

We can use the [Mux PHP library](https://github.com/muxinc/mux-php) to make it easier to create these upload URLs:

```php
$createAssetRequest = new MuxPhp\Models\CreateAssetRequest(["playback_policy" => [MuxPhp\Models\PlaybackPolicy::_PUBLIC]]);
$createUploadRequest = new MuxPhp\Models\CreateUploadRequest(["new_asset_settings" => $createAssetRequest]);
$upload = $uploadsApi->createDirectUpload($createUploadRequest);

print "Upload URL:" $upload->getData()->getUrl();
```

You'll want to add this script to one of your API routes, and return the upload URL instead of printing it.
On the front-end, you can use [Mux Uploader](/docs/guides/mux-uploader), a web component that gives you a simple UI to make uploading a video easier.
On the front-end using Mux Uploader, you would use the Upload URL for the `endpoint` attribute:

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-uploader"></script>

<mux-uploader endpoint="{direct_upload_url}"></mux-uploader>
```

## Video playback

If your webhook is already storing Playback IDs in your database, you can play back videos on the front-end using [Mux Player](/docs/guides/mux-player-web). Your blade template for this might look like:

```html
<script src="https://cdn.jsdelivr.net/npm/@mux/mux-player" defer></script>

<!-- The `metadata-` attributes are optional  -->
<mux-player
  playback-id="{{ $playbackId }}"
  metadata-video-title="{{ $title }}" 
  metadata-viewer-user-id="{{ $userId }}"
></mux-player>
```

Mux Player comes with lots of [features](/docs/guides/player-core-functionality) and [customisability](/docs/guides/player-customize-look-and-feel) out of the box.

The controller for this page might look something like this:

```php
<?php
namespace App\Http\Controllers;

use Illuminate\Http\Request;

class PlaybackController extends Controller
{
    public function show($videoId)
    {
        // Fetch the video from the database and set
        // $playbackId
        // $title
        
        // Get user (replace with your actual authentication logic)
        $userId = auth()->id();

        return view('playback', [
            'playbackId' => $playbackId,
            'title' => $title,
            'userId' => $userId,
        ]);
    }
}
```

## Community contributions and libraries

### `mux-php-laravel`

`mux-php-laravel` is a [library](https://github.com/martinbean/mux-php-laravel) that will help you setup defaults for working with Mux easier in your Laravel project.

### Statamic Mux

[Statamic](https://statamic.com/) is a popular CMS built on top of Laravel. There is a community [Mux integration](https://statamic-mux.daun.ltd/) for making it easier to get your videos into Mux using the CMS.


# Play video in React Native
Get a Mux video playing in your React Native app in five minutes or less.
## 1. Install \`expo-video\`

Mux delivers video using HLS (HTTP Live Streaming), which is natively supported on both iOS and Android. To play Mux videos in React Native, you'll use [`expo-video`](https://docs.expo.dev/versions/latest/sdk/video/), a cross-platform, performant video component with native support for React Native and Expo.

Install the package using your preferred package manager:

```bash
# npm
npm install expo-video

# yarn
yarn add expo-video

# pnpm
pnpm add expo-video
```

<Callout type="info">
  This guide assumes you're using Expo. If you're using bare React Native without Expo, you'll need to install the `expo` package first and configure your project for Expo modules. See the [Expo documentation](https://docs.expo.dev/bare/installing-expo-modules/) for details.
</Callout>

For iOS in a bare workflow, install the native dependencies:

```bash
cd ios && pod install && cd ..
```

## 2. Create a video player component

Create a new file called `components/video-player.tsx` in your project and add the following code. You'll need a Mux **playback ID** to construct the video URL.

If you don't have a video in Mux yet, you can use this demo playback ID for testing: `OfjbQ3esQifgboENTs4oDXslCP5sSnst`

```tsx
import React from 'react';
import { StyleSheet, View } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoPlayer() {
  // Replace with your own playback ID from https://dashboard.mux.com
  const playbackId = 'OfjbQ3esQifgboENTs4oDXslCP5sSnst';
  const videoSource = `https://stream.mux.com/${playbackId}.m3u8`;

  const player = useVideoPlayer(videoSource, (player) => {
    player.loop = false;
    player.play();
  });

  return (
    <View style={styles.container}>
      <VideoView
        player={player}
        style={styles.video}
        allowsFullscreen
        allowsPictureInPicture
        nativeControls
        contentFit="contain"
      />
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
});
```

### Understanding the video URL

Mux videos are streamed using the format:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

* `{PLAYBACK_ID}` is the unique identifier for your video
* `.m3u8` is the HLS manifest file format

<Callout type="info">
  New to Mux? Learn about [playback IDs](/docs/guides/play-your-videos) and [creating video assets](/docs/core/stream-video-files) in the main Mux docs.
</Callout>

## 3. Run your app

Import and use the `VideoPlayer` component in your app. If you used `create-expo-app`, you'll likely find your main screen at `app/(tabs)/index.tsx` or `app/index.tsx`. Import and add the component:

```tsx
import VideoPlayer from '@/components/video-player';

export default function HomeScreen() {
  return <VideoPlayer />;
}
```

Then run your app:

```bash
# Start Expo dev server
npx expo start

# Press 'i' for iOS or 'a' for Android
# Or scan the QR code with Expo Go
```

You should see your video playing with native controls! The video will stream using HLS with adaptive bitrate, automatically adjusting quality based on the viewer's network conditions.

## Common next steps

Now that you have basic playback working, here are some common things you'll want to do:

### Add a poster image (thumbnail)

Mux provides thumbnails for your videos using the same playback ID. Display a poster image that the user taps to start playback:

```tsx highlight=1-2 add=6,9,16-19,31-39,55-59 remove=13
import React, { useState } from 'react';
import { StyleSheet, View, Image, Pressable } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoPlayer() {
  const [showPoster, setShowPoster] = useState(true);
  const playbackId = 'OfjbQ3esQifgboENTs4oDXslCP5sSnst';
  const videoSource = `https://stream.mux.com/${playbackId}.m3u8`;
  const posterSource = `https://image.mux.com/${playbackId}/thumbnail.png?time=0`;

  const player = useVideoPlayer(videoSource, (player) => {
    player.loop = false;
    // Don't autoplay - wait for user to tap poster
  });

  const handlePosterPress = () => {
    setShowPoster(false);
    player.play();
  };

  return (
    <View style={styles.container}>
      <VideoView
        player={player}
        style={styles.video}
        allowsFullscreen
        allowsPictureInPicture
        nativeControls
        contentFit="contain"
      />
      {showPoster && (
        <Pressable onPress={handlePosterPress} style={styles.poster}>
          <Image
            source={{ uri: posterSource }}
            style={styles.poster}
            resizeMode="cover"
          />
        </Pressable>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  poster: {
    position: 'absolute',
    width: '100%',
    aspectRatio: 16 / 9,
  },
});
```

The thumbnail URL format is:

```
https://image.mux.com/{PLAYBACK_ID}/thumbnail.png?time={SECONDS}
```

Set `time` to capture a frame at a specific timestamp (e.g., `time=5` for 5 seconds in).

### Handle player events

Track loading, playback progress, and errors using `expo-video`'s event system:

```tsx highlight=2 add=3,13,16-40,44-46,55-63,79-94
import React from 'react';
import { StyleSheet, View, Text, ActivityIndicator } from 'react-native';
import { useEvent } from 'expo';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoPlayer() {
  const playbackId = 'OfjbQ3esQifgboENTs4oDXslCP5sSnst';
  const videoSource = `https://stream.mux.com/${playbackId}.m3u8`;

  const player = useVideoPlayer(videoSource, (player) => {
    player.loop = false;
    player.play();
    player.timeUpdateEventInterval = 0.5; // Update time every 0.5 seconds
  });

  // Listen to status changes (loading, readyToPlay, error)
  const { status, error } = useEvent(player, 'statusChange', {
    status: player.status,
  });

  // Listen to playback progress
  const timeUpdate = useEvent(player, 'timeUpdate');
  const currentTime = timeUpdate?.currentTime ?? 0;

  // Listen to playing state changes
  const { isPlaying } = useEvent(player, 'playingChange', {
    isPlaying: player.playing,
  });

  if (status === 'error' && error) {
    return (
      <View style={styles.container}>
        <Text style={styles.errorText}>Failed to load video: {error.message}</Text>
      </View>
    );
  }

  return (
    <View style={styles.container}>
      {status === 'loading' && (
        <ActivityIndicator size="large" color="#fff" style={styles.loader} />
      )}
      <VideoView
        player={player}
        style={styles.video}
        allowsFullscreen
        allowsPictureInPicture
        nativeControls
        contentFit="contain"
      />
      <View style={styles.info}>
        <Text style={styles.infoText}>Status: {status}</Text>
        <Text style={styles.infoText}>
          Time: {Math.floor(currentTime)}s / {Math.floor(player.duration)}s
        </Text>
        <Text style={styles.infoText}>
          {isPlaying ? 'Playing' : 'Paused'}
        </Text>
      </View>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  loader: {
    position: 'absolute',
  },
  info: {
    marginTop: 20,
    padding: 10,
  },
  infoText: {
    color: '#fff',
    fontSize: 14,
    marginBottom: 5,
  },
  errorText: {
    color: '#fff',
    fontSize: 16,
  },
});
```

### Support different aspect ratios

For portrait videos (like Stories or Reels), adjust the aspect ratio in your styles:

```tsx highlight=4
const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 9 / 16, // Portrait mode
  },
});
```

## Platform considerations

### iOS vs Android

Both iOS and Android have native HLS support, so `expo-video` works seamlessly on both platforms. However, there are a few differences to be aware of:

* **iOS**: HLS playback is handled by AVPlayer
* **Android**: HLS playback uses ExoPlayer (Media3)
* **Web**: Uses HTML5 video with HLS.js for HLS support

These differences are handled automatically by `expo-video`, but you may notice slight variations in buffering behavior or UI controls across platforms.

### Expo Go limitations

`expo-video` works with Expo Go for basic playback, but for advanced features like Picture-in-Picture or background playback, you'll need to create a [development build](https://docs.expo.dev/develop/development-builds/introduction/).

<Callout type="warning">
  Features like Picture-in-Picture (`allowsPictureInPicture`) and background playback require configuration through the [config plugin](#configuration) and a custom development build. These features will not work in Expo Go.
</Callout>

### Configuration

To enable advanced features, add the `expo-video` config plugin to your `app.json`:

```json
{
  "expo": {
    "plugins": [
      [
        "expo-video",
        {
          "supportsBackgroundPlayback": true,
          "supportsPictureInPicture": true
        }
      ]
    ]
  }
}
```

After adding the config plugin, rebuild your app with `eas build` or `npx expo run:ios`/`npx expo run:android`.

## What you've learned

You now know how to:

* Install and set up `expo-video`
* Create a video player using the `useVideoPlayer` hook
* Play Mux videos using playback IDs
* Display poster images (thumbnails)
* Handle player events with the `useEvent` hook (status, progress, playback state)
* Adjust for different aspect ratios
* Configure advanced features like Picture-in-Picture

## Next Steps

<GuideCard
  title="Video playback deep dive"
  description="Learn about managing video state, custom controls, error handling, and optimizing playback in React Native"
  links={[
    {title: "Read the guide", href: "/docs/frameworks/react-native-video-playback"},
  ]}
/>

<GuideCard
  title="Upload videos to Mux"
  description="Learn how to upload videos from React Native or ingest videos from URLs for AI-generated content"
  links={[
    {title: "Read the guide", href: "/docs/frameworks/react-native-uploading-videos"},
  ]}
/>

<GuideCard
  title="Build a Stories UI"
  description="Create an Instagram Stories or TikTok-style vertical video feed with swipe navigation"
  links={[
    {title: "Read the guide", href: "/docs/frameworks/react-native-stories-reels-ui"},
  ]}
/>


# Video playback in React Native
Build a production-ready video player with state management, error handling, and custom controls in React Native using expo-video.
This guide covers everything you need to know to build a robust, production-ready video player in React Native using Mux and expo-video. If you haven't already, start with the [quickstart guide](/docs/frameworks/react-native-quickstart) to get basic playback working.

## Understanding HLS playback

Mux delivers video using HLS (HTTP Live Streaming), which is natively supported on both iOS and Android. This means:

* Videos stream in segments, not as a single large file
* Quality automatically adapts to network conditions ([ABR - Adaptive Bitrate](https://www.mux.com/video-glossary/abr-adaptive-bitrate))
* Playback can start before the entire video downloads
* Works seamlessly on cellular networks

<Callout type="info">
  Learn more about [how Mux handles video streaming](/docs/guides/play-your-videos) in the main documentation.
</Callout>

## Playback IDs and URLs

Every Mux video has a **playback ID** that you use to construct the streaming URL:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8
```

### Public vs Signed playback

Mux supports two types of playback policies:

* **Public playback IDs**: Anyone with the URL can play the video
* **Signed playback IDs**: Requires a JWT token for access control

For signed playback, you'll need to generate a JWT on your backend and include it as a query parameter:

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?token={JWT}
```

<Callout type="info">
  Learn how to [secure video playback with signed URLs](/docs/guides/secure-video-playback) including JWT generation and domain restrictions.
</Callout>

For React Native apps, handle signed URLs by fetching the token from your backend before playing:

```tsx
import React, { useState, useEffect } from 'react';
import { View, ActivityIndicator, StyleSheet } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

function SecureVideoPlayer({ playbackId }: { playbackId: string }) {
  const [videoUrl, setVideoUrl] = useState<string | null>(null);

  useEffect(() => {
    // Fetch signed URL from your backend
    fetch('https://your-api.com/video/signed-url', {
      method: 'POST',
      body: JSON.stringify({ playbackId }),
    })
      .then(res => res.json())
      .then(data => setVideoUrl(data.url));
  }, [playbackId]);

  const player = useVideoPlayer(videoUrl, player => {
    player.play();
  });

  if (!videoUrl) {
    return <ActivityIndicator />;
  }

  return (
    <VideoView
      player={player}
      style={styles.video}
      nativeControls
    />
  );
}

const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
});
```

## Managing player state

Building a robust video player requires handling multiple states: loading, playing, paused, buffering, and errors. The `expo-video` library uses an event-based system with hooks from the `expo` package.

```tsx
import React from 'react';
import { View, ActivityIndicator, Text, StyleSheet } from 'react-native';
import { useEvent } from 'expo';
import { useVideoPlayer, VideoView } from 'expo-video';

interface VideoPlayerProps {
  playbackId: string;
}

export default function VideoPlayer({ playbackId }: VideoPlayerProps) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.loop = false;
      player.play();
    }
  );

  const { status, error } = useEvent(player, 'statusChange', {
    status: player.status,
  });

  const { isPlaying } = useEvent(player, 'playingChange', {
    isPlaying: player.playing,
  });

  if (status === 'error') {
    return (
      <View style={styles.container}>
        <Text style={styles.errorText}>
          {error?.message || 'Failed to load video. Please try again.'}
        </Text>
      </View>
    );
  }

  return (
    <View style={styles.container}>
      {status === 'loading' && (
        <ActivityIndicator
          size="large"
          color="#fff"
          style={styles.loader}
        />
      )}
      <VideoView
        player={player}
        style={styles.video}
        nativeControls
        contentFit="contain"
      />
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    position: 'relative',
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  loader: {
    position: 'absolute',
    top: '50%',
    left: '50%',
    marginLeft: -20,
    marginTop: -20,
    zIndex: 10,
  },
  errorText: {
    color: '#fff',
    textAlign: 'center',
    padding: 20,
  },
});
```

### Listening to player events

The `expo-video` player emits events that you can listen to using hooks from the `expo` package:

#### Using the `useEvent` hook

Creates a listener that returns a stateful value for use in components:

```tsx
import { useEvent } from 'expo';

const { status, error } = useEvent(player, 'statusChange', {
  status: player.status,
});

const { isPlaying } = useEvent(player, 'playingChange', {
  isPlaying: player.playing,
});
```

#### Using the `useEventListener` hook

For side effects when events occur:

```tsx
import { useEventListener } from 'expo';

useEventListener(player, 'statusChange', ({ status, error }) => {
  console.log('Player status changed:', status);
  if (error) {
    console.error('Player error:', error);
  }
});

useEventListener(player, 'playToEnd', () => {
  console.log('Video finished playing');
  player.replay();
});
```

### Key player events

| Event | When it fires | Use case |
|-------|---------------|----------|
| `statusChange` | Player status changes (idle, loading, readyToPlay, error) | Show loading states, handle errors |
| `playingChange` | Play/pause state changes | Update play/pause button |
| `timeUpdate` | Periodically during playback | Update progress bar |
| `sourceLoad` | Video source finishes loading | Get duration, available tracks |
| `playToEnd` | Video finishes playing | Auto-play next video, show replay |

## Poster images and thumbnails

Mux automatically generates thumbnails for your videos. Display a poster image that users tap to start playback:

```tsx
import React, { useState } from 'react';
import { View, Image, Pressable, StyleSheet } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoWithPoster({ playbackId }: { playbackId: string }) {
  const [showPoster, setShowPoster] = useState(true);
  const posterUrl = `https://image.mux.com/${playbackId}/thumbnail.png?time=0`;

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.loop = false;
      // Don't autoplay - wait for user to tap poster
    }
  );

  const handlePosterPress = () => {
    setShowPoster(false);
    player.play();
  };

  return (
    <View style={styles.container}>
      <VideoView
        player={player}
        style={styles.video}
        nativeControls
        contentFit="contain"
      />
      {showPoster && (
        <Pressable onPress={handlePosterPress} style={styles.poster}>
          <Image
            source={{ uri: posterUrl }}
            style={styles.poster}
            resizeMode="cover"
          />
        </Pressable>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    position: 'relative',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  poster: {
    position: 'absolute',
    width: '100%',
    aspectRatio: 16 / 9,
  },
});
```

### Thumbnail URL options

```
https://image.mux.com/{PLAYBACK_ID}/thumbnail.{format}?{params}
```

**Common parameters:**

* `time` - Timestamp in seconds (e.g., `time=5` for 5 seconds in)
* `width` - Thumbnail width in pixels (e.g., `width=640`)
* `height` - Thumbnail height in pixels (e.g., `height=360`)
* `fit_mode` - How to resize: `preserve`, `stretch`, `crop`, `smartcrop`

**Example:**

```tsx
const thumbnail = `https://image.mux.com/${playbackId}/thumbnail.jpg?time=5&width=1280&fit_mode=smartcrop`;
```

<Callout type="info">
  Learn more about [thumbnail options and image transformations](/docs/guides/get-images-from-a-video) in the main docs.
</Callout>

## Aspect ratios for different use cases

Choose the right aspect ratio based on your app's design:

### Landscape video (16:9)

Standard for most video content:

```tsx
const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 16 / 9, // 1.777
  },
});
```

### Portrait video (9:16)

For Stories, Reels, or TikTok-style feeds:

```tsx
const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 9 / 16, // 0.5625
  },
});
```

### Square video (1:1)

For social feeds:

```tsx
const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 1, // 1.0
  },
});
```

### Dynamic aspect ratio

Match the video's actual dimensions using the `sourceLoad` event:

```tsx
import { View, StyleSheet } from 'react-native';
import { useEvent } from 'expo';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function DynamicVideoPlayer({ playbackId = "TPsqaPkOFCKQHVGQ00Khp0256fLo4FAsEHjCTeWi02JyrM" }: { playbackId: string }) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  const loadedMetadata = useEvent(player, 'sourceLoad');

  // Calculate aspect ratio from available video tracks
  const aspectRatio = (() => {
    const tracks = loadedMetadata?.availableVideoTracks;
    if (tracks && tracks.length > 0) {
      const { width, height } = tracks[0].size;
      return width / height;
    }
    return 16 / 9; // Default fallback
  })();

  return (
    <VideoView
      player={player}
      style={[styles.video, { aspectRatio }]}
      nativeControls
    />
  );
}

const styles = StyleSheet.create({
  video: {
    width: '100%',
  },
});
```

<Callout type="warning">
  The `sourceLoad` event and video track metadata work reliably on iOS and Android. On web, this event may not fire consistently. If you need dynamic aspect ratios across all platforms, consider fetching video dimensions from the Mux API or storing them alongside your playback ID.
</Callout>

## Fullscreen support

Enable fullscreen mode using the `VideoView` ref methods:

```tsx
import React, { useRef } from 'react';
import { View, TouchableOpacity, Text, StyleSheet } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoPlayerWithFullscreen({ playbackId }: { playbackId: string }) {
  const videoRef = useRef<VideoView>(null);

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  const enterFullscreen = async () => {
    await videoRef.current?.enterFullscreen();
  };

  const exitFullscreen = async () => {
    await videoRef.current?.exitFullscreen();
  };

  return (
    <View>
      <VideoView
        ref={videoRef}
        player={player}
        style={styles.video}
        nativeControls={false}
        allowsFullscreen
        onFullscreenEnter={() => console.log('Entered fullscreen')}
        onFullscreenExit={() => console.log('Exited fullscreen')}
      />
      <TouchableOpacity onPress={enterFullscreen} style={styles.button}>
        <Text style={styles.buttonText}>Go big or go home</Text>
      </TouchableOpacity>
    </View>
  );
}

const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  button: {
    backgroundColor: '#ec9430ff',
    padding: 12,
    borderRadius: 8,
    marginTop: 10,
    alignItems: 'center',
  },
  buttonText: {
    color: '#fff',
    fontWeight: 'bold',
  },
});
```

<Callout type="info">
  Fullscreen behavior is handled natively by the platform. On iOS, this uses AVPlayerViewController. On Android, this uses ExoPlayer's fullscreen controller.
</Callout>

## Error handling

Network issues are common on mobile. Implement robust error handling:

```tsx
import React, { useState, useCallback } from 'react';
import { View, Text, TouchableOpacity, StyleSheet } from 'react-native';
import { useEvent } from 'expo';
import { useVideoPlayer, VideoView } from 'expo-video';

function VideoPlayerWithRetry({ playbackId }: { playbackId: string }) {
  const [retryKey, setRetryKey] = useState(0);

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  const { status, error } = useEvent(player, 'statusChange', {
    status: player.status,
  });

  const retry = useCallback(() => {
    player.replay();
    setRetryKey(prev => prev + 1);
  }, [player]);

  const getErrorMessage = (error: any) => {
    // Categorize errors based on the error message
    const message = error?.message || '';
    if (message.includes('network') || message.includes('ENOTFOUND')) {
      return 'Network error. Check your connection.';
    } else if (message.includes('403') || message.includes('forbidden')) {
      return 'This video is not available.';
    }
    return 'Failed to load video.';
  };

  if (status === 'error') {
    return (
      <View style={styles.errorContainer}>
        <Text style={styles.errorText}>{getErrorMessage(error)}</Text>
        <TouchableOpacity style={styles.retryButton} onPress={retry}>
          <Text style={styles.retryText}>Retry</Text>
        </TouchableOpacity>
      </View>
    );
  }

  return (
    <VideoView
      key={retryKey}
      player={player}
      style={styles.video}
      nativeControls
    />
  );
}

const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  errorContainer: {
    backgroundColor: '#000',
    padding: 40,
    alignItems: 'center',
    justifyContent: 'center',
    aspectRatio: 16 / 9,
  },
  errorText: {
    color: '#fff',
    fontSize: 16,
    textAlign: 'center',
    marginBottom: 20,
  },
  retryButton: {
    backgroundColor: '#fff',
    paddingHorizontal: 20,
    paddingVertical: 10,
    borderRadius: 5,
  },
  retryText: {
    color: '#000',
    fontWeight: 'bold',
  },
});
```

### Common error scenarios

| Error | Cause | Solution |
|-------|-------|----------|
| Network timeout | Slow/no connection | Show retry button, check network status |
| 403 Forbidden | Invalid playback ID or signed URL expired | Refresh token, verify playback ID |
| Video not loading | Asset still processing | Check asset status, show "processing" message |
| Playback stalled | Poor network | HLS handles this automatically via ABR |

## Custom controls

Build custom video controls by setting `nativeControls={false}` and tracking playback state with events. This example creates a control bar with a play/pause button, scrubbing slider, and current/total time displays using `@react-native-community/slider`:

```tsx
import React from 'react';
import { View, TouchableOpacity, Text, StyleSheet } from 'react-native';
import { useEvent } from 'expo';
import { useVideoPlayer, VideoView } from 'expo-video';
import Slider from '@react-native-community/slider';

export default function CustomControlsPlayer({ playbackId }: { playbackId: string }) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.timeUpdateEventInterval = 0.25; // Update every 250ms
    }
  );

  const { isPlaying } = useEvent(player, 'playingChange', {
    isPlaying: player.playing,
  });

  const timeUpdate = useEvent(player, 'timeUpdate');
  const currentTime = timeUpdate?.currentTime ?? 0;
  const duration = player.duration;

  const handleSeek = (time: number) => {
    player.currentTime = time;
  };

  const togglePlayback = () => {
    if (isPlaying) {
      player.pause();
    } else {
      player.play();
    }
  };

  const formatTime = (seconds: number) => {
    const mins = Math.floor(seconds / 60);
    const secs = Math.floor(seconds % 60);
    return `${mins}:${secs.toString().padStart(2, '0')}`;
  };

  return (
    <View style={styles.container}>
      <VideoView
        player={player}
        style={styles.video}
        nativeControls={false}
        contentFit="contain"
      />

      <View style={styles.controls}>
        <TouchableOpacity onPress={togglePlayback}>
          <Text style={styles.controlText}>
            {isPlaying ? '⏸' : '▶'}
          </Text>
        </TouchableOpacity>

        <Text style={styles.time}>{formatTime(currentTime)}</Text>

        <Slider
          style={styles.slider}
          value={currentTime}
          minimumValue={0}
          maximumValue={duration || 1}
          onSlidingComplete={handleSeek}
          minimumTrackTintColor="#fff"
          maximumTrackTintColor="#666"
          thumbTintColor="#fff"
        />

        <Text style={styles.time}>{formatTime(duration || 0)}</Text>
      </View>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  controls: {
    flexDirection: 'row',
    alignItems: 'center',
    padding: 10,
    backgroundColor: 'rgba(0,0,0,0.7)',
  },
  controlText: {
    color: '#fff',
    fontSize: 24,
    marginRight: 10,
  },
  time: {
    color: '#fff',
    fontSize: 12,
    marginHorizontal: 5,
  },
  slider: {
    flex: 1,
    marginHorizontal: 10,
  },
});
```

### Playback speed control

Allow users to adjust playback speed for faster or slower viewing:

```tsx
import React, { useState, useCallback } from 'react';
import { View, TouchableOpacity, Text, StyleSheet } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

const PLAYBACK_SPEEDS = [0.5, 0.75, 1, 1.25, 1.5, 2];

export default function VideoPlayerWithSpeed({ playbackId }: { playbackId: string }) {
  const [speedIndex, setSpeedIndex] = useState(2); // Default to 1x

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  const cycleSpeed = useCallback(() => {
    const nextIndex = (speedIndex + 1) % PLAYBACK_SPEEDS.length;
    setSpeedIndex(nextIndex);
    player.playbackRate = PLAYBACK_SPEEDS[nextIndex];
  }, [player, speedIndex]);

  return (
    <View style={styles.container}>
      <VideoView
        player={player}
        style={styles.video}
        nativeControls
        contentFit="contain"
      />
      <TouchableOpacity onPress={cycleSpeed} style={styles.speedButton}>
        <Text style={styles.speedText}>
          Speed: {PLAYBACK_SPEEDS[speedIndex]}x
        </Text>
      </TouchableOpacity>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  speedButton: {
    backgroundColor: '#ec9430ff',
    padding: 12,
    margin: 10,
    borderRadius: 8,
    alignItems: 'center',
  },
  speedText: {
    color: '#fff',
    fontSize: 14,
    fontWeight: 'bold',
  },
});
```

**Tip:** Use `player.preservesPitch = true` (default) to maintain audio pitch at higher speeds, or set to `false` for a "chipmunk" effect.

## Picture-in-Picture support

Enable Picture-in-Picture mode for background video playback:

```tsx
import React, { useRef, useState, useCallback } from 'react';
import { View, TouchableOpacity, Text, StyleSheet, Platform } from 'react-native';
import { useVideoPlayer, VideoView, isPictureInPictureSupported } from 'expo-video';

export default function VideoPlayerWithPiP({ playbackId }: { playbackId: string }) {
  const videoRef = useRef<VideoView>(null);
  const [isInPiP, setIsInPiP] = useState(false);

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  const togglePiP = useCallback(() => {
    if (!isInPiP) {
      videoRef.current?.startPictureInPicture();
    } else {
      videoRef.current?.stopPictureInPicture();
    }
  }, [isInPiP]);

  // Check PiP support (function only exists on iOS and Android)
  const pipSupported = Platform.OS !== 'web' && isPictureInPictureSupported();

  if (!pipSupported) {
    return (
      <View style={styles.container}>
        <Text style={styles.errorText}>
          Picture-in-Picture is not supported on this platform.
        </Text>
      </View>
    );
  }

  return (
    <View style={styles.container}>
      <VideoView
        ref={videoRef}
        player={player}
        style={styles.video}
        nativeControls
        allowsPictureInPicture
        startsPictureInPictureAutomatically
        onPictureInPictureStart={() => setIsInPiP(true)}
        onPictureInPictureStop={() => setIsInPiP(false)}
      />
      <TouchableOpacity onPress={togglePiP} style={styles.button}>
        <Text style={styles.buttonText}>
          {isInPiP ? 'Exit' : 'Enter'} Picture-in-Picture
        </Text>
      </TouchableOpacity>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    backgroundColor: '#000',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  button: {
    backgroundColor: '#ec9430ff',
    padding: 12,
    margin: 10,
    borderRadius: 8,
    alignItems: 'center',
  },
  buttonText: {
    color: '#fff',
    fontWeight: 'bold',
  },
  errorText: {
    color: '#fff',
    textAlign: 'center',
    padding: 20,
  },
});
```

<Callout type="warning">
  Picture-in-Picture requires configuration in your app.json:

  ```json
  {
    "expo": {
      "plugins": [
        ["expo-video", { "supportsPictureInPicture": true }]
      ]
    }
  }
  ```
</Callout>

## iOS vs Android considerations

While `expo-video` abstracts most platform differences, be aware of:

### iOS (AVPlayer)

* Native HLS support
* Picture-in-Picture available on iOS 14+
* Smooth fullscreen transitions
* AirPlay support via `allowsExternalPlayback`
* Respects system audio settings

### Android (ExoPlayer)

* Native HLS support via ExoPlayer
* Picture-in-Picture on Android 12+
* Configurable surface type (SurfaceView vs TextureView)
* May require additional permissions for background playback

<Callout type="warning">
  Test your video player on both iOS and Android physical devices, not just simulators. Network behavior and video codecs can differ between simulators and real devices.
</Callout>

### Platform-specific configuration

Optimize playback for each platform by detecting the OS and adjusting player settings. This example shows iOS-specific AirPlay support, platform-specific buffer configurations, and Android's TextureView for overlapping videos:

```tsx
import { Platform } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function PlatformAwarePlayer({ playbackId }: { playbackId: string }) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();

      // iOS-specific settings
      if (Platform.OS === 'ios') {
        player.allowsExternalPlayback = true; // Enable AirPlay
      }

      // Configure buffer options
      player.bufferOptions = {
        preferredForwardBufferDuration: Platform.OS === 'ios' ? 0 : 20,
        minBufferForPlayback: 2,
      };
    }
  );

  return (
    <VideoView
      player={player}
      style={{ width: '100%', aspectRatio: 16 / 9 }}
      nativeControls
      // Android-specific: use TextureView for overlapping videos
      surfaceType={Platform.OS === 'android' ? 'textureView' : undefined}
    />
  );
}
```

## Performance tips

### 1. Pause videos when not visible

```tsx
import { useEffect } from 'react';
import { AppState } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoPlayer({ playbackId }: { playbackId: string }) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.staysActiveInBackground = false;
    }
  );

  useEffect(() => {
    const subscription = AppState.addEventListener('change', (nextAppState) => {
      if (nextAppState === 'active') {
        player.play();
      } else {
        player.pause();
      }
    });

    return () => subscription.remove();
  }, [player]);

  return (
    <VideoView
      player={player}
      style={{ width: '100%', aspectRatio: 16 / 9 }}
      nativeControls
    />
  );
}
```

### 2. Preload videos for smoother transitions

```tsx
import { useVideoPlayer, VideoView, VideoSource } from 'expo-video';
import { useState, useCallback } from 'react';
import { TouchableOpacity, Text, View, StyleSheet } from 'react-native';

const video1: VideoSource = 'https://stream.mux.com/PLAYBACK_ID_1.m3u8';
const video2: VideoSource = 'https://stream.mux.com/PLAYBACK_ID_2.m3u8';

export default function PreloadingPlayer() {
  // Create both players - the second one preloads in the background
  const player1 = useVideoPlayer(video1, player => {
    player.play();
  });

  const player2 = useVideoPlayer(video2, player => {
    player.currentTime = 0; // Preload from the start
  });

  const [currentPlayer, setCurrentPlayer] = useState(player1);

  const switchVideo = useCallback(() => {
    currentPlayer.pause();
    if (currentPlayer === player1) {
      setCurrentPlayer(player2);
      player2.play();
    } else {
      setCurrentPlayer(player1);
      player1.play();
    }
  }, [currentPlayer, player1, player2]);

  return (
    <View>
      <VideoView player={currentPlayer} style={styles.video} nativeControls />
      <TouchableOpacity onPress={switchVideo} style={styles.button}>
        <Text style={styles.buttonText}>Switch Video</Text>
      </TouchableOpacity>
    </View>
  );
}

const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  button: {
    backgroundColor: '#4630ec',
    padding: 12,
    borderRadius: 8,
    marginTop: 10,
    alignItems: 'center',
  },
  buttonText: {
    color: '#fff',
    fontWeight: 'bold',
  },
});
```

### 3. Enable video caching

If your app frequently replays the same videos, enable caching to minimize network usage and improve playback performance. The cache is persistent and managed on a least-recently-used (LRU) basis:

```tsx
import { useVideoPlayer, VideoView, VideoSource } from 'expo-video';

function CachedVideoPlayer({ playbackId }: { playbackId: string }) {
  const videoSource: VideoSource = {
    uri: `https://stream.mux.com/${playbackId}.m3u8`,
    useCaching: true,
    metadata: {
      title: 'My Video',
    },
  };

  const player = useVideoPlayer(videoSource, player => {
    player.play();
  });

  return (
    <VideoView
      player={player}
      style={{ width: '100%', aspectRatio: 16 / 9 }}
      nativeControls
    />
  );
}
```

**How caching works:**

* The cache is persistent across app launches
* Videos are evicted on a least-recently-used basis when the cache size limit is reached (default: 1GB)
* The system may clear the cache when device storage is low
* Cached videos can be played offline until the cached data is exhausted

**Managing the cache:**

```tsx
import {
  setVideoCacheSizeAsync,
  getCurrentVideoCacheSize,
  clearVideoCacheAsync
} from 'expo-video';

// Set cache size to 500MB (must be called when no players exist)
await setVideoCacheSizeAsync(500 * 1024 * 1024);

// Get current cache size
const cacheSize = getCurrentVideoCacheSize();
console.log(`Cache is using ${cacheSize} bytes`);

// Clear all cached videos (must be called when no players exist)
await clearVideoCacheAsync();
```

<Callout type="warning">
  **Caching limitations:**

  * HLS video sources cannot be cached on iOS due to platform limitations
  * DRM-protected videos cannot be cached on Android and iOS
  * Cache management functions can only be called when no `VideoPlayer` instances exist
</Callout>

### 4. Optimize poster image loading

Use lower resolution thumbnails for poster images to reduce initial load time:

```tsx
const posterUrl = `https://image.mux.com/${playbackId}/thumbnail.jpg?width=640&time=0`;
```

## Additional expo-video features

This guide covers the most common video playback patterns with Mux. The `expo-video` library offers many additional capabilities beyond what's covered here.

For advanced features and patterns, see the official [expo-video examples repository](https://github.com/expo/expo/tree/main/apps/native-component-list/src/screens/Video):

* **[Playback controls](https://github.com/expo/expo/blob/main/apps/native-component-list/src/screens/Video/VideoPlaybackControlsScreen.tsx)** - Volume sliders, AirPlay button, and `seekBy()` / `replay()` methods
* **[DRM and content protection](https://github.com/expo/expo/blob/main/apps/native-component-list/src/screens/Video/VideoDRMScreen.tsx)** - Widevine and FairPlay integration
* **[Subtitles and closed captions](https://github.com/expo/expo/blob/main/apps/native-component-list/src/screens/Video/VideoSubtitlesScreen.tsx)** - Adding text tracks to videos

<Callout type="info">
  Mux supports most of these features natively. For example, Mux can automatically generate subtitles, provide DRM protection, and deliver multiple audio tracks. Learn more in the [Video features documentation](/docs/guides/video).
</Callout>

## Next Steps

<GuideCard
  title="Upload videos to Mux"
  description="Learn how to upload videos from React Native or ingest videos from URLs for AI-generated content"
  links={[
    {title: "Read the guide", href: "/docs/frameworks/react-native-uploading-videos"},
  ]}
/>

<GuideCard
  title="Build a Stories UI"
  description="Create an Instagram Stories or TikTok-style vertical video feed with swipe navigation"
  links={[
    {title: "Read the guide", href: "/docs/frameworks/react-native-stories-reels-ui"},
  ]}
/>


# Upload videos to Mux from React Native
Learn how to upload videos from React Native devices or ingest videos from URLs for AI-generated content workflows.
There are two primary ways to get videos into Mux from a React Native application:

1. **Direct upload from device** - User records or selects a video, which is uploaded directly from their mobile device
2. **Upload from URL** - Your backend creates a Mux asset from a video URL (ideal for AI-generated content)

This guide covers both approaches and helps you choose the right one for your use case.

## Choosing an upload method

| Method | Use Case | React Native Role | Backend Required | User Experience |
|--------|----------|-------------------|------------------|-----------------|
| **Direct Upload** | User-generated content (camera, library) | High - handles file upload | Yes - generates upload URL | Shows upload progress |
| **Upload from URL** | AI-generated videos, pre-hosted content | Low - just displays result | Yes - creates asset | Background process |

### When to use direct upload

Use direct upload when:

* Users record videos with their device camera
* Users select videos from their photo library
* You need to show upload progress to the user
* The video file is on the user's device

### When to use upload from URL

Use upload from URL when:

* Videos are generated by AI services (Runway, Pika, Fal.ai, etc.)
* Videos are already hosted elsewhere (S3, GCS, etc.)
* You want to ingest videos without user intervention
* The video generation happens on your backend

<Callout type="info">
  For apps that use on-demand generative AI video, **upload from URL** is the right choice since videos are generated by AI services and returned as URLs.
</Callout>

***

## Direct upload from mobile device

Direct uploads allow users to upload videos directly from their React Native app to Mux without the file touching your backend servers.

### Architecture

```
User Device → Your Backend (generate upload URL) → Mux
                ↓
            Upload URL returned
                ↓
User Device → Mux (upload file directly)
                ↓
            Mux processes video
                ↓
            Webhook → Your Backend (asset ready)
```

### Step 1: Generate a signed upload URL (backend)

Your backend must generate a signed upload URL using the Mux API:

```javascript
// Backend: Node.js + Mux SDK
import Mux from '@mux/mux-node';

const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET,
});

// API endpoint: POST /api/generate-upload-url
export async function generateUploadUrl(req, res) {
  try {
    const upload = await mux.video.uploads.create({
      cors_origin: '*', // Or specify your app's origin
      new_asset_settings: {
        playback_policies: ['public'],
        video_quality: "basic"
      },
    });

    res.json({
      uploadUrl: upload.url,
      uploadId: upload.id,
    });
  } catch (error) {
    console.error('Failed to create upload URL:', error);
    res.status(500).json({ error: 'Failed to generate upload URL' });
  }
}
```

<Callout type="warning">
  Never expose your Mux API credentials in your React Native app. Always generate upload URLs from your backend.
</Callout>

### Step 2: Record or select video (React Native)

Use Expo Camera or ImagePicker to get a video file:

```tsx
import * as ImagePicker from 'expo-image-picker';
import { useState } from 'react';

export function useVideoPicker() {
  const [videoUri, setVideoUri] = useState<string | null>(null);

  const pickVideo = async () => {
    const result = await ImagePicker.launchImageLibraryAsync({
      mediaTypes: ImagePicker.MediaTypeOptions.Videos,
      allowsEditing: true,
      quality: 1,
    });

    if (!result.canceled) {
      setVideoUri(result.assets[0].uri);
    }
  };

  const recordVideo = async () => {
    const result = await ImagePicker.launchCameraAsync({
      mediaTypes: ImagePicker.MediaTypeOptions.Videos,
      allowsEditing: true,
      quality: 1,
    });

    if (!result.canceled) {
      setVideoUri(result.assets[0].uri);
    }
  };

  return { videoUri, pickVideo, recordVideo };
}
```

### Step 3: Upload to Mux (React Native)

Upload the video file using Expo FileSystem:

```tsx
import * as FileSystem from 'expo-file-system';
import { useState } from 'react';

interface UploadResult {
  uploadId: string;
  assetId?: string;
}

export function useVideoUpload() {
  const [uploading, setUploading] = useState(false);
  const [uploadProgress, setUploadProgress] = useState(0);
  const [error, setError] = useState<string | null>(null);

  const uploadVideo = async (videoUri: string): Promise<UploadResult | null> => {
    setUploading(true);
    setUploadProgress(0);
    setError(null);

    try {
      // Step 1: Get upload URL from your backend
      const response = await fetch('https://your-api.com/generate-upload-url', {
        method: 'POST',
      });
      const { uploadUrl, uploadId } = await response.json();

      // Step 2: Upload video file to Mux with progress tracking
      const uploadTask = FileSystem.createUploadTask(
        uploadUrl,
        videoUri,
        {
          httpMethod: 'PUT',
          uploadType: FileSystem.FileSystemUploadType.BINARY_CONTENT,
        },
        (uploadProgress) => {
          const progress = uploadProgress.totalBytesSent / uploadProgress.totalBytesExpectedToSend;
          setUploadProgress(Math.round(progress * 100));
        }
      );

      const uploadResponse = await uploadTask.uploadAsync();

      if (!uploadResponse || uploadResponse.status !== 200) {
        throw new Error('Upload failed');
      }

      setUploading(false);

      return { uploadId };
    } catch (err) {
      console.error('Upload error:', err);
      setError('Failed to upload video');
      setUploading(false);
      setUploadProgress(0);
      return null;
    }
  };

  return { uploadVideo, uploading, uploadProgress, error };
}
```

### Step 4: Handle upload completion

The video asset won't be ready immediately after upload. You'll need to:

1. Listen for the `video.asset.ready` webhook on your backend
2. Update your database with the playback ID
3. Notify the React Native app (via polling, realtime DB, or push notification)

See the [async processing guide](/docs/frameworks/react-native-async-processing) for implementation details.

### Complete upload example

```tsx
import React, { useState } from 'react';
import {
  View,
  TouchableOpacity,
  Text,
  ActivityIndicator,
  StyleSheet,
} from 'react-native';
import * as ImagePicker from 'expo-image-picker';
import * as FileSystem from 'expo-file-system';

export default function VideoUploader() {
  const [uploading, setUploading] = useState(false);
  const [uploadProgress, setUploadProgress] = useState(0);
  const [uploadId, setUploadId] = useState<string | null>(null);

  const selectAndUploadVideo = async () => {
    // Step 1: Select video
    const result = await ImagePicker.launchImageLibraryAsync({
      mediaTypes: ImagePicker.MediaTypeOptions.Videos,
      quality: 1,
    });

    if (result.canceled) return;

    const videoUri = result.assets[0].uri;
    setUploading(true);
    setUploadProgress(0);

    try {
      // Step 2: Get upload URL
      const response = await fetch('https://your-api.com/generate-upload-url', {
        method: 'POST',
      });
      const { uploadUrl, uploadId } = await response.json();

      // Step 3: Upload to Mux with progress tracking
      const uploadTask = FileSystem.createUploadTask(
        uploadUrl,
        videoUri,
        {
          httpMethod: 'PUT',
          uploadType: FileSystem.FileSystemUploadType.BINARY_CONTENT,
        },
        (progress) => {
          const percentage = progress.totalBytesSent / progress.totalBytesExpectedToSend;
          setUploadProgress(Math.round(percentage * 100));
        }
      );

      await uploadTask.uploadAsync();

      setUploadId(uploadId);
      setUploading(false);

      // Video is now processing - see async processing guide
      // for how to get notified when it's ready
    } catch (error) {
      console.error('Upload failed:', error);
      setUploading(false);
      setUploadProgress(0);
    }
  };

  return (
    <View style={styles.container}>
      {uploading ? (
        <View style={styles.uploadingContainer}>
          <ActivityIndicator size="large" />
          <Text style={styles.progressText}>Uploading: {uploadProgress}%</Text>
        </View>
      ) : uploadId ? (
        <Text style={styles.text}>
          Video uploaded! Processing...
        </Text>
      ) : (
        <TouchableOpacity style={styles.button} onPress={selectAndUploadVideo}>
          <Text style={styles.buttonText}>Select Video</Text>
        </TouchableOpacity>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    padding: 20,
    alignItems: 'center',
  },
  uploadingContainer: {
    alignItems: 'center',
  },
  button: {
    backgroundColor: '#007AFF',
    padding: 15,
    borderRadius: 8,
  },
  buttonText: {
    color: '#fff',
    fontSize: 16,
    fontWeight: 'bold',
  },
  text: {
    fontSize: 16,
  },
  progressText: {
    marginTop: 10,
    fontSize: 16,
    color: '#666',
  },
});
```

***

## Upload from URL (for AI-generated videos)

This approach is ideal when videos are generated by AI services or already hosted elsewhere. The video never touches the React Native app - your backend handles everything.

### Architecture

```
User submits prompt → Your Backend → AI Service (Fal.ai, Runway, etc.)
                                            ↓
                                      AI returns video URL
                                            ↓
                      Your Backend → Mux (create asset from URL)
                                            ↓
                                     Mux ingests & processes
                                            ↓
                      Webhook → Your Backend (asset ready)
                                            ↓
                      Realtime DB → React Native App
```

### Step 1: Create asset from URL (backend)

Your backend receives the AI-generated video URL and creates a Mux asset:

```javascript
// Backend: Node.js + Mux SDK
import Mux from '@mux/mux-node';

const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET,
});

// API endpoint: POST /api/create-video-from-url
export async function createVideoFromUrl(req, res) {
  const { videoUrl, userId, prompt } = req.body;

  try {
    // Create Mux asset from URL
    const asset = await mux.video.assets.create({
      input: [{ url: videoUrl }],
      playback_policies: ['public'],
      video_quality: "basic"
    });

    // Store in your database
    await db.videos.create({
      id: generateId(),
      userId,
      prompt,
      muxAssetId: asset.id,
      status: 'processing', // Will be updated via webhook
      createdAt: new Date(),
    });

    res.json({
      videoId: video.id,
      assetId: asset.id,
      status: 'processing',
    });
  } catch (error) {
    console.error('Failed to create asset:', error);
    res.status(500).json({ error: 'Failed to create video asset' });
  }
}
```

<Callout type="info">
  The video URL must be publicly accessible. Mux will fetch the video file from that URL to ingest it.
</Callout>

### Step 2: Handle asset ready webhook (backend)

When Mux finishes processing, it sends a webhook to your backend:

```javascript
// Backend: Webhook handler

// check the mux-node-sdk docs for details
// https://github.com/muxinc/mux-node-sdk/blob/master/api.md#webhooks
const mux = new Mux();

// API endpoint: POST /api/webhooks/mux
export async function handleMuxWebhook(req, res) {
  const webhookSecret = process.env.MUX_WEBHOOK_SECRET;
  const signature = req.headers['mux-signature'];

  // Verify webhook signature
  try {
    mux.webhooks.verifySignature(req.body, req.headers, webhookSecret);
  } catch (error) {
    console.error('Invalid webhook signature');
    return res.status(401).json({ error: 'Invalid signature' });
  }

  const event = req.body;

  // Handle video.asset.ready
  if (event.type === 'video.asset.ready') {
    const { id, playback_ids, duration } = event.data;

    // Update database
    await db.videos.update({
      where: { muxAssetId: id },
      data: {
        status: 'ready',
        playbackId: playback_ids[0].id,
        duration,
      },
    });

    // Video is now ready to play!
    // Your realtime database will notify the React Native app
  }

  // Handle video.asset.errored
  if (event.type === 'video.asset.errored') {
    const { id } = event.data;

    await db.videos.update({
      where: { muxAssetId: id },
      data: {
        status: 'failed',
        error: 'Video processing failed',
      },
    });
  }

  res.json({ received: true });
}
```

<Callout type="warning">
  Always verify webhook signatures to ensure requests are actually from Mux. See the [webhooks guide](/docs/core/listen-for-webhooks) for details.
</Callout>

### Step 3: React Native subscribes to status updates

Your React Native app doesn't handle the upload - it just waits for the video to be ready:

```tsx
import React, { useEffect, useState } from 'react';
import { View, Text, ActivityIndicator, StyleSheet } from 'react-native';
import { supabase } from './supabase'; // or Firebase, etc.
import { useVideoPlayer, VideoView } from 'expo-video';

interface VideoGenerationProps {
  videoId: string;
}

export default function VideoGeneration({ videoId }: VideoGenerationProps) {
  const [status, setStatus] = useState<'processing' | 'ready' | 'failed'>('processing');
  const [playbackId, setPlaybackId] = useState<string | null>(null);

  const player = useVideoPlayer(
    playbackId ? `https://stream.mux.com/${playbackId}.m3u8` : null,
    (player) => {
      player.loop = false;
    }
  );

  useEffect(() => {
    // Subscribe to video status changes using Supabase Realtime
    const subscription = supabase
      .channel('video-updates')
      .on(
        'postgres_changes',
        {
          event: 'UPDATE',
          schema: 'public',
          table: 'videos',
          filter: `id=eq.${videoId}`,
        },
        (payload) => {
          const video = payload.new;
          setStatus(video.status);
          if (video.status === 'ready') {
            setPlaybackId(video.playback_id);
          }
        }
      )
      .subscribe();

    return () => {
      subscription.unsubscribe();
    };
  }, [videoId]);

  if (status === 'failed') {
    return (
      <View style={styles.container}>
        <Text style={styles.errorText}>
          Video generation failed. Please try again.
        </Text>
      </View>
    );
  }

  if (status === 'processing' || !playbackId) {
    return (
      <View style={styles.container}>
        <ActivityIndicator size="large" color="#007AFF" />
        <Text style={styles.text}>Generating your video...</Text>
      </View>
    );
  }

  return (
    <VideoView
      player={player}
      style={styles.video}
      nativeControls
    />
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
    padding: 20,
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  text: {
    marginTop: 10,
    fontSize: 16,
    color: '#666',
  },
  errorText: {
    fontSize: 16,
    color: '#ff0000',
    textAlign: 'center',
  },
});
```

### Complete AI video workflow example

Here's the full flow for an async video generation example app:

```tsx
import React, { useState } from 'react';
import {
  View,
  TextInput,
  TouchableOpacity,
  Text,
  ActivityIndicator,
  StyleSheet,
} from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function AIVideoGenerator() {
  const [prompt, setPrompt] = useState('');
  const [generating, setGenerating] = useState(false);
  const [videoId, setVideoId] = useState<string | null>(null);
  const [playbackId, setPlaybackId] = useState<string | null>(null);

  const player = useVideoPlayer(
    playbackId ? `https://stream.mux.com/${playbackId}.m3u8` : null,
    (player) => {
      player.loop = false;
    }
  );

  const generateVideo = async () => {
    if (!prompt.trim()) return;

    setGenerating(true);

    try {
      // Step 1: Submit prompt to your backend
      const response = await fetch('https://your-api.com/generate-video', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ prompt }),
      });

      const { videoId } = await response.json();
      setVideoId(videoId);

      // Step 2: Your backend handles:
      // - Calling AI service (Fal.ai, Runway, etc.)
      // - Getting video URL from AI service
      // - Creating Mux asset from URL
      // - Waiting for Mux webhook

      // Step 3: Poll or subscribe for updates
      const checkStatus = async () => {
        const statusResponse = await fetch(
          `https://your-api.com/videos/${videoId}`
        );
        const video = await statusResponse.json();

        if (video.status === 'ready') {
          setPlaybackId(video.playbackId);
          setGenerating(false);
        } else if (video.status === 'failed') {
          setGenerating(false);
          alert('Video generation failed');
        } else {
          // Still processing, check again in 3 seconds
          setTimeout(checkStatus, 3000);
        }
      };

      checkStatus();
    } catch (error) {
      console.error('Generation failed:', error);
      setGenerating(false);
    }
  };

  if (playbackId) {
    return (
      <View style={styles.container}>
        <VideoView
          player={player}
          style={styles.video}
          nativeControls
        />
        <TouchableOpacity
          style={styles.button}
          onPress={() => {
            setPrompt('');
            setPlaybackId(null);
            setVideoId(null);
          }}
        >
          <Text style={styles.buttonText}>Generate Another</Text>
        </TouchableOpacity>
      </View>
    );
  }

  return (
    <View style={styles.container}>
      <TextInput
        style={styles.input}
        placeholder="Enter video prompt..."
        value={prompt}
        onChangeText={setPrompt}
        multiline
        editable={!generating}
      />

      {generating ? (
        <>
          <ActivityIndicator size="large" color="#007AFF" />
          <Text style={styles.statusText}>
            Generating your video...
          </Text>
          <Text style={styles.subText}>
            This usually takes 30-60 seconds
          </Text>
        </>
      ) : (
        <TouchableOpacity style={styles.button} onPress={generateVideo}>
          <Text style={styles.buttonText}>Generate Video</Text>
        </TouchableOpacity>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    padding: 20,
    justifyContent: 'center',
  },
  input: {
    borderWidth: 1,
    borderColor: '#ccc',
    borderRadius: 8,
    padding: 15,
    fontSize: 16,
    marginBottom: 20,
    minHeight: 100,
  },
  button: {
    backgroundColor: '#007AFF',
    padding: 15,
    borderRadius: 8,
    alignItems: 'center',
  },
  buttonText: {
    color: '#fff',
    fontSize: 16,
    fontWeight: 'bold',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
    marginBottom: 20,
  },
  statusText: {
    fontSize: 16,
    marginTop: 10,
    textAlign: 'center',
  },
  subText: {
    fontSize: 14,
    color: '#666',
    marginTop: 5,
    textAlign: 'center',
  },
});
```

## Error handling

### Direct upload errors

Mobile networks are unreliable, so robust error handling is essential. Here are common upload errors and how to handle them:

**Common error scenarios:**

| Error | Cause | Solution |
|-------|-------|----------|
| Network timeout | Slow/unstable connection | Implement retry logic, allow resumable uploads |
| 403 Forbidden | Upload URL expired (valid for 48 hours) | Request a new upload URL from your backend |
| Connection lost | User switched from WiFi to cellular | Cancel upload, show option to retry |
| File too large | Video exceeds Mux limits | Validate file size before upload, compress if needed |
| Out of storage | Device storage full | Check available storage before upload |

**Enhanced error handling:**

```tsx
import { useState } from 'react';
import * as FileSystem from 'expo-file-system';
import * as Network from 'expo-network';

interface UploadError {
  message: string;
  canRetry: boolean;
  shouldRequestNewUrl: boolean;
}

export function useVideoUploadWithRetry() {
  const [uploading, setUploading] = useState(false);
  const [uploadProgress, setUploadProgress] = useState(0);
  const [error, setError] = useState<string | null>(null);
  const [retryCount, setRetryCount] = useState(0);

  const MAX_RETRIES = 3;

  const handleUploadError = (error: any): UploadError => {
    const message = error.message?.toLowerCase() || '';

    // Network errors - can retry
    if (message.includes('network') || message.includes('connection')) {
      return {
        message: 'Network error. Check your connection and try again.',
        canRetry: true,
        shouldRequestNewUrl: false,
      };
    }

    // Timeout errors - can retry
    if (message.includes('timeout')) {
      return {
        message: 'Upload timed out. Try again or use a shorter video.',
        canRetry: true,
        shouldRequestNewUrl: false,
      };
    }

    // Upload URL expired - need new URL
    if (message.includes('403') || message.includes('forbidden')) {
      return {
        message: 'Upload URL expired. Requesting a new one...',
        canRetry: true,
        shouldRequestNewUrl: true,
      };
    }

    // Server errors - can retry
    if (message.includes('500') || message.includes('502') || message.includes('503')) {
      return {
        message: 'Server error. Retrying...',
        canRetry: true,
        shouldRequestNewUrl: false,
      };
    }

    // Client errors - cannot retry
    if (message.includes('400') || message.includes('413')) {
      return {
        message: 'Invalid video file or file too large.',
        canRetry: false,
        shouldRequestNewUrl: false,
      };
    }

    // Generic error
    return {
      message: 'Upload failed. Please try again.',
      canRetry: true,
      shouldRequestNewUrl: false,
    };
  };

  const uploadVideoWithRetry = async (
    videoUri: string,
    getUploadUrl: () => Promise<{ uploadUrl: string; uploadId: string }>
  ): Promise<{ uploadId: string } | null> => {
    setUploading(true);
    setUploadProgress(0);
    setError(null);
    setRetryCount(0);

    let uploadUrl: string;
    let uploadId: string;

    // Get initial upload URL
    try {
      const result = await getUploadUrl();
      uploadUrl = result.uploadUrl;
      uploadId = result.uploadId;
    } catch (err) {
      setError('Failed to get upload URL');
      setUploading(false);
      return null;
    }

    // Retry loop
    for (let attempt = 0; attempt < MAX_RETRIES; attempt++) {
      try {
        setRetryCount(attempt);

        // Check network connectivity before upload
        const networkState = await Network.getNetworkStateAsync();
        if (!networkState.isConnected) {
          throw new Error('No network connection');
        }

        // Create upload task
        const uploadTask = FileSystem.createUploadTask(
          uploadUrl,
          videoUri,
          {
            httpMethod: 'PUT',
            uploadType: FileSystem.FileSystemUploadType.BINARY_CONTENT,
          },
          (progress) => {
            const percentage =
              progress.totalBytesSent / progress.totalBytesExpectedToSend;
            setUploadProgress(Math.round(percentage * 100));
          }
        );

        const uploadResponse = await uploadTask.uploadAsync();

        if (!uploadResponse || uploadResponse.status !== 200) {
          throw new Error(`Upload failed with status ${uploadResponse?.status}`);
        }

        // Success!
        setUploading(false);
        return { uploadId };
      } catch (err: any) {
        console.error(`Upload attempt ${attempt + 1} failed:`, err);

        const errorInfo = handleUploadError(err);
        setError(errorInfo.message);

        // If we can't retry or we've exhausted retries, give up
        if (!errorInfo.canRetry || attempt === MAX_RETRIES - 1) {
          setUploading(false);
          setUploadProgress(0);
          return null;
        }

        // If URL expired, get a new one
        if (errorInfo.shouldRequestNewUrl) {
          try {
            const result = await getUploadUrl();
            uploadUrl = result.uploadUrl;
            uploadId = result.uploadId;
          } catch {
            setError('Failed to get new upload URL');
            setUploading(false);
            setUploadProgress(0);
            return null;
          }
        }

        // Wait before retrying (exponential backoff)
        const delay = Math.min(1000 * Math.pow(2, attempt), 10000);
        await new Promise((resolve) => setTimeout(resolve, delay));
      }
    }

    setUploading(false);
    setUploadProgress(0);
    return null;
  };

  return { uploadVideoWithRetry, uploading, uploadProgress, error, retryCount };
}
```

**Using the retry hook:**

```tsx
import * as ImagePicker from 'expo-image-picker';

function VideoUploader() {
  const { uploadVideoWithRetry, uploading, uploadProgress, error, retryCount } =
    useVideoUploadWithRetry();

  const selectAndUpload = async () => {
    const result = await ImagePicker.launchImageLibraryAsync({
      mediaTypes: ImagePicker.MediaTypeOptions.Videos,
    });

    if (result.canceled) return;

    const getUploadUrl = async () => {
      const response = await fetch('https://your-api.com/generate-upload-url', {
        method: 'POST',
      });
      return response.json();
    };

    await uploadVideoWithRetry(result.assets[0].uri, getUploadUrl);
  };

  return (
    <View>
      <TouchableOpacity onPress={selectAndUpload} disabled={uploading}>
        <Text>Select and Upload Video</Text>
      </TouchableOpacity>
      {uploading && (
        <View>
          <Text>Uploading: {uploadProgress}%</Text>
          {retryCount > 0 && <Text>Retry attempt {retryCount + 1}</Text>}
        </View>
      )}
      {error && <Text style={{ color: 'red' }}>{error}</Text>}
    </View>
  );
}
```

<Callout type="info">
  Install `expo-network` to check connectivity: `npx expo install expo-network`
</Callout>

### URL upload errors

Common issues when creating assets from URLs:

* **Invalid URL**: Ensure the URL is publicly accessible
* **Unsupported format**: Mux supports MP4, MOV, AVI, and more
* **File too large**: Check Mux's file size limits
* **URL expired**: Some AI services return temporary URLs

## Best practices

### For direct upload

1. **Show upload progress** - Use `FileSystem.createUploadTask` for progress updates
2. **Validate file size** - Check before uploading (e.g., max 5GB)
3. **Handle retries** - Network issues are common on mobile
4. **Compress videos** - Consider client-side compression for large files

### For URL upload

1. **Validate URLs** - Ensure they're publicly accessible before sending to Mux
2. **Handle temporary URLs** - Some AI services return URLs that expire quickly
3. **Store original URL** - Keep a reference in case you need to re-ingest
4. **Set timeouts** - AI video generation can take 30-120 seconds

### General

1. **Use webhooks** - More reliable than polling for asset status
2. **Store metadata** - Save prompt, user ID, timestamps in your database
3. **Handle failures gracefully** - Show clear error messages and retry options
4. **Monitor costs** - Track encoding and storage usage

<Callout type="info">
  Learn more about [Mux encoding tiers and pricing](https://www.mux.com/pricing) to optimize costs for your use case.
</Callout>


# Handle async video processing
Learn how to notify your React Native app when Mux videos are ready using webhooks, realtime databases, and push notifications.
Video processing is asynchronous - whether you're uploading from a device or ingesting from a URL, there's always a delay while Mux processes the video. This guide shows you how to handle this gracefully in React Native.

## Why video processing is async

After you upload a video or create an asset from a URL, Mux needs time to:

1. Download the video (if from URL)
2. Transcode it into multiple formats and qualities
3. Generate thumbnails and storyboards
4. Prepare it for adaptive bitrate streaming

This can take anywhere from **a few seconds** (short videos) to **several minutes** (long, high-resolution videos).

## Asset states

Mux assets go through several states:

| State | Meaning | Action |
|-------|---------|--------|
| `preparing` | Video is being processed | Show loading UI |
| `ready` | Video is ready to play | Display video player |
| `errored` | Processing failed | Show error message |

There are other intermediate states, but these are the main ones you'll need to handle in your app.

## Getting notified when videos are ready

Your backend receives webhook notifications from Mux when asset states change. Your React Native app then needs to know about these changes. There are three main patterns:

### Pattern comparison

| Pattern | Best For | Pros | Cons |
|---------|----------|------|------|
| **Realtime Database** | Production apps | Instant updates, efficient | Requires realtime infrastructure |
| **Polling** | Simple apps, prototypes | Easy to implement | Server load, delayed updates |
| **Push Notifications** | Long processes (>60s) | Works when app backgrounded | Requires notification permissions |

***

## Pattern 1: Realtime database (recommended)

The best approach for production apps is to use a realtime database like Supabase or Firebase. Your backend updates the database via webhooks, and React Native subscribes to changes.

### Architecture

```
Mux → Webhook → Your Backend → Database
                                   ↓
                              Realtime Update
                                   ↓
                           React Native App
```

### Backend: Handle Mux webhook

See the [uploading videos guide](/docs/frameworks/react-native-uploading-videos) for the complete webhook handler. Here's the key part:

```javascript
// Backend: Webhook handler
export async function handleMuxWebhook(req, res) {
  // Verify signature (see main docs)
  const event = req.body;

  if (event.type === 'video.asset.ready') {
    const { id, playback_ids, duration } = event.data;

    // Update your database
    await db.videos.update({
      where: { muxAssetId: id },
      data: {
        status: 'ready',
        playbackId: playback_ids[0].id,
        duration,
        updatedAt: new Date(),
      },
    });
    // Database realtime will notify subscribed clients automatically
  }

  res.json({ received: true });
}
```

### React Native: Subscribe to changes (Supabase)

```tsx
import React, { useEffect, useState } from 'react';
import { View, Text, ActivityIndicator, StyleSheet, Image } from 'react-native';
import { supabase } from './lib/supabase';
import { useVideoPlayer, VideoView } from 'expo-video';

interface Video {
  id: string;
  status: 'processing' | 'ready' | 'failed';
  playbackId: string | null;
  duration: number | null;
}

interface VideoStatusProps {
  videoId: string;
}

export default function VideoStatus({ videoId }: VideoStatusProps) {
  const [video, setVideo] = useState<Video | null>(null);
  const [loading, setLoading] = useState(true);

  useEffect(() => {
    // Fetch initial video state
    const fetchVideo = async () => {
      const { data, error } = await supabase
        .from('videos')
        .select('*')
        .eq('id', videoId)
        .single();

      if (data) {
        setVideo(data);
        setLoading(false);
      }
    };

    fetchVideo();

    // Subscribe to realtime updates
    const subscription = supabase
      .channel(`video-${videoId}`)
      .on(
        'postgres_changes',
        {
          event: 'UPDATE',
          schema: 'public',
          table: 'videos',
          filter: `id=eq.${videoId}`,
        },
        (payload) => {
          console.log('Video updated:', payload.new);
          setVideo(payload.new as Video);
        }
      )
      .subscribe();

    return () => {
      subscription.unsubscribe();
    };
  }, [videoId]);

  if (loading) {
    return (
      <View style={styles.container}>
        <ActivityIndicator size="large" />
      </View>
    );
  }

  if (video?.status === 'failed') {
    return (
      <View style={styles.container}>
        <Text style={styles.errorText}>
          Video processing failed. Please try again.
        </Text>
      </View>
    );
  }

  if (video?.status === 'processing' || !video?.playbackId) {
    return (
      <View style={styles.container}>
        <ActivityIndicator size="large" color="#007AFF" />
        <Text style={styles.statusText}>Processing your video...</Text>
        <Text style={styles.subText}>This usually takes 30-60 seconds</Text>
      </View>
    );
  }

  return <VideoPlayer playbackId={video.playbackId} />;
}

function VideoPlayer({ playbackId }: { playbackId: string }) {
  const [showPoster, setShowPoster] = useState(true);
  const posterUrl = `https://image.mux.com/${playbackId}/thumbnail.png?time=0`;

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  return (
    <View style={styles.videoContainer}>
      <VideoView
        player={player}
        style={styles.video}
        nativeControls
        contentFit="contain"
        onFirstFrameRender={() => setShowPoster(false)}
      />
      {showPoster && (
        <Image
          source={{ uri: posterUrl }}
          style={[styles.video, styles.poster]}
          resizeMode="cover"
        />
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
    padding: 20,
  },
  videoContainer: {
    position: 'relative',
    width: '100%',
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  poster: {
    position: 'absolute',
    top: 0,
    left: 0,
  },
  statusText: {
    fontSize: 16,
    marginTop: 15,
    color: '#333',
  },
  subText: {
    fontSize: 14,
    marginTop: 5,
    color: '#666',
  },
  errorText: {
    fontSize: 16,
    color: '#ff0000',
    textAlign: 'center',
  },
});
```

### React Native: Subscribe to changes (Firebase)

```tsx
import React, { useEffect, useState } from 'react';
import { View, ActivityIndicator, StyleSheet } from 'react-native';
import firestore from '@react-native-firebase/firestore';
import { useVideoPlayer, VideoView } from 'expo-video';

export default function VideoStatus({ videoId }: { videoId: string }) {
  const [status, setStatus] = useState<'processing' | 'ready' | 'failed'>('processing');
  const [playbackId, setPlaybackId] = useState<string | null>(null);

  useEffect(() => {
    // Subscribe to Firestore document changes
    const unsubscribe = firestore()
      .collection('videos')
      .doc(videoId)
      .onSnapshot((documentSnapshot) => {
        const data = documentSnapshot.data();
        if (data) {
          setStatus(data.status);
          if (data.status === 'ready') {
            setPlaybackId(data.playbackId);
          }
        }
      });

    return () => unsubscribe();
  }, [videoId]);

  if (status === 'processing' || !playbackId) {
    return <ActivityIndicator size="large" />;
  }

  return <VideoPlayer playbackId={playbackId} />;
}

function VideoPlayer({ playbackId }: { playbackId: string }) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  return (
    <VideoView
      player={player}
      style={styles.video}
      nativeControls
    />
  );
}

const styles = StyleSheet.create({
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
});
```

<Callout type="info">
  **Setup required:** Both Supabase and Firebase require configuration. See [Supabase Realtime docs](https://supabase.com/docs/guides/realtime) or [Firebase Firestore docs](https://firebase.google.com/docs/firestore) for setup instructions.
</Callout>

***

## Pattern 2: Polling from React Native

If you don't have realtime infrastructure, you can poll your backend for status updates. This is simpler but less efficient.

<Callout type="warning">
  Polling creates unnecessary server load and provides slower updates compared to realtime databases. Use this only for prototypes or simple apps.
</Callout>

```tsx
import React, { useEffect, useState, useRef } from 'react';
import { View, Text, ActivityIndicator, StyleSheet } from 'react-native';
import { useVideoPlayer, VideoView } from 'expo-video';

interface VideoPollerProps {
  videoId: string;
  pollInterval?: number; // milliseconds
}

export default function VideoPoller({
  videoId,
  pollInterval = 3000, // Poll every 3 seconds
}: VideoPollerProps) {
  const [status, setStatus] = useState<'processing' | 'ready' | 'failed'>('processing');
  const [playbackId, setPlaybackId] = useState<string | null>(null);
  const [attempts, setAttempts] = useState(0);
  const maxAttempts = 60; // Stop after 3 minutes (60 * 3s)
  const intervalRef = useRef<NodeJS.Timeout | null>(null);

  useEffect(() => {
    const checkVideoStatus = async () => {
      try {
        const response = await fetch(
          `https://your-api.com/videos/${videoId}/status`
        );
        const data = await response.json();

        if (data.status === 'ready') {
          setStatus('ready');
          setPlaybackId(data.playbackId);
          // Stop polling
          if (intervalRef.current) {
            clearInterval(intervalRef.current);
          }
        } else if (data.status === 'failed') {
          setStatus('failed');
          if (intervalRef.current) {
            clearInterval(intervalRef.current);
          }
        } else {
          setAttempts((prev) => prev + 1);
        }
      } catch (error) {
        console.error('Failed to check video status:', error);
      }
    };

    // Initial check
    checkVideoStatus();

    // Start polling
    intervalRef.current = setInterval(() => {
      if (attempts >= maxAttempts) {
        // Timeout - stop polling
        if (intervalRef.current) {
          clearInterval(intervalRef.current);
        }
        setStatus('failed');
      } else {
        checkVideoStatus();
      }
    }, pollInterval);

    return () => {
      if (intervalRef.current) {
        clearInterval(intervalRef.current);
      }
    };
  }, [videoId, attempts, pollInterval, maxAttempts]);

  if (status === 'failed') {
    return (
      <View style={styles.container}>
        <Text style={styles.errorText}>
          Video processing failed or timed out.
        </Text>
      </View>
    );
  }

  if (status === 'processing' || !playbackId) {
    return (
      <View style={styles.container}>
        <ActivityIndicator size="large" color="#007AFF" />
        <Text style={styles.statusText}>
          Processing... ({Math.floor((attempts * pollInterval) / 1000)}s)
        </Text>
      </View>
    );
  }

  return <VideoPlayer playbackId={playbackId} />;
}

function VideoPlayer({ playbackId }: { playbackId: string }) {
  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  return (
    <VideoView
      player={player}
      style={styles.video}
      nativeControls
    />
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    justifyContent: 'center',
    alignItems: 'center',
    padding: 20,
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  statusText: {
    fontSize: 16,
    marginTop: 10,
    color: '#666',
  },
  errorText: {
    fontSize: 16,
    color: '#ff0000',
    textAlign: 'center',
  },
});
```

### Polling best practices

1. **Set a maximum number of attempts** - Don't poll forever
2. **Use reasonable intervals** - 3-5 seconds is typical
3. **Stop polling when done** - Clean up intervals on unmount
4. **Handle errors gracefully** - Network issues happen
5. **Show elapsed time** - Help users understand progress

***

## Pattern 3: Push notifications

For longer processing times (AI video generation can take 30-120 seconds), push notifications ensure users are notified even if they navigate away or background the app.

### Setup Expo Notifications

```bash
npx expo install expo-notifications expo-device expo-constants
```

### Request notification permissions

```tsx
import * as Notifications from 'expo-notifications';
import * as Device from 'expo-device';
import { Platform } from 'react-native';

async function registerForPushNotificationsAsync() {
  let token;

  if (Platform.OS === 'android') {
    await Notifications.setNotificationChannelAsync('default', {
      name: 'default',
      importance: Notifications.AndroidImportance.MAX,
    });
  }

  if (Device.isDevice) {
    const { status: existingStatus } = await Notifications.getPermissionsAsync();
    let finalStatus = existingStatus;

    if (existingStatus !== 'granted') {
      const { status } = await Notifications.requestPermissionsAsync();
      finalStatus = status;
    }

    if (finalStatus !== 'granted') {
      alert('Failed to get push token for push notification!');
      return;
    }

    token = (await Notifications.getExpoPushTokenAsync()).data;
  }

  return token;
}
```

### Send notification when video is ready (backend)

```javascript
// Backend: After video is ready
async function notifyUserVideoReady(userId, videoId, playbackId) {
  // Get user's push token from your database
  const user = await db.users.findUnique({ where: { id: userId } });

  if (user.pushToken) {
    await fetch('https://exp.host/--/api/v2/push/send', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
      },
      body: JSON.stringify({
        to: user.pushToken,
        title: 'Your video is ready! 🎉',
        body: 'Tap to watch your AI-generated video',
        data: { videoId, playbackId },
      }),
    });
  }
}
```

### Handle notification tap (React Native)

```tsx
import { useEffect, useRef } from 'react';
import { useNavigation } from '@react-navigation/native';
import * as Notifications from 'expo-notifications';

export function useNotificationHandler() {
  const navigation = useNavigation();
  const notificationListener = useRef<any>();
  const responseListener = useRef<any>();

  useEffect(() => {
    // Handle notification when app is foregrounded
    notificationListener.current = Notifications.addNotificationReceivedListener(
      (notification) => {
        console.log('Notification received:', notification);
      }
    );

    // Handle notification tap
    responseListener.current = Notifications.addNotificationResponseReceivedListener(
      (response) => {
        const { videoId } = response.notification.request.content.data;

        // Navigate to video screen
        navigation.navigate('Video', { videoId });
      }
    );

    return () => {
      Notifications.removeNotificationSubscription(notificationListener.current);
      Notifications.removeNotificationSubscription(responseListener.current);
    };
  }, [navigation]);
}
```

<Callout type="info">
  Push notifications require additional setup including APNs (iOS) and FCM (Android) configuration. See [Expo Notifications docs](https://docs.expo.dev/push-notifications/overview/) for details.
</Callout>

***

## UI patterns for processing states

### Loading with progress indicator

```tsx
import React, { useState, useEffect } from 'react';
import { View, Text, ActivityIndicator, StyleSheet } from 'react-native';
import { LinearGradient } from 'expo-linear-gradient';

export function ProcessingIndicator({ estimatedTime = 60 }: { estimatedTime?: number }) {
  const [elapsedTime, setElapsedTime] = useState(0);

  useEffect(() => {
    const interval = setInterval(() => {
      setElapsedTime((prev) => prev + 1);
    }, 1000);

    return () => clearInterval(interval);
  }, []);

  const progress = Math.min((elapsedTime / estimatedTime) * 100, 95);

  return (
    <View style={styles.container}>
      <ActivityIndicator size="large" color="#007AFF" />
      <Text style={styles.title}>Generating your video</Text>
      <Text style={styles.subtitle}>This usually takes {estimatedTime}s</Text>

      <View style={styles.progressBar}>
        <View style={[styles.progressFill, { width: `${progress}%` }]} />
      </View>

      <Text style={styles.time}>{elapsedTime}s elapsed</Text>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    padding: 40,
    alignItems: 'center',
  },
  title: {
    fontSize: 18,
    fontWeight: 'bold',
    marginTop: 20,
  },
  subtitle: {
    fontSize: 14,
    color: '#666',
    marginTop: 5,
  },
  progressBar: {
    width: '100%',
    height: 4,
    backgroundColor: '#e0e0e0',
    borderRadius: 2,
    marginTop: 20,
    overflow: 'hidden',
  },
  progressFill: {
    height: '100%',
    backgroundColor: '#007AFF',
  },
  time: {
    fontSize: 12,
    color: '#999',
    marginTop: 10,
  },
});
```

### Success animation

```tsx
import React, { useEffect } from 'react';
import { View, Text, StyleSheet } from 'react-native';
import Animated, {
  useSharedValue,
  useAnimatedStyle,
  withSpring,
  withSequence,
} from 'react-native-reanimated';

export function SuccessAnimation() {
  const scale = useSharedValue(0);

  useEffect(() => {
    scale.value = withSequence(
      withSpring(1.2, { damping: 2 }),
      withSpring(1)
    );
  }, []);

  const animatedStyle = useAnimatedStyle(() => ({
    transform: [{ scale: scale.value }],
  }));

  return (
    <View style={styles.container}>
      <Animated.View style={[styles.checkmark, animatedStyle]}>
        <Text style={styles.checkmarkText}>✓</Text>
      </Animated.View>
      <Text style={styles.text}>Video ready!</Text>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    alignItems: 'center',
    padding: 20,
  },
  checkmark: {
    width: 80,
    height: 80,
    borderRadius: 40,
    backgroundColor: '#4CAF50',
    justifyContent: 'center',
    alignItems: 'center',
  },
  checkmarkText: {
    fontSize: 48,
    color: '#fff',
  },
  text: {
    fontSize: 18,
    fontWeight: 'bold',
    marginTop: 15,
  },
});
```

### Error with retry

```tsx
import React from 'react';
import { View, Text, TouchableOpacity, StyleSheet } from 'react-native';

interface ErrorStateProps {
  message: string;
  onRetry: () => void;
}

export function ErrorState({ message, onRetry }: ErrorStateProps) {
  return (
    <View style={styles.container}>
      <Text style={styles.icon}>⚠️</Text>
      <Text style={styles.title}>Processing Failed</Text>
      <Text style={styles.message}>{message}</Text>

      <TouchableOpacity style={styles.button} onPress={onRetry}>
        <Text style={styles.buttonText}>Try Again</Text>
      </TouchableOpacity>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    padding: 40,
    alignItems: 'center',
  },
  icon: {
    fontSize: 48,
  },
  title: {
    fontSize: 20,
    fontWeight: 'bold',
    marginTop: 10,
  },
  message: {
    fontSize: 14,
    color: '#666',
    textAlign: 'center',
    marginTop: 10,
  },
  button: {
    backgroundColor: '#007AFF',
    paddingHorizontal: 30,
    paddingVertical: 12,
    borderRadius: 8,
    marginTop: 20,
  },
  buttonText: {
    color: '#fff',
    fontSize: 16,
    fontWeight: 'bold',
  },
});
```

## Complete workflow example

Putting it all together - AI video generation with async handling:

```tsx
import React, { useState, useEffect } from 'react';
import { View, TextInput, TouchableOpacity, Text, StyleSheet, Image } from 'react-native';
import { supabase } from './lib/supabase';
import { useVideoPlayer, VideoView } from 'expo-video';
import { ProcessingIndicator } from './ProcessingIndicator';
import { SuccessAnimation } from './SuccessAnimation';
import { ErrorState } from './ErrorState';

export default function AIVideoGenerator() {
  const [prompt, setPrompt] = useState('');
  const [videoId, setVideoId] = useState<string | null>(null);
  const [status, setStatus] = useState<'idle' | 'generating' | 'processing' | 'ready' | 'failed'>('idle');
  const [playbackId, setPlaybackId] = useState<string | null>(null);
  const [error, setError] = useState<string | null>(null);

  useEffect(() => {
    if (!videoId) return;

    // Subscribe to video status updates
    const subscription = supabase
      .channel(`video-${videoId}`)
      .on(
        'postgres_changes',
        {
          event: 'UPDATE',
          schema: 'public',
          table: 'videos',
          filter: `id=eq.${videoId}`,
        },
        (payload) => {
          const video = payload.new;
          setStatus(video.status);

          if (video.status === 'ready') {
            setPlaybackId(video.playback_id);
          } else if (video.status === 'failed') {
            setError(video.error || 'Video generation failed');
          }
        }
      )
      .subscribe();

    return () => {
      subscription.unsubscribe();
    };
  }, [videoId]);

  const generateVideo = async () => {
    if (!prompt.trim()) return;

    setStatus('generating');
    setError(null);

    try {
      const response = await fetch('https://your-api.com/generate-video', {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify({ prompt }),
      });

      const data = await response.json();
      setVideoId(data.videoId);
      setStatus('processing');
    } catch (err) {
      setStatus('failed');
      setError('Failed to start video generation');
    }
  };

  const reset = () => {
    setPrompt('');
    setVideoId(null);
    setStatus('idle');
    setPlaybackId(null);
    setError(null);
  };

  if (status === 'ready' && playbackId) {
    return (
      <View style={styles.container}>
        <SuccessAnimation />
        <VideoPlayer playbackId={playbackId} />
        <TouchableOpacity style={styles.button} onPress={reset}>
          <Text style={styles.buttonText}>Generate Another</Text>
        </TouchableOpacity>
      </View>
    );
  }

  if (status === 'failed') {
    return (
      <ErrorState
        message={error || 'Something went wrong'}
        onRetry={reset}
      />
    );
  }

  if (status === 'generating' || status === 'processing') {
    return (
      <View style={styles.container}>
        <ProcessingIndicator estimatedTime={60} />
      </View>
    );
  }

  return (
    <View style={styles.container}>
      <Text style={styles.title}>Generate AI Video</Text>
      <TextInput
        style={styles.input}
        placeholder="Describe your video..."
        value={prompt}
        onChangeText={setPrompt}
        multiline
        numberOfLines={4}
      />
      <TouchableOpacity
        style={[styles.button, !prompt.trim() && styles.buttonDisabled]}
        onPress={generateVideo}
        disabled={!prompt.trim()}
      >
        <Text style={styles.buttonText}>Generate</Text>
      </TouchableOpacity>
    </View>
  );
}

function VideoPlayer({ playbackId }: { playbackId: string }) {
  const [showPoster, setShowPoster] = useState(true);
  const posterUrl = `https://image.mux.com/${playbackId}/thumbnail.png?time=0`;

  const player = useVideoPlayer(
    `https://stream.mux.com/${playbackId}.m3u8`,
    player => {
      player.play();
    }
  );

  return (
    <View style={styles.videoContainer}>
      <VideoView
        player={player}
        style={styles.video}
        nativeControls
        contentFit="contain"
        onFirstFrameRender={() => setShowPoster(false)}
      />
      {showPoster && (
        <Image
          source={{ uri: posterUrl }}
          style={[styles.video, styles.poster]}
          resizeMode="cover"
        />
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    padding: 20,
  },
  title: {
    fontSize: 24,
    fontWeight: 'bold',
    marginBottom: 20,
  },
  input: {
    borderWidth: 1,
    borderColor: '#ddd',
    borderRadius: 8,
    padding: 15,
    fontSize: 16,
    minHeight: 120,
    textAlignVertical: 'top',
  },
  button: {
    backgroundColor: '#007AFF',
    padding: 15,
    borderRadius: 8,
    alignItems: 'center',
    marginTop: 15,
  },
  buttonDisabled: {
    backgroundColor: '#ccc',
  },
  buttonText: {
    color: '#fff',
    fontSize: 16,
    fontWeight: 'bold',
  },
  videoContainer: {
    position: 'relative',
    width: '100%',
    marginVertical: 20,
  },
  video: {
    width: '100%',
    aspectRatio: 16 / 9,
  },
  poster: {
    position: 'absolute',
    top: 0,
    left: 0,
  },
});
```

## Best practices

1. **Always clean up subscriptions** - Prevent memory leaks
2. **Show meaningful progress** - Elapsed time, estimated time remaining
3. **Handle edge cases** - What if the user navigates away?
4. **Set timeouts** - Don't wait forever (max 3-5 minutes)
5. **Provide feedback** - Loading states, success animations, error messages
6. **Allow cancellation** - Let users cancel long operations
7. **Test on slow networks** - Video processing + slow network = long waits

<Callout type="info">
  For more details on webhook setup and verification, see the [listen for webhooks guide](/docs/core/listen-for-webhooks) in the main Mux documentation.
</Callout>


# Build a Stories/Reels UI in React Native
Create an Instagram Stories or TikTok-style vertical video feed with full-screen videos, swipe navigation, and gesture controls.
This guide shows you how to build a vertical, full-screen video feed similar to Instagram Stories, TikTok, or Snapchat Spotlight. Users swipe up and down to navigate between videos, with automatic playback and gesture controls.

<Callout type="info">
  This is one of the most engaging UX patterns for short-form video. It's perfect for AI-generated content, user stories, or any vertical video feed.
</Callout>

## What we're building

A Stories-style interface with:

* **Full-screen vertical videos** (one video visible at a time)
* **Swipe navigation** (swipe up/down to see next/previous video)
* **Auto-play** (current video plays automatically)
* **Gesture controls** (tap to pause, double-tap to like)
* **UI overlay** (username, caption, stats, actions)
* **Smooth transitions** between videos
* **Preloading** for seamless playback

***

## Architecture overview

The Stories UI uses these key components:

```
StoriesScreen
  ├── FlatList (pagingEnabled, vertical)
  │   └── StoryItem (full-screen video + overlay)
  │       ├── MuxVideo (with auto-play logic)
  │       ├── VideoOverlay (username, stats, actions)
  │       └── GestureDetector (tap, double-tap)
  └── PreloadManager (loads next videos)
```

***

## Step 1: Configure FlatList for full-screen paging

The foundation is a FlatList configured for vertical paging:

```tsx
import React, { useRef, useState, useCallback } from 'react';
import {
  FlatList,
  Dimensions,
  StyleSheet,
  ViewToken,
  View,
} from 'react-native';

const { height: SCREEN_HEIGHT } = Dimensions.get('window');

interface Video {
  id: string;
  playbackId: string;
  title: string;
  username: string;
  userId: string;
  viewCount: number;
  likeCount: number;
}

interface StoriesFeedProps {
  videos: Video[];
}

export default function StoriesFeed({ videos }: StoriesFeedProps) {
  const [currentIndex, setCurrentIndex] = useState(0);
  const flatListRef = useRef<FlatList>(null);

  const onViewableItemsChanged = useCallback(
    ({ viewableItems }: { viewableItems: ViewToken[] }) => {
      if (viewableItems.length > 0) {
        const index = viewableItems[0].index;
        if (index !== null) {
          setCurrentIndex(index);
        }
      }
    },
    []
  );

  const viewabilityConfig = useRef({
    itemVisiblePercentThreshold: 50, // Item is "visible" when 50% is on screen
  }).current;

  return (
    <FlatList
      ref={flatListRef}
      data={videos}
      renderItem={({ item, index }) => (
        <StoryItem
          video={item}
          isActive={index === currentIndex}
        />
      )}
      keyExtractor={(item) => item.id}
      pagingEnabled
      showsVerticalScrollIndicator={false}
      snapToInterval={SCREEN_HEIGHT}
      snapToAlignment="start"
      decelerationRate="fast"
      onViewableItemsChanged={onViewableItemsChanged}
      viewabilityConfig={viewabilityConfig}
      getItemLayout={(data, index) => ({
        length: SCREEN_HEIGHT,
        offset: SCREEN_HEIGHT * index,
        index,
      })}
      windowSize={3} // Render 1 above, 1 current, 1 below
      maxToRenderPerBatch={2}
      removeClippedSubviews
    />
  );
}
```

### Key FlatList props

| Prop | Purpose |
|------|---------|
| `pagingEnabled` | Snaps to full screens |
| `snapToInterval={SCREEN_HEIGHT}` | Ensures exact screen alignment |
| `snapToAlignment="start"` | Aligns to top of screen |
| `decelerationRate="fast"` | Quick snap to next video |
| `windowSize={3}` | Renders 3 items (prev, current, next) |
| `getItemLayout` | Optimizes scroll performance |
| `removeClippedSubviews` | Unmounts off-screen items (memory optimization) |

***

## Step 2: Build the StoryItem component

Each story item is a full-screen video with controls:

```tsx
import React, { useRef, useState, useEffect } from 'react';
import { View, StyleSheet, Dimensions, Pressable } from 'react-native';
import Video, { VideoRef } from 'react-native-video';
import { Gesture, GestureDetector } from 'react-native-gesture-handler';

const { width: SCREEN_WIDTH, height: SCREEN_HEIGHT } = Dimensions.get('window');

interface StoryItemProps {
  video: Video;
  isActive: boolean;
}

export function StoryItem({ video, isActive }: StoryItemProps) {
  const videoRef = useRef<VideoRef>(null);
  const [paused, setPaused] = useState(!isActive);
  const [liked, setLiked] = useState(false);

  // Auto-play when active, pause when not
  useEffect(() => {
    setPaused(!isActive);
  }, [isActive]);

  // Single tap: pause/play
  const singleTap = Gesture.Tap()
    .numberOfTaps(1)
    .onEnd(() => {
      setPaused((prev) => !prev);
    });

  // Double tap: like
  const doubleTap = Gesture.Tap()
    .numberOfTaps(2)
    .onEnd(() => {
      setLiked(true);
      // TODO: Call API to like video
    });

  const taps = Gesture.Exclusive(doubleTap, singleTap);

  return (
    <View style={styles.container}>
      <GestureDetector gesture={taps}>
        <View style={styles.videoContainer}>
          <Video
            ref={videoRef}
            source={{ uri: `https://stream.mux.com/${video.playbackId}.m3u8` }}
            poster={`https://image.mux.com/${video.playbackId}/thumbnail.png?time=0`}
            posterResizeMode="cover"
            style={styles.video}
            paused={paused}
            repeat={true} // Loop the video
            resizeMode="cover"
            onError={(error) => console.error('Video error:', error)}
          />

          {/* Overlay UI */}
          <VideoOverlay
            username={video.username}
            title={video.title}
            viewCount={video.viewCount}
            likeCount={video.likeCount}
            liked={liked}
            onLike={() => setLiked(!liked)}
          />

          {/* Like animation (shown on double-tap) */}
          {liked && <LikeAnimation />}
        </View>
      </GestureDetector>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    width: SCREEN_WIDTH,
    height: SCREEN_HEIGHT,
    backgroundColor: '#000',
  },
  videoContainer: {
    flex: 1,
    position: 'relative',
  },
  video: {
    position: 'absolute',
    top: 0,
    left: 0,
    right: 0,
    bottom: 0,
  },
});
```

<Callout type="info">
  **Install gesture handler** if you haven't already:

  ```bash
  npx expo install react-native-gesture-handler
  ```
</Callout>

***

## Step 3: Create the video overlay

The overlay displays video metadata and actions:

```tsx
import React from 'react';
import { View, Text, StyleSheet, TouchableOpacity } from 'react-native';
import { LinearGradient } from 'expo-linear-gradient';

interface VideoOverlayProps {
  username: string;
  title: string;
  viewCount: number;
  likeCount: number;
  liked: boolean;
  onLike: () => void;
}

export function VideoOverlay({
  username,
  title,
  viewCount,
  likeCount,
  liked,
  onLike,
}: VideoOverlayProps) {
  return (
    <>
      {/* Top gradient for better text readability */}
      <LinearGradient
        colors={['rgba(0,0,0,0.6)', 'transparent']}
        style={styles.topGradient}
      />

      {/* Bottom gradient and content */}
      <LinearGradient
        colors={['transparent', 'rgba(0,0,0,0.8)']}
        style={styles.bottomGradient}
      >
        <View style={styles.bottomContent}>
          {/* Left side: User info and caption */}
          <View style={styles.leftContent}>
            <Text style={styles.username}>@{username}</Text>
            <Text style={styles.title} numberOfLines={2}>
              {title}
            </Text>
            <Text style={styles.views}>
              {formatNumber(viewCount)} views
            </Text>
          </View>

          {/* Right side: Actions */}
          <View style={styles.rightContent}>
            <ActionButton
              icon={liked ? '❤️' : '🤍'}
              label={formatNumber(likeCount)}
              onPress={onLike}
            />
            <ActionButton
              icon="💬"
              label="Comment"
              onPress={() => {/* TODO */}}
            />
            <ActionButton
              icon="🔗"
              label="Share"
              onPress={() => {/* TODO */}}
            />
          </View>
        </View>
      </LinearGradient>
    </>
  );
}

function ActionButton({
  icon,
  label,
  onPress,
}: {
  icon: string;
  label: string;
  onPress: () => void;
}) {
  return (
    <TouchableOpacity style={styles.actionButton} onPress={onPress}>
      <Text style={styles.actionIcon}>{icon}</Text>
      <Text style={styles.actionLabel}>{label}</Text>
    </TouchableOpacity>
  );
}

function formatNumber(num: number): string {
  if (num >= 1000000) return `${(num / 1000000).toFixed(1)}M`;
  if (num >= 1000) return `${(num / 1000).toFixed(1)}K`;
  return num.toString();
}

const styles = StyleSheet.create({
  topGradient: {
    position: 'absolute',
    top: 0,
    left: 0,
    right: 0,
    height: 150,
    zIndex: 1,
  },
  bottomGradient: {
    position: 'absolute',
    bottom: 0,
    left: 0,
    right: 0,
    paddingBottom: 40,
    zIndex: 1,
  },
  bottomContent: {
    flexDirection: 'row',
    padding: 20,
    justifyContent: 'space-between',
    alignItems: 'flex-end',
  },
  leftContent: {
    flex: 1,
    marginRight: 20,
  },
  username: {
    color: '#fff',
    fontSize: 16,
    fontWeight: 'bold',
    marginBottom: 5,
  },
  title: {
    color: '#fff',
    fontSize: 14,
    marginBottom: 8,
  },
  views: {
    color: 'rgba(255,255,255,0.7)',
    fontSize: 12,
  },
  rightContent: {
    alignItems: 'center',
    gap: 20,
  },
  actionButton: {
    alignItems: 'center',
  },
  actionIcon: {
    fontSize: 32,
    marginBottom: 4,
  },
  actionLabel: {
    color: '#fff',
    fontSize: 12,
    fontWeight: '500',
  },
});
```

***

## Step 4: Add like animation

Show a heart animation when users double-tap:

```tsx
import React, { useEffect } from 'react';
import { StyleSheet } from 'react-native';
import Animated, {
  useSharedValue,
  useAnimatedStyle,
  withSpring,
  withSequence,
  runOnJS,
} from 'react-native-reanimated';

export function LikeAnimation({ onComplete }: { onComplete?: () => void }) {
  const scale = useSharedValue(0);
  const opacity = useSharedValue(1);

  useEffect(() => {
    scale.value = withSequence(
      withSpring(1.2, { damping: 10 }),
      withSpring(1, { damping: 10 }),
      withSpring(0, { damping: 10 }, () => {
        if (onComplete) {
          runOnJS(onComplete)();
        }
      })
    );

    opacity.value = withSequence(
      withSpring(1),
      withSpring(1),
      withSpring(0)
    );
  }, []);

  const animatedStyle = useAnimatedStyle(() => ({
    transform: [{ scale: scale.value }],
    opacity: opacity.value,
  }));

  return (
    <Animated.View style={[styles.container, animatedStyle]}>
      <Animated.Text style={styles.heart}>❤️</Animated.Text>
    </Animated.View>
  );
}

const styles = StyleSheet.create({
  container: {
    position: 'absolute',
    top: '50%',
    left: '50%',
    marginLeft: -50,
    marginTop: -50,
    zIndex: 10,
  },
  heart: {
    fontSize: 100,
  },
});
```

***

## Step 5: Add progress indicator

Show progress at the top (like Instagram Stories):

```tsx
import React, { useEffect } from 'react';
import { View, StyleSheet, Dimensions } from 'react-native';
import Animated, {
  useSharedValue,
  useAnimatedStyle,
  withTiming,
  Easing,
} from 'react-native-reanimated';

const { width: SCREEN_WIDTH } = Dimensions.get('window');

interface ProgressIndicatorProps {
  duration: number; // Video duration in seconds
  paused: boolean;
}

export function ProgressIndicator({ duration, paused }: ProgressIndicatorProps) {
  const progress = useSharedValue(0);

  useEffect(() => {
    if (!paused) {
      progress.value = withTiming(1, {
        duration: duration * 1000,
        easing: Easing.linear,
      });
    }
  }, [paused, duration]);

  const animatedStyle = useAnimatedStyle(() => ({
    width: `${progress.value * 100}%`,
  }));

  return (
    <View style={styles.container}>
      <Animated.View style={[styles.progress, animatedStyle]} />
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    position: 'absolute',
    top: 50, // Below status bar
    left: 0,
    right: 0,
    height: 2,
    backgroundColor: 'rgba(255,255,255,0.3)',
    zIndex: 10,
  },
  progress: {
    height: '100%',
    backgroundColor: '#fff',
  },
});
```

***

## Step 6: Preload adjacent videos

Preload the next video for seamless transitions:

```tsx
import { useEffect } from 'react';
import Video from 'react-native-video';

interface PreloadManagerProps {
  videos: Video[];
  currentIndex: number;
}

export function PreloadManager({ videos, currentIndex }: PreloadManagerProps) {
  useEffect(() => {
    // Preload next video
    const nextIndex = currentIndex + 1;
    if (nextIndex < videos.length) {
      const nextVideo = videos[nextIndex];
      // Create hidden video component to trigger preload
      Video.preload(`https://stream.mux.com/${nextVideo.playbackId}.m3u8`);
    }

    // Optionally preload previous video
    const prevIndex = currentIndex - 1;
    if (prevIndex >= 0) {
      const prevVideo = videos[prevIndex];
      Video.preload(`https://stream.mux.com/${prevVideo.playbackId}.m3u8`);
    }
  }, [currentIndex, videos]);

  return null;
}
```

<Callout type="info">
  HLS streaming means videos don't need to fully download before playing. Preloading just fetches the manifest and initial segments for instant startup.
</Callout>

***

## Complete StoriesFeed implementation

Putting it all together:

```tsx
import React, { useRef, useState, useCallback, useEffect } from 'react';
import {
  FlatList,
  Dimensions,
  StyleSheet,
  ViewToken,
  View,
  StatusBar,
} from 'react-native';
import Video, { VideoRef } from 'react-native-video';
import { Gesture, GestureDetector } from 'react-native-gesture-handler';
import Animated, { FadeIn, FadeOut } from 'react-native-reanimated';
import { SafeAreaView } from 'react-native-safe-area-context';

const { width: SCREEN_WIDTH, height: SCREEN_HEIGHT } = Dimensions.get('window');

interface Video {
  id: string;
  playbackId: string;
  title: string;
  username: string;
  userId: string;
  viewCount: number;
  likeCount: number;
  duration: number;
}

interface StoriesFeedProps {
  videos: Video[];
  initialIndex?: number;
}

export default function StoriesFeed({ videos, initialIndex = 0 }: StoriesFeedProps) {
  const [currentIndex, setCurrentIndex] = useState(initialIndex);
  const flatListRef = useRef<FlatList>(null);

  // Hide status bar for full-screen experience
  useEffect(() => {
    StatusBar.setHidden(true);
    return () => StatusBar.setHidden(false);
  }, []);

  const onViewableItemsChanged = useCallback(
    ({ viewableItems }: { viewableItems: ViewToken[] }) => {
      if (viewableItems.length > 0) {
        const index = viewableItems[0].index;
        if (index !== null && index !== currentIndex) {
          setCurrentIndex(index);
        }
      }
    },
    [currentIndex]
  );

  const viewabilityConfig = useRef({
    itemVisiblePercentThreshold: 50,
  }).current;

  const renderItem = useCallback(
    ({ item, index }: { item: Video; index: number }) => {
      const isActive = index === currentIndex;
      return <StoryItem video={item} isActive={isActive} />;
    },
    [currentIndex]
  );

  return (
    <SafeAreaView style={styles.container} edges={['top']}>
      <FlatList
        ref={flatListRef}
        data={videos}
        renderItem={renderItem}
        keyExtractor={(item) => item.id}
        pagingEnabled
        showsVerticalScrollIndicator={false}
        snapToInterval={SCREEN_HEIGHT}
        snapToAlignment="start"
        decelerationRate="fast"
        onViewableItemsChanged={onViewableItemsChanged}
        viewabilityConfig={viewabilityConfig}
        getItemLayout={(data, index) => ({
          length: SCREEN_HEIGHT,
          offset: SCREEN_HEIGHT * index,
          index,
        })}
        windowSize={3}
        maxToRenderPerBatch={2}
        removeClippedSubviews
        initialScrollIndex={initialIndex}
      />

      {/* Preload adjacent videos */}
      <PreloadManager videos={videos} currentIndex={currentIndex} />
    </SafeAreaView>
  );
}

function StoryItem({ video, isActive }: { video: Video; isActive: boolean }) {
  const videoRef = useRef<VideoRef>(null);
  const [paused, setPaused] = useState(!isActive);
  const [liked, setLiked] = useState(false);
  const [showLikeAnimation, setShowLikeAnimation] = useState(false);

  useEffect(() => {
    setPaused(!isActive);
  }, [isActive]);

  const singleTap = Gesture.Tap()
    .numberOfTaps(1)
    .onEnd(() => {
      setPaused((prev) => !prev);
    });

  const doubleTap = Gesture.Tap()
    .numberOfTaps(2)
    .onEnd(() => {
      if (!liked) {
        setLiked(true);
        setShowLikeAnimation(true);
        // TODO: Call API to like video
      }
    });

  const taps = Gesture.Exclusive(doubleTap, singleTap);

  return (
    <View style={styles.storyContainer}>
      <GestureDetector gesture={taps}>
        <View style={styles.videoWrapper}>
          <Video
            ref={videoRef}
            source={{ uri: `https://stream.mux.com/${video.playbackId}.m3u8` }}
            poster={`https://image.mux.com/${video.playbackId}/thumbnail.png?time=0`}
            posterResizeMode="cover"
            style={styles.video}
            paused={paused}
            repeat
            resizeMode="cover"
            onError={(error) => console.error('Video error:', error)}
          />

          {/* Progress indicator */}
          {isActive && (
            <ProgressIndicator duration={video.duration} paused={paused} />
          )}

          {/* Overlay */}
          <VideoOverlay
            username={video.username}
            title={video.title}
            viewCount={video.viewCount}
            likeCount={video.likeCount}
            liked={liked}
            onLike={() => setLiked(!liked)}
          />

          {/* Like animation */}
          {showLikeAnimation && (
            <Animated.View
              entering={FadeIn}
              exiting={FadeOut}
              style={StyleSheet.absoluteFill}
            >
              <LikeAnimation
                onComplete={() => setShowLikeAnimation(false)}
              />
            </Animated.View>
          )}
        </View>
      </GestureDetector>
    </View>
  );
}

const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: '#000',
  },
  storyContainer: {
    width: SCREEN_WIDTH,
    height: SCREEN_HEIGHT,
  },
  videoWrapper: {
    flex: 1,
    position: 'relative',
  },
  video: {
    position: 'absolute',
    top: 0,
    left: 0,
    right: 0,
    bottom: 0,
  },
});
```

***

## Performance optimization

### 1. Memory management

```tsx
// Clean up videos when they're far from view
const [loadedVideos, setLoadedVideos] = useState<Set<number>>(new Set([0]));

const onViewableItemsChanged = useCallback(
  ({ viewableItems }: { viewableItems: ViewToken[] }) => {
    const visibleIndices = viewableItems.map(item => item.index).filter(Boolean);

    // Load current + adjacent
    const indicesToLoad = new Set([
      ...visibleIndices,
      ...visibleIndices.map(i => i - 1),
      ...visibleIndices.map(i => i + 1),
    ].filter(i => i >= 0 && i < videos.length));

    setLoadedVideos(indicesToLoad);
  },
  [videos.length]
);

// In renderItem:
if (!loadedVideos.has(index)) {
  return <VideoPlaceholder />;
}
```

### 2. Use React.memo

```tsx
const StoryItem = React.memo(
  ({ video, isActive }: StoryItemProps) => {
    // ... component code
  },
  (prevProps, nextProps) => {
    return (
      prevProps.video.id === nextProps.video.id &&
      prevProps.isActive === nextProps.isActive
    );
  }
);
```

### 3. Optimize re-renders

```tsx
// Use useCallback for all event handlers
const handleLike = useCallback(() => {
  // API call
}, []);

const handleShare = useCallback(() => {
  // Share logic
}, []);
```

***

## Advanced features

### Horizontal swipe for user navigation

Add horizontal swipes to jump between users:

```tsx
const panGesture = Gesture.Pan()
  .onEnd((event) => {
    if (Math.abs(event.translationX) > 100) {
      if (event.translationX > 0) {
        // Swipe right - previous user
        goToPreviousUser();
      } else {
        // Swipe left - next user
        goToNextUser();
      }
    }
  });
```

### Multiple videos per user

Track user stories and show progress bars:

```tsx
<View style={styles.progressBars}>
  {userVideos.map((video, index) => (
    <ProgressBar
      key={video.id}
      filled={index < currentVideoIndex}
      active={index === currentVideoIndex}
    />
  ))}
</View>
```

### Mute toggle

```tsx
const [muted, setMuted] = useState(false);

<Video
  source={videoSource}
  muted={muted}
/>

<TouchableOpacity onPress={() => setMuted(!muted)}>
  <Text>{muted ? '🔇' : '🔊'}</Text>
</TouchableOpacity>
```

***

## Best practices

### 1. Handle video errors gracefully

```tsx
const [error, setError] = useState(false);

<Video
  source={videoSource}
  onError={() => setError(true)}
/>

{error && (
  <View style={styles.errorOverlay}>
    <Text>Video unavailable</Text>
    <Button title="Skip" onPress={skipToNext} />
  </View>
)}
```

### 2. Respect device orientation

```tsx
import { useOrientation } from './hooks/useOrientation';

const orientation = useOrientation();

// Only show Stories UI in portrait mode
if (orientation === 'landscape') {
  return <Text>Please rotate your device</Text>;
}
```

### 3. Pause on app background

```tsx
import { AppState } from 'react-native';

useEffect(() => {
  const subscription = AppState.addEventListener('change', (nextAppState) => {
    if (nextAppState !== 'active') {
      setPaused(true);
    }
  });

  return () => subscription.remove();
}, []);
```

### 4. Test on real devices

Stories UI requires testing on physical devices:

* Gesture responsiveness varies between simulator and device
* Video performance is better on real hardware
* Test on both iOS and Android
* Test on different screen sizes

<Callout type="warning">
  The iOS simulator and Android emulator don't accurately represent real-device video performance. Always test Stories UI on physical devices before shipping.
</Callout>

***

## Troubleshooting

### Videos don't auto-play

* Check `isActive` prop is updating correctly
* Verify `paused` state changes when `isActive` changes
* Ensure `onViewableItemsChanged` fires (add console.log)

### Stuttering between videos

* Increase `windowSize` to preload more videos
* Verify `getItemLayout` is set correctly
* Enable `removeClippedSubviews` for memory management
* Check network conditions (poor network = stuttering)

### High memory usage

* Reduce `windowSize` (default is 3, ideal for Stories)
* Use `removeClippedSubviews={true}`
* Implement video unloading for far-away items
* Monitor with Xcode Instruments / Android Profiler

### Gestures not working

* Wrap root component with `<GestureHandlerRootView>`
* Check `react-native-gesture-handler` is installed correctly
* Verify gesture detector wraps the video container


# Using the Mux MCP Server
Use the Mux Model Context Protocol (MCP) Server to bring Mux's Video and Data platform capabilities directly to your AI tools.
The Mux MCP ([Model Context Protocol](https://modelcontextprotocol.io/introduction)) Server brings Mux's Video and Data platform capabilities directly to your AI tools. Once set up, you can upload videos, manage live streams, analyze video performance, and access practically all of Mux's video infrastructure through natural language prompts in supported AI clients.

This guide walks you through the functionality in Mux's MCP server, and connecting it to various AI clients.

## Tools & Routes

Here are the following tools and API routes supported in the local Mux MCP Server:

* **Video API**: Create assets, uploads, live streams, playback URLs
* **Data API**: Query metrics, dimensions, real-time data
* **Webhook management**: List and verify webhook signatures
* **Asset management**: Retrieve, update video metadata
* **Live streaming**: Create streams, manage recordings
* **Analytics**: Performance metrics, viewer data, error tracking

Here are the tools and routes we don't currently support in the local Mux MCP Server. Generally speaking, these are composed of endpoints which can execute deletions, and are disabled for safety:

* Asset deletion endpoints
* Live stream deletion endpoints
* Webhook deletion endpoints

## Prompt examples

**Video Management**

* Using the Mux tool, create a webpage where I can upload a video to Mux
* Give me the playback URL for the most recently uploaded video to my Mux account, use Mux MCP
* List all my video assets and their current status (using the Mux MCP tool)
* With the Mux tool: Show me recent video uploads
* Using Mux MCP, generate a subtitles track for asset ID: `ASSET_ID`

**Mux Data Analytics and Performance**

* Using the Mux MCP, tell me the best performing country for video streaming over the last month
* Show me video performance metrics for the last week using the Mux tool
* With the Mux tool: what are the top performing videos by view count?
* Using Mux, which countries have the highest video engagement?
* What are the most common video errors in my account (use the Mux MCP)?
* Show me breakdown values for video quality metrics using the Mux MCP tool
* List all available data dimensions I can filter by, use the Mux MCP to answer this prompt

## Prerequisites

Before utilizing the Mux MCP Server, make sure you meet the following prerequisites:

* A Mux account (sign up at [mux.com](https://mux.com/) if you don't have one)
* Claude Desktop, Cursor, or any other client that supports remote MCP servers, installed and updated to the latest version

## Configuring the Mux MCP Server

Mux's MCP server is hosted at https://mcp.mux.com, and when using this remote MCP server, authentication should be handled automatically, with no need for grabbing Access Token information from the Dashboard. In order to configure the Mux MCP server in your client, you need to add an MCP server, which is sometimes called a "connector" (Claude/Claude Code/ChatGPT), an "extension" (Goose), or simply an MCP Server (VSCode), and enter the URL `https://mcp.mux.com` as the location.

Once configured, the LLM client and our MCP server should negotiate authentication and authorization, prompting you automatically to:

* Log in to https://dashboard.mux.com via whatever means you normally log in (this is skipped if you're already logged in)
* Choose which environment you want to authorize this connection for

When you're already logged in, your experience will look something like this:

<Image src="https://image.mux.com/gBuQlI7JC3zQwMlrhSTWPOfEGF02KaVni/animated.gif?start=4&end=11&width=640" width={640} height={640} />

And that's it, you're good to go!

### Configuration options

By default, https://mcp.mux.com will be configured in the simplest manner (though this may change in the future), exposing access to the full set of tools available to Mux. That said, depending on your workflow, you may want to limit this set of tools in some way. For that reason, Mux supports query parameters to configure the MCP server. A more complete set of configuration options can be [seen here](https://github.com/muxinc/mux-node-sdk/tree/master/packages/mcp-server#exposing-endpoints-to-your-mcp-client), and most of those work simply as query params. However, a few bear mentioning directly:

* `tools`: options are `all` (default), and `dynamic`.
  * Use `dynamic` if you want to expose tools mean to [allow the LLMs to dynamically discover endpoints and tools](https://github.com/muxinc/mux-node-sdk/tree/master/packages/mcp-server#exposing-endpoints-to-your-mcp-client), which can aid in controlling context windows and speeding up processing if a lot of tools are available.
* `resource`: array of resources (sets of APIs) to expose, such as `video.*`. These act as an inclusion set, rather than excluding, so you can chain multiple to expand the list of tools. Some options include:
  * `video.*`: all Mux Video APIs
  * `data.*`: all Mux Data APIs
  * `system.*`: all System APIs, such as managing Signing Keys
  * `video.asset.*`: the APIs used to manage Mux Video assets
  * ...and so on
* `client`: options are `claude` (default), `claude-code`, `cursor`, `openai-agents`.
  * Each LLM has varying support for capabilities related to complex JSON schemas, and these are tested defaults for each of the known clients. You can read more about this in [this doc by Stainless](https://www.stainless.com/docs/guides/generate-mcp-server-from-openapi#client-capabilities).

You can also chain these together. For instance, if you want to configure an MCP server that exposes *only* the Video APIs, but does it in a dynamic way, for Cursor, you'd just use `https://mcp.mux.com?client=cursor&resource=video.*&tools=dynamic` as your remote URL.

## A note on remote MCP support

These days, most LLM clients directly support remote MCP servers (rather than locally installed ones), so you shouldn't have much trouble getting set up. That said, there are still some clients (particularly older versions) that don't have built-in remote MCP support (such as Goose as of the time I wrote this guide). For those situations, you have two options:

1. You can still [install our MCP server locally](/docs/integrations/installing-mcp-server-locally)
2. You can utilize [mcp-remote](https://www.npmjs.com/package/mcp-remote), which brings support for remote MCP servers to practically any LLM client (and may perform better than the built-in remote MCP support depending on the client).

## Having trouble?

If you run into issues or have questions:

* Check the [Model Context Protocol documentation](https://modelcontextprotocol.io/quickstart/user) for general MCP setup guidance
* Review [Claude's MCP documentation](https://docs.anthropic.com/en/docs/agents-and-tools/mcp) for Claude-specific configuration
* Visit our [API reference](https://docs.mux.com/api-reference) for detailed endpoint documentation
* Contact support: [mux.com/support](/support)


# Install the local Mux MCP Server
Set up the Mux Model Context Protocol (MCP) Server locally to bring Mux's Video and Data platform capabilities directly to your AI tools.
<Callout type="info">
  If you're interested in getting started quickly, and to read more about the MCP server, check out [this guide](/docs/integrations/mcp-server). This guide walks you through building and installing the Mux MCP Server locally on your machine and connecting it to various AI clients.
</Callout>

The Mux MCP ([Model Context Protocol](https://modelcontextprotocol.io/introduction)) Server brings Mux's Video and Data platform capabilities directly to your AI tools. Once installed, you can upload videos, manage live streams, analyze video performance, and access practically all of Mux's video infrastructure through natural language prompts in supported AI clients.

## Prerequisites

Before installing the Mux MCP Server, make sure you meet the following prerequisites:

* Node.js installed on locally on your machine (instructions available [here](https://nodejs.org/en/download))
* A Mux account (sign up at [mux.com](https://mux.com/) if you don't have one)
* Your Mux API access token and secret key from the [Mux Dashboard](https://dashboard.mux.com/settings/access-tokens) (detailed instructions are available below)
* Claude Desktop, Cursor, or any other client that supports local MCP servers, installed and updated to the latest version

## Installation

### Get your Mux API credentials and configure access

1. Log into your [Mux Dashboard](https://dashboard.mux.com/)
2. Navigate to Settings → Access Tokens
3. Generate a new access token or use an existing one
4. Copy your **Access Token ID** and **Secret Key** - you'll need both for the configuration

#### Required Scopes

* Your Mux access token should be configured for your desired Environment and read/write access
* We recommend clearly labeling this access token in Mux, for example: `MCP Access Token`

**Important:** Replace the placeholder values when adding to your AI client's config using the templates provided below:

* Replace `your_access_token_id` with your actual Mux Access Token ID
* Replace `your_secret_key` with your actual Mux Secret Key

<Callout type="info">
  **Note:** If you're using a tool that manages Node versions like Mise, you'll probably need to make sure you execute the npx commands found in the following examples from within that context. An example Mise command could look something like this:

  `mise x node@20 -- npx -y @mux/mcp@latest`

  Accordingly, the following examples would need to be changed similarly to below:

  ```json
        "command": "mise",
        "args": ["x", "node@20", "--", "npx", "-y", "@mux/mcp@latest","--tools=dynamic","--client=claude"],
  ```
</Callout>

### For Claude

* You must use Claude's Desktop app to install local MCP servers.

We support the recently released [Claude Desktop Extensions](https://www.anthropic.com/engineering/desktop-extensions) format, so you can download [this DXT file](https://github.com/muxinc/mux-node-sdk/releases/download/v12.1.0/mux-mcp.dxt) and open it with Claude Desktop to install it. Once it's installed, configure the environment variables you need and you're good to go.

If you'd like to configure it manually, follow the next steps.

#### Step A: Configure Claude Desktop

Follow [Claude's instructions](https://docs.anthropic.com/en/docs/agents-and-tools/mcp) to locate your Claude Desktop configuration file on your machine.

**macOS/Linux:**

```
~/Library/Application\ Support/Claude/claude_desktop_config.json
```

**Windows:**

```
%APPDATA%\Claude\claude_desktop_config.json
```

#### Step B: Add the MCP Server configuration

Add this configuration block to your `claude_desktop_config.json` file:

```json
{
  "globalShortcut": "",
  "mcpServers": {
    "mux": {
      "command": "npx",
      "args": ["-y", "@mux/mcp@latest","--tools=dynamic","--client=claude"],
      "env": {
        "MUX_TOKEN_ID": "your_access_token_id",
        "MUX_TOKEN_SECRET": "your_secret_key"
      }
    }
  }
}
```

#### Step C: Restart Claude Desktop

Close and reopen Claude Desktop to load the new MCP server configuration.

### For Cursor

#### Step A: Locate the Settings File

Follow the paths below to locate your Cursor MCP configuration file. If the file does not exist, you can create it.

**macOS/Linux:**

```
~/.cursor/mcp.json
```

**Windows:**

```
C:/Users/<username>/.cursor/mcp.json
```

#### Step B: Add the MCP Server Configuration

```json
{
  "mcpServers": {
    "mux": {
      "command": "npx",
      "args": ["-y", "@mux/mcp@latest","--tools=dynamic","--client=cursor"],
      "env": {
        "MUX_TOKEN_ID": "your_access_token_id",
        "MUX_TOKEN_SECRET": "your_secret_key"
      }
    }
  }
}
```

### For VSCode

To add the server to all of your workspaces globally, add the server configuration to your `settings.json` file.

#### Step A: Locate the Settings File

**macOS:**

```
~/Library/Application\ Support/Code/User/settings.json
```

**Linux:**

```
~/.config/Code/User/settings.json
```

**Windows:**

```
%APPDATA%\Code\User\settings.json
```

#### Step B: Add the MCP Server Configuration

```json
{
  "mcp": {
    "servers": {
      "mux": {
        "command": "npx",
        "args": ["-y", "@mux/mcp@latest","--tools=dynamic"],
        "env": {
          "MUX_TOKEN_ID": "your_access_token_id",
          "MUX_TOKEN_SECRET": "your_secret_key"
        }
      }
    }
  }
}
```

#### Step C: Starting the MCP Server

In VSCode, make sure to click on the `Start` button in the MCP Server to start the server. You can do this directly from the settings file, or from the Command Palette with `MCP: List Servers` .

## Verify installation

Test that the Mux MCP Server is working by asking your AI client:

> Give me the details for the most recently created Mux Video asset (using the Mux tool)

or

> Using the Mux MCP, list the best performing countries for video streaming over the last month using Mux Data

If the installation was successful, Claude will connect to the Mux API through the MCP server and return information about your video performance or assets.

## Troubleshooting

**Build Issues**

If you encounter errors during the build process:

* Make sure you have the correct Node.js version installed, and that npx is accessible in your PATH (`npx -v`)

**Connection Issues**

If Claude can't connect to the MCP server:

* Double-check that your file path in the `args` field is correct and points to the built MCP server file
* Verify your Mux credentials are correct and properly formatted
* Make sure there are no extra spaces or characters in your token values
* Confirm your API tokens have the necessary permissions in your Mux account

**Claude Desktop Issues**

If MCP features don't appear in Claude:

* Ensure you're using the latest version of Claude Desktop - older versions may not support MCP
* Verify your JSON configuration is valid (no missing commas or brackets)
* Check that Claude Desktop has restarted completely after configuration changes

## Getting help

If you run into issues or have questions:

* Check the [Model Context Protocol documentation](https://modelcontextprotocol.io/quickstart/user) for general MCP setup guidance
* Review [Claude's MCP documentation](https://docs.anthropic.com/en/docs/agents-and-tools/mcp) for Claude-specific configuration
* Visit our [API reference](https://docs.mux.com/api-reference) for detailed endpoint documentation
* Contact support: [mux.com/support](/support)


# Mux CLI
Use the Mux CLI to manage your video assets, live streams, and more directly from the terminal.
The Mux CLI is a command-line interface for interacting with the Mux API, designed to provide a first-class development experience for working with Mux services locally. With the CLI, you can upload videos, manage live streams, generate signed URLs, query analytics data, and access Mux's video infrastructure without leaving your terminal.

## Installation

The Mux CLI can be installed via Homebrew, npm, a shell installer, or by downloading pre-built binaries.

### Homebrew (macOS)

```bash
brew install muxinc/tap/mux
```

### npm

Install globally to use the `mux` command anywhere:

```bash
npm install -g @mux/cli
```

Or run directly without installing using `npx`:

```bash
npx @mux/cli
```

### Shell installer

Run the install script to automatically download and set up the CLI:

```bash
curl -fsSL https://raw.githubusercontent.com/muxinc/cli/main/install.sh | bash
```

### Binary download

Platform-specific binaries are available for macOS (Apple Silicon and Intel) and Linux (x64 and arm64) from the [GitHub Releases page](https://github.com/muxinc/cli/releases). These are self-contained executables with no external dependencies.

## Shell completions

Enable tab completion for commands, subcommands, and options in your shell:

**Bash** (add to `~/.bashrc`):

```bash
source <(mux completions bash)
```

**Zsh** (add to `~/.zshrc`):

```bash
source <(mux completions zsh)
```

**Fish** (add to `~/.config/fish/config.fish`):

```bash
source (mux completions fish | psub)
```

Restart your shell or source the config file to activate completions.

## Authentication

The CLI requires Mux API credentials to interact with your account. You can get your Access Token ID and Secret Key from the [Mux Dashboard](https://dashboard.mux.com/settings/access-tokens).

### Interactive login

Run `mux login` to authenticate interactively:

```bash
mux login
```

You'll be prompted to enter your Access Token ID and Secret Key. Credentials are stored securely in `~/.config/mux/config.json` with owner-only file permissions.

### Using environment variables

The CLI can read credentials from a `.env` file or environment variables:

```bash
MUX_TOKEN_ID=your-token-id
MUX_TOKEN_SECRET=your-token-secret
```

Login from a `.env` file:

```bash
mux login --env-file .env
```

### Named environments

For managing multiple Mux accounts (production, staging, development), you can configure named environments:

```bash
mux login --name production
mux login --name staging --env-file .env.staging
```

The first environment you add becomes the default. Switch between environments:

```bash
mux env switch staging
mux env list
```

Remove an environment:

```bash
mux logout staging
```

## Common options

These options are available on most commands:

| Option | Description |
|--------|-------------|
| `--json` | Output raw JSON instead of pretty-printed format. Useful for scripting and piping to `jq`. |
| `--compact` | One-line-per-item output, grep-friendly. Available on `list` commands. |
| `--limit <n>` | Number of results to return (default: 25). Available on `list` commands. |
| `--page <n>` | Page number for pagination (default: 1). Available on `list` commands. |
| `-f, --force` | Skip confirmation prompts on destructive actions. |
| `--wait` | Poll until the resource is ready before returning. Available on `create` commands. |

## Webhook forwarding

Listen for Mux webhook events in real-time and forward them to your local development server. Events are stored locally for replay during development.

<Callout type="warning">
  CLI webhook commands are for **local development only** and provide **no delivery guarantees**. In production, you must configure a webhook endpoint in the [Mux Dashboard](https://dashboard.mux.com) that points to your server's webhook URL.
</Callout>

### `mux webhooks listen`

Connect to Mux's event stream and optionally forward events to a local URL.

```bash
# Listen and print events
mux webhooks listen

# Forward to local dev server
mux webhooks listen --forward-to http://localhost:3000/api/webhooks/mux
```

| Option | Description |
|--------|-------------|
| `--forward-to <url>` | POST received events to a local URL in real-time |
| `--json` | Output raw JSON per event |

When using `--forward-to`, the CLI displays a webhook signing secret and signs each forwarded request with a `mux-signature` header. Set `MUX_WEBHOOK_SECRET` in your app's environment to [verify these signatures](/docs/core/verify-webhook-signatures):

```typescript
const event = mux.webhooks.unwrap(body, headers, process.env.MUX_WEBHOOK_SECRET);
```

The signing secret is unique per environment and persisted between sessions, so you only need to configure it once.

### `mux webhooks events list`

List locally stored webhook events captured during `listen` sessions. The CLI stores the last 100 events.

```bash
mux webhooks events list
mux webhooks events list --limit 50
```

### `mux webhooks events replay [event-id]`

Replay stored webhook events. Useful for re-testing your webhook handler without creating new resources.

```bash
# Replay a specific event to your local server
mux webhooks events replay abc123-event-id --forward-to http://localhost:3000/api/webhooks/mux

# Replay all stored events
mux webhooks events replay --all --forward-to http://localhost:3000/api/webhooks/mux

# View event payload without forwarding
mux webhooks events replay abc123-event-id
```

| Option | Description |
|--------|-------------|
| `--forward-to <url>` | POST event(s) to a local URL |
| `--all` | Replay all stored events |
| `--json` | Output JSON instead of pretty format |

### `mux webhooks trigger <event-type>`

Send a synthetic webhook event to a local URL for testing. No API call is made — the payload is generated locally and signed with the per-environment signing secret.

```bash
# Send an example video.asset.ready event
mux webhooks trigger video.asset.ready --forward-to http://localhost:3000/api/webhooks/mux

# Send a live stream event
mux webhooks trigger video.live_stream.active --forward-to http://localhost:3000/api/webhooks/mux
```

| Option | Description |
|--------|-------------|
| `--forward-to <url>` | Local URL to POST the example event to (required) |
| `--json` | Output JSON instead of pretty format |

Run `mux webhooks trigger <invalid-type>` to see all supported event types.

## Commands

### Asset management

Create, list, update, and delete video assets.

```bash
# Create from URL
mux assets create --url https://example.com/video.mp4 --playback-policy public

# Upload local files (glob supported)
mux assets create --upload ./videos/*.mp4 --playback-policy public

# Create from JSON config file (for overlays, subtitles, multiple inputs)
mux assets create --file asset-config.json

# Wait for processing to complete
mux assets create --url https://example.com/video.mp4 --playback-policy public --wait

# List, get, update, delete
mux assets list
mux assets get <asset-id>
mux assets update <asset-id> --title "My Video" --passthrough "my-custom-id"
mux assets delete <asset-id>
```

The interactive asset manager opens a terminal UI (TUI) for browsing and managing your video library:

```bash
mux assets manage
```

<CollapsibleRoot>
  <CollapsibleTrigger>
    View all asset create options
  </CollapsibleTrigger>

  <CollapsibleContent>
    | Option | Description |
    |--------|-------------|
    | `--url <url>` | Video URL to ingest from the web |
    | `--upload <path>` | Local file(s) to upload (supports glob patterns like `*.mp4`) |
    | `--file, -f <path>` | JSON configuration file for complex asset creation |
    | `--playback-policy <policy>` | `public` or `signed` (repeatable) |
    | `--test` | Create test asset (watermarked, 10s limit, deleted after 24h) |
    | `--passthrough <string>` | User metadata (max 255 characters) |
    | `--static-renditions <resolution>` | e.g. `1080p`, `720p`, `highest`, `audio-only` (repeatable) |
    | `--video-quality <quality>` | `basic`, `plus`, or `premium` |
    | `--normalize-audio` | Normalize audio loudness level |
    | `-y, --yes` | Skip confirmation prompts |
  </CollapsibleContent>
</CollapsibleRoot>

<CollapsibleRoot>
  <CollapsibleTrigger>
    View all asset update options
  </CollapsibleTrigger>

  <CollapsibleContent>
    | Option | Description |
    |--------|-------------|
    | `--title <string>` | Set `meta.title` (max 512 characters) |
    | `--creator-id <string>` | Set `meta.creator_id` (max 128 characters) |
    | `--external-id <string>` | Set `meta.external_id` (max 128 characters) |
    | `--passthrough <string>` | Set `passthrough` (max 255 characters) |
  </CollapsibleContent>
</CollapsibleRoot>

<CollapsibleRoot>
  <CollapsibleTrigger>
    Asset sub-resources: playback IDs, static renditions, tracks
  </CollapsibleTrigger>

  <CollapsibleContent>
    #### Playback IDs

    Manage playback IDs on assets. Each asset can have multiple playback IDs with different policies.

    ```bash
    mux assets playback-ids list <asset-id>
    mux assets playback-ids create <asset-id> --policy signed
    mux assets playback-ids delete <asset-id> <playback-id>
    ```

    #### Static renditions

    Static renditions are downloadable MP4 versions of your video assets at specific resolutions.

    ```bash
    mux assets static-renditions list <asset-id>
    mux assets static-renditions create <asset-id> --resolution 1080p --wait
    mux assets static-renditions delete <asset-id> <rendition-id>
    ```

    Resolution options: `highest`, `audio-only`, `2160p`, `1440p`, `1080p`, `720p`, `540p`, `480p`, `360p`, `270p`

    #### Tracks

    Add text and audio tracks (subtitles, captions, audio) to video assets.

    ```bash
    # Add a subtitle track
    mux assets tracks create <asset-id> \
      --url https://example.com/subs.vtt \
      --type text \
      --language-code en \
      --text-type subtitles

    # Generate subtitles from an audio track
    mux assets tracks generate-subtitles <asset-id> <track-id> \
      --language-code en \
      --name "English (auto)"

    # Delete a track
    mux assets tracks delete <asset-id> <track-id>
    ```

    #### Other asset commands

    ```bash
    mux assets input-info <asset-id>                                      # view input file details
    mux assets update-master-access <asset-id> --master-access temporary  # enable master access
    ```
  </CollapsibleContent>
</CollapsibleRoot>

### Live stream management

Create and manage live streams for broadcasting via RTMP.

```bash
# Create a live stream
mux live create --playback-policy public

# Create with options
mux live create \
  --playback-policy public \
  --latency-mode low \
  --reconnect-window 60

# List, get, delete
mux live list
mux live get <live-stream-id>
mux live delete <live-stream-id>

# Stream lifecycle
mux live complete <live-stream-id>
mux live enable <live-stream-id>
mux live disable <live-stream-id>

# Reset stream key
mux live reset-stream-key <live-stream-id>
```

Once created, stream using:

* **RTMP URL:** `rtmp://global-live.mux.com/app`
* **Stream Key:** returned in the response

<CollapsibleRoot>
  <CollapsibleTrigger>
    View all live stream create options
  </CollapsibleTrigger>

  <CollapsibleContent>
    | Option | Description |
    |--------|-------------|
    | `--playback-policy <policy>` | `public` or `signed` (repeatable) |
    | `--new-asset-settings <settings>` | Auto-create asset from stream. Use `none` to disable, or JSON string |
    | `--reconnect-window <seconds>` | Reconnect timeout (default: 60) |
    | `--latency-mode <mode>` | `low`, `reduced`, or `standard` (default: `low`) |
    | `--test` | Create test stream (deleted after 24h) |
  </CollapsibleContent>
</CollapsibleRoot>

<CollapsibleRoot>
  <CollapsibleTrigger>
    View all live stream update options
  </CollapsibleTrigger>

  <CollapsibleContent>
    | Option | Description |
    |--------|-------------|
    | `--latency-mode <mode>` | `low`, `reduced`, or `standard` |
    | `--reconnect-window <seconds>` | Reconnect window (0-1800) |
    | `--max-continuous-duration <seconds>` | Max continuous duration (60-43200) |
    | `--passthrough <string>` | Passthrough metadata (max 255 characters) |
    | `--reconnect-slate-url <url>` | Image to display during reconnect |
    | `--use-slate-for-standard-latency` | Display slate for standard latency streams |
    | `--title <string>` | Title for the live stream |
  </CollapsibleContent>
</CollapsibleRoot>

<CollapsibleRoot>
  <CollapsibleTrigger>
    Live stream sub-resources: simulcast targets, subtitles, playback IDs
  </CollapsibleTrigger>

  <CollapsibleContent>
    #### Simulcast targets

    Restream a live stream to third-party platforms (e.g., YouTube, Twitch).

    ```bash
    mux live simulcast-targets create <stream-id> --url rtmp://live.twitch.tv/app --stream-key live_xxxxx
    mux live simulcast-targets get <stream-id> <target-id>
    mux live simulcast-targets delete <stream-id> <target-id>
    ```

    #### Embedded subtitles (CEA-608)

    ```bash
    mux live update-embedded-subtitles <stream-id> \
      --language-channel cc1 \
      --language-code en \
      --name "English CC"
    ```

    #### Generated subtitles (ASR)

    ```bash
    mux live update-generated-subtitles <stream-id> \
      --language-code en \
      --name "English (auto)"
    ```

    Use `--clear` on either command to remove subtitle settings.

    #### New asset static renditions

    Configure static renditions for assets automatically created from a live stream.

    ```bash
    mux live update-new-asset-static-renditions <stream-id> --resolution 1080p --resolution 720p
    mux live delete-new-asset-static-renditions <stream-id>
    ```

    #### Playback IDs

    ```bash
    mux live playback-ids list <stream-id>
    mux live playback-ids create <stream-id> --policy signed
    mux live playback-ids delete <stream-id> <playback-id>
    ```
  </CollapsibleContent>
</CollapsibleRoot>

### Direct uploads

Create direct upload URLs for client-side video uploading.

```bash
mux uploads create --cors-origin "https://example.com" --playback-policy public
mux uploads list
mux uploads get <upload-id>
mux uploads cancel <upload-id>
```

| Option | Description |
|--------|-------------|
| `--cors-origin <origin>` | Allowed CORS origin for the upload (required) |
| `-p, --playback-policy <policy>` | `public` or `signed` |
| `--timeout <seconds>` | Seconds before the upload times out (default: 3600) |
| `--test` | Create a test upload (asset deleted after 24 hours) |

### Signing keys and secure playback

For assets with signed playback policies, the CLI can generate secure URLs.

#### Create a signing key

```bash
mux signing-keys create
```

<Callout type="warning">
  The private key is only returned once during creation. The CLI automatically stores it in your current environment configuration.
</Callout>

```bash
mux signing-keys list
mux signing-keys get <key-id>
mux signing-keys delete <key-id>
```

#### Generate signed URLs

```bash
# Sign for video playback
mux sign <playback-id>

# Sign with custom expiration
mux sign <playback-id> --expiration 24h

# Sign a thumbnail with parameters
mux sign <playback-id> --type thumbnail --param time=14 --param width=100

# Sign a GIF
mux sign <playback-id> --type gif

# Output token only (no URL)
mux sign <playback-id> --token-only

# Pass JWT claims as JSON
mux sign <playback-id> --params-json '{"custom": {"session_id": "xxxx-123"}}'
```

| Option | Description |
|--------|-------------|
| `-e, --expiration <duration>` | Token expiration (default: `7d`). Examples: `7d`, `24h`, `30m` |
| `-t, --type <type>` | `video` (default), `thumbnail`, `gif`, `storyboard` |
| `-p, --param <key=value>` | JWT claim as key=value (repeatable) |
| `--params-json <json>` | JWT claims as JSON object |
| `--token-only` | Output only the JWT token |

<CollapsibleRoot>
  <CollapsibleTrigger>
    View thumbnail parameters
  </CollapsibleTrigger>

  <CollapsibleContent>
    These parameters can be passed via `--param` when using `--type thumbnail`:

    | Parameter | Description |
    |-----------|-------------|
    | `time` | Video timestamp in seconds |
    | `width` | Width in pixels |
    | `height` | Height in pixels |
    | `rotate` | Clockwise rotation: 90, 180, or 270 |
    | `fit_mode` | `preserve`, `stretch`, `crop`, `smartcrop`, `pad` |
    | `flip_v` | Flip vertically |
    | `flip_h` | Flip horizontally |
  </CollapsibleContent>
</CollapsibleRoot>

### Playback ID lookup

Look up which asset or live stream a playback ID belongs to:

```bash
mux playback-ids <playback-id>
mux playback-ids <playback-id> --expand  # fetch the full asset or live stream object
```

### Playback restrictions

Control where and how your content can be played.

```bash
# Create a restriction
mux playback-restrictions create \
  --allowed-domains "example.com" \
  --allowed-domains "*.example.com"

# List, get, delete
mux playback-restrictions list
mux playback-restrictions get <restriction-id>
mux playback-restrictions delete <restriction-id>

# Update referrer rules
mux playback-restrictions update-referrer <restriction-id> \
  --allowed-domains "example.com" \
  --allow-no-referrer

# Update user agent rules
mux playback-restrictions update-user-agent <restriction-id> \
  --allow-no-user-agent true \
  --allow-high-risk-user-agent false
```

### Transcription vocabularies

Manage custom vocabularies to improve automatic speech recognition accuracy for domain-specific terms.

```bash
# Create a vocabulary
mux transcription-vocabularies create \
  --phrase "Mux" --phrase "HLS" --phrase "RTMP" \
  --name "Streaming Terms"

# List, get, update, delete
mux transcription-vocabularies list
mux transcription-vocabularies get <vocabulary-id>
mux transcription-vocabularies update <vocabulary-id> --phrase "Mux" --phrase "DASH"
mux transcription-vocabularies delete <vocabulary-id>
```

### Delivery usage

List delivery usage reports for video assets and live streams.

```bash
mux delivery-usage list
mux delivery-usage list --asset-id <id>
mux delivery-usage list --live-stream-id <id>
```

### DRM configurations

View DRM configurations for your Mux environment. DRM configurations are provisioned by Mux and are read-only.

```bash
mux drm-configurations list
mux drm-configurations get <drm-configuration-id>
```

### Mux Data

Commands for video analytics, monitoring, and incident tracking via the Mux Data API.

#### Video views

```bash
mux video-views list
mux video-views list --filters "country:US" --timeframe "24:hours"
mux video-views list --viewer-id <id>
mux video-views get <view-id>
```

#### Metrics

```bash
# List available metrics
mux metrics list

# Breakdown by dimension
mux metrics breakdown <metric-id> --group-by country --measurement median

# Overall metric values
mux metrics overall <metric-id> --measurement avg

# Timeseries data
mux metrics timeseries <metric-id> --group-by hour

# Performance insights
mux metrics insights <metric-id> --measurement 95th
```

Common options for metric commands: `--measurement <95th|median|avg|count|sum>`, `--filters`, `--metric-filters`, `--timeframe`

#### Monitoring

Real-time monitoring data from Mux Data.

```bash
mux monitoring dimensions
mux monitoring metrics
mux monitoring breakdown <metric-id> --dimension <d>
mux monitoring breakdown-timeseries <metric-id> --dimension <d>
mux monitoring histogram-timeseries --filters ...
mux monitoring timeseries <metric-id>
```

#### Incidents

```bash
mux incidents list
mux incidents list --status open --severity alert
mux incidents get <incident-id>
mux incidents related <incident-id>
```

#### Annotations

Mark significant events (deployments, config changes) on your analytics timeline.

```bash
mux annotations create --date 1700000000 --note "Deployed v2.1.0"
mux annotations list
mux annotations get <annotation-id>
mux annotations update <annotation-id> --date <timestamp> --note <text>
mux annotations delete <annotation-id>
```

#### Dimensions, errors, and exports

```bash
# List available dimensions and their values
mux dimensions list
mux dimensions values <dimension-id> --timeframe "24:hours"

# List errors
mux errors list --filters ... --timeframe ...

# List video view export files
mux exports list
```

## Getting help

View available commands:

```bash
mux --help
```

Get help for a specific command:

```bash
mux assets --help
mux assets create --help
```

## Full documentation

For more information, see the [Mux CLI repository](https://github.com/muxinc/cli) on GitHub.


# Add high-performance video to your Node application
Use our API and components to handle embedding, storing, and streaming video in your Node application
## Installation

Add a dependency on the `@mux/mux-node` package via npm or yarn.

```bash
npm install @mux/mux-node
```

## Quickstart

To start, you'll need a Mux access token. Once you've got that, you're off to the races!

```javascript
import Mux from '@mux/mux-node';
const mux = new Mux({
  tokenId: process.env.MUX_TOKEN_ID,
  tokenSecret: process.env.MUX_TOKEN_SECRET
});

const asset = await mux.video.assets.create({
  input: [{ url: 'https://storage.googleapis.com/muxdemofiles/mux-video-intro.mp4' }],
  playback_policy: ['public'],
});
```

## Full documentation

Check out the [Mux Node SDK docs](https://github.com/muxinc/mux-node-sdk) for more information.


# Add high-performance video to your Python application
Use our API and components to handle embedding, storing, and streaming video in your Python application
## Installation

Install this module using either `pip` or by installing from source.

```curl
# Via pip
pip install git+https://github.com/muxinc/mux-python.git

# Via source
git checkout https://github.com/muxinc/mux-python.git
cd mux-python
python setup.py install --user
```

## Quickstart

To start, you'll need a Mux access token. Once you've got that, you're off to the races!

```python
import os
import mux_python
from mux_python.rest import ApiException

# Authentication Setup
configuration = mux_python.Configuration()
configuration.username = os.environ['MUX_TOKEN_ID']
configuration.password = os.environ['MUX_TOKEN_SECRET']

# API Client Initialization
assets_api = mux_python.AssetsApi(mux_python.ApiClient(configuration))

# List Assets
print("Listing Assets: \n")
try:
    list_assets_response = assets_api.list_assets()
    for asset in list_assets_response.data:
        print('Asset ID: ' + asset.id)
        print('Status: ' + asset.status)
        print('Duration: ' + str(asset.duration) + "\n")
except ApiException as e:
    print("Exception when calling AssetsApi->list_assets: %s\n" % e)
```

## Full documentation

Check out the [Mux Python SDK docs](https://github.com/muxinc/mux-python) for more information.


# Add high-performance video to your PHP application
Use our API and components to handle embedding, storing, and streaming video in your PHP application
## Installation

We publish Mux PHP to Packagist. You should depend on Mux PHP by adding us to your `composer.json` file.

```php
composer require mux/mux-php
```

## Quickstart

To start, you'll need a Mux access token. Once you've got that, you're off to the races!

```php
// Authentication Setup
$config = MuxPhp\Configuration::getDefaultConfiguration()
    ->setUsername(getenv('MUX_TOKEN_ID'))
    ->setPassword(getenv('MUX_TOKEN_SECRET'));

// API Client Initialization
$assetsApi = new MuxPhp\Api\AssetsApi(
    new GuzzleHttp\Client(),
    $config
);

// Create Asset Request
$input = new MuxPhp\Models\InputSettings(["url" => "https://storage.googleapis.com/muxdemofiles/mux-video-intro.mp4"]);
$createAssetRequest = new MuxPhp\Models\CreateAssetRequest(["input" => $input, "playback_policy" => [MuxPhp\Models\PlaybackPolicy::PUBLIC_PLAYBACK_POLICY] ]);

// Ingest
$result = $assetsApi->createAsset($createAssetRequest);

// Print URL
print "Playback URL: https://stream.mux.com/" . $result->getData()->getPlaybackIds()[0]->getId() . ".m3u8\n"
```

## Full documentation

Check out the [Mux PHP SDK docs](https://github.com/muxinc/mux-php) for more information.


# Add high-performance video to your Ruby application
Use our API and components to handle embedding, storing, and streaming video in your Ruby application
## Installation

Add `mux_ruby` to your project's `Gemfile`.

```ruby
gem 'mux_ruby'
```

## Quickstart

To start, you'll need a Mux access token. Once you've got that, you're off to the races!

```ruby
require 'mux_ruby'

# Auth Setup
openapi = MuxRuby.configure do |config|
  config.username = ENV['MUX_TOKEN_ID']
  config.password = ENV['MUX_TOKEN_SECRET']
end

# API Client Init
assets_api = MuxRuby::AssetsApi.new

# List Assets
puts "Listing Assets in account:\n\n"

assets = assets_api.list_assets()
assets.data.each do | asset |
  puts "Asset ID: #{asset.id}"
  puts "Status: #{asset.status}"
  puts "Duration: #{asset.duration.to_s}\n\n"
end
```

## Full documentation

Check out the [Mux Ruby SDK docs](https://github.com/muxinc/mux-ruby) for more information.


# Add high-performance video to your Elixir application
Use our API and components to handle embedding, storing, and streaming video in your Elixir application
## Installation

Add `mux` to your list of dependencies in `mix.exs`.

```elixir
def deps do
  [
    {:mux, "~> 3.2.1"}
  ]
end
```

## Quickstart

To start, we'll need a Mux access token. We'll put our access token in our application configuration.

```elixir
# config/dev.exs
config :mux,
  access_token_id: "abcd1234",
  access_token_secret: "efghijkl"
```

Then use this config to initialize a new client in your application.

```elixir
client = Mux.client()
```

You can also pass the access token ID and secret directly to client/2 function if you'd prefer:

```elixir
client = Mux.client("access_token_id", "access_token_secret")
```

Now we can use the client to upload new videos, manage playback IDs, etc.

```elixir
params = %{
  input: "https://example.com/video.mp4"
}
{:ok, asset, _} = Mux.Video.Assets.create(client, params);
```

## Full documentation

Check out the [Mux Elixir SDK docs](https://github.com/muxinc/mux-elixir) for more information.


# Add high-performance video to your Java application
Use our API and components to handle embedding, storing, and streaming video in your Java application
## Installation

There are several ways to add the Mux Java SDK to your project:

### Maven

Add this dependency to your project's POM:

```xml
<dependency>
  <groupId>com.mux</groupId>
  <artifactId>mux-sdk-java</artifactId>
  <version>1.0.0</version>
  <scope>compile</scope>
</dependency>
```

### Gradle

Add this dependency to your project's build file:

```gradle
compile "com.mux:mux-sdk-java:1.0.0"
```

## Quickstart

To start, you'll need a Mux access token. Once you've got that, you're off to the races!

```java
// Import classes:
import com.mux.ApiClient;
import com.mux.ApiException;
import com.mux.Configuration;
import com.mux.auth.*;
import com.mux.models.*;
import com.mux.sdk.AssetsApi;

public class Example {
  public static void main(String[] args) {
    ApiClient defaultClient = Configuration.getDefaultApiClient();
    defaultClient.setBasePath("https://api.mux.com");
    
    // Configure HTTP basic authorization: accessToken
    HttpBasicAuth accessToken = (HttpBasicAuth) defaultClient.getAuthentication("accessToken");
    accessToken.setUsername("YOUR USERNAME");
    accessToken.setPassword("YOUR PASSWORD");

    AssetsApi apiInstance = new AssetsApi(defaultClient);
    CreateAssetRequest createAssetRequest = {"input":[{"url":"https://muxed.s3.amazonaws.com/leds.mp4"}],"playback_policy":["public"],"video_quality":"basic"}; // CreateAssetRequest | 
    try {
      AssetResponse result = apiInstance.createAsset(createAssetRequest)
            .execute();
      System.out.println(result);
    } catch (ApiException e) {
      System.err.println("Exception when calling AssetsApi#createAsset");
      System.err.println("Status code: " + e.getCode());
      System.err.println("Reason: " + e.getResponseBody());
      System.err.println("Response headers: " + e.getResponseHeaders());
      e.printStackTrace();
    }
  }
}

```

## Full documentation

Check out the [Mux Java SDK docs](https://github.com/muxinc/mux-java) for more information.


# Add high-performance video to your C# application
Use our API and components to handle embedding, storing, and streaming video in your C# application
## Frameworks supported

* .NET Core >=1.0
* .NET Framework >=4.6
* Mono/Xamarin >=vNext

## Dependencies

* [RestSharp](https://www.nuget.org/packages/RestSharp) - 106.11.4 or later
* [Json.NET](https://www.nuget.org/packages/Newtonsoft.Json/) - 12.0.3 or later
* [JsonSubTypes](https://www.nuget.org/packages/JsonSubTypes/) - 1.7.0 or later
* [System.ComponentModel.Annotations](https://www.nuget.org/packages/System.ComponentModel.Annotations) - 4.7.0 or later

The DLLs included in the package may not be the latest version. We recommend using [NuGet](https://docs.nuget.org/consume/installing-nuget) to obtain the latest version of the packages:

```
Install-Package RestSharp
Install-Package Newtonsoft.Json
Install-Package JsonSubTypes
Install-Package System.ComponentModel.Annotations
```

NOTE: RestSharp versions greater than 105.1.0 have a bug which causes file uploads to fail. See [RestSharp#742](https://github.com/restsharp/RestSharp/issues/742)

## Installation

Generate the DLL using your preferred tool (e.g. `dotnet build`)

Then include the DLL (under the `bin` folder) in the C# project, and use the namespaces:

```csharp
using Mux.Csharp.Sdk.Api;
using Mux.Csharp.Sdk.Client;
using Mux.Csharp.Sdk.Model;
```

## Usage

<Callout type="warning" title="Usage With Webhooks">
  At this moment, this SDK is not suitable for parsing or modeling webhook payloads, due to some incompatibilities in our API spec and our SDK generation tooling. We are working on resolving these issues, but for now you should only use this SDK for Mux's REST APIs.
</Callout>

To use the API client with a HTTP proxy, setup a `System.Net.WebProxy`

```csharp
Configuration c = new Configuration();
System.Net.WebProxy webProxy = new System.Net.WebProxy("http://myProxyUrl:80/");
webProxy.Credentials = System.Net.CredentialCache.DefaultCredentials;
c.Proxy = webProxy;
```

## Getting Started

```csharp
using System.Collections.Generic;
using System.Diagnostics;
using Mux.Csharp.Sdk.Api;
using Mux.Csharp.Sdk.Client;
using Mux.Csharp.Sdk.Model;

namespace Example
{
    public class Example
    {
        public static void Main()
        {

            Configuration config = new Configuration();
            config.BasePath = "https://api.mux.com";
            // Configure HTTP basic authorization: accessToken
            config.Username = "YOUR_USERNAME";
            config.Password = "YOUR_PASSWORD";

            var apiInstance = new AssetsApi(config);
            var createAssetRequest = new CreateAssetRequest(); // CreateAssetRequest | 

            try
            {
                // Create an asset
                AssetResponse result = apiInstance.CreateAsset(createAssetRequest);
                Debug.WriteLine(result);
            }
            catch (ApiException e)
            {
                Debug.Print("Exception when calling AssetsApi.CreateAsset: " + e.Message );
                Debug.Print("Status Code: "+ e.ErrorCode);
                Debug.Print(e.StackTrace);
            }

        }
    }
}
```

## Full documentation

Check out the [Mux C# SDK docs](https://github.com/muxinc/mux-csharp) for more information.


# Integrate with your CMS
Mux has a growing collection of CMS integrations that make it easy to incorporate your Mux video details into your application.
While Mux stores basic video metadata like titles and creator IDs, a content management system (CMS) can help you manage additional content around your videos - like descriptions, categories, related content, and other rich metadata that helps organize your video content within your application's broader content strategy.

Mux integrates with the following CMS applications to help you easily manage and deliver video content in your applications. These integrations allow you to easily incorporate Mux videos into your existing CMS workflow.

## Headless CMS integrations

* [Sanity](/docs/integrations/sanity)
* [Contentful](/docs/integrations/contentful)
* [WordPress](/docs/integrations/wordpress)
* [Strapi](/docs/integrations/strapi)
* [Cosmic](/docs/integrations/cosmic)
* [DatoCMS](/docs/integrations/datocms)
* [Prepr](/docs/integrations/prepr)

## Third-party integrations

These are integrations that are not officially supported by Mux, but are community-driven and maintained.

* [PayloadCMS](https://github.com/oversightstudio/payload-plugins/tree/main/packages/mux-video)
* [Statamic](https://github.com/daun/statamic-mux)

If there’s an integration you’d like to see or if you’d like to partner with us, [let us know](mailto:info@mux.com)!


# Integrate with Sanity
Learn how to integrate Mux video with your Sanity studio. If your team is using Sanity as a CMS this integration will allow them to upload videos to Mux without leaving the Sanity studio.
<Callout type="info" title="Prerequisites">
  This guide assumes you already have a Sanity Studio set up. If you haven't created your Sanity Studio yet, follow the [Sanity Studio quickstart guide](https://www.sanity.io/docs/sanity-studio-quickstart/setting-up-your-studio) to get started.
</Callout>

## 1. Install Mux plugin

Run this command in your Sanity project folder:

```sh
npm i sanity-plugin-mux-input
```

## 2. Use in a schema

To use Mux video in your Sanity schemas, you'll need to create a schema type, import it to your schema types index, and configure the Mux plugin in your Sanity configuration file.

### 2.1. Create a schema type

Create a new file in your `schemaTypes` directory (or `schemas` directory, depending on your setup). For example, create a file called `videoBlogPost.ts`:

```typescript
// schemaTypes/videoBlogPost.ts
import { defineType, defineField } from 'sanity'

export default defineType({
  title: 'Video blog post',
  name: 'videoBlogPost',
  type: 'document',
  fields: [
    defineField({
      name: 'title',
      type: 'string',
      title: 'Title'
    }),
    defineField({
      name: 'video',
      type: 'mux.video',
      title: 'Video file'
    })
  ]
})
```

### 2.2. Import the schema type

Import your new schema type in your schema types index file (usually `schemaTypes/index.ts` or `schemas/index.ts`):

```typescript
// schemaTypes/index.ts
import videoBlogPost from './videoBlogPost'

export const schemaTypes = [videoBlogPost]
```

### 2.3. Configure the Mux plugin

Add the Mux plugin to your Sanity configuration file (`sanity.config.ts` or `sanity.config.js`):

```typescript
// sanity.config.ts
import { defineConfig } from 'sanity'
import { structureTool } from 'sanity/structure'
import { muxInput } from 'sanity-plugin-mux-input'
import { schemaTypes } from './schemaTypes'

export default defineConfig({
  name: 'default',
  title: 'My Sanity Project',
  
  projectId: 'your-project-id',
  dataset: 'production',
  
  plugins: [
    structureTool(),
    muxInput()
  ],
  
  schema: {
    types: schemaTypes,
  },
})
```

## 3. Enter Mux credentials

Generate a new Access Token by going to the Access Token settings of your Mux account dashboard.

<Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} />

The access token should have Mux Video Read and Write permissions as well as Mux Data (read-only).
If you want to use signed playback, you need to enable both **Read** and **Write** permissions for the `System` section. For more information, check out the [Signed Tokens](/docs/integrations/sanity#signed-tokens) section.

<Image src="/docs/images/new-cms-token.png" width={608} height={480} alt="Mux Video and Mux Data access token permissions" sm />

Back in Sanity Studio, navigate to the **Videos** section in your studio menu, then click on **Configure plugin**. Enter your Access Token ID and Secret Key in the configuration settings.

<Image src="/docs/images/sanity-configure-plugin.png" width={800} height={400} />

You'll also see an option to **Enable signed URLs**. This feature allows you to create videos with signed playback policies for additional security. If you're unsure, you can leave this disabled for now—you can learn more about this feature in the [Signed Tokens](#signed-tokens) section below.

## 4. Upload video

Use the select button to open the file explorer on your system, drag the file right into the input area, or paste the URL to the video in the field. Once it's done uploading, you can select the thumbnail you want for the preview.

<Player playbackId="TEgRQ00yGgc6GflbsK4Z44HwZDMIxKqY1" thumbnailTime="0" title="Sanity - Mux Video input - Upload" />

<Callout type="success" title="Congratulations!">
  You now have the ability to upload content to Mux through Sanity CMS!
</Callout>

To retrieve your video for playback, check out the [Sanity docs](https://www.sanity.io/blog/first-class-responsive-video-support-with-the-new-mux-plugin) for instructions.

## 5. Explore advanced options

## Signed Tokens

<Callout type="warning" title="Warning! Requires generating JWT on your server">
  Enabling signed URLs in Sanity will require you to generate your own signing tokens on your application **server**. This involves creating a signing key and using that to generate JSON web tokens when you want to access your videos and thumbnails outside of Sanity.
</Callout>

By default, all assets uploaded to Mux through Sanity will be created with a playback policy of `"public"`. This means that your videos and thumbnails are accessible with `https://stream.mux.com/{PLAYBACK_ID}.m3u8` and `https://image.mux.com/{PLAYBACK_ID}/thumbnail.jpg`.

If you want more control over delivery of the playback and thumbnail access, you can enable this feature on the Sanity configuration popover:

<Image src="/docs/images/sanity-signed-urls.png" width={852} height={640} sm />

When you enable this feature, the following things will happen:

1. The Mux Plugin in Sanity will use the Mux API to create a URL signing key and save this with your `secrets` document.
2. Any assets that get created while this feature is enabled will be created with `playback_policy: "signed"` (instead of `"public"`).
3. The signing key from Step 1 will be used by the Mux Plugin to preview content inside the Sanity UI.
4. When you access your content in your own application, use the `MuxAsset.data.playback_ids` property to determine if the asset has a `signed` or `public` policy.

```json
{
  "_id": "0779365f-bbd1-46ab-9d78-c55feeb28faa",
  "_type": "mux.videoAsset",
  "assetId": "fNMFNYMq48EwgJM7AIn1rNldiFBcVIdK",
  "data": {
    "playback_ids": [
      {
        "id": "01cBJKm5KoeQii00YYGU7Rvpzvh6V01l4ZK",
        "policy": "public"
      }
    ]
  },
  "status": "ready"
}
```

5. You should use the signed `playbackId` to create URLs for playback and for thumbnail generation.

* Playback `https://stream.mux.com/{SIGNED_PLAYBACK_ID}.m3u8?token={TOKEN}`
* Thumbnails `https://image.mux.com/{SIGNED_PLAYBACK_ID}/thumbnail.jpg?token={TOKEN}`

6. The `TOKEN` parameter for the above URLs is something you create on your server according to Step 2 in [Secure video playback](/docs/guides/secure-video-playback)

Note that in the Sanity UI when an asset is using a signed URL you will see this green notice.

<Image src="/docs/images/sanity-signed-playback.png" width={562} height={783} sm />

## Encoding Tiers

When uploading a new video, you can select which Encoding Tier is used when preparing the Asset. Possible selections are `Smart` and `Baseline`. When choosing `Smart`, additional options are made available for maximum resolutions (1080p, 2K or 4K).

More details can be found in our [Use Encoding Tiers](/docs/guides/use-video-quality-levels) guide.

## Static Renditions

When using the `Smart` Encoding Tier, an option to enable downloadable MP4s will be available. This option will create [Static Renditions](/docs/guides/enable-static-mp4-renditions) for the Asset and will make MP4 files available for download to client devices using a formatted URL.

## Max Video Resolution

You can specify the maximum resolution to encode the uploaded video. This option is particularly important in managing costs when uploaded videos are higher than `1080p` resolution and also allows you to encode and play videos in 2k or 4k resolutions.
More information on the feature is available in [our docs](/docs/guides/stream-videos-in-4k). Also, read more on this feature announcement in our [blog post](https://www.mux.com/blog/more-pixels-fewer-problems-introducing-4k-support-for-mux-video).

## Captions and Subtitles

You can add captions to your videos in two ways: during the initial upload or after the video has been uploaded. Both auto-generated and custom captions are supported, and you can use both types on the same asset.

### Adding captions during upload

When uploading a new video, you can configure auto-generated captions in the upload modal before the file is uploaded to Mux. This allows you to set up auto-generated captions right from the start.

<Image src="/docs/images/sanity-captions-1.png" width={500} height={700} />

### Adding captions to existing videos

For videos that have already been uploaded, you can add or manage captions in two ways:

* **From the Videos section:** Go to **Videos** in your studio menu, find the video in the list, and open it to view its details and caption options.
* **From the document:** Open the document that contains the video field, click the three-dots menu on the video input, then select **Captions**.

<Image src="/docs/images/sanity-captions-2.png" width={1200} height={500} />

### Types of captions

#### Auto-generated captions

For auto-generated captions, select the language of the spoken audio in the video. Mux will generate the captions automatically while it prepares the asset. The display name you choose is what will appear in the player when users select the caption track.

<Callout type="warning" title="Auto-generate a single caption track">
  The auto-generated option should only be used to generate one caption track per asset. The language selected must match the spoken language in the video.
</Callout>

<Image src="/docs/images/sanity-captions-3.png" width={1200} height={500} />

More details: [Add auto-generated captions and use transcripts](/docs/guides/add-autogenerated-captions-and-use-transcripts).

#### Custom captions

You can add custom captions and subtitles by providing a public URL to a `.vtt` or `.srt` file. Enter the URL in the caption configuration and set the caption name and language. You can host the file in Sanity's Media Library or any other public URL.

<Image src="/docs/images/sanity-captions-4.png" width={1200} height={500} />

More details: [Add subtitles/captions to videos](/docs/guides/add-subtitles-to-your-videos).

### Managing captions

Caption tracks can be added and removed at any time. Changes are reflected in the stored asset data. If you need to edit auto-generated captions, you can download the VTT file, make your edits, and re-upload it as a custom caption.

<GuideCard title="Set up playback" description="Set up your iOS application, Android application or web application to start playing your Mux assets" links={[{ title: 'Read the guide', href: '/docs/guides/play-your-videos' }]} />

<GuideCard title="Preview your video" description="Now that you have Mux assets, build rich experiences into your application by extracting images from your videos" links={[{ title: 'Read the guide', href: '/docs/guides/get-images-from-a-video' }]} />

<GuideCard title="Integrate Mux Data" description="Add the Mux Data SDK to your player and start collecting playback performance metrics." links={[{ title: 'Read the guide', href: '/docs/guides/track-your-video-performance' }]} />

## Release notes

### Current release

#### v2.14.0

* Add pagination in "Videos" library
* Add text tracks management

### Previous releases

<a href="https://github.com/sanity-io/sanity-plugin-mux-input/releases">See GitHub</a>


# Integrate with Contentful
The Mux Contentful app connects Contentful with your Mux account so that Mux can handle uploading and streaming of all videos.
This is a detailed guide for integrating the Contentful Mux app. To read more about the Mux app and why you may want to use it to power videos in your CMS, read [the announcement blog post on Contentful's blog](https://www.contentful.com/blog/improve-video-streaming-using-hls-with-mux-contentful/).

## 1. Enter Mux credentials

Create an access token in your Mux account - you will need both the access token ID and the access token secret in the Contentful application. The access token should have **Read** and **Write** permissions for Mux Video, and also **Read** for Mux Data.

<Image src="/docs/images/new-cms-token.png" width={608} height={480} alt="Mux Video access token permissions" sm />

## 2. Install App

In the Contentful dashboard click “Apps > Manage Apps” in the top navigation bar. Scroll down and find the Mux app, click it to start the installation flow.

Next you will see the configuration screen. You can come back to this screen after the app is installed to update the app configuration. Enter your Mux access token and Mux token secret. These are the same credentials you would use to make API requests yourself.

Assign Mux to JSON fields from your content model. In this example I am assigning Mux to take over the “Video Upload” field in my Blog post model. If you add new JSON fields later you can always come back to this configuration screen and assign Mux to the new fields.

<Image src="/docs/images/contentful-v110-install.png" width={1506} height={953} alt="Add API keys to integration." />

You can also assign Mux fields from the configuration of a particular Content model if you create a JSON Object type field and then edit its appearance as follows:

<Image src="/docs/images/contentful-v200-install-2.png" width={1506} height={953} alt="Edit field appearance to add Mux." />

### 2.1. Assign Mux Sidebar to a Content model

Once the plugin is installed and your Content Models are configured, you can add the Mux sidebar by clicking the plus sign and placing it wherever you want. Then click save. This sidebar is used to visualize pending changes with the publication of the Entry in Contentful. This is explained in more detail in the [Features and Important Notes](#7-mux-sidebar) section.

<Image src="/docs/images/contentful-v200-install-3.png" width={1506} height={953} alt="Add Mux sidebar to a Content model." />

## 3. Upload video

Create a new entry for your content model. You should see a drag and drop zone and file picker to select an audio or video file:

<Image src="/docs/images/contentful-v200-upload-1.png" width={1256} height={834} alt="Empty Video field" />

Select a video file and a modal will appear with expandable configuration options that you can click to expand and configure the video before it's uploaded. The configuration options available are:

* **Video Quality Settings:** Allows you to define the video quality. More information: [Use different video quality levels](/docs/guides/use-video-quality-levels)

* **Privacy Settings:** Allows you to define the video visibility between public or protected. More information: [Secure video playback](/docs/guides/secure-video-playback)

* **Metadata:** Allows you to add the title in the video metadata. This is also useful for having the title defined in the Mux Dashboard. More information: [Add metadata to your videos](/docs/guides/add-metadata-to-your-videos)

* **Captions:** Allows you to add captions to your videos, both custom and auto-generated captions.
  * To generate auto-generated captions, you need to specify the video language and it will automatically generate captions.
  * For custom captions, you need to specify the URL where the captions are located and the caption language.

More information in [Captions and Subtitles](#5-captions-and-subtitles) section.

* **MP4 Generation:** Allows you to generate static renditions for your videos, both audio-only and audio with video. More information: [Enable static MP4 renditions](/docs/guides/enable-static-mp4-renditions)

After configuring these options, click on the upload button and wait for the file to upload to Mux. The amount of time this takes will depend on your network upload speed and the size of the file. **Don't close the tab until the upload is complete.**

Additionally, entering a Mux Asset ID from an existing video in Mux, or a URL to an audio or video file will also work in the input field.

<Image src="/docs/images/contentful-v200-upload-2.png" width={1256} height={834} alt="Empty Video field" />

<Image src="/docs/images/contentful-v110-2.png" width={1256} height={833} alt="Uploading video." />

After the upload is complete you will see a message that says "Waiting for asset to be playable" while Mux is processing your video. For normal video files it should only take a minute or so, however longer files, or files that need more processing, may take longer.

<Image src="/docs/images/contentful-v110-3.png" width={1256} height={785} alt="Waiting for video to process." />

Your video is now playable via Mux. You will see a player with your video in the Contentful UI.

<Image src="/docs/images/contentful-v200-upload-3.png" width={1287} height={1295} alt="After a video is finished and ready to play." />

## 4. Playback

When you query your Mux video through the Contentful API you will get back a JSON object that looks something like this that is viewable in the Data tab:

```json
{
  "version": 3,
  "uploadId": "some-upload-id",
  "assetId": "some-asset-id",
  "playbackId": "YOUR_PLAYBACK_ID",
  "ready": true,
  "ratio": "16:9",
  "max_stored_resolution": "HD",
  "max_stored_frame_rate": 29.928,
  "duration": 23.857167,
  "audioOnly": false,
  "created_at": 1664219467,
  "audioTracks": [
    {
      "type": "audio",
      "primary": true,
      "max_channels": 2,
      "max_channel_layout": "stereo",
      "id": "some-audio-track-id",
      "duration": 10.026667
    }
  ],
  "meta": {
    "title": "some-video-title",
    "external_id": "some-external-id"
  }
}
```

You will need to pull out the `playbackId` property and construct a URL like this. You will use this URL with a player that supports HLS:

```text
https://stream.mux.com/{YOUR_PLAYBACK_ID}.m3u8
```

View Mux's [Playback docs](/docs/guides/play-your-videos) for more information about players.

## Using Mux Player

We made it easy to playback video using Mux Player by including the same code used to play the video in the Contentful dashboard. Simply head to the **Player Code** tab, click the copy button, and paste this into a website for quicker testing and development.
You can also add optional parameters to the example code such as autoplay, mute, and loop by clicking the corresponding checkboxes. This will update the example code for you to use.

Additionally, there's an option to get iframe example code for embedding the video, providing an alternative integration method.

<Image src="/docs/images/contentful-v200-player-code-1.png" width={1249} height={934} alt="Get the Mux Player code." />

## 5. Captions and Subtitles

Captions can be added before uploading an asset (see the [Upload video](#3-upload-video) section) and after uploading in the Captions tab. There are two ways to add captions: auto-generated and custom captions, and they can be used together if desired.

You must select the type of captions to upload from the dropdown menu.

<Image src="/docs/images/contentful-v200-captions-dropdown.png" width={1256} height={585} alt="Select caption type from dropdown" />

### Auto-Generated Captions

For auto-generated captions, you need to select the Language Code. This is automatically populated based on the Audio Name you select. It's important to select the same language as the spoken audio in the video so that captions are generated correctly.

The Audio Name is what will appear in the player when selecting the caption. You can use any name you want.

<Image src="/docs/images/contentful-v200-captions-auto.png" width={1256} height={585} alt="Auto-generated captions configuration" />

### Custom Captions

Custom captions and subtitles can come from any publicly available URL. Add the URL to the `vtt` or `srt` caption file, selecting the caption name and to mark as closed captions.

<Image src="/docs/images/contentful-v200-captions-custom.png" width={1256} height={585} alt="Adding custom captions/subtitles" />

One way to upload captions is to use the Contentful Media Manager. After uploading the file to the Manager, right click on the download button and select `copy link` then paste this link into the URL field.

<Image src="/docs/images/contentful-v110-cc-2.png" width={1447} height={1100} alt="Copying the file URL in Media Manager." />

### Managing Captions

Caption files can be added or deleted, and files can be downloaded for further editing. The stored JSON object will also reflect additional caption files. Existing captions will be displayed after clicking the `Resync` button under the Data tab or when entering to the entry. Deleting a caption will appear in the Mux sidebar and require publishing to take effect in Mux.

<Image src="/docs/images/contentful-v200-captions-result.png" width={1256} height={1151} alt="Captions or subtitles are working on video." />

## 6. Audio Tracks

You can add new audio tracks to an existing asset in the Audio Tracks section. This is useful for providing multiple language audio tracks or alternative audio content for your videos.

<Image src="/docs/images/contentful-v200-audio-tracks-tab.png" width={1256} height={585} alt="Audio Tracks tab" />

### Adding Audio Tracks

To upload a new audio track, you need to provide a public URL of an audio file. One way to obtain this is by using the Contentful Media Manager, in the same way as described in the [Captions section](#custom-captions).

You need to specify the Language Code, which is automatically populated when you indicate the Audio Name. The Audio Name can contain any text and is what will be displayed in the player when users want to select from the available audio tracks.

<Image src="/docs/images/contentful-v200-audio-tracks-upload.png" width={1256} height={585} alt="Adding new audio track" />

### Managing Audio Tracks

Audio tracks can be added or deleted, and the stored JSON object will reflect the additional audio tracks. Deleting an audio track will appear in the Mux sidebar and require publishing to take effect in Mux.

<Image src="/docs/images/contentful-v200-audio-tracks-result.png" width={1256} height={585} alt="Audio tracks in player" />

## 7. Mux Sidebar

The Mux sidebar provides a visual interface to track the synchronization status between your Contentful entries and Mux assets. This sidebar displays:

* Pending actions that need to be synchronized with Mux
* Any changes that require publishing to take effect in Mux

<Image src="/docs/images/contentful-v200-feature-sidebar-1.png" width={350} height={953} alt="Mux sidebar." sm />

<Image src="/docs/images/contentful-v200-feature-sidebar-2.png" width={350} height={953} alt="Mux sidebar." sm />

## 8. Publishing Requirements and Breaking Changes

As part of the [version 2.0 plugin release](https://github.com/contentful/apps/pull/9826), changes were made to maintain data integrity and consistency between what is published in Contentful and what is stored in Mux. For example, to change a video from public to protected, its playback ID must be regenerated, and if that video is published in Contentful, the new playback ID couldn't be obtained previously.

To solve this, some actions that we consider "breaking changes" will not be executed in Mux until the user clicks "Publish" on the pending changes.

### Breaking Changes That Require Publishing

The following changes will appear as pending in the Mux sidebar and will only be applied to Mux when you click "Publish changes" in Contentful:

* **Delete Video** - Removing a video asset
* **Delete Captions** - Removing caption/subtitle files
* **Delete Audio** - Removing audio tracks
* **Change Metadata Title** - Modifying the title in the metadata section
* **Delete MP4 Renditions** - Removing static MP4 files
* **Change Video Visibility** - Switching between public/protected settings

<Image src="/docs/images/contentful-v200-pending-changes-sidebar.png" width={1506} height={953} alt="Mux sidebar showing pending changes" />

### Publishing Workflow

For example, if you want to change the visibility of an existing video, this will be a pending change that appears in the sidebar and the change will be made in Mux when you click "Publish changes" in Contentful.

<Image src="/docs/images/contentful-v200-publish-changes-process.png" width={1506} height={953} alt="Publishing changes process" />

The same happens with actions like deletions - if you want to delete a caption, it will appear marked for deletion but will only be removed when you publish. You can also undo deletions to prevent them from being deleted when you publish.

## Explore advanced options

## Advanced: Signed URLs

<Callout type="warning" title="Warning! Requires generating JWT on your server">
  Enabling signed URLs in Contentful will require you to generate your own signing tokens on your application **server**. This involves creating a signing key and using that to generate JSON web tokens when you want to access your videos and thumbnails outside of Contentful.
</Callout>

By default, all assets uploaded to Mux through Contentful will be created with a single playback policy of `"public"`. This means that your videos and thumbnails are accessible with `https://stream.mux.com/{PLAYBACK_ID}.m3u8` and `https://image.mux.com/{PLAYBACK_ID}/thumbnail.jpg`.

If you want more control over controlling the playback and thumbnail access, you can enable this feature on the Contentful configuration page:

<Image src="/docs/images/contentful-v110-config.png" width={1505} height={1158} alt="Additional global configuration options." />

When you enable this feature, the following things will happen:

1. The Mux App in Contentful will use the Mux API to create a URL signing key and save this with your Contentful configuration.
2. When uploading an asset, you can select "Protected" in the Privacy Settings section of the configuration modal. If you select this option, the asset will be created with `playback_policy: "signed"` (instead of `"public"`).
3. The signing key from Step 1 will be used by the Mux App to preview content inside the Contentful UI.
4. When you access your content in your own application, outside of Contentful, the Mux asset will no longer have the key `playbackId`, it will now be called `signedPlaybackId`.

```json
{
  "uploadId": "some-upload-id",
  "assetId": "some-asset-id",
  "signedPlaybackId": "YOUR_SIGNED_PLAYBACK_ID",
  "ready": true,
  "ratio": "16:9"
}
```

5. You should use the value from `signedPlaybackId` to create URLs for playback and for thumbnail generation.

* Playback `https://stream.mux.com/{SIGNED_PLAYBACK_ID}.m3u8?token={TOKEN}`
* Thumbnails `https://image.mux.com/{SIGNED_PLAYBACK_ID}/thumbnail.jpg?token={TOKEN}`

6. The `TOKEN` parameter for the above URLs is something you create on your server according to Step 2 in [Security: Signed URLs](/docs/guides/secure-video-playback).

Note that in the Contentful UI when an asset is using a signed URL you will see this notice.

<Image src="/docs/images/contentful-v110-notice.png" width={1444} height={573} alt="Signed Notice" />

Public and signed playback IDs can be toggled per-entry under the Data tab. Each time the IDs are toggled, the old playback ID is deleted, and a new ID is created in its place.

## Advanced: DRM

DRM (Digital Rights Management) provides the highest level of content protection using industry-standard encryption. To use DRM, you must first [request access](https://www.mux.com/support/human). Once approved, you'll receive a DRM Configuration ID.

[Learn more about DRM](/docs/guides/protect-videos-with-drm).

### Enabling DRM in Contentful

To enable DRM support, go to the Contentful Mux app configuration screen. You will see a new **"Advanced: DRM"** section where you can:

1. Enable or disable DRM support by checking the **Enable DRM** checkbox
2. Enter your **DRM Configuration ID** provided by Mux after your DRM access is approved

<Image src="/docs/images/contentful-drm-1.png" width={1444} height={573} alt="Enable DRM in Contentful" />

This configuration is stored alongside your other installation parameters and applies to all videos uploaded through this Contentful space.

### Uploading DRM-protected videos

Once DRM is enabled and configured, a new **"DRM Protected"** option will appear in the **Privacy Settings** section when uploading or editing a video asset.

* The DRM option is only available when both DRM is enabled **and** a valid DRM Configuration ID is present
* If DRM is enabled but no Configuration ID is set, the option will appear disabled with contextual help text
* When DRM is enabled and configured, **DRM Protected** is selected by default (users can change it to Public or Protected if needed)

<Image src="/docs/images/contentful-drm-2.png" width={1444} height={573} alt="DRM Protected option in Privacy Settings" />

**Note:** DRM is only available for video assets. If you attempt to upload an audio-only file with DRM selected, the app will display a warning indicating that DRM is not supported for audio files. Use the **Protected** option for secure playback of audio content.

### Viewing DRM-protected videos

To view DRM-protected videos in Contentful, you must also have **Signed URLs enabled** in the app configuration. If Signed URLs are not enabled, you will see an error message when trying to view DRM content. See [Advanced: Signed URLs](#advanced-signed-urls) for setup instructions.

Due to browser iframe security restrictions, **DRM-protected videos cannot be played directly within the Contentful preview**. When viewing a DRM-protected asset, you will see a notice explaining this limitation.

The app provides an **"Open in external player"** link that opens the video in a standalone browser tab where DRM playback is permitted. This external player uses a Blob URL to load the player with the required DRM tokens.

<Image src="/docs/images/contentful-drm-3.png" width={1444} height={573} alt="DRM warning in preview" />

**Note:** DRM playback incurs additional costs. See the [DRM pricing](/docs/guides/protect-videos-with-drm#pricing) section for details.

### Using DRM-protected videos in your application

When you query a DRM-protected video through the Contentful API, the JSON object will include DRM-related information. The generated player code in the **Player Code** tab will automatically include the necessary DRM tokens (`playback`, `thumbnail`, `storyboard`, and `drm` tokens), so you can copy and paste it directly into your application.

For more detailed information about implementing DRM playback, including token generation and player configuration, see the [DRM guide](/docs/guides/protect-videos-with-drm).

## Note about version 2.0 release

During the month of August 2025, [version 2.0 of the plugin was released](https://github.com/contentful/apps/pull/9826). No action is required as the plugin updates automatically.

There are no changes to the previous data structure, but new fields have been added that are necessary to support new features such as audio tracks, MP4 renditions, and others.

If you were already using the plugin previously, you may notice that when entering an entry that has a video, it changes to 'Changed' status. This occurs because the new video data is retrieved and stored in Contentful's data. It's like an automatic resync that runs.

This is expected behavior and only occurs with videos uploaded before the new version or if changes are made to videos outside of Contentful, as it synchronizes automatically.

## Note about migrating from the old Contentful extension

Before releasing the Contentful App, Mux had an officially supported [Contentful extension](https://github.com/contentful/extensions/pull/316).

The underlying data structure has not changed, so you can safely migrate to the app without losing data by following these steps:

1. Uninstall the extension (now your video fields should look like raw JSON)
2. Install the app
3. On the configuration screen, apply the Mux App to the video fields that you had before

<GuideCard title="Set up playback" description="Set up your iOS application, Android application or web application to start playing your Mux assets" links={[{ title: 'Read the guide', href: '/docs/guides/play-your-videos' }]} />

<GuideCard title="Preview your video" description="Now that you have Mux assets, build rich experiences into your application by extracting images from your videos" links={[{ title: 'Read the guide', href: '/docs/guides/get-images-from-a-video' }]} />

<GuideCard title="Integrate Mux Data" description="Add the Mux Data SDK to your player and start collecting playback performance metrics." links={[{ title: 'Read the guide', href: '/docs/guides/track-your-video-performance' }]} />


# Integrate with WordPress
Learn how to integration Mux with WordPress. The Mux Video Uploader by 2Coders plugin connects WordPress with your Mux account so you can upload, manage and embed videos on your site or application from your WordPress account.
This guide explains how to integrate Mux with WordPress using the Mux Video Uploader plugin by 2Coders. This integration enables you to leverage Mux's powerful video infrastructure while maintaining the familiar WordPress content management experience.

Follow the steps below to integrate Mux with your WordPress site

## 1. Install the Plugin

WordPress plugin can be installed either from WordPress Plugin Directory or Manually by uploading a zipped plugin file.

<Callout type="info">
  You should be on WordPress.com Business pricing plan or higher to install the Mux Video Uploader plugin. However, there is no such requirement for Self-Hosted WordPress instance.
</Callout>

### A. From the WordPress Plugin Directory

1. In your WordPress admin panel, navigate to `Plugins > Add Plugin` on the sidebar

2. Search and select "Mux Video Uploader by 2Coders"

   <Image src="/docs/images/wordpress-plugin-search.jpg" width={2404} height={526} />

3. Click `Install and activate` button on the plugin page.

   <Image src="/docs/images/wordpress-plugin-install.jpg" width={2404} height={526} />

### B: Manual Installation

1. Download the plugin ZIP file from the [WordPress.org site](https://downloads.wordpress.org/plugin/2coders-integration-mux-video.latest-stable.zip).

2. In your WordPress admin panel, go to `Plugins > Add Plugin`

3. Click Upload Plugin and select the downloaded ZIP file

   <Image src="/docs/images/wordpress-plugin-upload.jpg" width={2404} height={526} />

4. Click Install Now and then Activate Plugin

After the plugin is properly installed, you should see the Mux Video Uploader plugin on the `Installed Plugins` page.

<Image src="/docs/images/wordpress-plugin-installed.jpg" width={2404} height={526} />

## 2. Create a Mux Account

If you don't already have a Mux account:

1. Sign up at [mux.com](https://dashboard.mux.com/signup)
2. After creating your account, navigate to your dashboard
3. Generate [API Access Token](https://www.mux.com/docs/core/stream-video-files#1-get-an-api-access-token). You'll need both an Access Token ID and Secret Key for the plugin to make API requests.

   <Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} />

   The access token should have Mux Video Read and Write permissions as well as Mux Data (read-only).

   <Image src="/docs/images/new-cms-token.png" width={1102} height={468} alt="Mux Video and Mux Data access token permissions" />

## 3. Connect WordPress to Mux

In your WordPress admin panel, locate the new Mux Video menu item

1. Go to Mux Video > Settings
2. Enter your Mux API credentials (API ID and Secret Key)
3. Click Save Settings

   <Image src="/docs/images/wordpress-plugin-token.jpg" width={2404} height={526} />

## 4. Upload Video

The video below shows how to upload a video on WordPress using the Mux Video Uploader plugin. You can also enable advanced features, like Signed URLs, Subtitles & Captions, and MP4 generation during asset creation or later on from the `Assets List` page.

<Player playbackId="K1UOWbEJzRTNNiwAVywTS5CBuZD02iSIM" muted loop thumbnailTime={0} title="How to upload a video on Wordpress using Mux Video Uploader plugin" />

## 5. Play Video

### Using the Gutenberg Block

The uploaded video can be added to your WordPress site using the Gutenberg block Editor:

1. Add or Edit a post or page
2. Click the + icon to add a new block
3. Search for "Mux Video" and select the block
4. Choose the asset from the list and click `Insert`
5. Preview and publish your content

When using the Gutenberg block, the video is embedded onto the page with the default Mux Player configuration.

<Player playbackId="ita1G6KmqTIOlQJ017m6mDJwJver31Cwi" muted loop thumbnailTime={0} title="Embed Video with Gutenberg block using Mux Video Uploader WordPress plugin" />

### Using the Shortcode Block

The same uploaded videos can also be added using the Shortcode block. With the Shortcode, you can customize the Mux Player configuration instead of using the default configuration.

<Player playbackId="aI1gVG4LX01bmbh5USPdm5OTGsqgBoJsz" muted loop thumbnailTime={0} title="Embed Video with Shortcode block using Mux Video Uploader WordPress plugin" />

## Advanced video options

### Video Quality Levels

When uploading a new video, you can select which Video Quality Levels is used when preparing the Asset. Possible selections are `Basic`, `Plus` and `Premium`. More details can be found in our [Use Video Quality Options](/docs/guides/use-video-quality-levels) guide.

### MP4 Generation

Each Asset can be enabled to generate downloadable MP4s. You can select `Highest` or `Audio-Only` or both. This will create [Static Renditions](/docs/guides/enable-static-mp4-renditions) for the Asset and will make MP4 files available for download to client devices using a formatted URL.

### Signed Tokens

When uploading a new video, you can select `Protected` option when you want to secure the video playback and `Public` to make the video publicly available. Learn more about [Secure video playback](/docs/guides/secure-video-playback).

<Image src="/docs/images/wordpress-plugin-signed-url.jpg" width={2404} height={526} />

Mux Video Uploader plugin creates a signing key when configuring the Access Tokens on the plugin's Settings page. The plugin generates Signed URLs when `Protected` option is selected when uploading the video and available on the Asset page as shown in the image below.

<Image src="/docs/images/wordpress-plugin-view-asset-signed-url.jpg" width={2404} height={526} />

### Auto-Generate and Upload Custom Captions/Subtitles

With Mux's auto-generated captions, you can easily add captions to videos uploaded by selecting the language of the spoken words. Mux can generate captions automatically while preparing the asset or later. For generating captions later, go to that Asset entry on the Asset List section of the plugin and click on `Add Captions`. More details can be found in the [Add auto-generated captions to your videos and use transcripts](/guides/add-autogenerated-captions-and-use-transcripts) section of our documentation.

<Callout type="warning" title="Warning! Auto-generate a single caption track">
  The "Auto-generated" captions configuration should only be used to generate a single language captions track. The language selected must match the spoken language.
</Callout>

Additionally, you can upload one or more custom caption files (during asset creation step or later) for a single asset.


# Integrate with Strapi
Strapi is an open source content management system that allows you to define your own schemas for your content.
The [Mux Video Uploader](https://www.npmjs.com/package/strapi-plugin-mux-video-uploader) plugin allows editors to upload content directly to Mux from within the Strapi interface, then associate those videos with their custom collection types.

## Requirements

* A working installation of Strapi that is publicly accessible through a hostname
* An [Access Token and Secret Key](/docs/integrations/strapi#2-create-access-in-mux) which is provisioned within Mux Dashboard
* Configure a [Webhooks listener](/docs/integrations/strapi#3-configure-webhook-listener) within Mux Dashboard so that Strapi can be informed of upload progress.

## 1. Install the Mux Video Uploader plugin

With your existing Strapi installation, run the following command in the root of your Strapi project to install the plugin. Be sure to restart Strapi for the plugin to take effect.

<Callout type="info" title="Version notice">
  As of the 2.1.0 version of this player, only Strapi v4 will be supported. To use with Strapi v3, please use version 2.0.0 of this plugin.
</Callout>

## Install instructions for Strapi v5

Run this command in your project folder if you are using NPM:

```sh
npm i strapi-plugin-mux-video-uploader@latest
```

Or this command if you are using yarn with your project:

```sh
yarn add strapi-plugin-mux-video-uploader@latest
```

## Install instructions for Strapi v4

Run this command in your project folder if you are using NPM:

```sh
npm i strapi-plugin-mux-video-uploader@2.8.4
```

Or this command if you are using yarn with your project:

```sh
yarn add strapi-plugin-mux-video-uploader@2.8.4
```

## Install instructions for Strapi v3

Run this command in your project folder if you are using NPM:

```sh
npm i strapi-plugin-mux-video-uploader@2.0.0
```

Or this command if you are using yarn with your project:

```sh
yarn add strapi-plugin-mux-video-uploader@2.0.0
```

## 2. Create access token in Mux

Generate a new Access Token by going to the Access Token settings of your Mux account dashboard.

<Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} alt="Mux access token settings" />

The access token should have Mux Video **Read** and **Write** permissions.

<Image src="/docs/images/new-access-token.png" width={760} height={376} alt="Mux Video access token permissions" sm />

After clicking the "Generate Token" button, save the "Access Token ID" and "Secret Key" to be used later.

## 3. Configure Webhook listener

Part of the upload process includes Mux updating Strapi with the completion of the upload and processing. In order for Mux to make this communication, a Webhook needs to be established so that events are sent to your Strapi installation.

Create a new Webhook configuration in Mux Dashboard. There will be a space to add a "URL to notify". This value should be formatted based on your Strapi's hostname

```txt
{YOUR_STRAPI_DOMAIN_HERE}/mux-video-uploader/webhook-handler
```

After saving, copy the "Signing Secret" which will be used later.

## 4. Setup configuration in Strapi

In Strapi, visit the Settings page and navigate to the `MUX VIDEO UPLOADER` section.

Using the details saved in the previous step, fill in the fields with the appropriate values.

<Image src="/docs/images/strapi-config.png" width={1600} height={819} />

Click the "Save" button to persist the changes.

## 5. Upload video

Use the Mux Video Uploader page that is now available in Strapi's menu to upload either with a remote URL or directly using a local video file.

<Image src="/docs/images/strapi-upload.png" width={1600} height={819} />

From here, relationships of Mux assets can be modeled to custom collection types within Strapi to tie metadata with playable content.

<Player playbackId="kBwjf6dl6028FjELH4p0297dLcPM6AvQ1D" thumbnailTime="0" title="Strapi - Mux Video Uploader - Upload" />

<Callout type="success" title="Congratulations!">
  You now have the ability to upload content to Mux through Strapi CMS!
</Callout>

At this point, querying Strapi using REST or GraphQL will give you access to the `playback_id` information. This `playback_id` can be used by your client applications to stream content or retrieve thumbnails.

## 6. Explore advanced options

## Signed tokens

<Callout type="warning" title="Warning! Requires generating JWT on your server">
  Enabling signed URLs in Strapi will require you to generate your own signing tokens on your application **server**. This involves creating a signing key and using that to generate JSON web tokens when you want to access your videos and thumbnails outside of Strapi.
</Callout>

By default, all assets uploaded to Mux through Strapi will be created with a playback policy of `"public"`. This means that your videos and thumbnails are accessible with `https://stream.mux.com/{PLAYBACK_ID}.m3u8` and `https://image.mux.com/{PLAYBACK_ID}/thumbnail.jpg`.

If you want more control over delivery of the playback and thumbnail access, you can enable this feature in the Strapi settings for the Mux Video Uploader.

When you enable this feature, the following things will happen:

1. The Mux Plugin in Strapi will save the signing keys that you've generated and be available immediately.
2. Any Assets that get created with the Signed Playback URL setting enabled will be created with `playback_policy: "signed"` (instead of `"public"`).
3. The signing key from Step 1 will be used by the Mux Plugin to preview content inside the Strapi UI.
4. When you access your content in your own application, use the `MuxAsset.signed` property to determine if the asset is signed by either a `true` or `false` value.

```json
{
  "id": 9,
  "upload_id": null,
  "asset_id": "H9H01yni83yRLuu6cKaf8jQI8XW01SPp5XI7WrGsD37n00",
  "playback_id": "aAqXNee00zlfzR2Rsw01NmGBvxSg1Ocs3g008YChvtG6aM",
  "signed": true,
  "isReady": true,
  "duration": 25.492133,
  "aspect_ratio": "16:9",
  "createdAt": "2024-04-01T23:48:19.760Z",
  "updatedAt": "2024-04-01T23:48:21.605Z"
}
```

5. You should use the signed `playback_id` to create URLs for playback and for thumbnail generation.

* Playback `https://stream.mux.com/{SIGNED_PLAYBACK_ID}.m3u8?token={TOKEN}`
* Thumbnails `https://image.mux.com/{SIGNED_PLAYBACK_ID}/thumbnail.jpg?token={TOKEN}`

6. The `TOKEN` parameter for the above URLs is something you create on your server according to Step 2 in [Secure video playback](/docs/guides/secure-video-playback)

Note that in the Strapi UI when an asset is using a signed URL you will see a lock icon on the Asset list.

## Encoding Tiers

When uploading a new video, you can select what Encoding Tier used when preparing the Asset.  Possible selections are `Smart` and `Baseline`.  When choosing `Smart` additional options are made available for maximum stream resolutions (1080p, 2K or 4K).

More details can be found in our [Use Encoding Tiers](/docs/guides/use-video-quality-levels) section.

## Static Renditions

When using the `Smart` Encoding Tier, an option to enable downloadable MP4s will be available.  This option will create [Static Renditions](/docs/guides/enable-static-mp4-renditions) for the Asset and will make MP4 files available for download to client devices using a formatted URL.

## Captions/Subtitles

With Mux's auto-generated captions, editors can easily add captions to videos being uploaded from Strapi.  When using the "Auto-generated" option, Mux will generate captions automatically while it prepares the Asset.  More details can be found in the [Add auto-generated captions to your videos and use transcripts](/docs/guides/add-autogenerated-captions-and-use-transcripts) section of our documentation.

If you choose to upload a "Custom" captions file (supported formats are `.vtt` and `.srt`), your file will be uploaded to your instance of Strapi and Mux will pull it via a public URL from your Strapi instance.  Take a look at our [Add subtitles/captions to videos](/docs/guides/add-subtitles-to-your-videos) for more details.

<GuideCard title="Set up playback" description="Set up your iOS application, Android application or web application to start playing your Mux assets" links={[{ title: 'Read the guide', href: '/docs/guides/play-your-videos' }]} />

<GuideCard title="Preview your video" description="Now that you have Mux assets, build rich experiences into your application by extracting images from your videos" links={[{ title: 'Read the guide', href: '/docs/guides/get-images-from-a-video' }]} />

<GuideCard title="Integrate Mux Data" description="Add the Mux Data SDK to your player and start collecting playback performance metrics." links={[{ title: 'Read the guide', href: '/docs/guides/track-your-video-performance' }]} />

## Release notes

### Current release

#### v3.0.1

* Upgrade to Strapi v5
* **Breaking change**— Plugin configuration is now done using Strapi's file based config.  Details on how to do this can be found in the `README.md`.
* Refreshed library versions to latest
* Both auto-generated and custom captions can be added on upload
* Previous issue with preview player is resolved, plugin now uses Mux Player again


# Integrate with Cosmic
With the Mux Video integration for Cosmic JS, you can upload videos directly to Mux from your Cosmic JS Dashboard.
## 1. Install the Mux Extension

Log in to your Cosmic JS account and navigate to Your Bucket > Settings > Extensions. Click the Extensions tab and find the Mux Videos Extension. Hit Install.

<Image src="/docs/images/cosmic-install.png" width={1304} height={828} />

## 2. Enter Mux credentials

After installing, you will be redirected to the Extension settings page. Under Query Parameters, you will need to provide the Mux API credentials on your Mux account (mux\_access\_token, mux\_secret).

<Image src="/docs/images/cosmic-credentials.png" width={1304} height={828} />

If you need to generate a new Access Token, go to the Access Token settings of your Mux account dashboard.

<Image src="/docs/images/settings-api-access-tokens.png" width={500} height={500} />

The access token should have **Read** and **Write** permissions for Mux Video.

<Image src="/docs/images/new-access-token.png" width={760} height={376} alt="Mux Video access token permissions" sm />

Go back to the Cosmic Extensions setting page, enter your Mux credentials, and save your Extension.

## 3. Upload video

After installing the Extension and setting your Mux account keys, click the Mux Videos Extension link in the left-hand nav. Next, upload your videos.

<Image src="/docs/images/cosmic-upload.gif" width={1273} height={781} />

The Extension saves the uploaded video data to the Mux Videos Object Type. Now you can add your Mux Videos to any Object using an Object metafield. Then you can fetch Mux data into your application by using the `mux_playback_url` property located in the Object metadata.

<Image src="/docs/images/cosmic-edit.gif" width={1272} height={827} />

## 4. Playback

To retrieve your video for playback, check out the [Cosmic docs](https://www.cosmicjs.com/articles/mux-videos-extension-overview-jqpvec5d) to see how to add the Mux playback URL to your HTML Video player.

<GuideCard title="Set up playback" description="Set up your iOS application, Android application or web application to start playing your Mux assets" links={[{ title: 'Read the guide', href: '/docs/guides/play-your-videos' }]} />

<GuideCard title="Preview your video" description="Now that you have Mux assets, build rich experiences into your application by extracting images from your videos" links={[{ title: 'Read the guide', href: '/docs/guides/get-images-from-a-video' }]} />

<GuideCard title="Integrate Mux Data" description="Add the Mux Data SDK to your player and start collecting playback performance metrics." links={[{ title: 'Read the guide', href: '/docs/guides/track-your-video-performance' }]} />


# Integrate with DatoCMS
With every DatoCMS project you also get a native integration with Mux without any manual intervention.
Mux is by default enabled in every new DatoCMS project. The integration allows you to upload videos directly from DatoCMS dashboard or using the [REST API](https://www.datocms.com/docs/content-management-api). The CMS interface will then allow you to use the videos in the content, while on the API side you’ll be able to retrieve the Mux Video URLs and the thumbnail.

## 1. Upload video

Just drag and drop a video in DatoCMS media area, like this:

<Image src="/docs/images/datocms-upload.gif" width={640} height={360} />

## 2. Fetch video information via GraphQL

For every video that you upload, you can get on the API a custom video object with the following properties:

* HLS video streaming URL.
* High, medium and low quality MP4 versions of the video to support legacy browsers that do not support HLS.
* Duration and frame rate of the video.
* Thumbnail URL: resizable, cropable and available in JPEG, PNG and GIF format.

See the full page of this embedded example [here in the GraphQL explorer](https://cda-explorer.datocms.com/?embed=\&apitoken=faeb9172e232a75339242faafb9e56de8c8f13b735f7090964\&query=%7B%0A%20%20allUploads%28filter%3A%20%7Btype%3A%20%7Beq%3A%20video%7D%2C%20resolution%3A%20%7B%7D%2C%20smartTags%3A%20%7B%7D%7D%29%20%7B%0A%20%20%20%20video%20%7B%0A%20%20%20%20%20%20streamingUrl%0A%20%20%20%20%20%20mp4High%3A%20mp4Url%28res%3A%20high%29%0A%20%20%20%20%20%20mp4Med%3A%20mp4Url%28res%3A%20medium%29%0A%20%20%20%20%20%20mp4Low%3A%20mp4Url%28res%3A%20low%29%0A%20%20%20%20%20%20duration%0A%20%20%20%20%20%20framerate%0A%20%20%20%20%20%20thumbJpg%3A%20thumbnailUrl%28format%3A%20jpg%29%0A%20%20%20%20%20%20thumbPng%3A%20thumbnailUrl%28format%3A%20png%29%0A%20%20%20%20%20%20thumbGif%3A%20thumbnailUrl%28format%3A%20gif%29%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A).

<iframe src="https://cda-explorer.datocms.com/?embed=&apitoken=faeb9172e232a75339242faafb9e56de8c8f13b735f7090964&query=%7B%0A%20%20allUploads%28filter%3A%20%7Btype%3A%20%7Beq%3A%20video%7D%2C%20resolution%3A%20%7B%7D%2C%20smartTags%3A%20%7B%7D%7D%29%20%7B%0A%20%20%20%20video%20%7B%0A%20%20%20%20%20%20streamingUrl%0A%20%20%20%20%20%20mp4High%3A%20mp4Url%28res%3A%20high%29%0A%20%20%20%20%20%20mp4Med%3A%20mp4Url%28res%3A%20medium%29%0A%20%20%20%20%20%20mp4Low%3A%20mp4Url%28res%3A%20low%29%0A%20%20%20%20%20%20duration%0A%20%20%20%20%20%20framerate%0A%20%20%20%20%20%20thumbJpg%3A%20thumbnailUrl%28format%3A%20jpg%29%0A%20%20%20%20%20%20thumbPng%3A%20thumbnailUrl%28format%3A%20png%29%0A%20%20%20%20%20%20thumbGif%3A%20thumbnailUrl%28format%3A%20gif%29%0A%20%20%20%20%7D%0A%20%20%7D%0A%7D%0A" title="CDA GraphQL Explorer | DatoCMS" width="100%" height="500px" style={{ border: "none" }} />

<GuideCard title="Set up playback" description="Set up your iOS application, Android application or web application to start playing your Mux assets" links={[{ title: 'Read the guide', href: '/docs/guides/play-your-videos' }]} />

<GuideCard title="Preview your video" description="Now that you have Mux assets, build rich experiences into your application by extracting images from your videos" links={[{ title: 'Read the guide', href: '/docs/guides/get-images-from-a-video' }]} />

<GuideCard title="Integrate Mux Data" description="Add the Mux Data SDK to your player and start collecting playback performance metrics." links={[{ title: 'Read the guide', href: '/docs/guides/track-your-video-performance' }]} />


# Integrate with Prepr
Prepr is a headless content management system created for adaptive websites. Prepr works with Mux out of the box. No configuration needed.
Mux is enabled for every new Prepr account by default. You can upload your videos to Prepr, add them to a content model and query their URLs to display them on your website. Follow the steps below to get started.

# Using Mux with Prepr

## Upload video content to Prepr

1. [Create a free Prepr account](https://signup.prepr.io/) before you get started.
2. Log in to the Prepr and navigate to the *Media* page.
3. Simply drag and drop audio/video files from your local folders into the *Media* page directly.

<Image src="/docs/images/prepr-demo-1.png" width={800} height={450} />

Once uploaded, the videos are ready to be used in content items.

## Add live streams to Prepr

1. Navigate to the *Media* page.
2. Click the *Upload asset* dropdown and choose the *Add live stream* option.

<Image src="/docs/images/prepr-demo-2.png" width={800} height={450} />

3. Enter the broadcasting details as described in the [Start broadcasting a live stream](/docs/guides/start-live-streaming#3-start-broadcasting) guide.

The livestream asset is now ready to be used in content items with an asset field.

## Add videos to content items

Once your video(s) have been uploaded, you can add them to a content item. Follow the steps below to do this.

1. Navigate to the *Content* page.
2. Create a new content item or open one of your existing content items with an assets field.
3. Simply drag and drop audio/video files from your local folders into the field directly or click the assets field to add the video you previously uploaded to the *Media* page.
4. Save or publish the content item to make the video available to the front-end application.

<Image src="/docs/images/prepr-demo-3.gif" width={800} height={450} />

## Querying the GraphQL API

Now you can query the URLs or playback IDs of your videos to embed them on your website.

To learn how to play video content on your website, please follow these instructions provided by Mux.

Your query could look something like the example below. In this example, `Posts` is the name of your content model and `videos` is the name of the assets field. It has various options:

* The HLS streaming URL is returned by default as the `url` field.
* The playback ID can be returned by using the `playback_id` field.
* You can use the `res` option to request MP4 versions in high, medium and/or low quality to support legacy browsers that do not support HLS.
* You can query the duration of video content using the `duration` option.
* The cover image can be requested using the `cover` field. It is adjustable using width, height, animated, and time arguments.

```gql
{
  Posts {
    items {
      videos {
        hls: url
        playback_id
        mp4High: url(res: "high")
        mp4Medium: url(res: "medium")
        mp4Low: url(res: "low")
        duration
        cover
      }
    }
  }
}
```

# Using additional Mux features in Prepr

## Static Renditions

By default Prepr uses the [plus quality level](/docs/guides/use-video-quality-levels), MP4 support is enabled on all accounts. This option will create Static Renditions for the Asset and will make MP4 files available for download to client devices using the `url` field.

## Captions/Subtitles

While editing an asset from the *Media* page, content editors can easily upload their own captions file (supported formats are .vtt and .srt) by clicking the **+ Add subtitles** link. Take a look at [Add subtitles/captions to videos](/docs/guides/add-subtitles-to-your-videos) for more details.

<GuideCard title="Set up playback" description="Set up your iOS application, Android application or web application to start playing your Mux assets" links={[{ title: 'Read the guide', href: '/docs/guides/play-your-videos' }]} />

<GuideCard title="Preview your video" description="Now that you have Mux assets, build rich experiences into your application by extracting images from your videos" links={[{ title: 'Read the guide', href: '/docs/guides/get-images-from-a-video' }]} />

<GuideCard title="Integrate Mux Data" description="Add the Mux Data SDK to your player and start collecting playback performance metrics." links={[{ title: 'Read the guide', href: '/docs/guides/track-your-video-performance' }]} />


# Database schema design
Recommended database schemas for storing Mux video data in your application. Copy-paste schemas for PostgreSQL, MySQL, and MongoDB.
When you build a video application with Mux, the video infrastructure (encoding, storage, and delivery) is managed for you. Your application's database stores references to Mux resources (assets, uploads, webhook events) along with your own metadata like titles, user ownership, and visibility.

This guide provides recommended schemas you can copy directly into your database.

## What to store

Most Mux integrations need these tables:

| Table | Purpose |
|---|---|
| **mux\_assets** | Video assets synced from Mux, including status, playback IDs, duration, and resolution |
| **mux\_uploads** | Direct upload records that track upload status and the resulting asset ID |
| **mux\_webhook\_events** | Webhook event log for debugging and ensuring reliable event processing |
| **videos** (optional) | Your app-level metadata like title, description, user ownership, visibility, and tags |

The first three tables mirror data from the Mux API, kept in sync via [webhooks](/docs/core/listen-for-webhooks). The `videos` table is your application's layer on top, linking Mux assets to your users and adding metadata that Mux doesn't manage.

<Callout type="info">
  If you're using [Supabase](/docs/integrations/supabase) or [Convex](/docs/integrations/convex), these tables are created for you automatically. This guide is for teams managing their own database schema.
</Callout>

## Recommended schema

### Assets

The `mux_assets` table stores video asset data synced from Mux. This is the most important table because it tracks every video in your system.

```postgresql
CREATE TABLE mux_assets (
mux_asset_id TEXT PRIMARY KEY,
status TEXT NOT NULL DEFAULT 'preparing',
playback_ids JSONB DEFAULT '[]',
duration NUMERIC,
aspect_ratio TEXT,
resolution_tier TEXT,
max_stored_frame_rate NUMERIC,
video_quality TEXT,
upload_id TEXT,
live_stream_id TEXT,
passthrough TEXT,
tracks JSONB DEFAULT '[]',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

CREATE INDEX idx_mux_assets_status ON mux_assets (status);
CREATE INDEX idx_mux_assets_upload_id ON mux_assets (upload_id);
CREATE INDEX idx_mux_assets_created_at ON mux_assets (created_at);
```

```mysql
CREATE TABLE mux_assets (
mux_asset_id VARCHAR(255) PRIMARY KEY,
status VARCHAR(50) NOT NULL DEFAULT 'preparing',
playback_ids JSON DEFAULT ('[]'),
duration DECIMAL(12, 4),
aspect_ratio VARCHAR(20),
resolution_tier VARCHAR(20),
max_stored_frame_rate DECIMAL(8, 4),
video_quality VARCHAR(20),
upload_id VARCHAR(255),
live_stream_id VARCHAR(255),
passthrough VARCHAR(255),
tracks JSON DEFAULT ('[]'),
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
);

CREATE INDEX idx_mux_assets_status ON mux_assets (status);
CREATE INDEX idx_mux_assets_upload_id ON mux_assets (upload_id);
CREATE INDEX idx_mux_assets_created_at ON mux_assets (created_at);
```

```mongodb
db.createCollection("mux_assets");

db.mux_assets.createIndex({ mux_asset_id: 1 }, { unique: true });
db.mux_assets.createIndex({ status: 1 });
db.mux_assets.createIndex({ upload_id: 1 });
db.mux_assets.createIndex({ created_at: -1 });

// Example document structure:
// {
//   mux_asset_id: "abc123",
//   status: "ready",
//   playback_ids: [{ id: "playback123", policy: "public" }],
//   duration: 120.5,
//   aspect_ratio: "16:9",
//   resolution_tier: "1080p",
//   max_stored_frame_rate: 29.97,
//   video_quality: "plus",
//   upload_id: "upload123",
//   live_stream_id: null,
//   passthrough: "my-custom-id",
//   tracks: [
//     { id: "track1", type: "video", max_width: 1920, max_height: 1080 },
//     { id: "track2", type: "audio", max_channels: 2 }
//   ],
//   created_at: ISODate("2026-01-15T10:30:00Z"),
//   updated_at: ISODate("2026-01-15T10:35:00Z")
// }
```



#### Key fields

| Field | Description |
|---|---|
| `mux_asset_id` | The unique identifier from Mux. Use this as your primary key or unique constraint. |
| `status` | Asset lifecycle state: `preparing`, `ready`, or `errored`. Only serve videos with status `ready`. |
| `playback_ids` | Array of `{ id, policy }` objects. The `id` is used to construct playback URLs. `policy` is `public`, `signed`, or `drm`. |
| `duration` | Video length in seconds. |
| `aspect_ratio` | Format string like `16:9` or `4:3`. Useful for sizing players before the video loads. |
| `resolution_tier` | The highest resolution stored: `audio-only`, `720p`, `1080p`, `1440p`, or `2160p`. |
| `video_quality` | Encoding quality tier: `basic`, `plus`, or `premium`. |
| `upload_id` | Reference to the direct upload that created this asset, if applicable. |
| `live_stream_id` | Reference to the live stream that created this asset, if applicable. |
| `passthrough` | A string you set when creating the asset. Useful for storing your own external ID. |
| `tracks` | Array of video, audio, and text track metadata. |

### Uploads

The `mux_uploads` table tracks [direct uploads](/docs/guides/upload-files-directly) from your users' browsers. When an upload completes, Mux creates an asset and sets the `mux_asset_id`.

```postgresql
CREATE TABLE mux_uploads (
mux_upload_id TEXT PRIMARY KEY,
status TEXT NOT NULL DEFAULT 'waiting',
mux_asset_id TEXT REFERENCES mux_assets(mux_asset_id),
timeout INTEGER DEFAULT 3600,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

CREATE INDEX idx_mux_uploads_status ON mux_uploads (status);
CREATE INDEX idx_mux_uploads_asset_id ON mux_uploads (mux_asset_id);
```

```mysql
CREATE TABLE mux_uploads (
mux_upload_id VARCHAR(255) PRIMARY KEY,
status VARCHAR(50) NOT NULL DEFAULT 'waiting',
mux_asset_id VARCHAR(255),
timeout INTEGER DEFAULT 3600,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
FOREIGN KEY (mux_asset_id) REFERENCES mux_assets(mux_asset_id)
);

CREATE INDEX idx_mux_uploads_status ON mux_uploads (status);
CREATE INDEX idx_mux_uploads_asset_id ON mux_uploads (mux_asset_id);
```

```mongodb
db.createCollection("mux_uploads");

db.mux_uploads.createIndex({ mux_upload_id: 1 }, { unique: true });
db.mux_uploads.createIndex({ status: 1 });
db.mux_uploads.createIndex({ mux_asset_id: 1 });

// Example document structure:
// {
//   mux_upload_id: "upload123",
//   status: "asset_created",
//   mux_asset_id: "abc123",
//   timeout: 3600,
//   created_at: ISODate("2026-01-15T10:30:00Z"),
//   updated_at: ISODate("2026-01-15T10:30:00Z")
// }
```



#### Key fields

| Field | Description |
|---|---|
| `mux_upload_id` | The unique identifier for the direct upload. |
| `status` | Upload state: `waiting`, `asset_created`, `errored`, `cancelled`, or `timed_out`. |
| `mux_asset_id` | The asset created from this upload. Only set after status becomes `asset_created`. |
| `timeout` | How long the upload URL remains valid, in seconds (default: 3600). |

### Webhook events

The `mux_webhook_events` table logs every [webhook](/docs/core/listen-for-webhooks) received from Mux. This is essential for debugging sync issues and ensuring you don't process the same event twice.

```postgresql
CREATE TABLE mux_webhook_events (
id SERIAL PRIMARY KEY,
mux_event_id TEXT UNIQUE,
type TEXT NOT NULL,
object_type TEXT,
object_id TEXT,
payload JSONB NOT NULL,
received_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

CREATE INDEX idx_mux_events_type ON mux_webhook_events (type);
CREATE INDEX idx_mux_events_object ON mux_webhook_events (object_type, object_id);
```

```mysql
CREATE TABLE mux_webhook_events (
id INT AUTO_INCREMENT PRIMARY KEY,
mux_event_id VARCHAR(255) UNIQUE,
type VARCHAR(255) NOT NULL,
object_type VARCHAR(50),
object_id VARCHAR(255),
payload JSON NOT NULL,
received_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP
);

CREATE INDEX idx_mux_events_type ON mux_webhook_events (type);
CREATE INDEX idx_mux_events_object ON mux_webhook_events (object_type, object_id);
```

```mongodb
db.createCollection("mux_webhook_events");

db.mux_webhook_events.createIndex({ mux_event_id: 1 }, { unique: true, sparse: true });
db.mux_webhook_events.createIndex({ type: 1 });
db.mux_webhook_events.createIndex({ object_type: 1, object_id: 1 });
db.mux_webhook_events.createIndex({ received_at: -1 });

// Example document structure:
// {
//   mux_event_id: "evt_abc123",
//   type: "video.asset.ready",
//   object_type: "asset",
//   object_id: "abc123",
//   payload: { /* full webhook body */ },
//   received_at: ISODate("2026-01-15T10:35:00Z")
// }
```



#### Key fields

| Field | Description |
|---|---|
| `mux_event_id` | Unique event ID from Mux. Use this to deduplicate by checking if the event already exists before processing. |
| `type` | Event type string, like `video.asset.ready` or `video.upload.asset_ready`. |
| `object_type` | The resource type: `asset`, `upload`, or `live_stream`. |
| `object_id` | The ID of the related resource. |
| `payload` | The full webhook body. Store this as JSON so you can re-process events if your handler logic changes. |

### Videos (application metadata)

The `videos` table is optional but recommended. It stores your application-specific metadata, linking Mux assets to your users and adding information that Mux doesn't manage.

```postgresql
CREATE TABLE videos (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
mux_asset_id TEXT NOT NULL REFERENCES mux_assets(mux_asset_id),
user_id TEXT NOT NULL,
title TEXT,
description TEXT,
visibility TEXT NOT NULL DEFAULT 'private',
tags TEXT[] DEFAULT '{}',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

CREATE INDEX idx_videos_mux_asset_id ON videos (mux_asset_id);
CREATE INDEX idx_videos_user_id ON videos (user_id);
CREATE INDEX idx_videos_visibility ON videos (visibility);
```

```mysql
CREATE TABLE videos (
id CHAR(36) PRIMARY KEY DEFAULT (UUID()),
mux_asset_id VARCHAR(255) NOT NULL,
user_id VARCHAR(255) NOT NULL,
title TEXT,
description TEXT,
visibility VARCHAR(20) NOT NULL DEFAULT 'private',
tags JSON DEFAULT ('[]'),
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
FOREIGN KEY (mux_asset_id) REFERENCES mux_assets(mux_asset_id)
);

CREATE INDEX idx_videos_mux_asset_id ON videos (mux_asset_id);
CREATE INDEX idx_videos_user_id ON videos (user_id);
CREATE INDEX idx_videos_visibility ON videos (visibility);
```

```mongodb
db.createCollection("videos");

db.videos.createIndex({ mux_asset_id: 1 });
db.videos.createIndex({ user_id: 1 });
db.videos.createIndex({ visibility: 1 });
db.videos.createIndex({ mux_asset_id: 1, user_id: 1 }, { unique: true });

// Example document structure:
// {
//   mux_asset_id: "abc123",
//   user_id: "user_456",
//   title: "My First Video",
//   description: "A short description",
//   visibility: "public",
//   tags: ["tutorial", "getting-started"],
//   created_at: ISODate("2026-01-15T10:30:00Z"),
//   updated_at: ISODate("2026-01-15T10:30:00Z")
// }
```



#### Key fields

| Field | Description |
|---|---|
| `mux_asset_id` | Foreign key to the `mux_assets` table. Links your metadata to the Mux video. |
| `user_id` | Reference to the user in your application who owns this video. |
| `title` / `description` | User-facing metadata for display in your UI. |
| `visibility` | Access control: `public`, `unlisted`, or `private`. Enforce this in your application logic. |
| `tags` | Categorization for filtering and search. |

## Keeping data in sync

Use [Mux webhooks](/docs/core/listen-for-webhooks) to keep your database in sync with Mux. When Mux finishes processing a video, delivers an upload, or encounters an error, it sends a webhook to your server.

The key events to listen for:

| Event | Action |
|---|---|
| `video.asset.created` | Insert a new row in `mux_assets` with status `preparing` |
| `video.asset.ready` | Update the asset's status to `ready` and populate `playback_ids`, `duration`, `tracks` |
| `video.asset.errored` | Update the asset's status to `errored` |
| `video.asset.deleted` | Delete or mark the asset as deleted |
| `video.upload.asset_ready` | Update the upload's `mux_asset_id` and status |
| `video.upload.errored` | Update the upload's status to `errored` |

Here's a simplified example of a webhook handler that syncs asset data:

```javascript
// Example: Webhook handler for syncing Mux assets
app.post('/webhooks/mux', async (req, res) => {
  const event = req.body;

  // 1. Log the event for debugging
  await db.muxWebhookEvents.create({
    mux_event_id: event.id,
    type: event.type,
    object_type: event.object?.type,
    object_id: event.object?.id,
    payload: event,
  });

  // 2. Sync the resource
  if (event.type.startsWith('video.asset.')) {
    const asset = event.data;
    await db.muxAssets.upsert({
      mux_asset_id: asset.id,
      status: asset.status,
      playback_ids: asset.playback_ids,
      duration: asset.duration,
      aspect_ratio: asset.aspect_ratio,
      resolution_tier: asset.resolution_tier,
      tracks: asset.tracks,
      passthrough: asset.passthrough,
    });
  }

  res.status(200).send('ok');
});
```

<Callout type="info">
  Always verify webhook signatures in production to ensure events are genuinely from Mux. See the [webhooks guide](/docs/core/listen-for-webhooks) for details on signature verification.
</Callout>

## Extending for live streaming

If your application supports live streaming, add a `mux_live_streams` table to track stream configurations:

```postgresql
CREATE TABLE mux_live_streams (
mux_live_stream_id TEXT PRIMARY KEY,
status TEXT NOT NULL DEFAULT 'idle',
stream_key TEXT,
playback_ids JSONB DEFAULT '[]',
reconnect_window_seconds INTEGER DEFAULT 60,
active_asset_id TEXT,
recent_asset_ids JSONB DEFAULT '[]',
latency_mode TEXT DEFAULT 'standard',
passthrough TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);

CREATE INDEX idx_mux_live_streams_status ON mux_live_streams (status);
```

```mysql
CREATE TABLE mux_live_streams (
mux_live_stream_id VARCHAR(255) PRIMARY KEY,
status VARCHAR(50) NOT NULL DEFAULT 'idle',
stream_key VARCHAR(255),
playback_ids JSON DEFAULT ('[]'),
reconnect_window_seconds INTEGER DEFAULT 60,
active_asset_id VARCHAR(255),
recent_asset_ids JSON DEFAULT ('[]'),
latency_mode VARCHAR(20) DEFAULT 'standard',
passthrough VARCHAR(255),
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
);

CREATE INDEX idx_mux_live_streams_status ON mux_live_streams (status);
```

```mongodb
db.createCollection("mux_live_streams");

db.mux_live_streams.createIndex({ mux_live_stream_id: 1 }, { unique: true });
db.mux_live_streams.createIndex({ status: 1 });

// Example document structure:
// {
//   mux_live_stream_id: "stream123",
//   status: "idle",
//   stream_key: "sk_us-east-1_abc...",
//   playback_ids: [{ id: "playback123", policy: "public" }],
//   reconnect_window_seconds: 60,
//   active_asset_id: null,
//   recent_asset_ids: ["asset1", "asset2"],
//   latency_mode: "standard",
//   passthrough: null,
//   created_at: ISODate("2026-01-15T10:30:00Z"),
//   updated_at: ISODate("2026-01-15T10:30:00Z")
// }
```



<Callout type="warning">
  The `stream_key` is sensitive. Treat it like a password. Anyone with the stream key can broadcast to your live stream. Store it encrypted or restrict access in your application.
</Callout>

## Next steps

* [Integrate with Supabase](/docs/integrations/supabase): managed Postgres with automatic Mux sync
* [Integrate with Convex](/docs/integrations/convex): real-time database with built-in Mux component
* [Listen for webhooks](/docs/core/listen-for-webhooks): set up webhook handling for real-time sync
* [Secure video playback](/docs/guides/secure-video-playback): protect your content with signed playback IDs
* [API reference](/docs/api-reference): full details on all Mux resource fields


# Integrate with Supabase
The Mux Supabase integration syncs your Mux assets, live streams, and uploads to your Supabase database using webhooks and edge functions.
The `@mux/supabase` package provides a CLI that integrates your Mux account with Supabase. It creates a `mux` schema in your database with tables for assets, live streams, uploads, and webhook events, keeping everything in sync automatically.

## What you'll get

After setup, your Supabase database will have a `mux` schema containing:

* **assets** - Mux video assets and their metadata
* **live\_streams** - Live stream configurations and status
* **uploads** - Direct upload records
* **webhook\_events** - Webhook event history for debugging and auditing

## Prerequisites

Before you begin, make sure you have:

* A [Mux account](https://dashboard.mux.com/signup) with API credentials
* A [Supabase project](https://supabase.com/dashboard)
* The [Supabase CLI](https://supabase.com/docs/guides/cli) installed

## 1. Initialize the integration

First, make sure Supabase is initialized in your project, then run the Mux initialization:

```bash
npx supabase init
npx @mux/supabase init
```

The initialization process will:

1. Create a `mux` schema with tables for assets, live streams, uploads, and webhook events
2. Generate an edge function at `supabase/functions/mux-webhook`
3. Create a `.env` file at `supabase/functions/.env` with placeholder values for your Mux credentials
4. Update `config.toml` to expose the `mux` schema and disable JWT verification for the webhook function

## 2. Configure environment variables

After initialization, update the `.env` file in the `supabase/functions` directory with your actual Mux API credentials:

```bash
MUX_TOKEN_ID=your-mux-token-id
MUX_TOKEN_SECRET=your-mux-token-secret
MUX_WEBHOOK_SECRET=your-webhook-secret
```

You can find your API credentials in the [Mux Dashboard](https://dashboard.mux.com/settings/access-tokens). The webhook secret will be generated when you configure the webhook in step 4.

## 3. Local development

Start your local Supabase instance:

```bash
npx supabase start
```

To test the webhook locally, serve the edge function:

```bash
npx supabase functions serve mux-webhook
```

<Callout type="info">
  To receive webhooks locally, you'll need to expose your local server using a tool like [ngrok](https://ngrok.com/). Configure the ngrok URL as your webhook endpoint in the Mux dashboard.
</Callout>

## 4. Configure the Mux webhook

In the [Mux Dashboard](https://dashboard.mux.com/settings/webhooks), create a new webhook with the following settings:

* **URL**: Your Supabase edge function URL (e.g., `https://your-project.supabase.co/functions/v1/mux-webhook`)
* **Events**: Select the events you want to sync (recommended: all video asset and live stream events)

After creating the webhook, copy the signing secret and add it to your environment variables as `MUX_WEBHOOK_SECRET`.

## 5. Backfill existing data

If you have existing assets in Mux that you want to sync to Supabase, run the backfill command:

```bash
npx @mux/supabase backfill
```

This will import all your existing Mux assets and live streams into your Supabase database.

### Programmatic backfill

You can also run the backfill programmatically using the sync engine:

```typescript
import { MuxSync } from '@mux/sync-engine';

const muxSync = new MuxSync({
  databaseUrl: 'your-supabase-database-url',
  muxTokenId: 'your-mux-token-id',
  muxTokenSecret: 'your-mux-token-secret',
  muxWebhookSecret: 'your-mux-webhook-secret',
});

const result = await muxSync.syncBackfill({ object: 'all' });
```

## 6. Deploy to production

When you're ready to deploy:

1. **Push database migrations**:
   ```bash
   npx supabase db push
   ```

2. **Set production secrets**:
   ```bash
   npx supabase secrets set MUX_TOKEN_ID=your-token-id
   npx supabase secrets set MUX_TOKEN_SECRET=your-token-secret
   npx supabase secrets set MUX_WEBHOOK_SECRET=your-webhook-secret
   ```

3. **Deploy the edge function**:
   ```bash
   npx supabase functions deploy mux-webhook
   ```

4. **Update your Mux webhook URL** in the dashboard to point to your production edge function URL.

## Querying your data

Once synced, you can query your Mux data directly from Supabase. The `mux` schema uses Row Level Security (RLS) that blocks access from `anon` and `authenticated` roles by default, so you'll need to use the service role key for server-side queries:

```typescript
import { createClient } from '@supabase/supabase-js';

const supabase = createClient(
  'your-supabase-url',
  'your-supabase-service-role-key',
  { db: { schema: 'mux' } }
);

// Get all ready assets
const { data: assets } = await supabase
  .from('assets')
  .select('*')
  .eq('status', 'ready');

// Get a specific asset by ID
const { data: asset } = await supabase
  .from('assets')
  .select('*')
  .eq('id', 'asset-id')
  .single();
```

<Callout type="warning">
  The service role key bypasses RLS and should only be used in server-side code. Never expose it in client-side applications. If you need client-side access to Mux data, add your own RLS policies to the tables in the `mux` schema.
</Callout>

## Using with PostgreSQL

If you're using PostgreSQL without Supabase, you can use the `@mux/sync-engine` package directly:

```bash
npm install @mux/sync-engine
```

The sync engine works with any PostgreSQL database and provides the same synchronization capabilities.

## Resources

* [GitHub repository](https://github.com/muxinc/supabase)
* [Supabase documentation](https://supabase.com/docs)
* [Mux API reference](/docs/api-reference)


# Integrate with Convex
The Mux Convex component syncs your Mux assets, live streams, and uploads to your Convex database with real-time updates via webhooks.
The `@mux/convex` package is a reusable Convex component that bridges Mux video services with the Convex backend. It provides database tables for Mux resources, webhook handling, and query helpers for building video applications.

## What you'll get

After setup, your Convex database will have tables for:

* **assets** - Mux video assets and their metadata
* **uploads** - Direct upload records
* **liveStreams** - Live stream configurations and status
* **events** - Webhook event history
* **videoMetadata** - App-level metadata (user ID, title, description, visibility, tags, custom fields)

## Prerequisites

Before you begin, make sure you have:

* A [Mux account](https://dashboard.mux.com/signup) with API credentials
* A [Convex project](https://dashboard.convex.dev) with a `convex/` directory already initialized
* The Convex CLI installed (`npm install -g convex`)
* A TypeScript web application (e.g., [Next.js](https://docs.convex.dev/quickstart/nextjs), [React](https://docs.convex.dev/quickstart/react), or any [Convex-supported framework](https://docs.convex.dev/quickstarts))

<Callout type="info">
  New to Convex? Follow the [Convex quickstart](https://docs.convex.dev/quickstarts) for your framework first, then come back here to add Mux integration.
</Callout>

## 1. Install packages

Install the required packages in your project:

```bash
npm install convex @mux/convex @mux/mux-node
```

## 2. Generate project files

Run the initialization script to generate the required Convex files:

```bash
npx @mux/convex init --component-name mux
```

This creates four files in your `convex` directory:

* `convex.config.ts` - Mounts the Mux component into your Convex app
* `migrations.ts` - Backfill function for syncing existing Mux assets
* `muxWebhook.ts` - Webhook handler for receiving Mux events
* `http.ts` - HTTP route that exposes the webhook endpoint

If any of these files already exist, the CLI will skip them. Use `--force` to overwrite existing files.

<Callout type="info">
  If you already have a `convex/convex.config.ts` or `convex/http.ts`, use `--skip-config` or `--skip-http` to skip those files and manually integrate the Mux component into your existing configuration.
</Callout>

## 3. Set environment variables

Configure your Mux API credentials in Convex:

```bash
npx convex env set MUX_TOKEN_ID your-mux-token-id
npx convex env set MUX_TOKEN_SECRET your-mux-token-secret
```

You can find your API credentials in the [Mux Dashboard](https://dashboard.mux.com/settings/access-tokens).

## 4. Start development and backfill

Start the Convex development server:

```bash
npx convex dev
```

In another terminal, run the backfill migration to sync your existing Mux assets:

```bash
npx convex run migrations:backfillMux '{}'
```

This will import your existing Mux assets into the Convex `assets` and `videoMetadata` tables. You can customize the backfill with options:

```bash
npx convex run migrations:backfillMux '{"maxAssets": 500, "defaultUserId": "my-user-id", "includeVideoMetadata": true}'
```

<Callout type="info">
  The backfill currently syncs Mux assets only. Live streams and uploads will be synced in real time once you configure the webhook in the next step.
</Callout>

## 5. Configure the Mux webhook

In the [Mux Dashboard](https://dashboard.mux.com/settings/webhooks), create a new webhook with your Convex HTTP endpoint as the URL:

```
https://your-deployment.convex.site/mux/webhook
```

Mux will send all events to this endpoint. The component automatically handles routing asset, live stream, and upload events to the appropriate tables and ignores unsupported event types.

After creating the webhook, copy the signing secret and add it to your Convex environment:

```bash
npx convex env set MUX_WEBHOOK_SECRET your-webhook-signing-secret
```

## Verify the setup

Check your [Convex dashboard](https://dashboard.convex.dev) to verify the tables are populated:

1. Navigate to your project's Data tab
2. You should see the Mux tables: `assets`, `uploads`, `liveStreams`, `events`, and `videoMetadata`
3. Upload a test video in Mux and verify it appears in your Convex database

## Querying your data

The component syncs Mux data into your Convex database via webhooks. To access that data, you wrap the component's built-in queries with your own Convex functions, then call those functions from your frontend.

### Define your queries

Create a new file in your `convex` directory (e.g., `convex/videoQueries.ts`) to wrap the component queries:

```typescript
import { query } from './_generated/server';
import { components } from './_generated/api';
import { v } from 'convex/values';

// Returns an array of asset objects, each containing:
// muxAssetId, status, playbackIds, durationSeconds, aspectRatio, tracks, etc.
export const listAssets = query({
  handler: async (ctx) => {
    return await ctx.runQuery(components.mux.catalog.listAssets, {});
  },
});

// Returns a single asset object or null
export const getAsset = query({
  args: { muxAssetId: v.string() },
  handler: async (ctx, args) => {
    return await ctx.runQuery(components.mux.catalog.getAssetByMuxId, {
      muxAssetId: args.muxAssetId,
    });
  },
});

// Returns an array of { metadata, asset } pairs for a given user
export const getUserVideos = query({
  args: { userId: v.string() },
  handler: async (ctx, args) => {
    return await ctx.runQuery(components.mux.videos.listVideosForUser, {
      userId: args.userId,
    });
  },
});
```

### Use queries in your frontend

Call these queries from your React components using Convex's `useQuery` hook:

```typescript
import { useQuery } from 'convex/react';
import { api } from '../convex/_generated/api';

function VideoList() {
  const assets = useQuery(api.videoQueries.listAssets);

  if (!assets) return <div>Loading...</div>;

  return (
    <ul>
      {assets.map((asset) => (
        <li key={asset.muxAssetId}>
          {asset.muxAssetId} — {asset.status}
        </li>
      ))}
    </ul>
  );
}
```

These queries are reactive — your UI will update automatically when the underlying data changes in Convex, such as when a webhook updates an asset's status.

### Available queries

The component exposes the following queries through `components.mux`:

| Query | Description |
|---|---|
| `catalog.listAssets` | List all synced assets, sorted by most recently updated |
| `catalog.getAssetByMuxId` | Get a single asset by its Mux asset ID |
| `catalog.listLiveStreams` | List all synced live streams |
| `catalog.getLiveStreamByMuxId` | Get a single live stream by its Mux live stream ID |
| `catalog.listUploads` | List all synced direct uploads |
| `catalog.getUploadByMuxId` | Get a single upload by its Mux upload ID |
| `catalog.listRecentEvents` | List recent webhook events |
| `videos.listVideosForUser` | List assets with app-level metadata for a specific user |
| `videos.getVideoByMuxAssetId` | Get an asset with its associated metadata |

All list queries accept an optional `limit` parameter (default: 50).

## Adding video metadata

The catalog queries above return Mux-level data synced via webhooks. The video metadata system lets you layer your own app-level data — like titles, descriptions, user ownership, and visibility — on top of those synced assets.

```typescript
import { mutation } from './_generated/server';
import { components } from './_generated/api';
import { v } from 'convex/values';

export const setVideoMetadata = mutation({
  args: {
    muxAssetId: v.string(),
    userId: v.string(),
    title: v.optional(v.string()),
    description: v.optional(v.string()),
    visibility: v.optional(
      v.union(v.literal('public'), v.literal('private'), v.literal('unlisted'))
    ),
    tags: v.optional(v.array(v.string())),
  },
  handler: async (ctx, args) => {
    await ctx.runMutation(components.mux.videos.upsertVideoMetadata, {
      muxAssetId: args.muxAssetId,
      userId: args.userId,
      title: args.title,
      description: args.description,
      visibility: args.visibility,
      tags: args.tags,
    });
  },
});
```

Once metadata is set, queries like `videos.listVideosForUser` and `videos.getVideoByMuxAssetId` will return both the Mux asset data and your app-level metadata together.

## Important notes

<Callout type="info">
  The backfill is a one-time synchronization. After that, webhooks maintain near real-time updates between Mux and Convex.
</Callout>

<Callout type="warning">
  Make sure to use consistent component names across all configuration steps. If you change the component name from `mux`, regenerate the wrapper files with the matching name: `npx @mux/convex init --component-name yourName --force`
</Callout>

## Resources

* [GitHub repository](https://github.com/Joshalphonse/mux-convex-component)
* [Convex documentation](https://docs.convex.dev)
* [Mux API reference](/docs/api-reference)


# Understanding Mux Video Pricing
Learn how Mux Video pricing works and what levers and modifiers there are for you to control costs.
Mux pricing is split up into three categories: input, storage, and delivery. In other words, you're charged by how much video you upload every month, how much video you store every month, and how much video your users stream every month.

As you read, keep your eye out for what we call "pricing levers": ways you can suit your costs to your use case. For example, we offer discounts based on volume and resolution. More on those [near the end](#pricing-levers-and-add-ons).

Finally, Mux charges by minute of video inputted, stored, and delivered. [Learn more](https://www.mux.com/blog/why-we-still-price-in-minutes-for-video) about why we charge in minutes instead of bytes.

Let's get started by talking about the first category of pricing: Input.

<Callout id="higher-usage">
  If you find yourself with higher usage than the tiers described below, [we'd love to talk to you about how we can customize your pricing.](https://www.mux.com/sales-contact?utm_source=docs\&utm_campaign=video-pricing)
</Callout>

## Input

Videos can come in all sorts of different formats, containers, codecs, or countless other variations. When a video is uploaded to Mux, we process it and create a high-quality, standardized version of the video through a process called "encoding." We use that standardized version to deliver any number of bitrates and resolutions based on the viewer's needs, but more on that [later](#delivery).

Mux supports a configurable video quality level on each asset, with three levels, basic, plus, and premium.

The **basic** video quality level uses a reduced encoding ladder, with a lower target video quality, suitable for simpler video use cases. There is no charge for video input when using basic quality. Basic assets are optimized for video use cases with simpler streaming needs, such as social or user-generated content, where high encoding costs may limit your business model.

The **plus** video quality level encodes your video at a consistent high-quality level. Assets encoded with the plus quality use an AI-powered per-title encoding technology that boosts bitrates for high-complexity content, ensuring high-quality video, while reducing bitrates for lower-complexity content to save bandwidth without sacrificing on quality. Plus assets are enhanced quality, perfect for professional or branded content. The plus quality level incurs a cost per video minute of encoding.

The **premium** video quality level uses the same AI-powered per-title encoding technology as plus, but is tuned to optimize for the presentation of premium media content, where the superior video quality is required, including use cases such as studio or cinematic projects. The premium quality level incurs a higher cost per video minute of encoding.

<Callout id="live-streams">
  Live Streams are not available with the basic video quality level; Mux only supports plus and premium video quality levels for Live Streams.
</Callout>

The default video quality level for assets in new accounts is basic. You can configure your organization's default video quality level in the Settings pane in the dashboard, or after confirming the payment method when you change your plan. This configuration option is only available to account admins. You can also override the default video quality level on a per-asset basis at the time of asset creation.

[Learn more about setting video quality levels.](/docs/guides/use-video-quality-levels)

Mux charges by minute of video encoded.

### Basic quality input

[Learn more about setting video quality levels.](/docs/guides/use-video-quality-levels)

On-demand video only.

| Monthly volume tiers  | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :-------------------- | :--------- | :---- | :------    | :--------- | :---------
| All volumes | Free | Free | Free | Free | Free

### Plus quality input

[Learn more about setting video quality levels.](/docs/guides/use-video-quality-levels)

Live video is supported up to 1080p and on-demand video up to 4K.

Pricing is per minute.

| Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :------------------- | :--------- | :---- | :--------- | :--------- | :---------
| First 5,000 minutes | $0.025000 | $0.031250 | $0.050000 | $0.100000 | $0.002500
| Next 10,000 minutes | $0.023750 | $0.029688 | $0.047500 | $0.095000 | $0.002375
| Next 10,000 minutes | $0.023125 | $0.028906 | $0.046250 | $0.092500 | $0.002313
| Over 25,000 minutes | $0.022500 | $0.028125 | $0.045000 | $0.090000 | $0.002250

### Premium quality input

[Learn more about setting video quality levels.](/docs/guides/use-video-quality-levels)

Live video is supported up to 1080p, on-demand video up to 4K.

Pricing is per minute.

| Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :------------------- | :--------- | :---- | :--------- | :--------- | :---------
| First 5,000 minutes | $0.037500 | $0.046875 | $0.075000 | $0.150000 | $0.002500
| Next 10,000 minutes | $0.035625 | $0.044531 | $0.071250 | $0.142500 | $0.002375
| Next 10,000 minutes | $0.034688 | $0.043359 | $0.069375 | $0.138750 | $0.002313
| Over 25,000 minutes | $0.033750 | $0.042188 | $0.067500 | $0.135000 | $0.002250

## Advanced static rendition (MP4s) preparation

Advanced static renditions are priced per minute of content, per static rendition, based on the resolution of the static rendition that is generated.

Note that there is no charge for generating standard static rendition MP4s. See the [enabling static MP4 renditions](/docs/guides/enable-static-mp4-renditions) guide for more information.

### Basic and plus quality advanced static rendition preparation

| Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K)
| :------------------- | :--------- | :---- | :--------- | :---------
| First 5,000 minutes | $0.008000 | $0.010000 | $0.016000 | $0.032000
| Next 10,000 minutes | $0.007600 | $0.009500 | $0.015200 | $0.030400
| Next 10,000 minutes | $0.007400 | $0.009250 | $0.014800 | $0.029600
| Over 25,000 minutes | $0.007200 | $0.009000 | $0.014400 | $0.028800

### Premium quality advanced static rendition preparation

| Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K)
| :------------------- | :--------- | :---- | :--------- | :---------
| First 5,000 minutes | $0.012000 | $0.015000 | $0.024000 | $0.048000
| Next 10,000 minutes | $0.011400 | $0.014250 | $0.022800 | $0.045600
| Next 10,000 minutes | $0.011100 | $0.013875 | $0.022200 | $0.044400
| Over 25,000 minutes | $0.010800 | $0.013500 | $0.021600 | $0.043200

## Storage

When we talked about input, we mentioned that Mux creates a single, high-quality, standardized version of each video. That step is when most traditional video solutions or providers will create all the different versions of your video for streaming to different devices, which means storing all those different versions indefinitely. Mux, on the other hand, only creates and stores one version of your video because Mux is able to deliver the right versions of the video when your viewers need it.

With [Automatic Cold Storage](#automatic-cold-storage), Mux automatically applies discounts to infrequently accessed assets.

Storage is calculated by minute of video stored. Storage is prorated by the percentage of the month that the video is stored. For example, if a 10-minute asset is stored for only half a month, you will be charged for only 5 minutes.

The cost of video storage also includes the storage of primary audio, metadata, and captions. When you pay for the storage of a video, you can also transcode or transmux to normalize inputs, create metadata or thumbnails, and access it in the dashboard or through the API.

Mux Live Streams have the choice to use the plus or premium quality level. Mux offers live streaming up to 1080p.

Mux will automatically start creating an on-demand asset in the background when you begin broadcasting to your live stream. These assets are created and stored as assets with the video quality level you chose for encoding.

### Basic and plus quality storage

[Learn more about setting video quality levels.](/docs/guides/use-video-quality-levels)

**Basic quality** level assets have a minimum storage charge of one month and are prorated thereafter. Basic supports on-demand video only, up to 4K.
**Plus quality** level does not have a minimum storage change. Plus supports Live video supported up to 1080p and on-demand video up to 4K.

Pricing is per minute per month.

| Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :------------------- | :--------- | :---- | :--------- | :--------- | :---------
| First 50,000 minutes | $0.002400 | $0.003000 | $0.004800 | $0.009600 | $0.000240
| Next 100,000 minutes | $0.002320 | $0.002900 | $0.004640 | $0.009280 | $0.000232
| Next 100,000 minutes | $0.002280 | $0.002850 | $0.004560 | $0.009120 | $0.000228
| Over 250,000 minutes | $0.002240 | $0.002800 | $0.004480 | $0.008960 | $0.000224

### Premium quality storage

[Learn more about setting video quality levels.](/docs/guides/use-video-quality-levels)

Live video supported up to 1080p and on-demand video up to 4K.

Pricing is per minute per month.

| Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :------------------- | :--------- | :---- | :--------- | :--------- | :---------
| First 50,000 minutes | $0.003600 | $0.004500 | $0.007200 | $0.014400 | $0.000240
| Next 100,000 minutes | $0.003480 | $0.004350 | $0.006960 | $0.013920 | $0.000232
| Next 100,000 minutes | $0.003420 | $0.004275 | $0.006840 | $0.013680 | $0.000228
| Over 250,000 minutes | $0.003360 | $0.004200 | $0.006720 | $0.013440 | $0.000224

## Static rendition (MP4s) storage

Static renditions are priced per minute of content, per static rendition, per month stored. The pricing is also based on the resolution of the static rendition.

Static rendition storage also benefits from the automatic cold storage feature when content is not viewed, see [Automatic Cold Storage](#automatic-cold-storage) for more information.

### Basic and plus quality static rendition storage

| Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :--------- | :---- | :--------- | :--------- | :---------
| $0.000600 | $0.000750 | $0.001200 | $0.002400 | $0.000060

### Premium quality static rendition storage

| Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :--------- | :---- | :--------- | :--------- | :---------
| $0.000900 | $0.001125 | $0.001800 | $0.003600 | $0.000060

## Automatic cold storage

With Automatic Cold Storage, we programmatically transition a video or audio-only asset to a different storage level based on how long it has been since it was last viewed. The colder the asset gets, the lower the billing rate becomes.

An asset transitions to `Infrequent` if it has not been played in the last 30 days, and will receive a 40% discount off of the applicable usage-based rate.

An asset transitions to `Cold` if it has not been played in the last 90 days, and will receive a 60% discount off of the applicable usage-based rate.

When an asset is first created, it instantly transitions into the `Cold` tier until the first time it is played.

All assets, including those with static rendition (MP4s) enabled, are eligible for Automatic Cold Storage. Viewing an asset through either the static rendition or the HLS URL will reset the cold storage timer for the entire asset.

Note: [Downloading a master](/docs/guides/download-for-offline-editing) of an asset in infrequent or cold storage will cause the asset to be returned to the frequent storage class, and the cold storage timer to be reset.

### Basic and plus quality automatic cold storage

Pricing is per minute per month.

| Asset last viewed | Storage tier | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :---------------- | :----------- | :--------- | :---- | :--------- | :--------- | :---------
| 30+ days ago | `Infrequent` | $0.001440 | $0.001800 | $0.002880 | $0.005760 | $0.000144
| 90+ days ago | `Cold` | $0.000960 | $0.001200 | $0.001920 | $0.003840 | $0.000096

### Premium quality automatic cold storage

Pricing is per minute per month.

| Asset last viewed | Storage tier | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :---------------- | :----------- | :--------- | :---- | :--------- | :--------- | :---------
| 30+ days ago | `Infrequent` | $0.002160 | $0.002700 | $0.004320 | $0.008640 | $0.000144
| 90+ days ago | `Cold` | $0.001440 | $0.001800 | $0.002880 | $0.005760 | $0.000096

### Basic and plus quality static rendition automatic cold storage

Pricing is per minute per month.

| Asset last viewed | Storage tier | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :---------------- | :----------- | :--------- | :---- | :--------- | :--------- | :---------
| 30+ days ago | `Infrequent` | $0.000360 | $0.000450 | $0.000720 | $0.001440 | $0.000036
| 90+ days ago | `Cold` | $0.000240 | $0.000300 | $0.000480 | $0.000960 | $0.000024

### Premium quality static rendition automatic cold storage

Pricing is per minute per month.

| Asset last viewed | Storage tier | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
| :---------------- | :----------- | :--------- | :---- | :--------- | :--------- | :---------
| 30+ days ago | `Infrequent` | $0.000540 | $0.000675 | $0.001080 | $0.002160 | $0.000036
| 90+ days ago | `Cold` | $0.000360 | $0.000450 | $0.000720 | $0.001440 | $0.000024

## Delivery

When someone wants to watch a video on Mux, we use a process called “just-in-time encoding,” where we turn that standard, single video file into any number of bitrates and resolutions based on the viewer's needs. This process happens instantly.

In order to deliver video, Mux partners with multiple [CDNs](https://www.mux.com/video-glossary/cdn-content-delivery-network). Videos are delivered over HTTP-based streaming formats like [HLS](https://www.mux.com/video-glossary/hls-http-live-streaming). Video can be delivered to all major video players. If you're looking for a place to start with players, we suggest [Mux Player](https://www.mux.com/player). [Mux Data](https://data.mux.com/) is included with delivery, giving you the ability to monitor your video, including user engagement and quality of experience.

Cost is per minute of video delivered. To calculate video delivered, we measure the number of seconds of video delivered to a video player. Note that if a segment of video is delivered, it is charged, even if the viewer doesn't actually watch the video. For example, if a video player buffers 20 seconds of video ahead of the player, Mux Video still has to deliver those 20 seconds regardless of whether they are watched, and so those seconds are charged.

Mux Live Streams have the choice to use the plus or premium quality level. Mux offers live streaming up to 1080p.

### Basic and plus quality delivery

**Basic quality** level assets support on-demand video only, up to 4K.
**Plus quality** level supports live video up to 1080p, on-demand video up to 4K.

**The first 100,000 minutes delivered each month, regardless of quality or resolution, are free.**

Pricing is per minute.

Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
:------------------- | :--------- | :---- | :--------- | :--------- | :---------
First 500,000 minutes | $0.000800 | $0.001000 | $0.001600 | $0.003200 | $0.000080
Next 500,000 minutes | $0.000760 | $0.000950 | $0.001520 | $0.003040 | $0.000076
Next million minutes | $0.000720 | $0.000900 | $0.001440 | $0.002880 | $0.000072
Next 4 million minutes | $0.000670 | $0.000838 | $0.001340 | $0.002680 | $0.000067
Next 4 million minutes | $0.000610 | $0.000763 | $0.001220 | $0.002440 | $0.000061
Over 10 million minutes | $0.000560 | $0.000700 | $0.001120 | $0.002240 | $0.000056

### Premium quality delivery

Live video supported up to 1080p, on-demand video up to 4K.

**The first 100,000 minutes delivered each month, regardless of quality or resolution, are free.**

Pricing is per minute.

Monthly volume tiers | Up to 720p | 1080p | 1440p (2K) | 2160p (4K) | Audio-only
:------------------- | :--------- | :---- | :--------- | :--------- | :---------
First 500,000 minutes | $0.001200 | $0.001500 | $0.002400 | $0.004800 | $0.000080
Next 500,000 minutes | $0.001140 | $0.001425 | $0.002280 | $0.004560 | $0.000076
Next million minutes | $0.001080 | $0.001350 | $0.002160 | $0.004320 | $0.000072
Next 4 million minutes | $0.001005 | $0.001256 | $0.002010 | $0.004020 | $0.000067
Next 4 million minutes | $0.000915 | $0.001144 | $0.001830 | $0.003660 | $0.000061
Over 10 million minutes | $0.000840 | $0.001050 | $0.001680 | $0.003360 | $0.000056

## Pricing levers and add-ons

Mux offers a few ways to suit your pricing to your use case: pricing levers you can pull to move our standard encoding, storage, and delivery pricing up and down, and add-ons you can use to do more with your assets.

### Video quality level

The first pricing lever you should consider is video quality level. Whether you should pick basic, plus, or premium video quality depends on your streaming needs. Read more in the [Input section](#input).

### Resolution-based pricing

Resolution-based pricing tiers are determined by the number of pixels in the video, calculated by multiplying height by width. Tiers apply to encoding, storage, and delivery. An asset may be delivered in multiple resolutions, in which case it will be billed based on minutes delivered in each resolution. Resolution-based discounts are automatically applied.

| Pricing tier | Pixels | Typical resolution |
| :----------- | :----- | :----------------- |
| Up to 720p | Up to 921,600 pixels | 1280x720 |
| 1080p | Up 921,601 to 2,073,600 pixels | 1920x1080 |
| 1440p (2K) | 2,073,601 to 4,194,304 pixels | 2560x1440 |
| 2160p (4K) | 4,194,305 to 8,294,400 pixels | 3840x2160 |

You can control what resolution gets played with [playback modifiers](/docs/guides/modify-playback-behavior).

2K and 4K resolutions are available for on-demand assets only.

### Volume discounts

You have a total of 300,000 stored minutes. They're broken down into the following:

* 5,000 basic minutes at 720p
* 55,000 basic minutes at 1080p
* 100,000 plus minutes at 1080p
* 140,000 plus minutes at 4K

With volume discounts automatically applied, your storage discounts would be applied like this:

* 5,000 basic quality minutes at 720p - Because those minutes fall into your “first 50,000 minutes, no volume discounts applicable”
* 55,000 basic quality minutes at 1080p - The first 50,000 minutes of 1080p assets are charged at the first tier. The rest, 5,000 of those minutes, fall into the next tier.
* 100,000 plus quality minutes at 1080p - Similarly, discounts aren't available for the first 50,000 minutes of smart 1080p, but are available for the next 50,000 minutes.
* 140,000 plus quality minutes at 4K - 50,000 of the minutes get no discount, 90,000 do.

If you find yourself with higher usage than the tiers outlined in the Input, Storage, and Delivery above, [we'd love to talk to you about how we can customize your pricing.](https://www.mux.com/sales-contact?utm_source=docs\&utm_campaign=video-pricing)

### Audio-only

All on-demand audio-only assets and alternate audio tracks are calculated at 1/10th the cost of 720p basic video for encoding, storage, and delivery, no matter quality level assigned to that asset.

When an asset is set to the basic quality level, encoding is free for both audio-only and video assets.

Your free first 100,000 minutes delivered each month include video and audio minutes.

### Auto-generated live captions

6,000 minutes per month free.
$0.024 per minute after.

Learn how to [add auto-generated live captions](/docs/guides/add-autogenerated-live-captions).

### Live simulcasting

$0.020 per minute per simulcast target.

Learn about [simulcasting](/docs/guides/stream-live-to-3rd-party-platforms).

### Multi-track audio

The primary audio track uploaded with your video file will be included with the encoding, storage, and delivery cost as part of your video. Any additional audio tracks uploaded will be charged at the audio-only rates for encoding, storage, and delivery.

Learn more about [multi-track audio](/docs/guides/add-alternate-audio-tracks-to-your-videos).

### Static renditions (MP4s)

Static renditions are a paid add-on feature.

[Static renditions come in two types: standard and advanced](/docs/guides/enable-static-mp4-renditions). Standard static renditions are free to generate, while advanced static renditions are charged per minute of preparation. [See advanced static rendition preparation pricing.](#advanced-static-rendition-mp4s-preparation).

All static renditions are billed per static rendition, per month stored. Billing is based on the resolution of the static rendition. See static rendition storage pricing in the [Storage section](#static-rendition-mp4s-storage).

For static renditions, each minute downloaded counts as a minute streamed and will be charged according to the video quality level. [Learn more about cost of delivery](/docs/pricing/video#delivery).

[Learn more about enabling MP4 renditions](/docs/guides/enable-static-mp4-renditions).

### Digital Rights Management (DRM)

DRM is an add-on feature to Mux Video, with a $100/month access fee + $0.003 "per license", and discounts available for high volumes. For more details on DRM licenses, see our [DRM pricing documentation](/docs/guides/protect-videos-with-drm#what-is-a-drm-license).

Learn more about [DRM](/docs/guides/protect-videos-with-drm).

## FAQs

### What's the difference between the pay as you go plan and pre-pay credits?

When you pre-pay for credits you get Mux usage at a discounted rate. For example, with Launch credits you get $100 of monthly usage for $20 a month or Scale credits, $1000 of monthly usage for $500 a month.

Credits are automatically applied to your invoice at the usage rates outlined above. For any usage above your credit amount, you will be billed at pay as you go rates. Credits reset at the beginning of your billing cycle, as long as you’re subscribed to that credits plan.

API access, features, etc, are all the same otherwise.

You don’t need to purchase pre-pay credits, if you expect your spend to be less than $40, we encourage you to stay on pay as you go. You can upgrade to a pre-pay credit at any time.

### If pricing is per minute, what happens if I upload a 30-second video?

You pay for the exact number of seconds of video. We don't have a minimum or round to the nearest minute.

### Do I pay for every quality/bitrate that is delivered?

No - if you encode a two-minute video, you pay for two minutes, even if Mux Video delivers that same video in 8 different formats or qualities.

### Is support included in my price?

Our engineers provide hands-on support via email and chat for everyone. We also offer support packages with Slack and phone support. [Reach out](https://www.mux.com/sales-contact?utm_source=docs\&utm_campaign=video-pricing) to us for more information about support packages.

### How does Mux Data fit in?

As a Mux Video customer, monitoring your views with Mux Data is included at no additional charge as long as the videos are hosted on Mux. You have access to all data features available on the Data Pay As You Go Plan. When you integrate Mux Data SDKs into your video player or use Mux Player, you'll start getting engagement and QoE data for your videos.

### Do you offer non-profit discounts?

We offer one-time credits for non-profit customers to help them start using Mux. [Get in touch](https://www.mux.com/sales-contact?utm_source=docs\&utm_campaign=video-pricing) to find out more.

### Do you offer custom, contract pricing?

Yes! We do have custom plans that are well-suited for scaling and enterprise customers. These plans begin at $3,000/month. [Get in touch](https://www.mux.com/sales-contact?utm_source=docs\&utm_campaign=video-pricing) to find out more.

### Do you charge for MP4 downloads?

Yes, see the [Static Renditions (MP4s) section](#static-renditions-mp4s) for more information.

### Does Low-Latency Live Streaming cost extra?

Nope! At Mux, all live streamed video, whether it's standard or low latency, have the same pricing.


# Report on your Mux costs
Use Mux's Billing Breakdown to report on platform costs
Mux provides a billing breakdown which gives you transparency into your Mux costs, with historical context that helps you spot trends and make informed decisions. From this page, you can chart your costs and purchases over time, track your plan credit spending, and see your credit balances. It includes information about your last six billing cycles. Any invoices that have been issued in the six most recent billing cycles are shown but it does not include usage in the current cycle that has not yet been invoiced.

You can find the Billing Breakdown in the Billing area of the Mux dashboard.

### Overview

<Image src="/docs/images/billing-breakdown-overview-chart.png" width={2042} height={966} alt="Last 6 Billing Cycles chart" />

The Last 6 Billing Cycles chart makes it easy to see how the costs of usage fluctuate month-to-month. The billing period can include multiple invoices if more than one was generated by Mux.

In the bar chart:

* **Committed Charges (Orange):** Your plan’s committed costs, this can be Video, Data, or general committed amounts. For example, for a Starter Plan, the commitment would be $10 per month.
* **Previous Cycle’s Overages (Purple):** Costs from usage in the previous month that goes above the committed amount, if any.
* **Credits Applied (Purple-striped):** Reductions of the dollar amount that is billed in that month from plan credits that were purchased or promotional credits received.
* **Tax (Red):** Taxes added to the bill, based on your local jurisdiction.

If you hover over the bar chart, the tooltip shows the exact dollar amounts for each of these values and the total amount that is billed for each month.

The highlight box on the right side of the page shows the total dollar amount owed in the most recent billing period, the change from the previous month, and any promo credits that were applied.

<Image src="/docs/images/billing-breakdown-overview-table.png" width={1920} height={1040} alt="Last 6 Billing Cycles table" />

The detailed breakdown shows you exactly how your costs break down across the major areas of spending. Not all of these will be present for every customer and they won’t be shown if they don’t apply.

* **Video Usage:** the dollar amount of consumption-based Video SKUs that were used.
* **Data Usage:** the dollar amount of consumption-based Data SKUs that were used.
* **Video Committed:** the dollar amount that was committed to spend on Video usage
* **Data Committed:** the dollar amount that was committed to spend on Data usage, which is usually Monitored Views.
* **Support:** the cost for support. For some customers, this is only charged once per year so it may not be included in the table, depending on the date of the charge.
* **Plan Credits Purchase:** the amount of plan credits that were purchased. These may be used over multiple months and the balance will be shown in the “Plan Credits Remaining”. For some customers, this is only charged once per year so it may not be included in the table, depending on the date of the charge.
* **Plan Credits Applied:** the dollar amount that was paid from a plan credits balance.
* **Total:** the total dollar amount that was billed to you from your Mux usage.
* **Plan Credits Remaining:** the amount of plan credits that remain after the month’s charges were applied to an existing balance.

Rows that are highly variable include a percentage below the value which indicates the percentage change, increase or decrease, from the previous month.

While the overview gives you the overall picture, we also provide breakdowns at the product-level. The detailed breakdowns show you exactly how your costs break down within each product.

### Mux Video

<Image src="/docs/images/billing-breakdown-video.png" width={1920} height={1824} alt="Mux Video Billing chart and table" />

Mux Video billing is broken down into the major areas of Mux Video usage and spending:

* **Input VOD**: Input of recorded assets
* **Input Live**: Input of live streams
* **Advanced Static Rendition Generation:** Generation of advanced static renditions; generating standard static renditions is free.
* **Storage**: Storage for all assets, excluding static renditions
* **Static Rendition Storage:** Storage for standard and advanced static renditions
* **Delivery:** All delivery, including audio-only and video assets, live streams, and static renditions
* **Other:** DRM, simulcasting, auto-generated live captions

### Mux Data

<Image src="/docs/images/billing-breakdown-data.png" width={1920} height={1224} alt="Mux Data Billing chart and table" />

Mux Data billing shows the breakdown of pre-committed costs for Data views and consumption-based view pricing.

For more information about Mux video billing see our main [pricing page](/docs/pricing/video) or refer to the billing breakdown page for your account.


# Estimating your Mux Video costs
Learn how to use Mux's pricing to estimate costs under different scenarios
If there’s one thing to take away from Mux’s video pricing model, it’s that minutes are everything. Minutes encoded, minutes stored, and minutes delivered are the only things that matter when it comes to billing. We’ve [written previously](https://www.mux.com/blog/why-we-still-price-in-minutes-for-video) about why we think this is a better way of charging for video.

You can find all of our costs on our [pricing page](/docs/pricing/video), but you may be wondering how you can calculate estimates using these numbers. We’re going to outline a couple ways you can do this so you can be confident that you can predict your costs at any time.

### User-generated content platform

If you’re building a user-generated content platform (UGC), then encoding & storage is going to be a big consideration for you. Most UGC platforms follow some kind of power-law distribution, where a small percentage of the content makes up a large amount of views (in YouTube’s case, for example, much less than 1% of the content uploaded makes up much more than 99% of the views.

Your split might not be as extreme, maybe it’s close to 95/5, 90/10 or even 80/20, but this is the general tendency that we see for UGC platforms. You will want to consider using the [basic video quality level](/docs/guides/use-video-quality-levels), which is $0 encoding and pairing that with [Automatic Cold Storage](https://www.mux.com/blog/introducing-our-coolest-pricing-lever-yet-automatic-cold-storage) so that you get a cheaper storage rate for assets that are rarely viewed.

### Use high, medium and low ranges to make your estimates

If you have an existing application, you can use your existing usage patterns to estimate how much video your users might watch. If you don’t have any existing users to benchmark off of, estimating will be a little trickier. For example:

* Out of 1,000 monthly active users, we think 25% of them will engage with our new video product. Out of that 250 users we think 100 of them will stream 10 minutes of video and 150 of them might stream 25 minutes of video.

If you’re launching something entirely new, then we recommend making 3 separate estimates where you model scenarios that account for how popular your video might be. Here’s some examples:

* Low end: we think in the first few months we’ll get 150 active users on our product. Out of those 150 we think they’ll each stream 45 minutes of video per month.
* Middle of the road: we think in the first few months we’ll get 400 active users and they’ll be streaming an hour and a half of video per month.
* Moonshot: in the best case scenario we think we’ll get 1,000 active users and they’ll stream 2 and a half hours of video per month

Now, for each of those 3 scenarios you can plug the results into the calculator and get a range of costs. That range might be large, but you will have a good idea of how your costs will look depending on the uptake of your users.

### Working with Gigabytes instead of minutes

Some services charge for video based on file sizes, either stored or as bandwidth for delivery. There’s a couple ways you can compare these costs with Mux’s minute based pricing. These will only be a guide because 1GB of video can vary in duration depending on the bitrate, but we can use some estimates that will work for common video encoding settings and work from there.

1 minute of 1080p video averages around 38MB (at 5Mbps), this works out at 25 minutes of video per Gigabyte.

Here’s some example conversions based on how many gigabytes you might have using this as a base:

| Video (1080p, 5Mbps) | Estimated minutes |
| --- | --- |
| 1GB | 25 minutes |
| 10GB | 250 minutes |
| 100GB | 2,500 minutes |

Taking how many gigabytes you have and multiplying it by 25 for 1080p content should give you an estimate in minutes that you can plug into the calculator.

Here’s some estimates you can use for different resolutions:

| Resolution | Estimated minutes per GB |
| --- | --- |
| 720p (3.5Mbps) | 40 minutes |
| 1080p (5Mbps) | 25 minutes |
| 1440p (2K, 8Mbps) | 15 minutes |
| 2160p (4K, 12Mbps) | 10 minutes |

## What to consider when estimating your delivery

### Views (mostly) don’t matter

You might be used to thinking of delivery in terms of how many views a video had, as that’s a good metric for how popular a video is. From Mux’s perspective, 1 person viewing a video for 10 minutes is identical to 10 users watching a 1 minute video each. When you add it all up, 10 minutes of video has been delivered, and that’s how it will appear on your bill.

### Minutes delivered, not minutes watched

Mux bills on minutes delivered even if they weren’t watched. If a viewer starts playing a 20 minute video, they might only watch 5 minutes. Additionally, the player might have preloaded an extra minute of video that the viewer never saw. From a billing perspective, this is 6 minutes of delivery even though that extra minute was never seen, because we still had to deliver it to the client as requested by the player.

### Are looping videos charged for each time they repeat?

Whether a looping video is charged for one playthrough or for each time it repeats depends on the caching behavior of the browser and player being used. If the browser is not clearing out its buffers while the video is repeating then subsequent loops are not going to be charged for delivery, because we never see new requests for the video to our infrastructure as the video loops.

It's difficult to predict and control this browser behavior though. There are also physical limitations as to how much video can be stored in memory before some has to be removed.

In general, the shorter a video is, and the fewer renditions that are being switched between during playback, the more likely that the video will remain in the browsers buffers. Videos that are longer than roughly 60 seconds are likely to stretch what can fit in a browser's video buffer and lead to more requests (and delivery charges).

Configuring your player to use a single rendition instead of multiple ones can make it easier for a browser to cache video, but at the cost of forcing a single resolution onto users regardless of their bandwidth. If your videos are particuarly short, you could try using [static MP4s](/docs/guides/enable-static-mp4-renditions) instead of the default HLS delivery.

For more information about Mux video billing see our main [pricing page](/docs/pricing/video).


# Optimizing your Mux Video costs
Learn how to use different pricing strategies and levers to optimize your costs under different scenarios
You can find all of our costs on our [pricing page](/pricing), but you may be wondering how you can optimize your costs and potentially find ways to reduce your bill. We’re going to outline a few ways you can optimize your usage of Mux so that you can keep your costs as low as possible.

# Cost levers you can leverage through the Mux Video product

Mux offers a few ways to optimize your costs depending on your use case. We have many features that you can take advantage of that will influence your encoding, storage, and delivery costs. There are also add-ons you can opt-in to using so that you only pay for the features you need.

## Use Basic Video Quality

There is no charge for video encoding when using basic quality. This makes encoding free when uploading videos that use this quality level.

The basic video quality level uses a reduced encoding ladder with a lower target video quality and is suitable for simpler video use cases, particularly those that have a lot of user generated content.

You can learn more about [video quality](/docs/guides/use-video-quality-levels#supported-features) and what features are supported.

<Callout type="info">
  Basic quality level assets have a minimum storage charge of one month and are prorated thereafter. Storage is prorated by the percentage of the month that the video is stored. For example, if a 10-minute asset is stored for only half a month, you will be charged for only 5 minutes.
</Callout>

## Automatic Cold Storage

With Automatic Cold Storage, we automatically transition a video or audio-only asset to a different storage level based on how long it has been since it was last viewed. The colder the asset gets, the lower the billing rate becomes.

See more about [Automatic Cold Storage](/docs/pricing/video#automatic-cold-storage)

## Capping maximum delivery resolution

By setting a maximum delivery resolution, you can take advantage of our [resolution based pricing](https://www.mux.com/blog/introducing-resolution-based-pricing).

The playback URL below with the `max_resolution` query parameter modifies the resolutions available for the player to choose from.

```
https://stream.mux.com/{PLAYBACK_ID}.m3u8?max_resolution=720p
```

The `max_resolution` parameter can be set to `720p`, `1080p`, `1440p`, or `2160p`. You may want to do this in order to reduce your delivery costs, or build a feature to your product where only certain viewers get lower resolution video.

See more [here](/docs/guides/modify-playback-behavior)

## Capping upload resolution

If the video being captured in your app doesn't need to be played back in full resolution, specify a lower resolution when recording to take advantage of Mux's resolution dependent pricing.

When uploading from a mobile device ([Android](/docs/guides/upload-video-directly-from-android#setting-a-maximum-resolution), [iOS or iPadOS](/docs/guides/upload-video-directly-from-ios-or-ipados#setting-a-maximum-resolution)), you can utilize our upload SDKs to adjust the resolution of your video input locally before it is uploaded to Mux. By default the SDK will adjust the input resolution to 1920 x 1080 for any inputs that are larger.

[Control recording resolution](/docs/guides/control-recording-resolution)

## Preload

If you want to reduce delivery costs for users who might delay watching a video (or not watch it at all), you can set `preload="none"` in Mux Player (or other compatible player). This means that no video will be pre-loaded until the user plays the video. You could also use `preload="metadata"` which will only load the minimum amount of data needed for the player to get basic information about the video, like its duration.

The tradeoff with using `preload="metadata"` or `preload="none"` is that when the user plays the video they will experience a slower startup time because the video has to load before playback can start.

<Callout type="info">
  Mobile browsers, especially on iOS and Android, often ignore auto and metadata due to data-saving policies.

  While preload serves as a hint, browsers ultimately decide how to handle video loading. If you need precise control, consider managing video loading via JavaScript.
</Callout>

## Lazy loading

Lazy loading can be beneficial because you can opt to only load the player when the user is ready to watch the video, like scrolling it into view. If the player isn't loaded, you're not charged for any video delivery yet. See our guide on how to implement lazy loading for Mux Player [here](/docs/guides/player-lazy-loading).

## Delivery Usage API

This is not a cost optimization feature, but is a way to get asset level delivery visibility. You can utilize the <ApiRefLink href="/docs/api-reference/video/delivery-usage">Delivery Usage API</ApiRefLink> to retrieve information about the delivery of a specific video in a given time period. The Delivery Usage API allows you to get delivery and streaming usage details for each asset and across all assets.

Delivery usage details are aggregated every hour at the top of the hour and can be requested for a specified time window within the last 90 days starting at 12 hours prior to when the request is made.

# Cost levers you can leverage on your own

## Player buffer length

A player has a buffer for the media it plays. Segments are downloaded into the buffer, decoded, and then played. The forward buffer is the media that has not yet been played. In most modern web players, you can set the buffer length of the playback engine.

The main tradeoff when customizing these parameters is performance. Shortening the buffer length leaves your player vulnerable to rebuffering and the viewer waiting if there's a temporary network disconnection or hiccup and that buffer runs out. This is an advanced option, so please keep that in mind.

By reducing this value, you save on the delivered minutes portion of your bill because you're reducing the actual video delivery from the player. The mechanism to control this sometimes differs from player to player but in Mux Player and hls.js, you can set this in a couple of places by:

```javascript
const player = document.querySelector('mux-player');
player._hls.config.maxBufferLength = { number in seconds }
player._hls.config.maxBufferSize = { bytes }
player._hls.config.maxMaxBufferLength = { number in seconds }
```

[maxBufferLength](https://github.com/video-dev/hls.js/blob/master/docs/API.md#maxmaxbufferlength) = Maximum buffer length in seconds. If buffer length becomes less than this value, a new fragment will be loaded.

[maxBufferSize](https://github.com/video-dev/hls.js/blob/master/docs/API.md#maxbuffersize) = 'Minimum' maximum buffer size in bytes. If buffer size upfront is bigger than this value, no fragment will be loaded.

[maxMaxBufferLength](https://github.com/video-dev/hls.js/blob/master/docs/API.md#maxmaxbufferlength) = Maximum buffer length in seconds. Hls.js will never exceed this value, even if maxBufferSize is not reached yet. hls.js tries to buffer up to a maximum number of bytes (60 MB by default) rather than to buffer up to a maximum nb of seconds.

For more information, see the [hls.js documentation](https://github.com/video-dev/hls.js/blob/master/docs/API.md) on these options.

<Callout type="error">
  These options are all via hls.js and Mux Player. Your own player and playback engine will differ.
</Callout>

## Delete live stream assets when streaming ends

To save on storage costs, you can delete the resulting asset that gets created once your live stream has completed. This way you will limit storage charges and prevent further delivery costs. The ingest/encoding cost is still the same once the live stream has completed, this only affects storage.

<Callout type="info">
  Storage is calculated by minutes of video stored. Storage is prorated by the percentage of the month that the video is stored. For example, if a 10-minute asset is stored for only half a month, you will be charged for only 5 minutes.
</Callout>

## Pause when out of viewport

One way of reducing your delivery costs is to reduce the time viewers spend having your video play and buffer. You could implement a way to pause your video player when the viewer's browser window is out of focus or not visible. This can prevent unnecessary playback and delivery charges.

You can achieve this by listening to the `visibilitychange` event on the window object:

```javascript
document.addEventListener("visibilitychange", function () {
    if (document.visibilityState !== "visible") {
        console.log("Window is inactive, pausing video player");
        // replace the below with the corresponding pause method of the player you're using
        player.pause()
    }
});
```

## Are you still watching?

Many streaming services want to reduce their bandwidth and streaming delivery costs so they have implemented an "Are you still watching?" dialog popup that interrupts playback when the viewer has been watching on autoplay for an extended period of time with no interaction.

You could implement this in your own application as well. Below is a small proof of concept on how you might achieve this using React.

```jsx
import { useState } from "react";
import MuxPlayer from "@mux/mux-player-react";

export default function App() {
  const [lastPlayedTimestamp, setLastPlayedTimestamp] = useState();

  const playbackId = "g11xsFT2MA9E92016CuQTSh8kv01aaUhJK"
  const secondsToStopVideo = 10; // timer in seconds

  const handleAllUserActivity = (event) => {
    setLastPlayedTimestamp(event.target.currentTime) // reset the last played timestamp after each play
  };

  const handleTimeUpdate = (event) => {
    const player = event.target;
    const timeElapsed = player.currentTime - lastPlayedTimestamp;
    if (!player.paused && timeElapsed > secondsToStopVideo) {
      player.pause();
      alert("Are you still watching?");
    }
  };

  return (
    <>
      <MuxPlayer
        playbackId={playbackId}
        onPlaying={handleAllUserActivity}
        onSeeking={handleAllUserActivity}
        onRateChange={handleAllUserActivity}
        onVolumeChange={handleAllUserActivity}
        onTimeUpdate={handleTimeUpdate}
      />
    </>
  );
}
```

## Not loading multiple videos on one webpage

Since Mux customers are charged for any delivered video, if a video player is loaded on a webpage it *may* pre-load some amount of video before playback has been initiated.

This would result in minutes delivered just on page load before the viewer even hits the play button.

<Callout type="info">
  If you're displaying multiple videos on page load for each viewer, this could end up multiplying your bill as many videos are causing delivery charges at once. This could be very costly.
</Callout>

## Limit the duration of your uploads

If you're a looking to put a duration cap on your videos, you can set duration limits upon upload. This is not supported directly in Mux's API, but you can set this up on your end by checking the duration of the video before you upload it to Mux and reject any videos that are too long. This is a good way to limit your costs by not uploading videos that are unnecessarily long.

<Callout type="info">
  This is done usually by UGC platforms (social media) given the short form content focus, but also by platforms looking to make sure they're not paying for unnecessary ingests costs.
</Callout>

Below is a small proof of concept on how you might achieve this using React.

```jsx
import React, { useRef, useState } from 'react';
import * as UpChunk from '@mux/upchunk';

function VideoUpload() {
  const pickerRef = useRef(null);
  const [uploading, setUploading] = useState(false);

  const getVideoDuration = (file) => {
    return new Promise((resolve) => {
      const video = document.createElement('video');
      video.preload = 'metadata';

      video.onloadedmetadata = () => {
        window.URL.revokeObjectURL(video.src);
        resolve(video.duration);
      };

      video.src = URL.createObjectURL(file);
    });
  };

  const getUploadUrl = () =>
    fetch('/the-backend-endpoint').then((res) => res.text());

  const handleUpload = async () => {
    const file = pickerRef.current?.files[0];
    
    if (!file) {
      alert('Please select a file');
      return;
    }

    console.log(file);
    const duration = await getVideoDuration(file);

    if (duration > 300) {
      // 5 minutes
      console.log(duration);
      alert('Video must be under 5 minutes');
      pickerRef.current.value = '';
      return;
    }

    setUploading(true);

    const upchunkUpload = UpChunk.createUpload({
      endpoint: getUploadUrl,
      file: file,
      chunkSize: 5120, // Uploads the file in ~5mb chunks
    });

    // subscribe to events
    upchunkUpload.on('error', (err) => {
      console.error('💥 🙀', err.detail);
      setUploading(false);
    });

    upchunkUpload.on('success', () => {
      console.log('Upload complete! 🎉');
      setUploading(false);
      pickerRef.current.value = '';
    });

    upchunkUpload.on('progress', (progress) => {
      console.log(`Upload progress: ${progress.detail}%`);
    });
  };

  return (
    <div>
      <input
        ref={pickerRef}
        id="picker"
        type="file"
        accept="video/*, audio/*"
        disabled={uploading}
      />
      <button 
        id="send" 
        onClick={handleUpload}
        disabled={uploading}
      >
        {uploading ? 'Uploading...' : 'Upload'}
      </button>
    </div>
  );
}

export default VideoUpload;
```
