Published on January 18, 2022 (about 2 years ago)

Make a stats video your MoM would be proud of

Dave Kiss
By Dave Kiss25 min readEngineering

LinkThere's a quantifiable number for everything. That fact has always been mind-bending to me.

It's hard to fathom the number of grains of sand that are on Second Beach, or the number of times Roy Kent cusses in a football season, or the number of times my wife has had to remind me where I left my water bottle—but each of these does have a finite number.

Live footage of me counting my mistakes

Numbers are all around us, but they're meaningless without being surfaced for conclusions.

I've always loved using charts as a way to visualize data. There's something so satisfying about checking how many steps I have, watching my year-in-review listening trends video, and refreshing my website analytics.


So I had a thought. Mux Data makes it easy to get data about video user engagement and quality of experience, and there's an API to pull that data for further analysis. Why not create something similar to expose interesting trends from my Mux account? I was intrigued.

This article details what I came up with. I'll walk you through a method to leverage the React chops you already have to create a video exposing your Mux account's MoM (month-over-month) performance.

You’ll look like a hero in front of your boss (you can tell them the idea was all yours—really, I don't mind) and have you a conversation piece to show your family over this year's Christmas dinner so that maybe, just maybe, this will be the year they finally understand what it is that you do for a living.

LinkWhat will I make?

We’ll build and render a video highlighting some viewing trends pulled right from your Mux Data account. Watch the video below for an example of the final product:

LinkCan't you just gimme the code?

For those of you who'd rather peruse the codebase than read this heckin' chonka article, you can find it available here: https://github.com/davekiss/mux-remotion-demo

LinkI don't cut corners. Let's get building.

I like your attitude.

LinkInstall ffmpeg

brew install ffmpeg

If you don't have ffmpeg installed on your system, you'll need to get that going first. FFmpeg is a cross-platform solution to record, convert, and stream audio and video. We’ll be using it in this project to render our canvas into a video format.

The install process differs based on your operating system, but if you're on a Mac with Homebrew, you can install it by typing brew install ffmpeg into your terminal.

This is gonna take a minute, so I'd recommend you go for a walk around the block, drink some water, call someone you love, and take 10 deep breaths. Ready? I feel better already.

LinkClone the starter repo

We'll be using a tool called Remotion to make this stats video. Remotion is a neat library that allows you to create videos by writing React code.

To save some time with setup and configuration, we've provided a starter repo that's ready to roll. You can download and install it by issuing the following commands in your terminal:

shell
cd ~ && git clone https://github.com/davekiss/mux-remotion-demo cd mux-remotion-demo && npm install

LinkAdd your Mux access tokens

In order to pull the data from your Mux account, you'll need to obtain access tokens to establish this secure connection.

To do this, sign in to your Mux account, and within your dashboard, visit Settings → API Access Tokens. You can create a new token pair by clicking on the Generate new token button in the top right.

Now, click the button to download your credentials as an .env file. This is the file that we'll use in the project repo.

Move this newly downloaded file to the root of the mux-remotion-demo directory and rename the file to .env. Since your renamed file now begins with a ., it may no longer appear in your finder. Don’t worry, it’s still there! By default, these “dotfiles” are hidden from view. You can toggle the visibility of your dotfiles in finder by pressing Command + Shift + ..  We’re ready to fetch the data from your Mux account!

LinkQuerying and caching the Mux Data API

Remotion works by loading up your React app in a headless chrome instance, snapshotting the visual representation of the current frame, and then reloading the page with a request to progress to the next frame. This means that your data will be reloaded on every single frame render, which is no bueno for maintaining data consistency or avoiding rate limits from the Mux API.

Instead of making a request to the Mux Data API on every frame render, we'll pre-bake the data that we need to populate our video by running a script that will fetch the expected responses and save the results as JSON files on our local system.

We've added a few dependencies that will be helpful in achieving this task.

yarn add --dev axios date-fns dotenv

Note: If you don't have any Mux Data in your account, never fear! We've included seed data in the project repository so you can still follow along. Simply copy the mock data JSON files from the src/seeds directory into the src/data directory (create it if it doesn't exist) and skip ahead to the Visualizing your Mux data in a video section below.

There’s a data fetcher script located at scripts/hydrate.ts and an entry to the package.json scripts called hydrate that will allow us to run the script.

javascript
{ "scripts": { ..., "hydrate": "ts-node ./src/scripts/hydrate.ts" } }

The hydration script is the backbone of this project and provides a means of interacting with the Mux Data API.

In this script, we're describing 5 separate requests that will be made to fetch some unique datasets that give us an idea of how users are interacting with the video content in our Mux account. The 5 datasets that we care about for the purposes of this article are:

  1. Overall video viewership stats
  2. Number of views by device
  3. Number of views by video title
  4. Unique viewers by country
  5. Unique viewers by browser

For each of these datasets, we're going to fetch 2 date ranges—one describing the recorded values for the past 30 days, and another describing the values from the 30 days before that, which we can use for trend comparison, for a total of 10 separate requests.

For this script, we're going to dynamically build the Mux endpoint string value that will be targeted for each request. Let's take a look at one of these requests so we understand each part.

javascript
const now = new Date(); const one_month_ago = getUnixTime(subMonths(now, 1)); const two_months_ago = getUnixTime(subMonths(now, 2)); const pastMonthTimeframe = `timeframe[]=${one_month_ago}&timeframe[]=${getUnixTime(now)}`; const previousMonthTimeframe = `timeframe[]=${two_months_ago}&timeframe[]=${one_month_ago}` const fetchData = async (metric: Metric, type: DataType, querystring: string) => { try { const [pastMonthResponse, previousMonthResponse] = await Promise.all([ axios.get(`https://api.mux.com/data/v1/metrics/${metric}/${type}?${pastMonthTimeframe}${querystring}`, { headers }), axios.get(`https://api.mux.com/data/v1/metrics/${metric}/${type}?${previousMonthTimeframe}${querystring}`, { headers }) ]) return [pastMonthResponse.data, previousMonthResponse.data]; } catch (error) { if (axios.isAxiosError(error)) { // Access to config, request, and response } else { console.log(error) } return []; } }

By using JavaScript string interpolation, we can substitute values on a per-request basis. Mux's various API endpoints make it easy to expose just the data you need in a format that is repeatable and easy to understand.

javascript
// Views by video title { metric: "views", type: "breakdown", outputFilename: "views_by_title.json", group_by: "video_title", order_by: "views", limit: 10, }

For this Views by video title request, we're specifying that we'd like a breakdown of views grouped by video_title, ordered by the views value, and limited to only 10 rows.

We'll loop through each of these 5 request definitions, build the appropriate API endpoint based on the request values, and fire off queries to the Mux Data API. Then, we'll save the responses to our local project so we can use them later.

javascript
const hydrate = async () => { const dir = "./src/data"; await fs.rm(dir, { recursive: true, force: true }); await fs.mkdir("./src/data"); await Promise.all( REQUESTS.map(async ({ type, metric, group_by, limit, order_by, order_direction = "desc", filters = [], outputFilename }) => { const params = type === "breakdown" ? `&group_by=${group_by}&limit=${limit}&order_by=${order_by}&order_direction=${order_direction}` : ""; const filterStrings = filters.map(({type, key, value}) => `&filters[]=${type === "exclude" ? "!" : ""}${key}:${value}` ) const querystring = params + filterStrings.join(''); const response = await fetchData(metric, type, querystring); await fs.writeFile(`./src/data/${outputFilename}`, JSON.stringify(response)); })) } hydrate();

Now that you understand what this script is doing, let's run it by issuing the following command to your terminal:

npm run hydrate

If all goes well, you'll now have 5 different JSON files that contain the data we'll use to visualize what's happening in your Mux account. Radical!

If you're following along from scratch, note that we’ve added "resolveJsonModule": true to the tsconfig.json to allow importing of the output JSON files in the Remotion app.

LinkVisualizing your Mux data in a video

In order to work on the contents of our video, we'll need to spin up a Remotion development environment, which will provide some additional tooling for us to leverage. To do that, go back to your terminal and start the engines with the following command:

npm run start

Your browser should open up, and you’ll see a video editor appear on screen. Woah, what? Super cool stuff, Remotion. Now we’ve gotta figure out how to make some video clips and put our Mux Data to good use.

LinkUnderstanding Remotion compositions and sequences

There are 2 concepts specific to Remotion to understand up front: compositions and sequences. A composition is a collection of video clips that will play back to back, and a sequence is an individual video clip that appears in a composition. The video you create in this project will have 1 composition with several different sequences.

In our starter repo, we’ve created a composition in src/Video.tsx and specified that the component to use for our timeline exists in src/Timeline.tsx. We’ve also set the total composition duration to 1180 frames at 30 frames per second, for a total video length of just under 40 seconds.

Inside the Timeline component, we then import all of our sequences and tell Remotion to play each of them back to back using a Series component. We also set the durationInFrames prop for each Sequence to ensure each clip plays back for our desired length.

There’s a bit more to it under the hood, but for the purposes of this article, you’re now ready to rock in Remotion. Let’s go!

LinkTable of contents

  • Clip 1: Video intro
  • Clip 2: Stats overview
  • Clip 3: Viewing device breakdown
  • Clip 4: Top 10 videos
  • Clip 5: USA viewer heatmap
  • Clip 6: Viewing trends by browser
  • Clip 7: Video outro

LinkClip 1: Introduction

For the beginning of the video, let’s set the scene and tell our viewers exactly what they’re going to gain by watching this video with an introductory clip. We’ve created a new file at src/clips/1-Intro.tsx to do just that.

In this clip, let’s show a title for our video, the date range of the data that we’ll be presenting in the video, and a few additional design elements. We’ll also animate these elements on screen for that added touch of flair.

javascript
import React from 'react'; import Layout from "../components/Layout"; import { format } from 'date-fns'; import { interpolate, useCurrentFrame, Easing } from "remotion"; import data from "../data/views_by_title.json"; export const Intro: React.FC = () => { const { timeframe } = data[0]; const frame = useCurrentFrame(); const titleOpacity = interpolate(frame, [5, 25], [0, 1]); const subtitleOpacity = interpolate(frame, [30, 50], [0, 1]); const titleY = interpolate(frame, [5, 25], [50, 0], { extrapolateRight: "clamp", easing: Easing.out(Easing.ease) }); const subtitleY = interpolate(frame, [30, 50], [50, 0], { extrapolateRight: "clamp", easing: Easing.out(Easing.ease) }); return ( <Layout background="white"> <div className="border-b border-mux-gray mt-20 pb-8"> <h1 className="text-mux-black leading-none tracking-tight" style={{ fontSize: "120px", opacity: titleOpacity, transform: `translateY(${titleY}px)` }}>Video stats overview</h1> <h2 className="text-mux-gray mb-48 tracking-tight" style={{ fontSize: "120px", opacity: subtitleOpacity, transform: `translateY(${subtitleY}px)` }}> {format(new Date(timeframe[0] * 1000), 'MMM. dd')}{format(new Date(timeframe[1] * 1000), 'MMM. dd, yyyy')} </h2> </div> <div> <h2 className="font-mono text-mux-gray uppercase text-3xl my-16 tracking-widest">Powered by Mux Data</h2> <div className="grid grid-cols-5 h-4"> <div className="bg-mux-pink" /> <div className="bg-mux-green" /> <div className="bg-mux-blue" /> <div className="bg-mux-lavendar" /> <div className="bg-mux-yellow" /> </div> </div> </Layout> ); };

The first thing to notice is that this component is formatted just like any other React component that returns JSX. We’ve created a shared Layout component that each video clip will render to establish a common look and feel. We’re also importing some helper functions from remotion and date-fns along with one of our Mux Data JSON files. Let’s break down what these helper functions are doing.

First, we want our clip to be aware of the current frame that the video playback is on. Remember, each frame in our video contains a snapshot of data, and we need to incrementally render out frames with updated values for each frame. The current frame value will be used in our interpolation (huh? don’t worry, we’ll get there) to calculate which opacity and transform values to use on a per-frame basis.

Remotion provides a nice utility helper for this called useCurrentFrame, which returns—you guessed it—the number value representing the current frame of the video clip.

Now that we have the current playback frame, we can pass it to the Remotion interpolate function to come up with our desired CSS values for each frame.

LinkInterpol-what?

Let’s first understand what we’re trying to achieve here. We want to fade in the title of the video, so we need a way to know what opacity value should be used for each playback frame to emulate a fading effect.

javascript
const frame = useCurrentFrame(); const titleOpacity = interpolate(frame, [5, 25], [0, 1]);

Interpolation is the method of mapping our playback range integers to an opacity value somewhere between 0 and 1.

When the clip playback reaches frame 5, we want to start transitioning the title’s opacity value from its initial value of 0 toward its destination value of 1. This opacity value transition should be completed when the clip playback reaches frame 25.

In other words, the interpolate function will take the current frame value as a first argument, specify the possible range for this value as [5, 25], and remap it to a range of [startValue, endValue].

We have a similar scenario for the title’s position animation:

javascript
const titleY = interpolate(frame, [5, 25], [50, 0], { extrapolateRight: "clamp", easing: Easing.out(Easing.ease) });

When the clip playback reaches frame 5, we want to start transitioning the title’s translateY value from its initial value of 50 toward its destination value of 0. This transition should also be completed when the clip playback reaches frame 25.

We’re also passing a configuration with extrapolateRight: "clamp" to prevent the result going outside the output range, and an easing method to make the animation nice and smooth. 🥞

We can then pass this value directly into our JSX as a style value on our title:

javascript
<h1 className="text-mux-black leading-none tracking-tight" style={{ fontSize: "120px", opacity: titleOpacity, transform: `translateY(${titleY}px)` }}>Video stats overview</h1>

LinkClip 2: A high-level overview

Right out of the gate, it’d be cool to show an overall summary of our video performance over the past 30 days. Then, as our video progresses, we can get a bit more granular and dig into the preferences and origins of the unique viewers who are engaging with the content within our Mux account.

Let's start our video by showing off some high-level, overall stats. We'll expose our playback trends for the content within our Mux account and compare it to our viewership trends from the month before.

A common effect that is often seen with numerical visualizations is a quick count-up to the final displayed value. Let's implement that here in src/clips/2-Overall.tsx

javascript
import { useCurrentFrame, useVideoConfig, spring } from 'remotion'; const frame = useCurrentFrame(); const { fps } = useVideoConfig(); const driver = spring({ frame, fps, config: { damping: 60, mass: 0.4, overshootClamping: true, }, }); const getCurrentValue = (spring: number, endValue: number) => Math.ceil( interpolate(spring, [0, 1], [0, endValue], { extrapolateRight: 'clamp', }) ); const totalViews = getCurrentValue(driver, data[0].data.total_views); const totalWatchTime = getCurrentValue(driver, data[0].data.total_watch_time);

Say whaaat? Let's break this down line by line so we understand what's going on.

First, we’re going to use useCurrentFrame again to get the value of the current frame being rendered. The only difference this time is that we’re going to use a spring instead of the interpolate helper.

The second value we'll want to pass to the spring animation calculator is the video’s fps value. fps stands for “frames per second” and represents how many individual frames need to be rendered for each second of video played back. We can retrieve the fps value by destructuring the returned object from another utility helper provided by Remotion called useVideoConfig.

OK, now we’re ready to bounce right in. Strap on your moon shoes: it's time to set up the spring animation. (That’s it for the puns... for now.)


If you’ve animated elements on the web before, you may have previously used CSS keyframes to set up a basic animation. These are typically defined by specifying the duration of the animation in addition to the to and from values for a given CSS property. In other words, you describe the starting value, the ending value, and how long it takes to get from point A to point B.

Spring animations are a little different than your average linear interpolation. Springs take into consideration several different numerical values that you provide, which describe the physical attributes of the item you are animating. For example, if you are animating a wrecking ball on screen, you would provide that spring constructor a much higher mass value than you would for, say, a beach ball.

There's sometimes a bit of trial and error here in order to get the spring animation looking how you'd like, but the end result is a much more organic-looking animation that mimics imperfect real-world physics rather than precise time-based animation.


Let's set up a spring animation for creating an organic count-up animation to the final number value.

javascript
const driver = spring({ frame, fps, config: { damping: 60, mass: 0.4, overshootClamping: true, }, });

Here, we're passing the current frame, fps, and a config object, which assigns our count-up animation with an increased damping value and decreased mass. This allows the number to count up quickly but slowly ease in to its final value. We also set overshootClamping to true in order to avoid displaying a number higher than the actual final value that we're going to display.

In order to get the current count-up value to show in the current frame, let's write a utility function that takes our spring and our final value and calculates a somewhere-in-between output value.

javascript
const getCurrentValue = (spring: number, endValue: number) => Math.ceil( interpolate(spring, [0, 1], [0, endValue], { extrapolateRight: 'clamp', }) );

By default, our spring will calculate a value from 0 to 1 based on the current frame, but we want to map that value to a value between 0 and our endValue—which will produce the counting-up effect.

The interpolate function will take the current spring value as a first argument, specify the possible range for this value as [0, 1], and remap it to a range of [0, endValue]. We pass a configuration with extrapolateRight: "clamp" to prevent the result going outside the output range.

Lastly, we're rounding up the calculated value so we only get whole numbers.

Phew! Got all that? Don’t worry, springs are the toughest part of this whole project (after all, you are pretending to be an actual physicist). If it doesn’t make much sense, give it another read, or email me and tell me to do a better job with my writing.

Let's put it to the test! We'll pass the spring and the end value to this helper function in order to calculate the current value for both the totalViews and the totalWatchTime:

javascript
const totalViews = getCurrentValue(driver, data[0].data.total_views); const totalWatchTime = getCurrentValue(driver, data[0].data.total_watch_time);

Now we can use these 2 values right in our React JSX template. You’ll see that as the video playhead progresses, the spring value will be recalculated, and an updated value will be rendered to the screen. Nice!

Our total views and total watch time numbers are great, but they’d be even better if we could see how they compared to the values from the previous 30 days. Let’s grab those numbers from our data response and render them in a new Trend component in src/components/Trend.tsx.

javascript
import { useCurrentFrame, useVideoConfig, spring } from 'remotion'; const Trend = ({ border = false, color, previousMonthValue, pastMonthValue }: { border: boolean; color: string; previousMonthValue: number; pastMonthValue: number; }) => { const frame = useCurrentFrame(); const { fps } = useVideoConfig(); const y = spring({ frame, from: 100, to: 0, fps, config: { stiffness: 100, }, }); const delta = Math.abs(((1 - pastMonthValue / previousMonthValue) * 100)).toFixed(1); const isTrendingUp = pastMonthValue > previousMonthValue const prefix = isTrendingUp ? "+" : "-" return ( <div className={`mt-4 text-3xl px-4 py-3 rounded-lg ${border ? "border-2" : ""} border-mux-${color}-darker font-mono uppercase`} style={{ width: "fit-content", transform: `translateY(${y}px)` }}> <span className={`text-mux-${color}-darkest tracking-widest`}>{prefix}{delta}% from last month</span> </div> ) } export default Trend;

We’re creating a basic spring here, but this time we're animating from 100 to 0. We then pass the calculated value for the current frame to the transform CSS property. As this value decreases, the component will appear to move upwards on-screen.

We’re also calculating the percentage change from the pastMonthValue to the previousMonthValue and rendering out a + or - depending on the direction of the trend. Cool!

LinkClip 3: Viewing device breakdown

In this clip, we want to answer the question "which devices are my videos being watched on the most?" Understanding the viewing environment in which your content is consumed can be helpful to understand the context of your viewer's relationship with your content.

Mux Data tracks device information out of the box, so all we need to do in order to surface this data is make an API call and render out the response fields.

For this clip, let's animate a bar chart for each individual device. Each bar will represent a device's percentage of the views for the leading device that we're visualizing within the current dataset.

First, let's figure out the number of views for the leading device. This can be achieved by sorting through the array and taking the value of the first result.

javascript
const leadingDeviceViews = data[0].data .sort((a, b) => b.views - a.views)[0].views;

Now that we know the max device views across this dataset, we can figure out what the percentage of this max is for any individual device's views value.

javascript
(device.views / leadingDeviceViews) * 100;

Psst. This is elementary school math, but I still have to look up this equation every time. Don't feel bad if you do, too. Solidarity.

Next, we can create a reusable component file at src/components/Measure.tsx that will render a bar chart visualization for each value.

javascript
const Measure = ({index, value}: {index: number, value: number}) => { const frame = useCurrentFrame(); const { fps } = useVideoConfig(); const width = spring({ frame: frame - 20 - index * 8, // delay the starting frame of the animation from: 0, to: value, fps, config: { damping: 60, }, }); return ( <div className="absolute inset-0 bg-white" style={{width: `${width}%`}} /> ); };

As in the previous clip, we're creating a spring animation that uses the current frame value and the fps.

However, this time, we're changing 2 things:

  1. We’ve modified the to value to be equal to the views value for the current device iteration instead of the default spring to value of 1, and
  2. We're passing a modified value as the current frame value. By default, springs start animating at frame 0, so by tweaking this value and making it negative, we're telling the spring that it isn't time to start animating just yet.

You might notice the frame value contains something of a unique equation:

frame: frame - 20 - index * 8

By considering the current device index, we can create a cascading delay effect for each instance of the Measure component. The 8 is in place to scale up the produced values. This will incrementally increase the animation delay for each rendered Measure component.

Let’s also create a trickle-down effect for the table row for each device. We’ll create a spring inside a new inline Stat component, which will calculate the Y-axis offset of the row, along with a simple interpolation for the opacity of the row, to produce a slide-and-fade-in effect.

javascript
const Stat = ({ index, children }: { index: number, children: React.ReactNode }) => { const frame = useCurrentFrame(); const { fps } = useVideoConfig(); // Scale the index value up by a factor of 8 const scale = index * 8; const offset = spring({ frame: frame - 10 - scale, // delay the starting frame of the animation from: -100, to: 0, fps, config: { damping: 60, mass: 0.4 } }); const opacity = interpolate(frame, [10 + scale, 20 + scale], [0, 1]); return ( <div className="flex items-center border-t-2 border-mux-green-darker p-4 relative" style={{ transform: `translateY(${offset}px)`, opacity }} > {children} </div> ) }

Again, another spring animation, but this time we're animating from -100 to 0 and passing the resulting value to the CSS transform.

This clip will show the top 10 most popular video titles by views. This will give us an idea of the performance of specific pieces of content within our video library over the past 30 days.

For the presentation, we’ll build upon the lessons we’ve already learned in previous clips—only this time, we’ll render the results across 2 separate columns, modifying the Measure component’s delay by passing the results’ index to the rendered bar chart.

javascript
import React from 'react'; import Layout from "../components/Layout"; import Measure from "../components/Measure"; import { formatNumber } from '../utils'; import data from "../data/views_by_title.json"; const Stat = ({ children }: { children: React.ReactNode }) => ( <div className="flex border-t-2 border-mux-purple pt-5 px-4 text-4xl relative h-36">{children}</div> ) const Value = ({ children }: { children: React.ReactNode }) => ( <div className="font-normal z-10 tracking-tight">{children}</div> ) const Label = ({ children }: { children: React.ReactNode }) => ( <div className="text-mux-black flex-1 z-10 mr-10 tracking-tight">{children}</div> ) const Index = ({ children }: { children: React.ReactNode }) => ( <div className="text-mux-purple mr-8 z-10 w-10">{children}.</div> ) export const VideoTitles: React.FC = () => { const maxDatasetViews = data[0].data.sort((a, b) => b.views - a.views)[0].views; return ( <Layout bodyClass="bg-mux-lavendar" title="Top 10 videos by viewership" timeframe={data[0].timeframe} > <div className="grid grid-cols-2 gap-x-10"> <div> {data[0].data.slice(0, 5).map((video_title, i) => ( <Stat key={video_title.field}> <Measure index={i} value={(video_title.views / maxDatasetViews) * 100} /> <Index>{i + 1}</Index> <Label>{video_title.field}</Label> <Value>{formatNumber(video_title.views)}</Value> </Stat> ))} </div> <div> {data[0].data.slice(5, 10).map((video_title, i) => ( <Stat key={video_title.field}> <Measure index={i + 5} value={(video_title.views / maxDatasetViews) * 100} /> <Index>{i + 6}</Index> <Label>{video_title.field}</Label> <Value>{formatNumber(video_title.views)}</Value> </Stat> ))} </div> </div> </Layout> ); };

LinkClip 5: Where in the world???

For this clip, we'll be using the react-simple-maps library to create a map view of the United States of America. States with stronger viewership numbers will be represented in a darker color, with lighter viewership represented by a light color. You can install it with the following command:

yarn add react-simple-maps

The react-simple-maps library provides a lot of niceties right out of the box. This includes both geographical data and React components to represent visuals such as graticules (longitude and latitude lines), state boundaries, and more.

Let’s put it to use in a new MapChart component file located at src/components/MapChart.tsx.

javascript
import { ComposableMap, Geographies, Geography } from "react-simple-maps"; import { interpolateColors } from "remotion"; const geoUrl = "https://cdn.jsdelivr.net/npm/us-atlas@3/states-10m.json"; const MapChart = ({ data }: { data: StateData[] }) => { return ( <ComposableMap width={1000} height={500} projection="geoAlbersUsa" > {data.length > 0 && ( <Geographies geography={geoUrl}> {({ geographies }) => geographies.map((geo) => { const d = data.findIndex((s) => s.field === geo.properties.name); const val = d + 1 || 50; const color = interpolateColors(val, [1, 50], ["#FB501D", "#FFF3C7"]); return ( <Geography key={geo.rsmKey} geography={geo} fill={color} stroke="#FED32F" /> ); }) } </Geographies> )} </ComposableMap> ); }; export default MapChart;

Our MapChart component is following react-simple-maps practices to render out a ComposableMap with a child Geographies component. The geography prop is fed a JSON file that contains the boundary data for the US and produces a geographies render prop. This is an array containing information about each state in the US.

By iterating over the geographies we can:

  1. Search our imported Mux data for the current state geography by name and find its index (which tells us how it ranks in viewership compared to the rest of the US).
  2. Adjust the index by 1 (so the range is from 1-50 instead of 0-49).
  3. Pass the adjusted index to Remotion’s interpolateColors function. This will generate a hex color value that falls somewhere between #FB501D and #FFF3C7 based on the value of the index.

With the color generated, we can render out a Geography component that has a unique fill color based on the state’s viewership ranking within Mux Data. 🔥🔥🔥

LinkClip 6: Visualizing unique viewers across browsers

Phew, hang tight, we’re almost there!

For our last clip, we want to show the usage stats for specific web browsers and learn more about what browsers our audience is using to view our content.

We'll be leveraging an NPM package that brings high-quality browser logos to your project. Here's the command we used for the install:

yarn add alrra/browser-logos#70.4.0

There's an awesome video by the folks at Sparkbox that explains how to create a pie chart using SVGs. Rather than describe the process here, I'd highly encourage following along with their code-along video and learning something new.

Let’s implement the pie chart by creating a new inline PieChart component:

javascript
const PieChart = ({ index, value, percentage, }: { index: number, value: number, percentage: number, }) => { const frame = useCurrentFrame(); const { fps } = useVideoConfig(); const measure = spring({ frame: frame - 10 - (index * 3), // delay the starting frame of the animation from: 0, to: percentage, fps }); return ( <div className="relative w-72 h-72 items-center flex justify-center mb-5"> <svg className="w-72 h-72 absolute" viewBox="0 0 20 20"> <circle r="5" cx="10" cy="10" fill="transparent" stroke="white" strokeWidth="10" strokeDasharray={`calc(${measure} * 31.42 / 100) 31.42`} transform="rotate(-90) translate(-20)" /> </svg> <div className="z-10 flex flex-col items-center justify-center"> <div className="font-sans font-normal flex items-start"> <span className="tracking-tight" style={{ fontSize: "100px" }}>{percentage.toFixed(1)}</span> <span style={{ fontSize: "30px", transform: "translateY(36px)" }}>%</span> </div> <span className="font-sans font-normal tracking-tight" style={{ fontSize: "36px" }}>{formatNumber(value)} views</span> </div> </div> )

You're a spring expert by now. We're once again creating a spring animation, this time animating only from 0 to percentage—where percentage is the value that can be attributed to the current browser's usage amongst the rest of the browsers within the dataset.

Now you can get a sense of how to put this pie chart to use, along with the way we can look up the logo associated with each data entry:

javascript
import Chrome from '../../node_modules/browser-logos/src/chrome/chrome_256x256.png'; import Firefox from '../../node_modules/browser-logos/src/firefox/firefox_256x256.png'; import Safari from '../../node_modules/browser-logos/src/safari/safari_256x256.png'; import Edge from '../../node_modules/browser-logos/src/edge/edge_256x256.png'; import Opera from '../../node_modules/browser-logos/src/opera/opera_256x256.png'; import data from "../data/unique_viewers_by_browser.json" const LOGO_LOOKUP: Record<string, string> = { Chrome, Safari, Opera, Edge, Firefox, } export const Browsers: React.FC = () => { const totalDatasetViewers = data[0].data.map(d => d.value).reduce((previousValue, currentValue) => previousValue + currentValue); return ( <Layout bodyClass="bg-mux-blue" title="Top browsers by views" timeframe={data[0].timeframe}> <div className="grid grid-cols-4"> {data[0].data.map((row, i) => { const icon = LOGO_LOOKUP[row.field]; if (!icon) return; const previousMonthViews = data[1].data.find(d => d.field === row.field)?.value || 0 return ( <div key={row.field} className={`flex flex-col items-center ${i < 3 ? "border-r-2 border-mux-blue-darker" : ""}`}> <PieChart index={i} value={row.value} percentage={(row.value / totalDatasetViewers) * 100} /> <div className="w-80 text-center mb-16"> <Trend border={false} color="blue" pastMonthValue={row.value} previousMonthValue={previousMonthViews} /> </div> <div className="w-32 h-32 bg-white p-5 rounded-lg mb-10"> <Img src={icon} className="mb-4" /> </div> <p className="text-5xl text-mux-black font-sans tracking-tight">{row.field}</p> </div> ) })} </div> </Layout> ); };

LinkClip 7: Outro

We’re done here! Let’s show a simple prerecorded video of the Mux logo with a cool particle animation. Remotion’s Video component makes it easy to embed an existing .webm video file directly into your video clip.

javascript
import React from 'react'; import { Video } from "remotion"; import video from "../static/MuxLogo.webm"; export const Outro: React.FC = () => { return ( <div className="flex items-center justify-center w-full bg-white px-48"> <div className="py-72 w-full flex items-center justify-center"> <Video src={video} style={{ height: 1080 / 2, width: 1920 / 2 }} /> </div> </div> ); };

LinkTime to render!

Now that we have some real numbers and the code is hopefully bug-free, let's push the "do it" button and tell our computer to stitch our clips together into an award-winning video. For that, we'll use the familiar command you've likely run into before if you've done any work with JavaScript:

yarn build

The video file will be output to your /out folder. Go ahead, take a moment and bask in the rad video you just created—all without opening After Effects or touching a camera.

LinkPost and share

Here's the part where you get to show off and take all of the credit for this project. You can surprise-airdrop your video to a synced company folder, post it in Slack, or host it somewhere on the internets (a dead-simple home for adding your video is Mux's own https://stream.new—but, of course, we're biased).

LinkBut wait—there's more!

Hopefully this gives you a good idea of how to get started with creating a video on the fly with real video stats from your Mux account. However, you don't have to stop here! Mux provides many different metrics that can be surfaced for visualization.

Here are a few more ideas to try that might enhance your video:

  • Auto-generate a new rendition using a serverless function every month and auto-email it to your boss.
  • Loop this video on big-screen displays around the office and be pumped about how many compliments you get.
  • Analyze your own custom metadata that you’ve sent to Mux Data. For example, you might aggregate on a provided user.plan_name parameter so you can see how your viewer trends differ across your company's tiered offerings.
  • Build a report based on your content’s stream_type: are most people watching on-demand, live, or via Mux’s low-latency offering?
  • Expose the most popular playback pages. Grab a screenshot of the playback page url using something like Puppeteer and render it on demand, directly in your video.

LinkQuestions? Comments? Feedback?

If you liked this guide, have additional questions, got stuck somewhere, or just want to say hey, feel free to drop me a line at dave@mux.com—I’d love to hear from you!

P.S. Our very own Rob Mach is responsible for all of the awesome graphics you see in the video and the featured image. I didn't know someone could possess that much talent, did you? Thanks for all your help, Rob.

Written By

Dave Kiss

Dave Kiss – Senior Community Engineering Lead

Fake programmer. Likes bikes, trees, and cookie dough custard. Collects wood scraps. Lots of spiders live in his basement.

Leave your wallet where it is

No credit card required to get started.