January 18, 2022 (over 1 year ago)
It's hard to fathom the number of grains of sand that are on Second Beach, or the number of times Roy Kent cusses in a football season, or the number of times my wife has had to remind me where I left my water bottle—but each of these does have a finite number.
Live footage of me counting my mistakes
I've always loved using charts as a way to visualize data. There's something so satisfying about checking how many steps I have, watching my year-in-review listening trends video, and refreshing my website analytics.
So I had a thought. Mux Data makes it easy to get data about video user engagement and quality of experience, and there's an API to pull that data for further analysis. Why not create something similar to expose interesting trends from my Mux account? I was intrigued.
This article details what I came up with. I'll walk you through a method to leverage the React chops you already have to create a video exposing your Mux account's MoM (month-over-month) performance.
You’ll look like a hero in front of your boss (you can tell them the idea was all yours—really, I don't mind) and have you a conversation piece to show your family over this year's Christmas dinner so that maybe, just maybe, this will be the year they finally understand what it is that you do for a living.
We’ll build and render a video highlighting some viewing trends pulled right from your Mux Data account. Watch the video below for an example of the final product:
For those of you who'd rather peruse the codebase than read this heckin' chonka article, you can find it available here: https://github.com/davekiss/mux-remotion-demo
I like your attitude.
brew install ffmpeg
If you don't have ffmpeg installed on your system, you'll need to get that going first. FFmpeg is a cross-platform solution to record, convert, and stream audio and video. We’ll be using it in this project to render our canvas into a video format.
The install process differs based on your operating system, but if you're on a Mac with Homebrew, you can install it by typing brew install ffmpeg into your terminal.
This is gonna take a minute, so I'd recommend you go for a walk around the block, drink some water, call someone you love, and take 10 deep breaths. Ready? I feel better already.
We'll be using a tool called Remotion to make this stats video. Remotion is a neat library that allows you to create videos by writing React code.
To save some time with setup and configuration, we've provided a starter repo that's ready to roll. You can download and install it by issuing the following commands in your terminal:
In order to pull the data from your Mux account, you'll need to obtain access tokens to establish this secure connection.
To do this, sign in to your Mux account, and within your dashboard, visit Settings → API Access Tokens. You can create a new token pair by clicking on the Generate new token button in the top right.
Now, click the button to download your credentials as an .env file. This is the file that we'll use in the project repo.
Move this newly downloaded file to the root of the mux-remotion-demo directory and rename the file to .env. Since your renamed file now begins with a ., it may no longer appear in your finder. Don’t worry, it’s still there! By default, these “dotfiles” are hidden from view. You can toggle the visibility of your dotfiles in finder by pressing Command + Shift + .. We’re ready to fetch the data from your Mux account!
Remotion works by loading up your React app in a headless chrome instance, snapshotting the visual representation of the current frame, and then reloading the page with a request to progress to the next frame. This means that your data will be reloaded on every single frame render, which is no bueno for maintaining data consistency or avoiding rate limits from the Mux API.
Instead of making a request to the Mux Data API on every frame render, we'll pre-bake the data that we need to populate our video by running a script that will fetch the expected responses and save the results as JSON files on our local system.
We've added a few dependencies that will be helpful in achieving this task.
yarn add --dev axios date-fns dotenv
Note: If you don't have any Mux Data in your account, never fear! We've included seed data in the project repository so you can still follow along. Simply copy the mock data JSON files from the src/seeds directory into the src/data directory (create it if it doesn't exist) and skip ahead to the Visualizing your Mux data in a video section below.
There’s a data fetcher script located at scripts/hydrate.ts and an entry to the package.json scripts called hydrate that will allow us to run the script.
The hydration script is the backbone of this project and provides a means of interacting with the Mux Data API.
In this script, we're describing 5 separate requests that will be made to fetch some unique datasets that give us an idea of how users are interacting with the video content in our Mux account. The 5 datasets that we care about for the purposes of this article are:
For each of these datasets, we're going to fetch 2 date ranges—one describing the recorded values for the past 30 days, and another describing the values from the 30 days before that, which we can use for trend comparison, for a total of 10 separate requests.
For this script, we're going to dynamically build the Mux endpoint string value that will be targeted for each request. Let's take a look at one of these requests so we understand each part.
For this Views by video title request, we're specifying that we'd like a breakdown of views grouped by video_title, ordered by the views value, and limited to only 10 rows.
We'll loop through each of these 5 request definitions, build the appropriate API endpoint based on the request values, and fire off queries to the Mux Data API. Then, we'll save the responses to our local project so we can use them later.
Now that you understand what this script is doing, let's run it by issuing the following command to your terminal:
npm run hydrate
If all goes well, you'll now have 5 different JSON files that contain the data we'll use to visualize what's happening in your Mux account. Radical!
In order to work on the contents of our video, we'll need to spin up a Remotion development environment, which will provide some additional tooling for us to leverage. To do that, go back to your terminal and start the engines with the following command:
npm run start
Your browser should open up, and you’ll see a video editor appear on screen. Woah, what? Super cool stuff, Remotion. Now we’ve gotta figure out how to make some video clips and put our Mux Data to good use.
There are 2 concepts specific to Remotion to understand up front: compositions and sequences. A composition is a collection of video clips that will play back to back, and a sequence is an individual video clip that appears in a composition. The video you create in this project will have 1 composition with several different sequences.
In our starter repo, we’ve created a composition in src/Video.tsx and specified that the component to use for our timeline exists in src/Timeline.tsx. We’ve also set the total composition duration to 1180 frames at 30 frames per second, for a total video length of just under 40 seconds.
Inside the Timeline component, we then import all of our sequences and tell Remotion to play each of them back to back using a Series component. We also set the durationInFrames prop for each Sequence to ensure each clip plays back for our desired length.
There’s a bit more to it under the hood, but for the purposes of this article, you’re now ready to rock in Remotion. Let’s go!
For the beginning of the video, let’s set the scene and tell our viewers exactly what they’re going to gain by watching this video with an introductory clip. We’ve created a new file at src/clips/1-Intro.tsx to do just that.
In this clip, let’s show a title for our video, the date range of the data that we’ll be presenting in the video, and a few additional design elements. We’ll also animate these elements on screen for that added touch of flair.
The first thing to notice is that this component is formatted just like any other React component that returns JSX. We’ve created a shared Layout component that each video clip will render to establish a common look and feel. We’re also importing some helper functions from remotion and date-fns along with one of our Mux Data JSON files. Let’s break down what these helper functions are doing.
First, we want our clip to be aware of the current frame that the video playback is on. Remember, each frame in our video contains a snapshot of data, and we need to incrementally render out frames with updated values for each frame. The current frame value will be used in our interpolation (huh? don’t worry, we’ll get there) to calculate which opacity and transform values to use on a per-frame basis.
Remotion provides a nice utility helper for this called useCurrentFrame, which returns—you guessed it—the number value representing the current frame of the video clip.
Now that we have the current playback frame, we can pass it to the Remotion interpolate function to come up with our desired CSS values for each frame.
Let’s first understand what we’re trying to achieve here. We want to fade in the title of the video, so we need a way to know what opacity value should be used for each playback frame to emulate a fading effect.
Interpolation is the method of mapping our playback range integers to an opacity value somewhere between 0 and 1.
When the clip playback reaches frame 5, we want to start transitioning the title’s opacity value from its initial value of 0 toward its destination value of 1. This opacity value transition should be completed when the clip playback reaches frame 25.
In other words, the interpolate function will take the current frame value as a first argument, specify the possible range for this value as [5, 25], and remap it to a range of [startValue, endValue].
We have a similar scenario for the title’s position animation:
When the clip playback reaches frame 5, we want to start transitioning the title’s translateY value from its initial value of 50 toward its destination value of 0. This transition should also be completed when the clip playback reaches frame 25.
We’re also passing a configuration with extrapolateRight: "clamp" to prevent the result going outside the output range, and an easing method to make the animation nice and smooth. 🥞
We can then pass this value directly into our JSX as a style value on our title:
Right out of the gate, it’d be cool to show an overall summary of our video performance over the past 30 days. Then, as our video progresses, we can get a bit more granular and dig into the preferences and origins of the unique viewers who are engaging with the content within our Mux account.
Let's start our video by showing off some high-level, overall stats. We'll expose our playback trends for the content within our Mux account and compare it to our viewership trends from the month before.
A common effect that is often seen with numerical visualizations is a quick count-up to the final displayed value. Let's implement that here in src/clips/2-Overall.tsx
Say whaaat? Let's break this down line by line so we understand what's going on.
First, we’re going to use useCurrentFrame again to get the value of the current frame being rendered. The only difference this time is that we’re going to use a spring instead of the interpolate helper.
The second value we'll want to pass to the spring animation calculator is the video’s fps value. fps stands for “frames per second” and represents how many individual frames need to be rendered for each second of video played back. We can retrieve the fps value by destructuring the returned object from another utility helper provided by Remotion called useVideoConfig.
OK, now we’re ready to bounce right in. Strap on your moon shoes: it's time to set up the spring animation. (That’s it for the puns... for now.)
If you’ve animated elements on the web before, you may have previously used CSS keyframes to set up a basic animation. These are typically defined by specifying the duration of the animation in addition to the to and from values for a given CSS property. In other words, you describe the starting value, the ending value, and how long it takes to get from point A to point B.
Spring animations are a little different than your average linear interpolation. Springs take into consideration several different numerical values that you provide, which describe the physical attributes of the item you are animating. For example, if you are animating a wrecking ball on screen, you would provide that spring constructor a much higher mass value than you would for, say, a beach ball.
There's sometimes a bit of trial and error here in order to get the spring animation looking how you'd like, but the end result is a much more organic-looking animation that mimics imperfect real-world physics rather than precise time-based animation.
Let's set up a spring animation for creating an organic count-up animation to the final number value.
Here, we're passing the current frame, fps, and a config object, which assigns our count-up animation with an increased damping value and decreased mass. This allows the number to count up quickly but slowly ease in to its final value. We also set overshootClamping to true in order to avoid displaying a number higher than the actual final value that we're going to display.
In order to get the current count-up value to show in the current frame, let's write a utility function that takes our spring and our final value and calculates a somewhere-in-between output value.
By default, our spring will calculate a value from 0 to 1 based on the current frame, but we want to map that value to a value between 0 and our endValue—which will produce the counting-up effect.
The interpolate function will take the current spring value as a first argument, specify the possible range for this value as [0, 1], and remap it to a range of [0, endValue]. We pass a configuration with extrapolateRight: "clamp" to prevent the result going outside the output range.
Lastly, we're rounding up the calculated value so we only get whole numbers.
Phew! Got all that? Don’t worry, springs are the toughest part of this whole project (after all, you are pretending to be an actual physicist). If it doesn’t make much sense, give it another read, or email me and tell me to do a better job with my writing.
Let's put it to the test! We'll pass the spring and the end value to this helper function in order to calculate the current value for both the totalViews and the totalWatchTime:
Now we can use these 2 values right in our React JSX template. You’ll see that as the video playhead progresses, the spring value will be recalculated, and an updated value will be rendered to the screen. Nice!
Our total views and total watch time numbers are great, but they’d be even better if we could see how they compared to the values from the previous 30 days. Let’s grab those numbers from our data response and render them in a new Trend component in src/components/Trend.tsx.
We’re creating a basic spring here, but this time we're animating from 100 to 0. We then pass the calculated value for the current frame to the transform CSS property. As this value decreases, the component will appear to move upwards on-screen.
We’re also calculating the percentage change from the pastMonthValue to the previousMonthValue and rendering out a + or - depending on the direction of the trend. Cool!
In this clip, we want to answer the question "which devices are my videos being watched on the most?" Understanding the viewing environment in which your content is consumed can be helpful to understand the context of your viewer's relationship with your content.
Mux Data tracks device information out of the box, so all we need to do in order to surface this data is make an API call and render out the response fields.
For this clip, let's animate a bar chart for each individual device. Each bar will represent a device's percentage of the views for the leading device that we're visualizing within the current dataset.
First, let's figure out the number of views for the leading device. This can be achieved by sorting through the array and taking the value of the first result.
Now that we know the max device views across this dataset, we can figure out what the percentage of this max is for any individual device's views value.
Psst. This is elementary school math, but I still have to look up this equation every time. Don't feel bad if you do, too. Solidarity.
Next, we can create a reusable component file at src/components/Measure.tsx that will render a bar chart visualization for each value.
As in the previous clip, we're creating a spring animation that uses the current frame value and the fps.
However, this time, we're changing 2 things:
You might notice the frame value contains something of a unique equation:
frame: frame - 20 - index * 8
By considering the current device index, we can create a cascading delay effect for each instance of the Measure component. The 8 is in place to scale up the produced values. This will incrementally increase the animation delay for each rendered Measure component.
Let’s also create a trickle-down effect for the table row for each device. We’ll create a spring inside a new inline Stat component, which will calculate the Y-axis offset of the row, along with a simple interpolation for the opacity of the row, to produce a slide-and-fade-in effect.
Again, another spring animation, but this time we're animating from -100 to 0 and passing the resulting value to the CSS transform.
This clip will show the top 10 most popular video titles by views. This will give us an idea of the performance of specific pieces of content within our video library over the past 30 days.
For the presentation, we’ll build upon the lessons we’ve already learned in previous clips—only this time, we’ll render the results across 2 separate columns, modifying the Measure component’s delay by passing the results’ index to the rendered bar chart.
For this clip, we'll be using the react-simple-maps library to create a map view of the United States of America. States with stronger viewership numbers will be represented in a darker color, with lighter viewership represented by a light color. You can install it with the following command:
yarn add react-simple-maps
The react-simple-maps library provides a lot of niceties right out of the box. This includes both geographical data and React components to represent visuals such as graticules (longitude and latitude lines), state boundaries, and more.
Let’s put it to use in a new MapChart component file located at src/components/MapChart.tsx.
Our MapChart component is following react-simple-maps practices to render out a ComposableMap with a child Geographies component. The geography prop is fed a JSON file that contains the boundary data for the US and produces a geographies render prop. This is an array containing information about each state in the US.
By iterating over the geographies we can:
With the color generated, we can render out a Geography component that has a unique fill color based on the state’s viewership ranking within Mux Data. 🔥🔥🔥
Phew, hang tight, we’re almost there!
For our last clip, we want to show the usage stats for specific web browsers and learn more about what browsers our audience is using to view our content.
We'll be leveraging an NPM package that brings high-quality browser logos to your project. Here's the command we used for the install:
yarn add alrra/browser-logos#70.4.0
There's an awesome video by the folks at Sparkbox that explains how to create a pie chart using SVGs. Rather than describe the process here, I'd highly encourage following along with their code-along video and learning something new.
Let’s implement the pie chart by creating a new inline PieChart component:
You're a spring expert by now. We're once again creating a spring animation, this time animating only from 0 to percentage—where percentage is the value that can be attributed to the current browser's usage amongst the rest of the browsers within the dataset.
Now you can get a sense of how to put this pie chart to use, along with the way we can look up the logo associated with each data entry:
We’re done here! Let’s show a simple prerecorded video of the Mux logo with a cool particle animation. Remotion’s Video component makes it easy to embed an existing .webm video file directly into your video clip.
The video file will be output to your /out folder. Go ahead, take a moment and bask in the rad video you just created—all without opening After Effects or touching a camera.
Here's the part where you get to show off and take all of the credit for this project. You can surprise-airdrop your video to a synced company folder, post it in Slack, or host it somewhere on the internets (a dead-simple home for adding your video is Mux's own https://stream.new—but, of course, we're biased).
Hopefully this gives you a good idea of how to get started with creating a video on the fly with real video stats from your Mux account. However, you don't have to stop here! Mux provides many different metrics that can be surfaced for visualization.
Here are a few more ideas to try that might enhance your video:
If you liked this guide, have additional questions, got stuck somewhere, or just want to say hey, feel free to drop me a line at email@example.com—I’d love to hear from you!
P.S. Our very own Rob Mach is responsible for all of the awesome graphics you see in the video and the featured image. I didn't know someone could possess that much talent, did you? Thanks for all your help, Rob.
No credit card to start. $20 in free credits when you're ready.
Vercel's Edge Config can come in handy in many different ways. See how we used it to cut down on the amount of spam we were dealing with from our forms.
By Justin Sanford
With lazy-loading and a blurhash placeholder, we make the loading experience of Mux Player feel great in our Next.js app
By Darius Cepulis
While hunting for a pesky live streaming bug, we discovered that virtual load balancers don’t always simulate their physical counterparts the way you might expect.
By Dmitry Ilyevsky