March 27, 2025
Adaptive Video Streaming With Dash.js In React

I was recently tasked with creating video reels that needed to be played smoothly under a slow network or on low-end devices. I started with the native HTML5 <video> tag but quickly hit a wall — it just doesn’t cut it when connections are slow or devices are underpowered.

After some research, I found that adaptive bitrate streaming was the solution I needed. But here’s the frustrating part: finding a comprehensive, beginner-friendly guide was so difficult. The resources on MDN and other websites were helpful but lacked the end-to-end tutorial I was looking for.

That’s why I’m writing this article: to provide you with the step-by-step guide I wish I had found. I’ll bridge the gap between writing FFmpeg scripts, encoding video files, and implementing the DASH-compatible video player (Dash.js) with code examples you can follow.

Going Beyond The Native HTML5 <video> Tag

You might be wondering why you can’t simply rely on the HTML <video> element. There’s a good reason for that. Let’s compare the difference between a native <video> element and adaptive video streaming in browsers.

Progressive Download

With progressive downloading, your browser downloads the video file linearly from the server over HTTP and starts playback as long as it has buffered enough data. This is the default behavior of the <video> element.

<video src=”rabbit320.mp4″ />

When you play the video, check your browser’s network tab, and you’ll see multiple requests with the 206 Partial Content status code.

It uses HTTP 206 Range Requests to fetch the video file in chunks. The server sends specific byte ranges of the video to your browser. When you seek, the browser will make more range requests asking for new byte ranges (e.g., “Give me bytes 1,000,000–2,000,000”).

In other words, it doesn’t fetch the entire file all at once. Instead, it delivers partial byte ranges from the single MP4 video file on demand. This is still considered a progressive download because only a single file is fetched over HTTP — there is no bandwidth or quality adaptation.

If the server or browser doesn’t support range requests, the entire video file will be downloaded in a single request, returning a 200 OK status code. In that case, the video can only begin playing once the entire file has finished downloading.

The problems? If you’re on a slow connection trying to watch high-resolution video, you’ll be waiting a long time before playback starts.

Adaptive Bitrate Streaming

Instead of serving one single video file, adaptive bitrate (ABR) streaming splits the video into multiple segments at different bitrates and resolutions. During playback, the ABR algorithm will automatically select the highest quality segment that can be downloaded in time for smooth playback based on your network connectivity, bandwidth, and other device capabilities. It continues adjusting throughout to adapt to changing conditions.

This magic happens through two key browser technologies:

Media Source Extension (MSE)
It allows passing a MediaSource object to the src attribute in <video>, enabling sending multiple SourceBuffer objects that represent video segments.

Media Capabilities API
It provides information on your device’s video decoding and encoding abilities, enabling ABR to make informed decisions about which resolution to deliver.

Together, they enable the core functionality of ABR, serving video chunks optimized for your specific device limitations in real time.

Streaming Protocols: MPEG-DASH Vs. HLS

As mentioned above, to stream media adaptively, a video is split into chunks at different quality levels across various time points. We need to facilitate the process of switching between these segments adaptively in real time. To achieve this, ABR streaming relies on specific protocols. The two most common ABR protocols are:

MPEG-DASH,
HTTP Live Streaming (HLS).

Both of these protocols utilize HTTP to send video files. Hence, they are compatible with HTTP web servers.

This article focuses on MPEG-DASH. However, it’s worth noting that DASH isn’t supported by Apple devices or browsers, as mentioned in Mux’s article.

MPEG-DASH

MPEG-DASH enables adaptive streaming through:

A Media Presentation Description (MPD) file
This XML manifest file contains information on how to select and manage streams based on adaptive rules.
Segmented Media Files
Video and audio files are divided into segments at different resolutions and durations using MPEG-DASH-compliant codecs and formats.

On the client side, a DASH-compliant video player reads the MPD file and continuously monitors network bandwidth. Based on available bandwidth, the player selects the appropriate bitrate and requests the corresponding video chunk. This process repeats throughout playback, ensuring smooth, optimal quality.

Now that you understand the fundamentals, let’s build our adaptive video player!

Steps To Build an Adaptive Bitrate Streaming Video Player

Here’s the plan:

Transcode the MP4 video into audio and video renditions at different resolutions and bitrates with FFmpeg.
Generate an MPD file with FFmpeg.
Serve the output files from the server.
Build the DASH-compatible video player to play the video.

Install FFmpeg

For macOS users, install FFmpeg using Brew by running the following command in your terminal:

brew install ffmpeg

For other operating systems, please refer to FFmpeg’s documentation.

Generate Audio Rendition

Next, run the following script to extract the audio track and encode it in WebM format for DASH compatibility:

ffmpeg -i “input_video.mp4” -vn -acodec libvorbis -ab 128k “audio.webm”

-i “input_video.mp4”: Specifies the input video file.
-vn: Disables the video stream (audio-only output).
-acodec libvorbis: Uses the libvorbis codec to encode audio.
-ab 128k: Sets the audio bitrate to 128 kbps.
“audio.webm”: Specifies the output audio file in WebM format.

Generate Video Renditions

Run this script to create three video renditions with varying resolutions and bitrates. The largest resolution should match the input file size. For example, if the input video is 576×1024 at 30 frames per second (fps), the script generates renditions optimized for vertical video playback.

ffmpeg -i “input_video.mp4” -c:v libvpx-vp9 -keyint_min 150 -g 150
-tile-columns 4 -frame-parallel 1 -f webm
-an -vf scale=576:1024 -b:v 1500k “input_video_576x1024_1500k.webm”
-an -vf scale=480:854 -b:v 1000k “input_video_480x854_1000k.webm”
-an -vf scale=360:640 -b:v 750k “input_video_360x640_750k.webm”

-c:v libvpx-vp9: Uses the libvpx-vp9 as the VP9 video encoder for WebM.
-keyint_min 150 and -g 150: Set a 150-frame keyframe interval (approximately every 5 seconds at 30 fps). This allows bitrate switching every 5 seconds.
-tile-columns 4 and -frame-parallel 1: Optimize encoding performance through parallel processing.
-f webm: Specifies the output format as WebM.

In each rendition:

-an: Excludes audio (video-only output).
-vf scale=576:1024: Scales the video to a resolution of 576×1024 pixels.
-b:v 1500k: Sets the video bitrate to 1500 kbps.

WebM is chosen as the output format, as they are smaller in size and optimized yet widely compatible with most web browsers.

Generate MPD Manifest File

Combine the video renditions and audio track into a DASH-compliant MPD manifest file by running the following script:

ffmpeg
-f webm_dash_manifest -i “input_video_576x1024_1500k.webm”
-f webm_dash_manifest -i “input_video_480x854_1000k.webm”
-f webm_dash_manifest -i “input_video_360x640_750k.webm”
-f webm_dash_manifest -i “audio.webm”
-c copy
-map 0 -map 1 -map 2 -map 3
-f webm_dash_manifest
-adaptation_sets “id=0,streams=0,1,2 id=1,streams=3”
“input_video_manifest.mpd”

-f webm_dash_manifest -i “…”: Specifies the inputs so that the ASH video player will switch between them dynamically based on network conditions.
-map 0 -map 1 -map 2 -map 3: Includes all video (0, 1, 2) and audio (3) in the final manifest.
-adaptation_sets: Groups streams into adaptation sets:
id=0,streams=0,1,2: Groups the video renditions into a single adaptation set.
id=1,streams=3: Assigns the audio track to a separate adaptation set.

The resulting MPD file (input_video_manifest.mpd) describes the streams and enables adaptive bitrate streaming in MPEG-DASH.

<?xml version=”1.0″ encoding=”UTF-8″?>
<MPD
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”
xmlns=”urn:mpeg:DASH:schema:MPD:2011″
xsi:schemaLocation=”urn:mpeg:DASH:schema:MPD:2011″
type=”static”
mediaPresentationDuration=”PT81.166S”
minBufferTime=”PT1S”
profiles=”urn:mpeg:dash:profile:webm-on-demand:2012″>

<Period id=”0″ start=”PT0S” duration=”PT81.166S”>
<AdaptationSet
id=”0″
mimeType=”video/webm”
codecs=”vp9″
lang=”eng”
bitstreamSwitching=”true”
subsegmentAlignment=”false”
subsegmentStartsWithSAP=”1″>

<Representation id=”0″ bandwidth=”1647920″ width=”576″ height=”1024″>
<BaseURL>input_video_576x1024_1500k.webm</BaseURL>
<SegmentBase indexRange=”16931581-16931910″>
<Initialization range=”0-645″ />
</SegmentBase>
</Representation>

<Representation id=”1″ bandwidth=”1126977″ width=”480″ height=”854″>
<BaseURL>input_video_480x854_1000k.webm</BaseURL>
<SegmentBase indexRange=”11583599-11583986″>
<Initialization range=”0-645″ />
</SegmentBase>
</Representation>

<Representation id=”2″ bandwidth=”843267″ width=”360″ height=”640″>
<BaseURL>input_video_360x640_750k.webm</BaseURL>
<SegmentBase indexRange=”8668326-8668713″>
<Initialization range=”0-645″ />
</SegmentBase>
</Representation>

</AdaptationSet>

<AdaptationSet
id=”1″
mimeType=”audio/webm”
codecs=”vorbis”
lang=”eng”
audioSamplingRate=”44100″
bitstreamSwitching=”true”
subsegmentAlignment=”true”
subsegmentStartsWithSAP=”1″>

<Representation id=”3″ bandwidth=”89219″>
<BaseURL>audio.webm</BaseURL>
<SegmentBase indexRange=”921727-922055″>
<Initialization range=”0-4889″ />
</SegmentBase>
</Representation>

</AdaptationSet>
</Period>
</MPD>

After completing these steps, you’ll have:

Three video renditions (576×1024, 480×854, 360×640),
One audio track, and
An MPD manifest file.

input_video.mp4
audio.webm
input_video_576x1024_1500k.webm
input_video_480x854_1000k.webm
input_video_360x640_750k.webm
input_video_manifest.mpd

The original video input_video.mp4 should also be kept to serve as a fallback video source later.

Serve The Output Files

These output files can now be uploaded to cloud storage (e.g., AWS S3 or Cloudflare R2) for playback. While they can be served directly from a local folder, I highly recommend storing them in cloud storage and leveraging a CDN to cache the assets for better performance. Both AWS and Cloudflare support HTTP range requests out of the box.

Building The DASH-Compatible Video Player In React

There’s nothing like a real-world example to help understand how everything works. There are different ways we can implement a DASH-compatible video player, but I’ll focus on an approach using React.

First, install the Dash.js npm package by running:

npm i dashjs

Next, create a component called <DashVideoPlayer /> and initialize the Dash MediaPlayer instance by pointing it to the MPD file when the component mounts.

The ref callback function runs upon the component mounting, and within the callback function, playerRef will refer to the actual Dash MediaPlayer instance and be bound with event listeners. We also include the original MP4 URL in the <source> element as a fallback if the browser doesn’t support MPEG-DASH.

If you’re using Next.js app router, remember to add the ‘use client’ directive to enable client-side hydration, as the video player is only initialized on the client side.

Here is the full example:

import dashjs from ‘dashjs’
import { useCallback, useRef } from ‘react’

export const DashVideoPlayer = () => {
const playerRef = useRef()

const callbackRef = useCallback((node) => {
if (node !== null) {
playerRef.current = dashjs.MediaPlayer().create()

playerRef.current.initialize(node, “https://example.com/uri/to/input_video_manifest.mpd”, false)

playerRef.current.on(‘canPlay’, () => {
// upon video is playable
})

playerRef.current.on(‘error’, (e) => {
// handle error
})

playerRef.current.on(‘playbackStarted’, () => {
// handle playback started
})

playerRef.current.on(‘playbackPaused’, () => {
// handle playback paused
})

playerRef.current.on(‘playbackWaiting’, () => {
// handle playback buffering
})
}
},[])

return (
<video ref={callbackRef} width={310} height={548} controls>
<source src=”https://example.com/uri/to/input_video.mp4″ type=”video/mp4″ />
Your browser does not support the video tag.
</video>
)
}

Result

Observe the changes in the video file when the network connectivity is adjusted from Fast 4G to 3G using Chrome DevTools. It switches from 480p to 360p, showing how the experience is optimized for more or less available bandwidth.

Conclusion

That’s it! We just implemented a working DASH-compatible video player in React to establish a video with adaptive bitrate streaming. Again, the benefits of this are rooted in performance. When we adopt ABR streaming, we’re requesting the video in smaller chunks, allowing for more immediate playback than we’d get if we needed to fully download the video file first. And we’ve done it in a way that supports multiple versions of the same video, allowing us to serve the best format for the user’s device.

References

Http Range Request And MP4 Video Play In Browser,” Zeng Xu
Setting up adaptive streaming media sources (Mozilla Developer Network)
DASH Adaptive Streaming for HTML video (Mozilla Developer Network)

Leave a Reply

Your email address will not be published. Required fields are marked *

Send