Overview

Supabase and Trigger.dev can be used together to create powerful workflows triggered by real-time changes in your database tables:

  • A Supabase Database Webhook triggers an Edge Function when a row including a video URL is inserted into a table
  • The Edge Function triggers a Trigger.dev task, passing the video_url column data from the new table row as the payload
  • The Trigger.dev task then:
    • Uses FFmpeg to extract the audio track from a video URL
    • Uses Deepgram to transcribe the extracted audio
    • Updates the original table row using the record.id in Supabase with the new transcription using update

Prerequisites

Initial setup

1

Optional step 1: create a new Supabase project

If you already have a Supabase project on your local machine you can skip this step.

You can create a new project by running the following command in your terminal using the Supabase CLI:

supabase init

If you are using VS Code, ensure to answer ‘y’ when asked to generate VS Code settings for Deno, and install any recommended extensions.

2

Optional step 2: create a package.json file

If your project does not already have package.json file (e.g. if you are using Deno), create it manually in your project’s root folder.

If your project has a package.json file you can skip this step.

This is required for the Trigger.dev SDK to work correctly.

package.json
{
  "devDependencies": {
    "typescript": "^5.6.2"
  }
}
Update your Typescript version to the latest version available.
3

Run the CLI `init` command

The easiest way to get started is to use the CLI. It will add Trigger.dev to your existing project, create a /trigger folder and give you an example task.

Run this command in the root of your project to get started:

It will do a few things:

  1. Log you into the CLI if you’re not already logged in.
  2. Create a trigger.config.ts file in the root of your project.
  3. Ask where you’d like to create the /trigger directory.
  4. Create the /trigger directory with an example task, /trigger/example.[ts/js].

Choose “None” when prompted to install an example task. We will create a new task for this guide.

Create a new table in your Supabase database

First, in the Supabase project dashboard, you’ll need to create a new table to store the video URL and transcription.

To do this, click on ‘Table Editor’

in the left-hand menu and create a new table.

Call your table video_transcriptions.

Add two new columns, one called video_url with the type text

, and another called transcription, also with the type text

.

Create and deploy the Trigger.dev task

Generate the Database type definitions

To allow you to use TypeScript to interact with your table, you need to generate the type definitions for your Supabase table using the Supabase CLI.

supabase gen types --lang=typescript --project-id <project-ref> --schema public > database.types.ts
Replace <project-ref> with your Supabase project reference ID. This can be found in your Supabase project settings under ‘General’.

Create the transcription task

Create a new task file in your /trigger folder. Call it videoProcessAndUpdate.ts.

This task takes a video from a public video url, extracts the audio using FFmpeg and transcribes the audio using Deepgram. The transcription summary will then be updated back to the original row in the video_transcriptions table in Supabase.

You will need to install some additional dependencies for this task:

These dependencies will allow you to interact with the Deepgram and Supabase APIs and extract audio from a video using FFmpeg.

/trigger/videoProcessAndUpdate.ts
// Install any missing dependencies below
import { createClient as createDeepgramClient } from "@deepgram/sdk";
import { createClient as createSupabaseClient } from "@supabase/supabase-js";
import { logger, task } from "@trigger.dev/sdk/v3";
import ffmpeg from "fluent-ffmpeg";
import fs from "fs";
import { Readable } from "node:stream";
import os from "os";
import path from "path";
import { Database } from "../../database.types";

// Create a single Supabase client for interacting with your database
// 'Database' supplies the type definitions to supabase-js
const supabase = createSupabaseClient<Database>(
  // These details can be found in your Supabase project settings under `API`
  process.env.SUPABASE_PROJECT_URL as string, // e.g. https://abc123.supabase.co - replace 'abc123' with your project ID
  process.env.SUPABASE_SERVICE_ROLE_KEY as string // Your service role secret key
);

// Your DEEPGRAM_SECRET_KEY can be found in your Deepgram dashboard
const deepgram = createDeepgramClient(process.env.DEEPGRAM_SECRET_KEY);

export const videoProcessAndUpdate = task({
  id: "video-process-and-update",
  run: async (payload: { videoUrl: string; id: number }) => {
    const { videoUrl, id } = payload;

    logger.log(`Processing video at URL: ${videoUrl}`);

    // Generate temporary file names
    const tempDirectory = os.tmpdir();
    const outputPath = path.join(tempDirectory, `audio_${Date.now()}.wav`);

    // Fetch the video
    const response = await fetch(videoUrl);

    // Extract the audio
    await new Promise((resolve, reject) => {
      if (!response.body) {
        return reject(new Error("Failed to fetch video"));
      }

      ffmpeg(Readable.from(response.body))
        .outputOptions([
          "-vn", // Disable video output
          "-acodec pcm_s16le", // Use PCM 16-bit little-endian encoding
          "-ar 44100", // Set audio sample rate to 44.1 kHz
          "-ac 2", // Set audio channels to stereo
        ])
        .output(outputPath)
        .on("end", resolve)
        .on("error", reject)
        .run();
    });

    logger.log(`Audio extracted from video`, { outputPath });

    // Transcribe the audio using Deepgram
    const { result, error } = await deepgram.listen.prerecorded.transcribeFile(
      fs.readFileSync(outputPath),
      {
        model: "nova-2", // Use the Nova 2 model
        smart_format: true, // Automatically format the transcription
        diarize: true, // Enable speaker diarization
      }
    );

    if (error) {
      throw error;
    }

    // Convert the result object to a string
    const transcription = result.results.channels[0].alternatives[0].paragraphs?.transcript;

    logger.log(`Transcription: ${transcription}`);

    // Delete the temporary audio file
    fs.unlinkSync(outputPath);
    logger.log(`Temporary audio file deleted`, { outputPath });

    const { error: updateError } = await supabase
      .from("video_transcriptions")
      // Set the plan to the new plan and update the timestamp
      .update({ transcription: transcription, video_url: videoUrl })
      // Find the row by its ID
      .eq("id", id);

    if (updateError) {
      throw new Error(`Failed to update transcription: ${updateError.message}`);
    }

    return {
      message: `Summary of the audio: ${transcription}`,
      result,
    };
  },
});

When updating your tables from a Trigger.dev task which has been triggered by a database change, be extremely careful to not cause an infinite loop. Ensure you have the correct conditions in place to prevent this.

Adding the FFmpeg build extension

Before you can deploy the task, you’ll need to add the FFmpeg build extension to your trigger.config.ts file.

trigger.config.ts
// Add this import
import { ffmpeg } from "@trigger.dev/build/extensions/core";
import { defineConfig } from "@trigger.dev/sdk/v3";

export default defineConfig({
  project: "<project ref>", // Replace with your project ref
  // Your other config settings...
  build: {
    // Add the FFmpeg build extension
    extensions: [ffmpeg()],
  },
});

Build extensions allow you to hook into the build system and customize the build process or the resulting bundle and container image (in the case of deploying). You can use pre-built extensions or create your own.

You’ll also need to add @trigger.dev/build to your package.json file under devDependencies if you don’t already have it there.

Add your Deepgram and Supabase environment variables to your Trigger.dev project

You will need to add your DEEPGRAM_SECRET_KEY, SUPABASE_PROJECT_URL and SUPABASE_SERVICE_ROLE_KEY as environment variables in your Trigger.dev project. This can be done in the ‘Environment Variables’ page in your project dashboard.

Deploying your task

Now you can now deploy your task using the following command:

Create and deploy the Supabase Edge Function

Add your Trigger.dev prod secret key to the Supabase dashboard

Go to your Trigger.dev project dashboard and copy the prod secret key from the API keys page.

Then, in Supabase, select the project you want to use, navigate to ‘Project settings’

, click ‘Edge Functions’

in the configurations menu, and then click the ‘Add new secret’

button.

Add TRIGGER_SECRET_KEY

with the pasted value of your Trigger.dev prod secret key.

Create a new Edge Function using the Supabase CLI

Now create an Edge Function using the Supabase CLI. Call it video-processing-handler. This function will be triggered by the Database Webhook.

supabase functions new video-processing-handler
functions/video-processing-handler/index.ts
// Setup type definitions for built-in Supabase Runtime APIs
import "jsr:@supabase/functions-js/edge-runtime.d.ts";
import { tasks } from "npm:@trigger.dev/sdk@latest/v3";
// Import the videoProcessAndUpdate task from the trigger folder
import type { videoProcessAndUpdate } from "../../../src/trigger/videoProcessAndUpdate.ts";
//     👆 type only import

// Sets up a Deno server that listens for incoming JSON requests
Deno.serve(async (req) => {
  const payload = await req.json();

  // This payload will contain the video url and id from the new row in the table
  const videoUrl = payload.record.video_url;
  const id = payload.record.id;

  // Trigger the videoProcessAndUpdate task with the videoUrl payload
  await tasks.trigger<typeof videoProcessAndUpdate>("video-process-and-update", { videoUrl, id });
  console.log(payload ?? "No name provided");

  return new Response("ok");
});

Tasks in the trigger folder use Node, so they must stay in there or they will not run, especially if you are using a different runtime like Deno. Also do not add ”npm:” to imports inside your task files, for the same reason.

Deploy the Edge Function

Now deploy your new Edge Function with the following command:

supabase functions deploy video-processing-handler

Follow the CLI instructions, selecting the same project you added your prod secret key to, and once complete you should see your new Edge Function deployment in your Supabase Edge Functions dashboard.

There will be a link to the dashboard in your terminal output.

Create the Database Webhook

In your Supabase project dashboard, click ‘Project settings’

, then the ‘API’ tab

, and copy the anon public API key from the table

.

Then, go to ‘Database’

click on ‘Webhooks’

, and then click ‘Create a new hook’

.

Call the hook edge-function-hook.

Select the new table you have created: public video_transcriptions.

Choose the insert event.

Under ‘Webhook configuration’, select ‘Supabase Edge Functions’

Under ‘Edge Function’, choose POST and select the Edge Function you have created: video-processing-handler.

Under ‘HTTP Headers’, add a new header with the key Authorization and the value Bearer <your-api-key> (replace <your-api-key> with the anon public API key you copied earlier).

Supabase Edge Functions require a JSON Web Token JWT in the authorization header. This is to ensure that only authorized users can access your edge functions.

Click ‘Create webhook’.

Your Database Webhook is now ready to use.

Triggering the entire workflow

Your video-processing-handler Edge Function is now set up to trigger the videoProcessAndUpdate task every time a new row is inserted into your video_transcriptions table.

To do this, go back to your Supabase project dashboard, click on ‘Table Editor’

in the left-hand menu, click on the video_transcriptions table

, and then click ‘Insert’, ‘Insert Row’

.

Add a new item under video_url, with a public video url.

.

You can use the following public video URL for testing: https://content.trigger.dev/Supabase%20Edge%20Functions%20Quickstart.mp4.

Once the new table row has been inserted, check your cloud.trigger.dev project ‘Runs’ list

and you should see a processing videoProcessAndUpdate task

which has been triggered when you added a new row with the video url to your video_transcriptions table.

Once the run has completed successfully, go back to your Supabase video_transcriptions table, and you should see that in the row containing the original video URL, the transcription has now been added to the transcription column.

Congratulations! You have completed the full workflow from Supabase to Trigger.dev and back again.

Learn more about Supabase and Trigger.dev

Full walkthrough guides from development to deployment

Task examples with code you can copy and paste