This workflow automates the creation of long AI-generated videos from prompts, merges the generated clips into a single video, and automatically distributes the final content across multiple platforms.
The process starts with a Google Sheet that acts as the control panel for the workflow. Each row in the sheet contains a prompt, the duration of the clip, and a starting frame. The workflow reads this data and generates video clips sequentially.
Using the RunPod WAN 2.5 video generation API, the workflow creates individual video segments based on the prompt and input image. Each segment is then stored and tracked in the spreadsheet.
Once all clips are generated, the workflow uses the Fal.run FFmpeg API to merge them into a single long video. After merging, the final video is retrieved automatically.
The workflow also extracts the last frame of each generated clip to use as the starting frame for the next clip, ensuring smooth visual continuity between scenes.
Finally, the completed video is automatically:
This creates a fully automated pipeline that transforms prompts in a spreadsheet into a finished long-form video distributed across multiple platforms.
The workflow automates the entire process of generating, assembling, and publishing videos, eliminating manual editing and upload steps.
Using Google Sheets as the input system makes the workflow easy to manage and scalable. Users can create or modify video scenes simply by editing rows in the sheet.
The workflow can generate multiple clips and combine them into longer videos, enabling the creation of long-form content from short AI-generated segments.
By extracting the last frame of each clip and using it as the starting frame for the next scene, the workflow maintains visual continuity between segments.
The Fal.run FFmpeg API merges all generated clips into a single final video without requiring external editing tools.
Once the video is completed, it is automatically uploaded and published to multiple platforms, significantly reducing the time needed for content distribution.
The final video is saved to Google Drive, providing organized and secure storage for the generated content.
The workflow continuously checks the status of generation and processing tasks, waiting and retrying until the job is completed.
This workflow automates the creation of long videos by generating multiple clips from a Google Sheet and merging them together. Here's the process:
Trigger & Data Loading: When manually executed, the workflow reads a Google Sheet containing video generation parameters (prompts, durations, and starting images).
Video Generation Loop: For each row marked for processing, it:
Frame Extraction: After each video is generated, it extracts the last frame using Fal.ai's FFmpeg API and updates the next row's starting image (creating visual continuity).
Video Merging: Once all individual clips are generated (marked with "x" in the MERGE column), the workflow:
Distribution: The final long video is:
Google Sheet Setup:
API Credentials Required:
Configure Nodes:
Test: Run the workflow manually to generate your first long video sequence
👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n.
Contact me for consulting and support or add me on Linkedin.