A cutting-edge SaaS platform designed to generate high-quality photos and videos effortlessly. Powered by state-of-the-art technology, this tool is perfect for creatives, marketers, and businesses looking to streamline media production.
- AI-powered photo and video generation
- High-quality generated photo download
- User-friendly interface for easy customization
- High-resolution outputs for professional use
- Scalable architecture for enterprise users
- Integration-ready APIs for developers
-
Clone the Repository
git clone https://github.com/rawani123/saas-video-gen.git
-
Install Dependencies
Navigate to the project directory and install dependencies:cd saas-video-gen npm install
-
Run the Application
Start the development server:npm run dev
-
Run Python File
cd backend python main.py
-
Build for Production
To create a production build:npm run build
- Next.js: Used for its server-side rendering and static site generation, which enhance performance and improve SEO.
- Tailwind CSS: Provides a utility-first CSS framework that enables rapid and consistent design implementation.
- Remotion.js: Enables dynamic video creation directly in React, allowing for easy customization and integration into the platform.
- Node.js: Facilitates the creation of scalable and efficient APIs to handle concurrent requests and support real-time features.
- Flask: A lightweight Python web framework used to create APIs or applications for generating MP3 files dynamically.
- PostgreSQL: A robust relational database system that handles structured data, ensuring reliability and scalability for storing user profiles, project metadata, and analytics.
- HuggingFace API: Provides state-of-the-art pre-trained models for image and video generation, reducing development time and delivering high-quality results.
- Vercel: Perfect for deploying the Next.js frontend, offering serverless functions and optimized performance.
- Render: A reliable platform for hosting Flask-based backend applications, ensuring scalability and consistent uptime.
For questions or suggestions, please contact:
Author: Aniruddha Rawool
Email: [email protected]