Video tutorial coming soon.
Deploy Flowise on Ubuntu with Docker — a drag-and-drop LangChain pipeline builder for creating AI chatbots, agents, and RAG workflows visually. Connect Ollama, OpenAI, or any LLM and build production-ready AI applications without writing code.
Grab the automated bash script from GitHub to follow along with the video.
wget https://raw.githubusercontent.com/mhmdali94/Docker/main/ai/flowise/flowise-ubuntu.sh
chmod +x flowise-ubuntu.sh
sudo bash flowise-ubuntu.sh
The script installs Docker and deploys Flowise with a persistent storage volume for your saved chatflows and credentials.
wget https://raw.githubusercontent.com/mhmdali94/Docker/main/ai/flowise/flowise-ubuntu.sh
chmod +x flowise-ubuntu.sh
sudo bash flowise-ubuntu.sh
Open your browser and navigate to the Flowise interface:
http://<your-server-ip>:3000
Go to Credentials in the sidebar and add your API keys — OpenAI, Anthropic, or your Ollama server URL. Credentials are encrypted and stored locally.
Click "Add New" to create a chatflow. Drag nodes from the left panel — choose an LLM node, a memory node, and a chat input — connect them, and click Save. Your chatflow is instantly available as an API endpoint and embeddable chat widget.
| Port | Purpose |
|---|---|
| 3000 | Flowise Web UI & API |