Deploying Large Models With Ollama and Open WebUI
Tutorial Introduction

This tutorial is a one-click run package of Ollama + Open WebUI. You only need to follow the steps below to enter the command to run the large model with one click.
The models currently included are:
- qwen 1.5 14b
- qwen 1.5 32b
- llava 1.6 34b This model is a multimodal model
The above three models are placed in a public space and do not take up personal storage space.
After starting ollama and open webui respectively according to the following contents, you can use it by using the "API address" on the right.

Video tutorial reference:No need to install, just clone and run large models with one click – Run Ollama and Open WebUI on Cloud GPU
Start Ollama
OLLAMA_MODELS=/openbayes/home/ollama-models ./ollama serve
Start open webui
bash /openbayes/input/input1/open-webui/backend/start.sh
Open Page
- Copy the API address on the right and paste it into your browser to open the open webui page.

- Log in via admin@example / adminadmin

Adding new models
In the command line, pass the command ./ollama pull
You can download new models. New models will take up their own space.