Run Mistral-7B-v0.3 Demo Online
Deploy Mixtral:7B-v0.3 model with Ollama and Open WebUI
Tutorial Introduction
This tutorial is a one-click run package of Ollama + Open WebUI.Just follow the steps below and enter the command to run the large model Mixtral:7b v0.3 with one click.
Compared with Mistral:7B-v0.2, Mixtral:7B-v0.3 has the following advantages:
- The vocabulary size has been expanded from 32000 to 32768
- Support v3 tokenizer
- Support function calls
The model has been placed in a public space and does not take up any personal storage space.
After starting Ollama and 0pen Webui respectively according to the following contents, you can use it by using the "API address" on the right.

How to run
1. Download Ollama
curl -fsSL https://ollama.com/install.sh | sh
2. Start Ollama
OLLAMA_MODELS=/openbayes/input/input0 ollama serve
3. Create a new terminal and start Open Webui
cd open-webui
OLLAMA_MODELS=/openbayes/input/input0 bash backend/start.sh
4. Open the page
- Copy the API address on the right and paste it into your browser to open the Open Webui page

- Log in using admin@example.com / adminadmin

- After selecting the model, you can use
