HyperAI

vLLM+Open WebUI Deployment II-Medical-8B Medical Reasoning Model

1. Tutorial Introduction

II-Medical-8B is a newly developed advanced large-scale language model by Intelligent Internet Company. It is designed to enhance the AI capabilities of medical reasoning. It has made significant improvements on the previous II-Medical-7B-Preview, significantly improving the medical question-answering capabilities. The model is based on the Qwen/Qwen3-8B model, and optimizes the model performance by using SFT (supervised fine-tuning) with reasoning datasets specific to the medical field and training DAPO (a possible optimization method) on hard reasoning datasets. The related paper results are "1.4 Million Open-Source Distilled Reasoning Dataset to Empower Large Language Model Training".

This tutorial uses resources for a single RTX 4090 card.

2. Project Examples

3. Operation steps

1. After starting the container, click the API address to enter the Web interface

If "Model" is not displayed, it means the model is being initialized. Since the model is large, please wait about 1-2 minutes and refresh the page.

2. After entering the webpage, you can start a conversation with the model

How to use

4. Discussion

🖌️ If you see a high-quality project, please leave a message in the background to recommend it! In addition, we have also established a tutorial exchange group. Welcome friends to scan the QR code and remark [SD Tutorial] to join the group to discuss various technical issues and share application effects↓

Citation Information

Thanks to Github user xxxjjjyyy1  Deployment of this tutorial. The reference information of this project is as follows:

@misc{2025II-Medical-8B,
      title={II-Medical-8B: Medical Reasoning Model}, 
      author={Intelligent Internet},
      year={2025}
}