HyperAIHyperAI

Command Palette

Search for a command to run...

5 days ago

GPTOpt: Towards Efficient LLM-Based Black-Box Optimization

Jamison Meindl Yunsheng Tian Tony Cui Veronika Thost Zhang-Wei Hong Jie Chen Wojciech Matusik Mina Konaković Luković

GPTOpt: Towards Efficient LLM-Based Black-Box Optimization

Abstract

Global optimization of expensive, derivative-free black-box functions demands extreme sample efficiency. Classical methods such as Bayesian Optimization (BO) can be effective, but they often require careful parameter tuning to each application domain. At the same time, Large Language Models (LLMs) have shown broad capabilities, yet state-of-the-art models remain limited in solving continuous black-box optimization tasks. We introduce GPTOpt, an LLM-based optimization method that equips LLMs with continuous black-box optimization capabilities. By fine-tuning large language models on extensive synthetic datasets derived from diverse BO parameterizations, GPTOpt leverages LLM pre-training to generalize across optimization tasks. On a variety of black-box optimization benchmarks, GPTOpt surpasses traditional optimizers, highlighting the capacity of LLMs for advanced numerical reasoning and introducing a flexible framework for global optimization without parameter tuning.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
GPTOpt: Towards Efficient LLM-Based Black-Box Optimization | Papers | HyperAI