OpenConsole
Settings

LLM API Tester

Browser mode tries direct vendor calls (often blocked by CORS). Server mode proxies via Next.js (`/api/llm/run`).

https://api.openai.com/v1/models
Send a request to see the response…