Searching protocol for "OpenAI-compatible APIs"
Navigate OpenAI-compatible providers with ease.
OpenAI-compatible LLM serving on Ascend NPUs.
GPU-accelerated LocalAI for local AI API.
Deploy AI models with NVIDIA NIM anywhere.
Gemini via OpenAI API: easy integration.
High-throughput LLM inference
Self-host OpenAI-compatible APIs.
Deploy vLLM with Docker/GPU for fast AI inference.
Venice.ai API integration for privacy-first apps.
OpenAI-compatible API for local Ollama.
YandexGPT via OpenAI proxy
Run and connect to local or cloud LLMs.