Searching protocol for "list-models"
Query semantic models quickly and clearly.
Manage AI models from CLI with ease.
List supported models in an instant.
Enable local Ollama MCP tools.
Enable local Ollama models as fast AI tools.
Run local Ollama models in the agent.
Integrate local LLMs for cheaper tasks.
Integrate local Ollama models.
Run local Ollama models via MCP for fast results.
Integrate local LLMs for cheaper tasks.
Streamline OpenAI API workflows.
Organize LLM configs across providers in Notion.