ollama-local

Community

Run LLMs locally with Ollama.

Authorollieb89
Version1.0.0
Installs0

System Documentation

What problem does it solve?

This Skill allows you to manage and interact with Large Language Models (LLMs) running locally on your machine using the Ollama platform, enabling powerful AI capabilities without relying on external cloud services.

Core Features & Use Cases

  • Model Management: Easily list, pull, and remove Ollama models.
  • Inference: Perform chat-based conversations and text completions with local LLMs.
  • Embeddings: Generate vector embeddings for text, useful for semantic search and retrieval.
  • Tool Use: Leverage models capable of function calling for structured interactions.
  • Sub-Agent Integration: Spawn local LLM-powered sub-agents within the OpenClaw framework for complex workflows.
  • Use Case: You can use this skill to run a coding assistant locally to review your code, generate documentation, or answer technical questions, all while keeping your data private.

Quick Start

List all the Ollama models you have installed by running python3 scripts/ollama.py list.

Dependency Matrix

Required Modules

None required

Components

scriptsreferences

💻 Claude Code Installation

Recommended: Let Claude install automatically. Simply copy and paste the text below to Claude Code.

Please help me install this Skill:
Name: ollama-local
Download link: https://github.com/ollieb89/claw_imperium/archive/main.zip#ollama-local

Please download this .zip file, extract it, and install it in the .claude/skills/ directory.
View Source Repository

Agent Skills Search Helper

Install a tiny helper to your Agent, search and equip skill from 223,000+ vetted skills library on demand.