v0.1.5skill guideMay 2026
Repository Sourcewebsite/docs/en/skill.md

Vercel Labs skills CLI commands for GitHub-based installation and verification across agent runtimes.

Skill Guide

OmniLLM ships with a first-party agent skill in the repository's skill/ directory. The skill teaches agents the crate's real boundaries:

  • runtime generation through Gateway, ProviderEndpoint, and EndpointProtocol
  • provider primitive runtime calls through PrimitiveRequest, PrimitiveProviderEndpoint, and Gateway::primitive_*
  • protocol parsing, emission, and transcoding through parse_*, emit_*, and transcode_*
  • typed multi-endpoint conversion through ApiRequest, ApiResponse, and WireFormat
  • replay fixture sanitization through ReplayFixture and sanitize_*

If you only need the Rust crate, go back to Usage Guide. This page is specifically about installing the OmniLLM Skill into coding agents.

Install With Vercel Labs Skills

These instructions use the Vercel Labs skills installer.

The skill is declared as omnillm. When you install with --skill omnillm, the installer creates the correct target directory name automatically.

Agent runtimes only require:

  • SKILL.md
  • references/
  • assets/

The installer may also add README.md next to the skill files and a project-level skills-lock.json.

The commands below install directly from GitHub, so you do not need to clone the repository first.

The commands below use --copy so the installed skill stays self-contained in the target agent directory.

Claude Code

npx skills add https://github.com/aiomni/omnillm --skill omnillm --agent claude-code --copy

Add -g for a user-level install.

Codex

npx skills add https://github.com/aiomni/omnillm --skill omnillm --agent codex --copy

Add -g for a user-level install.

OpenCode

npx skills add https://github.com/aiomni/omnillm --skill omnillm --agent opencode --copy

Add -g for a user-level install.

Verify Installation

Use the installer to confirm that the skill is present for the agent you care about:

npx skills ls -a codex --json

Replace codex with claude-code or opencode as needed.

Then start a new session in your chosen agent and ask for something OmniLLM-specific, for example:

  • scaffold a GatewayBuilder flow with ProviderEndpoint and KeyConfig
  • configure an EndpointProtocol::*_compat runtime endpoint for an OpenAI-compatible wrapper that requires messages[].content[]
  • debug an OpenAI Chat compat stream where delta.role and the first delta.content arrive in the same SSE frame
  • pass through wrapper-specific OpenAI top-level fields such as enable_thinking with LlmRequest.vendor_extensions
  • explain when canonical Gateway APIs, provider primitive APIs, or transcode_* are correct
  • route a provider-native PrimitiveRequest through primitive_call, primitive_stream, or primitive_realtime
  • debug NoAvailableKey, BudgetExceeded, or Protocol(...)
  • emit an ApiRequest into a provider wire format

If the skill does not appear immediately, restart the session and rerun npx skills ls -a <agent>.